WorldWideScience

Sample records for germcode gcr event-based

  1. Overview of the Graphical User Interface for the GERMcode (GCR Event-Based Risk Model)

    Science.gov (United States)

    Kim, Myung-Hee Y.; Cucinotta, Francis A.

    2010-01-01

    The descriptions of biophysical events from heavy ions are of interest in radiobiology, cancer therapy, and space exploration. The biophysical description of the passage of heavy ions in tissue and shielding materials is best described by a stochastic approach that includes both ion track structure and nuclear interactions. A new computer model called the GCR Event-based Risk Model (GERM) code was developed for the description of biophysical events from heavy ion beams at the NASA Space Radiation Laboratory (NSRL). The GERMcode calculates basic physical and biophysical quantities of high-energy protons and heavy ions that have been studied at NSRL for the purpose of simulating space radiobiological effects. For mono-energetic beams, the code evaluates the linear-energy transfer (LET), range (R), and absorption in tissue equivalent material for a given Charge (Z), Mass Number (A) and kinetic energy (E) of an ion. In addition, a set of biophysical properties are evaluated such as the Poisson distribution of ion or delta-ray hits for a specified cellular area, cell survival curves, and mutation and tumor probabilities. The GERMcode also calculates the radiation transport of the beam line for either a fixed number of user-specified depths or at multiple positions along the Bragg curve of the particle. The contributions from primary ion and nuclear secondaries are evaluated. The GERMcode accounts for the major nuclear interaction processes of importance for describing heavy ion beams, including nuclear fragmentation, elastic scattering, and knockout-cascade processes by using the quantum multiple scattering fragmentation (QMSFRG) model. The QMSFRG model has been shown to be in excellent agreement with available experimental data for nuclear fragmentation cross sections, and has been used by the GERMcode for application to thick target experiments. The GERMcode provides scientists participating in NSRL experiments with the data needed for the interpretation of their

  2. Mixed-field GCR Simulations for Radiobiological Research using Ground Based Accelerators

    Science.gov (United States)

    Kim, Myung-Hee Y.; Rusek, Adam; Cucinotta, Francis

    Space radiation is comprised of a large number of particle types and energies, which have differential ionization power from high energy protons to high charge and energy (HZE) particles and secondary neutrons produced by galactic cosmic rays (GCR). Ground based accelerators such as the NASA Space Radiation Laboratory (NSRL) at Brookhaven National Laboratory (BNL) are used to simulate space radiation for radiobiology research and dosimetry, electronics parts, and shielding testing using mono-energetic beams for single ion species. As a tool to support research on new risk assessment models, we have developed a stochastic model of heavy ion beams and space radiation effects, the GCR Event-based Risk Model computer code (GERMcode). For radiobiological research on mixed-field space radiation, a new GCR simulator at NSRL is proposed. The NSRL-GCR simulator, which implements the rapid switching mode and the higher energy beam extraction to 1.5 GeV/u, can integrate multiple ions into a single simulation to create GCR Z-spectrum in major energy bins. After considering the GCR environment and energy limitations of NSRL, a GCR reference field is proposed after extensive simulation studies using the GERMcode. The GCR reference field is shown to reproduce the Z and LET spectra of GCR behind shielding within 20 percents accuracy compared to simulated full GCR environments behind shielding. A major challenge for space radiobiology research is to consider chronic GCR exposure of up to 3-years in relation to simulations with cell and animal models of human risks. We discuss possible approaches to map important biological time scales in experimental models using ground-based simulation with extended exposure of up to a few weeks and fractionation approaches at a GCR simulator.

  3. GCR Environmental Models I: Sensitivity Analysis for GCR Environments

    Science.gov (United States)

    Slaba, Tony C.; Blattnig, Steve R.

    2014-01-01

    Accurate galactic cosmic ray (GCR) models are required to assess crew exposure during long-duration missions to the Moon or Mars. Many of these models have been developed and compared to available measurements, with uncertainty estimates usually stated to be less than 15%. However, when the models are evaluated over a common epoch and propagated through to effective dose, relative differences exceeding 50% are observed. This indicates that the metrics used to communicate GCR model uncertainty can be better tied to exposure quantities of interest for shielding applications. This is the first of three papers focused on addressing this need. In this work, the focus is on quantifying the extent to which each GCR ion and energy group, prior to entering any shielding material or body tissue, contributes to effective dose behind shielding. Results can be used to more accurately calibrate model-free parameters and provide a mechanism for refocusing validation efforts on measurements taken over important energy regions. Results can also be used as references to guide future nuclear cross-section measurements and radiobiology experiments. It is found that GCR with Z>2 and boundary energies below 500 MeV/n induce less than 5% of the total effective dose behind shielding. This finding is important given that most of the GCR models are developed and validated against Advanced Composition Explorer/Cosmic Ray Isotope Spectrometer (ACE/CRIS) measurements taken below 500 MeV/n. It is therefore possible for two models to very accurately reproduce the ACE/CRIS data while inducing very different effective dose values behind shielding.

  4. Overview of the Graphical User Interface for the GERM Code (GCR Event-Based Risk Model

    Science.gov (United States)

    Kim, Myung-Hee; Cucinotta, Francis A.

    2010-01-01

    The descriptions of biophysical events from heavy ions are of interest in radiobiology, cancer therapy, and space exploration. The biophysical description of the passage of heavy ions in tissue and shielding materials is best described by a stochastic approach that includes both ion track structure and nuclear interactions. A new computer model called the GCR Event-based Risk Model (GERM) code was developed for the description of biophysical events from heavy ion beams at the NASA Space Radiation Laboratory (NSRL). The GERM code calculates basic physical and biophysical quantities of high-energy protons and heavy ions that have been studied at NSRL for the purpose of simulating space radiobiological effects. For mono-energetic beams, the code evaluates the linear-energy transfer (LET), range (R), and absorption in tissue equivalent material for a given Charge (Z), Mass Number (A) and kinetic energy (E) of an ion. In addition, a set of biophysical properties are evaluated such as the Poisson distribution of ion or delta-ray hits for a specified cellular area, cell survival curves, and mutation and tumor probabilities. The GERM code also calculates the radiation transport of the beam line for either a fixed number of user-specified depths or at multiple positions along the Bragg curve of the particle. The contributions from primary ion and nuclear secondaries are evaluated. The GERM code accounts for the major nuclear interaction processes of importance for describing heavy ion beams, including nuclear fragmentation, elastic scattering, and knockout-cascade processes by using the quantum multiple scattering fragmentation (QMSFRG) model. The QMSFRG model has been shown to be in excellent agreement with available experimental data for nuclear fragmentation cross sections, and has been used by the GERM code for application to thick target experiments. The GERM code provides scientists participating in NSRL experiments with the data needed for the interpretation of their

  5. Isotopic dependence of GCR fluence behind shielding

    International Nuclear Information System (INIS)

    Cucinotta, Francis A.; Wilson, John W.; Saganti, Premkumar; Hu, Xiaodong; Kim, Myung-Hee Y.; Cleghorn, Timothy; Zeitlin, Cary; Tripathi, Ram K.

    2006-01-01

    In this paper we consider the effects of the isotopic composition of the primary galactic cosmic rays (GCR), nuclear fragmentation cross sections, and isotopic-grid on the solution to transport models used for shielding studies. Satellite measurements are used to describe the isotopic composition of the GCR. For the nuclear interaction data-base and transport solution, we use the quantum multiple scattering theory of nuclear fragmentation (QMSFRG) and high-charge and energy (HZETRN) transport code, respectively. The QMSFRG model is shown to accurately describe existing fragmentation data including proper description of the odd-even effects as function of the iso-spin dependence on the projectile nucleus. The principle finding of this study is that large errors (±100%) will occur in the mass-fluence spectra when comparing transport models that use a complete isotopic-grid (∼170 ions) to ones that use a reduced isotopic-grid, for example the 59 ion-grid used in the HZETRN code in the past; however, less significant errors (<+/-20%) occur in the elemental-fluence spectra. Because a complete isotopic-grid is readily handled on small computer workstations and is needed for several applications studying GCR propagation and scattering, it is recommended that they be used for future GCR studies

  6. NASA Space Radiation Program Integrative Risk Model Toolkit

    Science.gov (United States)

    Kim, Myung-Hee Y.; Hu, Shaowen; Plante, Ianik; Ponomarev, Artem L.; Sandridge, Chris

    2015-01-01

    NASA Space Radiation Program Element scientists have been actively involved in development of an integrative risk models toolkit that includes models for acute radiation risk and organ dose projection (ARRBOD), NASA space radiation cancer risk projection (NSCR), hemocyte dose estimation (HemoDose), GCR event-based risk model code (GERMcode), and relativistic ion tracks (RITRACKS), NASA radiation track image (NASARTI), and the On-Line Tool for the Assessment of Radiation in Space (OLTARIS). This session will introduce the components of the risk toolkit with opportunity for hands on demonstrations. The brief descriptions of each tools are: ARRBOD for Organ dose projection and acute radiation risk calculation from exposure to solar particle event; NSCR for Projection of cancer risk from exposure to space radiation; HemoDose for retrospective dose estimation by using multi-type blood cell counts; GERMcode for basic physical and biophysical properties for an ion beam, and biophysical and radiobiological properties for a beam transport to the target in the NASA Space Radiation Laboratory beam line; RITRACKS for simulation of heavy ion and delta-ray track structure, radiation chemistry, DNA structure and DNA damage at the molecular scale; NASARTI for modeling of the effects of space radiation on human cells and tissue by incorporating a physical model of tracks, cell nucleus, and DNA damage foci with image segmentation for the automated count; and OLTARIS, an integrated tool set utilizing HZETRN (High Charge and Energy Transport) intended to help scientists and engineers study the effects of space radiation on shielding materials, electronics, and biological systems.

  7. Characterization of GCR-lightlike warped product of indefinite Sasakian manifolds

    Directory of Open Access Journals (Sweden)

    Rakesh Kumar

    2014-07-01

    Full Text Available In this paper we prove that there do not exist warped product GCR-lightlike submanifolds in the form M = N⊥ × λNT such that N⊥ is an anti-invariant submanifold tangent to V and NT an invariant submanifold of M‾, other than GCR-lightlike product in an indefinite Sasakian manifold. We also obtain characterization theorems for a GCR-lightlike submanifold to be locally a GCR-lightlike warped product.

  8. GCR and SPE Radiation Effects in Materials

    Science.gov (United States)

    Waller, Jess; Rojdev, Kristina; Nichols, Charles

    2016-01-01

    This Year 3 project provides risk reduction data to assess galactic cosmic ray (GCR) and solar particle event (SPE) space radiation damage in materials used in manned low-earth orbit, lunar, interplanetary, and Martian surface missions. Long duration (up to 50 years) space radiation damage is being quantified for materials used in inflatable structures (1st priority), and space suit and habitable composite materials (2nd priority). The data collected has relevance for nonmetallic materials (polymers and composites) used in NASA missions where long duration reliability is needed in continuous or intermittent space radiation fluxes.

  9. Opening a Window on ICME Evolution and GCR Modulation During Propagation in the Innermost Heliosphere

    Science.gov (United States)

    Winslow, R. M.; Lugaz, N.; Schwadron, N.; Farrugia, C. J.; Guo, J.; Wimmer-Schweingruber, R. F.; Wilson, J. K.; Joyce, C.; Jordan, A.; Lawrence, D. J.

    2017-12-01

    We use multipoint spacecraft observations to study interplanetary coronal mass ejection (ICME) evolution and subsequent galactic cosmic ray (GCR) modulation during propagation in the inner heliosphere. We illustrate ICME propagation effects through two different case studies. The first ICME was launched from the Sun on 29 December 2011 and was observed in near-perfect longitudinal conjunction at MESSENGER and STEREO A. Despite the close longitudinal alignment, we infer from force-free field modeling that the orientation of the underlying flux rope rotated ˜80o in latitude and ˜65o in longitude. Based on both spacecraft measurements as well as ENLIL model simulations of the steady state solar wind, we find that interactions involving magnetic reconnection with corotating structures in the solar wind dramatically alter the ICME magnetic field. In particular, we observed at STEREO A a highly turbulent region with distinct properties within the flux rope that was not observed at MESSENGER; we attribute this region to interaction between the ICME and a heliospheric plasma sheet/current sheet. This is a concrete example of a sequence of events that can increase the complexity of ICMEs during propagation and should serve as a caution on using very distant observations to predict the geoeffectiveness of large interplanetary transients. Our second case study investigates changes with heliospheric distance in GCR modulation by an ICME event (launched on 12 February 2014) observed in near-conjunction at all four of the inner solar system planets. The ICME caused Forbush decreases (FDs) in the GCR count rates at Mercury (MESSENGER), Earth/Moon (ACE/LRO), and Mars (MSL). At all three locations, the pre-ICME background GCR rate was well-matched, but the depth of the FD of GCR fluxes with similar energy ranges diminished with distance from the Sun. A larger difference in FD size was observed between Mercury and Earth than between Earth and Mars, partly owing to the much larger

  10. Impact of AMS-02 Measurements on Reducing GCR Model Uncertainties

    Science.gov (United States)

    Slaba, T. C.; O'Neill, P. M.; Golge, S.; Norbury, J. W.

    2015-01-01

    For vehicle design, shield optimization, mission planning, and astronaut risk assessment, the exposure from galactic cosmic rays (GCR) poses a significant and complex problem both in low Earth orbit and in deep space. To address this problem, various computational tools have been developed to quantify the exposure and risk in a wide range of scenarios. Generally, the tool used to describe the ambient GCR environment provides the input into subsequent computational tools and is therefore a critical component of end-to-end procedures. Over the past few years, several researchers have independently and very carefully compared some of the widely used GCR models to more rigorously characterize model differences and quantify uncertainties. All of the GCR models studied rely heavily on calibrating to available near-Earth measurements of GCR particle energy spectra, typically over restricted energy regions and short time periods. In this work, we first review recent sensitivity studies quantifying the ions and energies in the ambient GCR environment of greatest importance to exposure quantities behind shielding. Currently available measurements used to calibrate and validate GCR models are also summarized within this context. It is shown that the AMS-II measurements will fill a critically important gap in the measurement database. The emergence of AMS-II measurements also provides a unique opportunity to validate existing models against measurements that were not used to calibrate free parameters in the empirical descriptions. Discussion is given regarding rigorous approaches to implement the independent validation efforts, followed by recalibration of empirical parameters.

  11. A Reference Field for GCR Simulation and an LET-Based Implementation at NSRL

    Science.gov (United States)

    Slaba, Tony C.; Blattnig, Steve R.; Walker, Steven A.; Norbury, John W.

    2015-01-01

    Exposure to galactic cosmic rays (GCR) on long duration deep space missions presents a serious health risk to astronauts, with large uncertainties connected to the biological response. In order to reduce the uncertainties and gain understanding about the basic mechanisms through which space radiation initiates cancer and other endpoints, radiobiology experiments are performed. Some of the accelerator facilities supporting such experiments have matured to a point where simulating the broad range of particles and energies characteristic of the GCR environment in a single experiment is feasible from a technology, usage, and cost perspective. In this work, several aspects of simulating the GCR environment in the laboratory are discussed. First, comparisons are made between direct simulation of the external, free space GCR field and simulation of the induced tissue field behind shielding. It is found that upper energy constraints at the NASA Space Radiation Laboratory (NSRL) limit the ability to simulate the external, free space field directly (i.e. shielding placed in the beam line in front of a biological target and exposed to a free space spectrum). Second, variation in the induced tissue field associated with shielding configuration and solar activity is addressed. It is found that the observed variation is within physical uncertainties, allowing a single reference field for deep space missions to be defined. Third, an approach for simulating the reference field at NSRL is presented. The approach allows for the linear energy transfer (LET) spectrum of the reference field to be approximately represented with discrete ion and energy beams and implicitly maintains a reasonably accurate charge spectrum (or, average quality factor). Drawbacks of the proposed methodology are discussed and weighed against alternative simulation strategies. The neutron component and track structure characteristics of the proposed strategy are discussed in this context.

  12. Solar Energetic Particles (SEP) and Galactic Cosmic Rays (GCR) as tracers of solar wind conditions near Saturn: Event lists and applications

    Science.gov (United States)

    Roussos, E.; Jackman, C. M.; Thomsen, M. F.; Kurth, W. S.; Badman, S. V.; Paranicas, C.; Kollmann, P.; Krupp, N.; Bučík, R.; Mitchell, D. G.; Krimigis, S. M.; Hamilton, D. C.; Radioti, A.

    2018-01-01

    The lack of an upstream solar wind monitor poses a major challenge to any study that investigates the influence of the solar wind on the configuration and the dynamics of Saturn's magnetosphere. Here we show how Cassini MIMI/LEMMS observations of Solar Energetic Particle (SEP) and Galactic Cosmic Ray (GCR) transients, that are both linked to energetic processes in the heliosphere such us Interplanetary Coronal Mass Ejections (ICMEs) and Corotating Interaction Regions (CIRs), can be used to trace enhanced solar wind conditions at Saturn's distance. SEP protons can be easily distinguished from magnetospheric ions, particularly at the MeV energy range. Many SEPs are also accompanied by strong GCR Forbush Decreases. GCRs are detectable as a low count-rate noise signal in a large number of LEMMS channels. As SEPs and GCRs can easily penetrate into the outer and middle magnetosphere, they can be monitored continuously, even when Cassini is not situated in the solar wind. A survey of the MIMI/LEMMS dataset between 2004 and 2016 resulted in the identification of 46 SEP events. Most events last more than two weeks and have their lowest occurrence rate around the extended solar minimum between 2008 and 2010, suggesting that they are associated to ICMEs rather than CIRs, which are the main source of activity during the declining phase and the minimum of the solar cycle. We also list of 17 time periods ( > 50 days each) where GCRs show a clear solar periodicity ( ∼ 13 or 26 days). The 13-day period that derives from two CIRs per solar rotation dominates over the 26-day period in only one of the 17 cases catalogued. This interval belongs to the second half of 2008 when expansions of Saturn's electron radiation belts were previously reported to show a similar periodicity. That observation not only links the variability of Saturn's electron belts to solar wind processes, but also indicates that the source of the observed periodicity in GCRs may be local. In this case GCR

  13. Preliminary Sensitivity Study on Gas-Cooled Reactor for NHDD System Using MARS-GCR

    International Nuclear Information System (INIS)

    Lee, Seung Wook; Jeong, Jae Jun; Lee, Won Jae

    2005-01-01

    A Gas-Cooled Reactor (GCR) is considered as one of the most outstanding tools for a massive hydrogen production without CO 2 emission. Till now, two types of GCR are regarded as a viable nuclear reactor for a hydrogen production: Prismatic Modular Reactor (PMR), Pebble Bed Reactor (PBR). In this paper, a preliminary sensitivity study on two types of GCR is carried out by using MARS-GCR to find out the effect on the peak fuel and reactor pressure vessel (RPV) temperature, with varying the condition of a reactor inlet, outlet temperature, and system pressure for both PMR and PBR

  14. Trehalose, glycogen and ethanol metabolism in the gcr1 mutant of Saccharomyces cerevisiae

    DEFF Research Database (Denmark)

    Seker, Tamay; Hamamci, H.

    2003-01-01

    Since Gcr1p is pivotal in controlling the transcription of glycolytic enzymes and trehalose metabolism seems to be one of the control points of glycolysis, we examined trehalose and glycogen synthesis in response to 2 % glucose pulse during batch growth in gcr1 (glucose regulation-1) mutant lacking...... fully functional glycolytic pathway and in the wild-type strain. An increase in both trehalose and glycogen stores was observed 1 and 2 h after the pulse followed by a steady decrease in both the wild-type and the gcr1 mutant. The accumulation was faster while the following degradation was slower in gcr......1 cells compared to wild-type ones. Although there was no distinct glucose consumption in the mutant cells it seemed that the glucose repression mechanism is similar in gcr1 mutant and in wild-type strain at least with respect to trehalose and glycogen metabolism....

  15. Evaluation of abrasion of a modified drainage mixture with rubber waste crushed (GCR

    Directory of Open Access Journals (Sweden)

    Yee Wan Yung Vargas

    2017-02-01

    Conclusion: The results showed that there is a highlighted influence of mix temperature (between asphalt and GCR and compaction temperature (modified asphalt and aggregate on the behavior of the MD modified with GCR.

  16. GCR flux 9-day variations with LISA Pathfinder

    International Nuclear Information System (INIS)

    Grimani, C; Benella, S; Fabi, M; Finetti, N; Telloni, D

    2017-01-01

    Galactic cosmic-ray (GCR) energy spectra in the heliosphere vary on the basis of the level of solar activity, the status of solar polarity and interplanetary transient magnetic structures of solar origin. A high counting rate particle detector (PD) aboard LISA Pathfinder (LPF) allows for the measurement of galactic cosmic-ray and solar energetic particle (SEP) integral fluxes at energies > 70 MeV n −1 up to 6500 counts s −1 . Data are gathered with a sampling time of 15 s. A study of GCR flux depressions associated with the third harmonic of the Sun rotation period (∼ 9 days) is presented here. (paper)

  17. Opening a Window on ICME-driven GCR Modulation in the Inner Solar System

    Science.gov (United States)

    Winslow, Reka M.; Schwadron, Nathan A.; Lugaz, Noé; Guo, Jingnan; Joyce, Colin J.; Jordan, Andrew P.; Wilson, Jody K.; Spence, Harlan E.; Lawrence, David J.; Wimmer-Schweingruber, Robert F.; Mays, M. Leila

    2018-04-01

    Interplanetary coronal mass ejections (ICMEs) often cause Forbush decreases (Fds) in the flux of galactic cosmic rays (GCRs). We investigate how a single ICME, launched from the Sun on 2014 February 12, affected GCR fluxes at Mercury, Earth, and Mars. We use GCR observations from MESSENGER at Mercury, ACE/LRO at the Earth/Moon, and MSL at Mars. We find that Fds are steeper and deeper closer to the Sun, and that the magnitude of the magnetic field in the ICME magnetic ejecta as well as the “strength” of the ICME sheath both play a large role in modulating the depth of the Fd. Based on our results, we hypothesize that (1) the Fd size decreases exponentially with heliocentric distance, and (2) that two-step Fds are more common closer to the Sun. Both hypotheses will be directly verifiable by the upcoming Parker Solar Probe and Solar Orbiter missions. This investigation provides the first systematic study of the changes in GCR modulation as a function of distance from the Sun using nearly contemporaneous observations at Mercury, Earth/Moon, and Mars, which will be critical for validating our physical understanding of the modulation process throughout the heliosphere.

  18. The GCR2 gene family is not required for ABA control of seed germination and early seedling development in Arabidopsis.

    Directory of Open Access Journals (Sweden)

    Jianjun Guo

    Full Text Available BACKGROUND: The plant hormone abscisic acid (ABA regulates diverse processes of plant growth and development. It has recently been proposed that GCR2 functions as a G-protein-coupled receptor (GPCR for ABA. However, the structural relationships and functionality of GCR2 have been challenged by several independent studies. A central question in this controversy is whether gcr2 mutants are insensitive to ABA, because gcr2 mutants were shown to display reduced sensitivity to ABA under one experimental condition (e.g. 22 degrees C, continuous white light with 150 micromol m(-2 s(-1 but were shown to display wild-type sensitivity under another slightly different condition (e.g. 23 degrees C, 14/10 hr photoperiod with 120 micromol m(-2 s(-1. It has been hypothesized that gcr2 appears only weakly insensitive to ABA because two other GCR2-like genes in Arabidopsis, GCL1 and GCL2, compensate for the loss of function of GCR2. PRINCIPAL FINDINGS: In order to test this hypothesis, we isolated a putative loss-of-function allele of GCL2, and then generated all possible combinations of mutations in each member of the GCR2 gene family. We found that all double mutants, including gcr2 gcl1, gcr2 gcl2, gcl1 gcl2, as well as the gcr2 gcl1 gcl2 triple mutant displayed wild-type sensitivity to ABA in seed germination and early seedling development assays, demonstrating that the GCR2 gene family is not required for ABA responses in these processes. CONCLUSION: These results provide compelling genetic evidence that GCR2 is unlikely to act as a receptor for ABA in the context of either seed germination or early seedling development.

  19. Model for GCR-particle fluxes in stony meteorites and production rates of cosmogenic nuclides

    International Nuclear Information System (INIS)

    Reedy, R.C.

    1984-01-01

    A model is presented for the differential fluxes of galactic-cosmic-ray (GCR) particles with energies above 1 MeV inside any spherical stony meteorite as a function of the meteorite's radius and the sample's depth. This model is based on the Reedy-Arnold equations for the energy-dependent fluxes of GCR particles in the moon and is an extension of flux parameters that were derived for several meteorites of various sizes. This flux is used to calculate the production rates of many cosmogenic nuclides as a function of radius and depth. The peak production rates for most nuclides made by the reactions of energetic GCR particles occur near the centers of meteorites with radii of 40 to 70 g cm -2 . Although the model has some limitations, it reproduces well the basic trends for the depth-dependent production of cosmogenic nuclides in stony meteorites of various radii. These production profiles agree fairly well with measurements of cosmogenic nuclides in meteorites. Some of these production profiles are different than those calculated by others. The chemical dependence of the production rates for several nuclides varies with size and depth. 25 references, 8 figures

  20. GCR1, a transcriptional activator in Saccharomyces cerevisiae, complexes with RAP1 and can function without its DNA binding domain.

    Science.gov (United States)

    Tornow, J; Zeng, X; Gao, W; Santangelo, G M

    1993-01-01

    In Saccharomyces cerevisiae, efficient expression of glycolytic and translational component genes requires two DNA binding proteins, RAP1 (which binds to UASRPG) and GCR1 (which binds to the CT box). We generated deletions in GCR1 to test the validity of several different models for GCR1 function. We report here that the C-terminal half of GCR1, which includes the domain required for DNA binding to the CT box in vitro, can be removed without affecting GCR1-dependent transcription of either the glycolytic gene ADH1 or the translational component genes TEF1 and TEF2. We have also identified an activation domain within a segment of the GCR1 protein (the N-terminal third) that is essential for in vivo function. RAP1 and GCR1 can be co-immunoprecipitated from whole cell extracts, suggesting that they form a complex in vivo. The data are most consistent with a model in which GCR1 is attracted to DNA through contact with RAP1. Images PMID:8508768

  1. The UK MK III GCR experimental physics programme at AEE Winfrith

    Energy Technology Data Exchange (ETDEWEB)

    Johnstone, I

    1972-06-15

    The UK programme of reactor physics experiments in support of the Mk III GCR project started in 1968/69 and has now reached its third main phase. The overall programme is broadly summarised in this report.

  2. Unigenic Evolution: A Novel Genetic Method Localizes a Putative Leucine Zipper That Mediates Dimerization of the Saccharomyces Cerevisiae Regulator Gcr1p

    Science.gov (United States)

    Deminoff, S. J.; Tornow, J.; Santangelo, G. M.

    1995-01-01

    The GCR1 gene of Saccharomyces cerevisiae encodes a transcriptional activator that complexes with Rap1p and, through UAS(RPG) elements (Rap1p DNA binding sites), stimulates efficient expression of glycolytic and translational component genes. To map the functionally important domains in Gcr1p, we combined multiple rounds of random mutagenesis in vitro with in vivo selection of functional genes to locate conserved, or hypomutable, regions. We name this method unigenic evolution, a statistical analysis of mutations in evolutionary variants of a single gene in an otherwise isogenic background. Examination of the distribution of 315 mutations in 24 variant alleles allowed the localization of four hypomutable regions in GCR1 (A, B, C, and D). Dispensable N-terminal (intronic) and C-terminal portions of the evolved region of GCR1 were included in the analysis as controls and were, as expected, not hypomutable. The analysis of several insertion, deletion, and point mutations, combined with a comparison of the hypomutability and hydrophobicity plots of Gcr1p, suggested that some of the hypomutable regions may individually or in combination correspond to functionally important surface domains. In particular, we determined that region D contains a putative leucine zipper and is necessary and sufficient for Gcr1p homodimerization. PMID:8601472

  3. EDF's (Electricite de France) in service control for GCR type reactor vessels

    International Nuclear Information System (INIS)

    Douillet, M.G.

    1979-01-01

    This paper presents the performance of the data acquisition and processing systems developed by the French EDF for controlling and testing the mechanical properties (thermal stress, deformations, cracks,...) of prestressed concrete vessels for GCR type reactors

  4. Efficient transcription of the glycolytic gene ADH1 and three translational component genes requires the GCR1 product, which can act through TUF/GRF/RAP binding sites.

    OpenAIRE

    Santangelo, G M; Tornow, J

    1990-01-01

    Glycolytic gene expression in Saccharomyces cerevisiae is thought to be activated by the GCR and TUF proteins. We tested the hypothesis that GCR function is mediated by TUF/GRF/RAP binding sites (UASRPG elements). We found that UASRPG-dependent activation of a heterologous gene and transcription of ADH1, TEF1, TEF2, and RP59 were sensitive to GCR1 disruption. GCR is not required for TUF/GRF/RAP expression or in vitro DNA-binding activity.

  5. Accurate quantification of 5 German cockroach (GCr) allergens in complex extracts using multiple reaction monitoring mass spectrometry (MRM MS).

    Science.gov (United States)

    Mindaye, S T; Spiric, J; David, N A; Rabin, R L; Slater, J E

    2017-12-01

    German cockroach (GCr) allergen extracts are complex and heterogeneous products, and methods to better assess their potency and composition are needed for adequate studies of their safety and efficacy. The objective of this study was to develop an assay based on liquid chromatography and multiple reaction monitoring mass spectrometry (LC-MRM MS) for rapid, accurate, and reproducible quantification of 5 allergens (Bla g 1, Bla g 2, Bla g 3, Bla g 4, and Bla g 5) in crude GCr allergen extracts. We first established a comprehensive peptide library of allergens from various commercial extracts as well as recombinant allergens. Peptide mapping was performed using high-resolution MS, and the peptide library was then used to identify prototypic and quantotypic peptides to proceed with MRM method development. Assay development included a systematic optimization of digestion conditions (buffer, digestion time, and trypsin concentration), chromatographic separation, and MS parameters. Robustness and suitability were assessed following ICH (Q2 [R1]) guidelines. The method is precise (RSD  0.99, 0.01-1384 fmol/μL), and sensitive (LLOD and LLOQ MS, we quantified allergens from various commercial GCr extracts and showed considerable variability that may impact clinical efficacy. Our data demonstrate that the LC-MRM MS method is valuable for absolute quantification of allergens in GCr extracts and likely has broader applicability to other complex allergen extracts. Definitive quantification provides a new standard for labelling of allergen extracts, which will inform patient care, enable personalized therapy, and enhance the efficacy of immunotherapy for environmental and food allergies. © 2017 The Authors. Clinical & Experimental Allergy published by John Wiley & Sons Ltd. This article has been contributed to by US Government employees and their work is in the public domain in the USA.

  6. Efficient transcription of the glycolytic gene ADH1 and three translational component genes requires the GCR1 product, which can act through TUF/GRF/RAP binding sites.

    Science.gov (United States)

    Santangelo, G M; Tornow, J

    1990-01-01

    Glycolytic gene expression in Saccharomyces cerevisiae is thought to be activated by the GCR and TUF proteins. We tested the hypothesis that GCR function is mediated by TUF/GRF/RAP binding sites (UASRPG elements). We found that UASRPG-dependent activation of a heterologous gene and transcription of ADH1, TEF1, TEF2, and RP59 were sensitive to GCR1 disruption. GCR is not required for TUF/GRF/RAP expression or in vitro DNA-binding activity. Images PMID:2405258

  7. Constitutive Modeling of the Flow Stress of GCr15 Continuous Casting Bloom in the Heavy Reduction Process

    Science.gov (United States)

    Ji, Cheng; Wang, Zilin; Wu, Chenhui; Zhu, Miaoyong

    2018-04-01

    According to the calculation results of a 3D thermomechanical-coupled finite-element (FE) model of GCr15 bearing steel bloom during a heavy reduction (HR) process, the variation ranges in the strain rate and strain under HR were described. In addition, the hot deformation behavior of the GCr15 bearing steel was studied over the temperature range from 1023 K to 1573 K (750 °C to 1300 °C) with strain rates of 0.001, 0.01, and 0.1 s-1 in single-pass thermosimulation compression experiments. To ensure the accuracy of the constitutive model, the temperature range was divided into two temperature intervals according to the fully austenitic temperature of GCr15 steel [1173 K (900 °C)]. Two sets of material parameters for the constitutive model were derived based on the true stress-strain curves of the two temperature intervals. A flow stress constitutive model was established using a revised Arrhenius-type constitutive equation, which considers the relationships among the material parameters and true strain. This equation describes dynamic softening during hot compression processes. Considering the effect of glide and climb on the deformation mechanism, the Arrhenius-type constitutive equation was modified by a physically based approach. This model is the most accurate over the temperatures ranging from 1173 K to 1573 K (900 °C to 1300 °C) under HR deformation conditions (ignoring the range from 1273 K to 1573 K (1000 °C to 1300 °C) with a strain rate of 0.1 s-1). To ensure the convergence of the FE calculation, an approximated method was used to estimate the flow stress at temperatures greater than 1573 K (1300 °C).

  8. Elemental GCR Observations during the 2009-2010 Solar Minimum Period

    Science.gov (United States)

    Lave, K. A.; Israel, M. H.; Binns, W. R.; Christian, E. R.; Cummings, A. C.; Davis, A. J.; deNolfo, G. A.; Leske, R. A.; Mewaldt, R. A.; Stone, E. C.; hide

    2013-01-01

    Using observations from the Cosmic Ray Isotope Spectrometer (CRIS) onboard the Advanced Composition Explorer (ACE), we present new measurements of the galactic cosmic ray (GCR) elemental composition and energy spectra for the species B through Ni in the energy range approx. 50-550 MeV/nucleon during the record setting 2009-2010 solar minimum period. These data are compared with our observations from the 1997-1998 solar minimum period, when solar modulation in the heliosphere was somewhat higher. For these species, we find that the intensities during the 2009-2010 solar minimum were approx. 20% higher than those in the previous solar minimum, and in fact were the highest GCR intensities recorded during the space age. Relative abundances for these species during the two solar minimum periods differed by small but statistically significant amounts, which are attributed to the combination of spectral shape differences between primary and secondary GCRs in the interstellar medium and differences between the levels of solar modulation in the two solar minima. We also present the secondary-to-primary ratios B/C and (Sc+Ti+V)/Fe for both solar minimum periods, and demonstrate that these ratios are reasonably well fit by a simple "leaky-box" galactic transport model that is combined with a spherically symmetric solar modulation model.

  9. Transient Cosmic-ray Events beyond the Heliopause: Interpreting Voyager-1 Observations

    Energy Technology Data Exchange (ETDEWEB)

    Kóta, J.; Jokipii, J. R. [Lunar and Planetary Laboratory, University of Arizona, Tucson, AZ 85721-0092 (United States)

    2017-04-20

    In 2013 March and 2014 May, Voyager-1 ( V1 ) experienced small but significant increases in the flux of galactic cosmic rays (GCRs) in the hundred MeV/n range. Additionally, V1 also saw episodic depletion of GCR flux around perpendicular pitch angles. We discuss the pitch-angle distribution and the time profiles of these events. In a previous paper, we interpreted the 2013 “bump” as the GCRs remotely sensing a shock that reached the magnetic field line passing through V1 : particles gained energy as they were reflected on the approaching region of the stronger magnetic field of the disturbance. Here, we point out that energy gain is not restricted to reflected particles—GCRs passing through the disturbance also gain energy. The effect should be present in a broad range of pitch angles with the maximum increase of GCR intensity predicted to occur at the critical reflection angle. In this paper, the shock is not step-like, but a gradual increase of the magnetic field strength, B , taking a few days, in agreement with V1 measurements. This smoothens the profile of the predicted bump in the GCR flux. We also address the linear episodic decreases seen around perpendicular pitch angles. These events are interpreted in terms of adiabatic cooling behind the shock due to the slow weakening of B . We present simple numerical model calculations and find that a gradual shock followed by a slow decrease of B , as observed, may account for both the episodic increases and the anisotropic depletion of GCR fluxes.

  10. Criteria for confirming sequence periodicity identified by Fourier transform analysis: application to GCR2, a candidate plant GPCR?

    Science.gov (United States)

    Illingworth, Christopher J R; Parkes, Kevin E; Snell, Christopher R; Mullineaux, Philip M; Reynolds, Christopher A

    2008-03-01

    Methods to determine periodicity in protein sequences are useful for inferring function. Fourier transformation is one approach but care is required to ensure the periodicity is genuine. Here we have shown that empirically-derived statistical tables can be used as a measure of significance. Genuine protein sequences data rather than randomly generated sequences were used as the statistical backdrop. The method has been applied to G-protein coupled receptor (GPCR) sequences, by Fourier transformation of hydrophobicity values, codon frequencies and the extent of over-representation of codon pairs; the latter being related to translational step times. Genuine periodicity was observed in the hydrophobicity whereas the apparent periodicity (as inferred from previously reported measures) in the translation step times was not validated statistically. GCR2 has recently been proposed as the plant GPCR receptor for the hormone abscisic acid. It has homology to the Lanthionine synthetase C-like family of proteins, an observation confirmed by fold recognition. Application of the Fourier transform algorithm to the GCR2 family revealed strongly predicted seven fold periodicity in hydrophobicity, suggesting why GCR2 has been reported to be a GPCR, despite negative indications in most transmembrane prediction algorithms. The underlying multiple sequence alignment, also required for the Fourier transform analysis of periodicity, indicated that the hydrophobic regions around the 7 GXXG motifs commence near the C-terminal end of each of the 7 inner helices of the alpha-toroid and continue to the N-terminal region of the helix. The results clearly explain why GCR2 has been understandably but erroneously predicted to be a GPCR.

  11. GCR flux reconstruction during the last three centuries validated by the Ti-44 in meteorites and Be-10 in ice

    Science.gov (United States)

    Cini Castagnoli, G.; Cane, D.; Taricco, C.; Bhandari, N.

    2003-04-01

    In a previous work [1] we deduced that during prolonged minima of solar activity since 1700 the galactic cosmic rays (GCR) flux was much higher (˜2 times) respect to what we can infer from GCR modulation deduced solely by the Sunspot Number series. This flux was higher respect to what we observe in the last decades by Neutron Monitor or balloon and spacecraft-borne detectors and confirmed by the three fresh-fall meteorites that we have measured during solar cycle 22. Recently we have deduced the GCR annual mean spectra for the last 300 years [2], starting from the open solar magnetic flux proposed by Solanki et al. [3]. Utilizing the GCR flux we have calculated the 44Ti (T1/2 = 59.2 y) activity in meteorites taking into account the cross sections for its production from the main target element Fe and Ni. We compare the calculated activity with our measurements of the cosmogenic 44Ti in different chondrites fell in the period 1810-1997. The results are in close agreement both in phase and amplitude. The same procedure has been adopted for calculating the production rate of 10Be in atmosphere. Normalizing to the concentration in ice in the solar cycles 20 and 21 we obtain a good agreement with the 10Be profile in Dye3 core [4]. These results demonstrate that our inference of the GCR flux in the past 300 years is reliable. [1] Bonino G., Cini Castagnoli G., Bhandari N., Taricco C., textit {Science}, 270, 1648, 1995 [2] Bonino G., Cini Castagnoli G., Cane D., Taricco C. and Bhandari N., textit {Proc. XXVII Intern. Cosmic Ray Conf.} (Hamburg, 2001) 3769-3772. [3] Solanki S.K., Schüssler M. and Fligge M.,Nature, 408, 445, 2000 [4] Beer J. et al., private communication

  12. Energetic particles in the heliosphere and GCR modulation: Reviewing of SH-posters

    International Nuclear Information System (INIS)

    Struminsky, Alexei

    2013-01-01

    This rapporteur paper addresses the SH poster session titled 'Energetic particles in the heliosphere (solar and anomalous CRs, GCR modulation)' of the 23rd European Cosmic Ray Symposium (ECRS) and the 32nd Russian Cosmic Ray Conference (RCRC). The 65 posters presented are tentatively divided into five sections: Instruments and Methods; Solar Energetic Particles; Short Term Variations; Long Term Variations; Heliosphere.

  13. Model for spatial synthesis of automated control system of the GCR type reactor; Model za prostornu sintezu sistema automatskog upravljanja reaktora GCR tipa

    Energy Technology Data Exchange (ETDEWEB)

    Lazarevic, B; Matausek, M [Institut za nuklearne nauke ' Boris Kidric' , Vinca, Belgrade (Yugoslavia)

    1966-07-01

    This paper describes the model which was developed for synthesis of spatial distribution of automated control elements in the reactor. It represents a general reliable mathematical model for analyzing transition states and synthesis of the automated control and regulation systems of GCR type reactors. One-dimensional system was defined under assumption that the time dependence of parameters of the neutron diffusion equation are identical in the total volume of the reactor and that spatial distribution of neutrons is time independent. It is shown that this assumption is satisfactory in case of short term variations which are relevant for safety analysis.

  14. In vitro manganese-dependent cross-talk between Streptococcus mutans VicK and GcrR: implications for overlapping stress response pathways.

    Directory of Open Access Journals (Sweden)

    Jennifer S Downey

    Full Text Available Streptococcus mutans, a major acidogenic component of the dental plaque biofilm, has a key role in caries etiology. Previously, we demonstrated that the VicRK two-component signal transduction system modulates biofilm formation, oxidative stress and acid tolerance responses in S. mutans. Using in vitro phosphorylation assays, here we demonstrate for the first time, that in addition to activating its cognate response regulator protein, the sensor kinase, VicK can transphosphorylate a non-cognate stress regulatory response regulator, GcrR, in the presence of manganese. Manganese is an important micronutrient that has been previously correlated with caries incidence, and which serves as an effector of SloR-mediated metalloregulation in S. mutans. Our findings supporting regulatory effects of manganese on the VicRK, GcrR and SloR, and the cross-regulatory networks formed by these components are more complex than previously appreciated. Using DNaseI footprinting we observed overlapping DNA binding specificities for VicR and GcrR in native promoters, consistent with these proteins being part of the same transcriptional regulon. Our results also support a role for SloR as a positive regulator of the vicRK two component signaling system, since its transcription was drastically reduced in a SloR-deficient mutant. These findings demonstrate the regulatory complexities observed with the S. mutans manganese-dependent response, which involves cross-talk between non-cognate signal transduction systems (VicRK and GcrR to modulate stress response pathways.

  15. A 3D Monte Carlo model of radiation affecting cells, and its application to neuronal cells and GCR irradiation

    Science.gov (United States)

    Ponomarev, Artem; Sundaresan, Alamelu; Kim, Angela; Vazquez, Marcelo E.; Guida, Peter; Kim, Myung-Hee; Cucinotta, Francis A.

    A 3D Monte Carlo model of radiation transport in matter is applied to study the effect of heavy ion radiation on human neuronal cells. Central nervous system effects, including cognitive impairment, are suspected from the heavy ion component of galactic cosmic radiation (GCR) during space missions. The model can count, for instance, the number of direct hits from ions, which will have the most affect on the cells. For comparison, the remote hits, which are received through δ-rays from the projectile traversing space outside the volume of the cell, are also simulated and their contribution is estimated. To simulate tissue effects from irradiation, cellular matrices of neuronal cells, which were derived from confocal microscopy, were simulated in our model. To produce this realistic model of the brain tissue, image segmentation was used to identify cells in the images of cells cultures. The segmented cells were inserted pixel by pixel into the modeled physical space, which represents a volume of interacting cells with periodic boundary conditions (PBCs). PBCs were used to extrapolate the model results to the macroscopic tissue structures. Specific spatial patterns for cell apoptosis are expected from GCR, as heavy ions produce concentrated damage along their trajectories. The apoptotic cell patterns were modeled based on the action cross sections for apoptosis, which were estimated from the available experimental data. The cell patterns were characterized with an autocorrelation function, which values are higher for non-random cell patterns, and the values of the autocorrelation function were compared for X rays and Fe ion irradiations. The autocorrelation function indicates the directionality effects present in apoptotic neuronal cells from GCR.

  16. Simulation of the GCR spectrum in the Mars curiosity rover's RAD detector using MCNP6

    Science.gov (United States)

    Ratliff, Hunter N.; Smith, Michael B. R.; Heilbronn, Lawrence

    2017-08-01

    The paper presents results from MCNP6 simulations of galactic cosmic ray (GCR) propagation down through the Martian atmosphere to the surface and comparison with RAD measurements made there. This effort is part of a collaborative modeling workshop for space radiation hosted by Southwest Research Institute (SwRI). All modeling teams were tasked with simulating the galactic cosmic ray (GCR) spectrum through the Martian atmosphere and the Radiation Assessment Detector (RAD) on-board the Curiosity rover. The detector had two separate particle acceptance angles, 4π and 30 ° off zenith. All ions with Z = 1 through Z = 28 were tracked in both scenarios while some additional secondary particles were only tracked in the 4π cases. The MCNP6 4π absorbed dose rate was 307.3 ± 1.3 μGy/day while RAD measured 233 μGy/day. Using the ICRP-60 dose equivalent conversion factors built into MCNP6, the simulated 4π dose equivalent rate was found to be 473.1 ± 2.4 μSv/day while RAD reported 710 μSv/day.

  17. TRANSIENT GALACTIC COSMIC-RAY MODULATION DURING SOLAR CYCLE 24: A COMPARATIVE STUDY OF TWO PROMINENT FORBUSH DECREASE EVENTS

    International Nuclear Information System (INIS)

    Zhao, L.-L.; Zhang, H.

    2016-01-01

    Forbush decrease (FD) events are of great interest for transient galactic cosmic-ray (GCR) modulation study. In this study, we perform comparative analysis of two prominent Forbush events during cycle 24, occurring on 2012 March 8 (Event 1) and 2015 June 22 (Event 2), utilizing the measurements from the worldwide neutron monitor (NM) network. Despite their comparable magnitudes, the two Forbush events are distinctly different in terms of evolving GCR energy spectrum and energy dependence of the recovery time. The recovery time of Event 1 is strongly dependent on the median energy, compared to the nearly constant recovery time of Event 2 over the studied energy range. Additionally, while the evolutions of the energy spectra during the two FD events exhibit similar variation patterns, the spectrum of Event 2 is significantly harder, especially at the time of deepest depression. These difference are essentially related to their associated solar wind disturbances. Event 1 is associated with a complicated shock-associated interplanetary coronal mass ejection (ICME) disturbance with large radial extent, probably formed by the merging of multiple shocks and transient flows, and which delivered a glancing blow to Earth. Conversely, Event 2 is accompanied by a relatively simple halo ICME with small radial extent that hit Earth more head-on.

  18. TRANSIENT GALACTIC COSMIC-RAY MODULATION DURING SOLAR CYCLE 24: A COMPARATIVE STUDY OF TWO PROMINENT FORBUSH DECREASE EVENTS

    Energy Technology Data Exchange (ETDEWEB)

    Zhao, L.-L.; Zhang, H., E-mail: zhaolingling@ucas.edu.cn [Key Laboratory of Computational Geodynamics, University of Chinese Academy of Sciences, Beijing 100049 (China)

    2016-08-10

    Forbush decrease (FD) events are of great interest for transient galactic cosmic-ray (GCR) modulation study. In this study, we perform comparative analysis of two prominent Forbush events during cycle 24, occurring on 2012 March 8 (Event 1) and 2015 June 22 (Event 2), utilizing the measurements from the worldwide neutron monitor (NM) network. Despite their comparable magnitudes, the two Forbush events are distinctly different in terms of evolving GCR energy spectrum and energy dependence of the recovery time. The recovery time of Event 1 is strongly dependent on the median energy, compared to the nearly constant recovery time of Event 2 over the studied energy range. Additionally, while the evolutions of the energy spectra during the two FD events exhibit similar variation patterns, the spectrum of Event 2 is significantly harder, especially at the time of deepest depression. These difference are essentially related to their associated solar wind disturbances. Event 1 is associated with a complicated shock-associated interplanetary coronal mass ejection (ICME) disturbance with large radial extent, probably formed by the merging of multiple shocks and transient flows, and which delivered a glancing blow to Earth. Conversely, Event 2 is accompanied by a relatively simple halo ICME with small radial extent that hit Earth more head-on.

  19. Draft Title 40 CFR 191 compliance certification application for the Waste Isolation Pilot Plant. Volume 6: Appendix GCR Volume 1

    International Nuclear Information System (INIS)

    1995-01-01

    The Geological Characterization Report (GCR) for the WIPP site presents, in one document, a compilation of geologic information available to August, 1978, which is judged to be relevant to studies for the WIPP. The Geological Characterization Report for the WIPP site is neither a preliminary safety analysis report nor an environmental impact statement; these documents, when prepared, should be consulted for appropriate discussion of safety analysis and environmental impact. The Geological Characterization Report of the WIPP site is a unique document and at this time is not required by regulatory process. An overview is presented of the purpose of the WIPP, the purpose of the Geological Characterization Report, the site selection criteria, the events leading to studies in New Mexico, status of studies, and the techniques employed during geological characterization

  20. Draft Title 40 CFR 191 compliance certification application for the Waste Isolation Pilot Plant. Volume 6: Appendix GCR Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-03-31

    The Geological Characterization Report (GCR) for the WIPP site presents, in one document, a compilation of geologic information available to August, 1978, which is judged to be relevant to studies for the WIPP. The Geological Characterization Report for the WIPP site is neither a preliminary safety analysis report nor an environmental impact statement; these documents, when prepared, should be consulted for appropriate discussion of safety analysis and environmental impact. The Geological Characterization Report of the WIPP site is a unique document and at this time is not required by regulatory process. An overview is presented of the purpose of the WIPP, the purpose of the Geological Characterization Report, the site selection criteria, the events leading to studies in New Mexico, status of studies, and the techniques employed during geological characterization.

  1. Accident sequence precursor events with age-related contributors

    Energy Technology Data Exchange (ETDEWEB)

    Murphy, G.A.; Kohn, W.E.

    1995-12-31

    The Accident Sequence Precursor (ASP) Program at ORNL analyzed about 14.000 Licensee Event Reports (LERs) filed by US nuclear power plants 1987--1993. There were 193 events identified as precursors to potential severe core accident sequences. These are reported in G/CR-4674. Volumes 7 through 20. Under the NRC Nuclear Plant Aging Research program, the authors evaluated these events to determine the extent to which component aging played a role. Events were selected that involved age-related equipment degradation that initiated an event or contributed to an event sequence. For the 7-year period, ORNL identified 36 events that involved aging degradation as a contributor to an ASP event. Except for 1992, the percentage of age-related events within the total number of ASP events over the 7-year period ({approximately}19%) appears fairly consistent up to 1991. No correlation between plant ape and number of precursor events was found. A summary list of the age-related events is presented in the report.

  2. Simulation of the GCR spectrum in the Mars curiosity rover's RAD detector using MCNP6.

    Science.gov (United States)

    Ratliff, Hunter N; Smith, Michael B R; Heilbronn, Lawrence

    2017-08-01

    The paper presents results from MCNP6 simulations of galactic cosmic ray (GCR) propagation down through the Martian atmosphere to the surface and comparison with RAD measurements made there. This effort is part of a collaborative modeling workshop for space radiation hosted by Southwest Research Institute (SwRI). All modeling teams were tasked with simulating the galactic cosmic ray (GCR) spectrum through the Martian atmosphere and the Radiation Assessment Detector (RAD) on-board the Curiosity rover. The detector had two separate particle acceptance angles, 4π and 30 ° off zenith. All ions with Z = 1 through Z = 28 were tracked in both scenarios while some additional secondary particles were only tracked in the 4π cases. The MCNP6 4π absorbed dose rate was 307.3 ± 1.3 µGy/day while RAD measured 233 µGy/day. Using the ICRP-60 dose equivalent conversion factors built into MCNP6, the simulated 4π dose equivalent rate was found to be 473.1 ± 2.4 µSv/day while RAD reported 710 µSv/day. Copyright © 2017 The Committee on Space Research (COSPAR). Published by Elsevier Ltd. All rights reserved.

  3. Modelling of plate-out under gas-cooled reactor (GCR) accident conditions

    International Nuclear Information System (INIS)

    Taig, A.R.

    1981-01-01

    The importance of plate-out in mitigating consequences of gas-cooled reactor accidents, and its place in assessing these consequences, are discussed. The data requirements of a plate-out modelling program are discussed, and a brief description is given of parallel work programs on thermal/hydraulic reactor behaviour and fuel modelling, both of which will provide inputs to the plate-out program under development. The representation of a GCR system used in SRD studies is presented, and the equations governing iodine adsorption, desorption and transport round the circuit are derived. The status of SRD's plate-out program is described, and the type of sensitivity studies to be undertaken with the partially-developed computer program in order to identify the most useful lines for future research is discussed. (author)

  4. Microstructure of warm rolling and pearlitic transformation of ultrafine-grained GCr15 steel

    International Nuclear Information System (INIS)

    Sun, Jun-Jie; Lian, Fu-Liang; Liu, Hong-Ji; Jiang, Tao; Guo, Sheng-Wu; Du, Lin-Xiu; Liu, Yong-Ning

    2014-01-01

    Pearlitic transformation mechanisms have been investigated in ultra-fine grained GCr15 steel. The ultrafine-grained steel, whose grain size was less than 1 μm, was prepared by thermo-mechanical treatment at 873 K and then annealing at 923 K for 2 h. Pearlitic transformation was conducted by reheating the ultra-fine grained samples at 1073 K and 1123 K for different periods of time and then cooling in air. Scanning electron microscope observation shows that normal lamellar pearlite, instead of granular cementite and ferrite, cannot be formed when the grain size is approximately less than 4(± 0.6) μm, which yields a critical grain size for normal lamellar pearlitic transformations in this chromium alloyed steel. The result confirms that grain size has a great influence on pearlitic transformation by increasing the diffusion rate of carbon atoms in the ultra-fine grained steel, and the addition of chromium element doesn't change this pearlitic phase transformation rule. Meanwhile, the grain growth rate is reduced by chromium alloying, which is beneficial to form fine grains during austenitizing, thus it facilitating pearlitic transformation by divorced eutectoid transformation. Moreover, chromium element can form a relatively high gradient in the frontier of the undissolved carbide, which promotes carbide formation in the frontier of the undissolved carbide, i.e., chromium promotes divorced eutectoid transformation. - Highlights: • Ultrafine-grained GCr15 steel was obtained by warm rolling and annealing technology. • Reduction of grain size makes pearlite morphology from lamellar to granular. • Adding Cr does not change normal pearlitic phase transformation rule in UFG steel. • Cr carbide resists grain growth and facilitates pearlitic transformation by DET

  5. Inner heliosphere spatial gradients of GCR protons and alpha particles in the low GeV range

    Science.gov (United States)

    Gieseler, J.; Boezio, M.; Casolino, M.; De Simone, N.; Di Felice, V.; Heber, B.; Martucci, M.; Picozza, P.

    2013-12-01

    The spacecraft Ulysses was launched in October 1990 in the maximum phase of solar cycle 22, reached its final, highly inclined (80.2°) Keplerian orbit around the Sun in February 1992, and was finally switched off in June 2009. The Kiel Electron Telescope (KET) aboard Ulysses measures electrons from 3 MeV to a few GeV and protons and helium in the energy range from 6 MeV/nucleon to above 2 GeV/nucleon. In order to investigate the radial and latitudinal gradients of galactic cosmic rays (GCR), it is essential to know their intensity variations for a stationary observer in the heliosphere because the Ulysses measurements reflect not only the spatial but also the temporal variation of the energetic particle intensities. This was accomplished in the past with the Interplanetary Monitoring Platform-J (IMP 8) until it was lost in 2006. Fortunately, the satellite-borne experiment PAMELA (Payload for Antimatter Matter Exploration and Light-nuclei Astrophysics) was launched in June 2006 and can be used as a reliable 1 AU baseline for measurements of the KET aboard Ulysses. With these tools at hand, we have the opportunity to determine the spatial gradients of GCR protons and alpha particles at about 0.1 to 1 GeV/n in the inner heliosphere during the extended minimum of solar cycle 23. We then compare these A0 cycle.

  6. Event-Based Conceptual Modeling

    DEFF Research Database (Denmark)

    Bækgaard, Lars

    The paper demonstrates that a wide variety of event-based modeling approaches are based on special cases of the same general event concept, and that the general event concept can be used to unify the otherwise unrelated fields of information modeling and process modeling. A set of event......-based modeling approaches are analyzed and the results are used to formulate a general event concept that can be used for unifying the seemingly unrelated event concepts. Events are characterized as short-duration processes that have participants, consequences, and properties, and that may be modeled in terms...... of information structures. The general event concept can be used to guide systems analysis and design and to improve modeling approaches....

  7. RadWorks Storm Shelter Design for Solar Particle Event Shielding

    Science.gov (United States)

    Simon, Matthew A.; Cerro, Jeffrey; Clowdsley, Martha

    2013-01-01

    In order to enable long-duration human exploration beyond low-Earth orbit, the risks associated with exposure of astronaut crews to space radiation must be mitigated with practical and affordable solutions. The space radiation environment beyond the magnetosphere is primarily a combination of two types of radiation: galactic cosmic rays (GCR) and solar particle events (SPE). While mitigating GCR exposure remains an open issue, reducing astronaut exposure to SPEs is achievable through material shielding because they are made up primarily of medium-energy protons. In order to ensure astronaut safety for long durations beyond low-Earth orbit, SPE radiation exposure must be mitigated. However, the increasingly demanding spacecraft propulsive performance for these ambitious missions requires minimal mass and volume radiation shielding solutions which leverage available multi-functional habitat structures and logistics as much as possible. This paper describes the efforts of NASA's RadWorks Advanced Exploration Systems (AES) Project to design minimal mass SPE radiation shelter concepts leveraging available resources. Discussion items include a description of the shelter trade space, the prioritization process used to identify the four primary shelter concepts chosen for maturation, a summary of each concept's design features, a description of the radiation analysis process, and an assessment of the parasitic mass of each concept.

  8. Simulations of GCR interactions within planetary bodies using GEANT4

    Science.gov (United States)

    Mesick, K.; Feldman, W. C.; Stonehill, L. C.; Coupland, D. D. S.

    2017-12-01

    On planetary bodies with little to no atmosphere, Galactic Cosmic Rays (GCRs) can hit the body and produce neutrons primarily through nuclear spallation within the top few meters of the surfaces. These neutrons undergo further nuclear interactions with elements near the planetary surface and some will escape the surface and can be detected by landed or orbiting neutron radiation detector instruments. The neutron leakage signal at fast neutron energies provides a measure of average atomic mass of the near-surface material and in the epithermal and thermal energy ranges is highly sensitive to the presence of hydrogen. Gamma-rays can also escape the surface, produced at characteristic energies depending on surface composition, and can be detected by gamma-ray instruments. The intra-nuclear cascade (INC) that occurs when high-energy GCRs interact with elements within a planetary surface to produce the leakage neutron and gamma-ray signals is highly complex, and therefore Monte Carlo based radiation transport simulations are commonly used for predicting and interpreting measurements from planetary neutron and gamma-ray spectroscopy instruments. In the past, the simulation code that has been widely used for this type of analysis is MCNPX [1], which was benchmarked against data from the Lunar Neutron Probe Experiment (LPNE) on Apollo 17 [2]. In this work, we consider the validity of the radiation transport code GEANT4 [3], another widely used but open-source code, by benchmarking simulated predictions of the LPNE experiment to the Apollo 17 data. We consider the impact of different physics model options on the results, and show which models best describe the INC based on agreement with the Apollo 17 data. The success of this validation then gives us confidence in using GEANT4 to simulate GCR-induced neutron leakage signals on Mars in relevance to a re-analysis of Mars Odyssey Neutron Spectrometer data. References [1] D.B. Pelowitz, Los Alamos National Laboratory, LA-CP-05

  9. Event-Based Conceptual Modeling

    DEFF Research Database (Denmark)

    Bækgaard, Lars

    2009-01-01

    The purpose of the paper is to obtain insight into and provide practical advice for event-based conceptual modeling. We analyze a set of event concepts and use the results to formulate a conceptual event model that is used to identify guidelines for creation of dynamic process models and static...... information models. We characterize events as short-duration processes that have participants, consequences, and properties, and that may be modeled in terms of information structures. The conceptual event model is used to characterize a variety of event concepts and it is used to illustrate how events can...... be used to integrate dynamic modeling of processes and static modeling of information structures. The results are unique in the sense that no other general event concept has been used to unify a similar broad variety of seemingly incompatible event concepts. The general event concept can be used...

  10. The effects of microhardnesses and friction coefficients of GCr15 and Cr4Mo4V bearing materials by ion implantation

    International Nuclear Information System (INIS)

    Yang Qifa; Xiang Deguang; Lu Haolin

    1988-01-01

    Some experimental results of microhardnesses and friction coefficients of GCr15 and Cr4Mo4V bearing materials which were implanted with Cr, Mo, N and B ions are reported in this paper. It is found that the microhardnesses are increased and the friction coefficients are reduced by Cr, Mo, N and B ion implantation for two materials. The friction coefficients of Cr + Mo + N , Cr + Mo + B ion implanted samples are reduced to 1/3 of the unimplanted samples

  11. Investigations on femtosecond laser modified micro-textured surface with anti-friction property on bearing steel GCr15

    Science.gov (United States)

    Yang, Lijun; Ding, Ye; Cheng, Bai; He, Jiangtao; Wang, Genwang; Wang, Yang

    2018-03-01

    This work puts forward femtosecond laser modification of micro-textured surface on bearing steel GCr15 in order to reduce frictional wear and enhance load capacity during its application. Multi pulses femtosecond laser ablation experiments are established for the confirmation of laser spot radius as well as single pulse threshold fluence and pulse incubation coefficient of bulk material. Analytical models are set up in combination with hydrodynamics lubrication theory. Corresponding simulations are carried out on to explore influences of surface and cross sectional morphology of textures on hydrodynamics lubrication effect based on Navier-Stokes (N-S) equation. Technological experiments focus on the impacts of femtosecond laser machining variables, like scanning times, scanning velocity, pulse frequency and scanning gap on morphology of grooves as well as realization of optimized textures proposed by simulations, mechanisms of which are analyzed from multiple perspectives. Results of unidirectional rotating friction tests suggest that spherical texture with depth-to-width ratio of 0.2 can significantly improve tribological properties at low loading and velocity condition comparing with un-textured and other textured surfaces, which also verifies the accuracy of simulations and feasibility of femtosecond laser in modification of micro-textured surface.

  12. WGS-based surveillance of third-generation cephalosporin-resistant Escherichia coli from bloodstream infections in Denmark

    DEFF Research Database (Denmark)

    Roer, Louise; Hansen, Frank; Thomsen, Martin Christen Frølund

    2017-01-01

    clone, here observed for the first time in Denmark. Additionally, the analysis revealed three individual cases with possible persistence of closely related clones collected more than 13 months apart. Continuous WGS-based national surveillance of 3GC-R Ec , in combination with more detailed......-genome sequenced and characterized by using the batch uploader from the Center for Genomic Epidemiology (CGE) and automatically analysed using the CGE tools according to resistance profile, MLST, serotype and fimH subtype. Additionally, the phylogenetic relationship of the isolates was analysed by SNP analysis......To evaluate a genome-based surveillance of all Danish third-generation cephalosporin-resistant Escherichia coli (3GC-R Ec ) from bloodstream infections between 2014 and 2015, focusing on horizontally transferable resistance mechanisms. A collection of 552 3GC-R Ec isolates were whole...

  13. Evaluating shielding effectiveness for reducing space radiation cancer risks

    International Nuclear Information System (INIS)

    Cucinotta, Francis A.; Kim, Myung-Hee Y.; Ren, Lei

    2006-01-01

    We discuss calculations of probability distribution functions (PDF) representing uncertainties in projecting fatal cancer risk from galactic cosmic rays (GCR) and solar particle events (SPE). The PDFs are used in significance tests for evaluating the effectiveness of potential radiation shielding approaches. Uncertainties in risk coefficients determined from epidemiology data, dose and dose-rate reduction factors, quality factors, and physics models of radiation environments are considered in models of cancer risk PDFs. Competing mortality risks and functional correlations in radiation quality factor uncertainties are included in the calculations. We show that the cancer risk uncertainty, defined as the ratio of the upper value of 95% confidence interval (CI) to the point estimate is about 4-fold for lunar and Mars mission risk projections. For short-stay lunar missions ( 180d) or Mars missions, GCR risks may exceed radiation risk limits that are based on acceptable levels of risk. For example, the upper 95% CI exceeding 10% fatal risk for males and females on a Mars mission. For reducing GCR cancer risks, shielding materials are marginally effective because of the penetrating nature of GCR and secondary radiation produced in tissue by relativistic particles. At the present time, polyethylene or carbon composite shielding cannot be shown to significantly reduce risk compared to aluminum shielding based on a significance test that accounts for radiobiology uncertainties in GCR risk projection

  14. Practical Applications of Cosmic Ray Science: Spacecraft, Aircraft, Ground-Based Computation and Control Systems, Exploration, and Human Health and Safety

    Science.gov (United States)

    Koontz, Steve

    2015-01-01

    In this presentation a review of galactic cosmic ray (GCR) effects on microelectronic systems and human health and safety is given. The methods used to evaluate and mitigate unwanted cosmic ray effects in ground-based, atmospheric flight, and space flight environments are also reviewed. However not all GCR effects are undesirable. We will also briefly review how observation and analysis of GCR interactions with planetary atmospheres and surfaces and reveal important compositional and geophysical data on earth and elsewhere. About 1000 GCR particles enter every square meter of Earth’s upper atmosphere every second, roughly the same number striking every square meter of the International Space Station (ISS) and every other low- Earth orbit spacecraft. GCR particles are high energy ionized atomic nuclei (90% protons, 9% alpha particles, 1% heavier nuclei) traveling very close to the speed of light. The GCR particle flux is even higher in interplanetary space because the geomagnetic field provides some limited magnetic shielding. Collisions of GCR particles with atomic nuclei in planetary atmospheres and/or regolith as well as spacecraft materials produce nuclear reactions and energetic/highly penetrating secondary particle showers. Three twentieth century technology developments have driven an ongoing evolution of basic cosmic ray science into a set of practical engineering tools needed to design, test, and verify the safety and reliability of modern complex technological systems and assess effects on human health and safety effects. The key technology developments are: 1) high altitude commercial and military aircraft; 2) manned and unmanned spacecraft; and 3) increasingly complex and sensitive solid state micro-electronics systems. Space and geophysical exploration needs drove the development of the instruments and analytical tools needed to recover compositional and structural data from GCR induced nuclear reactions and secondary particle showers. Finally, the

  15. Methods to Load Balance a GCR Pressure Solver Using a Stencil Framework on Multi- and Many-Core Architectures

    Directory of Open Access Journals (Sweden)

    Milosz Ciznicki

    2015-01-01

    Full Text Available The recent advent of novel multi- and many-core architectures forces application programmers to deal with hardware-specific implementation details and to be familiar with software optimisation techniques to benefit from new high-performance computing machines. Extra care must be taken for communication-intensive algorithms, which may be a bottleneck for forthcoming era of exascale computing. This paper aims to present a high-level stencil framework implemented for the EULerian or LAGrangian model (EULAG that efficiently utilises multi- and many-cores architectures. Only an efficient usage of both many-core processors (CPUs and graphics processing units (GPUs with the flexible data decomposition method can lead to the maximum performance that scales the communication-intensive Generalized Conjugate Residual (GCR elliptic solver with preconditioner.

  16. US/FRG umbrella agreement for cooperation in GCR Development. Fuel, fission products, and graphite subprogram. Quarterly status report, July 1, 1982-September 30, 1982

    International Nuclear Information System (INIS)

    Turner, R.F.

    1982-10-01

    This report describes the status of the cooperative work being performed in the Fuel, Fission Product, and Graphite Subprogram under the HTR-Implementing Agreement of the United States/Federal Republic of Germany Umbrella Agreement for Cooperation in GCR Development. The status is described relative to the commitments in the Subprogram Plan for Fuel, Fission Products, and Graphite, Revision 5, April 1982. The work described was performed during the period July 1, 1982 through September 30, 1982 in the HTGR Base Technology Program at Oak Ridge National Laboratory, the HTGR Fuel and Plant Technology Programs at General Atomic Company (GA), and the Project HTR-Brennstoffkreislauf of the Entwicklungsgemeinschaft HTR at KFA Julich, HRB Mannheim, HOBEG Hanau, and SIGRI Meitingen. The requirement for and format of this quarterly status report are specified in the HTR Implementing Agreement procedures for cooperation. Responsibility for preparation of the quarterly report alternates between GA and KFA

  17. DEVS representation of dynamical systems - Event-based intelligent control. [Discrete Event System Specification

    Science.gov (United States)

    Zeigler, Bernard P.

    1989-01-01

    It is shown how systems can be advantageously represented as discrete-event models by using DEVS (discrete-event system specification), a set-theoretic formalism. Such DEVS models provide a basis for the design of event-based logic control. In this control paradigm, the controller expects to receive confirming sensor responses to its control commands within definite time windows determined by its DEVS model of the system under control. The event-based contral paradigm is applied in advanced robotic and intelligent automation, showing how classical process control can be readily interfaced with rule-based symbolic reasoning systems.

  18. Computational Model Prediction and Biological Validation Using Simplified Mixed Field Exposures for the Development of a GCR Reference Field

    Science.gov (United States)

    Hada, M.; Rhone, J.; Beitman, A.; Saganti, P.; Plante, I.; Ponomarev, A.; Slaba, T.; Patel, Z.

    2018-01-01

    The yield of chromosomal aberrations has been shown to increase in the lymphocytes of astronauts after long-duration missions of several months in space. Chromosome exchanges, especially translocations, are positively correlated with many cancers and are therefore a potential biomarker of cancer risk associated with radiation exposure. Although extensive studies have been carried out on the induction of chromosomal aberrations by low- and high-LET radiation in human lymphocytes, fibroblasts, and epithelial cells exposed in vitro, there is a lack of data on chromosome aberrations induced by low dose-rate chronic exposure and mixed field beams such as those expected in space. Chromosome aberration studies at NSRL will provide the biological validation needed to extend the computational models over a broader range of experimental conditions (more complicated mixed fields leading up to the galactic cosmic rays (GCR) simulator), helping to reduce uncertainties in radiation quality effects and dose-rate dependence in cancer risk models. These models can then be used to answer some of the open questions regarding requirements for a full GCR reference field, including particle type and number, energy, dose rate, and delivery order. In this study, we designed a simplified mixed field beam with a combination of proton, helium, oxygen, and iron ions with shielding or proton, helium, oxygen, and titanium without shielding. Human fibroblasts cells were irradiated with these mixed field beam as well as each single beam with acute and chronic dose rate, and chromosome aberrations (CA) were measured with 3-color fluorescent in situ hybridization (FISH) chromosome painting methods. Frequency and type of CA induced with acute dose rate and chronic dose rates with single and mixed field beam will be discussed. A computational chromosome and radiation-induced DNA damage model, BDSTRACKS (Biological Damage by Stochastic Tracks), was updated to simulate various types of CA induced by

  19. Host Event Based Network Monitoring

    Energy Technology Data Exchange (ETDEWEB)

    Jonathan Chugg

    2013-01-01

    The purpose of INL’s research on this project is to demonstrate the feasibility of a host event based network monitoring tool and the effects on host performance. Current host based network monitoring tools work on polling which can miss activity if it occurs between polls. Instead of polling, a tool could be developed that makes use of event APIs in the operating system to receive asynchronous notifications of network activity. Analysis and logging of these events will allow the tool to construct the complete real-time and historical network configuration of the host while the tool is running. This research focused on three major operating systems commonly used by SCADA systems: Linux, WindowsXP, and Windows7. Windows 7 offers two paths that have minimal impact on the system and should be seriously considered. First is the new Windows Event Logging API, and, second, Windows 7 offers the ALE API within WFP. Any future work should focus on these methods.

  20. Event-based Sensing for Space Situational Awareness

    Science.gov (United States)

    Cohen, G.; Afshar, S.; van Schaik, A.; Wabnitz, A.; Bessell, T.; Rutten, M.; Morreale, B.

    A revolutionary type of imaging device, known as a silicon retina or event-based sensor, has recently been developed and is gaining in popularity in the field of artificial vision systems. These devices are inspired by a biological retina and operate in a significantly different way to traditional CCD-based imaging sensors. While a CCD produces frames of pixel intensities, an event-based sensor produces a continuous stream of events, each of which is generated when a pixel detects a change in log light intensity. These pixels operate asynchronously and independently, producing an event-based output with high temporal resolution. There are also no fixed exposure times, allowing these devices to offer a very high dynamic range independently for each pixel. Additionally, these devices offer high-speed, low power operation and a sparse spatiotemporal output. As a consequence, the data from these sensors must be interpreted in a significantly different way to traditional imaging sensors and this paper explores the advantages this technology provides for space imaging. The applicability and capabilities of event-based sensors for SSA applications are demonstrated through telescope field trials. Trial results have confirmed that the devices are capable of observing resident space objects from LEO through to GEO orbital regimes. Significantly, observations of RSOs were made during both day-time and nighttime (terminator) conditions without modification to the camera or optics. The event based sensor’s ability to image stars and satellites during day-time hours offers a dramatic capability increase for terrestrial optical sensors. This paper shows the field testing and validation of two different architectures of event-based imaging sensors. An eventbased sensor’s asynchronous output has an intrinsically low data-rate. In addition to low-bandwidth communications requirements, the low weight, low-power and high-speed make them ideally suitable to meeting the demanding

  1. Spatiotemporal Features for Asynchronous Event-based Data

    Directory of Open Access Journals (Sweden)

    Xavier eLagorce

    2015-02-01

    Full Text Available Bio-inspired asynchronous event-based vision sensors are currently introducing a paradigm shift in visual information processing. These new sensors rely on a stimulus-driven principle of light acquisition similar to biological retinas. They are event-driven and fully asynchronous, thereby reducing redundancy and encoding exact times of input signal changes, leading to a very precise temporal resolution. Approaches for higher-level computer vision often rely on the realiable detection of features in visual frames, but similar definitions of features for the novel dynamic and event-based visual input representation of silicon retinas have so far been lacking. This article addresses the problem of learning and recognizing features for event-based vision sensors, which capture properties of truly spatiotemporal volumes of sparse visual event information. A novel computational architecture for learning and encoding spatiotemporal features is introduced based on a set of predictive recurrent reservoir networks, competing via winner-take-all selection. Features are learned in an unsupervised manner from real-world input recorded with event-based vision sensors. It is shown that the networks in the architecture learn distinct and task-specific dynamic visual features, and can predict their trajectories over time.

  2. Comparison of CREME (cosmic-ray effects on microelectronics) model LET (linear energy transfer) spaceflight dosimetry data

    Energy Technology Data Exchange (ETDEWEB)

    Letaw, J.R.; Adams, J.H.

    1986-07-15

    The galactic cosmic radiation (GCR) component of space radiation is the dominant cause of single-event phenomena in microelectronic circuits when Earth's magnetic shielding is low. Spaceflights outside the magnetosphere and in high inclination orbits are examples of such circumstances. In high-inclination orbits, low-energy (high LET) particles are transmitted through the field only at extreme latitudes, but can dominate the orbit-averaged dose. GCR is an important part of the radiation dose to astronauts under the same conditions. As a test of the CREME environmental model and particle transport codes used to estimate single event upsets, we have compiled existing measurements of HZE doses were compiled where GCR is expected to be important: Apollo 16 and 17, Skylab, Apollo Soyuz Test Project, and Kosmos 782. The LET spectra, due to direct ionization from GCR, for each of these missions has been estimated. The resulting comparisons with data validate the CREME model predictions of high-LET galactic cosmic-ray fluxes to within a factor of two. Some systematic differences between the model and data are identified.

  3. Problems in event based engine control

    DEFF Research Database (Denmark)

    Hendricks, Elbert; Jensen, Michael; Chevalier, Alain Marie Roger

    1994-01-01

    Physically a four cycle spark ignition engine operates on the basis of four engine processes or events: intake, compression, ignition (or expansion) and exhaust. These events each occupy approximately 180° of crank angle. In conventional engine controllers, it is an accepted practice to sample...... the engine variables synchronously with these events (or submultiples of them). Such engine controllers are often called event-based systems. Unfortunately the main system noise (or disturbance) is also synchronous with the engine events: the engine pumping fluctuations. Since many electronic engine...... problems on accurate air/fuel ratio control of a spark ignition (SI) engine....

  4. Rule-Based Event Processing and Reaction Rules

    Science.gov (United States)

    Paschke, Adrian; Kozlenkov, Alexander

    Reaction rules and event processing technologies play a key role in making business and IT / Internet infrastructures more agile and active. While event processing is concerned with detecting events from large event clouds or streams in almost real-time, reaction rules are concerned with the invocation of actions in response to events and actionable situations. They state the conditions under which actions must be taken. In the last decades various reaction rule and event processing approaches have been developed, which for the most part have been advanced separately. In this paper we survey reaction rule approaches and rule-based event processing systems and languages.

  5. PSA-based evaluation and rating of operational events

    International Nuclear Information System (INIS)

    Gomez Cobo, A.

    1997-01-01

    The presentation discusses the PSA-based evaluation and rating of operational events, including the following: historical background, procedures for event evaluation using PSA, use of PSA for event rating, current activities

  6. An Oracle-based Event Index for ATLAS

    CERN Document Server

    Gallas, Elizabeth; The ATLAS collaboration; Petrova, Petya Tsvetanova; Baranowski, Zbigniew; Canali, Luca; Formica, Andrea; Dumitru, Andrei

    2016-01-01

    The ATLAS EventIndex System has amassed a set of key quantities for a large number of ATLAS events into a Hadoop based infrastructure for the purpose of providing the experiment with a number of event-wise services. Collecting this data in one place provides the opportunity to investigate various storage formats and technologies and assess which best serve the various use cases as well as consider what other benefits alternative storage systems provide. In this presentation we describe how the data are imported into an Oracle RDBMS, the services we have built based on this architecture, and our experience with it. We've indexed about 15 billion real data events and about 25 billion simulated events thus far and have designed the system to accommodate future data which has expected rates of 5 and 20 billion events per year for real data and simulation, respectively. We have found this system offers outstanding performance for some fundamental use cases. In addition, profiting from the co-location of this data ...

  7. Static Analysis for Event-Based XML Processing

    DEFF Research Database (Denmark)

    Møller, Anders

    2008-01-01

    Event-based processing of XML data - as exemplified by the popular SAX framework - is a powerful alternative to using W3C's DOM or similar tree-based APIs. The event-based approach is a streaming fashion with minimal memory consumption. This paper discusses challenges for creating program analyses...... for SAX applications. In particular, we consider the problem of statically guaranteeing the a given SAX program always produces only well-formed and valid XML output. We propose an analysis technique based on ecisting anglyses of Servlets, string operations, and XML graphs....

  8. Electrophysiological correlates of strategic monitoring in event-based and time-based prospective memory.

    Directory of Open Access Journals (Sweden)

    Giorgia Cona

    Full Text Available Prospective memory (PM is the ability to remember to accomplish an action when a particular event occurs (i.e., event-based PM, or at a specific time (i.e., time-based PM while performing an ongoing activity. Strategic Monitoring is one of the basic cognitive functions supporting PM tasks, and involves two mechanisms: a retrieval mode, which consists of maintaining active the intention in memory; and target checking, engaged for verifying the presence of the PM cue in the environment. The present study is aimed at providing the first evidence of event-related potentials (ERPs associated with time-based PM, and at examining differences and commonalities in the ERPs related to Strategic Monitoring mechanisms between event- and time-based PM tasks.The addition of an event-based or a time-based PM task to an ongoing activity led to a similar sustained positive modulation of the ERPs in the ongoing trials, mainly expressed over prefrontal and frontal regions. This modulation might index the retrieval mode mechanism, similarly engaged in the two PM tasks. On the other hand, two further ERP modulations were shown specifically in an event-based PM task. An increased positivity was shown at 400-600 ms post-stimulus over occipital and parietal regions, and might be related to target checking. Moreover, an early modulation at 130-180 ms post-stimulus seems to reflect the recruitment of attentional resources for being ready to respond to the event-based PM cue. This latter modulation suggests the existence of a third mechanism specific for the event-based PM; that is, the "readiness mode".

  9. Modeling Space Radiation with Bleomycin

    Data.gov (United States)

    National Aeronautics and Space Administration — Space radiation is a mixed field of solar particle events (proton) and particles of Galactic Cosmic Rays (GCR) with different energy levels. These radiation events...

  10. OBEST: The Object-Based Event Scenario Tree Methodology

    International Nuclear Information System (INIS)

    WYSS, GREGORY D.; DURAN, FELICIA A.

    2001-01-01

    Event tree analysis and Monte Carlo-based discrete event simulation have been used in risk assessment studies for many years. This report details how features of these two methods can be combined with concepts from object-oriented analysis to develop a new risk assessment methodology with some of the best features of each. The resultant Object-Based Event Scenarios Tree (OBEST) methodology enables an analyst to rapidly construct realistic models for scenarios for which an a priori discovery of event ordering is either cumbersome or impossible (especially those that exhibit inconsistent or variable event ordering, which are difficult to represent in an event tree analysis). Each scenario produced by OBEST is automatically associated with a likelihood estimate because probabilistic branching is integral to the object model definition. The OBEST method uses a recursive algorithm to solve the object model and identify all possible scenarios and their associated probabilities. Since scenario likelihoods are developed directly by the solution algorithm, they need not be computed by statistical inference based on Monte Carlo observations (as required by some discrete event simulation methods). Thus, OBEST is not only much more computationally efficient than these simulation methods, but it also discovers scenarios that have extremely low probabilities as a natural analytical result--scenarios that would likely be missed by a Monte Carlo-based method. This report documents the OBEST methodology, the demonstration software that implements it, and provides example OBEST models for several different application domains, including interactions among failing interdependent infrastructure systems, circuit analysis for fire risk evaluation in nuclear power plants, and aviation safety studies

  11. CMS DAQ Event Builder Based on Gigabit Ethernet

    CERN Document Server

    Bauer, G; Branson, J; Brett, A; Cano, E; Carboni, A; Ciganek, M; Cittolin, S; Erhan, S; Gigi, D; Glege, F; Gómez-Reino, Robert; Gulmini, M; Gutiérrez-Mlot, E; Gutleber, J; Jacobs, C; Kim, J C; Klute, M; Lipeles, E; Lopez-Perez, Juan Antonio; Maron, G; Meijers, F; Meschi, E; Moser, R; Murray, S; Oh, A; Orsini, L; Paus, C; Petrucci, A; Pieri, M; Pollet, L; Rácz, A; Sakulin, H; Sani, M; Schieferdecker, P; Schwick, C; Sumorok, K; Suzuki, I; Tsirigkas, D; Varela, J

    2007-01-01

    The CMS Data Acquisition System is designed to build and filter events originating from 476 detector data sources at a maximum trigger rate of 100 KHz. Different architectures and switch technologies have been evaluated to accomplish this purpose. Events will be built in two stages: the first stage will be a set of event builders called FED Builders. These will be based on Myrinet technology and will pre-assemble groups of about 8 data sources. The second stage will be a set of event builders called Readout Builders. These will perform the building of full events. A single Readout Builder will build events from 72 sources of 16 KB fragments at a rate of 12.5 KHz. In this paper we present the design of a Readout Builder based on TCP/IP over Gigabit Ethernet and the optimization that was required to achieve the design throughput. This optimization includes architecture of the Readout Builder, the setup of TCP/IP, and hardware selection.

  12. Trends and characteristics observed in nuclear events based on international nuclear event scale reports

    International Nuclear Information System (INIS)

    Watanabe, Norio

    2001-01-01

    The International Nuclear Event Scale (INES) is jointly operated by the IAEA and the OECD-NEA as a means designed for providing prompt, clear and consistent information related to nuclear events, that occurred at nuclear facilities, and facilitating communication between the nuclear community, the media and the public. Nuclear events are reported to the INES with the Scale', a consistent safety significance indicator, which runs from level 0, for events with no safety significance, to level 7 for a major accident with widespread health and environmental effects. Since the operation of INES was initiated in 1990, approximately 500 events have been reported and disseminated. The present paper discusses the trends observed in nuclear events, such as overall trends of the reported events and characteristics of safety significant events with level 2 or higher, based on the INES reports. (author)

  13. An Oracle-based event index for ATLAS

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00083337; The ATLAS collaboration; Dimitrov, Gancho

    2017-01-01

    The ATLAS Eventlndex System has amassed a set of key quantities for a large number of ATLAS events into a Hadoop based infrastructure for the purpose of providing the experiment with a number of event-wise services. Collecting this data in one place provides the opportunity to investigate various storage formats and technologies and assess which best serve the various use cases as well as consider what other benefits alternative storage systems provide. In this presentation we describe how the data are imported into an Oracle RDBMS (relational database management system), the services we have built based on this architecture, and our experience with it. We’ve indexed about 26 billion real data events thus far and have designed the system to accommodate future data which has expected rates of 5 and 20 billion events per year. We have found this system offers outstanding performance for some fundamental use cases. In addition, profiting from the co-location of this data with other complementary metadata in AT...

  14. An Oracle-based event index for ATLAS

    Science.gov (United States)

    Gallas, E. J.; Dimitrov, G.; Vasileva, P.; Baranowski, Z.; Canali, L.; Dumitru, A.; Formica, A.; ATLAS Collaboration

    2017-10-01

    The ATLAS Eventlndex System has amassed a set of key quantities for a large number of ATLAS events into a Hadoop based infrastructure for the purpose of providing the experiment with a number of event-wise services. Collecting this data in one place provides the opportunity to investigate various storage formats and technologies and assess which best serve the various use cases as well as consider what other benefits alternative storage systems provide. In this presentation we describe how the data are imported into an Oracle RDBMS (relational database management system), the services we have built based on this architecture, and our experience with it. We’ve indexed about 26 billion real data events thus far and have designed the system to accommodate future data which has expected rates of 5 and 20 billion events per year. We have found this system offers outstanding performance for some fundamental use cases. In addition, profiting from the co-location of this data with other complementary metadata in ATLAS, the system has been easily extended to perform essential assessments of data integrity and completeness and to identify event duplication, including at what step in processing the duplication occurred.

  15. Estimating the impact of extreme events on crude oil price. An EMD-based event analysis method

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Xun; Wang, Shouyang [Institute of Systems Science, Academy of Mathematics and Systems Science, Chinese Academy of Sciences, Beijing 100190 (China); School of Mathematical Sciences, Graduate University of Chinese Academy of Sciences, Beijing 100190 (China); Yu, Lean [Institute of Systems Science, Academy of Mathematics and Systems Science, Chinese Academy of Sciences, Beijing 100190 (China); Lai, Kin Keung [Department of Management Sciences, City University of Hong Kong, Tat Chee Avenue, Kowloon (China)

    2009-09-15

    The impact of extreme events on crude oil markets is of great importance in crude oil price analysis due to the fact that those events generally exert strong impact on crude oil markets. For better estimation of the impact of events on crude oil price volatility, this study attempts to use an EMD-based event analysis approach for this task. In the proposed method, the time series to be analyzed is first decomposed into several intrinsic modes with different time scales from fine-to-coarse and an average trend. The decomposed modes respectively capture the fluctuations caused by the extreme event or other factors during the analyzed period. It is found that the total impact of an extreme event is included in only one or several dominant modes, but the secondary modes provide valuable information on subsequent factors. For overlapping events with influences lasting for different periods, their impacts are separated and located in different modes. For illustration and verification purposes, two extreme events, the Persian Gulf War in 1991 and the Iraq War in 2003, are analyzed step by step. The empirical results reveal that the EMD-based event analysis method provides a feasible solution to estimating the impact of extreme events on crude oil prices variation. (author)

  16. Estimating the impact of extreme events on crude oil price. An EMD-based event analysis method

    International Nuclear Information System (INIS)

    Zhang, Xun; Wang, Shouyang; Yu, Lean; Lai, Kin Keung

    2009-01-01

    The impact of extreme events on crude oil markets is of great importance in crude oil price analysis due to the fact that those events generally exert strong impact on crude oil markets. For better estimation of the impact of events on crude oil price volatility, this study attempts to use an EMD-based event analysis approach for this task. In the proposed method, the time series to be analyzed is first decomposed into several intrinsic modes with different time scales from fine-to-coarse and an average trend. The decomposed modes respectively capture the fluctuations caused by the extreme event or other factors during the analyzed period. It is found that the total impact of an extreme event is included in only one or several dominant modes, but the secondary modes provide valuable information on subsequent factors. For overlapping events with influences lasting for different periods, their impacts are separated and located in different modes. For illustration and verification purposes, two extreme events, the Persian Gulf War in 1991 and the Iraq War in 2003, are analyzed step by step. The empirical results reveal that the EMD-based event analysis method provides a feasible solution to estimating the impact of extreme events on crude oil prices variation. (author)

  17. Observation of galactic cosmic ray spallation events from the SoHO mission 20-Year operation of LASCO

    Science.gov (United States)

    Koutchmy, S.; Tavabi, E.; Urtado, O.

    2018-05-01

    A shower of secondary Cosmic Ray (CR) particles is produced at high altitudes in the Earth's atmosphere, so the primordial Galactic Cosmic Rays (GCRs) are never directly measured outside the Earth magnetosphere and atmosphere. They approach the Earth and other planets in the complex pattern of rigidity's dependence, generally excluded by the magnetosphere. GCRs revealed by images of single nuclear reactions also called spallation events are described here. Such an event was seen on Nov. 29, 2015 using a unique LASCO C3 space coronagraph routine image taken during the Solar and Heliospheric Observatory (SoHO) mission observing uninterruptedly at the Lagrangian L1 point. The spallation signature of a GCR identified well outside the Earth's magnetosphere is obtained for the 1st time. The resulting image includes different diverging linear "tracks" of varying intensity, leading to a single pixel; this frame identifies the site on the silicon CCD chip of the coronagraph camera. There was no solar flare reported at that time, nor Coronal Mass Ejection (CME) and no evidence of optical debris around the spacecraft. More examples of smaller CR events have been discovered through the 20 years of continuous observations from SoHO. This is the first spallation event from a CR, recorded outside the Earth's magnetosphere. We evaluate the probable energy of these events suggesting a plausible galactic source.

  18. CAMAC data acquisition system based on micro VAXII

    International Nuclear Information System (INIS)

    Yin Xijin; Shen Cuihua; Bai Xiaowei; Li Weisheng

    1993-01-01

    The CAMAC data acquisition system based on Micro VAXII Computer provides high-speed, Zero-suppressed, and 256-parameter CAMAC acquisition. It consists of three parts: control logic unit, CAMAC readout system and host computer system. When the control logical unit is triggered by external electronic selection signal, it produces a pilot signal to keep all of the parameters of a particular event together. Event-model data have been collected by using a CAMAC Fast Crate controller. The host computer system, in hard environment, is equipped with certain peripheral device. It includes the following: 1. at least two M990 GCR, 6250B/inch, magnetic tape driver operating at 75 inches per second or faster; 2. a Tektronix 4014 storage scope; 3. a laser printer, LND3-AE or copier which is capable of making hard-copies of Tektronix 4014 screen; 4. a control console device and a line printer; 5. x-press color graphics terminal; 6. DEC network. When the system is in real-time acquisition, it is able, on-line, to handle and analyse data stream, to monitor and control experiment and to display dynamically spectra on the Tektronix 4014

  19. RADIATION PROTECTION FOR HUMAN SPACEFLIGHT

    OpenAIRE

    Hellweg, C.E.; Baumstark-Khan, C.; Berger, T.

    2017-01-01

    Space is a special workplace not only because of microgravity and the dependency on life support systems, but also owing to a constant considerable exposure to a natural radiation source, the cosmic radiation. Galactic cosmic rays (GCR) and solar cosmic radiation (SCR) are the primary sources of the radiation field in space. Whereas the GCR component comprises all particles from protons to heavy ions with energies up to 10¹¹ GeV, the SCR component ejected in Solar Energetic Particle events (S...

  20. Event-based Simulation Model for Quantum Optics Experiments

    NARCIS (Netherlands)

    De Raedt, H.; Michielsen, K.; Jaeger, G; Khrennikov, A; Schlosshauer, M; Weihs, G

    2011-01-01

    We present a corpuscular simulation model of optical phenomena that does not require the knowledge of the solution of a wave equation of the whole system and reproduces the results of Maxwell's theory by generating detection events one-by-one. The event-based corpuscular model gives a unified

  1. Spatial gradients of GCR protons in the inner heliosphere derived from Ulysses COSPIN/KET and PAMELA measurements

    Science.gov (United States)

    Gieseler, J.; Heber, B.

    2016-05-01

    Context. During the transition from solar cycle 23 to 24 from 2006 to 2009, the Sun was in an unusual solar minimum with very low activity over a long period. These exceptional conditions included a very low interplanetary magnetic field (IMF) strength and a high tilt angle, which both play an important role in the modulation of galactic cosmic rays (GCR) in the heliosphere. Thus, the radial and latitudinal gradients of GCRs are very much expected to depend not only on the solar magnetic epoch, but also on the overall modulation level. Aims: We determine the non-local radial and the latitudinal gradients of protons in the rigidity range from ~0.45 to 2 GV. Methods: This was accomplished by using data from the satellite-borne experiment Payload for Antimatter Matter Exploration and Light-nuclei Astrophysics (PAMELA) at Earth and the Kiel Electron Telescope (KET) onboard Ulysses on its highly inclined Keplerian orbit around the Sun with the aphelion at Jupiter's orbit. Results: In comparison to the previous A> 0 solar magnetic epoch, we find that the absolute value of the latitudinal gradient is lower at higher and higher at lower rigidities. This energy dependence is therefore a crucial test for models that describe the cosmic ray transport in the inner heliosphere.

  2. Event-Based Corpuscular Model for Quantum Optics Experiments

    NARCIS (Netherlands)

    Michielsen, K.; Jin, F.; Raedt, H. De

    A corpuscular simulation model of optical phenomena that does not require the knowledge of the solution of a wave equation of the whole system and reproduces the results of Maxwell's theory by generating detection events one-by-one is presented. The event-based corpuscular model is shown to give a

  3. Space Weather Nowcasting of Atmospheric Ionizing Radiation for Aviation Safety

    Science.gov (United States)

    Mertens, Christopher J.; Wilson, John W.; Blattnig, Steve R.; Solomon, Stan C.; Wiltberger, J.; Kunches, Joseph; Kress, Brian T.; Murray, John J.

    2007-01-01

    There is a growing concern for the health and safety of commercial aircrew and passengers due to their exposure to ionizing radiation with high linear energy transfer (LET), particularly at high latitudes. The International Commission of Radiobiological Protection (ICRP), the EPA, and the FAA consider the crews of commercial aircraft as radiation workers. During solar energetic particle (SEP) events, radiation exposure can exceed annual limits, and the number of serious health effects is expected to be quite high if precautions are not taken. There is a need for a capability to monitor the real-time, global background radiations levels, from galactic cosmic rays (GCR), at commercial airline altitudes and to provide analytical input for airline operations decisions for altering flight paths and altitudes for the mitigation and reduction of radiation exposure levels during a SEP event. The Nowcast of Atmospheric Ionizing Radiation for Aviation Safety (NAIRAS) model is new initiative to provide a global, real-time radiation dosimetry package for archiving and assessing the biologically harmful radiation exposure levels at commercial airline altitudes. The NAIRAS model brings to bear the best available suite of Sun-Earth observations and models for simulating the atmospheric ionizing radiation environment. Observations are utilized from ground (neutron monitors), from the atmosphere (the METO analysis), and from space (NASA/ACE and NOAA/GOES). Atmospheric observations provide the overhead shielding information and the ground- and space-based observations provide boundary conditions on the GCR and SEP energy flux distributions for transport and dosimetry simulations. Dose rates are calculated using the parametric AIR (Atmospheric Ionizing Radiation) model and the physics-based HZETRN (High Charge and Energy Transport) code. Empirical models of the near-Earth radiation environment (GCR/SEP energy flux distributions and geomagnetic cut-off rigidity) are benchmarked

  4. Human based roots of failures in nuclear events investigations

    Energy Technology Data Exchange (ETDEWEB)

    Ziedelis, Stanislovas; Noel, Marc; Strucic, Miodrag [Commission of the European Communities, Petten (Netherlands). European Clearinghouse on Operational Experience Feedback for Nuclear Power Plants

    2012-10-15

    This paper aims for improvement of quality of the event investigations in the nuclear industry through analysis of the existing practices, identifying and removing the existing Human and Organizational Factors (HOF) and management related barriers. It presents the essential results of several studies performed by the European Clearinghouse on Operational Experience. Outcomes of studies are based on survey of currently existing event investigation practices typical for nuclear industry of 12 European countries, as well as on insights from analysis of numerous event investigation reports. System of operational experience feedback from information based on event investigation results is not enough effective to prevent and even to decrease frequency of recurring events due to existing methodological, HOF-related and/or knowledge management related constraints. Besides that, several latent root causes of unsuccessful event investigation are related to weaknesses in safety culture of personnel and managers. These weaknesses include focus on costs or schedule, political manipulation, arrogance, ignorance, entitlement and/or autocracy. Upgrades in safety culture of organization's personnel and its senior management especially seem to be an effective way to improvement. Increasing of competencies, capabilities and level of independency of event investigation teams, elaboration of comprehensive software, ensuring of positive approach, adequate support and impartiality of management could also facilitate for improvement of quality of the event investigations. (orig.)

  5. Human based roots of failures in nuclear events investigations

    International Nuclear Information System (INIS)

    Ziedelis, Stanislovas; Noel, Marc; Strucic, Miodrag

    2012-01-01

    This paper aims for improvement of quality of the event investigations in the nuclear industry through analysis of the existing practices, identifying and removing the existing Human and Organizational Factors (HOF) and management related barriers. It presents the essential results of several studies performed by the European Clearinghouse on Operational Experience. Outcomes of studies are based on survey of currently existing event investigation practices typical for nuclear industry of 12 European countries, as well as on insights from analysis of numerous event investigation reports. System of operational experience feedback from information based on event investigation results is not enough effective to prevent and even to decrease frequency of recurring events due to existing methodological, HOF-related and/or knowledge management related constraints. Besides that, several latent root causes of unsuccessful event investigation are related to weaknesses in safety culture of personnel and managers. These weaknesses include focus on costs or schedule, political manipulation, arrogance, ignorance, entitlement and/or autocracy. Upgrades in safety culture of organization's personnel and its senior management especially seem to be an effective way to improvement. Increasing of competencies, capabilities and level of independency of event investigation teams, elaboration of comprehensive software, ensuring of positive approach, adequate support and impartiality of management could also facilitate for improvement of quality of the event investigations. (orig.)

  6. Event Recognition Based on Deep Learning in Chinese Texts.

    Directory of Open Access Journals (Sweden)

    Yajun Zhang

    Full Text Available Event recognition is the most fundamental and critical task in event-based natural language processing systems. Existing event recognition methods based on rules and shallow neural networks have certain limitations. For example, extracting features using methods based on rules is difficult; methods based on shallow neural networks converge too quickly to a local minimum, resulting in low recognition precision. To address these problems, we propose the Chinese emergency event recognition model based on deep learning (CEERM. Firstly, we use a word segmentation system to segment sentences. According to event elements labeled in the CEC 2.0 corpus, we classify words into five categories: trigger words, participants, objects, time and location. Each word is vectorized according to the following six feature layers: part of speech, dependency grammar, length, location, distance between trigger word and core word and trigger word frequency. We obtain deep semantic features of words by training a feature vector set using a deep belief network (DBN, then analyze those features in order to identify trigger words by means of a back propagation neural network. Extensive testing shows that the CEERM achieves excellent recognition performance, with a maximum F-measure value of 85.17%. Moreover, we propose the dynamic-supervised DBN, which adds supervised fine-tuning to a restricted Boltzmann machine layer by monitoring its training performance. Test analysis reveals that the new DBN improves recognition performance and effectively controls the training time. Although the F-measure increases to 88.11%, the training time increases by only 25.35%.

  7. Event Recognition Based on Deep Learning in Chinese Texts.

    Science.gov (United States)

    Zhang, Yajun; Liu, Zongtian; Zhou, Wen

    2016-01-01

    Event recognition is the most fundamental and critical task in event-based natural language processing systems. Existing event recognition methods based on rules and shallow neural networks have certain limitations. For example, extracting features using methods based on rules is difficult; methods based on shallow neural networks converge too quickly to a local minimum, resulting in low recognition precision. To address these problems, we propose the Chinese emergency event recognition model based on deep learning (CEERM). Firstly, we use a word segmentation system to segment sentences. According to event elements labeled in the CEC 2.0 corpus, we classify words into five categories: trigger words, participants, objects, time and location. Each word is vectorized according to the following six feature layers: part of speech, dependency grammar, length, location, distance between trigger word and core word and trigger word frequency. We obtain deep semantic features of words by training a feature vector set using a deep belief network (DBN), then analyze those features in order to identify trigger words by means of a back propagation neural network. Extensive testing shows that the CEERM achieves excellent recognition performance, with a maximum F-measure value of 85.17%. Moreover, we propose the dynamic-supervised DBN, which adds supervised fine-tuning to a restricted Boltzmann machine layer by monitoring its training performance. Test analysis reveals that the new DBN improves recognition performance and effectively controls the training time. Although the F-measure increases to 88.11%, the training time increases by only 25.35%.

  8. Volcano!: An Event-Based Science Module. Student Edition. Geology Module.

    Science.gov (United States)

    Wright, Russell G.

    This book is designed for middle school students to learn scientific literacy through event-based science. Unlike traditional curricula, the event-based earth science module is a student-centered, interdisciplinary, inquiry-oriented program that emphasizes cooperative learning, teamwork, independent research, hands-on investigations, and…

  9. Volcano!: An Event-Based Science Module. Teacher's Guide. Geology Module.

    Science.gov (United States)

    Wright, Russell G.

    This book is designed for middle school earth science teachers to help their students learn scientific literacy through event-based science. Unlike traditional curricula, the event-based earth science module is a student-centered, interdisciplinary, inquiry-oriented program that emphasizes cooperative learning, teamwork, independent research,…

  10. An Event-Based Approach to Distributed Diagnosis of Continuous Systems

    Science.gov (United States)

    Daigle, Matthew; Roychoudhurry, Indranil; Biswas, Gautam; Koutsoukos, Xenofon

    2010-01-01

    Distributed fault diagnosis solutions are becoming necessary due to the complexity of modern engineering systems, and the advent of smart sensors and computing elements. This paper presents a novel event-based approach for distributed diagnosis of abrupt parametric faults in continuous systems, based on a qualitative abstraction of measurement deviations from the nominal behavior. We systematically derive dynamic fault signatures expressed as event-based fault models. We develop a distributed diagnoser design algorithm that uses these models for designing local event-based diagnosers based on global diagnosability analysis. The local diagnosers each generate globally correct diagnosis results locally, without a centralized coordinator, and by communicating a minimal number of measurements between themselves. The proposed approach is applied to a multi-tank system, and results demonstrate a marked improvement in scalability compared to a centralized approach.

  11. Event-based user classification in Weibo media.

    Science.gov (United States)

    Guo, Liang; Wang, Wendong; Cheng, Shiduan; Que, Xirong

    2014-01-01

    Weibo media, known as the real-time microblogging services, has attracted massive attention and support from social network users. Weibo platform offers an opportunity for people to access information and changes the way people acquire and disseminate information significantly. Meanwhile, it enables people to respond to the social events in a more convenient way. Much of the information in Weibo media is related to some events. Users who post different contents, and exert different behavior or attitude may lead to different contribution to the specific event. Therefore, classifying the large amount of uncategorized social circles generated in Weibo media automatically from the perspective of events has been a promising task. Under this circumstance, in order to effectively organize and manage the huge amounts of users, thereby further managing their contents, we address the task of user classification in a more granular, event-based approach in this paper. By analyzing real data collected from Sina Weibo, we investigate the Weibo properties and utilize both content information and social network information to classify the numerous users into four primary groups: celebrities, organizations/media accounts, grassroots stars, and ordinary individuals. The experiments results show that our method identifies the user categories accurately.

  12. Adverse Event extraction from Structured Product Labels using the Event-based Text-mining of Health Electronic Records (ETHER)system.

    Science.gov (United States)

    Pandey, Abhishek; Kreimeyer, Kory; Foster, Matthew; Botsis, Taxiarchis; Dang, Oanh; Ly, Thomas; Wang, Wei; Forshee, Richard

    2018-01-01

    Structured Product Labels follow an XML-based document markup standard approved by the Health Level Seven organization and adopted by the US Food and Drug Administration as a mechanism for exchanging medical products information. Their current organization makes their secondary use rather challenging. We used the Side Effect Resource database and DailyMed to generate a comparison dataset of 1159 Structured Product Labels. We processed the Adverse Reaction section of these Structured Product Labels with the Event-based Text-mining of Health Electronic Records system and evaluated its ability to extract and encode Adverse Event terms to Medical Dictionary for Regulatory Activities Preferred Terms. A small sample of 100 labels was then selected for further analysis. Of the 100 labels, Event-based Text-mining of Health Electronic Records achieved a precision and recall of 81 percent and 92 percent, respectively. This study demonstrated Event-based Text-mining of Health Electronic Record's ability to extract and encode Adverse Event terms from Structured Product Labels which may potentially support multiple pharmacoepidemiological tasks.

  13. Event- and interval-based measurement of stuttering: a review.

    Science.gov (United States)

    Valente, Ana Rita S; Jesus, Luis M T; Hall, Andreia; Leahy, Margaret

    2015-01-01

    Event- and interval-based measurements are two different ways of computing frequency of stuttering. Interval-based methodology emerged as an alternative measure to overcome problems associated with reproducibility in the event-based methodology. No review has been made to study the effect of methodological factors in interval-based absolute reliability data or to compute the agreement between the two methodologies in terms of inter-judge, intra-judge and accuracy (i.e., correspondence between raters' scores and an established criterion). To provide a review related to reproducibility of event-based and time-interval measurement, and to verify the effect of methodological factors (training, experience, interval duration, sample presentation order and judgment conditions) on agreement of time-interval measurement; in addition, to determine if it is possible to quantify the agreement between the two methodologies The first two authors searched for articles on ERIC, MEDLINE, PubMed, B-on, CENTRAL and Dissertation Abstracts during January-February 2013 and retrieved 495 articles. Forty-eight articles were selected for review. Content tables were constructed with the main findings. Articles related to event-based measurements revealed values of inter- and intra-judge greater than 0.70 and agreement percentages beyond 80%. The articles related to time-interval measures revealed that, in general, judges with more experience with stuttering presented significantly higher levels of intra- and inter-judge agreement. Inter- and intra-judge values were beyond the references for high reproducibility values for both methodologies. Accuracy (regarding the closeness of raters' judgements with an established criterion), intra- and inter-judge agreement were higher for trained groups when compared with non-trained groups. Sample presentation order and audio/video conditions did not result in differences in inter- or intra-judge results. A duration of 5 s for an interval appears to be

  14. Event-based historical value-at-risk

    NARCIS (Netherlands)

    Hogenboom, F.P.; Winter, Michael; Hogenboom, A.C.; Jansen, Milan; Frasincar, F.; Kaymak, U.

    2012-01-01

    Value-at-Risk (VaR) is an important tool to assess portfolio risk. When calculating VaR based on historical stock return data, we hypothesize that this historical data is sensitive to outliers caused by news events in the sampled period. In this paper, we research whether the VaR accuracy can be

  15. Power quality events recognition using a SVM-based method

    Energy Technology Data Exchange (ETDEWEB)

    Cerqueira, Augusto Santiago; Ferreira, Danton Diego; Ribeiro, Moises Vidal; Duque, Carlos Augusto [Department of Electrical Circuits, Federal University of Juiz de Fora, Campus Universitario, 36036 900, Juiz de Fora MG (Brazil)

    2008-09-15

    In this paper, a novel SVM-based method for power quality event classification is proposed. A simple approach for feature extraction is introduced, based on the subtraction of the fundamental component from the acquired voltage signal. The resulting signal is presented to a support vector machine for event classification. Results from simulation are presented and compared with two other methods, the OTFR and the LCEC. The proposed method shown an improved performance followed by a reasonable computational cost. (author)

  16. Address-event-based platform for bioinspired spiking systems

    Science.gov (United States)

    Jiménez-Fernández, A.; Luján, C. D.; Linares-Barranco, A.; Gómez-Rodríguez, F.; Rivas, M.; Jiménez, G.; Civit, A.

    2007-05-01

    Address Event Representation (AER) is an emergent neuromorphic interchip communication protocol that allows a real-time virtual massive connectivity between huge number neurons, located on different chips. By exploiting high speed digital communication circuits (with nano-seconds timings), synaptic neural connections can be time multiplexed, while neural activity signals (with mili-seconds timings) are sampled at low frequencies. Also, neurons generate "events" according to their activity levels. More active neurons generate more events per unit time, and access the interchip communication channel more frequently, while neurons with low activity consume less communication bandwidth. When building multi-chip muti-layered AER systems, it is absolutely necessary to have a computer interface that allows (a) reading AER interchip traffic into the computer and visualizing it on the screen, and (b) converting conventional frame-based video stream in the computer into AER and injecting it at some point of the AER structure. This is necessary for test and debugging of complex AER systems. In the other hand, the use of a commercial personal computer implies to depend on software tools and operating systems that can make the system slower and un-robust. This paper addresses the problem of communicating several AER based chips to compose a powerful processing system. The problem was discussed in the Neuromorphic Engineering Workshop of 2006. The platform is based basically on an embedded computer, a powerful FPGA and serial links, to make the system faster and be stand alone (independent from a PC). A new platform is presented that allow to connect up to eight AER based chips to a Spartan 3 4000 FPGA. The FPGA is responsible of the network communication based in Address-Event and, at the same time, to map and transform the address space of the traffic to implement a pre-processing. A MMU microprocessor (Intel XScale 400MHz Gumstix Connex computer) is also connected to the FPGA

  17. Wind Farm Grid Integration Using VSC Based HVDC Transmission - An Overview

    DEFF Research Database (Denmark)

    Chaudhary, Sanjay Kumar; Teodorescu, Remus; Rodriguez, Pedro

    2008-01-01

    The paper gives an overview of HVAC and HVDC connection of wind farm to the grid, with an emphasis on Voltage Source Converter (VSC)-based HVDC for large wind farms requiring long distance cable connection. Flexible control capabilities of a VSC-based HVDC system enables smooth integration of wind...... farm into the power grid network while meeting the Grid Code Requirements (GCR). Operation of a wind farm with VSC-based HVDC connection is described....

  18. Event-Based User Classification in Weibo Media

    Directory of Open Access Journals (Sweden)

    Liang Guo

    2014-01-01

    Full Text Available Weibo media, known as the real-time microblogging services, has attracted massive attention and support from social network users. Weibo platform offers an opportunity for people to access information and changes the way people acquire and disseminate information significantly. Meanwhile, it enables people to respond to the social events in a more convenient way. Much of the information in Weibo media is related to some events. Users who post different contents, and exert different behavior or attitude may lead to different contribution to the specific event. Therefore, classifying the large amount of uncategorized social circles generated in Weibo media automatically from the perspective of events has been a promising task. Under this circumstance, in order to effectively organize and manage the huge amounts of users, thereby further managing their contents, we address the task of user classification in a more granular, event-based approach in this paper. By analyzing real data collected from Sina Weibo, we investigate the Weibo properties and utilize both content information and social network information to classify the numerous users into four primary groups: celebrities, organizations/media accounts, grassroots stars, and ordinary individuals. The experiments results show that our method identifies the user categories accurately.

  19. A ROOT based event display software for JUNO

    Science.gov (United States)

    You, Z.; Li, K.; Zhang, Y.; Zhu, J.; Lin, T.; Li, W.

    2018-02-01

    An event display software SERENA has been designed for the Jiangmen Underground Neutrino Observatory (JUNO). The software has been developed in the JUNO offline software system and is based on the ROOT display package EVE. It provides an essential tool to display detector and event data for better understanding of the processes in the detectors. The software has been widely used in JUNO detector optimization, simulation, reconstruction and physics study.

  20. The analysis of the initiating events in thorium-based molten salt reactor

    International Nuclear Information System (INIS)

    Zuo Jiaxu; Song Wei; Jing Jianping; Zhang Chunming

    2014-01-01

    The initiation events analysis and evaluation were the beginning of nuclear safety analysis and probabilistic safety analysis, and it was the key points of the nuclear safety analysis. Currently, the initiation events analysis method and experiences both focused on water reactor, but no methods and theories for thorium-based molten salt reactor (TMSR). With TMSR's research and development in China, the initiation events analysis and evaluation was increasingly important. The research could be developed from the PWR analysis theories and methods. Based on the TMSR's design, the theories and methods of its initiation events analysis could be researched and developed. The initiation events lists and analysis methods of the two or three generation PWR, high-temperature gascooled reactor and sodium-cooled fast reactor were summarized. Based on the TMSR's design, its initiation events would be discussed and developed by the logical analysis. The analysis of TMSR's initiation events was preliminary studied and described. The research was important to clarify the events analysis rules, and useful to TMSR's designs and nuclear safety analysis. (authors)

  1. Event-Based control of depth of hypnosis in anesthesia.

    Science.gov (United States)

    Merigo, Luca; Beschi, Manuel; Padula, Fabrizio; Latronico, Nicola; Paltenghi, Massimiliano; Visioli, Antonio

    2017-08-01

    In this paper, we propose the use of an event-based control strategy for the closed-loop control of the depth of hypnosis in anesthesia by using propofol administration and the bispectral index as a controlled variable. A new event generator with high noise-filtering properties is employed in addition to a PIDPlus controller. The tuning of the parameters is performed off-line by using genetic algorithms by considering a given data set of patients. The effectiveness and robustness of the method is verified in simulation by implementing a Monte Carlo method to address the intra-patient and inter-patient variability. A comparison with a standard PID control structure shows that the event-based control system achieves a reduction of the total variation of the manipulated variable of 93% in the induction phase and of 95% in the maintenance phase. The use of event based automatic control in anesthesia yields a fast induction phase with bounded overshoot and an acceptable disturbance rejection. A comparison with a standard PID control structure shows that the technique effectively mimics the behavior of the anesthesiologist by providing a significant decrement of the total variation of the manipulated variable. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. Abstracting event-based control models for high autonomy systems

    Science.gov (United States)

    Luh, Cheng-Jye; Zeigler, Bernard P.

    1993-01-01

    A high autonomy system needs many models on which to base control, management, design, and other interventions. These models differ in level of abstraction and in formalism. Concepts and tools are needed to organize the models into a coherent whole. The paper deals with the abstraction processes for systematic derivation of related models for use in event-based control. The multifaceted modeling methodology is briefly reviewed. The morphism concepts needed for application to model abstraction are described. A theory for supporting the construction of DEVS models needed for event-based control is then presented. An implemented morphism on the basis of this theory is also described.

  3. IBES: A Tool for Creating Instructions Based on Event Segmentation

    Directory of Open Access Journals (Sweden)

    Katharina eMura

    2013-12-01

    Full Text Available Receiving informative, well-structured, and well-designed instructions supports performance and memory in assembly tasks. We describe IBES, a tool with which users can quickly and easily create multimedia, step-by-step instructions by segmenting a video of a task into segments. In a validation study we demonstrate that the step-by-step structure of the visual instructions created by the tool corresponds to the natural event boundaries, which are assessed by event segmentation and are known to play an important role in memory processes. In one part of the study, twenty participants created instructions based on videos of two different scenarios by using the proposed tool. In the other part of the study, ten and twelve participants respectively segmented videos of the same scenarios yielding event boundaries for coarse and fine events. We found that the visual steps chosen by the participants for creating the instruction manual had corresponding events in the event segmentation. The number of instructional steps was a compromise between the number of fine and coarse events. Our interpretation of results is that the tool picks up on natural human event perception processes of segmenting an ongoing activity into events and enables the convenient transfer into meaningful multimedia instructions for assembly tasks. We discuss the practical application of IBES, for example, creating manuals for differing expertise levels, and give suggestions for research on user-oriented instructional design based on this tool.

  4. IBES: a tool for creating instructions based on event segmentation.

    Science.gov (United States)

    Mura, Katharina; Petersen, Nils; Huff, Markus; Ghose, Tandra

    2013-12-26

    Receiving informative, well-structured, and well-designed instructions supports performance and memory in assembly tasks. We describe IBES, a tool with which users can quickly and easily create multimedia, step-by-step instructions by segmenting a video of a task into segments. In a validation study we demonstrate that the step-by-step structure of the visual instructions created by the tool corresponds to the natural event boundaries, which are assessed by event segmentation and are known to play an important role in memory processes. In one part of the study, 20 participants created instructions based on videos of two different scenarios by using the proposed tool. In the other part of the study, 10 and 12 participants respectively segmented videos of the same scenarios yielding event boundaries for coarse and fine events. We found that the visual steps chosen by the participants for creating the instruction manual had corresponding events in the event segmentation. The number of instructional steps was a compromise between the number of fine and coarse events. Our interpretation of results is that the tool picks up on natural human event perception processes of segmenting an ongoing activity into events and enables the convenient transfer into meaningful multimedia instructions for assembly tasks. We discuss the practical application of IBES, for example, creating manuals for differing expertise levels, and give suggestions for research on user-oriented instructional design based on this tool.

  5. DYNAMIC AUTHORIZATION BASED ON THE HISTORY OF EVENTS

    Directory of Open Access Journals (Sweden)

    Maxim V. Baklanovsky

    2016-11-01

    Full Text Available The new paradigm in the field of access control systems with fuzzy authorization is proposed. Let there is a set of objects in a single data transmissionnetwork. The goal is to develop dynamic authorization protocol based on correctness of presentation of events (news occurred earlier in the network. We propose mathematical method that keeps compactly the history of events, neglects more distant and least-significant events, composes and verifies authorization data. The history of events is represented as vectors of numbers. Each vector is multiplied by several stochastic vectors. The result is known that if vectors of events are sparse, then by solving the problem of -optimization they can be restored with high accuracy. Results of experiments for vectors restoring have shown that the greater the number of stochastic vectors is, the better accuracy of restored vectors is observed. It has been established that the largest absolute components are restored earlier. Access control system with the proposed dynamic authorization method enables to compute fuzzy confidence coefficients in networks with frequently changing set of participants, mesh-networks, multi-agent systems.

  6. Event-based state estimation a stochastic perspective

    CERN Document Server

    Shi, Dawei; Chen, Tongwen

    2016-01-01

    This book explores event-based estimation problems. It shows how several stochastic approaches are developed to maintain estimation performance when sensors perform their updates at slower rates only when needed. The self-contained presentation makes this book suitable for readers with no more than a basic knowledge of probability analysis, matrix algebra and linear systems. The introduction and literature review provide information, while the main content deals with estimation problems from four distinct angles in a stochastic setting, using numerous illustrative examples and comparisons. The text elucidates both theoretical developments and their applications, and is rounded out by a review of open problems. This book is a valuable resource for researchers and students who wish to expand their knowledge and work in the area of event-triggered systems. At the same time, engineers and practitioners in industrial process control will benefit from the event-triggering technique that reduces communication costs ...

  7. ACTINIDE AND ULTRA-HEAVY ABUNDANCES IN THE LOCAL GALACTIC COSMIC RAYS: AN ANALYSIS OF THE RESULTS FROM THE LDEF ULTRA-HEAVY COSMIC-RAY EXPERIMENT

    Energy Technology Data Exchange (ETDEWEB)

    Donnelly, J. [Dublin Institute of Technology (DIT), School of Physics, Kevin Street, Dublin 8 (Ireland); Thompson, A.; O' Sullivan, D.; Daly, J.; Drury, L. [School of Cosmic Physics, Dublin Institute for Advanced Studies, 31 Fitzwilliam Place, Dublin 2 (Ireland); Domingo, V.; Wenzel, K.-P. [European Space Research and Technology Centre (ESTEC), Keplerlaan 1, Postbus 299, 2200 AG Noordwijk (Netherlands)

    2012-03-01

    The LDEF Ultra-Heavy Cosmic-Ray Experiment (UHCRE) detected Galactic cosmic rays (GCRs) of charge Z {>=} 70 in Earth orbit with an exposure factor of 170 m{sup 2} sr yr, much larger than any other experiment. The major results include the first statistically significant uniform sample of GCR actinides with 35 events passing quality cuts, evidence for the existence of transuranic nuclei in the GCR with one {sub 96}Cm candidate event, and a low {sub 82}Pb/{sub 78}Pt ratio consistent with other experiments. The probability of the existence of a transuranic component is estimated as 96%, while the most likely {sub 92}U/{sub 90}Th ratio is found to be 0.4 within a wide 70% confidence interval ranging from 0 to 0.96. Overall, the results are consistent with a volatility-based acceleration bias and source material which is mainly ordinary interstellar medium material with some recent contamination by freshly synthesized material. Uncertainty in the key {sub 92}U/{sub 90}Th ratio is dominated by statistical errors resulting from the small sample size and any improved determination will thus require an experiment with a substantially larger exposure factor than the UHCRE.

  8. A robust neural network-based approach for microseismic event detection

    KAUST Repository

    Akram, Jubran

    2017-08-17

    We present an artificial neural network based approach for robust event detection from low S/N waveforms. We use a feed-forward network with a single hidden layer that is tuned on a training dataset and later applied on the entire example dataset for event detection. The input features used include the average of absolute amplitudes, variance, energy-ratio and polarization rectilinearity. These features are calculated in a moving-window of same length for the entire waveform. The output is set as a user-specified relative probability curve, which provides a robust way of distinguishing between weak and strong events. An optimal network is selected by studying the weight-based saliency and effect of number of neurons on the predicted results. Using synthetic data examples, we demonstrate that this approach is effective in detecting weaker events and reduces the number of false positives.

  9. THE EFFECT OF DEVOTEE-BASED BRAND EQUITY ON RELIGIOUS EVENTS

    Directory of Open Access Journals (Sweden)

    MUHAMMAD JAWAD IQBAL

    2016-04-01

    Full Text Available The objective of this research is to apply DBBE model to discover the constructs to measure the religious event as a business brand on the bases of devotees’ perception. SEM technique was applied to measure the hypothesized model of which CFA put to analyze the model and a theoretical model was made to measure the model fit. Sample size was of 500. The base of brand loyalty was affected directly by image and quality. This information might be beneficial to event management and sponsors in making brand and operating visitors’ destinations. More importantly, the brand of these religious events in Pakistan can be built as a strong tourism product.

  10. Event-based plausibility immediately influences on-line language comprehension.

    Science.gov (United States)

    Matsuki, Kazunaga; Chow, Tracy; Hare, Mary; Elman, Jeffrey L; Scheepers, Christoph; McRae, Ken

    2011-07-01

    In some theories of sentence comprehension, linguistically relevant lexical knowledge, such as selectional restrictions, is privileged in terms of the time-course of its access and influence. We examined whether event knowledge computed by combining multiple concepts can rapidly influence language understanding even in the absence of selectional restriction violations. Specifically, we investigated whether instruments can combine with actions to influence comprehension of ensuing patients of (as in Rayner, Warren, Juhuasz, & Liversedge, 2004; Warren & McConnell, 2007). Instrument-verb-patient triplets were created in a norming study designed to tap directly into event knowledge. In self-paced reading (Experiment 1), participants were faster to read patient nouns, such as hair, when they were typical of the instrument-action pair (Donna used the shampoo to wash vs. the hose to wash). Experiment 2 showed that these results were not due to direct instrument-patient relations. Experiment 3 replicated Experiment 1 using eyetracking, with effects of event typicality observed in first fixation and gaze durations on the patient noun. This research demonstrates that conceptual event-based expectations are computed and used rapidly and dynamically during on-line language comprehension. We discuss relationships among plausibility and predictability, as well as their implications. We conclude that selectional restrictions may be best considered as event-based conceptual knowledge rather than lexical-grammatical knowledge.

  11. Lessons Learned from Real-Time, Event-Based Internet Science Communications

    Science.gov (United States)

    Phillips, T.; Myszka, E.; Gallagher, D. L.; Adams, M. L.; Koczor, R. J.; Whitaker, Ann F. (Technical Monitor)

    2001-01-01

    For the last several years the Science Directorate at Marshall Space Flight Center has carried out a diverse program of Internet-based science communication. The Directorate's Science Roundtable includes active researchers, NASA public relations, educators, and administrators. The Science@NASA award-winning family of Web sites features science, mathematics, and space news. The program includes extended stories about NASA science, a curriculum resource for teachers tied to national education standards, on-line activities for students, and webcasts of real-time events. The focus of sharing science activities in real-time has been to involve and excite students and the public about science. Events have involved meteor showers, solar eclipses, natural very low frequency radio emissions, and amateur balloon flights. In some cases, broadcasts accommodate active feedback and questions from Internet participants. Through these projects a pattern has emerged in the level of interest or popularity with the public. The pattern differentiates projects that include science from those that do not, All real-time, event-based Internet activities have captured public interest at a level not achieved through science stories or educator resource material exclusively. The worst event-based activity attracted more interest than the best written science story. One truly rewarding lesson learned through these projects is that the public recognizes the importance and excitement of being part of scientific discovery. Flying a camera to 100,000 feet altitude isn't as interesting to the public as searching for viable life-forms at these oxygen-poor altitudes. The details of these real-time, event-based projects and lessons learned will be discussed.

  12. A scheme for PET data normalization in event-based motion correction

    International Nuclear Information System (INIS)

    Zhou, Victor W; Kyme, Andre Z; Fulton, Roger; Meikle, Steven R

    2009-01-01

    Line of response (LOR) rebinning is an event-based motion-correction technique for positron emission tomography (PET) imaging that has been shown to compensate effectively for rigid motion. It involves the spatial transformation of LORs to compensate for motion during the scan, as measured by a motion tracking system. Each motion-corrected event is then recorded in the sinogram bin corresponding to the transformed LOR. It has been shown previously that the corrected event must be normalized using a normalization factor derived from the original LOR, that is, based on the pair of detectors involved in the original coincidence event. In general, due to data compression strategies (mashing), sinogram bins record events detected on multiple LORs. The number of LORs associated with a sinogram bin determines the relative contribution of each LOR. This paper provides a thorough treatment of event-based normalization during motion correction of PET data using LOR rebinning. We demonstrate theoretically and experimentally that normalization of the corrected event during LOR rebinning should account for the number of LORs contributing to the sinogram bin into which the motion-corrected event is binned. Failure to account for this factor may cause artifactual slice-to-slice count variations in the transverse slices and visible horizontal stripe artifacts in the coronal and sagittal slices of the reconstructed images. The theory and implementation of normalization in conjunction with the LOR rebinning technique is described in detail, and experimental verification of the proposed normalization method in phantom studies is presented.

  13. Central FPGA-based Destination and Load Control in the LHCb MHz Event Readout

    CERN Document Server

    Jacobsson, Richard

    2012-01-01

    The readout strategy of the LHCb experiment [1] is based on complete event readout at 1 MHz [2]. Over 300 sub-detector readout boards transmit event fragments at 1 MHz over a commercial 70 Gigabyte/s switching network to a distributed event building and trigger processing farm with 1470 individual multi-core computer nodes [3]. In the original specifications, the readout was based on a pure push protocol. This paper describes the proposal, implementation, and experience of a powerful non-conventional mixture of a push and a pull protocol, akin to credit-based flow control. A high-speed FPGA-based central master module controls the event fragment packing in the readout boards, the assignment of the farm node destination for each event, and controls the farm load based on an asynchronous pull mechanism from each farm node. This dynamic readout scheme relies on generic event requests and the concept of node credit allowing load balancing and trigger rate regulation as a function of the global farm load. It also ...

  14. Event-Based Stabilization over Networks with Transmission Delays

    Directory of Open Access Journals (Sweden)

    Xiangyu Meng

    2012-01-01

    Full Text Available This paper investigates asymptotic stabilization for linear systems over networks based on event-driven communication. A new communication logic is proposed to reduce the feedback effort, which has some advantages over traditional ones with continuous feedback. Considering the effect of time-varying transmission delays, the criteria for the design of both the feedback gain and the event-triggering mechanism are derived to guarantee the stability and performance requirements. Finally, the proposed techniques are illustrated by an inverted pendulum system and a numerical example.

  15. FIREDATA, Nuclear Power Plant Fire Event Data Base

    International Nuclear Information System (INIS)

    Wheelis, W.T.

    2001-01-01

    1 - Description of program or function: FIREDATA contains raw fire event data from 1965 through June 1985. These data were obtained from a number of reference sources including the American Nuclear Insurers, Licensee Event Reports, Nuclear Power Experience, Electric Power Research Institute Fire Loss Data and then collated into one database developed in the personal computer database management system, dBASE III. FIREDATA is menu-driven and asks interactive questions of the user that allow searching of the database for various aspects of a fire such as: location, mode of plant operation at the time of the fire, means of detection and suppression, dollar loss, etc. Other features include the capability of searching for single or multiple criteria (using Boolean 'and' or 'or' logical operations), user-defined keyword searches of fire event descriptions, summary displays of fire event data by plant name of calendar date, and options for calculating the years of operating experience for all commercial nuclear power plants from any user-specified date and the ability to display general plant information. 2 - Method of solution: The six database files used to store nuclear power plant fire event information, FIRE, DESC, SUM, OPEXPER, OPEXBWR, and EXPERPWR, are accessed by software to display information meeting user-specified criteria or to perform numerical calculations (e.g., to determine the operating experience of a nuclear plant). FIRE contains specific searchable data relating to each of 354 fire events. A keyword concept is used to search each of the 31 separate entries or fields. DESC contains written descriptions of each of the fire events. SUM holds basic plant information for all plants proposed, under construction, in operation, or decommissioned. This includes the initial criticality and commercial operation dates, the physical location of the plant, and its operating capacity. OPEXPER contains date information and data on how various plant locations are

  16. Neural correlates of attentional and mnemonic processing in event-based prospective memory.

    Science.gov (United States)

    Knight, Justin B; Ethridge, Lauren E; Marsh, Richard L; Clementz, Brett A

    2010-01-01

    Prospective memory (PM), or memory for realizing delayed intentions, was examined with an event-based paradigm while simultaneously measuring neural activity with high-density EEG recordings. Specifically, the neural substrates of monitoring for an event-based cue were examined, as well as those perhaps associated with the cognitive processes supporting detection of cues and fulfillment of intentions. Participants engaged in a baseline lexical decision task (LDT), followed by a LDT with an embedded PM component. Event-based cues were constituted by color and lexicality (red words). Behavioral data provided evidence that monitoring, or preparatory attentional processes, were used to detect cues. Analysis of the event-related potentials (ERP) revealed visual attentional modulations at 140 and 220 ms post-stimulus associated with preparatory attentional processes. In addition, ERP components at 220, 350, and 400 ms post-stimulus were enhanced for intention-related items. Our results suggest preparatory attention may operate by selectively modulating processing of features related to a previously formed event-based intention, as well as provide further evidence for the proposal that dissociable component processes support the fulfillment of delayed intentions.

  17. Improving the Critic Learning for Event-Based Nonlinear $H_{\\infty }$ Control Design.

    Science.gov (United States)

    Wang, Ding; He, Haibo; Liu, Derong

    2017-10-01

    In this paper, we aim at improving the critic learning criterion to cope with the event-based nonlinear H ∞ state feedback control design. First of all, the H ∞ control problem is regarded as a two-player zero-sum game and the adaptive critic mechanism is used to achieve the minimax optimization under event-based environment. Then, based on an improved updating rule, the event-based optimal control law and the time-based worst-case disturbance law are obtained approximately by training a single critic neural network. The initial stabilizing control is no longer required during the implementation process of the new algorithm. Next, the closed-loop system is formulated as an impulsive model and its stability issue is handled by incorporating the improved learning criterion. The infamous Zeno behavior of the present event-based design is also avoided through theoretical analysis on the lower bound of the minimal intersample time. Finally, the applications to an aircraft dynamics and a robot arm plant are carried out to verify the efficient performance of the present novel design method.

  18. Ontology-based prediction of surgical events in laparoscopic surgery

    Science.gov (United States)

    Katić, Darko; Wekerle, Anna-Laura; Gärtner, Fabian; Kenngott, Hannes; Müller-Stich, Beat Peter; Dillmann, Rüdiger; Speidel, Stefanie

    2013-03-01

    Context-aware technologies have great potential to help surgeons during laparoscopic interventions. Their underlying idea is to create systems which can adapt their assistance functions automatically to the situation in the OR, thus relieving surgeons from the burden of managing computer assisted surgery devices manually. To this purpose, a certain kind of understanding of the current situation in the OR is essential. Beyond that, anticipatory knowledge of incoming events is beneficial, e.g. for early warnings of imminent risk situations. To achieve the goal of predicting surgical events based on previously observed ones, we developed a language to describe surgeries and surgical events using Description Logics and integrated it with methods from computational linguistics. Using n-Grams to compute probabilities of followup events, we are able to make sensible predictions of upcoming events in real-time. The system was evaluated on professionally recorded and labeled surgeries and showed an average prediction rate of 80%.

  19. Poisson-event-based analysis of cell proliferation.

    Science.gov (United States)

    Summers, Huw D; Wills, John W; Brown, M Rowan; Rees, Paul

    2015-05-01

    A protocol for the assessment of cell proliferation dynamics is presented. This is based on the measurement of cell division events and their subsequent analysis using Poisson probability statistics. Detailed analysis of proliferation dynamics in heterogeneous populations requires single cell resolution within a time series analysis and so is technically demanding to implement. Here, we show that by focusing on the events during which cells undergo division rather than directly on the cells themselves a simplified image acquisition and analysis protocol can be followed, which maintains single cell resolution and reports on the key metrics of cell proliferation. The technique is demonstrated using a microscope with 1.3 μm spatial resolution to track mitotic events within A549 and BEAS-2B cell lines, over a period of up to 48 h. Automated image processing of the bright field images using standard algorithms within the ImageJ software toolkit yielded 87% accurate recording of the manually identified, temporal, and spatial positions of the mitotic event series. Analysis of the statistics of the interevent times (i.e., times between observed mitoses in a field of view) showed that cell division conformed to a nonhomogeneous Poisson process in which the rate of occurrence of mitotic events, λ exponentially increased over time and provided values of the mean inter mitotic time of 21.1 ± 1.2 hours for the A549 cells and 25.0 ± 1.1 h for the BEAS-2B cells. Comparison of the mitotic event series for the BEAS-2B cell line to that predicted by random Poisson statistics indicated that temporal synchronisation of the cell division process was occurring within 70% of the population and that this could be increased to 85% through serum starvation of the cell culture. © 2015 International Society for Advancement of Cytometry.

  20. Lessons from the restructuring of the Danish planning system and its impact on the Greater Copenhagen Region

    DEFF Research Database (Denmark)

    Galland, Daniel

    2013-01-01

    This paper explores the rise and decay of regional planning policies and institutions in the Greater Copenhagen Region (GCR) since the postwar era. The paper develops an understanding based on spatial selectivity and spatial rescaling as regards the fluctuating planning context in the GCR through...

  1. Neural correlates of attentional and mnemonic processing in event-based prospective memory

    Directory of Open Access Journals (Sweden)

    Justin B Knight

    2010-02-01

    Full Text Available Prospective memory, or memory for realizing delayed intentions, was examined with an event-based paradigm while simultaneously measuring neural activity with high-density EEG recordings. Specifically, the neural substrates of monitoring for an event-based cue were examined, as well as those perhaps associated with the cognitive processes supporting detection of cues and fulfillment of intentions. Participants engaged in a baseline lexical decision task (LDT, followed by a LDT with an embedded prospective memory (PM component. Event-based cues were constituted by color and lexicality (red words. Behavioral data provided evidence that monitoring, or preparatory attentional processes, were used to detect cues. Analysis of the event-related potentials (ERP revealed visual attentional modulations at 140 and 220 ms post-stimulus associated with preparatory attentional processes. In addition, ERP components at 220, 350, and 400 ms post-stimulus were enhanced for intention-related items. Our results suggest preparatory attention may operate by selectively modulating processing of features related to a previously formed event-based intention, as well as provide further evidence for the proposal that dissociable component processes support the fulfillment of delayed intentions.

  2. Event-building and PC farm based level-3 trigger at the CDF experiment

    CERN Document Server

    Anikeev, K; Furic, I K; Holmgren, D; Korn, A J; Kravchenko, I V; Mulhearn, M; Ngan, P; Paus, C; Rakitine, A; Rechenmacher, R; Shah, T; Sphicas, Paris; Sumorok, K; Tether, S; Tseng, J

    2000-01-01

    In the technical design report the event building process at Fermilab's CDF experiment is required to function at an event rate of 300 events/sec. The events are expected to have an average size of 150 kBytes (kB) and are assembled from fragments of 16 readout locations. The fragment size from the different locations varies between 12 kB and 16 kB. Once the events are assembled they are fed into the Level-3 trigger which is based on processors running programs to filter events using the full event information. Computing power on the order of a second on a Pentium II processor is required per event. The architecture design is driven by the cost and is therefore based on commodity components: VME processor modules running VxWorks for the readout, an ATM switch for the event building, and Pentium PCs running Linux as an operation system for the Level-3 event processing. Pentium PCs are also used to receive events from the ATM switch and further distribute them to the processing nodes over multiple 100 Mbps Ether...

  3. Galactic cosmic ray simulation at the NASA Space Radiation Laboratory

    Science.gov (United States)

    Norbury, John W.; Schimmerling, Walter; Slaba, Tony C.; Azzam, Edouard I.; Badavi, Francis F.; Baiocco, Giorgio; Benton, Eric; Bindi, Veronica; Blakely, Eleanor A.; Blattnig, Steve R.; Boothman, David A.; Borak, Thomas B.; Britten, Richard A.; Curtis, Stan; Dingfelder, Michael; Durante, Marco; Dynan, William S.; Eisch, Amelia J.; Elgart, S. Robin; Goodhead, Dudley T.; Guida, Peter M.; Heilbronn, Lawrence H.; Hellweg, Christine E.; Huff, Janice L.; Kronenberg, Amy; La Tessa, Chiara; Lowenstein, Derek I.; Miller, Jack; Morita, Takashi; Narici, Livio; Nelson, Gregory A.; Norman, Ryan B.; Ottolenghi, Andrea; Patel, Zarana S.; Reitz, Guenther; Rusek, Adam; Schreurs, Ann-Sofie; Scott-Carnell, Lisa A.; Semones, Edward; Shay, Jerry W.; Shurshakov, Vyacheslav A.; Sihver, Lembit; Simonsen, Lisa C.; Story, Michael D.; Turker, Mitchell S.; Uchihori, Yukio; Williams, Jacqueline; Zeitlin, Cary J.

    2017-01-01

    Most accelerator-based space radiation experiments have been performed with single ion beams at fixed energies. However, the space radiation environment consists of a wide variety of ion species with a continuous range of energies. Due to recent developments in beam switching technology implemented at the NASA Space Radiation Laboratory (NSRL) at Brookhaven National Laboratory (BNL), it is now possible to rapidly switch ion species and energies, allowing for the possibility to more realistically simulate the actual radiation environment found in space. The present paper discusses a variety of issues related to implementation of galactic cosmic ray (GCR) simulation at NSRL, especially for experiments in radiobiology. Advantages and disadvantages of different approaches to developing a GCR simulator are presented. In addition, issues common to both GCR simulation and single beam experiments are compared to issues unique to GCR simulation studies. A set of conclusions is presented as well as a discussion of the technical implementation of GCR simulation. PMID:26948012

  4. Web-based online system for recording and examing of events in power plants

    International Nuclear Information System (INIS)

    Seyd Farshi, S.; Dehghani, M.

    2004-01-01

    Occurrence of events in power plants could results in serious drawbacks in generation of power. This suggests high degree of importance for online recording and examing of events. In this paper an online web-based system is introduced, which records and examines events in power plants. Throughout the paper, procedures for design and implementation of this system, its features and results gained are explained. this system provides predefined level of online access to all data of events for all its users in power plants, dispatching, regional utilities and top-level managers. By implementation of electric power industry intranet, an expandable modular system to be used in different sectors of industry is offered. Web-based online recording and examing system for events offers the following advantages: - Online recording of events in power plants. - Examing of events in regional utilities. - Access to event' data. - Preparing managerial reports

  5. Cognitive load and task condition in event- and time-based prospective memory: an experimental investigation.

    Science.gov (United States)

    Khan, Azizuddin; Sharma, Narendra K; Dixit, Shikha

    2008-09-01

    Prospective memory is memory for the realization of delayed intention. Researchers distinguish 2 kinds of prospective memory: event- and time-based (G. O. Einstein & M. A. McDaniel, 1990). Taking that distinction into account, the present authors explored participants' comparative performance under event- and time-based tasks. In an experimental study of 80 participants, the authors investigated the roles of cognitive load and task condition in prospective memory. Cognitive load (low vs. high) and task condition (event- vs. time-based task) were the independent variables. Accuracy in prospective memory was the dependent variable. Results showed significant differential effects under event- and time-based tasks. However, the effect of cognitive load was more detrimental in time-based prospective memory. Results also revealed that time monitoring is critical in successful performance of time estimation and so in time-based prospective memory. Similarly, participants' better performance on the event-based prospective memory task showed that they acted on the basis of environment cues. Event-based prospective memory was environmentally cued; time-based prospective memory required self-initiation.

  6. A semi-supervised learning framework for biomedical event extraction based on hidden topics.

    Science.gov (United States)

    Zhou, Deyu; Zhong, Dayou

    2015-05-01

    Scientists have devoted decades of efforts to understanding the interaction between proteins or RNA production. The information might empower the current knowledge on drug reactions or the development of certain diseases. Nevertheless, due to the lack of explicit structure, literature in life science, one of the most important sources of this information, prevents computer-based systems from accessing. Therefore, biomedical event extraction, automatically acquiring knowledge of molecular events in research articles, has attracted community-wide efforts recently. Most approaches are based on statistical models, requiring large-scale annotated corpora to precisely estimate models' parameters. However, it is usually difficult to obtain in practice. Therefore, employing un-annotated data based on semi-supervised learning for biomedical event extraction is a feasible solution and attracts more interests. In this paper, a semi-supervised learning framework based on hidden topics for biomedical event extraction is presented. In this framework, sentences in the un-annotated corpus are elaborately and automatically assigned with event annotations based on their distances to these sentences in the annotated corpus. More specifically, not only the structures of the sentences, but also the hidden topics embedded in the sentences are used for describing the distance. The sentences and newly assigned event annotations, together with the annotated corpus, are employed for training. Experiments were conducted on the multi-level event extraction corpus, a golden standard corpus. Experimental results show that more than 2.2% improvement on F-score on biomedical event extraction is achieved by the proposed framework when compared to the state-of-the-art approach. The results suggest that by incorporating un-annotated data, the proposed framework indeed improves the performance of the state-of-the-art event extraction system and the similarity between sentences might be precisely

  7. Networked Estimation for Event-Based Sampling Systems with Packet Dropouts

    Directory of Open Access Journals (Sweden)

    Young Soo Suh

    2009-04-01

    Full Text Available This paper is concerned with a networked estimation problem in which sensor data are transmitted over the network. In the event-based sampling scheme known as level-crossing or send-on-delta (SOD, sensor data are transmitted to the estimator node if the difference between the current sensor value and the last transmitted one is greater than a given threshold. Event-based sampling has been shown to be more efficient than the time-triggered one in some situations, especially in network bandwidth improvement. However, it cannot detect packet dropout situations because data transmission and reception do not use a periodical time-stamp mechanism as found in time-triggered sampling systems. Motivated by this issue, we propose a modified event-based sampling scheme called modified SOD in which sensor data are sent when either the change of sensor output exceeds a given threshold or the time elapses more than a given interval. Through simulation results, we show that the proposed modified SOD sampling significantly improves estimation performance when packet dropouts happen.

  8. Rocchio-based relevance feedback in video event retrieval

    NARCIS (Netherlands)

    Pingen, G.L.J.; de Boer, M.H.T.; Aly, Robin; Amsaleg, Laurent; Guðmundsson, Gylfi Þór; Gurrin, Cathal; Jónsson, Björn Þór; Satoh, Shin’ichi

    This paper investigates methods for user and pseudo relevance feedback in video event retrieval. Existing feedback methods achieve strong performance but adjust the ranking based on few individual examples. We propose a relevance feedback algorithm (ARF) derived from the Rocchio method, which is a

  9. System risk evolution analysis and risk critical event identification based on event sequence diagram

    International Nuclear Information System (INIS)

    Luo, Pengcheng; Hu, Yang

    2013-01-01

    During system operation, the environmental, operational and usage conditions are time-varying, which causes the fluctuations of the system state variables (SSVs). These fluctuations change the accidents’ probabilities and then result in the system risk evolution (SRE). This inherent relation makes it feasible to realize risk control by monitoring the SSVs in real time, herein, the quantitative analysis of SRE is essential. Besides, some events in the process of SRE are critical to system risk, because they act like the “demarcative points” of safety and accident, and this characteristic makes each of them a key point of risk control. Therefore, analysis of SRE and identification of risk critical events (RCEs) are remarkably meaningful to ensure the system to operate safely. In this context, an event sequence diagram (ESD) based method of SRE analysis and the related Monte Carlo solution are presented; RCE and risk sensitive variable (RSV) are defined, and the corresponding identification methods are also proposed. Finally, the proposed approaches are exemplified with an accident scenario of an aircraft getting into the icing region

  10. Space Radiation: The Number One Risk to Astronaut Health beyond Low Earth Orbit

    Science.gov (United States)

    Chancellor, Jeffery C.; Scott, Graham B. I.; Sutton, Jeffrey P.

    2014-01-01

    Projecting a vision for space radiobiological research necessitates understanding the nature of the space radiation environment and how radiation risks influence mission planning, timelines and operational decisions. Exposure to space radiation increases the risks of astronauts developing cancer, experiencing central nervous system (CNS) decrements, exhibiting degenerative tissue effects or developing acute radiation syndrome. One or more of these deleterious health effects could develop during future multi-year space exploration missions beyond low Earth orbit (LEO). Shielding is an effective countermeasure against solar particle events (SPEs), but is ineffective in protecting crew members from the biological impacts of fast moving, highly-charged galactic cosmic radiation (GCR) nuclei. Astronauts traveling on a protracted voyage to Mars may be exposed to SPE radiation events, overlaid on a more predictable flux of GCR. Therefore, ground-based research studies employing model organisms seeking to accurately mimic the biological effects of the space radiation environment must concatenate exposures to both proton and heavy ion sources. New techniques in genomics, proteomics, metabolomics and other “omics” areas should also be intelligently employed and correlated with phenotypic observations. This approach will more precisely elucidate the effects of space radiation on human physiology and aid in developing personalized radiological countermeasures for astronauts. PMID:25370382

  11. Space Radiation: The Number One Risk to Astronaut Health beyond Low Earth Orbit

    Directory of Open Access Journals (Sweden)

    Jeffery C. Chancellor

    2014-09-01

    Full Text Available Projecting a vision for space radiobiological research necessitates understanding the nature of the space radiation environment and how radiation risks influence mission planning, timelines and operational decisions. Exposure to space radiation increases the risks of astronauts developing cancer, experiencing central nervous system (CNS decrements, exhibiting degenerative tissue effects or developing acute radiation syndrome. One or more of these deleterious health effects could develop during future multi-year space exploration missions beyond low Earth orbit (LEO. Shielding is an effective countermeasure against solar particle events (SPEs, but is ineffective in protecting crew members from the biological impacts of fast moving, highly-charged galactic cosmic radiation (GCR nuclei. Astronauts traveling on a protracted voyage to Mars may be exposed to SPE radiation events, overlaid on a more predictable flux of GCR. Therefore, ground-based research studies employing model organisms seeking to accurately mimic the biological effects of the space radiation environment must concatenate exposures to both proton and heavy ion sources. New techniques in genomics, proteomics, metabolomics and other “omics” areas should also be intelligently employed and correlated with phenotypic observations. This approach will more precisely elucidate the effects of space radiation on human physiology and aid in developing personalized radiological countermeasures for astronauts.

  12. WILBER and PyWEED: Event-based Seismic Data Request Tools

    Science.gov (United States)

    Falco, N.; Clark, A.; Trabant, C. M.

    2017-12-01

    WILBER and PyWEED are two user-friendly tools for requesting event-oriented seismic data. Both tools provide interactive maps and other controls for browsing and filtering event and station catalogs, and downloading data for selected event/station combinations, where the data window for each event/station pair may be defined relative to the arrival time of seismic waves from the event to that particular station. Both tools allow data to be previewed visually, and can download data in standard miniSEED, SAC, and other formats, complete with relevant metadata for performing instrument correction. WILBER is a web application requiring only a modern web browser. Once the user has selected an event, WILBER identifies all data available for that time period, and allows the user to select stations based on criteria such as the station's distance and orientation relative to the event. When the user has finalized their request, the data is collected and packaged on the IRIS server, and when it is ready the user is sent a link to download. PyWEED is a downloadable, cross-platform (Macintosh / Windows / Linux) application written in Python. PyWEED allows a user to select multiple events and stations, and will download data for each event/station combination selected. PyWEED is built around the ObsPy seismic toolkit, and allows direct interaction and control of the application through a Python interactive console.

  13. The role of musical training in emergent and event-based timing

    Directory of Open Access Journals (Sweden)

    Lawrence eBaer

    2013-05-01

    Full Text Available Musical performance is thought to rely predominantly on event-based timing involving a clock-like neural process and an explicit internal representation of the time interval. Some aspects of musical performance may rely on emergent timing, which is established through the optimization of movement kinematics, and can be maintained without reference to any explicit representation of the time interval. We predicted that musical training would have its largest effect on event-based timing, supporting the dissociability of these timing processes and the dominance of event-based timing in musical performance. We compared 22 musicians and 17 non-musicians on the prototypical event-based timing task of finger tapping and on the typically emergently timed task of circle drawing. For each task, participants first responded in synchrony with a metronome (Paced and then responded at the same rate without the metronome (Unpaced. Analyses of the Unpaced phase revealed that non-musicians were more variable in their inter-response intervals for finger tapping compared to circle drawing. Musicians did not differ between the two tasks. Between groups, non-musicians were more variable than musicians for tapping but not for drawing. We were able to show that the differences were due to less timer variability in musicians on the tapping task. Correlational analyses of movement jerk and inter-response interval variability revealed a negative association for tapping and a positive association for drawing in non-musicians only. These results suggest that musical training affects temporal variability in tapping but not drawing. Additionally, musicians and non-musicians may be employing different movement strategies to maintain accurate timing in the two tasks. These findings add to our understanding of how musical training affects timing and support the dissociability of event-based and emergent timing modes.

  14. Diet Activity Characteristic of Large-scale Sports Events Based on HACCP Management Model

    OpenAIRE

    Xiao-Feng Su; Li Guo; Li-Hua Gao; Chang-Zhuan Shao

    2015-01-01

    The study proposed major sports events dietary management based on "HACCP" management model. According to the characteristic of major sports events catering activities. Major sports events are not just showcase level of competitive sports activities which have become comprehensive special events including social, political, economic, cultural and other factors, complex. Sporting events conferred reach more diverse goals and objectives of economic, political, cultural, technological and other ...

  15. GPS-based PWV for precipitation forecasting and its application to a typhoon event

    Science.gov (United States)

    Zhao, Qingzhi; Yao, Yibin; Yao, Wanqiang

    2018-01-01

    The temporal variability of precipitable water vapour (PWV) derived from Global Navigation Satellite System (GNSS) observations can be used to forecast precipitation events. A number of case studies of precipitation events have been analysed in Zhejiang Province, and a forecasting method for precipitation events was proposed. The PWV time series retrieved from the Global Positioning System (GPS) observations was processed by using a least-squares fitting method, so as to obtain the line tendency of ascents and descents over PWV. The increment of PWV for a short time (two to six hours) and PWV slope for a longer time (a few hours to more than ten hours) during the PWV ascending period are considered as predictive factors with which to forecast the precipitation event. The numerical results show that about 80%-90% of precipitation events and more than 90% of heavy rain events can be forecasted two to six hours in advance of the precipitation event based on the proposed method. 5-minute PWV data derived from GPS observations based on real-time precise point positioning (RT-PPP) were used for the typhoon event that passed over Zhejiang Province between 10 and 12 July, 2015. A good result was acquired using the proposed method and about 74% of precipitation events were predicted at some ten to thirty minutes earlier than their onset with a false alarm rate of 18%. This study shows that the GPS-based PWV was promising for short-term and now-casting precipitation forecasting.

  16. Studies on switch-based event building systems in RD13

    International Nuclear Information System (INIS)

    Bee, C.P.; Eshghi, S.; Jones, R.

    1996-01-01

    One of the goals of the RD13 project at CERN is to investigate the feasibility of parallel event building system for detectors at the LHC. Studies were performed by building a prototype based on the HiPPI standard and by modeling this prototype and extended architectures with MODSIM II. The prototype used commercially available VME-HiPPI interfaces and a HiPPI switch together with a modular software. The setup was tested successfully as a parallel event building system in different configurations and with different data flow control schemes. The simulation program was used with realistic parameters from the prototype measurements to simulate large-scale event building systems. This includes simulations of a realistic setup of the ATLAS event building system. The influence of different parameters and scaling behavior were investigated. The influence of realistic event size distributions was checked with data from off-line simulations. Different control schemes for destination assignment and traffic shaping were investigated as well as a two-stage event building system. (author)

  17. Central FPGA-based destination and load control in the LHCb MHz event readout

    International Nuclear Information System (INIS)

    Jacobsson, R.

    2012-01-01

    The readout strategy of the LHCb experiment is based on complete event readout at 1 MHz. A set of 320 sub-detector readout boards transmit event fragments at total rate of 24.6 MHz at a bandwidth usage of up to 70 GB/s over a commercial switching network based on Gigabit Ethernet to a distributed event building and high-level trigger processing farm with 1470 individual multi-core computer nodes. In the original specifications, the readout was based on a pure push protocol. This paper describes the proposal, implementation, and experience of a non-conventional mixture of a push and a pull protocol, akin to credit-based flow control. An FPGA-based central master module, partly operating at the LHC bunch clock frequency of 40.08 MHz and partly at a double clock speed, is in charge of the entire trigger and readout control from the front-end electronics up to the high-level trigger farm. One FPGA is dedicated to controlling the event fragment packing in the readout boards, the assignment of the farm node destination for each event, and controls the farm load based on an asynchronous pull mechanism from each farm node. This dynamic readout scheme relies on generic event requests and the concept of node credit allowing load control and trigger rate regulation as a function of the global farm load. It also allows the vital task of fast central monitoring and automatic recovery in-flight of failing nodes while maintaining dead-time and event loss at a minimum. This paper demonstrates the strength and suitability of implementing this real-time task for a very large distributed system in an FPGA where no random delays are introduced, and where extreme reliability and accurate event accounting are fundamental requirements. It was in use during the entire commissioning phase of LHCb and has been in faultless operation during the first two years of physics luminosity data taking.

  18. Central FPGA-based destination and load control in the LHCb MHz event readout

    Science.gov (United States)

    Jacobsson, R.

    2012-10-01

    The readout strategy of the LHCb experiment is based on complete event readout at 1 MHz. A set of 320 sub-detector readout boards transmit event fragments at total rate of 24.6 MHz at a bandwidth usage of up to 70 GB/s over a commercial switching network based on Gigabit Ethernet to a distributed event building and high-level trigger processing farm with 1470 individual multi-core computer nodes. In the original specifications, the readout was based on a pure push protocol. This paper describes the proposal, implementation, and experience of a non-conventional mixture of a push and a pull protocol, akin to credit-based flow control. An FPGA-based central master module, partly operating at the LHC bunch clock frequency of 40.08 MHz and partly at a double clock speed, is in charge of the entire trigger and readout control from the front-end electronics up to the high-level trigger farm. One FPGA is dedicated to controlling the event fragment packing in the readout boards, the assignment of the farm node destination for each event, and controls the farm load based on an asynchronous pull mechanism from each farm node. This dynamic readout scheme relies on generic event requests and the concept of node credit allowing load control and trigger rate regulation as a function of the global farm load. It also allows the vital task of fast central monitoring and automatic recovery in-flight of failing nodes while maintaining dead-time and event loss at a minimum. This paper demonstrates the strength and suitability of implementing this real-time task for a very large distributed system in an FPGA where no random delays are introduced, and where extreme reliability and accurate event accounting are fundamental requirements. It was in use during the entire commissioning phase of LHCb and has been in faultless operation during the first two years of physics luminosity data taking.

  19. A browser-based event display for the CMS experiment at the LHC

    International Nuclear Information System (INIS)

    Hategan, M; McCauley, T; Nguyen, P

    2012-01-01

    The line between native and web applications is becoming increasingly blurred as modern web browsers are becoming powerful platforms on which applications can be run. Such applications are trivial to install and are readily extensible and easy to use. In an educational setting, web applications permit a way to deploy deploy tools in a highly-restrictive computing environment. The I2U2 collaboration has developed a browser-based event display for viewing events in data collected and released to the public by the CMS experiment at the LHC. The application itself reads a JSON event format and uses the JavaScript 3D rendering engine pre3d. The only requirement is a modern browser using HTML5 canvas. The event display has been used by thousands of high school students in the context of programs organized by I2U2, QuarkNet, and IPPOG. This browser-based approach to display of events can have broader usage and impact for experts and public alike.

  20. Fire!: An Event-Based Science Module. Teacher's Guide. Chemistry and Fire Ecology Module.

    Science.gov (United States)

    Wright, Russell G.

    This book is designed for middle school earth science or physical science teachers to help their students learn scientific literacy through event-based science. Unlike traditional curricula, the event- based earth science module is a student-centered, interdisciplinary, inquiry-oriented program that emphasizes cooperative learning, teamwork,…

  1. Short-Period Surface Wave Based Seismic Event Relocation

    Science.gov (United States)

    White-Gaynor, A.; Cleveland, M.; Nyblade, A.; Kintner, J. A.; Homman, K.; Ammon, C. J.

    2017-12-01

    Accurate and precise seismic event locations are essential for a broad range of geophysical investigations. Superior location accuracy generally requires calibration with ground truth information, but superb relative location precision is often achievable independently. In explosion seismology, low-yield explosion monitoring relies on near-source observations, which results in a limited number of observations that challenges our ability to estimate any locations. Incorporating more distant observations means relying on data with lower signal-to-noise ratios. For small, shallow events, the short-period (roughly 1/2 to 8 s period) fundamental-mode and higher-mode Rayleigh waves (including Rg) are often the most stable and visible portion of the waveform at local distances. Cleveland and Ammon [2013] have shown that teleseismic surface waves are valuable observations for constructing precise, relative event relocations. We extend the teleseismic surface wave relocation method, and apply them to near-source distances using Rg observations from the Bighorn Arche Seismic Experiment (BASE) and the Earth Scope USArray Transportable Array (TA) seismic stations. Specifically, we present relocation results using short-period fundamental- and higher-mode Rayleigh waves (Rg) in a double-difference relative event relocation for 45 delay-fired mine blasts and 21 borehole chemical explosions. Our preliminary efforts are to explore the sensitivity of the short-period surface waves to local geologic structure, source depth, explosion magnitude (yield), and explosion characteristics (single-shot vs. distributed source, etc.). Our results show that Rg and the first few higher-mode Rayleigh wave observations can be used to constrain the relative locations of shallow low-yield events.

  2. Track-based event recognition in a realistic crowded environment

    Science.gov (United States)

    van Huis, Jasper R.; Bouma, Henri; Baan, Jan; Burghouts, Gertjan J.; Eendebak, Pieter T.; den Hollander, Richard J. M.; Dijk, Judith; van Rest, Jeroen H.

    2014-10-01

    Automatic detection of abnormal behavior in CCTV cameras is important to improve the security in crowded environments, such as shopping malls, airports and railway stations. This behavior can be characterized at different time scales, e.g., by small-scale subtle and obvious actions or by large-scale walking patterns and interactions between people. For example, pickpocketing can be recognized by the actual snatch (small scale), when he follows the victim, or when he interacts with an accomplice before and after the incident (longer time scale). This paper focusses on event recognition by detecting large-scale track-based patterns. Our event recognition method consists of several steps: pedestrian detection, object tracking, track-based feature computation and rule-based event classification. In the experiment, we focused on single track actions (walk, run, loiter, stop, turn) and track interactions (pass, meet, merge, split). The experiment includes a controlled setup, where 10 actors perform these actions. The method is also applied to all tracks that are generated in a crowded shopping mall in a selected time frame. The results show that most of the actions can be detected reliably (on average 90%) at a low false positive rate (1.1%), and that the interactions obtain lower detection rates (70% at 0.3% FP). This method may become one of the components that assists operators to find threatening behavior and enrich the selection of videos that are to be observed.

  3. National inventory of Global Change relevant research in Norway; Nasjonal kartlegging av global change-relevant forskning

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2001-05-01

    The Norwegian Global Change Committee has made an inventory of global change research (GCR) projects funded by the Research Council of Norway (RCN) in 2001. In lack of a rigid definition, GCR was defined as research that can be considered relevant to the science agenda of the four major international global change programmes DIVERSITAS, IGBP, IHDP and WCRP. Relevance was judged based on the objectives stated for each of the international programmes and their core projects. It was not attempted to check whether the projects had any kind of link to the programmes they were considered relevant for. The grants provided by the RCN in 2001 to GCR as defined above amounts to about 77 mill. NOK. Based on a recent survey on climate change research it is reasonable to estimate that the RCN finances between 30 and 40 % of all GCR in Norway. Accordingly, the total value of Norwegian research relevant to the four international global change programmes in 2001 can be estimated to 192 - 254 mill. NOK.

  4. Abnormal Event Detection in Wireless Sensor Networks Based on Multiattribute Correlation

    Directory of Open Access Journals (Sweden)

    Mengdi Wang

    2017-01-01

    Full Text Available Abnormal event detection is one of the vital tasks in wireless sensor networks. However, the faults of nodes and the poor deployment environment have brought great challenges to abnormal event detection. In a typical event detection technique, spatiotemporal correlations are collected to detect an event, which is susceptible to noises and errors. To improve the quality of detection results, we propose a novel approach for abnormal event detection in wireless sensor networks. This approach considers not only spatiotemporal correlations but also the correlations among observed attributes. A dependency model of observed attributes is constructed based on Bayesian network. In this model, the dependency structure of observed attributes is obtained by structure learning, and the conditional probability table of each node is calculated by parameter learning. We propose a new concept named attribute correlation confidence to evaluate the fitting degree between the sensor reading and the abnormal event pattern. On the basis of time correlation detection and space correlation detection, the abnormal events are identified. Experimental results show that the proposed algorithm can reduce the impact of interference factors and the rate of the false alarm effectively; it can also improve the accuracy of event detection.

  5. Deep learning based beat event detection in action movie franchises

    Science.gov (United States)

    Ejaz, N.; Khan, U. A.; Martínez-del-Amor, M. A.; Sparenberg, H.

    2018-04-01

    Automatic understanding and interpretation of movies can be used in a variety of ways to semantically manage the massive volumes of movies data. "Action Movie Franchises" dataset is a collection of twenty Hollywood action movies from five famous franchises with ground truth annotations at shot and beat level of each movie. In this dataset, the annotations are provided for eleven semantic beat categories. In this work, we propose a deep learning based method to classify shots and beat-events on this dataset. The training dataset for each of the eleven beat categories is developed and then a Convolution Neural Network is trained. After finding the shot boundaries, key frames are extracted for each shot and then three classification labels are assigned to each key frame. The classification labels for each of the key frames in a particular shot are then used to assign a unique label to each shot. A simple sliding window based method is then used to group adjacent shots having the same label in order to find a particular beat event. The results of beat event classification are presented based on criteria of precision, recall, and F-measure. The results are compared with the existing technique and significant improvements are recorded.

  6. Declarative event based models of concurrency and refinement in psi-calculi

    DEFF Research Database (Denmark)

    Normann, Håkon; Johansen, Christian; Hildebrandt, Thomas

    2015-01-01

    Psi-calculi constitute a parametric framework for nominal process calculi, where constraint based process calculi and process calculi for mobility can be defined as instances. We apply here the framework of psi-calculi to provide a foundation for the exploration of declarative event-based process...... calculi with support for run-time refinement. We first provide a representation of the model of finite prime event structures as an instance of psi-calculi and prove that the representation respects the semantics up to concurrency diamonds and action refinement. We then proceed to give a psi......-calculi representation of Dynamic Condition Response Graphs, which conservatively extends prime event structures to allow finite representations of (omega) regular finite (and infinite) behaviours and have been shown to support run-time adaptation and refinement. We end by outlining the final aim of this research, which...

  7. Preventing Medication Error Based on Knowledge Management Against Adverse Event

    OpenAIRE

    Hastuti, Apriyani Puji; Nursalam, Nursalam; Triharini, Mira

    2017-01-01

    Introductions: Medication error is one of many types of errors that could decrease the quality and safety of healthcare. Increasing number of adverse events (AE) reflects the number of medication errors. This study aimed to develop a model of medication error prevention based on knowledge management. This model is expected to improve knowledge and skill of nurses to prevent medication error which is characterized by the decrease of adverse events (AE). Methods: This study consisted of two sta...

  8. Risk-based ranking of dominant contributors to maritime pollution events

    International Nuclear Information System (INIS)

    Wheeler, T.A.

    1993-01-01

    This report describes a conceptual approach for identifying dominant contributors to risk from maritime shipping of hazardous materials. Maritime transportation accidents are relatively common occurrences compared to more frequently analyzed contributors to public risk. Yet research on maritime safety and pollution incidents has not been guided by a systematic, risk-based approach. Maritime shipping accidents can be analyzed using event trees to group the accidents into 'bins,' or groups, of similar characteristics such as type of cargo, location of accident (e.g., harbor, inland waterway), type of accident (e.g., fire, collision, grounding), and size of release. The importance of specific types of events to each accident bin can be quantified. Then the overall importance of accident events to risk can be estimated by weighting the events' individual bin importance measures by the risk associated with each accident bin. 4 refs., 3 figs., 6 tabs

  9. GIS-based rare events logistic regression for mineral prospectivity mapping

    Science.gov (United States)

    Xiong, Yihui; Zuo, Renguang

    2018-02-01

    Mineralization is a special type of singularity event, and can be considered as a rare event, because within a specific study area the number of prospective locations (1s) are considerably fewer than the number of non-prospective locations (0s). In this study, GIS-based rare events logistic regression (RELR) was used to map the mineral prospectivity in the southwestern Fujian Province, China. An odds ratio was used to measure the relative importance of the evidence variables with respect to mineralization. The results suggest that formations, granites, and skarn alterations, followed by faults and aeromagnetic anomaly are the most important indicators for the formation of Fe-related mineralization in the study area. The prediction rate and the area under the curve (AUC) values show that areas with higher probability have a strong spatial relationship with the known mineral deposits. Comparing the results with original logistic regression (OLR) demonstrates that the GIS-based RELR performs better than OLR. The prospectivity map obtained in this study benefits the search for skarn Fe-related mineralization in the study area.

  10. Tracing the Spatial-Temporal Evolution of Events Based on Social Media Data

    Directory of Open Access Journals (Sweden)

    Xiaolu Zhou

    2017-03-01

    Full Text Available Social media data provide a great opportunity to investigate event flow in cities. Despite the advantages of social media data in these investigations, the data heterogeneity and big data size pose challenges to researchers seeking to identify useful information about events from the raw data. In addition, few studies have used social media posts to capture how events develop in space and time. This paper demonstrates an efficient approach based on machine learning and geovisualization to identify events and trace the development of these events in real-time. We conducted an empirical study to delineate the temporal and spatial evolution of a natural event (heavy precipitation and a social event (Pope Francis’ visit to the US in the New York City—Washington, DC regions. By investigating multiple features of Twitter data (message, author, time, and geographic location information, this paper demonstrates how voluntary local knowledge from tweets can be used to depict city dynamics, discover spatiotemporal characteristics of events, and convey real-time information.

  11. Application and Use of PSA-based Event Analysis in Belgium

    International Nuclear Information System (INIS)

    Hulsmans, M.; De Gelder, P.

    2003-01-01

    The paper describes the experiences of the Belgian nuclear regulatory body AVN with the application and the use of the PSAEA guidelines (PSA-based Event Analysis). In 2000, risk-based precursor analysis has increasingly become a part of the AVN process of feedback of operating experience, and constitutes in fact the first PSA application for the Belgian plants. The PSAEA guidelines were established by a consultant in the framework of an international project. In a first stage, AVN applied the PSAEA guidelines to two test cases in order to explore the feasibility and the interest of this type of probabilistic precursor analysis. These pilot studies demonstrated the applicability of the PSAEA method in general, and its applicability to the computer models of the Belgian state-of-the- art PSAs in particular. They revealed insights regarding the event analysis methodology, the resulting event severity and the PSA model itself. The consideration of relevant what-if questions allowed to identify - and in some cases also to quantify - several potential safety issues for improvement. The internal evaluation of PSAEA was positive and AVN decided to routinely perform several PSAEA studies per year. During 2000, PSAEA has increasingly become a part of the AVN process of feedback of operating experience. The objectives of the AVN precursor program have been clearly stated. A first pragmatic set of screening rules for operational events has been drawn up and applied. Six more operational events have been analysed in detail (initiating events as well as condition events) and resulted in a wide spectrum of event severity. In addition to the particular conclusions for each event, relevant insights have been gained regarding for instance event modelling and the interpretation of results. Particular attention has been devoted to the form of the analysis report. After an initial presentation of some key concepts, the particular context of this program and of AVN's objectives, the

  12. An XML-Based Protocol for Distributed Event Services

    Science.gov (United States)

    Smith, Warren; Gunter, Dan; Quesnel, Darcy; Biegel, Bryan (Technical Monitor)

    2001-01-01

    This viewgraph presentation provides information on the application of an XML (extensible mark-up language)-based protocol to the developing field of distributed processing by way of a computational grid which resembles an electric power grid. XML tags would be used to transmit events between the participants of a transaction, namely, the consumer and the producer of the grid scheme.

  13. Multi Agent System Based Wide Area Protection against Cascading Events

    DEFF Research Database (Denmark)

    Liu, Zhou; Chen, Zhe; Liu, Leo

    2012-01-01

    In this paper, a multi-agent system based wide area protection scheme is proposed in order to prevent long term voltage instability induced cascading events. The distributed relays and controllers work as a device agent which not only executes the normal function automatically but also can...... the effectiveness of proposed protection strategy. The simulation results indicate that the proposed multi agent control system can effectively coordinate the distributed relays and controllers to prevent the long term voltage instability induced cascading events....

  14. Declarative Event-Based Workflow as Distributed Dynamic Condition Response Graphs

    DEFF Research Database (Denmark)

    Hildebrandt, Thomas; Mukkamala, Raghava Rao

    2010-01-01

    We present Dynamic Condition Response Graphs (DCR Graphs) as a declarative, event-based process model inspired by the workflow language employed by our industrial partner and conservatively generalizing prime event structures. A dynamic condition response graph is a directed graph with nodes repr...... exemplify the use of distributed DCR Graphs on a simple workflow taken from a field study at a Danish hospital, pointing out their flexibility compared to imperative workflow models. Finally we provide a mapping from DCR Graphs to Buchi-automata....

  15. Simulation of quantum computation : A deterministic event-based approach

    NARCIS (Netherlands)

    Michielsen, K; De Raedt, K; De Raedt, H

    We demonstrate that locally connected networks of machines that have primitive learning capabilities can be used to perform a deterministic, event-based simulation of quantum computation. We present simulation results for basic quantum operations such as the Hadamard and the controlled-NOT gate, and

  16. Simulation of Quantum Computation : A Deterministic Event-Based Approach

    NARCIS (Netherlands)

    Michielsen, K.; Raedt, K. De; Raedt, H. De

    2005-01-01

    We demonstrate that locally connected networks of machines that have primitive learning capabilities can be used to perform a deterministic, event-based simulation of quantum computation. We present simulation results for basic quantum operations such as the Hadamard and the controlled-NOT gate, and

  17. A Bayesian Model for Event-based Trust

    DEFF Research Database (Denmark)

    Nielsen, Mogens; Krukow, Karl; Sassone, Vladimiro

    2007-01-01

    The application scenarios envisioned for ‘global ubiquitous computing’ have unique requirements that are often incompatible with traditional security paradigms. One alternative currently being investigated is to support security decision-making by explicit representation of principals' trusting...... of the systems from the computational trust literature; the comparison is derived formally, rather than obtained via experimental simulation as traditionally done. With this foundation in place, we formalise a general notion of information about past behaviour, based on event structures. This yields a flexible...

  18. Event Management for Teacher-Coaches: Risk and Supervision Considerations for School-Based Sports

    Science.gov (United States)

    Paiement, Craig A.; Payment, Matthew P.

    2011-01-01

    A professional sports event requires considerable planning in which years are devoted to the success of that single activity. School-based sports events do not have that luxury, because high schools across the country host athletic events nearly every day. It is not uncommon during the fall sports season for a combination of boys' and girls'…

  19. Event-based soil loss models for construction sites

    Science.gov (United States)

    Trenouth, William R.; Gharabaghi, Bahram

    2015-05-01

    The elevated rates of soil erosion stemming from land clearing and grading activities during urban development, can result in excessive amounts of eroded sediments entering waterways and causing harm to the biota living therein. However, construction site event-based soil loss simulations - required for reliable design of erosion and sediment controls - are one of the most uncertain types of hydrologic models. This study presents models with improved degree of accuracy to advance the design of erosion and sediment controls for construction sites. The new models are developed using multiple linear regression (MLR) on event-based permutations of the Universal Soil Loss Equation (USLE) and artificial neural networks (ANN). These models were developed using surface runoff monitoring datasets obtained from three sites - Greensborough, Cookstown, and Alcona - in Ontario and datasets mined from the literature for three additional sites - Treynor, Iowa, Coshocton, Ohio and Cordoba, Spain. The predictive MLR and ANN models can serve as both diagnostic and design tools for the effective sizing of erosion and sediment controls on active construction sites, and can be used for dynamic scenario forecasting when considering rapidly changing land use conditions during various phases of construction.

  20. Event-based rainfall-runoff modelling of the Kelantan River Basin

    Science.gov (United States)

    Basarudin, Z.; Adnan, N. A.; Latif, A. R. A.; Tahir, W.; Syafiqah, N.

    2014-02-01

    Flood is one of the most common natural disasters in Malaysia. According to hydrologists there are many causes that contribute to flood events. The two most dominant factors are the meteorology factor (i.e climate change) and change in land use. These two factors contributed to floods in recent decade especially in the monsoonal catchment such as Malaysia. This paper intends to quantify the influence of rainfall during extreme rainfall events on the hydrological model in the Kelantan River catchment. Therefore, two dynamic inputs were used in the study: rainfall and river discharge. The extreme flood events in 2008 and 2004 were compared based on rainfall data for both years. The events were modeled via a semi-distributed HEC-HMS hydrological model. Land use change was not incorporated in the study because the study only tries to quantify rainfall changes during these two events to simulate the discharge and runoff value. Therefore, the land use data representing the year 2004 were used as inputs in the 2008 runoff model. The study managed to demonstrate that rainfall change has a significant impact to determine the peak discharge and runoff depth for the study area.

  1. Event-based rainfall-runoff modelling of the Kelantan River Basin

    International Nuclear Information System (INIS)

    Basarudin, Z; Adnan, N A; Latif, A R A; Syafiqah, N; Tahir, W

    2014-01-01

    Flood is one of the most common natural disasters in Malaysia. According to hydrologists there are many causes that contribute to flood events. The two most dominant factors are the meteorology factor (i.e climate change) and change in land use. These two factors contributed to floods in recent decade especially in the monsoonal catchment such as Malaysia. This paper intends to quantify the influence of rainfall during extreme rainfall events on the hydrological model in the Kelantan River catchment. Therefore, two dynamic inputs were used in the study: rainfall and river discharge. The extreme flood events in 2008 and 2004 were compared based on rainfall data for both years. The events were modeled via a semi-distributed HEC-HMS hydrological model. Land use change was not incorporated in the study because the study only tries to quantify rainfall changes during these two events to simulate the discharge and runoff value. Therefore, the land use data representing the year 2004 were used as inputs in the 2008 runoff model. The study managed to demonstrate that rainfall change has a significant impact to determine the peak discharge and runoff depth for the study area

  2. Event-based cluster synchronization of coupled genetic regulatory networks

    Science.gov (United States)

    Yue, Dandan; Guan, Zhi-Hong; Li, Tao; Liao, Rui-Quan; Liu, Feng; Lai, Qiang

    2017-09-01

    In this paper, the cluster synchronization of coupled genetic regulatory networks with a directed topology is studied by using the event-based strategy and pinning control. An event-triggered condition with a threshold consisting of the neighbors' discrete states at their own event time instants and a state-independent exponential decay function is proposed. The intra-cluster states information and extra-cluster states information are involved in the threshold in different ways. By using the Lyapunov function approach and the theories of matrices and inequalities, we establish the cluster synchronization criterion. It is shown that both the avoidance of continuous transmission of information and the exclusion of the Zeno behavior are ensured under the presented triggering condition. Explicit conditions on the parameters in the threshold are obtained for synchronization. The stability criterion of a single GRN is also given under the reduced triggering condition. Numerical examples are provided to validate the theoretical results.

  3. Limits on the Efficiency of Event-Based Algorithms for Monte Carlo Neutron Transport

    Energy Technology Data Exchange (ETDEWEB)

    Romano, Paul K.; Siegel, Andrew R.

    2017-04-16

    The traditional form of parallelism in Monte Carlo particle transport simulations, wherein each individual particle history is considered a unit of work, does not lend itself well to data-level parallelism. Event-based algorithms, which were originally used for simulations on vector processors, may offer a path toward better utilizing data-level parallelism in modern computer architectures. In this study, a simple model is developed for estimating the efficiency of the event-based particle transport algorithm under two sets of assumptions. Data collected from simulations of four reactor problems using OpenMC was then used in conjunction with the models to calculate the speedup due to vectorization as a function of two parameters: the size of the particle bank and the vector width. When each event type is assumed to have constant execution time, the achievable speedup is directly related to the particle bank size. We observed that the bank size generally needs to be at least 20 times greater than vector size in order to achieve vector efficiency greater than 90%. When the execution times for events are allowed to vary, however, the vector speedup is also limited by differences in execution time for events being carried out in a single event-iteration. For some problems, this implies that vector effciencies over 50% may not be attainable. While there are many factors impacting performance of an event-based algorithm that are not captured by our model, it nevertheless provides insights into factors that may be limiting in a real implementation.

  4. Asymptotic Effectiveness of the Event-Based Sampling According to the Integral Criterion

    Directory of Open Access Journals (Sweden)

    Marek Miskowicz

    2007-01-01

    Full Text Available A rapid progress in intelligent sensing technology creates new interest in a development of analysis and design of non-conventional sampling schemes. The investigation of the event-based sampling according to the integral criterion is presented in this paper. The investigated sampling scheme is an extension of the pure linear send-on- delta/level-crossing algorithm utilized for reporting the state of objects monitored by intelligent sensors. The motivation of using the event-based integral sampling is outlined. The related works in adaptive sampling are summarized. The analytical closed-form formulas for the evaluation of the mean rate of event-based traffic, and the asymptotic integral sampling effectiveness, are derived. The simulation results verifying the analytical formulas are reported. The effectiveness of the integral sampling is compared with the related linear send-on-delta/level-crossing scheme. The calculation of the asymptotic effectiveness for common signals, which model the state evolution of dynamic systems in time, is exemplified.

  5. Software failure events derivation and analysis by frame-based technique

    International Nuclear Information System (INIS)

    Huang, H.-W.; Shih, C.; Yih, Swu; Chen, M.-H.

    2007-01-01

    A frame-based technique, including physical frame, logical frame, and cognitive frame, was adopted to perform digital I and C failure events derivation and analysis for generic ABWR. The physical frame was structured with a modified PCTran-ABWR plant simulation code, which was extended and enhanced on the feedwater system, recirculation system, and steam line system. The logical model is structured with MATLAB, which was incorporated into PCTran-ABWR to improve the pressure control system, feedwater control system, recirculation control system, and automated power regulation control system. As a result, the software failure of these digital control systems can be properly simulated and analyzed. The cognitive frame was simulated by the operator awareness status in the scenarios. Moreover, via an internal characteristics tuning technique, the modified PCTran-ABWR can precisely reflect the characteristics of the power-core flow. Hence, in addition to the transient plots, the analysis results can then be demonstrated on the power-core flow map. A number of postulated I and C system software failure events were derived to achieve the dynamic analyses. The basis for event derivation includes the published classification for software anomalies, the digital I and C design data for ABWR, chapter 15 accident analysis of generic SAR, and the reported NPP I and C software failure events. The case study of this research includes: (1) the software CMF analysis for the major digital control systems; and (2) postulated ABWR digital I and C software failure events derivation from the actual happening of non-ABWR digital I and C software failure events, which were reported to LER of USNRC or IRS of IAEA. These events were analyzed by PCTran-ABWR. Conflicts among plant status, computer status, and human cognitive status are successfully identified. The operator might not easily recognize the abnormal condition, because the computer status seems to progress normally. However, a well

  6. Life review based on remembering specific positive events in active aging.

    Science.gov (United States)

    Latorre, José M; Serrano, Juan P; Ricarte, Jorge; Bonete, Beatriz; Ros, Laura; Sitges, Esther

    2015-02-01

    The aim of this study is to evaluate the effectiveness of life review (LR) based on specific positive events in non-depressed older adults taking part in an active aging program. Fifty-five older adults were randomly assigned to an experimental group or an active control (AC) group. A six-session individual training of LR based on specific positive events was carried out with the experimental group. The AC group undertook a "media workshop" of six sessions focused on learning journalistic techniques. Pre-test and post-test measures included life satisfaction, depressive symptoms, experiencing the environment as rewarding, and autobiographical memory (AM) scales. LR intervention decreased depressive symptomatology, improved life satisfaction, and increased specific memories. The findings suggest that practice in AM for specific events is an effective component of LR that could be a useful tool in enhancing emotional well-being in active aging programs, thus reducing depressive symptoms. © The Author(s) 2014.

  7. Event-Based Control Strategy for Mobile Robots in Wireless Environments.

    Science.gov (United States)

    Socas, Rafael; Dormido, Sebastián; Dormido, Raquel; Fabregas, Ernesto

    2015-12-02

    In this paper, a new event-based control strategy for mobile robots is presented. It has been designed to work in wireless environments where a centralized controller has to interchange information with the robots over an RF (radio frequency) interface. The event-based architectures have been developed for differential wheeled robots, although they can be applied to other kinds of robots in a simple way. The solution has been checked over classical navigation algorithms, like wall following and obstacle avoidance, using scenarios with a unique or multiple robots. A comparison between the proposed architectures and the classical discrete-time strategy is also carried out. The experimental results shows that the proposed solution has a higher efficiency in communication resource usage than the classical discrete-time strategy with the same accuracy.

  8. Supervision in the PC based prototype for the ATLAS event filter

    CERN Document Server

    Bee, C P; Etienne, F; Fede, E; Meessen, C; Nacasch, R; Qian, Z; Touchard, F

    1999-01-01

    A prototype of the ATLAS event filter based on commodity PCs linked by a Fast Ethernet switch has been developed in Marseille. The present contribution focus on the supervision aspects of the prototype based on Java and Java mobile agents technology. (5 refs).

  9. Design a Learning-Oriented Fall Event Reporting System Based on Kirkpatrick Model.

    Science.gov (United States)

    Zhou, Sicheng; Kang, Hong; Gong, Yang

    2017-01-01

    Patient fall has been a severe problem in healthcare facilities around the world due to its prevalence and cost. Routine fall prevention training programs are not as effective as expected. Using event reporting systems is the trend for reducing patient safety events such as falls, although some limitations of the systems exist at current stage. We summarized these limitations through literature review, and developed an improved web-based fall event reporting system. The Kirkpatrick model, widely used in the business area for training program evaluation, has been integrated during the design of our system. Different from traditional event reporting systems that only collect and store the reports, our system automatically annotates and analyzes the reported events, and provides users with timely knowledge support specific to the reported event. The paper illustrates the design of our system and how its features are intended to reduce patient falls by learning from previous errors.

  10. Trust Index Based Fault Tolerant Multiple Event Localization Algorithm for WSNs

    Science.gov (United States)

    Xu, Xianghua; Gao, Xueyong; Wan, Jian; Xiong, Naixue

    2011-01-01

    This paper investigates the use of wireless sensor networks for multiple event source localization using binary information from the sensor nodes. The events could continually emit signals whose strength is attenuated inversely proportional to the distance from the source. In this context, faults occur due to various reasons and are manifested when a node reports a wrong decision. In order to reduce the impact of node faults on the accuracy of multiple event localization, we introduce a trust index model to evaluate the fidelity of information which the nodes report and use in the event detection process, and propose the Trust Index based Subtract on Negative Add on Positive (TISNAP) localization algorithm, which reduces the impact of faulty nodes on the event localization by decreasing their trust index, to improve the accuracy of event localization and performance of fault tolerance for multiple event source localization. The algorithm includes three phases: first, the sink identifies the cluster nodes to determine the number of events occurred in the entire region by analyzing the binary data reported by all nodes; then, it constructs the likelihood matrix related to the cluster nodes and estimates the location of all events according to the alarmed status and trust index of the nodes around the cluster nodes. Finally, the sink updates the trust index of all nodes according to the fidelity of their information in the previous reporting cycle. The algorithm improves the accuracy of localization and performance of fault tolerance in multiple event source localization. The experiment results show that when the probability of node fault is close to 50%, the algorithm can still accurately determine the number of the events and have better accuracy of localization compared with other algorithms. PMID:22163972

  11. Trust Index Based Fault Tolerant Multiple Event Localization Algorithm for WSNs

    Directory of Open Access Journals (Sweden)

    Jian Wan

    2011-06-01

    Full Text Available This paper investigates the use of wireless sensor networks for multiple event source localization using binary information from the sensor nodes. The events could continually emit signals whose strength is attenuated inversely proportional to the distance from the source. In this context, faults occur due to various reasons and are manifested when a node reports a wrong decision. In order to reduce the impact of node faults on the accuracy of multiple event localization, we introduce a trust index model to evaluate the fidelity of information which the nodes report and use in the event detection process, and propose the Trust Index based Subtract on Negative Add on Positive (TISNAP localization algorithm, which reduces the impact of faulty nodes on the event localization by decreasing their trust index, to improve the accuracy of event localization and performance of fault tolerance for multiple event source localization. The algorithm includes three phases: first, the sink identifies the cluster nodes to determine the number of events occurred in the entire region by analyzing the binary data reported by all nodes; then, it constructs the likelihood matrix related to the cluster nodes and estimates the location of all events according to the alarmed status and trust index of the nodes around the cluster nodes. Finally, the sink updates the trust index of all nodes according to the fidelity of their information in the previous reporting cycle. The algorithm improves the accuracy of localization and performance of fault tolerance in multiple event source localization. The experiment results show that when the probability of node fault is close to 50%, the algorithm can still accurately determine the number of the events and have better accuracy of localization compared with other algorithms.

  12. Limits on the efficiency of event-based algorithms for Monte Carlo neutron transport

    Directory of Open Access Journals (Sweden)

    Paul K. Romano

    2017-09-01

    Full Text Available The traditional form of parallelism in Monte Carlo particle transport simulations, wherein each individual particle history is considered a unit of work, does not lend itself well to data-level parallelism. Event-based algorithms, which were originally used for simulations on vector processors, may offer a path toward better utilizing data-level parallelism in modern computer architectures. In this study, a simple model is developed for estimating the efficiency of the event-based particle transport algorithm under two sets of assumptions. Data collected from simulations of four reactor problems using OpenMC was then used in conjunction with the models to calculate the speedup due to vectorization as a function of the size of the particle bank and the vector width. When each event type is assumed to have constant execution time, the achievable speedup is directly related to the particle bank size. We observed that the bank size generally needs to be at least 20 times greater than vector size to achieve vector efficiency greater than 90%. When the execution times for events are allowed to vary, the vector speedup is also limited by differences in the execution time for events being carried out in a single event-iteration.

  13. Using Forbush Decreases to Derive the Transit Time of ICMEs Propagating from 1 AU to Mars

    Science.gov (United States)

    Freiherr von Forstner, Johan L.; Guo, Jingnan; Wimmer-Schweingruber, Robert F.; Hassler, Donald M.; Temmer, Manuela; Dumbović, Mateja; Jian, Lan K.; Appel, Jan K.; Čalogović, Jaša.; Ehresmann, Bent; Heber, Bernd; Lohf, Henning; Posner, Arik; Steigies, Christian T.; Vršnak, Bojan; Zeitlin, Cary J.

    2018-01-01

    The propagation of 15 interplanetary coronal mass ejections (ICMEs) from Earth's orbit (1 AU) to Mars (˜1.5 AU) has been studied with their propagation speed estimated from both measurements and simulations. The enhancement of magnetic fields related to ICMEs and their shock fronts causes the so-called Forbush decrease, which can be detected as a reduction of galactic cosmic rays measured on ground. We have used galactic cosmic ray (GCR) data from in situ measurements at Earth, from both STEREO A and STEREO B as well as GCR measurements by the Radiation Assessment Detector (RAD) instrument on board Mars Science Laboratory on the surface of Mars. A set of ICME events has been selected during the periods when Earth (or STEREO A or STEREO B) and Mars locations were nearly aligned on the same side of the Sun in the ecliptic plane (so-called opposition phase). Such lineups allow us to estimate the ICMEs' transit times between 1 and 1.5 AU by estimating the delay time of the corresponding Forbush decreases measured at each location. We investigate the evolution of their propagation speeds before and after passing Earth's orbit and find that the deceleration of ICMEs due to their interaction with the ambient solar wind may continue beyond 1 AU. We also find a substantial variance of the speed evolution among different events revealing the dynamic and diverse nature of eruptive solar events. Furthermore, the results are compared to simulation data obtained from two CME propagation models, namely the Drag-Based Model and ENLIL plus cone model.

  14. Building a knowledge base of severe adverse drug events based on AERS reporting data using semantic web technologies.

    Science.gov (United States)

    Jiang, Guoqian; Wang, Liwei; Liu, Hongfang; Solbrig, Harold R; Chute, Christopher G

    2013-01-01

    A semantically coded knowledge base of adverse drug events (ADEs) with severity information is critical for clinical decision support systems and translational research applications. However it remains challenging to measure and identify the severity information of ADEs. The objective of the study is to develop and evaluate a semantic web based approach for building a knowledge base of severe ADEs based on the FDA Adverse Event Reporting System (AERS) reporting data. We utilized a normalized AERS reporting dataset and extracted putative drug-ADE pairs and their associated outcome codes in the domain of cardiac disorders. We validated the drug-ADE associations using ADE datasets from SIDe Effect Resource (SIDER) and the UMLS. We leveraged the Common Terminology Criteria for Adverse Event (CTCAE) grading system and classified the ADEs into the CTCAE in the Web Ontology Language (OWL). We identified and validated 2,444 unique Drug-ADE pairs in the domain of cardiac disorders, of which 760 pairs are in Grade 5, 775 pairs in Grade 4 and 2,196 pairs in Grade 3.

  15. A robust neural network-based approach for microseismic event detection

    KAUST Repository

    Akram, Jubran; Ovcharenko, Oleg; Peter, Daniel

    2017-01-01

    We present an artificial neural network based approach for robust event detection from low S/N waveforms. We use a feed-forward network with a single hidden layer that is tuned on a training dataset and later applied on the entire example dataset

  16. Multitask Learning-Based Security Event Forecast Methods for Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Hui He

    2016-01-01

    Full Text Available Wireless sensor networks have strong dynamics and uncertainty, including network topological changes, node disappearance or addition, and facing various threats. First, to strengthen the detection adaptability of wireless sensor networks to various security attacks, a region similarity multitask-based security event forecast method for wireless sensor networks is proposed. This method performs topology partitioning on a large-scale sensor network and calculates the similarity degree among regional subnetworks. The trend of unknown network security events can be predicted through multitask learning of the occurrence and transmission characteristics of known network security events. Second, in case of lacking regional data, the quantitative trend of unknown regional network security events can be calculated. This study introduces a sensor network security event forecast method named Prediction Network Security Incomplete Unmarked Data (PNSIUD method to forecast missing attack data in the target region according to the known partial data in similar regions. Experimental results indicate that for an unknown security event forecast the forecast accuracy and effects of the similarity forecast algorithm are better than those of single-task learning method. At the same time, the forecast accuracy of the PNSIUD method is better than that of the traditional support vector machine method.

  17. Badhwar-O'Neill 2011 Galactic Cosmic Ray Model Update and Future Improvements

    Science.gov (United States)

    O'Neill, Pat M.; Kim, Myung-Hee Y.

    2014-01-01

    The Badhwar-O'Neill Galactic Cosmic Ray (GCR) Model based on actual GR measurements is used by deep space mission planners for the certification of micro-electronic systems and the analysis of radiation health risks to astronauts in space missions. The BO GCR Model provides GCR flux in deep space (outside the earth's magnetosphere) for any given time from 1645 to present. The energy spectrum from 50 MeV/n-20 GeV/n is provided for ions from hydrogen to uranium. This work describes the most recent version of the BO GCR model (BO'11). BO'11 determines the GCR flux at a given time applying an empirical time delay function to past sunspot activity. We describe the GCR measurement data used in the BO'11 update - modern data from BESS, PAMELA, CAPRICE, and ACE emphasized for than the older balloon data used for the previous BO model (BO'10). We look at the GCR flux for the last 24 solar minima and show how much greater the flux was for the cycle 24 minimum in 2010. The BO'11 Model uses the traditional, steady-state Fokker-Planck differential equation to account for particle transport in the heliosphere due to diffusion, convection, and adiabatic deceleration. It assumes a radially symmetrical diffusion coefficient derived from magnetic disturbances caused by sunspots carried onward by a constant solar wind. A more complex differential equation is now being tested to account for particle transport in the heliosphere in the next generation BO model. This new model is time-dependent (no longer a steady state model). In the new model, the dynamics and anti-symmetrical features of the actual heliosphere are accounted for so empirical time delay functions will no longer be required. The new model will be capable of simulating the more subtle features of modulation - such as the Sun's polarity and modulation dependence on the gradient and curvature drift. This improvement is expected to significantly improve the fidelity of the BO GCR model. Preliminary results of its

  18. Development of diagnostic prediction tools for bacteraemia caused by third-generation cephalosporin-resistant enterobacteria in suspected bacterial infections: a nested case-control study.

    Science.gov (United States)

    Rottier, W C; van Werkhoven, C H; Bamberg, Y R P; Dorigo-Zetsma, J W; van de Garde, E M; van Hees, B C; Kluytmans, J A J W; Kuck, E M; van der Linden, P D; Prins, J M; Thijsen, S F T; Verbon, A; Vlaminckx, B J M; Ammerlaan, H S M; Bonten, M J M

    2018-03-23

    Current guidelines for the empirical antibiotic treatment predict the presence of third-generation cephalosporin-resistant enterobacterial bacteraemia (3GCR-E-Bac) in case of infection only poorly, thereby increasing unnecessary carbapenem use. We aimed to develop diagnostic scoring systems which can better predict the presence of 3GCR-E-Bac. A retrospective nested case-control study was performed that included patients ≥18 years of age from eight Dutch hospitals in whom blood cultures were obtained and intravenous antibiotics were initiated. Each patient with 3GCR-E-Bac was matched to four control infection episodes within the same hospital, based on blood-culture date and onset location (community or hospital). Starting from 32 commonly described clinical risk factors at infection onset, selection strategies were used to derive scoring systems for the probability of community- and hospital-onset 3GCR-E-Bac. 3GCR-E-Bac occurred in 90 of 22 506 (0.4%) community-onset infections and in 82 of 8110 (1.0%) hospital-onset infections, and these cases were matched to 360 community-onset and 328 hospital-onset control episodes. The derived community-onset and hospital-onset scoring systems consisted of six and nine predictors, respectively. With selected score cut-offs, the models identified 3GCR-E-Bac with sensitivity equal to existing guidelines (community-onset: 54.3%; hospital-onset: 81.5%). However, they reduced the proportion of patients classified as at risk for 3GCR-E-Bac (i.e. eligible for empirical carbapenem therapy) with 40% (95%CI 21-56%) and 49% (95%CI 39-58%) in, respectively, community-onset and hospital-onset infections. These prediction scores for 3GCR-E-Bac, specifically geared towards the initiation of empirical antibiotic treatment, may improve the balance between inappropriate antibiotics and carbapenem overuse. Copyright © 2018 European Society of Clinical Microbiology and Infectious Diseases. Published by Elsevier Ltd. All rights reserved.

  19. Drug Interactions Between Hepatoprotective Agents Ursodeoxycholic Acid or Glycyrrhizin and Ombitasvir/Paritaprevir/Ritonavir in Healthy Japanese Subjects.

    Science.gov (United States)

    Zha, Jiuhong; Badri, Prajakta S; Ding, Bifeng; Uchiyama, Naotaka; Alves, Katia; Rodrigues, Lino; Redman, Rebecca; Dutta, Sandeep; Menon, Rajeev M

    2015-11-01

    The 2 direct-acting antiviral combination (2D) of ombitasvir and paritaprevir (coadministered with ritonavir) is being evaluated for the treatment of chronic hepatitis C virus infection in Japan. Ursodeoxycholic acid (UDCA) and glycyrrhizin (GCR) are hepatoprotective agents widely used in Japan. A drug-drug interaction (DDI) study was conducted to guide dosing recommendations for UDCA and GCR when coadministered with the 2D regimen. DDIs between the 2D regimen (ombitasvir/paritaprevir/ritonavir 25/150/100 mg orally once daily) and UDCA (50 mg orally 3 times daily) or GCR (80 mg intravenously once daily) were evaluated in a 2-arm, multiple-dose study in 24 Japanese healthy subjects under fed conditions. Pharmacokinetic and safety evaluations were performed when UDCA or GCR and the 2D regimen were administered alone and during coadministration. Exposures from coadministration of the 2D regimen plus UDCA or GCR versus the 2D regimen, UDCA, or GCR alone were compared using repeated-measures analyses of natural logarithms of the maximum plasma concentration (Cmax) and area under the curve (AUC). After coadministration of the 2D regimen and UDCA, steady-state exposures (Cmax and AUC) of ombitasvir, paritaprevir, and ritonavir showed a ≤9% change, and UDCA exposures showed a ≤20% change compared with administration alone. When the 2D regimen and GCR were coadministered, steady-state exposures of ombitasvir, paritaprevir, and ritonavir were not affected (≤9% change), GCR AUC increased by 49%, and GCR Cmax was unaffected (<1% change). No dose adjustment is needed for UDCA, GCR, or the 2D regimen when UDCA or GCR is coadministered with the 2D regimen in hepatitis C virus-infected patients under fed conditions. Clinical monitoring of patients using GCR is recommended due to an approximately 50% increase in GCR AUC when coadministered with the 2D regimen. Copyright © 2015 Elsevier HS Journals, Inc. All rights reserved.

  20. Event-based motion correction for PET transmission measurements with a rotating point source

    International Nuclear Information System (INIS)

    Zhou, Victor W; Kyme, Andre Z; Meikle, Steven R; Fulton, Roger

    2011-01-01

    Accurate attenuation correction is important for quantitative positron emission tomography (PET) studies. When performing transmission measurements using an external rotating radioactive source, object motion during the transmission scan can distort the attenuation correction factors computed as the ratio of the blank to transmission counts, and cause errors and artefacts in reconstructed PET images. In this paper we report a compensation method for rigid body motion during PET transmission measurements, in which list mode transmission data are motion corrected event-by-event, based on known motion, to ensure that all events which traverse the same path through the object are recorded on a common line of response (LOR). As a result, the motion-corrected transmission LOR may record a combination of events originally detected on different LORs. To ensure that the corresponding blank LOR records events from the same combination of contributing LORs, the list mode blank data are spatially transformed event-by-event based on the same motion information. The number of counts recorded on the resulting blank LOR is then equivalent to the number of counts that would have been recorded on the corresponding motion-corrected transmission LOR in the absence of any attenuating object. The proposed method has been verified in phantom studies with both stepwise movements and continuous motion. We found that attenuation maps derived from motion-corrected transmission and blank data agree well with those of the stationary phantom and are significantly better than uncorrected attenuation data.

  1. Solar Energetic Particles Events and Human Exploration: Measurements in a Space Habitat

    Science.gov (United States)

    Narici, L.; Berrilli, F.; Casolino, M.; Del Moro, D.; Forte, R.; Giovannelli, L.; Martucci, M.; Mergè, M.; Picozza, P.; Rizzo, A.; Scardigli, S.; Sparvoli, R.; Zeitlin, C.

    2016-12-01

    Solar activity is the source of Space Weather disturbances. Flares, CME and coronal holes modulate physical conditions of circumterrestrial and interplanetary space and ultimately the fluxes of high-energy ionized particles, i.e., solar energetic particle (SEP) and galactic cosmic ray (GCR) background. This ionizing radiation affects spacecrafts and biological systems, therefore it is an important issue for human exploration of space. During a deep space travel (for example the trip to Mars) radiation risk thresholds may well be exceeded by the crew, so mitigation countermeasures must be employed. Solar particle events (SPE) constitute high risks due to their impulsive high rate dose. Forecasting SPE appears to be needed and also specifically tailored to the human exploration needs. Understanding the parameters of the SPE that produce events leading to higher health risks for the astronauts in deep space is therefore a first priority issue. Measurements of SPE effects with active devices in LEO inside the ISS can produce important information for the specific SEP measured, relative to the specific detector location in the ISS (in a human habitat with a shield typical of manned space-crafts). Active detectors can select data from specific geo-magnetic regions along the orbits, allowing geo-magnetic selections that best mimic deep space radiation. We present results from data acquired in 2010 - 2012 by the detector system ALTEA inside the ISS (18 SPEs detected). We compare this data with data from the detector Pamela on a LEO satellite, with the RAD data during the Curiosity Journey to Mars, with GOES data and with several Solar physical parameters. While several features of the radiation modulation are easily understood by the effect of the geomagnetic field, as an example we report a proportionality of the flux in the ISS with the energetic proton flux measured by GOES, some features appear more difficult to interpret. The final goal of this work is to find the

  2. On Mixed Data and Event Driven Design for Adaptive-Critic-Based Nonlinear $H_{\\infty}$ Control.

    Science.gov (United States)

    Wang, Ding; Mu, Chaoxu; Liu, Derong; Ma, Hongwen

    2018-04-01

    In this paper, based on the adaptive critic learning technique, the control for a class of unknown nonlinear dynamic systems is investigated by adopting a mixed data and event driven design approach. The nonlinear control problem is formulated as a two-player zero-sum differential game and the adaptive critic method is employed to cope with the data-based optimization. The novelty lies in that the data driven learning identifier is combined with the event driven design formulation, in order to develop the adaptive critic controller, thereby accomplishing the nonlinear control. The event driven optimal control law and the time driven worst case disturbance law are approximated by constructing and tuning a critic neural network. Applying the event driven feedback control, the closed-loop system is built with stability analysis. Simulation studies are conducted to verify the theoretical results and illustrate the control performance. It is significant to observe that the present research provides a new avenue of integrating data-based control and event-triggering mechanism into establishing advanced adaptive critic systems.

  3. A community-based event delivery protocol in publish/subscribe systems for delay tolerant sensor networks.

    Science.gov (United States)

    Liu, Nianbo; Liu, Ming; Zhu, Jinqi; Gong, Haigang

    2009-01-01

    The basic operation of a Delay Tolerant Sensor Network (DTSN) is to finish pervasive data gathering in networks with intermittent connectivity, while the publish/subscribe (Pub/Sub for short) paradigm is used to deliver events from a source to interested clients in an asynchronous way. Recently, extension of Pub/Sub systems in DTSNs has become a promising research topic. However, due to the unique frequent partitioning characteristic of DTSNs, extension of a Pub/Sub system in a DTSN is a considerably difficult and challenging problem, and there are no good solutions to this problem in published works. To ad apt Pub/Sub systems to DTSNs, we propose CED, a community-based event delivery protocol. In our design, event delivery is based on several unchanged communities, which are formed by sensor nodes in the network according to their connectivity. CED consists of two components: event delivery and queue management. In event delivery, events in a community are delivered to mobile subscribers once a subscriber comes into the community, for improving the data delivery ratio. The queue management employs both the event successful delivery time and the event survival time to decide whether an event should be delivered or dropped for minimizing the transmission overhead. The effectiveness of CED is demonstrated through comprehensive simulation studies.

  4. Ontology-Based Vaccine Adverse Event Representation and Analysis.

    Science.gov (United States)

    Xie, Jiangan; He, Yongqun

    2017-01-01

    Vaccine is the one of the greatest inventions of modern medicine that has contributed most to the relief of human misery and the exciting increase in life expectancy. In 1796, an English country physician, Edward Jenner, discovered that inoculating mankind with cowpox can protect them from smallpox (Riedel S, Edward Jenner and the history of smallpox and vaccination. Proceedings (Baylor University. Medical Center) 18(1):21, 2005). Based on the vaccination worldwide, we finally succeeded in the eradication of smallpox in 1977 (Henderson, Vaccine 29:D7-D9, 2011). Other disabling and lethal diseases, like poliomyelitis and measles, are targeted for eradication (Bonanni, Vaccine 17:S120-S125, 1999).Although vaccine development and administration are tremendously successful and cost-effective practices to human health, no vaccine is 100% safe for everyone because each person reacts to vaccinations differently given different genetic background and health conditions. Although all licensed vaccines are generally safe for the majority of people, vaccinees may still suffer adverse events (AEs) in reaction to various vaccines, some of which can be serious or even fatal (Haber et al., Drug Saf 32(4):309-323, 2009). Hence, the double-edged sword of vaccination remains a concern.To support integrative AE data collection and analysis, it is critical to adopt an AE normalization strategy. In the past decades, different controlled terminologies, including the Medical Dictionary for Regulatory Activities (MedDRA) (Brown EG, Wood L, Wood S, et al., Drug Saf 20(2):109-117, 1999), the Common Terminology Criteria for Adverse Events (CTCAE) (NCI, The Common Terminology Criteria for Adverse Events (CTCAE). Available from: http://evs.nci.nih.gov/ftp1/CTCAE/About.html . Access on 7 Oct 2015), and the World Health Organization (WHO) Adverse Reactions Terminology (WHO-ART) (WHO, The WHO Adverse Reaction Terminology - WHO-ART. Available from: https://www.umc-products.com/graphics/28010.pdf

  5. An asynchronous data-driven event-building scheme based on ATM switching fabrics

    International Nuclear Information System (INIS)

    Letheren, M.; Christiansen, J.; Mandjavidze, I.; Verhille, H.; De Prycker, M.; Pauwels, B.; Petit, G.; Wright, S.; Lumley, J.

    1994-01-01

    The very high data rates expected in experiments at the next generation of high luminosity hadron colliders will be handled by pipelined front-end readout electronics and multiple levels (2 or 3) of triggering. A variety of data acquisition architectures have been proposed for use downstream of the first level trigger. Depending on the architecture, the aggregate bandwidths required for event building are expected to be of the order 10--100 Gbit/s. Here, an Asynchronous Transfer Mode (ATM) packet-switching network technology is proposed as the interconnect for building high-performance, scalable data acquisition architectures. This paper introduces the relevant characteristics of ATM and describes components for the construction of an ATM-based event builder: (1) a multi-path, self-routing, scalable ATM switching fabric, (2) an experimental high performance workstation ATM-interface, and (3) a VMEbus ATM-interface. The requirement for traffic shaping in ATM-based event-builders is discussed and an analysis of the performance of several such schemes is presented

  6. Event-based proactive interference in rhesus monkeys.

    Science.gov (United States)

    Devkar, Deepna T; Wright, Anthony A

    2016-10-01

    Three rhesus monkeys (Macaca mulatta) were tested in a same/different memory task for proactive interference (PI) from prior trials. PI occurs when a previous sample stimulus appears as a test stimulus on a later trial, does not match the current sample stimulus, and the wrong response "same" is made. Trial-unique pictures (scenes, objects, animals, etc.) were used on most trials, except on trials where the test stimulus matched potentially interfering sample stimulus from a prior trial (1, 2, 4, 8, or 16 trials prior). Greater interference occurred when fewer trials separated interference and test. PI functions showed a continuum of interference. Delays between sample and test stimuli and intertrial intervals were manipulated to test how PI might vary as a function of elapsed time. Contrary to a similar study with pigeons, these time manipulations had no discernable effect on the monkey's PI, as shown by compete overlap of PI functions with no statistical differences or interactions. These results suggested that interference was strictly based upon the number of intervening events (trials with other pictures) without regard to elapsed time. The monkeys' apparent event-based interference was further supported by retesting with a novel set of 1,024 pictures. PI from novel pictures 1 or 2 trials prior was greater than from familiar pictures, a familiar set of 1,024 pictures. Moreover, when potentially interfering novel stimuli were 16 trials prior, performance accuracy was actually greater than accuracy on baseline trials (no interference), suggesting that remembering stimuli from 16 trials prior was a cue that this stimulus was not the sample stimulus on the current trial-a somewhat surprising conclusion particularly given monkeys.

  7. An event-based model for contracts

    Directory of Open Access Journals (Sweden)

    Tiziana Cimoli

    2013-02-01

    Full Text Available We introduce a basic model for contracts. Our model extends event structures with a new relation, which faithfully captures the circular dependencies among contract clauses. We establish whether an agreement exists which respects all the contracts at hand (i.e. all the dependencies can be resolved, and we detect the obligations of each participant. The main technical contribution is a correspondence between our model and a fragment of the contract logic PCL. More precisely, we show that the reachable events are exactly those which correspond to provable atoms in the logic. Despite of this strong correspondence, our model improves previous work on PCL by exhibiting a finer-grained notion of culpability, which takes into account the legitimate orderings of events.

  8. Precursor analyses - The use of deterministic and PSA based methods in the event investigation process at nuclear power plants

    International Nuclear Information System (INIS)

    2004-09-01

    The efficient feedback of operating experience (OE) is a valuable source of information for improving the safety and reliability of nuclear power plants (NPPs). It is therefore essential to collect information on abnormal events from both internal and external sources. Internal operating experience is analysed to obtain a complete understanding of an event and of its safety implications. Corrective or improvement measures may then be developed, prioritized and implemented in the plant if considered appropriate. Information from external events may also be analysed in order to learn lessons from others' experience and prevent similar occurrences at our own plant. The traditional ways of investigating operational events have been predominantly qualitative. In recent years, a PSA-based method called probabilistic precursor event analysis has been developed, used and applied on a significant scale in many places for a number of plants. The method enables a quantitative estimation of the safety significance of operational events to be incorporated. The purpose of this report is to outline a synergistic process that makes more effective use of operating experience event information by combining the insights and knowledge gained from both approaches, traditional deterministic event investigation and PSA-based event analysis. The PSA-based view on operational events and PSA-based event analysis can support the process of operational event analysis at the following stages of the operational event investigation: (1) Initial screening stage. (It introduces an element of quantitative analysis into the selection process. Quantitative analysis of the safety significance of nuclear plant events can be a very useful measure when it comes to selecting internal and external operating experience information for its relevance.) (2) In-depth analysis. (PSA based event evaluation provides a quantitative measure for judging the significance of operational events, contributors to

  9. Noether's Theorem and its Inverse of Birkhoffian System in Event Space Based on Herglotz Variational Problem

    Science.gov (United States)

    Tian, X.; Zhang, Y.

    2018-03-01

    Herglotz variational principle, in which the functional is defined by a differential equation, generalizes the classical ones defining the functional by an integral. The principle gives a variational principle description of nonconservative systems even when the Lagrangian is independent of time. This paper focuses on studying the Noether's theorem and its inverse of a Birkhoffian system in event space based on the Herglotz variational problem. Firstly, according to the Herglotz variational principle of a Birkhoffian system, the principle of a Birkhoffian system in event space is established. Secondly, its parametric equations and two basic formulae for the variation of Pfaff-Herglotz action of a Birkhoffian system in event space are obtained. Furthermore, the definition and criteria of Noether symmetry of the Birkhoffian system in event space based on the Herglotz variational problem are given. Then, according to the relationship between the Noether symmetry and conserved quantity, the Noether's theorem is derived. Under classical conditions, Noether's theorem of a Birkhoffian system in event space based on the Herglotz variational problem reduces to the classical ones. In addition, Noether's inverse theorem of the Birkhoffian system in event space based on the Herglotz variational problem is also obtained. In the end of the paper, an example is given to illustrate the application of the results.

  10. Integral-based event triggering controller design for stochastic LTI systems via convex optimisation

    Science.gov (United States)

    Mousavi, S. H.; Marquez, H. J.

    2016-07-01

    The presence of measurement noise in the event-based systems can lower system efficiency both in terms of data exchange rate and performance. In this paper, an integral-based event triggering control system is proposed for LTI systems with stochastic measurement noise. We show that the new mechanism is robust against noise and effectively reduces the flow of communication between plant and controller, and also improves output performance. Using a Lyapunov approach, stability in the mean square sense is proved. A simulated example illustrates the properties of our approach.

  11. Assessment of initial soil moisture conditions for event-based rainfall-runoff modelling

    OpenAIRE

    Tramblay, Yves; Bouvier, Christophe; Martin, C.; Didon-Lescot, J. F.; Todorovik, D.; Domergue, J. M.

    2010-01-01

    Flash floods are the most destructive natural hazards that occur in the Mediterranean region. Rainfall-runoff models can be very useful for flash flood forecasting and prediction. Event-based models are very popular for operational purposes, but there is a need to reduce the uncertainties related to the initial moisture conditions estimation prior to a flood event. This paper aims to compare several soil moisture indicators: local Time Domain Reflectometry (TDR) measurements of soil moisture,...

  12. A Community-Based Event Delivery Protocol in Publish/Subscribe Systems for Delay Tolerant Sensor Networks

    Directory of Open Access Journals (Sweden)

    Haigang Gong

    2009-09-01

    Full Text Available The basic operation of a Delay Tolerant Sensor Network (DTSN is to finish pervasive data gathering in networks with intermittent connectivity, while the publish/subscribe (Pub/Sub for short paradigm is used to deliver events from a source to interested clients in an asynchronous way. Recently, extension of Pub/Sub systems in DTSNs has become a promising research topic. However, due to the unique frequent partitioning characteristic of DTSNs, extension of a Pub/Sub system in a DTSN is a considerably difficult and challenging problem, and there are no good solutions to this problem in published works. To ad apt Pub/Sub systems to DTSNs, we propose CED, a community-based event delivery protocol. In our design, event delivery is based on several unchanged communities, which are formed by sensor nodes in the network according to their connectivity. CED consists of two components: event delivery and queue management. In event delivery, events in a community are delivered to mobile subscribers once a subscriber comes into the community, for improving the data delivery ratio. The queue management employs both the event successful delivery time and the event survival time to decide whether an event should be delivered or dropped for minimizing the transmission overhead. The effectiveness of CED is demonstrated through comprehensive simulation studies.

  13. Event generators for address event representation transmitters

    Science.gov (United States)

    Serrano-Gotarredona, Rafael; Serrano-Gotarredona, Teresa; Linares Barranco, Bernabe

    2005-06-01

    Address Event Representation (AER) is an emergent neuromorphic interchip communication protocol that allows for real-time virtual massive connectivity between huge number neurons located on different chips. By exploiting high speed digital communication circuits (with nano-seconds timings), synaptic neural connections can be time multiplexed, while neural activity signals (with mili-seconds timings) are sampled at low frequencies. Also, neurons generate 'events' according to their activity levels. More active neurons generate more events per unit time, and access the interchip communication channel more frequently, while neurons with low activity consume less communication bandwidth. In a typical AER transmitter chip, there is an array of neurons that generate events. They send events to a peripheral circuitry (let's call it "AER Generator") that transforms those events to neurons coordinates (addresses) which are put sequentially on an interchip high speed digital bus. This bus includes a parallel multi-bit address word plus a Rqst (request) and Ack (acknowledge) handshaking signals for asynchronous data exchange. There have been two main approaches published in the literature for implementing such "AER Generator" circuits. They differ on the way of handling event collisions coming from the array of neurons. One approach is based on detecting and discarding collisions, while the other incorporates arbitration for sequencing colliding events . The first approach is supposed to be simpler and faster, while the second is able to handle much higher event traffic. In this article we will concentrate on the second arbiter-based approach. Boahen has been publishing several techniques for implementing and improving the arbiter based approach. Originally, he proposed an arbitration squeme by rows, followed by a column arbitration. In this scheme, while one neuron was selected by the arbiters to transmit his event out of the chip, the rest of neurons in the array were

  14. A Markovian event-based framework for stochastic spiking neural networks.

    Science.gov (United States)

    Touboul, Jonathan D; Faugeras, Olivier D

    2011-11-01

    In spiking neural networks, the information is conveyed by the spike times, that depend on the intrinsic dynamics of each neuron, the input they receive and on the connections between neurons. In this article we study the Markovian nature of the sequence of spike times in stochastic neural networks, and in particular the ability to deduce from a spike train the next spike time, and therefore produce a description of the network activity only based on the spike times regardless of the membrane potential process. To study this question in a rigorous manner, we introduce and study an event-based description of networks of noisy integrate-and-fire neurons, i.e. that is based on the computation of the spike times. We show that the firing times of the neurons in the networks constitute a Markov chain, whose transition probability is related to the probability distribution of the interspike interval of the neurons in the network. In the cases where the Markovian model can be developed, the transition probability is explicitly derived in such classical cases of neural networks as the linear integrate-and-fire neuron models with excitatory and inhibitory interactions, for different types of synapses, possibly featuring noisy synaptic integration, transmission delays and absolute and relative refractory period. This covers most of the cases that have been investigated in the event-based description of spiking deterministic neural networks.

  15. Agent Based Simulation of Group Emotions Evolution and Strategy Intervention in Extreme Events

    Directory of Open Access Journals (Sweden)

    Bo Li

    2014-01-01

    Full Text Available Agent based simulation method has become a prominent approach in computational modeling and analysis of public emergency management in social science research. The group emotions evolution, information diffusion, and collective behavior selection make extreme incidents studies a complex system problem, which requires new methods for incidents management and strategy evaluation. This paper studies the group emotion evolution and intervention strategy effectiveness using agent based simulation method. By employing a computational experimentation methodology, we construct the group emotion evolution as a complex system and test the effects of three strategies. In addition, the events-chain model is proposed to model the accumulation influence of the temporal successive events. Each strategy is examined through three simulation experiments, including two make-up scenarios and a real case study. We show how various strategies could impact the group emotion evolution in terms of the complex emergence and emotion accumulation influence in extreme events. This paper also provides an effective method of how to use agent-based simulation for the study of complex collective behavior evolution problem in extreme incidents, emergency, and security study domains.

  16. Event Shape Sorting: selecting events with similar evolution

    Directory of Open Access Journals (Sweden)

    Tomášik Boris

    2017-01-01

    Full Text Available We present novel method for the organisation of events. The method is based on comparing event-by-event histograms of a chosen quantity Q that is measured for each particle in every event. The events are organised in such a way that those with similar shape of the Q-histograms end-up placed close to each other. We apply the method on histograms of azimuthal angle of the produced hadrons in ultrarelativsitic nuclear collisions. By selecting events with similar azimuthal shape of their hadron distribution one chooses events which are likely that they underwent similar evolution from the initial state to the freeze-out. Such events can more easily be compared to theoretical simulations where all conditions can be controlled. We illustrate the method on data simulated by the AMPT model.

  17. ICPP criticality event of October 17, 1978. Facts and sequential description of criticality event and precursor events

    International Nuclear Information System (INIS)

    1979-01-01

    On October 17 during the period of approximately 8:15 to 8:40 p.m., a criticality event occurred in the base of IB column, H-100. The inventory of medium short-lived fission products used to determine the number of fissions indicates that the criticality occurred in column H-100 aqueous phase and the sampling of the column wall with counting of the filings clearly indicates that the event occurred in the column base. The events leading up to the accident are described. The event produced no personnel injury, on-or off-site contamination, nor damage to equipment or property

  18. A heavy ion spectrometer system for the measurement of projectile fragmentation of relativistic heavy ions

    International Nuclear Information System (INIS)

    Engelage, J.; Crawford, H.J.; Greiner, L.; Kuo, C.

    1996-06-01

    The Heavy Ion Spectrometer System (HISS) at the LBL Bevalac provided a unique facility for measuring projectile fragmentation cross sections important in deconvolving the Galactic Cosmic Ray (GCR) source composition. The general characteristics of the apparatus specific to this application are described and the main features of the event reconstruction and analysis used in the TRANSPORT experiment are discussed

  19. Social importance enhances prospective memory: evidence from an event-based task.

    Science.gov (United States)

    Walter, Stefan; Meier, Beat

    2017-07-01

    Prospective memory performance can be enhanced by task importance, for example by promising a reward. Typically, this comes at costs in the ongoing task. However, previous research has suggested that social importance (e.g., providing a social motive) can enhance prospective memory performance without additional monitoring costs in activity-based and time-based tasks. The aim of the present study was to investigate the influence of social importance in an event-based task. We compared four conditions: social importance, promising a reward, both social importance and promising a reward, and standard prospective memory instructions (control condition). The results showed enhanced prospective memory performance for all importance conditions compared to the control condition. Although ongoing task performance was slowed in all conditions with a prospective memory task when compared to a baseline condition with no prospective memory task, additional costs occurred only when both the social importance and reward were present simultaneously. Alone, neither social importance nor promising a reward produced an additional slowing when compared to the cost in the standard (control) condition. Thus, social importance and reward can enhance event-based prospective memory at no additional cost.

  20. The Cognitive Processes Underlying Event-Based Prospective Memory In School Age Children and Young Adults: A Formal Model-Based Study

    OpenAIRE

    Smith, Rebekah E.; Bayen, Ute Johanna; Martin, Claudia

    2010-01-01

    Fifty 7-year-olds (29 female), 53 10-year-olds (29 female), and 36 young adults (19 female), performed a computerized event-based prospective memory task. All three groups differed significantly in prospective memory performance with adults showing the best performance and 7-year-olds the poorest performance. We used a formal multinomial process tree model of event-based prospective memory to decompose age differences in cognitive processes that jointly contribute to prospective memory perfor...

  1. Identifying Typhoon Tracks based on Event Synchronization derived Spatially Embedded Climate Networks

    Science.gov (United States)

    Ozturk, Ugur; Marwan, Norbert; Kurths, Jürgen

    2017-04-01

    Complex networks are commonly used for investigating spatiotemporal dynamics of complex systems, e.g. extreme rainfall. Especially directed networks are very effective tools in identifying climatic patterns on spatially embedded networks. They can capture the network flux, so as the principal dynamics of spreading significant phenomena. Network measures, such as network divergence, bare the source-receptor relation of the directed networks. However, it is still a challenge how to catch fast evolving atmospheric events, i.e. typhoons. In this study, we propose a new technique, namely Radial Ranks, to detect the general pattern of typhoons forward direction based on the strength parameter of the event synchronization over Japan. We suggest to subset a circular zone of high correlation around the selected grid based on the strength parameter. Radial sums of the strength parameter along vectors within this zone, radial ranks are measured for potential directions, which allows us to trace the network flux over long distances. We employed also the delay parameter of event synchronization to identify and separate the frontal storms' and typhoons' individual behaviors.

  2. Event-based criteria in GT-STAF information indices: theory, exploratory diversity analysis and QSPR applications.

    Science.gov (United States)

    Barigye, S J; Marrero-Ponce, Y; Martínez López, Y; Martínez Santiago, O; Torrens, F; García Domenech, R; Galvez, J

    2013-01-01

    Versatile event-based approaches for the definition of novel information theory-based indices (IFIs) are presented. An event in this context is the criterion followed in the "discovery" of molecular substructures, which in turn serve as basis for the construction of the generalized incidence and relations frequency matrices, Q and F, respectively. From the resultant F, Shannon's, mutual, conditional and joint entropy-based IFIs are computed. In previous reports, an event named connected subgraphs was presented. The present study is an extension of this notion, in which we introduce other events, namely: terminal paths, vertex path incidence, quantum subgraphs, walks of length k, Sach's subgraphs, MACCs, E-state and substructure fingerprints and, finally, Ghose and Crippen atom-types for hydrophobicity and refractivity. Moreover, we define magnitude-based IFIs, introducing the use of the magnitude criterion in the definition of mutual, conditional and joint entropy-based IFIs. We also discuss the use of information-theoretic parameters as a measure of the dissimilarity of codified structural information of molecules. Finally, a comparison of the statistics for QSPR models obtained with the proposed IFIs and DRAGON's molecular descriptors for two physicochemical properties log P and log K of 34 derivatives of 2-furylethylenes demonstrates similar to better predictive ability than the latter.

  3. Event-based prospective memory in mildly and severely autistic children.

    Science.gov (United States)

    Sheppard, Daniel P; Kvavilashvili, Lia; Ryder, Nuala

    2016-01-01

    There is a growing body of research into the development of prospective memory (PM) in typically developing children but research is limited in autistic children (Aut) and rarely includes children with more severe symptoms. This study is the first to specifically compare event-based PM in severely autistic children to mildly autistic and typically developing children. Fourteen mildly autistic children and 14 severely autistic children, aged 5-13 years, were matched for educational attainment with 26 typically developing children aged 5-6 years. Three PM tasks and a retrospective memory task were administered. Results showed that severely autistic children performed less well than typically developing children on two PM tasks but mildly autistic children did not differ from either group. No group differences were found on the most motivating (a toy reward) task. The findings suggest naturalistic tasks and motivation are important factors in PM success in severely autistic children and highlights the need to consider the heterogeneity of autism and symptom severity in relation to performance on event-based PM tasks. Copyright © 2015 Elsevier Ltd. All rights reserved.

  4. Framework for event-based semidistributed modeling that unifies the SCS-CN method, VIC, PDM, and TOPMODEL

    Science.gov (United States)

    Bartlett, M. S.; Parolari, A. J.; McDonnell, J. J.; Porporato, A.

    2016-09-01

    Hydrologists and engineers may choose from a range of semidistributed rainfall-runoff models such as VIC, PDM, and TOPMODEL, all of which predict runoff from a distribution of watershed properties. However, these models are not easily compared to event-based data and are missing ready-to-use analytical expressions that are analogous to the SCS-CN method. The SCS-CN method is an event-based model that describes the runoff response with a rainfall-runoff curve that is a function of the cumulative storm rainfall and antecedent wetness condition. Here we develop an event-based probabilistic storage framework and distill semidistributed models into analytical, event-based expressions for describing the rainfall-runoff response. The event-based versions called VICx, PDMx, and TOPMODELx also are extended with a spatial description of the runoff concept of "prethreshold" and "threshold-excess" runoff, which occur, respectively, before and after infiltration exceeds a storage capacity threshold. For total storm rainfall and antecedent wetness conditions, the resulting ready-to-use analytical expressions define the source areas (fraction of the watershed) that produce runoff by each mechanism. They also define the probability density function (PDF) representing the spatial variability of runoff depths that are cumulative values for the storm duration, and the average unit area runoff, which describes the so-called runoff curve. These new event-based semidistributed models and the traditional SCS-CN method are unified by the same general expression for the runoff curve. Since the general runoff curve may incorporate different model distributions, it may ease the way for relating such distributions to land use, climate, topography, ecology, geology, and other characteristics.

  5. Intelligent Transportation Control based on Proactive Complex Event Processing

    OpenAIRE

    Wang Yongheng; Geng Shaofeng; Li Qian

    2016-01-01

    Complex Event Processing (CEP) has become the key part of Internet of Things (IoT). Proactive CEP can predict future system states and execute some actions to avoid unwanted states which brings new hope to intelligent transportation control. In this paper, we propose a proactive CEP architecture and method for intelligent transportation control. Based on basic CEP technology and predictive analytic technology, a networked distributed Markov decision processes model with predicting states is p...

  6. Research on Crowdsourcing Emergency Information Extraction of Based on Events' Frame

    Science.gov (United States)

    Yang, Bo; Wang, Jizhou; Ma, Weijun; Mao, Xi

    2018-01-01

    At present, the common information extraction method cannot extract the structured emergency event information accurately; the general information retrieval tool cannot completely identify the emergency geographic information; these ways also do not have an accurate assessment of these results of distilling. So, this paper proposes an emergency information collection technology based on event framework. This technique is to solve the problem of emergency information picking. It mainly includes emergency information extraction model (EIEM), complete address recognition method (CARM) and the accuracy evaluation model of emergency information (AEMEI). EIEM can be structured to extract emergency information and complements the lack of network data acquisition in emergency mapping. CARM uses a hierarchical model and the shortest path algorithm and allows the toponomy pieces to be joined as a full address. AEMEI analyzes the results of the emergency event and summarizes the advantages and disadvantages of the event framework. Experiments show that event frame technology can solve the problem of emergency information drawing and provides reference cases for other applications. When the emergency disaster is about to occur, the relevant departments query emergency's data that has occurred in the past. They can make arrangements ahead of schedule which defense and reducing disaster. The technology decreases the number of casualties and property damage in the country and world. This is of great significance to the state and society.

  7. A process-oriented event-based programming language

    DEFF Research Database (Denmark)

    Hildebrandt, Thomas; Zanitti, Francesco

    2012-01-01

    Vi præsenterer den første version af PEPL, et deklarativt Proces-orienteret, Event-baseret Programmeringssprog baseret på den fornyligt introducerede Dynamic Condition Response (DCR) Graphs model. DCR Graphs tillader specifikation, distribuerede udførsel og verifikation af pervasive event...

  8. Component-Based Data-Driven Predictive Maintenance to Reduce Unscheduled Maintenance Events

    NARCIS (Netherlands)

    Verhagen, W.J.C.; Curran, R.; de Boer, L.W.M.; Chen, C.H.; Trappey, A.C.; Peruzzini, M.; Stjepandić, J.; Wognum, N.

    2017-01-01

    Costs associated with unscheduled and preventive maintenance can contribute significantly to an airline's expenditure. Reliability analysis can help to identify and plan for maintenance events. Reliability analysis in industry is often limited to statistically based

  9. Distinct and shared cognitive functions mediate event- and time-based prospective memory impairment in normal ageing

    Science.gov (United States)

    Gonneaud, Julie; Kalpouzos, Grégoria; Bon, Laetitia; Viader, Fausto; Eustache, Francis; Desgranges, Béatrice

    2011-01-01

    Prospective memory (PM) is the ability to remember to perform an action at a specific point in the future. Regarded as multidimensional, PM involves several cognitive functions that are known to be impaired in normal aging. In the present study, we set out to investigate the cognitive correlates of PM impairment in normal aging. Manipulating cognitive load, we assessed event- and time-based PM, as well as several cognitive functions, including executive functions, working memory and retrospective episodic memory, in healthy subjects covering the entire adulthood. We found that normal aging was characterized by PM decline in all conditions and that event-based PM was more sensitive to the effects of aging than time-based PM. Whatever the conditions, PM was linked to inhibition and processing speed. However, while event-based PM was mainly mediated by binding and retrospective memory processes, time-based PM was mainly related to inhibition. The only distinction between high- and low-load PM cognitive correlates lays in an additional, but marginal, correlation between updating and the high-load PM condition. The association of distinct cognitive functions, as well as shared mechanisms with event- and time-based PM confirms that each type of PM relies on a different set of processes. PMID:21678154

  10. Mind the gap: modelling event-based and millennial-scale landscape dynamics

    NARCIS (Netherlands)

    Baartman, J.E.M.

    2012-01-01

    This research looks at landscape dynamics – erosion and deposition – from two different perspectives: long-term landscape evolution over millennial timescales on the one hand and short-term event-based erosion and deposition at the other hand. For the first, landscape evolution models (LEMs) are

  11. Investigations of Forbush decreases in the PAMELA experiment

    Science.gov (United States)

    Lagoida, I. A.; Voronov, S. A.; Mikhailov, V. V.

    2017-01-01

    A phenomenon in cosmic ray physics now called Forbush decrease (FD), or Forbush effect was discovered by S. Forbush in 1937 [1], it is a sudden decrease of galactic cosmic ray (GCR) intensity near the Earth. However, despite of the long term investigations the nature of this phenomenon is still not completely understood. Today this effect is studied mostly by the neutron monitors and muon hodoscopes, which are located on the Earth’s surface. But these monitors can detect only products of GCR interaction with the Earth atmosphere. Satellite detectors allow to obtain more accurate information about the characteristics of FD. Examples of FDs registered by the PAMELA telescope and observed with Oulu neutron monitor are presented. About 10 events with amplitude more than 3% have been registered from 2006 till 2016 with the PAMELA experiment.

  12. DD4Hep based event reconstruction

    CERN Document Server

    AUTHOR|(SzGeCERN)683529; Frank, Markus; Gaede, Frank-Dieter; Hynds, Daniel; Lu, Shaojun; Nikiforou, Nikiforos; Petric, Marko; Simoniello, Rosa; Voutsinas, Georgios Gerasimos

    The DD4HEP detector description toolkit offers a flexible and easy-to-use solution for the consistent and complete description of particle physics detectors in a single system. The sub-component DDREC provides a dedicated interface to the detector geometry as needed for event reconstruction. With DDREC there is no need to define an additional, separate reconstruction geometry as is often done in HEP, but one can transparently extend the existing detailed simulation model to be also used for the reconstruction. Based on the extension mechanism of DD4HEP, DDREC allows one to attach user defined data structures to detector elements at all levels of the geometry hierarchy. These data structures define a high level view onto the detectors describing their physical properties, such as measurement layers, point resolutions, and cell sizes. For the purpose of charged particle track reconstruction, dedicated surface objects can be attached to every volume in the detector geometry. These surfaces provide the measuremen...

  13. Automated reasoning with dynamic event trees: a real-time, knowledge-based decision aide

    International Nuclear Information System (INIS)

    Touchton, R.A.; Gunter, A.D.; Subramanyan, N.

    1988-01-01

    The models and data contained in a probabilistic risk assessment (PRA) Event Sequence Analysis represent a wealth of information that can be used for dynamic calculation of event sequence likelihood. In this paper we report a new and unique computerization methodology which utilizes these data. This sub-system (referred to as PREDICTOR) has been developed and tested as part of a larger system. PREDICTOR performs a real-time (re)calculation of the estimated likelihood of core-melt as a function of plant status. This methodology uses object-oriented programming techniques from the artificial intelligence discipline that enable one to codify event tree and fault tree logic models and associated probabilities developed in a PRA study. Existence of off-normal conditions is reported to PREDICTOR, which then updates the relevant failure probabilities throughout the event tree and fault tree models by dynamically replacing the off-the-shelf (or prior) probabilities with new probabilities based on the current situation. The new event probabilities are immediately propagated through the models (using 'demons') and an updated core-melt probability is calculated. Along the way, the dominant non-success path of each event tree is determined and highlighted. (author)

  14. Gas-cooled reactors for advanced terrestrial applications

    International Nuclear Information System (INIS)

    Kesavan, K.; Lance, J.R.; Jones, A.R.; Spurrier, F.R.; Peoples, J.A.; Porter, C.A.; Bresnahan, J.D.

    1986-01-01

    Conceptual design of a power plant on an inert gas cooled nuclear coupled to an open, air Brayton power conversion cycle is presented. The power system, called the Westinghouse GCR/ATA (Gas-Cooled Reactors for Advanced Terrestrial Applications), is designed to meet modern military needs, and offers the advantages of secure, reliable and safe electrical power. The GCR/ATA concept is adaptable over a range of 1 to 10 MWe power output. Design descriptions of a compact, air-transportable forward base unit for 1 to 3 MWe output and a fixed-base, permanent installation for 3 to 10 MWe output are presented

  15. Are there persistent physical atmospheric responses to galactic cosmic rays?

    International Nuclear Information System (INIS)

    Benestad, Rasmus E

    2013-01-01

    Variations in the annual mean of the galactic cosmic ray flux (GCR) are compared with annual variations in the most common meteorological variables: temperature, mean sea-level barometric pressure, and precipitation statistics. A multiple regression analysis was used to explore the potential for a GCR response on timescales longer than a year and to identify ‘fingerprint’ patterns in time and space associated with GCR as well as greenhouse gas (GHG) concentrations and the El Niño–Southern Oscillation (ENSO). The response pattern associated with GCR consisted of a negative temperature anomaly that was limited to parts of eastern Europe, and a weak anomaly in the sea-level pressure (SLP), but coincided with higher pressure over the Norwegian Sea. It had a similarity to the North Atlantic Oscillation (NAO) in the northern hemisphere and a wave train in the southern hemisphere. A set of Monte Carlo simulations nevertheless indicated that the weak amplitude of the global mean temperature response associated with GCR could easily be due to chance (p-value = 0.6), and there has been no trend in the GCR. Hence, there is little empirical evidence that links GCR to the recent global warming. (letter)

  16. A Cluster-Based Fuzzy Fusion Algorithm for Event Detection in Heterogeneous Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    ZiQi Hao

    2015-01-01

    Full Text Available As limited energy is one of the tough challenges in wireless sensor networks (WSN, energy saving becomes important in increasing the lifecycle of the network. Data fusion enables combining information from several sources thus to provide a unified scenario, which can significantly save sensor energy and enhance sensing data accuracy. In this paper, we propose a cluster-based data fusion algorithm for event detection. We use k-means algorithm to form the nodes into clusters, which can significantly reduce the energy consumption of intracluster communication. Distances between cluster heads and event and energy of clusters are fuzzified, thus to use a fuzzy logic to select the clusters that will participate in data uploading and fusion. Fuzzy logic method is also used by cluster heads for local decision, and then the local decision results are sent to the base station. Decision-level fusion for final decision of event is performed by base station according to the uploaded local decisions and fusion support degree of clusters calculated by fuzzy logic method. The effectiveness of this algorithm is demonstrated by simulation results.

  17. 2-D Modelling of Long Period Variations of Galactic Cosmic Ray Intensity

    International Nuclear Information System (INIS)

    Siluszyk, M; Iskra, K; Alania, M

    2015-01-01

    A new two-dimensional (2-D) time dependent model describing long-period variations of the Galactic Cosmic Ray (GCR) intensity has been developed. New approximations for the changes of the magnitude B of the Interplanetary Magnetic Field (IMF), the tilt angle δ of the Heliospheric Neutral Sheet (HNS) and drift effects of the GCR particles have been included into the model. Moreover, temporal changes of the exponent γ expressing the power law - rigidity dependence of the amplitudes of the 11-year variation of the GCR intensity have been added. We show that changes of the expected GCR particle density precedes changes of the GCR intensity measured by the Moscow Neutron (MN) monitor by about 18 months. So ∼18 months can be taken as an effective delay time between the expected intensity caused by the combined influence of the changes of the parameters implemented in the time-dependent 2-D model and the GCR intensity measured by neutron monitors during the 21 cycle of solar activity. (paper)

  18. Evaluation of extreme temperature events in northern Spain based on process control charts

    Science.gov (United States)

    Villeta, M.; Valencia, J. L.; Saá, A.; Tarquis, A. M.

    2018-02-01

    Extreme climate events have recently attracted the attention of a growing number of researchers because these events impose a large cost on agriculture and associated insurance planning. This study focuses on extreme temperature events and proposes a new method for their evaluation based on statistical process control tools, which are unusual in climate studies. A series of minimum and maximum daily temperatures for 12 geographical areas of a Spanish region between 1931 and 2009 were evaluated by applying statistical process control charts to statistically test whether evidence existed for an increase or a decrease of extreme temperature events. Specification limits were determined for each geographical area and used to define four types of extreme anomalies: lower and upper extremes for the minimum and maximum anomalies. A new binomial Markov extended process that considers the autocorrelation between extreme temperature events was generated for each geographical area and extreme anomaly type to establish the attribute control charts for the annual fraction of extreme days and to monitor the occurrence of annual extreme days. This method was used to assess the significance of changes and trends of extreme temperature events in the analysed region. The results demonstrate the effectiveness of an attribute control chart for evaluating extreme temperature events. For example, the evaluation of extreme maximum temperature events using the proposed statistical process control charts was consistent with the evidence of an increase in maximum temperatures during the last decades of the last century.

  19. Pull-Based Distributed Event-Triggered Consensus for Multiagent Systems With Directed Topologies.

    Science.gov (United States)

    Yi, Xinlei; Lu, Wenlian; Chen, Tianping

    2017-01-01

    This paper mainly investigates consensus problem with a pull-based event-triggered feedback control. For each agent, the diffusion coupling feedbacks are based on the states of its in-neighbors at its latest triggering time, and the next triggering time of this agent is determined by its in-neighbors' information. The general directed topologies, including irreducible and reducible cases, are investigated. The scenario of distributed continuous communication is considered first. It is proved that if the network topology has a spanning tree, then the event-triggered coupling algorithm can realize the consensus for the multiagent system. Then, the results are extended to discontinuous communication, i.e., self-triggered control, where each agent computes its next triggering time in advance without having to observe the system's states continuously. The effectiveness of the theoretical results is illustrated by a numerical example finally.

  20. wayGoo recommender system: personalized recommendations for events scheduling, based on static and real-time information

    Science.gov (United States)

    Thanos, Konstantinos-Georgios; Thomopoulos, Stelios C. A.

    2016-05-01

    wayGoo is a fully functional application whose main functionalities include content geolocation, event scheduling, and indoor navigation. However, significant information about events do not reach users' attention, either because of the size of this information or because some information comes from real - time data sources. The purpose of this work is to facilitate event management operations by prioritizing the presented events, based on users' interests using both, static and real - time data. Through the wayGoo interface, users select conceptual topics that are interesting for them. These topics constitute a browsing behavior vector which is used for learning users' interests implicitly, without being intrusive. Then, the system estimates user preferences and return an events list sorted from the most preferred one to the least. User preferences are modeled via a Naïve Bayesian Network which consists of: a) the `decision' random variable corresponding to users' decision on attending an event, b) the `distance' random variable, modeled by a linear regression that estimates the probability that the distance between a user and each event destination is not discouraging, ` the seat availability' random variable, modeled by a linear regression, which estimates the probability that the seat availability is encouraging d) and the `relevance' random variable, modeled by a clustering - based collaborative filtering, which determines the relevance of each event users' interests. Finally, experimental results show that the proposed system contribute essentially to assisting users in browsing and selecting events to attend.

  1. MAS Based Event-Triggered Hybrid Control for Smart Microgrids

    DEFF Research Database (Denmark)

    Dou, Chunxia; Liu, Bin; Guerrero, Josep M.

    2013-01-01

    This paper is focused on an advanced control for autonomous microgrids. In order to improve the performance regarding security and stability, a hierarchical decentralized coordinated control scheme is proposed based on multi-agents structure. Moreover, corresponding to the multi-mode and the hybrid...... haracteristics of microgrids, an event-triggered hybrid control, including three kinds of switching controls, is designed to intelligently reconstruct operation mode when the security stability assessment indexes or the constraint conditions are violated. The validity of proposed control scheme is demonstrated...

  2. Measurement of the underlying event using track-based event shapes in Z→l{sup +}l{sup -} events with ATLAS

    Energy Technology Data Exchange (ETDEWEB)

    Schulz, Holger

    2014-09-11

    This thesis describes a measurement of hadron-collider event shapes in proton-proton collisions at a centre of momentum energy of 7 TeV at the Large Hadron Collider (LHC) at CERN (Conseil Europeenne pour la Recherche Nucleaire) located near Geneva (Switzerland). The analysed data (integrated luminosity: 1.1 fb{sup -1}) was recorded in 2011 with the ATLAS-experiment. Events where a Z-boson was produced in the hard sub-process which subsequently decays into an electron-positron or muon-antimuon pair were selected for this analysis. The observables are calculated using all reconstructed tracks of charged particles within the acceptance of the inner detector of ATLAS except those of the leptons of the Z-decay. Thus, this is the first measurement of its kind. The observables were corrected for background processes using data-driven methods. For the correction of so-called ''pile-up'' (multiple overlapping proton-proton collisions) a novel technique was developed and successfully applied. The data was further unfolded to correct for remaining detector effects. The obtained distributions are especially sensitive to the so-called ''Underlying Event'' and can be compared with predictions of Monte-Carlo event-generators directly, i.e. without the necessity of running time-consuming simulations of the ATLAS-detector. Finally, it was tried to improve the predictions of the event generators Pythia8 and Sherpa by finding an optimised setting of relevant model parameters in a technique called ''Tuning''. It became apparent, however, that the underlying Sjoestrand-Zijl model is unable to give a good description of the measured event-shape distributions.

  3. Assessing uncertainty in extreme events: Applications to risk-based decision making in interdependent infrastructure sectors

    International Nuclear Information System (INIS)

    Barker, Kash; Haimes, Yacov Y.

    2009-01-01

    Risk-based decision making often relies upon expert probability assessments, particularly in the consequences of disruptive events and when such events are extreme or catastrophic in nature. Naturally, such expert-elicited probability distributions can be fraught with errors, as they describe events which occur very infrequently and for which only sparse data exist. This paper presents a quantitative framework, the extreme event uncertainty sensitivity impact method (EE-USIM), for measuring the sensitivity of extreme event consequences to uncertainties in the parameters of the underlying probability distribution. The EE-USIM is demonstrated with the Inoperability input-output model (IIM), a model with which to evaluate the propagation of inoperability throughout an interdependent set of economic and infrastructure sectors. The EE-USIM also makes use of a two-sided power distribution function generated by expert elicitation of extreme event consequences

  4. Cosmic rays linked to rapid mid-latitude cloud changes

    Directory of Open Access Journals (Sweden)

    B. A. Laken

    2010-11-01

    Full Text Available The effect of the Galactic Cosmic Ray (GCR flux on Earth's climate is highly uncertain. Using a novel sampling approach based around observing periods of significant cloud changes, a statistically robust relationship is identified between short-term GCR flux changes and the most rapid mid-latitude (60°–30° N/S cloud decreases operating over daily timescales; this signal is verified in surface level air temperature (SLAT reanalysis data. A General Circulation Model (GCM experiment is used to test the causal relationship of the observed cloud changes to the detected SLAT anomalies. Results indicate that the anomalous cloud changes were responsible for producing the observed SLAT changes, implying that if there is a causal relationship between significant decreases in the rate of GCR flux (~0.79 GU, where GU denotes a change of 1% of the 11-year solar cycle amplitude in four days and decreases in cloud cover (~1.9 CU, where CU denotes a change of 1% cloud cover in four days, an increase in SLAT (~0.05 KU, where KU denotes a temperature change of 1 K in four days can be expected. The influence of GCRs is clearly distinguishable from changes in solar irradiance and the interplanetary magnetic field. However, the results of the GCM experiment are found to be somewhat limited by the ability of the model to successfully reproduce observed cloud cover. These results provide perhaps the most compelling evidence presented thus far of a GCR-climate relationship. From this analysis we conclude that a GCR-climate relationship is governed by both short-term GCR changes and internal atmospheric precursor conditions.

  5. Influence of Secondary Cooling Mode on Solidification Structure and Macro-segregation Behavior for High-carbon Continuous Casting Bloom

    Science.gov (United States)

    Dou, Kun; Yang, Zhenguo; Liu, Qing; Huang, Yunhua; Dong, Hongbiao

    2017-07-01

    A cellular automaton-finite element coupling model for high-carbon continuously cast bloom of GCr15 steel is established to simulate the solidification structure and to investigate the influence of different secondary cooling modes on characteristic parameters such as equiaxed crystal ratio, grain size and secondary dendrite arm spacing, in which the effect of phase transformation and electromagnetic stirring is taken into consideration. On this basis, evolution of carbon macro-segregation for GCr15 steel bloom is researched correspondingly via industrial tests. Based on above analysis, the relationship among secondary cooling modes, characteristic parameters for solidification structure as well as carbon macro-segregation is illustrated to obtain optimum secondary cooling strategy and alleviate carbon macro-segregation degree for GCr15 steel bloom in continuous casting process. The evaluating method for element macro-segregation is applicable in various steel types.

  6. MadEvent: automatic event generation with MadGraph

    International Nuclear Information System (INIS)

    Maltoni, Fabio; Stelzer, Tim

    2003-01-01

    We present a new multi-channel integration method and its implementation in the multi-purpose event generator MadEvent, which is based on MadGraph. Given a process, MadGraph automatically identifies all the relevant subprocesses, generates both the amplitudes and the mappings needed for an efficient integration over the phase space, and passes them to MadEvent. As a result, a process-specific, stand-alone code is produced that allows the user to calculate cross sections and produce unweighted events in a standard output format. Several examples are given for processes that are relevant for physics studies at present and forthcoming colliders. (author)

  7. Event-based scenario manager for multibody dynamics simulation of heavy load lifting operations in shipyards

    Directory of Open Access Journals (Sweden)

    Sol Ha

    2016-01-01

    Full Text Available This paper suggests an event-based scenario manager capable of creating and editing a scenario for shipbuilding process simulation based on multibody dynamics. To configure various situation in shipyards and easily connect with multibody dynamics, the proposed method has two main concepts: an Actor and an Action List. The Actor represents the anatomic unit of action in the multibody dynamics and can be connected to a specific component of the dynamics kernel such as the body and joint. The user can make a scenario up by combining the actors. The Action List contains information for arranging and executing the actors. Since the shipbuilding process is a kind of event-based sequence, all simulation models were configured using Discrete EVent System Specification (DEVS formalism. The proposed method was applied to simulations of various operations in shipyards such as lifting and erection of a block and heavy load lifting operation using multiple cranes.

  8. Full-waveform detection of non-impulsive seismic events based on time-reversal methods

    Science.gov (United States)

    Solano, Ericka Alinne; Hjörleifsdóttir, Vala; Liu, Qinya

    2017-12-01

    We present a full-waveform detection method for non-impulsive seismic events, based on time-reversal principles. We use the strain Green's tensor as a matched filter, correlating it with continuous observed seismograms, to detect non-impulsive seismic events. We show that this is mathematically equivalent to an adjoint method for detecting earthquakes. We define the detection function, a scalar valued function, which depends on the stacked correlations for a group of stations. Event detections are given by the times at which the amplitude of the detection function exceeds a given value relative to the noise level. The method can make use of the whole seismic waveform or any combination of time-windows with different filters. It is expected to have an advantage compared to traditional detection methods for events that do not produce energetic and impulsive P waves, for example glacial events, landslides, volcanic events and transform-fault earthquakes for events which velocity structure along the path is relatively well known. Furthermore, the method has advantages over empirical Greens functions template matching methods, as it does not depend on records from previously detected events, and therefore is not limited to events occurring in similar regions and with similar focal mechanisms as these events. The method is not specific to any particular way of calculating the synthetic seismograms, and therefore complicated structural models can be used. This is particularly beneficial for intermediate size events that are registered on regional networks, for which the effect of lateral structure on the waveforms can be significant. To demonstrate the feasibility of the method, we apply it to two different areas located along the mid-oceanic ridge system west of Mexico where non-impulsive events have been reported. The first study area is between Clipperton and Siqueiros transform faults (9°N), during the time of two earthquake swarms, occurring in March 2012 and May

  9. OLTARIS: An Efficient Web-Based Tool for Analyzing Materials Exposed to Space Radiation

    Science.gov (United States)

    Slaba, Tony; McMullen, Amelia M.; Thibeault, Sheila A.; Sandridge, Chris A.; Clowdsley, Martha S.; Blatting, Steve R.

    2011-01-01

    The near-Earth space radiation environment includes energetic galactic cosmic rays (GCR), high intensity proton and electron belts, and the potential for solar particle events (SPE). These sources may penetrate shielding materials and deposit significant energy in sensitive electronic devices on board spacecraft and satellites. Material and design optimization methods may be used to reduce the exposure and extend the operational lifetime of individual components and systems. Since laboratory experiments are expensive and may not cover the range of particles and energies relevant for space applications, such optimization may be done computationally with efficient algorithms that include the various constraints placed on the component, system, or mission. In the present work, the web-based tool OLTARIS (On-Line Tool for the Assessment of Radiation in Space) is presented, and the applicability of the tool for rapidly analyzing exposure levels within either complicated shielding geometries or user-defined material slabs exposed to space radiation is demonstrated. An example approach for material optimization is also presented. Slabs of various advanced multifunctional materials are defined and exposed to several space radiation environments. The materials and thicknesses defining each layer in the slab are then systematically adjusted to arrive at an optimal slab configuration.

  10. Discrete event model-based simulation for train movement on a single-line railway

    International Nuclear Information System (INIS)

    Xu Xiao-Ming; Li Ke-Ping; Yang Li-Xing

    2014-01-01

    The aim of this paper is to present a discrete event model-based approach to simulate train movement with the considered energy-saving factor. We conduct extensive case studies to show the dynamic characteristics of the traffic flow and demonstrate the effectiveness of the proposed approach. The simulation results indicate that the proposed discrete event model-based simulation approach is suitable for characterizing the movements of a group of trains on a single railway line with less iterations and CPU time. Additionally, some other qualitative and quantitative characteristics are investigated. In particular, because of the cumulative influence from the previous trains, the following trains should be accelerated or braked frequently to control the headway distance, leading to more energy consumption. (general)

  11. Arachne - A web-based event viewer for MINERvA

    International Nuclear Information System (INIS)

    Tagg, N.; Brangham, J.; Chvojka, J.; Clairemont, M.; Day, M.; Eberly, B.; Felix, J.; Fields, L.; Gago, A.M.; Gran, R.; Harris, D.A.

    2011-01-01

    Neutrino interaction events in the MINERvA detector are visually represented with a web-based tool called Arachne. Data are retrieved from a central server via AJAX, and client-side JavaScript draws images into the user's browser window using the draft HTML 5 standard. These technologies allow neutrino interactions to be viewed by anyone with a web browser, allowing for easy hand-scanning of particle interactions. Arachne has been used in MINERvA to evaluate neutrino data in a prototype detector, to tune reconstruction algorithms, and for public outreach and education.

  12. Arachne—A web-based event viewer for MINERνA

    International Nuclear Information System (INIS)

    Tagg, N.; Brangham, J.; Chvojka, J.; Clairemont, M.; Day, M.; Eberly, B.; Felix, J.; Fields, L.; Gago, A.M.; Gran, R.; Harris, D.A.; Kordosky, M.; Lee, H.; Maggi, G.; Maher, E.; Mann, W.A.; Marshall, C.M.; McFarland, K.S.; McGowan, A.M.; Mislivec, A.

    2012-01-01

    Neutrino interaction events in the MINERνA detector are visually represented with a web-based tool called Arachne. Data are retrieved from a central server via AJAX, and client-side JavaScript draws images into the user's browser window using the draft HTML 5 standard. These technologies allow neutrino interactions to be viewed by anyone with a web browser, allowing for easy hand-scanning of particle interactions. Arachne has been used in MINERνA to evaluate neutrino data in a prototype detector, to tune reconstruction algorithms, and for public outreach and education.

  13. Arachne - A web-based event viewer for MINERvA

    Energy Technology Data Exchange (ETDEWEB)

    Tagg, N.; /Otterbein Coll.; Brangham, J.; /Otterbein Coll.; Chvojka, J.; /Rochester U.; Clairemont, M.; /Otterbein Coll.; Day, M.; /Rochester U.; Eberly, B.; /Pittsburgh U.; Felix, J.; /Guanajuato U.; Fields, L.; /Northwestern U.; Gago, A.M.; /Lima, Pont. U. Catolica; Gran, R.; /Maryland U.; Harris, D.A.; /Fermilab /William-Mary Coll.

    2011-11-01

    Neutrino interaction events in the MINERvA detector are visually represented with a web-based tool called Arachne. Data are retrieved from a central server via AJAX, and client-side JavaScript draws images into the user's browser window using the draft HTML 5 standard. These technologies allow neutrino interactions to be viewed by anyone with a web browser, allowing for easy hand-scanning of particle interactions. Arachne has been used in MINERvA to evaluate neutrino data in a prototype detector, to tune reconstruction algorithms, and for public outreach and education.

  14. Mining web-based data to assess public response to environmental events

    International Nuclear Information System (INIS)

    Cha, YoonKyung; Stow, Craig A.

    2015-01-01

    We explore how the analysis of web-based data, such as Twitter and Google Trends, can be used to assess the social relevance of an environmental accident. The concept and methods are applied in the shutdown of drinking water supply at the city of Toledo, Ohio, USA. Toledo's notice, which persisted from August 1 to 4, 2014, is a high-profile event that directly influenced approximately half a million people and received wide recognition. The notice was given when excessive levels of microcystin, a byproduct of cyanobacteria blooms, were discovered at the drinking water treatment plant on Lake Erie. Twitter mining results illustrated an instant response to the Toledo incident, the associated collective knowledge, and public perception. The results from Google Trends, on the other hand, revealed how the Toledo event raised public attention on the associated environmental issue, harmful algal blooms, in a long-term context. Thus, when jointly applied, Twitter and Google Trend analysis results offer complementary perspectives. Web content aggregated through mining approaches provides a social standpoint, such as public perception and interest, and offers context for establishing and evaluating environmental management policies. - The joint application of Twitter and Google Trend analysis to an environmental event offered both short and long-term patterns of public perception and interest on the event

  15. Issues in Informal Education: Event-Based Science Communication Involving Planetaria and the Internet

    Science.gov (United States)

    Adams, Mitzi L.; Gallagher, D. L.; Whitt, A.; Whitaker, Ann F. (Technical Monitor)

    2001-01-01

    For the last several years the Science Directorate at Marshall Space Flight Center has carried out a diverse program of Internet-based science communication. The program includes extended stories about NASA science, a curriculum resource for teachers tied to national education standards, on-line activities for students, and webcasts of real-time events. The focus of sharing real-time science related events has been to involve and excite students and the public about science. Events have involved meteor showers, solar eclipses, natural very low frequency radio emissions, and amateur balloon flights. In some cases broadcasts accommodate active feedback and questions from Internet participants. Panel participation will be used to communicate the problems and lessons learned from these activities over the last three years.

  16. Analysis of manufacturing based on object oriented discrete event simulation

    Directory of Open Access Journals (Sweden)

    Eirik Borgen

    1990-01-01

    Full Text Available This paper describes SIMMEK, a computer-based tool for performing analysis of manufacturing systems, developed at the Production Engineering Laboratory, NTH-SINTEF. Its main use will be in analysis of job shop type of manufacturing. But certain facilities make it suitable for FMS as well as a production line manufacturing. This type of simulation is very useful in analysis of any types of changes that occur in a manufacturing system. These changes may be investments in new machines or equipment, a change in layout, a change in product mix, use of late shifts, etc. The effects these changes have on for instance the throughput, the amount of VIP, the costs or the net profit, can be analysed. And this can be done before the changes are made, and without disturbing the real system. Simulation takes into consideration, unlike other tools for analysis of manufacturing systems, uncertainty in arrival rates, process and operation times, and machine availability. It also shows the interaction effects a job which is late in one machine, has on the remaining machines in its route through the layout. It is these effects that cause every production plan not to be fulfilled completely. SIMMEK is based on discrete event simulation, and the modeling environment is object oriented. The object oriented models are transformed by an object linker into data structures executable by the simulation kernel. The processes of the entity objects, i.e. the products, are broken down to events and put into an event list. The user friendly graphical modeling environment makes it possible for end users to build models in a quick and reliable way, using terms from manufacturing. Various tests and a check of model logic are helpful functions when testing validity of the models. Integration with software packages, with business graphics and statistical functions, is convenient in the result presentation phase.

  17. A Hospital Nursing Adverse Events Reporting System Project: An Approach Based on the Systems Development Life Cycle.

    Science.gov (United States)

    Cao, Yingjuan; Ball, Marion

    2017-01-01

    Based on the System Development Life Cycle, a hospital based nursing adverse event reporting system was developed and implemented which integrated with the current Hospital Information System (HIS). Besides the potitive outcomes in terms of timeliness and efficiency, this approach has brought an enormous change in how the nurses report, analyze and respond to the adverse events.

  18. Effect of Aluminum Alloying on the Hot Deformation Behavior of Nano-bainite Bearing Steel

    Science.gov (United States)

    Yang, Z. N.; Dai, L. Q.; Chu, C. H.; Zhang, F. C.; Wang, L. W.; Xiao, A. P.

    2017-12-01

    Interest in using aluminum in nano-bainite steel, especially for high-carbon bearing steel, is gradually growing. In this study, GCr15SiMo and GCr15SiMoAl steels are introduced to investigate the effect of Al alloying on the hot deformation behavior of bearing steel. Results show that the addition of Al not only notably increases the flow stress of steel due to the strong strengthening effect of Al on austenite phase, but also accelerates the strain-softening rates for its increasing effect on stacking fault energy. Al alloying also increases the activation energy of deformation. Two constitutive equations with an accuracy of higher than 0.99 are proposed. The constructed processing maps show the expanded instability regions for GCr15SiMoAl steel as compared with GCr15SiMo steel. This finding is consistent with the occurrence of cracking on the GCr15SiMoAl specimens, revealing that Al alloying reduces the high-temperature plasticity of the bearing steel. On the contrary, GCr15SiMoAl steel possesses smaller grain size than GCr15SiMo steel, manifesting the positive effect of Al on bearing steel. Attention should be focused on the hot working process of bearing steel with Al.

  19. Microbial Response to Spaceflight Conditions

    OpenAIRE

    Moeller, R.

    2017-01-01

    Space radiation, including Galactic Cosmic Rays (GCR) and Solar Particle Events (SPE), represents a major hazard for biological systems beyond Earth. Spores of Bacillus subtilis have been shown to be suitable dosimeters for probing extreme terrestrial and extraterrestrial environmental conditions in astrobiological and environmental studies. During dormancy spores are metabolically inactive; thus substantial DNA, protein, tRNA and ribosome damage can accumulate while the spo...

  20. Combined adaptive multiple subtraction based on optimized event tracing and extended wiener filtering

    Science.gov (United States)

    Tan, Jun; Song, Peng; Li, Jinshan; Wang, Lei; Zhong, Mengxuan; Zhang, Xiaobo

    2017-06-01

    The surface-related multiple elimination (SRME) method is based on feedback formulation and has become one of the most preferred multiple suppression methods used. However, some differences are apparent between the predicted multiples and those in the source seismic records, which may result in conventional adaptive multiple subtraction methods being barely able to effectively suppress multiples in actual production. This paper introduces a combined adaptive multiple attenuation method based on the optimized event tracing technique and extended Wiener filtering. The method firstly uses multiple records predicted by SRME to generate a multiple velocity spectrum, then separates the original record to an approximate primary record and an approximate multiple record by applying the optimized event tracing method and short-time window FK filtering method. After applying the extended Wiener filtering method, residual multiples in the approximate primary record can then be eliminated and the damaged primary can be restored from the approximate multiple record. This method combines the advantages of multiple elimination based on the optimized event tracing method and the extended Wiener filtering technique. It is an ideal method for suppressing typical hyperbolic and other types of multiples, with the advantage of minimizing damage of the primary. Synthetic and field data tests show that this method produces better multiple elimination results than the traditional multi-channel Wiener filter method and is more suitable for multiple elimination in complicated geological areas.

  1. Event Displays for the Visualization of CMS Events

    CERN Document Server

    Jones, Christopher Duncan

    2010-01-01

    During the last year the CMS experiment engaged in consolidation of its existing event display programs. The core of the new system is based on the Fireworks event display program which was by-design directly integrated with the CMS Event Data Model (EDM) and the light version of the software framework (FWLite). The Event Visualization Environment (EVE) of the ROOT framework is used to manage a consistent set of 3D and 2D views, selection, user-feedback and user-interaction with the graphics windows; several EVE components were developed by CMS in collaboration with the ROOT project. In event display operation simple plugins are registered into the system to perform conversion from EDM collections into their visual representations which are then managed by the application. Full event navigation and filtering as well as collection-level filtering is supported. The same data-extraction principle can also be applied when Fireworks will eventually operate as a service within the full software framework.

  2. Event Display for the Visualization of CMS Events

    Science.gov (United States)

    Bauerdick, L. A. T.; Eulisse, G.; Jones, C. D.; Kovalskyi, D.; McCauley, T.; Mrak Tadel, A.; Muelmenstaedt, J.; Osborne, I.; Tadel, M.; Tu, Y.; Yagil, A.

    2011-12-01

    During the last year the CMS experiment engaged in consolidation of its existing event display programs. The core of the new system is based on the Fireworks event display program which was by-design directly integrated with the CMS Event Data Model (EDM) and the light version of the software framework (FWLite). The Event Visualization Environment (EVE) of the ROOT framework is used to manage a consistent set of 3D and 2D views, selection, user-feedback and user-interaction with the graphics windows; several EVE components were developed by CMS in collaboration with the ROOT project. In event display operation simple plugins are registered into the system to perform conversion from EDM collections into their visual representations which are then managed by the application. Full event navigation and filtering as well as collection-level filtering is supported. The same data-extraction principle can also be applied when Fireworks will eventually operate as a service within the full software framework.

  3. Event Display for the Visualization of CMS Events

    International Nuclear Information System (INIS)

    Bauerdick, L A T; Eulisse, G; Jones, C D; McCauley, T; Osborne, I; Kovalskyi, D; Tadel, A Mrak; Muelmenstaedt, J; Tadel, M; Tu, Y; Yagil, A

    2011-01-01

    During the last year the CMS experiment engaged in consolidation of its existing event display programs. The core of the new system is based on the Fireworks event display program which was by-design directly integrated with the CMS Event Data Model (EDM) and the light version of the software framework (FWLite). The Event Visualization Environment (EVE) of the ROOT framework is used to manage a consistent set of 3D and 2D views, selection, user-feedback and user-interaction with the graphics windows; several EVE components were developed by CMS in collaboration with the ROOT project. In event display operation simple plugins are registered into the system to perform conversion from EDM collections into their visual representations which are then managed by the application. Full event navigation and filtering as well as collection-level filtering is supported. The same data-extraction principle can also be applied when Fireworks will eventually operate as a service within the full software framework.

  4. Emerging Radiation Health-Risk Mitigation Technologies

    International Nuclear Information System (INIS)

    Wilson, J.W.; Cucinotta, F.A.; Schimmerling, W.

    2004-01-01

    Past space missions beyond the confines of the Earth's protective magnetic field have been of short duration and protection from the effects of solar particle events was of primary concern. The extension of operational infrastructure beyond low-Earth orbit to enable routine access to more interesting regions of space will require protection from the hazards of the accumulated exposures of Galactic Cosmic Rays (GCR). There are significant challenges in providing protection from the long-duration exposure to GCR: the human risks to the exposures are highly uncertain and safety requirements places unreasonable demands in supplying sufficient shielding materials in the design. A vigorous approach to future radiation health-risk mitigation requires a triage of techniques (using biological and technical factors) and reduction of the uncertainty in radiation risk models. The present paper discusses the triage of factors for risk mitigation with associated materials issues and engineering design methods

  5. A summary of recent results from the GRAPES-3 experiment

    Directory of Open Access Journals (Sweden)

    Gupta S.K.

    2017-01-01

    Full Text Available The GRAPES-3 experiment is a combination of a high density extensive air shower (EAS array of nearly 400 plastic scintillator detectors, and a large 560 m2 area tracking muon telescope with an energy threshold Eμ >1 GeV. GRAPES-3 has been operating continuously in Ooty, India since 2000. By accurately correcting for the effects of atmospheric pressure and temperature, the muon telescope provides a high precision directional survey of the galactic cosmic ray (GCR intensity. This telescope has been used to observe the acceleration of muons during thunderstorm events. The recent discovery of a transient weakening of the Earth's magnetic shield through the detection of a GCR burst was the highlight of the GRAPES-3 results. We have an ongoing major expansion activity to further enhance the capability of the GRAPES-3 muon telescope by doubling its area.

  6. The study of lymphocytes glucocorticoid receptor in severe head injury

    International Nuclear Information System (INIS)

    Li Dapei; Wang Haodan; Zhao Qihuang

    1994-01-01

    Glucocorticoid receptors (GCR) of peripheral lymphocytes from 14 patients with severe head injury and 11 normal volunteers are studied by means of single point method of radioligand binding assay. All these patients receive surgical therapy and glucocorticoid of routine dosage. The results show that the GCR level of these patients is lower than that of the normal, while the plasma cortisol level is much higher. These changes correlate closely to the patients' clinical outcome. It is indicated that the GCR level can reflect the degree of stress of these patients and their response to glucocorticoid therapy. Using peripheral lymphocytes instead of the brain biopsy for the measurement of GCR can reflect the GCR changes of brain tissue, it's more convenient to get the sample and more acceptable to the patients

  7. Automatic detection of esophageal pressure events. Is there an alternative to rule-based criteria?

    DEFF Research Database (Denmark)

    Kruse-Andersen, S; Rütz, K; Kolberg, Jens Godsk

    1995-01-01

    of relevant pressure peaks at the various recording levels. Until now, this selection has been performed entirely by rule-based systems, requiring each pressure deflection to fit within predefined rigid numerical limits in order to be detected. However, due to great variations in the shapes of the pressure...... curves generated by muscular contractions, rule-based criteria do not always select the pressure events most relevant for further analysis. We have therefore been searching for a new concept for automatic event recognition. The present study describes a new system, based on the method of neurocomputing.......79-0.99 and accuracies of 0.89-0.98, depending on the recording level within the esophageal lumen. The neural networks often recognized peaks that clearly represented true contractions but that had been rejected by a rule-based system. We conclude that neural networks have potentials for automatic detections...

  8. Event-triggered hybrid control based on multi-Agent systems for Microgrids

    DEFF Research Database (Denmark)

    Dou, Chun-xia; Liu, Bin; Guerrero, Josep M.

    2014-01-01

    This paper is focused on a multi-agent system based event-triggered hybrid control for intelligently restructuring the operating mode of an microgrid (MG) to ensure the energy supply with high security, stability and cost effectiveness. Due to the microgrid is composed of different types...... of distributed energy resources, thus it is typical hybrid dynamic network. Considering the complex hybrid behaviors, a hierarchical decentralized coordinated control scheme is firstly constructed based on multi-agent sys-tem, then, the hybrid model of the microgrid is built by using differential hybrid Petri...

  9. Fluence-based and microdosimetric event-based methods for radiation protection in space

    International Nuclear Information System (INIS)

    Curtis, S.B.

    2002-01-01

    The National Council on Radiation Protection and Measurements (NCRP) has recently published a report (Report no.137) that discusses various aspects of the concepts used in radiation protection and the difficulties in measuring the radiation environment in spacecraft for the estimation of radiation risk to space travelers. Two novel dosimetric methodologies, fluence-based and microdosimetric event-based methods, are discussed and evaluated, along with the more conventional quality factor/linear energy transfer (LET) method. It was concluded that for the present, any reason to switch to a new methodology is not compelling. It is suggested that because of certain drawbacks in the presently-used conventional method, these alternative methodologies should be kept in mind. As new data become available and dosimetric techniques become more refined, the question should be revisited and that in the future, significant improvement might be realized. In addition, such concepts as equivalent dose and organ dose equivalent are discussed and various problems regarding the measurement/estimation of these quantities are presented. (author)

  10. Temporal and Location Based RFID Event Data Management and Processing

    Science.gov (United States)

    Wang, Fusheng; Liu, Peiya

    Advance of sensor and RFID technology provides significant new power for humans to sense, understand and manage the world. RFID provides fast data collection with precise identification of objects with unique IDs without line of sight, thus it can be used for identifying, locating, tracking and monitoring physical objects. Despite these benefits, RFID poses many challenges for data processing and management. RFID data are temporal and history oriented, multi-dimensional, and carrying implicit semantics. Moreover, RFID applications are heterogeneous. RFID data management or data warehouse systems need to support generic and expressive data modeling for tracking and monitoring physical objects, and provide automated data interpretation and processing. We develop a powerful temporal and location oriented data model for modeling and queryingRFID data, and a declarative event and rule based framework for automated complex RFID event processing. The approach is general and can be easily adapted for different RFID-enabled applications, thus significantly reduces the cost of RFID data integration.

  11. Event-Based Prospective Memory Is Resistant but Not Immune to Proactive Interference.

    Science.gov (United States)

    Oates, Joyce M; Peynircioglu, Zehra F

    2016-01-01

    Recent evidence suggests that proactive interference (PI) does not hurt event-based prospective memory (ProM) the way it does retrospective memory (RetroM) (Oates, Peynircioglu, & Bates, 2015). We investigated this apparent resistance further. Introduction of a distractor task to ensure we were testing ProM rather than vigilance in Experiment 1 and tripling the number of lists to provide more opportunity for PI buildup in Experiment 2 still did not produce performance decrements. However, when the ProM task was combined with a RetroM task in Experiment 3, a comparable buildup and release was observed also in the ProM task. It appears that event based ProM is indeed somewhat resistant to PI, but this resistance can break down when the ProM task comprises the same stimuli as in an embedded RetroM task. We discuss the results using the ideas of cue overload and distinctiveness as well as shared attentional and working memory resources.

  12. Prediction problem for target events based on the inter-event waiting time

    Science.gov (United States)

    Shapoval, A.

    2010-11-01

    In this paper we address the problem of forecasting the target events of a time series given the distribution ξ of time gaps between target events. Strong earthquakes and stock market crashes are the two types of such events that we are focusing on. In the series of earthquakes, as McCann et al. show [W.R. Mc Cann, S.P. Nishenko, L.R. Sykes, J. Krause, Seismic gaps and plate tectonics: seismic potential for major boundaries, Pure and Applied Geophysics 117 (1979) 1082-1147], there are well-defined gaps (called seismic gaps) between strong earthquakes. On the other hand, usually there are no regular gaps in the series of stock market crashes [M. Raberto, E. Scalas, F. Mainardi, Waiting-times and returns in high-frequency financial data: an empirical study, Physica A 314 (2002) 749-755]. For the case of seismic gaps, we analytically derive an upper bound of prediction efficiency given the coefficient of variation of the distribution ξ. For the case of stock market crashes, we develop an algorithm that predicts the next crash within a certain time interval after the previous one. We show that this algorithm outperforms random prediction. The efficiency of our algorithm sets up a lower bound of efficiency for effective prediction of stock market crashes.

  13. Simulating the influence of life trajectory events on transport mode behavior in an agent-based system

    NARCIS (Netherlands)

    Verhoeven, M.; Arentze, T.A.; Timmermans, H.J.P.; Waerden, van der P.J.H.J.

    2007-01-01

    this paper describes the results of a study on the impact of lifecycle or life trajectory events on activity-travel decisions. This lifecycle trajectory of individual agents can be easily incorporated in an agent-based simulation system. This paper focuses on two lifecycle events, change in

  14. A 21.7 kb DNA segment on the left arm of yeast chromosome XIV carries WHI3, GCR2, SPX18, SPX19, an homologue to the heat shock gene SSB1 and 8 new open reading frames of unknown function.

    Science.gov (United States)

    Jonniaux, J L; Coster, F; Purnelle, B; Goffeau, A

    1994-12-01

    We report the amino acid sequence of 13 open reading frames (ORF > 299 bp) located on a 21.7 kb DNA segment from the left arm of chromosome XIV of Saccharomyces cerevisiae. Five open reading frames had been entirely or partially sequenced previously: WHI3, GCR2, SPX19, SPX18 and a heat shock gene similar to SSB1. The products of 8 other ORFs are new putative proteins among which N1394 is probably a membrane protein. N1346 contains a leucine zipper pattern and the corresponding ORF presents an HAP (global regulator of respiratory genes) upstream activating sequence in the promoting region. N1386 shares homologies with the DNA structure-specific recognition protein family SSRPs and the corresponding ORF is preceded by an MCB (MluI cell cycle box) upstream activating factor.

  15. Cellular track model of biological damage to mammalian cell cultures from galactic cosmic rays

    International Nuclear Information System (INIS)

    Cucinotta, F.A.; Katz, R.; Wilson, J.W.; Townsend, L.W.; Nealy, J.E.; Shinn, J.L.

    1991-02-01

    The assessment of biological damage from the galactic cosmic rays (GCR) is a current interest for exploratory class space missions where the highly ionizing, high-energy, high-charge ions (HZE) particles are the major concern. The relative biological effectiveness (RBE) values determined by ground-based experiments with HZE particles are well described by a parametric track theory of cell inactivation. Using the track model and a deterministic GCR transport code, the biological damage to mammalian cell cultures is considered for 1 year in free space at solar minimum for typical spacecraft shielding. Included are the effects of projectile and target fragmentation. The RBE values for the GCR spectrum which are fluence-dependent in the track model are found to be more severe than the quality factors identified by the International Commission on Radiological Protection publication 26 and seem to obey a simple scaling law with the duration period in free space

  16. Cellular track model of biological damage to mammalian cell cultures from galactic cosmic rays

    Science.gov (United States)

    Cucinotta, Francis A.; Katz, Robert; Wilson, John W.; Townsend, Lawrence W.; Nealy, John E.; Shinn, Judy L.

    1991-01-01

    The assessment of biological damage from the galactic cosmic rays (GCR) is a current interest for exploratory class space missions where the highly ionizing, high-energy, high-charge ions (HZE) particles are the major concern. The relative biological effectiveness (RBE) values determined by ground-based experiments with HZE particles are well described by a parametric track theory of cell inactivation. Using the track model and a deterministic GCR transport code, the biological damage to mammalian cell cultures is considered for 1 year in free space at solar minimum for typical spacecraft shielding. Included are the effects of projectile and target fragmentation. The RBE values for the GCR spectrum which are fluence-dependent in the track model are found to be more severe than the quality factors identified by the International Commission on Radiological Protection publication 26 and seem to obey a simple scaling law with the duration period in free space.

  17. Influence of ground level enhancements on the terrestrial production of {sup 10}Be, {sup 14}C and {sup 36}Cl

    Energy Technology Data Exchange (ETDEWEB)

    Herbst, Konstantin; Heber, Bernd [IEAP, Christian-Albrechts-Universitaet zu Kiel, Kiel (Germany); Beer, Juerg [Swiss Federal Institute of Aquatic Science and Technology, EAWAG (Switzerland); Tylka, Allan J. [Space Science Division, Naval Research Laboratory, Washington, DC (United States); Dietrich, William F. [Praxis, Inc., Alexandria, VA (United States)

    2014-07-01

    Cosmogenic radionuclides are a product of the interaction of primary cosmic rays, in particular galactic cosmic rays (GCRs), with the Earth's atmosphere. But only primary particles with energies above several 100 MeV can trigger the necessary reaction chains. Because GCRs are modulated by the solar activity on their way through the interplanetary medium the GCR-induced cosmogenic radionuclide production is anti-correlated to the solar cycle. During phases of strong solar activity also solar energetic particle (SEP) events occur frequently. In particular SEP events which can be detected by ground-based instruments, so-called ground level enhancements (GLEs), may strongly contribute to the cosmogenic radionuclide production. Beside the variation due to the modulation of GCRs we investigate the influence of 58 GLEs, which occurred within the past five solar cycles and discuss the possibility to detect such events in present ice-core and tree-ring records. In addition, an estimate for the probability to find such events over the past 10'000 years, also known as Holocene, during different modulation conditions are given.

  18. Monte Carlo transport model comparison with 1A GeV accelerated iron experiment: heavy-ion shielding evaluation of NASA space flight-crew foodstuff

    Science.gov (United States)

    Stephens, D. L. Jr; Townsend, L. W.; Miller, J.; Zeitlin, C.; Heilbronn, L.

    2002-01-01

    Deep-space manned flight as a reality depends on a viable solution to the radiation problem. Both acute and chronic radiation health threats are known to exist, with solar particle events as an example of the former and galactic cosmic rays (GCR) of the latter. In this experiment Iron ions of 1A GeV are used to simulate GCR and to determine the secondary radiation field created as the GCR-like particles interact with a thick target. A NASA prepared food pantry locker was subjected to the iron beam and the secondary fluence recorded. A modified version of the Monte Carlo heavy ion transport code developed by Zeitlin at LBNL is compared with experimental fluence. The foodstuff is modeled as mixed nuts as defined by the 71st edition of the Chemical Rubber Company (CRC) Handbook of Physics and Chemistry. The results indicate a good agreement between the experimental data and the model. The agreement between model and experiment is determined using a linear fit to ordered pairs of data. The intercept is forced to zero. The slope fit is 0.825 and the R2 value is 0.429 over the resolved fluence region. The removal of an outlier, Z=14, gives values of 0.888 and 0.705 for slope and R2 respectively. c2002 COSPAR. Published by Elsevier Science Ltd. All rights reserved.

  19. Semantics-based information extraction for detecting economic events

    NARCIS (Netherlands)

    A.C. Hogenboom (Alexander); F. Frasincar (Flavius); K. Schouten (Kim); O. van der Meer

    2013-01-01

    textabstractAs today's financial markets are sensitive to breaking news on economic events, accurate and timely automatic identification of events in news items is crucial. Unstructured news items originating from many heterogeneous sources have to be mined in order to extract knowledge useful for

  20. Charge-dependent correlations from event-by-event anomalous hydrodynamics

    Energy Technology Data Exchange (ETDEWEB)

    Hirono, Yuji [Department of Physics and Astronomy, Stony Brook University, Stony Brook, NY 11794-3800 (United States); Hirano, Tetsufumi [Department of Physics, Sophia University, Tokyo 102-8554 (Japan); Kharzeev, Dmitri E. [Department of Physics and Astronomy, Stony Brook University, Stony Brook, NY 11794-3800 (United States); Department of Physics and RIKEN-BNL Research Center, Brookhaven National Laboratory, Upton, NY 11973-5000 (United States)

    2016-12-15

    We report on our recent attempt of quantitative modeling of the Chiral Magnetic Effect (CME) in heavy-ion collisions. We perform 3+1 dimensional anomalous hydrodynamic simulations on an event-by-event basis, with constitutive equations that contain the anomaly-induced effects. We also develop a model of the initial condition for the axial charge density that captures the statistical nature of random chirality imbalances created by the color flux tubes. Basing on the event-by-event hydrodynamic simulations for hundreds of thousands of collisions, we calculate the correlation functions that are measured in experiments, and discuss how the anomalous transport affects these observables.

  1. The Agency of Event

    DEFF Research Database (Denmark)

    Nicholas, Paul; Tamke, Martin; Riiber, Jacob

    2014-01-01

    This paper explores the notion of agency within event-based models. We present an event-based modeling approach that links interdependent generative, analytic and decision making sub-models within a system of exchange. Two case study projects demonstrate the underlying modeling concepts and metho...

  2. High temperature materials

    International Nuclear Information System (INIS)

    2003-01-01

    The aim of this workshop is to share the needs of high temperature and nuclear fuel materials for future nuclear systems, to take stock of the status of researches in this domain and to propose some cooperation works between the different research organisations. The future nuclear systems are the very high temperature (850 to 1200 deg. C) gas cooled reactors (GCR) and the molten salt reactors (MSR). These systems include not only the reactor but also the fabrication and reprocessing of the spent fuel. This document brings together the transparencies of 13 communications among the 25 given at the workshop: 1) characteristics and needs of future systems: specifications, materials and fuel needs for fast spectrum GCR and very high temperature GCR; 2) high temperature materials out of neutron flux: thermal barriers: materials, resistance, lifetimes; nickel-base metal alloys: status of knowledge, mechanical behaviour, possible applications; corrosion linked with the gas coolant: knowledge and problems to be solved; super-alloys for turbines: alloys for blades and discs; corrosion linked with MSR: knowledge and problems to be solved; 3) materials for reactor core structure: nuclear graphite and carbon; fuel assembly structure materials of the GCR with fast neutron spectrum: status of knowledge and ceramics and cermets needs; silicon carbide as fuel confinement material, study of irradiation induced defects; migration of fission products, I and Cs in SiC; 4) materials for hydrogen production: status of the knowledge and needs for the thermochemical cycle; 5) technologies: GCR components and the associated material needs: compact exchangers, pumps, turbines; MSR components: valves, exchangers, pumps. (J.S.)

  3. High temperature materials; Materiaux a hautes temperatures

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2003-07-01

    The aim of this workshop is to share the needs of high temperature and nuclear fuel materials for future nuclear systems, to take stock of the status of researches in this domain and to propose some cooperation works between the different research organisations. The future nuclear systems are the very high temperature (850 to 1200 deg. C) gas cooled reactors (GCR) and the molten salt reactors (MSR). These systems include not only the reactor but also the fabrication and reprocessing of the spent fuel. This document brings together the transparencies of 13 communications among the 25 given at the workshop: 1) characteristics and needs of future systems: specifications, materials and fuel needs for fast spectrum GCR and very high temperature GCR; 2) high temperature materials out of neutron flux: thermal barriers: materials, resistance, lifetimes; nickel-base metal alloys: status of knowledge, mechanical behaviour, possible applications; corrosion linked with the gas coolant: knowledge and problems to be solved; super-alloys for turbines: alloys for blades and discs; corrosion linked with MSR: knowledge and problems to be solved; 3) materials for reactor core structure: nuclear graphite and carbon; fuel assembly structure materials of the GCR with fast neutron spectrum: status of knowledge and ceramics and cermets needs; silicon carbide as fuel confinement material, study of irradiation induced defects; migration of fission products, I and Cs in SiC; 4) materials for hydrogen production: status of the knowledge and needs for the thermochemical cycle; 5) technologies: GCR components and the associated material needs: compact exchangers, pumps, turbines; MSR components: valves, exchangers, pumps. (J.S.)

  4. QTL Mapping of Grain Quality Traits Using Introgression Lines Carrying Oryza rufipogon Chromosome Segments in Japonica Rice.

    Science.gov (United States)

    Yun, Yeo-Tae; Chung, Chong-Tae; Lee, Young-Ju; Na, Han-Jung; Lee, Jae-Chul; Lee, Sun-Gye; Lee, Kwang-Won; Yoon, Young-Hwan; Kang, Ju-Won; Lee, Hyun-Sook; Lee, Jong-Yeol; Ahn, Sang-Nag

    2016-12-01

    Improved eating quality is a major breeding target in japonica rice due to market demand. Consequently, quantitative trait loci (QTL) for glossiness of cooked rice and amylose content associated with eating quality have received much research focus because of their importance in rice quality. In this study, QTL associated with 12 grain quality traits were identified using 96 introgression lines (IL) of rice developed from an interspecific cross between the Korean elite O. sativa japonica cultivar 'Hwaseong' and O. rufipogon over 7 years. QTL analyses indicated that QTL qDTH6 for heading date, detected on chromosome 6 is associated with variance in grain traits. Most QTLs detected in this study clustered near the qDTH6 locus on chromosome 6, suggesting the effect of qDTH6. O. rufipogon alleles negatively affected grain quality traits except for a few QTLs, including qGCR9 for glossiness of cooked rice on chromosome 9. To characterize the effect of the O. rufipogon locus harboring qGCR9, four lines with a single but different O. rufipogon segment near qGCR9 were compared to Hwaseong. Three lines (O. rufipopgon ILs) having O. rufipogon segment between RM242 and RM245 in common showed higher glossiness of cooked rice than Hwaseong and the other line (Hwaseong IL), indicating that qGCR9 is located in the 3.4-Mb region between RM242 and RM245. Higher glossiness of cooked rice conferred by the O. rufipogon allele might be associated with protein content considering that three lines had lower protein content than Hwaseong (P < 0.1). These three O. rufipogon ILs showed higher yield than Hwaseong and Hwaseong IL due to increase in spikelets per panicle and grain weight indicating the linkage of qGCR9 and yield component QTLs. The qGCR9 locus is of particular interest because of its independence from other undesirable grain quality traits in O. rufipogon. SSR markers linked to qGCR9 can be used to develop high-quality japonica lines and offer a starting point for map-based

  5. Galactic cosmic ray and El Nino Southern Oscillation trends in International Satellite Cloud Climatology Project D2 low-cloud properties

    DEFF Research Database (Denmark)

    Marsh, N.; Svensmark, Henrik

    2003-01-01

    [1] The recently reported correlation between clouds and galactic cosmic rays (GCR) implies the existence of a previously unknown process linking solar variability and climate. An analysis of the interannual variability of International Satellite Cloud Climatology Project D2 (ISCCP-D2) low-cloud...... a strong correlation with GCR, which suggests that low-cloud properties observed in these regions are less likely to be contaminated from overlying cloud. The GCR-low cloud correlation cannot easily be explained by internal climate processes, changes in direct solar forcing, or UV-ozone interactions...... properties over the period July 1983 to August 1994 suggests that low clouds are statistically related to two processes, (1) GCR and (2) El Nino-Southern Oscillation (ENSO), with GCR explaining a greater percentage of the total variance. Areas where satellites have an unobstructed view of low cloud possess...

  6. Electromagnetic Dissociation and Spacecraft Electronics Damage

    Science.gov (United States)

    Norbury, John W.

    2016-01-01

    When protons or heavy ions from galactic cosmic rays (GCR) or solar particle events (SPE) interact with target nuclei in spacecraft, there can be two different types of interactions. The more familiar strong nuclear interaction often dominates and is responsible for nuclear fragmentation in either the GCR or SPE projectile nucleus or the spacecraft target nucleus. (Of course, the proton does not break up, except possibly to produce pions or other hadrons.) The less familiar, second type of interaction is due to the very strong electromagnetic fields that exist when two charged nuclei pass very close to each other. This process is called electromagnetic dissociation (EMD) and primarily results in the emission of neutrons, protons and light ions (isotopes of hydrogen and helium). The cross section for particle production is approximately defined as the number of particles produced in nucleus-nucleus collisions or other types of reactions. (There are various kinematic and other factors which multiply the particle number to arrive at the cross section.) Strong, nuclear interactions usually dominate the nuclear reactions of most interest that occur between GCR and target nuclei. However, for heavy nuclei (near Fe and beyond) at high energy the EMD cross section can be much larger than the strong nuclear interaction cross section. This paper poses a question: Are there projectile or target nuclei combinations in the interaction of GCR or SPE where the EMD reaction cross section plays a dominant role? If the answer is affirmative, then EMD mechanisms should be an integral part of codes that are used to predict damage to spacecraft electronics. The question can become more fine-tuned and one can ask about total reaction cross sections as compared to double differential cross sections. These issues will be addressed in the present paper.

  7. An Event-Based Approach to Design a Teamwork Training Scenario and Assessment Tool in Surgery.

    Science.gov (United States)

    Nguyen, Ngan; Watson, William D; Dominguez, Edward

    2016-01-01

    Simulation is a technique recommended for teaching and measuring teamwork, but few published methodologies are available on how best to design simulation for teamwork training in surgery and health care in general. The purpose of this article is to describe a general methodology, called event-based approach to training (EBAT), to guide the design of simulation for teamwork training and discuss its application to surgery. The EBAT methodology draws on the science of training by systematically introducing training exercise events that are linked to training requirements (i.e., competencies being trained and learning objectives) and performance assessment. The EBAT process involves: Of the 4 teamwork competencies endorsed by the Agency for Healthcare Research Quality and Department of Defense, "communication" was chosen to be the focus of our training efforts. A total of 5 learning objectives were defined based on 5 validated teamwork and communication techniques. Diagnostic laparoscopy was chosen as the clinical context to frame the training scenario, and 29 KSAs were defined based on review of published literature on patient safety and input from subject matter experts. Critical events included those that correspond to a specific phase in the normal flow of a surgical procedure as well as clinical events that may occur when performing the operation. Similar to the targeted KSAs, targeted responses to the critical events were developed based on existing literature and gathering input from content experts. Finally, a 29-item EBAT-derived checklist was created to assess communication performance. Like any instructional tool, simulation is only effective if it is designed and implemented appropriately. It is recognized that the effectiveness of simulation depends on whether (1) it is built upon a theoretical framework, (2) it uses preplanned structured exercises or events to allow learners the opportunity to exhibit the targeted KSAs, (3) it assesses performance, and (4

  8. Location aware event driven multipath routing in Wireless Sensor Networks: Agent based approach

    Directory of Open Access Journals (Sweden)

    A.V. Sutagundar

    2013-03-01

    Full Text Available Wireless Sensor Networks (WSNs demand reliable and energy efficient paths for critical information delivery to sink node from an event occurrence node. Multipath routing facilitates reliable data delivery in case of critical information. This paper proposes an event triggered multipath routing in WSNs by employing a set of static and mobile agents. Every sensor node is assumed to know the location information of the sink node and itself. The proposed scheme works as follows: (1 Event node computes the arbitrary midpoint between an event node and the sink node by using location information. (2 Event node establishes a shortest path from itself to the sink node through the reference axis by using a mobile agent with the help of location information; the mobile agent collects the connectivity information and other parameters of all the nodes on the way and provides the information to the sink node. (3 Event node finds the arbitrary location of the special (middle intermediate nodes (above/below reference axis by using the midpoint location information given in step 1. (4 Mobile agent clones from the event node and the clones carry the event type and discover the path passing through special intermediate nodes; the path above/below reference axis looks like an arc. While migrating from one sensor node to another along the traversed path, each mobile agent gathers the node information (such as node id, location information, residual energy, available bandwidth, and neighbors connectivity and delivers to the sink node. (5 The sink node constructs a partial topology, connecting event and sink node by using the connectivity information delivered by the mobile agents. Using the partial topology information, sink node finds the multipath and path weight factor by using link efficiency, energy ratio, and hop distance. (6 The sink node selects the number of paths among the available paths based upon the criticalness of an event, and (7 if the event is non

  9. Multi-agent system-based event-triggered hybrid control scheme for energy internet

    DEFF Research Database (Denmark)

    Dou, Chunxia; Yue, Dong; Han, Qing Long

    2017-01-01

    This paper is concerned with an event-triggered hybrid control for the energy Internet based on a multi-agent system approach with which renewable energy resources can be fully utilized to meet load demand with high security and well dynamical quality. In the design of control, a multi-agent system...

  10. Establishment of nuclear knowledge and information infrastructure; establishment of web-based database system for nuclear events

    Energy Technology Data Exchange (ETDEWEB)

    Park, W. J.; Kim, K. J. [Korea Atomic Energy Research Institute , Taejeon (Korea); Lee, S. H. [Korea Institute of Nuclear Safety, Taejeon (Korea)

    2001-05-01

    Nuclear events data reported by nuclear power plants are useful to prevent nuclear accidents at the power plant by examine the cause of initiating events and removal of weak points in the aspects of operational safety, and to improve nuclear safety in design and operation stages by backfitting operational experiences and practices 'Nuclear Event Evaluation Database : NEED' system distributed by CD-ROM media are upgraded to the NEED-Web (Web-based Nuclear Event Evaluation Database) version to manage event data using database system on network basis and the event data and the statistics are provided to the authorized users in the Nuclear Portal Site and publics through Internet Web services. The efforts to establish the NEED-Web system will improve the integrity of events data occurred in Korean nuclear power plant and the usability of data services, and enhance the confidence building and the transparency to the public in nuclear safety. 11 refs., 27 figs. (Author)

  11. BAT: An open-source, web-based audio events annotation tool

    OpenAIRE

    Blai Meléndez-Catalan, Emilio Molina, Emilia Gómez

    2017-01-01

    In this paper we present BAT (BMAT Annotation Tool), an open-source, web-based tool for the manual annotation of events in audio recordings developed at BMAT (Barcelona Music and Audio Technologies). The main feature of the tool is that it provides an easy way to annotate the salience of simultaneous sound sources. Additionally, it allows to define multiple ontologies to adapt to multiple tasks and offers the possibility to cross-annotate audio data. Moreover, it is easy to install and deploy...

  12. Modeling crowd behavior based on the discrete-event multiagent approach

    OpenAIRE

    Лановой, Алексей Феликсович; Лановой, Артем Алексеевич

    2014-01-01

    The crowd is a temporary, relatively unorganized group of people, who are in close physical contact with each other. Individual behavior of human outside the crowd is determined by many factors, associated with his intellectual activities, but inside the crowd the man loses his identity and begins to obey more simple laws of behavior.One of approaches to the construction of multi-level model of the crowd using discrete-event multiagent approach was described in the paper.Based on this analysi...

  13. Improving the extraction of complex regulatory events from scientific text by using ontology-based inference.

    Science.gov (United States)

    Kim, Jung-Jae; Rebholz-Schuhmann, Dietrich

    2011-10-06

    The extraction of complex events from biomedical text is a challenging task and requires in-depth semantic analysis. Previous approaches associate lexical and syntactic resources with ontologies for the semantic analysis, but fall short in testing the benefits from the use of domain knowledge. We developed a system that deduces implicit events from explicitly expressed events by using inference rules that encode domain knowledge. We evaluated the system with the inference module on three tasks: First, when tested against a corpus with manually annotated events, the inference module of our system contributes 53.2% of correct extractions, but does not cause any incorrect results. Second, the system overall reproduces 33.1% of the transcription regulatory events contained in RegulonDB (up to 85.0% precision) and the inference module is required for 93.8% of the reproduced events. Third, we applied the system with minimum adaptations to the identification of cell activity regulation events, confirming that the inference improves the performance of the system also on this task. Our research shows that the inference based on domain knowledge plays a significant role in extracting complex events from text. This approach has great potential in recognizing the complex concepts of such biomedical ontologies as Gene Ontology in the literature.

  14. Improving the extraction of complex regulatory events from scientific text by using ontology-based inference

    Directory of Open Access Journals (Sweden)

    Kim Jung-jae

    2011-10-01

    Full Text Available Abstract Background The extraction of complex events from biomedical text is a challenging task and requires in-depth semantic analysis. Previous approaches associate lexical and syntactic resources with ontologies for the semantic analysis, but fall short in testing the benefits from the use of domain knowledge. Results We developed a system that deduces implicit events from explicitly expressed events by using inference rules that encode domain knowledge. We evaluated the system with the inference module on three tasks: First, when tested against a corpus with manually annotated events, the inference module of our system contributes 53.2% of correct extractions, but does not cause any incorrect results. Second, the system overall reproduces 33.1% of the transcription regulatory events contained in RegulonDB (up to 85.0% precision and the inference module is required for 93.8% of the reproduced events. Third, we applied the system with minimum adaptations to the identification of cell activity regulation events, confirming that the inference improves the performance of the system also on this task. Conclusions Our research shows that the inference based on domain knowledge plays a significant role in extracting complex events from text. This approach has great potential in recognizing the complex concepts of such biomedical ontologies as Gene Ontology in the literature.

  15. Probabilistic analysis of external events with focus on the Fukushima event

    International Nuclear Information System (INIS)

    Kollasko, Heiko; Jockenhoevel-Barttfeld, Mariana; Klapp, Ulrich

    2014-01-01

    External hazards are those natural or man-made hazards to a site and facilities that are originated externally to both the site and its processes, i.e. the duty holder may have very little or no control over the hazard. External hazards can have the potential of causing initiating events at the plant, typically transients like e.g., loss of offsite power. Simultaneously, external events may affect safety systems required to control the initiating event and, where applicable, also back-up systems implemented for risk-reduction. The plant safety may especially be threatened when loads from external hazards exceed the load assumptions considered in the design of safety-related systems, structures and components. Another potential threat is given by hazards inducing initiating events not considered in the safety demonstration otherwise. An example is loss of offsite power combined with prolonged plant isolation. Offsite support, e.g., delivery of diesel fuel oil, usually credited in the deterministic safety analysis may not be possible in this case. As the Fukushima events have shown, the biggest threat is likely given by hazards inducing both effects. Such hazards may well be dominant risk contributors even if their return period is very high. In order to identify relevant external hazards for a certain Nuclear Power Plant (NPP) location, a site specific screening analysis is performed, both for single events and for combinations of external events. As a result of the screening analysis, risk significant and therefore relevant (screened-in) single external events and combinations of them are identified for a site. The screened-in events are further considered in a detailed event tree analysis in the frame of the Probabilistic Safety Analysis (PSA) to calculate the core damage/large release frequency resulting from each relevant external event or from each relevant combination. Screening analyses of external events performed at AREVA are based on the approach provided

  16. Various sizes of sliding event bursts in the plastic flow of metallic glasses based on a spatiotemporal dynamic model

    Energy Technology Data Exchange (ETDEWEB)

    Ren, Jingli, E-mail: renjl@zzu.edu.cn, E-mail: g.wang@shu.edu.cn; Chen, Cun [School of Mathematics and Statistics, Zhengzhou University, Zhengzhou 450001 (China); Wang, Gang, E-mail: renjl@zzu.edu.cn, E-mail: g.wang@shu.edu.cn [Laboratory for Microstructures, Shanghai University, Shanghai 200444 (China); Cheung, Wing-Sum [Department of Mathematics, The University of HongKong, HongKong (China); Sun, Baoan; Mattern, Norbert [IFW-dresden, Institute for Complex Materials, P.O. Box 27 01 16, D-01171 Dresden (Germany); Siegmund, Stefan [Department of Mathematics, TU Dresden, D-01062 Dresden (Germany); Eckert, Jürgen [IFW-dresden, Institute for Complex Materials, P.O. Box 27 01 16, D-01171 Dresden (Germany); Institute of Materials Science, TU Dresden, D-01062 Dresden (Germany)

    2014-07-21

    This paper presents a spatiotemporal dynamic model based on the interaction between multiple shear bands in the plastic flow of metallic glasses during compressive deformation. Various sizes of sliding events burst in the plastic deformation as the generation of different scales of shear branches occurred; microscopic creep events and delocalized sliding events were analyzed based on the established model. This paper discusses the spatially uniform solutions and traveling wave solution. The phase space of the spatially uniform system applied in this study reflected the chaotic state of the system at a lower strain rate. Moreover, numerical simulation showed that the microscopic creep events were manifested at a lower strain rate, whereas the delocalized sliding events were manifested at a higher strain rate.

  17. An adverse events potential costs analysis based on Drug Programs in Poland. Dermatology focus

    Directory of Open Access Journals (Sweden)

    Szkultecka-Debek Monika

    2014-09-01

    Full Text Available The aim of the project, carried out within the Polish Society for Pharmacoeconomics (PTFE, was to estimate the potential costs of treatment of the side effects which (theoretically may occur as a result of treatments for the selected diseases. This paper deals solely with dermatology related events. Herein, several Drug Programs financed by the National Health Fund in Poland, in 2012, were analyzed. The adverse events were selected based on the Summary of Product Characteristics of the chosen products. We focused the project on those potential adverse events which were defined in SPC as frequent and very frequent. The results are presented according to their therapeutic areas, and in this paper, the focus is upon that which is related to dermatology. The events described as ‘very common’ had an incidence of ≥ 1/10, and that which is ‘common’ - ≥ 1/100, <1 /10. In order to identify the resources used, we, with the engagement of clinical experts, performed a survey. In our work, we employed only the total direct costs incurred by the public payer, based on valid individual cost data in February 2014. Moreover, we calculated the total spending from the public payer’s perspective, as well as the patient’s perspective, and the percentage of each component of the total cost in detail. The paper, thus, informs the reader of the estimated costs of treatment of side effects related to the dermatologic symptoms and reactions. Based on our work, we can state that the treatment of skin adverse drug reactions generates a significant cost - one incurred by both the public payer and the patient.

  18. A model-based approach to operational event groups ranking

    Energy Technology Data Exchange (ETDEWEB)

    Simic, Zdenko [European Commission Joint Research Centre, Petten (Netherlands). Inst. for Energy and Transport; Maqua, Michael [Gesellschaft fuer Anlagen- und Reaktorsicherheit mbH (GRS), Koeln (Germany); Wattrelos, Didier [Institut de Radioprotection et de Surete Nucleaire (IRSN), Fontenay-aux-Roses (France)

    2014-04-15

    The operational experience (OE) feedback provides improvements in all industrial activities. Identification of the most important and valuable groups of events within accumulated experience is important in order to focus on a detailed investigation of events. The paper describes the new ranking method and compares it with three others. Methods have been described and applied to OE events utilised by nuclear power plants in France and Germany for twenty years. The results show that different ranking methods only roughly agree on which of the event groups are the most important ones. In the new ranking method the analytical hierarchy process is applied in order to assure consistent and comprehensive weighting determination for ranking indexes. The proposed method allows a transparent and flexible event groups ranking and identification of the most important OE for further more detailed investigation in order to complete the feedback. (orig.)

  19. Analysis of adverse events of renal impairment related to platinum-based compounds using the Japanese Adverse Drug Event Report database.

    Science.gov (United States)

    Naganuma, Misa; Motooka, Yumi; Sasaoka, Sayaka; Hatahira, Haruna; Hasegawa, Shiori; Fukuda, Akiho; Nakao, Satoshi; Shimada, Kazuyo; Hirade, Koseki; Mori, Takayuki; Yoshimura, Tomoaki; Kato, Takeshi; Nakamura, Mitsuhiro

    2018-01-01

    Platinum compounds cause several adverse events, such as nephrotoxicity, gastrointestinal toxicity, myelosuppression, ototoxicity, and neurotoxicity. We evaluated the incidence of renal impairment as adverse events are related to the administration of platinum compounds using the Japanese Adverse Drug Event Report database. We analyzed adverse events associated with the use of platinum compounds reported from April 2004 to November 2016. The reporting odds ratio at 95% confidence interval was used to detect the signal for each renal impairment incidence. We evaluated the time-to-onset profile of renal impairment and assessed the hazard type using Weibull shape parameter and used the applied association rule mining technique to discover undetected relationships such as possible risk factor. In total, 430,587 reports in the Japanese Adverse Drug Event Report database were analyzed. The reporting odds ratios (95% confidence interval) for renal impairment resulting from the use of cisplatin, oxaliplatin, carboplatin, and nedaplatin were 2.7 (2.5-3.0), 0.6 (0.5-0.7), 0.8 (0.7-1.0), and 1.3 (0.8-2.1), respectively. The lower limit of the reporting odds ratio (95% confidence interval) for cisplatin was >1. The median (lower-upper quartile) onset time of renal impairment following the use of platinum-based compounds was 6.0-8.0 days. The Weibull shape parameter β and 95% confidence interval upper limit of oxaliplatin were impairment during cisplatin use in real-world setting. The present findings demonstrate that the incidence of renal impairment following cisplatin use should be closely monitored when patients are hypertensive or diabetic, or when they are co-administered furosemide, loxoprofen, or pemetrexed. In addition, healthcare professionals should closely assess a patient's background prior to treatment.

  20. Activation of glucocorticoid receptors in Müller glia is protective to retinal neurons and suppresses microglial reactivity.

    Science.gov (United States)

    Gallina, Donika; Zelinka, Christopher Paul; Cebulla, Colleen M; Fischer, Andy J

    2015-11-01

    Reactive microglia and macrophages are prevalent in damaged retinas. Glucocorticoid signaling is known to suppress inflammation and the reactivity of microglia and macrophages. In the vertebrate retina, the glucocorticoid receptor (GCR) is known to be activated and localized to the nuclei of Müller glia (Gallina et al., 2014). Accordingly, we investigated how signaling through GCR influences the survival of neurons using the chick retina in vivo as a model system. We applied intraocular injections of GCR agonist or antagonist, assessed microglial reactivity, and the survival of retinal neurons following different damage paradigms. Microglial reactivity was increased in retinas from eyes that were injected with vehicle, and this reactivity was decreased by GCR-agonist dexamethasone (Dex) and increased by GCR-antagonist RU486. We found that activation of GCR suppresses the reactivity of microglia and inhibited the loss of retinal neurons resulting from excitotoxicity. We provide evidence that the protection-promoting effects of Dex were maintained when the microglia were selectively ablated. Similarly, intraocular injections of Dex protected ganglion cells from colchicine-treatment and protected photoreceptors from damage caused by retinal detachment. We conclude that activation of GCR promotes the survival of ganglion cells in colchicine-damaged retinas, promotes the survival of amacrine and bipolar cells in excitotoxin-damaged retinas, and promotes the survival of photoreceptors in detached retinas. We propose that suppression of microglial reactivity is secondary to activation of GCR in Müller glia, and this mode of signaling is an effective means to lessen the damage and vision loss resulting from different types of retinal damage. Copyright © 2015 Elsevier Inc. All rights reserved.

  1. Managing wildfire events: risk-based decision making among a group of federal fire managers

    Science.gov (United States)

    Robyn S. Wilson; Patricia L. Winter; Lynn A. Maguire; Timothy. Ascher

    2011-01-01

    Managing wildfire events to achieve multiple management objectives involves a high degree of decision complexity and uncertainty, increasing the likelihood that decisions will be informed by experience-based heuristics triggered by available cues at the time of the decision. The research reported here tests the prevalence of three risk-based biases among 206...

  2. Adaptive Event-Triggered Control Based on Heuristic Dynamic Programming for Nonlinear Discrete-Time Systems.

    Science.gov (United States)

    Dong, Lu; Zhong, Xiangnan; Sun, Changyin; He, Haibo

    2017-07-01

    This paper presents the design of a novel adaptive event-triggered control method based on the heuristic dynamic programming (HDP) technique for nonlinear discrete-time systems with unknown system dynamics. In the proposed method, the control law is only updated when the event-triggered condition is violated. Compared with the periodic updates in the traditional adaptive dynamic programming (ADP) control, the proposed method can reduce the computation and transmission cost. An actor-critic framework is used to learn the optimal event-triggered control law and the value function. Furthermore, a model network is designed to estimate the system state vector. The main contribution of this paper is to design a new trigger threshold for discrete-time systems. A detailed Lyapunov stability analysis shows that our proposed event-triggered controller can asymptotically stabilize the discrete-time systems. Finally, we test our method on two different discrete-time systems, and the simulation results are included.

  3. Parallelization of the unstructured Navier-stoke solver LILAC for the aero-thermal analysis of a gas-cooled reactor

    International Nuclear Information System (INIS)

    Kim, J. T.; Kim, S. B.; Lee, W. J.

    2004-01-01

    Currently lilac code is under development to analyse thermo-hydraulics of the gas-cooled reactor(GCR) especially high-temperature GCR which is one of the gen IV nuclear reactors. The lilac code was originally developed for the analysis of thermo-hydraulics in a molten pool. And now it is modified to resolve the compressible gas flows in the GCR. The more complexities in the internal flow geometries of the GCR reactor and aero-thermal flows, the number of computational cells are increased and finally exceeds the current computing powers of the desktop computers. To overcome the problem and well resolve the interesting physics in the GCR it is conducted to parallels the lilac code by the decomposition of a computational domain or grid. Some benchmark problems are solved with the parallelized lilac code and its speed-up characteristics by the parallel computation is evaluated and described in the article

  4. Developing future precipitation events from historic events: An Amsterdam case study.

    Science.gov (United States)

    Manola, Iris; van den Hurk, Bart; de Moel, Hans; Aerts, Jeroen

    2016-04-01

    Due to climate change, the frequency and intensity of extreme precipitation events is expected to increase. It is therefore of high importance to develop climate change scenarios tailored towards the local and regional needs of policy makers in order to develop efficient adaptation strategies to reduce the risks from extreme weather events. Current approaches to tailor climate scenarios are often not well adopted in hazard management, since average changes in climate are not a main concern to policy makers, and tailoring climate scenarios to simulate future extremes can be complex. Therefore, a new concept has been introduced recently that uses known historic extreme events as a basis, and modifies the observed data for these events so that the outcome shows how the same event would occur in a warmer climate. This concept is introduced as 'Future Weather', and appeals to the experience of stakeholders and users. This research presents a novel method of projecting a future extreme precipitation event, based on a historic event. The selected precipitation event took place over the broader area of Amsterdam, the Netherlands in the summer of 2014, which resulted in blocked highways, disruption of air transportation, flooded buildings and public facilities. An analysis of rain monitoring stations showed that an event of such intensity has a 5 to 15 years return period. The method of projecting a future event follows a non-linear delta transformation that is applied directly on the observed event assuming a warmer climate to produce an "up-scaled" future precipitation event. The delta transformation is based on the observed behaviour of the precipitation intensity as a function of the dew point temperature during summers. The outcome is then compared to a benchmark method using the HARMONIE numerical weather prediction model, where the boundary conditions of the event from the Ensemble Prediction System of ECMWF (ENS) are perturbed to indicate a warmer climate. The two

  5. Intelligent Transportation Control based on Proactive Complex Event Processing

    Directory of Open Access Journals (Sweden)

    Wang Yongheng

    2016-01-01

    Full Text Available Complex Event Processing (CEP has become the key part of Internet of Things (IoT. Proactive CEP can predict future system states and execute some actions to avoid unwanted states which brings new hope to intelligent transportation control. In this paper, we propose a proactive CEP architecture and method for intelligent transportation control. Based on basic CEP technology and predictive analytic technology, a networked distributed Markov decision processes model with predicting states is proposed as sequential decision model. A Q-learning method is proposed for this model. The experimental evaluations show that this method works well when used to control congestion in in intelligent transportation systems.

  6. Lunar soil as shielding against space radiation

    Energy Technology Data Exchange (ETDEWEB)

    Miller, J. [Lawrence Berkeley National Laboratory, MS 83R0101, 1 Cyclotron Road, Berkeley, CA 94720 (United States)], E-mail: miller@lbl.gov; Taylor, L. [Planetary Geosciences Institute, Department of Earth and Planetary Sciences, University of Tennessee, Knoxville, TN 37996 (United States); Zeitlin, C. [Southwest Research Institute, Boulder, CO 80302 (United States); Heilbronn, L. [Department of Nuclear Engineering, University of Tennessee, Knoxville, TN 37996 (United States); Guetersloh, S. [Department of Nuclear Engineering, Texas A and M University, College Station, TX 77843 (United States); DiGiuseppe, M. [Northrop Grumman Corporation, Bethpage, NY 11714 (United States); Iwata, Y.; Murakami, T. [National Institute of Radiological Sciences, Chiba 263-8555 (Japan)

    2009-02-15

    We have measured the radiation transport and dose reduction properties of lunar soil with respect to selected heavy ion beams with charges and energies comparable to some components of the galactic cosmic radiation (GCR), using soil samples returned by the Apollo missions and several types of synthetic soil glasses and lunar soil simulants. The suitability for shielding studies of synthetic soil and soil simulants as surrogates for lunar soil was established, and the energy deposition as a function of depth for a particular heavy ion beam passing through a new type of lunar highland simulant was measured. A fragmentation and energy loss model was used to extend the results over a range of heavy ion charges and energies, including protons at solar particle event (SPE) energies. The measurements and model calculations indicate that a modest amount of lunar soil affords substantial protection against primary GCR nuclei and SPE, with only modest residual dose from surviving charged fragments of the heavy beams.

  7. Event-Based Impulsive Control of Continuous-Time Dynamic Systems and Its Application to Synchronization of Memristive Neural Networks.

    Science.gov (United States)

    Zhu, Wei; Wang, Dandan; Liu, Lu; Feng, Gang

    2017-08-18

    This paper investigates exponential stabilization of continuous-time dynamic systems (CDSs) via event-based impulsive control (EIC) approaches, where the impulsive instants are determined by certain state-dependent triggering condition. The global exponential stability criteria via EIC are derived for nonlinear and linear CDSs, respectively. It is also shown that there is no Zeno-behavior for the concerned closed loop control system. In addition, the developed event-based impulsive scheme is applied to the synchronization problem of master and slave memristive neural networks. Furthermore, a self-triggered impulsive control scheme is developed to avoid continuous communication between the master system and slave system. Finally, two numerical simulation examples are presented to illustrate the effectiveness of the proposed event-based impulsive controllers.

  8. RETRIEVAL EVENTS EVALUATION

    International Nuclear Information System (INIS)

    Wilson, T.

    1999-01-01

    The purpose of this analysis is to evaluate impacts to the retrieval concept presented in the Design Analysis ''Retrieval Equipment and Strategy'' (Reference 6), from abnormal events based on Design Basis Events (DBE) and Beyond Design Basis Events (BDBE) as defined in two recent analyses: (1) DBE/Scenario Analysis for Preclosure Repository Subsurface Facilities (Reference 4); and (2) Preliminary Preclosure Design Basis Event Calculations for the Monitored Geologic Repository (Reference 5) The objective of this task is to determine what impacts the DBEs and BDBEs have on the equipment developed for retrieval. The analysis lists potential impacts and recommends changes to be analyzed in subsequent design analyses for developed equipment, or recommend where additional equipment may be needed, to allow retrieval to be performed in all DBE or BDBE situations. This analysis supports License Application design and therefore complies with the requirements of Systems Description Document input criteria comparison as presented in Section 7, Conclusions. In addition, the analysis discusses the impacts associated with not using concrete inverts in the emplacement drifts. The ''Retrieval Equipment and Strategy'' analysis was based on a concrete invert configuration in the emplacement drift. The scope of the analysis, as presented in ''Development Plan for Retrieval Events Evaluation'' (Reference 3) includes evaluation and criteria of the following: Impacts to retrieval from the emplacement drift based on DBE/BDBEs, and changes to the invert configuration for the preclosure period. Impacts to retrieval from the main drifts based on DBE/BDBEs for the preclosure period

  9. New developments in file-based infrastructure for ATLAS event selection

    Energy Technology Data Exchange (ETDEWEB)

    Gemmeren, P van; Malon, D M [Argonne National Laboratory, Argonne, Illinois 60439 (United States); Nowak, M, E-mail: gemmeren@anl.go [Brookhaven National Laboratory, Upton, NY 11973-5000 (United States)

    2010-04-01

    In ATLAS software, TAGs are event metadata records that can be stored in various technologies, including ROOT files and relational databases. TAGs are used to identify and extract events that satisfy certain selection predicates, which can be coded as SQL-style queries. TAG collection files support in-file metadata to store information describing all events in the collection. Event Selector functionality has been augmented to provide such collection-level metadata to subsequent algorithms. The ATLAS I/O framework has been extended to allow computational processing of TAG attributes to select or reject events without reading the event data. This capability enables physicists to use more detailed selection criteria than are feasible in an SQL query. For example, the TAGs contain enough information not only to check the number of electrons, but also to calculate their distance to the closest jet-a calculation that would be difficult to express in SQL. Another new development allows ATLAS to write TAGs directly into event data files. This feature can improve performance by supporting advanced event selection capabilities, including computational processing of TAG information, without the need for external TAG file or database access.

  10. Active Dosimetry on Recent Space Flights

    International Nuclear Information System (INIS)

    Beaujean, R.; Kopp, J.; Reitz, G.

    1999-01-01

    The radiation exposure inside the spacecraft in low earth orbit was investigated with a telescope based on two silicon planar detectors during three NASA shuttle-to-MIR missions (inclination 51.6 deg, altitude about 380 km). Count and dose rate profiles were measured, as well as separate linear energy transfer (LET) spectra, for the galactic cosmic rays (GCR) and the trapped radiation encountered in the South Atlantic Anomaly (SAA). Effective quality factors are deduced from the converted LET spectra (in water) in the range 0.1-120 keV.μm -1 according to ICRP 60. Measured mission averaged dose rates in silicon are in the range 98-108 μGy.d-1 and 137-178 μGy.d -1 for the GCR and SAA contributions, respectively. The deduced effective quality factors are 2.95-3.29 (GCR) and 1.18-1.25 (SAA), resulting in mission averaged dose equivalent rates of 631-716 μSv.d -1 for the comparable three missions. (author)

  11. Cardiovascular Events in Cancer Patients Treated with Highly or Moderately Emetogenic Chemotherapy: Results from a Population-Based Study

    International Nuclear Information System (INIS)

    Vo, T. T.; Nelson, J. J.

    2012-01-01

    Studies on cardiovascular safety in cancer patients treated with highly or moderately emetogenic chemotherapy (HEC or MEC), who may have taken the antiemetic, aprepitant, have been limited to clinical trials and postmarketing spontaneous reports. Our study explored background rates of cardiovascular disease (CVD) events among HEC- or MEC-treated cancer patients in a population-based setting to contextualize events seen in a new drug development program and to determine at a high level whether rates differed by aprepitant usage. Medical and pharmacy claims data from the 2005-2007 IMPACT National Benchmark Database were classified into emetogenic chemotherapy categories and CVD outcomes. Among 5827 HEC/MEC-treated patients, frequencies were highest for hypertension (16-21%) and composites of venous (7-12%) and arterial thromboembolic events (4-7%). Aprepitant users generally did not experience higher frequencies of events compared to nonusers. Our study serves as a useful benchmark of background CVD event rates in a population-based setting of cancer patients.

  12. Multiplexing real-time timed events

    NARCIS (Netherlands)

    Holenderski, M.J.; Cools, W.A.; Bril, R.J.; Lukkien, J.J.

    2009-01-01

    This paper presents the design and implementation of RELTEQ, a timed event management algorithm based on relative event times, supporting long event interarrival time, long lifetime of the event queue, no drift and low overhead. It is targeted at embedded operating systems. RELTEQ has been conceived

  13. Location-based technologies for supporting elderly pedestrian in "getting lost" events.

    Science.gov (United States)

    Pulido Herrera, Edith

    2017-05-01

    Localization-based technologies promise to keep older adults with dementia safe and support them and their caregivers during getting lost events. This paper summarizes mainly technological contributions to support the target group in these events. Moreover, important aspects of the getting lost phenomenon such as its concept and ethical issues are also briefly addressed. Papers were selected from scientific databases and gray literature. Since the topic is still in its infancy, other terms were used to find contributions associated with getting lost e.g. wandering. Trends of applying localization systems were identified as personal locators, perimeter systems and assistance systems. The first system barely considered the older adult's opinion, while assistance systems may involve context awareness to improve the support for both the elderly and the caregiver. Since few studies report multidisciplinary work with a special focus on getting lost, there is not a strong evidence of the real efficiency of localization systems or guidelines to design systems for the target group. Further research about getting lost is required to obtain insights for developing customizable systems. Moreover, considering conditions of the older adult might increase the impact of developments that combine localization technologies and artificial intelligence techniques. Implications for Rehabilitation Whilst there is no cure for dementia such as Alzheimer's, it is feasible to take advantage of technological developments to somewhat diminish its negative impact. For instance, location-based systems may provide information to early diagnose the Alzheimer's disease by assessing navigational impairments of older adults. Assessing the latest supportive technologies and methodologies may provide insights to adopt strategies to properly manage getting lost events. More user-centered designs will provide appropriate assistance to older adults. Namely, customizable systems could assist older adults

  14. Knowledge base about earthquakes as a tool to minimize strong events consequences

    Science.gov (United States)

    Frolova, Nina; Bonnin, Jean; Larionov, Valery; Ugarov, Alexander; Kijko, Andrzej

    2017-04-01

    The paper describes the structure and content of the knowledge base on physical and socio-economical consequences of damaging earthquakes, which may be used for calibration of near real-time loss assessment systems based on simulation models for shaking intensity, damage to buildings and casualties estimates. Such calibration allows to compensate some factors which influence on reliability of expected damage and loss assessment in "emergency" mode. The knowledge base contains the description of past earthquakes' consequences for the area under study. It also includes the current distribution of built environment and population at the time of event occurrence. Computer simulation of the recorded in knowledge base events allow to determine the sets of regional calibration coefficients, including rating of seismological surveys, peculiarities of shaking intensity attenuation and changes in building stock and population distribution, in order to provide minimum error of damaging earthquakes loss estimations in "emergency" mode. References 1. Larionov, V., Frolova, N: Peculiarities of seismic vulnerability estimations. In: Natural Hazards in Russia, volume 6: Natural Risks Assessment and Management, Publishing House "Kruk", Moscow, 120-131, 2003. 2. Frolova, N., Larionov, V., Bonnin, J.: Data Bases Used In Worlwide Systems For Earthquake Loss Estimation In Emergency Mode: Wenchuan Earthquake. In Proc. TIEMS2010 Conference, Beijing, China, 2010. 3. Frolova N. I., Larionov V. I., Bonnin J., Sushchev S. P., Ugarov A. N., Kozlov M. A. Loss Caused by Earthquakes: Rapid Estimates. Natural Hazards Journal of the International Society for the Prevention and Mitigation of Natural Hazards, vol.84, ISSN 0921-030, Nat Hazards DOI 10.1007/s11069-016-2653

  15. MODULATION OF GALACTIC COSMIC RAYS OBSERVED AT L1 IN SOLAR CYCLE 23

    Energy Technology Data Exchange (ETDEWEB)

    Fludra, A., E-mail: Andrzej.Fludra@stfc.ac.uk [RAL Space, STFC Rutherford Appleton Laboratory, Harwell, Didcot OX11 0QX (United Kingdom)

    2015-01-20

    We analyze a unique 15 yr record of galactic cosmic-ray (GCR) measurements made by the SOHO Coronal Diagnostic Spectrometer NIS detectors, recording integrated GCR numbers with energies above 1.0 GeV between 1996 July and 2011 June. We are able to closely reproduce the main features of the SOHO/CDS GCR record using the modulation potential calculated from neutron monitor data by Usoskin et al. The GCR numbers show a clear solar cycle modulation: they decrease by 50% from the 1997 minimum to the 2000 maximum of the solar cycle, then return to the 1997 level in 2007 and continue to rise, in 2009 December reaching a level 25% higher than in 1997. This 25% increase is in contrast with the behavior of Ulysses/KET GCR protons extrapolated to 1 AU in the ecliptic plane, showing the same level in 2008-2009 as in 1997. The GCR numbers are inversely correlated with the tilt angle of the heliospheric current sheet. In particular, the continued increase of SOHO/CDS GCRs from 2007 until 2009 is correlated with the decrease of the minimum tilt angle from 30° in mid-2008 to 5° in late 2009. The GCR level then drops sharply from 2010 January, again consistent with a rapid increase of the tilt angle to over 35°. This shows that the extended 2008 solar minimum was different from the 1997 minimum in terms of the structure of the heliospheric current sheet.

  16. Ultra high molecular weight polyethylene (UHMWPE) fiber epoxy composite hybridized with Gadolinium and Boron nanoparticles for radiation shielding

    Science.gov (United States)

    Mani, Venkat; Prasad, Narasimha S.; Kelkar, Ajit

    2016-09-01

    Deep space radiations pose a major threat to the astronauts and their spacecraft during long duration space exploration missions. The two sources of radiation that are of concern are the galactic cosmic radiation (GCR) and the short lived secondary neutron radiations that are generated as a result of fragmentation that occurs when GCR strikes target nuclei in a spacecraft. Energy loss, during the interaction of GCR and the shielding material, increases with the charge to mass ratio of the shielding material. Hydrogen with no neutron in its nucleus has the highest charge to mass ratio and is the element which is the most effective shield against GCR. Some of the polymers because of their higher hydrogen content also serve as radiation shield materials. Ultra High Molecular Weight Polyethylene (UHMWPE) fibers, apart from possessing radiation shielding properties by the virtue of the high hydrogen content, are known for extraordinary properties. An effective radiation shielding material is the one that will offer protection from GCR and impede the secondary neutron radiations resulting from the fragmentation process. Neutrons, which result from fragmentation, do not respond to the Coulombic interaction that shield against GCR. To prevent the deleterious effects of secondary neutrons, targets such as Gadolinium are required. In this paper, the radiation shielding studies that were carried out on the fabricated sandwich panels by vacuum-assisted resin transfer molding (VARTM) process are presented. VARTM is a manufacturing process used for making large composite structures by infusing resin into base materials formed with woven fabric or fiber using vacuum pressure. Using the VARTM process, the hybridization of Epoxy/UHMWPE composites with Gadolinium nanoparticles, Boron, and Boron carbide nanoparticles in the form of sandwich panels were successfully carried out. The preliminary results from neutron radiation tests show that greater than 99% shielding performance was

  17. Event group importance measures for top event frequency analyses

    International Nuclear Information System (INIS)

    1995-01-01

    Three traditional importance measures, risk reduction, partial derivative, nd variance reduction, have been extended to permit analyses of the relative importance of groups of underlying failure rates to the frequencies of resulting top events. The partial derivative importance measure was extended by assessing the contribution of a group of events to the gradient of the top event frequency. Given the moments of the distributions that characterize the uncertainties in the underlying failure rates, the expectation values of the top event frequency, its variance, and all of the new group importance measures can be quantified exactly for two familiar cases: (1) when all underlying failure rates are presumed independent, and (2) when pairs of failure rates based on common data are treated as being equal (totally correlated). In these cases, the new importance measures, which can also be applied to assess the importance of individual events, obviate the need for Monte Carlo sampling. The event group importance measures are illustrated using a small example problem and demonstrated by applications made as part of a major reactor facility risk assessment. These illustrations and applications indicate both the utility and the versatility of the event group importance measures

  18. Event group importance measures for top event frequency analyses

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-07-31

    Three traditional importance measures, risk reduction, partial derivative, nd variance reduction, have been extended to permit analyses of the relative importance of groups of underlying failure rates to the frequencies of resulting top events. The partial derivative importance measure was extended by assessing the contribution of a group of events to the gradient of the top event frequency. Given the moments of the distributions that characterize the uncertainties in the underlying failure rates, the expectation values of the top event frequency, its variance, and all of the new group importance measures can be quantified exactly for two familiar cases: (1) when all underlying failure rates are presumed independent, and (2) when pairs of failure rates based on common data are treated as being equal (totally correlated). In these cases, the new importance measures, which can also be applied to assess the importance of individual events, obviate the need for Monte Carlo sampling. The event group importance measures are illustrated using a small example problem and demonstrated by applications made as part of a major reactor facility risk assessment. These illustrations and applications indicate both the utility and the versatility of the event group importance measures.

  19. Nest-crowdcontrol: Advanced video-based crowd monitoring for large public events

    OpenAIRE

    Monari, Eduardo; Fischer, Yvonne; Anneken, Mathias

    2015-01-01

    Current video surveillance systems still lack of intelligent video and data analysis modules for supporting situation awareness of decision makers. Especially in mass gatherings like large public events, the decision maker would benefit from different views of the area, especially from crowd density estimations. This article describes a multi-camera system called NEST and its application for crowd density analysis. First, the overall system design is presented. Based on this, the crowd densit...

  20. Neural bases of event knowledge and syntax integration in comprehension of complex sentences.

    Science.gov (United States)

    Malaia, Evie; Newman, Sharlene

    2015-01-01

    Comprehension of complex sentences is necessarily supported by both syntactic and semantic knowledge, but what linguistic factors trigger a readers' reliance on a specific system? This functional neuroimaging study orthogonally manipulated argument plausibility and verb event type to investigate cortical bases of the semantic effect on argument comprehension during reading. The data suggest that telic verbs facilitate online processing by means of consolidating the event schemas in episodic memory and by easing the computation of syntactico-thematic hierarchies in the left inferior frontal gyrus. The results demonstrate that syntax-semantics integration relies on trade-offs among a distributed network of regions for maximum comprehension efficiency.

  1. Human Performance Event Database

    International Nuclear Information System (INIS)

    Trager, E. A.

    1998-01-01

    The purpose of this paper is to describe several aspects of a Human Performance Event Database (HPED) that is being developed by the Nuclear Regulatory Commission. These include the background, the database structure and basis for the structure, the process for coding and entering event records, the results of preliminary analyses of information in the database, and plans for the future. In 1992, the Office for Analysis and Evaluation of Operational Data (AEOD) within the NRC decided to develop a database for information on human performance during operating events. The database was needed to help classify and categorize the information to help feedback operating experience information to licensees and others. An NRC interoffice working group prepared a list of human performance information that should be reported for events and the list was based on the Human Performance Investigation Process (HPIP) that had been developed by the NRC as an aid in investigating events. The structure of the HPED was based on that list. The HPED currently includes data on events described in augmented inspection team (AIT) and incident investigation team (IIT) reports from 1990 through 1996, AEOD human performance studies from 1990 through 1993, recent NRR special team inspections, and licensee event reports (LERs) that were prepared for the events. (author)

  2. Dust events in Beijing, China (2004–2006: comparison of ground-based measurements with columnar integrated observations

    Directory of Open Access Journals (Sweden)

    Z. J. Wu

    2009-09-01

    Full Text Available Ambient particle number size distributions spanning three years were used to characterize the frequency and intensity of atmospheric dust events in the urban areas of Beijing, China in combination with AERONET sun/sky radiometer data. Dust events were classified into two types based on the differences in particle number and volume size distributions and local weather conditions. This categorization was confirmed by aerosol index images, columnar aerosol optical properties, and vertical potential temperature profiles. During the type-1 events, dust particles dominated the total particle volume concentration (<10 μm, with a relative share over 70%. Anthropogenic particles in the Aitken and accumulation mode played a subordinate role here because of high wind speeds (>4 m s−1. The type-2 events occurred in rather stagnant air masses and were characterized by a lower volume fraction of coarse mode particles (on average, 55%. Columnar optical properties showed that the superposition of dust and anthropogenic aerosols in type-2 events resulted in a much higher AOD (average: 1.51 than for the rather pure dust aerosols in type-1 events (average AOD: 0.36. A discrepancy was found between the ground-based and column integrated particle volume size distributions, especially for the coarse mode particles. This discrepancy likely originates from both the limited comparability of particle volume size distributions derived from Sun photometer and in situ number size distributions, and the inhomogeneous vertical distribution of particles during dust events.

  3. Automatic Classification of volcano-seismic events based on Deep Neural Networks.

    Science.gov (United States)

    Titos Luzón, M.; Bueno Rodriguez, A.; Garcia Martinez, L.; Benitez, C.; Ibáñez, J. M.

    2017-12-01

    Seismic monitoring of active volcanoes is a popular remote sensing technique to detect seismic activity, often associated to energy exchanges between the volcano and the environment. As a result, seismographs register a wide range of volcano-seismic signals that reflect the nature and underlying physics of volcanic processes. Machine learning and signal processing techniques provide an appropriate framework to analyze such data. In this research, we propose a new classification framework for seismic events based on deep neural networks. Deep neural networks are composed by multiple processing layers, and can discover intrinsic patterns from the data itself. Internal parameters can be initialized using a greedy unsupervised pre-training stage, leading to an efficient training of fully connected architectures. We aim to determine the robustness of these architectures as classifiers of seven different types of seismic events recorded at "Volcán de Fuego" (Colima, Mexico). Two deep neural networks with different pre-training strategies are studied: stacked denoising autoencoder and deep belief networks. Results are compared to existing machine learning algorithms (SVM, Random Forest, Multilayer Perceptron). We used 5 LPC coefficients over three non-overlapping segments as training features in order to characterize temporal evolution, avoid redundancy and encode the signal, regardless of its duration. Experimental results show that deep architectures can classify seismic events with higher accuracy than classical algorithms, attaining up to 92% recognition accuracy. Pre-training initialization helps these models to detect events that occur simultaneously in time (such explosions and rockfalls), increase robustness against noisy inputs, and provide better generalization. These results demonstrate deep neural networks are robust classifiers, and can be deployed in real-environments to monitor the seismicity of restless volcanoes.

  4. Astrophysical Li-7 as a product of big bang nucleosynthesis and galactic cosmic-ray spallation

    Science.gov (United States)

    Olive, Keith A.; Schramm, David N.

    1992-01-01

    The astrophysical Li-7 abundance is considered to be largely primordial, while the Be and B abundances are thought to be due to galactic cosmic ray (GCR) spallation reactions on top of a much smaller big bang component. But GCR spallation should also produce Li-7. As a consistency check on the combination of big bang nucleosynthesis and GCR spallation, the Be and B data from a sample of hot population II stars is used to subtract from the measured Li-7 abundance an estimate of the amount generated by GCR spallation for each star in the sample, and then to add to this baseline an estimate of the metallicity-dependent augmentation of Li-7 due to spallation. The singly reduced primordial Li-7 abundance is still consistent with big bang nucleosynthesis, and a single GCR spallation model can fit the Be, B, and corrected Li-7 abundances for all the stars in the sample.

  5. Single event upset threshold estimation based on local laser irradiation

    International Nuclear Information System (INIS)

    Chumakov, A.I.; Egorov, A.N.; Mavritsky, O.B.; Yanenko, A.V.

    1999-01-01

    An approach for estimation of ion-induced SEU threshold based on local laser irradiation is presented. Comparative experiment and software simulation research were performed at various pulse duration and spot size. Correlation of single event threshold LET to upset threshold laser energy under local irradiation was found. The computer analysis of local laser irradiation of IC structures was developed for SEU threshold LET estimation. The correlation of local laser threshold energy with SEU threshold LET was shown. Two estimation techniques were suggested. The first one is based on the determination of local laser threshold dose taking into account the relation of sensitive area to local irradiated area. The second technique uses the photocurrent peak value instead of this relation. The agreement between the predicted and experimental results demonstrates the applicability of this approach. (authors)

  6. Event-Based Color Segmentation With a High Dynamic Range Sensor

    Directory of Open Access Journals (Sweden)

    Alexandre Marcireau

    2018-04-01

    Full Text Available This paper introduces a color asynchronous neuromorphic event-based camera and a methodology to process color output from the device to perform color segmentation and tracking at the native temporal resolution of the sensor (down to one microsecond. Our color vision sensor prototype is a combination of three Asynchronous Time-based Image Sensors, sensitive to absolute color information. We devise a color processing algorithm leveraging this information. It is designed to be computationally cheap, thus showing how low level processing benefits from asynchronous acquisition and high temporal resolution data. The resulting color segmentation and tracking performance is assessed both with an indoor controlled scene and two outdoor uncontrolled scenes. The tracking's mean error to the ground truth for the objects of the outdoor scenes ranges from two to twenty pixels.

  7. Rates for parallax-shifted microlensing events from ground-based observations of the galactic bulge

    International Nuclear Information System (INIS)

    Buchalter, A.; Kamionkowski, M.

    1997-01-01

    The parallax effect in ground-based microlensing (ML) observations consists of a distortion to the standard ML light curve arising from the Earth's orbital motion. This can be used to partially remove the degeneracy among the system parameters in the event timescale, t 0 . In most cases, the resolution in current ML surveys is not accurate enough to observe this effect, but parallax could conceivably be detected with frequent follow-up observations of ML events in progress, providing the photometric errors are small enough. We calculate the expected fraction of ML events where the shape distortions will be observable by such follow-up observations, adopting Galactic models for the lens and source distributions that are consistent with observed microlensing timescale distributions. We study the dependence of the rates for parallax-shifted events on the frequency of follow-up observations and on the precision of the photometry. For example, we find that for hourly observations with typical photometric errors of 0.01 mag, 6% of events where the lens is in the bulge, and 31% of events where the lens is in the disk (or ∼10% of events overall), will give rise to a measurable parallax shift at the 95% confidence level. These fractions may be increased by improved photometric accuracy and increased sampling frequency. While long-duration events are favored, the surveys would be effective in picking out such distortions in events with timescales as low as t 0 ∼20 days. We study the dependence of these fractions on the assumed disk mass function and find that a higher parallax incidence is favored by mass functions with higher mean masses. Parallax measurements yield the reduced transverse speed, v, which gives both the relative transverse speed and lens mass as a function of distance. We give examples of the accuracies with which v may be measured in typical parallax events. (Abstract Truncated)

  8. PROFESSIONAL AND NONPROFESSIONAL EVENT MANAGERS: AGENTS’ CHARACTERISTICS OF EVENT-ACTIVITIES FIELD

    Directory of Open Access Journals (Sweden)

    Natalia Nikolaevna Startseva

    2015-03-01

    Full Text Available The main research question focuses on studying of the «new», non-traditional, recent forming professional group of event managers in modernRussia. This article provides the event group boundaries, its numerical and structural composition. On the basis of existing community event stereotypes, perceptions of event managers, supervisors’ event agencies and customers such the all agents included in the event activities, the author designed the image of professional and nonprofessional manager.Scientific, theoretical and practical significance of the work is leading to characterize agent-field-event activities and outline the prospects for the professional groups’ formation of event managers in modernRussia.Conceptuality and validity of the study is based on using theoretical and methodological comparative, functional and activity approaches.The obtained results can be used for further investigations in the event managers’ field and for other professional-groups, as well as useful in the study plan such as «Sociology of professions and professional groups» and «Sociology of culture and spiritual life».

  9. A Short-term ESPERTA-based Forecast Tool for Moderate-to-extreme Solar Proton Events

    Science.gov (United States)

    Laurenza, M.; Alberti, T.; Cliver, E. W.

    2018-04-01

    The ESPERTA (Empirical model for Solar Proton Event Real Time Alert) forecast tool has a Probability of Detection (POD) of 63% for all >10 MeV events with proton peak intensity ≥10 pfu (i.e., ≥S1 events, S1 referring to minor storms on the NOAA Solar Radiation Storms scale), from 1995 to 2014 with a false alarm rate (FAR) of 38% and a median (minimum) warning time (WT) of ∼4.8 (0.4) hr. The NOAA space weather scale includes four additional categories: moderate (S2), strong (S3), severe (S4), and extreme (S5). As S1 events have only minor impacts on HF radio propagation in the polar regions, the effective threshold for significant space radiation effects appears to be the S2 level (100 pfu), above which both biological and space operation impacts are observed along with increased effects on HF propagation in the polar regions. We modified the ESPERTA model to predict ≥S2 events and obtained a POD of 75% (41/55) and an FAR of 24% (13/54) for the 1995–2014 interval with a median (minimum) WT of ∼1.7 (0.2) hr based on predictions made at the time of the S1 threshold crossing. The improved performance of ESPERTA for ≥S2 events is a reflection of the big flare syndrome, which postulates that the measures of the various manifestations of eruptive solar flares increase as one considers increasingly larger events.

  10. A case for multi-model and multi-approach based event attribution: The 2015 European drought

    Science.gov (United States)

    Hauser, Mathias; Gudmundsson, Lukas; Orth, René; Jézéquel, Aglaé; Haustein, Karsten; Seneviratne, Sonia Isabelle

    2017-04-01

    Science on the role of anthropogenic influence on extreme weather events such as heat waves or droughts has evolved rapidly over the past years. The approach of "event attribution" compares the occurrence probability of an event in the present, factual world with the probability of the same event in a hypothetical, counterfactual world without human-induced climate change. Every such analysis necessarily faces multiple methodological choices including, but not limited to: the event definition, climate model configuration, and the design of the counterfactual world. Here, we explore the role of such choices for an attribution analysis of the 2015 European summer drought (Hauser et al., in preparation). While some GCMs suggest that anthropogenic forcing made the 2015 drought more likely, others suggest no impact, or even a decrease in the event probability. These results additionally differ for single GCMs, depending on the reference used for the counterfactual world. Observational results do not suggest a historical tendency towards more drying, but the record may be too short to provide robust assessments because of the large interannual variability of drought occurrence. These results highlight the need for a multi-model and multi-approach framework in event attribution research. This is especially important for events with low signal to noise ratio and high model dependency such as regional droughts. Hauser, M., L. Gudmundsson, R. Orth, A. Jézéquel, K. Haustein, S.I. Seneviratne, in preparation. A case for multi-model and multi-approach based event attribution: The 2015 European drought.

  11. Allowing Brief Delays in Responding Improves Event-Based Prospective Memory for Young Adults Living with HIV Disease

    OpenAIRE

    Loft, Shayne; Doyle, Katie L.; Naar-King, Sylvie; Outlaw, Angulique Y.; Nichols, Sharon L.; Weber, Erica; Blackstone, Kaitlin; Woods, Steven Paul

    2014-01-01

    Event-based prospective memory (PM) tasks require individuals to remember to perform an action when they encounter a specific cue in the environment, and have clear relevance for daily functioning for individuals with HIV. In many everyday tasks, the individual must not only maintain the intent to perform the PM task, but the PM task response also competes with the alternative and more habitual task response. The current study examined whether event-based PM can be improved by slowing down th...

  12. Lyapunov design of event-based controllers for the rendez-vous of coupled systems

    NARCIS (Netherlands)

    De Persis, Claudio; Postoyan, Romain

    2014-01-01

    The objective is to present a new type of triggering conditions together with new proof concepts for the event-based coordination of multi-agents. As a first step, we focus on the rendez-vous of two identical systems modeled as double integrators with additional damping in the velocity dynamics. The

  13. Generalization of the event-based Carnevale-Hines integration scheme for integrate-and-fire models

    NARCIS (Netherlands)

    van Elburg, R.A.J.; van Ooyen, A.

    2009-01-01

    An event-based integration scheme for an integrate-and-fire neuron model with exponentially decaying excitatory synaptic currents and double exponential inhibitory synaptic currents has been introduced by Carnevale and Hines. However, the integration scheme imposes nonphysiological constraints on

  14. Generalization of the Event-Based Carnevale-Hines Integration Scheme for Integrate-and-Fire Models

    NARCIS (Netherlands)

    van Elburg, Ronald A. J.; van Ooyen, Arjen

    An event-based integration scheme for an integrate-and-fire neuron model with exponentially decaying excitatory synaptic currents and double exponential inhibitory synaptic currents has been introduced by Carnevale and Hines. However, the integration scheme imposes nonphysiological constraints on

  15. Power System Event Ranking Using a New Linear Parameter-Varying Modeling with a Wide Area Measurement System-Based Approach

    Directory of Open Access Journals (Sweden)

    Mohammad Bagher Abolhasani Jabali

    2017-07-01

    Full Text Available Detecting critical power system events for Dynamic Security Assessment (DSA is required for reliability improvement. The approach proposed in this paper investigates the effects of events on dynamic behavior during nonlinear system response while common approaches use steady-state conditions after events. This paper presents some new and enhanced indices for event ranking based on time-domain simulation and polytopic linear parameter-varying (LPV modeling of a power system. In the proposed approach, a polytopic LPV representation is generated via linearization about some points of the nonlinear dynamic behavior of power system using wide-area measurement system (WAMS concepts and then event ranking is done based on the frequency response of the system models on the vertices. Therefore, the nonlinear behaviors of the system in the time of fault occurrence are considered for events ranking. The proposed algorithm is applied to a power system using nonlinear simulation. The comparison of the results especially in different fault conditions shows the advantages of the proposed approach and indices.

  16. Arachne-A web-based event viewer for MINER{nu}A

    Energy Technology Data Exchange (ETDEWEB)

    Tagg, N., E-mail: ntagg@otterbein.edu [Department of Physics, Otterbein University, 1 South Grove Street, Westerville, OH 43081 (United States); Brangham, J. [Department of Physics, Otterbein University, 1 South Grove Street, Westerville, OH 43081 (United States); Chvojka, J. [Rochester, NY 14610 (United States); Clairemont, M. [Department of Physics, Otterbein University, 1 South Grove Street, Westerville, OH 43081 (United States); Day, M. [Rochester, NY 14610 (United States); Eberly, B. [Department of Physics and Astronomy, University of Pittsburgh, Pittsburgh, PA 15260 (United States); Felix, J. [Lascurain de Retana No. 5, Col. Centro. Guanajuato, Guanajuato 36000 (Mexico); Fields, L. [Northwestern University, Evanston, IL 60208 (United States); Gago, A.M. [Seccion Fisica, Departamento de Ciencias, Pontificia Universidad Catolica del Peru, Apartado 1761, Lima (Peru); Gran, R. [Department of Physics, University of Minnesota - Duluth, Duluth, MN 55812 (United States); Harris, D.A. [Fermi National Accelerator Laboratory, Batavia, IL 60510 (United States); Kordosky, M. [Department of Physics, College of William and Mary, Williamsburg, VA 23187 (United States); Lee, H. [Rochester, NY 14610 (United States); Maggi, G. [Departamento de Fisica, Universidad Tecnica Federico Santa Maria, Avda. Espana 1680 Casilla 110-V Valparaiso (Chile); Maher, E. [Massachusetts College of Liberal Arts, 375 Church Street, North Adams, MA 01247 (United States); Mann, W.A. [Physics Department, Tufts University, Medford, MA 02155 (United States); Marshall, C.M.; McFarland, K.S.; McGowan, A.M.; Mislivec, A. [Rochester, NY 14610 (United States); and others

    2012-06-01

    Neutrino interaction events in the MINER{nu}A detector are visually represented with a web-based tool called Arachne. Data are retrieved from a central server via AJAX, and client-side JavaScript draws images into the user's browser window using the draft HTML 5 standard. These technologies allow neutrino interactions to be viewed by anyone with a web browser, allowing for easy hand-scanning of particle interactions. Arachne has been used in MINER{nu}A to evaluate neutrino data in a prototype detector, to tune reconstruction algorithms, and for public outreach and education.

  17. A Novel Event-Based Incipient Slip Detection Using Dynamic Active-Pixel Vision Sensor (DAVIS).

    Science.gov (United States)

    Rigi, Amin; Baghaei Naeini, Fariborz; Makris, Dimitrios; Zweiri, Yahya

    2018-01-24

    In this paper, a novel approach to detect incipient slip based on the contact area between a transparent silicone medium and different objects using a neuromorphic event-based vision sensor (DAVIS) is proposed. Event-based algorithms are developed to detect incipient slip, slip, stress distribution and object vibration. Thirty-seven experiments were performed on five objects with different sizes, shapes, materials and weights to compare precision and response time of the proposed approach. The proposed approach is validated by using a high speed constitutional camera (1000 FPS). The results indicate that the sensor can detect incipient slippage with an average of 44.1 ms latency in unstructured environment for various objects. It is worth mentioning that the experiments were conducted in an uncontrolled experimental environment, therefore adding high noise levels that affected results significantly. However, eleven of the experiments had a detection latency below 10 ms which shows the capability of this method. The results are very promising and show a high potential of the sensor being used for manipulation applications especially in dynamic environments.

  18. Event-based text mining for biology and functional genomics

    Science.gov (United States)

    Thompson, Paul; Nawaz, Raheel; McNaught, John; Kell, Douglas B.

    2015-01-01

    The assessment of genome function requires a mapping between genome-derived entities and biochemical reactions, and the biomedical literature represents a rich source of information about reactions between biological components. However, the increasingly rapid growth in the volume of literature provides both a challenge and an opportunity for researchers to isolate information about reactions of interest in a timely and efficient manner. In response, recent text mining research in the biology domain has been largely focused on the identification and extraction of ‘events’, i.e. categorised, structured representations of relationships between biochemical entities, from the literature. Functional genomics analyses necessarily encompass events as so defined. Automatic event extraction systems facilitate the development of sophisticated semantic search applications, allowing researchers to formulate structured queries over extracted events, so as to specify the exact types of reactions to be retrieved. This article provides an overview of recent research into event extraction. We cover annotated corpora on which systems are trained, systems that achieve state-of-the-art performance and details of the community shared tasks that have been instrumental in increasing the quality, coverage and scalability of recent systems. Finally, several concrete applications of event extraction are covered, together with emerging directions of research. PMID:24907365

  19. A SAS-based solution to evaluate study design efficiency of phase I pediatric oncology trials via discrete event simulation.

    Science.gov (United States)

    Barrett, Jeffrey S; Jayaraman, Bhuvana; Patel, Dimple; Skolnik, Jeffrey M

    2008-06-01

    Previous exploration of oncology study design efficiency has focused on Markov processes alone (probability-based events) without consideration for time dependencies. Barriers to study completion include time delays associated with patient accrual, inevaluability (IE), time to dose limiting toxicities (DLT) and administrative and review time. Discrete event simulation (DES) can incorporate probability-based assignment of DLT and IE frequency, correlated with cohort in the case of DLT, with time-based events defined by stochastic relationships. A SAS-based solution to examine study efficiency metrics and evaluate design modifications that would improve study efficiency is presented. Virtual patients are simulated with attributes defined from prior distributions of relevant patient characteristics. Study population datasets are read into SAS macros which select patients and enroll them into a study based on the specific design criteria if the study is open to enrollment. Waiting times, arrival times and time to study events are also sampled from prior distributions; post-processing of study simulations is provided within the decision macros and compared across designs in a separate post-processing algorithm. This solution is examined via comparison of the standard 3+3 decision rule relative to the "rolling 6" design, a newly proposed enrollment strategy for the phase I pediatric oncology setting.

  20. Event Completion: Event Based Inferences Distort Memory in a Matter of Seconds

    Science.gov (United States)

    Strickland, Brent; Keil, Frank

    2011-01-01

    We present novel evidence that implicit causal inferences distort memory for events only seconds after viewing. Adults watched videos of someone launching (or throwing) an object. However, the videos omitted the moment of contact (or release). Subjects falsely reported seeing the moment of contact when it was implied by subsequent footage but did…

  1. Making Sense of Collective Events: The Co-creation of a Research-based Dance

    OpenAIRE

    Boydell, Katherine M.

    2011-01-01

    A symbolic interaction (BLUMER, 1969; MEAD, 1934; PRUS, 1996; PRUS & GRILLS, 2003) approach was taken to study the collective event (PRUS, 1997) of creating a research-based dance on pathways to care in first episode psychosis. Viewing the co-creation of a research-based dance as collective activity attends to the processual aspects of an individual's experiences. It allowed us to study the process of the creation of the dance and its capacity to convert abstract research into concrete form a...

  2. Making Sense of Collective Events: The Co-creation of a Research-based Dance

    OpenAIRE

    Katherine M. Boydell

    2011-01-01

    A symbolic interaction (Blumer, 1969; Mead, 1934; Prus, 1996; Prus & Grills, 2003) approach was taken to study the collective event (Prus, 1997) of creating a research-based dance on pathways to care in first episode psychosis. Viewing the co-creation of a research-based dance as collective activity attends to the processual aspects of an individual's experiences. It allowed the authors to study the process of the creation of the dance and its capacity to convert abstract research into concre...

  3. Numerical Simulations of Slow Stick Slip Events with PFC, a DEM Based Code

    Science.gov (United States)

    Ye, S. H.; Young, R. P.

    2017-12-01

    Nonvolcanic tremors around subduction zone have become a fascinating subject in seismology in recent years. Previous studies have shown that the nonvolcanic tremor beneath western Shikoku is composed of low frequency seismic waves overlapping each other. This finding provides direct link between tremor and slow earthquakes. Slow stick slip events are considered to be laboratory scaled slow earthquakes. Slow stick slip events are traditionally studied with direct shear or double direct shear experiment setup, in which the sliding velocity can be controlled to model a range of fast and slow stick slips. In this study, a PFC* model based on double direct shear is presented, with a central block clamped by two side blocks. The gauge layers between the central and side blocks are modelled as discrete fracture networks with smooth joint bonds between pairs of discrete elements. In addition, a second model is presented in this study. This model consists of a cylindrical sample subjected to triaxial stress. Similar to the previous model, a weak gauge layer at a 45 degrees is added into the sample, on which shear slipping is allowed. Several different simulations are conducted on this sample. While the confining stress is maintained at the same level in different simulations, the axial loading rate (displacement rate) varies. By varying the displacement rate, a range of slipping behaviour, from stick slip to slow stick slip are observed based on the stress-strain relationship. Currently, the stick slip and slow stick slip events are strictly observed based on the stress-strain relationship. In the future, we hope to monitor the displacement and velocity of the balls surrounding the gauge layer as a function of time, so as to generate a synthetic seismogram. This will allow us to extract seismic waveforms and potentially simulate the tremor-like waves found around subduction zones. *Particle flow code, a discrete element method based numerical simulation code developed by

  4. Event-by-event particle multiplicity fluctuations in Pb-Pb collisions with ALICE

    Energy Technology Data Exchange (ETDEWEB)

    Arslandok, Mesut [Institut fuer Kernphysik, Goethe-Universitaet Frankfurt (Germany); Collaboration: ALICE-Collaboration

    2014-07-01

    The study of event-by-event fluctuations of identified hadrons may reveal the degrees of freedom of the strongly interacting mater created in heavy-ion collisions. Particle identification that is based on the measurement of the specific ionization energy loss dE/dx works well on a statistical basis, however, suffers from ambiguities when applied on the event-by-event level. A novel experimental technique called the ''Identity Method'' was recently proposed to overcome such limitations. The method follows a probabilistic approach using the inclusive dE/dx distributions measured in the ALICE TPC, and determines the moments of the multiplicity distributions by an unfolding procedure. In this contribution, the status of an event-by-event fluctuation analysis that applies the Identity Method to Pb-Pb data from ALICE is presented.

  5. Age Differences in the Experience of Daily Life Events: A Study Based on the Social Goals Perspective.

    Science.gov (United States)

    Ji, Lingling; Peng, Huamao; Xue, Xiaotong

    2017-01-01

    This study examined age differences in daily life events related to different types of social goals based on the socioemotional selectivity theory (SST), and determined whether the positivity effect existed in the context of social goals in older adults' daily lives. Over a course of 14 days, 49 older adults and 36 younger adults wrote about up to three life events daily and rated the valence of each event. The findings indicated that (1) although both older and younger adults recorded events related to both emotional and knowledge-acquisition goals, the odds ratio for reporting a higher number of events related to emotional goals compared to the number of events related to knowledge-acquisition goals was 2.12 times higher in older adults than that observed in younger adults. (2) Considering the number of events, there was an age-related positivity effect only for knowledge-related goals, and (3) older adults' ratings for events related to emotional and knowledge-acquisition goals were significantly more positive compared to those observed in younger adults. These findings supported the SST, and to some extent, the positivity effect was demonstrated in the context of social goals.

  6. Age Differences in the Experience of Daily Life Events: A Study Based on the Social Goals Perspective

    Directory of Open Access Journals (Sweden)

    Lingling Ji

    2017-09-01

    Full Text Available This study examined age differences in daily life events related to different types of social goals based on the socioemotional selectivity theory (SST, and determined whether the positivity effect existed in the context of social goals in older adults’ daily lives. Over a course of 14 days, 49 older adults and 36 younger adults wrote about up to three life events daily and rated the valence of each event. The findings indicated that (1 although both older and younger adults recorded events related to both emotional and knowledge-acquisition goals, the odds ratio for reporting a higher number of events related to emotional goals compared to the number of events related to knowledge-acquisition goals was 2.12 times higher in older adults than that observed in younger adults. (2 Considering the number of events, there was an age-related positivity effect only for knowledge-related goals, and (3 older adults’ ratings for events related to emotional and knowledge-acquisition goals were significantly more positive compared to those observed in younger adults. These findings supported the SST, and to some extent, the positivity effect was demonstrated in the context of social goals.

  7. An energy estimation framework for event-based methods in Non-Intrusive Load Monitoring

    International Nuclear Information System (INIS)

    Giri, Suman; Bergés, Mario

    2015-01-01

    Highlights: • Energy estimation is NILM has not yet accounted for complexity of appliance models. • We present a data-driven framework for appliance modeling in supervised NILM. • We test the framework on 3 houses and report average accuracies of 5.9–22.4%. • Appliance models facilitate the estimation of energy consumed by the appliance. - Abstract: Non-Intrusive Load Monitoring (NILM) is a set of techniques used to estimate the electricity consumed by individual appliances in a building from measurements of the total electrical consumption. Most commonly, NILM works by first attributing any significant change in the total power consumption (also known as an event) to a specific load and subsequently using these attributions (i.e. the labels for the events) to estimate energy for each load. For this last step, most published work in the field makes simplifying assumptions to make the problem more tractable. In this paper, we present a framework for creating appliance models based on classification labels and aggregate power measurements that can help to relax many of these assumptions. Our framework automatically builds models for appliances to perform energy estimation. The model relies on feature extraction, clustering via affinity propagation, perturbation of extracted states to ensure that they mimic appliance behavior, creation of finite state models, correction of any errors in classification that might violate the model, and estimation of energy based on corrected labels. We evaluate our framework on 3 houses from standard datasets in the field and show that the framework can learn data-driven models based on event labels and use that to estimate energy with lower error margins (e.g., 1.1–42.3%) than when using the heuristic models used by others

  8. Normalization Strategies for Enhancing Spatio-Temporal Analysis of Social Media Responses during Extreme Events: A Case Study based on Analysis of Four Extreme Events using Socio-Environmental Data Explorer (SEDE

    Directory of Open Access Journals (Sweden)

    J. Ajayakumar

    2017-10-01

    Full Text Available With social media becoming increasingly location-based, there has been a greater push from researchers across various domains including social science, public health, and disaster management, to tap in the spatial, temporal, and textual data available from these sources to analyze public response during extreme events such as an epidemic outbreak or a natural disaster. Studies based on demographics and other socio-economic factors suggests that social media data could be highly skewed based on the variations of population density with respect to place. To capture the spatio-temporal variations in public response during extreme events we have developed the Socio-Environmental Data Explorer (SEDE. SEDE collects and integrates social media, news and environmental data to support exploration and assessment of public response to extreme events. For this study, using SEDE, we conduct spatio-temporal social media response analysis on four major extreme events in the United States including the “North American storm complex” in December 2015, the “snowstorm Jonas” in January 2016, the “West Virginia floods” in June 2016, and the “Hurricane Matthew” in October 2016. Analysis is conducted on geo-tagged social media data from Twitter and warnings from the storm events database provided by National Centers For Environmental Information (NCEI for analysis. Results demonstrate that, to support complex social media analyses, spatial and population-based normalization and filtering is necessary. The implications of these results suggests that, while developing software solutions to support analysis of non-conventional data sources such as social media, it is quintessential to identify the inherent biases associated with the data sources, and adapt techniques and enhance capabilities to mitigate the bias. The normalization strategies that we have developed and incorporated to SEDE will be helpful in reducing the population bias associated with

  9. Normalization Strategies for Enhancing Spatio-Temporal Analysis of Social Media Responses during Extreme Events: A Case Study based on Analysis of Four Extreme Events using Socio-Environmental Data Explorer (SEDE)

    Science.gov (United States)

    Ajayakumar, J.; Shook, E.; Turner, V. K.

    2017-10-01

    With social media becoming increasingly location-based, there has been a greater push from researchers across various domains including social science, public health, and disaster management, to tap in the spatial, temporal, and textual data available from these sources to analyze public response during extreme events such as an epidemic outbreak or a natural disaster. Studies based on demographics and other socio-economic factors suggests that social media data could be highly skewed based on the variations of population density with respect to place. To capture the spatio-temporal variations in public response during extreme events we have developed the Socio-Environmental Data Explorer (SEDE). SEDE collects and integrates social media, news and environmental data to support exploration and assessment of public response to extreme events. For this study, using SEDE, we conduct spatio-temporal social media response analysis on four major extreme events in the United States including the "North American storm complex" in December 2015, the "snowstorm Jonas" in January 2016, the "West Virginia floods" in June 2016, and the "Hurricane Matthew" in October 2016. Analysis is conducted on geo-tagged social media data from Twitter and warnings from the storm events database provided by National Centers For Environmental Information (NCEI) for analysis. Results demonstrate that, to support complex social media analyses, spatial and population-based normalization and filtering is necessary. The implications of these results suggests that, while developing software solutions to support analysis of non-conventional data sources such as social media, it is quintessential to identify the inherent biases associated with the data sources, and adapt techniques and enhance capabilities to mitigate the bias. The normalization strategies that we have developed and incorporated to SEDE will be helpful in reducing the population bias associated with social media data and will be useful

  10. SPEED : a semantics-based pipeline for economic event detection

    NARCIS (Netherlands)

    Hogenboom, F.P.; Hogenboom, A.C.; Frasincar, F.; Kaymak, U.; Meer, van der O.; Schouten, K.; Vandic, D.; Parsons, J.; Motoshi, S.; Shoval, P.; Woo, C.; Wand, Y.

    2010-01-01

    Nowadays, emerging news on economic events such as acquisitions has a substantial impact on the financial markets. Therefore, it is important to be able to automatically and accurately identify events in news items in a timely manner. For this, one has to be able to process a large amount of

  11. Search for gamma-ray events in the BATSE data base

    Science.gov (United States)

    Lewin, Walter

    1994-01-01

    We find large location errors and error radii in the locations of channel 1 Cygnus X-1 events. These errors and their associated uncertainties are a result of low signal-to-noise ratios (a few sigma) in the two brightest detectors for each event. The untriggered events suffer from similarly low signal-to-noise ratios, and their location errors are expected to be at least as large as those found for Cygnus X-1 with a given signal-to-noise ratio. The statistical error radii are consistent with those found for Cygnus X-1 and with the published estimates. We therefore expect approximately 20 - 30 deg location errors for the untriggered events. Hence, many of the untriggered events occurring within a few months of the triggered activity from SGR 1900 plus 14 are indeed consistent with the SGR source location, although Cygnus X-1 is also a good candidate.

  12. Supporting Beacon and Event-Driven Messages in Vehicular Platoons through Token-Based Strategies.

    Science.gov (United States)

    Balador, Ali; Uhlemann, Elisabeth; Calafate, Carlos T; Cano, Juan-Carlos

    2018-03-23

    Timely and reliable inter-vehicle communications is a critical requirement to support traffic safety applications, such as vehicle platooning. Furthermore, low-delay communications allow the platoon to react quickly to unexpected events. In this scope, having a predictable and highly effective medium access control (MAC) method is of utmost importance. However, the currently available IEEE 802.11p technology is unable to adequately address these challenges. In this paper, we propose a MAC method especially adapted to platoons, able to transmit beacons within the required time constraints, but with a higher reliability level than IEEE 802.11p, while concurrently enabling efficient dissemination of event-driven messages. The protocol circulates the token within the platoon not in a round-robin fashion, but based on beacon data age, i.e., the time that has passed since the previous collection of status information, thereby automatically offering repeated beacon transmission opportunities for increased reliability. In addition, we propose three different methods for supporting event-driven messages co-existing with beacons. Analysis and simulation results in single and multi-hop scenarios showed that, by providing non-competitive channel access and frequent retransmission opportunities, our protocol can offer beacon delivery within one beacon generation interval while fulfilling the requirements on low-delay dissemination of event-driven messages for traffic safety applications.

  13. Supporting Beacon and Event-Driven Messages in Vehicular Platoons through Token-Based Strategies

    Directory of Open Access Journals (Sweden)

    Ali Balador

    2018-03-01

    Full Text Available Timely and reliable inter-vehicle communications is a critical requirement to support traffic safety applications, such as vehicle platooning. Furthermore, low-delay communications allow the platoon to react quickly to unexpected events. In this scope, having a predictable and highly effective medium access control (MAC method is of utmost importance. However, the currently available IEEE 802.11p technology is unable to adequately address these challenges. In this paper, we propose a MAC method especially adapted to platoons, able to transmit beacons within the required time constraints, but with a higher reliability level than IEEE 802.11p, while concurrently enabling efficient dissemination of event-driven messages. The protocol circulates the token within the platoon not in a round-robin fashion, but based on beacon data age, i.e., the time that has passed since the previous collection of status information, thereby automatically offering repeated beacon transmission opportunities for increased reliability. In addition, we propose three different methods for supporting event-driven messages co-existing with beacons. Analysis and simulation results in single and multi-hop scenarios showed that, by providing non-competitive channel access and frequent retransmission opportunities, our protocol can offer beacon delivery within one beacon generation interval while fulfilling the requirements on low-delay dissemination of event-driven messages for traffic safety applications.

  14. A Geo-Event-Based Geospatial Information Service: A Case Study of Typhoon Hazard

    Directory of Open Access Journals (Sweden)

    Yu Zhang

    2017-03-01

    Full Text Available Social media is valuable in propagating information during disasters for its timely and available characteristics nowadays, and assists in making decisions when tagged with locations. Considering the ambiguity and inaccuracy in some social data, additional authoritative data are needed for important verification. However, current works often fail to leverage both social and authoritative data and, on most occasions, the data are used in disaster analysis after the fact. Moreover, current works organize the data from the perspective of the spatial location, but not from the perspective of the disaster, making it difficult to dynamically analyze the disaster. All of the disaster-related data around the affected locations need to be retrieved. To solve these limitations, this study develops a geo-event-based geospatial information service (GEGIS framework and proceeded as follows: (1 a geo-event-related ontology was constructed to provide a uniform semantic basis for the system; (2 geo-events and attributes were extracted from the web using a natural language process (NLP and used in the semantic similarity match of the geospatial resources; and (3 a geospatial information service prototype system was designed and implemented for automatically retrieving and organizing geo-event-related geospatial resources. A case study of a typhoon hazard is analyzed here within the GEGIS and shows that the system would be effective when typhoons occur.

  15. Event-based computer simulation model of aspect-type experiments strictly satisfying Einstein's locality conditions

    NARCIS (Netherlands)

    De Raedt, Hans; De Raedt, Koen; Michielsen, Kristel; Keimpema, Koenraad; Miyashita, Seiji

    2007-01-01

    Inspired by Einstein-Podolsky-Rosen-Bohtn experiments with photons, we construct an event-based simulation model in which every essential element in the ideal experiment has a counterpart. The model satisfies Einstein's criterion of local causality and does not rely on concepts of quantum and

  16. Multiple daytime nucleation events in semi-clean savannah and industrial environments in South Africa: analysis based on observations

    Directory of Open Access Journals (Sweden)

    A. Hirsikko

    2013-06-01

    Full Text Available Recent studies have shown very high frequencies of atmospheric new particle formation in different environments in South Africa. Our aim here was to investigate the causes for two or three consecutive daytime nucleation events, followed by subsequent particle growth during the same day. We analysed 108 and 31 such days observed in a polluted industrial and moderately polluted rural environments, respectively, in South Africa. The analysis was based on two years of measurements at each site. After rejecting the days having notable changes in the air mass origin or local wind direction, i.e. two major reasons for observed multiple nucleation events, we were able to investigate other factors causing this phenomenon. Clouds were present during, or in between most of the analysed multiple particle formation events. Therefore, some of these events may have been single events, interrupted somehow by the presence of clouds. From further analysis, we propose that the first nucleation and growth event of the day was often associated with the mixing of a residual air layer rich in SO2 (oxidized to sulphuric acid into the shallow surface-coupled layer. The second nucleation and growth event of the day usually started before midday and was sometimes associated with renewed SO2 emissions from industrial origin. However, it was also evident that vapours other than sulphuric acid were required for the particle growth during both events. This was especially the case when two simultaneously growing particle modes were observed. Based on our analysis, we conclude that the relative contributions of estimated H2SO4 and other vapours on the first and second nucleation and growth events of the day varied from day to day, depending on anthropogenic and natural emissions, as well as atmospheric conditions.

  17. TwitterSensing: An Event-Based Approach for Wireless Sensor Networks Optimization Exploiting Social Media in Smart City Applications.

    Science.gov (United States)

    Costa, Daniel G; Duran-Faundez, Cristian; Andrade, Daniel C; Rocha-Junior, João B; Peixoto, João Paulo Just

    2018-04-03

    Modern cities are subject to periodic or unexpected critical events, which may bring economic losses or even put people in danger. When some monitoring systems based on wireless sensor networks are deployed, sensing and transmission configurations of sensor nodes may be adjusted exploiting the relevance of the considered events, but efficient detection and classification of events of interest may be hard to achieve. In Smart City environments, several people spontaneously post information in social media about some event that is being observed and such information may be mined and processed for detection and classification of critical events. This article proposes an integrated approach to detect and classify events of interest posted in social media, notably in Twitter , and the assignment of sensing priorities to source nodes. By doing so, wireless sensor networks deployed in Smart City scenarios can be optimized for higher efficiency when monitoring areas under the influence of the detected events.

  18. TwitterSensing: An Event-Based Approach for Wireless Sensor Networks Optimization Exploiting Social Media in Smart City Applications

    Directory of Open Access Journals (Sweden)

    Daniel G. Costa

    2018-04-01

    Full Text Available Modern cities are subject to periodic or unexpected critical events, which may bring economic losses or even put people in danger. When some monitoring systems based on wireless sensor networks are deployed, sensing and transmission configurations of sensor nodes may be adjusted exploiting the relevance of the considered events, but efficient detection and classification of events of interest may be hard to achieve. In Smart City environments, several people spontaneously post information in social media about some event that is being observed and such information may be mined and processed for detection and classification of critical events. This article proposes an integrated approach to detect and classify events of interest posted in social media, notably in Twitter, and the assignment of sensing priorities to source nodes. By doing so, wireless sensor networks deployed in Smart City scenarios can be optimized for higher efficiency when monitoring areas under the influence of the detected events.

  19. Event-by-event simulation of quantum cryptography protocols

    NARCIS (Netherlands)

    Zhao, S.; Raedt, H. De

    We present a new approach to simulate quantum cryptography protocols using event-based processes. The method is validated by simulating the BB84 protocol and the Ekert protocol, both without and with the presence of an eavesdropper.

  20. Development of knowledge-based operator support system for steam generator water leak events in FBR plants

    International Nuclear Information System (INIS)

    Arikawa, Hiroshi; Ida, Toshio; Matsumoto, Hiroyuki; Kishida, Masako

    1991-01-01

    A knowledge engineering approach to operation support system would be useful in maintaining safe and steady operation in nuclear plants. This paper describes a knowledge-based operation support system which assists the operators during steam generator water leak events in FBR plants. We have developed a real-time expert system. The expert system adopts hierarchical knowledge representation corresponding to the 'plant abnormality model'. A technique of signal validation which uses knowledge of symptom propagation are applied to diagnosis. In order to verify the knowledge base concerning steam generator water leak events in FBR plants, a simulator is linked to the expert system. It is revealed that diagnosis based on 'plant abnormality model' and signal validation using knowledge of symptom propagation could work successfully. Also, it is suggested that the expert system could be useful in supporting FBR plants operations. (author)

  1. OGLE-2016-BLG-0168 Binary Microlensing Event: Prediction and Confirmation of the Microlens Parallax Effect from Space-based Observations

    Energy Technology Data Exchange (ETDEWEB)

    Shin, I.-G.; Yee, J. C.; Jung, Y. K. [Smithsonian Astrophysical Observatory, 60 Garden Street, Cambridge, MA 02138 (United States); Udalski, A.; Skowron, J.; Mróz, P.; Soszyński, I.; Poleski, R.; Szymański, M. K.; Kozłowski, S.; Pietrukowicz, P.; Ulaczyk, K.; Pawlak, M. [Warsaw University Observatory, Al. Ujazdowskie 4,00-478 Warszawa (Poland); Novati, S. Calchi [IPAC, Mail Code 100-22, California Institute of Technology, 1200 E. California Boulevard, Pasadena, CA 91125 (United States); Han, C. [Department of Physics, Chungbuk National University, Cheongju 371-763 (Korea, Republic of); Albrow, M. D. [University of Canterbury, Department of Physics and Astronomy, Private Bag 4800, Christchurch 8020 (New Zealand); Gould, A. [Department of Astronomy, Ohio State University, 140 W. 18th Avenue, Columbus, OH 43210 (United States); Chung, S.-J.; Hwang, K.-H.; Ryu, Y.-H. [Korea Astronomy and Space Science Institute, 776 Daedeokdae-ro, Yuseong-Gu, Daejeon 34055 (Korea, Republic of); Collaboration: OGLE Collaboration; KMTNet Group; Spitzer Team; and others

    2017-11-01

    The microlens parallax is a crucial observable for conclusively identifying the nature of lens systems in microlensing events containing or composed of faint (even dark) astronomical objects such as planets, neutron stars, brown dwarfs, and black holes. With the commencement of a new era of microlensing in collaboration with space-based observations, the microlens parallax can be routinely measured. In addition, space-based observations can provide opportunities to verify the microlens parallax measured from ground-only observations and to find a unique solution to the lensing light-curve analysis. Furthermore, since most space-based observations cannot cover the full light curves of lensing events, it is also necessary to verify the reliability of the information extracted from fragmentary space-based light curves. We conduct a test based on the microlensing event OGLE-2016-BLG-0168, created by a binary lens system consisting of almost equal mass M-dwarf stars, to demonstrate that it is possible to verify the microlens parallax and to resolve degeneracies using the space-based light curve even though the observations are fragmentary. Since space-based observatories will frequently produce fragmentary light curves due to their short observing windows, the methodology of this test will be useful for next-generation microlensing experiments that combine space-based and ground-based collaboration.

  2. Superbugs in the supermarket? Assessing the rate of contamination with third-generation cephalosporin-resistant gram-negative bacteria in fresh Australian pork and chicken.

    Science.gov (United States)

    McLellan, Jade E; Pitcher, Joshua I; Ballard, Susan A; Grabsch, Elizabeth A; Bell, Jan M; Barton, Mary; Grayson, M Lindsay

    2018-01-01

    Antibiotic misuse in food-producing animals is potentially associated with human acquisition of multidrug-resistant (MDR; resistance to ≥ 3 drug classes) bacteria via the food chain. We aimed to determine if MDR Gram-negative (GNB) organisms are present in fresh Australian chicken and pork products. We sampled raw, chicken drumsticks (CD) and pork ribs (PR) from 30 local supermarkets/butchers across Melbourne on two occasions. Specimens were sub-cultured onto selective media for third-generation cephalosporin-resistant (3GCR) GNBs, with species identification and antibiotic susceptibility determined for all unique colonies. Isolates were assessed by PCR for SHV, TEM, CTX-M, AmpC and carbapenemase genes (encoding IMP, VIM, KPC, OXA-48, NDM). From 120 specimens (60 CD, 60 PR), 112 (93%) grew a 3GCR-GNB ( n  = 164 isolates; 86 CD, 78 PR); common species were Acinetobacter baumannii (37%), Pseudomonas aeruginosa (13%) and Serratia fonticola (12%), but only one E. coli isolate. Fifty-nine (36%) had evidence of 3GCR alone, 93/163 (57%) displayed 3GCR plus resistance to one additional antibiotic class, and 9/163 (6%) were 3GCR plus resistance to two additional classes. Of 158 DNA specimens, all were negative for ESBL/carbapenemase genes, except 23 (15%) which were positive for AmpC, with 22/23 considered to be inherently chromosomal, but the sole E. coli isolate contained a plasmid-mediated CMY-2 AmpC. We found low rates of MDR-GNBs in Australian chicken and pork meat, but potential 3GCR-GNBs are common (93% specimens). Testing programs that only assess for E. coli are likely to severely underestimate the diversity of 3GCR organisms in fresh meat.

  3. LCP method for a planar passive dynamic walker based on an event-driven scheme

    Science.gov (United States)

    Zheng, Xu-Dong; Wang, Qi

    2018-06-01

    The main purpose of this paper is to present a linear complementarity problem (LCP) method for a planar passive dynamic walker with round feet based on an event-driven scheme. The passive dynamic walker is treated as a planar multi-rigid-body system. The dynamic equations of the passive dynamic walker are obtained by using Lagrange's equations of the second kind. The normal forces and frictional forces acting on the feet of the passive walker are described based on a modified Hertz contact model and Coulomb's law of dry friction. The state transition problem of stick-slip between feet and floor is formulated as an LCP, which is solved with an event-driven scheme. Finally, to validate the methodology, four gaits of the walker are simulated: the stance leg neither slips nor bounces; the stance leg slips without bouncing; the stance leg bounces without slipping; the walker stands after walking several steps.

  4. Individual differences in event-based prospective memory: Evidence for multiple processes supporting cue detection.

    Science.gov (United States)

    Brewer, Gene A; Knight, Justin B; Marsh, Richard L; Unsworth, Nash

    2010-04-01

    The multiprocess view proposes that different processes can be used to detect event-based prospective memory cues, depending in part on the specificity of the cue. According to this theory, attentional processes are not necessary to detect focal cues, whereas detection of nonfocal cues requires some form of controlled attention. This notion was tested using a design in which we compared performance on a focal and on a nonfocal prospective memory task by participants with high or low working memory capacity. An interaction was found, such that participants with high and low working memory performed equally well on the focal task, whereas the participants with high working memory performed significantly better on the nonfocal task than did their counterparts with low working memory. Thus, controlled attention was only necessary for detecting event-based prospective memory cues in the nonfocal task. These results have implications for theories of prospective memory, the processes necessary for cue detection, and the successful fulfillment of intentions.

  5. Features, Events, and Processes: Disruptive Events

    Energy Technology Data Exchange (ETDEWEB)

    J. King

    2004-03-31

    The primary purpose of this analysis is to evaluate seismic- and igneous-related features, events, and processes (FEPs). These FEPs represent areas of natural system processes that have the potential to produce disruptive events (DE) that could impact repository performance and are related to the geologic processes of tectonism, structural deformation, seismicity, and igneous activity. Collectively, they are referred to as the DE FEPs. This evaluation determines which of the DE FEPs are excluded from modeling used to support the total system performance assessment for license application (TSPA-LA). The evaluation is based on the data and results presented in supporting analysis reports, model reports, technical information, or corroborative documents that are cited in the individual FEP discussions in Section 6.2 of this analysis report.

  6. Features, Events, and Processes: Disruptive Events

    International Nuclear Information System (INIS)

    J. King

    2004-01-01

    The primary purpose of this analysis is to evaluate seismic- and igneous-related features, events, and processes (FEPs). These FEPs represent areas of natural system processes that have the potential to produce disruptive events (DE) that could impact repository performance and are related to the geologic processes of tectonism, structural deformation, seismicity, and igneous activity. Collectively, they are referred to as the DE FEPs. This evaluation determines which of the DE FEPs are excluded from modeling used to support the total system performance assessment for license application (TSPA-LA). The evaluation is based on the data and results presented in supporting analysis reports, model reports, technical information, or corroborative documents that are cited in the individual FEP discussions in Section 6.2 of this analysis report

  7. Gas reactor international coope--ative program. Interim report: assessment of gas-cooled reactor economics

    Energy Technology Data Exchange (ETDEWEB)

    1979-08-01

    A computer analysis of domestic economic incentive is presented. Included are the sample computer data set for ten combinations of reprocessing and reactor assumptions; basic data set and computer output; higher uranium availability computer output; 50 percent higher GCR fabrication cost computer output; 50 percent higher GCR reprocessing cost computer output; year 1990 and year 2000 GCR introduction scenario computer outputs; 75 percent perceived capacity factor for PBR computer output; and capital cost of GCRs 1.2 times that of LWRs.

  8. Creating personalized memories from social events: Community-based support for multi-camera recordings of school concerts

    OpenAIRE

    Guimaraes R.L.; Cesar P.; Bulterman D.C.A.; Zsombori V.; Kegel I.

    2011-01-01

    htmlabstractThe wide availability of relatively high-quality cameras makes it easy for many users to capture video fragments of social events such as concerts, sports events or community gatherings. The wide availability of simple sharing tools makes it nearly as easy to upload individual fragments to on-line video sites. Current work on video mashups focuses on the creation of a video summary based on the characteristics of individual media fragments, but it fails to address the interpersona...

  9. The taxable events for the Value-Added Tax (VAT based on a Comparative Law approach

    Directory of Open Access Journals (Sweden)

    Walker Villanueva Gutiérrez

    2014-07-01

    Full Text Available This article analyzes the definitions of the main taxable events for the Value-Added Tax (VAT based on a comparative approach to thelegislation of different countries (Spain, Mexico, Chile, Colombia, Argentina and Peru. In this regard, it analyzes which legislations offer definitions according to the principles of generality, fiscal neutrality and legal certainty for VAT. Moreover, it points out that the VAT systems of those countries do not require as a condition for the configuration of the taxable events that the transactions involve a «value added» or a final consumption. In the specificcase of «supplies of goods», the VAT systems have a similar definition of the taxable event, although there are a few differences. However, in the case of«supplies of services», which is the most important taxable event for VAT, there are important differences at the time each country defines it. This is not a desirable effect for the international trade of services, since the lack of harmonization produces double taxation or double non taxation.

  10. Positive predictive value of a register-based algorithm using the Danish National Registries to identify suicidal events.

    Science.gov (United States)

    Gasse, Christiane; Danielsen, Andreas Aalkjaer; Pedersen, Marianne Giørtz; Pedersen, Carsten Bøcker; Mors, Ole; Christensen, Jakob

    2018-04-17

    It is not possible to fully assess intention of self-harm and suicidal events using information from administrative databases. We conducted a validation study of intention of suicide attempts/self-harm contacts identified by a commonly applied Danish register-based algorithm (DK-algorithm) based on hospital discharge diagnosis and emergency room contacts. Of all 101 530 people identified with an incident suicide attempt/self-harm contact at Danish hospitals between 1995 and 2012 using the DK-algorithm, we selected a random sample of 475 people. We validated the DK-algorithm against medical records applying the definitions and terminology of the Columbia Classification Algorithm of Suicide Assessment of suicidal events, nonsuicidal events, and indeterminate or potentially suicidal events. We calculated positive predictive values (PPVs) of the DK-algorithm to identify suicidal events overall, by gender, age groups, and calendar time. We retrieved medical records for 357 (75%) people. The PPV of the DK-algorithm to identify suicidal events was 51.5% (95% CI: 46.4-56.7) overall, 42.7% (95% CI: 35.2-50.5) in males, and 58.5% (95% CI: 51.6-65.1) in females. The PPV varied further across age groups and calendar time. After excluding cases identified via the DK-algorithm by unspecific codes of intoxications and injury, the PPV improved slightly (56.8% [95% CI: 50.0-63.4]). The DK-algorithm can reliably identify self-harm with suicidal intention in 52% of the identified cases of suicide attempts/self-harm. The PPVs could be used for quantitative bias analysis and implemented as weights in future studies to estimate the proportion of suicidal events among cases identified via the DK-algorithm. Copyright © 2018 John Wiley & Sons, Ltd.

  11. VLF Observation of Long Ionospheric Recovery Events

    Science.gov (United States)

    Cotts, B. R.; Inan, U. S.

    2006-12-01

    On the evening of 20 November 1992, three early/fast events were observed on the great circle path (GCP) from the NAU transmitter in Puerto Rico to Gander (GA), Newfoundland. These events were found to have significantly longer recovery times (up to 20 minutes) than any previously documented events. Typical early/fast events and Lightning-induced Electron Precipitation (LEP) events affect the D-region ionosphere near the night-time VLF-reflection height of ~85 km and exhibit recovery to pre-event levels of gigantic jets. In this context, preliminary results indicate that the lightning-associated VLF long recovery events appear to be more common in oceanic thunderstorms. In this paper, we present occurrence statistics and other measured properties of VLF long recovery events, observed on all-sea based and land based VLF great circle paths.

  12. The CMS Event Builder

    CERN Document Server

    Brigljevic, V; Cano, E; Cittolin, Sergio; Csilling, Akos; Gigi, D; Glege, F; Gómez-Reino, Robert; Gulmini, M; Gutleber, J; Jacobs, C; Kozlovszky, Miklos; Larsen, H; Magrans de Abril, Ildefons; Meijers, F; Meschi, E; Murray, S; Oh, A; Orsini, L; Pollet, L; Rácz, A; Samyn, D; Scharff-Hansen, P; Schwick, C; Sphicas, Paris; ODell, V; Suzuki, I; Berti, L; Maron, G; Toniolo, N; Zangrando, L; Ninane, A; Erhan, S; Bhattacharya, S; Branson, J G

    2003-01-01

    The data acquisition system of the CMS experiment at the Large Hadron Collider will employ an event builder which will combine data from about 500 data sources into full events at an aggregate throughput of 100 GByte/s. Several architectures and switch technologies have been evaluated for the DAQ Technical Design Report by measurements with test benches and by simulation. This paper describes studies of an EVB test-bench based on 64 PCs acting as data sources and data consumers and employing both Gigabit Ethernet and Myrinet technologies as the interconnect. In the case of Ethernet, protocols based on Layer-2 frames and on TCP/IP are evaluated. Results from ongoing studies, including measurements on throughput and scaling are presented. The architecture of the baseline CMS event builder will be outlined. The event builder is organised into two stages with intelligent buffers in between. The first stage contains 64 switches performing a first level of data concentration by building super-fragments from fragmen...

  13. Modeling Documents with Event Model

    Directory of Open Access Journals (Sweden)

    Longhui Wang

    2015-08-01

    Full Text Available Currently deep learning has made great breakthroughs in visual and speech processing, mainly because it draws lessons from the hierarchical mode that brain deals with images and speech. In the field of NLP, a topic model is one of the important ways for modeling documents. Topic models are built on a generative model that clearly does not match the way humans write. In this paper, we propose Event Model, which is unsupervised and based on the language processing mechanism of neurolinguistics, to model documents. In Event Model, documents are descriptions of concrete or abstract events seen, heard, or sensed by people and words are objects in the events. Event Model has two stages: word learning and dimensionality reduction. Word learning is to learn semantics of words based on deep learning. Dimensionality reduction is the process that representing a document as a low dimensional vector by a linear mode that is completely different from topic models. Event Model achieves state-of-the-art results on document retrieval tasks.

  14. Research on a Hierarchical Dynamic Automatic Voltage Control System Based on the Discrete Event-Driven Method

    Directory of Open Access Journals (Sweden)

    Yong Min

    2013-06-01

    Full Text Available In this paper, concepts and methods of hybrid control systems are adopted to establish a hierarchical dynamic automatic voltage control (HD-AVC system, realizing the dynamic voltage stability of power grids. An HD-AVC system model consisting of three layers is built based on the hybrid control method and discrete event-driven mechanism. In the Top Layer, discrete events are designed to drive the corresponding control block so as to avoid solving complex multiple objective functions, the power system’s characteristic matrix is formed and the minimum amplitude eigenvalue (MAE is calculated through linearized differential-algebraic equations. MAE is applied to judge the system’s voltage stability and security and construct discrete events. The Middle Layer is responsible for management and operation, which is also driven by discrete events. Control values of the control buses are calculated based on the characteristics of power systems and the sensitivity method. Then control values generate control strategies through the interface block. In the Bottom Layer, various control devices receive and implement the control commands from the Middle Layer. In this way, a closed-loop power system voltage control is achieved. Computer simulations verify the validity and accuracy of the HD-AVC system, and verify that the proposed HD-AVC system is more effective than normal voltage control methods.

  15. Detection of planets in extremely weak central perturbation microlensing events via next-generation ground-based surveys

    International Nuclear Information System (INIS)

    Chung, Sun-Ju; Lee, Chung-Uk; Koo, Jae-Rim

    2014-01-01

    Even though the recently discovered high-magnification event MOA-2010-BLG-311 had complete coverage over its peak, confident planet detection did not happen due to extremely weak central perturbations (EWCPs, fractional deviations of ≲ 2%). For confident detection of planets in EWCP events, it is necessary to have both high cadence monitoring and high photometric accuracy better than those of current follow-up observation systems. The next-generation ground-based observation project, Korea Microlensing Telescope Network (KMTNet), satisfies these conditions. We estimate the probability of occurrence of EWCP events with fractional deviations of ≤2% in high-magnification events and the efficiency of detecting planets in the EWCP events using the KMTNet. From this study, we find that the EWCP events occur with a frequency of >50% in the case of ≲ 100 M E planets with separations of 0.2 AU ≲ d ≲ 20 AU. We find that for main-sequence and sub-giant source stars, ≳ 1 M E planets in EWCP events with deviations ≤2% can be detected with frequency >50% in a certain range that changes with the planet mass. However, it is difficult to detect planets in EWCP events of bright stars like giant stars because it is easy for KMTNet to be saturated around the peak of the events because of its constant exposure time. EWCP events are caused by close, intermediate, and wide planetary systems with low-mass planets and close and wide planetary systems with massive planets. Therefore, we expect that a much greater variety of planetary systems than those already detected, which are mostly intermediate planetary systems, regardless of the planet mass, will be significantly detected in the near future.

  16. Management of investment-construction projects basing on the matrix of key events

    Directory of Open Access Journals (Sweden)

    Morozenko Andrey Aleksandrovich

    2016-11-01

    Full Text Available The article considers the current problematic issues in the management of investment-construction projects, examines the questions of efficiency increase of construction operations on the basis of the formation of a reflex-adaptive organizational structure. The authors analyzed the necessity of forming a matrix of key events in the investment-construction project (ICP, which will create the optimal structure of the project, basing on the work program for its implementation. For convenience of representing programs of the project implementation in time the authors make recommendations to consolidate the works into separate, economically independent functional blocks. It is proposed to use an algorithm of forming the matrix of an investment-construction project, considering the economic independence of the functional blocks and stages of the ICP implementation. The use of extended network model is justified, which is supplemented by organizational and structural constraints at different stages of the project, highlighting key events fundamentally influencing the further course of the ICP implementation.

  17. Safety based on organisational learning (SOL) - Conceptual approach and verification of a method for event analysis

    International Nuclear Information System (INIS)

    Miller, R.; Wilpert, B.; Fahlbruch, B.

    1999-01-01

    This paper discusses a method for analysing safety-relevant events in NPP which is known as 'SOL', safety based on organisational learning. After discussion of the specific organisational and psychological problems examined in the event analysis, the analytic process using the SOL approach is explained as well as the required general setting. The SOL approach has been tested both with scientific experiments and from the practical perspective, by operators of NPPs and experts from other branches of industry. (orig./CB) [de

  18. Single-event effect ground test issues

    International Nuclear Information System (INIS)

    Koga, R.

    1996-01-01

    Ground-based single event effect (SEE) testing of microcircuits permits characterization of device susceptibility to various radiation induced disturbances, including: (1) single event upset (SEU) and single event latchup (SEL) in digital microcircuits; (2) single event gate rupture (SEGR), and single event burnout (SEB) in power transistors; and (3) bit errors in photonic devices. These characterizations can then be used to generate predictions of device performance in the space radiation environment. This paper provides a general overview of ground-based SEE testing and examines in critical depth several underlying conceptual constructs relevant to the conduct of such tests and to the proper interpretation of results. These more traditional issues are contrasted with emerging concerns related to the testing of modern, advanced microcircuits

  19. Signal detection to identify serious adverse events (neuropsychiatric events in travelers taking mefloquine for chemoprophylaxis of malaria

    Directory of Open Access Journals (Sweden)

    Naing C

    2012-08-01

    Full Text Available Cho Naing,1,3 Kyan Aung,1 Syed Imran Ahmed,2 Joon Wah Mak31School of Medical Sciences, 2School of Pharmacy and Health Sciences, 3School of Postgraduate Studies and Research, International Medical University, Kuala Lumpur, MalaysiaBackground: For all medications, there is a trade-off between benefits and potential for harm. It is important for patient safety to detect drug-event combinations and analyze by appropriate statistical methods. Mefloquine is used as chemoprophylaxis for travelers going to regions with known chloroquine-resistant Plasmodium falciparum malaria. As such, there is a concern about serious adverse events associated with mefloquine chemoprophylaxis. The objective of the present study was to assess whether any signal would be detected for the serious adverse events of mefloquine, based on data in clinicoepidemiological studies.Materials and methods: We extracted data on adverse events related to mefloquine chemoprophylaxis from the two published datasets. Disproportionality reporting of adverse events such as neuropsychiatric events and other adverse events was presented in the 2 × 2 contingency table. Reporting odds ratio and corresponding 95% confidence interval [CI] data-mining algorithm was applied for the signal detection. The safety signals are considered significant when the ROR estimates and the lower limits of the corresponding 95% CI are ≥2.Results: Two datasets addressing adverse events of mefloquine chemoprophylaxis (one from a published article and one from a Cochrane systematic review were included for analyses. Reporting odds ratio 1.58, 95% CI: 1.49–1.68 based on published data in the selected article, and 1.195, 95% CI: 0.94–1.44 based on data in the selected Cochrane review. Overall, in both datasets, the reporting odds ratio values of lower 95% CI were less than 2.Conclusion: Based on available data, findings suggested that signals for serious adverse events pertinent to neuropsychiatric event were

  20. VLSI-based video event triggering for image data compression

    Science.gov (United States)

    Williams, Glenn L.

    1994-02-01

    Long-duration, on-orbit microgravity experiments require a combination of high resolution and high frame rate video data acquisition. The digitized high-rate video stream presents a difficult data storage problem. Data produced at rates of several hundred million bytes per second may require a total mission video data storage requirement exceeding one terabyte. A NASA-designed, VLSI-based, highly parallel digital state machine generates a digital trigger signal at the onset of a video event. High capacity random access memory storage coupled with newly available fuzzy logic devices permits the monitoring of a video image stream for long term (DC-like) or short term (AC-like) changes caused by spatial translation, dilation, appearance, disappearance, or color change in a video object. Pre-trigger and post-trigger storage techniques are then adaptable to archiving only the significant video images.

  1. Ant colony optimization and event-based dynamic task scheduling and staffing for software projects

    Science.gov (United States)

    Ellappan, Vijayan; Ashwini, J.

    2017-11-01

    In programming change organizations from medium to inconceivable scale broadens, the issue of wander orchestrating is amazingly unusual and testing undertaking despite considering it a manual system. Programming wander-organizing requirements to deal with the issue of undertaking arranging and in addition the issue of human resource portion (also called staffing) in light of the way that most of the advantages in programming ventures are individuals. We propose a machine learning approach with finds respond in due order regarding booking by taking in the present arranging courses of action and an event based scheduler revives the endeavour arranging system moulded by the learning computation in perspective of the conformity in event like the begin with the Ander, the instant at what time possessions be free starting to ended errands, and the time when delegates stick together otherwise depart the wander inside the item change plan. The route toward invigorating the timetable structure by the even based scheduler makes the arranging method dynamic. It uses structure components to exhibit the interrelated surges of endeavours, slip-ups and singular all through different progression organizes and is adjusted to mechanical data. It increases past programming wander movement ask about by taking a gander at a survey based process with a one of a kind model, organizing it with the data based system for peril assessment and cost estimation, and using a choice showing stage.

  2. Valenced cues and contexts have different effects on event-based prospective memory.

    Science.gov (United States)

    Graf, Peter; Yu, Martin

    2015-01-01

    This study examined the separate influence and joint influences on event-based prospective memory task performance due to the valence of cues and the valence of contexts. We manipulated the valence of cues and contexts with pictures from the International Affective Picture System. The participants, undergraduate students, showed higher performance when neutral compared to valenced pictures were used for cueing prospective memory. In addition, neutral pictures were more effective as cues when they occurred in a valenced context than in the context of neutral pictures, but the effectiveness of valenced cues did not vary across contexts that differed in valence. The finding of an interaction between cue and context valence indicates that their respective influence on event-based prospective memory task performance cannot be understood in isolation from each other. Our findings are not consistent with by the prevailing view which holds that the scope of attention is broadened and narrowed, respectively, by positively and negatively valenced stimuli. Instead, our findings are more supportive of the recent proposal that the scope of attention is determined by the motivational intensity associated with valenced stimuli. Consistent with this proposal, we speculate that the motivational intensity associated with different retrieval cues determines the scope of attention, that contexts with different valence values determine participants' task engagement, and that prospective memory task performance is determined jointly by attention scope and task engagement.

  3. Valenced cues and contexts have different effects on event-based prospective memory.

    Directory of Open Access Journals (Sweden)

    Peter Graf

    Full Text Available This study examined the separate influence and joint influences on event-based prospective memory task performance due to the valence of cues and the valence of contexts. We manipulated the valence of cues and contexts with pictures from the International Affective Picture System. The participants, undergraduate students, showed higher performance when neutral compared to valenced pictures were used for cueing prospective memory. In addition, neutral pictures were more effective as cues when they occurred in a valenced context than in the context of neutral pictures, but the effectiveness of valenced cues did not vary across contexts that differed in valence. The finding of an interaction between cue and context valence indicates that their respective influence on event-based prospective memory task performance cannot be understood in isolation from each other. Our findings are not consistent with by the prevailing view which holds that the scope of attention is broadened and narrowed, respectively, by positively and negatively valenced stimuli. Instead, our findings are more supportive of the recent proposal that the scope of attention is determined by the motivational intensity associated with valenced stimuli. Consistent with this proposal, we speculate that the motivational intensity associated with different retrieval cues determines the scope of attention, that contexts with different valence values determine participants' task engagement, and that prospective memory task performance is determined jointly by attention scope and task engagement.

  4. The space radiation environment

    International Nuclear Information System (INIS)

    Robbins, D.E.

    1997-01-01

    There are three primary sources of space radiation: galactic cosmic rays (GCR), trapped belt radiation, and solar particle events (SPE). All are composed of ions, the nuclei of atoms. Their energies range from a few MeV u -1 to over a GeV u -1 . These ions can fragment when they interact with spacecraft materials and produce energetic neutrons and ions of lower atomic mass. Absorbed dose rates inside a typical spacecraft (like the Space Shuttle) in a low inclination (28.5 degrees) orbit range between 0.05 and 2 mGy d -1 depending on the altitude and flight inclination (angle of orbit with the equator). The quality factor of radiation in orbit depends on the relative contributions of trapped belt radiation and GCR, and the dose rate varies both with orbital altitude and inclination. The corresponding equivalent dose rate ranges between 0.1 and 4 mSv d -1 . In high inclination orbits, like that of the Mir Space Station and as is planned for the International Space Station, blood-forming organ (BFO) equivalent dose rates as high as 1.5 mSv d -1 . Thus, on a 1 y mission, a crew member could obtain a total dose of 0.55 Sv. Maximum equivalent dose rates measured in high altitude passes through the South Atlantic Anomaly (SAA) were 10 mSv h -1 . For an interplanetary space mission (e.g., to Mars) annual doses from GCR alone range between 150 mSv y -1 at solar maximum and 580 mSv y -1 at solar minimum. Large SPE, like the October 1989 series, are more apt to occur in the years around solar maximum. In free space, such an event could contribute another 300 mSv, assuming that a warning system and safe haven can be effectively used with operational procedures to minimize crew exposures. Thus, the total dose for a 3 y mission to Mars could exceed 2 Sv

  5. Event-Based Media Enrichment Using an Adaptive Probabilistic Hypergraph Model.

    Science.gov (United States)

    Liu, Xueliang; Wang, Meng; Yin, Bao-Cai; Huet, Benoit; Li, Xuelong

    2015-11-01

    Nowadays, with the continual development of digital capture technologies and social media services, a vast number of media documents are captured and shared online to help attendees record their experience during events. In this paper, we present a method combining semantic inference and multimodal analysis for automatically finding media content to illustrate events using an adaptive probabilistic hypergraph model. In this model, media items are taken as vertices in the weighted hypergraph and the task of enriching media to illustrate events is formulated as a ranking problem. In our method, each hyperedge is constructed using the K-nearest neighbors of a given media document. We also employ a probabilistic representation, which assigns each vertex to a hyperedge in a probabilistic way, to further exploit the correlation among media data. Furthermore, we optimize the hypergraph weights in a regularization framework, which is solved as a second-order cone problem. The approach is initiated by seed media and then used to rank the media documents using a transductive inference process. The results obtained from validating the approach on an event dataset collected from EventMedia demonstrate the effectiveness of the proposed approach.

  6. Metrics for Polyphonic Sound Event Detection

    Directory of Open Access Journals (Sweden)

    Annamaria Mesaros

    2016-05-01

    Full Text Available This paper presents and discusses various metrics proposed for evaluation of polyphonic sound event detection systems used in realistic situations where there are typically multiple sound sources active simultaneously. The system output in this case contains overlapping events, marked as multiple sounds detected as being active at the same time. The polyphonic system output requires a suitable procedure for evaluation against a reference. Metrics from neighboring fields such as speech recognition and speaker diarization can be used, but they need to be partially redefined to deal with the overlapping events. We present a review of the most common metrics in the field and the way they are adapted and interpreted in the polyphonic case. We discuss segment-based and event-based definitions of each metric and explain the consequences of instance-based and class-based averaging using a case study. In parallel, we provide a toolbox containing implementations of presented metrics.

  7. De-Virtualizing Social Events: Understanding the Gap between Online and Offline Participation for Event Invitations

    OpenAIRE

    Huang, Ai-Ju; Wang, Hao-Chuan; Yuan, Chien Wen

    2013-01-01

    One growing use of computer-based communication media is for gathering people to initiate or sustain social events. Although the use of computer-mediated communication and social network sites such as Facebook for event promotion is becoming popular, online participation in an event does not always translate to offline attendance. In this paper, we report on an interview study of 31 participants that examines how people handle online event invitations and what influences their online and offl...

  8. Seeing Iconic Gestures While Encoding Events Facilitates Children's Memory of These Events.

    Science.gov (United States)

    Aussems, Suzanne; Kita, Sotaro

    2017-11-08

    An experiment with 72 three-year-olds investigated whether encoding events while seeing iconic gestures boosts children's memory representation of these events. The events, shown in videos of actors moving in an unusual manner, were presented with either iconic gestures depicting how the actors performed these actions, interactive gestures, or no gesture. In a recognition memory task, children in the iconic gesture condition remembered actors and actions better than children in the control conditions. Iconic gestures were categorized based on how much of the actors was represented by the hands (feet, legs, or body). Only iconic hand-as-body gestures boosted actor memory. Thus, seeing iconic gestures while encoding events facilitates children's memory of those aspects of events that are schematically highlighted by gesture. © 2017 The Authors. Child Development © 2017 Society for Research in Child Development, Inc.

  9. Breaking The Millisecond Barrier On SpiNNaker: Implementing Asynchronous Event-Based Plastic Models With Microsecond Resolution

    Directory of Open Access Journals (Sweden)

    Xavier eLagorce

    2015-06-01

    Full Text Available Spike-based neuromorphic sensors such as retinas and cochleas, change the way in which the world is sampled. Instead of producing data sampled at a constant rate, these sensors output spikes that are asynchronous and event driven. The event-based nature of neuromorphic sensors implies a complete paradigm shift in current perception algorithms towards those that emphasize the importance of precise timing. The spikes produced by these sensors usually have a time resolution in the order of microseconds. This high temporal resolution is a crucial factor in learning tasks. It is also widely used in the field of biological neural networks. Sound localization for instance relies on detecting time lags between the two ears which, in the barn owl, reaches a temporal resolution of 5 microseconds. Current available neuromorphic computation platforms such as SpiNNaker often limit their users to a time resolution in the order of milliseconds that is not compatible with the asynchronous outputs of neuromorphic sensors. To overcome these limitations and allow for the exploration of new types of neuromorphic computing architectures, we introduce a novel software framework on the SpiNNaker platform. This framework allows for simulations of spiking networks and plasticity mechanisms using a completely asynchronous and event-based scheme running with a microsecond time resolution. Results on two example networks using this new implementation are presented.

  10. Seismology-based early identification of dam-formation landquake events.

    Science.gov (United States)

    Chao, Wei-An; Zhao, Li; Chen, Su-Chin; Wu, Yih-Min; Chen, Chi-Hsuan; Huang, Hsin-Hua

    2016-01-12

    Flooding resulting from the bursting of dams formed by landquake events such as rock avalanches, landslides and debris flows can lead to serious bank erosion and inundation of populated areas near rivers. Seismic waves can be generated by landquake events which can be described as time-dependent forces (unloading/reloading cycles) acting on the Earth. In this study, we conduct inversions of long-period (LP, period ≥20 s) waveforms for the landquake force histories (LFHs) of ten events, which provide quantitative characterization of the initiation, propagation and termination stages of the slope failures. When the results obtained from LP waveforms are analyzed together with high-frequency (HF, 1-3 Hz) seismic signals, we find a relatively strong late-arriving seismic phase (dubbed Dam-forming phase or D-phase) recorded clearly in the HF waveforms at the closest stations, which potentially marks the time when the collapsed masses sliding into river and perhaps even impacting the topographic barrier on the opposite bank. Consequently, our approach to analyzing the LP and HF waveforms developed in this study has a high potential for identifying five dam-forming landquake events (DFLEs) in near real-time using broadband seismic records, which can provide timely warnings of the impending floods to downstream residents.

  11. Local Times of Galactic Cosmic Ray Intensity Maximum and Minimum in the Diurnal Variation

    Directory of Open Access Journals (Sweden)

    Su Yeon Oh

    2006-06-01

    Full Text Available The Diurnal variation of galactic cosmic ray (GCR flux intensity observed by the ground Neutron Monitor (NM shows a sinusoidal pattern with the amplitude of 1sim 2 % of daily mean. We carried out a statistical study on tendencies of the local times of GCR intensity daily maximum and minimum. To test the influences of the solar activity and the location (cut-off rigidity on the distribution in the local times of maximum and minimum GCR intensity, we have examined the data of 1996 (solar minimum and 2000 (solar maximum at the low-latitude Haleakala (latitude: 20.72 N, cut-off rigidity: 12.91 GeV and the high-latitude Oulu (latitude: 65.05 N, cut-off rigidity: 0.81 GeV NM stations. The most frequent local times of the GCR intensity daily maximum and minimum come later about 2sim3 hours in the solar activity maximum year 2000 than in the solar activity minimum year 1996. Oulu NM station whose cut-off rigidity is smaller has the most frequent local times of the GCR intensity maximum and minimum later by 2sim3 hours from those of Haleakala station. This feature is more evident at the solar maximum. The phase of the daily variation in GCR is dependent upon the interplanetary magnetic field varying with the solar activity and the cut-off rigidity varying with the geographic latitude.

  12. Event monitoring of parallel computations

    Directory of Open Access Journals (Sweden)

    Gruzlikov Alexander M.

    2015-06-01

    Full Text Available The paper considers the monitoring of parallel computations for detection of abnormal events. It is assumed that computations are organized according to an event model, and monitoring is based on specific test sequences

  13. GALACTIC COSMIC-RAY INTENSITY MODULATION BY COROTATING INTERACTION REGION STREAM INTERFACES AT 1 au

    Energy Technology Data Exchange (ETDEWEB)

    Guo, X. [State Key Laboratory of Space Weather, National Space Science Center, Chinese Academy of Sciences, Beijing, 100190 (China); Florinski, V. [Center for Space Plasma and Aeronomic Research, University of Alabama, Huntsville, AL 35899 (United States)

    2016-07-20

    We present a new model that couples galactic cosmic-ray (GCR) propagation with magnetic turbulence transport and the MHD background evolution in the heliosphere. The model is applied to the problem of the formation of corotating interaction regions (CIRs) during the last solar minimum from the period between 2007 and 2009. The numerical model simultaneously calculates the large-scale supersonic solar wind properties and its small-scale turbulent content from 0.3 au to the termination shock. Cosmic rays are then transported through the background, and thus computed, with diffusion coefficients derived from the solar wind turbulent properties, using a stochastic Parker approach. Our results demonstrate that GCR variations depend on the ratio of diffusion coefficients in the fast and slow solar winds. Stream interfaces inside the CIRs always lead to depressions of the GCR intensity. On the other hand, heliospheric current sheet (HCS) crossings do not appreciably affect GCR intensities in the model, which is consistent with the two observations under quiet solar wind conditions. Therefore, variations in diffusion coefficients associated with CIR stream interfaces are more important for GCR propagation than the drift effects of the HCS during a negative solar minimum.

  14. GALACTIC COSMIC-RAY INTENSITY MODULATION BY COROTATING INTERACTION REGION STREAM INTERFACES AT 1 au

    International Nuclear Information System (INIS)

    Guo, X.; Florinski, V.

    2016-01-01

    We present a new model that couples galactic cosmic-ray (GCR) propagation with magnetic turbulence transport and the MHD background evolution in the heliosphere. The model is applied to the problem of the formation of corotating interaction regions (CIRs) during the last solar minimum from the period between 2007 and 2009. The numerical model simultaneously calculates the large-scale supersonic solar wind properties and its small-scale turbulent content from 0.3 au to the termination shock. Cosmic rays are then transported through the background, and thus computed, with diffusion coefficients derived from the solar wind turbulent properties, using a stochastic Parker approach. Our results demonstrate that GCR variations depend on the ratio of diffusion coefficients in the fast and slow solar winds. Stream interfaces inside the CIRs always lead to depressions of the GCR intensity. On the other hand, heliospheric current sheet (HCS) crossings do not appreciably affect GCR intensities in the model, which is consistent with the two observations under quiet solar wind conditions. Therefore, variations in diffusion coefficients associated with CIR stream interfaces are more important for GCR propagation than the drift effects of the HCS during a negative solar minimum.

  15. Qualitative Event-Based Diagnosis: Case Study on the Second International Diagnostic Competition

    Science.gov (United States)

    Daigle, Matthew; Roychoudhury, Indranil

    2010-01-01

    We describe a diagnosis algorithm entered into the Second International Diagnostic Competition. We focus on the first diagnostic problem of the industrial track of the competition in which a diagnosis algorithm must detect, isolate, and identify faults in an electrical power distribution testbed and provide corresponding recovery recommendations. The diagnosis algorithm embodies a model-based approach, centered around qualitative event-based fault isolation. Faults produce deviations in measured values from model-predicted values. The sequence of these deviations is matched to those predicted by the model in order to isolate faults. We augment this approach with model-based fault identification, which determines fault parameters and helps to further isolate faults. We describe the diagnosis approach, provide diagnosis results from running the algorithm on provided example scenarios, and discuss the issues faced, and lessons learned, from implementing the approach

  16. A fuzzy-based reliability approach to evaluate basic events of fault tree analysis for nuclear power plant probabilistic safety assessment

    International Nuclear Information System (INIS)

    Purba, Julwan Hendry

    2014-01-01

    Highlights: • We propose a fuzzy-based reliability approach to evaluate basic event reliabilities. • It implements the concepts of failure possibilities and fuzzy sets. • Experts evaluate basic event failure possibilities using qualitative words. • Triangular fuzzy numbers mathematically represent qualitative failure possibilities. • It is a very good alternative for conventional reliability approach. - Abstract: Fault tree analysis has been widely utilized as a tool for nuclear power plant probabilistic safety assessment. This analysis can be completed only if all basic events of the system fault tree have their quantitative failure rates or failure probabilities. However, it is difficult to obtain those failure data due to insufficient data, environment changing or new components. This study proposes a fuzzy-based reliability approach to evaluate basic events of system fault trees whose failure precise probability distributions of their lifetime to failures are not available. It applies the concept of failure possibilities to qualitatively evaluate basic events and the concept of fuzzy sets to quantitatively represent the corresponding failure possibilities. To demonstrate the feasibility and the effectiveness of the proposed approach, the actual basic event failure probabilities collected from the operational experiences of the David–Besse design of the Babcock and Wilcox reactor protection system fault tree are used to benchmark the failure probabilities generated by the proposed approach. The results confirm that the proposed fuzzy-based reliability approach arises as a suitable alternative for the conventional probabilistic reliability approach when basic events do not have the corresponding quantitative historical failure data for determining their reliability characteristics. Hence, it overcomes the limitation of the conventional fault tree analysis for nuclear power plant probabilistic safety assessment

  17. Design of a Scalable Event Notification Service: Interface and Architecture

    National Research Council Canada - National Science Library

    Carzaniga, Antonio; Rosenblum, David S; Wolf, Alexander L

    1998-01-01

    Event-based distributed systems are programmed to operate in response to events. An event notification service is an application-independent infrastructure that supports the construction of event-based systems...

  18. Reactor protection system software test-case selection based on input-profile considering concurrent events and uncertainties

    International Nuclear Information System (INIS)

    Khalaquzzaman, M.; Lee, Seung Jun; Cho, Jaehyun; Jung, Wondea

    2016-01-01

    Recently, the input-profile-based testing for safety critical software has been proposed for determining the number of test cases and quantifying the failure probability of the software. Input-profile of a reactor protection system (RPS) software is the input which causes activation of the system for emergency shutdown of a reactor. This paper presents a method to determine the input-profile of a RPS software which considers concurrent events/transients. A deviation of a process parameter value begins through an event and increases owing to the concurrent multi-events depending on the correlation of process parameters and severity of incidents. A case of reactor trip caused by feedwater loss and main steam line break is simulated and analyzed to determine the RPS software input-profile and estimate the number of test cases. The different sizes of the main steam line breaks (e.g., small, medium, large break) with total loss of feedwater supply are considered in constructing the input-profile. The uncertainties of the simulation related to the input-profile-based software testing are also included. Our study is expected to provide an option to determine test cases and quantification of RPS software failure probability. (author)

  19. A Two-Account Life Insurance Model for Scenario-Based Valuation Including Event Risk

    Directory of Open Access Journals (Sweden)

    Ninna Reitzel Jensen

    2015-06-01

    Full Text Available Using a two-account model with event risk, we model life insurance contracts taking into account both guaranteed and non-guaranteed payments in participating life insurance as well as in unit-linked insurance. Here, event risk is used as a generic term for life insurance events, such as death, disability, etc. In our treatment of participating life insurance, we have special focus on the bonus schemes “consolidation” and “additional benefits”, and one goal is to formalize how these work and interact. Another goal is to describe similarities and differences between participating life insurance and unit-linked insurance. By use of a two-account model, we are able to illustrate general concepts without making the model too abstract. To allow for complicated financial markets without dramatically increasing the mathematical complexity, we focus on economic scenarios. We illustrate the use of our model by conducting scenario analysis based on Monte Carlo simulation, but the model applies to scenarios in general and to worst-case and best-estimate scenarios in particular. In addition to easy computations, our model offers a common framework for the valuation of life insurance payments across product types. This enables comparison of participating life insurance products and unit-linked insurance products, thus building a bridge between the two different ways of formalizing life insurance products. Finally, our model distinguishes itself from the existing literature by taking into account the Markov model for the state of the policyholder and, hereby, facilitating event risk.

  20. Spatio-Temporal Story Mapping Animation Based On Structured Causal Relationships Of Historical Events

    Science.gov (United States)

    Inoue, Y.; Tsuruoka, K.; Arikawa, M.

    2014-04-01

    In this paper, we proposed a user interface that displays visual animations on geographic maps and timelines for depicting historical stories by representing causal relationships among events for time series. We have been developing an experimental software system for the spatial-temporal visualization of historical stories for tablet computers. Our proposed system makes people effectively learn historical stories using visual animations based on hierarchical structures of different scale timelines and maps.

  1. Ontology-based knowledge management for personalized adverse drug events detection.

    Science.gov (United States)

    Cao, Feng; Sun, Xingzhi; Wang, Xiaoyuan; Li, Bo; Li, Jing; Pan, Yue

    2011-01-01

    Since Adverse Drug Event (ADE) has become a leading cause of death around the world, there arises high demand for helping clinicians or patients to identify possible hazards from drug effects. Motivated by this, we present a personalized ADE detection system, with the focus on applying ontology-based knowledge management techniques to enhance ADE detection services. The development of electronic health records makes it possible to automate the personalized ADE detection, i.e., to take patient clinical conditions into account during ADE detection. Specifically, we define the ADE ontology to uniformly manage the ADE knowledge from multiple sources. We take advantage of the rich semantics from the terminology SNOMED-CT and apply it to ADE detection via the semantic query and reasoning.

  2. Leading indicators of community-based violent events among adults with mental illness.

    Science.gov (United States)

    Van Dorn, R A; Grimm, K J; Desmarais, S L; Tueller, S J; Johnson, K L; Swartz, M S

    2017-05-01

    The public health, public safety and clinical implications of violent events among adults with mental illness are significant; however, the causes and consequences of violence and victimization among adults with mental illness are complex and not well understood, which limits the effectiveness of clinical interventions and risk management strategies. This study examined interrelationships between violence, victimization, psychiatric symptoms, substance use, homelessness and in-patient treatment over time. Available data were integrated from four longitudinal studies of adults with mental illness. Assessments took place at baseline, and at 1, 3, 6, 9, 12, 15, 18, 24, 30 and 36 months, depending on the parent studies' protocol. Data were analysed with the autoregressive cross-lag model. Violence and victimization were leading indicators of each other and affective symptoms were a leading indicator of both. Drug and alcohol use were leading indicators of violence and victimization, respectively. All psychiatric symptom clusters - affective, positive, negative, disorganized cognitive processing - increased the likelihood of experiencing at least one subsequent symptom cluster. Sensitivity analyses identified few group-based differences in the magnitude of effects in this heterogeneous sample. Violent events demonstrated unique and shared indicators and consequences over time. Findings indicate mechanisms for reducing violent events, including trauma-informed therapy, targeting internalizing and externalizing affective symptoms with cognitive-behavioral and psychopharmacological interventions, and integrating substance use and psychiatric care. Finally, mental illness and violence and victimization research should move beyond demonstrating concomitant relationships and instead focus on lagged effects with improved spatio-temporal contiguity.

  3. INES rating of radiation protection related events

    International Nuclear Information System (INIS)

    Hort, M.

    2009-01-01

    In this presentation, based on the draft Manual, a short review of the use of the INES rating of events concerning radiation protection is given, based on a new INES User's Manual edition. The presentation comprises a brief history of the scale development, general description of the scale and the main principles of the INES rating. Several examples of the use of the scale for radiation protection related events are mentioned. In the presentation, the term 'radiation protection related events' is used for radiation source and transport related events outside the nuclear installations. (authors)

  4. Event recognition in personal photo collections via multiple instance learning-based classification of multiple images

    Science.gov (United States)

    Ahmad, Kashif; Conci, Nicola; Boato, Giulia; De Natale, Francesco G. B.

    2017-11-01

    Over the last few years, a rapid growth has been witnessed in the number of digital photos produced per year. This rapid process poses challenges in the organization and management of multimedia collections, and one viable solution consists of arranging the media on the basis of the underlying events. However, album-level annotation and the presence of irrelevant pictures in photo collections make event-based organization of personal photo albums a more challenging task. To tackle these challenges, in contrast to conventional approaches relying on supervised learning, we propose a pipeline for event recognition in personal photo collections relying on a multiple instance-learning (MIL) strategy. MIL is a modified form of supervised learning and fits well for such applications with weakly labeled data. The experimental evaluation of the proposed approach is carried out on two large-scale datasets including a self-collected and a benchmark dataset. On both, our approach significantly outperforms the existing state-of-the-art.

  5. Logical Discrete Event Systems in a trace theory based setting

    NARCIS (Netherlands)

    Smedinga, R.

    1993-01-01

    Discrete event systems can be modelled using a triple consisting of some alphabet (representing the events that might occur), and two trace sets (sets of possible strings) denoting the possible behaviour and the completed tasks of the system. Using this definition we are able to formulate and solve

  6. Event Based Simulator for Parallel Computing over the Wide Area Network for Real Time Visualization

    Science.gov (United States)

    Sundararajan, Elankovan; Harwood, Aaron; Kotagiri, Ramamohanarao; Satria Prabuwono, Anton

    As the computational requirement of applications in computational science continues to grow tremendously, the use of computational resources distributed across the Wide Area Network (WAN) becomes advantageous. However, not all applications can be executed over the WAN due to communication overhead that can drastically slowdown the computation. In this paper, we introduce an event based simulator to investigate the performance of parallel algorithms executed over the WAN. The event based simulator known as SIMPAR (SIMulator for PARallel computation), simulates the actual computations and communications involved in parallel computation over the WAN using time stamps. Visualization of real time applications require steady stream of processed data flow for visualization purposes. Hence, SIMPAR may prove to be a valuable tool to investigate types of applications and computing resource requirements to provide uninterrupted flow of processed data for real time visualization purposes. The results obtained from the simulation show concurrence with the expected performance using the L-BSP model.

  7. Event-based model diagnosis of rainfall-runoff model structures

    International Nuclear Information System (INIS)

    Stanzel, P.

    2012-01-01

    The objective of this research is a comparative evaluation of different rainfall-runoff model structures. Comparative model diagnostics facilitate the assessment of strengths and weaknesses of each model. The application of multiple models allows an analysis of simulation uncertainties arising from the selection of model structure, as compared with effects of uncertain parameters and precipitation input. Four different model structures, including conceptual and physically based approaches, are compared. In addition to runoff simulations, results for soil moisture and the runoff components of overland flow, interflow and base flow are analysed. Catchment runoff is simulated satisfactorily by all four model structures and shows only minor differences. Systematic deviations from runoff observations provide insight into model structural deficiencies. While physically based model structures capture some single runoff events better, they do not generally outperform conceptual model structures. Contributions to uncertainty in runoff simulations stemming from the choice of model structure show similar dimensions to those arising from parameter selection and the representation of precipitation input. Variations in precipitation mainly affect the general level and peaks of runoff, while different model structures lead to different simulated runoff dynamics. Large differences between the four analysed models are detected for simulations of soil moisture and, even more pronounced, runoff components. Soil moisture changes are more dynamical in the physically based model structures, which is in better agreement with observations. Streamflow contributions of overland flow are considerably lower in these models than in the more conceptual approaches. Observations of runoff components are rarely made and are not available in this study, but are shown to have high potential for an effective selection of appropriate model structures (author) [de

  8. Discrete event systems diagnosis and diagnosability

    CERN Document Server

    Sayed-Mouchaweh, Moamar

    2014-01-01

    Discrete Event Systems: Diagnosis and Diagnosability addresses the problem of fault diagnosis of Discrete Event Systems (DES). This book provides the basic techniques and approaches necessary for the design of an efficient fault diagnosis system for a wide range of modern engineering applications. The different techniques and approaches are classified according to several criteria such as: modeling tools (Automata, Petri nets) that is used to construct the model; the information (qualitative based on events occurrences and/or states outputs, quantitative based on signal processing and data analysis) that is needed to analyze and achieve the diagnosis; the decision structure (centralized, decentralized) that is required to achieve the diagnosis. The goal of this classification is to select the efficient method to achieve the fault diagnosis according to the application constraints. This book focuses on the centralized and decentralized event based diagnosis approaches using formal language and automata as mode...

  9. Prescription-event monitoring: developments in signal detection.

    Science.gov (United States)

    Ferreira, Germano

    2007-01-01

    Prescription-event monitoring (PEM) is a non-interventional intensive method for post-marketing drug safety monitoring of newly licensed medicines. PEM studies are cohort studies where exposure is obtained from a centralised service and outcomes from simple questionnaires completed by general practitioners. Follow-up forms are sent for selected events. Because PEM captures all events and not only the suspected adverse drug reactions, PEM cohorts potentially differ in respect to the distribution of number of events per person depending on the nature of the drug under study. This variance can be related either with the condition for which the drug is prescribed (e.g. a condition causing high morbidity will have, in average, a higher number of events per person compared with a condition with lower morbidity) or with the drug effect itself. This paper describes an exploratory investigation of the distortion caused by product-related variations of the number of events to the interpretation of the proportional reporting ratio (PRR) values ("the higher the PRR, the greater the strength of the signal") computed using drug-cohort data. We studied this effect by assessing the agreement between the PRR based on events (event of interest vs all other events) and PRR based on cases (cases with the event of interest vs cases with any other events). PRR were calculated for all combinations reported to ten selected drugs against a comparator of 81 other drugs. Three of the ten drugs had a cohort with an apparent higher proportion of patients with lower number of events. The PRRs based on events were systematically higher than the PRR based on cases for the combinations reported to these three drugs. Additionally, when applying the threshold criteria for signal screening (n > or =3, PRR > or =1.5 and Chi-squared > or =4), the binary agreement was generally high but apparently lower for these three drugs. In conclusion, the distribution of events per patient in drug cohorts shall be

  10. Forecasting of integral parameters of solar cosmic ray events according to initial characteristics of an event

    International Nuclear Information System (INIS)

    Belovskij, M.N.; Ochelkov, Yu.P.

    1981-01-01

    The forecasting method for an integral proton flux of solar cosmic rays (SCR) based on the initial characteristics of the phe-- nomenon is proposed. The efficiency of the method is grounded. The accuracy of forecasting is estimated and the retrospective forecasting of real events is carried out. The parameters of the universal function describing the time progress of the SCR events are pre-- sented. The proposed method is suitable for forecasting practically all the SCR events. The timeliness of the given forecasting is not worse than that of the forecasting based on utilization of the SCR propagation models [ru

  11. Tracking the evolution of stream DOM source during storm events using end member mixing analysis based on DOM quality

    Science.gov (United States)

    Yang, Liyang; Chang, Soon-Woong; Shin, Hyun-Sang; Hur, Jin

    2015-04-01

    The source of river dissolved organic matter (DOM) during storm events has not been well constrained, which is critical in determining the quality and reactivity of DOM. This study assessed temporal changes in the contributions of four end members (weeds, leaf litter, soil, and groundwater), which exist in a small forested watershed (the Ehwa Brook, South Korea), to the stream DOM during two storm events, using end member mixing analysis (EMMA) based on spectroscopic properties of DOM. The instantaneous export fluxes of dissolved organic carbon (DOC), chromophoric DOM (CDOM), and fluorescent components were all enhanced during peak flows. The DOC concentration increased with the flow rate, while CDOM and humic-like fluorescent components were diluted around the peak flows. Leaf litter was dominant for the DOM source in event 2 with a higher rainfall, although there were temporal variations in the contributions of the four end members to the stream DOM for both events. The contribution of leaf litter peaked while that of deeper soils decreased to minima at peak flows. Our results demonstrated that EMMA based on DOM properties could be used to trace the DOM source, which is of fundamental importance for understanding the factors responsible for river DOM dynamics during storm events.

  12. A Hexose Transporter Homologue Controls Glucose Repression in the Methylotrophic Yeast Hansenula polymorpha

    NARCIS (Netherlands)

    Stasyk, Oleh V.; Stasyk, Olena G.; Komduur, Janet; Veenhuis, Marten; Cregg, James M.; Sibirny, Andrei A.

    2004-01-01

    Peroxisome biogenesis and synthesis of peroxisomal enzymes in the methylotrophic yeast Hansenula polymorpha are under the strict control of glucose repression. We identified an H. polymorpha glucose catabolite repression gene (HpGCR1) that encodes a hexose transporter homologue. Deficiency in GCR1

  13. Real-time identification of residential appliance events based on power monitoring

    Science.gov (United States)

    Yang, Zhao; Zhu, Zhicheng; Wei, Zhiqiang; Yin, Bo; Wang, Xiuwei

    2018-03-01

    Energy monitoring for specific home appliances has been regarded as the pre-requisite for reducing residential energy consumption. To enhance the accuracy of identifying operation status of household appliances and to keep pace with the development of smart power grid, this paper puts forward the integration of electric current and power data on the basis of existing algorithm. If average power difference of several adjacent cycles varies from the baseline and goes beyond the pre-assigned threshold value, the event will be flagged. Based on MATLAB platform and domestic appliances simulations, the results of tested data and verified algorithm indicate that the power method has accomplished desired results of appliance identification.

  14. Reflex vocal fold adduction in the porcine model: the effects of stimuli delivered to various sensory nerves.

    Science.gov (United States)

    Woo, Jeong-Soo; Hundal, Jagdeep S; Sasaki, Clarence T; Abdelmessih, Mikhail W; Kelleher, Stephen P

    2008-10-01

    The aim of this study was to identify a panel of sensory nerves capable of eliciting an evoked glottic closure reflex (GCR) and to quantify the glottic closing force (GCF) of these responses in a porcine model. In 5 pigs, the internal branch of the superior laryngeal nerve (iSLN) and the trigeminal, pharyngeal plexus, glossopharyngeal, radial, and intercostal nerves were surgically isolated and electrically stimulated. During stimulation of each nerve, the GCR was detected by laryngeal electromyography and the GCF was measured with a pressure transducer. The only nerve that elicited the GCR in the 5 pigs was the iSLN. The average GCF was 288.9 mm Hg. This study demonstrates that the only afferent nerve that elicits the GCR in pigs is the iSLN, and that it should remain the focus of research for the rehabilitation of patients with absent or defective reflex vocal fold adduction.

  15. Prototype Biology-Based Radiation Risk Module Project

    Science.gov (United States)

    Terrier, Douglas; Clayton, Ronald G.; Patel, Zarana; Hu, Shaowen; Huff, Janice

    2015-01-01

    Biological effects of space radiation and risk mitigation are strategic knowledge gaps for the Evolvable Mars Campaign. The current epidemiology-based NASA Space Cancer Risk (NSCR) model contains large uncertainties (HAT #6.5a) due to lack of information on the radiobiology of galactic cosmic rays (GCR) and lack of human data. The use of experimental models that most accurately replicate the response of human tissues is critical for precision in risk projections. Our proposed study will compare DNA damage, histological, and cell kinetic parameters after irradiation in normal 2D human cells versus 3D tissue models, and it will use a multi-scale computational model (CHASTE) to investigate various biological processes that may contribute to carcinogenesis, including radiation-induced cellular signaling pathways. This cross-disciplinary work, with biological validation of an evolvable mathematical computational model, will help reduce uncertainties within NSCR and aid risk mitigation for radiation-induced carcinogenesis.

  16. Subsurface event detection and classification using Wireless Signal Networks.

    Science.gov (United States)

    Yoon, Suk-Un; Ghazanfari, Ehsan; Cheng, Liang; Pamukcu, Sibel; Suleiman, Muhannad T

    2012-11-05

    Subsurface environment sensing and monitoring applications such as detection of water intrusion or a landslide, which could significantly change the physical properties of the host soil, can be accomplished using a novel concept, Wireless Signal Networks (WSiNs). The wireless signal networks take advantage of the variations of radio signal strength on the distributed underground sensor nodes of WSiNs to monitor and characterize the sensed area. To characterize subsurface environments for event detection and classification, this paper provides a detailed list and experimental data of soil properties on how radio propagation is affected by soil properties in subsurface communication environments. Experiments demonstrated that calibrated wireless signal strength variations can be used as indicators to sense changes in the subsurface environment. The concept of WSiNs for the subsurface event detection is evaluated with applications such as detection of water intrusion, relative density change, and relative motion using actual underground sensor nodes. To classify geo-events using the measured signal strength as a main indicator of geo-events, we propose a window-based minimum distance classifier based on Bayesian decision theory. The window-based classifier for wireless signal networks has two steps: event detection and event classification. With the event detection, the window-based classifier classifies geo-events on the event occurring regions that are called a classification window. The proposed window-based classification method is evaluated with a water leakage experiment in which the data has been measured in laboratory experiments. In these experiments, the proposed detection and classification method based on wireless signal network can detect and classify subsurface events.

  17. Discrete event dynamic system (DES)-based modeling for dynamic material flow in the pyroprocess

    International Nuclear Information System (INIS)

    Lee, Hyo Jik; Kim, Kiho; Kim, Ho Dong; Lee, Han Soo

    2011-01-01

    A modeling and simulation methodology was proposed in order to implement the dynamic material flow of the pyroprocess. Since the static mass balance provides the limited information on the material flow, it is hard to predict dynamic behavior according to event. Therefore, a discrete event system (DES)-based model named, PyroFlow, was developed at the Korea Atomic Energy Research Institute (KAERI). PyroFlow is able to calculate dynamic mass balance and also show various dynamic operational results in real time. By using PyroFlow, it is easy to rapidly predict unforeseeable results, such as throughput in unit process, accumulated product in buffer and operation status. As preliminary simulations, bottleneck analyses in the pyroprocess were carried out and consequently it was presented that operation strategy had influence on the productivity of the pyroprocess.

  18. Gas cooled reactor decommissioning. Packaging of waste for disposal in the United Kingdom deep repository

    International Nuclear Information System (INIS)

    Barlow, S.V.; Wisbey, S.J.; Wood, P.

    1998-01-01

    United Kingdom Nirex Limited has been established to develop and operate a deep underground repository for the disposal of the UK's intermediate and certain low level radioactive waste. The UK has a significant Gas Cooled Reactor (GCR) programme, including both Magnox and AGR (Advanced Gas-cooled Reactor) capacity, amounting to 26 Magnox reactors, 15 AGR reactors as well as research and prototype reactor units such as the Windscale AGR and the Windscale Piles. Some of these units are already undergoing decommissioning and Nirex has estimated that some 15,000 m 3 (conditioned volume) will come forward for disposal from GCR decommissioning before 2060. This volume does not include final stage (Stage 3) decommissioning arisings from commercial reactors since the generating utilities in the UK are proposing to adopt a deferred safe store strategy for these units. Intermediate level wastes arising from GCR decommissioning needs to be packaged in a form suitable for on-site interim storage and eventual deep disposal in the planned repository. In the absence of Conditions for Acceptance for a repository in the UK, the dimensions, key features and minimum performance requirements for waste packages are defined in Waste Package Specifications. These form the basis for all assessments of the suitability of wastes for disposal, including GCR wastes. This paper will describe the nature and characteristics of GCR decommissioning wastes which are intended for disposal in a UK repository. The Nirex Waste Package Specifications and the key technical issues, which have been identified when considering GCR decommissioning waste against the performance requirements within the specifications, are discussed. (author)

  19. Pattern recognition based on time-frequency analysis and convolutional neural networks for vibrational events in φ-OTDR

    Science.gov (United States)

    Xu, Chengjin; Guan, Junjun; Bao, Ming; Lu, Jiangang; Ye, Wei

    2018-01-01

    Based on vibration signals detected by a phase-sensitive optical time-domain reflectometer distributed optical fiber sensing system, this paper presents an implement of time-frequency analysis and convolutional neural network (CNN), used to classify different types of vibrational events. First, spectral subtraction and the short-time Fourier transform are used to enhance time-frequency features of vibration signals and transform different types of vibration signals into spectrograms, which are input to the CNN for automatic feature extraction and classification. Finally, by replacing the soft-max layer in the CNN with a multiclass support vector machine, the performance of the classifier is enhanced. Experiments show that after using this method to process 4000 vibration signal samples generated by four different vibration events, namely, digging, walking, vehicles passing, and damaging, the recognition rates of vibration events are over 90%. The experimental results prove that this method can automatically make an effective feature selection and greatly improve the classification accuracy of vibrational events in distributed optical fiber sensing systems.

  20. Characterising Event-Based DOM Inputs to an Urban Watershed

    Science.gov (United States)

    Croghan, D.; Bradley, C.; Hannah, D. M.; Van Loon, A.; Sadler, J. P.

    2017-12-01

    Dissolved Organic Matter (DOM) composition in urban streams is dominated by terrestrial inputs after rainfall events. Urban streams have particularly strong terrestrial-riverine connections due to direct input from terrestrial drainage systems. Event driven DOM inputs can have substantial adverse effects on water quality. Despite this, DOM from important catchment sources such as road drains and Combined Sewage Overflows (CSO's) remains poorly characterised within urban watersheds. We studied DOM sources within an urbanised, headwater watershed in Birmingham, UK. Samples from terrestrial sources (roads, roofs and a CSO), were collected manually after the onset of rainfall events of varying magnitude, and again within 24-hrs of the event ending. Terrestrial samples were analysed for fluorescence, absorbance and Dissolved Organic Carbon (DOC) concentration. Fluorescence and absorbance indices were calculated, and Parallel Factor Analysis (PARAFAC) was undertaken to aid sample characterization. Substantial differences in fluorescence, absorbance, and DOC were observed between source types. PARAFAC-derived components linked to organic pollutants were generally highest within road derived samples, whilst humic-like components tended to be highest within roof samples. Samples taken from the CSO generally contained low fluorescence, however this likely represents a dilution effect. Variation within source groups was particularly high, and local land use seemed to be the driving factor for road and roof drain DOM character and DOC quantity. Furthermore, high variation in fluorescence, absorbance and DOC was apparent between all sources depending on event type. Drier antecedent conditions in particular were linked to greater presence of terrestrially-derived components and higher DOC content. Our study indicates that high variations in DOM character occur between source types, and over small spatial scales. Road drains located on main roads appear to contain the poorest

  1. Integrating physically based simulators with Event Detection Systems: Multi-site detection approach.

    Science.gov (United States)

    Housh, Mashor; Ohar, Ziv

    2017-03-01

    The Fault Detection (FD) Problem in control theory concerns of monitoring a system to identify when a fault has occurred. Two approaches can be distinguished for the FD: Signal processing based FD and Model-based FD. The former concerns of developing algorithms to directly infer faults from sensors' readings, while the latter uses a simulation model of the real-system to analyze the discrepancy between sensors' readings and expected values from the simulation model. Most contamination Event Detection Systems (EDSs) for water distribution systems have followed the signal processing based FD, which relies on analyzing the signals from monitoring stations independently of each other, rather than evaluating all stations simultaneously within an integrated network. In this study, we show that a model-based EDS which utilizes a physically based water quality and hydraulics simulation models, can outperform the signal processing based EDS. We also show that the model-based EDS can facilitate the development of a Multi-Site EDS (MSEDS), which analyzes the data from all the monitoring stations simultaneously within an integrated network. The advantage of the joint analysis in the MSEDS is expressed by increased detection accuracy (higher true positive alarms and fewer false alarms) and shorter detection time. Copyright © 2016 Elsevier Ltd. All rights reserved.

  2. Eruptive event generator based on the Gibson-Low magnetic configuration

    Science.gov (United States)

    Borovikov, D.; Sokolov, I. V.; Manchester, W. B.; Jin, M.; Gombosi, T. I.

    2017-08-01

    Coronal mass ejections (CMEs), a kind of energetic solar eruptions, are an integral subject of space weather research. Numerical magnetohydrodynamic (MHD) modeling, which requires powerful computational resources, is one of the primary means of studying the phenomenon. With increasing accessibility of such resources, grows the demand for user-friendly tools that would facilitate the process of simulating CMEs for scientific and operational purposes. The Eruptive Event Generator based on Gibson-Low flux rope (EEGGL), a new publicly available computational model presented in this paper, is an effort to meet this demand. EEGGL allows one to compute the parameters of a model flux rope driving a CME via an intuitive graphical user interface. We provide a brief overview of the physical principles behind EEGGL and its functionality. Ways toward future improvements of the tool are outlined.

  3. Increments in insulin sensitivity during intensive treatment are closely correlated with decrements in glucocorticoid receptor mRNA in skeletal muscle from patients with Type II diabetes

    DEFF Research Database (Denmark)

    Vestergaard, H; Bratholm, P; Christensen, N J

    2001-01-01

    decreased significantly after intensive insulin treatment. A close correlation was found between increments in glucose uptake during intensive treatment and decrements in skeletal muscle total GCR mRNA (r=0.95, Pmultiple regression analysis), and between glucose uptake and alpha/alpha 2 GCR m RNA...

  4. Triggerless Readout with Time and Amplitude Reconstruction of Event Based on Deconvolution Algorithm

    International Nuclear Information System (INIS)

    Kulis, S.; Idzik, M.

    2011-01-01

    In future linear colliders like CLIC, where the period between the bunch crossings is in a sub-nanoseconds range ( 500 ps), an appropriate detection technique with triggerless signal processing is needed. In this work we discuss a technique, based on deconvolution algorithm, suitable for time and amplitude reconstruction of an event. In the implemented method the output of a relatively slow shaper (many bunch crossing periods) is sampled and digitalised in an ADC and then the deconvolution procedure is applied to digital data. The time of an event can be found with a precision of few percent of sampling time. The signal to noise ratio is only slightly decreased after passing through the deconvolution filter. The performed theoretical and Monte Carlo studies are confirmed by the results of preliminary measurements obtained with the dedicated system comprising of radiation source, silicon sensor, front-end electronics, ADC and further digital processing implemented on a PC computer. (author)

  5. Extreme flood event analysis in Indonesia based on rainfall intensity and recharge capacity

    Science.gov (United States)

    Narulita, Ida; Ningrum, Widya

    2018-02-01

    Indonesia is very vulnerable to flood disaster because it has high rainfall events throughout the year. Flood is categorized as the most important hazard disaster because it is causing social, economic and human losses. The purpose of this study is to analyze extreme flood event based on satellite rainfall dataset to understand the rainfall characteristic (rainfall intensity, rainfall pattern, etc.) that happened before flood disaster in the area for monsoonal, equatorial and local rainfall types. Recharge capacity will be analyzed using land cover and soil distribution. The data used in this study are CHIRPS rainfall satellite data on 0.05 ° spatial resolution and daily temporal resolution, and GSMap satellite rainfall dataset operated by JAXA on 1-hour temporal resolution and 0.1 ° spatial resolution, land use and soil distribution map for recharge capacity analysis. The rainfall characteristic before flooding, and recharge capacity analysis are expected to become the important information for flood mitigation in Indonesia.

  6. Discrimination of Rock Fracture and Blast Events Based on Signal Complexity and Machine Learning

    Directory of Open Access Journals (Sweden)

    Zilong Zhou

    2018-01-01

    Full Text Available The automatic discrimination of rock fracture and blast events is complex and challenging due to the similar waveform characteristics. To solve this problem, a new method based on the signal complexity analysis and machine learning has been proposed in this paper. First, the permutation entropy values of signals at different scale factors are calculated to reflect complexity of signals and constructed into a feature vector set. Secondly, based on the feature vector set, back-propagation neural network (BPNN as a means of machine learning is applied to establish a discriminator for rock fracture and blast events. Then to evaluate the classification performances of the new method, the classifying accuracies of support vector machine (SVM, naive Bayes classifier, and the new method are compared, and the receiver operating characteristic (ROC curves are also analyzed. The results show the new method obtains the best classification performances. In addition, the influence of different scale factor q and number of training samples n on discrimination results is discussed. It is found that the classifying accuracy of the new method reaches the highest value when q = 8–15 or 8–20 and n=140.

  7. A data base approach for prediction of deforestation-induced mass wasting events

    Science.gov (United States)

    Logan, T. L.

    1981-01-01

    A major topic of concern in timber management is determining the impact of clear-cutting on slope stability. Deforestation treatments on steep mountain slopes have often resulted in a high frequency of major mass wasting events. The Geographic Information System (GIS) is a potentially useful tool for predicting the location of mass wasting sites. With a raster-based GIS, digitally encoded maps of slide hazard parameters can be overlayed and modeled to produce new maps depicting high probability slide areas. The present investigation has the objective to examine the raster-based information system as a tool for predicting the location of the clear-cut mountain slopes which are most likely to experience shallow soil debris avalanches. A literature overview is conducted, taking into account vegetation, roads, precipitation, soil type, slope-angle and aspect, and models predicting mass soil movements. Attention is given to a data base approach and aspects of slide prediction.

  8. Cosmogenic nuclides in the Martian surface: constraints for sample recovery and transport

    International Nuclear Information System (INIS)

    Englert, P.A.J.

    1988-01-01

    Stable and radioactive cosmogenic nuclides and radiation damage effects such as cosmic ray tracks can provide information on the surface history of Mars. A recent overview on developments in cosmogenic nuclide research for historical studies of predominantly extraterrestrial materials was published previously. The information content of cosmogenic nuclides and radiation damage effects produced in the Martian surface is based on the different ways of interaction of the primary galactic and solar cosmic radiation (GCR, SCR) and the secondary particle cascade. Generally the kind and extent of interactions as seen in the products depend on the following factors: (1) composition, energy and intensity of the primary SCR and GCR; (2) composition, energy and intensity of the GCR-induced cascade of secondary particles; (3) the target geometry, i.e., the spatial parameters of Martian surface features with respect to the primary radiation source; (4) the target chemistry, i.e., the chemical composition of the Martian surface at the sampling location down to the minor element level or lower; and (5) duration of the exposure. These factors are not independent of each other and have a major influence on sample taking strategies and techniques

  9. Generation of a Chinese Hamster Ovary Cell Line Producing Recombinant Human Glucocerebrosidase

    Science.gov (United States)

    Novo, Juliana Branco; Morganti, Ligia; Moro, Ana Maria; Paes Leme, Adriana Franco; Serrano, Solange Maria de Toledo; Raw, Isaias; Ho, Paulo Lee

    2012-01-01

    Impaired activity of the lysosomal enzyme glucocerebrosidase (GCR) results in the inherited metabolic disorder known as Gaucher disease. Current treatment consists of enzyme replacement therapy by administration of exogenous GCR. Although effective, it is exceptionally expensive, and patients worldwide have a limited access to this medicine. In Brazil, the public healthcare system provides the drug free of charge for all Gaucher's patients, which reaches the order of $ 84 million per year. However, the production of GCR by public institutions in Brazil would reduce significantly the therapy costs. Here, we describe a robust protocol for the generation of a cell line producing recombinant human GCR. The protein was expressed in CHO-DXB11 (dhfr−) cells after stable transfection and gene amplification with methotrexate. As expected, glycosylated GCR was detected by immunoblotting assay both as cell-associated (~64 and 59 kDa) and secreted (63–69 kDa) form. Analysis of subclones allowed the selection of stable CHO cells producing a secreted functional enzyme, with a calculated productivity of 5.14 pg/cell/day for the highest producer. Although being laborious, traditional methods of screening high-producing recombinant cells may represent a valuable alternative to generate expensive biopharmaceuticals in countries with limited resources. PMID:23091360

  10. Generation of a Chinese Hamster Ovary Cell Line Producing Recombinant Human Glucocerebrosidase

    Directory of Open Access Journals (Sweden)

    Juliana Branco Novo

    2012-01-01

    Full Text Available Impaired activity of the lysosomal enzyme glucocerebrosidase (GCR results in the inherited metabolic disorder known as Gaucher disease. Current treatment consists of enzyme replacement therapy by administration of exogenous GCR. Although effective, it is exceptionally expensive, and patients worldwide have a limited access to this medicine. In Brazil, the public healthcare system provides the drug free of charge for all Gaucher’s patients, which reaches the order of $ 84 million per year. However, the production of GCR by public institutions in Brazil would reduce significantly the therapy costs. Here, we describe a robust protocol for the generation of a cell line producing recombinant human GCR. The protein was expressed in CHO-DXB11 (dhfr− cells after stable transfection and gene amplification with methotrexate. As expected, glycosylated GCR was detected by immunoblotting assay both as cell-associated (~64 and 59 kDa and secreted (63–69 kDa form. Analysis of subclones allowed the selection of stable CHO cells producing a secreted functional enzyme, with a calculated productivity of 5.14 pg/cell/day for the highest producer. Although being laborious, traditional methods of screening high-producing recombinant cells may represent a valuable alternative to generate expensive biopharmaceuticals in countries with limited resources.

  11. Subsurface Event Detection and Classification Using Wireless Signal Networks

    Directory of Open Access Journals (Sweden)

    Muhannad T. Suleiman

    2012-11-01

    Full Text Available Subsurface environment sensing and monitoring applications such as detection of water intrusion or a landslide, which could significantly change the physical properties of the host soil, can be accomplished using a novel concept, Wireless Signal Networks (WSiNs. The wireless signal networks take advantage of the variations of radio signal strength on the distributed underground sensor nodes of WSiNs to monitor and characterize the sensed area. To characterize subsurface environments for event detection and classification, this paper provides a detailed list and experimental data of soil properties on how radio propagation is affected by soil properties in subsurface communication environments. Experiments demonstrated that calibrated wireless signal strength variations can be used as indicators to sense changes in the subsurface environment. The concept of WSiNs for the subsurface event detection is evaluated with applications such as detection of water intrusion, relative density change, and relative motion using actual underground sensor nodes. To classify geo-events using the measured signal strength as a main indicator of geo-events, we propose a window-based minimum distance classifier based on Bayesian decision theory. The window-based classifier for wireless signal networks has two steps: event detection and event classification. With the event detection, the window-based classifier classifies geo-events on the event occurring regions that are called a classification window. The proposed window-based classification method is evaluated with a water leakage experiment in which the data has been measured in laboratory experiments. In these experiments, the proposed detection and classification method based on wireless signal network can detect and classify subsurface events.

  12. International Sport Events: Improving Marketing

    Directory of Open Access Journals (Sweden)

    Margarita Kerzaitė

    2014-04-01

    Full Text Available The report and the article will be a comprehensive analysis ofthe needs to improve the international sport events marketing.Highlighting the role of international sport events in contemporarysociety and the challenges in the context of globalization,comparing opinions of various authors about aspects of classificationand the benefits for host country. The article and the reportreveals the main existing problem encountered in organizinginternational sport events, estimated perspectives for solutionof this problem. Summarizes the international sport eventsopportunities, basically modernize marketing tools according tothe marketing mix correction based on systematic synthesis ofmarketing concepts and adaptation/standardization needs, themost important factors in the marketing mix for the excretion ofthe main marketing objectives. The article is based on the latestscientific literature analysis.

  13. ANALYSIS OF EVENT TOURISM IN RUSSIA, ITS FUNCTIONS, WAYS TO IMPROVE THE EFFICIENCY OF EVENT

    Directory of Open Access Journals (Sweden)

    Mikhail Yur'evich Grushin

    2016-01-01

    existing market of tourist services to requirements of consumers. This process causes the search for science-based organizational and methodological recommendations for such adaptation in the regions of Russia and their further development. It is based on the results of the IV All-Russian Fair open event tourism «Russian open Event Expo» and III All-Russian competition in the fi eld of event tourism 12–14 November 2015 in Khanty-Mansiysk city.

  14. Severity Classification of a Seismic Event based on the Magnitude-Distance Ratio Using Only One Seismological Station

    Directory of Open Access Journals (Sweden)

    Luis Hernán Ochoa Gutiérrez

    2014-07-01

    Full Text Available Seismic event characterization is often accomplished using algorithms based only on information received at seismological stations located closest to the particular event, while ignoring historical data received at those stations. These historical data are stored and unseen at this stage. This characterization process can delay the emergency response, costing valuable time in the mitigation of the adverse effects on the affected population. Seismological stations have recorded data during many events that have been characterized by classical methods, and these data can be used as previous "knowledge" to train such stations to recognize patterns. This knowledge can be used to make faster characterizations using only one three-component broadband station by applying bio-inspired algorithms or recently developed stochastic methods, such as kernel methods. We trained a Support Vector Machine (SVM algorithm with seismograph data recorded by INGEOMINAS's National Seismological Network at a three-component station located near Bogota, Colombia. As input model descriptors, we used the following: (1 the integral of the Fourier transform/power spectrum for each component, divided into 7 windows of 2 seconds and beginning at the P onset time, and (2 the ratio between the calculated logarithm of magnitude (Mb and epicentral distance. We used 986 events with magnitudes greater than 3 recorded from late 2003 to 2008. The algorithm classifies events with magnitude-distance ratios (a measure of the severity of possible damage caused by an earthquake greater than a background value. This value can be used to estimate the magnitude based on a known epicentral distance, which is calculated from the difference between P and S onset times. This rapid (< 20 seconds magnitude estimate can be used for rapid response strategies. The results obtained in this work confirm that many hypocentral parameters and a rapid location of a seismic event can be obtained using a few

  15. Evaluation of the Health Protection Event-Based Surveillance for the London 2012 Olympic and Paralympic Games.

    Science.gov (United States)

    Severi, E; Kitching, A; Crook, P

    2014-06-19

    The Health Protection Agency (HPA) (currently Public Health England) implemented the Health Protection Event-Based Surveillance (EBS) to provide additional national epidemic intelligence for the 2012 London Olympic and Paralympic Games (the Games). We describe EBS and evaluate the system attributes. EBS aimed at identifying, assessing and reporting to the HPA Olympic Coordination Centre (OCC) possible national infectious disease threats that may significantly impact the Games. EBS reported events in England from 2 July to 12 September 2012. EBS sourced events from reports from local health protection units and from screening an electronic application 'HPZone Dashboard' (DB). During this period, 147 new events were reported to EBS, mostly food-borne and vaccine-preventable diseases: 79 from regional units, 144 from DB (76 from both). EBS reported 61 events to the OCC: 21 of these were reported onwards. EBS sensitivity was 95.2%; positive predictive value was 32.8%; reports were timely (median one day; 10th percentile: 0 days - same day; 90th percentile: 3.6 days); completeness was 99.7%; stability was 100%; EBS simplicity was assessed as good; the daily time per regional or national unit dedicated to EBS was approximately 4 hours (weekdays) and 3 hours (weekends). OCC directors judged EBS as efficient, fast and responsive. EBS provided reliable, reassuring, timely, simple and stable national epidemic intelligence for the Games.

  16. A novel GLM-based method for the Automatic IDentification of functional Events (AIDE) in fNIRS data recorded in naturalistic environments.

    Science.gov (United States)

    Pinti, Paola; Merla, Arcangelo; Aichelburg, Clarisse; Lind, Frida; Power, Sarah; Swingler, Elizabeth; Hamilton, Antonia; Gilbert, Sam; Burgess, Paul W; Tachtsidis, Ilias

    2017-07-15

    Recent technological advances have allowed the development of portable functional Near-Infrared Spectroscopy (fNIRS) devices that can be used to perform neuroimaging in the real-world. However, as real-world experiments are designed to mimic everyday life situations, the identification of event onsets can be extremely challenging and time-consuming. Here, we present a novel analysis method based on the general linear model (GLM) least square fit analysis for the Automatic IDentification of functional Events (or AIDE) directly from real-world fNIRS neuroimaging data. In order to investigate the accuracy and feasibility of this method, as a proof-of-principle we applied the algorithm to (i) synthetic fNIRS data simulating both block-, event-related and mixed-design experiments and (ii) experimental fNIRS data recorded during a conventional lab-based task (involving maths). AIDE was able to recover functional events from simulated fNIRS data with an accuracy of 89%, 97% and 91% for the simulated block-, event-related and mixed-design experiments respectively. For the lab-based experiment, AIDE recovered more than the 66.7% of the functional events from the fNIRS experimental measured data. To illustrate the strength of this method, we then applied AIDE to fNIRS data recorded by a wearable system on one participant during a complex real-world prospective memory experiment conducted outside the lab. As part of the experiment, there were four and six events (actions where participants had to interact with a target) for the two different conditions respectively (condition 1: social-interact with a person; condition 2: non-social-interact with an object). AIDE managed to recover 3/4 events and 3/6 events for conditions 1 and 2 respectively. The identified functional events were then corresponded to behavioural data from the video recordings of the movements and actions of the participant. Our results suggest that "brain-first" rather than "behaviour-first" analysis is

  17. Event-by-event jet quenching

    Energy Technology Data Exchange (ETDEWEB)

    Fries, R.J.; Rodriguez, R.; Ramirez, E.

    2010-08-14

    High momentum jets and hadrons can be used as probes for the quark gluon plasma (QGP) formed in nuclear collisions at high energies. We investigate the influence of fluctuations in the fireball on jet quenching observables by comparing propagation of light quarks and gluons through averaged, smooth QGP fireballs with event-by-event jet quenching using realistic inhomogeneous fireballs. We find that the transverse momentum and impact parameter dependence of the nuclear modification factor R{sub AA} can be fit well in an event-by-event quenching scenario within experimental errors. However the transport coefficient {cflx q} extracted from fits to the measured nuclear modification factor R{sub AA} in averaged fireballs underestimates the value from event-by-event calculations by up to 50%. On the other hand, after adjusting {cflx q} to fit R{sub AA} in the event-by-event analysis we find residual deviations in the azimuthal asymmetry v{sub 2} and in two-particle correlations, that provide a possible faint signature for a spatial tomography of the fireball. We discuss a correlation function that is a measure for spatial inhomogeneities in a collision and can be constrained from data.

  18. Event-by-event jet quenching

    Energy Technology Data Exchange (ETDEWEB)

    Rodriguez, R. [Cyclotron Institute and Physics Department, Texas A and M University, College Station, TX 77843 (United States); Fries, R.J., E-mail: rjfries@comp.tamu.ed [Cyclotron Institute and Physics Department, Texas A and M University, College Station, TX 77843 (United States); RIKEN/BNL Research Center, Brookhaven National Laboratory, Upton, NY 11973 (United States); Ramirez, E. [Physics Department, University of Texas El Paso, El Paso, TX 79968 (United States)

    2010-09-27

    High momentum jets and hadrons can be used as probes for the quark gluon plasma (QGP) formed in nuclear collisions at high energies. We investigate the influence of fluctuations in the fireball on jet quenching observables by comparing propagation of light quarks and gluons through averaged, smooth QGP fireballs with event-by-event jet quenching using realistic inhomogeneous fireballs. We find that the transverse momentum and impact parameter dependence of the nuclear modification factor R{sub AA} can be fit well in an event-by-event quenching scenario within experimental errors. However the transport coefficient q extracted from fits to the measured nuclear modification factor R{sub AA} in averaged fireballs underestimates the value from event-by-event calculations by up to 50%. On the other hand, after adjusting q to fit R{sub AA} in the event-by-event analysis we find residual deviations in the azimuthal asymmetry v{sub 2} and in two-particle correlations, that provide a possible faint signature for a spatial tomography of the fireball. We discuss a correlation function that is a measure for spatial inhomogeneities in a collision and can be constrained from data.

  19. An event database for rotational seismology

    Science.gov (United States)

    Salvermoser, Johannes; Hadziioannou, Celine; Hable, Sarah; Chow, Bryant; Krischer, Lion; Wassermann, Joachim; Igel, Heiner

    2016-04-01

    The ring laser sensor (G-ring) located at Wettzell, Germany, routinely observes earthquake-induced rotational ground motions around a vertical axis since its installation in 2003. Here we present results from a recently installed event database which is the first that will provide ring laser event data in an open access format. Based on the GCMT event catalogue and some search criteria, seismograms from the ring laser and the collocated broadband seismometer are extracted and processed. The ObsPy-based processing scheme generates plots showing waveform fits between rotation rate and transverse acceleration and extracts characteristic wavefield parameters such as peak ground motions, noise levels, Love wave phase velocities and waveform coherence. For each event, these parameters are stored in a text file (json dictionary) which is easily readable and accessible on the website. The database contains >10000 events starting in 2007 (Mw>4.5). It is updated daily and therefore provides recent events at a time lag of max. 24 hours. The user interface allows to filter events for epoch, magnitude, and source area, whereupon the events are displayed on a zoomable world map. We investigate how well the rotational motions are compatible with the expectations from the surface wave magnitude scale. In addition, the website offers some python source code examples for downloading and processing the openly accessible waveforms.

  20. NEBULAS A High Performance Data-Driven Event-Building Architecture based on an Asynchronous Self-Routing Packet-Switching Network

    CERN Multimedia

    Costa, M; Letheren, M; Djidi, K; Gustafsson, L; Lazraq, T; Minerskjold, M; Tenhunen, H; Manabe, A; Nomachi, M; Watase, Y

    2002-01-01

    RD31 : The project is evaluating a new approach to event building for level-two and level-three processor farms at high rate experiments. It is based on the use of commercial switching fabrics to replace the traditional bus-based architectures used in most previous data acquisition sytems. Switching fabrics permit the construction of parallel, expandable, hardware-driven event builders that can deliver higher aggregate throughput than the bus-based architectures. A standard industrial switching fabric technology is being evaluated. It is based on Asynchronous Transfer Mode (ATM) packet-switching network technology. Commercial, expandable ATM switching fabrics and processor interfaces, now being developed for the future Broadband ISDN infrastructure, could form the basis of an implementation. The goals of the project are to demonstrate the viability of this approach, to evaluate the trade-offs involved in make versus buy options, to study the interfacing of the physics frontend data buffers to such a fabric, a...

  1. Spent fuels conditioning and irradiated nuclear fuel elements examination: the STAR facility and its abilities

    Energy Technology Data Exchange (ETDEWEB)

    Boussard, F.; Huillery, R. [CEA Centre d`Etudes de Cadarache, 13 - Saint-Paul-lez-Durance (France). Dept. d`Etudes des Combustibles; Averseng, J.L.; Serpantie, J.P. [Novatome Industries, 92 - Le Plessis-Robinson (France)

    1994-12-31

    This paper is a presentation of the STAR facility, a high activity laboratory located in Cadarache Nuclear Research Center (France). The purpose of the STAR facility and of the associated processes, is the treatment, cleaning and conditioning of spent fuels from Gas Cooled Reactors (GCR) and in particular of about 2300 spent GCR fuel cartridges irradiated more than 20 years ago in Electricite de France (EDF) or CEA Uranium Graphite GCR. The processes are: to separate the nuclear fuel from the clad remains, to chemically stabilize the nuclear material and to condition it in sealed canisters. An additional objective of STAR consists in non-destructive or destructive examinations and tests on PWR rods or FBR pins in the frame of fuel development programs. The paper describes the STAR facility conceptual design (safety design rules, hot cells..) and the different options corresponding to the GCR reconditioning process and to further research and development works on various fuel types. (J.S.). 3 figs.

  2. Spent fuels conditioning and irradiated nuclear fuel elements examination: the STAR facility and its abilities

    International Nuclear Information System (INIS)

    Boussard, F.; Huillery, R.

    1994-01-01

    This paper is a presentation of the STAR facility, a high activity laboratory located in Cadarache Nuclear Research Center (France). The purpose of the STAR facility and of the associated processes, is the treatment, cleaning and conditioning of spent fuels from Gas Cooled Reactors (GCR) and in particular of about 2300 spent GCR fuel cartridges irradiated more than 20 years ago in Electricite de France (EDF) or CEA Uranium Graphite GCR. The processes are: to separate the nuclear fuel from the clad remains, to chemically stabilize the nuclear material and to condition it in sealed canisters. An additional objective of STAR consists in non-destructive or destructive examinations and tests on PWR rods or FBR pins in the frame of fuel development programs. The paper describes the STAR facility conceptual design (safety design rules, hot cells..) and the different options corresponding to the GCR reconditioning process and to further research and development works on various fuel types. (J.S.). 3 figs

  3. Mathematical foundations of event trees

    International Nuclear Information System (INIS)

    Papazoglou, Ioannis A.

    1998-01-01

    A mathematical foundation from first principles of event trees is presented. The main objective of this formulation is to offer a formal basis for developing automated computer assisted construction techniques for event trees. The mathematical theory of event trees is based on the correspondence between the paths of the tree and the elements of the outcome space of a joint event. The concept of a basic cylinder set is introduced to describe joint event outcomes conditional on specific outcomes of basic events or unconditional on the outcome of basic events. The concept of outcome space partition is used to describe the minimum amount of information intended to be preserved by the event tree representation. These concepts form the basis for an algorithm for systematic search for and generation of the most compact (reduced) form of an event tree consistent with the minimum amount of information the tree should preserve. This mathematical foundation allows for the development of techniques for automated generation of event trees corresponding to joint events which are formally described through other types of graphical models. Such a technique has been developed for complex systems described by functional blocks and it is reported elsewhere. On the quantification issue of event trees, a formal definition of a probability space corresponding to the event tree outcomes is provided. Finally, a short discussion is offered on the relationship of the presented mathematical theory with the more general use of event trees in reliability analysis of dynamic systems

  4. On-line detection of apnea/hypopnea events using SpO2 signal: a rule-based approach employing binary classifier models.

    Science.gov (United States)

    Koley, Bijoy Laxmi; Dey, Debangshu

    2014-01-01

    This paper presents an online method for automatic detection of apnea/hypopnea events, with the help of oxygen saturation (SpO2) signal, measured at fingertip by Bluetooth nocturnal pulse oximeter. Event detection is performed by identifying abnormal data segments from the recorded SpO2 signal, employing a binary classifier model based on a support vector machine (SVM). Thereafter the abnormal segment is further analyzed to detect different states within the segment, i.e., steady, desaturation, and resaturation, with the help of another SVM-based binary ensemble classifier model. Finally, a heuristically obtained rule-based system is used to identify the apnea/hypopnea events from the time-sequenced decisions of these classifier models. In the developmental phase, a set of 34 time domain-based features was extracted from the segmented SpO2 signal using an overlapped windowing technique. Later, an optimal set of features was selected on the basis of recursive feature elimination technique. A total of 34 subjects were included in the study. The results show average event detection accuracies of 96.7% and 93.8% for the offline and the online tests, respectively. The proposed system provides direct estimation of the apnea/hypopnea index with the help of a relatively inexpensive and widely available pulse oximeter. Moreover, the system can be monitored and accessed by physicians through LAN/WAN/Internet and can be extended to deploy in Bluetooth-enabled mobile phones.

  5. Reliability research based experience with systems and events at the Kozloduy NPP units 1-4

    Energy Technology Data Exchange (ETDEWEB)

    Khristova, R; Kaltchev, B; Dimitrov, B [Energoproekt, Sofia (Bulgaria); Nedyalkova, D; Sonev, A [Kombinat Atomna Energetika, Kozloduj (Bulgaria)

    1996-12-31

    An overview of equipment reliability based on operational data of selected safety systems at the Kozloduy NPP is presented. Conclusions are drawn on reliability of the service water system, feed water system, emergency power supply - category 2, emergency high pressure ejection system and spray system. For the units 1-4 all recorded accident protocols in the period 1974-1993 have been processed and the main initiators identified. A list with 39 most frequent initiators of accidents/incidents is compiled. The human-caused errors account for 27% of all events. The reliability characteristics and frequencies have been calculated for all initiating events. It is concluded that there have not been any accidents with consequences for fuel integrity or radioactive release. 14 refs.

  6. Reliability research based experience with systems and events at the Kozloduy NPP units 1-4

    International Nuclear Information System (INIS)

    Khristova, R.; Kaltchev, B.; Dimitrov, B.; Nedyalkova, D.; Sonev, A.

    1995-01-01

    An overview of equipment reliability based on operational data of selected safety systems at the Kozloduy NPP is presented. Conclusions are drawn on reliability of the service water system, feed water system, emergency power supply - category 2, emergency high pressure ejection system and spray system. For the units 1-4 all recorded accident protocols in the period 1974-1993 have been processed and the main initiators identified. A list with 39 most frequent initiators of accidents/incidents is compiled. The human-caused errors account for 27% of all events. The reliability characteristics and frequencies have been calculated for all initiating events. It is concluded that there have not been any accidents with consequences for fuel integrity or radioactive release. 14 refs

  7. g-PRIME: A Free, Windows Based Data Acquisition and Event Analysis Software Package for Physiology in Classrooms and Research Labs.

    Science.gov (United States)

    Lott, Gus K; Johnson, Bruce R; Bonow, Robert H; Land, Bruce R; Hoy, Ronald R

    2009-01-01

    We present g-PRIME, a software based tool for physiology data acquisition, analysis, and stimulus generation in education and research. This software was developed in an undergraduate neurophysiology course and strongly influenced by instructor and student feedback. g-PRIME is a free, stand-alone, windows application coded and "compiled" in Matlab (does not require a Matlab license). g-PRIME supports many data acquisition interfaces from the PC sound card to expensive high throughput calibrated equipment. The program is designed as a software oscilloscope with standard trigger modes, multi-channel visualization controls, and data logging features. Extensive analysis options allow real time and offline filtering of signals, multi-parameter threshold-and-window based event detection, and two-dimensional display of a variety of parameters including event time, energy density, maximum FFT frequency component, max/min amplitudes, and inter-event rate and intervals. The software also correlates detected events with another simultaneously acquired source (event triggered average) in real time or offline. g-PRIME supports parameter histogram production and a variety of elegant publication quality graphics outputs. A major goal of this software is to merge powerful engineering acquisition and analysis tools with a biological approach to studies of nervous system function.

  8. Dynamic SEP event probability forecasts

    Science.gov (United States)

    Kahler, S. W.; Ling, A.

    2015-10-01

    The forecasting of solar energetic particle (SEP) event probabilities at Earth has been based primarily on the estimates of magnetic free energy in active regions and on the observations of peak fluxes and fluences of large (≥ M2) solar X-ray flares. These forecasts are typically issued for the next 24 h or with no definite expiration time, which can be deficient for time-critical operations when no SEP event appears following a large X-ray flare. It is therefore important to decrease the event probability forecast with time as a SEP event fails to appear. We use the NOAA listing of major (≥10 pfu) SEP events from 1976 to 2014 to plot the delay times from X-ray peaks to SEP threshold onsets as a function of solar source longitude. An algorithm is derived to decrease the SEP event probabilities with time when no event is observed to reach the 10 pfu threshold. In addition, we use known SEP event size distributions to modify probability forecasts when SEP intensity increases occur below the 10 pfu event threshold. An algorithm to provide a dynamic SEP event forecast, Pd, for both situations of SEP intensities following a large flare is derived.

  9. Radiation 2006. In association with the Polymer Division, Royal Australian Chemical Institute. Incorporating the 21st AINSE Radiation Chemistry Conference and the 18th Radiation Biology Conference, conference handbook

    International Nuclear Information System (INIS)

    Wroe, Andrew; Rosenfeld, A. B.; Cornelius, I. M.; Pisacane, V. L.; Ziegler, J. F.; M E Nelson, M. E.; Cucinotta, F.; Zaider, M.; Dicello, J. F.

    2006-01-01

    Full text: Humans exploring outer space are exposed to space radiation composed of high-energy protons and heavy ions. In deep space, the radiation environment consists mainly of galactic cosmic radiation (GCR). In the energy range from 100MeV per nucleon to 10GeV per nucleon, the GCR consists of 87 percent protons, 12 percent helium ions, and 1 percent heavier ions (Simpson 1983). Protons are also the major component of solar particle events (SPEs), with a smaller contribution by helium and heavier ions emitted from the Sun. Organizations planning and conducting space travel such as NASA and ESA have a fundamental interest in evaluating adverse health effects induced by GCR and SPEs in human space explorers and their offspring. In future space missions both personnel and electronic devices are going to be required to perform for longer periods within a radiation environment. For these applications it is imperative that heterogeneous shielding structures, biological structures and the secondaries produced by such structures be investigated thoroughly. The spectra of secondaries can be investigated and monitored utilising solid-state microdosimeters as have been developed at the CMRP. The effectiveness of such shielding in inhibiting the production of harmful secondaries can also be investigated and in turn optimised utilising Monte Carlo radiation transport simulation studies that utilise microdosimetry as the measurement parameter. The microdosimetric spectra produced by a solar proton radiation field traversing heterogeneous structures, such as spacecraft shielding and the astronaut will be investigated as part of this simulation study. The effectiveness of the microdosimeter in such an environment will also be assessed and presented

  10. Stress reaction process-based hierarchical recognition algorithm for continuous intrusion events in optical fiber prewarning system

    Science.gov (United States)

    Qu, Hongquan; Yuan, Shijiao; Wang, Yanping; Yang, Dan

    2018-04-01

    To improve the recognition performance of optical fiber prewarning system (OFPS), this study proposed a hierarchical recognition algorithm (HRA). Compared with traditional methods, which employ only a complex algorithm that includes multiple extracted features and complex classifiers to increase the recognition rate with a considerable decrease in recognition speed, HRA takes advantage of the continuity of intrusion events, thereby creating a staged recognition flow inspired by stress reaction. HRA is expected to achieve high-level recognition accuracy with less time consumption. First, this work analyzed the continuity of intrusion events and then presented the algorithm based on the mechanism of stress reaction. Finally, it verified the time consumption through theoretical analysis and experiments, and the recognition accuracy was obtained through experiments. Experiment results show that the processing speed of HRA is 3.3 times faster than that of a traditional complicated algorithm and has a similar recognition rate of 98%. The study is of great significance to fast intrusion event recognition in OFPS.

  11. A membrane glucocorticoid receptor mediates the rapid/non-genomic actions of glucocorticoids in mammalian skeletal muscle fibres.

    Science.gov (United States)

    Pérez, María Hernández-Alcalá; Cormack, Jonathan; Mallinson, David; Mutungi, Gabriel

    2013-10-15

    Glucocorticoids (GCs) are steroid hormones released from the adrenal gland in response to stress. They are also some of the most potent anti-inflammatory and immunosuppressive drugs currently in clinical use. They exert most of their physiological and pharmacological actions through the classical/genomic pathway. However, they also have rapid/non-genomic actions whose physiological and pharmacological functions are still poorly understood. Therefore, the primary aim of this study was to investigate the rapid/non-genomic effects of two widely prescribed glucocorticoids, beclomethasone dipropionate (BDP) and prednisolone acetate (PDNA), on force production in isolated, intact, mouse skeletal muscle fibre bundles. The results show that the effects of both GCs on maximum isometric force (Po) were fibre-type dependent. Thus, they increased Po in the slow-twitch fibre bundles without significantly affecting that of the fast-twitch fibre bundles. The increase in Po occurred within 10 min and was insensitive to the transcriptional inhibitor actinomycin D. Also, it was maximal at ∼250 nM and was blocked by the glucocorticoid receptor (GCR) inhibitor RU486 and a monoclonal anti-GCR, suggesting that it was mediated by a membrane (m) GCR. Both muscle fibre types expressed a cytosolic GCR. However, a mGCR was present only in the slow-twitch fibres. The receptor was more abundant in oxidative than in glycolytic fibres and was confined mainly to the periphery of the fibres where it co-localised with laminin. From these findings we conclude that the rapid/non-genomic actions of GCs are mediated by a mGCR and that they are physiologically/therapeutically beneficial, especially in slow-twitch muscle fibres.

  12. Trail-Based Search for Efficient Event Report to Mobile Actors in Wireless Sensor and Actor Networks.

    Science.gov (United States)

    Xu, Zhezhuang; Liu, Guanglun; Yan, Haotian; Cheng, Bin; Lin, Feilong

    2017-10-27

    In wireless sensor and actor networks, when an event is detected, the sensor node needs to transmit an event report to inform the actor. Since the actor moves in the network to execute missions, its location is always unavailable to the sensor nodes. A popular solution is the search strategy that can forward the data to a node without its location information. However, most existing works have not considered the mobility of the node, and thus generate significant energy consumption or transmission delay. In this paper, we propose the trail-based search (TS) strategy that takes advantage of actor's mobility to improve the search efficiency. The main idea of TS is that, when the actor moves in the network, it can leave its trail composed of continuous footprints. The search packet with the event report is transmitted in the network to search the actor or its footprints. Once an effective footprint is discovered, the packet will be forwarded along the trail until it is received by the actor. Moreover, we derive the condition to guarantee the trail connectivity, and propose the redundancy reduction scheme based on TS (TS-R) to reduce nontrivial transmission redundancy that is generated by the trail. The theoretical and numerical analysis is provided to prove the efficiency of TS. Compared with the well-known expanding ring search (ERS), TS significantly reduces the energy consumption and search delay.

  13. Event analysis in a primary substation

    Energy Technology Data Exchange (ETDEWEB)

    Jaerventausta, P; Paulasaari, H [Tampere Univ. of Technology (Finland); Partanen, J [Lappeenranta Univ. of Technology (Finland)

    1998-08-01

    The target of the project was to develop applications which observe the functions of a protection system by using modern microprocessor based relays. Microprocessor based relays have three essential capabilities: communication with the SCADA, the internal clock to produce time stamped event data, and the capability to register certain values during the fault. Using the above features some new functions for event analysis were developed in the project

  14. Detection of Visual Events in Underwater Video Using a Neuromorphic Saliency-based Attention System

    Science.gov (United States)

    Edgington, D. R.; Walther, D.; Cline, D. E.; Sherlock, R.; Salamy, K. A.; Wilson, A.; Koch, C.

    2003-12-01

    The Monterey Bay Aquarium Research Institute (MBARI) uses high-resolution video equipment on remotely operated vehicles (ROV) to obtain quantitative data on the distribution and abundance of oceanic animals. High-quality video data supplants the traditional approach of assessing the kinds and numbers of animals in the oceanic water column through towing collection nets behind ships. Tow nets are limited in spatial resolution, and often destroy abundant gelatinous animals resulting in species undersampling. Video camera-based quantitative video transects (QVT) are taken through the ocean midwater, from 50m to 4000m, and provide high-resolution data at the scale of the individual animals and their natural aggregation patterns. However, the current manual method of analyzing QVT video by trained scientists is labor intensive and poses a serious limitation to the amount of information that can be analyzed from ROV dives. Presented here is an automated system for detecting marine animals (events) visible in the videos. Automated detection is difficult due to the low contrast of many translucent animals and due to debris ("marine snow") cluttering the scene. Video frames are processed with an artificial intelligence attention selection algorithm that has proven a robust means of target detection in a variety of natural terrestrial scenes. The candidate locations identified by the attention selection module are tracked across video frames using linear Kalman filters. Typically, the occurrence of visible animals in the video footage is sparse in space and time. A notion of "boring" video frames is developed by detecting whether or not there is an interesting candidate object for an animal present in a particular sequence of underwater video -- video frames that do not contain any "interesting" events. If objects can be tracked successfully over several frames, they are stored as potentially "interesting" events. Based on low-level properties, interesting events are

  15. The Advanced Photon Source event system

    International Nuclear Information System (INIS)

    Lenkszus, F.R.; Laird, R.

    1995-01-01

    The Advanced Photon Source, like many other facilities, requires a means of transmitting timing information to distributed control system 1/0 controllers. The APS event system provides the means of distributing medium resolution/accuracy timing events throughout the facility. It consists of VME event generators and event receivers which are interconnected with 10OMbit/sec fiber optic links at distances of up to 650m in either a star or a daisy chain configuration. The systems event throughput rate is 1OMevents/sec with a peak-to-peak timing jitter down to lOOns depending on the source of the event. It is integrated into the EPICS-based A.PS control system through record and device support. Event generators broadcast timing events over fiber optic links to event receivers which are programmed to decode specific events. Event generators generate events in response to external inputs, from internal programmable event sequence RAMS, and from VME bus writes. The event receivers can be programmed to generate both pulse and set/reset level outputs to synchronize hardware, and to generate interrupts to initiate EPICS record processing. In addition, each event receiver contains a time stamp counter which is used to provide synchronized time stamps to EPICS records

  16. Modification of corrosion resistances of steels by rare earths ion implantation

    International Nuclear Information System (INIS)

    Hu Zhaomin; Zhang Weiguo; Liu Fengying; Shao Tongyi; Xiang Xuyang; Gao Fengqin; Li Gongpan

    1987-01-01

    Five kinds of rare earth RE elements have been implanted into steel No.45 and GCr15 bearing steel respectively. The corrosion resistances of the specimens have been examined using electrochemical dynamic potential method, in a NaAc/HAc solution for steel No.45 specimens and in a NaAc/HAc solution containing 0.1 mol/lNaCl for GCr15 bearing steel specimens. It has been found that the aqueous solution corrosion resistances of steel No.45 are obviously modified by implantation of RE element, and the pitting corrosion properties of GCr15 bearing steel are significantly improved due to heavy RE element implantation

  17. Evidence of nearby supernovae affecting life on Earth

    DEFF Research Database (Denmark)

    Svensmark, Henrik

    2012-01-01

    . A statistical analysis indicates that the Solar system has experienced many large short-term increases in the flux of Galactic cosmic rays (GCR) from nearby SNe. The hypothesis that a high GCR flux should coincide with cold conditions on the Earth is borne out by comparing the general geological record...

  18. Event Index - a LHCb Event Search System

    CERN Document Server

    INSPIRE-00392208; Kazeev, Nikita; Redkin, Artem

    2015-12-23

    LHC experiments generate up to $10^{12}$ events per year. This paper describes Event Index - an event search system. Event Index's primary function is quickly selecting subsets of events from a combination of conditions, such as the estimated decay channel or stripping lines output. Event Index is essentially Apache Lucene optimized for read-only indexes distributed over independent shards on independent nodes.

  19. Toward zero waste: Composting and recycling for sustainable venue based events

    International Nuclear Information System (INIS)

    Hottle, Troy A.; Bilec, Melissa M.; Brown, Nicholas R.; Landis, Amy E.

    2015-01-01

    Highlights: • Venues have billions of customers per year contributing to waste generation. • Waste audits of four university baseball games were conducted to assess venue waste. • Seven scenarios including composting were modeled using EPA’s WARM. • Findings demonstrate tradeoffs between emissions, energy, and landfill avoidance. • Sustainability of handling depends on efficacy of collection and treatment impacts. - Abstract: This study evaluated seven different waste management strategies for venue-based events and characterized the impacts of event waste management via waste audits and the Waste Reduction Model (WARM). The seven waste management scenarios included traditional waste handling methods (e.g. recycle and landfill) and management of the waste stream via composting, including purchasing where only compostable food service items were used during the events. Waste audits were conducted at four Arizona State University (ASU) baseball games, including a three game series. The findings demonstrate a tradeoff among CO 2 equivalent emissions, energy use, and landfill diversion rates. Of the seven waste management scenarios assessed, the recycling scenarios provide the greatest reductions in CO 2 eq. emissions and energy use because of the retention of high value materials but are compounded by the difficulty in managing a two or three bin collection system. The compost only scenario achieves complete landfill diversion but does not perform as well with respect to CO 2 eq. emissions or energy. The three game series was used to test the impact of staffed bins on contamination rates; the first game served as a baseline, the second game employed staffed bins, and the third game had non staffed bins to determine the effect of staffing on contamination rates. Contamination rates in both the recycling and compost bins were tracked throughout the series. Contamination rates were reduced from 34% in the first game to 11% on the second night (with the

  20. Toward zero waste: Composting and recycling for sustainable venue based events

    Energy Technology Data Exchange (ETDEWEB)

    Hottle, Troy A., E-mail: troy.hottle@asu.edu [Arizona State University, School of Sustainable Engineering and the Built Environment, 370 Interdisciplinary Science and Technology Building 4 (ISTB4), 781 East Terrace Road, Tempe, AZ 85287-6004 (United States); Bilec, Melissa M., E-mail: mbilec@pitt.edu [University of Pittsburgh, Civil and Environmental Engineering, 153 Benedum Hall, 3700 O’Hara Street, Pittsburgh, PA 15261-3949 (United States); Brown, Nicholas R., E-mail: nick.brown@asu.edu [Arizona State University, University Sustainability Practices, 1130 East University Drive, Suite 206, Tempe, AZ 85287 (United States); Landis, Amy E., E-mail: amy.landis@asu.edu [Arizona State University, School of Sustainable Engineering and the Built Environment, 375 Interdisciplinary Science and Technology Building 4 (ISTB4), 781 East Terrace Road, Tempe, AZ 85287-6004 (United States)

    2015-04-15

    Highlights: • Venues have billions of customers per year contributing to waste generation. • Waste audits of four university baseball games were conducted to assess venue waste. • Seven scenarios including composting were modeled using EPA’s WARM. • Findings demonstrate tradeoffs between emissions, energy, and landfill avoidance. • Sustainability of handling depends on efficacy of collection and treatment impacts. - Abstract: This study evaluated seven different waste management strategies for venue-based events and characterized the impacts of event waste management via waste audits and the Waste Reduction Model (WARM). The seven waste management scenarios included traditional waste handling methods (e.g. recycle and landfill) and management of the waste stream via composting, including purchasing where only compostable food service items were used during the events. Waste audits were conducted at four Arizona State University (ASU) baseball games, including a three game series. The findings demonstrate a tradeoff among CO{sub 2} equivalent emissions, energy use, and landfill diversion rates. Of the seven waste management scenarios assessed, the recycling scenarios provide the greatest reductions in CO{sub 2} eq. emissions and energy use because of the retention of high value materials but are compounded by the difficulty in managing a two or three bin collection system. The compost only scenario achieves complete landfill diversion but does not perform as well with respect to CO{sub 2} eq. emissions or energy. The three game series was used to test the impact of staffed bins on contamination rates; the first game served as a baseline, the second game employed staffed bins, and the third game had non staffed bins to determine the effect of staffing on contamination rates. Contamination rates in both the recycling and compost bins were tracked throughout the series. Contamination rates were reduced from 34% in the first game to 11% on the second night

  1. Low earth orbit radiation dose distribution in a phantom head

    International Nuclear Information System (INIS)

    Konradi, A.; Badhwar, G.D.; Cash, B.L.; Hardy, K.A.

    1992-01-01

    In order to compare analytical methods with data obtained during exposure to space radiation, a phantom head instrumented with a large number of radiation detectors was flown on the Space Shuttle on three occasions: 8 August 1989 (STS-28), 28 February 1990 (STS-36), and 24 April 1990 (STS-31). The objective of this experiment was to obtain a measurement of the inhomogeneity in the dose distribution within a phantom head volume. The orbits of these missions were complementary-STS-28 and STS-36 had high inclination and low altitude, while STS-31 had a low inclination and high altitude. In the cases of STS-28 and STS-36, the main contribution to the radiation dose comes from galactic cosmic rays (GCR) with a minor to negligible part supplied by the inner belt through the South Atlantic Anomaly (SAA), and for STS-28 an even smaller one from a proton enhancement during a solar flare-associated proton event. For STS-31, the inner belt protons dominate and the GCR contribution is almost negligible. The internal dose distribution is consistent with the mass distribution of the orbiter and the self-shielding and physical location of the phantom head. (author)

  2. Root cause analysis of relevant events

    International Nuclear Information System (INIS)

    Perez, Silvia S.; Vidal, Patricia G.

    2000-01-01

    During 1998 the research work followed more specific guidelines, which entailed focusing exclusively on the two selected methods (ASSET and HPIP) and incorporating some additional human behaviour elements based on the documents of reference. Once resident inspectors were incorporated in the project (and trained accordingly), events occurring in Argentine nuclear power plants were analysed. Some events were analysed (all of them from Atucha I and Embalse nuclear power plant), concluding that the systematic methodology used allows us to investigate also minor events that were precursors of the events selected. (author)

  3. Integrated Initiating Event Performance Indicators

    International Nuclear Information System (INIS)

    S. A. Eide; Dale M. Rasmuson; Corwin L. Atwood

    2005-01-01

    The U.S. Nuclear Regulatory Commission Industry Trends Program (ITP) collects and analyses industry-wide data, assesses the safety significance of results, and communicates results to Congress and other stakeholders. This paper outlines potential enhancements in the ITP to comprehensively cover the Initiating Events Cornerstone of Safety. Future work will address other cornerstones of safety. The proposed Tier 1 activity involves collecting data on ten categories of risk-significant initiating events, trending the results, and comparing early performance with prediction limits (allowable numbers of events, above which NRC action may occur). Tier 1 results would be used to monitor industry performance at the level of individual categories of initiating events. The proposed Tier 2 activity involves integrating the information for individual categories of initiating events into a single risk-based indicator, termed the Baseline Risk Index for Initiating Events or BRIIE. The BRIIE would be evaluated yearly and compared against a threshold. BRIIE results would be reported to Congress on a yearly basis

  4. Results from a data acquisition system prototype project using a switch-based event builder

    Energy Technology Data Exchange (ETDEWEB)

    Black, D.; Andresen, J.; Barsotti, E.; Baumbaugh, A.; Esterline, D.; Knickerbocker, K.; Kwarciany, R.; Moore, G.; Patrick, J.; Swoboda, C.; Treptow, K.; Trevizo, O.; Urish, J.; VanConant, R.; Walsh, D. (Fermi National Accelerator Lab., Batavia, IL (United States)); Bowden, M.; Booth, A. (Superconducting Super Collider Lab., Dallas, TX (United States)); Cancelo, G. (La Plata Univ. Nacional (Argentina))

    1991-11-01

    A prototype of a high bandwidth parallel event builder has been designed and tested. The architecture is based on a simple switching network and is adaptable to a wide variety of data acquisition systems. An eight channel system with a peak throughput of 160 Megabytes per second has been implemented. It is modularly expandable to 64 channels (over one Gigabyte per second). The prototype uses a number of relatively recent commercial technologies, including very high speed fiber-optic data links, high integration crossbar switches and embedded RISC processors. It is based on an open architecture which permits the installation of new technologies with little redesign effort. 5 refs., 6 figs.

  5. Results from a data acquisition system prototype project using a switch-based event builder

    International Nuclear Information System (INIS)

    Black, D.; Andresen, J.; Barsotti, E.; Baumbaugh, A.; Esterline, D.; Knickerbocker, K.; Kwarciany, R.; Moore, G.; Patrick, J.; Swoboda, C.; Treptow, K.; Trevizo, O.; Urish, J.; VanConant, R.; Walsh, D.; Bowden, M.; Booth, A.; Cancelo, G.

    1991-11-01

    A prototype of a high bandwidth parallel event builder has been designed and tested. The architecture is based on a simple switching network and is adaptable to a wide variety of data acquisition systems. An eight channel system with a peak throughput of 160 Megabytes per second has been implemented. It is modularly expandable to 64 channels (over one Gigabyte per second). The prototype uses a number of relatively recent commercial technologies, including very high speed fiber-optic data links, high integration crossbar switches and embedded RISC processors. It is based on an open architecture which permits the installation of new technologies with little redesign effort. 5 refs., 6 figs

  6. Comparative Effectiveness of Tacrolimus-Based Steroid Sparing versus Steroid Maintenance Regimens in Kidney Transplantation: Results from Discrete Event Simulation.

    Science.gov (United States)

    Desai, Vibha C A; Ferrand, Yann; Cavanaugh, Teresa M; Kelton, Christina M L; Caro, J Jaime; Goebel, Jens; Heaton, Pamela C

    2017-10-01

    Corticosteroids used as immunosuppressants to prevent acute rejection (AR) and graft loss (GL) following kidney transplantation are associated with serious cardiovascular and other adverse events. Evidence from short-term randomized controlled trials suggests that many patients on a tacrolimus-based immunosuppressant regimen can withdraw from steroids without increased AR or GL risk. To measure the long-term tradeoff between GL and adverse events for a heterogeneous-risk population and determine the optimal timing of steroid withdrawal. A discrete event simulation was developed including, as events, AR, GL, myocardial infarction (MI), stroke, cytomegalovirus, and new onset diabetes mellitus (NODM), among others. Data from the United States Renal Data System were used to estimate event-specific parametric regressions, which accounted for steroid-sparing regimen (avoidance, early 7-d withdrawal, 6-mo withdrawal, 12-mo withdrawal, and maintenance) as well as patients' demographics, immunologic risks, and comorbidities. Regression-equation results were used to derive individual time-to-event Weibull distributions, used, in turn, to simulate the course of patients over 20 y. Patients on steroid avoidance or an early-withdrawal regimen were more likely to experience AR (45.9% to 55.0% v. 33.6%, P events and other outcomes with no worsening of AR or GL rates compared with steroid maintenance.

  7. Rocket and ground-based study of an auroral breakup event

    International Nuclear Information System (INIS)

    Marklund, G.

    1982-02-01

    On 27 January, 1979 the substorm-GEOS rocket S23H was launched from ESRANGE, Kiruna, shortly after the onset of an intense magnetospheric substorm over northern Scandinavia. Rocket electric field and particle observations have been used to calculate ionospheric currents and heating rates. These results are generally consistent with the ground magnetic and optical observations. An important finding emerging from a comparison of this event with a pre-breakup event earlier on this day is that the ionospheric substorm-related electric field could be split up into two parts, namely: 1) an ambient LT dependent field, probably of magnetospheric origin 2) superimposed on this a small-scale electric field associated with the bright auroral structures, being southward for both events. This is shown to have important consequences on the location of the ionospheric currents and the Joule energy discussion relative to the auroral forms. (Author)

  8. Strategies to Automatically Derive a Process Model from a Configurable Process Model Based on Event Data

    Directory of Open Access Journals (Sweden)

    Mauricio Arriagada-Benítez

    2017-10-01

    Full Text Available Configurable process models are frequently used to represent business workflows and other discrete event systems among different branches of large organizations: they unify commonalities shared by all branches and describe their differences, at the same time. The configuration of such models is usually done manually, which is challenging. On the one hand, when the number of configurable nodes in the configurable process model grows, the size of the search space increases exponentially. On the other hand, the person performing the configuration may lack the holistic perspective to make the right choice for all configurable nodes at the same time, since choices influence each other. Nowadays, information systems that support the execution of business processes create event data reflecting how processes are performed. In this article, we propose three strategies (based on exhaustive search, genetic algorithms and a greedy heuristic that use event data to automatically derive a process model from a configurable process model that better represents the characteristics of the process in a specific branch. These strategies have been implemented in our proposed framework and tested in both business-like event logs as recorded in a higher educational enterprise resource planning system and a real case scenario involving a set of Dutch municipalities.

  9. Effect of a ward-based pharmacy team on preventable adverse drug events in surgical patients (SUREPILL study)

    NARCIS (Netherlands)

    de Boer, M.; Boeker, E. B.; Ramrattan, M. A.; Kiewiet, J. J. S.; Ram, K.; Gombert-Handoko, K. B.; van Lent-Evers, N. A. E. M.; Kuks, P. F. M.; Mulder, W. M. C.; Breslau, P. J.; Oostenbroek, R. J.; Dijkgraaf, M. G. W.; Lie-A-Huen, L.; Boermeester, M. A.

    2015-01-01

    Surgical patients are at risk of adverse drug events (ADEs) causing morbidity and mortality. Much harm is preventable. Ward-based pharmacy interventions to reduce medication-related harm have not been evaluated in surgical patients. This multicentre prospective clinical trial evaluated a

  10. Event segmentation ability uniquely predicts event memory.

    Science.gov (United States)

    Sargent, Jesse Q; Zacks, Jeffrey M; Hambrick, David Z; Zacks, Rose T; Kurby, Christopher A; Bailey, Heather R; Eisenberg, Michelle L; Beck, Taylor M

    2013-11-01

    Memory for everyday events plays a central role in tasks of daily living, autobiographical memory, and planning. Event memory depends in part on segmenting ongoing activity into meaningful units. This study examined the relationship between event segmentation and memory in a lifespan sample to answer the following question: Is the ability to segment activity into meaningful events a unique predictor of subsequent memory, or is the relationship between event perception and memory accounted for by general cognitive abilities? Two hundred and eight adults ranging from 20 to 79years old segmented movies of everyday events and attempted to remember the events afterwards. They also completed psychometric ability tests and tests measuring script knowledge for everyday events. Event segmentation and script knowledge both explained unique variance in event memory above and beyond the psychometric measures, and did so as strongly in older as in younger adults. These results suggest that event segmentation is a basic cognitive mechanism, important for memory across the lifespan. Copyright © 2013 Elsevier B.V. All rights reserved.

  11. Y-90 microsphere therapy: prevention of adverse events.

    Science.gov (United States)

    Schultz, Cheryl C; Campbell, Janice; Bakalyar, Donovan; Beauvais, Michele; Feng, Wenzheng; Savin, Michael

    2009-08-01

    Thirty-three (33) events that were inconsistent with intended treatment for 471 Y-90 microsphere deliveries were analyzed from 2001 to 2007. Each occurrence was categorized, based on root-cause analysis, as a device/product defect and/or operator error event. Events were further categorized, if there was an adverse outcome, as spill/leak, termination, recatheterization, dose deviation, and/or a regulatory medical event. Of 264 Y-90 Therasphere (MDS Nordion, Ottawa, Ontario, Canada) treatments, 15 events were reported (5.7%). Of 207 Y-90 SIR-Spheres (Sirtex, Wilmington, MA) treatments, 18 events were reported (8.7%). Twenty-five (25) of 33 events (76%) were device/product defects: 73% for Therasphere (11 of 15) and 78% for SIR-Spheres (14 of 18). There were 31 adverse outcomes associated with 33 events: 15 were leaks and/or spills, 9 resulted in termination of the dose administration, 3 resulted in recatheterization for dose compensation, 2 were dose deviations (doses differing from the prescribed between 10% and 20%), and 2 were reported as regulatory medical events. Fifty-five (55) corrective actions were taken: 39 (71%) were related to the manufacturer and 16 (29%) were hospital based. This process of analyzing each event and measuring our outcomes has been effective at minimizing adverse events and improving patient safety.

  12. Detection of anomalous events

    Science.gov (United States)

    Ferragut, Erik M.; Laska, Jason A.; Bridges, Robert A.

    2016-06-07

    A system is described for receiving a stream of events and scoring the events based on anomalousness and maliciousness (or other classification). The system can include a plurality of anomaly detectors that together implement an algorithm to identify low-probability events and detect atypical traffic patterns. The anomaly detector provides for comparability of disparate sources of data (e.g., network flow data and firewall logs.) Additionally, the anomaly detector allows for regulatability, meaning that the algorithm can be user configurable to adjust a number of false alerts. The anomaly detector can be used for a variety of probability density functions, including normal Gaussian distributions, irregular distributions, as well as functions associated with continuous or discrete variables.

  13. Event-based state estimation for a class of complex networks with time-varying delays: A comparison principle approach

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Wenbing [Department of Mathematics, Yangzhou University, Yangzhou 225002 (China); Wang, Zidong [Department of Computer Science, Brunel University London, Uxbridge, Middlesex, UB8 3PH (United Kingdom); Liu, Yurong, E-mail: yrliu@yzu.edu.cn [Department of Mathematics, Yangzhou University, Yangzhou 225002 (China); Communication Systems and Networks (CSN) Research Group, Faculty of Engineering, King Abdulaziz University, Jeddah 21589 (Saudi Arabia); Ding, Derui [Shanghai Key Lab of Modern Optical System, Department of Control Science and Engineering, University of Shanghai for Science and Technology, Shanghai 200093 (China); Alsaadi, Fuad E. [Communication Systems and Networks (CSN) Research Group, Faculty of Engineering, King Abdulaziz University, Jeddah 21589 (Saudi Arabia)

    2017-01-05

    The paper is concerned with the state estimation problem for a class of time-delayed complex networks with event-triggering communication protocol. A novel event generator function, which is dependent not only on the measurement output but also on a predefined positive constant, is proposed with hope to reduce the communication burden. A new concept of exponentially ultimate boundedness is provided to quantify the estimation performance. By means of the comparison principle, some sufficient conditions are obtained to guarantee that the estimation error is exponentially ultimately bounded, and then the estimator gains are obtained in terms of the solution of certain matrix inequalities. Furthermore, a rigorous proof is proposed to show that the designed triggering condition is free of the Zeno behavior. Finally, a numerical example is given to illustrate the effectiveness of the proposed event-based estimator. - Highlights: • An event-triggered estimator is designed for complex networks with time-varying delays. • A novel event generator function is proposed to reduce the communication burden. • The comparison principle is utilized to derive the sufficient conditions. • The designed triggering condition is shown to be free of the Zeno behavior.

  14. The ATLAS Event Service: A New Approach to Event Processing

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00070566; De, Kaushik; Guan, Wen; Maeno, Tadashi; Nilsson, Paul; Oleynik, Danila; Panitkin, Sergey; Tsulaia, Vakhtang; van Gemmeren, Peter; Wenaus, Torre

    2015-01-01

    The ATLAS Event Service (ES) implements a new fine grained approach to HEP event processing, designed to be agile and efficient in exploiting transient, short-lived resources such as HPC hole-filling, spot market commercial clouds, and volunteer computing. Input and output control and data flows, bookkeeping, monitoring, and data storage are all managed at the event level in an implementation capable of supporting ATLAS-scale distributed processing throughputs (about 4M CPU-hours/day). Input data flows utilize remote data repositories with no data locality or pre­staging requirements, minimizing the use of costly storage in favor of strongly leveraging powerful networks. Object stores provide a highly scalable means of remotely storing the quasi-continuous, fine grained outputs that give ES based applications a very light data footprint on a processing resource, and ensure negligible losses should the resource suddenly vanish. We will describe the motivations for the ES system, its unique features and capabi...

  15. The Frasnian-Famennian mass killing event(s), methods of identification and evaluation

    Science.gov (United States)

    Geldsetzer, H. H. J.

    1988-01-01

    The absence of an abnormally high number of earlier Devonian taxa from Famennian sediments was repeatedly documented and can hardly be questioned. Primary recognition of the event(s) was based on paleontological data, especially common macrofossils. Most paleontologists place the disappearance of these common forms at the gigas/triangularis contact and this boundary was recently proposed as the Frasnian-Famennian (F-F) boundary. Not unexpectedly, alternate F-F positions were suggested caused by temporary Frasnian survivors or sudden post-event radiations of new forms. Secondary supporting evidence for mass killing event(s) is supplied by trace element and stable isotope geochemistry but not with the same success as for the K/T boundary, probably due to additional 300 ma of tectonic and diagenetic overprinting. Another tool is microfacies analysis which is surprisingly rarely used even though it can explain geochemical anomalies or paleontological overlap not detectable by conventional macrofacies analysis. The combination of microfacies analysis and geochemistry was applied at two F-F sections in western Canada and showed how interdependent the two methods are. Additional F-F sections from western Canada, western United States, France, Germany and Australia were sampled or re-sampled and await geochemical/microfacies evaluation.

  16. Identification of Tropical-Extratropical Interactions and Extreme Precipitation Events in the Middle East based on Potential Vorticity and Moisture Transport

    KAUST Repository

    de Vries, A. J.; Ouwersloot, H. G.; Feldstein, S. B.; Riemer, M.; El Kenawy, A. M.; McCabe, Matthew; Lelieveld, J.

    2017-01-01

    ) intrusion reaches deep into the subtropics and forces an incursion of high poleward vertically integrated water vapor transport (IVT) into the Middle East. This study presents an object-based identification method for extreme precipitation events based

  17. Event-by-event simulation of single-neutron experiments to test uncertainty relations

    International Nuclear Information System (INIS)

    Raedt, H De; Michielsen, K

    2014-01-01

    Results from a discrete-event simulation of a recent single-neutron experiment that tests Ozawa's generalization of Heisenberg's uncertainty relation are presented. The event-based simulation algorithm reproduces the results of the quantum theoretical description of the experiment but does not require the knowledge of the solution of a wave equation, nor does it rely on detailed concepts of quantum theory. In particular, the data from these non-quantum simulations satisfy uncertainty relations derived in the context of quantum theory. (paper)

  18. Negative Life Events Scale for Students (NLESS)

    Science.gov (United States)

    Buri, John R.; Cromett, Cristina E.; Post, Maria C.; Landis, Anna Marie; Alliegro, Marissa C.

    2015-01-01

    Rationale is presented for the derivation of a new measure of stressful life events for use with students [Negative Life Events Scale for Students (NLESS)]. Ten stressful life events questionnaires were reviewed, and the more than 600 items mentioned in these scales were culled based on the following criteria: (a) only long-term and unpleasant…

  19. Text messaging as a strategy to address the limits of audio-based communication during mass-gathering events with high ambient noise.

    Science.gov (United States)

    Lund, Adam; Wong, Daniel; Lewis, Kerrie; Turris, Sheila A; Vaisler, Sean; Gutman, Samuel

    2013-02-01

    The provision of medical care in environments with high levels of ambient noise (HLAN), such as concerts or sporting events, presents unique communication challenges. Audio transmissions can be incomprehensible to the receivers. Text-based communications may be a valuable primary and/or secondary means of communication in this type of setting. To evaluate the usability of text-based communications in parallel with standard two-way radio communications during mass-gathering (MG) events in the context of HLAN. This Canadian study used outcome survey methods to evaluate the performance of communication devices during MG events. Ten standard commercially available handheld smart phones loaded with basic voice and data plans were assigned to health care providers (HCPs) for use as an adjunct to the medical team's typical radio-based communication. Common text messaging and chat platforms were trialed. Both efficacy and provider satisfaction were evaluated. During a 23-month period, the smart phones were deployed at 17 events with HLAN for a total of 40 event days or approximately 460 hours of active use. Survey responses from health care providers (177) and dispatchers (26) were analyzed. The response rate was unknown due to the method of recruitment. Of the 155 HCP responses to the question measuring difficulty of communication in environments with HLAN, 68.4% agreed that they "occasionally" or "frequently" found it difficult to clearly understand voice communications via two-way radio. Similarly, of the 23 dispatcher responses to the same item, 65.2% of the responses indicated that "occasionally" or "frequently" HLAN negatively affected the ability to communicate clearly with team members. Of the 168 HCP responses to the item assessing whether text-based communication improved the ability to understand and respond to calls when compared to radio alone, 86.3% "agreed" or "strongly agreed" that this was the case. The dispatcher responses (n = 21) to the same item also

  20. Atherosclerosis profile and incidence of cardiovascular events: a population-based survey

    Directory of Open Access Journals (Sweden)

    Bullano Michael F

    2009-09-01

    Full Text Available Abstract Background Atherosclerosis is a chronic progressive disease often presenting as clinical cardiovascular disease (CVD events. This study evaluated the characteristics of individuals with a diagnosis of atherosclerosis and estimated the incidence of CVD events to assist in the early identification of high-risk individuals. Methods Respondents to the US SHIELD baseline survey were followed for 2 years to observe incident self-reported CVD. Respondents had subclinical atherosclerosis if they reported a diagnosis of narrow or blocked arteries/carotid artery disease without a past clinical CVD event (heart attack, stroke or revascularization. Characteristics of those with atherosclerosis and incident CVD were compared with those who did not report atherosclerosis at baseline but had CVD in the following 2 years using chi-square tests. Logistic regression model identified characteristics associated with atherosclerosis and incident events. Results Of 17,640 respondents, 488 (2.8% reported having subclinical atherosclerosis at baseline. Subclinical atherosclerosis was associated with age, male gender, dyslipidemia, circulation problems, hypertension, past smoker, and a cholesterol test in past year (OR = 2.2 [all p Conclusion Self-report of subclinical atherosclerosis identified an extremely high-risk group with a >25% risk of a CVD event in the next 2 years. These characteristics may be useful for identifying individuals for more aggressive diagnostic and therapeutic efforts.

  1. Microseismic Event Relocation and Focal Mechanism Estimation Based on PageRank Linkage

    Science.gov (United States)

    Aguiar, A. C.; Myers, S. C.

    2017-12-01

    Microseismicity associated with enhanced geothermal systems (EGS) is key in understanding how subsurface stimulation can modify stress, fracture rock, and increase permeability. Large numbers of microseismic events are commonly associated with hydroshearing an EGS, making data mining methods useful in their analysis. We focus on PageRank, originally developed as Google's search engine, and subsequently adapted for use in seismology to detect low-frequency earthquakes by linking events directly and indirectly through cross-correlation (Aguiar and Beroza, 2014). We expand on this application by using PageRank to define signal-correlation topology for micro-earthquakes from the Newberry Volcano EGS in Central Oregon, which has been stimulated two times using high-pressure fluid injection. We create PageRank signal families from both data sets and compare these to the spatial and temporal proximity of associated earthquakes. PageRank families are relocated using differential travel times measured by waveform cross-correlation (CC) and the Bayesloc approach (Myers et al., 2007). Prior to relocation events are loosely clustered with events at a distance from the cluster. After relocation, event families are found to be tightly clustered. Indirect linkage of signals using PageRank is a reliable way to increase the number of events confidently determined to be similar, suggesting an efficient and effective grouping of earthquakes with similar physical characteristics (ie. location, focal mechanism, stress drop). We further explore the possibility of using PageRank families to identify events with similar relative phase polarities and estimate focal mechanisms following Shelly et al. (2016) method, where CC measurements are used to determine individual polarities within event clusters. Given a positive result, PageRank might be a useful tool in adaptive approaches to enhance production at well-instrumented geothermal sites. Prepared by LLNL under Contract DE-AC52-07NA27344

  2. Robust Initial Wetness Condition Framework of an Event-Based Rainfall–Runoff Model Using Remotely Sensed Soil Moisture

    Directory of Open Access Journals (Sweden)

    Wooyeon Sunwoo

    2017-01-01

    Full Text Available Runoff prediction in limited-data areas is vital for hydrological applications, such as the design of infrastructure and flood defenses, runoff forecasting, and water management. Rainfall–runoff models may be useful for simulation of runoff generation, particularly event-based models, which offer a practical modeling scheme because of their simplicity. However, there is a need to reduce the uncertainties related to the estimation of the initial wetness condition (IWC prior to a rainfall event. Soil moisture is one of the most important variables in rainfall–runoff modeling, and remotely sensed soil moisture is recognized as an effective way to improve the accuracy of runoff prediction. In this study, the IWC was evaluated based on remotely sensed soil moisture by using the Soil Conservation Service-Curve Number (SCS-CN method, which is one of the representative event-based models used for reducing the uncertainty of runoff prediction. Four proxy variables for the IWC were determined from the measurements of total rainfall depth (API5, ground-based soil moisture (SSMinsitu, remotely sensed surface soil moisture (SSM, and soil water index (SWI provided by the advanced scatterometer (ASCAT. To obtain a robust IWC framework, this study consists of two main parts: the validation of remotely sensed soil moisture, and the evaluation of runoff prediction using four proxy variables with a set of rainfall–runoff events in the East Asian monsoon region. The results showed an acceptable agreement between remotely sensed soil moisture (SSM and SWI and ground based soil moisture data (SSMinsitu. In the proxy variable analysis, the SWI indicated the optimal value among the proposed proxy variables. In the runoff prediction analysis considering various infiltration conditions, the SSM and SWI proxy variables significantly reduced the runoff prediction error as compared with API5 by 60% and 66%, respectively. Moreover, the proposed IWC framework with

  3. Sentiment Diffusion of Public Opinions about Hot Events: Based on Complex Network.

    Directory of Open Access Journals (Sweden)

    Xiaoqing Hao

    Full Text Available To study the sentiment diffusion of online public opinions about hot events, we collected people's posts through web data mining techniques. We calculated the sentiment value of each post based on a sentiment dictionary. Next, we divided those posts into five different orientations of sentiments: strongly positive (P, weakly positive (p, neutral (o, weakly negative (n, and strongly negative (N. These sentiments are combined into modes through coarse graining. We constructed sentiment mode complex network of online public opinions (SMCOP with modes as nodes and the conversion relation in chronological order between different types of modes as edges. We calculated the strength, k-plex clique, clustering coefficient and betweenness centrality of the SMCOP. The results show that the strength distribution obeys power law. Most posts' sentiments are weakly positive and neutral, whereas few are strongly negative. There are weakly positive subgroups and neutral subgroups with ppppp and ooooo as the core mode, respectively. Few modes have larger betweenness centrality values and most modes convert to each other with these higher betweenness centrality modes as mediums. Therefore, the relevant person or institutes can take measures to lead people's sentiments regarding online hot events according to the sentiment diffusion mechanism.

  4. Vision-based Event Detection of the Sit-to-Stand Transition

    Directory of Open Access Journals (Sweden)

    Victor Shia

    2015-12-01

    Full Text Available Sit-to-stand (STS motions are one of the most important activities of daily living as they serve as a precursor to mobility and walking. However, there exist no standard method of segmenting STS motions. This is partially due to the variety of different sensors and modalities used to study the STS motion such as force plate, vision, and accelerometers, each providing different types of data, and the variability of the STS motion in video data. In this work, we present a method using motion capture to detect events in the STS motion by estimating ground reaction forces, thereby eliminating the variability in joint angles from visual data. We illustrate the accuracy of this method with 10 subjects with an average difference of 16.5ms in event times obtained via motion capture vs force plate. This method serves as a proof of concept for detecting events in the STS motion via video which are comparable to those obtained via force plate.

  5. VLSI implementation of a 2.8 Gevent/s packet based AER interface with routing and event sorting functionality

    Directory of Open Access Journals (Sweden)

    Stefan eScholze

    2011-10-01

    Full Text Available State-of-the-art large scale neuromorphic systems require sophisticated spike event communication between units of the neural network. We present a high-speed communication infrastructure for a waferscale neuromorphic system, based on application-specific neuromorphic communication ICs in an FPGA-maintained environment. The ICs implement configurable axonal delays, as required for certain types of dynamic processing or for emulating spike based learning among distant cortical areas. Measurements are presented which show the efficacy of these delays in influencing behaviour of neuromorphic benchmarks. The specialized, dedicated AER communication in most current systems requires separate, low-bandwidth configuration channels. In contrast, the configuration of the waferscale neuromorphic system is also handled by the digital packet-based pulse channel, which transmits configuration data at the full bandwidth otherwise used for pulse transmission. The overall so-called pulse communication subgroup (ICs and FPGA delivers a factor 25-50 more event transmission rate than other current neuromorphic communication infrastructures.

  6. Performance of the CMS Event Builder

    Energy Technology Data Exchange (ETDEWEB)

    Andre, J.M.; et al.

    2017-11-22

    The data acquisition system (DAQ) of the CMS experiment at the CERN Large Hadron Collider assembles events at a rate of 100 kHz, transporting event data at an aggregate throughput of to the high-level trigger farm. The DAQ architecture is based on state-of-the-art network technologies for the event building. For the data concentration, 10/40 Gbit/s Ethernet technologies are used together with a reduced TCP/IP protocol implemented in FPGA for a reliable transport between custom electronics and commercial computing hardware. A 56 Gbit/s Infiniband FDR Clos network has been chosen for the event builder. This paper presents the implementation and performance of the event-building system.

  7. LHCb Online event processing and filtering

    CERN Document Server

    Alessio, F; Brarda, L; Frank, M; Franek, B; Galli, D; Gaspar, C; Van Herwijnen, E; Jacobsson, R; Jost, B; Köstner, S; Moine, G; Neufeld, N; Somogyi, P; Stoica, R; Suman, S

    2008-01-01

    The first level trigger of LHCb accepts one million events per second. After preprocessing in custom FPGA-based boards these events are distributed to a large farm of PC-servers using a high-speed Gigabit Ethernet network. Synchronisation and event management is achieved by the Timing and Trigger system of LHCb. Due to the complex nature of the selection of B-events, which are the main interest of LHCb, a full event-readout is required. Event processing on the servers is parallelised on an event basis. The reduction factor is typically 1/500. The remaining events are forwarded to a formatting layer, where the raw data files are formed and temporarily stored. A small part of the events is also forwarded to a dedicated farm for calibration and monitoring. The files are subsequently shipped to the CERN Tier0 facility for permanent storage and from there to the various Tier1 sites for reconstruction. In parallel files are used by various monitoring and calibration processes running within the LHCb Online system. ...

  8. Event-by-event simulation of Einstein-Podolsky-Rosen-Bohm experiments

    NARCIS (Netherlands)

    Zhao, Shuang; De Raedt, Hans; Michielsen, Kristel

    We construct an event-based computer simulation model of the Einstein-Podolsky-Rosen-Bohm experiments with photons. The algorithm is a one-to-one copy of the data gathering and analysis procedures used in real laboratory experiments. We consider two types of experiments, those with a source emitting

  9. Report on Fukushima Daiichi NPP precursor events

    International Nuclear Information System (INIS)

    2014-01-01

    The main questions to be answered by this report were: The Fukushima Daiichi NPP accident, could it have been prevented? If there is a next severe accident, may it be prevented? To answer the first question, the report addressed several aspects. First, the report investigated whether precursors to the Fukushima Daiichi NPP accident existed in the operating experience; second, the reasons why these precursors did not evolve into a severe accident. Third, whether lessons learned from these precursor events were adequately considered by member countries; and finally, if the operating experience feedback system needs to be improved, based on the previous analysis. To address the second question which is much more challenging, the report considered precursor events identified through a search and analysis of the IRS database and also precursors events based on risk significance. Both methods can point out areas where further work may be needed, even if it depends heavily on design and site-specific factors. From the operating experience side, more efforts are needed to ensure timely and full implementation of lessons learnt from precursor events. Concerning risk considerations, a combined use of risk precursors and operating experience may drive to effective changes to plants to reduce risk. The report also contains a short description and evaluation of selected precursors that are related to the course of the Fukushima Daiichi NPP accident. The report addresses the question whether operating experience feedback can be effectively used to identify plant vulnerabilities and minimize potential for severe core damage accidents. Based on several of the precursor events national or international in-depth evaluations were started. The vulnerability of NPPs due to external and internal flooding has clearly been addressed. In addition to the IRS based investigation, the WGRISK was asked to identify important precursor events based on risk significance. These precursors have

  10. Multi-spacecraft observations of ICMEs propagating beyond Earth orbit during MSL/RAD flight and surface phases

    Science.gov (United States)

    von Forstner, J.; Guo, J.; Wimmer-Schweingruber, R. F.; Hassler, D.; Temmer, M.; Vrsnak, B.; Čalogović, J.; Dumbovic, M.; Lohf, H.; Appel, J. K.; Heber, B.; Steigies, C. T.; Zeitlin, C.; Ehresmann, B.; Jian, L. K.; Boehm, E.; Boettcher, S. I.; Burmeister, S.; Martin-Garcia, C.; Brinza, D. E.; Posner, A.; Reitz, G.; Matthiae, D.; Rafkin, S. C.; weigle, G., II; Cucinotta, F.

    2017-12-01

    The propagation of interplanetary coronal mass ejections (ICMEs) between Earth's orbit (1 AU) and Mars ( 1.5 AU) has been studied with their propagation speed estimated from both measurements and simulations. The enhancement of the magnetic fields related to ICMEs and their shock fronts cause so-called Forbush decreases, which can be detected as a reduction of galactic cosmic rays measured on-ground or on a spacecraft. We have used galactic cosmic ray (GCR) data from in-situ measurements at Earth, from both STEREO A and B as well as the GCR measurement by the Radiation Assessment Detector (RAD) instrument onboard Mars Science Laboratory (MSL) on the surface of Mars as well as during its flight to Mars in 2011-2012. A set of ICME events has been selected during the periods when Earth (or STEREO A or B) and MSL locations were nearly aligned on the same side of the Sun in the ecliptic plane (so-called opposition phase). Such lineups allow us to estimate the ICMEs' transit times between 1 AU and the MSL location by estimating the delay time of the corresponding Forbush decreases measured at each location. We investigate the evolution of their propagation speeds after passing Earth's orbit and find that the deceleration of ICMEs due to their interaction with the ambient solar wind continues beyond 1 AU. The results are compared to simulation data obtained from two CME propagation models, namely the Drag-Based Model (DBM) and the WSA-ENLIL plus cone model.

  11. Patient stratification and identification of adverse event correlations in the space of 1190 drug related adverse events

    DEFF Research Database (Denmark)

    Roitmann, Eva; Eriksson, Robert; Brunak, Søren

    2014-01-01

    New pharmacovigilance methods are needed as a consequence of the morbidity caused by drugs. We exploit fine-grained drug related adverse event information extracted by text mining from electronic medical records (EMRs) to stratify patients based on their adverse events and to determine adverse...

  12. Organizational Learning in Rare Events

    DEFF Research Database (Denmark)

    Andersen, Kristina Vaarst; Tyler, Beverly; Beukel, Karin

    When organizations encounter rare events they often find it challenging to extract learning from the experience. We analyze opportunities for organizational learning in one such rare event, namely Intellectual Property (IP) litigation, i.e., when organizations take disputes regarding their intell......When organizations encounter rare events they often find it challenging to extract learning from the experience. We analyze opportunities for organizational learning in one such rare event, namely Intellectual Property (IP) litigation, i.e., when organizations take disputes regarding...... the organization little discretion to utilize any learning from past litigation success. Thus, learning appears be to most beneficial in infringement cases. Based on statistical analysis of 10,211 litigation court cases in China, we find support for our hypotheses. Our findings suggest that organizations can learn...

  13. Tracking Real-Time Changes in Working Memory Updating and Gating with the Event-Based Eye-Blink Rate

    NARCIS (Netherlands)

    Rac-Lubashevsky, R.; Slagter, H.A.; Kessler, Y.

    2017-01-01

    Effective working memory (WM) functioning depends on the gating process that regulates the balance between maintenance and updating of WM. The present study used the event-based eye-blink rate (ebEBR), which presumably reflects phasic striatal dopamine activity, to examine how the cognitive

  14. TEMAC, Top Event Sensitivity Analysis

    International Nuclear Information System (INIS)

    Iman, R.L.; Shortencarier, M.J.

    1988-01-01

    1 - Description of program or function: TEMAC is designed to permit the user to easily estimate risk and to perform sensitivity and uncertainty analyses with a Boolean expression such as produced by the SETS computer program. SETS produces a mathematical representation of a fault tree used to model system unavailability. In the terminology of the TEMAC program, such a mathematical representation is referred to as a top event. The analysis of risk involves the estimation of the magnitude of risk, the sensitivity of risk estimates to base event probabilities and initiating event frequencies, and the quantification of the uncertainty in the risk estimates. 2 - Method of solution: Sensitivity and uncertainty analyses associated with top events involve mathematical operations on the corresponding Boolean expression for the top event, as well as repeated evaluations of the top event in a Monte Carlo fashion. TEMAC employs a general matrix approach which provides a convenient general form for Boolean expressions, is computationally efficient, and allows large problems to be analyzed. 3 - Restrictions on the complexity of the problem - Maxima of: 4000 cut sets, 500 events, 500 values in a Monte Carlo sample, 16 characters in an event name. These restrictions are implemented through the FORTRAN 77 PARAMATER statement

  15. Sampled-data consensus in switching networks of integrators based on edge events

    Science.gov (United States)

    Xiao, Feng; Meng, Xiangyu; Chen, Tongwen

    2015-02-01

    This paper investigates the event-driven sampled-data consensus in switching networks of multiple integrators and studies both the bidirectional interaction and leader-following passive reaction topologies in a unified framework. In these topologies, each information link is modelled by an edge of the information graph and assigned a sequence of edge events, which activate the mutual data sampling and controller updates of the two linked agents. Two kinds of edge-event-detecting rules are proposed for the general asynchronous data-sampling case and the synchronous periodic event-detecting case. They are implemented in a distributed fashion, and their effectiveness in reducing communication costs and solving consensus problems under a jointly connected topology condition is shown by both theoretical analysis and simulation examples.

  16. Balboa: A Framework for Event-Based Process Data Analysis

    National Research Council Canada - National Science Library

    Cook, Jonathan E; Wolf, Alexander L

    1998-01-01

    .... We have built Balboa as a bridge between the data collection and the analysis tools, facilitating the gathering and management of event data, and simplifying the construction of tools to analyze the data...

  17. A data-based model to locate mass movements triggered by seismic events in Sichuan, China.

    Science.gov (United States)

    de Souza, Fabio Teodoro

    2014-01-01

    Earthquakes affect the entire world and have catastrophic consequences. On May 12, 2008, an earthquake of magnitude 7.9 on the Richter scale occurred in the Wenchuan area of Sichuan province in China. This event, together with subsequent aftershocks, caused many avalanches, landslides, debris flows, collapses, and quake lakes and induced numerous unstable slopes. This work proposes a methodology that uses a data mining approach and geographic information systems to predict these mass movements based on their association with the main and aftershock epicenters, geologic faults, riverbeds, and topography. A dataset comprising 3,883 mass movements is analyzed, and some models to predict the location of these mass movements are developed. These predictive models could be used by the Chinese authorities as an important tool for identifying risk areas and rescuing survivors during similar events in the future.

  18. Corpuscular event-by-event simulation of quantum optics experiments : application to a quantum-controlled delayed-choice experiment

    NARCIS (Netherlands)

    De Raedt, Hans; Delina, M; Jin, Fengping; Michielsen, Kristel

    2012-01-01

    A corpuscular simulation model of optical phenomena that does not require knowledge of the solution of a wave equation of the whole system and reproduces the results of Maxwell's theory by generating detection events one by one is discussed. The event-based corpuscular model gives a unified

  19. Science-based risk assessments for rare events in a changing climate

    Science.gov (United States)

    Sobel, A. H.; Tippett, M. K.; Camargo, S. J.; Lee, C. Y.; Allen, J. T.

    2014-12-01

    History shows that substantial investments in protection against any specific type of natural disaster usually occur only after (usually shortly after) that specific type of disaster has happened in a given place. This is true even when it was well known before the event that there was a significant risk that it could occur. Presumably what psychologists Kahneman and Tversky have called "availability bias" is responsible, at least in part, for these failures to act on known but out-of-sample risks. While understandable, this human tendency prepares us poorly for events which are very rare (on the time scales of human lives) and even more poorly for a changing climate, as historical records become a poorer guide. A more forward-thinking and rational approach would require scientific risk assessments that can place meaningful probabilities on events that are rare enough to be absent from the historical record, and that can account for the influences of both anthropogenic climate change and low-frequency natural climate variability. The set of tools available for doing such risk assessments is still quite limited, particularly for some of the most extreme events such as tropical cyclones and tornadoes. We will briefly assess the state of the art for these events in particular, and describe some of our ongoing research to develop new tools for quantitative risk assessment using hybrids of statistical methods and physical understanding of the hazards.

  20. Tsunami Source Identification on the 1867 Tsunami Event Based on the Impact Intensity

    Science.gov (United States)

    Wu, T. R.

    2014-12-01

    The 1867 Keelung tsunami event has drawn significant attention from people in Taiwan. Not only because the location was very close to the 3 nuclear power plants which are only about 20km away from the Taipei city but also because of the ambiguous on the tsunami sources. This event is unique in terms of many aspects. First, it was documented on many literatures with many languages and with similar descriptions. Second, the tsunami deposit was discovered recently. Based on the literatures, earthquake, 7-meter tsunami height, volcanic smoke, and oceanic smoke were observed. Previous studies concluded that this tsunami was generated by an earthquake with a magnitude around Mw7.0 along the Shanchiao Fault. However, numerical results showed that even a Mw 8.0 earthquake was not able to generate a 7-meter tsunami. Considering the steep bathymetry and intense volcanic activities along the Keelung coast, one reasonable hypothesis is that different types of tsunami sources were existed, such as the submarine landslide or volcanic eruption. In order to confirm this scenario, last year we proposed the Tsunami Reverse Tracing Method (TRTM) to find the possible locations of the tsunami sources. This method helped us ruling out the impossible far-field tsunami sources. However, the near-field sources are still remain unclear. This year, we further developed a new method named 'Impact Intensity Analysis' (IIA). In the IIA method, the study area is divided into a sequence of tsunami sources, and the numerical simulations of each source is conducted by COMCOT (Cornell Multi-grid Coupled Tsunami Model) tsunami model. After that, the resulting wave height from each source to the study site is collected and plotted. This method successfully helped us to identify the impact factor from the near-field potential sources. The IIA result (Fig. 1) shows that the 1867 tsunami event was a multi-source event. A mild tsunami was trigged by a Mw7.0 earthquake, and then followed by the submarine

  1. Event-Entity-Relationship Modeling in Data Warehouse Environments

    DEFF Research Database (Denmark)

    Bækgaard, Lars

    We use the event-entity-relationship model (EVER) to illustrate the use of entity-based modeling languages for conceptual schema design in data warehouse environments. EVER is a general-purpose information modeling language that supports the specification of both general schema structures and multi......-dimensional schemes that are customized to serve specific information needs. EVER is based on an event concept that is very well suited for multi-dimensional modeling because measurement data often represent events in multi-dimensional databases...

  2. TOURISMOLOGICAL CLASSIFICATION OF SPORTING EVENTS

    Directory of Open Access Journals (Sweden)

    Željko Bjeljac

    2017-04-01

    Full Text Available Sporting events are programs, which are dominated by creative and complex facilities, primarily sports, but also recreation and entertainment. As such, they achieve tourism effects and goals and have a socio-economic importance for the city, region or state. Depending on the size and importance of sports event, sport has a different role in the context of promoting tourist destination, as well as different values. Each sport discipline has its own criteria by which athletes are ranked individually or as team. The subject of the research is to determine the criteria for the categorization of sporting events, in order to determine the importance of sporting events as an element of the tourist offer (individually or as part of a tourist destination. Also, this paper’s results present a comparative analysis of similar methodologies for the categorization of sporting events. Based on the research presented in the paper, there are four groups of criteria: economic, media, social and environmental. Together with this, paper gives the classification of traditional sporting events in the case of Serbia, dividing them in four groups.

  3. Auxiliary bearing design considerations for gas cooled reactors

    International Nuclear Information System (INIS)

    Penfield, S.R. Jr.; Rodwell, E.

    2001-01-01

    The need to avoid contamination of the primary system, along with other perceived advantages, has led to the selection of electromagnetic bearings (EMBs) in most ongoing commercial-scale gas cooled reactor (GCR) designs. However, one implication of magnetic bearings is the requirement to provide backup support to mitigate the effects of failures or overload conditions. The demands on these auxiliary or 'catcher' bearings have been substantially escalated by the recent development of direct Brayton cycle GCR concepts. Conversely, there has been only limited directed research in the area of auxiliary bearings, particularly for vertically oriented turbomachines. This paper explores the current state-of-the-art for auxiliary bearings and the implications for current GCR designs. (author)

  4. Significance of glucocorticoids and their receptors in patients with nephritic syndrome

    International Nuclear Information System (INIS)

    Yang Liusong; Li Dapei; Liu Deyi; Wang Weiyue; Wang Haodan

    1996-01-01

    The glucocorticoid receptor (GCR) in 34 patients with nephritic syndrome (NS) and 40 normal controls is investigated by radioligand binding assay. The results show that the GCR levels of NS patients are correlated well with the treatment results by glucocorticoids (GC). These patients who are sensitive to GC treatment have much higher levels of GCR than those who are not responsive to GC treatment (P<0.01) and the normal controls. The plasma ACTH and cortisol in the same subjects are also measured and the results show that NS patients have much lower levels of these two hormones than the normal controls', but no significant correlation is noted between the levels and the GC treatment effects

  5. Automatic, ECG-based detection of autonomic arousals and their association with cortical arousals, leg movements, and respiratory events in sleep

    DEFF Research Database (Denmark)

    Olsen, Mads; Schneider, Logan Douglas; Cheung, Joseph

    2018-01-01

    The current definition of sleep arousals neglects to address the diversity of arousals and their systemic cohesion. Autonomic arousals (AA) are autonomic activations often associated with cortical arousals (CA), but they may also occur in isolation in relation to a respiratory event, a leg movement...... event or spontaneously, without any other physiological associations. AA should be acknowledged as essential events to understand and explore the systemic implications of arousals. We developed an automatic AA detection algorithm based on intelligent feature selection and advanced machine learning using...... or respiratory events. This indicates that most FP constitute autonomic activations that are indistinguishable from those with cortical cohesion. The proposed algorithm provides an automatic system trained in a clinical environment, which can be utilized to analyse the systemic and clinical impacts of arousals....

  6. Event Discrimination Using Seismoacoustic Catalog Probabilities

    Science.gov (United States)

    Albert, S.; Arrowsmith, S.; Bowman, D.; Downey, N.; Koch, C.

    2017-12-01

    Presented here are three seismoacoustic catalogs from various years and locations throughout Utah and New Mexico. To create these catalogs, we combine seismic and acoustic events detected and located using different algorithms. Seismoacoustic events are formed based on similarity of origin time and location. Following seismoacoustic fusion, the data is compared against ground truth events. Each catalog contains events originating from both natural and anthropogenic sources. By creating these seismoacoustic catalogs, we show that the fusion of seismic and acoustic data leads to a better understanding of the nature of individual events. The probability of an event being a surface blast given its presence in each seismoacoustic catalog is quantified. We use these probabilities to discriminate between events from natural and anthropogenic sources. Sandia National Laboratories is a multimission laboratory managed and operated by National Technology and Engineering Solutions of Sandia, LLC., a wholly owned subsidiary of Honeywell International, Inc., for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-NA-0003525.

  7. Hospital deaths and adverse events in Brazil

    Directory of Open Access Journals (Sweden)

    Pavão Ana Luiza B

    2011-09-01

    Full Text Available Abstract Background Adverse events are considered a major international problem related to the performance of health systems. Evaluating the occurrence of adverse events involves, as any other outcome measure, determining the extent to which the observed differences can be attributed to the patient's risk factors or to variations in the treatment process, and this in turn highlights the importance of measuring differences in the severity of the cases. The current study aims to evaluate the association between deaths and adverse events, adjusted according to patient risk factors. Methods The study is based on a random sample of 1103 patient charts from hospitalizations in the year 2003 in 3 teaching hospitals in the state of Rio de Janeiro, Brazil. The methodology involved a retrospective review of patient charts in two stages - screening phase and evaluation phase. Logistic regression was used to evaluate the relationship between hospital deaths and adverse events. Results The overall mortality rate was 8.5%, while the rate related to the occurrence of an adverse event was 2.9% (32/1103 and that related to preventable adverse events was 2.3% (25/1103. Among the 94 deaths analyzed, 34% were related to cases involving adverse events, and 26.6% of deaths occurred in cases whose adverse events were considered preventable. The models tested showed good discriminatory capacity. The unadjusted odds ratio (OR 11.43 and the odds ratio adjusted for patient risk factors (OR 8.23 between death and preventable adverse event were high. Conclusions Despite discussions in the literature regarding the limitations of evaluating preventable adverse events based on peer review, the results presented here emphasize that adverse events are not only prevalent, but are associated with serious harm and even death. These results also highlight the importance of risk adjustment and multivariate models in the study of adverse events.

  8. Spatial-Temporal Event Detection from Geo-Tagged Tweets

    Directory of Open Access Journals (Sweden)

    Yuqian Huang

    2018-04-01

    Full Text Available As one of the most popular social networking services in the world, Twitter allows users to post messages along with their current geographic locations. Such georeferenced or geo-tagged Twitter datasets can benefit location-based services, targeted advertising and geosocial studies. Our study focused on the detection of small-scale spatial-temporal events and their textual content. First, we used Spatial-Temporal Density-Based Spatial Clustering of Applications with Noise (ST-DBSCAN to spatially-temporally cluster the tweets. Then, the word frequencies were summarized for each cluster and the potential topics were modeled by the Latent Dirichlet Allocation (LDA algorithm. Using two years of Twitter data from four college cities in the U.S., we were able to determine the spatial-temporal patterns of two known events, two unknown events and one recurring event, which then were further explored and modeled to identify the semantic content about the events. This paper presents our process and recommendations for both finding event-related tweets as well as understanding the spatial-temporal behaviors and semantic natures of the detected events.

  9. Flood modelling with a distributed event-based parsimonious rainfall-runoff model: case of the karstic Lez river catchment

    Directory of Open Access Journals (Sweden)

    M. Coustau

    2012-04-01

    Full Text Available Rainfall-runoff models are crucial tools for the statistical prediction of flash floods and real-time forecasting. This paper focuses on a karstic basin in the South of France and proposes a distributed parsimonious event-based rainfall-runoff model, coherent with the poor knowledge of both evaporative and underground fluxes. The model combines a SCS runoff model and a Lag and Route routing model for each cell of a regular grid mesh. The efficiency of the model is discussed not only to satisfactorily simulate floods but also to get powerful relationships between the initial condition of the model and various predictors of the initial wetness state of the basin, such as the base flow, the Hu2 index from the Meteo-France SIM model and the piezometric levels of the aquifer. The advantage of using meteorological radar rainfall in flood modelling is also assessed. Model calibration proved to be satisfactory by using an hourly time step with Nash criterion values, ranging between 0.66 and 0.94 for eighteen of the twenty-one selected events. The radar rainfall inputs significantly improved the simulations or the assessment of the initial condition of the model for 5 events at the beginning of autumn, mostly in September–October (mean improvement of Nash is 0.09; correction in the initial condition ranges from −205 to 124 mm, but were less efficient for the events at the end of autumn. In this period, the weak vertical extension of the precipitation system and the low altitude of the 0 °C isotherm could affect the efficiency of radar measurements due to the distance between the basin and the radar (~60 km. The model initial condition S is correlated with the three tested predictors (R2 > 0.6. The interpretation of the model suggests that groundwater does not affect the first peaks of the flood, but can strongly impact subsequent peaks in the case of a multi-storm event. Because this kind of model is based on a limited

  10. Positive predictive value of a register-based algorithm using the Danish National Registries to identify suicidal events

    DEFF Research Database (Denmark)

    Gasse, Christiane; Danielsen, Andreas Aalkjaer; Pedersen, Marianne Giørtz

    2018-01-01

    events overall, by gender, age groups, and calendar time. RESULTS: We retrieved medical records for 357 (75%) people. The PPV of the DK-algorithm to identify suicidal events was 51.5% (95% CI: 46.4-56.7) overall, 42.7% (95% CI: 35.2-50.5) in males, and 58.5% (95% CI: 51.6-65.1) in females. The PPV varied...... further across age groups and calendar time. After excluding cases identified via the DK-algorithm by unspecific codes of intoxications and injury, the PPV improved slightly (56.8% [95% CI: 50.0-63.4]). CONCLUSIONS: The DK-algorithm can reliably identify self-harm with suicidal intention in 52......PURPOSE: It is not possible to fully assess intention of self-harm and suicidal events using information from administrative databases. We conducted a validation study of intention of suicide attempts/self-harm contacts identified by a commonly applied Danish register-based algorithm (DK...

  11. Time-to-event methodology improved statistical evaluation in register-based health services research.

    Science.gov (United States)

    Bluhmki, Tobias; Bramlage, Peter; Volk, Michael; Kaltheuner, Matthias; Danne, Thomas; Rathmann, Wolfgang; Beyersmann, Jan

    2017-02-01

    Complex longitudinal sampling and the observational structure of patient registers in health services research are associated with methodological challenges regarding data management and statistical evaluation. We exemplify common pitfalls and want to stimulate discussions on the design, development, and deployment of future longitudinal patient registers and register-based studies. For illustrative purposes, we use data from the prospective, observational, German DIabetes Versorgungs-Evaluation register. One aim was to explore predictors for the initiation of a basal insulin supported therapy in patients with type 2 diabetes initially prescribed to glucose-lowering drugs alone. Major challenges are missing mortality information, time-dependent outcomes, delayed study entries, different follow-up times, and competing events. We show that time-to-event methodology is a valuable tool for improved statistical evaluation of register data and should be preferred to simple case-control approaches. Patient registers provide rich data sources for health services research. Analyses are accompanied with the trade-off between data availability, clinical plausibility, and statistical feasibility. Cox' proportional hazards model allows for the evaluation of the outcome-specific hazards, but prediction of outcome probabilities is compromised by missing mortality information. Copyright © 2016 Elsevier Inc. All rights reserved.

  12. The Event Detection and the Apparent Velocity Estimation Based on Computer Vision

    Science.gov (United States)

    Shimojo, M.

    2012-08-01

    The high spatial and time resolution data obtained by the telescopes aboard Hinode revealed the new interesting dynamics in solar atmosphere. In order to detect such events and estimate the velocity of dynamics automatically, we examined the estimation methods of the optical flow based on the OpenCV that is the computer vision library. We applied the methods to the prominence eruption observed by NoRH, and the polar X-ray jet observed by XRT. As a result, it is clear that the methods work well for solar images if the images are optimized for the methods. It indicates that the optical flow estimation methods in the OpenCV library are very useful to analyze the solar phenomena.

  13. Increasing the Operational Value of Event Messages

    Science.gov (United States)

    Li, Zhenping; Savkli, Cetin; Smith, Dan

    2003-01-01

    Assessing the health of a space mission has traditionally been performed using telemetry analysis tools. Parameter values are compared to known operational limits and are plotted over various time periods. This presentation begins with the notion that there is an incredible amount of untapped information contained within the mission s event message logs. Through creative advancements in message handling tools, the event message logs can be used to better assess spacecraft and ground system status and to highlight and report on conditions not readily apparent when messages are evaluated one-at-a-time during a real-time pass. Work in this area is being funded as part of a larger NASA effort at the Goddard Space Flight Center to create component-based, middleware-based, standards-based general purpose ground system architecture referred to as GMSEC - the GSFC Mission Services Evolution Center. The new capabilities and operational concepts for event display, event data analyses and data mining are being developed by Lockheed Martin and the new subsystem has been named GREAT - the GMSEC Reusable Event Analysis Toolkit. Planned for use on existing and future missions, GREAT has the potential to increase operational efficiency in areas of problem detection and analysis, general status reporting, and real-time situational awareness.

  14. Event-by-event simulation of quantum phenomena : Application to Einstein-Podolosky-Rosen-Bohm experiments

    NARCIS (Netherlands)

    De Raedt, H.; De Raedt, K.; Michielsen, K.; Keimpema, K.; Miyashita, S.

    We review the data gathering and analysis procedure used in real E instein-Podolsky-Rosen-Bohm experiments with photons and we illustrate the procedure by analyzing experimental data. Based on this analysis, we construct event-based computer simulation models in which every essential element in the

  15. Ptaquiloside from bracken in stream water at base flow and during storm events

    DEFF Research Database (Denmark)

    Clauson-Kaas, Frederik; Ramwell, Carmel; Hansen, Hans Chr. Bruun

    2016-01-01

    not decrease over the course of the event. In the stream, the throughfall contribution to PTA cannot be separated from a possible below-ground input from litter, rhizomes and soil. Catchment-specific factors such as the soil pH, topography, hydrology, and bracken coverage will evidently affect the level of PTA...... rainfall and PTA concentration in the stream, with a reproducible time lag of approx. 1 h from onset of rain to elevated concentrations, and returning rather quickly (about 2 h) to base flow concentration levels. The concentration of PTA behaved similar to an inert tracer (Cl(-)) in the pulse experiment...

  16. A Multi-Objective Partition Method for Marine Sensor Networks Based on Degree of Event Correlation

    Directory of Open Access Journals (Sweden)

    Dongmei Huang

    2017-09-01

    Full Text Available Existing marine sensor networks acquire data from sea areas that are geographically divided, and store the data independently in their affiliated sea area data centers. In the case of marine events across multiple sea areas, the current network structure needs to retrieve data from multiple data centers, and thus severely affects real-time decision making. In this study, in order to provide a fast data retrieval service for a marine sensor network, we use all the marine sensors as the vertices, establish the edge based on marine events, and abstract the marine sensor network as a graph. Then, we construct a multi-objective balanced partition method to partition the abstract graph into multiple regions and store them in the cloud computing platform. This method effectively increases the correlation of the sensors and decreases the retrieval cost. On this basis, an incremental optimization strategy is designed to dynamically optimize existing partitions when new sensors are added into the network. Experimental results show that the proposed method can achieve the optimal layout for distributed storage in the process of disaster data retrieval in the China Sea area, and effectively optimize the result of partitions when new buoys are deployed, which eventually will provide efficient data access service for marine events.

  17. A Multi-Objective Partition Method for Marine Sensor Networks Based on Degree of Event Correlation.

    Science.gov (United States)

    Huang, Dongmei; Xu, Chenyixuan; Zhao, Danfeng; Song, Wei; He, Qi

    2017-09-21

    Existing marine sensor networks acquire data from sea areas that are geographically divided, and store the data independently in their affiliated sea area data centers. In the case of marine events across multiple sea areas, the current network structure needs to retrieve data from multiple data centers, and thus severely affects real-time decision making. In this study, in order to provide a fast data retrieval service for a marine sensor network, we use all the marine sensors as the vertices, establish the edge based on marine events, and abstract the marine sensor network as a graph. Then, we construct a multi-objective balanced partition method to partition the abstract graph into multiple regions and store them in the cloud computing platform. This method effectively increases the correlation of the sensors and decreases the retrieval cost. On this basis, an incremental optimization strategy is designed to dynamically optimize existing partitions when new sensors are added into the network. Experimental results show that the proposed method can achieve the optimal layout for distributed storage in the process of disaster data retrieval in the China Sea area, and effectively optimize the result of partitions when new buoys are deployed, which eventually will provide efficient data access service for marine events.

  18. Using fuzzy arithmetic in containment event trees

    International Nuclear Information System (INIS)

    Rivera, S.S.; Baron, Jorge H.

    2000-01-01

    The use of fuzzy arithmetic is proposed for the evaluation of containment event trees. Concepts such as improbable, very improbable, and so on, which are subjective by nature, are represented by fuzzy numbers. The quantitative evaluation of containment event trees is based on the extension principle, by which operations on real numbers are extended to operations on fuzzy numbers. Expert knowledge is considered as state of the base variable with a normal distribution, which is considered to represent the membership function. Finally, this paper presents results of an example calculation of a containment event tree for the CAREM-25 nuclear power plant, presently under detailed design stage at Argentina. (author)

  19. Online Event Selection at the CMS experiment

    CERN Document Server

    Konecki, M

    2004-01-01

    Triggering in the high-rate environment of the LHC is a challenging task. The CMS experiment has developed a two-stage trigger system. The Level-1 Trigger is based on custom hardware devices and is designed to reduce the 40 MHz LHC bunch-crossing rate to a maximum event rate of ~100 kHz. The further reduction of the event rate to O(100 Hz), suitable for permanent storage, is performed in the High-Level Trigger (HLT) which is based on a farm of commercial processors. The methods used for object identification and reconstruction are presented. The CMS event selection strategy is discussed. The performance of the HLT is also given.

  20. Modified Dugdale cracks and Fictitious cracks

    DEFF Research Database (Denmark)

    Nielsen, Lauge Fuglsang

    1998-01-01

    A number of theories are presented in the literature on crack mechanics by which the strength of damaged materials can be predicted. Among these are theories based on the well-known Dugdale model of a crack prevented from spreading by self-created constant cohesive flow stressed acting in local...... areas, so-called fictitious cracks, in front of the crack.The Modified Dugdale theory presented in this paper is also based on the concept of Dugdale cracks. Any cohesive stress distribution, however, can be considered in front of the crack. Formally the strength of a material weakened by a modified...... Dugdale crack is the same as if it has been weakened by the well-known Griffith crack, namely sigma_CR = (EG_CR/phi)^1/2 where E and 1 are Young's modulus and crack half-length respectively, and G_CR is the so-called critical energy release rate. The physical significance of G_CR, however, is different...