WorldWideScience

Sample records for detection event rates

  1. Standard and Nonstandard Neutrino-Nucleus Reactions Cross Sections and Event Rates to Neutrino Detection Experiments

    Directory of Open Access Journals (Sweden)

    D. K. Papoulias

    2015-01-01

    Full Text Available In this work, we explore ν-nucleus processes from a nuclear theory point of view and obtain results with high confidence level based on accurate nuclear structure cross sections calculations. Besides cross sections, the present study includes simulated signals expected to be recorded by nuclear detectors and differential event rates as well as total number of events predicted to be measured. Our original cross sections calculations are focused on measurable rates for the standard model process, but we also perform calculations for various channels of the nonstandard neutrino-nucleus reactions and come out with promising results within the current upper limits of the corresponding exotic parameters. We concentrate on the possibility of detecting (i supernova neutrinos by using massive detectors like those of the GERDA and SuperCDMS dark matter experiments and (ii laboratory neutrinos produced near the spallation neutron source facilities (at Oak Ridge National Lab by the COHERENT experiment. Our nuclear calculations take advantage of the relevant experimental sensitivity and employ the severe bounds extracted for the exotic parameters entering the Lagrangians of various particle physics models and specifically those resulting from the charged lepton flavour violating μ-→e- experiments (Mu2e and COMET experiments.

  2. The Diurnal Variation of the Wimp Detection Event Rates in Directional Experiments

    CERN Document Server

    Vergados, J D

    2009-01-01

    The recent WMAP data have confirmed that exotic dark matter together with the vacuum energy (cosmological constant) dominate in the flat Universe. Modern particle theories naturally provide viable cold dark matter candidates with masses in the GeV-TeV region. Supersymmetry provides the lightest supersymmetric particle (LSP), theories in extra dimensions supply the lightest Kaluza-Klein particle (LKP) etc. The nature of dark matter can only be unraveled only by its direct detection in the laboratory. All such candidates will be called WIMPs (Weakly Interacting Massive Particles). In any case the direct dark matter search, which amounts to detecting the recoiling nucleus, following its collision with WIMP, is central to particle physics and cosmology. In this work we briefly review the theoretical elements relevant to the direct dark matter detection experiments, paying particular attention to directional experiments. i.e experiments in which, not only the energy but the direction of the recoiling nucleus is ob...

  3. Detection of anomalous events

    Science.gov (United States)

    Ferragut, Erik M.; Laska, Jason A.; Bridges, Robert A.

    2016-06-07

    A system is described for receiving a stream of events and scoring the events based on anomalousness and maliciousness (or other classification). The system can include a plurality of anomaly detectors that together implement an algorithm to identify low-probability events and detect atypical traffic patterns. The anomaly detector provides for comparability of disparate sources of data (e.g., network flow data and firewall logs.) Additionally, the anomaly detector allows for regulatability, meaning that the algorithm can be user configurable to adjust a number of false alerts. The anomaly detector can be used for a variety of probability density functions, including normal Gaussian distributions, irregular distributions, as well as functions associated with continuous or discrete variables.

  4. EDICAM (Event Detection Intelligent Camera)

    Energy Technology Data Exchange (ETDEWEB)

    Zoletnik, S. [Wigner RCP RMI, EURATOM Association, Budapest (Hungary); Szabolics, T., E-mail: szabolics.tamas@wigner.mta.hu [Wigner RCP RMI, EURATOM Association, Budapest (Hungary); Kocsis, G.; Szepesi, T.; Dunai, D. [Wigner RCP RMI, EURATOM Association, Budapest (Hungary)

    2013-10-15

    Highlights: ► We present EDICAM's hardware modules. ► We present EDICAM's main design concepts. ► This paper will describe EDICAM firmware architecture. ► Operation principles description. ► Further developments. -- Abstract: A new type of fast framing camera has been developed for fusion applications by the Wigner Research Centre for Physics during the last few years. A new concept was designed for intelligent event driven imaging which is capable of focusing image readout to Regions of Interests (ROIs) where and when predefined events occur. At present these events mean intensity changes and external triggers but in the future more sophisticated methods might also be defined. The camera provides 444 Hz frame rate at full resolution of 1280 × 1024 pixels, but monitoring of smaller ROIs can be done in the 1–116 kHz range even during exposure of the full image. Keeping space limitations and the harsh environment in mind the camera is divided into a small Sensor Module and a processing card interconnected by a fast 10 Gbit optical link. This camera hardware has been used for passive monitoring of the plasma in different devices for example at ASDEX Upgrade and COMPASS with the first version of its firmware. The new firmware and software package is now available and ready for testing the new event processing features. This paper will present the operation principle and features of the Event Detection Intelligent Camera (EDICAM). The device is intended to be the central element in the 10-camera monitoring system of the Wendelstein 7-X stellarator.

  5. Rate based failure detection

    Science.gov (United States)

    Johnson, Brett Emery Trabun; Gamage, Thoshitha Thanushka; Bakken, David Edward

    2018-01-02

    This disclosure describes, in part, a system management component and failure detection component for use in a power grid data network to identify anomalies within the network and systematically adjust the quality of service of data published by publishers and subscribed to by subscribers within the network. In one implementation, subscribers may identify a desired data rate, a minimum acceptable data rate, desired latency, minimum acceptable latency and a priority for each subscription. The failure detection component may identify an anomaly within the network and a source of the anomaly. Based on the identified anomaly, data rates and or data paths may be adjusted in real-time to ensure that the power grid data network does not become overloaded and/or fail.

  6. Event detection intelligent camera development

    International Nuclear Information System (INIS)

    Szappanos, A.; Kocsis, G.; Molnar, A.; Sarkozi, J.; Zoletnik, S.

    2008-01-01

    A new camera system 'event detection intelligent camera' (EDICAM) is being developed for the video diagnostics of W-7X stellarator, which consists of 10 distinct and standalone measurement channels each holding a camera. Different operation modes will be implemented for continuous and for triggered readout as well. Hardware level trigger signals will be generated from real time image processing algorithms optimized for digital signal processor (DSP) and field programmable gate array (FPGA) architectures. At full resolution a camera sends 12 bit sampled 1280 x 1024 pixels with 444 fps which means 1.43 Terabyte over half an hour. To analyse such a huge amount of data is time consuming and has a high computational complexity. We plan to overcome this problem by EDICAM's preprocessing concepts. EDICAM camera system integrates all the advantages of CMOS sensor chip technology and fast network connections. EDICAM is built up from three different modules with two interfaces. A sensor module (SM) with reduced hardware and functional elements to reach a small and compact size and robust action in harmful environment as well. An image processing and control unit (IPCU) module handles the entire user predefined events and runs image processing algorithms to generate trigger signals. Finally a 10 Gigabit Ethernet compatible image readout card functions as the network interface for the PC. In this contribution all the concepts of EDICAM and the functions of the distinct modules are described

  7. INES rating of radiation protection related events

    International Nuclear Information System (INIS)

    Hort, M.

    2009-01-01

    In this presentation, based on the draft Manual, a short review of the use of the INES rating of events concerning radiation protection is given, based on a new INES User's Manual edition. The presentation comprises a brief history of the scale development, general description of the scale and the main principles of the INES rating. Several examples of the use of the scale for radiation protection related events are mentioned. In the presentation, the term 'radiation protection related events' is used for radiation source and transport related events outside the nuclear installations. (authors)

  8. A simple strategy for fall events detection

    KAUST Repository

    Harrou, Fouzi

    2017-01-20

    The paper concerns the detection of fall events based on human silhouette shape variations. The detection of fall events is addressed from the statistical point of view as an anomaly detection problem. Specifically, the paper investigates the multivariate exponentially weighted moving average (MEWMA) control chart to detect fall events. Towards this end, a set of ratios for five partial occupancy areas of the human body for each frame are collected and used as the input data to MEWMA chart. The MEWMA fall detection scheme has been successfully applied to two publicly available fall detection databases, the UR fall detection dataset (URFD) and the fall detection dataset (FDD). The monitoring strategy developed was able to provide early alert mechanisms in the event of fall situations.

  9. Semblance for microseismic event detection

    Czech Academy of Sciences Publication Activity Database

    Staněk, František; Anikiev, D.; Valenta, Jan; Eisner, Leo

    2015-01-01

    Roč. 201, č. 3 (2015), s. 1362-1369 ISSN 0956-540X R&D Projects: GA ČR GAP210/12/2451 Institutional support: RVO:67985891 Keywords : microseismic event * microseismic monitoring * source mechanisms Subject RIV: DC - Siesmology, Volcanology, Earth Structure Impact factor: 2.484, year: 2015

  10. Joint Attributes and Event Analysis for Multimedia Event Detection.

    Science.gov (United States)

    Ma, Zhigang; Chang, Xiaojun; Xu, Zhongwen; Sebe, Nicu; Hauptmann, Alexander G

    2017-06-15

    Semantic attributes have been increasingly used the past few years for multimedia event detection (MED) with promising results. The motivation is that multimedia events generally consist of lower level components such as objects, scenes, and actions. By characterizing multimedia event videos with semantic attributes, one could exploit more informative cues for improved detection results. Much existing work obtains semantic attributes from images, which may be suboptimal for video analysis since these image-inferred attributes do not carry dynamic information that is essential for videos. To address this issue, we propose to learn semantic attributes from external videos using their semantic labels. We name them video attributes in this paper. In contrast with multimedia event videos, these external videos depict lower level contents such as objects, scenes, and actions. To harness video attributes, we propose an algorithm established on a correlation vector that correlates them to a target event. Consequently, we could incorporate video attributes latently as extra information into the event detector learnt from multimedia event videos in a joint framework. To validate our method, we perform experiments on the real-world large-scale TRECVID MED 2013 and 2014 data sets and compare our method with several state-of-the-art algorithms. The experiments show that our method is advantageous for MED.

  11. Detection of neutral current events

    International Nuclear Information System (INIS)

    Innocenti, P.G.

    1979-01-01

    The topics investigated in the course of the study can be broadly divided into three classes: (i) Inclusive measurements of the scattered electron for the determination of structure functions, scaling violations, deltaL/deltaT and weak interaction effects. (ii) Exclusive measurements of the current jet (momentum, energy, particle composition) for the study of fragmentation functions, for search of new particles, new quarks and QCD effects in the jet. (iii) Search for heavy leptons by detection and identification of their decay products. (orig.)

  12. Generalized Detectability for Discrete Event Systems

    Science.gov (United States)

    Shu, Shaolong; Lin, Feng

    2011-01-01

    In our previous work, we investigated detectability of discrete event systems, which is defined as the ability to determine the current and subsequent states of a system based on observation. For different applications, we defined four types of detectabilities: (weak) detectability, strong detectability, (weak) periodic detectability, and strong periodic detectability. In this paper, we extend our results in three aspects. (1) We extend detectability from deterministic systems to nondeterministic systems. Such a generalization is necessary because there are many systems that need to be modeled as nondeterministic discrete event systems. (2) We develop polynomial algorithms to check strong detectability. The previous algorithms are based on observer whose construction is of exponential complexity, while the new algorithms are based on a new automaton called detector. (3) We extend detectability to D-detectability. While detectability requires determining the exact state of a system, D-detectability relaxes this requirement by asking only to distinguish certain pairs of states. With these extensions, the theory on detectability of discrete event systems becomes more applicable in solving many practical problems. PMID:21691432

  13. Cartan invariants and event horizon detection

    Science.gov (United States)

    Brooks, D.; Chavy-Waddy, P. C.; Coley, A. A.; Forget, A.; Gregoris, D.; MacCallum, M. A. H.; McNutt, D. D.

    2018-04-01

    We show that it is possible to locate the event horizon of a black hole (in arbitrary dimensions) by the zeros of certain Cartan invariants. This approach accounts for the recent results on the detection of stationary horizons using scalar polynomial curvature invariants, and improves upon them since the proposed method is computationally less expensive. As an application, we produce Cartan invariants that locate the event horizons for various exact four-dimensional and five-dimensional stationary, asymptotically flat (or (anti) de Sitter), black hole solutions and compare the Cartan invariants with the corresponding scalar curvature invariants that detect the event horizon.

  14. Semantic Context Detection Using Audio Event Fusion

    Directory of Open Access Journals (Sweden)

    Cheng Wen-Huang

    2006-01-01

    Full Text Available Semantic-level content analysis is a crucial issue in achieving efficient content retrieval and management. We propose a hierarchical approach that models audio events over a time series in order to accomplish semantic context detection. Two levels of modeling, audio event and semantic context modeling, are devised to bridge the gap between physical audio features and semantic concepts. In this work, hidden Markov models (HMMs are used to model four representative audio events, that is, gunshot, explosion, engine, and car braking, in action movies. At the semantic context level, generative (ergodic hidden Markov model and discriminative (support vector machine (SVM approaches are investigated to fuse the characteristics and correlations among audio events, which provide cues for detecting gunplay and car-chasing scenes. The experimental results demonstrate the effectiveness of the proposed approaches and provide a preliminary framework for information mining by using audio characteristics.

  15. Metrics for Polyphonic Sound Event Detection

    Directory of Open Access Journals (Sweden)

    Annamaria Mesaros

    2016-05-01

    Full Text Available This paper presents and discusses various metrics proposed for evaluation of polyphonic sound event detection systems used in realistic situations where there are typically multiple sound sources active simultaneously. The system output in this case contains overlapping events, marked as multiple sounds detected as being active at the same time. The polyphonic system output requires a suitable procedure for evaluation against a reference. Metrics from neighboring fields such as speech recognition and speaker diarization can be used, but they need to be partially redefined to deal with the overlapping events. We present a review of the most common metrics in the field and the way they are adapted and interpreted in the polyphonic case. We discuss segment-based and event-based definitions of each metric and explain the consequences of instance-based and class-based averaging using a case study. In parallel, we provide a toolbox containing implementations of presented metrics.

  16. Detection of goal events in soccer videos

    Science.gov (United States)

    Kim, Hyoung-Gook; Roeber, Steffen; Samour, Amjad; Sikora, Thomas

    2005-01-01

    In this paper, we present an automatic extraction of goal events in soccer videos by using audio track features alone without relying on expensive-to-compute video track features. The extracted goal events can be used for high-level indexing and selective browsing of soccer videos. The detection of soccer video highlights using audio contents comprises three steps: 1) extraction of audio features from a video sequence, 2) event candidate detection of highlight events based on the information provided by the feature extraction Methods and the Hidden Markov Model (HMM), 3) goal event selection to finally determine the video intervals to be included in the summary. For this purpose we compared the performance of the well known Mel-scale Frequency Cepstral Coefficients (MFCC) feature extraction method vs. MPEG-7 Audio Spectrum Projection feature (ASP) extraction method based on three different decomposition methods namely Principal Component Analysis( PCA), Independent Component Analysis (ICA) and Non-Negative Matrix Factorization (NMF). To evaluate our system we collected five soccer game videos from various sources. In total we have seven hours of soccer games consisting of eight gigabytes of data. One of five soccer games is used as the training data (e.g., announcers' excited speech, audience ambient speech noise, audience clapping, environmental sounds). Our goal event detection results are encouraging.

  17. PSA-based evaluation and rating of operational events

    International Nuclear Information System (INIS)

    Gomez Cobo, A.

    1997-01-01

    The presentation discusses the PSA-based evaluation and rating of operational events, including the following: historical background, procedures for event evaluation using PSA, use of PSA for event rating, current activities

  18. A simple strategy for fall events detection

    KAUST Repository

    Harrou, Fouzi; Zerrouki, Nabil; Sun, Ying; Houacine, Amrane

    2017-01-01

    the multivariate exponentially weighted moving average (MEWMA) control chart to detect fall events. Towards this end, a set of ratios for five partial occupancy areas of the human body for each frame are collected and used as the input data to MEWMA chart

  19. Adaptive prediction applied to seismic event detection

    International Nuclear Information System (INIS)

    Clark, G.A.; Rodgers, P.W.

    1981-01-01

    Adaptive prediction was applied to the problem of detecting small seismic events in microseismic background noise. The Widrow-Hoff LMS adaptive filter used in a prediction configuration is compared with two standard seismic filters as an onset indicator. Examples demonstrate the technique's usefulness with both synthetic and actual seismic data

  20. Adaptive prediction applied to seismic event detection

    Energy Technology Data Exchange (ETDEWEB)

    Clark, G.A.; Rodgers, P.W.

    1981-09-01

    Adaptive prediction was applied to the problem of detecting small seismic events in microseismic background noise. The Widrow-Hoff LMS adaptive filter used in a prediction configuration is compared with two standard seismic filters as an onset indicator. Examples demonstrate the technique's usefulness with both synthetic and actual seismic data.

  1. Abnormal Event Detection Using Local Sparse Representation

    DEFF Research Database (Denmark)

    Ren, Huamin; Moeslund, Thomas B.

    2014-01-01

    We propose to detect abnormal events via a sparse subspace clustering algorithm. Unlike most existing approaches, which search for optimized normal bases and detect abnormality based on least square error or reconstruction error from the learned normal patterns, we propose an abnormality measurem...... is found that satisfies: the distance between its local space and the normal space is large. We evaluate our method on two public benchmark datasets: UCSD and Subway Entrance datasets. The comparison to the state-of-the-art methods validate our method's effectiveness....

  2. An Examination of Three Spatial Event Cluster Detection Methods

    Directory of Open Access Journals (Sweden)

    Hensley H. Mariathas

    2015-03-01

    Full Text Available In spatial disease surveillance, geographic areas with large numbers of disease cases are to be identified, so that targeted investigations can be pursued. Geographic areas with high disease rates are called disease clusters and statistical cluster detection tests are used to identify geographic areas with higher disease rates than expected by chance alone. In some situations, disease-related events rather than individuals are of interest for geographical surveillance, and methods to detect clusters of disease-related events are called event cluster detection methods. In this paper, we examine three distributional assumptions for the events in cluster detection: compound Poisson, approximate normal and multiple hypergeometric (exact. The methods differ on the choice of distributional assumption for the potentially multiple correlated events per individual. The methods are illustrated on emergency department (ED presentations by children and youth (age < 18 years because of substance use in the province of Alberta, Canada, during 1 April 2007, to 31 March 2008. Simulation studies are conducted to investigate Type I error and the power of the clustering methods.

  3. Fast radio burst event rate counts - I. Interpreting the observations

    Science.gov (United States)

    Macquart, J.-P.; Ekers, R. D.

    2018-02-01

    The fluence distribution of the fast radio burst (FRB) population (the `source count' distribution, N (>F) ∝Fα), is a crucial diagnostic of its distance distribution, and hence the progenitor evolutionary history. We critically reanalyse current estimates of the FRB source count distribution. We demonstrate that the Lorimer burst (FRB 010724) is subject to discovery bias, and should be excluded from all statistical studies of the population. We re-examine the evidence for flat, α > -1, source count estimates based on the ratio of single-beam to multiple-beam detections with the Parkes multibeam receiver, and show that current data imply only a very weak constraint of α ≲ -1.3. A maximum-likelihood analysis applied to the portion of the Parkes FRB population detected above the observational completeness fluence of 2 Jy ms yields α = -2.6_{-1.3}^{+0.7 }. Uncertainties in the location of each FRB within the Parkes beam render estimates of the Parkes event rate uncertain in both normalizing survey area and the estimated post-beam-corrected completeness fluence; this uncertainty needs to be accounted for when comparing the event rate against event rates measured at other telescopes.

  4. First CNGS events detected by LVD

    International Nuclear Information System (INIS)

    Agafonova, N.Yu.; Boyarkin, V.V.; Kuznetsov, V.V.; Kuznetsov, V.A.; Malguin, A.S.; Ryasny, V.G.; Ryazhskaya, O.G.; Yakushev, V.F.; Zatsepin, G.T.; Aglietta, M.; Bonardi, A.; Fulgione, W.; Galeotti, P.; Porta, A.; Saavedra, O.; Vigorito, C.; Antonioli, P.; Bari, G.; Giusti, P.; Menghetti, H.; Persiani, R.; Pesci, A.; Sartorelli, G.; Selvi, M.; Zichichi, A.; Bruno, G.; Ghia, P.L.; Garbini, M.; Kemp, E.; Pless, I.A.; Votano, L.

    2007-01-01

    The CERN Neutrino to Gran Sasso (CNGS) project aims to produce a high energy, wide band ν μ beam at CERN and send it toward the INFN Gran Sasso National Laboratory (LNGS), 732 km away. Its main goal is the observation of the ν τ appearance, through neutrino flavour oscillation. The beam started its operation in August 2006 for about 12 days: a total amount of 7.6 x 10 17 protons were delivered to the target. The LVD detector, installed in hall A of the LNGS and mainly dedicated to the study of supernova neutrinos, was fully operating during the whole CNGS running time. A total number of 569 events were detected in coincidence with the beam spill time. This is in good agreement with the expected number of events from Monte Carlo simulations. (orig.)

  5. First CNGS events detected by LVD

    International Nuclear Information System (INIS)

    Selvi, M.

    2007-01-01

    The Cern Neutrino to Gran Sasso (CNGS) project aims to produce a high energy, wide band ν μ beam at Cern and send it towards the INFN Gran Sasso National Laboratory (LNGS), 732 km away. Its main goal is the observation of the ν τ appearance, through neutrino flavour oscillation. The beam started its operation in August 2006 for about 12 days: a total amount of 7.6 10 17 protons were delivered to the target. The LVD detector, installed in hall A of the LNGS and mainly dedicated to the study of supernova neutrinos, was fully operating during the whole CNGS running time. A total number of 569 events were detected in coincidence with the beam spill time. This is in good agreement with the expected number of events from Montecarlo simulations

  6. LAN attack detection using Discrete Event Systems.

    Science.gov (United States)

    Hubballi, Neminath; Biswas, Santosh; Roopa, S; Ratti, Ritesh; Nandi, Sukumar

    2011-01-01

    Address Resolution Protocol (ARP) is used for determining the link layer or Medium Access Control (MAC) address of a network host, given its Internet Layer (IP) or Network Layer address. ARP is a stateless protocol and any IP-MAC pairing sent by a host is accepted without verification. This weakness in the ARP may be exploited by malicious hosts in a Local Area Network (LAN) by spoofing IP-MAC pairs. Several schemes have been proposed in the literature to circumvent these attacks; however, these techniques either make IP-MAC pairing static, modify the existing ARP, patch operating systems of all the hosts etc. In this paper we propose a Discrete Event System (DES) approach for Intrusion Detection System (IDS) for LAN specific attacks which do not require any extra constraint like static IP-MAC, changing the ARP etc. A DES model is built for the LAN under both a normal and compromised (i.e., spoofed request/response) situation based on the sequences of ARP related packets. Sequences of ARP events in normal and spoofed scenarios are similar thereby rendering the same DES models for both the cases. To create different ARP events under normal and spoofed conditions the proposed technique uses active ARP probing. However, this probing adds extra ARP traffic in the LAN. Following that a DES detector is built to determine from observed ARP related events, whether the LAN is operating under a normal or compromised situation. The scheme also minimizes extra ARP traffic by probing the source IP-MAC pair of only those ARP packets which are yet to be determined as genuine/spoofed by the detector. Also, spoofed IP-MAC pairs determined by the detector are stored in tables to detect other LAN attacks triggered by spoofing namely, man-in-the-middle (MiTM), denial of service etc. The scheme is successfully validated in a test bed. Copyright © 2010 ISA. Published by Elsevier Ltd. All rights reserved.

  7. Multilingual event extraction for epidemic detection.

    Science.gov (United States)

    Lejeune, Gaël; Brixtel, Romain; Doucet, Antoine; Lucas, Nadine

    2015-10-01

    This paper presents a multilingual news surveillance system applied to tele-epidemiology. It has been shown that multilingual approaches improve timeliness in detection of epidemic events across the globe, eliminating the wait for local news to be translated into major languages. We present here a system to extract epidemic events in potentially any language, provided a Wikipedia seed for common disease names exists. The Daniel system presented herein relies on properties that are common to news writing (the journalistic genre), the most useful being repetition and saliency. Wikipedia is used to screen common disease names to be matched with repeated characters strings. Language variations, such as declensions, are handled by processing text at the character-level, rather than at the word level. This additionally makes it possible to handle various writing systems in a similar fashion. As no multilingual ground truth existed to evaluate the Daniel system, we built a multilingual corpus from the Web, and collected annotations from native speakers of Chinese, English, Greek, Polish and Russian, with no connection or interest in the Daniel system. This data set is available online freely, and can be used for the evaluation of other event extraction systems. Experiments for 5 languages out of 17 tested are detailed in this paper: Chinese, English, Greek, Polish and Russian. The Daniel system achieves an average F-measure of 82% in these 5 languages. It reaches 87% on BEcorpus, the state-of-the-art corpus in English, slightly below top-performing systems, which are tailored with numerous language-specific resources. The consistent performance of Daniel on multiple languages is an important contribution to the reactivity and the coverage of epidemiological event detection systems. Most event extraction systems rely on extensive resources that are language-specific. While their sophistication induces excellent results (over 90% precision and recall), it restricts their

  8. Detecting Seismic Events Using a Supervised Hidden Markov Model

    Science.gov (United States)

    Burks, L.; Forrest, R.; Ray, J.; Young, C.

    2017-12-01

    We explore the use of supervised hidden Markov models (HMMs) to detect seismic events in streaming seismogram data. Current methods for seismic event detection include simple triggering algorithms, such as STA/LTA and the Z-statistic, which can lead to large numbers of false positives that must be investigated by an analyst. The hypothesis of this study is that more advanced detection methods, such as HMMs, may decreases false positives while maintaining accuracy similar to current methods. We train a binary HMM classifier using 2 weeks of 3-component waveform data from the International Monitoring System (IMS) that was carefully reviewed by an expert analyst to pick all seismic events. Using an ensemble of simple and discrete features, such as the triggering of STA/LTA, the HMM predicts the time at which transition occurs from noise to signal. Compared to the STA/LTA detection algorithm, the HMM detects more true events, but the false positive rate remains unacceptably high. Future work to potentially decrease the false positive rate may include using continuous features, a Gaussian HMM, and multi-class HMMs to distinguish between types of seismic waves (e.g., P-waves and S-waves). Acknowledgement: Sandia National Laboratories is a multi-mission laboratory managed and operated by National Technology and Engineering Solutions of Sandia, LLC., a wholly owned subsidiary of Honeywell International, Inc., for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-NA-0003525.SAND No: SAND2017-8154 A

  9. National Earthquake Information Center Seismic Event Detections on Multiple Scales

    Science.gov (United States)

    Patton, J.; Yeck, W. L.; Benz, H.; Earle, P. S.; Soto-Cordero, L.; Johnson, C. E.

    2017-12-01

    The U.S. Geological Survey National Earthquake Information Center (NEIC) monitors seismicity on local, regional, and global scales using automatic picks from more than 2,000 near-real time seismic stations. This presents unique challenges in automated event detection due to the high variability in data quality, network geometries and density, and distance-dependent variability in observed seismic signals. To lower the overall detection threshold while minimizing false detection rates, NEIC has begun to test the incorporation of new detection and picking algorithms, including multiband (Lomax et al., 2012) and kurtosis (Baillard et al., 2014) pickers, and a new bayesian associator (Glass 3.0). The Glass 3.0 associator allows for simultaneous processing of variably scaled detection grids, each with a unique set of nucleation criteria (e.g., nucleation threshold, minimum associated picks, nucleation phases) to meet specific monitoring goals. We test the efficacy of these new tools on event detection in networks of various scales and geometries, compare our results with previous catalogs, and discuss lessons learned. For example, we find that on local and regional scales, rapid nucleation of small events may require event nucleation with both P and higher-amplitude secondary phases (e.g., S or Lg). We provide examples of the implementation of a scale-independent associator for an induced seismicity sequence (local-scale), a large aftershock sequence (regional-scale), and for monitoring global seismicity. Baillard, C., Crawford, W. C., Ballu, V., Hibert, C., & Mangeney, A. (2014). An automatic kurtosis-based P-and S-phase picker designed for local seismic networks. Bulletin of the Seismological Society of America, 104(1), 394-409. Lomax, A., Satriano, C., & Vassallo, M. (2012). Automatic picker developments and optimization: FilterPicker - a robust, broadband picker for real-time seismic monitoring and earthquake early-warning, Seism. Res. Lett. , 83, 531-540, doi: 10

  10. Detecting surface events at the COBRA experiment

    Energy Technology Data Exchange (ETDEWEB)

    Tebruegge, Jan [Exp. Physik IV, TU Dortmund (Germany); Collaboration: COBRA-Collaboration

    2015-07-01

    The aim of the COBRA experiment is to prove the existence of neutrinoless double-beta-decay and to measure its half-life. For this purpose the COBRA demonstrator, a prototype for a large-scale experiment, is operated at the Gran Sasso Underground Laboratory (LNGS) in Italy. The demonstrator is a detector array made of 64 Cadmium-Zinc-Telluride (CdZnTe) semiconductor detectors in the coplanar grid anode configuration. Each detector is 1**1 ccm in size. This setup is used to investigate the experimental issues of operating CdZnTe detectors in low background mode and identify potential background components. As the ''detector=source'' principle is used, the neutrinoless double beta decay COBRA searches for happens within the whole detector volume. Consequently, events on the surface of the detectors are considered as background. These surface events are a main background component, stemming mainly from the natural radioactivity, especially radon. This talk explains to what extent surface events occur and shows how these are recognized and vetoed in the analysis using pulse shape discrimination algorithms.

  11. The rate of adverse events during IV conscious sedation.

    Science.gov (United States)

    Schwamburger, Nathan T; Hancock, Raymond H; Chong, Chol H; Hartup, Grant R; Vandewalle, Kraig S

    2012-01-01

    Conscious sedation has become an integral part of dentistry; it is often used to reduce anxiety or fear in some patients during oral surgery, periodontal surgery, implant placement, and general dentistry procedures. The purpose of this study was to evaluate the frequency of adverse events during IV conscious sedation provided by credentialed general dentists and periodontists in the United States Air Force (USAF). Sedation clinical records (Air Force Form 1417) from calendar year 2009 were requested from all USAF bases. A total of 1,468 records were reviewed and 19 adverse events were noted in 17 patients. IV complication (infiltration) was the most common adverse event. The overall adverse event rate was 1.3 per 100 patients treated. The results of this study show that moderate sedation provided by general dentists and periodontists in the USAF has a low incidence of adverse events, and conscious sedation remains a viable option for providers for the reduction of anxiety in select patients.

  12. Detection limit for rate fluctuations in inhomogeneous Poisson processes

    Science.gov (United States)

    Shintani, Toshiaki; Shinomoto, Shigeru

    2012-04-01

    Estimations of an underlying rate from data points are inevitably disturbed by the irregular occurrence of events. Proper estimation methods are designed to avoid overfitting by discounting the irregular occurrence of data, and to determine a constant rate from irregular data derived from a constant probability distribution. However, it can occur that rapid or small fluctuations in the underlying density are undetectable when the data are sparse. For an estimation method, the maximum degree of undetectable rate fluctuations is uniquely determined as a phase transition, when considering an infinitely long series of events drawn from a fluctuating density. In this study, we analytically examine an optimized histogram and a Bayesian rate estimator with respect to their detectability of rate fluctuation, and determine whether their detectable-undetectable phase transition points are given by an identical formula defining a degree of fluctuation in an underlying rate. In addition, we numerically examine the variational Bayes hidden Markov model in its detectability of rate fluctuation, and determine whether the numerically obtained transition point is comparable to those of the other two methods. Such consistency among these three principled methods suggests the presence of a theoretical limit for detecting rate fluctuations.

  13. Detection limit for rate fluctuations in inhomogeneous Poisson processes.

    Science.gov (United States)

    Shintani, Toshiaki; Shinomoto, Shigeru

    2012-04-01

    Estimations of an underlying rate from data points are inevitably disturbed by the irregular occurrence of events. Proper estimation methods are designed to avoid overfitting by discounting the irregular occurrence of data, and to determine a constant rate from irregular data derived from a constant probability distribution. However, it can occur that rapid or small fluctuations in the underlying density are undetectable when the data are sparse. For an estimation method, the maximum degree of undetectable rate fluctuations is uniquely determined as a phase transition, when considering an infinitely long series of events drawn from a fluctuating density. In this study, we analytically examine an optimized histogram and a Bayesian rate estimator with respect to their detectability of rate fluctuation, and determine whether their detectable-undetectable phase transition points are given by an identical formula defining a degree of fluctuation in an underlying rate. In addition, we numerically examine the variational Bayes hidden Markov model in its detectability of rate fluctuation, and determine whether the numerically obtained transition point is comparable to those of the other two methods. Such consistency among these three principled methods suggests the presence of a theoretical limit for detecting rate fluctuations.

  14. Subsurface Event Detection and Classification Using Wireless Signal Networks

    Directory of Open Access Journals (Sweden)

    Muhannad T. Suleiman

    2012-11-01

    Full Text Available Subsurface environment sensing and monitoring applications such as detection of water intrusion or a landslide, which could significantly change the physical properties of the host soil, can be accomplished using a novel concept, Wireless Signal Networks (WSiNs. The wireless signal networks take advantage of the variations of radio signal strength on the distributed underground sensor nodes of WSiNs to monitor and characterize the sensed area. To characterize subsurface environments for event detection and classification, this paper provides a detailed list and experimental data of soil properties on how radio propagation is affected by soil properties in subsurface communication environments. Experiments demonstrated that calibrated wireless signal strength variations can be used as indicators to sense changes in the subsurface environment. The concept of WSiNs for the subsurface event detection is evaluated with applications such as detection of water intrusion, relative density change, and relative motion using actual underground sensor nodes. To classify geo-events using the measured signal strength as a main indicator of geo-events, we propose a window-based minimum distance classifier based on Bayesian decision theory. The window-based classifier for wireless signal networks has two steps: event detection and event classification. With the event detection, the window-based classifier classifies geo-events on the event occurring regions that are called a classification window. The proposed window-based classification method is evaluated with a water leakage experiment in which the data has been measured in laboratory experiments. In these experiments, the proposed detection and classification method based on wireless signal network can detect and classify subsurface events.

  15. Subsurface event detection and classification using Wireless Signal Networks.

    Science.gov (United States)

    Yoon, Suk-Un; Ghazanfari, Ehsan; Cheng, Liang; Pamukcu, Sibel; Suleiman, Muhannad T

    2012-11-05

    Subsurface environment sensing and monitoring applications such as detection of water intrusion or a landslide, which could significantly change the physical properties of the host soil, can be accomplished using a novel concept, Wireless Signal Networks (WSiNs). The wireless signal networks take advantage of the variations of radio signal strength on the distributed underground sensor nodes of WSiNs to monitor and characterize the sensed area. To characterize subsurface environments for event detection and classification, this paper provides a detailed list and experimental data of soil properties on how radio propagation is affected by soil properties in subsurface communication environments. Experiments demonstrated that calibrated wireless signal strength variations can be used as indicators to sense changes in the subsurface environment. The concept of WSiNs for the subsurface event detection is evaluated with applications such as detection of water intrusion, relative density change, and relative motion using actual underground sensor nodes. To classify geo-events using the measured signal strength as a main indicator of geo-events, we propose a window-based minimum distance classifier based on Bayesian decision theory. The window-based classifier for wireless signal networks has two steps: event detection and event classification. With the event detection, the window-based classifier classifies geo-events on the event occurring regions that are called a classification window. The proposed window-based classification method is evaluated with a water leakage experiment in which the data has been measured in laboratory experiments. In these experiments, the proposed detection and classification method based on wireless signal network can detect and classify subsurface events.

  16. Learning Multimodal Deep Representations for Crowd Anomaly Event Detection

    Directory of Open Access Journals (Sweden)

    Shaonian Huang

    2018-01-01

    Full Text Available Anomaly event detection in crowd scenes is extremely important; however, the majority of existing studies merely use hand-crafted features to detect anomalies. In this study, a novel unsupervised deep learning framework is proposed to detect anomaly events in crowded scenes. Specifically, low-level visual features, energy features, and motion map features are simultaneously extracted based on spatiotemporal energy measurements. Three convolutional restricted Boltzmann machines are trained to model the mid-level feature representation of normal patterns. Then a multimodal fusion scheme is utilized to learn the deep representation of crowd patterns. Based on the learned deep representation, a one-class support vector machine model is used to detect anomaly events. The proposed method is evaluated using two available public datasets and compared with state-of-the-art methods. The experimental results show its competitive performance for anomaly event detection in video surveillance.

  17. THE PRODUCTION RATE OF SN Ia EVENTS IN GLOBULAR CLUSTERS

    International Nuclear Information System (INIS)

    Washabaugh, Pearce C.; Bregman, Joel N.

    2013-01-01

    In globular clusters, dynamical evolution produces luminous X-ray emitting binaries at a rate about 200 times greater than in the field. If globular clusters also produce SN Ia at a high rate, it would account for many of the SN Ia production in early-type galaxies and provide insight into their formation. Here we use archival Hubble Space Telescope (HST) images of nearby galaxies that have hosted an SN Ia to examine the rate at which globular clusters produce these events. The location of the SN Ia is registered on an HST image obtained before the event or after the supernova (SN) faded. Of the 36 nearby galaxies examined, 21 had sufficiently good data to search for globular cluster hosts. None of the 21 SNe have a definite globular cluster counterpart, although there are some ambiguous cases. This places an upper limit to the enhancement rate of SN Ia production in globular clusters of about 42 at the 95% confidence level, which is an order of magnitude lower than the enhancement rate for luminous X-ray binaries. Even if all of the ambiguous cases are considered as having a globular cluster counterpart, the upper bound for the enhancement rate is 82 at the 95% confidence level, still a factor of several below that needed to account for half of the SN Ia events. Barring unforeseen selection effects, we conclude that globular clusters are not responsible for producing a significant fraction of the SN Ia events in early-type galaxies.

  18. Abnormal Event Detection in Wireless Sensor Networks Based on Multiattribute Correlation

    Directory of Open Access Journals (Sweden)

    Mengdi Wang

    2017-01-01

    Full Text Available Abnormal event detection is one of the vital tasks in wireless sensor networks. However, the faults of nodes and the poor deployment environment have brought great challenges to abnormal event detection. In a typical event detection technique, spatiotemporal correlations are collected to detect an event, which is susceptible to noises and errors. To improve the quality of detection results, we propose a novel approach for abnormal event detection in wireless sensor networks. This approach considers not only spatiotemporal correlations but also the correlations among observed attributes. A dependency model of observed attributes is constructed based on Bayesian network. In this model, the dependency structure of observed attributes is obtained by structure learning, and the conditional probability table of each node is calculated by parameter learning. We propose a new concept named attribute correlation confidence to evaluate the fitting degree between the sensor reading and the abnormal event pattern. On the basis of time correlation detection and space correlation detection, the abnormal events are identified. Experimental results show that the proposed algorithm can reduce the impact of interference factors and the rate of the false alarm effectively; it can also improve the accuracy of event detection.

  19. Pooling overdispersed binomial data to estimate event rate.

    Science.gov (United States)

    Young-Xu, Yinong; Chan, K Arnold

    2008-08-19

    The beta-binomial model is one of the methods that can be used to validly combine event rates from overdispersed binomial data. Our objective is to provide a full description of this method and to update and broaden its applications in clinical and public health research. We describe the statistical theories behind the beta-binomial model and the associated estimation methods. We supply information about statistical software that can provide beta-binomial estimations. Using a published example, we illustrate the application of the beta-binomial model when pooling overdispersed binomial data. In an example regarding the safety of oral antifungal treatments, we had 41 treatment arms with event rates varying from 0% to 13.89%. Using the beta-binomial model, we obtained a summary event rate of 3.44% with a standard error of 0.59%. The parameters of the beta-binomial model took the values of 1.24 for alpha and 34.73 for beta. The beta-binomial model can provide a robust estimate for the summary event rate by pooling overdispersed binomial data from different studies. The explanation of the method and the demonstration of its applications should help researchers incorporate the beta-binomial method as they aggregate probabilities of events from heterogeneous studies.

  20. Pooling overdispersed binomial data to estimate event rate

    Directory of Open Access Journals (Sweden)

    Chan K Arnold

    2008-08-01

    Full Text Available Abstract Background The beta-binomial model is one of the methods that can be used to validly combine event rates from overdispersed binomial data. Our objective is to provide a full description of this method and to update and broaden its applications in clinical and public health research. Methods We describe the statistical theories behind the beta-binomial model and the associated estimation methods. We supply information about statistical software that can provide beta-binomial estimations. Using a published example, we illustrate the application of the beta-binomial model when pooling overdispersed binomial data. Results In an example regarding the safety of oral antifungal treatments, we had 41 treatment arms with event rates varying from 0% to 13.89%. Using the beta-binomial model, we obtained a summary event rate of 3.44% with a standard error of 0.59%. The parameters of the beta-binomial model took the values of 1.24 for alpha and 34.73 for beta. Conclusion The beta-binomial model can provide a robust estimate for the summary event rate by pooling overdispersed binomial data from different studies. The explanation of the method and the demonstration of its applications should help researchers incorporate the beta-binomial method as they aggregate probabilities of events from heterogeneous studies.

  1. Abnormal global and local event detection in compressive sensing domain

    Science.gov (United States)

    Wang, Tian; Qiao, Meina; Chen, Jie; Wang, Chuanyun; Zhang, Wenjia; Snoussi, Hichem

    2018-05-01

    Abnormal event detection, also known as anomaly detection, is one challenging task in security video surveillance. It is important to develop effective and robust movement representation models for global and local abnormal event detection to fight against factors such as occlusion and illumination change. In this paper, a new algorithm is proposed. It can locate the abnormal events on one frame, and detect the global abnormal frame. The proposed algorithm employs a sparse measurement matrix designed to represent the movement feature based on optical flow efficiently. Then, the abnormal detection mission is constructed as a one-class classification task via merely learning from the training normal samples. Experiments demonstrate that our algorithm performs well on the benchmark abnormal detection datasets against state-of-the-art methods.

  2. Abnormal global and local event detection in compressive sensing domain

    Directory of Open Access Journals (Sweden)

    Tian Wang

    2018-05-01

    Full Text Available Abnormal event detection, also known as anomaly detection, is one challenging task in security video surveillance. It is important to develop effective and robust movement representation models for global and local abnormal event detection to fight against factors such as occlusion and illumination change. In this paper, a new algorithm is proposed. It can locate the abnormal events on one frame, and detect the global abnormal frame. The proposed algorithm employs a sparse measurement matrix designed to represent the movement feature based on optical flow efficiently. Then, the abnormal detection mission is constructed as a one-class classification task via merely learning from the training normal samples. Experiments demonstrate that our algorithm performs well on the benchmark abnormal detection datasets against state-of-the-art methods.

  3. Event storm detection and identification in communication systems

    International Nuclear Information System (INIS)

    Albaghdadi, Mouayad; Briley, Bruce; Evens, Martha

    2006-01-01

    Event storms are the manifestation of an important class of abnormal behaviors in communication systems. They occur when a large number of nodes throughout the system generate a set of events within a small period of time. It is essential for network management systems to detect every event storm and identify its cause, in order to prevent and repair potential system faults. This paper presents a set of techniques for the effective detection and identification of event storms in communication systems. First, we introduce a new algorithm to synchronize events to a single node in the system. Second, the system's event log is modeled as a normally distributed random process. This is achieved by using data analysis techniques to explore and then model the statistical behavior of the event log. Third, event storm detection is proposed using a simple test statistic combined with an exponential smoothing technique to overcome the non-stationary behavior of event logs. Fourth, the system is divided into non-overlapping regions to locate the main contributing regions of a storm. We show that this technique provides us with a method for event storm identification. Finally, experimental results from a commercially deployed multimedia communication system that uses these techniques demonstrate their effectiveness

  4. Detecting failure events in buildings: a numerical and experimental analysis

    OpenAIRE

    Heckman, V. M.; Kohler, M. D.; Heaton, T. H.

    2010-01-01

    A numerical method is used to investigate an approach for detecting the brittle fracture of welds associated with beam -column connections in instrumented buildings in real time through the use of time-reversed Green’s functions and wave propagation reciprocity. The approach makes use of a prerecorded catalog of Green’s functions for an instrumented building to detect failure events in the building during a later seismic event by screening continuous data for the presence of wavef...

  5. Online Detection of Abnormal Events in Video Streams

    Directory of Open Access Journals (Sweden)

    Tian Wang

    2013-01-01

    an image descriptor and online nonlinear classification method. We introduce the covariance matrix of the optical flow and image intensity as a descriptor encoding moving information. The nonlinear online support vector machine (SVM firstly learns a limited set of the training frames to provide a basic reference model then updates the model and detects abnormal events in the current frame. We finally apply the method to detect abnormal events on a benchmark video surveillance dataset to demonstrate the effectiveness of the proposed technique.

  6. Detecting impacts of extreme events with ecological in situ monitoring networks

    Directory of Open Access Journals (Sweden)

    M. D. Mahecha

    2017-09-01

    Full Text Available Extreme hydrometeorological conditions typically impact ecophysiological processes on land. Satellite-based observations of the terrestrial biosphere provide an important reference for detecting and describing the spatiotemporal development of such events. However, in-depth investigations of ecological processes during extreme events require additional in situ observations. The question is whether the density of existing ecological in situ networks is sufficient for analysing the impact of extreme events, and what are expected event detection rates of ecological in situ networks of a given size. To assess these issues, we build a baseline of extreme reductions in the fraction of absorbed photosynthetically active radiation (FAPAR, identified by a new event detection method tailored to identify extremes of regional relevance. We then investigate the event detection success rates of hypothetical networks of varying sizes. Our results show that large extremes can be reliably detected with relatively small networks, but also reveal a linear decay of detection probabilities towards smaller extreme events in log–log space. For instance, networks with  ≈  100 randomly placed sites in Europe yield a  ≥  90 % chance of detecting the eight largest (typically very large extreme events; but only a  ≥  50 % chance of capturing the 39 largest events. These findings are consistent with probability-theoretic considerations, but the slopes of the decay rates deviate due to temporal autocorrelation and the exact implementation of the extreme event detection algorithm. Using the examples of AmeriFlux and NEON, we then investigate to what degree ecological in situ networks can capture extreme events of a given size. Consistent with our theoretical considerations, we find that today's systematically designed networks (i.e. NEON reliably detect the largest extremes, but that the extreme event detection rates are not higher than would

  7. Evaluation of misplaced event count rate using a scintillation camera

    International Nuclear Information System (INIS)

    Yanagimoto, Shin-ichi; Tomomitsu, Tatsushi; Muranaka, Akira

    1985-01-01

    Misplaced event count rates were evaluated using an acryl scatter body of various thickness and a gamma camera. The count rate in the region of interest (ROI) within the camera view field, which was thought to represent part of the misplaced event count rate, increased as the thickness of the scatter body was increased to 5 cm, followed by a steep decline in the count rate. On the other hand, the ratio of the count rate in the ROI to the total count rate continuously increased as the thickness of the scatter body was increased. As the thickness of the scatter body was increased, the count rates increased, and the increments of increase were greater in the lower energy region of the photopeak than in the higher energy region. In ranges energy other than the photopeak, the influence of the scatter body on the count rate in the ROI was the greatest at 76 keV, which was the lowest energy we examined. (author)

  8. Leak rate models and leak detection

    International Nuclear Information System (INIS)

    1992-01-01

    Leak detection may be carried out by a number of detection systems, but selection of the systems must be carefully adapted to the fluid state and the location of the leak in the reactor coolant system. Computer programs for the calculation of leak rates contain different models to take into account the fluid state before its entrance into the crack, and they have to be verified by experiments; agreement between experiments and calculations is generally not satisfactory for very small leak rates resulting from narrow cracks or from a closing bending moment

  9. Non-Linguistic Vocal Event Detection Using Online Random

    DEFF Research Database (Denmark)

    Abou-Zleikha, Mohamed; Tan, Zheng-Hua; Christensen, Mads Græsbøll

    2014-01-01

    areas such as object detection, face recognition, and audio event detection. This paper proposes to use online random forest technique for detecting laughter and filler and for analyzing the importance of various features for non-linguistic vocal event classification through permutation. The results...... show that according to the Area Under Curve measure the online random forest achieved 88.1% compared to 82.9% obtained by the baseline support vector machines for laughter classification and 86.8% to 83.6% for filler classification....

  10. TACKLING EVENT DETECTION IN THE CONTEXT OF VIDEO SURVEILLANCE

    Directory of Open Access Journals (Sweden)

    Raducu DUMITRESCU

    2011-11-01

    Full Text Available In this paper we address the problem of event detection in the context of video surveillance systems. First we deal with background extraction. Three methods are being tested, namely: frame differencing, running average and an estimate of median filtering technique. This provides information about changing contents. Further, we use this information to address human presence detection in the scene. This is carried out thought a contour-based approach. Contours are extracted from moving regions and parameterized. Human silhouettes show particular signatures of these parameters. Experimental results prove the potential of this approach to event detection. However, these are our first preliminary results to this application.

  11. Event Coverage Detection and Event Source Determination in Underwater Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Zhangbing Zhou

    2015-12-01

    Full Text Available With the advent of the Internet of Underwater Things, smart things are deployed in the ocean space and establish underwater wireless sensor networks for the monitoring of vast and dynamic underwater environments. When events are found to have possibly occurred, accurate event coverage should be detected, and potential event sources should be determined for the enactment of prompt and proper responses. To address this challenge, a technique that detects event coverage and determines event sources is developed in this article. Specifically, the occurrence of possible events corresponds to a set of neighboring sensor nodes whose sensory data may deviate from a normal sensing range in a collective fashion. An appropriate sensor node is selected as the relay node for gathering and routing sensory data to sink node(s. When sensory data are collected at sink node(s, the event coverage is detected and represented as a weighted graph, where the vertices in this graph correspond to sensor nodes and the weight specified upon the edges reflects the extent of sensory data deviating from a normal sensing range. Event sources are determined, which correspond to the barycenters in this graph. The results of the experiments show that our technique is more energy efficient, especially when the network topology is relatively steady.

  12. Net reclassification index at event rate: properties and relationships.

    Science.gov (United States)

    Pencina, Michael J; Steyerberg, Ewout W; D'Agostino, Ralph B

    2017-12-10

    The net reclassification improvement (NRI) is an attractively simple summary measure quantifying improvement in performance because of addition of new risk marker(s) to a prediction model. Originally proposed for settings with well-established classification thresholds, it quickly extended into applications with no thresholds in common use. Here we aim to explore properties of the NRI at event rate. We express this NRI as a difference in performance measures for the new versus old model and show that the quantity underlying this difference is related to several global as well as decision analytic measures of model performance. It maximizes the relative utility (standardized net benefit) across all classification thresholds and can be viewed as the Kolmogorov-Smirnov distance between the distributions of risk among events and non-events. It can be expressed as a special case of the continuous NRI, measuring reclassification from the 'null' model with no predictors. It is also a criterion based on the value of information and quantifies the reduction in expected regret for a given regret function, casting the NRI at event rate as a measure of incremental reduction in expected regret. More generally, we find it informative to present plots of standardized net benefit/relative utility for the new versus old model across the domain of classification thresholds. Then, these plots can be summarized with their maximum values, and the increment in model performance can be described by the NRI at event rate. We provide theoretical examples and a clinical application on the evaluation of prognostic biomarkers for atrial fibrillation. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  13. COMPARISON OF FOUR METHODS TO DETECT ADVERSE EVENTS IN HOSPITAL

    Directory of Open Access Journals (Sweden)

    Inge Dhamanti

    2015-09-01

    Full Text Available AbstrakDeteksi terjadinya kejadian yang tidak diharapkan (KTD telah menjadi salah satu tantangan dalam keselamatan pasien oleh karena itu metode untuk mendeteksi terjadinya KTD sangatlah penting untuk meningkatkan keselamatan pasien. Tujuan dari artikel ini adalah untuk membandingkan kelebihan dan kekurangan dari beberapa metode untuk mendeteksi terjadinya KTD di rumah sakit, meliputi review rekam medis, pelaporan insiden secara mandiri, teknologi informasi, dan pelaporan oleh pasien. Studi ini merupakan kajian literatur untuk membandingkan dan menganalisa metode terbaik untuk mendeteksi KTD yang dapat diimplementasikan oleh rumah sakit. Semua dari empat metode telah terbukti mampu untuk mendeteksi terjadinya KTD di rumah sakit, tetapi masing-masing metode mempunyai kelebihan dan kekurangan yang perlu diatasi. Tidak ada satu metode terbaik yang akan memberikan hasil terbaik untuk mendeteksi KTD di rumah sakit. Sehingga untuk mendeteksi lebih banyak KTD yang seharusnya dapat dicegah, atau KTD yang telah terjadi, rumah sakit seharusnya mengkombinasikan lebih dari satu metode untuk mendeteksi, karena masing-masing metode mempunyai sensitivitas berbeda-beda.AbstractDetecting adverse events has become one of the challenges in patient safety thus methods to detect adverse events become critical for improving patient safety. The purpose of this paper is to compare the strengths and weaknesses of several methods of identifying adverse events in hospital, including medical records reviews, self-reported incidents, information technology, and patient self-reports. This study is a literature review to compared and analyzed to determine the best method implemented by the hospital. All of four methods have been proved in their ability in detecting adverse events in hospitals, but each method had strengths and limitations to be overcome. There is no ‘best’ single method that will give the best results for adverse events detection in hospital. Thus to

  14. Application of a conversion factor to estimate the adenoma detection rate from the polyp detection rate.

    LENUS (Irish Health Repository)

    Francis, Dawn L

    2011-03-01

    The adenoma detection rate (ADR) is a quality benchmark for colonoscopy. Many practices find it difficult to determine the ADR because it requires a combination of endoscopic and histologic findings. It may be possible to apply a conversion factor to estimate the ADR from the polyp detection rate (PDR).

  15. Complexity rating of abnormal events and operator performance

    International Nuclear Information System (INIS)

    Oeivind Braarud, Per

    1998-01-01

    The complexity of the work situation during abnormal situations is a major topic in a discussion of safety aspects of Nuclear Power plants. An understanding of complexity and its impact on operator performance in abnormal situations is important. One way to enhance understanding is to look at the dimensions that constitute complexity for NPP operators, and how those dimensions can be measured. A further step is to study how dimensions of complexity of the event are related to performance of operators. One aspect of complexity is the operator 's subjective experience of given difficulties of the event. Another related aspect of complexity is subject matter experts ratings of the complexity of the event. A definition and a measure of this part of complexity are being investigated at the OECD Halden Reactor Project in Norway. This paper focus on the results from a study of simulated scenarios carried out in the Halden Man-Machine Laboratory, which is a full scope PWR simulator. Six crews of two licensed operators each performed in 16 scenarios (simulated events). Before the experiment subject matter experts rated the complexity of the scenarios, using a Complexity Profiling Questionnaire. The Complexity Profiling Questionnaire contains eight previously identified dimensions associated with complexity. After completing the scenarios the operators received a questionnaire containing 39 questions about perceived complexity. This questionnaire was used for development of a measure of subjective complexity. The results from the study indicated that Process experts' rating of scenario complexity, using the Complexity Profiling Questionnaire, were able to predict crew performance quite well. The results further indicated that a measure of subjective complexity could be developed that was related to crew performance. Subjective complexity was found to be related to subjective work load. (author)

  16. Leak rate measurements and detection systems

    International Nuclear Information System (INIS)

    Kupperman, D.; Shack, W.J.; Claytor, T.

    1983-10-01

    A research program is under way to evaluate and develop improve leak detection systems. The primary focus of the work has been on acoustic emission detection of leaks. Leaks from artificial flaws, laboratory-generated IGSCCs and thermal fatigue cracks, and field-induced intergranular stress corrosion cracks (IGSCCs) from reactor piping have been examined. The effects of pressure, temperature, and leak rate and geometry on the acoustic signature are under study. The use of cross-correlation techniques for leak location and pattern recognition and autocorrelation for source discrimination is also being considered

  17. Analytic 3D image reconstruction using all detected events

    International Nuclear Information System (INIS)

    Kinahan, P.E.; Rogers, J.G.

    1988-11-01

    We present the results of testing a previously presented algorithm for three-dimensional image reconstruction that uses all gamma-ray coincidence events detected by a PET volume-imaging scanner. By using two iterations of an analytic filter-backprojection method, the algorithm is not constrained by the requirement of a spatially invariant detector point spread function, which limits normal analytic techniques. Removing this constraint allows the incorporation of all detected events, regardless of orientation, which improves the statistical quality of the final reconstructed image

  18. Spatial-Temporal Event Detection from Geo-Tagged Tweets

    Directory of Open Access Journals (Sweden)

    Yuqian Huang

    2018-04-01

    Full Text Available As one of the most popular social networking services in the world, Twitter allows users to post messages along with their current geographic locations. Such georeferenced or geo-tagged Twitter datasets can benefit location-based services, targeted advertising and geosocial studies. Our study focused on the detection of small-scale spatial-temporal events and their textual content. First, we used Spatial-Temporal Density-Based Spatial Clustering of Applications with Noise (ST-DBSCAN to spatially-temporally cluster the tweets. Then, the word frequencies were summarized for each cluster and the potential topics were modeled by the Latent Dirichlet Allocation (LDA algorithm. Using two years of Twitter data from four college cities in the U.S., we were able to determine the spatial-temporal patterns of two known events, two unknown events and one recurring event, which then were further explored and modeled to identify the semantic content about the events. This paper presents our process and recommendations for both finding event-related tweets as well as understanding the spatial-temporal behaviors and semantic natures of the detected events.

  19. Event-Triggered Fault Detection of Nonlinear Networked Systems.

    Science.gov (United States)

    Li, Hongyi; Chen, Ziran; Wu, Ligang; Lam, Hak-Keung; Du, Haiping

    2017-04-01

    This paper investigates the problem of fault detection for nonlinear discrete-time networked systems under an event-triggered scheme. A polynomial fuzzy fault detection filter is designed to generate a residual signal and detect faults in the system. A novel polynomial event-triggered scheme is proposed to determine the transmission of the signal. A fault detection filter is designed to guarantee that the residual system is asymptotically stable and satisfies the desired performance. Polynomial approximated membership functions obtained by Taylor series are employed for filtering analysis. Furthermore, sufficient conditions are represented in terms of sum of squares (SOSs) and can be solved by SOS tools in MATLAB environment. A numerical example is provided to demonstrate the effectiveness of the proposed results.

  20. Detection of Abnormal Events via Optical Flow Feature Analysis

    Directory of Open Access Journals (Sweden)

    Tian Wang

    2015-03-01

    Full Text Available In this paper, a novel algorithm is proposed to detect abnormal events in video streams. The algorithm is based on the histogram of the optical flow orientation descriptor and the classification method. The details of the histogram of the optical flow orientation descriptor are illustrated for describing movement information of the global video frame or foreground frame. By combining one-class support vector machine and kernel principal component analysis methods, the abnormal events in the current frame can be detected after a learning period characterizing normal behaviors. The difference abnormal detection results are analyzed and explained. The proposed detection method is tested on benchmark datasets, then the experimental results show the effectiveness of the algorithm.

  1. Detection of Abnormal Events via Optical Flow Feature Analysis

    Science.gov (United States)

    Wang, Tian; Snoussi, Hichem

    2015-01-01

    In this paper, a novel algorithm is proposed to detect abnormal events in video streams. The algorithm is based on the histogram of the optical flow orientation descriptor and the classification method. The details of the histogram of the optical flow orientation descriptor are illustrated for describing movement information of the global video frame or foreground frame. By combining one-class support vector machine and kernel principal component analysis methods, the abnormal events in the current frame can be detected after a learning period characterizing normal behaviors. The difference abnormal detection results are analyzed and explained. The proposed detection method is tested on benchmark datasets, then the experimental results show the effectiveness of the algorithm. PMID:25811227

  2. Event detection for car park entries by video-surveillance

    Science.gov (United States)

    Coquin, Didier; Tailland, Johan; Cintract, Michel

    2007-10-01

    Intelligent surveillance has become an important research issue due to the high cost and low efficiency of human supervisors, and machine intelligence is required to provide a solution for automated event detection. In this paper we describe a real-time system that has been used for detecting car park entries, using an adaptive background learning algorithm and two indicators representing activity and identity to overcome the difficulty of tracking objects.

  3. Context-aware event detection smartphone application for first responders

    Science.gov (United States)

    Boddhu, Sanjay K.; Dave, Rakesh P.; McCartney, Matt; West, James A.; Williams, Robert L.

    2013-05-01

    The rise of social networking platforms like Twitter, Facebook, etc…, have provided seamless sharing of information (as chat, video and other media) among its user community on a global scale. Further, the proliferation of the smartphones and their connectivity networks has powered the ordinary individuals to share and acquire information regarding the events happening in his/her immediate vicinity in a real-time fashion. This human-centric sensed data being generated in "human-as-sensor" approach is tremendously valuable as it delivered mostly with apt annotations and ground truth that would be missing in traditional machine-centric sensors, besides high redundancy factor (same data thru multiple users). Further, when appropriately employed this real-time data can support in detecting localized events like fire, accidents, shooting, etc…, as they unfold and pin-point individuals being affected by those events. This spatiotemporal information, when made available for first responders in the event vicinity (or approaching it) can greatly assist them to make effective decisions to protect property and life in a timely fashion. In this vein, under SATE and YATE programs, the research team at AFRL Tec^Edge Discovery labs had demonstrated the feasibility of developing Smartphone applications, that can provide a augmented reality view of the appropriate detected events in a given geographical location (localized) and also provide an event search capability over a large geographic extent. In its current state, the application thru its backend connectivity utilizes a data (Text & Image) processing framework, which deals with data challenges like; identifying and aggregating important events, analyzing and correlating the events temporally and spatially and building a search enabled event database. Further, the smartphone application with its backend data processing workflow has been successfully field tested with live user generated feeds.

  4. TNO at TRECVID 2013: Multimedia Event Detection and Instance Search

    NARCIS (Netherlands)

    Bouma, H.; Azzopardi, G.; Spitters, M.M.; Wit, J.J. de; Versloot, C.A.; Zon, R.W.L. van der; Eendebak, P.T.; Baan, J.; Hove, R.J.M. ten; Eekeren, A.W.M. van; Haar, F.B. ter; Hollander, R.J.M. den; Huis, R.J. van; Boer, M.H.T. de; Antwerpen, G. van; Broekhuijsen, B.J.; Daniele, L.M.; Brandt, P.; Schavemaker, J.G.M.; Kraaij, W.; Schutte, K.

    2013-01-01

    We describe the TNO system and the evaluation results for TRECVID 2013 Multimedia Event Detection (MED) and instance search (INS) tasks. The MED system consists of a bag-of-word (BOW) approach with spatial tiling that uses low-level static and dynamic visual features, an audio feature and high-level

  5. Distributed Event Detection in Wireless Sensor Networks for Disaster Management

    NARCIS (Netherlands)

    Bahrepour, M.; Meratnia, Nirvana; Poel, Mannes; Taghikhaki, Zahra; Havinga, Paul J.M.

    2010-01-01

    Recently, wireless sensor networks (WSNs) have become mature enough to go beyond being simple fine-grained continuous monitoring platforms and become one of the enabling technologies for disaster early-warning systems. Event detection functionality of WSNs can be of great help and importance for

  6. Complex Event Detection via Multi Source Video Attributes (Open Access)

    Science.gov (United States)

    2013-10-03

    Complex Event Detection via Multi-Source Video Attributes Zhigang Ma† Yi Yang‡ Zhongwen Xu‡§ Shuicheng Yan Nicu Sebe† Alexander G. Hauptmann...under its International Research Centre @ Singapore Fund- ing Initiative and administered by the IDM Programme Of- fice, and the Intelligence Advanced

  7. On Event Detection and Localization in Acyclic Flow Networks

    KAUST Repository

    Suresh, Mahima Agumbe

    2013-05-01

    Acyclic flow networks, present in many infrastructures of national importance (e.g., oil and gas and water distribution systems), have been attracting immense research interest. Existing solutions for detecting and locating attacks against these infrastructures have been proven costly and imprecise, particularly when dealing with large-scale distribution systems. In this article, to the best of our knowledge, for the first time, we investigate how mobile sensor networks can be used for optimal event detection and localization in acyclic flow networks. We propose the idea of using sensors that move along the edges of the network and detect events (i.e., attacks). To localize the events, sensors detect proximity to beacons, which are devices with known placement in the network. We formulate the problem of minimizing the cost of monitoring infrastructure (i.e., minimizing the number of sensors and beacons deployed) in a predetermined zone of interest, while ensuring a degree of coverage by sensors and a required accuracy in locating events using beacons. We propose algorithms for solving the aforementioned problem and demonstrate their effectiveness with results obtained from a realistic flow network simulator.

  8. Contamination Event Detection with Multivariate Time-Series Data in Agricultural Water Monitoring

    Directory of Open Access Journals (Sweden)

    Yingchi Mao

    2017-12-01

    Full Text Available Time series data of multiple water quality parameters are obtained from the water sensor networks deployed in the agricultural water supply network. The accurate and efficient detection and warning of contamination events to prevent pollution from spreading is one of the most important issues when pollution occurs. In order to comprehensively reduce the event detection deviation, a spatial–temporal-based event detection approach with multivariate time-series data for water quality monitoring (M-STED was proposed. The M-STED approach includes three parts. The first part is that M-STED adopts a Rule K algorithm to select backbone nodes as the nodes in the CDS, and forward the sensed data of multiple water parameters. The second part is to determine the state of each backbone node with back propagation neural network models and the sequential Bayesian analysis in the current timestamp. The third part is to establish a spatial model with Bayesian networks to estimate the state of the backbones in the next timestamp and trace the “outlier” node to its neighborhoods to detect a contamination event. The experimental results indicate that the average detection rate is more than 80% with M-STED and the false detection rate is lower than 9%, respectively. The M-STED approach can improve the rate of detection by about 40% and reduce the false alarm rate by about 45%, compared with the event detection with a single water parameter algorithm, S-STED. Moreover, the proposed M-STED can exhibit better performance in terms of detection delay and scalability.

  9. Automatic Detection and Classification of Audio Events for Road Surveillance Applications

    Directory of Open Access Journals (Sweden)

    Noor Almaadeed

    2018-06-01

    Full Text Available This work investigates the problem of detecting hazardous events on roads by designing an audio surveillance system that automatically detects perilous situations such as car crashes and tire skidding. In recent years, research has shown several visual surveillance systems that have been proposed for road monitoring to detect accidents with an aim to improve safety procedures in emergency cases. However, the visual information alone cannot detect certain events such as car crashes and tire skidding, especially under adverse and visually cluttered weather conditions such as snowfall, rain, and fog. Consequently, the incorporation of microphones and audio event detectors based on audio processing can significantly enhance the detection accuracy of such surveillance systems. This paper proposes to combine time-domain, frequency-domain, and joint time-frequency features extracted from a class of quadratic time-frequency distributions (QTFDs to detect events on roads through audio analysis and processing. Experiments were carried out using a publicly available dataset. The experimental results conform the effectiveness of the proposed approach for detecting hazardous events on roads as demonstrated by 7% improvement of accuracy rate when compared against methods that use individual temporal and spectral features.

  10. Towards Detecting the Crowd Involved in Social Events

    Directory of Open Access Journals (Sweden)

    Wei Huang

    2017-10-01

    Full Text Available Knowing how people interact with urban environments is fundamental for a variety of fields, ranging from transportation to social science. Despite the fact that human mobility patterns have been a major topic of study in recent years, a challenge to understand large-scale human behavior when a certain event occurs remains due to a lack of either relevant data or suitable approaches. Psychological crowd refers to a group of people who are usually located at different places and show different behaviors, but who are very sensitively driven to take the same act (gather together by a certain event, which has been theoretically studied by social psychologists since the 19th century. This study aims to propose a computational approach using a machine learning method to model psychological crowds, contributing to the better understanding of human activity patterns under events. Psychological features and mental unity of the crowd are computed to detect the involved individuals. A national event happening across the USA in April, 2015 is analyzed using geotagged tweets as a case study to test our approach. The result shows that 81% of individuals in the crowd can be successfully detected. Through investigating the geospatial pattern of the involved users, not only can the event related users be identified but also those unobserved users before the event can be uncovered. The proposed approach can effectively represent the psychological feature and measure the mental unity of the psychological crowd, which sheds light on the study of large-scale psychological crowd and provides an innovative way to understanding human behavior under events.

  11. Neutron detector for detecting rare events of spontaneous fission

    International Nuclear Information System (INIS)

    Ter-Akop'yan, G.M.; Popeko, A.G.; Sokol, E.A.; Chelnokov, L.P.; Smirnov, V.I.; Gorshkov, V.A.

    1981-01-01

    The neutron detector for registering rare events of spontaneous fission by detecting multiple neutron emission is described. The detector represents a block of plexiglas of 550 mm diameter and 700 mm height in the centre of which there is a through 160 mm diameter channel for the sample under investigation. The detector comprises 56 3 He filled counters (up to 7 atm pressure) with 1% CO 2 addition. The counters have a 500 mm length and a 32 mm diameter. The sampling of fission events is realized by an electron system which allows determining the number of detected neutrons, numbers of operated counters, signal amplitude and time for fission event detecting. A block diagram of a neutron detector electron system is presented and its operation principle is considered. For protection against cosmic radiation the detector is surronded by a system of plastic scintillators and placed behind the concrete shield of 6 m thickness. The results of measurements of background radiation are given. It has been found that the background radiation of single neutron constitutes about 150 counts per hour, the detecting efficiency of single neutron equals 0.483 +- 0.005, for a 10l detector sensitive volume. By means of the detector described the parameters of multiplicity distribution of prompt neutrons for 256 Fm spontaneous fission are measured. The average multiplicity equals 3.59+-0.06 the dispersion being 2.30+-0.65

  12. Detection and interpretation of seismoacoustic events at German infrasound stations

    Science.gov (United States)

    Pilger, Christoph; Koch, Karl; Ceranna, Lars

    2016-04-01

    Three infrasound arrays with collocated or nearby installed seismometers are operated by the Federal Institute for Geosciences and Natural Resources (BGR) as the German National Data Center (NDC) for the verification of the Comprehensive Nuclear-Test-Ban Treaty (CTBT). Infrasound generated by seismoacoustic events is routinely detected at these infrasound arrays, but air-to-ground coupled acoustic waves occasionally show up in seismometer recordings as well. Different natural and artificial sources like meteoroids as well as industrial and mining activity generate infrasonic signatures that are simultaneously detected at microbarometers and seismometers. Furthermore, many near-surface sources like earthquakes and explosions generate both seismic and infrasonic waves that can be detected successively with both technologies. The combined interpretation of seismic and acoustic signatures provides additional information about the origin time and location of remote infrasound events or about the characterization of seismic events distinguishing man-made and natural origins. Furthermore, seismoacoustic studies help to improve the modelling of infrasound propagation and ducting in the atmosphere and allow quantifying the portion of energy coupled into ground and into air by seismoacoustic sources. An overview of different seismoacoustic sources and their detection by German infrasound stations as well as some conclusions on the benefit of a combined seismoacoustic analysis are presented within this study.

  13. Rate-limiting events in hyperthermic cell killing

    International Nuclear Information System (INIS)

    Landry, J.; Marceau, N.

    1978-01-01

    The inactivation rate of HeLa cells for temperatures ranging from 41 to 55 0 C and treatment durations varying from 2 to 300 min was analyzed in thermodynamic terms by considering the dependence of cell free energy (ΔG + ) on temperature. Within this temperature range the loss of proliferative capacity exhibits a complex temperature dependence which is characterized by entropy and enthalpy values that gradually decrease as temperature increases. This complex process of heat-induced cell killing was postulated to be the result of a series of reactions, each of them being alternatively rate limiting within a certain temperature range. From this kinetic scheme a mathematical model was derived and, in the case of HeLa cells, the use of a least-squares search parameter procedure (as applied to the derived survival regression function) demonstrated that three such sequential reactions were sufficient to explain all experimental data points obtained within the 41 to 55 0 C range. The proposed model was also shown to be adequate for explaining survival data of HeLa cells exposed to nanosecond heat pulses of infrared laser energy. Considerations of thermodynamic properties of known biochemical reactions suggest plausible rate-limiting events in hyperthermic cell killing

  14. Polygraph lie detection on real events in a laboratory setting.

    Science.gov (United States)

    Bradley, M T; Cullen, M C

    1993-06-01

    This laboratory study dealt with real-life intense emotional events. Subjects generated embarrassing stories from their experience, then submitted to polygraph testing and, by lying, denied their stories and, by telling the truth, denied a randomly assigned story. Money was given as an incentive to be judged innocent on each story. An interrogator, blind to the stories, used Control Question Tests and found subjects more deceptive when lying than when truthful. Stories interacted with order such that lying on the second story was more easily detected than lying on the first. Embarrassing stories provide an alternative to the use of mock crimes to study lie detection in the laboratory.

  15. Complexity of deciding detectability in discrete event systems

    Czech Academy of Sciences Publication Activity Database

    Masopust, Tomáš

    2018-01-01

    Roč. 93, July (2018), s. 257-261 ISSN 0005-1098 Institutional support: RVO:67985840 Keywords : discrete event systems * finite automata * detectability Subject RIV: BA - General Mathematics OBOR OECD: Computer science s, information science , bioinformathics (hardware development to be 2.2, social aspect to be 5.8) Impact factor: 5.451, year: 2016 https://www. science direct.com/ science /article/pii/S0005109818301730

  16. Complexity of deciding detectability in discrete event systems

    Czech Academy of Sciences Publication Activity Database

    Masopust, Tomáš

    2018-01-01

    Roč. 93, July (2018), s. 257-261 ISSN 0005-1098 Institutional support: RVO:67985840 Keywords : discrete event systems * finite automata * detectability Subject RIV: BA - General Mathematics OBOR OECD: Computer sciences, information science, bioinformathics (hardware development to be 2.2, social aspect to be 5.8) Impact factor: 5.451, year: 2016 https://www.sciencedirect.com/science/article/pii/S0005109818301730

  17. Microseismic Events Detection on Xishancun Landslide, Sichuan Province, China

    Science.gov (United States)

    Sheng, M.; Chu, R.; Wei, Z.

    2016-12-01

    On landslide, the slope movement and the fracturing of the rock mass often lead to microearthquakes, which are recorded as weak signals on seismographs. The distribution characteristics of temporal and spatial regional unstability as well as the impact of external factors on the unstable regions can be understand and analyzed by monitoring those microseismic events. Microseismic method can provide some information inside the landslide, which can be used as supplementary of geodetic methods for monitoring the movement of landslide surface. Compared to drilling on landslide, microseismic method is more economical and safe. Xishancun Landslide is located about 60km northwest of Wenchuan earthquake centroid, it keep deforming after the earthquake, which greatly increases the probability of disasters. In the autumn of 2015, 30 seismometers were deployed on the landslide for 3 months with intervals of 200 500 meters. First, we used regional earthquakes for time correction of seismometers to eliminate the influence of inaccuracy GPS clocks and the subsurface structure of stations. Due to low velocity of the loose medium, the travel time difference of microseismic events on the landslide up to 5s. According to travel time and waveform characteristics, we found many microseismic events and converted them into envelopes as templates, then we used a sliding-window cross-correlation technique based on waveform envelope to detect the other microseismic events. Consequently, 100 microseismic events were detected with the waveforms recorded on all seismometers. Based on the location, we found most of them located on the front of the landslide while the others located on the back end. The bottom and top of the landslide accumulated considerable energy and deformed largely, radiated waves could be recorded by all stations. What's more, the bottom with more events seemed very active. In addition, there were many smaller events happened in middle part of the landslide where released

  18. Towards Optimal Event Detection and Localization in Acyclic Flow Networks

    KAUST Repository

    Agumbe Suresh, Mahima

    2012-01-03

    Acyclic flow networks, present in many infrastructures of national importance (e.g., oil & gas and water distribution systems), have been attracting immense research interest. Existing solutions for detecting and locating attacks against these infrastructures, have been proven costly and imprecise, especially when dealing with large scale distribution systems. In this paper, to the best of our knowledge for the first time, we investigate how mobile sensor networks can be used for optimal event detection and localization in acyclic flow networks. Sensor nodes move along the edges of the network and detect events (i.e., attacks) and proximity to beacon nodes with known placement in the network. We formulate the problem of minimizing the cost of monitoring infrastructure (i.e., minimizing the number of sensor and beacon nodes deployed), while ensuring a degree of sensing coverage in a zone of interest and a required accuracy in locating events. We propose algorithms for solving these problems and demonstrate their effectiveness with results obtained from a high fidelity simulator.

  19. Approaches to proton single-event rate calculations

    International Nuclear Information System (INIS)

    Petersen, E.L.

    1996-01-01

    This article discusses the fundamentals of proton-induced single-event upsets and of the various methods that have been developed to calculate upset rates. Two types of approaches are used based on nuclear-reaction analysis. Several aspects can be analyzed using analytic methods, but a complete description is not available. The paper presents an analytic description for the component due to elastic-scattering recoils. There have been a number of studies made using Monte Carlo methods. These can completely describe the reaction processes, including the effect of nuclear reactions occurring outside the device-sensitive volume. They have not included the elastic-scattering processes. The article describes the semiempirical approaches that are most widely used. The quality of previous upset predictions relative to space observations is discussed and leads to comments about the desired quality of future predictions. Brief sections treat the possible testing limitation due to total ionizing dose effects, the relationship of proton and heavy-ion upsets, upsets due to direct proton ionization, and relative proton and cosmic-ray upset rates

  20. Drowsiness detection using heart rate variability.

    Science.gov (United States)

    Vicente, José; Laguna, Pablo; Bartra, Ariadna; Bailón, Raquel

    2016-06-01

    It is estimated that 10-30 % of road fatalities are related to drowsy driving. Driver's drowsiness detection based on biological and vehicle signals is being studied in preventive car safety. Autonomous nervous system activity, which can be measured noninvasively from the heart rate variability (HRV) signal obtained from surface electrocardiogram, presents alterations during stress, extreme fatigue and drowsiness episodes. We hypothesized that these alterations manifest on HRV and thus could be used to detect driver's drowsiness. We analyzed three driving databases in which drivers presented different sleep-deprivation levels, and in which each driving minute was annotated as drowsy or awake. We developed two different drowsiness detectors based on HRV. While the drowsiness episodes detector assessed each minute of driving as "awake" or "drowsy" with seven HRV derived features (positive predictive value 0.96, sensitivity 0.59, specificity 0.98 on 3475 min of driving), the sleep-deprivation detector discerned if a driver was suitable for driving or not, at driving onset, as function of his sleep-deprivation state. Sleep-deprivation state was estimated from the first three minutes of driving using only one HRV feature (positive predictive value 0.80, sensitivity 0.62, specificity 0.88 on 30 drivers). Incorporating drowsiness assessment based on HRV signal may add significant improvements to existing car safety systems.

  1. Rates for parallax-shifted microlensing events from ground-based observations of the galactic bulge

    International Nuclear Information System (INIS)

    Buchalter, A.; Kamionkowski, M.

    1997-01-01

    The parallax effect in ground-based microlensing (ML) observations consists of a distortion to the standard ML light curve arising from the Earth's orbital motion. This can be used to partially remove the degeneracy among the system parameters in the event timescale, t 0 . In most cases, the resolution in current ML surveys is not accurate enough to observe this effect, but parallax could conceivably be detected with frequent follow-up observations of ML events in progress, providing the photometric errors are small enough. We calculate the expected fraction of ML events where the shape distortions will be observable by such follow-up observations, adopting Galactic models for the lens and source distributions that are consistent with observed microlensing timescale distributions. We study the dependence of the rates for parallax-shifted events on the frequency of follow-up observations and on the precision of the photometry. For example, we find that for hourly observations with typical photometric errors of 0.01 mag, 6% of events where the lens is in the bulge, and 31% of events where the lens is in the disk (or ∼10% of events overall), will give rise to a measurable parallax shift at the 95% confidence level. These fractions may be increased by improved photometric accuracy and increased sampling frequency. While long-duration events are favored, the surveys would be effective in picking out such distortions in events with timescales as low as t 0 ∼20 days. We study the dependence of these fractions on the assumed disk mass function and find that a higher parallax incidence is favored by mass functions with higher mean masses. Parallax measurements yield the reduced transverse speed, v, which gives both the relative transverse speed and lens mass as a function of distance. We give examples of the accuracies with which v may be measured in typical parallax events. (Abstract Truncated)

  2. Spectral analysis of time series of events: effect of respiration on heart rate in neonates

    International Nuclear Information System (INIS)

    Van Drongelen, Wim; Williams, Amber L; Lasky, Robert E

    2009-01-01

    Certain types of biomedical processes such as the heart rate generator can be considered as signals that are sampled by the occurring events, i.e. QRS complexes. This sampling property generates problems for the evaluation of spectral parameters of such signals. First, the irregular occurrence of heart beats creates an unevenly sampled data set which must either be pre-processed (e.g. by using trace binning or interpolation) prior to spectral analysis, or analyzed with specialized methods (e.g. Lomb's algorithm). Second, the average occurrence of events determines the Nyquist limit for the sampled time series. Here we evaluate different types of spectral analysis of recordings of neonatal heart rate. Coupling between respiration and heart rate and the detection of heart rate itself are emphasized. We examine both standard and data adaptive frequency bands of heart rate signals generated by models of coupled oscillators and recorded data sets from neonates. We find that an important spectral artifact occurs due to a mirror effect around the Nyquist limit of half the average heart rate. Further we conclude that the presence of respiratory coupling can only be detected under low noise conditions and if a data-adaptive respiratory band is used

  3. Machine learning for the automatic detection of anomalous events

    Science.gov (United States)

    Fisher, Wendy D.

    In this dissertation, we describe our research contributions for a novel approach to the application of machine learning for the automatic detection of anomalous events. We work in two different domains to ensure a robust data-driven workflow that could be generalized for monitoring other systems. Specifically, in our first domain, we begin with the identification of internal erosion events in earth dams and levees (EDLs) using geophysical data collected from sensors located on the surface of the levee. As EDLs across the globe reach the end of their design lives, effectively monitoring their structural integrity is of critical importance. The second domain of interest is related to mobile telecommunications, where we investigate a system for automatically detecting non-commercial base station routers (BSRs) operating in protected frequency space. The presence of non-commercial BSRs can disrupt the connectivity of end users, cause service issues for the commercial providers, and introduce significant security concerns. We provide our motivation, experimentation, and results from investigating a generalized novel data-driven workflow using several machine learning techniques. In Chapter 2, we present results from our performance study that uses popular unsupervised clustering algorithms to gain insights to our real-world problems, and evaluate our results using internal and external validation techniques. Using EDL passive seismic data from an experimental laboratory earth embankment, results consistently show a clear separation of events from non-events in four of the five clustering algorithms applied. Chapter 3 uses a multivariate Gaussian machine learning model to identify anomalies in our experimental data sets. For the EDL work, we used experimental data from two different laboratory earth embankments. Additionally, we explore five wavelet transform methods for signal denoising. The best performance is achieved with the Haar wavelets. We achieve up to 97

  4. Detection of red tide events in the Ariake Sound, Japan

    Science.gov (United States)

    Ishizaka, Joji

    2003-05-01

    High resolution SeaWiFS data was used to detect a red tide event occurred in the Ariake Sound, Japan, in winter of 2000 to 2001. The area is small embayment surrounding by tidal flat, and it is known as one of the most productive areas in coast of Japan. The red tide event damaged to seaweed (Nori) culture, and the relation to the reclamation at the Isahaya Bay in the Sound has been discussed. SeaWiFS chlorophyll data showed the red tide started early December 2000, from the Isahaya Bay, although direct relationship to the reclamation was not clear. The red tide persisted to the end of February. Monthly average of SeaWiFS data from May 1998 to December 2001 indicated that the chlorophyll increased twice a year, early summer and fall after the rain. The red tide event was part of the fall bloom which started later and continued longer than other years. Ocean color is useful to detect the red tide; however, it is required to improve the algorithms to accurately estimate chlorophyll in high turbid water and to discriminate toxic flagellates.

  5. LIMITS ON THE EVENT RATES OF FAST RADIO TRANSIENTS FROM THE V-FASTR EXPERIMENT

    International Nuclear Information System (INIS)

    Wayth, Randall B.; Tingay, Steven J.; Deller, Adam T.; Brisken, Walter F.; Thompson, David R.; Wagstaff, Kiri L.; Majid, Walid A.

    2012-01-01

    We present the first results from the V-FASTR experiment, a commensal search for fast transient radio bursts using the Very Long Baseline Array (VLBA). V-FASTR is unique in that the widely spaced VLBA antennas provide a discriminant against non-astronomical signals and a mechanism for the localization and identification of events that is not possible with single dishes or short baseline interferometers. Thus, far V-FASTR has accumulated over 1300 hr of observation time with the VLBA, between 90 cm and 3 mm wavelength (327 MHz-86 GHz), providing the first limits on fast transient event rates at high radio frequencies (>1.4 GHz). V-FASTR has blindly detected bright individual pulses from seven known pulsars but has not detected any single-pulse events that would indicate high-redshift impulsive bursts of radio emission. At 1.4 GHz, V-FASTR puts limits on fast transient event rates comparable with the PALFA survey at the Arecibo telescope, but generally at lower sensitivities, and comparable to the 'fly's eye' survey at the Allen Telescope Array, but with less sky coverage. We also illustrate the likely performance of the Phase 1 SKA dish array for an incoherent fast transient search fashioned on V-FASTR.

  6. Piecing together the puzzle: Improving event content coverage for real-time sub-event detection using adaptive microblog crawling.

    Directory of Open Access Journals (Sweden)

    Laurissa Tokarchuk

    Full Text Available In an age when people are predisposed to report real-world events through their social media accounts, many researchers value the benefits of mining user generated content from social media. Compared with the traditional news media, social media services, such as Twitter, can provide more complete and timely information about the real-world events. However events are often like a puzzle and in order to solve the puzzle/understand the event, we must identify all the sub-events or pieces. Existing Twitter event monitoring systems for sub-event detection and summarization currently typically analyse events based on partial data as conventional data collection methodologies are unable to collect comprehensive event data. This results in existing systems often being unable to report sub-events in real-time and often in completely missing sub-events or pieces in the broader event puzzle. This paper proposes a Sub-event detection by real-TIme Microblog monitoring (STRIM framework that leverages the temporal feature of an expanded set of news-worthy event content. In order to more comprehensively and accurately identify sub-events this framework first proposes the use of adaptive microblog crawling. Our adaptive microblog crawler is capable of increasing the coverage of events while minimizing the amount of non-relevant content. We then propose a stream division methodology that can be accomplished in real time so that the temporal features of the expanded event streams can be analysed by a burst detection algorithm. In the final steps of the framework, the content features are extracted from each divided stream and recombined to provide a final summarization of the sub-events. The proposed framework is evaluated against traditional event detection using event recall and event precision metrics. Results show that improving the quality and coverage of event contents contribute to better event detection by identifying additional valid sub-events. The

  7. Piecing together the puzzle: Improving event content coverage for real-time sub-event detection using adaptive microblog crawling.

    Science.gov (United States)

    Tokarchuk, Laurissa; Wang, Xinyue; Poslad, Stefan

    2017-01-01

    In an age when people are predisposed to report real-world events through their social media accounts, many researchers value the benefits of mining user generated content from social media. Compared with the traditional news media, social media services, such as Twitter, can provide more complete and timely information about the real-world events. However events are often like a puzzle and in order to solve the puzzle/understand the event, we must identify all the sub-events or pieces. Existing Twitter event monitoring systems for sub-event detection and summarization currently typically analyse events based on partial data as conventional data collection methodologies are unable to collect comprehensive event data. This results in existing systems often being unable to report sub-events in real-time and often in completely missing sub-events or pieces in the broader event puzzle. This paper proposes a Sub-event detection by real-TIme Microblog monitoring (STRIM) framework that leverages the temporal feature of an expanded set of news-worthy event content. In order to more comprehensively and accurately identify sub-events this framework first proposes the use of adaptive microblog crawling. Our adaptive microblog crawler is capable of increasing the coverage of events while minimizing the amount of non-relevant content. We then propose a stream division methodology that can be accomplished in real time so that the temporal features of the expanded event streams can be analysed by a burst detection algorithm. In the final steps of the framework, the content features are extracted from each divided stream and recombined to provide a final summarization of the sub-events. The proposed framework is evaluated against traditional event detection using event recall and event precision metrics. Results show that improving the quality and coverage of event contents contribute to better event detection by identifying additional valid sub-events. The novel combination of

  8. Unsupervised behaviour-specific dictionary learning for abnormal event detection

    DEFF Research Database (Denmark)

    Ren, Huamin; Liu, Weifeng; Olsen, Søren Ingvor

    2015-01-01

    the training data is only a small proportion of the surveillance data. Therefore, we propose behavior-specific dictionaries (BSD) through unsupervised learning, pursuing atoms from the same type of behavior to represent one behavior dictionary. To further improve the dictionary by introducing information from...... potential infrequent normal patterns, we refine the dictionary by searching ‘missed atoms’ that have compact coefficients. Experimental results show that our BSD algorithm outperforms state-of-the-art dictionaries in abnormal event detection on the public UCSD dataset. Moreover, BSD has less false alarms...

  9. A novel seizure detection algorithm informed by hidden Markov model event states

    Science.gov (United States)

    Baldassano, Steven; Wulsin, Drausin; Ung, Hoameng; Blevins, Tyler; Brown, Mesha-Gay; Fox, Emily; Litt, Brian

    2016-06-01

    Objective. Recently the FDA approved the first responsive, closed-loop intracranial device to treat epilepsy. Because these devices must respond within seconds of seizure onset and not miss events, they are tuned to have high sensitivity, leading to frequent false positive stimulations and decreased battery life. In this work, we propose a more robust seizure detection model. Approach. We use a Bayesian nonparametric Markov switching process to parse intracranial EEG (iEEG) data into distinct dynamic event states. Each event state is then modeled as a multidimensional Gaussian distribution to allow for predictive state assignment. By detecting event states highly specific for seizure onset zones, the method can identify precise regions of iEEG data associated with the transition to seizure activity, reducing false positive detections associated with interictal bursts. The seizure detection algorithm was translated to a real-time application and validated in a small pilot study using 391 days of continuous iEEG data from two dogs with naturally occurring, multifocal epilepsy. A feature-based seizure detector modeled after the NeuroPace RNS System was developed as a control. Main results. Our novel seizure detection method demonstrated an improvement in false negative rate (0/55 seizures missed versus 2/55 seizures missed) as well as a significantly reduced false positive rate (0.0012 h versus 0.058 h-1). All seizures were detected an average of 12.1 ± 6.9 s before the onset of unequivocal epileptic activity (unequivocal epileptic onset (UEO)). Significance. This algorithm represents a computationally inexpensive, individualized, real-time detection method suitable for implantable antiepileptic devices that may considerably reduce false positive rate relative to current industry standards.

  10. How does structured sparsity work in abnormal event detection?

    DEFF Research Database (Denmark)

    Ren, Huamin; Pan, Hong; Olsen, Søren Ingvor

    the training, which is the due to the fact that abnormal videos are limited or even unavailable in advance in most video surveillance applications. As a result, there could be only one label in the training data which hampers supervised learning; 2) Even though there are multiple types of normal behaviors, how...... many normal patterns lie in the whole surveillance data is still unknown. This is because there is huge amount of video surveillance data and only a small proportion is used in algorithm learning, consequently, the normal patterns in the training data could be incomplete. As a result, any sparse...... structure learned from the training data could have a high bias and ruin the precision of abnormal event detection. Therefore, we in the paper propose an algorithm to solve the abnormality detection problem by sparse representation, in which local structured sparsity is preserved in coefficients. To better...

  11. Description and detection of burst events in turbulent flows

    Science.gov (United States)

    Schmid, P. J.; García-Gutierrez, A.; Jiménez, J.

    2018-04-01

    A mathematical and computational framework is developed for the detection and identification of coherent structures in turbulent wall-bounded shear flows. In a first step, this data-based technique will use an embedding methodology to formulate the fluid motion as a phase-space trajectory, from which state-transition probabilities can be computed. Within this formalism, a second step then applies repeated clustering and graph-community techniques to determine a hierarchy of coherent structures ranked by their persistencies. This latter information will be used to detect highly transitory states that act as precursors to violent and intermittent events in turbulent fluid motion (e.g., bursts). Used as an analysis tool, this technique allows the objective identification of intermittent (but important) events in turbulent fluid motion; however, it also lays the foundation for advanced control strategies for their manipulation. The techniques are applied to low-dimensional model equations for turbulent transport, such as the self-sustaining process (SSP), for varying levels of complexity.

  12. Collaborative Event-Driven Coverage and Rate Allocation for Event Miss-Ratio Assurances in Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Ozgur Sanli H

    2010-01-01

    Full Text Available Wireless sensor networks are often required to provide event miss-ratio assurance for a given event type. To meet such assurances along with minimum energy consumption, this paper shows how a node's activation and rate assignment is dependent on its distance to event sources, and proposes a practical coverage and rate allocation (CORA protocol to exploit this dependency in realistic environments. Both uniform event distribution and nonuniform event distribution are considered and the notion of ideal correlation distance around a clusterhead is introduced for on-duty node selection. In correlation distance guided CORA, rate assignment assists coverage scheduling by determining which nodes should be activated for minimizing data redundancy in transmission. Coverage scheduling assists rate assignment by controlling the amount of overlap among sensing regions of neighboring nodes, thereby providing sufficient data correlation for rate assignment. Extensive simulation results show that CORA meets the required event miss-ratios in realistic environments. CORA's joint coverage scheduling and rate allocation reduce the total energy expenditure by 85%, average battery energy consumption by 25%, and the overhead of source coding up to 90% as compared to existing rate allocation techniques.

  13. Efficient hemodynamic event detection utilizing relational databases and wavelet analysis

    Science.gov (United States)

    Saeed, M.; Mark, R. G.

    2001-01-01

    Development of a temporal query framework for time-oriented medical databases has hitherto been a challenging problem. We describe a novel method for the detection of hemodynamic events in multiparameter trends utilizing wavelet coefficients in a MySQL relational database. Storage of the wavelet coefficients allowed for a compact representation of the trends, and provided robust descriptors for the dynamics of the parameter time series. A data model was developed to allow for simplified queries along several dimensions and time scales. Of particular importance, the data model and wavelet framework allowed for queries to be processed with minimal table-join operations. A web-based search engine was developed to allow for user-defined queries. Typical queries required between 0.01 and 0.02 seconds, with at least two orders of magnitude improvement in speed over conventional queries. This powerful and innovative structure will facilitate research on large-scale time-oriented medical databases.

  14. Detecting and characterising ramp events in wind power time series

    International Nuclear Information System (INIS)

    Gallego, Cristóbal; Cuerva, Álvaro; Costa, Alexandre

    2014-01-01

    In order to implement accurate models for wind power ramp forecasting, ramps need to be previously characterised. This issue has been typically addressed by performing binary ramp/non-ramp classifications based on ad-hoc assessed thresholds. However, recent works question this approach. This paper presents the ramp function, an innovative wavelet- based tool which detects and characterises ramp events in wind power time series. The underlying idea is to assess a continuous index related to the ramp intensity at each time step, which is obtained by considering large power output gradients evaluated under different time scales (up to typical ramp durations). The ramp function overcomes some of the drawbacks shown by the aforementioned binary classification and permits forecasters to easily reveal specific features of the ramp behaviour observed at a wind farm. As an example, the daily profile of the ramp-up and ramp-down intensities are obtained for the case of a wind farm located in Spain

  15. High rate of adverse events following circumcision of young male ...

    African Journals Online (AJOL)

    (94) refusing circumcision by the TK technique; 34 men were randomised to the FG group and 35 to the TK group, and 32 and 24 patients were circumcised by the FG and TK methods respectively, of whom 29 and 19 respectively attended the post-circumcision visit. All 12 adverse event sheets corresponded to the TK group ...

  16. Relationship between medication event rates and the Leapfrog computerized physician order entry evaluation tool.

    Science.gov (United States)

    Leung, Alexander A; Keohane, Carol; Lipsitz, Stuart; Zimlichman, Eyal; Amato, Mary; Simon, Steven R; Coffey, Michael; Kaufman, Nathan; Cadet, Bismarck; Schiff, Gordon; Seger, Diane L; Bates, David W

    2013-06-01

    The Leapfrog CPOE evaluation tool has been promoted as a means of monitoring computerized physician order entry (CPOE). We sought to determine the relationship between Leapfrog scores and the rates of preventable adverse drug events (ADE) and potential ADE. A cross-sectional study of 1000 adult admissions in five community hospitals from October 1, 2008 to September 30, 2010 was performed. Observed rates of preventable ADE and potential ADE were compared with scores reported by the Leapfrog CPOE evaluation tool. The primary outcome was the rate of preventable ADE and the secondary outcome was the composite rate of preventable ADE and potential ADE. Leapfrog performance scores were highly related to the primary outcome. A 43% relative reduction in the rate of preventable ADE was predicted for every 5% increase in Leapfrog scores (rate ratio 0.57; 95% CI 0.37 to 0.88). In absolute terms, four fewer preventable ADE per 100 admissions were predicted for every 5% increase in overall Leapfrog scores (rate difference -4.2; 95% CI -7.4 to -1.1). A statistically significant relationship between Leapfrog scores and the secondary outcome, however, was not detected. Our findings support the use of the Leapfrog tool as a means of evaluating and monitoring CPOE performance after implementation, as addressed by current certification standards. Scores from the Leapfrog CPOE evaluation tool closely relate to actual rates of preventable ADE. Leapfrog testing may alert providers to potential vulnerabilities and highlight areas for further improvement.

  17. Motion Pattern Extraction and Event Detection for Automatic Visual Surveillance

    Directory of Open Access Journals (Sweden)

    Benabbas Yassine

    2011-01-01

    Full Text Available Efficient analysis of human behavior in video surveillance scenes is a very challenging problem. Most traditional approaches fail when applied in real conditions and contexts like amounts of persons, appearance ambiguity, and occlusion. In this work, we propose to deal with this problem by modeling the global motion information obtained from optical flow vectors. The obtained direction and magnitude models learn the dominant motion orientations and magnitudes at each spatial location of the scene and are used to detect the major motion patterns. The applied region-based segmentation algorithm groups local blocks that share the same motion direction and speed and allows a subregion of the scene to appear in different patterns. The second part of the approach consists in the detection of events related to groups of people which are merge, split, walk, run, local dispersion, and evacuation by analyzing the instantaneous optical flow vectors and comparing the learned models. The approach is validated and experimented on standard datasets of the computer vision community. The qualitative and quantitative results are discussed.

  18. Balloon-Borne Infrasound Detection of Energetic Bolide Events

    Science.gov (United States)

    Young, Eliot F.; Ballard, Courtney; Klein, Viliam; Bowman, Daniel; Boslough, Mark

    2016-10-01

    Infrasound is usually defined as sound waves below 20 Hz, the nominal limit of human hearing. Infrasound waves propagate over vast distances through the Earth's atmosphere: the CTBTO (Comprehensive Nuclear-Test-Ban Treaty Organization) has 48 installed infrasound-sensing stations around the world to detect nuclear detonations and other disturbances. In February 2013, several CTBTO infrasound stations detected infrasound signals from a large bolide that exploded over Chelyabinsk, Russia. Some stations recorded signals that had circumnavigated the Earth, over a day after the original event. The goal of this project is to improve upon the sensitivity of the CTBTO network by putting microphones on small, long-duration super-pressure balloons, with the overarching goal of studying the small end of the NEO population by using the Earth's atmosphere as a witness plate.A balloon-borne infrasound sensor is expected to have two advantages over ground-based stations: a lack of wind noise and a concentration of infrasound energy in the "stratospheric duct" between roughly 5 - 50 km altitude. To test these advantages, we have built a small balloon payload with five calibrated microphones. We plan to fly this payload on a NASA high-altitude balloon from Ft Sumner, NM in August 2016. We have arranged for three large explosions to take place in Socorro, NM while the balloon is aloft to assess the sensitivity of balloon-borne vs. ground-based infrasound sensors. We will report on the results from this test flight and the prospects for detecting/characterizing small bolides in the stratosphere.

  19. DETECT: a MATLAB toolbox for event detection and identification in time series, with applications to artifact detection in EEG signals.

    Science.gov (United States)

    Lawhern, Vernon; Hairston, W David; Robbins, Kay

    2013-01-01

    Recent advances in sensor and recording technology have allowed scientists to acquire very large time-series datasets. Researchers often analyze these datasets in the context of events, which are intervals of time where the properties of the signal change relative to a baseline signal. We have developed DETECT, a MATLAB toolbox for detecting event time intervals in long, multi-channel time series. Our primary goal is to produce a toolbox that is simple for researchers to use, allowing them to quickly train a model on multiple classes of events, assess the accuracy of the model, and determine how closely the results agree with their own manual identification of events without requiring extensive programming knowledge or machine learning experience. As an illustration, we discuss application of the DETECT toolbox for detecting signal artifacts found in continuous multi-channel EEG recordings and show the functionality of the tools found in the toolbox. We also discuss the application of DETECT for identifying irregular heartbeat waveforms found in electrocardiogram (ECG) data as an additional illustration.

  20. DETECT: a MATLAB toolbox for event detection and identification in time series, with applications to artifact detection in EEG signals.

    Directory of Open Access Journals (Sweden)

    Vernon Lawhern

    Full Text Available Recent advances in sensor and recording technology have allowed scientists to acquire very large time-series datasets. Researchers often analyze these datasets in the context of events, which are intervals of time where the properties of the signal change relative to a baseline signal. We have developed DETECT, a MATLAB toolbox for detecting event time intervals in long, multi-channel time series. Our primary goal is to produce a toolbox that is simple for researchers to use, allowing them to quickly train a model on multiple classes of events, assess the accuracy of the model, and determine how closely the results agree with their own manual identification of events without requiring extensive programming knowledge or machine learning experience. As an illustration, we discuss application of the DETECT toolbox for detecting signal artifacts found in continuous multi-channel EEG recordings and show the functionality of the tools found in the toolbox. We also discuss the application of DETECT for identifying irregular heartbeat waveforms found in electrocardiogram (ECG data as an additional illustration.

  1. Single Versus Multiple Events Error Potential Detection in a BCI-Controlled Car Game With Continuous and Discrete Feedback.

    Science.gov (United States)

    Kreilinger, Alex; Hiebel, Hannah; Müller-Putz, Gernot R

    2016-03-01

    This work aimed to find and evaluate a new method for detecting errors in continuous brain-computer interface (BCI) applications. Instead of classifying errors on a single-trial basis, the new method was based on multiple events (MEs) analysis to increase the accuracy of error detection. In a BCI-driven car game, based on motor imagery (MI), discrete events were triggered whenever subjects collided with coins and/or barriers. Coins counted as correct events, whereas barriers were errors. This new method, termed ME method, combined and averaged the classification results of single events (SEs) and determined the correctness of MI trials, which consisted of event sequences instead of SEs. The benefit of this method was evaluated in an offline simulation. In an online experiment, the new method was used to detect erroneous MI trials. Such MI trials were discarded and could be repeated by the users. We found that, even with low SE error potential (ErrP) detection rates, feasible accuracies can be achieved when combining MEs to distinguish erroneous from correct MI trials. Online, all subjects reached higher scores with error detection than without, at the cost of longer times needed for completing the game. Findings suggest that ErrP detection may become a reliable tool for monitoring continuous states in BCI applications when combining MEs. This paper demonstrates a novel technique for detecting errors in online continuous BCI applications, which yields promising results even with low single-trial detection rates.

  2. Gender Differences in Rating Stressful Events, Depression, and Depressive Cognition.

    Science.gov (United States)

    Sowa, Claudia J.; Lustman, Patrick J.

    1984-01-01

    Administered the Life Stress Questionnaire, the Beck Depression Inventory, and the Automatic Thought Questionnaire to 140 students. Results showed significant sex differences. Men reported more stressful life change, but women rated the impact of stressors more severely and had higher depression. Men exhibited greater distortions in cognitive…

  3. Nuclear and Radiological Event Scale Turns 20. INES Helps Authorities Rate Events and Communicate Their Significance

    International Nuclear Information System (INIS)

    Verlini, Giovanni

    2011-01-01

    Originally developed in the 1990s jointly by IAEA and Nuclear Energy Agency of the Organization for Economic Co-operation and Development (OECD/NEA) and Member States experts, INES was last revised in 2008 to become a more versatile and informative tool. INES is now designed to address events associated with the transport, storage and use of radioactive material and radiation sources, whether they occur at a nuclear installation or not.

  4. High frame-rate neutron radiography of dynamic events

    International Nuclear Information System (INIS)

    Bossi, R.H.; Robinson, A.H.; Barton, J.P.

    1981-01-01

    A system has been developed to perform neutron radiographic analysis of dynamic events having a duration of several milliseconds. The system has been operated in the range of 2000 to 10,000 frames/second. Synchronization has provided high-speed-motion neutron radiographs for evaluation of the firing cycle of 7.62 mm munition rounds within a steel rifle barrel. The system has also been used to demonstrate the ability to produce neutron radiographic movies of two-phase flow. The equipment uses the Oregon State University TRIGA reactor capable of pulsing to 3000 MW peak power, a neutron beam collimator, a scintillator neutron conversion screen coupled to an image intensifier, and a 16 mm high speed movie camera. The peak neutron flux incident at the object position is approximately 4 x 10 11 n/cm 2 s with a pulse, full width at half maximum, of 9 ms. Special studies have been performed on the scintillator conversion screens and on the effects of statistical limitations on the image quality. Modulation transfer function analysis has been used to assist in the evaluation of the system performance

  5. High frame-rate neutron radiography of dynamic events

    International Nuclear Information System (INIS)

    Bossi, R.H.; Robinson, A.H.; Barton, J.P.

    1983-01-01

    A system has been developed to perform neutron radiographic analysis of dynamic events having a duration of several milliseconds. The system has been operated in the range of 2000 to 10,000 frames/second. Synchronization has provided high-speed-motion neutron radiographs for evaluation of the firing cycle of 7.62 mm munition rounds within a steel rifle barrel. The system has also been used to demonstrate the ability to produce neutron radiographic movies of two phase flow. The equipment uses the Oregon State University TRIGA reactor capable of pulsing to 3000 MW peak power, a neutron beam collimator, a scintillator neutron conversion screen coupled to an image intensifier, and a 16 mm high speed movie camera. The peak neutron flux incident at the object position is approximately 4 x 10 11 n/cm 2 s with a pulse, full width at half maximum, of 9 ms. Special studies have been performed on the scintillator conversion screens and on the effects of statistical limitations on the image quality. Modulation transfer function analysis has been used to assist in the evaluation of the system performance. (Auth.)

  6. Power Load Event Detection and Classification Based on Edge Symbol Analysis and Support Vector Machine

    Directory of Open Access Journals (Sweden)

    Lei Jiang

    2012-01-01

    Full Text Available Energy signature analysis of power appliance is the core of nonintrusive load monitoring (NILM where the detailed data of the appliances used in houses are obtained by analyzing changes in the voltage and current. This paper focuses on developing an automatic power load event detection and appliance classification based on machine learning. In power load event detection, the paper presents a new transient detection algorithm. By turn-on and turn-off transient waveforms analysis, it can accurately detect the edge point when a device is switched on or switched off. The proposed load classification technique can identify different power appliances with improved recognition accuracy and computational speed. The load classification method is composed of two processes including frequency feature analysis and support vector machine. The experimental results indicated that the incorporation of the new edge detection and turn-on and turn-off transient signature analysis into NILM revealed more information than traditional NILM methods. The load classification method has achieved more than ninety percent recognition rate.

  7. Optimal detection of burst events in gravitational wave interferometric observatories

    International Nuclear Information System (INIS)

    Vicere, Andrea

    2002-01-01

    We consider the problem of detecting a burst signal of unknown shape in the data from gravitational wave interferometric detectors. We introduce a statistic which generalizes the excess power statistic proposed first by Flanagan and Hughes, and then extended by Anderson et al. to the multiple detector case. The statistic that we propose is shown to be optimal for an arbitrary noise spectral characteristic, under the two hypotheses that the noise is Gaussian, albeit colored, and that the prior for the signal is uniform. The statistic derivation is based on the assumption that a signal affects only N parallel samples in the data stream, but that no other information is a priori available, and that the value of the signal at each sample can be arbitrary. This is the main difference from previous works, where different assumptions were made, such as a signal distribution uniform with respect to the metric induced by the (inverse) noise correlation matrix. The two choices are equivalent if the noise is white, and in that limit the two statistics do indeed coincide. In the general case, we believe that the statistic we propose may be more appropriate, because it does not reflect the characteristics of the noise affecting the detector on the supposed distribution of the gravitational wave signal. Moreover, we show that the proposed statistic can be easily implemented in its exact form, combining standard time-series analysis tools which can be efficiently implemented. We generalize this version of an excess power statistic to the multiple detector case, considering first a noise uncorrelated among the different instruments, and then including the effect of correlated noise. We discuss exact and approximate forms of the statistic; the choice depends on the characteristics of the noise and on the assumed length of the burst event. As an example, we show the sensitivity of the network of interferometers to a δ-function burst

  8. Latency and mode of error detection as reflected in Swedish licensee event reports

    Energy Technology Data Exchange (ETDEWEB)

    Svenson, Ola; Salo, Ilkka [Stockholm Univ., (Sweden). Dept. of Psychology

    2002-03-01

    Licensee event reports (LERs) from an industry provide important information feedback about safety to the industry itself, the regulators and to the public. LERs from four nuclear power reactors were analyzed to find out about detection times, mode of detection and qualitative differences in reports from different reactors. The reliability of the coding was satisfactory and measured as the covariance between the ratings from two independent judges. The results showed differences in detection time across the reactors. On the average about ten percent of the errors remained undetected for 100 weeks or more, but the great majority of errors were detected soon after their first appearance in the plant. On the average 40 percent of the errors were detected in regular tests and 40 per cent through alarms. Operators found about 10 per cent of the errors through noticing something abnormal in the plant. The remaining errors were detected in various other ways. There were qualitative differences between the LERs from the different reactors reflecting the different conditions in the plants. The number of reports differed by a magnitude 1:2 between the different plants. However, a greater number of LERs can indicate both higher safety standards (e.g., a greater willingness to report all possible events to be able to learn from them) and lower safety standards (e.g., reporting as few events as possible to make a good impression). It was pointed out that LERs are indispensable in order to maintain safety of an industry and that the differences between plants found in the analyses of this study indicate how error reports can be used to initiate further investigations for improved safety.

  9. Latency and mode of error detection as reflected in Swedish licensee event reports

    International Nuclear Information System (INIS)

    Svenson, Ola; Salo, Ilkka

    2002-03-01

    Licensee event reports (LERs) from an industry provide important information feedback about safety to the industry itself, the regulators and to the public. LERs from four nuclear power reactors were analyzed to find out about detection times, mode of detection and qualitative differences in reports from different reactors. The reliability of the coding was satisfactory and measured as the covariance between the ratings from two independent judges. The results showed differences in detection time across the reactors. On the average about ten percent of the errors remained undetected for 100 weeks or more, but the great majority of errors were detected soon after their first appearance in the plant. On the average 40 percent of the errors were detected in regular tests and 40 per cent through alarms. Operators found about 10 per cent of the errors through noticing something abnormal in the plant. The remaining errors were detected in various other ways. There were qualitative differences between the LERs from the different reactors reflecting the different conditions in the plants. The number of reports differed by a magnitude 1:2 between the different plants. However, a greater number of LERs can indicate both higher safety standards (e.g., a greater willingness to report all possible events to be able to learn from them) and lower safety standards (e.g., reporting as few events as possible to make a good impression). It was pointed out that LERs are indispensable in order to maintain safety of an industry and that the differences between plants found in the analyses of this study indicate how error reports can be used to initiate further investigations for improved safety

  10. Event Detection Intelligent Camera: Demonstration of flexible, real-time data taking and processing

    Energy Technology Data Exchange (ETDEWEB)

    Szabolics, Tamás, E-mail: szabolics.tamas@wigner.mta.hu; Cseh, Gábor; Kocsis, Gábor; Szepesi, Tamás; Zoletnik, Sándor

    2015-10-15

    Highlights: • We present EDICAM's operation principles description. • Firmware tests results. • Software test results. • Further developments. - Abstract: An innovative fast camera (EDICAM – Event Detection Intelligent CAMera) was developed by MTA Wigner RCP in the last few years. This new concept was designed for intelligent event driven processing to be able to detect predefined events and track objects in the plasma. The camera provides a moderate frame rate of 400 Hz at full frame resolution (1280 × 1024), and readout of smaller region of interests can be done in the 1–140 kHz range even during exposure of the full image. One of the most important advantages of this hardware is a 10 Gbit/s optical link which ensures very fast communication and data transfer between the PC and the camera, enabling two level of processing: primitive algorithms in the camera hardware and high-level processing in the PC. This camera hardware has successfully proven to be able to monitoring the plasma in several fusion devices for example at ASDEX Upgrade, KSTAR and COMPASS with the first version of firmware. A new firmware and software package is under development. It allows to detect predefined events in real time and therefore the camera is capable to change its own operation or to give warnings e.g. to the safety system of the experiment. The EDICAM system can handle a huge amount of data (up to TBs) with high data rate (950 MB/s) and will be used as the central element of the 10 camera overview video diagnostic system of Wendenstein 7-X (W7-X) stellarator. This paper presents key elements of the newly developed built-in intelligence stressing the revolutionary new features and the results of the test of the different software elements.

  11. High-Performance Signal Detection for Adverse Drug Events using MapReduce Paradigm.

    Science.gov (United States)

    Fan, Kai; Sun, Xingzhi; Tao, Ying; Xu, Linhao; Wang, Chen; Mao, Xianling; Peng, Bo; Pan, Yue

    2010-11-13

    Post-marketing pharmacovigilance is important for public health, as many Adverse Drug Events (ADEs) are unknown when those drugs were approved for marketing. However, due to the large number of reported drugs and drug combinations, detecting ADE signals by mining these reports is becoming a challenging task in terms of computational complexity. Recently, a parallel programming model, MapReduce has been introduced by Google to support large-scale data intensive applications. In this study, we proposed a MapReduce-based algorithm, for common ADE detection approach, Proportional Reporting Ratio (PRR), and tested it in mining spontaneous ADE reports from FDA. The purpose is to investigate the possibility of using MapReduce principle to speed up biomedical data mining tasks using this pharmacovigilance case as one specific example. The results demonstrated that MapReduce programming model could improve the performance of common signal detection algorithm for pharmacovigilance in a distributed computation environment at approximately liner speedup rates.

  12. Developing Fluorescence Sensor Systems for Early Detection of Nitrification Events in Chloraminated Drinking Water Distribution Systems

    Science.gov (United States)

    Detection of nitrification events in chloraminated drinking water distribution systems remains an ongoing challenge for many drinking water utilities, including Dallas Water Utilities (DWU) and the City of Houston (CoH). Each year, these utilities experience nitrification events ...

  13. Event Rates in Randomized Clinical Trials Evaluating Cardiovascular Interventions and Devices

    NARCIS (Netherlands)

    Mahmoud, Karim D.; Lennon, Ryan J.; Holmes, David R.

    2015-01-01

    Randomized clinical trials (RCTs) are considered the gold standard for evidence-based medicine. However, an accurate estimation of the event rate is crucial for their ability to test clinical hypotheses. Overestimation of event rates reduces the required sample size but can compromise the

  14. SPEED : a semantics-based pipeline for economic event detection

    NARCIS (Netherlands)

    Hogenboom, F.P.; Hogenboom, A.C.; Frasincar, F.; Kaymak, U.; Meer, van der O.; Schouten, K.; Vandic, D.; Parsons, J.; Motoshi, S.; Shoval, P.; Woo, C.; Wand, Y.

    2010-01-01

    Nowadays, emerging news on economic events such as acquisitions has a substantial impact on the financial markets. Therefore, it is important to be able to automatically and accurately identify events in news items in a timely manner. For this, one has to be able to process a large amount of

  15. Automated Detection of Financial Events in News Text

    NARCIS (Netherlands)

    F.P. Hogenboom (Frederik)

    2014-01-01

    markdownabstractToday’s financial markets are inextricably linked with financial events like acquisitions, profit announcements, or product launches. Information extracted from news messages that report on such events could hence be beneficial for financial decision making. The ubiquity of news,

  16. Semantics-based information extraction for detecting economic events

    NARCIS (Netherlands)

    A.C. Hogenboom (Alexander); F. Frasincar (Flavius); K. Schouten (Kim); O. van der Meer

    2013-01-01

    textabstractAs today's financial markets are sensitive to breaking news on economic events, accurate and timely automatic identification of events in news items is crucial. Unstructured news items originating from many heterogeneous sources have to be mined in order to extract knowledge useful for

  17. Prescription-event monitoring: developments in signal detection.

    Science.gov (United States)

    Ferreira, Germano

    2007-01-01

    Prescription-event monitoring (PEM) is a non-interventional intensive method for post-marketing drug safety monitoring of newly licensed medicines. PEM studies are cohort studies where exposure is obtained from a centralised service and outcomes from simple questionnaires completed by general practitioners. Follow-up forms are sent for selected events. Because PEM captures all events and not only the suspected adverse drug reactions, PEM cohorts potentially differ in respect to the distribution of number of events per person depending on the nature of the drug under study. This variance can be related either with the condition for which the drug is prescribed (e.g. a condition causing high morbidity will have, in average, a higher number of events per person compared with a condition with lower morbidity) or with the drug effect itself. This paper describes an exploratory investigation of the distortion caused by product-related variations of the number of events to the interpretation of the proportional reporting ratio (PRR) values ("the higher the PRR, the greater the strength of the signal") computed using drug-cohort data. We studied this effect by assessing the agreement between the PRR based on events (event of interest vs all other events) and PRR based on cases (cases with the event of interest vs cases with any other events). PRR were calculated for all combinations reported to ten selected drugs against a comparator of 81 other drugs. Three of the ten drugs had a cohort with an apparent higher proportion of patients with lower number of events. The PRRs based on events were systematically higher than the PRR based on cases for the combinations reported to these three drugs. Additionally, when applying the threshold criteria for signal screening (n > or =3, PRR > or =1.5 and Chi-squared > or =4), the binary agreement was generally high but apparently lower for these three drugs. In conclusion, the distribution of events per patient in drug cohorts shall be

  18. Presentation of the results of a Bayesian automatic event detection and localization program to human analysts

    Science.gov (United States)

    Kushida, N.; Kebede, F.; Feitio, P.; Le Bras, R.

    2016-12-01

    The Preparatory Commission for the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO) has been developing and testing NET-VISA (Arora et al., 2013), a Bayesian automatic event detection and localization program, and evaluating its performance in a realistic operational mode. In our preliminary testing at the CTBTO, NET-VISA shows better performance than its currently operating automatic localization program. However, given CTBTO's role and its international context, a new technology should be introduced cautiously when it replaces a key piece of the automatic processing. We integrated the results of NET-VISA into the Analyst Review Station, extensively used by the analysts so that they can check the accuracy and robustness of the Bayesian approach. We expect the workload of the analysts to be reduced because of the better performance of NET-VISA in finding missed events and getting a more complete set of stations than the current system which has been operating for nearly twenty years. The results of a series of tests indicate that the expectations born from the automatic tests, which show an overall overlap improvement of 11%, meaning that the missed events rate is cut by 42%, hold for the integrated interactive module as well. New events are found by analysts, which qualify for the CTBTO Reviewed Event Bulletin, beyond the ones analyzed through the standard procedures. Arora, N., Russell, S., and Sudderth, E., NET-VISA: Network Processing Vertically Integrated Seismic Analysis, 2013, Bull. Seismol. Soc. Am., 103, 709-729.

  19. Integrating physically based simulators with Event Detection Systems: Multi-site detection approach.

    Science.gov (United States)

    Housh, Mashor; Ohar, Ziv

    2017-03-01

    The Fault Detection (FD) Problem in control theory concerns of monitoring a system to identify when a fault has occurred. Two approaches can be distinguished for the FD: Signal processing based FD and Model-based FD. The former concerns of developing algorithms to directly infer faults from sensors' readings, while the latter uses a simulation model of the real-system to analyze the discrepancy between sensors' readings and expected values from the simulation model. Most contamination Event Detection Systems (EDSs) for water distribution systems have followed the signal processing based FD, which relies on analyzing the signals from monitoring stations independently of each other, rather than evaluating all stations simultaneously within an integrated network. In this study, we show that a model-based EDS which utilizes a physically based water quality and hydraulics simulation models, can outperform the signal processing based EDS. We also show that the model-based EDS can facilitate the development of a Multi-Site EDS (MSEDS), which analyzes the data from all the monitoring stations simultaneously within an integrated network. The advantage of the joint analysis in the MSEDS is expressed by increased detection accuracy (higher true positive alarms and fewer false alarms) and shorter detection time. Copyright © 2016 Elsevier Ltd. All rights reserved.

  20. Acoustic Event Detection in Multichannel Audio Using Gated Recurrent Neural Networks with High‐Resolution Spectral Features

    Directory of Open Access Journals (Sweden)

    Hyoung‐Gook Kim

    2017-12-01

    Full Text Available Recently, deep recurrent neural networks have achieved great success in various machine learning tasks, and have also been applied for sound event detection. The detection of temporally overlapping sound events in realistic environments is much more challenging than in monophonic detection problems. In this paper, we present an approach to improve the accuracy of polyphonic sound event detection in multichannel audio based on gated recurrent neural networks in combination with auditory spectral features. In the proposed method, human hearing perception‐based spatial and spectral‐domain noise‐reduced harmonic features are extracted from multichannel audio and used as high‐resolution spectral inputs to train gated recurrent neural networks. This provides a fast and stable convergence rate compared to long short‐term memory recurrent neural networks. Our evaluation reveals that the proposed method outperforms the conventional approaches.

  1. Probability of detection of clinical seizures using heart rate changes.

    Science.gov (United States)

    Osorio, Ivan; Manly, B F J

    2015-08-01

    Heart rate-based seizure detection is a viable complement or alternative to ECoG/EEG. This study investigates the role of various biological factors on the probability of clinical seizure detection using heart rate. Regression models were applied to 266 clinical seizures recorded from 72 subjects to investigate if factors such as age, gender, years with epilepsy, etiology, seizure site origin, seizure class, and data collection centers, among others, shape the probability of EKG-based seizure detection. Clinical seizure detection probability based on heart rate changes, is significantly (pprobability of detecting clinical seizures (>0.8 in the majority of subjects) using heart rate is highest for complex partial seizures, increases with a patient's years with epilepsy, is lower for females than for males and is unrelated to the side of hemisphere origin. Clinical seizure detection probability using heart rate is multi-factorially dependent and sufficiently high (>0.8) in most cases to be clinically useful. Knowledge of the role that these factors play in shaping said probability will enhance its applicability and usefulness. Heart rate is a reliable and practical signal for extra-cerebral detection of clinical seizures originating from or spreading to central autonomic network structures. Copyright © 2015 British Epilepsy Association. Published by Elsevier Ltd. All rights reserved.

  2. Signal detection to identify serious adverse events (neuropsychiatric events in travelers taking mefloquine for chemoprophylaxis of malaria

    Directory of Open Access Journals (Sweden)

    Naing C

    2012-08-01

    Full Text Available Cho Naing,1,3 Kyan Aung,1 Syed Imran Ahmed,2 Joon Wah Mak31School of Medical Sciences, 2School of Pharmacy and Health Sciences, 3School of Postgraduate Studies and Research, International Medical University, Kuala Lumpur, MalaysiaBackground: For all medications, there is a trade-off between benefits and potential for harm. It is important for patient safety to detect drug-event combinations and analyze by appropriate statistical methods. Mefloquine is used as chemoprophylaxis for travelers going to regions with known chloroquine-resistant Plasmodium falciparum malaria. As such, there is a concern about serious adverse events associated with mefloquine chemoprophylaxis. The objective of the present study was to assess whether any signal would be detected for the serious adverse events of mefloquine, based on data in clinicoepidemiological studies.Materials and methods: We extracted data on adverse events related to mefloquine chemoprophylaxis from the two published datasets. Disproportionality reporting of adverse events such as neuropsychiatric events and other adverse events was presented in the 2 × 2 contingency table. Reporting odds ratio and corresponding 95% confidence interval [CI] data-mining algorithm was applied for the signal detection. The safety signals are considered significant when the ROR estimates and the lower limits of the corresponding 95% CI are ≥2.Results: Two datasets addressing adverse events of mefloquine chemoprophylaxis (one from a published article and one from a Cochrane systematic review were included for analyses. Reporting odds ratio 1.58, 95% CI: 1.49–1.68 based on published data in the selected article, and 1.195, 95% CI: 0.94–1.44 based on data in the selected Cochrane review. Overall, in both datasets, the reporting odds ratio values of lower 95% CI were less than 2.Conclusion: Based on available data, findings suggested that signals for serious adverse events pertinent to neuropsychiatric event were

  3. Network hydraulics inclusion in water quality event detection using multiple sensor stations data.

    Science.gov (United States)

    Oliker, Nurit; Ostfeld, Avi

    2015-09-01

    Event detection is one of the current most challenging topics in water distribution systems analysis: how regular on-line hydraulic (e.g., pressure, flow) and water quality (e.g., pH, residual chlorine, turbidity) measurements at different network locations can be efficiently utilized to detect water quality contamination events. This study describes an integrated event detection model which combines multiple sensor stations data with network hydraulics. To date event detection modelling is likely limited to single sensor station location and dataset. Single sensor station models are detached from network hydraulics insights and as a result might be significantly exposed to false positive alarms. This work is aimed at decreasing this limitation through integrating local and spatial hydraulic data understanding into an event detection model. The spatial analysis complements the local event detection effort through discovering events with lower signatures by exploring the sensors mutual hydraulic influences. The unique contribution of this study is in incorporating hydraulic simulation information into the overall event detection process of spatially distributed sensors. The methodology is demonstrated on two example applications using base runs and sensitivity analyses. Results show a clear advantage of the suggested model over single-sensor event detection schemes. Copyright © 2015 Elsevier Ltd. All rights reserved.

  4. Initiating Event Rates at U.S. Nuclear Power Plants. 1988 - 2013

    International Nuclear Information System (INIS)

    Schroeder, John A.; Bower, Gordon R.

    2014-01-01

    Analyzing initiating event rates is important because it indicates performance among plants and also provides inputs to several U.S. Nuclear Regulatory Commission (NRC) risk-informed regulatory activities. This report presents an analysis of initiating event frequencies at U.S. commercial nuclear power plants since each plant's low-power license date. The evaluation is based on the operating experience from fiscal year 1988 through 2013 as reported in licensee event reports. Engineers with nuclear power plant experience staff reviewed each event report since the last update to this report for the presence of valid scrams or reactor trips at power. To be included in the study, an event had to meet all of the following criteria: includes an unplanned reactor trip (not a scheduled reactor trip on the daily operations schedule), sequence of events starts when reactor is critical and at or above the point of adding heat, occurs at a U.S. commercial nuclear power plant (excluding Fort St. Vrain and LaCrosse), and is reported by a licensee event report. This report displays occurrence rates (baseline frequencies) for the categories of initiating events that contribute to the NRC's Industry Trends Program. Sixteen initiating event groupings are trended and displayed. Initiators are plotted separately for initiating events with different occurrence rates for boiling water reactors and pressurized water reactors. p-values are given for the possible presence of a trend over the most recent 10 years.

  5. Arduino-based noise robust online heart-rate detection.

    Science.gov (United States)

    Das, Sangita; Pal, Saurabh; Mitra, Madhuchhanda

    2017-04-01

    This paper introduces a noise robust real time heart rate detection system from electrocardiogram (ECG) data. An online data acquisition system is developed to collect ECG signals from human subjects. Heart rate is detected using window-based autocorrelation peak localisation technique. A low-cost Arduino UNO board is used to implement the complete automated process. The performance of the system is compared with PC-based heart rate detection technique. Accuracy of the system is validated through simulated noisy ECG data with various levels of signal to noise ratio (SNR). The mean percentage error of detected heart rate is found to be 0.72% for the noisy database with five different noise levels.

  6. [Detection of adverse events in hospitalized adult patients by using the Global Trigger Tool method].

    Science.gov (United States)

    Guzmán-Ruiz, O; Ruiz-López, P; Gómez-Cámara, A; Ramírez-Martín, M

    2015-01-01

    To identify and characterize adverse events (AE) in an Internal Medicine Department of a district hospital using an extension of the Global Trigger Tool (GTT), analyzing the diagnostic validity of the tool. An observational, analytical, descriptive and retrospective study was conducted on 2013 clinical charts from an Internal Medicine Department in order to detect EA through the identification of 'triggers' (an event often related to an AE). The 'triggers' and AE were located by systematic review of clinical documentation. The AE were characterized after they were identified. A total of 149 AE were detected in 291 clinical charts during 2013, of which 75.3% were detected directly by the tool, while the rest were not associated with a trigger. The percentage of charts that had at least one AE was 35.4%. The most frequent AE found was pressure ulcer (12%), followed by delirium, constipation, nosocomial respiratory infection and altered level of consciousness by drugs. Almost half (47.6%) of the AE were related to drug use, and 32.2% of all AE were considered preventable. The tool demonstrated a sensitivity of 91.3% (95%CI: 88.9-93.2) and a specificity of 32.5% (95%CI: 29.9-35.1). It had a positive predictive value of 42.5% (95%CI: 40.1-45.1) and a negative predictive value of 87.1% (95%CI: 83.8-89.9). The tool used in this study is valid, useful and reproducible for the detection of AE. It also serves to determine rates of injury and to observe their progression over time. A high frequency of both AE and preventable events were observed in this study. Copyright © 2014 SECA. Published by Elsevier Espana. All rights reserved.

  7. An Event-Triggered Machine Learning Approach for Accelerometer-Based Fall Detection.

    Science.gov (United States)

    Putra, I Putu Edy Suardiyana; Brusey, James; Gaura, Elena; Vesilo, Rein

    2017-12-22

    The fixed-size non-overlapping sliding window (FNSW) and fixed-size overlapping sliding window (FOSW) approaches are the most commonly used data-segmentation techniques in machine learning-based fall detection using accelerometer sensors. However, these techniques do not segment by fall stages (pre-impact, impact, and post-impact) and thus useful information is lost, which may reduce the detection rate of the classifier. Aligning the segment with the fall stage is difficult, as the segment size varies. We propose an event-triggered machine learning (EvenT-ML) approach that aligns each fall stage so that the characteristic features of the fall stages are more easily recognized. To evaluate our approach, two publicly accessible datasets were used. Classification and regression tree (CART), k -nearest neighbor ( k -NN), logistic regression (LR), and the support vector machine (SVM) were used to train the classifiers. EvenT-ML gives classifier F-scores of 98% for a chest-worn sensor and 92% for a waist-worn sensor, and significantly reduces the computational cost compared with the FNSW- and FOSW-based approaches, with reductions of up to 8-fold and 78-fold, respectively. EvenT-ML achieves a significantly better F-score than existing fall detection approaches. These results indicate that aligning feature segments with fall stages significantly increases the detection rate and reduces the computational cost.

  8. Development and validation of a Thai stressful life events rating scale for patients with a diagnosis of schizophrenic methamphetamine abuse

    Directory of Open Access Journals (Sweden)

    Ek-uma Imkome

    2017-04-01

    Full Text Available This study aimed to psychometrically test a Thai Stressful Life Events Rating Scale (TSLERS. Factor analysis was done on data collected from 313 patients with schizophrenia and methamphetamine abuse in Thailand from April to May, 2015. Results identified the following problems impacting physical and mental health: social relationship and social concerns, money, family life, life security, and career. Evaluation of the psychometric scale properties demonstrated acceptable validity and reliability. TSLERS provided scientific and empirical data about stressful life events of patients with schizophrenia and methamphetamine abuse, and was suitable for stress detection and suggesting further innovations.

  9. Review of nuclear power reactor coolant system leakage events and leak detection requirements

    International Nuclear Information System (INIS)

    Chokshi, N.C.; Srinivasan, M.; Kupperman, D.S.; Krishnaswamy, P.

    2005-01-01

    In response to the vessel head event at the Davis-Besse reactor, the U.S. Nuclear Regulatory Commission (NRC) formed a Lessons Learned Task Force (LLTF). Four action plans were formulated to respond to the recommendations of the LLTF. The action plans involved efforts on barrier integrity, stress corrosion cracking (SCC), operating experience, and inspection and program management. One part of the action plan on barrier integrity was an assessment to identify potential safety benefits from changes in requirements pertaining to leakage in the reactor coolant system (RCS). In this effort, experiments and models were reviewed to identify correlations between crack size, crack-tip-opening displacement (CTOD), and leak rate in the RCS. Sensitivity studies using the Seepage Quantification of Upsets In Reactor Tubes (SQUIRT) code were carried out to correlate crack parameters, such as crack size, with leak rate for various types of crack configurations in RCS components. A database that identifies the leakage source, leakage rate, and resulting actions from RCS leaks discovered in U.S. light water reactors was developed. Humidity monitoring systems for detecting leakage and acoustic emission crack monitoring systems for the detection of crack initiation and growth before a leak occurs were also considered. New approaches to the detection of a leak in the reactor head region by monitoring boric-acid aerosols were also considered. (authors)

  10. Multivariate algorithms for initiating event detection and identification in nuclear power plants

    International Nuclear Information System (INIS)

    Wu, Shun-Chi; Chen, Kuang-You; Lin, Ting-Han; Chou, Hwai-Pwu

    2018-01-01

    Highlights: •Multivariate algorithms for NPP initiating event detection and identification. •Recordings from multiple sensors are simultaneously considered for detection. •Both spatial and temporal information is used for event identification. •Untrained event isolation avoids falsely relating an untrained event. •Efficacy of the algorithms is verified with data from the Maanshan NPP simulator. -- Abstract: To prevent escalation of an initiating event into a severe accident, promptly detecting its occurrence and precisely identifying its type are essential. In this study, several multivariate algorithms for initiating event detection and identification are proposed to help maintain safe operations of nuclear power plants (NPPs). By monitoring changes in the NPP sensing variables, an event is detected when the preset thresholds are exceeded. Unlike existing approaches, recordings from sensors of the same type are simultaneously considered for detection, and no subjective reasoning is involved in setting these thresholds. To facilitate efficient event identification, a spatiotemporal feature extractor is proposed. The extracted features consist of the temporal traits used by existing techniques and the spatial signature of an event. Through an F-score-based feature ranking, only those that are most discriminant in classifying the events under consideration will be retained for identification. Moreover, an untrained event isolation scheme is introduced to avoid relating an untrained event to those in the event dataset so that improper recovery actions can be prevented. Results from experiments containing data of 12 event classes and a total of 125 events generated using a Taiwan’s Maanshan NPP simulator are provided to illustrate the efficacy of the proposed algorithms.

  11. Towards Optimal Event Detection and Localization in Acyclic Flow Networks

    KAUST Repository

    Agumbe Suresh, Mahima; Stoleru, Radu; Denton, Ron; Zechman, Emily; Shihada, Basem

    2012-01-01

    infrastructure (i.e., minimizing the number of sensor and beacon nodes deployed), while ensuring a degree of sensing coverage in a zone of interest and a required accuracy in locating events. We propose algorithms for solving these problems and demonstrate

  12. On the event detected by the Mont Blanc underground neutrino detector on February 23, 1987

    Energy Technology Data Exchange (ETDEWEB)

    Dadykin, V L; Zatsepin, G T; Korchagin, V B

    1988-02-01

    The event detected by the Mont Balnc Soviet -Italian scintillation detector on February 23, 1987 at 2:52:37 are discussed. The corrected energies of the pulases of the event and the probability of the event imitation by the background are presented.

  13. Automatic detection of whole night snoring events using non-contact microphone.

    Directory of Open Access Journals (Sweden)

    Eliran Dafna

    Full Text Available OBJECTIVE: Although awareness of sleep disorders is increasing, limited information is available on whole night detection of snoring. Our study aimed to develop and validate a robust, high performance, and sensitive whole-night snore detector based on non-contact technology. DESIGN: Sounds during polysomnography (PSG were recorded using a directional condenser microphone placed 1 m above the bed. An AdaBoost classifier was trained and validated on manually labeled snoring and non-snoring acoustic events. PATIENTS: Sixty-seven subjects (age 52.5 ± 13.5 years, BMI 30.8 ± 4.7 kg/m(2, m/f 40/27 referred for PSG for obstructive sleep apnea diagnoses were prospectively and consecutively recruited. Twenty-five subjects were used for the design study; the validation study was blindly performed on the remaining forty-two subjects. MEASUREMENTS AND RESULTS: To train the proposed sound detector, >76,600 acoustic episodes collected in the design study were manually classified by three scorers into snore and non-snore episodes (e.g., bedding noise, coughing, environmental. A feature selection process was applied to select the most discriminative features extracted from time and spectral domains. The average snore/non-snore detection rate (accuracy for the design group was 98.4% based on a ten-fold cross-validation technique. When tested on the validation group, the average detection rate was 98.2% with sensitivity of 98.0% (snore as a snore and specificity of 98.3% (noise as noise. CONCLUSIONS: Audio-based features extracted from time and spectral domains can accurately discriminate between snore and non-snore acoustic events. This audio analysis approach enables detection and analysis of snoring sounds from a full night in order to produce quantified measures for objective follow-up of patients.

  14. A Dynamically Configurable Log-based Distributed Security Event Detection Methodology using Simple Event Correlator

    Science.gov (United States)

    2010-06-01

    Figures Figure Page 2.1. Verizon Data Breach Report: Detective Controls by percent of breach victims. [11...immature, operationally speaking. Figure 2.1: Verizon Data Breach Report: Detective Controls by percent of breach victims. [11] 7 The 2008 CSI Computer...Christopher Novak, Christopher Porter, Bryan Sartin, Peter Tippett, and J. Andrew Valentine. Data Breach Investigations Report. Technical report

  15. On Event Detection and Localization in Acyclic Flow Networks

    KAUST Repository

    Suresh, Mahima Agumbe; Stoleru, Radu; Zechman, Emily M.; Shihada, Basem

    2013-01-01

    Acyclic flow networks, present in many infrastructures of national importance (e.g., oil and gas and water distribution systems), have been attracting immense research interest. Existing solutions for detecting and locating attacks against

  16. Organization of pulse-height analysis programs for high event rates

    Energy Technology Data Exchange (ETDEWEB)

    Cohn, C E [Argonne National Lab., Ill. (USA)

    1976-09-01

    The ability of a pulse-height analysis program to handle high event rates can be enhanced by organizing it so as to minimize the time spent in interrupt housekeeping. Specifically, the routine that services the data-ready interrupt from the ADC should test whether another event is ready before performing the interrupt return.

  17. Unsupervised Event Characterization and Detection in Multichannel Signals: An EEG application

    Directory of Open Access Journals (Sweden)

    Angel Mur

    2016-04-01

    Full Text Available In this paper, we propose a new unsupervised method to automatically characterize and detect events in multichannel signals. This method is used to identify artifacts in electroencephalogram (EEG recordings of brain activity. The proposed algorithm has been evaluated and compared with a supervised method. To this end an example of the performance of the algorithm to detect artifacts is shown. The results show that although both methods obtain similar classification, the proposed method allows detecting events without training data and can also be applied in signals whose events are unknown a priori. Furthermore, the proposed method provides an optimal window whereby an optimal detection and characterization of events is found. The detection of events can be applied in real-time.

  18. Fault Detection Using the Zero Crossing Rate | Osuagwu | Nigerian ...

    African Journals Online (AJOL)

    A method of fault detection based on the zero crossing rate of the signal, Z1, and the zero crossing rate of the first order difference signal. Z2, is presented. It is shown that the parameter pair (Z1, Z2) possesses adequate discriminating potential to classify a signature as good or defective. The parameter pair also carries ...

  19. Rate modulation detection thresholds for cochlear implant users.

    Science.gov (United States)

    Brochier, Tim; McKay, Colette; McDermott, Hugh

    2018-02-01

    The perception of temporal amplitude modulations is critical for speech understanding by cochlear implant (CI) users. The present study compared the ability of CI users to detect sinusoidal modulations of the electrical stimulation rate and current level, at different presentation levels (80% and 40% of the dynamic range) and modulation frequencies (10 and 100 Hz). Rate modulation detection thresholds (RMDTs) and amplitude modulation detection thresholds (AMDTs) were measured and compared to assess whether there was a perceptual advantage to either modulation method. Both RMDTs and AMDTs improved with increasing presentation level and decreasing modulation frequency. RMDTs and AMDTs were correlated, indicating that a common processing mechanism may underlie the perception of rate modulation and amplitude modulation, or that some subject-dependent factors affect both types of modulation detection.

  20. Detection and location of multiple events by MARS. Final report

    International Nuclear Information System (INIS)

    Wang, J.; Masso, J.F.; Archambeau, C.B.; Savino, J.M.

    1980-09-01

    Seismic data from two explosions was processed using the Systems Science and Software MARS (Multiple Arrival Recognition System) seismic event detector in an effort to determine their relative spatial and temporal separation on the basis of seismic data alone. The explosions were less than 1.0 kilometer apart and were separated by less than 0.5 sec in origin times. The seismic data consisted of nine local accelerograms (r < 1.0 km) and four regional (240 through 400 km) seismograms. The MARS processing clearly indicates the presence of multiple explosions, but the restricted frequency range of the data inhibits accurate time picks and hence limits the precision of the event location

  1. The necessity of recognizing all events in x-ray detection

    International Nuclear Information System (INIS)

    Papp, T.; Maxwell, J.A.; Papp, A.T.

    2008-01-01

    -ray detection. Examples will be given in detection of x-rays in nuclear backgrounds, and in industrial measurements for ROHS and WEEE compliance with input rates of up to several hundred thousands counts per seconds. The availability of all the events allows one to see the other part of the spectrum, and thus offer explanations why the basic parameters are in such a bad shape

  2. Deep learning based beat event detection in action movie franchises

    Science.gov (United States)

    Ejaz, N.; Khan, U. A.; Martínez-del-Amor, M. A.; Sparenberg, H.

    2018-04-01

    Automatic understanding and interpretation of movies can be used in a variety of ways to semantically manage the massive volumes of movies data. "Action Movie Franchises" dataset is a collection of twenty Hollywood action movies from five famous franchises with ground truth annotations at shot and beat level of each movie. In this dataset, the annotations are provided for eleven semantic beat categories. In this work, we propose a deep learning based method to classify shots and beat-events on this dataset. The training dataset for each of the eleven beat categories is developed and then a Convolution Neural Network is trained. After finding the shot boundaries, key frames are extracted for each shot and then three classification labels are assigned to each key frame. The classification labels for each of the key frames in a particular shot are then used to assign a unique label to each shot. A simple sliding window based method is then used to group adjacent shots having the same label in order to find a particular beat event. The results of beat event classification are presented based on criteria of precision, recall, and F-measure. The results are compared with the existing technique and significant improvements are recorded.

  3. Falls event detection using triaxial accelerometry and barometric pressure measurement.

    Science.gov (United States)

    Bianchi, Federico; Redmond, Stephen J; Narayanan, Michael R; Cerutti, Sergio; Celler, Branko G; Lovell, Nigel H

    2009-01-01

    A falls detection system, employing a Bluetooth-based wearable device, containing a triaxial accelerometer and a barometric pressure sensor, is described. The aim of this study is to evaluate the use of barometric pressure measurement, as a surrogate measure of altitude, to augment previously reported accelerometry-based falls detection algorithms. The accelerometry and barometric pressure signals obtained from the waist-mounted device are analyzed by a signal processing and classification algorithm to discriminate falls from activities of daily living. This falls detection algorithm has been compared to two existing algorithms which utilize accelerometry signals alone. A set of laboratory-based simulated falls, along with other tasks associated with activities of daily living (16 tests) were performed by 15 healthy volunteers (9 male and 6 female; age: 23.7 +/- 2.9 years; height: 1.74 +/- 0.11 m). The algorithm incorporating pressure information detected falls with the highest sensitivity (97.8%) and the highest specificity (96.7%).

  4. Bidirectional RNN for Medical Event Detection in Electronic Health Records.

    Science.gov (United States)

    Jagannatha, Abhyuday N; Yu, Hong

    2016-06-01

    Sequence labeling for extraction of medical events and their attributes from unstructured text in Electronic Health Record (EHR) notes is a key step towards semantic understanding of EHRs. It has important applications in health informatics including pharmacovigilance and drug surveillance. The state of the art supervised machine learning models in this domain are based on Conditional Random Fields (CRFs) with features calculated from fixed context windows. In this application, we explored recurrent neural network frameworks and show that they significantly out-performed the CRF models.

  5. Power System Extreme Event Detection: The VulnerabilityFrontier

    Energy Technology Data Exchange (ETDEWEB)

    Lesieutre, Bernard C.; Pinar, Ali; Roy, Sandip

    2007-10-17

    In this work we apply graph theoretic tools to provide aclose bound on a frontier relating the number of line outages in a gridto the power disrupted by the outages. This frontier describes theboundary of a space relating the possible severity of a disturbance interms of power disruption, from zero to some maximum on the boundary, tothe number line outages involved in the event. We present the usefulnessof this analysis with a complete analysis of a 30 bus system, and presentresults for larger systems.

  6. DOUBLE COMPACT OBJECTS. III. GRAVITATIONAL-WAVE DETECTION RATES

    Energy Technology Data Exchange (ETDEWEB)

    Dominik, Michal; Belczynski, Krzysztof; Bulik, Tomasz [Astronomical Observatory, University of Warsaw, Al. Ujazdowskie 4, 00-478 Warsaw (Poland); Berti, Emanuele [Department of Physics and Astronomy, The University of Mississippi, University, MS 38677 (United States); O’Shaughnessy, Richard [Center for Gravitation, Cosmology, and Astrophysics, University of Wisconsin-Milwaukee, Milwaukee, WI (United States); Mandel, Ilya [School of Physics and Astronomy, University of Birmingham, Edgbaston, Birmingham B15 2TT (United Kingdom); Fryer, Christopher [CCS-2, MSD409, Los Alamos National Laboratory, Los Alamos, NM 87545 (United States); Holz, Daniel E. [Enrico Fermi Institute, Department of Physics, and Kavli Institute for Cosmological Physics University of Chicago, Chicago, IL 60637 (United States); Pannarale, Francesco [School of Physics and Astronomy, Cardiff University, The Parade, Cardiff CF24 3AA (United Kingdom)

    2015-06-20

    The unprecedented range of second-generation gravitational-wave (GW) observatories calls for refining the predictions of potential sources and detection rates. The coalescence of double compact objects (DCOs)—i.e., neutron star–neutron star (NS–NS), black hole–neutron star (BH–NS), and black hole–black hole (BH–BH) binary systems—is the most promising source of GWs for these detectors. We compute detection rates of coalescing DCOs in second-generation GW detectors using the latest models for their cosmological evolution, and implementing inspiral-merger-ringdown gravitational waveform models in our signal-to-noise ratio calculations. We find that (1) the inclusion of the merger/ringdown portion of the signal does not significantly affect rates for NS–NS and BH–NS systems, but it boosts rates by a factor of ∼1.5 for BH–BH systems; (2) in almost all of our models BH–BH systems yield by far the largest rates, followed by NS–NS and BH–NS systems, respectively; and (3) a majority of the detectable BH–BH systems were formed in the early universe in low-metallicity environments. We make predictions for the distributions of detected binaries and discuss what the first GW detections will teach us about the astrophysics underlying binary formation and evolution.

  7. Reliability of the peer-review process for adverse event rating.

    Directory of Open Access Journals (Sweden)

    Alan J Forster

    Full Text Available Adverse events are poor patient outcomes caused by medical care. Their identification requires the peer-review of poor outcomes, which may be unreliable. Combining physician ratings might improve the accuracy of adverse event classification.To evaluate the variation in peer-reviewer ratings of adverse outcomes; determine the impact of this variation on estimates of reviewer accuracy; and determine the number of reviewers who judge an adverse event occurred that is required to ensure that the true probability of an adverse event exceeded 50%, 75% or 95%.Thirty physicians rated 319 case reports giving details of poor patient outcomes following hospital discharge. They rated whether medical management caused the outcome using a six-point ordinal scale. We conducted latent class analyses to estimate the prevalence of adverse events as well as the sensitivity and specificity of each reviewer. We used this model and Bayesian calculations to determine the probability that an adverse event truly occurred to each patient as function of their number of positive ratings.The overall median score on the 6-point ordinal scale was 3 (IQR 2,4 but the individual rater median score ranged from a minimum of 1 (in four reviewers to a maximum median score of 5. The overall percentage of cases rated as an adverse event was 39.7% (3798/9570. The median kappa for all pair-wise combinations of the 30 reviewers was 0.26 (IQR 0.16, 0.42; Min = -0.07, Max = 0.62. Reviewer sensitivity and specificity for adverse event classification ranged from 0.06 to 0.93 and 0.50 to 0.98, respectively. The estimated prevalence of adverse events using a latent class model with a common sensitivity and specificity for all reviewers (0.64 and 0.83 respectively was 47.6%. For patients to have a 95% chance of truly having an adverse event, at least 3 of 3 reviewers are required to deem the outcome an adverse event.Adverse event classification is unreliable. To be certain that a case

  8. Event detection in athletics for personalized sports content delivery

    DEFF Research Database (Denmark)

    Katsarakis, N.; Pnevmatikakis, A.

    2009-01-01

    Broadcasting of athletics is nowadays biased towards running (sprint and longer distances) sports. Personalized content delivery can change that for users that wish to focus on different content. Using a combination of video signal processing algorithms and live information that accompanies the v...... algorithms needed for the extraction of the events that trigger both between and within sport camera selection, and describes a system that handles user preferences, live information andvideo-generated events to offer personalized content to the users.......Broadcasting of athletics is nowadays biased towards running (sprint and longer distances) sports. Personalized content delivery can change that for users that wish to focus on different content. Using a combination of video signal processing algorithms and live information that accompanies...... the video of large-scale sports like the Olympics, a system can attend to the preferences of users by selecting the most suitable camera view for them.There are two types of camera selection for personalized content delivery. According to the between sport camera selection, the view is changed between two...

  9. Probabilistic pipe fracture evaluations for leak-rate-detection applications

    International Nuclear Information System (INIS)

    Rahman, S.; Ghadiali, N.; Paul, D.; Wilkowski, G.

    1995-04-01

    Regulatory Guide 1.45, open-quotes Reactor Coolant Pressure Boundary Leakage Detection Systems,close quotes was published by the U.S. Nuclear Regulatory Commission (NRC) in May 1973, and provides guidance on leak detection methods and system requirements for Light Water Reactors. Additionally, leak detection limits are specified in plant Technical Specifications and are different for Boiling Water Reactors (BWRs) and Pressurized Water Reactors (PWRs). These leak detection limits are also used in leak-before-break evaluations performed in accordance with Draft Standard Review Plan, Section 3.6.3, open-quotes Leak Before Break Evaluation Proceduresclose quotes where a margin of 10 on the leak detection limit is used in determining the crack size considered in subsequent fracture analyses. This study was requested by the NRC to: (1) evaluate the conditional failure probability for BWR and PWR piping for pipes that were leaking at the allowable leak detection limit, and (2) evaluate the margin of 10 to determine if it was unnecessarily large. A probabilistic approach was undertaken to conduct fracture evaluations of circumferentially cracked pipes for leak-rate-detection applications. Sixteen nuclear piping systems in BWR and PWR plants were analyzed to evaluate conditional failure probability and effects of crack-morphology variability on the current margins used in leak rate detection for leak-before-break

  10. A Bayes linear Bayes method for estimation of correlated event rates.

    Science.gov (United States)

    Quigley, John; Wilson, Kevin J; Walls, Lesley; Bedford, Tim

    2013-12-01

    Typically, full Bayesian estimation of correlated event rates can be computationally challenging since estimators are intractable. When estimation of event rates represents one activity within a larger modeling process, there is an incentive to develop more efficient inference than provided by a full Bayesian model. We develop a new subjective inference method for correlated event rates based on a Bayes linear Bayes model under the assumption that events are generated from a homogeneous Poisson process. To reduce the elicitation burden we introduce homogenization factors to the model and, as an alternative to a subjective prior, an empirical method using the method of moments is developed. Inference under the new method is compared against estimates obtained under a full Bayesian model, which takes a multivariate gamma prior, where the predictive and posterior distributions are derived in terms of well-known functions. The mathematical properties of both models are presented. A simulation study shows that the Bayes linear Bayes inference method and the full Bayesian model provide equally reliable estimates. An illustrative example, motivated by a problem of estimating correlated event rates across different users in a simple supply chain, shows how ignoring the correlation leads to biased estimation of event rates. © 2013 Society for Risk Analysis.

  11. Spatial-temporal event detection in climate parameter imagery.

    Energy Technology Data Exchange (ETDEWEB)

    McKenna, Sean Andrew; Gutierrez, Karen A.

    2011-10-01

    Previously developed techniques that comprise statistical parametric mapping, with applications focused on human brain imaging, are examined and tested here for new applications in anomaly detection within remotely-sensed imagery. Two approaches to analysis are developed: online, regression-based anomaly detection and conditional differences. These approaches are applied to two example spatial-temporal data sets: data simulated with a Gaussian field deformation approach and weekly NDVI images derived from global satellite coverage. Results indicate that anomalies can be identified in spatial temporal data with the regression-based approach. Additionally, la Nina and el Nino climatic conditions are used as different stimuli applied to the earth and this comparison shows that el Nino conditions lead to significant decreases in NDVI in both the Amazon Basin and in Southern India.

  12. BIRDNEST: Bayesian Inference for Ratings-Fraud Detection

    OpenAIRE

    Hooi, Bryan; Shah, Neil; Beutel, Alex; Gunnemann, Stephan; Akoglu, Leman; Kumar, Mohit; Makhija, Disha; Faloutsos, Christos

    2015-01-01

    Review fraud is a pervasive problem in online commerce, in which fraudulent sellers write or purchase fake reviews to manipulate perception of their products and services. Fake reviews are often detected based on several signs, including 1) they occur in short bursts of time; 2) fraudulent user accounts have skewed rating distributions. However, these may both be true in any given dataset. Hence, in this paper, we propose an approach for detecting fraudulent reviews which combines these 2 app...

  13. Bayesian analysis of energy and count rate data for detection of low count rate radioactive sources.

    Science.gov (United States)

    Klumpp, John; Brandl, Alexander

    2015-03-01

    A particle counting and detection system is proposed that searches for elevated count rates in multiple energy regions simultaneously. The system analyzes time-interval data (e.g., time between counts), as this was shown to be a more sensitive technique for detecting low count rate sources compared to analyzing counts per unit interval (Luo et al. 2013). Two distinct versions of the detection system are developed. The first is intended for situations in which the sample is fixed and can be measured for an unlimited amount of time. The second version is intended to detect sources that are physically moving relative to the detector, such as a truck moving past a fixed roadside detector or a waste storage facility under an airplane. In both cases, the detection system is expected to be active indefinitely; i.e., it is an online detection system. Both versions of the multi-energy detection systems are compared to their respective gross count rate detection systems in terms of Type I and Type II error rates and sensitivity.

  14. Full-waveform detection of non-impulsive seismic events based on time-reversal methods

    Science.gov (United States)

    Solano, Ericka Alinne; Hjörleifsdóttir, Vala; Liu, Qinya

    2017-12-01

    We present a full-waveform detection method for non-impulsive seismic events, based on time-reversal principles. We use the strain Green's tensor as a matched filter, correlating it with continuous observed seismograms, to detect non-impulsive seismic events. We show that this is mathematically equivalent to an adjoint method for detecting earthquakes. We define the detection function, a scalar valued function, which depends on the stacked correlations for a group of stations. Event detections are given by the times at which the amplitude of the detection function exceeds a given value relative to the noise level. The method can make use of the whole seismic waveform or any combination of time-windows with different filters. It is expected to have an advantage compared to traditional detection methods for events that do not produce energetic and impulsive P waves, for example glacial events, landslides, volcanic events and transform-fault earthquakes for events which velocity structure along the path is relatively well known. Furthermore, the method has advantages over empirical Greens functions template matching methods, as it does not depend on records from previously detected events, and therefore is not limited to events occurring in similar regions and with similar focal mechanisms as these events. The method is not specific to any particular way of calculating the synthetic seismograms, and therefore complicated structural models can be used. This is particularly beneficial for intermediate size events that are registered on regional networks, for which the effect of lateral structure on the waveforms can be significant. To demonstrate the feasibility of the method, we apply it to two different areas located along the mid-oceanic ridge system west of Mexico where non-impulsive events have been reported. The first study area is between Clipperton and Siqueiros transform faults (9°N), during the time of two earthquake swarms, occurring in March 2012 and May

  15. Estimating the effect of a rare time-dependent treatment on the recurrent event rate.

    Science.gov (United States)

    Smith, Abigail R; Zhu, Danting; Goodrich, Nathan P; Merion, Robert M; Schaubel, Douglas E

    2018-05-30

    In many observational studies, the objective is to estimate the effect of treatment or state-change on the recurrent event rate. If treatment is assigned after the start of follow-up, traditional methods (eg, adjustment for baseline-only covariates or fully conditional adjustment for time-dependent covariates) may give biased results. We propose a two-stage modeling approach using the method of sequential stratification to accurately estimate the effect of a time-dependent treatment on the recurrent event rate. At the first stage, we estimate the pretreatment recurrent event trajectory using a proportional rates model censored at the time of treatment. Prognostic scores are estimated from the linear predictor of this model and used to match treated patients to as yet untreated controls based on prognostic score at the time of treatment for the index patient. The final model is stratified on matched sets and compares the posttreatment recurrent event rate to the recurrent event rate of the matched controls. We demonstrate through simulation that bias due to dependent censoring is negligible, provided the treatment frequency is low, and we investigate a threshold at which correction for dependent censoring is needed. The method is applied to liver transplant (LT), where we estimate the effect of development of post-LT End Stage Renal Disease (ESRD) on rate of days hospitalized. Copyright © 2018 John Wiley & Sons, Ltd.

  16. Super-Eddington Accretion in Tidal Disruption Events: the Impact of Realistic Fallback Rates on Accretion Rates

    Science.gov (United States)

    Wu, Samantha; Coughlin, Eric R.; Nixon, Chris

    2018-04-01

    After the tidal disruption of a star by a massive black hole, disrupted stellar debris can fall back to the hole at a rate significantly exceeding its Eddington limit. To understand how black hole mass affects the duration of super-Eddington accretion in tidal disruption events, we first run a suite of simulations of the disruption of a Solar-like star by a supermassive black hole of varying mass to directly measure the fallback rate onto the hole, and we compare these fallback rates to the analytic predictions of the "frozen-in" model. Then, adopting a Zero-Bernoulli Accretion flow as an analytic prescription for the accretion flow around the hole, we investigate how the accretion rate onto the black hole evolves with the more accurate fallback rates calculated from the simulations. We find that numerically-simulated fallback rates yield accretion rates onto the hole that can, depending on the black hole mass, be nearly an order of magnitude larger than those predicted by the frozen-in approximation. Our results place new limits on the maximum black hole mass for which super-Eddington accretion occurs in tidal disruption events.

  17. A coupled classification - evolutionary optimization model for contamination event detection in water distribution systems.

    Science.gov (United States)

    Oliker, Nurit; Ostfeld, Avi

    2014-03-15

    This study describes a decision support system, alerts for contamination events in water distribution systems. The developed model comprises a weighted support vector machine (SVM) for the detection of outliers, and a following sequence analysis for the classification of contamination events. The contribution of this study is an improvement of contamination events detection ability and a multi-dimensional analysis of the data, differing from the parallel one-dimensional analysis conducted so far. The multivariate analysis examines the relationships between water quality parameters and detects changes in their mutual patterns. The weights of the SVM model accomplish two goals: blurring the difference between sizes of the two classes' data sets (as there are much more normal/regular than event time measurements), and adhering the time factor attribute by a time decay coefficient, ascribing higher importance to recent observations when classifying a time step measurement. All model parameters were determined by data driven optimization so the calibration of the model was completely autonomic. The model was trained and tested on a real water distribution system (WDS) data set with randomly simulated events superimposed on the original measurements. The model is prominent in its ability to detect events that were only partly expressed in the data (i.e., affecting only some of the measured parameters). The model showed high accuracy and better detection ability as compared to previous modeling attempts of contamination event detection. Copyright © 2013 Elsevier Ltd. All rights reserved.

  18. Impact of whole-genome duplication events on diversification rates in angiosperms.

    Science.gov (United States)

    Landis, Jacob B; Soltis, Douglas E; Li, Zheng; Marx, Hannah E; Barker, Michael S; Tank, David C; Soltis, Pamela S

    2018-03-01

    Polyploidy or whole-genome duplication (WGD) pervades the evolutionary history of angiosperms. Despite extensive progress in our understanding of WGD, the role of these events in promoting diversification is still not well understood. We seek to clarify the possible association between WGD and diversification rates in flowering plants. Using a previously published phylogeny spanning all land plants (31,749 tips) and WGD events inferred from analyses of the 1000 Plants (1KP) transcriptome data, we analyzed the association of WGDs and diversification rates following numerous WGD events across the angiosperms. We used a stepwise AIC approach (MEDUSA), a Bayesian mixture model approach (BAMM), and state-dependent diversification analyses (MuSSE) to investigate patterns of diversification. Sister-clade comparisons were used to investigate species richness after WGDs. Based on the density of 1KP taxon sampling, 106 WGDs were unambiguously placed on the angiosperm phylogeny. We identified 334-530 shifts in diversification rates. We found that 61 WGD events were tightly linked to changes in diversification rates, and state-dependent diversification analyses indicated higher speciation rates for subsequent rounds of WGD. Additionally, 70 of 99 WGD events showed an increase in species richness compared to the sister clade. Forty-six of the 106 WGDs analyzed appear to be closely associated with upshifts in the rate of diversification in angiosperms. Shifts in diversification do not appear more likely than random within a four-node lag phase following a WGD; however, younger WGD events are more likely to be followed by an upshift in diversification than older WGD events. © 2018 Botanical Society of America.

  19. Differences in the rates of patient safety events by payer: implications for providers and policymakers.

    Science.gov (United States)

    Spencer, Christine S; Roberts, Eric T; Gaskin, Darrell J

    2015-06-01

    The reduction of adverse patient safety events and the equitable treatment of patients in hospitals are clinical and policy priorities. Health services researchers have identified disparities in the quality of care provided to patients, both by demographic characteristics and insurance status. However, less is known about the extent to which disparities reflect differences in the places where patients obtain care, versus disparities in the quality of care provided to different groups of patients in the same hospital. In this study, we examine whether the rate of adverse patient safety events differs by the insurance status of patients within the same hospital. Using discharge data from hospitals in 11 states, we compared risk-adjusted rates for 13 AHRQ Patient Safety Indicators by Medicare, Medicaid, and Private payer insurance status, within the same hospitals. We used multivariate regression to assess the relationship between insurance status and rates of adverse patient safety events within hospitals. Medicare and Medicaid patients experienced significantly more adverse safety events than private pay patients for 12 and 7 Patient Safety Indicators, respectively (at P patients had significantly lower event rates than private payers on 2 Patient Safety Indicators. Risk-adjusted Patient Safety Indicator rates varied with patients' insurance within the same hospital. More research is needed to determine the cause of differences in care quality received by patients at the same hospital, especially if quality measures are to be used for payment.

  20. Heart rate detection from an electronic weighing scale

    International Nuclear Information System (INIS)

    González-Landaeta, R; Casas, O; Pallàs-Areny, R

    2008-01-01

    We propose a novel technique for beat-to-beat heart rate detection based on the ballistocardiographic (BCG) force signal from a subject standing on a common electronic weighing scale. The detection relies on sensing force variations related to the blood acceleration in the aorta, works even if wearing footwear and does not require any sensors attached to the body because it uses the load cells in the scale. We have devised an approach to estimate the sensitivity and frequency response of three commercial weighing scales to assess their capability to detect the BCG force signal. Static sensitivities ranged from 490 nV V −1 N −1 to 1670 nV V −1 N −1 . The frequency response depended on the subject's mass but it was broad enough for heart rate estimation. We have designed an electronic pulse detection system based on off-the-shelf integrated circuits to sense heart-beat-related force variations of about 0.24 N. The signal-to-noise ratio of the main peaks of the force signal detected was higher than 30 dB. A Bland–Altman plot was used to compare the RR time intervals estimated from the ECG and BCG force signals for 17 volunteers. The error was ±21 ms, which makes the proposed technique suitable for short-term monitoring of the heart rate

  1. Unified framework for triaxial accelerometer-based fall event detection and classification using cumulants and hierarchical decision tree classifier.

    Science.gov (United States)

    Kambhampati, Satya Samyukta; Singh, Vishal; Manikandan, M Sabarimalai; Ramkumar, Barathram

    2015-08-01

    In this Letter, the authors present a unified framework for fall event detection and classification using the cumulants extracted from the acceleration (ACC) signals acquired using a single waist-mounted triaxial accelerometer. The main objective of this Letter is to find suitable representative cumulants and classifiers in effectively detecting and classifying different types of fall and non-fall events. It was discovered that the first level of the proposed hierarchical decision tree algorithm implements fall detection using fifth-order cumulants and support vector machine (SVM) classifier. In the second level, the fall event classification algorithm uses the fifth-order cumulants and SVM. Finally, human activity classification is performed using the second-order cumulants and SVM. The detection and classification results are compared with those of the decision tree, naive Bayes, multilayer perceptron and SVM classifiers with different types of time-domain features including the second-, third-, fourth- and fifth-order cumulants and the signal magnitude vector and signal magnitude area. The experimental results demonstrate that the second- and fifth-order cumulant features and SVM classifier can achieve optimal detection and classification rates of above 95%, as well as the lowest false alarm rate of 1.03%.

  2. Adenoma detection rate varies greatly during colonoscopy training

    NARCIS (Netherlands)

    van Doorn, Sascha C.; Klanderman, Robert B.; Hazewinkel, Yark; Fockens, Paul; Dekker, Evelien

    2015-01-01

    The adenoma detection rate (ADR) is considered the most important quality indicator for colonoscopy and varies widely among colonoscopists. It is unknown whether the ADR of gastroenterology consultants can already be predicted during their colonoscopy training. To evaluate the ADR of fellows in

  3. EUROCAT website data on prenatal detection rates of congenital anomalies

    NARCIS (Netherlands)

    Garne, Ester; Dolk, Helen; Loane, Maria; Boyd, Patricia A.

    2010-01-01

    The EUROCAT website www.eurocat-network.eu publishes prenatal detection rates for major congenital anomalies using data from European population-based congenital anomaly registers, covering 28% of the EU population as well as non-EU countries. Data are updated annually. This information can be

  4. Novel Method For Low-Rate Ddos Attack Detection

    Science.gov (United States)

    Chistokhodova, A. A.; Sidorov, I. D.

    2018-05-01

    The relevance of the work is associated with an increasing number of advanced types of DDoS attacks, in particular, low-rate HTTP-flood. Last year, the power and complexity of such attacks increased significantly. The article is devoted to the analysis of DDoS attacks detecting methods and their modifications with the purpose of increasing the accuracy of DDoS attack detection. The article details low-rate attacks features in comparison with conventional DDoS attacks. During the analysis, significant shortcomings of the available method for detecting low-rate DDoS attacks were found. Thus, the result of the study is an informal description of a new method for detecting low-rate denial-of-service attacks. The architecture of the stand for approbation of the method is developed. At the current stage of the study, it is possible to improve the efficiency of an already existing method by using a classifier with memory, as well as additional information.

  5. EUROCAT website data on prenatal detection rates of congenital anomalies

    DEFF Research Database (Denmark)

    Garne, Ester; Dolk, Helen; Loane, Maria

    2010-01-01

    The EUROCAT website www.eurocat-network.eu publishes prenatal detection rates for major congenital anomalies using data from European population-based congenital anomaly registers, covering 28% of the EU population as well as non-EU countries. Data are updated annually. This information can be us...

  6. Object-Oriented Query Language For Events Detection From Images Sequences

    Science.gov (United States)

    Ganea, Ion Eugen

    2015-09-01

    In this paper is presented a method to represent the events extracted from images sequences and the query language used for events detection. Using an object oriented model the spatial and temporal relationships between salient objects and also between events are stored and queried. This works aims to unify the storing and querying phases for video events processing. The object oriented language syntax used for events processing allow the instantiation of the indexes classes in order to improve the accuracy of the query results. The experiments were performed on images sequences provided from sport domain and it shows the reliability and the robustness of the proposed language. To extend the language will be added a specific syntax for constructing the templates for abnormal events and for detection of the incidents as the final goal of the research.

  7. Discrete Event Simulation Model of the Polaris 2.1 Gamma Ray Imaging Radiation Detection Device

    Science.gov (United States)

    2016-06-01

    release; distribution is unlimited DISCRETE EVENT SIMULATION MODEL OF THE POLARIS 2.1 GAMMA RAY IMAGING RADIATION DETECTION DEVICE by Andres T...ONLY (Leave blank) 2. REPORT DATE June 2016 3. REPORT TYPE AND DATES COVERED Master’s thesis 4. TITLE AND SUBTITLE DISCRETE EVENT SIMULATION MODEL...modeled. The platform, Simkit, was utilized to create a discrete event simulation (DES) model of the Polaris. After carefully constructing the DES

  8. Event Detection Challenges, Methods, and Applications in Natural and Artificial Systems

    Science.gov (United States)

    2009-03-01

    Sauvageon, Agogino, Mehr, and Tumer [2006], for instance, use a fourth degree polynomial within an event detection algorithm to sense high... cancer , and coronary artery disease. His study examines the age at which to begin screening exams, the intervals between the exams, and (possibly...AM, Mehr AF, and Tumer IY. 2006. “Comparison of Event Detection Methods for Centralized Sensor Networks.” IEEE Sensors Applications Symposium 2006

  9. Screening DNA chip and event-specific multiplex PCR detection methods for biotech crops.

    Science.gov (United States)

    Lee, Seong-Hun

    2014-11-01

    There are about 80 biotech crop events that have been approved by safety assessment in Korea. They have been controlled by genetically modified organism (GMO) and living modified organism (LMO) labeling systems. The DNA-based detection method has been used as an efficient scientific management tool. Recently, the multiplex polymerase chain reaction (PCR) and DNA chip have been developed as simultaneous detection methods for several biotech crops' events. The event-specific multiplex PCR method was developed to detect five biotech maize events: MIR604, Event 3272, LY 038, MON 88017 and DAS-59122-7. The specificity was confirmed and the sensitivity was 0.5%. The screening DNA chip was developed from four endogenous genes of soybean, maize, cotton and canola respectively along with two regulatory elements and seven genes: P35S, tNOS, pat, bar, epsps1, epsps2, pmi, cry1Ac and cry3B. The specificity was confirmed and the sensitivity was 0.5% for four crops' 12 events: one soybean, six maize, three cotton and two canola events. The multiplex PCR and DNA chip can be available for screening, gene-specific and event-specific analysis of biotech crops as efficient detection methods by saving on workload and time. © 2014 Society of Chemical Industry. © 2014 Society of Chemical Industry.

  10. Detecting Smoking Events Using Accelerometer Data Collected Via Smartwatch Technology: Validation Study.

    Science.gov (United States)

    Cole, Casey A; Anshari, Dien; Lambert, Victoria; Thrasher, James F; Valafar, Homayoun

    2017-12-13

    Smoking is the leading cause of preventable death in the world today. Ecological research on smoking in context currently relies on self-reported smoking behavior. Emerging smartwatch technology may more objectively measure smoking behavior by automatically detecting smoking sessions using robust machine learning models. This study aimed to examine the feasibility of detecting smoking behavior using smartwatches. The second aim of this study was to compare the success of observing smoking behavior with smartwatches to that of conventional self-reporting. A convenience sample of smokers was recruited for this study. Participants (N=10) recorded 12 hours of accelerometer data using a mobile phone and smartwatch. During these 12 hours, they engaged in various daily activities, including smoking, for which they logged the beginning and end of each smoking session. Raw data were classified as either smoking or nonsmoking using a machine learning model for pattern recognition. The accuracy of the model was evaluated by comparing the output with a detailed description of a modeled smoking session. In total, 120 hours of data were collected from participants and analyzed. The accuracy of self-reported smoking was approximately 78% (96/123). Our model was successful in detecting 100 of 123 (81%) smoking sessions recorded by participants. After eliminating sessions from the participants that did not adhere to study protocols, the true positive detection rate of the smartwatch based-detection increased to more than 90%. During the 120 hours of combined observation time, only 22 false positive smoking sessions were detected resulting in a 2.8% false positive rate. Smartwatch technology can provide an accurate, nonintrusive means of monitoring smoking behavior in natural contexts. The use of machine learning algorithms for passively detecting smoking sessions may enrich ecological momentary assessment protocols and cessation intervention studies that often rely on self

  11. An integrated logit model for contamination event detection in water distribution systems.

    Science.gov (United States)

    Housh, Mashor; Ostfeld, Avi

    2015-05-15

    The problem of contamination event detection in water distribution systems has become one of the most challenging research topics in water distribution systems analysis. Current attempts for event detection utilize a variety of approaches including statistical, heuristics, machine learning, and optimization methods. Several existing event detection systems share a common feature in which alarms are obtained separately for each of the water quality indicators. Unifying those single alarms from different indicators is usually performed by means of simple heuristics. A salient feature of the current developed approach is using a statistically oriented model for discrete choice prediction which is estimated using the maximum likelihood method for integrating the single alarms. The discrete choice model is jointly calibrated with other components of the event detection system framework in a training data set using genetic algorithms. The fusing process of each indicator probabilities, which is left out of focus in many existing event detection system models, is confirmed to be a crucial part of the system which could be modelled by exploiting a discrete choice model for improving its performance. The developed methodology is tested on real water quality data, showing improved performances in decreasing the number of false positive alarms and in its ability to detect events with higher probabilities, compared to previous studies. Copyright © 2015 Elsevier Ltd. All rights reserved.

  12. Energy-Efficient Fault-Tolerant Dynamic Event Region Detection in Wireless Sensor Networks

    DEFF Research Database (Denmark)

    Enemark, Hans-Jacob; Zhang, Yue; Dragoni, Nicola

    2015-01-01

    to a hybrid algorithm for dynamic event region detection, such as real-time tracking of chemical leakage regions. Considering the characteristics of the moving away dynamic events, we propose a return back condition for the hybrid algorithm from distributed neighborhood collaboration, in which a node makes......Fault-tolerant event detection is fundamental to wireless sensor network applications. Existing approaches usually adopt neighborhood collaboration for better detection accuracy, while need more energy consumption due to communication. Focusing on energy efficiency, this paper makes an improvement...... its detection decision based on decisions received from its spatial and temporal neighbors, to local non-communicative decision making. The simulation results demonstrate that the improved algorithm does not degrade the detection accuracy of the original algorithm, while it has better energy...

  13. Adaptive Sensor Tuning for Seismic Event Detection in Environment with Electromagnetic Noise

    Science.gov (United States)

    Ziegler, Abra E.

    The goal of this research is to detect possible microseismic events at a carbon sequestration site. Data recorded on a continuous downhole microseismic array in the Farnsworth Field, an oil field in Northern Texas that hosts an ongoing carbon capture, utilization, and storage project, were evaluated using machine learning and reinforcement learning techniques to determine their effectiveness at seismic event detection on a dataset with electromagnetic noise. The data were recorded from a passive vertical monitoring array consisting of 16 levels of 3-component 15 Hz geophones installed in the field and continuously recording since January 2014. Electromagnetic and other noise recorded on the array has significantly impacted the utility of the data and it was necessary to characterize and filter the noise in order to attempt event detection. Traditional detection methods using short-term average/long-term average (STA/LTA) algorithms were evaluated and determined to be ineffective because of changing noise levels. To improve the performance of event detection and automatically and dynamically detect seismic events using effective data processing parameters, an adaptive sensor tuning (AST) algorithm developed by Sandia National Laboratories was utilized. AST exploits neuro-dynamic programming (reinforcement learning) trained with historic event data to automatically self-tune and determine optimal detection parameter settings. The key metric that guides the AST algorithm is consistency of each sensor with its nearest neighbors: parameters are automatically adjusted on a per station basis to be more or less sensitive to produce consistent agreement of detections in its neighborhood. The effects that changes in neighborhood configuration have on signal detection were explored, as it was determined that neighborhood-based detections significantly reduce the number of both missed and false detections in ground-truthed data. The performance of the AST algorithm was

  14. Gamma-ray dose rate increase at rainfall events and their air-mass origins

    International Nuclear Information System (INIS)

    Iyogi, Takashi; Hisamatsu, Shun'ichi; Inaba, Jiro

    2007-01-01

    The environmental γ-ray dose rate and precipitation rates were measured at our institute, in Rokkasho, Aomori, Japan. We analyzed 425 rainfall events in which the precipitation rate was over 0.5 mm from April through November during the years 2003 to 2005. Backward trajectories for 5 d starting from 1000 m above Rokkasho at the time of the maximum dose rate in a rainfall event, were calculated by using the HYSPLIT model of the NOAA Air Resources Laboratory. The trajectories for 5 d were classified by visual inspection according to the passage areas; Pacific Ocean, Asian Continent and Japan Islands. The increase of cumulative environmental γ-ray dose during a rainfall event was plotted against the precipitation in the event, and their relationship was separately examined according to the air-mass passage area, i.e. origin of the air-mass. Our results showed that the origin of air-mass was an important factor affecting the increase of environmental γ-ray dose rate by rainfall. (author)

  15. QRS peak detection for heart rate monitoring on Android smartphone

    Science.gov (United States)

    Pambudi Utomo, Trio; Nuryani, Nuryani; Darmanto

    2017-11-01

    In this study, Android smartphone is used for heart rate monitoring and displaying electrocardiogram (ECG) graph. Heart rate determination is based on QRS peak detection. Two methods are studied to detect the QRS complex peak; they are Peak Threshold and Peak Filter. The acquisition of ECG data is utilized by AD8232 module from Analog Devices, three electrodes, and Microcontroller Arduino UNO R3. To record the ECG data from a patient, three electrodes are attached to particular body’s surface of a patient. Patient’s heart activity which is recorded by AD8232 module is decoded by Arduino UNO R3 into analog data. Then, the analog data is converted into a voltage value (mV) and is processed to get the QRS complex peak. Heart rate value is calculated by Microcontroller Arduino UNO R3 uses the QRS complex peak. Voltage, heart rate, and the QRS complex peak are sent to Android smartphone by Bluetooth HC-05. ECG data is displayed as the graph by Android smartphone. To evaluate the performance of QRS complex peak detection method, three parameters are used; they are positive predictive, accuracy and sensitivity. Positive predictive, accuracy, and sensitivity of Peak Threshold method is 92.39%, 70.30%, 74.62% and for Peak Filter method are 98.38%, 82.47%, 83.61%, respectively.

  16. Event-specific qualitative and quantitative detection of five genetically modified rice events using a single standard reference molecule.

    Science.gov (United States)

    Kim, Jae-Hwan; Park, Saet-Byul; Roh, Hyo-Jeong; Shin, Min-Ki; Moon, Gui-Im; Hong, Jin-Hwan; Kim, Hae-Yeong

    2017-07-01

    One novel standard reference plasmid, namely pUC-RICE5, was constructed as a positive control and calibrator for event-specific qualitative and quantitative detection of genetically modified (GM) rice (Bt63, Kemingdao1, Kefeng6, Kefeng8, and LLRice62). pUC-RICE5 contained fragments of a rice-specific endogenous reference gene (sucrose phosphate synthase) as well as the five GM rice events. An existing qualitative PCR assay approach was modified using pUC-RICE5 to create a quantitative method with limits of detection correlating to approximately 1-10 copies of rice haploid genomes. In this quantitative PCR assay, the square regression coefficients ranged from 0.993 to 1.000. The standard deviation and relative standard deviation values for repeatability ranged from 0.02 to 0.22 and 0.10% to 0.67%, respectively. The Ministry of Food and Drug Safety (Korea) validated the method and the results suggest it could be used routinely to identify five GM rice events. Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. A Generalized Approach to Model the Spectra and Radiation Dose Rate of Solar Particle Events on the Surface of Mars

    Science.gov (United States)

    Guo, Jingnan; Zeitlin, Cary; Wimmer-Schweingruber, Robert F.; McDole, Thoren; Kühl, Patrick; Appel, Jan C.; Matthiä, Daniel; Krauss, Johannes; Köhler, Jan

    2018-01-01

    For future human missions to Mars, it is important to study the surface radiation environment during extreme and elevated conditions. In the long term, it is mainly galactic cosmic rays (GCRs) modulated by solar activity that contribute to the radiation on the surface of Mars, but intense solar energetic particle (SEP) events may induce acute health effects. Such events may enhance the radiation level significantly and should be detected as immediately as possible to prevent severe damage to humans and equipment. However, the energetic particle environment on the Martian surface is significantly different from that in deep space due to the influence of the Martian atmosphere. Depending on the intensity and shape of the original solar particle spectra, as well as particle types, the surface spectra may induce entirely different radiation effects. In order to give immediate and accurate alerts while avoiding unnecessary ones, it is important to model and well understand the atmospheric effect on the incoming SEPs, including both protons and helium ions. In this paper, we have developed a generalized approach to quickly model the surface response of any given incoming proton/helium ion spectra and have applied it to a set of historical large solar events, thus providing insights into the possible variety of surface radiation environments that may be induced during SEP events. Based on the statistical study of more than 30 significant solar events, we have obtained an empirical model for estimating the surface dose rate directly from the intensities of a power-law SEP spectra.

  18. Impact of sensor detection limits on protecting water distribution systems from contamination events

    International Nuclear Information System (INIS)

    McKenna, Sean Andrew; Hart, David Blaine; Yarrington, Lane

    2006-01-01

    Real-time water quality sensors are becoming commonplace in water distribution systems. However, field deployable, contaminant-specific sensors are still in the development stage. As development proceeds, the necessary operating parameters of these sensors must be determined to protect consumers from accidental and malevolent contamination events. This objective can be quantified in several different ways including minimization of: the time necessary to detect a contamination event, the population exposed to contaminated water, the extent of the contamination within the network, and others. We examine the ability of a sensor set to meet these objectives as a function of both the detection limit of the sensors and the number of sensors in the network. A moderately sized distribution network is used as an example and different sized sets of randomly placed sensors are considered. For each combination of a certain number of sensors and a detection limit, the mean values of the different objectives across multiple random sensor placements are calculated. The tradeoff between the necessary detection limit in a sensor and the number of sensors is evaluated. Results show that for the example problem examined here, a sensor detection limit of 0.01 of the average source concentration is adequate for maximum protection. Detection of events is dependent on the detection limit of the sensors, but for those events that are detected, the values of the performance measures are not a function of the sensor detection limit. The results of replacing a single sensor in a network with a sensor having a much lower detection limit show that while this replacement can improve results, the majority of the additional events detected had performance measures of relatively low consequence.

  19. Regression analysis of mixed recurrent-event and panel-count data with additive rate models.

    Science.gov (United States)

    Zhu, Liang; Zhao, Hui; Sun, Jianguo; Leisenring, Wendy; Robison, Leslie L

    2015-03-01

    Event-history studies of recurrent events are often conducted in fields such as demography, epidemiology, medicine, and social sciences (Cook and Lawless, 2007, The Statistical Analysis of Recurrent Events. New York: Springer-Verlag; Zhao et al., 2011, Test 20, 1-42). For such analysis, two types of data have been extensively investigated: recurrent-event data and panel-count data. However, in practice, one may face a third type of data, mixed recurrent-event and panel-count data or mixed event-history data. Such data occur if some study subjects are monitored or observed continuously and thus provide recurrent-event data, while the others are observed only at discrete times and hence give only panel-count data. A more general situation is that each subject is observed continuously over certain time periods but only at discrete times over other time periods. There exists little literature on the analysis of such mixed data except that published by Zhu et al. (2013, Statistics in Medicine 32, 1954-1963). In this article, we consider the regression analysis of mixed data using the additive rate model and develop some estimating equation-based approaches to estimate the regression parameters of interest. Both finite sample and asymptotic properties of the resulting estimators are established, and the numerical studies suggest that the proposed methodology works well for practical situations. The approach is applied to a Childhood Cancer Survivor Study that motivated this study. © 2014, The International Biometric Society.

  20. Insertable cardiac event recorder in detection of atrial fibrillation after cryptogenic stroke: an audit report.

    Science.gov (United States)

    Etgen, Thorleif; Hochreiter, Manfred; Mundel, Markus; Freudenberger, Thomas

    2013-07-01

    Atrial fibrillation (AF) is the most frequent risk factor in ischemic stroke but often remains undetected. We analyzed the value of insertable cardiac event recorder in detection of AF in a 1-year cohort of patients with cryptogenic ischemic stroke. All patients with cryptogenic stroke and eligibility for oral anticoagulation were offered the insertion of a cardiac event recorder. Regular follow-up for 1 year recorded the incidence of AF. Of the 393 patients with ischemic stroke, 65 (16.5%) had a cryptogenic stroke, and in 22 eligible patients, an event recorder was inserted. After 1 year, in 6 of 22 patients (27.3%), AF was detected. These preliminary data show that insertion of cardiac event recorder was eligible in approximately one third of patients with cryptogenic stroke and detected in approximately one quarter of these patients new AF.

  1. A robust neural network-based approach for microseismic event detection

    KAUST Repository

    Akram, Jubran

    2017-08-17

    We present an artificial neural network based approach for robust event detection from low S/N waveforms. We use a feed-forward network with a single hidden layer that is tuned on a training dataset and later applied on the entire example dataset for event detection. The input features used include the average of absolute amplitudes, variance, energy-ratio and polarization rectilinearity. These features are calculated in a moving-window of same length for the entire waveform. The output is set as a user-specified relative probability curve, which provides a robust way of distinguishing between weak and strong events. An optimal network is selected by studying the weight-based saliency and effect of number of neurons on the predicted results. Using synthetic data examples, we demonstrate that this approach is effective in detecting weaker events and reduces the number of false positives.

  2. Real-time detection and classification of anomalous events in streaming data

    Science.gov (United States)

    Ferragut, Erik M.; Goodall, John R.; Iannacone, Michael D.; Laska, Jason A.; Harrison, Lane T.

    2016-04-19

    A system is described for receiving a stream of events and scoring the events based on anomalousness and maliciousness (or other classification). The events can be displayed to a user in user-defined groupings in an animated fashion. The system can include a plurality of anomaly detectors that together implement an algorithm to identify low probability events and detect atypical traffic patterns. The atypical traffic patterns can then be classified as being of interest or not. In one particular example, in a network environment, the classification can be whether the network traffic is malicious or not.

  3. Adverse Event Rates Associated with Transforaminal and Interlaminar Epidural Steroid Injections: A Multi-Institutional Study.

    Science.gov (United States)

    El-Yahchouchi, Christine A; Plastaras, Christopher T; Maus, Timothy P; Carr, Carrie M; McCormick, Zachary L; Geske, Jennifer R; Smuck, Matthew; Pingree, Matthew J; Kennedy, David J

    2016-02-01

    Transforaminal epidural steroid injections (TFESI) have demonstrated efficacy and effectiveness in treatment of radicular pain. Despite little evidence of efficacy/effectiveness, interlaminar epidural steroid injections (ILESI) are advocated by some as primary therapy for radicular pain due to purported greater safety. To assess immediate and delayed adverse event rates of TFESI and ILESI injections at three academic medical centers utilizing International Spine Intervention Society practice guidelines. Quality assurance databases from a Radiology and two physical medicine and rehabilitation (PM&R) practices were interrogated. Medical records were reviewed, verifying immediate and delayed adverse events. There were no immediate major adverse events of neurologic injury or hemorrhage in 16,638 consecutive procedures in all spine segments (14,956 TFESI; 1,682 ILESI). Vasovagal reactions occurred in 1.2% of procedures, more frequently (P = 0.004) in TFESI (1.3%) than ILESI (0.5%). Dural punctures occurred in 0.06% of procedures, more commonly after ILESI (0.2% vs 0.04%, P = 0.006). Delayed follow up on PM&R patients (92.5% and 78.5, next business day) and radiology patients (63.1%, 2 weeks) identified no major adverse events of neurologic injury, hemorrhage, or infection. There were no significant differences in delayed minor adverse event rates. Central steroid response (sleeplessness, flushing, nonpositional headache) was seen in 2.6% of both TFESI and ILESI patients. 2.1% of TFESI and 1.8% of ILESI patients reported increased pain. No long-term sequelae were seen from any immediate or delayed minor adverse event. Both transforaminal and ILESI are safely performed with low immediate and delayed adverse event rates when informed by evidence-based procedural guidelines. By demonstrating comparable safety, this study suggests that the choice between ILESI and TFESIs can be based on documented efficacy and effectiveness and not driven by safety concerns.

  4. Early snowmelt events: detection, distribution, and significance in a major sub-arctic watershed

    International Nuclear Information System (INIS)

    Semmens, Kathryn Alese; Ramage, Joan; Bartsch, Annett; Liston, Glen E

    2013-01-01

    High latitude drainage basins are experiencing higher average temperatures, earlier snowmelt onset in spring, and an increase in rain on snow (ROS) events in winter, trends that climate models project into the future. Snowmelt-dominated basins are most sensitive to winter temperature increases that influence the frequency of ROS events and the timing and duration of snowmelt, resulting in changes to spring runoff. Of specific interest in this study are early melt events that occur in late winter preceding melt onset in the spring. The study focuses on satellite determination and characterization of these early melt events using the Yukon River Basin (Canada/USA) as a test domain. The timing of these events was estimated using data from passive (Advanced Microwave Scanning Radiometer—EOS (AMSR-E)) and active (SeaWinds on Quick Scatterometer (QuikSCAT)) microwave remote sensors, employing detection algorithms for brightness temperature (AMSR-E) and radar backscatter (QuikSCAT). The satellite detected events were validated with ground station meteorological and hydrological data, and the spatial and temporal variability of the events across the entire river basin was characterized. Possible causative factors for the detected events, including ROS, fog, and positive air temperatures, were determined by comparing the timing of the events to parameters from SnowModel and National Centers for Environmental Prediction North American Regional Reanalysis (NARR) outputs, and weather station data. All melt events coincided with above freezing temperatures, while a limited number corresponded to ROS (determined from SnowModel and ground data) and a majority to fog occurrence (determined from NARR). The results underscore the significant influence that warm air intrusions have on melt in some areas and demonstrate the large temporal and spatial variability over years and regions. The study provides a method for melt detection and a baseline from which to assess future change

  5. High cardiovascular event rates in patients with asymptomatic carotid stenosis: the REACH Registry

    DEFF Research Database (Denmark)

    Aichner, F T; Topakian, R; Alberts, M J

    2009-01-01

    BACKGROUND AND PURPOSE: Data on current cardiovascular event rates in patients with asymptomatic carotid artery stenosis (ACAS) are sparse. We compared the 1-year outcomes of patients with ACAS > or =70% versus patients without ACAS in an international, prospective cohort of outpatients with or a...

  6. Measurement Differences from Rating Posttraumatic Stress Disorder Symptoms in Response to Differentially Distressing Traumatic Events

    Science.gov (United States)

    Elhai, Jon D.; Fine, Thomas H.

    2012-01-01

    The authors explored differences in posttraumatic stress disorder (PTSD) symptoms as a result of rating symptoms from two separate, differentially distressing traumatic events. In an initial sample of 400 nonclinical participants, the authors inquired through a web survey about previous psychological trauma, instructing participants to nominate…

  7. Ratings of Severity of Life Events by Ninth-Grade Students.

    Science.gov (United States)

    Hutton, Jerry B.; And Others

    1987-01-01

    Special education, basic, and honors ninth-grade students (n=60) rated the severity of stress for each of the life events on the Source of Stress Inventory (Chandler, 1981). There was a significant positive relationship between the Chandler rankings (teachers and mental health workers) and the student rankings. (Author/NB)

  8. Pipe fracture evaluations for leak-rate detection: Probabilistic models

    International Nuclear Information System (INIS)

    Rahman, S.; Wilkowski, G.; Ghadiali, N.

    1993-01-01

    This is the second in series of three papers generated from studies on nuclear pipe fracture evaluations for leak-rate detection. This paper focuses on the development of novel probabilistic models for stochastic performance evaluation of degraded nuclear piping systems. It was accomplished here in three distinct stages. First, a statistical analysis was conducted to characterize various input variables for thermo-hydraulic analysis and elastic-plastic fracture mechanics, such as material properties of pipe, crack morphology variables, and location of cracks found in nuclear piping. Second, a new stochastic model was developed to evaluate performance of degraded piping systems. It is based on accurate deterministic models for thermo-hydraulic and fracture mechanics analyses described in the first paper, statistical characterization of various input variables, and state-of-the-art methods of modem structural reliability theory. From this model. the conditional probability of failure as a function of leak-rate detection capability of the piping systems can be predicted. Third, a numerical example was presented to illustrate the proposed model for piping reliability analyses. Results clearly showed that the model provides satisfactory estimates of conditional failure probability with much less computational effort when compared with those obtained from Monte Carlo simulation. The probabilistic model developed in this paper will be applied to various piping in boiling water reactor and pressurized water reactor plants for leak-rate detection applications

  9. {sup 123}I-MIBG imaging detects cardiac involvement and predicts cardiac events in Churg-Strauss syndrome

    Energy Technology Data Exchange (ETDEWEB)

    Horiguchi, Yoriko; Morita, Yukiko [National Hospital Organization Sagamihara National Hospital, Department of Cardiology, Sagamihara City, Kanagawa (Japan); Tsurikisawa, Naomi; Akiyama, Kazuo [National Hospital Organization Sagamihara National Hospital, Clinical Research Centre for Allergy and Rheumatology, Sagamihara City, Kanagawa (Japan)

    2011-02-15

    In Churg-Strauss syndrome (CSS) it is important to detect cardiac involvement, which predicts poor prognosis. This study evaluated whether {sup 123}I-metaiodobenzylguanidine (MIBG) scintigraphy could detect cardiac damage and predict cardiac events in CSS. {sup 123}I-MIBG scintigraphy was performed in 28 patients with CSS, 12 of whom had cardiac involvement. The early and delayed heart to mediastinum ratio (early H/M and delayed H/M) and washout rate were calculated by using {sup 123}I-MIBG scintigraphy and compared with those in control subjects. Early H/M and delayed H/M were significantly lower and the washout rate was significantly higher in patients with cardiac involvement than in those without and in controls (early H/M, p = 0.0024, p = 0.0001; delayed H/M, p = 0.0002, p = 0.0001; washout rate, p = 0.0012, p = 0.0052 vs those without and vs controls, respectively). Accuracy for detecting cardiac involvement was 86% for delayed H/M and washout rate and 79% for early H/M and B-type natriuretic peptide (BNP). Kaplan-Meier analysis showed significantly lower cardiac event-free rates in patients with early H/M {<=} 2.18 and BNP > 21.8 pg/ml than those with early H/M > 2.18 and BNP {<=} 21.8 pg/ml (log-rank test p = 0.006). Cardiac sympathetic nerve function was damaged in CSS patients with cardiac involvement. {sup 123}I-MIBG scintigraphy was useful in detecting cardiac involvement and in predicting cardiac events. (orig.)

  10. 123I-MIBG imaging detects cardiac involvement and predicts cardiac events in Churg-Strauss syndrome

    International Nuclear Information System (INIS)

    Horiguchi, Yoriko; Morita, Yukiko; Tsurikisawa, Naomi; Akiyama, Kazuo

    2011-01-01

    In Churg-Strauss syndrome (CSS) it is important to detect cardiac involvement, which predicts poor prognosis. This study evaluated whether 123 I-metaiodobenzylguanidine (MIBG) scintigraphy could detect cardiac damage and predict cardiac events in CSS. 123 I-MIBG scintigraphy was performed in 28 patients with CSS, 12 of whom had cardiac involvement. The early and delayed heart to mediastinum ratio (early H/M and delayed H/M) and washout rate were calculated by using 123 I-MIBG scintigraphy and compared with those in control subjects. Early H/M and delayed H/M were significantly lower and the washout rate was significantly higher in patients with cardiac involvement than in those without and in controls (early H/M, p = 0.0024, p = 0.0001; delayed H/M, p = 0.0002, p = 0.0001; washout rate, p = 0.0012, p = 0.0052 vs those without and vs controls, respectively). Accuracy for detecting cardiac involvement was 86% for delayed H/M and washout rate and 79% for early H/M and B-type natriuretic peptide (BNP). Kaplan-Meier analysis showed significantly lower cardiac event-free rates in patients with early H/M ≤ 2.18 and BNP > 21.8 pg/ml than those with early H/M > 2.18 and BNP ≤ 21.8 pg/ml (log-rank test p = 0.006). Cardiac sympathetic nerve function was damaged in CSS patients with cardiac involvement. 123 I-MIBG scintigraphy was useful in detecting cardiac involvement and in predicting cardiac events. (orig.)

  11. A method for estimating failure rates for low probability events arising in PSA

    International Nuclear Information System (INIS)

    Thorne, M.C.; Williams, M.M.R.

    1995-01-01

    The authors develop a method for predicting failure rates and failure probabilities per event when, over a given test period or number of demands, no failures have occurred. A Bayesian approach is adopted to calculate a posterior probability distribution for the failure rate or failure probability per event subsequent to the test period. This posterior is then used to estimate effective failure rates or probabilities over a subsequent period of time or number of demands. In special circumstances, the authors results reduce to the well-known rules of thumb, viz: 1/N and 1/T, where N is the number of demands during the test period for no failures and T is the test period for no failures. However, the authors are able to give strict conditions on the validity of these rules of thumb and to improve on them when necessary

  12. Detection of water-quality contamination events based on multi-sensor fusion using an extented Dempster–Shafer method

    International Nuclear Information System (INIS)

    Hou, Dibo; He, Huimei; Huang, Pingjie; Zhang, Guangxin; Loaiciga, Hugo

    2013-01-01

    This study presents a method for detecting contamination events of sources of drinking water based on the Dempster–Shafer (D-S) evidence theory. The detection method has the purpose of protecting water supply systems against accidental and intentional contamination events. This purpose is achieved by first predicting future water-quality parameters using an autoregressive (AR) model. The AR model predicts future water-quality parameters using recent measurements of these parameters made with automated (on-line) water-quality sensors. Next, a probabilistic method assigns probabilities to the time series of residuals formed by comparing predicted water-quality parameters with threshold values. Finally, the D-S fusion method searches for anomalous probabilities of the residuals and uses the result of that search to determine whether the current water quality is normal (that is, free of pollution) or contaminated. The D-S fusion method is extended and improved in this paper by weighted averaging of water-contamination evidence and by the analysis of the persistence of anomalous probabilities of water-quality parameters. The extended D-S fusion method makes determinations that have a high probability of being correct concerning whether or not a source of drinking water has been contaminated. This paper's method for detecting water-contamination events was tested with water-quality time series from automated (on-line) water quality sensors. In addition, a small-scale, experimental, water-pipe network was tested to detect water-contamination events. The two tests demonstrated that the extended D-S fusion method achieves a low false alarm rate and high probabilities of detecting water contamination events. (paper)

  13. Energy efficient data representation and aggregation with event region detection in wireless sensor networks

    Science.gov (United States)

    Banerjee, Torsha

    Detection (PERD) for WSNs. When a single event occurs, a child of the tree sends a Flagged Polynomial (FP) to its parent, if the readings approximated by it falls outside the data range defining the existing phenomenon. After the aggregation process is over, the root having the two polynomials, P and FP can be queried for FP (approximating the new event region) instead of flooding the whole network. For multiple such events, instead of computing a polynomial corresponding to each new event, areas with same data range are combined by the corresponding tree nodes and the aggregated coefficients are passed on. Results reveal that a new event can be detected by PERD while error in detection remains constant and is less than a threshold of 10%. As the node density increases, accuracy and delay for event detection are found to remain almost constant, making PERD highly scalable. Whenever an event occurs in a WSN, data is generated by closeby sensors and relaying the data to the base station (BS) make sensors closer to the BS run out of energy at a much faster rate than sensors in other parts of the network. This gives rise to an unequal distribution of residual energy in the network and makes those sensors with lower remaining energy level die at much faster rate than others. We propose a scheme for enhancing network Lifetime using mobile cluster heads (CH) in a WSN. To maintain remaining energy more evenly, some energy-rich nodes are designated as CHs which move in a controlled manner towards sensors rich in energy and data. This eliminates multihop transmission required by the static sensors and thus increases the overall lifetime of the WSN. We combine the idea of clustering and mobile CH to first form clusters of static sensor nodes. A collaborative strategy among the CHs further increases the lifetime of the network. Time taken for transmitting data to the BS is reduced further by making the CHs follow a connectivity strategy that always maintain a connected path to the BS

  14. Individual differences in event-based prospective memory: Evidence for multiple processes supporting cue detection.

    Science.gov (United States)

    Brewer, Gene A; Knight, Justin B; Marsh, Richard L; Unsworth, Nash

    2010-04-01

    The multiprocess view proposes that different processes can be used to detect event-based prospective memory cues, depending in part on the specificity of the cue. According to this theory, attentional processes are not necessary to detect focal cues, whereas detection of nonfocal cues requires some form of controlled attention. This notion was tested using a design in which we compared performance on a focal and on a nonfocal prospective memory task by participants with high or low working memory capacity. An interaction was found, such that participants with high and low working memory performed equally well on the focal task, whereas the participants with high working memory performed significantly better on the nonfocal task than did their counterparts with low working memory. Thus, controlled attention was only necessary for detecting event-based prospective memory cues in the nonfocal task. These results have implications for theories of prospective memory, the processes necessary for cue detection, and the successful fulfillment of intentions.

  15. On Event/Time Triggered and Distributed Analysis of a WSN System for Event Detection, Using Fuzzy Logic

    Directory of Open Access Journals (Sweden)

    Sofia Maria Dima

    2016-01-01

    Full Text Available Event detection in realistic WSN environments is a critical research domain, while the environmental monitoring comprises one of its most pronounced applications. Although efforts related to the environmental applications have been presented in the current literature, there is a significant lack of investigation on the performance of such systems, when applied in wireless environments. Aiming at addressing this shortage, in this paper an advanced multimodal approach is followed based on fuzzy logic. The proposed fuzzy inference system (FIS is implemented on TelosB motes and evaluates the probability of fire detection while aiming towards power conservation. Additionally to a straightforward centralized approach, a distributed implementation of the above FIS is also proposed, aiming towards network congestion reduction while optimally distributing the energy consumption among network nodes so as to maximize network lifetime. Moreover this work proposes an event based execution of the aforementioned FIS aiming to further reduce the computational as well as the communication cost, compared to a periodical time triggered FIS execution. As a final contribution, performance metrics acquired from all the proposed FIS implementation techniques are thoroughly compared and analyzed with respect to critical network conditions aiming to offer realistic evaluation and thus objective conclusions’ extraction.

  16. A semi-automated method for rapid detection of ripple events on interictal voltage discharges in the scalp electroencephalogram.

    Science.gov (United States)

    Chu, Catherine J; Chan, Arthur; Song, Dan; Staley, Kevin J; Stufflebeam, Steven M; Kramer, Mark A

    2017-02-01

    High frequency oscillations are emerging as a clinically important indicator of epileptic networks. However, manual detection of these high frequency oscillations is difficult, time consuming, and subjective, especially in the scalp EEG, thus hindering further clinical exploration and application. Semi-automated detection methods augment manual detection by reducing inspection to a subset of time intervals. We propose a new method to detect high frequency oscillations that co-occur with interictal epileptiform discharges. The new method proceeds in two steps. The first step identifies candidate time intervals during which high frequency activity is increased. The second step computes a set of seven features for each candidate interval. These features require that the candidate event contain a high frequency oscillation approximately sinusoidal in shape, with at least three cycles, that co-occurs with a large amplitude discharge. Candidate events that satisfy these features are stored for validation through visual analysis. We evaluate the detector performance in simulation and on ten examples of scalp EEG data, and show that the proposed method successfully detects spike-ripple events, with high positive predictive value, low false positive rate, and high intra-rater reliability. The proposed method is less sensitive than the existing method of visual inspection, but much faster and much more reliable. Accurate and rapid detection of high frequency activity increases the clinical viability of this rhythmic biomarker of epilepsy. The proposed spike-ripple detector rapidly identifies candidate spike-ripple events, thus making clinical analysis of prolonged, multielectrode scalp EEG recordings tractable. Copyright © 2016 Elsevier B.V. All rights reserved.

  17. [Incidence rate of adverse reaction/event by Qingkailing injection: a Meta-analysis of single rate].

    Science.gov (United States)

    Ai, Chun-ling; Xie, Yan-ming; Li, Ming-quan; Wang, Lian-xin; Liao, Xing

    2015-12-01

    To systematically review the incidence rate of adverse drug reaction/event by Qingkailing injection. Such databases as the PubMed, EMbase, the Cochrane library, CNKI, VIP WanFang data and CBM were searched by computer from foundation to July 30, 2015. Two reviewers independently screened literature according to the inclusion and exclusion criteria, extracted data and cross check data. Then, Meta-analysis was performed by using the R 3.2.0 software, subgroup sensitivity analysis was performed based on age, mode of medicine, observation time and research quality. Sixty-three studies involving 9,793 patients with Qingkailing injection were included, 367 cases of adverse reactions/events were reported in total. The incidence rate of adverse reaction in skin and mucosa group was 2% [95% CI (0.02; 0.03)]; the digestive system adverse reaction was 6% [95% CI(0.05; 0.07); the injection site adverse reaction was 4% [95% CI (0.02; 0.07)]. In the digestive system as the main types of adverse reactions/events, incidence of children and adults were 4.6% [0.021 1; 0.097 7] and 6.9% [0.053 5; 0.089 8], respectively. Adverse reactions to skin and mucous membrane damage as the main performance/event type, the observation time > 7 days and ≤ 7 days incidence of 3% [0.012 9; 0.068 3] and 1.9% [0.007 8; 0.046 1], respectively. Subgroup analysis showed that different types of adverse reactions, combination in the incidence of adverse reactions/events were higher than that of single drug, the difference was statistically significant (P reactions occur, and clinical rational drug use, such as combination, age and other fators, and the influence factors vary in different populations. Therefore, clinical doctors for children and the elderly use special care was required for a clear and open spirit injection, the implementation of individualized medication.

  18. Detection of microsleep events in a car driving simulation study using electrocardiographic features

    Directory of Open Access Journals (Sweden)

    Lenis Gustavo

    2016-09-01

    Full Text Available Microsleep events (MSE are short intrusions of sleep under the demand of sustained attention. They can impose a major threat to safety while driving a car and are considered one of the most significant causes of traffic accidents. Driver’s fatigue and MSE account for up to 20% of all car crashes in Europe and at least 100,000 accidents in the US every year. Unfortunately, there is not a standardized test developed to quantify the degree of vigilance of a driver. To account for this problem, different approaches based on biosignal analysis have been studied in the past. In this paper, we investigate an electrocardiographic-based detection of MSE using morphological and rhythmical features. 14 records from a car driving simulation study with a high incidence of MSE were analyzed and the behavior of the ECG features before and after an MSE in relation to reference baseline values (without drowsiness were investigated. The results show that MSE cannot be detected (or predicted using only the ECG. However, in the presence of MSE, the rhythmical and morphological features were observed to be significantly different than the ones calculated for the reference signal without sleepiness. In particular, when MSE were present, the heart rate diminished while the heart rate variability increased. Time distances between P wave and R peak, and R peak and T wave and their dispersion increased also. This demonstrates a noticeable change of the autonomous regulation of the heart. In future, the ECG parameter could be used as a surrogate measure of fatigue.

  19. Unfamiliar voice identification: Effect of post-event information on accuracy and voice ratings

    Directory of Open Access Journals (Sweden)

    Harriet Mary Jessica Smith

    2014-04-01

    Full Text Available This study addressed the effect of misleading post-event information (PEI on voice ratings, identification accuracy, and confidence, as well as the link between verbal recall and accuracy. Participants listened to a dialogue between male and female targets, then read misleading information about voice pitch. Participants engaged in verbal recall, rated voices on a feature checklist, and made a lineup decision. Accuracy rates were low, especially on target-absent lineups. Confidence and accuracy were unrelated, but the number of facts recalled about the voice predicted later lineup accuracy. There was a main effect of misinformation on ratings of target voice pitch, but there was no effect on identification accuracy or confidence ratings. As voice lineup evidence from earwitnesses is used in courts, the findings have potential applied relevance.

  20. On the Mass and Luminosity Functions of Tidal Disruption Flares: Rate Suppression due to Black Hole Event Horizons

    Science.gov (United States)

    van Velzen, S.

    2018-01-01

    The tidal disruption of a star by a massive black hole is expected to yield a luminous flare of thermal emission. About two dozen of these stellar tidal disruption flares (TDFs) may have been detected in optical transient surveys. However, explaining the observed properties of these events within the tidal disruption paradigm is not yet possible. This theoretical ambiguity has led some authors to suggest that optical TDFs are due to a different process, such as a nuclear supernova or accretion disk instabilities. Here we present a test of a fundamental prediction of the tidal disruption event scenario: a suppression of the flare rate due to the direct capture of stars by the black hole. Using a recently compiled sample of candidate TDFs with black hole mass measurements, plus a careful treatment of selection effects in this flux-limited sample, we confirm that the dearth of observed TDFs from high-mass black holes is statistically significant. All the TDF impostor models we consider fail to explain the observed mass function; the only scenario that fits the data is a suppression of the rate due to direct captures. We find that this suppression can explain the low volumetric rate of the luminous TDF candidate ASASSN-15lh, thus supporting the hypothesis that this flare belongs to the TDF family. Our work is the first to present the optical TDF luminosity function. A steep power law is required to explain the observed rest-frame g-band luminosity, {dN}/{{dL}}g\\propto {L}g-2.5. The mean event rate of the flares in our sample is ≈ 1× {10}-4 galaxy‑1 yr‑1, consistent with the theoretically expected tidal disruption rate.

  1. An Unsupervised Anomalous Event Detection and Interactive Analysis Framework for Large-scale Satellite Data

    Science.gov (United States)

    LIU, Q.; Lv, Q.; Klucik, R.; Chen, C.; Gallaher, D. W.; Grant, G.; Shang, L.

    2016-12-01

    Due to the high volume and complexity of satellite data, computer-aided tools for fast quality assessments and scientific discovery are indispensable for scientists in the era of Big Data. In this work, we have developed a framework for automated anomalous event detection in massive satellite data. The framework consists of a clustering-based anomaly detection algorithm and a cloud-based tool for interactive analysis of detected anomalies. The algorithm is unsupervised and requires no prior knowledge of the data (e.g., expected normal pattern or known anomalies). As such, it works for diverse data sets, and performs well even in the presence of missing and noisy data. The cloud-based tool provides an intuitive mapping interface that allows users to interactively analyze anomalies using multiple features. As a whole, our framework can (1) identify outliers in a spatio-temporal context, (2) recognize and distinguish meaningful anomalous events from individual outliers, (3) rank those events based on "interestingness" (e.g., rareness or total number of outliers) defined by users, and (4) enable interactively query, exploration, and analysis of those anomalous events. In this presentation, we will demonstrate the effectiveness and efficiency of our framework in the application of detecting data quality issues and unusual natural events using two satellite datasets. The techniques and tools developed in this project are applicable for a diverse set of satellite data and will be made publicly available for scientists in early 2017.

  2. Detecting Forest Disturbance Events from MODIS and Landsat Time Series for the Conterminous United States

    Science.gov (United States)

    Zhang, G.; Ganguly, S.; Saatchi, S. S.; Hagen, S. C.; Harris, N.; Yu, Y.; Nemani, R. R.

    2013-12-01

    Spatial and temporal patterns of forest disturbance and regrowth processes are key for understanding aboveground terrestrial vegetation biomass and carbon stocks at regional-to-continental scales. The NASA Carbon Monitoring System (CMS) program seeks key input datasets, especially information related to impacts due to natural/man-made disturbances in forested landscapes of Conterminous U.S. (CONUS), that would reduce uncertainties in current carbon stock estimation and emission models. This study provides a end-to-end forest disturbance detection framework based on pixel time series analysis from MODIS (Moderate Resolution Imaging Spectroradiometer) and Landsat surface spectral reflectance data. We applied the BFAST (Breaks for Additive Seasonal and Trend) algorithm to the Normalized Difference Vegetation Index (NDVI) data for the time period from 2000 to 2011. A harmonic seasonal model was implemented in BFAST to decompose the time series to seasonal and interannual trend components in order to detect abrupt changes in magnitude and direction of these components. To apply the BFAST for whole CONUS, we built a parallel computing setup for processing massive time-series data using the high performance computing facility of the NASA Earth Exchange (NEX). In the implementation process, we extracted the dominant deforestation events from the magnitude of abrupt changes in both seasonal and interannual components, and estimated dates for corresponding deforestation events. We estimated the recovery rate for deforested regions through regression models developed between NDVI values and time since disturbance for all pixels. A similar implementation of the BFAST algorithm was performed over selected Landsat scenes (all Landsat cloud free data was used to generate NDVI from atmospherically corrected spectral reflectances) to demonstrate the spatial coherence in retrieval layers between MODIS and Landsat. In future, the application of this largely parallel disturbance

  3. Detection of unusual events and trends in complex non-stationary data streams

    International Nuclear Information System (INIS)

    Charlton-Perez, C.; Perez, R.B.; Protopopescu, V.; Worley, B.A.

    2011-01-01

    The search for unusual events and trends hidden in multi-component, nonlinear, non-stationary, noisy signals is extremely important for diverse applications, ranging from power plant operation to homeland security. In the context of this work, we define an unusual event as a local signal disturbance and a trend as a continuous carrier of information added to and different from the underlying baseline dynamics. The goal of this paper is to investigate the feasibility of detecting hidden events inside intermittent signal data sets corrupted by high levels of noise, by using the Hilbert-Huang empirical mode decomposition method.

  4. Method for detecting binding events using micro-X-ray fluorescence spectrometry

    Science.gov (United States)

    Warner, Benjamin P.; Havrilla, George J.; Mann, Grace

    2010-12-28

    Method for detecting binding events using micro-X-ray fluorescence spectrometry. Receptors are exposed to at least one potential binder and arrayed on a substrate support. Each member of the array is exposed to X-ray radiation. The magnitude of a detectable X-ray fluorescence signal for at least one element can be used to determine whether a binding event between a binder and a receptor has occurred, and can provide information related to the extent of binding between the binder and receptor.

  5. Nonstationarities in the occurrence rates of flood events in Portuguese watersheds

    Directory of Open Access Journals (Sweden)

    A. T. Silva

    2012-01-01

    Full Text Available An exploratory analysis on the variability of flood occurrence rates in 10 Portuguese watersheds is made, to ascertain if that variability is concurrent with the principle of stationarity. A peaks-over-threshold (POT sampling technique is applied to 10 long series of mean daily streamflows and to 4 long series of daily rainfall in order to sample the times of occurrence (POT time data of the peak values of those series. The kernel occurrence rate estimator, coupled with a bootstrap approach, was applied to the POT time data to obtain the time dependent estimated occurrence rate curves, λˆ(t, of floods and extreme rainfall events. The results of the analysis show that the occurrence of those events constitutes an inhomogeneous Poisson process, hence the occurrence rates are nonstationary. An attempt was made to assess whether the North Atlantic Oscillation (NAO casted any influence on the occurrence rate of floods in the study area. Although further research is warranted, it was found that years with a less-than-average occurrence of floods tend to occur when the winter NAO is in the positive phase, and years with a higher occurrence of floods (more than twice the average tend to occur when the winter NAO is in the negative phase. Although the number of analyzed watersheds and their uneven spatial distribution hinders the generalization of the findings to the country scale, the authors conclude that the mathematical formulation of the flood frequency models relying on stationarity commonly employed in Portugal should be revised in order to account for possible nonstationarities in the occurrence rates of such events.

  6. Why conventional detection methods fail in identifying the existence of contamination events.

    Science.gov (United States)

    Liu, Shuming; Li, Ruonan; Smith, Kate; Che, Han

    2016-04-15

    Early warning systems are widely used to safeguard water security, but their effectiveness has raised many questions. To understand why conventional detection methods fail to identify contamination events, this study evaluates the performance of three contamination detection methods using data from a real contamination accident and two artificial datasets constructed using a widely applied contamination data construction approach. Results show that the Pearson correlation Euclidean distance (PE) based detection method performs better for real contamination incidents, while the Euclidean distance method (MED) and linear prediction filter (LPF) method are more suitable for detecting sudden spike-like variation. This analysis revealed why the conventional MED and LPF methods failed to identify existence of contamination events. The analysis also revealed that the widely used contamination data construction approach is misleading. Copyright © 2016 Elsevier Ltd. All rights reserved.

  7. The Informational Content of Credit Ratings in Brazil: An Event Study

    Directory of Open Access Journals (Sweden)

    Flávia Cruz de Souza Murcia

    2014-03-01

    Full Text Available This study analyzes the effect of credit rating announcements on stock returns in the Brazilian market during 1997-2011. We conducted an event study using a sample of 242 observations of listed companies, 179 from Standard and Poor’s and 63 from Moody’s, to analyze stock market reaction. Abnormal returns have been computed using the Market Model and CAPM for three windows: three days (-1, +1, 11 days (-5, +5 and 21 days (-10, +10. We find statistically significant abnormal returns in days -1 and 0 for all the three types of rating announcement tested: initial rating, downgrades and upgrades. For downgrades, consisted with prior studies, our results also showed negative abnormal returns for all practically all windows tested. Overall, our findings evidence the rating announcements do have information content, as it impacts stock returns causing abnormal returns, especially when they bring ‘bad news’ to the market.

  8. Detectability of changes in cosmic-ray counting rate measured with the Liulin detector

    International Nuclear Information System (INIS)

    Malusek, A.; Kubancak, J.; Ambrozova, I.

    2011-05-01

    Experimental data are needed to improve and validate models predicting the dynamics of solar particle events because the mechanisms of processes leading to the acceleration of solar energetic particles are not yet fully understood. The aim of this work was to examine whether the spectrometer of deposited energy, Liulin, positioned at the Lomnický štít mountain observatory can collect such data. Decision thresholds and detection limits for the increase or decrease in the average number of particles detected by Liulin were determined for a period in February 2011. The changes in counts corresponding to the decision thresholds and detection limits relative to the average number of detected particles were about 17% and 33%, respectively. The Forbush decrease with a maximum change of about 6.8%, which started on February 18, was detectable neither during the 10-minute acquisition time nor during any other, longer period. Statistical analysis showed that an acquisition time about 7 hours would be needed to detect a 5% decrease. As this time was shorter than the duration of the Forbush decrease (about 56 hours), we theorize that the current placement of the Liulin detector inside a living room shielded by a thick concrete ceiling may have had an adverse impact on the detectability of the the cosmic ray counting rate decrease. To test this hypothesis, we recommend positioning the Liulin detector outside the main observatory building.. (author)

  9. Effect of parameters in moving average method for event detection enhancement using phase sensitive OTDR

    Science.gov (United States)

    Kwon, Yong-Seok; Naeem, Khurram; Jeon, Min Yong; Kwon, Il-bum

    2017-04-01

    We analyze the relations of parameters in moving average method to enhance the event detectability of phase sensitive optical time domain reflectometer (OTDR). If the external events have unique frequency of vibration, then the control parameters of moving average method should be optimized in order to detect these events efficiently. A phase sensitive OTDR was implemented by a pulsed light source, which is composed of a laser diode, a semiconductor optical amplifier, an erbium-doped fiber amplifier, a fiber Bragg grating filter, and a light receiving part, which has a photo-detector and high speed data acquisition system. The moving average method is operated with the control parameters: total number of raw traces, M, number of averaged traces, N, and step size of moving, n. The raw traces are obtained by the phase sensitive OTDR with sound signals generated by a speaker. Using these trace data, the relation of the control parameters is analyzed. In the result, if the event signal has one frequency, then the optimal values of N, n are existed to detect the event efficiently.

  10. Secure access control and large scale robust representation for online multimedia event detection.

    Science.gov (United States)

    Liu, Changyu; Lu, Bin; Li, Huiling

    2014-01-01

    We developed an online multimedia event detection (MED) system. However, there are a secure access control issue and a large scale robust representation issue when we want to integrate traditional event detection algorithms into the online environment. For the first issue, we proposed a tree proxy-based and service-oriented access control (TPSAC) model based on the traditional role based access control model. Verification experiments were conducted on the CloudSim simulation platform, and the results showed that the TPSAC model is suitable for the access control of dynamic online environments. For the second issue, inspired by the object-bank scene descriptor, we proposed a 1000-object-bank (1000OBK) event descriptor. Feature vectors of the 1000OBK were extracted from response pyramids of 1000 generic object detectors which were trained on standard annotated image datasets, such as the ImageNet dataset. A spatial bag of words tiling approach was then adopted to encode these feature vectors for bridging the gap between the objects and events. Furthermore, we performed experiments in the context of event classification on the challenging TRECVID MED 2012 dataset, and the results showed that the robust 1000OBK event descriptor outperforms the state-of-the-art approaches.

  11. Secure Access Control and Large Scale Robust Representation for Online Multimedia Event Detection

    Directory of Open Access Journals (Sweden)

    Changyu Liu

    2014-01-01

    Full Text Available We developed an online multimedia event detection (MED system. However, there are a secure access control issue and a large scale robust representation issue when we want to integrate traditional event detection algorithms into the online environment. For the first issue, we proposed a tree proxy-based and service-oriented access control (TPSAC model based on the traditional role based access control model. Verification experiments were conducted on the CloudSim simulation platform, and the results showed that the TPSAC model is suitable for the access control of dynamic online environments. For the second issue, inspired by the object-bank scene descriptor, we proposed a 1000-object-bank (1000OBK event descriptor. Feature vectors of the 1000OBK were extracted from response pyramids of 1000 generic object detectors which were trained on standard annotated image datasets, such as the ImageNet dataset. A spatial bag of words tiling approach was then adopted to encode these feature vectors for bridging the gap between the objects and events. Furthermore, we performed experiments in the context of event classification on the challenging TRECVID MED 2012 dataset, and the results showed that the robust 1000OBK event descriptor outperforms the state-of-the-art approaches.

  12. Use of wireless sensor networks for distributed event detection in disaster management applications

    NARCIS (Netherlands)

    Bahrepour, M.; Meratnia, Nirvana; Poel, Mannes; Taghikhaki, Zahra; Havinga, Paul J.M.

    Recently, wireless sensor networks (WSNs) have become mature enough to go beyond being simple fine-grained continuous monitoring platforms and have become one of the enabling technologies for early-warning disaster systems. Event detection functionality of WSNs can be of great help and importance

  13. A robust neural network-based approach for microseismic event detection

    KAUST Repository

    Akram, Jubran; Ovcharenko, Oleg; Peter, Daniel

    2017-01-01

    We present an artificial neural network based approach for robust event detection from low S/N waveforms. We use a feed-forward network with a single hidden layer that is tuned on a training dataset and later applied on the entire example dataset

  14. Process variant comparison: using event logs to detect differences in behavior and business rules

    NARCIS (Netherlands)

    Bolt, A.; de Leoni, M.; van der Aalst, W.M.P.

    2018-01-01

    This paper addresses the problem of comparing different variants of the same process. We aim to detect relevant differences between processes based on what was recorded in event logs. We use transition systems to model behavior and to highlight differences. Transition systems are annotated with

  15. Estimating rate of occurrence of rare events with empirical bayes: A railway application

    International Nuclear Information System (INIS)

    Quigley, John; Bedford, Tim; Walls, Lesley

    2007-01-01

    Classical approaches to estimating the rate of occurrence of events perform poorly when data are few. Maximum likelihood estimators result in overly optimistic point estimates of zero for situations where there have been no events. Alternative empirical-based approaches have been proposed based on median estimators or non-informative prior distributions. While these alternatives offer an improvement over point estimates of zero, they can be overly conservative. Empirical Bayes procedures offer an unbiased approach through pooling data across different hazards to support stronger statistical inference. This paper considers the application of Empirical Bayes to high consequence low-frequency events, where estimates are required for risk mitigation decision support such as as low as reasonably possible. A summary of empirical Bayes methods is given and the choices of estimation procedures to obtain interval estimates are discussed. The approaches illustrated within the case study are based on the estimation of the rate of occurrence of train derailments within the UK. The usefulness of empirical Bayes within this context is discussed

  16. Events

    Directory of Open Access Journals (Sweden)

    Igor V. Karyakin

    2016-02-01

    Full Text Available The 9th ARRCN Symposium 2015 was held during 21st–25th October 2015 at the Novotel Hotel, Chumphon, Thailand, one of the most favored travel destinations in Asia. The 10th ARRCN Symposium 2017 will be held during October 2017 in the Davao, Philippines. International Symposium on the Montagu's Harrier (Circus pygargus «The Montagu's Harrier in Europe. Status. Threats. Protection», organized by the environmental organization «Landesbund für Vogelschutz in Bayern e.V.» (LBV was held on November 20-22, 2015 in Germany. The location of this event was the city of Wurzburg in Bavaria.

  17. Event detection and exception handling strategies in the ASDEX Upgrade discharge control system

    International Nuclear Information System (INIS)

    Treutterer, W.; Neu, G.; Rapson, C.; Raupp, G.; Zasche, D.; Zehetbauer, T.

    2013-01-01

    Highlights: •Event detection and exception handling is integrated in control system architecture. •Pulse control with local exception handling and pulse supervision with central exception handling are strictly separated. •Local exception handling limits the effect of an exception to a minimal part of the controlled system. •Central Exception Handling solves problems requiring coordinated action of multiple control components. -- Abstract: Thermonuclear plasmas are governed by nonlinear characteristics: plasma operation can be classified into scenarios with pronounced features like L and H-mode, ELMs or MHD activity. Transitions between them may be treated as events. Similarly, technical systems are also subject to events such as failure of measurement sensors, actuator saturation or violation of machine and plant operation limits. Such situations often are handled with a mixture of pulse abortion and iteratively improved pulse schedule reference programming. In case of protection-relevant events, however, the complexity of even a medium-sized device as ASDEX Upgrade requires a sophisticated and coordinated shutdown procedure rather than a simple stop of the pulse. The detection of events and their intelligent handling by the control system has been shown to be valuable also in terms of saving experiment time and cost. This paper outlines how ASDEX Upgrade's discharge control system (DCS) detects events and handles exceptions in two stages: locally and centrally. The goal of local exception handling is to limit the effect of an unexpected or asynchronous event to a minimal part of the controlled system. Thus, local exception handling facilitates robustness to failures but keeps the decision structures lean. A central state machine deals with exceptions requiring coordinated action of multiple control components. DCS implements the state machine by means of pulse schedule segments containing pre-programmed waveforms to define discharge goal and control

  18. Event detection and exception handling strategies in the ASDEX Upgrade discharge control system

    Energy Technology Data Exchange (ETDEWEB)

    Treutterer, W., E-mail: Wolfgang.Treutterer@ipp.mpg.de; Neu, G.; Rapson, C.; Raupp, G.; Zasche, D.; Zehetbauer, T.

    2013-10-15

    Highlights: •Event detection and exception handling is integrated in control system architecture. •Pulse control with local exception handling and pulse supervision with central exception handling are strictly separated. •Local exception handling limits the effect of an exception to a minimal part of the controlled system. •Central Exception Handling solves problems requiring coordinated action of multiple control components. -- Abstract: Thermonuclear plasmas are governed by nonlinear characteristics: plasma operation can be classified into scenarios with pronounced features like L and H-mode, ELMs or MHD activity. Transitions between them may be treated as events. Similarly, technical systems are also subject to events such as failure of measurement sensors, actuator saturation or violation of machine and plant operation limits. Such situations often are handled with a mixture of pulse abortion and iteratively improved pulse schedule reference programming. In case of protection-relevant events, however, the complexity of even a medium-sized device as ASDEX Upgrade requires a sophisticated and coordinated shutdown procedure rather than a simple stop of the pulse. The detection of events and their intelligent handling by the control system has been shown to be valuable also in terms of saving experiment time and cost. This paper outlines how ASDEX Upgrade's discharge control system (DCS) detects events and handles exceptions in two stages: locally and centrally. The goal of local exception handling is to limit the effect of an unexpected or asynchronous event to a minimal part of the controlled system. Thus, local exception handling facilitates robustness to failures but keeps the decision structures lean. A central state machine deals with exceptions requiring coordinated action of multiple control components. DCS implements the state machine by means of pulse schedule segments containing pre-programmed waveforms to define discharge goal and control

  19. Detectability and detection rate of acute cerebral hemisphere infarcts on CT and diffusion-weighted MRI

    International Nuclear Information System (INIS)

    Urbach, H.; Flacke, S.; Keller, E.; Textor, J.; Berlis, A.; Reul, J.; Schild, H.H.; Hartmann, A.; Solymosi, L.

    2000-01-01

    Our purpose was to compare the detectability and detection rate of acute ischaemic cerebral hemisphere infarcts on CT and diffusion-weighted MRI (DWI). We investigated 32 consecutive patients with acute hemisphere stroke with unenhanced CT and DWI within 6 h of stroke onset. The interval between CT and DWI ranged from 15 to 180 min (mean 60 min). Infarct detectability on CT and DWI was determined by comparing the initial CT, DWI and later reference images in a consensus reading of five independent examiners. The ''true'' detection rate was assessed by analysing all single readings. Two patients had intracerebral haematomas on DWI and CT and were excluded. There were 27 patients with ischaemic infarcts; all were visible on DWI and proven by follow-up. DWI was negative in three patients without a final diagnosis of infarct (100 % sensitivity, 100 % specificity, χ 2 = 30, P 2 = 1.48, P = 0.224). With regard to the single readings (30 examinations x 5 examiners = 150 readings), 63 CT readings were true positive and 72 false negative (sensitivity 47 %, specificity 86 %, χ 2 = 2.88, P = 0.089). Of the DWI readings 128 were true positive and 7 false negative (sensitivity 95 %, specificity 87 %, χ 2 = 70.67, P < 0.0001). Interobserver agreement was substantial for CT (χ= 0.72, 95 % confidence interval, 0.6-0.84) and DWI (χ= 0.82, 95 % confidence interval, 0.46-1). Taken together, detectability and detection rate of acute (< 6 h) hemisphere infarcts are significantly higher with DWI than with CT. (orig.)

  20. A Comparative Study of Data Mining Algorithms for High Detection Rate in Intrusion Detection System

    Directory of Open Access Journals (Sweden)

    Nabeela Ashraf

    2018-01-01

    Full Text Available Due to the fast growth and tradition of the internet over the last decades, the network security problems are increasing vigorously. Humans can not handle the speed of processes and the huge amount of data required to handle network anomalies. Therefore, it needs substantial automation in both speed and accuracy. Intrusion Detection System is one of the approaches to recognize illegal access and rare attacks to secure networks. In this proposed paper, Naive Bayes, J48 and Random Forest classifiers are compared to compute the detection rate and accuracy of IDS. For experiments, the KDD_NSL dataset is used.

  1. Automatic detection of adverse events to predict drug label changes using text and data mining techniques.

    Science.gov (United States)

    Gurulingappa, Harsha; Toldo, Luca; Rajput, Abdul Mateen; Kors, Jan A; Taweel, Adel; Tayrouz, Yorki

    2013-11-01

    The aim of this study was to assess the impact of automatically detected adverse event signals from text and open-source data on the prediction of drug label changes. Open-source adverse effect data were collected from FAERS, Yellow Cards and SIDER databases. A shallow linguistic relation extraction system (JSRE) was applied for extraction of adverse effects from MEDLINE case reports. Statistical approach was applied on the extracted datasets for signal detection and subsequent prediction of label changes issued for 29 drugs by the UK Regulatory Authority in 2009. 76% of drug label changes were automatically predicted. Out of these, 6% of drug label changes were detected only by text mining. JSRE enabled precise identification of four adverse drug events from MEDLINE that were undetectable otherwise. Changes in drug labels can be predicted automatically using data and text mining techniques. Text mining technology is mature and well-placed to support the pharmacovigilance tasks. Copyright © 2013 John Wiley & Sons, Ltd.

  2. High detection rate of dog circovirus in diarrheal dogs.

    Science.gov (United States)

    Hsu, Han-Siang; Lin, Ting-Han; Wu, Hung-Yi; Lin, Lee-Shuan; Chung, Cheng-Shu; Chiou, Ming-Tang; Lin, Chao-Nan

    2016-06-17

    Diarrhea is one of the most common clinical symptoms reported in companion animal clinics. Dog circovirus (DogCV) is a new mammalian circovirus that is considered to be a cause of alimentary syndromes such as diarrhea, vomiting and hemorrhagic enteritis. DogCV has previously only been identified in the United States, Italy, Germany (GeneBank accession number: KF887949) and China (GeneBank accession number: KT946839). Therefore, the aims of this study were to determine the prevalence of DogCV in Taiwan and to explore the correlation between diarrhea and DogCV infection. Clinical specimens were collected between 2012 and 2014 from 207 dogs suffering from diarrhea and 160 healthy dogs. In this study, we developed a sensitive and specific SYBR Green-based real-time PCR assays to detected DogCV in naturally infected animals. Of the analyzed fecal samples from diarrheal dogs and health dogs, 58 (28.0 %) and 19 (11.9 %), respectively, were DogCV positive. The difference in DogCV prevalence was highly significant (P = 0.0002755) in diarrheal dogs. This is the first study to reveal that DogCV is currently circulating in domestic dogs in Taiwan and to demonstrate its high detection rate in dogs with diarrhea.

  3. Ontology-based knowledge management for personalized adverse drug events detection.

    Science.gov (United States)

    Cao, Feng; Sun, Xingzhi; Wang, Xiaoyuan; Li, Bo; Li, Jing; Pan, Yue

    2011-01-01

    Since Adverse Drug Event (ADE) has become a leading cause of death around the world, there arises high demand for helping clinicians or patients to identify possible hazards from drug effects. Motivated by this, we present a personalized ADE detection system, with the focus on applying ontology-based knowledge management techniques to enhance ADE detection services. The development of electronic health records makes it possible to automate the personalized ADE detection, i.e., to take patient clinical conditions into account during ADE detection. Specifically, we define the ADE ontology to uniformly manage the ADE knowledge from multiple sources. We take advantage of the rich semantics from the terminology SNOMED-CT and apply it to ADE detection via the semantic query and reasoning.

  4. Direct SUSY dark matter detection-theoretical rates due to the spin

    International Nuclear Information System (INIS)

    Vergados, J D

    2004-01-01

    The recent WMAP data have confirmed that exotic dark matter together with the vacuum energy (cosmological constant) dominate in the flat Universe. Thus direct dark matter detection, consisting of detecting the recoiling nucleus, is central to particle physics and cosmology. Supersymmetry provides a natural dark matter candidate, the lightest supersymmetric particle (LSP). The relevant cross sections arise out of two mechanisms: (i) the coherent mode, due to the scalar interaction and (ii) the spin contribution arising from the axial current. In this paper we will focus on the spin contribution, which is expected to dominate for light targets. For both modes it is possible to obtain detectable rates, but in most models the expected rates are much lower than the present experimental goals. So one should exploit two characteristic signatures of the reaction, namely the modulation effect and in directional experiments the correlation of the event rates with the sun's motion. In standard non-directional experiments the modulation is small, less than 2 per cent. In the case of the directional event rates we would like to suggest that the experiments exploit two features of the process, which are essentially independent of the SUSY model employed, namely: (1) the forward-backward asymmetry, with respect to the sun's direction of motion, is very large and (2) the modulation is much larger, especially if the observation is made in a plane perpendicular to the sun's velocity. In this case the difference between maximum and minimum can be larger than 40 per cent and the phase of the earth at the maximum is direction dependent

  5. Detections of Planets in Binaries Through the Channel of Chang–Refsdal Gravitational Lensing Events

    Energy Technology Data Exchange (ETDEWEB)

    Han, Cheongho [Department of Physics, Chungbuk National University, Cheongju 361-763 (Korea, Republic of); Shin, In-Gu; Jung, Youn Kil [Harvard-Smithsonian Center for Astrophysics, 60 Garden St., Cambridge, MA 02138 (United States)

    2017-02-01

    Chang–Refsdal (C–R) lensing, which refers to the gravitational lensing of a point mass perturbed by a constant external shear, provides a good approximation in describing lensing behaviors of either a very wide or a very close binary lens. C–R lensing events, which are identified by short-term anomalies near the peak of high-magnification lensing light curves, are routinely detected from lensing surveys, but not much attention is paid to them. In this paper, we point out that C–R lensing events provide an important channel to detect planets in binaries, both in close and wide binary systems. Detecting planets through the C–R lensing event channel is possible because the planet-induced perturbation occurs in the same region of the C–R lensing-induced anomaly and thus the existence of the planet can be identified by the additional deviation in the central perturbation. By presenting the analysis of the actually observed C–R lensing event OGLE-2015-BLG-1319, we demonstrate that dense and high-precision coverage of a C–R lensing-induced perturbation can provide a strong constraint on the existence of a planet in a wide range of planet parameters. The sample of an increased number of microlensing planets in binary systems will provide important observational constraints in giving shape to the details of planet formation, which have been restricted to the case of single stars to date.

  6. Compact Binary Mergers and the Event Rate of Fast Radio Bursts

    Science.gov (United States)

    Cao, Xiao-Feng; Yu, Yun-Wei; Zhou, Xia

    2018-05-01

    Fast radio bursts (FRBs) are usually suggested to be associated with mergers of compact binaries consisting of white dwarfs (WDs), neutron stars (NSs), or black holes (BHs). We test these models by fitting the observational distributions in both redshift and isotropic energy of 22 Parkes FRBs, where, as usual, the rates of compact binary mergers (CBMs) are connected with cosmic star formation rates by a power-law distributed time delay. It is found that the observational distributions can well be produced by the CBM model with a characteristic delay time from several tens to several hundreds of megayears and an energy function index 1.2 ≲ γ ≲ 1.7, where a tentative fixed spectral index β = 0.8 is adopted for all FRBs. Correspondingly, the local event rate of FRBs is constrained to {(3{--}6)× {10}4{f}{{b}}-1({ \\mathcal T }/270{{s}})}-1{({ \\mathcal A }/2π )}-1 {Gpc}}-3 {yr}}-1 for an adopted minimum FRB energy of E min = 3 × 1039 erg, where f b is the beaming factor of the radiation, { \\mathcal T } is the duration of each pointing observation, and { \\mathcal A } is the sky area of the survey. This event rate, about an order of magnitude higher than the rates of NS–NS/NS–BH mergers, indicates that the most promising origin of FRBs in the CBM scenario could be mergers of WD–WD binaries. Here a massive WD could be produced since no FRB was found to be associated with an SN Ia. Alternatively, if all FRBs can repeat on a timescale much longer than the period of current observations, then they could also originate from a young active NS that forms from relatively rare NS–NS mergers and accretion-induced collapses of WD–WD binaries.

  7. Fundamental aspects of seismic event detection, magnitude estimation and their interrelation

    International Nuclear Information System (INIS)

    Ringdal, F.

    1977-01-01

    The main common subject of the papers forming this thesis is statistical model development within the seismological disciplines of seismic event detection and event magnitude estimation. As more high quality seismic data become available as a result of recent seismic network developments, the opportunity will exist for large scale application and further refinement of these models. It is hoped that the work presented here will facilitate improved understanding of the basic issues, both within earthquake-explosion discrimination, in the framework of which most of this work originated, and in seismology in general. (Auth.)

  8. Detection of rain events in radiological early warning networks with spectro-dosimetric systems

    Science.gov (United States)

    Dąbrowski, R.; Dombrowski, H.; Kessler, P.; Röttger, A.; Neumaier, S.

    2017-10-01

    Short-term pronounced increases of the ambient dose equivalent rate, due to rainfall are a well-known phenomenon. Increases in the same order of magnitude or even below may also be caused by a nuclear or radiological event, i.e. by artificial radiation. Hence, it is important to be able to identify natural rain events in dosimetric early warning networks and to distinguish them from radiological events. Novel spectrometric systems based on scintillators may be used to differentiate between the two scenarios, because the measured gamma spectra provide significant nuclide-specific information. This paper describes three simple, automatic methods to check whether an dot H*(10) increase is caused by a rain event or by artificial radiation. These methods were applied to measurements of three spectrometric systems based on CeBr3, LaBr3 and SrI2 scintillation crystals, investigated and tested for their practicability at a free-field reference site of PTB.

  9. Facilitating adverse drug event detection in pharmacovigilance databases using molecular structure similarity: application to rhabdomyolysis

    Science.gov (United States)

    Vilar, Santiago; Harpaz, Rave; Chase, Herbert S; Costanzi, Stefano; Rabadan, Raul

    2011-01-01

    Background Adverse drug events (ADE) cause considerable harm to patients, and consequently their detection is critical for patient safety. The US Food and Drug Administration maintains an adverse event reporting system (AERS) to facilitate the detection of ADE in drugs. Various data mining approaches have been developed that use AERS to detect signals identifying associations between drugs and ADE. The signals must then be monitored further by domain experts, which is a time-consuming task. Objective To develop a new methodology that combines existing data mining algorithms with chemical information by analysis of molecular fingerprints to enhance initial ADE signals generated from AERS, and to provide a decision support mechanism to facilitate the identification of novel adverse events. Results The method achieved a significant improvement in precision in identifying known ADE, and a more than twofold signal enhancement when applied to the ADE rhabdomyolysis. The simplicity of the method assists in highlighting the etiology of the ADE by identifying structurally similar drugs. A set of drugs with strong evidence from both AERS and molecular fingerprint-based modeling is constructed for further analysis. Conclusion The results demonstrate that the proposed methodology could be used as a pharmacovigilance decision support tool to facilitate ADE detection. PMID:21946238

  10. Microfluidic Arrayed Lab-On-A-Chip for Electrochemical Capacitive Detection of DNA Hybridization Events.

    Science.gov (United States)

    Ben-Yoav, Hadar; Dykstra, Peter H; Bentley, William E; Ghodssi, Reza

    2017-01-01

    A microfluidic electrochemical lab-on-a-chip (LOC) device for DNA hybridization detection has been developed. The device comprises a 3 × 3 array of microelectrodes integrated with a dual layer microfluidic valved manipulation system that provides controlled and automated capabilities for high throughput analysis of microliter volume samples. The surface of the microelectrodes is functionalized with single-stranded DNA (ssDNA) probes which enable specific detection of complementary ssDNA targets. These targets are detected by a capacitive technique which measures dielectric variation at the microelectrode-electrolyte interface due to DNA hybridization events. A quantitative analysis of the hybridization events is carried out based on a sensing modeling that includes detailed analysis of energy storage and dissipation components. By calculating these components during hybridization events the device is able to demonstrate specific and dose response sensing characteristics. The developed microfluidic LOC for DNA hybridization detection offers a technology for real-time and label-free assessment of genetic markers outside of laboratory settings, such as at the point-of-care or in-field environmental monitoring.

  11. Simultaneous Event-Triggered Fault Detection and Estimation for Stochastic Systems Subject to Deception Attacks.

    Science.gov (United States)

    Li, Yunji; Wu, QingE; Peng, Li

    2018-01-23

    In this paper, a synthesized design of fault-detection filter and fault estimator is considered for a class of discrete-time stochastic systems in the framework of event-triggered transmission scheme subject to unknown disturbances and deception attacks. A random variable obeying the Bernoulli distribution is employed to characterize the phenomena of the randomly occurring deception attacks. To achieve a fault-detection residual is only sensitive to faults while robust to disturbances, a coordinate transformation approach is exploited. This approach can transform the considered system into two subsystems and the unknown disturbances are removed from one of the subsystems. The gain of fault-detection filter is derived by minimizing an upper bound of filter error covariance. Meanwhile, system faults can be reconstructed by the remote fault estimator. An recursive approach is developed to obtain fault estimator gains as well as guarantee the fault estimator performance. Furthermore, the corresponding event-triggered sensor data transmission scheme is also presented for improving working-life of the wireless sensor node when measurement information are aperiodically transmitted. Finally, a scaled version of an industrial system consisting of local PC, remote estimator and wireless sensor node is used to experimentally evaluate the proposed theoretical results. In particular, a novel fault-alarming strategy is proposed so that the real-time capacity of fault-detection is guaranteed when the event condition is triggered.

  12. Integrated hydraulic and organophosphate pesticide injection simulations for enhancing event detection in water distribution systems.

    Science.gov (United States)

    Schwartz, Rafi; Lahav, Ori; Ostfeld, Avi

    2014-10-15

    As a complementary step towards solving the general event detection problem of water distribution systems, injection of the organophosphate pesticides, chlorpyrifos (CP) and parathion (PA), were simulated at various locations within example networks and hydraulic parameters were calculated over 24-h duration. The uniqueness of this study is that the chemical reactions and byproducts of the contaminants' oxidation were also simulated, as well as other indicative water quality parameters such as alkalinity, acidity, pH and the total concentration of free chlorine species. The information on the change in water quality parameters induced by the contaminant injection may facilitate on-line detection of an actual event involving this specific substance and pave the way to development of a generic methodology for detecting events involving introduction of pesticides into water distribution systems. Simulation of the contaminant injection was performed at several nodes within two different networks. For each injection, concentrations of the relevant contaminants' mother and daughter species, free chlorine species and water quality parameters, were simulated at nodes downstream of the injection location. The results indicate that injection of these substances can be detected at certain conditions by a very rapid drop in Cl2, functioning as the indicative parameter, as well as a drop in alkalinity concentration and a small decrease in pH, both functioning as supporting parameters, whose usage may reduce false positive alarms. Copyright © 2014 Elsevier Ltd. All rights reserved.

  13. Detection of visual events along the apparent motion trace in patients with paranoid schizophrenia.

    Science.gov (United States)

    Sanders, Lia Lira Olivier; Muckli, Lars; de Millas, Walter; Lautenschlager, Marion; Heinz, Andreas; Kathmann, Norbert; Sterzer, Philipp

    2012-07-30

    Dysfunctional prediction in sensory processing has been suggested as a possible causal mechanism in the development of delusions in patients with schizophrenia. Previous studies in healthy subjects have shown that while the perception of apparent motion can mask visual events along the illusory motion trace, such motion masking is reduced when events are spatio-temporally compatible with the illusion, and, therefore, predictable. Here we tested the hypothesis that this specific detection advantage for predictable target stimuli on the apparent motion trace is reduced in patients with paranoid schizophrenia. Our data show that, although target detection along the illusory motion trace is generally impaired, both patients and healthy control participants detect predictable targets more often than unpredictable targets. Patients had a stronger motion masking effect when compared to controls. However, patients showed the same advantage in the detection of predictable targets as healthy control subjects. Our findings reveal stronger motion masking but intact prediction of visual events along the apparent motion trace in patients with paranoid schizophrenia and suggest that the sensory prediction mechanism underlying apparent motion is not impaired in paranoid schizophrenia. Copyright © 2012. Published by Elsevier Ireland Ltd.

  14. Incorporating an Exercise Detection, Grading, and Hormone Dosing Algorithm Into the Artificial Pancreas Using Accelerometry and Heart Rate

    OpenAIRE

    Jacobs, Peter G.; Resalat, Navid; El Youssef, Joseph; Reddy, Ravi; Branigan, Deborah; Preiser, Nicholas; Condon, John; Castle, Jessica

    2015-01-01

    In this article, we present several important contributions necessary for enabling an artificial endocrine pancreas (AP) system to better respond to exercise events. First, we show how exercise can be automatically detected using body-worn accelerometer and heart rate sensors. During a 22 hour overnight inpatient study, 13 subjects with type 1 diabetes wearing a Zephyr accelerometer and heart rate monitor underwent 45 minutes of mild aerobic treadmill exercise while controlling their glucose ...

  15. Hazardous Traffic Event Detection Using Markov Blanket and Sequential Minimal Optimization (MB-SMO

    Directory of Open Access Journals (Sweden)

    Lixin Yan

    2016-07-01

    Full Text Available The ability to identify hazardous traffic events is already considered as one of the most effective solutions for reducing the occurrence of crashes. Only certain particular hazardous traffic events have been studied in previous studies, which were mainly based on dedicated video stream data and GPS data. The objective of this study is twofold: (1 the Markov blanket (MB algorithm is employed to extract the main factors associated with hazardous traffic events; (2 a model is developed to identify hazardous traffic event using driving characteristics, vehicle trajectory, and vehicle position data. Twenty-two licensed drivers were recruited to carry out a natural driving experiment in Wuhan, China, and multi-sensor information data were collected for different types of traffic events. The results indicated that a vehicle’s speed, the standard deviation of speed, the standard deviation of skin conductance, the standard deviation of brake pressure, turn signal, the acceleration of steering, the standard deviation of acceleration, and the acceleration in Z (G have significant influences on hazardous traffic events. The sequential minimal optimization (SMO algorithm was adopted to build the identification model, and the accuracy of prediction was higher than 86%. Moreover, compared with other detection algorithms, the MB-SMO algorithm was ranked best in terms of the prediction accuracy. The conclusions can provide reference evidence for the development of dangerous situation warning products and the design of intelligent vehicles.

  16. Syncope During Competitive Events: Interrogating Heart Rate Monitor Watches May Be Useful!

    Science.gov (United States)

    Thabouillot, Oscar; Bostanci, Kevin; Bouvier, Francois; Dumitrescu, Nicolae; Stéfuriac, Maria; Paule, Philippe; Roche, Nicolas-Charles

    2017-12-01

    This is a case report of a 45-year-old man who reported complete amnesia during the very first kilometer of a 10-km run. He was wearing a heart rate monitor (HRM). The interrogation of his HRM watch showed 200 bpm tachycardia beginning in the first kilometer and increasing up to 220 bpm during the last kilometer. The patient was asked to wear a Holter-monitor (Holter Research Laboratory; Helena, Montana USA) electrocardiogram (ECG) while practicing a training session. This examination allowed for the diagnosis of an adrenergic paroxysmal atrial fibrillation (AF) with an impressive auriculo-ventricular conduction over 260 bpm. This case highlights that non-medical devices, such as connected watches, can be helpful to diagnose arrhythmias. Thabouillot O , Bostanci K , Bouvier F , Dumitrescu N , Stéfuriac M , Paule P , Roche NC . Syncope during competitive events: interrogating heart rate monitor watches may be useful! Prehosp Disaster Med. 2017;32(6):691-693.

  17. Bayesian analysis of energy and count rate data for detection of low count rate radioactive sources

    Energy Technology Data Exchange (ETDEWEB)

    Klumpp, John [Colorado State University, Department of Environmental and Radiological Health Sciences, Molecular and Radiological Biosciences Building, Colorado State University, Fort Collins, Colorado, 80523 (United States)

    2013-07-01

    We propose a radiation detection system which generates its own discrete sampling distribution based on past measurements of background. The advantage to this approach is that it can take into account variations in background with respect to time, location, energy spectra, detector-specific characteristics (i.e. different efficiencies at different count rates and energies), etc. This would therefore be a 'machine learning' approach, in which the algorithm updates and improves its characterization of background over time. The system would have a 'learning mode,' in which it measures and analyzes background count rates, and a 'detection mode,' in which it compares measurements from an unknown source against its unique background distribution. By characterizing and accounting for variations in the background, general purpose radiation detectors can be improved with little or no increase in cost. The statistical and computational techniques to perform this kind of analysis have already been developed. The necessary signal analysis can be accomplished using existing Bayesian algorithms which account for multiple channels, multiple detectors, and multiple time intervals. Furthermore, Bayesian machine-learning techniques have already been developed which, with trivial modifications, can generate appropriate decision thresholds based on the comparison of new measurements against a nonparametric sampling distribution. (authors)

  18. Direct risk standardisation: a new method for comparing casemix adjusted event rates using complex models.

    Science.gov (United States)

    Nicholl, Jon; Jacques, Richard M; Campbell, Michael J

    2013-10-29

    Comparison of outcomes between populations or centres may be confounded by any casemix differences and standardisation is carried out to avoid this. However, when the casemix adjustment models are large and complex, direct standardisation has been described as "practically impossible", and indirect standardisation may lead to unfair comparisons. We propose a new method of directly standardising for risk rather than standardising for casemix which overcomes these problems. Using a casemix model which is the same model as would be used in indirect standardisation, the risk in individuals is estimated. Risk categories are defined, and event rates in each category for each centre to be compared are calculated. A weighted sum of the risk category specific event rates is then calculated. We have illustrated this method using data on 6 million admissions to 146 hospitals in England in 2007/8 and an existing model with over 5000 casemix combinations, and a second dataset of 18,668 adult emergency admissions to 9 centres in the UK and overseas and a published model with over 20,000 casemix combinations and a continuous covariate. Substantial differences between conventional directly casemix standardised rates and rates from direct risk standardisation (DRS) were found. Results based on DRS were very similar to Standardised Mortality Ratios (SMRs) obtained from indirect standardisation, with similar standard errors. Direct risk standardisation using our proposed method is as straightforward as using conventional direct or indirect standardisation, always enables fair comparisons of performance to be made, can use continuous casemix covariates, and was found in our examples to have similar standard errors to the SMR. It should be preferred when there is a risk that conventional direct or indirect standardisation will lead to unfair comparisons.

  19. Exploring the relationship between analgesic event rate and pain intensity in kidney stone surgery: A Repeated Time to Event Pilot Study

    DEFF Research Database (Denmark)

    Juul, Rasmus Vestergaard; Pedersen, Katja Venborg; Christrup, Lona Louring

    III-60 Rasmus Juul Exploring the relationship between analgesic event rate and pain intensity in kidney stone surgery: A Repeated Time to Event Pilot Study RV Juul(1), KV Pedersen(2, 4), LL Christrup(1), AE Olesen(1, 3), AM Drewes(3), PJS Osther(4), TM Lund(1) 1) Department of Drug Design...... a relationship with pain intensity has not yet been established. The aim of this pilot study was to discuss how best to investigate the relationship between RTTE hazard of analgesic events and pain intensity in postoperative pain. Methods: Data was available from 44 patients undergoing kidney stone surgery......). Gompertz and exponential distribution models were evaluated. Post-hoc linear mixed effect modelling was performed between estimated RTTE hazard and observed NRS using the lme4 package in R (3). Results: A Gompertz distribution model adequately described data, with a baseline event rate of 0.64h-1 (RSE 25...

  20. Detection of Unusual Events and Trends in Complex Non-Stationary Data Streams

    International Nuclear Information System (INIS)

    Perez, Rafael B.; Protopopescu, Vladimir A.; Worley, Brian Addison; Perez, Cristina

    2006-01-01

    The search for unusual events and trends hidden in multi-component, nonlinear, non-stationary, noisy signals is extremely important for a host of different applications, ranging from nuclear power plant and electric grid operation to internet traffic and implementation of non-proliferation protocols. In the context of this work, we define an unusual event as a local signal disturbance and a trend as a continuous carrier of information added to and different from the underlying baseline dynamics. The goal of this paper is to investigate the feasibility of detecting hidden intermittent events inside non-stationary signal data sets corrupted by high levels of noise, by using the Hilbert-Huang empirical mode decomposition method

  1. Loss of Excitation Detection in Doubly Fed Induction Generator by Voltage and Reactive Power Rate

    Directory of Open Access Journals (Sweden)

    M. J. Abbasi

    2016-12-01

    Full Text Available The doubly fed induction generator (DFIG is one of the most popular technologies used in wind power systems. With the growing use of DFIGs and increasing power system dependence on them in recent years, protecting of these generators against internal faults is more considered. Loss of excitation (LOE event is among the most frequent failures in electric generators. However, LOE detection studies heretofore were usually confined to synchronous generators. Common LOE detection methods are based on impedance trajectory which makes the system slow and also prone to interpret a stable power swing (SPS as a LOE fault. This paper suggests a new method to detect the LOE based on the measured variables from the DFIG terminal. In this combined method for LOE detection, the rate of change of both the terminal voltage and the output reactive power are utilized and for SPS detection, the fast Fourier transform (FFT analysis of the output instantaneous active power has been used. The performance of the proposed method was evaluated using Matlab/Simulink interface for various power capacities and operating conditions. The results proved the method's quickness, simplicity and security.

  2. Energy Reconstruction for Events Detected in TES X-ray Detectors

    Science.gov (United States)

    Ceballos, M. T.; Cardiel, N.; Cobo, B.

    2015-09-01

    The processing of the X-ray events detected by a TES (Transition Edge Sensor) device (such as the one that will be proposed in the ESA AO call for instruments for the Athena mission (Nandra et al. 2013) as a high spectral resolution instrument, X-IFU (Barret et al. 2013)), is a several step procedure that starts with the detection of the current pulses in a noisy signal and ends up with their energy reconstruction. For this last stage, an energy calibration process is required to convert the pseudo energies measured in the detector to the real energies of the incoming photons, accounting for possible nonlinearity effects in the detector. We present the details of the energy calibration algorithm we implemented as the last part of the Event Processing software that we are developing for the X-IFU instrument, that permits the calculation of the calibration constants in an analytical way.

  3. A novel CUSUM-based approach for event detection in smart metering

    Science.gov (United States)

    Zhu, Zhicheng; Zhang, Shuai; Wei, Zhiqiang; Yin, Bo; Huang, Xianqing

    2018-03-01

    Non-intrusive load monitoring (NILM) plays such a significant role in raising consumer awareness on household electricity use to reduce overall energy consumption in the society. With regard to monitoring low power load, many researchers have introduced CUSUM into the NILM system, since the traditional event detection method is not as effective as expected. Due to the fact that the original CUSUM faces limitations given the small shift is below threshold, we therefore improve the test statistic which allows permissible deviation to gradually rise as the data size increases. This paper proposes a novel event detection and corresponding criterion that could be used in NILM systems to recognize transient states and to help the labelling task. Its performance has been tested in a real scenario where eight different appliances are connected to main line of electric power.

  4. Automatic detection of esophageal pressure events. Is there an alternative to rule-based criteria?

    DEFF Research Database (Denmark)

    Kruse-Andersen, S; Rütz, K; Kolberg, Jens Godsk

    1995-01-01

    of relevant pressure peaks at the various recording levels. Until now, this selection has been performed entirely by rule-based systems, requiring each pressure deflection to fit within predefined rigid numerical limits in order to be detected. However, due to great variations in the shapes of the pressure...... curves generated by muscular contractions, rule-based criteria do not always select the pressure events most relevant for further analysis. We have therefore been searching for a new concept for automatic event recognition. The present study describes a new system, based on the method of neurocomputing.......79-0.99 and accuracies of 0.89-0.98, depending on the recording level within the esophageal lumen. The neural networks often recognized peaks that clearly represented true contractions but that had been rejected by a rule-based system. We conclude that neural networks have potentials for automatic detections...

  5. Flow detection via sparse frame analysis for suspicious event recognition in infrared imagery

    Science.gov (United States)

    Fernandes, Henrique C.; Batista, Marcos A.; Barcelos, Celia A. Z.; Maldague, Xavier P. V.

    2013-05-01

    It is becoming increasingly evident that intelligent systems are very bene¯cial for society and that the further development of such systems is necessary to continue to improve society's quality of life. One area that has drawn the attention of recent research is the development of automatic surveillance systems. In our work we outline a system capable of monitoring an uncontrolled area (an outside parking lot) using infrared imagery and recognizing suspicious events in this area. The ¯rst step is to identify moving objects and segment them from the scene's background. Our approach is based on a dynamic background-subtraction technique which robustly adapts detection to illumination changes. It is analyzed only regions where movement is occurring, ignoring in°uence of pixels from regions where there is no movement, to segment moving objects. Regions where movement is occurring are identi¯ed using °ow detection via sparse frame analysis. During the tracking process the objects are classi¯ed into two categories: Persons and Vehicles, based on features such as size and velocity. The last step is to recognize suspicious events that may occur in the scene. Since the objects are correctly segmented and classi¯ed it is possible to identify those events using features such as velocity and time spent motionless in one spot. In this paper we recognize the suspicious event suspicion of object(s) theft from inside a parked vehicle at spot X by a person" and results show that the use of °ow detection increases the recognition of this suspicious event from 78:57% to 92:85%.

  6. Indoor acrolein emission and decay rates resulting from domestic cooking events

    Science.gov (United States)

    Seaman, Vincent Y.; Bennett, Deborah H.; Cahill, Thomas M.

    2009-12-01

    Acrolein (2-propenal) is a common constituent of both indoor and outdoor air, can exacerbate asthma in children, and may contribute to other chronic lung diseases. Recent studies have found high indoor levels of acrolein and other carbonyls compared to outdoor ambient concentrations. Heated cooking oils produce considerable amounts of acrolein, thus cooking is likely an important source of indoor acrolein. A series of cooking experiments were conducted to determine the emission rates of acrolein and other volatile carbonyls for different types of cooking oils (canola, soybean, corn and olive oils) and deep-frying different food items. Similar concentrations and emission rates of carbonyls were found when different vegetable oils were used to deep-fry the same food product. The food item being deep-fried was generally not a significant source of carbonyls compared to the cooking oil. The oil cooking events resulted in high concentrations of acrolein that were in the range of 26.4-64.5 μg m -3. These concentrations exceed all the chronic regulatory exposure limits and many of the acute exposure limits. The air exchange rate and the decay rate of the carbonyls were monitored to estimate the half-life of the carbonyls. The half-life for acrolein was 14.4 ± 2.6 h, which indicates that indoor acrolein concentrations can persist for considerable time after cooking in poorly-ventilated homes.

  7. Detection of invisible and crucial events: from seismic fluctuations to the war against terrorism

    Energy Technology Data Exchange (ETDEWEB)

    Allegrini, Paolo; Fronzoni, Leone; Grigolini, Paolo; Latora, Vito; Mega, Mirko S.; Palatella, Luigi E-mail: luigi.palatella@df.unipi.it; Rapisarda, Andrea; Vinciguerra, Sergio

    2004-04-01

    We argue that the recent discovery of the non-Poissonian statistics of the seismic main-shocks is a special case of a more general approach to the detection of the distribution of the time increments between one crucial but invisible event and the next. We make the conjecture that the proposed approach can be applied to the analysis of terrorist network with significant benefits for the Intelligence Community.

  8. Automated Feature and Event Detection with SDO AIA and HMI Data

    Science.gov (United States)

    Davey, Alisdair; Martens, P. C. H.; Attrill, G. D. R.; Engell, A.; Farid, S.; Grigis, P. C.; Kasper, J.; Korreck, K.; Saar, S. H.; Su, Y.; Testa, P.; Wills-Davey, M.; Savcheva, A.; Bernasconi, P. N.; Raouafi, N.-E.; Delouille, V. A.; Hochedez, J. F..; Cirtain, J. W.; Deforest, C. E.; Angryk, R. A.; de Moortel, I.; Wiegelmann, T.; Georgouli, M. K.; McAteer, R. T. J.; Hurlburt, N.; Timmons, R.

    The Solar Dynamics Observatory (SDO) represents a new frontier in quantity and quality of solar data. At about 1.5 TB/day, the data will not be easily digestible by solar physicists using the same methods that have been employed for images from previous missions. In order for solar scientists to use the SDO data effectively they need meta-data that will allow them to identify and retrieve data sets that address their particular science questions. We are building a comprehensive computer vision pipeline for SDO, abstracting complete metadata on many of the features and events detectable on the Sun without human intervention. Our project unites more than a dozen individual, existing codes into a systematic tool that can be used by the entire solar community. The feature finding codes will run as part of the SDO Event Detection System (EDS) at the Joint Science Operations Center (JSOC; joint between Stanford and LMSAL). The metadata produced will be stored in the Heliophysics Event Knowledgebase (HEK), which will be accessible on-line for the rest of the world directly or via the Virtual Solar Observatory (VSO) . Solar scientists will be able to use the HEK to select event and feature data to download for science studies.

  9. A Macro-Observation Scheme for Abnormal Event Detection in Daily-Life Video Sequences

    Directory of Open Access Journals (Sweden)

    Chiu Wei-Yao

    2010-01-01

    Full Text Available Abstract We propose a macro-observation scheme for abnormal event detection in daily life. The proposed macro-observation representation records the time-space energy of motions of all moving objects in a scene without segmenting individual object parts. The energy history of each pixel in the scene is instantly updated with exponential weights without explicitly specifying the duration of each activity. Since possible activities in daily life are numerous and distinct from each other and not all abnormal events can be foreseen, images from a video sequence that spans sufficient repetition of normal day-to-day activities are first randomly sampled. A constrained clustering model is proposed to partition the sampled images into groups. The new observed event that has distinct distance from any of the cluster centroids is then classified as an anomaly. The proposed method has been evaluated in daily work of a laboratory and BEHAVE benchmark dataset. The experimental results reveal that it can well detect abnormal events such as burglary and fighting as long as they last for a sufficient duration of time. The proposed method can be used as a support system for the scene that requires full time monitoring personnel.

  10. Biometric Quantization through Detection Rate Optimized Bit Allocation

    Directory of Open Access Journals (Sweden)

    C. Chen

    2009-01-01

    Full Text Available Extracting binary strings from real-valued biometric templates is a fundamental step in many biometric template protection systems, such as fuzzy commitment, fuzzy extractor, secure sketch, and helper data systems. Previous work has been focusing on the design of optimal quantization and coding for each single feature component, yet the binary string—concatenation of all coded feature components—is not optimal. In this paper, we present a detection rate optimized bit allocation (DROBA principle, which assigns more bits to discriminative features and fewer bits to nondiscriminative features. We further propose a dynamic programming (DP approach and a greedy search (GS approach to achieve DROBA. Experiments of DROBA on the FVC2000 fingerprint database and the FRGC face database show good performances. As a universal method, DROBA is applicable to arbitrary biometric modalities, such as fingerprint texture, iris, signature, and face. DROBA will bring significant benefits not only to the template protection systems but also to the systems with fast matching requirements or constrained storage capability.

  11. Detecting adverse events in surgery: comparing events detected by the Veterans Health Administration Surgical Quality Improvement Program and the Patient Safety Indicators.

    Science.gov (United States)

    Mull, Hillary J; Borzecki, Ann M; Loveland, Susan; Hickson, Kathleen; Chen, Qi; MacDonald, Sally; Shin, Marlena H; Cevasco, Marisa; Itani, Kamal M F; Rosen, Amy K

    2014-04-01

    The Patient Safety Indicators (PSIs) use administrative data to screen for select adverse events (AEs). In this study, VA Surgical Quality Improvement Program (VASQIP) chart review data were used as the gold standard to measure the criterion validity of 5 surgical PSIs. Independent chart review was also used to determine reasons for PSI errors. The sensitivity, specificity, and positive predictive value of PSI software version 4.1a were calculated among Veterans Health Administration hospitalizations (2003-2007) reviewed by VASQIP (n = 268,771). Nurses re-reviewed a sample of hospitalizations for which PSI and VASQIP AE detection disagreed. Sensitivities ranged from 31% to 68%, specificities from 99.1% to 99.8%, and positive predictive values from 31% to 72%. Reviewers found that coding errors accounted for some PSI-VASQIP disagreement; some disagreement was also the result of differences in AE definitions. These results suggest that the PSIs have moderate criterion validity; however, some surgical PSIs detect different AEs than VASQIP. Future research should explore using both methods to evaluate surgical quality. Published by Elsevier Inc.

  12. Temporal and spatial predictability of an irrelevant event differently affect detection and memory of items in a visual sequence

    Directory of Open Access Journals (Sweden)

    Junji eOhyama

    2016-02-01

    Full Text Available We examined how the temporal and spatial predictability of a task-irrelevant visual event affects the detection and memory of a visual item embedded in a continuously changing sequence. Participants observed 11 sequentially presented letters, during which a task-irrelevant visual event was either present or absent. Predictabilities of spatial location and temporal position of the event were controlled in 2 × 2 conditions. In the spatially predictable conditions, the event occurred at the same location within the stimulus sequence or at another location, while, in the spatially unpredictable conditions, it occurred at random locations. In the temporally predictable conditions, the event timing was fixed relative to the order of the letters, while in the temporally unpredictable condition, it could not be predicted from the letter order. Participants performed a working memory task and a target detection reaction time task. Memory accuracy was higher for a letter simultaneously presented at the same location as the event in the temporally unpredictable conditions, irrespective of the spatial predictability of the event. On the other hand, the detection reaction times were only faster for a letter simultaneously presented at the same location as the event when the event was both temporally and spatially predictable. Thus, to facilitate ongoing detection processes, an event must be predictable both in space and time, while memory processes are enhanced by temporally unpredictable (i.e., surprising events. Evidently, temporal predictability has differential effects on detection and memory of a visual item embedded in a sequence of images.

  13. Method for the depth corrected detection of ionizing events from a co-planar grids sensor

    Science.gov (United States)

    De Geronimo, Gianluigi [Syosset, NY; Bolotnikov, Aleksey E [South Setauket, NY; Carini, Gabriella [Port Jefferson, NY

    2009-05-12

    A method for the detection of ionizing events utilizing a co-planar grids sensor comprising a semiconductor substrate, cathode electrode, collecting grid and non-collecting grid. The semiconductor substrate is sensitive to ionizing radiation. A voltage less than 0 Volts is applied to the cathode electrode. A voltage greater than the voltage applied to the cathode is applied to the non-collecting grid. A voltage greater than the voltage applied to the non-collecting grid is applied to the collecting grid. The collecting grid and the non-collecting grid are summed and subtracted creating a sum and difference respectively. The difference and sum are divided creating a ratio. A gain coefficient factor for each depth (distance between the ionizing event and the collecting grid) is determined, whereby the difference between the collecting electrode and the non-collecting electrode multiplied by the corresponding gain coefficient is the depth corrected energy of an ionizing event. Therefore, the energy of each ionizing event is the difference between the collecting grid and the non-collecting grid multiplied by the corresponding gain coefficient. The depth of the ionizing event can also be determined from the ratio.

  14. Ion-ion coincidence imaging at high event rate using an in-vacuum pixel detector

    Science.gov (United States)

    Long, Jingming; Furch, Federico J.; Durá, Judith; Tremsin, Anton S.; Vallerga, John; Schulz, Claus Peter; Rouzée, Arnaud; Vrakking, Marc J. J.

    2017-07-01

    A new ion-ion coincidence imaging spectrometer based on a pixelated complementary metal-oxide-semiconductor detector has been developed for the investigation of molecular ionization and fragmentation processes in strong laser fields. Used as a part of a velocity map imaging spectrometer, the detection system is comprised of a set of microchannel plates and a Timepix detector. A fast time-to-digital converter (TDC) is used to enhance the ion time-of-flight resolution by correlating timestamps registered separately by the Timepix detector and the TDC. In addition, sub-pixel spatial resolution (principle experiment on strong field dissociative double ionization of carbon dioxide molecules (CO2), using a 400 kHz repetition rate laser system. The experimental results demonstrate that the spectrometer can detect multiple ions in coincidence, making it a valuable tool for studying the fragmentation dynamics of molecules in strong laser fields.

  15. The influence of disturbance events on survival and dispersal rates of Florida box turtles

    Science.gov (United States)

    Dodd, C.K.; Ozgul, A.; Oli, M.K.

    2006-01-01

    Disturbances have the potential to cause long-term effects to ecosystem structure and function, and they may affect individual species in different ways. Long-lived vertebrates such as turtles may be at risk from such events, inasmuch as their life histories preclude rapid recovery should extensive mortality occur. We applied capture–mark–recapture models to assess disturbance effects on a population of Florida box turtles (Terrapene carolina bauri) on Egmont Key, Florida, USA. Near the midpoint of the study, a series of physical disturbances affected the island, from salt water overwash associated with several tropical storms to extensive removal of nonindigenous vegetation. These disturbances allowed us to examine demographic responses of the turtle population and to determine if they affected dispersal throughout the island. Adult survival rates did not vary significantly either between sexes or among years of the study. Survival rates did not vary significantly between juvenile and adult turtles, or among years of the study. Furthermore, neither adult nor juvenile survival rates differed significantly between pre- and post-disturbance. However, dispersal rates varied significantly among the four major study sites, and dispersal rates were higher during the pre-disturbance sampling periods compared to post-disturbance. Our results suggest few long-term effects on the demography of the turtle population. Florida box turtles responded to tropical storms and vegetation control by moving to favorable habitats minimally affected by the disturbances and remaining there. As long as turtles and perhaps other long-lived vertebrates can disperse to non-disturbed habitat, and high levels of mortality do not occur in a population, a long life span may allow them to wait out the impact of disturbance with potentially little effect on long-term population processes.

  16. Combined passive detection and ultrafast active imaging of cavitation events induced by short pulses of high-intensity ultrasound.

    Science.gov (United States)

    Gateau, Jérôme; Aubry, Jean-François; Pernot, Mathieu; Fink, Mathias; Tanter, Mickaël

    2011-03-01

    The activation of natural gas nuclei to induce larger bubbles is possible using short ultrasonic excitations of high amplitude, and is required for ultrasound cavitation therapies. However, little is known about the distribution of nuclei in tissues. Therefore, the acoustic pressure level necessary to generate bubbles in a targeted zone and their exact location are currently difficult to predict. To monitor the initiation of cavitation activity, a novel all-ultrasound technique sensitive to single nucleation events is presented here. It is based on combined passive detection and ultrafast active imaging over a large volume using the same multi-element probe. Bubble nucleation was induced using a focused transducer (660 kHz, f-number = 1) driven by a high-power electric burst (up to 300 W) of one to two cycles. Detection was performed with a linear array (4 to 7 MHz) aligned with the single-element focal point. In vitro experiments in gelatin gel and muscular tissue are presented. The synchronized passive detection enabled radio-frequency data to be recorded, comprising high-frequency coherent wave fronts as signatures of the acoustic emissions linked to the activation of the nuclei. Active change detection images were obtained by subtracting echoes collected in the unnucleated medium. These indicated the appearance of stable cavitating regions. Because of the ultrafast frame rate, active detection occurred as quickly as 330 μs after the high-amplitude excitation and the dynamics of the induced regions were studied individually.

  17. Wireless AE Event and Environmental Monitoring for Wind Turbine Blades at Low Sampling Rates

    Science.gov (United States)

    Bouzid, Omar M.; Tian, Gui Y.; Cumanan, K.; Neasham, J.

    Integration of acoustic wireless technology in structural health monitoring (SHM) applications introduces new challenges due to requirements of high sampling rates, additional communication bandwidth, memory space, and power resources. In order to circumvent these challenges, this chapter proposes a novel solution through building a wireless SHM technique in conjunction with acoustic emission (AE) with field deployment on the structure of a wind turbine. This solution requires a low sampling rate which is lower than the Nyquist rate. In addition, features extracted from aliased AE signals instead of reconstructing the original signals on-board the wireless nodes are exploited to monitor AE events, such as wind, rain, strong hail, and bird strike in different environmental conditions in conjunction with artificial AE sources. Time feature extraction algorithm, in addition to the principal component analysis (PCA) method, is used to extract and classify the relevant information, which in turn is used to classify or recognise a testing condition that is represented by the response signals. This proposed novel technique yields a significant data reduction during the monitoring process of wind turbine blades.

  18. Final Scientific Report, Integrated Seismic Event Detection and Location by Advanced Array Processing

    Energy Technology Data Exchange (ETDEWEB)

    Kvaerna, T.; Gibbons. S.J.; Ringdal, F; Harris, D.B.

    2007-01-30

    In the field of nuclear explosion monitoring, it has become a priority to detect, locate, and identify seismic events down to increasingly small magnitudes. The consideration of smaller seismic events has implications for a reliable monitoring regime. Firstly, the number of events to be considered increases greatly; an exponential increase in naturally occurring seismicity is compounded by large numbers of seismic signals generated by human activity. Secondly, the signals from smaller events become more difficult to detect above the background noise and estimates of parameters required for locating the events may be subject to greater errors. Thirdly, events are likely to be observed by a far smaller number of seismic stations, and the reliability of event detection and location using a very limited set of observations needs to be quantified. For many key seismic stations, detection lists may be dominated by signals from routine industrial explosions which should be ascribed, automatically and with a high level of confidence, to known sources. This means that expensive analyst time is not spent locating routine events from repeating seismic sources and that events from unknown sources, which could be of concern in an explosion monitoring context, are more easily identified and can be examined with due care. We have obtained extensive lists of confirmed seismic events from mining and other artificial sources which have provided an excellent opportunity to assess the quality of existing fully-automatic event bulletins and to guide the development of new techniques for online seismic processing. Comparing the times and locations of confirmed events from sources in Fennoscandia and NW Russia with the corresponding time and location estimates reported in existing automatic bulletins has revealed substantial mislocation errors which preclude a confident association of detected signals with known industrial sources. The causes of the errors are well understood and are

  19. Final Scientific Report, Integrated Seismic Event Detection and Location by Advanced Array Processing

    International Nuclear Information System (INIS)

    Kvaerna, T.; Gibbons. S.J.; Ringdal, F; Harris, D.B.

    2007-01-01

    In the field of nuclear explosion monitoring, it has become a priority to detect, locate, and identify seismic events down to increasingly small magnitudes. The consideration of smaller seismic events has implications for a reliable monitoring regime. Firstly, the number of events to be considered increases greatly; an exponential increase in naturally occurring seismicity is compounded by large numbers of seismic signals generated by human activity. Secondly, the signals from smaller events become more difficult to detect above the background noise and estimates of parameters required for locating the events may be subject to greater errors. Thirdly, events are likely to be observed by a far smaller number of seismic stations, and the reliability of event detection and location using a very limited set of observations needs to be quantified. For many key seismic stations, detection lists may be dominated by signals from routine industrial explosions which should be ascribed, automatically and with a high level of confidence, to known sources. This means that expensive analyst time is not spent locating routine events from repeating seismic sources and that events from unknown sources, which could be of concern in an explosion monitoring context, are more easily identified and can be examined with due care. We have obtained extensive lists of confirmed seismic events from mining and other artificial sources which have provided an excellent opportunity to assess the quality of existing fully-automatic event bulletins and to guide the development of new techniques for online seismic processing. Comparing the times and locations of confirmed events from sources in Fennoscandia and NW Russia with the corresponding time and location estimates reported in existing automatic bulletins has revealed substantial mislocation errors which preclude a confident association of detected signals with known industrial sources. The causes of the errors are well understood and are

  20. A Novel Event-Based Incipient Slip Detection Using Dynamic Active-Pixel Vision Sensor (DAVIS).

    Science.gov (United States)

    Rigi, Amin; Baghaei Naeini, Fariborz; Makris, Dimitrios; Zweiri, Yahya

    2018-01-24

    In this paper, a novel approach to detect incipient slip based on the contact area between a transparent silicone medium and different objects using a neuromorphic event-based vision sensor (DAVIS) is proposed. Event-based algorithms are developed to detect incipient slip, slip, stress distribution and object vibration. Thirty-seven experiments were performed on five objects with different sizes, shapes, materials and weights to compare precision and response time of the proposed approach. The proposed approach is validated by using a high speed constitutional camera (1000 FPS). The results indicate that the sensor can detect incipient slippage with an average of 44.1 ms latency in unstructured environment for various objects. It is worth mentioning that the experiments were conducted in an uncontrolled experimental environment, therefore adding high noise levels that affected results significantly. However, eleven of the experiments had a detection latency below 10 ms which shows the capability of this method. The results are very promising and show a high potential of the sensor being used for manipulation applications especially in dynamic environments.

  1. Event rates, hospital utilization, and costs associated with major complications of diabetes: a multicountry comparative analysis.

    Directory of Open Access Journals (Sweden)

    Philip M Clarke

    2010-02-01

    Full Text Available Diabetes imposes a substantial burden globally in terms of premature mortality, morbidity, and health care costs. Estimates of economic outcomes associated with diabetes are essential inputs to policy analyses aimed at prevention and treatment of diabetes. Our objective was to estimate and compare event rates, hospital utilization, and costs associated with major diabetes-related complications in high-, middle-, and low-income countries.Incidence and history of diabetes-related complications, hospital admissions, and length of stay were recorded in 11,140 patients with type 2 diabetes participating in the Action in Diabetes and Vascular Disease (ADVANCE study (mean age at entry 66 y. The probability of hospital utilization and number of days in hospital for major events associated with coronary disease, cerebrovascular disease, congestive heart failure, peripheral vascular disease, and nephropathy were estimated for three regions (Asia, Eastern Europe, and Established Market Economies using multiple regression analysis. The resulting estimates of days spent in hospital were multiplied by regional estimates of the costs per hospital bed-day from the World Health Organization to compute annual acute and long-term costs associated with the different types of complications. To assist, comparability, costs are reported in international dollars (Int$, which represent a hypothetical currency that allows for the same quantities of goods or services to be purchased regardless of country, standardized on purchasing power in the United States. A cost calculator accompanying this paper enables the estimation of costs for individual countries and translation of these costs into local currency units. The probability of attending a hospital following an event was highest for heart failure (93%-96% across regions and lowest for nephropathy (15%-26%. The average numbers of days in hospital given at least one admission were greatest for stroke (17-32 d across

  2. 2Kx2K resolution element photon counting MCP sensor with >200 kHz event rate capability

    CERN Document Server

    Vallerga, J V

    2000-01-01

    Siegmund Scientific undertook a NASA Small Business Innovative Research (SBIR) contract to develop a versatile, high-performance photon (or particle) counting detector combining recent technical advances in all aspects of Microchannel Plate (MCP) detector development in a low cost, commercially viable package that can support a variety of applications. The detector concept consists of a set of MCPs whose output electron pulses are read out with a crossed delay line (XDL) anode and associated high-speed event encoding electronics. The delay line anode allows high-resolution photon event centroiding at very high event rates and can be scaled to large formats (>40 mm) while maintaining good linearity and high temporal stability. The optimal sensitivity wavelength range is determined by the choice of opaque photocathodes. Specific achievements included: spatial resolution of 200 000 events s sup - sup 1; local rates of >100 events s sup - sup 1 per resolution element; event timing of <1 ns; and low background ...

  3. Soft error rate analysis methodology of multi-Pulse-single-event transients

    International Nuclear Information System (INIS)

    Zhou Bin; Huo Mingxue; Xiao Liyi

    2012-01-01

    As transistor feature size scales down, soft errors in combinational logic because of high-energy particle radiation is gaining more and more concerns. In this paper, a combinational logic soft error analysis methodology considering multi-pulse-single-event transients (MPSETs) and re-convergence with multi transient pulses is proposed. In the proposed approach, the voltage pulse produced at the standard cell output is approximated by a triangle waveform, and characterized by three parameters: pulse width, the transition time of the first edge, and the transition time of the second edge. As for the pulse with the amplitude being smaller than the supply voltage, the edge extension technique is proposed. Moreover, an efficient electrical masking model comprehensively considering transition time, delay, width and amplitude is proposed, and an approach using the transition times of two edges and pulse width to compute the amplitude of pulse is proposed. Finally, our proposed firstly-independently-propagating-secondly-mutually-interacting (FIP-SMI) is used to deal with more practical re-convergence gate with multi transient pulses. As for MPSETs, a random generation model of MPSETs is exploratively proposed. Compared to the estimates obtained using circuit level simulations by HSpice, our proposed soft error rate analysis algorithm has 10% errors in SER estimation with speed up of 300 when the single-pulse-single-event transient (SPSET) is considered. We have also demonstrated the runtime and SER decrease with the increment of P0 using designs from the ISCAS-85 benchmarks. (authors)

  4. Developing a Digital Medicine System in Psychiatry: Ingestion Detection Rate and Latency Period.

    Science.gov (United States)

    Profit, Deborah; Rohatagi, Shashank; Zhao, Cathy; Hatch, Ainslie; Docherty, John P; Peters-Strickland, Timothy S

    2016-09-01

    A digital medicine system (DMS) has been developed to measure and report adherence to an atypical antipsychotic, aripiprazole, in psychiatric patients. The DMS consists of 3 components: ingestible sensor embedded in a medication tablet, wearable sensor, and secure mobile and cloud-based applications. An umbrella study protocol was designed to rapidly assess the technical performance and safety of the DMS in multiple substudies to guide the technology development. Two sequential substudies enrolled 30 and 29 healthy volunteers between March-April 2014 and February-March 2015, respectively, to assess detection accuracy of the ingestible sensor by the DMS and the latency period between ingestion and detection of the ingestion by the wearable sensor or the cloud-based server. The first substudy identified areas for improvement using early versions of the wearable sensor and the mobile application. The second substudy tested updated versions of the components and showed an overall ingestion detection rate of 96.6%. Mean latency times for the signal transmission were 1.1-1.3 minutes (from ingestion to the wearable sensor detection) and 6.2-10.3 minutes (from the wearable sensor detection to the server detection). Half of transmissions were completed in < 2 minutes, and ~90% of ingestions were registered by the smartphone within 30 minutes of ingestion. No serious adverse events, discontinuations, or clinically significant laboratory/vital signs findings were reported. The DMS implementing modified versions of the smartphone application and the wearable sensor has the technical capability to detect and report tablet ingestion with high accuracy and acceptable latency time. ClinicalTrials.gov identifier: NCT02091882. © Copyright 2016 Physicians Postgraduate Press, Inc.

  5. Automatic detection of lexical change: an auditory event-related potential study.

    Science.gov (United States)

    Muller-Gass, Alexandra; Roye, Anja; Kirmse, Ursula; Saupe, Katja; Jacobsen, Thomas; Schröger, Erich

    2007-10-29

    We investigated the detection of rare task-irrelevant changes in the lexical status of speech stimuli. Participants performed a nonlinguistic task on word and pseudoword stimuli that occurred, in separate conditions, rarely or frequently. Task performance for pseudowords was deteriorated relative to words, suggesting unintentional lexical analysis. Furthermore, rare word and pseudoword changes had a similar effect on the event-related potentials, starting as early as 165 ms. This is the first demonstration of the automatic detection of change in lexical status that is not based on a co-occurring acoustic change. We propose that, following lexical analysis of the incoming stimuli, a mental representation of the lexical regularity is formed and used as a template against which lexical change can be detected.

  6. EVENT DETECTION USING MOBILE PHONE MASS GPS DATA AND THEIR RELIAVILITY VERIFICATION BY DMSP/OLS NIGHT LIGHT IMAGE

    Directory of Open Access Journals (Sweden)

    A. Yuki

    2016-06-01

    Full Text Available In this study, we developed a method to detect sudden population concentration on a certain day and area, that is, an “Event,” all over Japan in 2012 using mass GPS data provided from mobile phone users. First, stay locations of all phone users were detected using existing methods. Second, areas and days where Events occurred were detected by aggregation of mass stay locations into 1-km-square grid polygons. Finally, the proposed method could detect Events with an especially large number of visitors in the year by removing the influences of Events that occurred continuously throughout the year. In addition, we demonstrated reasonable reliability of the proposed Event detection method by comparing the results of Event detection with light intensities obtained from the night light images from the DMSP/OLS night light images. Our method can detect not only positive events such as festivals but also negative events such as natural disasters and road accidents. These results are expected to support policy development of urban planning, disaster prevention, and transportation management.

  7. Predictive modeling of structured electronic health records for adverse drug event detection.

    Science.gov (United States)

    Zhao, Jing; Henriksson, Aron; Asker, Lars; Boström, Henrik

    2015-01-01

    The digitization of healthcare data, resulting from the increasingly widespread adoption of electronic health records, has greatly facilitated its analysis by computational methods and thereby enabled large-scale secondary use thereof. This can be exploited to support public health activities such as pharmacovigilance, wherein the safety of drugs is monitored to inform regulatory decisions about sustained use. To that end, electronic health records have emerged as a potentially valuable data source, providing access to longitudinal observations of patient treatment and drug use. A nascent line of research concerns predictive modeling of healthcare data for the automatic detection of adverse drug events, which presents its own set of challenges: it is not yet clear how to represent the heterogeneous data types in a manner conducive to learning high-performing machine learning models. Datasets from an electronic health record database are used for learning predictive models with the purpose of detecting adverse drug events. The use and representation of two data types, as well as their combination, are studied: clinical codes, describing prescribed drugs and assigned diagnoses, and measurements. Feature selection is conducted on the various types of data to reduce dimensionality and sparsity, while allowing for an in-depth feature analysis of the usefulness of each data type and representation. Within each data type, combining multiple representations yields better predictive performance compared to using any single representation. The use of clinical codes for adverse drug event detection significantly outperforms the use of measurements; however, there is no significant difference over datasets between using only clinical codes and their combination with measurements. For certain adverse drug events, the combination does, however, outperform using only clinical codes. Feature selection leads to increased predictive performance for both data types, in isolation and

  8. An Ensemble Approach for Emotion Cause Detection with Event Extraction and Multi-Kernel SVMs

    Institute of Scientific and Technical Information of China (English)

    Ruifeng Xu; Jiannan Hu; Qin Lu; Dongyin Wu; Lin Gui

    2017-01-01

    In this paper,we present a new challenging task for emotion analysis,namely emotion cause extraction.In this task,we focus on the detection of emotion cause a.k.a the reason or the stimulant of an emotion,rather than the regular emotion classification or emotion component extraction.Since there is no open dataset for this task available,we first designed and annotated an emotion cause dataset which follows the scheme of W3C Emotion Markup Language.We then present an emotion cause detection method by using event extraction framework,where a tree structure-based representation method is used to represent the events.Since the distribution of events is imbalanced in the training data,we propose an under-sampling-based bagging algorithm to solve this problem.Even with a limited training set,the proposed approach may still extract sufficient features for analysis by a bagging of multi-kernel based SVMs method.Evaluations show that our approach achieves an F-measure 7.04% higher than the state-of-the-art methods.

  9. Automatic Multi-sensor Data Quality Checking and Event Detection for Environmental Sensing

    Science.gov (United States)

    LIU, Q.; Zhang, Y.; Zhao, Y.; Gao, D.; Gallaher, D. W.; Lv, Q.; Shang, L.

    2017-12-01

    With the advances in sensing technologies, large-scale environmental sensing infrastructures are pervasively deployed to continuously collect data for various research and application fields, such as air quality study and weather condition monitoring. In such infrastructures, many sensor nodes are distributed in a specific area and each individual sensor node is capable of measuring several parameters (e.g., humidity, temperature, and pressure), providing massive data for natural event detection and analysis. However, due to the dynamics of the ambient environment, sensor data can be contaminated by errors or noise. Thus, data quality is still a primary concern for scientists before drawing any reliable scientific conclusions. To help researchers identify potential data quality issues and detect meaningful natural events, this work proposes a novel algorithm to automatically identify and rank anomalous time windows from multiple sensor data streams. More specifically, (1) the algorithm adaptively learns the characteristics of normal evolving time series and (2) models the spatial-temporal relationship among multiple sensor nodes to infer the anomaly likelihood of a time series window for a particular parameter in a sensor node. Case studies using different data sets are presented and the experimental results demonstrate that the proposed algorithm can effectively identify anomalous time windows, which may resulted from data quality issues and natural events.

  10. Vision-based Event Detection of the Sit-to-Stand Transition

    Directory of Open Access Journals (Sweden)

    Victor Shia

    2015-12-01

    Full Text Available Sit-to-stand (STS motions are one of the most important activities of daily living as they serve as a precursor to mobility and walking. However, there exist no standard method of segmenting STS motions. This is partially due to the variety of different sensors and modalities used to study the STS motion such as force plate, vision, and accelerometers, each providing different types of data, and the variability of the STS motion in video data. In this work, we present a method using motion capture to detect events in the STS motion by estimating ground reaction forces, thereby eliminating the variability in joint angles from visual data. We illustrate the accuracy of this method with 10 subjects with an average difference of 16.5ms in event times obtained via motion capture vs force plate. This method serves as a proof of concept for detecting events in the STS motion via video which are comparable to those obtained via force plate.

  11. Detection of Visual Events in Underwater Video Using a Neuromorphic Saliency-based Attention System

    Science.gov (United States)

    Edgington, D. R.; Walther, D.; Cline, D. E.; Sherlock, R.; Salamy, K. A.; Wilson, A.; Koch, C.

    2003-12-01

    The Monterey Bay Aquarium Research Institute (MBARI) uses high-resolution video equipment on remotely operated vehicles (ROV) to obtain quantitative data on the distribution and abundance of oceanic animals. High-quality video data supplants the traditional approach of assessing the kinds and numbers of animals in the oceanic water column through towing collection nets behind ships. Tow nets are limited in spatial resolution, and often destroy abundant gelatinous animals resulting in species undersampling. Video camera-based quantitative video transects (QVT) are taken through the ocean midwater, from 50m to 4000m, and provide high-resolution data at the scale of the individual animals and their natural aggregation patterns. However, the current manual method of analyzing QVT video by trained scientists is labor intensive and poses a serious limitation to the amount of information that can be analyzed from ROV dives. Presented here is an automated system for detecting marine animals (events) visible in the videos. Automated detection is difficult due to the low contrast of many translucent animals and due to debris ("marine snow") cluttering the scene. Video frames are processed with an artificial intelligence attention selection algorithm that has proven a robust means of target detection in a variety of natural terrestrial scenes. The candidate locations identified by the attention selection module are tracked across video frames using linear Kalman filters. Typically, the occurrence of visible animals in the video footage is sparse in space and time. A notion of "boring" video frames is developed by detecting whether or not there is an interesting candidate object for an animal present in a particular sequence of underwater video -- video frames that do not contain any "interesting" events. If objects can be tracked successfully over several frames, they are stored as potentially "interesting" events. Based on low-level properties, interesting events are

  12. Performance Evaluation of Wireless Sensor Networks for Event-Detection with Shadowing-Induced Radio Irregularities

    Directory of Open Access Journals (Sweden)

    Giuseppe De Marco

    2007-01-01

    Full Text Available In this paper, we study a particular application of wireless sensor networks for event-detection and tracking. In this kind of application, the transport of data is simplified, and guaranteeing a minimum number of packets at the monitoring node is the only constraint on the performance of the sensor network. This minimum number of packets is called event-reliability. Contrary to other studies on the subject, here we consider the behavior of such a network in presence of a realistic radio model, such as the shadowing of the radio signal. With this setting, we extend our previous analysis of the event-reliability approach for the transport of data. In particular, both regular and random networks are considered. The contribute of this work is to show via simulations that, in the presence of randomness or irregularities in the radio channel, the event-reliability can be jeopardized, that is the constraint on the minimum number of packets at the sink node could not be satisfied.

  13. Rapid and reliable detection and identification of GM events using multiplex PCR coupled with oligonucleotide microarray.

    Science.gov (United States)

    Xu, Xiaodan; Li, Yingcong; Zhao, Heng; Wen, Si-yuan; Wang, Sheng-qi; Huang, Jian; Huang, Kun-lun; Luo, Yun-bo

    2005-05-18

    To devise a rapid and reliable method for the detection and identification of genetically modified (GM) events, we developed a multiplex polymerase chain reaction (PCR) coupled with a DNA microarray system simultaneously aiming at many targets in a single reaction. The system included probes for screening gene, species reference gene, specific gene, construct-specific gene, event-specific gene, and internal and negative control genes. 18S rRNA was combined with species reference genes as internal controls to assess the efficiency of all reactions and to eliminate false negatives. Two sets of the multiplex PCR system were used to amplify four and five targets, respectively. Eight different structure genes could be detected and identified simultaneously for Roundup Ready soybean in a single microarray. The microarray specificity was validated by its ability to discriminate two GM maizes Bt176 and Bt11. The advantages of this method are its high specificity and greatly reduced false-positives and -negatives. The multiplex PCR coupled with microarray technology presented here is a rapid and reliable tool for the simultaneous detection of GM organism ingredients.

  14. Femtomolar detection of single mismatches by discriminant analysis of DNA hybridization events using gold nanoparticles.

    Science.gov (United States)

    Ma, Xingyi; Sim, Sang Jun

    2013-03-21

    Even though DNA-based nanosensors have been demonstrated for quantitative detection of analytes and diseases, hybridization events have never been numerically investigated for further understanding of DNA mediated interactions. Here, we developed a nanoscale platform with well-designed capture and detection gold nanoprobes to precisely evaluate the hybridization events. The capture gold nanoprobes were mono-laid on glass and the detection probes were fabricated via a novel competitive conjugation method. The two kinds of probes combined in a suitable orientation following the hybridization with the target. We found that hybridization efficiency was markedly dependent on electrostatic interactions between DNA strands, which can be tailored by adjusting the salt concentration of the incubation solution. Due to the much lower stability of the double helix formed by mismatches, the hybridization efficiencies of single mismatched (MMT) and perfectly matched DNA (PMT) were different. Therefore, we obtained an optimized salt concentration that allowed for discrimination of MMT from PMT without stringent control of temperature or pH. The results indicated this to be an ultrasensitive and precise nanosensor for the diagnosis of genetic diseases.

  15. Detection of genetically modified maize events in Brazilian maize-derived food products

    Directory of Open Access Journals (Sweden)

    Maria Regina Branquinho

    2013-09-01

    Full Text Available The Brazilian government has approved many transgenic maize lines for commercialization and has established a threshold of 1% for food labeling, which underscores need for monitoring programs. Thirty four samples including flours and different types of nacho chips were analyzed by conventional and real-time PCR in 2011 and 2012. The events MON810, Bt11, and TC1507 were detected in most of the samples, and NK603 was present only in the samples analyzed in 2012. The authorized lines GA21, T25, and the unauthorized Bt176 were not detected. All positive samples in the qualitative tests collected in 2011 showed a transgenic content higher than 1%, and none of them was correctly labeled. Regarding the samples collected in 2012, all positive samples were quantified higher than the threshold, and 47.0% were not correctly labeled. The overall results indicated that the major genetically modified organisms detected were MON810, TC1507, Bt11, and NK603 events. Some industries that had failed to label their products in 2011 started labeling them in 2012, demonstrating compliance with the current legislation observing the consumer rights. Although these results are encouraging, it has been clearly demonstrated the need for continuous monitoring programs to ensure consumers that food products are labeled properly.

  16. Application of Data Cubes for Improving Detection of Water Cycle Extreme Events

    Science.gov (United States)

    Albayrak, Arif; Teng, William

    2015-01-01

    As part of an ongoing NASA-funded project to remove a longstanding barrier to accessing NASA data (i.e., accessing archived time-step array data as point-time series), for the hydrology and other point-time series-oriented communities, "data cubes" are created from which time series files (aka "data rods") are generated on-the-fly and made available as Web services from the Goddard Earth Sciences Data and Information Services Center (GES DISC). Data cubes are data as archived rearranged into spatio-temporal matrices, which allow for easy access to the data, both spatially and temporally. A data cube is a specific case of the general optimal strategy of reorganizing data to match the desired means of access. The gain from such reorganization is greater the larger the data set. As a use case of our project, we are leveraging existing software to explore the application of the data cubes concept to machine learning, for the purpose of detecting water cycle extreme events, a specific case of anomaly detection, requiring time series data. We investigate the use of support vector machines (SVM) for anomaly classification. We show an example of detection of water cycle extreme events, using data from the Tropical Rainfall Measuring Mission (TRMM).

  17. Very low frequency earthquakes (VLFEs) detected during episodic tremor and slip (ETS) events in Cascadia using a match filter method indicate repeating events

    Science.gov (United States)

    Hutchison, A. A.; Ghosh, A.

    2016-12-01

    Very low frequency earthquakes (VLFEs) occur in transitional zones of faults, releasing seismic energy in the 0.02-0.05 Hz frequency band over a 90 s duration and typically have magntitudes within the range of Mw 3.0-4.0. VLFEs can occur down-dip of the seismogenic zone, where they can transfer stress up-dip potentially bringing the locked zone closer to a critical failure stress. VLFEs also occur up-dip of the seismogenic zone in a region along the plate interface that can rupture coseismically during large megathrust events, such as the 2011 Tohoku-Oki earthquake [Ide et al., 2011]. VLFEs were first detected in Cascadia during the 2011 episodic tremor and slip (ETS) event, occurring coincidentally with tremor [Ghosh et al., 2015]. However, during the 2014 ETS event, VLFEs were spatially and temporally asynchronous with tremor activity [Hutchison and Ghosh, 2016]. Such contrasting behaviors remind us that the mechanics behind such events remain elusive, yet they are responsible for the largest portion of the moment release during an ETS event. Here, we apply a match filter method using known VLFEs as template events to detect additional VLFEs. Using a grid-search centroid moment tensor inversion method, we invert stacks of the resulting match filter detections to ensure moment tensor solutions are similar to that of the respective template events. Our ability to successfully employ a match filter method to VLFE detection in Cascadia intrinsically indicates that these events can be repeating, implying that the same asperities are likely responsible for generating multiple VLFEs.

  18. Hierarchical modeling for rare event detection and cell subset alignment across flow cytometry samples.

    Directory of Open Access Journals (Sweden)

    Andrew Cron

    Full Text Available Flow cytometry is the prototypical assay for multi-parameter single cell analysis, and is essential in vaccine and biomarker research for the enumeration of antigen-specific lymphocytes that are often found in extremely low frequencies (0.1% or less. Standard analysis of flow cytometry data relies on visual identification of cell subsets by experts, a process that is subjective and often difficult to reproduce. An alternative and more objective approach is the use of statistical models to identify cell subsets of interest in an automated fashion. Two specific challenges for automated analysis are to detect extremely low frequency event subsets without biasing the estimate by pre-processing enrichment, and the ability to align cell subsets across multiple data samples for comparative analysis. In this manuscript, we develop hierarchical modeling extensions to the Dirichlet Process Gaussian Mixture Model (DPGMM approach we have previously described for cell subset identification, and show that the hierarchical DPGMM (HDPGMM naturally generates an aligned data model that captures both commonalities and variations across multiple samples. HDPGMM also increases the sensitivity to extremely low frequency events by sharing information across multiple samples analyzed simultaneously. We validate the accuracy and reproducibility of HDPGMM estimates of antigen-specific T cells on clinically relevant reference peripheral blood mononuclear cell (PBMC samples with known frequencies of antigen-specific T cells. These cell samples take advantage of retrovirally TCR-transduced T cells spiked into autologous PBMC samples to give a defined number of antigen-specific T cells detectable by HLA-peptide multimer binding. We provide open source software that can take advantage of both multiple processors and GPU-acceleration to perform the numerically-demanding computations. We show that hierarchical modeling is a useful probabilistic approach that can provide a

  19. Signal Detection of Imipenem Compared to Other Drugs from Korea Adverse Event Reporting System Database.

    Science.gov (United States)

    Park, Kyounghoon; Soukavong, Mick; Kim, Jungmee; Kwon, Kyoung Eun; Jin, Xue Mei; Lee, Joongyub; Yang, Bo Ram; Park, Byung Joo

    2017-05-01

    To detect signals of adverse drug events after imipenem treatment using the Korea Institute of Drug Safety & Risk Management-Korea adverse event reporting system database (KIDS-KD). We performed data mining using KIDS-KD, which was constructed using spontaneously reported adverse event (AE) reports between December 1988 and June 2014. We detected signals calculated the proportional reporting ratio, reporting odds ratio, and information component of imipenem. We defined a signal as any AE that satisfied all three indices. The signals were compared with drug labels of nine countries. There were 807582 spontaneous AEs reports in the KIDS-KD. Among those, the number of antibiotics related AEs was 192510; 3382 reports were associated with imipenem. The most common imipenem-associated AE was the drug eruption; 353 times. We calculated the signal by comparing with all other antibiotics and drugs; 58 and 53 signals satisfied the three methods. We compared the drug labelling information of nine countries, including the USA, the UK, Japan, Italy, Switzerland, Germany, France, Canada, and South Korea, and discovered that the following signals were currently not included in drug labels: hypokalemia, cardiac arrest, cardiac failure, Parkinson's syndrome, myocardial infarction, and prostate enlargement. Hypokalemia was an additional signal compared with all other antibiotics, and the other signals were not different compared with all other antibiotics and all other drugs. We detected new signals that were not listed on the drug labels of nine countries. However, further pharmacoepidemiologic research is needed to evaluate the causality of these signals. © Copyright: Yonsei University College of Medicine 2017

  20. Quench detection of fast plasma events for the JT-60SA central solenoid

    International Nuclear Information System (INIS)

    Murakami, Haruyuki; Kizu, Kaname; Tsuchiya, Katsuhiko; Kamiya, Koji; Takahashi, Yoshikazu; Yoshida, Kiyoshi

    2012-01-01

    Highlights: ► Pick-up coil method is used for the quench detection of JT-60SA magnet system. ► Disk-shaped pick-up coils are inserted in CS module to compensate inductive voltage. ► Applicability of pick-up coil is evaluated by two dimensional analysis. ► Pick-up coil is applicable whenever disruption, mini collapse and other plasma event. - Abstract: The JT-60 is planned to be modified to a full-superconducting tokamak referred to as the JT-60 Super Advanced (JT-60SA). The maximum temperature of the magnet during its quench might reach the temperature of higher than several hundreds Kelvin that will damage the superconducting magnet itself. The high precision quench detection system, therefore, is one of the key technologies in the superconducting magnet protection system. The pick-up coil method, which is using voltage taps to detect the normal voltage, is used for the quench detection of the JT-60SA superconducting magnet system. The disk-shaped pick-up coils are inserted in the central solenoid (CS) module to compensate the inductive voltage. In the previous study, the quench detection system requires a large number of pick-up coils. The reliability of quench detection system would be higher by simplifying the detection system such as reducing the number of pick-up coils. Simplifying the quench detection system is also important to reduce the total cost of the protection system. Hence the design method is improved by increasing optimizing parameters. The improved design method can reduce the number of pick-up coils without reducing the sensitivity of detection; consequently the protection system can be designed with higher reliability and lower cost. The applicability of the disk-shaped pick-up coil for quench detection system is evaluated by the two dimensional analysis. In the previous study, however, the analysis model only took into account the CS, EF (equilibrium field) coils and plasma. Therefore, applicability of the disk-shaped pick-up coil for

  1. Analysis of arrhythmic events is useful to detect lead failure earlier in patients followed by remote monitoring.

    Science.gov (United States)

    Nishii, Nobuhiro; Miyoshi, Akihito; Kubo, Motoki; Miyamoto, Masakazu; Morimoto, Yoshimasa; Kawada, Satoshi; Nakagawa, Koji; Watanabe, Atsuyuki; Nakamura, Kazufumi; Morita, Hiroshi; Ito, Hiroshi

    2018-03-01

    Remote monitoring (RM) has been advocated as the new standard of care for patients with cardiovascular implantable electronic devices (CIEDs). RM has allowed the early detection of adverse clinical events, such as arrhythmia, lead failure, and battery depletion. However, lead failure was often identified only by arrhythmic events, but not impedance abnormalities. To compare the usefulness of arrhythmic events with conventional impedance abnormalities for identifying lead failure in CIED patients followed by RM. CIED patients in 12 hospitals have been followed by the RM center in Okayama University Hospital. All transmitted data have been analyzed and summarized. From April 2009 to March 2016, 1,873 patients have been followed by the RM center. During the mean follow-up period of 775 days, 42 lead failure events (atrial lead 22, right ventricular pacemaker lead 5, implantable cardioverter defibrillator [ICD] lead 15) were detected. The proportion of lead failures detected only by arrhythmic events, which were not detected by conventional impedance abnormalities, was significantly higher than that detected by impedance abnormalities (arrhythmic event 76.2%, 95% CI: 60.5-87.9%; impedance abnormalities 23.8%, 95% CI: 12.1-39.5%). Twenty-seven events (64.7%) were detected without any alert. Of 15 patients with ICD lead failure, none has experienced inappropriate therapy. RM can detect lead failure earlier, before clinical adverse events. However, CIEDs often diagnose lead failure as just arrhythmic events without any warning. Thus, to detect lead failure earlier, careful human analysis of arrhythmic events is useful. © 2017 Wiley Periodicals, Inc.

  2. An analog cell to detect single event transients in voltage references

    Energy Technology Data Exchange (ETDEWEB)

    Franco, F.J., E-mail: fjfranco@fis.ucm.es [Departamento de Física Aplicada III, Facultad de Físicas, Universidad Complutense de Madrid (UCM), 28040 Madrid (Spain); Palomar, C. [Departamento de Física Aplicada III, Facultad de Físicas, Universidad Complutense de Madrid (UCM), 28040 Madrid (Spain); Izquierdo, J.G. [Centro de Láseres Ultrarrápidos, Facultad de Químicas, Universidad Complutense de Madrid (UCM), 28040 Madrid (Spain); Agapito, J.A. [Departamento de Física Aplicada III, Facultad de Físicas, Universidad Complutense de Madrid (UCM), 28040 Madrid (Spain)

    2015-01-11

    A reliable voltage reference is mandatory in mixed-signal systems. However, this family of components can undergo very long single event transients when operating in radiation environments such as space and nuclear facilities due to the impact of heavy ions. The purpose of the present paper is to demonstrate how a simple cell can be used to detect these transients. The cell was implemented with typical COTS components and its behavior was verified by SPICE simulations and in a laser facility. Different applications of the cell are explored as well.

  3. Comparison of brand versus generic antiepileptic drug adverse event reporting rates in the U.S. Food and Drug Administration Adverse Event Reporting System (FAERS).

    Science.gov (United States)

    Rahman, Md Motiur; Alatawi, Yasser; Cheng, Ning; Qian, Jingjing; Plotkina, Annya V; Peissig, Peggy L; Berg, Richard L; Page, David; Hansen, Richard A

    2017-09-01

    Despite the cost saving role of generic anti-epileptic drugs (AEDs), debate exists as to whether generic substitution of branded AEDs may lead to therapeutic failure and increased toxicity. This study compared adverse event (AE) reporting rates for brand vs. authorized generic (AG) vs. generic AEDs. Since AGs are pharmaceutically identical to brand but perceived as generics, the generic vs. AG comparison minimized potential bias against generics. Events reported to the U.S. Food and Drug Administration Adverse Event Reporting System between January 2004 to March 2015 with lamotrigine, carbamazepine, and oxcarbazepine listed as primary or secondary suspect were classified as brand, generic, or AG based on the manufacturer. Disproportionality analyses using the reporting odds ratio (ROR) assessed the relative rate of reporting of labeled AEs compared to reporting these events with all other drugs. The Breslow-Day statistic compared RORs across brand, AG, and other generics using a Bonferroni-corrected Pbrand and generics for all three drugs of interest (Breslow-Day Pbrands and generics have similar reporting rates after accounting for generic perception biases. Disproportional suicide reporting was observed for generics compared with AGs and brand, although this finding needs further study. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. From b → sγ to the LSP detection rates in minimal string unification models

    International Nuclear Information System (INIS)

    Khalil, S.; Masiero, A.; Shafi, Q.

    1997-04-01

    We exploit the measured branching ratio for b → sγ to derive lower limits on the sparticle and Higgs masses in the minimal string unification models. For the LSP ('bino'), chargino and the lightest Higgs, these turn out to be 50, 90 and 75 GeV respectively. Taking account of the upper bounds on the mass spectrum from the LSP relic abundance, we estimate the direct detection rate for the latter to vary from 10 -1 to 10 -4 events/kg/day. The muon flux, produced by neutrinos from the annihilating LSP's, varies in the range 10 -2 - 10 -9 muons/m 2 /day. (author). 26 refs, 9 figs

  5. Optimized Swinging Door Algorithm for Wind Power Ramp Event Detection: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Cui, Mingjian; Zhang, Jie; Florita, Anthony R.; Hodge, Bri-Mathias; Ke, Deping; Sun, Yuanzhang

    2015-08-06

    Significant wind power ramp events (WPREs) are those that influence the integration of wind power, and they are a concern to the continued reliable operation of the power grid. As wind power penetration has increased in recent years, so has the importance of wind power ramps. In this paper, an optimized swinging door algorithm (SDA) is developed to improve ramp detection performance. Wind power time series data are segmented by the original SDA, and then all significant ramps are detected and merged through a dynamic programming algorithm. An application of the optimized SDA is provided to ascertain the optimal parameter of the original SDA. Measured wind power data from the Electric Reliability Council of Texas (ERCOT) are used to evaluate the proposed optimized SDA.

  6. Highly specific detection of genetic modification events using an enzyme-linked probe hybridization chip.

    Science.gov (United States)

    Zhang, M Z; Zhang, X F; Chen, X M; Chen, X; Wu, S; Xu, L L

    2015-08-10

    The enzyme-linked probe hybridization chip utilizes a method based on ligase-hybridizing probe chip technology, with the principle of using thio-primers for protection against enzyme digestion, and using lambda DNA exonuclease to cut multiple PCR products obtained from the sample being tested into single-strand chains for hybridization. The 5'-end amino-labeled probe was fixed onto the aldehyde chip, and hybridized with the single-stranded PCR product, followed by addition of a fluorescent-modified probe that was then enzymatically linked with the adjacent, substrate-bound probe in order to achieve highly specific, parallel, and high-throughput detection. Specificity and sensitivity testing demonstrated that enzyme-linked probe hybridization technology could be applied to the specific detection of eight genetic modification events at the same time, with a sensitivity reaching 0.1% and the achievement of accurate, efficient, and stable results.

  7. A power filter for the detection of burst events based on time-frequency spectrum estimation

    International Nuclear Information System (INIS)

    Guidi, G M; Cuoco, E; Vicere, A

    2004-01-01

    We propose as a statistic for the detection of bursts in a gravitational wave interferometer the 'energy' of the events estimated with a time-dependent calculation of the spectrum. This statistic has an asymptotic Gaussian distribution with known statistical moments, which makes it possible to perform a uniformly most powerful test (McDonough R N and Whalen A D 1995 Detection of Signals in Noise (New York: Academic)) on the energy mean. We estimate the receiver operating characteristic (ROC, from the same book) of this statistic for different levels of the signal-to-noise ratio in the specific case of a simulated noise having the spectral density expected for Virgo, using test signals taken from a library of possible waveforms emitted during the collapse of the core of type II supernovae

  8. The effect of sporting events on emergency department attendance rates in a district general hospital in Northern Ireland.

    Science.gov (United States)

    McGreevy, A; Millar, L; Murphy, B; Davison, G W; Brown, R; O'Donnell, M E

    2010-10-01

    Previous studies have reported a conflicting relationship between the effect of live and televised sporting events on attendance rates to emergency departments (ED). The objectives of this study were to investigate the relationship of major sporting events on emergency department attendance rates and to determine the potential effects of such events on service provision. A retrospective analysis of ED attendances to a district general hospital (DGH) and subsequent admissions over a 24-h period following live and televised sporting activities was performed over a 5-year period. Data were compiled from the hospital's emergency record books including the number of attendances, patient demographics, clinical complaint and outcome. Review patients were excluded. Analysis of sporting events was compiled for live local, regional and national events as well as world-wide televised sporting broadcasts. A total of 137,668 (80,445 men) patients attended from April 2002 to July 2007. Mean attendance rate per day was 80 patients (men = 47). Mean admission rate was 13.6 patients per day. Major sporting events during the study period included; Soccer: 4 FA Cup and 1 World Cup (WC) finals; Rugby: 47 Six Nations, 25 Six nations games involving Ireland, 1 WC final, 2 WC semi-finals, 2 WC quarter-finals and 4 WC games involving Ireland; and Gaelic Football [Gaelic Athletic Association (GAA)]: 5 All-Ireland finals, 11 semi-finals, 11 quarter-finals and 5 provincial finals. There was a significantly higher patient admission rate during the soccer FA Cup final, Rugby Six Nations and games involving Ireland and for GAA semi- and quarter-final games (p = 0.001-0.01). There was no difference identified in total attendance or non-admission rates for sporting events throughout the study period. Although there was no correlation identified between any of these sporting events and total emergency department attendances (r 0.07), multinomial logistic regression demonstrated that FA Cup final (p

  9. Adverse Events Associated with Hospitalization or Detected through the RAI-HC Assessment among Canadian Home Care Clients

    Science.gov (United States)

    Doran, Diane; Hirdes, John P.; Blais, Régis; Baker, G. Ross; Poss, Jeff W.; Li, Xiaoqiang; Dill, Donna; Gruneir, Andrea; Heckman, George; Lacroix, Hélène; Mitchell, Lori; O'Beirne, Maeve; Foebel, Andrea; White, Nancy; Qian, Gan; Nahm, Sang-Myong; Yim, Odilia; Droppo, Lisa; McIsaac, Corrine

    2013-01-01

    Background: The occurrence of adverse events (AEs) in care settings is a patient safety concern that has significant consequences across healthcare systems. Patient safety problems have been well documented in acute care settings; however, similar data for clients in home care (HC) settings in Canada are limited. The purpose of this Canadian study was to investigate AEs in HC, specifically those associated with hospitalization or detected through the Resident Assessment Instrument for Home Care (RAI-HC). Method: A retrospective cohort design was used. The cohort consisted of HC clients from the provinces of Nova Scotia, Ontario, British Columbia and the Winnipeg Regional Health Authority. Results: The overall incidence rate of AEs associated with hospitalization ranged from 6% to 9%. The incidence rate of AEs determined from the RAI-HC was 4%. Injurious falls, injuries from other than fall and medication-related events were the most frequent AEs associated with hospitalization, whereas new caregiver distress was the most frequent AE identified through the RAI-HC. Conclusion: The incidence of AEs from all sources of data ranged from 4% to 9%. More resources are needed to target strategies for addressing safety risks in HC in a broader context. Tools such as the RAI-HC and its Clinical Assessment Protocols, already available in Canada, could be very useful in the assessment and management of HC clients who are at safety risk. PMID:23968676

  10. Detection of planets in extremely weak central perturbation microlensing events via next-generation ground-based surveys

    International Nuclear Information System (INIS)

    Chung, Sun-Ju; Lee, Chung-Uk; Koo, Jae-Rim

    2014-01-01

    Even though the recently discovered high-magnification event MOA-2010-BLG-311 had complete coverage over its peak, confident planet detection did not happen due to extremely weak central perturbations (EWCPs, fractional deviations of ≲ 2%). For confident detection of planets in EWCP events, it is necessary to have both high cadence monitoring and high photometric accuracy better than those of current follow-up observation systems. The next-generation ground-based observation project, Korea Microlensing Telescope Network (KMTNet), satisfies these conditions. We estimate the probability of occurrence of EWCP events with fractional deviations of ≤2% in high-magnification events and the efficiency of detecting planets in the EWCP events using the KMTNet. From this study, we find that the EWCP events occur with a frequency of >50% in the case of ≲ 100 M E planets with separations of 0.2 AU ≲ d ≲ 20 AU. We find that for main-sequence and sub-giant source stars, ≳ 1 M E planets in EWCP events with deviations ≤2% can be detected with frequency >50% in a certain range that changes with the planet mass. However, it is difficult to detect planets in EWCP events of bright stars like giant stars because it is easy for KMTNet to be saturated around the peak of the events because of its constant exposure time. EWCP events are caused by close, intermediate, and wide planetary systems with low-mass planets and close and wide planetary systems with massive planets. Therefore, we expect that a much greater variety of planetary systems than those already detected, which are mostly intermediate planetary systems, regardless of the planet mass, will be significantly detected in the near future.

  11. Detection of Water Contamination Events Using Fluorescence Spectroscopy and Alternating Trilinear Decomposition Algorithm

    Directory of Open Access Journals (Sweden)

    Jie Yu

    2017-01-01

    Full Text Available The method based on conventional index and UV-vision has been widely applied in the field of water quality abnormality detection. This paper presents a qualitative analysis approach to detect the water contamination events with unknown pollutants. Fluorescence spectra were used as water quality monitoring tools, and the detection method of unknown contaminants in water based on alternating trilinear decomposition (ATLD is proposed to analyze the excitation and emission spectra of the samples. The Delaunay triangulation interpolation method was used to make the pretreatment of three-dimensional fluorescence spectra data, in order to estimate the effect of Rayleigh and Raman scattering; ATLD model was applied to establish the model of normal water sample, and the residual matrix was obtained by subtracting the measured matrix from the model matrix; the residual sum of squares obtained from the residual matrix and threshold was used to make qualitative discrimination of test samples and distinguish drinking water samples and organic pollutant samples. The results of the study indicate that ATLD modeling with three-dimensional fluorescence spectra can provide a tool for detecting unknown organic pollutants in water qualitatively. The method based on fluorescence spectra can be complementary to the method based on conventional index and UV-vision.

  12. Endpoint visual detection of three genetically modified rice events by loop-mediated isothermal amplification.

    Science.gov (United States)

    Chen, Xiaoyun; Wang, Xiaofu; Jin, Nuo; Zhou, Yu; Huang, Sainan; Miao, Qingmei; Zhu, Qing; Xu, Junfeng

    2012-11-07

    Genetically modified (GM) rice KMD1, TT51-1, and KF6 are three of the most well known transgenic Bt rice lines in China. A rapid and sensitive molecular assay for risk assessment of GM rice is needed. Polymerase chain reaction (PCR), currently the most common method for detecting genetically modified organisms, requires temperature cycling and relatively complex procedures. Here we developed a visual and rapid loop-mediated isothermal amplification (LAMP) method to amplify three GM rice event-specific junction sequences. Target DNA was amplified and visualized by two indicators (SYBR green or hydroxy naphthol blue [HNB]) within 60 min at an isothermal temperature of 63 °C. Different kinds of plants were selected to ensure the specificity of detection and the results of the non-target samples were negative, indicating that the primer sets for the three GM rice varieties had good levels of specificity. The sensitivity of LAMP, with detection limits at low concentration levels (0.01%−0.005% GM), was 10- to 100-fold greater than that of conventional PCR. Additionally, the LAMP assay coupled with an indicator (SYBR green or HNB) facilitated analysis. These findings revealed that the rapid detection method was suitable as a simple field-based test to determine the status of GM crops.

  13. Development of electrochemical biosensor for detection of pathogenic microorganism in Asian dust events.

    Science.gov (United States)

    Yoo, Min-Sang; Shin, Minguk; Kim, Younghun; Jang, Min; Choi, Yoon-E; Park, Si Jae; Choi, Jonghoon; Lee, Jinyoung; Park, Chulhwan

    2017-05-01

    We developed a single-walled carbon nanotubes (SWCNTs)-based electrochemical biosensor for the detection of Bacillus subtilis, one of the microorganisms observed in Asian dust events, which causes respiratory diseases such as asthma and pneumonia. SWCNTs plays the role of a transducer in biological antigen/antibody reaction for the electrical signal while 1-pyrenebutanoic acid succinimidyl ester (1-PBSE) and ant-B. subtilis were performed as a chemical linker and an acceptor, respectively, for the adhesion of target microorganism in the developed biosensor. The detection range (10 2 -10 10  CFU/mL) and the detection limit (10 2  CFU/mL) of the developed biosensor were identified while the response time was 10 min. The amount of target B. subtilis was the highest in the specificity test of the developed biosensor, compared with the other tested microorganisms (Staphylococcus aureus, Flavobacterium psychrolimnae, and Aquabacterium commune). In addition, target B. subtilis detected by the developed biosensor was observed by scanning electron microscope (SEM) analysis. Copyright © 2017 Elsevier Ltd. All rights reserved.

  14. Endpoint Visual Detection of Three Genetically Modified Rice Events by Loop-Mediated Isothermal Amplification

    Directory of Open Access Journals (Sweden)

    Qing Zhu

    2012-11-01

    Full Text Available Genetically modified (GM rice KMD1, TT51-1, and KF6 are three of the most well known transgenic Bt rice lines in China. A rapid and sensitive molecular assay for risk assessment of GM rice is needed. Polymerase chain reaction (PCR, currently the most common method for detecting genetically modified organisms, requires temperature cycling and relatively complex procedures. Here we developed a visual and rapid loop-mediated isothermal amplification (LAMP method to amplify three GM rice event-specific junction sequences. Target DNA was amplified and visualized by two indicators (SYBR green or hydroxy naphthol blue [HNB] within 60 min at an isothermal temperature of 63 °C. Different kinds of plants were selected to ensure the specificity of detection and the results of the non-target samples were negative, indicating that the primer sets for the three GM rice varieties had good levels of specificity. The sensitivity of LAMP, with detection limits at low concentration levels (0.01%–0.005% GM, was 10- to 100-fold greater than that of conventional PCR. Additionally, the LAMP assay coupled with an indicator (SYBR green or HNB facilitated analysis. These findings revealed that the rapid detection method was suitable as a simple field-based test to determine the status of GM crops.

  15. A Cluster-Based Fuzzy Fusion Algorithm for Event Detection in Heterogeneous Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    ZiQi Hao

    2015-01-01

    Full Text Available As limited energy is one of the tough challenges in wireless sensor networks (WSN, energy saving becomes important in increasing the lifecycle of the network. Data fusion enables combining information from several sources thus to provide a unified scenario, which can significantly save sensor energy and enhance sensing data accuracy. In this paper, we propose a cluster-based data fusion algorithm for event detection. We use k-means algorithm to form the nodes into clusters, which can significantly reduce the energy consumption of intracluster communication. Distances between cluster heads and event and energy of clusters are fuzzified, thus to use a fuzzy logic to select the clusters that will participate in data uploading and fusion. Fuzzy logic method is also used by cluster heads for local decision, and then the local decision results are sent to the base station. Decision-level fusion for final decision of event is performed by base station according to the uploaded local decisions and fusion support degree of clusters calculated by fuzzy logic method. The effectiveness of this algorithm is demonstrated by simulation results.

  16. Online Least Squares One-Class Support Vector Machines-Based Abnormal Visual Event Detection

    Directory of Open Access Journals (Sweden)

    Tian Wang

    2013-12-01

    Full Text Available The abnormal event detection problem is an important subject in real-time video surveillance. In this paper, we propose a novel online one-class classification algorithm, online least squares one-class support vector machine (online LS-OC-SVM, combined with its sparsified version (sparse online LS-OC-SVM. LS-OC-SVM extracts a hyperplane as an optimal description of training objects in a regularized least squares sense. The online LS-OC-SVM learns a training set with a limited number of samples to provide a basic normal model, then updates the model through remaining data. In the sparse online scheme, the model complexity is controlled by the coherence criterion. The online LS-OC-SVM is adopted to handle the abnormal event detection problem. Each frame of the video is characterized by the covariance matrix descriptor encoding the moving information, then is classified into a normal or an abnormal frame. Experiments are conducted, on a two-dimensional synthetic distribution dataset and a benchmark video surveillance dataset, to demonstrate the promising results of the proposed online LS-OC-SVM method.

  17. The impact of ambient dose rate measuring network and precipitation radar system for detection of environmental radioactivity released by accident

    International Nuclear Information System (INIS)

    Bleher, M; Stoehlker, U.

    2003-01-01

    For the surveillance of environmental radioactivity, the German measuring network of BfS consists of more than 2000 stations where the ambient gamma dose rate is continuously measured. This network is a helpful tool to detect and localise enhanced environmental contamination from artificial radionuclides. The threshold for early warning is so low, that already an additional dose rate contribution of 0,07 μGy/h is detectable. However, this threshold is frequently exceeded due to precipitation events caused by washout of natural activity in air. Therefore, the precipitation radar system of the German Weather Service provides valuable information on the problem, whether the increase of the ambient dose rate is due to natural or man-made events. In case of an accidental release, the data of this radar system show small area precipitation events and potential local hot spots not detected by the measuring network. For the phase of cloud passage, the ambient dose rate measuring network provides a reliable database for the evaluation of the current situation and its further development. It is possible to compare measured data for dose rate with derived intervention levels for countermeasures like ''sheltering''. Thus, critical regions can be identified and it is possible to verify implemented countermeasures. During and after this phase of cloud passage the measured data of the monitoring network help to adapt the results of the national decision support systems PARK and RODOS. Therefore, it is necessary to derive the actual additional contribution to the ambient dose rate. Map representations of measured dose rate are rapidly available and helpful to optimise measurement strategies of mobile systems and collection strategies for samples of agricultural products. (orig.)

  18. The association of colonoscopy quality indicators with the detection of screen-relevant lesions, adverse events, and postcolonoscopy cancers in an asymptomatic Canadian colorectal cancer screening population.

    Science.gov (United States)

    Hilsden, Robert J; Dube, Catherine; Heitman, Steven J; Bridges, Ronald; McGregor, S Elizabeth; Rostom, Alaa

    2015-11-01

    Although several quality indicators of colonoscopy have been defined, quality assurance activities should be directed at the measurement of quality indicators that are predictive of key screening colonoscopy outcomes. The goal of this study was to examine the association among established quality indicators and the detection of screen-relevant lesions (SRLs), adverse events, and postcolonoscopy cancers. Historical cohort study. Canadian colorectal cancer screening center. A total of 18,456 asymptomatic men and women ages 40 to 74, at either average risk or increased risk for colorectal cancer because of a family history, who underwent a screening colonoscopy from 2008 to 2010. Using univariate and multivariate analyses, we explored the association among procedural quality indicators and 3 colonoscopy outcomes: detection of SRLs, adverse events, and postcolonoscopy cancers. The crude rates of SRLs, adverse events, and postcolonoscopy cancers were 240, 6.44, and .54 per 1000 colonoscopies, respectively. Several indicators, including endoscopist withdrawal time (OR, 1.3; 95% CI, 1.2-1.4) and cecal intubation rate (OR, 13.9; 95% CI, 1.9-96.9), were associated with the detection of SRLs. No quality indicator was associated with the risk of adverse events. Endoscopist average withdrawal time over 6 minutes (OR, .12; 95% CI, .002-.85) and SRL detection rate over 20% (OR, .17; 95% CI, .03-.74) were associated with a reduced risk of postcolonoscopy cancers. Single-center study. Quality assurance programs should prioritize the measurement of endoscopist average withdrawal time and adenoma (SRL) detection rate. Copyright © 2015 American Society for Gastrointestinal Endoscopy. Published by Elsevier Inc. All rights reserved.

  19. Detecting regular sound changes in linguistics as events of concerted evolution.

    Science.gov (United States)

    Hruschka, Daniel J; Branford, Simon; Smith, Eric D; Wilkins, Jon; Meade, Andrew; Pagel, Mark; Bhattacharya, Tanmoy

    2015-01-05

    Concerted evolution is normally used to describe parallel changes at different sites in a genome, but it is also observed in languages where a specific phoneme changes to the same other phoneme in many words in the lexicon—a phenomenon known as regular sound change. We develop a general statistical model that can detect concerted changes in aligned sequence data and apply it to study regular sound changes in the Turkic language family. Linguistic evolution, unlike the genetic substitutional process, is dominated by events of concerted evolutionary change. Our model identified more than 70 historical events of regular sound change that occurred throughout the evolution of the Turkic language family, while simultaneously inferring a dated phylogenetic tree. Including regular sound changes yielded an approximately 4-fold improvement in the characterization of linguistic change over a simpler model of sporadic change, improved phylogenetic inference, and returned more reliable and plausible dates for events on the phylogenies. The historical timings of the concerted changes closely follow a Poisson process model, and the sound transition networks derived from our model mirror linguistic expectations. We demonstrate that a model with no prior knowledge of complex concerted or regular changes can nevertheless infer the historical timings and genealogical placements of events of concerted change from the signals left in contemporary data. Our model can be applied wherever discrete elements—such as genes, words, cultural trends, technologies, or morphological traits—can change in parallel within an organism or other evolving group. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.

  20. Detection of ULF geomagnetic signals associated with seismic events in Central Mexico using Discrete Wavelet Transform

    Directory of Open Access Journals (Sweden)

    O. Chavez

    2010-12-01

    Full Text Available The geomagnetic observatory of Juriquilla Mexico, located at longitude –100.45° and latitude 20.70°, and 1946 m a.s.l., has been operational since June 2004 compiling geomagnetic field measurements with a three component fluxgate magnetometer. In this paper, the results of the analysis of these measurements in relation to important seismic activity in the period of 2007 to 2009 are presented. For this purpose, we used superposed epochs of Discrete Wavelet Transform of filtered signals for the three components of the geomagnetic field during relative seismic calm, and it was compared with seismic events of magnitudes greater than Ms > 5.5, which have occurred in Mexico. The analysed epochs consisted of 18 h of observations for a dataset corresponding to 18 different earthquakes (EQs. The time series were processed for a period of 9 h prior to and 9 h after each seismic event. This data processing was compared with the same number of observations during a seismic calm. The proposed methodology proved to be an efficient tool to detect signals associated with seismic activity, especially when the seismic events occur in a distance (D from the observatory to the EQ, such that the ratio D/ρ < 1.8 where ρ is the earthquake radius preparation zone. The methodology presented herein shows important anomalies in the Ultra Low Frequency Range (ULF; 0.005–1 Hz, primarily for 0.25 to 0.5 Hz. Furthermore, the time variance (σ2 increases prior to, during and after the seismic event in relation to the coefficient D1 obtained, principally in the Bx (N-S and By (E-W geomagnetic components. Therefore, this paper proposes and develops a new methodology to extract the abnormal signals of the geomagnetic anomalies related to different stages of the EQs.

  1. Statistical improvement in detection level of gravitational microlensing events from their light curves

    Science.gov (United States)

    Ibrahim, Ichsan; Malasan, Hakim L.; Kunjaya, Chatief; Timur Jaelani, Anton; Puannandra Putri, Gerhana; Djamal, Mitra

    2018-04-01

    In astronomy, the brightness of a source is typically expressed in terms of magnitude. Conventionally, the magnitude is defined by the logarithm of received flux. This relationship is known as the Pogson formula. For received flux with a small signal to noise ratio (S/N), however, the formula gives a large magnitude error. We investigate whether the use of Inverse Hyperbolic Sine function (hereafter referred to as the Asinh magnitude) in the modified formulae could allow for an alternative calculation of magnitudes for small S/N flux, and whether the new approach is better for representing the brightness of that region. We study the possibility of increasing the detection level of gravitational microlensing using 40 selected microlensing light curves from the 2013 and 2014 seasons and by using the Asinh magnitude. Photometric data of the selected events are obtained from the Optical Gravitational Lensing Experiment (OGLE). We found that utilization of the Asinh magnitude makes the events brighter compared to using the logarithmic magnitude, with an average of about 3.42 × 10‑2 magnitude and an average in the difference of error between the logarithmic and the Asinh magnitude of about 2.21 × 10‑2 magnitude. The microlensing events OB140847 and OB140885 are found to have the largest difference values among the selected events. Using a Gaussian fit to find the peak for OB140847 and OB140885, we conclude statistically that the Asinh magnitude gives better mean squared values of the regression and narrower residual histograms than the Pogson magnitude. Based on these results, we also attempt to propose a limit in magnitude value for which use of the Asinh magnitude is optimal with small S/N data.

  2. Building a Global Catalog of Nonvolcanic Tremor Events Using an Automatic Detection Algorithm

    Science.gov (United States)

    Bagley, B. C.; Revenaugh, J.

    2009-12-01

    Nonvolcanic tremor is characterized by a long-period seismic event containing a series of low-frequency earthquakes (LFEs). Tremor has been detected in regions of subduction (e.g. Kao et. al. 2007, 2008; Shelly 2006) and beneath the San Andreas fault near Cholame, California (e.g. Nadeau and Dolenc, 2005). In some cases tremor events seem to have periodicity, and these are often referred to as episodic tremor and slip (ETS). The origin of nonvolcanic tremor has been ascribed to shear slip along plate boundaries and/or high pore-fluid pressure. The apparent periodicity and tectonic setting associated with ETS has led to the suggestion that there may be a link between ETS and megathrust earthquakes. Until recently tremor detection has been a manual process requiring visual inspection of seismic data. In areas that have dense seismic arrays (e.g. Japan) waveform cross correlation techniques have been successfully employed (e.g. Obara, 2002). Kao et al. (2007) developed an algorithm for automatic detection of seismic tremor that can be used in regions without dense arrays. This method has been used to create the Tremor Activity Monitoring System (TAMS), which is used by the Geologic Survey of Canada to monitor northern Cascadia. So far the study of nonvolcanic tremor has been limited to regions of subduction or along major transform faults. It is unknown if tremor events occur in other tectonic settings, or if the current detection schemes will be useful for finding them. We propose to look for tremor events in non-subduction regions. It is possible that if tremor exists in other regions it will have different characteristics and may not trigger the TAMS system or be amenable to other existing detection schemes. We are developing algorithms for searching sparse array data sets for quasi-harmonic energy bursts in hopes of recognizing and cataloging nonvolcanic tremor in an expanded tectonic setting. Statistical comparisons against the TAMS algorithm will be made if

  3. Predictors of Arrhythmic Events Detected by Implantable Loop Recorders in Renal Transplant Candidates

    Energy Technology Data Exchange (ETDEWEB)

    Silva, Rodrigo Tavares; Martinelli Filho, Martino, E-mail: martino@cardiol.br; Peixoto, Giselle de Lima; Lima, José Jayme Galvão de; Siqueira, Sérgio Freitas de; Costa, Roberto; Gowdak, Luís Henrique Wolff [Instituto do Coração do Hospital das Clínicas da Faculdade de Medicina da Universidade de São Paulo, São Paulo, SP (Brazil); Paula, Flávio Jota de [Unidade de Transplante Renal - Divisão de Urologia do Hospital das Clínicas da Faculdade de Medicina da Universidade de São Paulo, São Paulo, SP (Brazil); Kalil Filho, Roberto; Ramires, José Antônio Franchini [Instituto do Coração do Hospital das Clínicas da Faculdade de Medicina da Universidade de São Paulo, São Paulo, SP (Brazil)

    2015-11-15

    The recording of arrhythmic events (AE) in renal transplant candidates (RTCs) undergoing dialysis is limited by conventional electrocardiography. However, continuous cardiac rhythm monitoring seems to be more appropriate due to automatic detection of arrhythmia, but this method has not been used. We aimed to investigate the incidence and predictors of AE in RTCs using an implantable loop recorder (ILR). A prospective observational study conducted from June 2009 to January 2011 included 100 consecutive ambulatory RTCs who underwent ILR and were followed-up for at least 1 year. Multivariate logistic regression was applied to define predictors of AE. During a mean follow-up of 424 ± 127 days, AE could be detected in 98% of patients, and 92% had more than one type of arrhythmia, with most considered potentially not serious. Sustained atrial tachycardia and atrial fibrillation occurred in 7% and 13% of patients, respectively, and bradyarrhythmia and non-sustained or sustained ventricular tachycardia (VT) occurred in 25% and 57%, respectively. There were 18 deaths, of which 7 were sudden cardiac events: 3 bradyarrhythmias, 1 ventricular fibrillation, 1 myocardial infarction, and 2 undetermined. The presence of a long QTc (odds ratio [OR] = 7.28; 95% confidence interval [CI], 2.01–26.35; p = 0.002), and the duration of the PR interval (OR = 1.05; 95% CI, 1.02–1.08; p < 0.001) were independently associated with bradyarrhythmias. Left ventricular dilatation (LVD) was independently associated with non-sustained VT (OR = 2.83; 95% CI, 1.01–7.96; p = 0.041). In medium-term follow-up of RTCs, ILR helped detect a high incidence of AE, most of which did not have clinical relevance. The PR interval and presence of long QTc were predictive of bradyarrhythmias, whereas LVD was predictive of non-sustained VT.

  4. Predictors of Arrhythmic Events Detected by Implantable Loop Recorders in Renal Transplant Candidates

    Directory of Open Access Journals (Sweden)

    Rodrigo Tavares Silva

    2015-11-01

    Full Text Available AbstractBackground:The recording of arrhythmic events (AE in renal transplant candidates (RTCs undergoing dialysis is limited by conventional electrocardiography. However, continuous cardiac rhythm monitoring seems to be more appropriate due to automatic detection of arrhythmia, but this method has not been used.Objective:We aimed to investigate the incidence and predictors of AE in RTCs using an implantable loop recorder (ILR.Methods:A prospective observational study conducted from June 2009 to January 2011 included 100 consecutive ambulatory RTCs who underwent ILR and were followed-up for at least 1 year. Multivariate logistic regression was applied to define predictors of AE.Results:During a mean follow-up of 424 ± 127 days, AE could be detected in 98% of patients, and 92% had more than one type of arrhythmia, with most considered potentially not serious. Sustained atrial tachycardia and atrial fibrillation occurred in 7% and 13% of patients, respectively, and bradyarrhythmia and non-sustained or sustained ventricular tachycardia (VT occurred in 25% and 57%, respectively. There were 18 deaths, of which 7 were sudden cardiac events: 3 bradyarrhythmias, 1 ventricular fibrillation, 1 myocardial infarction, and 2 undetermined. The presence of a long QTc (odds ratio [OR] = 7.28; 95% confidence interval [CI], 2.01–26.35; p = 0.002, and the duration of the PR interval (OR = 1.05; 95% CI, 1.02–1.08; p < 0.001 were independently associated with bradyarrhythmias. Left ventricular dilatation (LVD was independently associated with non-sustained VT (OR = 2.83; 95% CI, 1.01–7.96; p = 0.041.Conclusions:In medium-term follow-up of RTCs, ILR helped detect a high incidence of AE, most of which did not have clinical relevance. The PR interval and presence of long QTc were predictive of bradyarrhythmias, whereas LVD was predictive of non-sustained VT.

  5. Predictors of Arrhythmic Events Detected by Implantable Loop Recorders in Renal Transplant Candidates

    Science.gov (United States)

    Silva, Rodrigo Tavares; Martinelli Filho, Martino; Peixoto, Giselle de Lima; de Lima, José Jayme Galvão; de Siqueira, Sérgio Freitas; Costa, Roberto; Gowdak, Luís Henrique Wolff; de Paula, Flávio Jota; Kalil Filho, Roberto; Ramires, José Antônio Franchini

    2015-01-01

    Background The recording of arrhythmic events (AE) in renal transplant candidates (RTCs) undergoing dialysis is limited by conventional electrocardiography. However, continuous cardiac rhythm monitoring seems to be more appropriate due to automatic detection of arrhythmia, but this method has not been used. Objective We aimed to investigate the incidence and predictors of AE in RTCs using an implantable loop recorder (ILR). Methods A prospective observational study conducted from June 2009 to January 2011 included 100 consecutive ambulatory RTCs who underwent ILR and were followed-up for at least 1 year. Multivariate logistic regression was applied to define predictors of AE. Results During a mean follow-up of 424 ± 127 days, AE could be detected in 98% of patients, and 92% had more than one type of arrhythmia, with most considered potentially not serious. Sustained atrial tachycardia and atrial fibrillation occurred in 7% and 13% of patients, respectively, and bradyarrhythmia and non-sustained or sustained ventricular tachycardia (VT) occurred in 25% and 57%, respectively. There were 18 deaths, of which 7 were sudden cardiac events: 3 bradyarrhythmias, 1 ventricular fibrillation, 1 myocardial infarction, and 2 undetermined. The presence of a long QTc (odds ratio [OR] = 7.28; 95% confidence interval [CI], 2.01–26.35; p = 0.002), and the duration of the PR interval (OR = 1.05; 95% CI, 1.02–1.08; p < 0.001) were independently associated with bradyarrhythmias. Left ventricular dilatation (LVD) was independently associated with non-sustained VT (OR = 2.83; 95% CI, 1.01–7.96; p = 0.041). Conclusions In medium-term follow-up of RTCs, ILR helped detect a high incidence of AE, most of which did not have clinical relevance. The PR interval and presence of long QTc were predictive of bradyarrhythmias, whereas LVD was predictive of non-sustained VT. PMID:26351983

  6. A novel hardware implementation for detecting respiration rate using photoplethysmography.

    Science.gov (United States)

    Prinable, Joseph; Jones, Peter; Thamrin, Cindy; McEwan, Alistair

    2017-07-01

    Asthma is a serious public health problem. Continuous monitoring of breathing may offer an alternative way to assess disease status. In this paper we present a novel hardware implementation for the capture and storage of a photoplethysmography (PPG) signal. The LED duty cycle was altered to determine the effect on respiratory rate accuracy. The oximeter was mounted to the left index finger of ten healthy volunteers. The breathing rate derived from the oximeter was validated against a nasal airflow sensor. The duty cycle of a pulse oximeter was changed between 5%, 10% and 25% at a sample rate of 500 Hz. A PPG signal and reference signal was captured for each duty cycle. The PPG signals were post processed in Matlab to derive a respiration rate using an existing Matlab toolbox. At a 25% duty cycle the RMSE was <;2 breaths per minute for the top performing algorithm. The RMSE increased to over 5 breaths per minute when the duty cycle was reduced to 5%. The power consumed by the hardware for a 5%, 10% and 25% duty cycle was 5.4 mW, 7.8 mW, and 15 mW respectively. For clinical assessment of respiratory rate, a RSME of <;2 breaths per minute is recommended. Further work is required to determine utility in asthma management. However for non-clinical applications such as fitness tracking, lower accuracy may be sufficient to allow a reduced duty cycle setting.

  7. Towards Real-Time Detection of Gait Events on Different Terrains Using Time-Frequency Analysis and Peak Heuristics Algorithm.

    Science.gov (United States)

    Zhou, Hui; Ji, Ning; Samuel, Oluwarotimi Williams; Cao, Yafei; Zhao, Zheyi; Chen, Shixiong; Li, Guanglin

    2016-10-01

    Real-time detection of gait events can be applied as a reliable input to control drop foot correction devices and lower-limb prostheses. Among the different sensors used to acquire the signals associated with walking for gait event detection, the accelerometer is considered as a preferable sensor due to its convenience of use, small size, low cost, reliability, and low power consumption. Based on the acceleration signals, different algorithms have been proposed to detect toe off (TO) and heel strike (HS) gait events in previous studies. While these algorithms could achieve a relatively reasonable performance in gait event detection, they suffer from limitations such as poor real-time performance and are less reliable in the cases of up stair and down stair terrains. In this study, a new algorithm is proposed to detect the gait events on three walking terrains in real-time based on the analysis of acceleration jerk signals with a time-frequency method to obtain gait parameters, and then the determination of the peaks of jerk signals using peak heuristics. The performance of the newly proposed algorithm was evaluated with eight healthy subjects when they were walking on level ground, up stairs, and down stairs. Our experimental results showed that the mean F1 scores of the proposed algorithm were above 0.98 for HS event detection and 0.95 for TO event detection on the three terrains. This indicates that the current algorithm would be robust and accurate for gait event detection on different terrains. Findings from the current study suggest that the proposed method may be a preferable option in some applications such as drop foot correction devices and leg prostheses.

  8. Data filtering and expected muon and neutrino event rates in the KM3NeT neutrino telescope

    Energy Technology Data Exchange (ETDEWEB)

    Shanidze, Rezo [ECAP, University of Erlangen-Nuremberg, Erwin-Rommel-Str.1, 91058 Erlangen (Germany); Collaboration: ANTARES-KM3NeT-Erlangen-Collaboration

    2011-07-01

    KM3NeT is a future Mediterranean deep sea neutrino telescope with an instrumented volume of several cubic kilometres. The neutrino and muon events in KM3NeT will be reconstructed from the signals collected from the telescope's photo detectors. However, in the deep sea the dominant source of photon signals are the decays of K40 nuclei and bioluminescence. The selection of neutrino and muon events requires the implementation of fast and efficient data filtering algorithms for the reduction of accidental background event rates. Possible data filtering and triggering schemes for the KM3NeT neutrino telescope and expected muon and neutrino event rates are discussed.

  9. Prevalent Rate of Nonalbuminuric Renal Insufficiency and Its Association with Cardiovascular Disease Event in Korean Type 2 Diabetes

    Directory of Open Access Journals (Sweden)

    Hye Won Lee

    2016-12-01

    Full Text Available BackgroundNonalbuminuric renal insufficiency is a unique category of diabetic kidney diseases. The objectives of the study were to evaluate prevalent rate of nonalbuminuric renal insufficiency and to investigate its relationship with previous cardiovascular disease (CVD event in Korean patients with type 2 diabetes mellitus (T2DM.MethodsLaboratory and clinical data of 1,067 subjects with T2DM were obtained and reviewed. Study subjects were allocated into four subgroups according to the CKD classification. Major CVD events were included with coronary, cerebrovascular, and peripheral vascular events.ResultsNonalbuminuric stage ≥3 CKD group, when compared with albuminuric stage ≥3 CKD group, had shorter diabetic duration, lower concentrations of glycated hemoglobin, high density lipoprotein cholesterol, and high-sensitivity C-reactive protein, lower prevalent rates of retinopathy and previous CVD, and higher rate of treatment with angiotensin-converting enzyme inhibitors/angiotensin II receptor blockers. Nonalbuminuric stage ≥3 CKD group showed a greater association with prior CVD events than no CKD group; however, albuminuric stage ≥3 CKD group made addition to increase prevalence of prior CVD events significantly when CKD categories were applied as covariates. Association of prior CVD events, when compared with normal estimated glomerular filtration rate (eGFR and nonalbuminuria categories, became significant for declined eGFR, which was higher for eGFR of <30 mL/min/1.73 m2, and albuminuria.ConclusionThe results show that subjects with nonalbuminuric stage ≥3 CKD is significantly interrelated with occurrence of prior CVD events than those with normal eGFR with or without albuminuria. Comparing with normal eGFR and nonalbuminuria categories, the combination of increased degree of albuminuria and declined eGFR is becoming significant for the association of prior CVD events.

  10. Next-to-next-leading order correction to 3-jet rate and event-shape ...

    Indian Academy of Sciences (India)

    portunity to test QCD by measuring the energy dependence of different ... event shape data was not satisfactory largely due to the scale uncertainty of the pertur- .... )3 d ¯C dy. + O. ( α4 s. ) . (5). Here the event-shape distribution is normalized to the ..... [1] A Gehrmann-De Ridder, T Gehrmann, E W N Glover and G Heinrich, J.

  11. Solar Power Ramp Events Detection Using an Optimized Swinging Door Algorithm: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Cui, Mingjian; Zhang, Jie; Florita, Anthony; Hodge, Bri-Mathias; Ke, Deping; Sun, Yuanzhang

    2015-08-07

    Solar power ramp events (SPREs) are those that significantly influence the integration of solar power on non-clear days and threaten the reliable and economic operation of power systems. Accurately extracting solar power ramps becomes more important with increasing levels of solar power penetrations in power systems. In this paper, we develop an optimized swinging door algorithm (OpSDA) to detection. First, the swinging door algorithm (SDA) is utilized to segregate measured solar power generation into consecutive segments in a piecewise linear fashion. Then we use a dynamic programming approach to combine adjacent segments into significant ramps when the decision thresholds are met. In addition, the expected SPREs occurring in clear-sky solar power conditions are removed. Measured solar power data from Tucson Electric Power is used to assess the performance of the proposed methodology. OpSDA is compared to two other ramp detection methods: the SDA and the L1-Ramp Detect with Sliding Window (L1-SW) method. The statistical results show the validity and effectiveness of the proposed method. OpSDA can significantly improve the performance of the SDA, and it can perform as well as or better than L1-SW with substantially less computation time.

  12. Leveraging KVM Events to Detect Cache-Based Side Channel Attacks in a Virtualization Environment

    Directory of Open Access Journals (Sweden)

    Ady Wahyudi Paundu

    2018-01-01

    Full Text Available Cache-based side channel attack (CSCa techniques in virtualization systems are becoming more advanced, while defense methods against them are still perceived as nonpractical. The most recent CSCa variant called Flush + Flush has showed that the current detection methods can be easily bypassed. Within this work, we introduce a novel monitoring approach to detect CSCa operations inside a virtualization environment. We utilize the Kernel Virtual Machine (KVM event data in the kernel and process this data using a machine learning technique to identify any CSCa operation in the guest Virtual Machine (VM. We evaluate our approach using Receiver Operating Characteristic (ROC diagram of multiple attack and benign operation scenarios. Our method successfully separate the CSCa datasets from the non-CSCa datasets, on both trained and nontrained data scenarios. The successful classification also include the Flush + Flush attack scenario. We are also able to explain the classification results by extracting the set of most important features that separate both classes using their Fisher scores and show that our monitoring approach can work to detect CSCa in general. Finally, we evaluate the overhead impact of our CSCa monitoring method and show that it has a negligible computation overhead on the host and the guest VM.

  13. Ultra-Low Power Sensor System for Disaster Event Detection in Metro Tunnel Systems

    Directory of Open Access Journals (Sweden)

    Jonah VINCKE

    2017-05-01

    Full Text Available In this extended paper, the concept for an ultra-low power wireless sensor network (WSN for underground tunnel systems is presented highlighting the chosen sensors. Its objectives are the detection of emergency events either from natural disasters, such as flooding or fire, or from terrorist attacks using explosives. Earlier works have demonstrated that the power consumption for the communication can be reduced such that the data acquisition (i.e. sensor sub-system becomes the most significant energy consumer. By using ultra-low power components for the smoke detector, a hydrostatic pressure sensor for water ingress detection and a passive acoustic emission sensor for explosion detection, all considered threats are covered while the energy consumption can be kept very low in relation to the data acquisition. In addition to 1 the sensor system is integrated into a sensor board. The total average power consumption for operating the sensor sub-system is measured to be 35.9 µW for lower and 7.8 µW for upper nodes.

  14. The effect of whole-blood donor adverse events on blood donor return rates.

    Science.gov (United States)

    Newman, Bruce H; Newman, Daniel T; Ahmad, Raffat; Roth, Arthur J

    2006-08-01

    Some blood donation-related adverse events (AEs) can negatively impact the blood donor return rate (BDRR) and decrease donor retention. One-thousand randomly selected whole-blood donors were interviewed 3 weeks after a 525-mL index whole-blood donation for seven AEs. The number of return visits and duration of follow-up were recorded for each of the 1000 donors. A negative binomial regression analysis was used to determine the contribution of the four most common AEs to the BDRR, and interactions between these AEs were also evaluated. The four most common AEs were bruise alone (15.1%), sore arm "alone" (7.0%), fatigue "alone" (5.1%), and donor reaction "alone" (4.2%), where "alone" is defined to also include donors who had a bruise but no other AE. The estimated BDRR for donations without AEs was 1.32 visits per year. The estimated BDRRs for the four most common AEs were: bruise alone, 1.32 visits per year; sore arm alone, 1.30 visits per year (2% reduction in BDRR); fatigue alone, 1.06 visits per year (20% reduction in BDRR); and donor reaction alone, 0.87 visits per year (34% reduction in BDRR). The BDRR for donor reaction, fatigue, and sore arm together was 0.20 visits per year (85% reduction in BDRR). Donor reaction had the most negative impact on the BDRR. There appears to be a synergistic effect between donor reaction, fatigue, and sore arm. Theoretically, amelioration of some AEs has the potential to improve BDRRs.

  15. First events from the CNGS neutrino beam detected in the OPERA experiment

    CERN Document Server

    Acquafredda, R.; Ambrosio, M.; Anokhina, A.; Aoki, S.; Ariga, A.; Arrabito, L.; Autiero, D.; Badertscher, A.; Bergnoli, A.; Bersani Greggio, F.; Besnier, M.; Beyer, M.; Bondil-Blin, S.; Borer, K.; Boucrot, J.; Boyarkin, V.; Bozza, C.; Brugnera, R.; Buontempo, S.; Caffari, Y.; Campagne, Jean-Eric; Carlus, B.; Carrara, E.; Cazes, A.; Chaussard, L.; Chernyavsky, M.; Chiarella, V.; Chon-Sen, N.; Chukanov, A.; Ciesielski, R.; Consiglio, L.; Cozzi, M.; Dal Corso, F.; D'Ambrosio, N.; Damet, J.; De Lellis, G.; Declais, Y.; Descombes, T.; De Serio, M.; Di Capua, F.; Di Ferdinando, D.; Di Giovanni, A.; Di Marco, N.; Di Troia, C.; Dmitrievski, S.; Dracos, M.; Duchesneau, D.; Dulach, B.; Dusini, S.; Ebert, J.; Enikeev, R.; Ereditato, A.; Esposito, L.S.; Fanin, C.; Favier, J.; Felici, G.; Ferber, T.; Fournier, L.; Franceschi, A.; Frekers, D.; Fukuda, T.; Fukushima, C.; Galkin, V.I.; Galkin, V.A.; Gallet, R.; Garfagnini, A.; Gaudiot, G.; Giacomelli, G.; Giarmana, O.; Giorgini, M.; Girard, L.; Girerd, C.; Goellnitz, C.; Goldberg, J.; Gornoushkin, Y.; Grella, G.; Grianti, F.; Guerin, C.; Guler, M.; Gustavino, C.; Hagner, C.; Hamane, T.; Hara, T.; Hauger, M.; Hess, M.; Hoshino, K.; Ieva, M.; Incurvati, M.; Jakovcic, K.; Janicsko Csathy, J.; Janutta, B.; Jollet, C.; Juget, F.; Kazuyama, M.; Kim, S.H.; Kimura, M.; Knuesel, J.; Kodama, K.; Kolev, D.; Komatsu, M.; Kose, U.; Krasnoperov, A.; Kreslo, I.; Krumstein, Z.; Laktineh, I.; de La Taille, C.; Le Flour, T.; Lieunard, S.; Ljubicic, A.; Longhin, A.; Malgin, A.; Manai, K.; Mandrioli, G.; Mantello, U.; Marotta, A.; Marteau, J.; Martin-Chassard, G.; Matveev, V.; Messina, M.; Meyer, L.; Micanovic, S.; Migliozzi, P.; Miyamoto, S.; Monacelli, Piero; Monteiro, I.; Morishima, K.; Moser, U.; Muciaccia, M.T.; Mugnier, P.; Naganawa, N.; Nakamura, M.; Nakano, T.; Napolitano, T.; Natsume, M.; Niwa, K.; Nonoyama, Y.; Nozdrin, A.; Ogawa, S.; Olchevski, A.; Orlandi, D.; Ossetski, D.; Paoloni, A.; Park, B.D.; Park, I.G.; Pastore, A.; Patrizii, L.; Pellegrino, L.; Pessard, H.; Pilipenko, V.; Pistillo, C.; Polukhina, N.; Pozzato, M.; Pretzl, K.; Publichenko, P.; Raux, L.; Repellin, J.P.; Roganova, T.; Romano, G.; Rosa, G.; Rubbia, A.; Ryasny, V.; Ryazhskaya, O.; Ryzhikov, D.; Sadovski, A.; Sanelli, C.; Sato, O.; Sato, Y.; Saveliev, V.; Savvinov, N.; Sazhina, G.; Schembri, A.; Schmidt Parzefall, W.; Schroeder, H.; Schutz, H.U.; Scotto Lavina, L.; Sewing, J.; Shibuya, H.; Simone, S.; Sioli, M.; Sirignano, C.; Sirri, G.; Song, J.S.; Spaeti, R.; Spinetti, M.; Stanco, L.; Starkov, N.; Stipcevic, M.; Strolin, Paolo Emilio; Sugonyaev, V.; Takahashi, S.; Tereschenko, V.; Terranova, F.; Tezuka, I.; Tioukov, V.; Tikhomirov, I.; Tolun, P.; Toshito, T.; Tsarev, V.; Tsenov, R.; Ugolino, U.; Ushida, N.; Van Beek, G.; Verguilov, V.; Vilain, P.; Votano, L.; Vuilleumier, J.L.; Waelchli, T.; Waldi, R.; Weber, M.; Wilquet, G.; Wonsak, B.; Wurth, R.; Wurtz, J.; Yakushev, V.; Yoon, C.S.; Zaitsev, Y.; Zamboni, I.; Zimmerman, R.

    2006-01-01

    The OPERA neutrino detector at the underground Gran Sasso Laboratory (LNGS) was designed to perform the first detection of neutrino oscillations in appearance mode, through the study of nu_mu to nu_tau oscillations. The apparatus consists of a lead/emulsion-film target complemented by electronic detectors. It is placed in the high-energy, long-baseline CERN to LNGS beam (CNGS) 730 km away from the neutrino source. In August 2006 a first run with CNGS neutrinos was successfully conducted. A first sample of neutrino events was collected, statistically consistent with the integrated beam intensity. After a brief description of the beam and of the various sub-detectors, we report on the achievement of this milestone, presenting the first data and some analysis results.

  16. The Event Detection and the Apparent Velocity Estimation Based on Computer Vision

    Science.gov (United States)

    Shimojo, M.

    2012-08-01

    The high spatial and time resolution data obtained by the telescopes aboard Hinode revealed the new interesting dynamics in solar atmosphere. In order to detect such events and estimate the velocity of dynamics automatically, we examined the estimation methods of the optical flow based on the OpenCV that is the computer vision library. We applied the methods to the prominence eruption observed by NoRH, and the polar X-ray jet observed by XRT. As a result, it is clear that the methods work well for solar images if the images are optimized for the methods. It indicates that the optical flow estimation methods in the OpenCV library are very useful to analyze the solar phenomena.

  17. Detection of Cardiopulmonary Activity and Related Abnormal Events Using Microsoft Kinect Sensor.

    Science.gov (United States)

    Al-Naji, Ali; Chahl, Javaan

    2018-03-20

    Monitoring of cardiopulmonary activity is a challenge when attempted under adverse conditions, including different sleeping postures, environmental settings, and an unclear region of interest (ROI). This study proposes an efficient remote imaging system based on a Microsoft Kinect v2 sensor for the observation of cardiopulmonary-signal-and-detection-related abnormal cardiopulmonary events (e.g., tachycardia, bradycardia, tachypnea, bradypnea, and central apnoea) in many possible sleeping postures within varying environmental settings including in total darkness and whether the subject is covered by a blanket or not. The proposed system extracts the signal from the abdominal-thoracic region where cardiopulmonary activity is most pronounced, using a real-time image sequence captured by Kinect v2 sensor. The proposed system shows promising results in any sleep posture, regardless of illumination conditions and unclear ROI even in the presence of a blanket, whilst being reliable, safe, and cost-effective.

  18. Event detection and localization for small mobile robots using reservoir computing.

    Science.gov (United States)

    Antonelo, E A; Schrauwen, B; Stroobandt, D

    2008-08-01

    Reservoir Computing (RC) techniques use a fixed (usually randomly created) recurrent neural network, or more generally any dynamic system, which operates at the edge of stability, where only a linear static readout output layer is trained by standard linear regression methods. In this work, RC is used for detecting complex events in autonomous robot navigation. This can be extended to robot localization tasks which are solely based on a few low-range, high-noise sensory data. The robot thus builds an implicit map of the environment (after learning) that is used for efficient localization by simply processing the input stream of distance sensors. These techniques are demonstrated in both a simple simulation environment and in the physically realistic Webots simulation of the commercially available e-puck robot, using several complex and even dynamic environments.

  19. Decision support methods for the detection of adverse events in post-marketing data.

    Science.gov (United States)

    Hauben, M; Bate, A

    2009-04-01

    Spontaneous reporting is a crucial component of post-marketing drug safety surveillance despite its significant limitations. The size and complexity of some spontaneous reporting system databases represent a challenge for drug safety professionals who traditionally have relied heavily on the scientific and clinical acumen of the prepared mind. Computer algorithms that calculate statistical measures of reporting frequency for huge numbers of drug-event combinations are increasingly used to support pharamcovigilance analysts screening large spontaneous reporting system databases. After an overview of pharmacovigilance and spontaneous reporting systems, we discuss the theory and application of contemporary computer algorithms in regular use, those under development, and the practical considerations involved in the implementation of computer algorithms within a comprehensive and holistic drug safety signal detection program.

  20. Detection of Cardiopulmonary Activity and Related Abnormal Events Using Microsoft Kinect Sensor

    Directory of Open Access Journals (Sweden)

    Ali Al-Naji

    2018-03-01

    Full Text Available Monitoring of cardiopulmonary activity is a challenge when attempted under adverse conditions, including different sleeping postures, environmental settings, and an unclear region of interest (ROI. This study proposes an efficient remote imaging system based on a Microsoft Kinect v2 sensor for the observation of cardiopulmonary-signal-and-detection-related abnormal cardiopulmonary events (e.g., tachycardia, bradycardia, tachypnea, bradypnea, and central apnoea in many possible sleeping postures within varying environmental settings including in total darkness and whether the subject is covered by a blanket or not. The proposed system extracts the signal from the abdominal-thoracic region where cardiopulmonary activity is most pronounced, using a real-time image sequence captured by Kinect v2 sensor. The proposed system shows promising results in any sleep posture, regardless of illumination conditions and unclear ROI even in the presence of a blanket, whilst being reliable, safe, and cost-effective.

  1. Improved detection of congestive heart failure via probabilistic symbolic pattern recognition and heart rate variability metrics.

    Science.gov (United States)

    Mahajan, Ruhi; Viangteeravat, Teeradache; Akbilgic, Oguz

    2017-12-01

    A timely diagnosis of congestive heart failure (CHF) is crucial to evade a life-threatening event. This paper presents a novel probabilistic symbol pattern recognition (PSPR) approach to detect CHF in subjects from their cardiac interbeat (R-R) intervals. PSPR discretizes each continuous R-R interval time series by mapping them onto an eight-symbol alphabet and then models the pattern transition behavior in the symbolic representation of the series. The PSPR-based analysis of the discretized series from 107 subjects (69 normal and 38 CHF subjects) yielded discernible features to distinguish normal subjects and subjects with CHF. In addition to PSPR features, we also extracted features using the time-domain heart rate variability measures such as average and standard deviation of R-R intervals. An ensemble of bagged decision trees was used to classify two groups resulting in a five-fold cross-validation accuracy, specificity, and sensitivity of 98.1%, 100%, and 94.7%, respectively. However, a 20% holdout validation yielded an accuracy, specificity, and sensitivity of 99.5%, 100%, and 98.57%, respectively. Results from this study suggest that features obtained with the combination of PSPR and long-term heart rate variability measures can be used in developing automated CHF diagnosis tools. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. Convolutional neural networks for event-related potential detection: impact of the architecture.

    Science.gov (United States)

    Cecotti, H

    2017-07-01

    The detection of brain responses at the single-trial level in the electroencephalogram (EEG) such as event-related potentials (ERPs) is a difficult problem that requires different processing steps to extract relevant discriminant features. While most of the signal and classification techniques for the detection of brain responses are based on linear algebra, different pattern recognition techniques such as convolutional neural network (CNN), as a type of deep learning technique, have shown some interests as they are able to process the signal after limited pre-processing. In this study, we propose to investigate the performance of CNNs in relation of their architecture and in relation to how they are evaluated: a single system for each subject, or a system for all the subjects. More particularly, we want to address the change of performance that can be observed between specifying a neural network to a subject, or by considering a neural network for a group of subjects, taking advantage of a larger number of trials from different subjects. The results support the conclusion that a convolutional neural network trained on different subjects can lead to an AUC above 0.9 by using an appropriate architecture using spatial filtering and shift invariant layers.

  3. General programmable Level-1 trigger with 3D-Flow assembly system for calorimeters of different sizes and event rates

    International Nuclear Information System (INIS)

    Crosetto, D.

    1992-12-01

    Experience demonstrates that fine tuning on the trigger of an experiment is often achieved only after running the experiment and analyzing the first data acquired. It is desirable that identification and, consequently, selection of interesting events be made on a more refined identification of particles. Use of an innovative parallel-processing system architecture together with an instruction set allows identification of objects (particles) among the data coming from a calorimeter in a programmable manner, utilizing the information related to their shape in two- or three-dimensional form, rather than applying only a programmable threshold proportional to their energy. The architecture is flexible, allowing execution of simple algorithms as well as complex pattern recognition algorithms. It is scalable in the sense that the same hardware can be used for small or large calorimeters having a slow or fast event rate. The simple printed circuit board (accommodating 16 x 3D-Flow processors) on a 4 in. x 4 in. board described herein uses the same hardware to build a large Level-1 programmable trigger (by interconnecting many boards in a matrix array) and is capable of implementing simple or complex pattern recognition algorithms at different event input rates (by cascading boards one on top of another). With the same hardware one can build low-cost, programmable Level-1 triggers for a small and low-event-rate calorimeter, or high-performance, programmable Level-1 triggers for a large calorimeter capable of sustaining up to 60 million events per second

  4. The power to detect recent fragmentation events using genetic differentiation methods.

    Directory of Open Access Journals (Sweden)

    Michael W Lloyd

    Full Text Available Habitat loss and fragmentation are imminent threats to biological diversity worldwide and thus are fundamental issues in conservation biology. Increased isolation alone has been implicated as a driver of negative impacts in populations associated with fragmented landscapes. Genetic monitoring and the use of measures of genetic divergence have been proposed as means to detect changes in landscape connectivity. Our goal was to evaluate the sensitivity of Wright's F st, Hedrick' G'st , Sherwin's MI, and Jost's D to recent fragmentation events across a range of population sizes and sampling regimes. We constructed an individual-based model, which used a factorial design to compare effects of varying population size, presence or absence of overlapping generations, and presence or absence of population sub-structuring. Increases in population size, overlapping generations, and population sub-structuring each reduced F st, G'st , MI, and D. The signal of fragmentation was detected within two generations for all metrics. However, the magnitude of the change in each was small in all cases, and when N e was >100 individuals it was extremely small. Multi-generational sampling and population estimates are required to differentiate the signal of background divergence from changes in Fst , G'st , MI, and D associated with fragmentation. Finally, the window during which rapid change in Fst , G'st , MI, and D between generations occurs can be small, and if missed would lead to inconclusive results. For these reasons, use of F st, G'st , MI, or D for detecting and monitoring changes in connectivity is likely to prove difficult in real-world scenarios. We advocate use of genetic monitoring only in conjunction with estimates of actual movement among patches such that one could compare current movement with the genetic signature of past movement to determine there has been a change.

  5. Detection of adverse events in general surgery using the " Trigger Tool" methodology.

    Science.gov (United States)

    Pérez Zapata, Ana Isabel; Gutiérrez Samaniego, María; Rodríguez Cuéllar, Elías; Andrés Esteban, Eva María; Gómez de la Cámara, Agustín; Ruiz López, Pedro

    2015-02-01

    Surgery is one of the high-risk areas for the occurrence of adverse events (AE). The purpose of this study is to know the percentage of hospitalisation-related AE that are detected by the «Global Trigger Tool» methodology in surgical patients, their characteristics and the tool validity. Retrospective, observational study on patients admitted to a general surgery department, who underwent a surgical operation in a third level hospital during the year 2012. The identification of AE was carried out by patient record review using an adaptation of «Global Trigger Tool» methodology. Once an AE was identified, a harm category was assigned, including the grade in which the AE could have been avoided and its relation with the surgical procedure. The prevalence of AE was 36,8%. There were 0,5 AE per patient. 56,2% were deemed preventable. 69,3% were directly related to the surgical procedure. The tool had a sensitivity of 86% and a specificity of 93,6%. The positive predictive value was 89% and the negative predictive value 92%. Prevalence of AE is greater than the estimate of other studies. In most cases the AE detected were related to the surgical procedure and more than half were also preventable. The adapted «Global Trigger Tool» methodology has demonstrated to be highly effective and efficient for detecting AE in surgical patients, identifying all the serious AE with few false negative results. Copyright © 2014 AEC. Publicado por Elsevier España, S.L.U. All rights reserved.

  6. Pulse oximetry recorded from the Phone Oximeter for detection of obstructive sleep apnea events with and without oxygen desaturation in children.

    Science.gov (United States)

    Garde, Ainara; Dehkordi, Parastoo; Wensley, David; Ansermino, J Mark; Dumont, Guy A

    2015-01-01

    Obstructive sleep apnea (OSA) disrupts normal ventilation during sleep and can lead to serious health problems in children if left untreated. Polysomnography, the gold standard for OSA diagnosis, is resource intensive and requires a specialized laboratory. Thus, we proposed to use the Phone Oximeter™, a portable device integrating pulse oximetry with a smartphone, to detect OSA events. As a proportion of OSA events occur without oxygen desaturation (defined as SpO2 decreases ≥ 3%), we suggest combining SpO2 and pulse rate variability (PRV) analysis to identify all OSA events and provide a more detailed sleep analysis. We recruited 160 children and recorded pulse oximetry consisting of SpO2 and plethysmography (PPG) using the Phone Oximeter™, alongside standard polysomnography. A sleep technician visually scored all OSA events with and without oxygen desaturation from polysomnography. We divided pulse oximetry signals into 1-min signal segments and extracted several features from SpO2 and PPG analysis in the time and frequency domain. Segments with OSA, especially the ones with oxygen desaturation, presented greater SpO2 variability and modulation reflected in the spectral domain than segments without OSA. Segments with OSA also showed higher heart rate and sympathetic activity through the PRV analysis relative to segments without OSA. PRV analysis was more sensitive than SpO2 analysis for identification of OSA events without oxygen desaturation. Combining SpO2 and PRV analysis enhanced OSA event detection through a multiple logistic regression model. The area under the ROC curve increased from 81% to 87%. Thus, the Phone Oximeter™ might be useful to monitor sleep and identify OSA events with and without oxygen desaturation at home.

  7. Tracking Real-Time Changes in Working Memory Updating and Gating with the Event-Based Eye-Blink Rate

    NARCIS (Netherlands)

    Rac-Lubashevsky, R.; Slagter, H.A.; Kessler, Y.

    2017-01-01

    Effective working memory (WM) functioning depends on the gating process that regulates the balance between maintenance and updating of WM. The present study used the event-based eye-blink rate (ebEBR), which presumably reflects phasic striatal dopamine activity, to examine how the cognitive

  8. Atrial fibrillation detection by heart rate variability in Poincare plot.

    Science.gov (United States)

    Park, Jinho; Lee, Sangwook; Jeon, Moongu

    2009-12-11

    Atrial fibrillation (AFib) is one of the prominent causes of stroke, and its risk increases with age. We need to detect AFib correctly as early as possible to avoid medical disaster because it is likely to proceed into a more serious form in short time. If we can make a portable AFib monitoring system, it will be helpful to many old people because we cannot predict when a patient will have a spasm of AFib. We analyzed heart beat variability from inter-beat intervals obtained by a wavelet-based detector. We made a Poincare plot using the inter-beat intervals. By analyzing the plot, we extracted three feature measures characterizing AFib and non-AFib: the number of clusters, mean stepping increment of inter-beat intervals, and dispersion of the points around a diagonal line in the plot. We divided distribution of the number of clusters into two and calculated mean value of the lower part by k-means clustering method. We classified data whose number of clusters is more than one and less than this mean value as non-AFib data. In the other case, we tried to discriminate AFib from non-AFib using support vector machine with the other feature measures: the mean stepping increment and dispersion of the points in the Poincare plot. We found that Poincare plot from non-AFib data showed some pattern, while the plot from AFib data showed irregularly irregular shape. In case of non-AFib data, the definite pattern in the plot manifested itself with some limited number of clusters or closely packed one cluster. In case of AFib data, the number of clusters in the plot was one or too many. We evaluated the accuracy using leave-one-out cross-validation. Mean sensitivity and mean specificity were 91.4% and 92.9% respectively. Because pulse beats of ventricles are less likely to be influenced by baseline wandering and noise, we used the inter-beat intervals to diagnose AFib. We visually displayed regularity of the inter-beat intervals by way of Poincare plot. We tried to design an

  9. Maximum Redshift of Gravitational Wave Merger Events

    Science.gov (United States)

    Koushiappas, Savvas M.; Loeb, Abraham

    2017-12-01

    Future generations of gravitational wave detectors will have the sensitivity to detect gravitational wave events at redshifts far beyond any detectable electromagnetic sources. We show that if the observed event rate is greater than one event per year at redshifts z ≥40 , then the probability distribution of primordial density fluctuations must be significantly non-Gaussian or the events originate from primordial black holes. The nature of the excess events can be determined from the redshift distribution of the merger rate.

  10. FOREWORD: 3rd Symposium on Large TPCs for Low Energy Event Detection

    Science.gov (United States)

    Irastorza, Igor G.; Colas, Paul; Gorodetzky, Phillippe

    2007-05-01

    The Third International Symposium on large TPCs for low-energy rare-event detection was held at Carré des sciences, Poincaré auditorium, 25 rue de la Montagne Ste Geneviève in Paris on 11 12 December 2006. This prestigious location belonging to the Ministry of Research is hosted in the former Ecole Polytechnique. The meeting, held in Paris every two years, gathers a significant community of physicists involved in rare event detection. Its purpose is an extensive discussion of present and future projects using large TPCs for low energy, low background detection of rare events (low-energy neutrinos, dark matter, solar axions). The use of a new generation of Micro-Pattern Gaseous Detectors (MPGD) appears to be a promising way to reach this goal. The program this year was enriched by a new session devoted to the detection challenge of polarized gamma rays, relevant novel experimental techniques and the impact on particle physics, astrophysics and astronomy. A very particular feature of this conference is the large variety of talks ranging from purely theoretical to purely experimental subjects including novel technological aspects. This allows discussion and exchange of useful information and new ideas that are emerging to address particle physics experimental challenges. The scientific highlights at the Symposium came on many fronts: Status of low-energy neutrino physics and double-beta decay New ideas on double-beta decay experiments Gamma ray polarization measurement combining high-precision TPCs with MPGD read-out Dark Matter challenges in both axion and WIMP search with new emerging ideas for detection improvements Progress in gaseous and liquid TPCs for rare event detection Georges Charpak opened the meeting with a talk on gaseous detectors for applications in the bio-medical field. He also underlined the importance of new MPGD detectors for both physics and applications. There were about 100 registered participants at the symposium. The successful

  11. Bioluminescence Detection of Cells Having Stabilized p53 in Response to a Genotoxic Event

    Directory of Open Access Journals (Sweden)

    Alnawaz Rehemtulla

    2004-01-01

    Full Text Available Inactivation of p53 is one of the most frequent molecular events in neoplastic transformation. Approximately 60% of all human tumors have mutations in both p53 alleles. Wild-type p53 activity is regulated in large part by the proteosome-dependent degradation of p53, resulting in a short p53 half-life in unstressed and untransformed cells. Activation of p53 by a variety of stimuli, including DNA damage induced by genotoxic drugs or radiation, is accomplished by stabilization of wild-type p53. The stabilized and active p53 can result in either cell-cycle arrest or apoptosis. Surprisingly, the majority of tumor-associated, inactivating p53 mutations also result in p53 accumulation. Thus, constitutive elevation of p53 levels in cells is a reliable measure of p53 inactivation, whereas transiently increased p53 levels reflect a recent genotoxic stress. In order to facilitate noninvasive imaging of p53 accumulation, we here describe the construction of a p53-luciferase fusion protein. Induction of DNA damage in cells expressing the fusion protein resulted in a time-dependent accumulation of the fusion that was noninvasively detected using bioluminescence imaging and validated by Western blot analysis. The p53-Luc protein retains p53 function because its expression in HCT116 cells lacking functional p53 resulted in activation of p21 expression as well as induction of apoptosis in response to a DNA damaging event. Employed in a transgenic animal model, the proposed p53-reporter fusion protein will be useful for studying p53 activation in response to exposure to DNA-damaging carcinogenic agents. It could also be used to study p53 stabilization as a result of inactivating p53 mutations. Such studies will further our understanding of p53's role as the “guardian of the genome” and its function in tumorigenesis.

  12. Robust event-triggered MPC with guaranteed asymptotic bound and average sampling rate

    NARCIS (Netherlands)

    Brunner, F.D.; Heemels, W.P.M.H.; Allgower, F.

    2017-01-01

    We propose a robust event-triggered model predictive control (MPC) scheme for linear time-invariant discrete-time systems subject to bounded additive stochastic disturbances and hard constraints on the input and state. For given probability distributions of the disturbances acting on the system, we

  13. Next-to-next-leading order correction to 3-jet rate and event-shape ...

    Indian Academy of Sciences (India)

    The coupling constant, , was measured by two different methods: first by employing the three-jet observables. Combining all the data, the value of as at next-to-next leading order (NNLO) was determined to be 0.117 ± 0.004(hard) ± 0.006(theo). Secondly, from the event-shape distributions, the strong coupling constant, ...

  14. PET-CT detection rate of primary breast cancer lesions. Correlation with the clinicopathological factors

    International Nuclear Information System (INIS)

    Ogawa, Tomoko; Tozaki, Mitsuhiro; Fukuma, Eisuke

    2008-01-01

    One hundred and forty lesions of primary breast cancer underwent positron emission tomography (PET)-CT between June 2006 and May 2007. The PET-CT detection rate of primary breast cancer lesions was 72.1%. The detection rate was 52.1% for invasive cancer ≤20 mm, 92.8% for invasive breast cancers >20 mm, and these results were significant. In the present study, no significant relationship was observed between tumor types, however, invasive lobular carcinoma showed a lower detection rate, 58.3%. The PET-CT results were not significantly affected by either estrogen and progesterone receptors or distant metastasis. A significant correlation regarding the detection rate of PET-CT was found with HER2 status, tumor grade, and axillary lymph node status. The detection rate was 100% for invasive cancer ≤20 mm when the interval between prior diagnostic Mammotome biopsies and PET-CT was less than 3 weeks, 18.8% for invasive cancer ≤20 mm when the interval was more than 3 weeks, and these results were significant. Mammotome biopsies may therefore affect the detection rate of PET-CT. Invasive cancers ≤20 mm showed a low detection rate, therefore, it is considered to be insufficient to use PET-CT for the detection of early breast cancer. (author)

  15. Correction to the count-rate detection limit and sample/blank time-allocation methods

    International Nuclear Information System (INIS)

    Alvarez, Joseph L.

    2013-01-01

    A common form of count-rate detection limits contains a propagation of uncertainty error. This error originated in methods to minimize uncertainty in the subtraction of the blank counts from the gross sample counts by allocation of blank and sample counting times. Correct uncertainty propagation showed that the time allocation equations have no solution. This publication presents the correct form of count-rate detection limits. -- Highlights: •The paper demonstrated a proper method of propagating uncertainty of count rate differences. •The standard count-rate detection limits were in error. •Count-time allocation methods for minimum uncertainty were in error. •The paper presented the correct form of the count-rate detection limit. •The paper discussed the confusion between count-rate uncertainty and count uncertainty

  16. Detection of sentinel lymph node in breast cancer and malignant melanoma - Influence of some factors on detection success rate

    International Nuclear Information System (INIS)

    Krafta, O.; Safarcika, K.; Stepien, A.

    2004-01-01

    Full text: The aim of this study was to compare three radiopharmaceuticals for sentinel lymph node detection in breast cancer and malignant melanoma patients. We examined 100 women and 2 men with breast cancer (average age 59.3 years) and 167 patients with malignant melanoma (69 men with mean age of 58.6 years and 98 women with mean age of 53.6 years). Lymphoscintigraphy was performed in all patients after injection of the radiotracer, either of the three: NANOCIS (average particle size 100 nm), SENTISCINT (particle size 100-600 nm), and NANOCOLL (particle size under 80 nm). Dynamic scintigraphy was performed in melanoma patients while breast cancer patients were subjected to stating imaging at 1-2 and 22 hours of injection. In patients with melanoma surgery was done on the same day, to remove the primary tumor, sentinel lymph node and other nodes, wherever required. In breast cancer patients, surgery, more or less, was done on the second day of radiotracer injection. In operation theatre isosulfan blue dye and gamma probe was used to detect sentinel lymph nodes. In breast cancer patients, scintigraphy detected a total of 231 lymph nodes but failed to show sentinel lymph node in 7 patients (success rate of lymphoscintigraphy 93.1 %). Using gamma probe 158 lymph nodes were detected in 89 patients but sentinel nodes were missed in 9 patients (success rate of probe was 89.9 %). 146 lymph nodes could be visualised using blue dye in 92 patients but were missed in 12 patients (detection rate by dye was 87 %). In 2 patients sentinel lymph node could not be detected by any method. In patients with melanoma, scintigraphy showed 304 lymph nodes. However, it did not detect sentinel lymph node in 9 patients (success rate of lymphoscintigraphy was 94.6 %). 104 patients were examined by means of gamma probe and 132 lymph nodes were detected and no lymph node was found in 13 patients (success rate of probe 87.5%). Using blue dye in 140 patients, 131 nodes were found but were

  17. Short gamma-ray burst formation rate from BATSE data using E{sub p} -L{sub p} correlation and the minimum gravitational-wave event rate of a coalescing compact binary

    Energy Technology Data Exchange (ETDEWEB)

    Yonetoku, Daisuke; Sawano, Tatsuya; Toyanago, Asuka [College of Science and Engineering, School of Mathematics and Physics, Kanazawa University, Kakuma, Kanazawa, Ishikawa 920-1192 (Japan); Nakamura, Takashi [Department of Physics, Kyoto University, Kyoto 606-8502 (Japan); Takahashi, Keitaro, E-mail: yonetoku@astro.s.kanazawa-u.ac.jp, E-mail: takashi@tap.scphys.kyoto-u.ac.jp [Faculty of Science, Kumamoto University, Kurokami, Kumamoto 860-8555 (Japan)

    2014-07-01

    Using 72 short gamma-ray bursts (SGRBs) with well determined spectral data observed by BATSE, we determine their redshift and luminosity by applying the E{sub p} -L{sub p} correlation for SGRBs found by Tsutsui et al. For 53 SGRBs with an observed flux brighter than 4 × 10{sup –6} erg cm{sup –2} s{sup –1}, the cumulative redshift distribution up to z = 1 agrees well with that of 22 Swift SGRBs. This suggests that the redshift determination by the E{sub p} -L{sub p} correlation for SGRBs works well. The minimum event rate at z = 0 is estimated as R{sub on−axis}{sup min}=6.3{sub −3.9}{sup +3.1}× 10{sup −10} events Mpc{sup −3} yr{sup −1}, so that the minimum beaming angle is 0.°6-7.°8 assuming a merging rate of 10{sup –7}- 4 × 10{sup –6} events Mpc{sup –3} yr{sup –1} suggested from the binary pulsar data. Interestingly, this angle is consistent with that for SGRB 130603B of ∼4°-8°. On the other hand, if we assume a beaming angle of ∼6° suggested from four SGRBs with the observed beaming angle value, then the minimum event rate including off-axis SGRBs is estimated as R{sub all}{sup min}=1.15{sub −0.66}{sup +0.56} × 10{sup −7} events Mpc{sup −3} yr{sup −1}. If SGRBs are induced by the coalescence of binary neutron stars (NSs) and/or black holes (BHs), then this event rate leads to a minimum gravitational-wave detection rate of 3.8{sub −2.2}{sup +1.8} (146{sub −83}{sup +71}) events yr{sup −1} for an NS-NS (NS-BH) binary, respectively, by a worldwide network with KAGRA, advanced-LIGO, advanced-VIRGO, and GEO.

  18. A video event trigger for high frame rate, high resolution video technology

    Science.gov (United States)

    Williams, Glenn L.

    1991-12-01

    When video replaces film the digitized video data accumulates very rapidly, leading to a difficult and costly data storage problem. One solution exists for cases when the video images represent continuously repetitive 'static scenes' containing negligible activity, occasionally interrupted by short events of interest. Minutes or hours of redundant video frames can be ignored, and not stored, until activity begins. A new, highly parallel digital state machine generates a digital trigger signal at the onset of a video event. High capacity random access memory storage coupled with newly available fuzzy logic devices permits the monitoring of a video image stream for long term or short term changes caused by spatial translation, dilation, appearance, disappearance, or color change in a video object. Pretrigger and post-trigger storage techniques are then adaptable for archiving the digital stream from only the significant video images.

  19. Detecting Micro-seismicity and Long-duration Tremor-like Events from the Oklahoma Wavefield Experiment

    Science.gov (United States)

    Li, C.; Li, Z.; Peng, Z.; Zhang, C.; Nakata, N.

    2017-12-01

    Oklahoma has experienced abrupt increase of induced seismicity in the last decade. An important way to fully understand seismic activities in Oklahoma is to obtain more complete earthquake catalogs and detect different types of seismic events. The IRIS Community Wavefield Demonstration Experiment was deployed near Enid, Oklahoma in Summer of 2016. The dataset from this ultra-dense array provides an excellent opportunity for detecting microseismicity in that region with wavefield approaches. Here we examine continuous waveforms recorded by 3 seismic lines using local coherence for ultra-dense arrays (Li et al., 2017), which is a measure of cross-correlation of waveform at each station with its nearby stations. So far we have detected more than 5,000 events from 06/22/2016 to 07/20/2016, and majority of them are not listed on the regional catalog of Oklahoma or global catalogs, indicating that they are local events. We also identify 15-20 long-period long-duration events, some of them lasting for more than 500 s. Such events have been found at major plate-boundary faults (also known as deep tectonic tremor), as well as during hydraulic fracturing, slow-moving landslides and glaciers. Our next step is to locate these possible tremor-like events with their relative arrival times across the array and compare their occurrence times with solid-earth tides and injection histories to better understand their driving mechanisms.

  20. On the feasibility of using satellite gravity observations for detecting large-scale solid mass transfer events

    Science.gov (United States)

    Peidou, Athina C.; Fotopoulos, Georgia; Pagiatakis, Spiros

    2017-10-01

    The main focus of this paper is to assess the feasibility of utilizing dedicated satellite gravity missions in order to detect large-scale solid mass transfer events (e.g. landslides). Specifically, a sensitivity analysis of Gravity Recovery and Climate Experiment (GRACE) gravity field solutions in conjunction with simulated case studies is employed to predict gravity changes due to past subaerial and submarine mass transfer events, namely the Agulhas slump in southeastern Africa and the Heart Mountain Landslide in northwestern Wyoming. The detectability of these events is evaluated by taking into account the expected noise level in the GRACE gravity field solutions and simulating their impact on the gravity field through forward modelling of the mass transfer. The spectral content of the estimated gravity changes induced by a simulated large-scale landslide event is estimated for the known spatial resolution of the GRACE observations using wavelet multiresolution analysis. The results indicate that both the Agulhas slump and the Heart Mountain Landslide could have been detected by GRACE, resulting in {\\vert }0.4{\\vert } and {\\vert }0.18{\\vert } mGal change on GRACE solutions, respectively. The suggested methodology is further extended to the case studies of the submarine landslide in Tohoku, Japan, and the Grand Banks landslide in Newfoundland, Canada. The detectability of these events using GRACE solutions is assessed through their impact on the gravity field.

  1. Snake scales, partial exposure, and the Snake Detection Theory: A human event-related potentials study

    Science.gov (United States)

    Van Strien, Jan W.; Isbell, Lynne A.

    2017-01-01

    Studies of event-related potentials in humans have established larger early posterior negativity (EPN) in response to pictures depicting snakes than to pictures depicting other creatures. Ethological research has recently shown that macaques and wild vervet monkeys respond strongly to partially exposed snake models and scale patterns on the snake skin. Here, we examined whether snake skin patterns and partially exposed snakes elicit a larger EPN in humans. In Task 1, we employed pictures with close-ups of snake skins, lizard skins, and bird plumage. In task 2, we employed pictures of partially exposed snakes, lizards, and birds. Participants watched a random rapid serial visual presentation of these pictures. The EPN was scored as the mean activity (225–300 ms after picture onset) at occipital and parieto-occipital electrodes. Consistent with previous studies, and with the Snake Detection Theory, the EPN was significantly larger for snake skin pictures than for lizard skin and bird plumage pictures, and for lizard skin pictures than for bird plumage pictures. Likewise, the EPN was larger for partially exposed snakes than for partially exposed lizards and birds. The results suggest that the EPN snake effect is partly driven by snake skin scale patterns which are otherwise rare in nature. PMID:28387376

  2. Vision-based Detection of Acoustic Timed Events: a Case Study on Clarinet Note Onsets

    Science.gov (United States)

    Bazzica, A.; van Gemert, J. C.; Liem, C. C. S.; Hanjalic, A.

    2017-05-01

    Acoustic events often have a visual counterpart. Knowledge of visual information can aid the understanding of complex auditory scenes, even when only a stereo mixdown is available in the audio domain, \\eg identifying which musicians are playing in large musical ensembles. In this paper, we consider a vision-based approach to note onset detection. As a case study we focus on challenging, real-world clarinetist videos and carry out preliminary experiments on a 3D convolutional neural network based on multiple streams and purposely avoiding temporal pooling. We release an audiovisual dataset with 4.5 hours of clarinetist videos together with cleaned annotations which include about 36,000 onsets and the coordinates for a number of salient points and regions of interest. By performing several training trials on our dataset, we learned that the problem is challenging. We found that the CNN model is highly sensitive to the optimization algorithm and hyper-parameters, and that treating the problem as binary classification may prevent the joint optimization of precision and recall. To encourage further research, we publicly share our dataset, annotations and all models and detail which issues we came across during our preliminary experiments.

  3. First Satellite-detected Perturbations of Outgoing Longwave Radiation Associated with Blowing Snow Events over Antarctica

    Science.gov (United States)

    Yang, Yuekui; Palm, Stephen P.; Marshak, Alexander; Wu, Dong L.; Yu, Hongbin; Fu, Qiang

    2014-01-01

    We present the first satellite-detected perturbations of the outgoing longwave radiation (OLR) associated with blowing snow events over the Antarctic ice sheet using data from Cloud-Aerosol Lidar with Orthogonal Polarization and Clouds and the Earth's Radiant Energy System. Significant cloud-free OLR differences are observed between the clear and blowing snow sky, with the sign andmagnitude depending on season and time of the day. During nighttime, OLRs are usually larger when blowing snow is present; the average difference in OLRs between without and with blowing snow over the East Antarctic Ice Sheet is about 5.2 W/m2 for the winter months of 2009. During daytime, in contrast, the OLR perturbation is usually smaller or even has the opposite sign. The observed seasonal variations and day-night differences in the OLR perturbation are consistent with theoretical calculations of the influence of blowing snow on OLR. Detailed atmospheric profiles are needed to quantify the radiative effect of blowing snow from the satellite observations.

  4. Real Time Robot Soccer Game Event Detection Using Finite State Machines with Multiple Fuzzy Logic Probability Evaluators

    Directory of Open Access Journals (Sweden)

    Elmer P. Dadios

    2009-01-01

    Full Text Available This paper presents a new algorithm for real time event detection using Finite State Machines with multiple Fuzzy Logic Probability Evaluators (FLPEs. A machine referee for a robot soccer game is developed and is used as the platform to test the proposed algorithm. A novel technique to detect collisions and other events in microrobot soccer game under inaccurate and insufficient information is presented. The robots' collision is used to determine goalkeeper charging and goal score events which are crucial for the machine referee's decisions. The Main State Machine (MSM handles the schedule of event activation. The FLPE calculates the probabilities of the true occurrence of the events. Final decisions about the occurrences of events are evaluated and compared through threshold crisp probability values. The outputs of FLPEs can be combined to calculate the probability of an event composed of subevents. Using multiple fuzzy logic system, the FLPE utilizes minimal number of rules and can be tuned individually. Experimental results show the accuracy and robustness of the proposed algorithm.

  5. Sample Size Estimation for Negative Binomial Regression Comparing Rates of Recurrent Events with Unequal Follow-Up Time.

    Science.gov (United States)

    Tang, Yongqiang

    2015-01-01

    A sample size formula is derived for negative binomial regression for the analysis of recurrent events, in which subjects can have unequal follow-up time. We obtain sharp lower and upper bounds on the required size, which is easy to compute. The upper bound is generally only slightly larger than the required size, and hence can be used to approximate the sample size. The lower and upper size bounds can be decomposed into two terms. The first term relies on the mean number of events in each group, and the second term depends on two factors that measure, respectively, the extent of between-subject variability in event rates, and follow-up time. Simulation studies are conducted to assess the performance of the proposed method. An application of our formulae to a multiple sclerosis trial is provided.

  6. Do co-intoxicants increase adverse event rates in the first 24 hours in patients resuscitated from acute opioid overdose?

    Science.gov (United States)

    Mirakbari, Seyed Mostafa; Innes, Grant D; Christenson, Jim; Tilley, Jessica; Wong, Hubert

    2003-01-01

    Patients frequently arrive in emergency departments (EDs) after being resuscitated from opioid overdose. Autopsy studies suggest that multidrug intoxication is a major risk factor for adverse outcomes after acute heroin overdose in patients. If this is true, there may be high-risk drug combinations that identify patients who require more intensive monitoring and prolonged observation. Our objective was to determine the impact of co-intoxication with alcohol, cocaine, or CNS depressant drugs on short-term adverse event rates in patients resuscitated from acute opioid overdose. Data were extracted from the database of a prospective opioid overdose cohort study conducted between May 1997 and 1999. Patients were prospectively enrolled if they received naloxone for presumed opioid overdose. Investigators gathered clinical, demographic, and other predictor variables, including co-intoxicants used. Patients were followed to identify prespecified adverse outcome events occurring within 24 h, and multiple logistic regression was used to determine the association of concomitant drug use on short-term adverse event rates. Of 1155 patients studied, 58 (5%) had pure opioid overdose and 922 (80%) reported co-intoxicants, including alcohol, cocaine, and CNS depressants. Overall, out of 1056 patients with known outcome status there were 123 major adverse events (11.6%) and 194 minor adverse events (18.4%). After adjustment for age, gender, HIV status, cardiovascular disease, pulmonary disease and diabetes, we found that coadministration of alcohol, cocaine, or CNS depressants, alone or in combination, was not associated with increased risk of death or adverse events during the 24 h follow-up period. In patients resuscitated from acute opioid overdose, short-term outcomes are similar for patients with pure opioid overdose and multidrug intoxications. A history of cointoxication cannot be used to identify high-risk patients who require more intensive ED monitoring or prolonged

  7. Age-specific vibrissae growth rates: a tool for determining the timing of ecologically important events in Steller sea lions

    Science.gov (United States)

    Rea, L.D.; Christ, A.M.; Hayden, A.B.; Stegall, V.K.; Farley, S.D.; Stricker, Craig A.; Mellish, J.E.; Maniscalco, John M.; Waite, J.N.; Burkanov, V.N.; Pitcher, K.W.

    2015-01-01

    Steller sea lions (SSL; Eumetopias jubatus) grow their vibrissae continually, providing a multiyear record suitable for ecological and physiological studies based on stable isotopes. An accurate age-specific vibrissae growth rate is essential for registering a chronology along the length of the record, and for interpreting the timing of ecologically important events. We utilized four methods to estimate the growth rate of vibrissae in fetal, rookery pup, young-of-the-year (YOY), yearling, subadult, and adult SSL. The majority of vibrissae were collected from SSL live-captured in Alaska and Russia between 2000 and 2013 (n = 1,115), however, vibrissae were also collected from six adult SSL found dead on haul-outs and rookeries during field excursions to increase the sample size of this underrepresented age group. Growth rates of vibrissae were generally slower in adult (0.44 ± 0.15 cm/mo) and subadult (0.61 ± 0.10 cm/mo) SSL than in YOY (0.87 ± 0.28 cm/mo) and fetal (0.73 ± 0.05 cm/mo) animals, but there was high individual variability in these growth rates within each age group. Some variability in vibrissae growth rates was attributed to the somatic growth rate of YOY sea lions between capture events (P = 0.014, r2 = 0.206, n = 29).

  8. Semiquantitative SPECT myocardial perfusion with dipyridamole in patients unable to exercise. Event rate during 4 years of follow up

    International Nuclear Information System (INIS)

    Balestrini, V.R.; Arja, V.J.; Sandrin, A.L.; Calvo, G.R.; Gomez Bosh, Z.; Quiroga, W.; Balestrini, C.E.; Joekes, S.

    2002-01-01

    The increasingly numbers of patients (P) that can't reach an adequate level of exercise in order to evaluate CAD, lead us to use pharmacological and technical tools available for this subgroup of P. Aim: evaluate the prognostic significance of myocardial perfusion SPECT imaging with pharmacological stress in P without LBBB, unable to exercise. Material and Methods: 209 P were included. Mean age: 65 years old (39-88), male 66%. Clinical: Pre test likelihood 8: 28%; SDS 0 and SRS 0: 31.7%; SDS 0 + SRS >=1: 21%; SDS >=1: 47.3%. III) Follow up: 13 patients were early re-vascularized induced by SPECT study results, 10 patients get lost and 186 were follow up by a mean 1086 days. Cumulative events rate: 1st year SCE 9.7%, HCE 1.6%; 2nd year SCE 14%, HCE 4.3%; 3rd year SCE 17.7%, HCE 5.4%; 4th year SCE 21%, HCE 5,4%. Scintigraphic indices and events rate relationship are presented. Conclusion: There was a relationship between scintigraphic indices and hard cardiac events. The semiquantitative myocardial perfusion with dipyridamole stress was a safe test and useful to discriminate groups of P with different risk of events

  9. Boarding is associated with higher rates of medication delays and adverse events but fewer laboratory-related delays.

    Science.gov (United States)

    Sri-On, Jiraporn; Chang, Yuchiao; Curley, David P; Camargo, Carlos A; Weissman, Joel S; Singer, Sara J; Liu, Shan W

    2014-09-01

    Hospital crowding and emergency department (ED) boarding are large and growing problems. To date, there has been a paucity of information regarding the quality of care received by patients boarding in the ED compared with the care received by patients on an inpatient unit. We compared the rate of delays and adverse events at the event level that occur while boarding in the ED vs while on an inpatient unit. This study was a secondary analysis of data from medical record review and administrative databases at 2 urban academic teaching hospitals from August 1, 2004, through January 31, 2005. We measured delayed repeat cardiac enzymes, delayed partial thromboplastin time level checks, delayed antibiotic administration, delayed administration of home medications, and adverse events. We compared the incidence of events during ED boarding vs while on an inpatient unit. Among 1431 patient medical records, we identified 1016 events. Emergency department boarding was associated with an increased risk of home medication delays (risk ratio [RR], 1.54; 95% confidence interval [CI], 1.26-1.88), delayed antibiotic administration (RR, 2.49; 95% CI, 1.72-3.52), and adverse events (RR, 2.36; 95% CI, 1.15-4.72). On the contrary, ED boarding was associated with fewer delays in repeat cardiac enzymes (RR, 0.17; 95% CI, 0.09-0.27) and delayed partial thromboplastin time checks (RR, 0.54; 95% CI, 0.27-0.96). Compared with inpatient units, ED boarding was associated with more medication-related delays and adverse events but fewer laboratory-related delays. Until we can eliminate ED boarding, it is critical to identify areas for improvement. Copyright © 2014 Elsevier Inc. All rights reserved.

  10. Event-Associated Oxygen Consumption Rate Increases ca. Five-Fold When Interictal Activity Transforms into Seizure-Like Events In Vitro

    Directory of Open Access Journals (Sweden)

    Karl Schoknecht

    2017-09-01

    Full Text Available Neuronal injury due to seizures may result from a mismatch of energy demand and adenosine triphosphate (ATP synthesis. However, ATP demand and oxygen consumption rates have not been accurately determined, yet, for different patterns of epileptic activity, such as interictal and ictal events. We studied interictal-like and seizure-like epileptiform activity induced by the GABAA antagonist bicuculline alone, and with co-application of the M-current blocker XE-991, in rat hippocampal slices. Metabolic changes were investigated based on recording partial oxygen pressure, extracellular potassium concentration, and intracellular flavine adenine dinucleotide (FAD redox potential. Recorded data were used to calculate oxygen consumption and relative ATP consumption rates, cellular ATP depletion, and changes in FAD/FADH2 ratio by applying a reactive-diffusion and a two compartment metabolic model. Oxygen-consumption rates were ca. five times higher during seizure activity than interictal activity. Additionally, ATP consumption was higher during seizure activity (~94% above control than interictal activity (~15% above control. Modeling of FAD transients based on partial pressure of oxygen recordings confirmed increased energy demand during both seizure and interictal activity and predicted actual FAD autofluorescence recordings, thereby validating the model. Quantifying metabolic alterations during epileptiform activity has translational relevance as it may help to understand the contribution of energy supply and demand mismatches to seizure-induced injury.

  11. On the merging rates of envelope-deprived components of binary systems which can give rise to supernova events

    International Nuclear Information System (INIS)

    Tornambe, Amedo

    1989-01-01

    We derive theoretical rates of mergings of envelope-deprived components of binary systems, which can give rise to supernova events. The effects of the various assumptions one is forced to make on the physical properties of the progenitor system and of its evolutionary behaviour through common envelope phases are discussed. Four cases have been analysed: CO-CO, He-CO, He-He double degenerate mergings and He star-CO dwarf merging. (author)

  12. Exploring similarities and differences in hospital adverse event rates between Norway and Sweden using Global Trigger Tool

    OpenAIRE

    Deilk?s, Ellen Tveter; Risberg, Madeleine Borgstedt; Haugen, Marion; Lindstr?m, Jonas Christoffer; Nyl?n, Urban; Rutberg, Hans; Michael, Soop

    2017-01-01

    Objectives: In this paper, we explore similarities and differences in hospital adverse event (AE) rates between Norway and Sweden by reviewing medical records with the Global Trigger Tool (GTT). Design: All acute care hospitals in both countries performed medical record reviews, except one in Norway. Records were randomly selected from all eligible admissions in 2013. Eligible admissions were patients 18 years of age or older, undergoing care with an in-hospital stay of at least 24 hours, exc...

  13. Development and validation of a Thai stressful life events rating scale for patients with a diagnosis of schizophrenic methamphetamine abuse

    OpenAIRE

    Ek-uma Imkome; JintanaYunibhand; Waraporn Chaiyawat

    2017-01-01

    This study aimed to psychometrically test a Thai Stressful Life Events Rating Scale (TSLERS). Factor analysis was done on data collected from 313 patients with schizophrenia and methamphetamine abuse in Thailand from April to May, 2015. Results identified the following problems impacting physical and mental health: social relationship and social concerns, money, family life, life security, and career. Evaluation of the psychometric scale properties demonstrated acceptable validity ...

  14. A Compound Detection System Based on Ultrasonic Flow Rate and Concentration

    Directory of Open Access Journals (Sweden)

    Qing-Hui WANG

    2014-02-01

    Full Text Available This paper proposes a new detection system for monitoring gas concentration and flow rate. Velocity difference of ultrasonic wave in bi-directional propagation in measured gas is recorded and utilized for computing the online gas concentration and flow rate. Meanwhile, the temperature compensation, return signal processing and error analysis algorithms are applied to improve the accuracy. The experimental results show that, compared with the single sensor measurement of gas flow rate or concentration, the proposed detection system with lower cost and higher accuracy can be applied in the occasion which needs simultaneous monitoring of gas concentration and flow rate.

  15. Time-Frequency Analysis of Terahertz Radar Signals for Rapid Heart and Breath Rate Detection

    National Research Council Canada - National Science Library

    Massar, Melody L

    2008-01-01

    We develop new time-frequency analytic techniques which facilitate the detection of a person's heart and breath rates from the Doppler shift the movement of their body induces in a terahertz radar signal...

  16. A new method to detect event-related potentials based on Pearson's correlation.

    Science.gov (United States)

    Giroldini, William; Pederzoli, Luciano; Bilucaglia, Marco; Melloni, Simone; Tressoldi, Patrizio

    2016-12-01

    Event-related potentials (ERPs) are widely used in brain-computer interface applications and in neuroscience.  Normal EEG activity is rich in background noise, and therefore, in order to detect ERPs, it is usually necessary to take the average from multiple trials to reduce the effects of this noise.  The noise produced by EEG activity itself is not correlated with the ERP waveform and so, by calculating the average, the noise is decreased by a factor inversely proportional to the square root of N , where N is the number of averaged epochs. This is the easiest strategy currently used to detect ERPs, which is based on calculating the average of all ERP's waveform, these waveforms being time- and phase-locked.  In this paper, a new method called GW6 is proposed, which calculates the ERP using a mathematical method based only on Pearson's correlation. The result is a graph with the same time resolution as the classical ERP and which shows only positive peaks representing the increase-in consonance with the stimuli-in EEG signal correlation over all channels.  This new method is also useful for selectively identifying and highlighting some hidden components of the ERP response that are not phase-locked, and that are usually hidden in the standard and simple method based on the averaging of all the epochs.  These hidden components seem to be caused by variations (between each successive stimulus) of the ERP's inherent phase latency period (jitter), although the same stimulus across all EEG channels produces a reasonably constant phase. For this reason, this new method could be very helpful to investigate these hidden components of the ERP response and to develop applications for scientific and medical purposes. Moreover, this new method is more resistant to EEG artifacts than the standard calculations of the average and could be very useful in research and neurology.  The method we are proposing can be directly used in the form of a process written in the well

  17. Deficits in cue detection underlie event-based prospective memory impairment in major depression: an eye tracking study.

    Science.gov (United States)

    Chen, Siyi; Zhou, Renlai; Cui, Hong; Chen, Xinyin

    2013-10-30

    This study examined the cue detection in the non-focal event-based prospective memory (PM) of individuals with and without a major depressive disorder using behavioural and eye tracking assessments. The participants were instructed to search on each trial for a different target stimulus that could be present or absent and to make prospective responses to the cue object. PM tasks included cue only and target plus cue, whereas ongoing tasks included target only and distracter only. The results showed that a) participants with depression performed more poorly than those without depression in PM; b) participants with depression showed more fixations and longer total and average fixation durations in both ongoing and PM conditions; c) participants with depression had lower scores on accuracy in target-plus-cue trials than in cue-only trials and had a higher gaze rate of targets on hits and misses in target-plus-cue trials than did those without depression. The results indicate that the state of depression may impair top-down cognitive control function, which in turn results in particular deficits in the engagement of monitoring for PM cues. Copyright © 2013. Published by Elsevier Ireland Ltd.

  18. A secure distributed logistic regression protocol for the detection of rare adverse drug events.

    Science.gov (United States)

    El Emam, Khaled; Samet, Saeed; Arbuckle, Luk; Tamblyn, Robyn; Earle, Craig; Kantarcioglu, Murat

    2013-05-01

    There is limited capacity to assess the comparative risks of medications after they enter the market. For rare adverse events, the pooling of data from multiple sources is necessary to have the power and sufficient population heterogeneity to detect differences in safety and effectiveness in genetic, ethnic and clinically defined subpopulations. However, combining datasets from different data custodians or jurisdictions to perform an analysis on the pooled data creates significant privacy concerns that would need to be addressed. Existing protocols for addressing these concerns can result in reduced analysis accuracy and can allow sensitive information to leak. To develop a secure distributed multi-party computation protocol for logistic regression that provides strong privacy guarantees. We developed a secure distributed logistic regression protocol using a single analysis center with multiple sites providing data. A theoretical security analysis demonstrates that the protocol is robust to plausible collusion attacks and does not allow the parties to gain new information from the data that are exchanged among them. The computational performance and accuracy of the protocol were evaluated on simulated datasets. The computational performance scales linearly as the dataset sizes increase. The addition of sites results in an exponential growth in computation time. However, for up to five sites, the time is still short and would not affect practical applications. The model parameters are the same as the results on pooled raw data analyzed in SAS, demonstrating high model accuracy. The proposed protocol and prototype system would allow the development of logistic regression models in a secure manner without requiring the sharing of personal health information. This can alleviate one of the key barriers to the establishment of large-scale post-marketing surveillance programs. We extended the secure protocol to account for correlations among patients within sites through

  19. Pacific Northwest geomorphology and hydrology: rates and probabilities of selected processes and events

    International Nuclear Information System (INIS)

    Tubbs, D.W.

    1979-01-01

    This report presents results of one of the geomorphological and hydrological studies that have been conducted for the release scenario analysis of the Waste Isolation Safety Assessment Program (WISAP). Three general topics are considered: (1) determination of rates of denudation, (2) estimation of the probability of flooding due to each of several causes, and (3) evaluation of other surface processes that should be considered in the release scenario analysis. The third general topic was ultimately narrowed to the possible effects of landsliding. Rates of erosion are expressed as centimeters per 100 years, except that the original units are retained in figures taken from other sources. Probabilities are also expressed per 100 years

  20. Enriched Encoding: Reward Motivation Organizes Cortical Networks for Hippocampal Detection of Unexpected Events

    OpenAIRE

    Murty, Vishnu P.; Adcock, R. Alison

    2013-01-01

    Learning how to obtain rewards requires learning about their contexts and likely causes. How do long-term memory mechanisms balance the need to represent potential determinants of reward outcomes with the computational burden of an over-inclusive memory? One solution would be to enhance memory for salient events that occur during reward anticipation, because all such events are potential determinants of reward. We tested whether reward motivation enhances encoding of salient events like expec...

  1. Adaptive heart rate-based epileptic seizure detection using real-time user feedback

    DEFF Research Database (Denmark)

    De Cooman, Thomas; Kjær, Troels Wesenberg; Van Huffel, Sabine

    2017-01-01

    Automated seizure detection in a home environment has been of increased interest the last couple of decades. Heart rate-based seizure detection is a way to detect temporal lobe epilepsy seizures at home, but patient-independent algorithms showed to be insufficiently accurate due to the high patient...... with incorrect user feedback, making it ideal for implementation in a home environment for a seizure warning system....

  2. Glaucoma progression detection by retinal nerve fiber layer measurement using scanning laser polarimetry: event and trend analysis.

    Science.gov (United States)

    Moon, Byung Gil; Sung, Kyung Rim; Cho, Jung Woo; Kang, Sung Yong; Yun, Sung-Cheol; Na, Jung Hwa; Lee, Youngrok; Kook, Michael S

    2012-06-01

    To evaluate the use of scanning laser polarimetry (SLP, GDx VCC) to measure the retinal nerve fiber layer (RNFL) thickness in order to evaluate the progression of glaucoma. Test-retest measurement variability was determined in 47 glaucomatous eyes. One eye each from 152 glaucomatous patients with at least 4 years of follow-up was enrolled. Visual field (VF) loss progression was determined by both event analysis (EA, Humphrey guided progression analysis) and trend analysis (TA, linear regression analysis of the visual field index). SLP progression was defined as a reduction of RNFL exceeding the predetermined repeatability coefficient in three consecutive exams, as compared to the baseline measure (EA). The slope of RNFL thickness change over time was determined by linear regression analysis (TA). Twenty-two eyes (14.5%) progressed according to the VF EA, 16 (10.5%) by VF TA, 37 (24.3%) by SLP EA and 19 (12.5%) by SLP TA. Agreement between VF and SLP progression was poor in both EA and TA (VF EA vs. SLP EA, k = 0.110; VF TA vs. SLP TA, k = 0.129). The mean (±standard deviation) progression rate of RNFL thickness as measured by SLP TA did not significantly differ between VF EA progressors and non-progressors (-0.224 ± 0.148 µm/yr vs. -0.218 ± 0.151 µm/yr, p = 0.874). SLP TA and EA showed similar levels of sensitivity when VF progression was considered as the reference standard. RNFL thickness as measurement by SLP was shown to be capable of detecting glaucoma progression. Both EA and TA of SLP showed poor agreement with VF outcomes in detecting glaucoma progression.

  3. Minimum Symbol Error Rate Detection in Single-Input Multiple-Output Channels with Markov Noise

    DEFF Research Database (Denmark)

    Christensen, Lars P.B.

    2005-01-01

    Minimum symbol error rate detection in Single-Input Multiple- Output(SIMO) channels with Markov noise is presented. The special case of zero-mean Gauss-Markov noise is examined closer as it only requires knowledge of the second-order moments. In this special case, it is shown that optimal detection...

  4. Contemporary statistical procedures (Parametric Empirical Bayes) and nuclear plant event rates

    International Nuclear Information System (INIS)

    Gaver, D.P.; Worledge, D.H.

    1985-01-01

    The conduct of a nuclear power plant probabilistic risk assessment (PRA) recognizes that each of a great many vital components and systems is subject to failure. One aspect of the PRA procedure is to quantify individual item failure propensity, often in terms of the failure rate parameter of an exponential distribution or Poisson process, and then to combine rates so as to effectively infer the probability of plant failure, e.g., core damage. The formal method of combination of such rates involves use of fault-tree analysis. The defensibility of the final fault-tree result depends both upon the adequacy of the failure representations of its components, and upon the correctness and inclusiveness of the fault tree logic. This paper focuses upon the first issue, in particular, upon contemporary proposals for deriving estimates of individual rates. The purpose of the paper is to present, in basically non-mathematical terms, the essential nature of some of these proposals, and an assessment of how they might fit into, and contribute positively to, a more defensible or trustworthy PRA process

  5. The necessity of recognizing all events in X-ray detection.

    Science.gov (United States)

    Papp, T; Maxwell, J A; Papp, A T

    2010-01-01

    In our work in studying properties of inner shell ionization, we are troubled that the experimental data used to determine the basic parameters of X-ray physics have a large and unexplainable scatter. As we looked into the problems we found that many of them contradict simple logic, elemental arithmetic, even parity and angular momentum conservation laws. We have identified that the main source of the problems, other than the human factor, is rooted in the signal processing electronics. To overcome these problems we have developed a fully digital signal processor, which not only has excellent resolution and line shape, but also allows proper accounting of all events. This is achieved by processing all events and separating them into two or more spectra (maximum 16), where the first spectrum is the accepted or good spectrum and the second spectrum is the spectrum of all rejected events. The availability of all the events allows one to see the other part of the spectrum. To our surprise the total information explains many of the shortcomings and contradictions of the X-ray database. The data processing methodology cannot be established on the partial and fractional information offered by other approaches. Comparing Monte Carlo detector modeling results with the partial spectra is ambiguous. It suggests that the metrology of calibration by radioactive sources as well as other X-ray measurements could be improved by the availability of the proper accounting of all events. It is not enough to know that an event was rejected and increment the input counter, it is necessary to know, what was rejected and why it happened, whether it was a noise or a disturbed event, a retarded event or a true event, or any pile up combination of these events. Such information is supplied by our processor reporting the events rejected by each discriminator in separate spectra. Several industrial applications of this quality assurance capable signal processor are presented. Copyright 2009

  6. Individual differences in heart rate variability are associated with the avoidance of negative emotional events.

    Science.gov (United States)

    Katahira, Kentaro; Fujimura, Tomomi; Matsuda, Yoshi-Taka; Okanoya, Kazuo; Okada, Masato

    2014-12-01

    Although the emotional outcome of a choice generally affects subsequent decisions, humans can inhibit the influence of emotion. Heart rate variability (HRV) has emerged as an objective measure of individual differences in the capacity for inhibitory control. In the present study, we investigated how individual differences in HRV at rest are associated with the emotional effects of the outcome of a choice on subsequent decision making using a decision-making task in which emotional pictures appeared as decision outcomes. We used a reinforcement learning model to characterize the observed behaviors according to several parameters, namely, the learning rate and the motivational value of positive and negative pictures. Consequently, we found that individuals with a lower resting HRV exhibited a greater negative motivational value in response to negative pictures, suggesting that these individuals tend to avoid negative pictures compared with individuals with a higher resting HRV. Copyright © 2014 Elsevier B.V. All rights reserved.

  7. Left Ventricular Wall Stress-Mass-Heart Rate Product and Cardiovascular Events in Treated Hypertensive Patients

    DEFF Research Database (Denmark)

    Devereux, Richard B; Bang, Casper N; Roman, Mary J

    2015-01-01

    -varying covariate in Cox models assessing predictors of the LIFE primary composite end point (cardiovascular death, MI, or stroke), its individual components, and all-cause mortality. At baseline, the triple product in both treatment groups was, compared with normal adults, elevated in 70% of patients. During...... more, greater heart rate reduction with atenolol resulted in larger reduction of the triple product. Lower triple product during antihypertensive treatment was strongly, independently associated with lower rates of the LIFE primary composite end point, cardiovascular death, and MI, but not stroke.......In the Losartan Intervention for End Point Reduction in Hypertension (LIFE) study, 4.8 years' losartan- versus atenolol-based antihypertensive treatment reduced left ventricular hypertrophy and cardiovascular end points, including cardiovascular death and stroke. However, there was no difference...

  8. High cardiovascular event rates in patients with asymptomatic carotid stenosis: the REACH Registry

    DEFF Research Database (Denmark)

    Aichner, F T; Topakian, R; Alberts, M J

    2009-01-01

    /absence of ACAS at the time of inclusion. RESULTS: Compared with patients without ACAS (n = 30 329), patients with ACAS (n = 3164) had higher age- and sex-adjusted 1-year rates of transient ischaemic attack (3.51% vs. 1.61%, P ....26%, P = 0.04), cardiovascular death (2.29% vs. 1.52%, P = 0.002), the composite end-point cardiovascular death/myocardial infarction/stroke (6.03% vs. 4.29%, P

  9. Soverign Rating Events and bank Share Prices in the Italian Market

    OpenAIRE

    S. Caselli; G. Gandolfi; M. Soana

    2013-01-01

    The paper examines the spillover effect of Eurozone sovereign rating changes announced by Standard and Poor’s, Moody’s, and Fitch on domestic bank share prices in the period 2002–2012. This spillover effect appears strongly negative in the case of downgrades, but insignificant for upgrades. Bank losses following sovereign downgrades are greater during the financial crisis of 2008 and in the PIIGS countries (Portugal, Ireland, Italy, Greece, and Spain). Surprisingly, announcement of sovereign ...

  10. The tell-tale heart: heart rate fluctuations index objective and subjective events during a game of chess.

    Science.gov (United States)

    Leone, María J; Petroni, Agustín; Fernandez Slezak, Diego; Sigman, Mariano

    2012-01-01

    During a decision-making process, the body changes. These somatic changes have been related to specific cognitive events and also have been postulated to assist decision-making indexing possible outcomes of different options. We used chess to analyze heart rate (HR) modulations on specific cognitive events. In a chess game, players have a limited time-budget to make about 40 moves (decisions) that can be objectively evaluated and retrospectively assigned to specific subjectively perceived events, such as setting a goal and the process to reach a known goal. We show that HR signals events: it predicts the conception of a plan, the concrete analysis of variations or the likelihood to blunder by fluctuations before to the move, and it reflects reactions, such as a blunder made by the opponent, by fluctuations subsequent to the move. Our data demonstrate that even if HR constitutes a relatively broad marker integrating a myriad of physiological variables, its dynamic is rich enough to reveal relevant episodes of inner thought.

  11. The Detection and Correction of Bias in Student Ratings of Instruction.

    Science.gov (United States)

    Haladyna, Thomas; Hess, Robert K.

    1994-01-01

    A Rasch model was used to detect and correct bias in Likert rating scales used to assess student perceptions of college teaching, using a database of ratings. Statistical corrections were significant, supporting the model's potential utility. Recommendations are made for a theoretical rationale and further research on the model. (Author/MSE)

  12. Ten-year detection rate of brain arteriovenous malformations in a large, multiethnic, defined population.

    Science.gov (United States)

    Gabriel, Rodney A; Kim, Helen; Sidney, Stephen; McCulloch, Charles E; Singh, Vineeta; Johnston, S Claiborne; Ko, Nerissa U; Achrol, Achal S; Zaroff, Jonathan G; Young, William L

    2010-01-01

    To evaluate whether increased neuroimaging use is associated with increased brain arteriovenous malformation (BAVM) detection, we examined detection rates in the Kaiser Permanente Medical Care Program of northern California between 1995 and 2004. We reviewed medical records, radiology reports, and administrative databases to identify BAVMs, intracranial aneurysms (IAs: subarachnoid hemorrhage [SAH] and unruptured aneurysms), and other vascular malformations (OVMs: dural fistulas, cavernous malformations, Vein of Galen malformations, and venous malformations). Poisson regression (with robust standard errors) was used to test for trend. Random-effects meta-analysis generated a pooled measure of BAVM detection rate from 6 studies. We identified 401 BAVMs (197 ruptured, 204 unruptured), 570 OVMs, and 2892 IAs (2079 SAHs and 813 unruptured IAs). Detection rates per 100 000 person-years were 1.4 (95% CI, 1.3 to 1.6) for BAVMs, 2.0 (95% CI, 1.8 to 2.3) for OVMs, and 10.3 (95% CI, 9.9 to 10.7) for IAs. Neuroimaging utilization increased 12% per year during the time period (PIAs (PIAs (P4) per 100 000 person-years, without heterogeneity between studies (P=0.25). Rates for BAVMs, OVMs, and IAs in this large, multiethnic population were similar to those in other series. During 1995 to 2004, a period of increasing neuroimaging utilization, we did not observe an increased rate of detection of unruptured BAVMs, despite increased detection of unruptured IAs.

  13. A Configurable Event-Driven Convolutional Node with Rate Saturation Mechanism for Modular ConvNet Systems Implementation

    Science.gov (United States)

    Camuñas-Mesa, Luis A.; Domínguez-Cordero, Yaisel L.; Linares-Barranco, Alejandro; Serrano-Gotarredona, Teresa; Linares-Barranco, Bernabé

    2018-01-01

    Convolutional Neural Networks (ConvNets) are a particular type of neural network often used for many applications like image recognition, video analysis or natural language processing. They are inspired by the human brain, following a specific organization of the connectivity pattern between layers of neurons known as receptive field. These networks have been traditionally implemented in software, but they are becoming more computationally expensive as they scale up, having limitations for real-time processing of high-speed stimuli. On the other hand, hardware implementations show difficulties to be used for different applications, due to their reduced flexibility. In this paper, we propose a fully configurable event-driven convolutional node with rate saturation mechanism that can be used to implement arbitrary ConvNets on FPGAs. This node includes a convolutional processing unit and a routing element which allows to build large 2D arrays where any multilayer structure can be implemented. The rate saturation mechanism emulates the refractory behavior in biological neurons, guaranteeing a minimum separation in time between consecutive events. A 4-layer ConvNet with 22 convolutional nodes trained for poker card symbol recognition has been implemented in a Spartan6 FPGA. This network has been tested with a stimulus where 40 poker cards were observed by a Dynamic Vision Sensor (DVS) in 1 s time. Different slow-down factors were applied to characterize the behavior of the system for high speed processing. For slow stimulus play-back, a 96% recognition rate is obtained with a power consumption of 0.85 mW. At maximum play-back speed, a traffic control mechanism downsamples the input stimulus, obtaining a recognition rate above 63% when less than 20% of the input events are processed, demonstrating the robustness of the network. PMID:29515349

  14. A Configurable Event-Driven Convolutional Node with Rate Saturation Mechanism for Modular ConvNet Systems Implementation

    Directory of Open Access Journals (Sweden)

    Luis A. Camuñas-Mesa

    2018-02-01

    Full Text Available Convolutional Neural Networks (ConvNets are a particular type of neural network often used for many applications like image recognition, video analysis or natural language processing. They are inspired by the human brain, following a specific organization of the connectivity pattern between layers of neurons known as receptive field. These networks have been traditionally implemented in software, but they are becoming more computationally expensive as they scale up, having limitations for real-time processing of high-speed stimuli. On the other hand, hardware implementations show difficulties to be used for different applications, due to their reduced flexibility. In this paper, we propose a fully configurable event-driven convolutional node with rate saturation mechanism that can be used to implement arbitrary ConvNets on FPGAs. This node includes a convolutional processing unit and a routing element which allows to build large 2D arrays where any multilayer structure can be implemented. The rate saturation mechanism emulates the refractory behavior in biological neurons, guaranteeing a minimum separation in time between consecutive events. A 4-layer ConvNet with 22 convolutional nodes trained for poker card symbol recognition has been implemented in a Spartan6 FPGA. This network has been tested with a stimulus where 40 poker cards were observed by a Dynamic Vision Sensor (DVS in 1 s time. Different slow-down factors were applied to characterize the behavior of the system for high speed processing. For slow stimulus play-back, a 96% recognition rate is obtained with a power consumption of 0.85 mW. At maximum play-back speed, a traffic control mechanism downsamples the input stimulus, obtaining a recognition rate above 63% when less than 20% of the input events are processed, demonstrating the robustness of the network.

  15. A Configurable Event-Driven Convolutional Node with Rate Saturation Mechanism for Modular ConvNet Systems Implementation.

    Science.gov (United States)

    Camuñas-Mesa, Luis A; Domínguez-Cordero, Yaisel L; Linares-Barranco, Alejandro; Serrano-Gotarredona, Teresa; Linares-Barranco, Bernabé

    2018-01-01

    Convolutional Neural Networks (ConvNets) are a particular type of neural network often used for many applications like image recognition, video analysis or natural language processing. They are inspired by the human brain, following a specific organization of the connectivity pattern between layers of neurons known as receptive field. These networks have been traditionally implemented in software, but they are becoming more computationally expensive as they scale up, having limitations for real-time processing of high-speed stimuli. On the other hand, hardware implementations show difficulties to be used for different applications, due to their reduced flexibility. In this paper, we propose a fully configurable event-driven convolutional node with rate saturation mechanism that can be used to implement arbitrary ConvNets on FPGAs. This node includes a convolutional processing unit and a routing element which allows to build large 2D arrays where any multilayer structure can be implemented. The rate saturation mechanism emulates the refractory behavior in biological neurons, guaranteeing a minimum separation in time between consecutive events. A 4-layer ConvNet with 22 convolutional nodes trained for poker card symbol recognition has been implemented in a Spartan6 FPGA. This network has been tested with a stimulus where 40 poker cards were observed by a Dynamic Vision Sensor (DVS) in 1 s time. Different slow-down factors were applied to characterize the behavior of the system for high speed processing. For slow stimulus play-back, a 96% recognition rate is obtained with a power consumption of 0.85 mW. At maximum play-back speed, a traffic control mechanism downsamples the input stimulus, obtaining a recognition rate above 63% when less than 20% of the input events are processed, demonstrating the robustness of the network.

  16. Variety Is Not the Spice of Life for People with Autism Spectrum Disorders: Frequency Ratings of Central, Variable and Inappropriate Aspects of Common Real-Life Events

    Science.gov (United States)

    Loth, Eva; Happe, Francesca; Gomez, Juan Carlos

    2010-01-01

    This study used a novel rating task to investigate whether high-functioning individuals with autism spectrum disorder (ASD) have difficulties distinguishing essential from variable aspects of familiar events. Participants read stories about everyday events and judged how often central, variable, and inappropriate event-components normally occur in…

  17. Varying Definitions for Periprocedural Myocardial Infarction Alter Event Rates and Prognostic Implications

    Science.gov (United States)

    Idris, Hanan; Lo, Sidney; Shugman, Ibrahim M.; Saad, Yousef; Hopkins, Andrew P.; Mussap, Christian; Leung, Dominic; Thomas, Liza; Juergens, Craig P.; French, John K.

    2014-01-01

    Background Periprocedural myocardial infarction (PMI) has had several definitions in the last decade, including the Society for Cardiovascular Angiography and Interventions (SCAI) definition, that requires marked biomarker elevations congruent with surgical PMI criteria. Methods and Results The aim of this study was to examine the definition‐based frequencies of PMI and whether they influenced the reported association between PMI and increased rates of late death/ myocardial infarction (MI). We studied 742 patients; 492 (66%) had normal troponin T (TnT) levels and 250 (34%) had elevated, but stable or falling, TnT levels. PMI, using the 2007 and the 2012 universal definition, occurred in 172 (23.2%) and in 99 (13.3%) patients, respectively, whereas 19 (2.6%) met the SCAI PMI definition (PPMI using the 2012 definition, occlusion of a side branch ≤1 mm occurred in 48 patients (48.5%) and was the most common angiographic finding for PMI. The rates of death/MI at 2 years in patients with, compared to those without, PMI was 14.7% versus 10.1% (P=0.087) based on the 2007 definition, 16.9% versus 10.3% (P=0.059) based on the 2012 definition, and 29.4% versus 10.7% (P=0.015) based on the SCAI definition. Conclusion In this study, PMI, according to the SCAI definition, was associated with more‐frequent late death/MI, with ≈20% of all patients, who had PMI using the 2007 universal MI definition, not having SCAI‐defined PMI. Categorizing these latter patients as SCAI‐defined no PMI did not alter the rate of death/MI among no‐PMI patients. PMID:25359403

  18. On the merging rates of envelope-deprived components of binary systems which can give rise to supernova events

    Science.gov (United States)

    Tornambe, Amedeo

    1989-08-01

    Theoretical rates of mergings of envelope-deprived components of binary systems, which can give rise to supernova events are described. The effects of the various assumptions on the physical properties of the progenitor system and of its evolutionary behavior through common envelope phases are discussed. Four cases have been analyzed: CO-CO, He-CO, He-He double degenerate mergings and He star-CO dwarf merging. It is found that, above a critical efficiency of the common envelope action in system shrinkage, the rate of CO-CO mergings is not strongly sensitive to the efficiency. Below this critical value, no CO-CO systems will survive for times larger than a few Gyr. In contrast, He-CO dwarf systems will continue to merge at a reasonable rate up to 20 Gyr, and more, also under extreme conditions.

  19. Particle Filter Based Tracking in a Detection Sparse Discrete Event Simulation Environment

    National Research Council Canada - National Science Library

    Borovies, Drew A

    2007-01-01

    .... In actuality, even extremely uncertain or incomplete detections and counter-detections of opposing entities can provide enough data for entities to make reasonably intelligent decisions on the virtual battlefield...

  20. A Data-Driven Monitoring Technique for Enhanced Fall Events Detection

    KAUST Repository

    Zerrouki, Nabil; Harrou, Fouzi; Sun, Ying; Houacine, Amrane

    2016-01-01

    Fall detection is a crucial issue in the health care of seniors. In this work, we propose an innovative method for detecting falls via a simple human body descriptors. The extracted features are discriminative enough to describe human postures

  1. Accuracy and precision of equine gait event detection during walking with limb and trunk mounted inertial sensors

    DEFF Research Database (Denmark)

    Olsen, Emil; Andersen, Pia Haubro; Pfau, Thilo

    2012-01-01

    The increased variations of temporal gait events when pathology is present are good candidate features for objective diagnostic tests. We hypothesised that the gait events hoof-on/off and stance can be detected accurately and precisely using features from trunk and distal limb-mounted Inertial....... Accuracy (bias) and precision (SD of bias) was calculated to compare force plate and IMU timings for gait events. Data were collected from seven horses. One hundred and twenty three (123) front limb steps were analysed; hoof-on was detected with a bias (SD) of -7 (23) ms, hoof-off with 0.7 (37) ms...... and front limb stance with -0.02 (37) ms. A total of 119 hind limb steps were analysed; hoof-on was found with a bias (SD) of -4 (25) ms, hoof-off with 6 (21) ms and hind limb stance with 0.2 (28) ms. IMUs mounted on the distal limbs and sacrum can detect gait events accurately and precisely....

  2. The impact of realistic models of mass segregation on the event rate of extreme-mass ratio inspirals and cusp re-growth

    Science.gov (United States)

    Amaro-Seoane, Pau; Preto, Miguel

    2011-05-01

    One of the most interesting sources of gravitational waves (GWs) for LISA is the inspiral of compact objects on to a massive black hole (MBH), commonly referred to as an 'extreme-mass ratio inspiral' (EMRI). The small object, typically a stellar black hole, emits significant amounts of GW along each orbit in the detector bandwidth. The slowly, adiabatic inspiral of these sources will allow us to map spacetime around MBHs in detail, as well as to test our current conception of gravitation in the strong regime. The event rate of this kind of source has been addressed many times in the literature and the numbers reported fluctuate by orders of magnitude. On the other hand, recent observations of the Galactic centre revealed a dearth of giant stars inside the inner parsec relative to the numbers theoretically expected for a fully relaxed stellar cusp. The possibility of unrelaxed nuclei (or, equivalently, with no or only a very shallow cusp, or core) adds substantial uncertainty to the estimates. Having this timely question in mind, we run a significant number of direct-summation N-body simulations with up to half a million particles to calibrate a much faster orbit-averaged Fokker-Planck code. We show that, under quite generic initial conditions, the time required for the growth of a relaxed, mass segregated stellar cusp is shorter than a Hubble time for MBHs with M• boosting the EMRI rates by a factor of ~10 in comparison to what would result from a 7/4-Bahcall and Wolf cusp resulting in ~250 events per Gyr per Milky Way type galaxy. Such an intrinsic rate should translate roughly into ~102-7 × 102 sbh's (EMRIs detected by LISA over a mission lifetime of 2 or 5 years, respectively), depending on the detailed assumptions regarding LISA detection capabilities.

  3. A Compound Detection System Based on Ultrasonic Flow Rate and Concentration

    OpenAIRE

    Qing-Hui WANG; Fang MU; Li-Feng WEI

    2014-01-01

    This paper proposes a new detection system for monitoring gas concentration and flow rate. Velocity difference of ultrasonic wave in bi-directional propagation in measured gas is recorded and utilized for computing the online gas concentration and flow rate. Meanwhile, the temperature compensation, return signal processing and error analysis algorithms are applied to improve the accuracy. The experimental results show that, compared with the single sensor measurement of gas flow rate or conce...

  4. Validating administrative data for the detection of adverse events in older hospitalized patients

    Directory of Open Access Journals (Sweden)

    Ackroyd-Stolarz S

    2014-08-01

    Full Text Available Stacy Ackroyd-Stolarz,1,2 Susan K Bowles,3–5 Lorri Giffin6 1Performance Excellence Portfolio, Capital District Health Authority, Halifax, Nova Scotia, Canada; 2Department of Emergency Medicine, Dalhousie University, Halifax, Nova Scotia, Canada; 3Geriatric Medicine, Capital District Health Authority, Halifax, Nova Scotia, Canada; 4College of Pharmacy and Division of Geriatric Medicine, Dalhousie University, Halifax, Nova Scotia, Canada; 5Department of Pharmacy at Capital District Health Authority, Halifax, Nova Scotia, Canada; 6South Shore Family Health, Bridgewater, Nova Scotia, Canada Abstract: Older hospitalized patients are at risk of experiencing adverse events including, but not limited to, hospital-acquired pressure ulcers, fall-related injuries, and adverse drug events. A significant challenge in monitoring and managing adverse events is lack of readily accessible information on their occurrence. Purpose: The objective of this retrospective cross-sectional study was to validate diagnostic codes for pressure ulcers, fall-related injuries, and adverse drug events found in routinely collected administrative hospitalization data. Methods: All patients 65 years of age or older discharged between April 1, 2009 and March 31, 2011 from a provincial academic health sciences center in Canada were eligible for inclusion in the validation study. For each of the three types of adverse events, a random sample of 50 patients whose records were positive and 50 patients whose records were not positive for an adverse event was sought for review in the validation study (n=300 records in total. A structured health record review was performed independently by two health care providers with experience in geriatrics, both of whom were unaware of the patient's status with respect to adverse event coding. A physician reviewed 40 records (20 reviewed by each health care provider to establish interrater agreement. Results: A total of 39 pressure ulcers, 56 fall

  5. [Comparison of the "Trigger" tool with the minimum basic data set for detecting adverse events in general surgery].

    Science.gov (United States)

    Pérez Zapata, A I; Gutiérrez Samaniego, M; Rodríguez Cuéllar, E; Gómez de la Cámara, A; Ruiz López, P

    Surgery is a high risk for the occurrence of adverse events (AE). The main objective of this study is to compare the effectiveness of the Trigger tool with the Hospital National Health System registration of Discharges, the minimum basic data set (MBDS), in detecting adverse events in patients admitted to General Surgery and undergoing surgery. Observational and descriptive retrospective study of patients admitted to general surgery of a tertiary hospital, and undergoing surgery in 2012. The identification of adverse events was made by reviewing the medical records, using an adaptation of "Global Trigger Tool" methodology, as well as the (MBDS) registered on the same patients. Once the AE were identified, they were classified according to damage and to the extent to which these could have been avoided. The area under the curve (ROC) were used to determine the discriminatory power of the tools. The Hanley and Mcneil test was used to compare both tools. AE prevalence was 36.8%. The TT detected 89.9% of all AE, while the MBDS detected 28.48%. The TT provides more information on the nature and characteristics of the AE. The area under the curve was 0.89 for the TT and 0.66 for the MBDS. These differences were statistically significant (P<.001). The Trigger tool detects three times more adverse events than the MBDS registry. The prevalence of adverse events in General Surgery is higher than that estimated in other studies. Copyright © 2017 SECA. Publicado por Elsevier España, S.L.U. All rights reserved.

  6. Scaling A Moment-Rate Function For Small To Large Magnitude Events

    Science.gov (United States)

    Archuleta, Ralph; Ji, Chen

    2017-04-01

    Since the 1980's seismologists have recognized that peak ground acceleration (PGA) and peak ground velocity (PGV) scale differently with magnitude for large and moderate earthquakes. In a recent paper (Archuleta and Ji, GRL 2016) we introduced an apparent moment-rate function (aMRF) that accurately predicts the scaling with magnitude of PGA, PGV, PWA (Wood-Anderson Displacement) and the ratio PGA/2πPGV (dominant frequency) for earthquakes 3.3 ≤ M ≤ 5.3. This apparent moment-rate function is controlled by two temporal parameters, tp and td, which are related to the time for the moment-rate function to reach its peak amplitude and the total duration of the earthquake, respectively. These two temporal parameters lead to a Fourier amplitude spectrum (FAS) of displacement that has two corners in between which the spectral amplitudes decay as 1/f, f denotes frequency. At higher or lower frequencies, the FAS of the aMRF looks like a single-corner Aki-Brune omega squared spectrum. However, in the presence of attenuation the higher corner is almost certainly masked. Attempting to correct the spectrum to an Aki-Brune omega-squared spectrum will produce an "apparent" corner frequency that falls between the double corner frequency of the aMRF. We reason that the two corners of the aMRF are the reason that seismologists deduce a stress drop (e.g., Allmann and Shearer, JGR 2009) that is generally much smaller than the stress parameter used to produce ground motions from stochastic simulations (e.g., Boore, 2003 Pageoph.). The presence of two corners for the smaller magnitude earthquakes leads to several questions. Can deconvolution be successfully used to determine scaling from small to large earthquakes? Equivalently will large earthquakes have a double corner? If large earthquakes are the sum of many smaller magnitude earthquakes, what should the displacement FAS look like for a large magnitude earthquake? Can a combination of such a double-corner spectrum and random

  7. Link Between RI-ISI and Inspection Qualification: Relationship between Defect Detection Rate and Margin of Detection

    International Nuclear Information System (INIS)

    Shepherd, Barrie; Goujon, Sophie; Whittle, John

    2007-01-01

    Quantitative risk-informed in-service inspection (RI-ISI) requires a quantitative measurement of inspection effectiveness if the risk change associated with an inspection is to be determined. Knowing the probability of detection (POD) as a function of defect depth (through wall dimension) would provide ideal information. However the main in-service inspection method for nuclear plant is ultrasonics, for which defect detection capability depends on a wide variety of parameters besides defect depth, such as defect orientation, roughness, location, shape etc. In recognition of this the European approach to inspection qualification is generally based on some combination of technical justification, and practical trials on a relatively limited number of defects. This inspection qualification process involves demonstrating that defects of concern will generate responses in excess of the specified recording level or noise, depending on the inspection. It is not currently designed to quantify the probability with which defects will be detected. The work described in this report has been performed in order to help address the problem of how the information generated during inspection qualification can be used as an input for RI-ISI. The approach adopted has been to recognise that as the defect response increases above the recording or noise level, the probability of detecting defects is likely to increase. The work therefore involved an investigation of the relationship between POD (strictly speaking defect detection rate) and margin of detection. It involved blind manual and automated ultrasonic trials on artificial defects in test plates designed to generate a range of signal responses. The detection rate for defects which provided signals at a particular level above noise or above a recording level was then measured. A relationship between defect detection rate and margin of detection has been established based on these trials. In addition to establishing a stronger link

  8. THE DETECTION RATE OF EARLY UV EMISSION FROM SUPERNOVAE: A DEDICATED GALEX/PTF SURVEY AND CALIBRATED THEORETICAL ESTIMATES

    Energy Technology Data Exchange (ETDEWEB)

    Ganot, Noam; Gal-Yam, Avishay; Ofek, Eran O.; Sagiv, Ilan; Waxman, Eli; Lapid, Ofer [Department of Particle Physics and Astrophysics, Faculty of Physics, The Weizmann Institute of Science, Rehovot 76100 (Israel); Kulkarni, Shrinivas R.; Kasliwal, Mansi M. [Cahill Center for Astrophysics, California Institute of Technology, Pasadena, CA 91125 (United States); Ben-Ami, Sagi [Smithsonian Astrophysical Observatory, Harvard-Smithsonian Ctr. for Astrophysics, 60 Garden Street, Cambridge, MA 02138 (United States); Chelouche, Doron; Rafter, Stephen [Physics Department, Faculty of Natural Sciences, University of Haifa, 31905 Haifa (Israel); Behar, Ehud; Laor, Ari [Physics Department, Technion Israel Institute of Technology, 32000 Haifa (Israel); Poznanski, Dovi; Nakar, Ehud; Maoz, Dan [School of Physics and Astronomy, Tel Aviv University, 69978 Tel Aviv (Israel); Trakhtenbrot, Benny [Institute for Astronomy, ETH Zurich, Wolfgang-Pauli-Strasse 27 Zurich 8093 (Switzerland); Neill, James D.; Barlow, Thomas A.; Martin, Christofer D., E-mail: noam.ganot@gmail.com [California Institute of Technology, 1200 East California Boulevard, MC 278-17, Pasadena, CA 91125 (United States); Collaboration: ULTRASAT Science Team; WTTH consortium; GALEX Science Team; Palomar Transient Factory; and others

    2016-03-20

    The radius and surface composition of an exploding massive star, as well as the explosion energy per unit mass, can be measured using early UV observations of core-collapse supernovae (SNe). We present the first results from a simultaneous GALEX/PTF search for early ultraviolet (UV) emission from SNe. Six SNe II and one Type II superluminous SN (SLSN-II) are clearly detected in the GALEX near-UV (NUV) data. We compare our detection rate with theoretical estimates based on early, shock-cooling UV light curves calculated from models that fit existing Swift and GALEX observations well, combined with volumetric SN rates. We find that our observations are in good agreement with calculated rates assuming that red supergiants (RSGs) explode with fiducial radii of 500 R{sub ⊙}, explosion energies of 10{sup 51} erg, and ejecta masses of 10 M{sub ⊙}. Exploding blue supergiants and Wolf–Rayet stars are poorly constrained. We describe how such observations can be used to derive the progenitor radius, surface composition, and explosion energy per unit mass of such SN events, and we demonstrate why UV observations are critical for such measurements. We use the fiducial RSG parameters to estimate the detection rate of SNe during the shock-cooling phase (<1 day after explosion) for several ground-based surveys (PTF, ZTF, and LSST). We show that the proposed wide-field UV explorer ULTRASAT mission is expected to find >85 SNe per year (∼0.5 SN per deg{sup 2}), independent of host galaxy extinction, down to an NUV detection limit of 21.5 mag AB. Our pilot GALEX/PTF project thus convincingly demonstrates that a dedicated, systematic SN survey at the NUV band is a compelling method to study how massive stars end their life.

  9. Evaluating cessation of the type 2 oral polio vaccine by modeling pre- and post-cessation detection rates.

    Science.gov (United States)

    Kroiss, Steve J; Famulare, Michael; Lyons, Hil; McCarthy, Kevin A; Mercer, Laina D; Chabot-Couture, Guillaume

    2017-10-09

    The globally synchronized removal of the attenuated Sabin type 2 strain from the oral polio vaccine (OPV) in April 2016 marked a major change in polio vaccination policy. This change will provide a significant reduction in the burden of vaccine-associated paralytic polio (VAPP), but may increase the risk of circulating vaccine-derived poliovirus (cVDPV2) outbreaks during the transition period. This risk can be monitored by tracking the disappearance of Sabin-like type 2 (SL2) using data from the polio surveillance system. We studied SL2 prevalence in 17 countries in Africa and Asia, from 2010 to 2016 using acute flaccid paralysis surveillance data. We modeled the peak and decay of SL2 prevalence following mass vaccination events using a beta-binomial model for the detection rate, and a Ricker function for the temporal dependence. We found type 2 circulated the longest of all serotypes after a vaccination campaign, but that SL2 prevalence returned to baseline levels in approximately 50days. Post-cessation model predictions identified 19 anomalous SL2 detections outside of model predictions in Afghanistan, India, Nigeria, Pakistan, and western Africa. Our models established benchmarks for the duration of SL2 detection after OPV2 cessation. As predicted, SL2 detection rates have plummeted, except in Nigeria where OPV2 use continued for some time in response to recent cVDPV2 detections. However, the anomalous SL2 detections suggest specific areas that merit enhanced monitoring for signs of cVDPV2 outbreaks. Copyright © 2017 The Author(s). Published by Elsevier Ltd.. All rights reserved.

  10. The sequentially discounting autoregressive (SDAR) method for on-line automatic seismic event detecting on long term observation

    Science.gov (United States)

    Wang, L.; Toshioka, T.; Nakajima, T.; Narita, A.; Xue, Z.

    2017-12-01

    In recent years, more and more Carbon Capture and Storage (CCS) studies focus on seismicity monitoring. For the safety management of geological CO2 storage at Tomakomai, Hokkaido, Japan, an Advanced Traffic Light System (ATLS) combined different seismic messages (magnitudes, phases, distributions et al.) is proposed for injection controlling. The primary task for ATLS is the seismic events detection in a long-term sustained time series record. Considering the time-varying characteristics of Signal to Noise Ratio (SNR) of a long-term record and the uneven energy distributions of seismic event waveforms will increase the difficulty in automatic seismic detecting, in this work, an improved probability autoregressive (AR) method for automatic seismic event detecting is applied. This algorithm, called sequentially discounting AR learning (SDAR), can identify the effective seismic event in the time series through the Change Point detection (CPD) of the seismic record. In this method, an anomaly signal (seismic event) can be designed as a change point on the time series (seismic record). The statistical model of the signal in the neighborhood of event point will change, because of the seismic event occurrence. This means the SDAR aims to find the statistical irregularities of the record thought CPD. There are 3 advantages of SDAR. 1. Anti-noise ability. The SDAR does not use waveform messages (such as amplitude, energy, polarization) for signal detecting. Therefore, it is an appropriate technique for low SNR data. 2. Real-time estimation. When new data appears in the record, the probability distribution models can be automatic updated by SDAR for on-line processing. 3. Discounting property. the SDAR introduces a discounting parameter to decrease the influence of present statistic value on future data. It makes SDAR as a robust algorithm for non-stationary signal processing. Within these 3 advantages, the SDAR method can handle the non-stationary time-varying long

  11. Effects of episodic sediment supply on bedload transport rate in mountain rivers. Detecting debris flow activity using continuous monitoring

    Science.gov (United States)

    Uchida, Taro; Sakurai, Wataru; Iuchi, Takuma; Izumiyama, Hiroaki; Borgatti, Lisa; Marcato, Gianluca; Pasuto, Alessandro

    2018-04-01

    Monitoring of sediment transport from hillslopes to channel networks as a consequence of floods with suspended and bedload transport, hyperconcentrated flows, debris and mud flows is essential not only for scientific issues, but also for prevention and mitigation of natural disasters, i.e. for hazard assessment, land use planning and design of torrent control interventions. In steep, potentially unstable terrains, ground-based continuous monitoring of hillslope and hydrological processes is still highly localized and expensive, especially in terms of manpower. In recent years, new seismic and acoustic methods have been developed for continuous bedload monitoring in mountain rivers. Since downstream bedload transport rate is controlled by upstream sediment supply from tributary channels and bed-external sources, continuous bedload monitoring might be an effective tool for detecting the sediments mobilized by debris flow processes in the upper catchment and thus represent an indirect method to monitor slope instability processes at the catchment scale. However, there is poor information about the effects of episodic sediment supply from upstream bed-external sources on downstream bedload transport rate at a single flood time scale. We have examined the effects of sediment supply due to upstream debris flow events on downstream bedload transport rate along the Yotagiri River, central Japan. To do this, we have conducted continuous bedload observations using a hydrophone (Japanese pipe microphone) located 6.4 km downstream the lower end of a tributary affected by debris flows. Two debris flows occurred during the two-years-long observation period. As expected, bedload transport rate for a given flow depth showed to be larger after storms triggering debris flows. That is, although the magnitude of sediment supply from debris flows is not large, their effect on bedload is propagating >6 km downstream at a single flood time scale. This indicates that continuous bedload

  12. [Performance and optimisation of a trigger tool for the detection of adverse events in hospitalised adult patients].

    Science.gov (United States)

    Guzmán Ruiz, Óscar; Pérez Lázaro, Juan José; Ruiz López, Pedro

    To characterise the performance of the triggers used in the detection of adverse events (AE) of hospitalised adult patients and to define a simplified panel of triggers to facilitate the detection of AE. Cross-sectional study of charts of patients from a service of internal medicine to detect EA through systematic review of the charts and identification of triggers (clinical event often related to AE), determining if there was AE as the context in which it appeared the trigger. Once the EA was detected, we proceeded to the characterization of the triggers that detected it. Logistic regression was applied to select the triggers with greater AE detection capability. A total of 291 charts were reviewed, with a total of 562 triggers in 103 patients, of which 163 were involved in detecting an AE. The triggers that detected the most AE were "A.1. Pressure ulcer" (9.82%), "B.5. Laxative or enema" (8.59%), "A.8. Agitation" (8.59%), "A.9. Over-sedation" (7.98%), "A.7. Haemorrhage" (6.75%) and "B.4. Antipsychotic" (6.75%). A simplified model was obtained using logistic regression, and included the variable "Number of drugs" and the triggers "Over-sedation", "Urinary catheterisation", "Readmission in 30 days", "Laxative or enema" and "Abrupt medication stop". This model showed a probability of 81% to correctly classify charts with EA or without EA (p <0.001; 95% confidence interval: 0.763-0.871). A high number of triggers were associated with AE. The summary model is capable of detecting a large amount of AE, with a minimum of elements. Copyright © 2017 SESPAS. Publicado por Elsevier España, S.L.U. All rights reserved.

  13. Cost and detection rate of glaucoma screening with imaging devices in a primary care center

    Directory of Open Access Journals (Sweden)

    Anton A

    2017-02-01

    Full Text Available Alfonso Anton,1–4 Monica Fallon,3,5 Francesc Cots,2 María A Sebastian,6 Antonio Morilla-Grasa,4 Sergi Mojal,3 Xavier Castells2 1Medicine School, Universidad Internacional de Cataluña, 2Servei d’Estudies, Parc de Salut Mar, 3Instituto Hospital del Mar de Investigaciones Médicas (IMIM, 4Glaucoma Department, Instituto Catalán de Retina (ICR, 5Universidad Autónoma de Barcelona, 6Centro de Atención Primaria Larrard, Barcelona, Spain Purpose: To analyze the cost and detection rate of a screening program for detecting glaucoma with imaging devices. Materials and methods: In this cross-sectional study, a glaucoma screening program was applied in a population-based sample randomly selected from a population of 23,527. Screening targeted the population at risk of glaucoma. Examinations included optic disk tomography (Heidelberg retina tomograph [HRT], nerve fiber analysis, and tonometry. Subjects who met at least 2 of 3 endpoints (HRT outside normal limits, nerve fiber index ≥30, or tonometry ≥21 mmHg were referred for glaucoma consultation. The currently established (“conventional” detection method was evaluated by recording data from primary care and ophthalmic consultations in the same population. The direct costs of screening and conventional detection were calculated by adding the unit costs generated during the diagnostic process. The detection rate of new glaucoma cases was assessed. Results: The screening program evaluated 414 subjects; 32 cases were referred for glaucoma consultation, 7 had glaucoma, and 10 had probable glaucoma. The current detection method assessed 677 glaucoma suspects in the population, of whom 29 were diagnosed with glaucoma or probable glaucoma. Glaucoma screening and the conventional detection method had detection rates of 4.1% and 3.1%, respectively, and the cost per case detected was 1,410 and 1,435€, respectively. The cost of screening 1 million inhabitants would be 5.1 million euros and would allow

  14. Research and development of a high-temperature helium-leak detection system (joint research). Part 1 survey on leakage events and current leak detection technology

    Energy Technology Data Exchange (ETDEWEB)

    Sakaba, Nariaki; Nakazawa, Toshio; Kawasaki, Kozo [Japan Atomic Energy Research Inst., Oarai, Ibaraki (Japan). Oarai Research Establishment; Urakami, Masao; Saisyu, Sadanori [Japan Atomic Power Co., Tokyo (Japan)

    2003-03-01

    In High Temperature Gas-cooled Reactors (HTGR), the detection of leakage of helium at an early stage is very important for the safety and stability of operations. Since helium is a colourless gas, it is generally difficult to identify the location and the amount of leakage when very little leakage has occurred. The purpose of this R and D is to develop a helium leak detection system for the high temperature environment appropriate to the HTGR. As the first step in the development, this paper describes the result of surveying leakage events at nuclear facilities inside and outside Japan and current gas leakage detection technology to adapt optical-fibre detection technology to HTGRs. (author)

  15. Novel Fingertip Image-Based Heart Rate Detection Methods for a Smartphone

    Directory of Open Access Journals (Sweden)

    Rifat Zaman

    2017-02-01

    Full Text Available We hypothesize that our fingertip image-based heart rate detection methods using smartphone reliably detect the heart rhythm and rate of subjects. We propose fingertip curve line movement-based and fingertip image intensity-based detection methods, which both use the movement of successive fingertip images obtained from smartphone cameras. To investigate the performance of the proposed methods, heart rhythm and rate of the proposed methods are compared to those of the conventional method, which is based on average image pixel intensity. Using a smartphone, we collected 120 s pulsatile time series data from each recruited subject. The results show that the proposed fingertip curve line movement-based method detects heart rate with a maximum deviation of 0.0832 Hz and 0.124 Hz using time- and frequency-domain based estimation, respectively, compared to the conventional method. Moreover, another proposed fingertip image intensity-based method detects heart rate with a maximum deviation of 0.125 Hz and 0.03 Hz using time- and frequency-based estimation, respectively.

  16. A Data-Driven Monitoring Technique for Enhanced Fall Events Detection

    KAUST Repository

    Zerrouki, Nabil

    2016-07-26

    Fall detection is a crucial issue in the health care of seniors. In this work, we propose an innovative method for detecting falls via a simple human body descriptors. The extracted features are discriminative enough to describe human postures and not too computationally complex to allow a fast processing. The fall detection is addressed as a statistical anomaly detection problem. The proposed approach combines modeling using principal component analysis modeling with the exponentially weighted moving average (EWMA) monitoring chart. The EWMA scheme is applied on the ignored principal components to detect the presence of falls. Using two different fall detection datasets, URFD and FDD, we have demonstrated the greater sensitivity and effectiveness of the developed method over the conventional PCA-based methods.

  17. INTEGRAL Detection of the First Prompt Gamma-Ray Signal Coincident with the Gravitational-wave Event GW170817

    DEFF Research Database (Denmark)

    Savchenko, V.; Ferrigno, C.; Kuulkers, E.

    2017-01-01

    We report the INTernational Gamma-ray Astrophysics Laboratory (INTEGRAL) detection of the short gamma-ray burst GRB 170817A (discovered by Fermi-GBM) with a signal-to-noise ratio of 4.6, and, for the first time, its association with the gravitational waves (GWs) from binary neutron star (BNS......) merging event GW170817 detected by the LIGO and Virgo observatories. The significance of association between the gamma-ray burst observed by INTEGRAL and GW170817 is 3.2σ, while the association between the Fermi-GBM and INTEGRAL detections is 4.2σ. GRB 170817A was detected by the SPI-ACS instrument about...

  18. Technological advances for improving adenoma detection rates: The changing face of colonoscopy.

    Science.gov (United States)

    Ishaq, Sauid; Siau, Keith; Harrison, Elizabeth; Tontini, Gian Eugenio; Hoffman, Arthur; Gross, Seth; Kiesslich, Ralf; Neumann, Helmut

    2017-07-01

    Worldwide, colorectal cancer is the third commonest cancer. Over 90% follow an adenoma-to-cancer sequence over many years. Colonoscopy is the gold standard method for cancer screening and early adenoma detection. However, considerable variation exists between endoscopists' detection rates. This review considers the effects of different endoscopic techniques on adenoma detection. Two areas of technological interest were considered: (1) optical technologies and (2) mechanical technologies. Optical solutions, including FICE, NBI, i-SCAN and high definition colonoscopy showed mixed results. In contrast, mechanical advances, such as cap-assisted colonoscopy, FUSE, EndoCuff and G-EYE™, showed promise, with reported detections rates of up to 69%. However, before definitive recommendations can be made for their incorporation into daily practice, further studies and comparison trials are required. Copyright © 2017 Editrice Gastroenterologica Italiana S.r.l. Published by Elsevier Ltd. All rights reserved.

  19. LASER: A Maximum Likelihood Toolkit for Detecting Temporal Shifts in Diversification Rates From Molecular Phylogenies

    Directory of Open Access Journals (Sweden)

    Daniel L. Rabosky

    2006-01-01

    Full Text Available Rates of species origination and extinction can vary over time during evolutionary radiations, and it is possible to reconstruct the history of diversification using molecular phylogenies of extant taxa only. Maximum likelihood methods provide a useful framework for inferring temporal variation in diversification rates. LASER is a package for the R programming environment that implements maximum likelihood methods based on the birth-death process to test whether diversification rates have changed over time. LASER contrasts the likelihood of phylogenetic data under models where diversification rates have changed over time to alternative models where rates have remained constant over time. Major strengths of the package include the ability to detect temporal increases in diversification rates and the inference of diversification parameters under multiple rate-variable models of diversification. The program and associated documentation are freely available from the R package archive at http://cran.r-project.org.

  20. One Novel Multiple-Target Plasmid Reference Molecule Targeting Eight Genetically Modified Canola Events for Genetically Modified Canola Detection.

    Science.gov (United States)

    Li, Zhuqing; Li, Xiang; Wang, Canhua; Song, Guiwen; Pi, Liqun; Zheng, Lan; Zhang, Dabing; Yang, Litao

    2017-09-27

    Multiple-target plasmid DNA reference materials have been generated and utilized as good substitutes of matrix-based reference materials in the analysis of genetically modified organisms (GMOs). Herein, we report the construction of one multiple-target plasmid reference molecule, pCAN, which harbors eight GM canola event-specific sequences (RF1, RF2, MS1, MS8, Topas 19/2, Oxy235, RT73, and T45) and a partial sequence of the canola endogenous reference gene PEP. The applicability of this plasmid reference material in qualitative and quantitative PCR assays of the eight GM canola events was evaluated, including the analysis of specificity, limit of detection (LOD), limit of quantification (LOQ), and performance of pCAN in the analysis of various canola samples, etc. The LODs are 15 copies for RF2, MS1, and RT73 assays using pCAN as the calibrator and 10 genome copies for the other events. The LOQ in each event-specific real-time PCR assay is 20 copies. In quantitative real-time PCR analysis, the PCR efficiencies of all event-specific and PEP assays are between 91% and 97%, and the squared regression coefficients (R 2 ) are all higher than 0.99. The quantification bias values varied from 0.47% to 20.68% with relative standard deviation (RSD) from 1.06% to 24.61% in the quantification of simulated samples. Furthermore, 10 practical canola samples sampled from imported shipments in the port of Shanghai, China, were analyzed employing pCAN as the calibrator, and the results were comparable with those assays using commercial certified materials as the calibrator. Concluding from these results, we believe that this newly developed pCAN plasmid is one good candidate for being a plasmid DNA reference material in the detection and quantification of the eight GM canola events in routine analysis.

  1. INTEGRAL Detection of the First Prompt Gamma-Ray Signal Coincident with the Gravitational-wave Event GW170817

    Energy Technology Data Exchange (ETDEWEB)

    Savchenko, V.; Ferrigno, C.; Bozzo, E.; Courvoisier, T. J.-L. [ISDC, Department of Astronomy, University of Geneva, Chemin d’Écogia, 16 CH-1290 Versoix (Switzerland); Kuulkers, E. [European Space Research and Technology Centre (ESA/ESTEC), Keplerlaan 1, 2201 AZ Noordwijk (Netherlands); Bazzano, A.; Natalucci, L.; Rodi, J. [INAF-Institute for Space Astrophysics and Planetology, Via Fosso del Cavaliere 100, I-00133-Rome (Italy); Brandt, S.; Chenevez, J. [DTU Space, National Space Institute Elektrovej, Building 327 DK-2800 Kongens Lyngby (Denmark); Diehl, R.; Von Kienlin, A. [Max-Planck-Institut für Extraterrestrische Physik, Garching (Germany); Domingo, A. [Centro de Astrobiología (CAB-CSIC/INTA, ESAC Campus), Camino bajo del Castillo S/N, E-28692 Villanueva de la Cañada, Madrid (Spain); Hanlon, L.; Martin-Carrillo, A. [Space Science Group, School of Physics, University College Dublin, Belfield, Dublin 4 (Ireland); Jourdain, E. [IRAP, Université de Toulouse, CNRS, UPS, CNES, 9 Av. Roche, F-31028 Toulouse (France); Laurent, P.; Lebrun, F. [APC, AstroParticule et Cosmologie, Université Paris Diderot, CNRS/IN2P3, CEA/Irfu, Observatoire de Paris Sorbonne Paris Cité, 10 rue Alice Domont et Léonie Duquet, F-75205 Paris Cedex 13 (France); Lutovinov, A. [Space Research Institute of Russian Academy of Sciences, Profsoyuznaya 84/32, 117997 Moscow (Russian Federation); Mereghetti, S. [INAF, IASF-Milano, via E.Bassini 15, I-20133 Milano (Italy); and others

    2017-10-20

    We report the INTernational Gamma-ray Astrophysics Laboratory ( INTEGRAL ) detection of the short gamma-ray burst GRB 170817A (discovered by Fermi -GBM) with a signal-to-noise ratio of 4.6, and, for the first time, its association with the gravitational waves (GWs) from binary neutron star (BNS) merging event GW170817 detected by the LIGO and Virgo observatories. The significance of association between the gamma-ray burst observed by INTEGRAL and GW170817 is 3.2σ, while the association between the Fermi -GBM and INTEGRAL detections is 4.2σ. GRB 170817A was detected by the SPI-ACS instrument about 2 s after the end of the GW event. We measure a fluence of (1.4 ± 0.4 ± 0.6) × 10{sup −7} erg cm{sup −2} (75–2000 keV), where, respectively, the statistical error is given at the 1σ confidence level, and the systematic error corresponds to the uncertainty in the spectral model and instrument response. We also report on the pointed follow-up observations carried out by INTEGRAL , starting 19.5 hr after the event, and lasting for 5.4 days. We provide a stringent upper limit on any electromagnetic signal in a very broad energy range, from 3 keV to 8 MeV, constraining the soft gamma-ray afterglow flux to <7.1 × 10{sup −11} erg cm{sup −2} s{sup −1} (80–300 keV). Exploiting the unique capabilities of INTEGRAL , we constrained the gamma-ray line emission from radioactive decays that are expected to be the principal source of the energy behind a kilonova event following a BNS coalescence. Finally, we put a stringent upper limit on any delayed bursting activity, for example, from a newly formed magnetar.

  2. Shilling attack detection for recommender systems based on credibility of group users and rating time series.

    Science.gov (United States)

    Zhou, Wei; Wen, Junhao; Qu, Qiang; Zeng, Jun; Cheng, Tian

    2018-01-01

    Recommender systems are vulnerable to shilling attacks. Forged user-generated content data, such as user ratings and reviews, are used by attackers to manipulate recommendation rankings. Shilling attack detection in recommender systems is of great significance to maintain the fairness and sustainability of recommender systems. The current studies have problems in terms of the poor universality of algorithms, difficulty in selection of user profile attributes, and lack of an optimization mechanism. In this paper, a shilling behaviour detection structure based on abnormal group user findings and rating time series analysis is proposed. This paper adds to the current understanding in the field by studying the credibility evaluation model in-depth based on the rating prediction model to derive proximity-based predictions. A method for detecting suspicious ratings based on suspicious time windows and target item analysis is proposed. Suspicious rating time segments are determined by constructing a time series, and data streams of the rating items are examined and suspicious rating segments are checked. To analyse features of shilling attacks by a group user's credibility, an abnormal group user discovery method based on time series and time window is proposed. Standard testing datasets are used to verify the effect of the proposed method.

  3. Interference of detection rate of lumbar disc herniation by socioeconomic status.

    Science.gov (United States)

    Ji, Gyu Yeul; Oh, Chang Hyun; Jung, Nak-Yong; An, Seong Dae; Choi, Won-Seok; Kim, Jung Hoon

    2013-03-01

    Retrospective study. The objective of the study is to evaluate the relationship between the detection rate of lumbar disc herniation and socioeconomic status. Income is one important determinant of public health. Yet, there are no reports about the relationship between socioeconomic status and the detective rate of disc herniation. In this study, 443 cases were checked for lumbar computed tomography for lumbar disc herniation, and they reviewed questionnaires about their socioeconomic status, the presence of back pain or radiating pain and the presence of a medical certificate (to check the medical or surgical treatment for the pain) during the Korean conscription. Without the consideration for the presence of a medical certificate, there was no difference in spinal physical grade according to socioeconomic status (p=0.290). But, with the consideration of the presence of a medical certificate, the significant statistical differences were observed according to socioeconomic status in 249 cases in the presence of a medical certificate (p=0.028). There was a lower detection rate in low economic status individuals than those in the high economic class. The common reason for not submitting a medical certificate is that it is neither necessary for the people of lower socioeconomic status nor is it financially affordable. The prevalence of lumbar disc herniation is not different according to socioeconomic status, but the detective rate was affected by socioeconomic status. Socioeconomic status is an important factor for detecting lumbar disc herniation.

  4. Pulse Rate and Transit Time Analysis to Predict Hypotension Events After Spinal Anesthesia During Programmed Cesarean Labor.

    Science.gov (United States)

    Bolea, Juan; Lázaro, Jesús; Gil, Eduardo; Rovira, Eva; Remartínez, José M; Laguna, Pablo; Pueyo, Esther; Navarro, Augusto; Bailón, Raquel

    2017-09-01

    Prophylactic treatment has been proved to reduce hypotension incidence after spinal anesthesia during cesarean labor. However, the use of pharmacological prophylaxis could carry out undesirable side-effects on mother and fetus. Thus, the prediction of hypotension becomes an important challenge. Hypotension events are hypothesized to be related to a malfunctioning of autonomic nervous system (ANS) regulation of blood pressure. In this work, ANS responses to positional changes of 51 pregnant women programmed for a cesarean labor were explored for hypotension prediction. Lateral and supine decubitus, and sitting position were considered while electrocardiographic and pulse photoplethysmographic signals were recorded. Features based on heart rate variability, pulse rate variability (PRV) and pulse transit time (PTT) analysis were used in a logistic regression classifier. The results showed that PRV irregularity changes, assessed by approximate entropy, from supine to lateral decubitus, and standard deviation of PTT in supine decubitus were found as the combination of features that achieved the best classification results sensitivity of 76%, specificity of 70% and accuracy of 72%, being normotensive the positive class. Peripheral regulation and blood pressure changes, measured by PRV and PTT analysis, could help to predict hypotension events reducing prophylactic side-effects in the low-risk population.

  5. Network Events on Multiple Space and Time Scales in Cultured Neural Networks and in a Stochastic Rate Model.

    Directory of Open Access Journals (Sweden)

    Guido Gigante

    2015-11-01

    Full Text Available Cortical networks, in-vitro as well as in-vivo, can spontaneously generate a variety of collective dynamical events such as network spikes, UP and DOWN states, global oscillations, and avalanches. Though each of them has been variously recognized in previous works as expression of the excitability of the cortical tissue and the associated nonlinear dynamics, a unified picture of the determinant factors (dynamical and architectural is desirable and not yet available. Progress has also been partially hindered by the use of a variety of statistical measures to define the network events of interest. We propose here a common probabilistic definition of network events that, applied to the firing activity of cultured neural networks, highlights the co-occurrence of network spikes, power-law distributed avalanches, and exponentially distributed 'quasi-orbits', which offer a third type of collective behavior. A rate model, including synaptic excitation and inhibition with no imposed topology, synaptic short-term depression, and finite-size noise, accounts for all these different, coexisting phenomena. We find that their emergence is largely regulated by the proximity to an oscillatory instability of the dynamics, where the non-linear excitable behavior leads to a self-amplification of activity fluctuations over a wide range of scales in space and time. In this sense, the cultured network dynamics is compatible with an excitation-inhibition balance corresponding to a slightly sub-critical regime. Finally, we propose and test a method to infer the characteristic time of the fatigue process, from the observed time course of the network's firing rate. Unlike the model, possessing a single fatigue mechanism, the cultured network appears to show multiple time scales, signalling the possible coexistence of different fatigue mechanisms.

  6. Online surveillance of media health event reporting in Nepal: digital disease detection from a One Health perspective.

    Science.gov (United States)

    Schwind, Jessica S; Norman, Stephanie A; Karmacharya, Dibesh; Wolking, David J; Dixit, Sameer M; Rajbhandari, Rajesh M; Mekaru, Sumiko R; Brownstein, John S

    2017-09-21

    Traditional media and the internet are crucial sources of health information. Media can significantly shape public opinion, knowledge and understanding of emerging and endemic health threats. As digital communication rapidly progresses, local access and dissemination of health information contribute significantly to global disease detection and reporting. Health event reports in Nepal (October 2013-December 2014) were used to characterize Nepal's media environment from a One Health perspective using HealthMap - a global online disease surveillance and mapping tool. Event variables (location, media source type, disease or risk factor of interest, and affected species) were extracted from HealthMap. A total of 179 health reports were captured from various sources including newspapers, inter-government agency bulletins, individual reports, and trade websites, yielding 108 (60%) unique articles. Human health events were reported most often (n = 85; 79%), followed by animal health events (n = 23; 21%), with no reports focused solely on environmental health. By expanding event coverage across all of the health sectors, media in developing countries could play a crucial role in national risk communication efforts and could enhance early warning systems for disasters and disease outbreaks.

  7. Snake scales, partial exposure, and the Snake Detection Theory: A human event-related potentials study

    NARCIS (Netherlands)

    J.W. van Strien (Jan); L.A. Isbell (Lynne A.)

    2017-01-01

    textabstractStudies of event-related potentials in humans have established larger early posterior negativity (EPN) in response to pictures depicting snakes than to pictures depicting other creatures. Ethological research has recently shown that macaques and wild vervet monkeys respond strongly to

  8. Detection and monitoring of two dust storm events by multispectral modis images.

    Digital Repository Service at National Institute of Oceanography (India)

    Mehta P.S.; Kunte, P.D.

    of Oman, over Arabian Sea to the coast of Pakistan. The dust storm lasted over the Arabian Sea till 30th March. MODIS sensors on both Terra and Aqua Satellites captured images of both events. From the difference in emissive/transmissive nature...

  9. Oil spill detection and remote sensing : an overview with focus on recent events

    International Nuclear Information System (INIS)

    Coolbaugh, T.S.

    2008-01-01

    Several offshore oil spills occurred during the period from November to December 2007 in various parts of the world, each highlighting the need of quickly detect oil spills in marine settings. Several factors must be considered in order to determine the best technical approach for successful detection and oil spill monitoring. These include the reason for detection or monitoring; the location of the spill; the scale of spatial coverage; availability of detection equipment and time to deploy; high specificity for petroleum oil; weather conditions at and above the spill site; and cost of the detection approach. This paper outlined some of the key attributes of several remote sensing options that are available today or being considered. The approaches used to enhance visualization or detection of spills include traditional electromagnetic spectrum-based approaches such as ultra violet (UV), visible, infra-red (IR), radar, and fluorescence-based systems. Analytical approaches such as chemical analysis for the detection of volatile organic compounds (VOCs) or monitoring of electrical conductivity of the water surface may also provide a warning that hydrocarbons have been released. 12 refs., 3 tabs., 11 figs

  10. Control Surface Fault Diagnosis with Specified Detection Probability - Real Event Experiences

    DEFF Research Database (Denmark)

    Hansen, Søren; Blanke, Mogens

    2013-01-01

    desired levels of false alarms and detection probabilities. Self-tuning residual generators are employed for diagnosis and are combined with statistical change detection to form a setup for robust fault diagnosis. On-line estimation of test statistics is used to obtain a detection threshold and a desired...... false alarm probability. A data based method is used to determine the validity of the methods proposed. Verification is achieved using real data and shows that the presented diagnosis method is efficient and could have avoided incidents where faults led to loss of aircraft....

  11. Detecting Specific Health-Related Events Using an Integrated Sensor System for Vital Sign Monitoring

    Directory of Open Access Journals (Sweden)

    Mourad Adnane

    2009-09-01

    Full Text Available In this paper, a new method for the detection of apnea/hypopnea periods in physiological data is presented. The method is based on the intelligent combination of an integrated sensor system for long-time cardiorespiratory signal monitoring and dedicated signal-processing packages. Integrated sensors are a PVDF film and conductive fabric sheets. The signal processing package includes dedicated respiratory cycle (RC and QRS complex detection algorithms and a new method using the respiratory cycle variability (RCV for detecting apnea/hypopnea periods in physiological data. Results show that our method is suitable for online analysis of long time series data.

  12. Factors influencing variation in physician adenoma detection rates: a theory-based approach for performance improvement.

    Science.gov (United States)

    Atkins, Louise; Hunkeler, Enid M; Jensen, Christopher D; Michie, Susan; Lee, Jeffrey K; Doubeni, Chyke A; Zauber, Ann G; Levin, Theodore R; Quinn, Virginia P; Corley, Douglas A

    2016-03-01

    Interventions to improve physician adenoma detection rates for colonoscopy have generally not been successful, and there are little data on the factors contributing to variation that may be appropriate targets for intervention. We sought to identify factors that may influence variation in detection rates by using theory-based tools for understanding behavior. We separately studied gastroenterologists and endoscopy nurses at 3 Kaiser Permanente Northern California medical centers to identify potentially modifiable factors relevant to physician adenoma detection rate variability by using structured group interviews (focus groups) and theory-based tools for understanding behavior and eliciting behavior change: the Capability, Opportunity, and Motivation behavior model; the Theoretical Domains Framework; and the Behavior Change Wheel. Nine factors potentially associated with adenoma detection rate variability were identified, including 6 related to capability (uncertainty about which types of polyps to remove, style of endoscopy team leadership, compromised ability to focus during an examination due to distractions, examination technique during withdrawal, difficulty detecting certain types of adenomas, and examiner fatigue and pain), 2 related to opportunity (perceived pressure due to the number of examinations expected per shift and social pressure to finish examinations before scheduled breaks or the end of a shift), and 1 related to motivation (valuing a meticulous examination as the top priority). Examples of potential intervention strategies are provided. By using theory-based tools, this study identified several novel and potentially modifiable factors relating to capability, opportunity, and motivation that may contribute to adenoma detection rate variability and be appropriate targets for future intervention trials. Copyright © 2016 American Society for Gastrointestinal Endoscopy. Published by Elsevier Inc. All rights reserved.

  13. Direct inference of SNP heterozygosity rates and resolution of LOH detection.

    Directory of Open Access Journals (Sweden)

    Xiaohong Li

    2007-11-01

    Full Text Available Single nucleotide polymorphisms (SNPs have been increasingly utilized to investigate somatic genetic abnormalities in premalignancy and cancer. LOH is a common alteration observed during cancer development, and SNP assays have been used to identify LOH at specific chromosomal regions. The design of such studies requires consideration of the resolution for detecting LOH throughout the genome and identification of the number and location of SNPs required to detect genetic alterations in specific genomic regions. Our study evaluated SNP distribution patterns and used probability models, Monte Carlo simulation, and real human subject genotype data to investigate the relationships between the number of SNPs, SNP HET rates, and the sensitivity (resolution for detecting LOH. We report that variances of SNP heterozygosity rate in dbSNP are high for a large proportion of SNPs. Two statistical methods proposed for directly inferring SNP heterozygosity rates require much smaller sample sizes (intermediate sizes and are feasible for practical use in SNP selection or verification. Using HapMap data, we showed that a region of LOH greater than 200 kb can be reliably detected, with losses smaller than 50 kb having a substantially lower detection probability when using all SNPs currently in the HapMap database. Higher densities of SNPs may exist in certain local chromosomal regions that provide some opportunities for reliably detecting LOH of segment sizes smaller than 50 kb. These results suggest that the interpretation of the results from genome-wide scans for LOH using commercial arrays need to consider the relationships among inter-SNP distance, detection probability, and sample size for a specific study. New experimental designs for LOH studies would also benefit from considering the power of detection and sample sizes required to accomplish the proposed aims.

  14. A new islanding detection technique for multiple mini hydro based on rate of change of reactive power and load connecting strategy

    International Nuclear Information System (INIS)

    Laghari, J.A.; Mokhlis, H.; Bakar, A.H.A.; Karimi, M.

    2013-01-01

    Highlights: • The requirement of DG interconnection with existing power system is discussed. • Various islanding detection techniques are discussed with their merits and demerits. • New islanding detection strategy is proposed for multiple mini hydro type DGs. • The proposed strategy is based on dq/dt and load connecting strategy. • The effectiveness of strategy is verified on various other cases. - Abstract: The interconnection of distributed generation (DG) into distribution networks is undergoing a rapid global expansion. It enhances the system’s reliability, while simultaneously reduces pollution problems related to the generation of electrical power. To fully utilize the benefits of DGs, certain technical issues need to be addressed. One of the most important issues in this context is islanding detection. This paper presents a new islanding detection technique that is suitable for multiple mini-hydro type DG units. The proposed strategy is based on the rate of change of reactive power and load connecting strategy to detect islanding within the system. For a large power mismatch, islanding is detected by rate of change of reactive power only. However, for a close power mismatch, the rate of change of reactive power initiates a load connecting strategy, which in turn alters the load on the distribution network. This load variation in the distribution network causes a variation in the rate of change of reactive power, which is utilized to distinguish islanding and other events. The simulation results show that the proposed strategy is effective in detecting islanding occurrence in a distribution network

  15. Real-time detection of organic contamination events in water distribution systems by principal components analysis of ultraviolet spectral data.

    Science.gov (United States)

    Zhang, Jian; Hou, Dibo; Wang, Ke; Huang, Pingjie; Zhang, Guangxin; Loáiciga, Hugo

    2017-05-01

    The detection of organic contaminants in water distribution systems is essential to protect public health from potential harmful compounds resulting from accidental spills or intentional releases. Existing methods for detecting organic contaminants are based on quantitative analyses such as chemical testing and gas/liquid chromatography, which are time- and reagent-consuming and involve costly maintenance. This study proposes a novel procedure based on discrete wavelet transform and principal component analysis for detecting organic contamination events from ultraviolet spectral data. Firstly, the spectrum of each observation is transformed using discrete wavelet with a coiflet mother wavelet to capture the abrupt change along the wavelength. Principal component analysis is then employed to approximate the spectra based on capture and fusion features. The significant value of Hotelling's T 2 statistics is calculated and used to detect outliers. An alarm of contamination event is triggered by sequential Bayesian analysis when the outliers appear continuously in several observations. The effectiveness of the proposed procedure is tested on-line using a pilot-scale setup and experimental data.

  16. Rates of medical errors and preventable adverse events among hospitalized children following implementation of a resident handoff bundle.

    Science.gov (United States)

    Starmer, Amy J; Sectish, Theodore C; Simon, Dennis W; Keohane, Carol; McSweeney, Maireade E; Chung, Erica Y; Yoon, Catherine S; Lipsitz, Stuart R; Wassner, Ari J; Harper, Marvin B; Landrigan, Christopher P

    2013-12-04

    Handoff miscommunications are a leading cause of medical errors. Studies comprehensively assessing handoff improvement programs are lacking. To determine whether introduction of a multifaceted handoff program was associated with reduced rates of medical errors and preventable adverse events, fewer omissions of key data in written handoffs, improved verbal handoffs, and changes in resident-physician workflow. Prospective intervention study of 1255 patient admissions (642 before and 613 after the intervention) involving 84 resident physicians (42 before and 42 after the intervention) from July-September 2009 and November 2009-January 2010 on 2 inpatient units at Boston Children's Hospital. Resident handoff bundle, consisting of standardized communication and handoff training, a verbal mnemonic, and a new team handoff structure. On one unit, a computerized handoff tool linked to the electronic medical record was introduced. The primary outcomes were the rates of medical errors and preventable adverse events measured by daily systematic surveillance. The secondary outcomes were omissions in the printed handoff document and resident time-motion activity. Medical errors decreased from 33.8 per 100 admissions (95% CI, 27.3-40.3) to 18.3 per 100 admissions (95% CI, 14.7-21.9; P < .001), and preventable adverse events decreased from 3.3 per 100 admissions (95% CI, 1.7-4.8) to 1.5 (95% CI, 0.51-2.4) per 100 admissions (P = .04) following the intervention. There were fewer omissions of key handoff elements on printed handoff documents, especially on the unit that received the computerized handoff tool (significant reductions of omissions in 11 of 14 categories with computerized tool; significant reductions in 2 of 14 categories without computerized tool). Physicians spent a greater percentage of time in a 24-hour period at the patient bedside after the intervention (8.3%; 95% CI 7.1%-9.8%) vs 10.6% (95% CI, 9.2%-12.2%; P = .03). The average duration of verbal

  17. Event rate and reaction time performance in ADHD: Testing predictions from the state regulation deficit hypothesis using an ex-Gaussian model.

    Science.gov (United States)

    Metin, Baris; Wiersema, Jan R; Verguts, Tom; Gasthuys, Roos; van Der Meere, Jacob J; Roeyers, Herbert; Sonuga-Barke, Edmund

    2014-12-06

    According to the state regulation deficit (SRD) account, ADHD is associated with a problem using effort to maintain an optimal activation state under demanding task settings such as very fast or very slow event rates. This leads to a prediction of disrupted performance at event rate extremes reflected in higher Gaussian response variability that is a putative marker of activation during motor preparation. In the current study, we tested this hypothesis using ex-Gaussian modeling, which distinguishes Gaussian from non-Gaussian variability. Twenty-five children with ADHD and 29 typically developing controls performed a simple Go/No-Go task under four different event-rate conditions. There was an accentuated quadratic relationship between event rate and Gaussian variability in the ADHD group compared to the controls. The children with ADHD had greater Gaussian variability at very fast and very slow event rates but not at moderate event rates. The results provide evidence for the SRD account of ADHD. However, given that this effect did not explain all group differences (some of which were independent of event rate) other cognitive and/or motivational processes are also likely implicated in ADHD performance deficits.

  18. Exploring similarities and differences in hospital adverse event rates between Norway and Sweden using Global Trigger Tool.

    Science.gov (United States)

    Deilkås, Ellen Tveter; Risberg, Madeleine Borgstedt; Haugen, Marion; Lindstrøm, Jonas Christoffer; Nylén, Urban; Rutberg, Hans; Michael, Soop

    2017-03-20

    In this paper, we explore similarities and differences in hospital adverse event (AE) rates between Norway and Sweden by reviewing medical records with the Global Trigger Tool (GTT). All acute care hospitals in both countries performed medical record reviews, except one in Norway. Records were randomly selected from all eligible admissions in 2013. Eligible admissions were patients 18 years of age or older, undergoing care with an in-hospital stay of at least 24 hours, excluding psychiatric and care and rehabilitation. Reviews were done according to GTT methodology. Similar contexts for healthcare and similar socioeconomic and demographic characteristics have inspired the Nordic countries to exchange experiences from measuring and monitoring quality and patient safety in healthcare. The co-operation has promoted the use of GTT to monitor national and local rates of AEs in hospital care. 10 986 medical records were reviewed in Norway and 19 141 medical records in Sweden. No significant difference between overall AE rates was found between the two countries. The rate was 13.0% (95% CI 11.7% to 14.3%) in Norway and 14.4% (95% CI 12.6% to 16.3%) in Sweden. There were significantly higher AE rates of surgical complications in Norwegian hospitals compared with Swedish hospitals. Swedish hospitals had significantly higher rates of pressure ulcers, falls and 'other' AEs. Among more severe AEs, Norwegian hospitals had significantly higher rates of surgical complications than Swedish hospitals. Swedish hospitals had significantly higher rates of postpartum AEs. The level of patient safety in acute care hospitals, as assessed by GTT, was essentially the same in both countries. The differences between the countries in the rates of several types of AEs provide new incentives for Norwegian and Swedish governing bodies to address patient safety issues. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please

  19. Mining the IPTV Channel Change Event Stream to Discover Insight and Detect Ads

    Directory of Open Access Journals (Sweden)

    Matej Kren

    2016-01-01

    Full Text Available IPTV has been widely deployed throughout the world, bringing significant advantages to users in terms of the channel offering, video on demand, and interactive applications. One aspect that has been often neglected is the ability of precise and unobtrusive telemetry. TV set-top boxes that are deployed in modern IPTV systems can be thought of as capable sensor nodes that collect vast amounts of data, representing both the user activity and the quality of service delivered by the system itself. In this paper we focus on the user-generated events and analyze how the data stream of channel change events received from the entire IPTV network can be mined to obtain insight about the content. We demonstrate that it is possible to predict the occurrence of TV ads with high probability and show that the approach could be extended to model the user behavior and classify the viewership in multiple dimensions.

  20. Detection and Modeling of Non-Tidal Oceanic Effects on the Earth's Rotation Rate

    Science.gov (United States)

    Marcus, S. L.; Chao, Y.; Dickey, J. O.; Gegout, P.

    1998-01-01

    Sub-decadal changes in the Earth's rotation rate, and hence in the length-of-day (LOD), are largely controlled by variations in atmospheric angular momentum. Results from two oceanic general circulation models (OGCMs), forced by observed wind stress and heat flux for the years 1992-1994, show that ocean current and mass distribution changes also induce detectable LOD variations.

  1. Leadership training to improve adenoma detection rate in screening colonoscopy: A randomised trial

    NARCIS (Netherlands)

    M.F. Kaminski (Michal); J. Anderson (John); R.M. Valori (Roland ); E. Kraszewska (Ewa); M. Rupinski (Maciej); J. Pachlewski (Jacek); E. Wronska (Ewa); M. Bretthauer (Michael); S. Thomas-Gibson (Siwan); E.J. Kuipers (Ernst); J. Regula (J.)

    2016-01-01

    textabstractObjective Suboptimal adenoma detection rate (ADR) at colonoscopy is associated with increased risk of interval colorectal cancer. It is uncertain how ADR might be improved. We compared the effect of leadership training versus feedback only on colonoscopy quality in a countrywide

  2. The association of incidentally detected heart valve calcification with future cardiovascular events

    OpenAIRE

    Gondrie, Martijn J. A.; van der Graaf, Yolanda; Jacobs, Peter C.; Oen, Ay L.; Mali, Willem P. Th. M.

    2010-01-01

    Objectives This study aims to investigate the prognostic value of incidental aortic valve calcification (AVC), mitral valve calcification (MVC) and mitral annular calcification (MAC) for cardiovascular events and non-rheumatic valve disease in particular on routine diagnostic chest CT. Methods The study followed a case-cohort design. 10410 patients undergoing chest CT were followed for a median period of 17 months. Patients referred for cardiovascular disease were excluded. A random sample of...

  3. Passive Visual Analytics of Social Media Data for Detection of Unusual Events

    OpenAIRE

    Rustagi, Kush; Chae, Junghoon

    2016-01-01

    Now that social media sites have gained substantial traction, huge amounts of un-analyzed valuable data are being generated. Posts containing images and text have spatiotemporal data attached as well, having immense value for increasing situational awareness of local events, providing insights for investigations and understanding the extent of incidents, their severity, and consequences, as well as their time-evolving nature. However, the large volume of unstructured social media data hinders...

  4. Enriched encoding: reward motivation organizes cortical networks for hippocampal detection of unexpected events.

    Science.gov (United States)

    Murty, Vishnu P; Adcock, R Alison

    2014-08-01

    Learning how to obtain rewards requires learning about their contexts and likely causes. How do long-term memory mechanisms balance the need to represent potential determinants of reward outcomes with the computational burden of an over-inclusive memory? One solution would be to enhance memory for salient events that occur during reward anticipation, because all such events are potential determinants of reward. We tested whether reward motivation enhances encoding of salient events like expectancy violations. During functional magnetic resonance imaging, participants performed a reaction-time task in which goal-irrelevant expectancy violations were encountered during states of high- or low-reward motivation. Motivation amplified hippocampal activation to and declarative memory for expectancy violations. Connectivity of the ventral tegmental area (VTA) with medial prefrontal, ventrolateral prefrontal, and visual cortices preceded and predicted this increase in hippocampal sensitivity. These findings elucidate a novel mechanism whereby reward motivation can enhance hippocampus-dependent memory: anticipatory VTA-cortical-hippocampal interactions. Further, the findings integrate literatures on dopaminergic neuromodulation of prefrontal function and hippocampus-dependent memory. We conclude that during reward motivation, VTA modulation induces distributed neural changes that amplify hippocampal signals and records of expectancy violations to improve predictions-a potentially unique contribution of the hippocampus to reward learning. © The Author 2013. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  5. Detection of explosive cough events in audio recordings by internal sound analysis.

    Science.gov (United States)

    Rocha, B M; Mendes, L; Couceiro, R; Henriques, J; Carvalho, P; Paiva, R P

    2017-07-01

    We present a new method for the discrimination of explosive cough events, which is based on a combination of spectral content descriptors and pitch-related features. After the removal of near-silent segments, a vector of event boundaries is obtained and a proposed set of 9 features is extracted for each event. Two data sets, recorded using electronic stethoscopes and comprising a total of 46 healthy subjects and 13 patients, were employed to evaluate the method. The proposed feature set is compared to three other sets of descriptors: a baseline, a combination of both sets, and an automatic selection of the best 10 features from both sets. The combined feature set yields good results on the cross-validated database, attaining a sensitivity of 92.3±2.3% and a specificity of 84.7±3.3%. Besides, this feature set seems to generalize well when it is trained on a small data set of patients, with a variety of respiratory and cardiovascular diseases, and tested on a bigger data set of mostly healthy subjects: a sensitivity of 93.4% and a specificity of 83.4% are achieved in those conditions. These results demonstrate that complementing the proposed feature set with a baseline set is a promising approach.

  6. Adding Endoscopist-Directed Flexible Endoscopic Evaluation of Swallowing to the Videofluoroscopic Swallowing Study Increased the Detection Rates of Penetration, Aspiration, and Pharyngeal Residue.

    Science.gov (United States)

    Park, Won Young; Lee, Tae Hee; Ham, Nam Seok; Park, Ji Woong; Lee, Yang Gyun; Cho, Sang Jin; Lee, Joon Seong; Hong, Su Jin; Jeon, Seong Ran; Kim, Hyun Gun; Cho, Joo Young; Kim, Jin Oh; Cho, Jun Hyung; Lee, Ji Sung

    2015-09-23

    Currently, the videofluoroscopic swallowing study (VFSS) is the standard tool for evaluating dysphagia. We evaluated whether the addition of endoscopist-directed flexible endoscopic evaluation of swallowing (FEES) to VFSS could improve the detection rates of penetration, aspiration, and pharyngeal residue, compared the diagnostic efficacy between VFSS and endoscopist-directed FEES and assessed the adverse events of the FEES. In single tertiary referral center, a retrospective analysis of prospectively collected data was conducted. Fifty consecutive patients suspected of oropharyngeal dysphagia were enrolled in this study between January 2012 and July 2012. The agreement in the detection of penetration and aspiration between VFSS and FEES of viscous food (κ=0.34; 95% confidence interval [CI], 0.15 to 0.53) and liquid food (κ=0.22; 95% CI, 0.02 to 0.42) was "fair." The agreement in the detection of pharyngeal residue between the two tests was "substantial" with viscous food (κ=0.63; 95% CI, 0.41 to 0.94) and "fair" with liquid food (κ=0.37; 95% CI, 0.10 to 0.63). Adding FEES to VFSS significantly increased the detection rates of penetration, aspiration, and pharyngeal residue. No severe adverse events were noted during FEES, except for two cases of epistaxis, which stopped spontaneously without requiring any packing. This study demonstrated that the addition of endoscopist-directed FEES to VFSS increased the detection rates of penetration, aspiration, and pharyngeal residue.

  7. Low-Rate DDoS Attack Detection Using Expectation of Packet Size

    Directory of Open Access Journals (Sweden)

    Lu Zhou

    2017-01-01

    Full Text Available Low-rate Distributed Denial-of-Service (low-rate DDoS attacks are a new challenge to cyberspace, as the attackers send a large amount of attack packets similar to normal traffic, to throttle legitimate flows. In this paper, we propose a measurement—expectation of packet size—that is based on the distribution difference of the packet size to distinguish two typical low-rate DDoS attacks, the constant attack and the pulsing attack, from legitimate traffic. The experimental results, obtained using a series of real datasets with different times and different tolerance factors, are presented to demonstrate the effectiveness of the proposed measurement. In addition, extensive experiments are performed to show that the proposed measurement can detect the low-rate DDoS attacks not only in the short and long terms but also for low packet rates and high packet rates. Furthermore, the false-negative rates and the adjudication distance can be adjusted based on the detection sensitivity requirements.

  8. A visible light imaging device for cardiac rate detection with reduced effect of body movement

    Science.gov (United States)

    Jiang, Xiaotian; Liu, Ming; Zhao, Yuejin

    2014-09-01

    A visible light imaging system to detect human cardiac rate is proposed in this paper. A color camera and several LEDs, acting as lighting source, were used to avoid the interference of ambient light. From people's forehead, the cardiac rate could be acquired based on photoplethysmography (PPG) theory. The template matching method was used after the capture of video. The video signal was discomposed into three signal channels (RGB) and the region of interest was chosen to take the average gray value. The green channel signal could provide an excellent waveform of pulse wave on the account of green lights' absorptive characteristics of blood. Through the fast Fourier transform, the cardiac rate was exactly achieved. But the research goal was not just to achieve the cardiac rate accurately. With the template matching method, the effects of body movement are reduced to a large extent, therefore the pulse wave can be detected even while people are in the moving state and the waveform is largely optimized. Several experiments are conducted on volunteers, and the results are compared with the ones gained by a finger clamped pulse oximeter. The contrast results between these two ways are exactly agreeable. This method to detect the cardiac rate and the pulse wave largely reduces the effects of body movement and can probably be widely used in the future.

  9. Simulation of overpressure events with a Laguna Verde model for the RELAP code to conditions of extended power up rate

    International Nuclear Information System (INIS)

    Rodriguez H, A.; Araiza M, E.; Fuentes M, L.; Ortiz V, J.

    2012-10-01

    In this work the main results of the simulation of overpressure events are presented using a model of the nuclear power plant of Laguna Verde developed for the RELAP/SCDAPSIM code. As starting point we have the conformation of a Laguna Verde model that represents a stationary state to similar conditions to the operation of the power station with Extended Power Up rate (EPU). The transitory of simulated pressure are compared with those documented in the Final Safety Analysis Report of Laguna Verde (FSAR). The results of the turbine shot transitory with and without by-pass of the main turbine are showed, and the event of closes of all the valves of main vapor isolation. A preliminary simulation was made and with base in the results some adjustments were made for the operation with EPU, taking into account the Operation Technical Specifications of the power station. The results of the final simulations were compared and analyzed with the content in the FSAR. The response of the power station to the transitory, reflected in the model for RELAP, was satisfactory. Finally, comments about the improvement of the model are included, for example, the response time of the protection and mitigation systems of the power station. (Author)

  10. Short Personality and Life Event scale for detection of suicide attempters.

    Science.gov (United States)

    Artieda-Urrutia, Paula; Delgado-Gómez, David; Ruiz-Hernández, Diego; García-Vega, Juan Manuel; Berenguer, Nuria; Oquendo, Maria A; Blasco-Fontecilla, Hilario

    2015-01-01

    To develop a brief and reliable psychometric scale to identify individuals at risk for suicidal behaviour. Case-control study. 182 individuals (61 suicide attempters, 57 psychiatric controls, and 64 psychiatrically healthy controls) aged 18 or older, admitted to the Emergency Department at Puerta de Hierro University Hospital in Madrid, Spain. All participants completed a form including their socio-demographic and clinical characteristics, and the Personality and Life Events scale (27 items). To assess Axis I diagnoses, all psychiatric patients (including suicide attempters) were administered the Mini International Neuropsychiatric Interview. Descriptive statistics were computed for the socio-demographic factors. Additionally, χ(2) independence tests were applied to evaluate differences in socio-demographic and clinical variables, and the Personality and Life Events scale between groups. A stepwise linear regression with backward variable selection was conducted to build the Short Personality Life Event (S-PLE) scale. In order to evaluate the accuracy, a ROC analysis was conducted. The internal reliability was assessed using Cronbach's α, and the external reliability was evaluated using a test-retest procedure. The S-PLE scale, composed of just 6 items, showed good performance in discriminating between medical controls, psychiatric controls and suicide attempters in an independent sample. For instance, the S-PLE scale discriminated between past suicide and past non-suicide attempters with sensitivity of 80% and specificity of 75%. The area under the ROC curve was 88%. A factor analysis extracted only one factor, revealing a single dimension of the S-PLE scale. Furthermore, the S-PLE scale provides values of internal and external reliability between poor (test-retest: 0.55) and acceptable (Cronbach's α: 0.65) ranges. Administration time is about one minute. The S-PLE scale is a useful and accurate instrument for estimating the risk of suicidal behaviour in

  11. MEDUSA: Mining Events to Detect Undesirable uSer Actions in SCADA

    NARCIS (Netherlands)

    Hadziosmanovic, D.; Bolzoni, D.; Hartel, Pieter H.; Jha, Somesh; Sommer, Robin; Kreibich, Christian

    2010-01-01

    Standard approaches for detecting malicious behaviors, e.g. monitoring network traffic, cannot address process-related threats in SCADA(Supervisory Control And Data Acquisition) systems. These threats take place when an attacker gains user access rights and performs actions which look legitimate,

  12. Extranuclear detection of histones and nucleosomes in activated human lymphoblasts as an early event in apoptosis.

    NARCIS (Netherlands)

    Gabler, C.; Blank, N.; Hieronymus, T.; Schiller, M.; Berden, J.H.M.; Kalden, J.R.; Lorenz, H.M.

    2004-01-01

    OBJECTIVE: To evaluate the presence of histones and nucleosomes in cell lysates of freshly isolated peripheral blood mononuclear cells (PBMC), fully activated lymphoblasts, or lymphoblasts after induction of apoptosis. METHODS: Each histone class (H1, H2A, H2B, H3, and H4) was detected by western

  13. USBeSafe: Applying One Class SVM for Effective USB Event Anomaly Detection

    Science.gov (United States)

    2016-04-25

    2012. [5] Phil Muncaster. Indian navy computers stormed by malware-ridden USBs. 2012. [6] Ponemon. 2011 Second Annual Cost of Cyber Crime Study...Zhang, and Shanshan Sun. “A mixed unsu- pervised clustering-based intrusion detection model”. In: Genetic and Evolutionary Computing, 2009. WGEC’09

  14. myBlackBox: Blackbox Mobile Cloud Systems for Personalized Unusual Event Detection

    Directory of Open Access Journals (Sweden)

    Junho Ahn

    2016-05-01

    Full Text Available We demonstrate the feasibility of constructing a novel and practical real-world mobile cloud system, called myBlackBox, that efficiently fuses multimodal smartphone sensor data to identify and log unusual personal events in mobile users’ daily lives. The system incorporates a hybrid architectural design that combines unsupervised classification of audio, accelerometer and location data with supervised joint fusion classification to achieve high accuracy, customization, convenience and scalability. We show the feasibility of myBlackBox by implementing and evaluating this end-to-end system that combines Android smartphones with cloud servers, deployed for 15 users over a one-month period.

  15. Exploring design tradeoffs of a distributed algorithm for cosmic ray event detection

    Science.gov (United States)

    Yousaf, S.; Bakhshi, R.; van Steen, M.; Voulgaris, S.; Kelley, J. L.

    2013-03-01

    Many sensor networks, including large particle detector arrays measuring high-energy cosmic-ray air showers, traditionally rely on centralised trigger algorithms to find spatial and temporal coincidences of individual nodes. Such schemes suffer from scalability problems, especially if the nodes communicate wirelessly or have bandwidth limitations. However, nodes which instead communicate with each other can, in principle, use a distributed algorithm to find coincident events themselves without communication with a central node. We present such an algorithm and consider various design tradeoffs involved, in the context of a potential trigger for the Auger Engineering Radio Array (AERA).

  16. Analyzing direct dark matter detection data with unrejected background events by the AMIDAS website

    International Nuclear Information System (INIS)

    Shan, Chung-Lin

    2012-01-01

    In this talk I have presented the data analysis results of extracting properties of halo WIMPs: the mass and the (ratios between the) spin-independent and spin-dependent couplings/cross sections on nucleons by the AMIDAS website by taking into account possible unrejected background events in the analyzed data sets. Although non-standard astronomical setup has been used to generate pseudodata sets for our analyses, it has been found that, without prior information/assumption about the local density and velocity distribution of halo Dark Matter, these WIMP properties have been reconstructed with ∼ 2% to ∼< 30% deviations from the input values.

  17. Detecting subsurface fluid leaks in real-time using injection and production rates

    Science.gov (United States)

    Singh, Harpreet; Huerta, Nicolas J.

    2017-12-01

    CO2 injection into geologic formations for either enhanced oil recovery or carbon storage introduces a risk for undesired fluid leakage into overlying groundwater or to the surface. Despite decades of subsurface CO2 production and injection, the technologies and methods for detecting CO2 leaks are still costly and prone to large uncertainties. This is especially true for pressure-based monitoring methods, which require the use of simplified geological and reservoir flow models to simulate the pressure behavior as well as background noise affecting pressure measurements. In this study, we propose a method to detect the time and volume of fluid leakage based on real-time measurements of well injection and production rates. The approach utilizes analogies between fluid flow and capacitance-resistance modeling. Unlike other leak detection methods (e.g. pressure-based), the proposed method does not require geological and reservoir flow models to simulate the behavior that often carry significant sources of uncertainty; therefore, with our approach the leak can be detected with greater certainty. The method can be applied to detect when a leak begins by tracking a departure in fluid production rate from the expected pattern. The method has been tuned to detect the effect of boundary conditions and fluid compressibility on leakage. To highlight the utility of this approach we use our method to detect leaks for two scenarios. The first scenario simulates a fluid leak from the storage formation into an above-zone monitoring interval. The second scenario simulates intra-reservoir migration between two compartments. We illustrate this method to detect fluid leakage in three different reservoirs with varying levels of geological and structural complexity. The proposed leakage detection method has three novelties: i) requires only readily-available data (injection and production rates), ii) accounts for fluid compressibility and boundary effects, and iii) in addition to

  18. The Tendency of the Crest Factor Helps Detect Nascent Events; Electronic Circuit, Software and Applications to Signals from Diverse Fields

    Directory of Open Access Journals (Sweden)

    Núñez-Pérez Ricardo Francisco

    2014-04-01

    Full Text Available Within the signal analysis techniques in the time domain, the crest factor (CF is undoubtedly one of the most simple and fast to implement using electronic circuits and/or software. That's why it has been used reliably to care for machinery and to evaluate the quality of supply. One of the major manufacturers of instruments for these purposes is Bruel and Kjaer and defines the crest factor of voltage or repetitive current signal as the ratio of the peak level and its rms value during a certain period of time. In this paper, we try to find out experimentally the potential of CF and their tendency to detect the nascent and evolution of events in various fields of knowledge, either by generating it with a developed electronic circuit, or with calculations, through routines that are performed with the programs DADISP and LabVIEW. The results are validated and checked for all the above factors and trends through a comparison between them and the proposed features and specifications. The results were acceptable so that the tools were applied to detect early faults in electrical machines, to identify chaosity differences between the circuits with these dynamics, to detect abnormal respiratory distress or rales in patients and to detect harmful distortions in the electrical current, all this based on simulations and measurements for each of the 4 cases studied. Other CF original applications proposed are: a control of chaos in electronic circuits that stir/ mix industrial processes and b correct the power factor of non-linear and inductive loads. A medium-term study and use a CF that considers the maximum signal peak to peak is contemplated, and it is thought that it can improve event detection

  19. ON THE DETECTABILITY OF A PREDICTED MESOLENSING EVENT ASSOCIATED WITH THE HIGH PROPER MOTION STAR VB 10

    International Nuclear Information System (INIS)

    Lépine, Sébastien; DiStefano, Rosanne

    2012-01-01

    Extrapolation of the astrometric motion of the nearby low-mass star VB 10 indicates that sometime in late 2011 December or during the first 2-3 months of 2012, the star will make a close approach to a background point source. Based on astrometric uncertainties, we estimate a 1 in 2 chance that the distance of closest approach ρ min will be less than 100 mas, a 1 in 5 chance that ρ min min J planet on a moderately wide (≈0.18 AU–0.84 AU) orbit, there is a chance (1% to more than 10%, depending on the distance of closest approach and orbital period and inclination) that a passage of the planet closer to the background source will result in a secondary event of higher magnification. The detection of secondary events could be made possible with a several-times-per-night multi-site monitoring campaign.

  20. Cardiovascular disease (CVD) and chronic kidney disease (CKD) event rates in HIV-positive persons at high predicted CVD and CKD risk

    DEFF Research Database (Denmark)

    Boyd, Mark A; Mocroft, Amanda; Ryom, Lene

    2017-01-01

    BACKGROUND: The Data Collection on Adverse Events of Anti-HIV Drugs (D:A:D) study has developed predictive risk scores for cardiovascular disease (CVD) and chronic kidney disease (CKD, defined as confirmed estimated glomerular filtration rate [eGFR] ≤ 60 ml/min/1.73 m2) events in HIV...

  1. 5 CFR Appendix A to Subpart F of... - List of Events for Which Inclusion of NAFI Service May Affect the Rate of Annuity Payable

    Science.gov (United States)

    2010-01-01

    ... 5 Administrative Personnel 2 2010-01-01 2010-01-01 false List of Events for Which Inclusion of... of Part 847—List of Events for Which Inclusion of NAFI Service May Affect the Rate of Annuity Payable... of annuity. CSRS disability retirement Commencing date of annuity. 1 FERS disability retirement First...

  2. Does Prison Crowding Predict Higher Rates of Substance Use Related Parole Violations? A Recurrent Events Multi-Level Survival Analysis.

    Directory of Open Access Journals (Sweden)

    Michael A Ruderman

    Full Text Available This administrative data-linkage cohort study examines the association between prison crowding and the rate of post-release parole violations in a random sample of prisoners released with parole conditions in California, for an observation period of two years (January 2003 through December 2004.Crowding overextends prison resources needed to adequately protect inmates and provide drug rehabilitation services. Violence and lack of access to treatment are known risk factors for drug use and substance use disorders. These and other psychosocial effects of crowding may lead to higher rates of recidivism in California parolees.Rates of parole violation for parolees exposed to high and medium levels of prison crowding were compared to parolees with low prison crowding exposure. Hazard ratios (HRs with 95% confidence intervals (CIs were estimated using a Cox model for recurrent events. Our dataset included 13070 parolees in California, combining individual level parolee data with aggregate level crowding data for multilevel analysis.Comparing parolees exposed to high crowding with those exposed to low crowding, the effect sizes from greatest to least were absconding violations (HR 3.56 95% CI: 3.05-4.17, drug violations (HR 2.44 95% CI: 2.00-2.98, non-violent violations (HR 2.14 95% CI: 1.73-2.64, violent and serious violations (HR 1.88 95% CI: 1.45-2.43, and technical violations (HR 1.86 95% CI: 1.37-2.53.Prison crowding predicted higher rates of parole violations after release from prison. The effect was magnitude-dependent and particularly strong for drug charges. Further research into whether adverse prison experiences, such as crowding, are associated with recidivism and drug use in particular may be warranted.

  3. Does Prison Crowding Predict Higher Rates of Substance Use Related Parole Violations? A Recurrent Events Multi-Level Survival Analysis.

    Science.gov (United States)

    Ruderman, Michael A; Wilson, Deirdra F; Reid, Savanna

    2015-01-01

    This administrative data-linkage cohort study examines the association between prison crowding and the rate of post-release parole violations in a random sample of prisoners released with parole conditions in California, for an observation period of two years (January 2003 through December 2004). Crowding overextends prison resources needed to adequately protect inmates and provide drug rehabilitation services. Violence and lack of access to treatment are known risk factors for drug use and substance use disorders. These and other psychosocial effects of crowding may lead to higher rates of recidivism in California parolees. Rates of parole violation for parolees exposed to high and medium levels of prison crowding were compared to parolees with low prison crowding exposure. Hazard ratios (HRs) with 95% confidence intervals (CIs) were estimated using a Cox model for recurrent events. Our dataset included 13070 parolees in California, combining individual level parolee data with aggregate level crowding data for multilevel analysis. Comparing parolees exposed to high crowding with those exposed to low crowding, the effect sizes from greatest to least were absconding violations (HR 3.56 95% CI: 3.05-4.17), drug violations (HR 2.44 95% CI: 2.00-2.98), non-violent violations (HR 2.14 95% CI: 1.73-2.64), violent and serious violations (HR 1.88 95% CI: 1.45-2.43), and technical violations (HR 1.86 95% CI: 1.37-2.53). Prison crowding predicted higher rates of parole violations after release from prison. The effect was magnitude-dependent and particularly strong for drug charges. Further research into whether adverse prison experiences, such as crowding, are associated with recidivism and drug use in particular may be warranted.

  4. Does Prison Crowding Predict Higher Rates of Substance Use Related Parole Violations? A Recurrent Events Multi-Level Survival Analysis

    Science.gov (United States)

    Ruderman, Michael A.; Wilson, Deirdra F.; Reid, Savanna

    2015-01-01

    Objective This administrative data-linkage cohort study examines the association between prison crowding and the rate of post-release parole violations in a random sample of prisoners released with parole conditions in California, for an observation period of two years (January 2003 through December 2004). Background Crowding overextends prison resources needed to adequately protect inmates and provide drug rehabilitation services. Violence and lack of access to treatment are known risk factors for drug use and substance use disorders. These and other psychosocial effects of crowding may lead to higher rates of recidivism in California parolees. Methods Rates of parole violation for parolees exposed to high and medium levels of prison crowding were compared to parolees with low prison crowding exposure. Hazard ratios (HRs) with 95% confidence intervals (CIs) were estimated using a Cox model for recurrent events. Our dataset included 13070 parolees in California, combining individual level parolee data with aggregate level crowding data for multilevel analysis. Results Comparing parolees exposed to high crowding with those exposed to low crowding, the effect sizes from greatest to least were absconding violations (HR 3.56 95% CI: 3.05–4.17), drug violations (HR 2.44 95% CI: 2.00–2.98), non-violent violations (HR 2.14 95% CI: 1.73–2.64), violent and serious violations (HR 1.88 95% CI: 1.45–2.43), and technical violations (HR 1.86 95% CI: 1.37–2.53). Conclusions Prison crowding predicted higher rates of parole violations after release from prison. The effect was magnitude-dependent and particularly strong for drug charges. Further research into whether adverse prison experiences, such as crowding, are associated with recidivism and drug use in particular may be warranted. PMID:26492490

  5. DETECTION OF SUPERSONIC DOWNFLOWS AND ASSOCIATED HEATING EVENTS IN THE TRANSITION REGION ABOVE SUNSPOTS

    Energy Technology Data Exchange (ETDEWEB)

    Kleint, L.; Martínez-Sykora, J. [Bay Area Environmental Research Institute, 625 2nd Street, Ste. 209, Petaluma, CA (United States); Antolin, P. [National Astronomical Observatory of Japan, 2-21-1 Osawa, Mitaka, Tokyo 181-8588 (Japan); Tian, H.; Testa, P.; Reeves, K. K.; McKillop, S.; Saar, S.; Golub, L. [Harvard-Smithsonian Center for Astrophysics, 60 Garden Street, Cambridge, MA 02138 (United States); Judge, P. [High Altitude Observatory/NCAR, P.O. Box 3000, Boulder, CO 80307 (United States); De Pontieu, B.; Wuelser, J. P.; Boerner, P.; Hurlburt, N.; Lemen, J.; Tarbell, T. D.; Title, A. [Lockheed Martin Solar and Astrophysics Laboratory, 3251 Hanover St., Org. ADBS, Bldg. 252, Palo Alto, CA 94304 (United States); Carlsson, M.; Hansteen, V. [Institute of Theoretical Astrophysics, University of Oslo, P.O. Box 1029, Blindern, NO-0315 Oslo (Norway); Jaeggli, S., E-mail: lucia.kleint@fhnw.ch [Department of Physics, Montana State University, Bozeman, P.O. Box 173840, Bozeman, MT 59717 (United States); and others

    2014-07-10

    Interface Region Imaging Spectrograph data allow us to study the solar transition region (TR) with an unprecedented spatial resolution of 0.''33. On 2013 August 30, we observed bursts of high Doppler shifts suggesting strong supersonic downflows of up to 200 km s{sup –1} and weaker, slightly slower upflows in the spectral lines Mg II h and k, C II 1336, Si IV 1394 Å, and 1403 Å, that are correlated with brightenings in the slitjaw images (SJIs). The bursty behavior lasts throughout the 2 hr observation, with average burst durations of about 20 s. The locations of these short-lived events appear to be the umbral and penumbral footpoints of EUV loops. Fast apparent downflows are observed along these loops in the SJIs and in the Atmospheric Imaging Assembly, suggesting that the loops are thermally unstable. We interpret the observations as cool material falling from coronal heights, and especially coronal rain produced along the thermally unstable loops, which leads to an increase of intensity at the loop footpoints, probably indicating an increase of density and temperature in the TR. The rain speeds are on the higher end of previously reported speeds for this phenomenon, and possibly higher than the free-fall velocity along the loops. On other observing days, similar bright dots are sometimes aligned into ribbons, resembling small flare ribbons. These observations provide a first insight into small-scale heating events in sunspots in the TR.

  6. NuSTAR Detection of X-Ray Heating Events in the Quiet Sun

    Science.gov (United States)

    Kuhar, Matej; Krucker, Säm; Glesener, Lindsay; Hannah, Iain G.; Grefenstette, Brian W.; Smith, David M.; Hudson, Hugh S.; White, Stephen M.

    2018-04-01

    The explanation of the coronal heating problem potentially lies in the existence of nanoflares, numerous small-scale heating events occurring across the whole solar disk. In this Letter, we present the first imaging spectroscopy X-ray observations of three quiet Sun flares during the Nuclear Spectroscopic Telescope ARray (NuSTAR) solar campaigns on 2016 July 26 and 2017 March 21, concurrent with the Solar Dynamics Observatory/Atmospheric Imaging Assembly (SDO/AIA) observations. Two of the three events showed time lags of a few minutes between peak X-ray and extreme ultraviolet emissions. Isothermal fits with rather low temperatures in the range 3.2–4.1 MK and emission measures of (0.6–15) × 1044 cm‑3 describe their spectra well, resulting in thermal energies in the range (2–6) × 1026 erg. NuSTAR spectra did not show any signs of a nonthermal or higher temperature component. However, as the estimated upper limits of (hidden) nonthermal energy are comparable to the thermal energy estimates, the lack of a nonthermal component in the observed spectra is not a constraining result. The estimated Geostationary Operational Environmental Satellite (GOES) classes from the fitted values of temperature and emission measure fall between 1/1000 and 1/100 A class level, making them eight orders of magnitude fainter in soft X-ray flux than the largest solar flares.

  7. Semi-automated camera trap image processing for the detection of ungulate fence crossing events.

    Science.gov (United States)

    Janzen, Michael; Visser, Kaitlyn; Visscher, Darcy; MacLeod, Ian; Vujnovic, Dragomir; Vujnovic, Ksenija

    2017-09-27

    Remote cameras are an increasingly important tool for ecological research. While remote camera traps collect field data with minimal human attention, the images they collect require post-processing and characterization before it can be ecologically and statistically analyzed, requiring the input of substantial time and money from researchers. The need for post-processing is due, in part, to a high incidence of non-target images. We developed a stand-alone semi-automated computer program to aid in image processing, categorization, and data reduction by employing background subtraction and histogram rules. Unlike previous work that uses video as input, our program uses still camera trap images. The program was developed for an ungulate fence crossing project and tested against an image dataset which had been previously processed by a human operator. Our program placed images into categories representing the confidence of a particular sequence of images containing a fence crossing event. This resulted in a reduction of 54.8% of images that required further human operator characterization while retaining 72.6% of the known fence crossing events. This program can provide researchers using remote camera data the ability to reduce the time and cost required for image post-processing and characterization. Further, we discuss how this procedure might be generalized to situations not specifically related to animal use of linear features.

  8. Evaluation of epidemic intelligence systems integrated in the early alerting and reporting project for the detection of A/H5N1 influenza events.

    Directory of Open Access Journals (Sweden)

    Philippe Barboza

    Full Text Available The objective of Web-based expert epidemic intelligence systems is to detect health threats. The Global Health Security Initiative (GHSI Early Alerting and Reporting (EAR project was launched to assess the feasibility and opportunity for pooling epidemic intelligence data from seven expert systems. EAR participants completed a qualitative survey to document epidemic intelligence strategies and to assess perceptions regarding the systems performance. Timeliness and sensitivity were rated highly illustrating the value of the systems for epidemic intelligence. Weaknesses identified included representativeness, completeness and flexibility. These findings were corroborated by the quantitative analysis performed on signals potentially related to influenza A/H5N1 events occurring in March 2010. For the six systems for which this information was available, the detection rate ranged from 31% to 38%, and increased to 72% when considering the virtual combined system. The effective positive predictive values ranged from 3% to 24% and F1-scores ranged from 6% to 27%. System sensitivity ranged from 38% to 72%. An average difference of 23% was observed between the sensitivities calculated for human cases and epizootics, underlining the difficulties in developing an efficient algorithm for a single pathology. However, the sensitivity increased to 93% when the virtual combined system was considered, clearly illustrating complementarities between individual systems. The average delay between the detection of A/H5N1 events by the systems and their official reporting by WHO or OIE was 10.2 days (95% CI: 6.7-13.8. This work illustrates the diversity in implemented epidemic intelligence activities, differences in system's designs, and the potential added values and opportunities for synergy between systems, between users and between systems and users.

  9. Evaluation of Epidemic Intelligence Systems Integrated in the Early Alerting and Reporting Project for the Detection of A/H5N1 Influenza Events

    Science.gov (United States)

    Barboza, Philippe; Vaillant, Laetitia; Mawudeku, Abla; Nelson, Noele P.; Hartley, David M.; Madoff, Lawrence C.; Linge, Jens P.; Collier, Nigel; Brownstein, John S.; Yangarber, Roman; Astagneau, Pascal; on behalf of the Early Alerting, Reporting Project of the Global Health Security Initiative

    2013-01-01

    The objective of Web-based expert epidemic intelligence systems is to detect health threats. The Global Health Security Initiative (GHSI) Early Alerting and Reporting (EAR) project was launched to assess the feasibility and opportunity for pooling epidemic intelligence data from seven expert systems. EAR participants completed a qualitative survey to document epidemic intelligence strategies and to assess perceptions regarding the systems performance. Timeliness and sensitivity were rated highly illustrating the value of the systems for epidemic intelligence. Weaknesses identified included representativeness, completeness and flexibility. These findings were corroborated by the quantitative analysis performed on signals potentially related to influenza A/H5N1 events occurring in March 2010. For the six systems for which this information was available, the detection rate ranged from 31% to 38%, and increased to 72% when considering the virtual combined system. The effective positive predictive values ranged from 3% to 24% and F1-scores ranged from 6% to 27%. System sensitivity ranged from 38% to 72%. An average difference of 23% was observed between the sensitivities calculated for human cases and epizootics, underlining the difficulties in developing an efficient algorithm for a single pathology. However, the sensitivity increased to 93% when the virtual combined system was considered, clearly illustrating complementarities between individual systems. The average delay between the detection of A/H5N1 events by the systems and their official reporting by WHO or OIE was 10.2 days (95% CI: 6.7–13.8). This work illustrates the diversity in implemented epidemic intelligence activities, differences in system's designs, and the potential added values and opportunities for synergy between systems, between users and between systems and users. PMID:23472077

  10. Evaluation of epidemic intelligence systems integrated in the early alerting and reporting project for the detection of A/H5N1 influenza events.

    Science.gov (United States)

    Barboza, Philippe; Vaillant, Laetitia; Mawudeku, Abla; Nelson, Noele P; Hartley, David M; Madoff, Lawrence C; Linge, Jens P; Collier, Nigel; Brownstein, John S; Yangarber, Roman; Astagneau, Pascal

    2013-01-01

    The objective of Web-based expert epidemic intelligence systems is to detect health threats. The Global Health Security Initiative (GHSI) Early Alerting and Reporting (EAR) project was launched to assess the feasibility and opportunity for pooling epidemic intelligence data from seven expert systems. EAR participants completed a qualitative survey to document epidemic intelligence strategies and to assess perceptions regarding the systems performance. Timeliness and sensitivity were rated highly illustrating the value of the systems for epidemic intelligence. Weaknesses identified included representativeness, completeness and flexibility. These findings were corroborated by the quantitative analysis performed on signals potentially related to influenza A/H5N1 events occurring in March 2010. For the six systems for which this information was available, the detection rate ranged from 31% to 38%, and increased to 72% when considering the virtual combined system. The effective positive predictive values ranged from 3% to 24% and F1-scores ranged from 6% to 27%. System sensitivity ranged from 38% to 72%. An average difference of 23% was observed between the sensitivities calculated for human cases and epizootics, underlining the difficulties in developing an efficient algorithm for a single pathology. However, the sensitivity increased to 93% when the virtual combined system was considered, clearly illustrating complementarities between individual systems. The average delay between the detection of A/H5N1 events by the systems and their official reporting by WHO or OIE was 10.2 days (95% CI: 6.7-13.8). This work illustrates the diversity in implemented epidemic intelligence activities, differences in system's designs, and the potential added values and opportunities for synergy between systems, between users and between systems and users.

  11. Device to detect the presence of a pure signal in a discrete noisy signal measured at an average rate of constant noise with a probability of false detection lower than one predeterminated

    International Nuclear Information System (INIS)

    Poussier, E.; Rambaut, M.

    1986-01-01

    Detection consists of a measurement of a counting rate. A probability of wrong detection is associated with this counting rate and with an average estimated rate of noise. Detection consists also in comparing the wrong detection probability to a predeterminated rate of wrong detection. The comparison can use tabulated values. Application is made to corpuscule radiation detection [fr

  12. Using additional external inputs to forecast water quality with an artificial neural network for contamination event detection in source water

    Science.gov (United States)

    Schmidt, F.; Liu, S.

    2016-12-01

    Source water quality plays an important role for the safety of drinking water and early detection of its contamination is vital to taking appropriate countermeasures. However, compared to drinking water, it is more difficult to detect contamination events because its environment is less controlled and numerous natural causes contribute to a high variability of the background values. In this project, Artificial Neural Networks (ANNs) and a Contamination Event Detection Process (CED Process) were used to identify events in river water. The ANN models the response of basic water quality sensors obtained in laboratory experiments in an off-line learning stage and continuously forecasts future values of the time line in an on-line forecasting step. During this second stage, the CED Process compares the forecast to the measured value and classifies it as regular background or event value, which modifies the ANN's continuous learning and influences its forecasts. In addition to this basic setup, external information is fed to the CED Process: A so-called Operator Input (OI) is provided to inform about unusual water quality levels that are unrelated to the presence of contamination, for example due to cooling water discharge from a nearby power plant. This study's primary goal is to evaluate how well the OI fits into the design of the combined forecasting ANN and CED Process and to understand its effects on the online forecasting stage. To test this, data from laboratory experiments conducted previously at the School of Environment, Tsinghua University, have been used to perform simulations highlighting features and drawbacks of this method. Applying the OI has been shown to have a positive influence on the ANN's ability to handle a sudden change in background values, which is unrelated to contamination. However, it might also mask the presence of an event, an issue that underlines the necessity to have several instances of the algorithm run in parallel. Other difficulties

  13. Multiscale vision model for event detection and reconstruction in two-photon imaging data

    DEFF Research Database (Denmark)

    Brazhe, Alexey; Mathiesen, Claus; Lind, Barbara Lykke

    2014-01-01

    on a modified multiscale vision model, an object detection framework based on the thresholding of wavelet coefficients and hierarchical trees of significant coefficients followed by nonlinear iterative partial object reconstruction, for the analysis of two-photon calcium imaging data. The framework is discussed...... of the multiscale vision model is similar in the denoising, but provides a better segmenation of the image into meaningful objects, whereas other methods need to be combined with dedicated thresholding and segmentation utilities....

  14. CISN ShakeAlert: Faster Warning Information Through Multiple Threshold Event Detection in the Virtual Seismologist (VS) Early Warning Algorithm

    Science.gov (United States)

    Cua, G. B.; Fischer, M.; Caprio, M.; Heaton, T. H.; Cisn Earthquake Early Warning Project Team

    2010-12-01

    The Virtual Seismologist (VS) earthquake early warning (EEW) algorithm is one of 3 EEW approaches being incorporated into the California Integrated Seismic Network (CISN) ShakeAlert system, a prototype EEW system that could potentially be implemented in California. The VS algorithm, implemented by the Swiss Seismological Service at ETH Zurich, is a Bayesian approach to EEW, wherein the most probable source estimate at any given time is a combination of contributions from a likehihood function that evolves in response to incoming data from the on-going earthquake, and selected prior information, which can include factors such as network topology, the Gutenberg-Richter relationship or previously observed seismicity. The VS codes have been running in real-time at the Southern California Seismic Network since July 2008, and at the Northern California Seismic Network since February 2009. We discuss recent enhancements to the VS EEW algorithm that are being integrated into CISN ShakeAlert. We developed and continue to test a multiple-threshold event detection scheme, which uses different association / location approaches depending on the peak amplitudes associated with an incoming P pick. With this scheme, an event with sufficiently high initial amplitudes can be declared on the basis of a single station, maximizing warning times for damaging events for which EEW is most relevant. Smaller, non-damaging events, which will have lower initial amplitudes, will require more picks to initiate an event declaration, with the goal of reducing false alarms. This transforms the VS codes from a regional EEW approach reliant on traditional location estimation (and the requirement of at least 4 picks as implemented by the Binder Earthworm phase associator) into an on-site/regional approach capable of providing a continuously evolving stream of EEW information starting from the first P-detection. Real-time and offline analysis on Swiss and California waveform datasets indicate that the

  15. The impact of realistic models of mass segregation on the event rate of extreme-mass ratio inspirals and cusp re-growth

    International Nuclear Information System (INIS)

    Amaro-Seoane, Pau; Preto, Miguel

    2011-01-01

    One of the most interesting sources of gravitational waves (GWs) for LISA is the inspiral of compact objects on to a massive black hole (MBH), commonly referred to as an 'extreme-mass ratio inspiral' (EMRI). The small object, typically a stellar black hole, emits significant amounts of GW along each orbit in the detector bandwidth. The slowly, adiabatic inspiral of these sources will allow us to map spacetime around MBHs in detail, as well as to test our current conception of gravitation in the strong regime. The event rate of this kind of source has been addressed many times in the literature and the numbers reported fluctuate by orders of magnitude. On the other hand, recent observations of the Galactic centre revealed a dearth of giant stars inside the inner parsec relative to the numbers theoretically expected for a fully relaxed stellar cusp. The possibility of unrelaxed nuclei (or, equivalently, with no or only a very shallow cusp, or core) adds substantial uncertainty to the estimates. Having this timely question in mind, we run a significant number of direct-summation N-body simulations with up to half a million particles to calibrate a much faster orbit-averaged Fokker-Planck code. We show that, under quite generic initial conditions, the time required for the growth of a relaxed, mass segregated stellar cusp is shorter than a Hubble time for MBHs with M . ∼ 6 M o-dot (i.e. nuclei in the range of LISA). We then investigate the regime of strong mass segregation (SMS) for models with two different stellar mass components. Given the most recent stellar mass normalization for the inner parsec of the Galactic centre, SMS has the significant impact of boosting the EMRI rates by a factor of ∼10 in comparison to what would result from a 7/4-Bahcall and Wolf cusp resulting in ∼250 events per Gyr per Milky Way type galaxy. Such an intrinsic rate should translate roughly into ∼10 2 -7 x 10 2 sbh's (EMRIs detected by LISA over a mission lifetime of 2 or 5

  16. Combination of High Rate, Real-time GNSS and Accelerometer Observations - Preliminary Results Using a Shake Table and Historic Earthquake Events.

    Science.gov (United States)

    Jackson, Michael; Passmore, Paul; Zimakov, Leonid; Raczka, Jared

    2014-05-01

    One of the fundamental requirements of an Earthquake Early Warning (EEW) system (and other mission critical applications) is to quickly detect and process the information from the strong motion event, i.e. event detection and location, magnitude estimation, and the peak ground motion estimation at the defined targeted site, thus allowing the civil protection authorities to provide pre-programmed emergency response actions: Slow down or stop rapid transit trains and high-speed trains; shutoff of gas pipelines and chemical facilities; stop elevators at the nearest floor; send alarms to hospitals, schools and other civil institutions. An important question associated with the EEW system is: can we measure displacements in real time with sufficient accuracy? Scientific GNSS networks are moving towards a model of real-time data acquisition, storage integrity, and real-time position and displacement calculations. This new paradigm allows the integration of real-time, high-rate GNSS displacement information with acceleration and velocity data to create very high-rate displacement records. The mating of these two instruments allows the creation of a new, very high-rate (200 Hz) displacement observable that has the full-scale displacement characteristics of GNSS and high-precision dynamic motions of seismic technologies. It is envisioned that these new observables can be used for earthquake early warning studies and other mission critical applications, such as volcano monitoring, building, bridge and dam monitoring systems. REF TEK a Division of Trimble has developed the integrated GNSS/Accelerograph system, model 160-09SG, which consists of REF TEK's fourth generation electronics, a 147-01 high-resolution ANSS Class A accelerometer, and Trimble GNSS receiver and antenna capable of real time, on board Precise Point Positioning (PPP) techniques with satellite clock and orbit corrections delivered to the receiver directly via L-band satellite communications. The test we

  17. Reliable Maintanace of Wireless Sensor Networks for Event-detection Applications%事件检测型传感器网络的可靠性维护

    Institute of Scientific and Technical Information of China (English)

    胡四泉; 杨金阳; 王俊峰

    2011-01-01

    The reliability maintannace of the wireless sensor network is a key point to keep the alarm messages delivered reliably to the monitor center on time in a event-detection application. Based on the unreliable links in the wireless sensor network and the network charateristics of an event detection application,MPRRM,a multiple path redundant reliability maintanace algoritm was proposed in this paper. Both analytical and simulation results show that the MPRRM algorithm is superior to the previous published solutions in the metrics of reliability, false positive rate, latency and message overhead.%传感器网络(Wireless Sensor Networks,WSN)的事件检测型应用中,如何通过可靠性维护来保证在检测到事件时报警信息能及时、可靠地传输到监控主机至关重要.通过对不可靠的无线链路和网络传输的分析,提出多路冗余可靠性维护算法MPRRM.通过解析方法和仿真分析证明,该算法在可靠性、误报率、延迟和消息开销量上比同类算法具有优势.

  18. Stage discharge curve for Guillemard Bridge streamflow sation based on rating curve method using historical flood event data

    International Nuclear Information System (INIS)

    Ros, F C; Sidek, L M; Desa, M N; Arifin, K; Tosaka, H

    2013-01-01

    The purpose of the stage-discharge curves varies from water quality study, flood modelling study, can be used to project climate change scenarios and so on. As the bed of the river often changes due to the annual monsoon seasons that sometimes cause by massive floods, the capacity of the river will changed causing shifting controlled to happen. This study proposes to use the historical flood event data from 1960 to 2009 in calculating the stage-discharge curve of Guillemard Bridge located in Sg. Kelantan. Regression analysis was done to check the quality of the data and examine the correlation between the two variables, Q and H. The mean values of the two variables then were adopted to find the value of difference between zero gauge height and the level of zero flow, 'a', K and 'n' to fit into rating curve equation and finally plotting the stage-discharge rating curve. Regression analysis of the historical flood data indicate that 91 percent of the original uncertainty has been explained by the analysis with the standard error of 0.085.

  19. Growth rate of late passage sarcoma cells is independent of epigenetic events but dependent on the amount of chromosomal aberrations

    International Nuclear Information System (INIS)

    Becerikli, Mustafa; Jacobsen, Frank; Rittig, Andrea; Köhne, Wiebke; Nambiar, Sandeep; Mirmohammadsadegh, Alireza; Stricker, Ingo; Tannapfel, Andrea; Wieczorek, Stefan; Epplen, Joerg Thomas; Tilkorn, Daniel; Steinstraesser, Lars

    2013-01-01

    Soft tissue sarcomas (STS) are characterized by co-participation of several epigenetic and genetic events during tumorigenesis. Having bypassed cellular senescence barriers during oncogenic transformation, the factors further affecting growth rate of STS cells remain poorly understood. Therefore, we investigated the role of gene silencing (DNA promoter methylation of LINE-1, PTEN), genetic aberrations (karyotype, KRAS and BRAF mutations) as well as their contribution to the proliferation rate and migratory potential that underlies “initial” and “final” passage sarcoma cells. Three different cell lines were used, SW982 (synovial sarcoma), U2197 (malignant fibrous histiocytoma (MFH)) and HT1080 (fibrosarcoma). Increased proliferative potential of final passage STS cells was not associated with significant differences in methylation (LINE-1, PTEN) and mutation status (KRAS, BRAF), but it was dependent on the amount of chromosomal aberrations. Collectively, our data demonstrate that these fairly differentiated/advanced cancer cell lines have still the potential to gain an additional spontaneous growth benefit without external influences and that maintenance of increased proliferative potential towards longevity of STS cells (having crossed senescence barriers) may be independent of overt epigenetic alterations. -- Highlights: Increased proliferative potential of late passage STS cells was: • Not associated with epigenetic changes (methylation changes at LINE-1, PTEN). • Not associated with mutation status of KRAS, BRAF. • Dependent on presence/absence of chromosomal aberrations

  20. Evaluating Monitoring Strategies to Detect Precipitation-Induced Microbial Contamination Events in Karstic Springs Used for Drinking Water

    Directory of Open Access Journals (Sweden)

    Michael D. Besmer

    2017-11-01

    Full Text Available Monitoring of microbial drinking water quality is a key component for ensuring safety and understanding risk, but conventional monitoring strategies are typically based on low sampling frequencies (e.g., quarterly or monthly. This is of concern because many drinking water sources, such as karstic springs are often subject to changes in bacterial concentrations on much shorter time scales (e.g., hours to days, for example after precipitation events. Microbial contamination events are crucial from a risk assessment perspective and should therefore be targeted by monitoring strategies to establish both the frequency of their occurrence and the magnitude of bacterial peak concentrations. In this study we used monitoring data from two specific karstic springs. We assessed the performance of conventional monitoring based on historical records and tested a number of alternative strategies based on a high-resolution data set of bacterial concentrations in spring water collected with online flow cytometry (FCM. We quantified the effect of increasing sampling frequency and found that for the specific case studied, at least bi-weekly sampling would be needed to detect precipitation events with a probability of >90%. We then proposed an optimized monitoring strategy with three targeted samples per event, triggered by precipitation measurements. This approach is more effective and efficient than simply increasing overall sampling frequency. It would enable the water utility to (1 analyze any relevant event and (2 limit median underestimation of peak concentrations to approximately 10%. We conclude with a generalized perspective on sampling optimization and argue that the assessment of short-term dynamics causing microbial peak loads initially requires increased sampling/analysis efforts, but can be optimized subsequently to account for limited resources. This offers water utilities and public health authorities systematic ways to evaluate and optimize their

  1. Detection of exudates in fundus imagery using a constant false-alarm rate (CFAR) detector

    Science.gov (United States)

    Khanna, Manish; Kapoor, Elina

    2014-05-01

    Diabetic retinopathy is the leading cause of blindness in adults in the United States. The presence of exudates in fundus imagery is the early sign of diabetic retinopathy so detection of these lesions is essential in preventing further ocular damage. In this paper we present a novel technique to automatically detect exudates in fundus imagery that is robust against spatial and temporal variations of background noise. The detection threshold is adjusted dynamically, based on the local noise statics around the pixel under test in order to maintain a pre-determined, constant false alarm rate (CFAR). The CFAR detector is often used to detect bright targets in radar imagery where the background clutter can vary considerably from scene to scene and with angle to the scene. Similarly, the CFAR detector addresses the challenge of detecting exudate lesions in RGB and multispectral fundus imagery where the background clutter often exhibits variations in brightness and texture. These variations present a challenge to common, global thresholding detection algorithms and other methods. Performance of the CFAR algorithm is tested against a publicly available, annotated, diabetic retinopathy database and preliminary testing suggests that performance of the CFAR detector proves to be superior to techniques such as Otsu thresholding.

  2. A new method for detecting interactions between the senses in event-related potentials

    DEFF Research Database (Denmark)

    Gondan, Matthias; Röder, B.

    2006-01-01

    Event-related potentials (ERPs) can be used in multisensory research to determine the point in time when different senses start to interact, for example, the auditory and the visual system. For this purpose, the ERP to bimodal stimuli (AV) is often compared to the sum of the ERPs to auditory (A......) and visual (V) stimuli: AV - (A + V). If the result is non-zero, this is interpreted as an indicator for multisensory interactions. Using this method, several studies have demonstrated auditory-visual interactions as early as 50 ms after stimulus onset. The subtraction requires that A, V, and AV do...... not contain common activity: This activity would be subtracted twice from one ERP and would, therefore, contaminate the result. In the present study, ERPs to unimodal, bimodal, and trimodal auditory, visual, and tactile stimuli (T) were recorded. We demonstrate that (T + TAV) - (TA + TV) is equivalent to AV...

  3. Study of the Convergence in State Estimators for LTI Systems with Event Detection

    Directory of Open Access Journals (Sweden)

    Juan C. Posada

    2016-01-01

    Full Text Available The methods frequently used to estimate the state of an LTI system require that the precise value of the output variable is known at all times, or at equidistant sampling times. In LTI systems, in which the output signal is measured through binary sensors (detectors, the traditional way of state observers design is not applicable even though the system has a complete observability matrix. This type of state observers design is known as passive. It is necessary, then, to introduce a new state estimation technique, which allows reckoning the state from the information of the variable’s crossing through a detector’s action threshold (switch. This paper seeks, therefore, to study the convergence in this type of estimators in finite time, allowing establishing, theoretically, whether some family of the proposed models can be estimated in a convergent way through the use of the estimation technique based on events.

  4. A novel approach for analyzing data on recurrent events with duration to estimate the combined cumulative rate of both variables over time

    Directory of Open Access Journals (Sweden)

    Sudipta Bhattacharya

    2018-06-01

    Full Text Available Recurrent adverse events, once occur often continue for some duration of time in clinical trials; and the number of events along with their durations is clinically considered as a measure of severity of a disease under study. While there are methods available for analyzing recurrent events or durations or for analyzing both side by side, no effort has been made so far to combine them and present as a single measure. However, this single-valued combined measure may help clinicians assess the wholesome effect of recurrence of incident comprising events and durations. Non-parametric approach is adapted here to develop an estimator for estimating the combined rate of both, the recurrence of events as well as the event-continuation, that is the duration per event. The proposed estimator produces a single numerical value, the interpretation and meaningfulness of which are discussed through the analysis of a real-life clinical dataset. The algebraic expression of variance is derived, asymptotic normality of the estimator is noted, and demonstration is provided on how the estimator can be used in the setup of testing of statistical hypothesis. Further possible development of the estimator is also noted, to adjust for the dependence of event occurrences on the history of the process generating recurrent events through covariates and for the case of dependent censoring. Keywords: Recurrent events, Duration per event, Intensity, Nelson-aalen estimator

  5. Public High School Four-Year On-Time Graduation Rates and Event Dropout Rates: School Years 2010-11 and 2011-12. First Look. NCES 2014-391

    Science.gov (United States)

    Stetser, Marie C.; Stillwell, Robert

    2014-01-01

    This National Center for Education Statistics (NCES) First Look report introduces new data for two separate measures of 4-year on-time graduation rates as well as event dropout rates for school year (SY) 2010-11 and SY 2011-12. Specifically this report provides the following: (1) Four-year adjusted cohort graduation rate (ACGR) data reported by…

  6. Universal design of a microcontroller and IoT system to detect the heart rate

    Science.gov (United States)

    Uwamahoro, Raphael; Mushikiwabeza, Alexie; Minani, Gerard; Mohan Murari, Bhaskar

    2017-11-01

    Heart rate analysis provides vital information of the present condition of the human body. It helps medical professionals in diagnosis of various malfunctions of the body. The limitation of vision impaired and blind people to access medical devices cause a considerable loss of life. In this paper, we intended to develop a heart rate detection system that is usable for people with normal and abnormal vision. The system is based on a non-invasive method of measuring the variation of the tissue blood flow rate by means of a photo transmitter and detector through fingertip known as photoplethysmography (PPG). The signal detected is firstly passed through active low pass filter and then amplified by a two stages high gain amplifier. The amplified signal is feed into the microcontroller to calculate the heart rate and displays the heart beat via sound systems and Liquid Crystal Display (LCD). To distinguish arrhythmia, normal heart rate and abnormal working conditions of the system, recognition is provided in different sounds, LCD readings and Light Emitting Diodes (LED).

  7. Measurement of the modification and interference rate of urinary albumin detected by size-exclusion HPLC

    International Nuclear Information System (INIS)

    Markó, Lajos; Molnár, Gergő Attila; Wagner, Zoltán; Szijártó, István; Mérei, Ákos; Wittmann, István; Böddi, Katalin; Szabó, Zoltán; Matus, Zoltán; Kőszegi, Tamás; Nagy, Géza

    2009-01-01

    The measurement of the excretion of urinary albumin (albuminuria) is an important and well-established method to assess clinical outcomes. A high-performance liquid chromatography (HPLC) method has been introduced to measure albuminuria. Using this method, it was found that commonly used immunological methods do not measure a fraction of urinary albumin. Some authors presumed that the reason of immuno-unreactivity is the modification of urinary albumin; some others presumed that the difference is merely because of interference. In order to decide this question, we established an HPLC method equipped with tandem UV and fluorescent detection to assess the changes in the detectability of albumin with the rate of modification. For this measurement, differently modified forms of albumin were used. Urine samples of diabetic patients were also measured to find a potential connection between the modification rate and clinical parameters. Secondly, we have established a reversed phase HPLC method to assess the interference rate. We conclude that albumin modification does not affect immunoreactivity. The modification rate of urinary albumin in diabetic patients showed a correlation with renal function. The interference rate of the albumin peak was found to be 12.7% on average, which does not explain the difference between the two methods

  8. Invariance of the bit error rate in the ancilla-assisted homodyne detection

    International Nuclear Information System (INIS)

    Yoshida, Yuhsuke; Takeoka, Masahiro; Sasaki, Masahide

    2010-01-01

    We investigate the minimum achievable bit error rate of the discrimination of binary coherent states with the help of arbitrary ancillary states. We adopt homodyne measurement with a common phase of the local oscillator and classical feedforward control. After one ancillary state is measured, its outcome is referred to the preparation of the next ancillary state and the tuning of the next mixing with the signal. It is shown that the minimum bit error rate of the system is invariant under the following operations: feedforward control, deformations, and introduction of any ancillary state. We also discuss the possible generalization of the homodyne detection scheme.

  9. Reaction rate constant of HO2+O3 measured by detecting HO2 from photofragment fluorescence

    Science.gov (United States)

    Manzanares, E. R.; Suto, Masako; Lee, Long C.; Coffey, Dewitt, Jr.

    1986-01-01

    A room-temperature discharge-flow system investigation of the rate constant for the reaction 'HO2 + O3 yields OH + 2O2' has detected HO2 through the OH(A-X) fluorescence produced by photodissociative excitation of HO2 at 147 nm. A reaction rate constant of 1.9 + or - 0.3 x 10 to the -15th cu cm/molecule per sec is obtained from first-order decay of HO2 in excess O3; this agrees well with published data.

  10. The Rates of Type I X-ray Bursts from Transients Observed with RXTE: Evidence for Black Hole Event Horizons

    Science.gov (United States)

    Remillard, R. A.; Lin, D.; Cooper, R. L.; Narayan, R.

    2005-12-01

    We measure the rates of type I X-ray bursts from a likely complete sample of 37 non-pulsing Galactic X-ray transients observed with the RXTE ASM during 1996-2004. Our strategy is to test the prevailing paradigms for these sources, which are well-categorized in the literature as either neutron-star systems or black hole candidates. Burst rates are measured as a function of the bolometric luminosity, and the results are compared with burst models for neutron stars and for heavy compact objects with a solid surface. We use augmented versions of the models developed by Narayan & Heyl (2002; 2003). For a given mass, we consider a range of conditions in both the radius and the temperature at the boundary below the accretion layer. We find 135 type I bursts in 3.7 Ms of PCA light curves for the neutron-star group, and the burst rate function is generally consistent with the model predictions for bursts from accreting neutron stars. On the other hand, none of the (20) bursts candidates passed spectral criteria for type I bursts in 6.5 Ms of PCA light curves for black-hole binaries and candidates. The burst function upper limits are inconsistent with the predictions of the burst model for heavy compact objects with a solid surface. The consistency probability is found to be below 10-7 for dynamical black-hole binaries, falling to below 10-13 for the additional exposures of black-hole candidates. These results provide indirect evidence that black holes do have event horizons. This research was supported, in part, by NASA science programs.

  11. Individual polyp detection rate in routine daily endoscopy practice depends on case-mix.

    Science.gov (United States)

    Loffeld, R J L F; Liberov, B; Dekkers, P E P

    2015-07-01

    The adenoma detection rate (ADR), a marker of endoscopic quality, is confounded by selection bias. It is not known what the ADR is in normal daily practice. To study the polyp detection rate (PDR) in different endoscopists in the course of years. All consecutive endoscopies of the colon done in 11 years were included. Endoscopies in the regular surveillance programme after polyp removal and after surgery because of colorectal cancer or diverticular disease were scored separately. The number of yearly procedures per endoscopist and presence of polyps, anastomoses, surveillance and cancer were noted. In the period of 11 years, 14,908 consecutive endoscopies of colon and rectum were done by four endoscopists. Two endoscopists had a significantly lower PDR than the other two (p case-mix of patients presented for endoscopy. This result debates the use of the ADR as quality indicator for individual endoscopists.

  12. A multiplex microplatform for the detection of multiple DNA methylation events using gold-DNA affinity.

    Science.gov (United States)

    Sina, Abu Ali Ibn; Foster, Matthew Thomas; Korbie, Darren; Carrascosa, Laura G; Shiddiky, Muhammad J A; Gao, Jing; Dey, Shuvashis; Trau, Matt

    2017-10-07

    We report a new multiplexed strategy for the electrochemical detection of regional DNA methylation across multiple regions. Using the sequence dependent affinity of bisulfite treated DNA towards gold surfaces, the method integrates the high sensitivity of a micro-fabricated multiplex device comprising a microarray of gold electrodes, with the powerful multiplexing capability of multiplex-PCR. The synergy of this combination enables the monitoring of the methylation changes across several genomic regions simultaneously from as low as 500 pg μl -1 of DNA with no sequencing requirement.

  13. Factors influencing the detection rate of drug-related problems in community pharmacy

    DEFF Research Database (Denmark)

    Westerlund, T; Almarsdóttir, Anna Birna; Melander, A

    1999-01-01

    This study analyzes relationships between the number of drug-related problems detected in community pharmacy practice and the educational level and other characteristics of pharmacy personnel and their work sites. Random samples of pharmacists, prescriptionists and pharmacy technicians were drawn...... by each professional. The regression analysis showed the educational level of the professional to have a statistically significant effect on the detection rate, with pharmacists finding on average 2.5 more drug-related problems per 100 patients than prescriptionists and about 3.6 more than technicians....... The results of this study indicate the importance of education and training of pharmacy personnel in detection of drug-related problems. This findings speaks in favor of increasing the pharmacist to other personnel ratio, provided the higher costs will be offset by societal benefits....

  14. Predictive value of night-time heart rate for cardiovascular events in hypertension. The ABP-International study.

    Science.gov (United States)

    Palatini, Paolo; Reboldi, Gianpaolo; Beilin, Lawrence J; Eguchi, Kazuo; Imai, Yutaka; Kario, Kazuomi; Ohkubo, Takayoshi; Pierdomenico, Sante D; Saladini, Francesca; Schwartz, Joseph E; Wing, Lindon; Verdecchia, Paolo

    2013-09-30

    Data from prospective cohort studies regarding the association between ambulatory heart rate (HR) and cardiovascular events (CVE) are conflicting. To investigate whether ambulatory HR predicts CVE in hypertension, we performed 24-hour ambulatory blood pressure and HR monitoring in 7600 hypertensive patients aged 52 ± 16 years from Italy, U.S.A., Japan, and Australia, included in the 'ABP-International' registry. All were untreated at baseline examination. Standardized hazard ratios for ambulatory HRs were computed, stratifying for cohort, and adjusting for age, gender, blood pressure, smoking, diabetes, serum total cholesterol and serum creatinine. During a median follow-up of 5.0 years there were 639 fatal and nonfatal CVE. In a multivariable Cox model, night-time HR predicted fatal combined with nonfatal CVE more closely than 24h HR (p=0.007 and =0.03, respectively). Daytime HR and the night:day HR ratio were not associated with CVE (p=0.07 and =0.18, respectively). The hazard ratio of the fatal combined with nonfatal CVE for a 10-beats/min increment of the night-time HR was 1.13 (95% CI, 1.04-1.22). This relationship remained significant when subjects taking beta-blockers during the follow-up (hazard ratio, 1.15; 95% CI, 1.05-1.25) or subjects who had an event within 5 years after enrollment (hazard ratio, 1.23; 95% CI, 1.05-1.45) were excluded from analysis. At variance with previous data obtained from general populations, ambulatory HR added to the risk stratification for fatal combined with nonfatal CVE in the hypertensive patients from the ABP-International study. Night-time HR was a better predictor of CVE than daytime HR. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  15. How Newspaper-Article-Events, Other Stock Market Indices, and the Foreign Currency Rate Affect the Philippine Stock Market

    OpenAIRE

    Percival S. Gabriel

    2013-01-01

    Eugene Fama in his “Efficient Market Hypothesis” introduced the term newspaper-article-event. The aim of this paper is to find out if newspaper-article-events which are presented and discussed in newspaper articles and which could collage to create an atmosphere of investment, together with the indices of other stock markets (treated as other events) and the performance of the Philippine Peso against the US Dollar (considered as another event) could affect the closing Philippine Stock Market ...

  16. Double symbol error rates for differential detection of narrow-band FM

    Science.gov (United States)

    Simon, M. K.

    1985-01-01

    This paper evaluates the double symbol error rate (average probability of two consecutive symbol errors) in differentially detected narrow-band FM. Numerical results are presented for the special case of MSK with a Gaussian IF receive filter. It is shown that, not unlike similar results previously obtained for the single error probability of such systems, large inaccuracies in predicted performance can occur when intersymbol interference is ignored.

  17. Non-contact detection of cardiac rate based on visible light imaging device

    Science.gov (United States)

    Zhu, Huishi; Zhao, Yuejin; Dong, Liquan

    2012-10-01

    We have developed a non-contact method to detect human cardiac rate at a distance. This detection is based on the general lighting condition. Using the video signal of human face region captured by webcam, we acquire the cardiac rate based on the PhotoPlethysmoGraphy theory. In this paper, the cardiac rate detecting method is mainly in view of the blood's different absorptivities of the lights various wavelengths. Firstly, we discompose the video signal into RGB three color signal channels and choose the face region as region of interest to take average gray value. Then, we draw three gray-mean curves on each color channel with time as variable. When the imaging device has good fidelity of color, the green channel signal shows the PhotoPlethysmoGraphy information most clearly. But the red and blue channel signals can provide more other physiological information on the account of their light absorptive characteristics of blood. We divide red channel signal by green channel signal to acquire the pulse wave. With the passband from 0.67Hz to 3Hz as a filter of the pulse wave signal and the frequency spectrum superimposed algorithm, we design frequency extracted algorithm to achieve the cardiac rate. Finally, we experiment with 30 volunteers, containing different genders and different ages. The results of the experiments are all relatively agreeable. The difference is about 2bmp. Through the experiment, we deduce that the PhotoPlethysmoGraphy theory based on visible light can also be used to detect other physiological information.

  18. Fast joint detection-estimation of evoked brain activity in event-related FMRI using a variational approach

    Science.gov (United States)

    Chaari, Lotfi; Vincent, Thomas; Forbes, Florence; Dojat, Michel; Ciuciu, Philippe

    2013-01-01

    In standard within-subject analyses of event-related fMRI data, two steps are usually performed separately: detection of brain activity and estimation of the hemodynamic response. Because these two steps are inherently linked, we adopt the so-called region-based Joint Detection-Estimation (JDE) framework that addresses this joint issue using a multivariate inference for detection and estimation. JDE is built by making use of a regional bilinear generative model of the BOLD response and constraining the parameter estimation by physiological priors using temporal and spatial information in a Markovian model. In contrast to previous works that use Markov Chain Monte Carlo (MCMC) techniques to sample the resulting intractable posterior distribution, we recast the JDE into a missing data framework and derive a Variational Expectation-Maximization (VEM) algorithm for its inference. A variational approximation is used to approximate the Markovian model in the unsupervised spatially adaptive JDE inference, which allows automatic fine-tuning of spatial regularization parameters. It provides a new algorithm that exhibits interesting properties in terms of estimation error and computational cost compared to the previously used MCMC-based approach. Experiments on artificial and real data show that VEM-JDE is robust to model mis-specification and provides computational gain while maintaining good performance in terms of activation detection and hemodynamic shape recovery. PMID:23096056

  19. A High Performance Impedance-based Platform for Evaporation Rate Detection.

    Science.gov (United States)

    Chou, Wei-Lung; Lee, Pee-Yew; Chen, Cheng-You; Lin, Yu-Hsin; Lin, Yung-Sheng

    2016-10-17

    This paper describes the method of a novel impedance-based platform for the detection of the evaporation rate. The model compound hyaluronic acid was employed here for demonstration purposes. Multiple evaporation tests on the model compound as a humectant with various concentrations in solutions were conducted for comparison purposes. A conventional weight loss approach is known as the most straightforward, but time-consuming, measurement technique for evaporation rate detection. Yet, a clear disadvantage is that a large volume of sample is required and multiple sample tests cannot be conducted at the same time. For the first time in literature, an electrical impedance sensing chip is successfully applied to a real-time evaporation investigation in a time sharing, continuous and automatic manner. Moreover, as little as 0.5 ml of test samples is required in this impedance-based apparatus, and a large impedance variation is demonstrated among various dilute solutions. The proposed high-sensitivity and fast-response impedance sensing system is found to outperform a conventional weight loss approach in terms of evaporation rate detection.

  20. Microlensing events by Proxima Centauri in 2014 and 2016: Opportunities for mass determination and possible planet detection

    Energy Technology Data Exchange (ETDEWEB)

    Sahu, Kailash C.; Bond, Howard E.; Anderson, Jay [Space Telescope Science Institute, 3700 San Martin Drive, Baltimore, MD 21218 (United States); Dominik, Martin, E-mail: ksahu@stsci.edu, E-mail: jayander@stsci.edu, E-mail: heb11@psu.edu, E-mail: md35@st-andrews.ac.uk [SUPA, School of Physics and Astronomy, University of St. Andrews, North Haugh, St. Andrews KY16 9SS (United Kingdom)

    2014-02-20

    We have found that Proxima Centauri, the star closest to our Sun, will pass close to a pair of faint background stars in the next few years. Using Hubble Space Telescope (HST) images obtained in 2012 October, we determine that the passage close to a mag 20 star will occur in 2014 October (impact parameter 1.''6), and to a mag 19.5 star in 2016 February (impact parameter 0.''5). As Proxima passes in front of these stars, the relativistic deflection of light will cause shifts in the positions of the background stars of ∼0.5 and 1.5 mas, respectively, readily detectable by HST imaging, and possibly by Gaia and ground-based facilities such as the Very Large Telescope. Measurement of these astrometric shifts offers a unique and direct method to measure the mass of Proxima. Moreover, if Proxima has a planetary system, the planets may be detectable through their additional microlensing signals, although the probability of such detections is small. With astrometric accuracies of 0.03 mas (achievable with HST spatial scanning), centroid shifts caused by Jovian planets are detectable at separations of up to 2.''0 (corresponding to 2.6 AU at the distance of Proxima), and centroid shifts by Earth-mass planets are detectable within a small band of 8 mas (corresponding to 0.01 AU) around the source trajectories. Jovian planets within a band of about 28 mas (corresponding to 0.036 AU) around the source trajectories would produce a brightening of the source by >0.01 mag and could hence be detectable. Estimated timescales of the astrometric and photometric microlensing events due to a planet range from a few hours to a few days, and both methods would provide direct measurements of the planetary mass.