WorldWideScience

Sample records for model-based event detection

  1. A Cyber-Attack Detection Model Based on Multivariate Analyses

    Science.gov (United States)

    Sakai, Yuto; Rinsaka, Koichiro; Dohi, Tadashi

    In the present paper, we propose a novel cyber-attack detection model based on two multivariate-analysis methods to the audit data observed on a host machine. The statistical techniques used here are the well-known Hayashi's quantification method IV and cluster analysis method. We quantify the observed qualitative audit event sequence via the quantification method IV, and collect similar audit event sequence in the same groups based on the cluster analysis. It is shown in simulation experiments that our model can improve the cyber-attack detection accuracy in some realistic cases where both normal and attack activities are intermingled.

  2. Detection of anomalous events

    Science.gov (United States)

    Ferragut, Erik M.; Laska, Jason A.; Bridges, Robert A.

    2016-06-07

    A system is described for receiving a stream of events and scoring the events based on anomalousness and maliciousness (or other classification). The system can include a plurality of anomaly detectors that together implement an algorithm to identify low-probability events and detect atypical traffic patterns. The anomaly detector provides for comparability of disparate sources of data (e.g., network flow data and firewall logs.) Additionally, the anomaly detector allows for regulatability, meaning that the algorithm can be user configurable to adjust a number of false alerts. The anomaly detector can be used for a variety of probability density functions, including normal Gaussian distributions, irregular distributions, as well as functions associated with continuous or discrete variables.

  3. Trojan detection model based on network behavior analysis

    International Nuclear Information System (INIS)

    Liu Junrong; Liu Baoxu; Wang Wenjin

    2012-01-01

    Based on the analysis of existing Trojan detection technology, this paper presents a Trojan detection model based on network behavior analysis. First of all, we abstract description of the Trojan network behavior, then according to certain rules to establish the characteristic behavior library, and then use the support vector machine algorithm to determine whether a Trojan invasion. Finally, through the intrusion detection experiments, shows that this model can effectively detect Trojans. (authors)

  4. Model Based Fault Detection in a Centrifugal Pump Application

    DEFF Research Database (Denmark)

    Kallesøe, Carsten; Cocquempot, Vincent; Izadi-Zamanabadi, Roozbeh

    2006-01-01

    A model based approach for fault detection in a centrifugal pump, driven by an induction motor, is proposed in this paper. The fault detection algorithm is derived using a combination of structural analysis, observer design and Analytical Redundancy Relation (ARR) design. Structural considerations...

  5. Nonlinear Model-Based Fault Detection for a Hydraulic Actuator

    NARCIS (Netherlands)

    Van Eykeren, L.; Chu, Q.P.

    2011-01-01

    This paper presents a model-based fault detection algorithm for a specific fault scenario of the ADDSAFE project. The fault considered is the disconnection of a control surface from its hydraulic actuator. Detecting this type of fault as fast as possible helps to operate an aircraft more cost

  6. A model-based approach to operational event groups ranking

    Energy Technology Data Exchange (ETDEWEB)

    Simic, Zdenko [European Commission Joint Research Centre, Petten (Netherlands). Inst. for Energy and Transport; Maqua, Michael [Gesellschaft fuer Anlagen- und Reaktorsicherheit mbH (GRS), Koeln (Germany); Wattrelos, Didier [Institut de Radioprotection et de Surete Nucleaire (IRSN), Fontenay-aux-Roses (France)

    2014-04-15

    The operational experience (OE) feedback provides improvements in all industrial activities. Identification of the most important and valuable groups of events within accumulated experience is important in order to focus on a detailed investigation of events. The paper describes the new ranking method and compares it with three others. Methods have been described and applied to OE events utilised by nuclear power plants in France and Germany for twenty years. The results show that different ranking methods only roughly agree on which of the event groups are the most important ones. In the new ranking method the analytical hierarchy process is applied in order to assure consistent and comprehensive weighting determination for ranking indexes. The proposed method allows a transparent and flexible event groups ranking and identification of the most important OE for further more detailed investigation in order to complete the feedback. (orig.)

  7. Model Based Verification of Cyber Range Event Environments

    Science.gov (United States)

    2015-11-13

    that may include users, applications, operating systems, servers, hosts, routers, switches, control planes , and instrumentation planes , many of...which lack models for their configuration. Our main contributions in this paper are the following. First, we have developed a configuration ontology...configuration errors in environment designs for several cyber range events. The rest of the paper is organized as follows. Section 2 provides an overview of

  8. Zero-inflated Poisson model based likelihood ratio test for drug safety signal detection.

    Science.gov (United States)

    Huang, Lan; Zheng, Dan; Zalkikar, Jyoti; Tiwari, Ram

    2017-02-01

    In recent decades, numerous methods have been developed for data mining of large drug safety databases, such as Food and Drug Administration's (FDA's) Adverse Event Reporting System, where data matrices are formed by drugs such as columns and adverse events as rows. Often, a large number of cells in these data matrices have zero cell counts and some of them are "true zeros" indicating that the drug-adverse event pairs cannot occur, and these zero counts are distinguished from the other zero counts that are modeled zero counts and simply indicate that the drug-adverse event pairs have not occurred yet or have not been reported yet. In this paper, a zero-inflated Poisson model based likelihood ratio test method is proposed to identify drug-adverse event pairs that have disproportionately high reporting rates, which are also called signals. The maximum likelihood estimates of the model parameters of zero-inflated Poisson model based likelihood ratio test are obtained using the expectation and maximization algorithm. The zero-inflated Poisson model based likelihood ratio test is also modified to handle the stratified analyses for binary and categorical covariates (e.g. gender and age) in the data. The proposed zero-inflated Poisson model based likelihood ratio test method is shown to asymptotically control the type I error and false discovery rate, and its finite sample performance for signal detection is evaluated through a simulation study. The simulation results show that the zero-inflated Poisson model based likelihood ratio test method performs similar to Poisson model based likelihood ratio test method when the estimated percentage of true zeros in the database is small. Both the zero-inflated Poisson model based likelihood ratio test and likelihood ratio test methods are applied to six selected drugs, from the 2006 to 2011 Adverse Event Reporting System database, with varying percentages of observed zero-count cells.

  9. Fuzzy model-based observers for fault detection in CSTR.

    Science.gov (United States)

    Ballesteros-Moncada, Hazael; Herrera-López, Enrique J; Anzurez-Marín, Juan

    2015-11-01

    Under the vast variety of fuzzy model-based observers reported in the literature, what would be the properone to be used for fault detection in a class of chemical reactor? In this study four fuzzy model-based observers for sensor fault detection of a Continuous Stirred Tank Reactor were designed and compared. The designs include (i) a Luenberger fuzzy observer, (ii) a Luenberger fuzzy observer with sliding modes, (iii) a Walcott-Zak fuzzy observer, and (iv) an Utkin fuzzy observer. A negative, an oscillating fault signal, and a bounded random noise signal with a maximum value of ±0.4 were used to evaluate and compare the performance of the fuzzy observers. The Utkin fuzzy observer showed the best performance under the tested conditions. Copyright © 2015 ISA. Published by Elsevier Ltd. All rights reserved.

  10. EDICAM (Event Detection Intelligent Camera)

    Energy Technology Data Exchange (ETDEWEB)

    Zoletnik, S. [Wigner RCP RMI, EURATOM Association, Budapest (Hungary); Szabolics, T., E-mail: szabolics.tamas@wigner.mta.hu [Wigner RCP RMI, EURATOM Association, Budapest (Hungary); Kocsis, G.; Szepesi, T.; Dunai, D. [Wigner RCP RMI, EURATOM Association, Budapest (Hungary)

    2013-10-15

    Highlights: ► We present EDICAM's hardware modules. ► We present EDICAM's main design concepts. ► This paper will describe EDICAM firmware architecture. ► Operation principles description. ► Further developments. -- Abstract: A new type of fast framing camera has been developed for fusion applications by the Wigner Research Centre for Physics during the last few years. A new concept was designed for intelligent event driven imaging which is capable of focusing image readout to Regions of Interests (ROIs) where and when predefined events occur. At present these events mean intensity changes and external triggers but in the future more sophisticated methods might also be defined. The camera provides 444 Hz frame rate at full resolution of 1280 × 1024 pixels, but monitoring of smaller ROIs can be done in the 1–116 kHz range even during exposure of the full image. Keeping space limitations and the harsh environment in mind the camera is divided into a small Sensor Module and a processing card interconnected by a fast 10 Gbit optical link. This camera hardware has been used for passive monitoring of the plasma in different devices for example at ASDEX Upgrade and COMPASS with the first version of its firmware. The new firmware and software package is now available and ready for testing the new event processing features. This paper will present the operation principle and features of the Event Detection Intelligent Camera (EDICAM). The device is intended to be the central element in the 10-camera monitoring system of the Wendelstein 7-X stellarator.

  11. Event detection intelligent camera development

    International Nuclear Information System (INIS)

    Szappanos, A.; Kocsis, G.; Molnar, A.; Sarkozi, J.; Zoletnik, S.

    2008-01-01

    A new camera system 'event detection intelligent camera' (EDICAM) is being developed for the video diagnostics of W-7X stellarator, which consists of 10 distinct and standalone measurement channels each holding a camera. Different operation modes will be implemented for continuous and for triggered readout as well. Hardware level trigger signals will be generated from real time image processing algorithms optimized for digital signal processor (DSP) and field programmable gate array (FPGA) architectures. At full resolution a camera sends 12 bit sampled 1280 x 1024 pixels with 444 fps which means 1.43 Terabyte over half an hour. To analyse such a huge amount of data is time consuming and has a high computational complexity. We plan to overcome this problem by EDICAM's preprocessing concepts. EDICAM camera system integrates all the advantages of CMOS sensor chip technology and fast network connections. EDICAM is built up from three different modules with two interfaces. A sensor module (SM) with reduced hardware and functional elements to reach a small and compact size and robust action in harmful environment as well. An image processing and control unit (IPCU) module handles the entire user predefined events and runs image processing algorithms to generate trigger signals. Finally a 10 Gigabit Ethernet compatible image readout card functions as the network interface for the PC. In this contribution all the concepts of EDICAM and the functions of the distinct modules are described

  12. Discrete event model-based simulation for train movement on a single-line railway

    International Nuclear Information System (INIS)

    Xu Xiao-Ming; Li Ke-Ping; Yang Li-Xing

    2014-01-01

    The aim of this paper is to present a discrete event model-based approach to simulate train movement with the considered energy-saving factor. We conduct extensive case studies to show the dynamic characteristics of the traffic flow and demonstrate the effectiveness of the proposed approach. The simulation results indicate that the proposed discrete event model-based simulation approach is suitable for characterizing the movements of a group of trains on a single railway line with less iterations and CPU time. Additionally, some other qualitative and quantitative characteristics are investigated. In particular, because of the cumulative influence from the previous trains, the following trains should be accelerated or braked frequently to control the headway distance, leading to more energy consumption. (general)

  13. Model-based fault detection algorithm for photovoltaic system monitoring

    KAUST Repository

    Harrou, Fouzi; Sun, Ying; Saidi, Ahmed

    2018-01-01

    Reliable detection of faults in PV systems plays an important role in improving their reliability, productivity, and safety. This paper addresses the detection of faults in the direct current (DC) side of photovoltaic (PV) systems using a

  14. Analytical Model-based Fault Detection and Isolation in Control Systems

    DEFF Research Database (Denmark)

    Vukic, Z.; Ozbolt, H.; Blanke, M.

    1998-01-01

    The paper gives an introduction and an overview of the field of fault detection and isolation for control systems. The summary of analytical (quantitative model-based) methodds and their implementation are presented. The focus is given to mthe analytical model-based fault-detection and fault...

  15. A simple strategy for fall events detection

    KAUST Repository

    Harrou, Fouzi

    2017-01-20

    The paper concerns the detection of fall events based on human silhouette shape variations. The detection of fall events is addressed from the statistical point of view as an anomaly detection problem. Specifically, the paper investigates the multivariate exponentially weighted moving average (MEWMA) control chart to detect fall events. Towards this end, a set of ratios for five partial occupancy areas of the human body for each frame are collected and used as the input data to MEWMA chart. The MEWMA fall detection scheme has been successfully applied to two publicly available fall detection databases, the UR fall detection dataset (URFD) and the fall detection dataset (FDD). The monitoring strategy developed was able to provide early alert mechanisms in the event of fall situations.

  16. Model-Based Design of Tree WSNs for Decentralized Detection

    Directory of Open Access Journals (Sweden)

    Ashraf Tantawy

    2015-08-01

    Full Text Available The classical decentralized detection problem of finding the optimal decision rules at the sensor and fusion center, as well as variants that introduce physical channel impairments have been studied extensively in the literature. The deployment of WSNs in decentralized detection applications brings new challenges to the field. Protocols for different communication layers have to be co-designed to optimize the detection performance. In this paper, we consider the communication network design problem for a tree WSN. We pursue a system-level approach where a complete model for the system is developed that captures the interactions between different layers, as well as different sensor quality measures. For network optimization, we propose a hierarchical optimization algorithm that lends itself to the tree structure, requiring only local network information. The proposed design approach shows superior performance over several contentionless and contention-based network design approaches.

  17. Model-based fault detection algorithm for photovoltaic system monitoring

    KAUST Repository

    Harrou, Fouzi

    2018-02-12

    Reliable detection of faults in PV systems plays an important role in improving their reliability, productivity, and safety. This paper addresses the detection of faults in the direct current (DC) side of photovoltaic (PV) systems using a statistical approach. Specifically, a simulation model that mimics the theoretical performances of the inspected PV system is designed. Residuals, which are the difference between the measured and estimated output data, are used as a fault indicator. Indeed, residuals are used as the input for the Multivariate CUmulative SUM (MCUSUM) algorithm to detect potential faults. We evaluated the proposed method by using data from an actual 20 MWp grid-connected PV system located in the province of Adrar, Algeria.

  18. Semblance for microseismic event detection

    Czech Academy of Sciences Publication Activity Database

    Staněk, František; Anikiev, D.; Valenta, Jan; Eisner, Leo

    2015-01-01

    Roč. 201, č. 3 (2015), s. 1362-1369 ISSN 0956-540X R&D Projects: GA ČR GAP210/12/2451 Institutional support: RVO:67985891 Keywords : microseismic event * microseismic monitoring * source mechanisms Subject RIV: DC - Siesmology, Volcanology, Earth Structure Impact factor: 2.484, year: 2015

  19. A new binaural detection model based on contralateral inhibition

    NARCIS (Netherlands)

    Breebaart, D.J.; Kohlrausch, A.G.; Dau, T.; Hohmann, V.; Kollmeier, B.

    1999-01-01

    Binaural models attempt to explain binaural phenomena in terms of neural mechanisms that extract binaural information from accoustic stimuli. In this paper, a model setup is presented that can be used to simulate binaural detection tasks. In contrast to the most often used cross correlation between

  20. Model-Based Detection of Pipe Leakage at Joints

    International Nuclear Information System (INIS)

    Kim, Taejin; Youn, Byeng D.; Woo, Sihyong

    2015-01-01

    Time domain reflectometry (TDR) is widely used for wire failure detection. It transmits a pulse that is reflected at the boundaries of different characteristic impedances. By analyzing the reflected signal, TDR makes it possible to locate the failure. In this study, TDR was used to detect the water leakage at a pipe joint. A wire attached to the pipe surface was soaked by water when a leak occurred, which affected the characteristic impedance of the wet part, resulting in a change in the reflected signal. To infer the leakage from the TDR signal, we first developed a finite difference time domain-based forward model that provided the output of the TDR signal given the configuration of the transmission line. Then, by solving the inverse problem, the locations of the leaks were found

  1. Joint Attributes and Event Analysis for Multimedia Event Detection.

    Science.gov (United States)

    Ma, Zhigang; Chang, Xiaojun; Xu, Zhongwen; Sebe, Nicu; Hauptmann, Alexander G

    2017-06-15

    Semantic attributes have been increasingly used the past few years for multimedia event detection (MED) with promising results. The motivation is that multimedia events generally consist of lower level components such as objects, scenes, and actions. By characterizing multimedia event videos with semantic attributes, one could exploit more informative cues for improved detection results. Much existing work obtains semantic attributes from images, which may be suboptimal for video analysis since these image-inferred attributes do not carry dynamic information that is essential for videos. To address this issue, we propose to learn semantic attributes from external videos using their semantic labels. We name them video attributes in this paper. In contrast with multimedia event videos, these external videos depict lower level contents such as objects, scenes, and actions. To harness video attributes, we propose an algorithm established on a correlation vector that correlates them to a target event. Consequently, we could incorporate video attributes latently as extra information into the event detector learnt from multimedia event videos in a joint framework. To validate our method, we perform experiments on the real-world large-scale TRECVID MED 2013 and 2014 data sets and compare our method with several state-of-the-art algorithms. The experiments show that our method is advantageous for MED.

  2. Detection of neutral current events

    International Nuclear Information System (INIS)

    Innocenti, P.G.

    1979-01-01

    The topics investigated in the course of the study can be broadly divided into three classes: (i) Inclusive measurements of the scattered electron for the determination of structure functions, scaling violations, deltaL/deltaT and weak interaction effects. (ii) Exclusive measurements of the current jet (momentum, energy, particle composition) for the study of fragmentation functions, for search of new particles, new quarks and QCD effects in the jet. (iii) Search for heavy leptons by detection and identification of their decay products. (orig.)

  3. Generalized Detectability for Discrete Event Systems

    Science.gov (United States)

    Shu, Shaolong; Lin, Feng

    2011-01-01

    In our previous work, we investigated detectability of discrete event systems, which is defined as the ability to determine the current and subsequent states of a system based on observation. For different applications, we defined four types of detectabilities: (weak) detectability, strong detectability, (weak) periodic detectability, and strong periodic detectability. In this paper, we extend our results in three aspects. (1) We extend detectability from deterministic systems to nondeterministic systems. Such a generalization is necessary because there are many systems that need to be modeled as nondeterministic discrete event systems. (2) We develop polynomial algorithms to check strong detectability. The previous algorithms are based on observer whose construction is of exponential complexity, while the new algorithms are based on a new automaton called detector. (3) We extend detectability to D-detectability. While detectability requires determining the exact state of a system, D-detectability relaxes this requirement by asking only to distinguish certain pairs of states. With these extensions, the theory on detectability of discrete event systems becomes more applicable in solving many practical problems. PMID:21691432

  4. Cartan invariants and event horizon detection

    Science.gov (United States)

    Brooks, D.; Chavy-Waddy, P. C.; Coley, A. A.; Forget, A.; Gregoris, D.; MacCallum, M. A. H.; McNutt, D. D.

    2018-04-01

    We show that it is possible to locate the event horizon of a black hole (in arbitrary dimensions) by the zeros of certain Cartan invariants. This approach accounts for the recent results on the detection of stationary horizons using scalar polynomial curvature invariants, and improves upon them since the proposed method is computationally less expensive. As an application, we produce Cartan invariants that locate the event horizons for various exact four-dimensional and five-dimensional stationary, asymptotically flat (or (anti) de Sitter), black hole solutions and compare the Cartan invariants with the corresponding scalar curvature invariants that detect the event horizon.

  5. Semantic Context Detection Using Audio Event Fusion

    Directory of Open Access Journals (Sweden)

    Cheng Wen-Huang

    2006-01-01

    Full Text Available Semantic-level content analysis is a crucial issue in achieving efficient content retrieval and management. We propose a hierarchical approach that models audio events over a time series in order to accomplish semantic context detection. Two levels of modeling, audio event and semantic context modeling, are devised to bridge the gap between physical audio features and semantic concepts. In this work, hidden Markov models (HMMs are used to model four representative audio events, that is, gunshot, explosion, engine, and car braking, in action movies. At the semantic context level, generative (ergodic hidden Markov model and discriminative (support vector machine (SVM approaches are investigated to fuse the characteristics and correlations among audio events, which provide cues for detecting gunplay and car-chasing scenes. The experimental results demonstrate the effectiveness of the proposed approaches and provide a preliminary framework for information mining by using audio characteristics.

  6. Metrics for Polyphonic Sound Event Detection

    Directory of Open Access Journals (Sweden)

    Annamaria Mesaros

    2016-05-01

    Full Text Available This paper presents and discusses various metrics proposed for evaluation of polyphonic sound event detection systems used in realistic situations where there are typically multiple sound sources active simultaneously. The system output in this case contains overlapping events, marked as multiple sounds detected as being active at the same time. The polyphonic system output requires a suitable procedure for evaluation against a reference. Metrics from neighboring fields such as speech recognition and speaker diarization can be used, but they need to be partially redefined to deal with the overlapping events. We present a review of the most common metrics in the field and the way they are adapted and interpreted in the polyphonic case. We discuss segment-based and event-based definitions of each metric and explain the consequences of instance-based and class-based averaging using a case study. In parallel, we provide a toolbox containing implementations of presented metrics.

  7. Detection of goal events in soccer videos

    Science.gov (United States)

    Kim, Hyoung-Gook; Roeber, Steffen; Samour, Amjad; Sikora, Thomas

    2005-01-01

    In this paper, we present an automatic extraction of goal events in soccer videos by using audio track features alone without relying on expensive-to-compute video track features. The extracted goal events can be used for high-level indexing and selective browsing of soccer videos. The detection of soccer video highlights using audio contents comprises three steps: 1) extraction of audio features from a video sequence, 2) event candidate detection of highlight events based on the information provided by the feature extraction Methods and the Hidden Markov Model (HMM), 3) goal event selection to finally determine the video intervals to be included in the summary. For this purpose we compared the performance of the well known Mel-scale Frequency Cepstral Coefficients (MFCC) feature extraction method vs. MPEG-7 Audio Spectrum Projection feature (ASP) extraction method based on three different decomposition methods namely Principal Component Analysis( PCA), Independent Component Analysis (ICA) and Non-Negative Matrix Factorization (NMF). To evaluate our system we collected five soccer game videos from various sources. In total we have seven hours of soccer games consisting of eight gigabytes of data. One of five soccer games is used as the training data (e.g., announcers' excited speech, audience ambient speech noise, audience clapping, environmental sounds). Our goal event detection results are encouraging.

  8. A Sensitivity Analysis of a Computer Model-Based Leak Detection System for Oil Pipelines

    OpenAIRE

    Zhe Lu; Yuntong She; Mark Loewen

    2017-01-01

    Improving leak detection capability to eliminate undetected releases is an area of focus for the energy pipeline industry, and the pipeline companies are working to improve existing methods for monitoring their pipelines. Computer model-based leak detection methods that detect leaks by analyzing the pipeline hydraulic state have been widely employed in the industry, but their effectiveness in practical applications is often challenged by real-world uncertainties. This study quantitatively ass...

  9. Model-based failure detection for cylindrical shells from noisy vibration measurements.

    Science.gov (United States)

    Candy, J V; Fisher, K A; Guidry, B L; Chambers, D H

    2014-12-01

    Model-based processing is a theoretically sound methodology to address difficult objectives in complex physical problems involving multi-channel sensor measurement systems. It involves the incorporation of analytical models of both physical phenomenology (complex vibrating structures, noisy operating environment, etc.) and the measurement processes (sensor networks and including noise) into the processor to extract the desired information. In this paper, a model-based methodology is developed to accomplish the task of online failure monitoring of a vibrating cylindrical shell externally excited by controlled excitations. A model-based processor is formulated to monitor system performance and detect potential failure conditions. The objective of this paper is to develop a real-time, model-based monitoring scheme for online diagnostics in a representative structural vibrational system based on controlled experimental data.

  10. A simple strategy for fall events detection

    KAUST Repository

    Harrou, Fouzi; Zerrouki, Nabil; Sun, Ying; Houacine, Amrane

    2017-01-01

    the multivariate exponentially weighted moving average (MEWMA) control chart to detect fall events. Towards this end, a set of ratios for five partial occupancy areas of the human body for each frame are collected and used as the input data to MEWMA chart

  11. Adaptive prediction applied to seismic event detection

    International Nuclear Information System (INIS)

    Clark, G.A.; Rodgers, P.W.

    1981-01-01

    Adaptive prediction was applied to the problem of detecting small seismic events in microseismic background noise. The Widrow-Hoff LMS adaptive filter used in a prediction configuration is compared with two standard seismic filters as an onset indicator. Examples demonstrate the technique's usefulness with both synthetic and actual seismic data

  12. Adaptive prediction applied to seismic event detection

    Energy Technology Data Exchange (ETDEWEB)

    Clark, G.A.; Rodgers, P.W.

    1981-09-01

    Adaptive prediction was applied to the problem of detecting small seismic events in microseismic background noise. The Widrow-Hoff LMS adaptive filter used in a prediction configuration is compared with two standard seismic filters as an onset indicator. Examples demonstrate the technique's usefulness with both synthetic and actual seismic data.

  13. Observer-Based and Regression Model-Based Detection of Emerging Faults in Coal Mills

    DEFF Research Database (Denmark)

    Odgaard, Peter Fogh; Lin, Bao; Jørgensen, Sten Bay

    2006-01-01

    In order to improve the reliability of power plants it is important to detect fault as fast as possible. Doing this it is interesting to find the most efficient method. Since modeling of large scale systems is time consuming it is interesting to compare a model-based method with data driven ones....

  14. Model-based fault detection for proton exchange membrane fuel cell ...

    African Journals Online (AJOL)

    In this paper, an intelligent model-based fault detection (FD) is developed for proton exchange membrane fuel cell (PEMFC) dynamic systems using an independent radial basis function (RBF) networks. The novelty is that this RBF networks is used to model the PEMFC dynamic systems and residuals are generated based ...

  15. Abnormal Event Detection Using Local Sparse Representation

    DEFF Research Database (Denmark)

    Ren, Huamin; Moeslund, Thomas B.

    2014-01-01

    We propose to detect abnormal events via a sparse subspace clustering algorithm. Unlike most existing approaches, which search for optimized normal bases and detect abnormality based on least square error or reconstruction error from the learned normal patterns, we propose an abnormality measurem...... is found that satisfies: the distance between its local space and the normal space is large. We evaluate our method on two public benchmark datasets: UCSD and Subway Entrance datasets. The comparison to the state-of-the-art methods validate our method's effectiveness....

  16. A Sensitivity Analysis of a Computer Model-Based Leak Detection System for Oil Pipelines

    Directory of Open Access Journals (Sweden)

    Zhe Lu

    2017-08-01

    Full Text Available Improving leak detection capability to eliminate undetected releases is an area of focus for the energy pipeline industry, and the pipeline companies are working to improve existing methods for monitoring their pipelines. Computer model-based leak detection methods that detect leaks by analyzing the pipeline hydraulic state have been widely employed in the industry, but their effectiveness in practical applications is often challenged by real-world uncertainties. This study quantitatively assessed the effects of uncertainties on leak detectability of a commonly used real-time transient model-based leak detection system. Uncertainties in fluid properties, field sensors, and the data acquisition system were evaluated. Errors were introduced into the input variables of the leak detection system individually and collectively, and the changes in leak detectability caused by the uncertainties were quantified using simulated leaks. This study provides valuable quantitative results contributing towards a better understanding of how real-world uncertainties affect leak detection. A general ranking of the importance of the uncertainty sources was obtained: from high to low it is time skew, bulk modulus error, viscosity error, and polling time. It was also shown that inertia-dominated pipeline systems were less sensitive to uncertainties compared to friction-dominated systems.

  17. Integrating physically based simulators with Event Detection Systems: Multi-site detection approach.

    Science.gov (United States)

    Housh, Mashor; Ohar, Ziv

    2017-03-01

    The Fault Detection (FD) Problem in control theory concerns of monitoring a system to identify when a fault has occurred. Two approaches can be distinguished for the FD: Signal processing based FD and Model-based FD. The former concerns of developing algorithms to directly infer faults from sensors' readings, while the latter uses a simulation model of the real-system to analyze the discrepancy between sensors' readings and expected values from the simulation model. Most contamination Event Detection Systems (EDSs) for water distribution systems have followed the signal processing based FD, which relies on analyzing the signals from monitoring stations independently of each other, rather than evaluating all stations simultaneously within an integrated network. In this study, we show that a model-based EDS which utilizes a physically based water quality and hydraulics simulation models, can outperform the signal processing based EDS. We also show that the model-based EDS can facilitate the development of a Multi-Site EDS (MSEDS), which analyzes the data from all the monitoring stations simultaneously within an integrated network. The advantage of the joint analysis in the MSEDS is expressed by increased detection accuracy (higher true positive alarms and fewer false alarms) and shorter detection time. Copyright © 2016 Elsevier Ltd. All rights reserved.

  18. Model based Fault Detection and Isolation for Driving Motors of a Ground Vehicle

    Directory of Open Access Journals (Sweden)

    Young-Joon Kim

    2016-04-01

    Full Text Available This paper proposes model based current sensor and position sensor fault detection and isolation algorithm for driving motor of In-wheel independent drive electric vehicle. From low level perspective, fault diagnosis conducted and analyzed to enhance robustness and stability. Composing state equation of interior permanent magnet synchronous motor (IPMSM, current sensor fault and position sensor fault diagnosed with parity equation. Validation and usefulness of algorithm confirmed based on IPMSM fault occurrence simulation data.

  19. First CNGS events detected by LVD

    International Nuclear Information System (INIS)

    Agafonova, N.Yu.; Boyarkin, V.V.; Kuznetsov, V.V.; Kuznetsov, V.A.; Malguin, A.S.; Ryasny, V.G.; Ryazhskaya, O.G.; Yakushev, V.F.; Zatsepin, G.T.; Aglietta, M.; Bonardi, A.; Fulgione, W.; Galeotti, P.; Porta, A.; Saavedra, O.; Vigorito, C.; Antonioli, P.; Bari, G.; Giusti, P.; Menghetti, H.; Persiani, R.; Pesci, A.; Sartorelli, G.; Selvi, M.; Zichichi, A.; Bruno, G.; Ghia, P.L.; Garbini, M.; Kemp, E.; Pless, I.A.; Votano, L.

    2007-01-01

    The CERN Neutrino to Gran Sasso (CNGS) project aims to produce a high energy, wide band ν μ beam at CERN and send it toward the INFN Gran Sasso National Laboratory (LNGS), 732 km away. Its main goal is the observation of the ν τ appearance, through neutrino flavour oscillation. The beam started its operation in August 2006 for about 12 days: a total amount of 7.6 x 10 17 protons were delivered to the target. The LVD detector, installed in hall A of the LNGS and mainly dedicated to the study of supernova neutrinos, was fully operating during the whole CNGS running time. A total number of 569 events were detected in coincidence with the beam spill time. This is in good agreement with the expected number of events from Monte Carlo simulations. (orig.)

  20. First CNGS events detected by LVD

    International Nuclear Information System (INIS)

    Selvi, M.

    2007-01-01

    The Cern Neutrino to Gran Sasso (CNGS) project aims to produce a high energy, wide band ν μ beam at Cern and send it towards the INFN Gran Sasso National Laboratory (LNGS), 732 km away. Its main goal is the observation of the ν τ appearance, through neutrino flavour oscillation. The beam started its operation in August 2006 for about 12 days: a total amount of 7.6 10 17 protons were delivered to the target. The LVD detector, installed in hall A of the LNGS and mainly dedicated to the study of supernova neutrinos, was fully operating during the whole CNGS running time. A total number of 569 events were detected in coincidence with the beam spill time. This is in good agreement with the expected number of events from Montecarlo simulations

  1. LAN attack detection using Discrete Event Systems.

    Science.gov (United States)

    Hubballi, Neminath; Biswas, Santosh; Roopa, S; Ratti, Ritesh; Nandi, Sukumar

    2011-01-01

    Address Resolution Protocol (ARP) is used for determining the link layer or Medium Access Control (MAC) address of a network host, given its Internet Layer (IP) or Network Layer address. ARP is a stateless protocol and any IP-MAC pairing sent by a host is accepted without verification. This weakness in the ARP may be exploited by malicious hosts in a Local Area Network (LAN) by spoofing IP-MAC pairs. Several schemes have been proposed in the literature to circumvent these attacks; however, these techniques either make IP-MAC pairing static, modify the existing ARP, patch operating systems of all the hosts etc. In this paper we propose a Discrete Event System (DES) approach for Intrusion Detection System (IDS) for LAN specific attacks which do not require any extra constraint like static IP-MAC, changing the ARP etc. A DES model is built for the LAN under both a normal and compromised (i.e., spoofed request/response) situation based on the sequences of ARP related packets. Sequences of ARP events in normal and spoofed scenarios are similar thereby rendering the same DES models for both the cases. To create different ARP events under normal and spoofed conditions the proposed technique uses active ARP probing. However, this probing adds extra ARP traffic in the LAN. Following that a DES detector is built to determine from observed ARP related events, whether the LAN is operating under a normal or compromised situation. The scheme also minimizes extra ARP traffic by probing the source IP-MAC pair of only those ARP packets which are yet to be determined as genuine/spoofed by the detector. Also, spoofed IP-MAC pairs determined by the detector are stored in tables to detect other LAN attacks triggered by spoofing namely, man-in-the-middle (MiTM), denial of service etc. The scheme is successfully validated in a test bed. Copyright © 2010 ISA. Published by Elsevier Ltd. All rights reserved.

  2. Real-Time Model-Based Leak-Through Detection within Cryogenic Flow Systems

    Science.gov (United States)

    Walker, M.; Figueroa, F.

    2015-01-01

    The timely detection of leaks within cryogenic fuel replenishment systems is of significant importance to operators on account of the safety and economic impacts associated with material loss and operational inefficiencies. Associated loss in control of pressure also effects the stability and ability to control the phase of cryogenic fluids during replenishment operations. Current research dedicated to providing Prognostics and Health Management (PHM) coverage of such cryogenic replenishment systems has focused on the detection of leaks to atmosphere involving relatively simple model-based diagnostic approaches that, while effective, are unable to isolate the fault to specific piping system components. The authors have extended this research to focus on the detection of leaks through closed valves that are intended to isolate sections of the piping system from the flow and pressurization of cryogenic fluids. The described approach employs model-based detection of leak-through conditions based on correlations of pressure changes across isolation valves and attempts to isolate the faults to specific valves. Implementation of this capability is enabled by knowledge and information embedded in the domain model of the system. The approach has been used effectively to detect such leak-through faults during cryogenic operational testing at the Cryogenic Testbed at NASA's Kennedy Space Center.

  3. Observer and data-driven model based fault detection in Power Plant Coal Mills

    DEFF Research Database (Denmark)

    Fogh Odgaard, Peter; Lin, Bao; Jørgensen, Sten Bay

    2008-01-01

    model with motor power as the controlled variable, data-driven methods for fault detection are also investigated. Regression models that represent normal operating conditions (NOCs) are developed with both static and dynamic principal component analysis and partial least squares methods. The residual...... between process measurement and the NOC model prediction is used for fault detection. A hybrid approach, where a data-driven model is employed to derive an optimal unknown input observer, is also implemented. The three methods are evaluated with case studies on coal mill data, which includes a fault......This paper presents and compares model-based and data-driven fault detection approaches for coal mill systems. The first approach detects faults with an optimal unknown input observer developed from a simplified energy balance model. Due to the time-consuming effort in developing a first principles...

  4. Multilingual event extraction for epidemic detection.

    Science.gov (United States)

    Lejeune, Gaël; Brixtel, Romain; Doucet, Antoine; Lucas, Nadine

    2015-10-01

    This paper presents a multilingual news surveillance system applied to tele-epidemiology. It has been shown that multilingual approaches improve timeliness in detection of epidemic events across the globe, eliminating the wait for local news to be translated into major languages. We present here a system to extract epidemic events in potentially any language, provided a Wikipedia seed for common disease names exists. The Daniel system presented herein relies on properties that are common to news writing (the journalistic genre), the most useful being repetition and saliency. Wikipedia is used to screen common disease names to be matched with repeated characters strings. Language variations, such as declensions, are handled by processing text at the character-level, rather than at the word level. This additionally makes it possible to handle various writing systems in a similar fashion. As no multilingual ground truth existed to evaluate the Daniel system, we built a multilingual corpus from the Web, and collected annotations from native speakers of Chinese, English, Greek, Polish and Russian, with no connection or interest in the Daniel system. This data set is available online freely, and can be used for the evaluation of other event extraction systems. Experiments for 5 languages out of 17 tested are detailed in this paper: Chinese, English, Greek, Polish and Russian. The Daniel system achieves an average F-measure of 82% in these 5 languages. It reaches 87% on BEcorpus, the state-of-the-art corpus in English, slightly below top-performing systems, which are tailored with numerous language-specific resources. The consistent performance of Daniel on multiple languages is an important contribution to the reactivity and the coverage of epidemiological event detection systems. Most event extraction systems rely on extensive resources that are language-specific. While their sophistication induces excellent results (over 90% precision and recall), it restricts their

  5. Strategies to Automatically Derive a Process Model from a Configurable Process Model Based on Event Data

    Directory of Open Access Journals (Sweden)

    Mauricio Arriagada-Benítez

    2017-10-01

    Full Text Available Configurable process models are frequently used to represent business workflows and other discrete event systems among different branches of large organizations: they unify commonalities shared by all branches and describe their differences, at the same time. The configuration of such models is usually done manually, which is challenging. On the one hand, when the number of configurable nodes in the configurable process model grows, the size of the search space increases exponentially. On the other hand, the person performing the configuration may lack the holistic perspective to make the right choice for all configurable nodes at the same time, since choices influence each other. Nowadays, information systems that support the execution of business processes create event data reflecting how processes are performed. In this article, we propose three strategies (based on exhaustive search, genetic algorithms and a greedy heuristic that use event data to automatically derive a process model from a configurable process model that better represents the characteristics of the process in a specific branch. These strategies have been implemented in our proposed framework and tested in both business-like event logs as recorded in a higher educational enterprise resource planning system and a real case scenario involving a set of Dutch municipalities.

  6. Mixture model-based clustering and logistic regression for automatic detection of microaneurysms in retinal images

    Science.gov (United States)

    Sánchez, Clara I.; Hornero, Roberto; Mayo, Agustín; García, María

    2009-02-01

    Diabetic Retinopathy is one of the leading causes of blindness and vision defects in developed countries. An early detection and diagnosis is crucial to avoid visual complication. Microaneurysms are the first ocular signs of the presence of this ocular disease. Their detection is of paramount importance for the development of a computer-aided diagnosis technique which permits a prompt diagnosis of the disease. However, the detection of microaneurysms in retinal images is a difficult task due to the wide variability that these images usually present in screening programs. We propose a statistical approach based on mixture model-based clustering and logistic regression which is robust to the changes in the appearance of retinal fundus images. The method is evaluated on the public database proposed by the Retinal Online Challenge in order to obtain an objective performance measure and to allow a comparative study with other proposed algorithms.

  7. A Model-Based Anomaly Detection Approach for Analyzing Streaming Aircraft Engine Measurement Data

    Science.gov (United States)

    Simon, Donald L.; Rinehart, Aidan Walker

    2015-01-01

    This paper presents a model-based anomaly detection architecture designed for analyzing streaming transient aircraft engine measurement data. The technique calculates and monitors residuals between sensed engine outputs and model predicted outputs for anomaly detection purposes. Pivotal to the performance of this technique is the ability to construct a model that accurately reflects the nominal operating performance of the engine. The dynamic model applied in the architecture is a piecewise linear design comprising steady-state trim points and dynamic state space matrices. A simple curve-fitting technique for updating the model trim point information based on steadystate information extracted from available nominal engine measurement data is presented. Results from the application of the model-based approach for processing actual engine test data are shown. These include both nominal fault-free test case data and seeded fault test case data. The results indicate that the updates applied to improve the model trim point information also improve anomaly detection performance. Recommendations for follow-on enhancements to the technique are also presented and discussed.

  8. Ensemble regression model-based anomaly detection for cyber-physical intrusion detection in smart grids

    DEFF Research Database (Denmark)

    Kosek, Anna Magdalena; Gehrke, Oliver

    2016-01-01

    The shift from centralised large production to distributed energy production has several consequences for current power system operation. The replacement of large power plants by growing numbers of distributed energy resources (DERs) increases the dependency of the power system on small scale......, distributed production. Many of these DERs can be accessed and controlled remotely, posing a cybersecurity risk. This paper investigates an intrusion detection system which evaluates the DER operation in order to discover unauthorized control actions. The proposed anomaly detection method is based...

  9. Tuning and Test of Fragmentation Models Based on Identified Particles and Precision Event Shape Data

    CERN Document Server

    Abreu, P; Adye, T; Ajinenko, I; Alekseev, G D; Alemany, R; Allport, P P; Almehed, S; Amaldi, Ugo; Amato, S; Andreazza, A; Andrieux, M L; Antilogus, P; Apel, W D; Åsman, B; Augustin, J E; Augustinus, A; Baillon, Paul; Bambade, P; Barão, F; Barate, R; Barbi, M S; Bardin, Dimitri Yuri; Baroncelli, A; Bärring, O; Barrio, J A; Bartl, Walter; Bates, M J; Battaglia, Marco; Baubillier, M; Baudot, J; Becks, K H; Begalli, M; Beillière, P; Belokopytov, Yu A; Belous, K S; Benvenuti, Alberto C; Berggren, M; Bertini, D; Bertrand, D; Besançon, M; Bianchi, F; Bigi, M; Bilenky, S M; Billoir, P; Bloch, D; Blume, M; Bolognese, T; Bonesini, M; Bonivento, W; Booth, P S L; Bosio, C; Botner, O; Boudinov, E; Bouquet, B; Bourdarios, C; Bowcock, T J V; Bozzo, M; Branchini, P; Brand, K D; Brenke, T; Brenner, R A; Bricman, C; Brown, R C A; Brückman, P; Brunet, J M; Bugge, L; Buran, T; Burgsmüller, T; Buschmann, P; Buys, A; Cabrera, S; Caccia, M; Calvi, M; Camacho-Rozas, A J; Camporesi, T; Canale, V; Canepa, M; Cankocak, K; Cao, F; Carena, F; Carroll, L; Caso, Carlo; Castillo-Gimenez, M V; Cattai, A; Cavallo, F R; Chabaud, V; Charpentier, P; Chaussard, L; Checchia, P; Chelkov, G A; Chen, M; Chierici, R; Chliapnikov, P V; Chochula, P; Chorowicz, V; Chudoba, J; Cindro, V; Collins, P; Contreras, J L; Contri, R; Cortina, E; Cosme, G; Cossutti, F; Cowell, J H; Crawley, H B; Crennell, D J; Crosetti, G; Cuevas-Maestro, J; Czellar, S; Dahl-Jensen, Erik; Dahm, J; D'Almagne, B; Dam, M; Damgaard, G; Dauncey, P D; Davenport, Martyn; Da Silva, W; Defoix, C; Deghorain, A; Della Ricca, G; Delpierre, P A; Demaria, N; De Angelis, A; de Boer, Wim; De Brabandere, S; De Clercq, C; La Vaissière, C de; De Lotto, B; De Min, A; De Paula, L S; De Saint-Jean, C; Dijkstra, H; Di Ciaccio, Lucia; Di Diodato, A; Djama, F; Dolbeau, J; Dönszelmann, M; Doroba, K; Dracos, M; Drees, J; Drees, K A; Dris, M; Durand, J D; Edsall, D M; Ehret, R; Eigen, G; Ekelöf, T J C; Ekspong, Gösta; Elsing, M; Engel, J P; Erzen, B; Espirito-Santo, M C; Falk, E; Fassouliotis, D; Feindt, Michael; Ferrer, A; Fichet, S; Filippas-Tassos, A; Firestone, A; Fischer, P A; Föth, H; Fokitis, E; Fontanelli, F; Formenti, F; Franek, B J; Frenkiel, P; Fries, D E C; Frodesen, A G; Frühwirth, R; Fulda-Quenzer, F; Fuster, J A; Galloni, A; Gamba, D; Gandelman, M; García, C; García, J; Gaspar, C; Gasparini, U; Gavillet, P; Gazis, E N; Gelé, D; Gerber, J P; Gokieli, R; Golob, B; Gopal, Gian P; Gorn, L; Górski, M; Guz, Yu; Gracco, Valerio; Graziani, E; Green, C; Grefrath, A; Gris, P; Grosdidier, G; Grzelak, K; Gumenyuk, S A; Gunnarsson, P; Günther, M; Guy, J; Hahn, F; Hahn, S; Hajduk, Z; Hallgren, A; Hamacher, K; Harris, F J; Hedberg, V; Henriques, R P; Hernández, J J; Herquet, P; Herr, H; Hessing, T L; Higón, E; Hilke, Hans Jürgen; Hill, T S; Holmgren, S O; Holt, P J; Holthuizen, D J; Hoorelbeke, S; Houlden, M A; Hrubec, Josef; Huet, K; Hultqvist, K; Jackson, J N; Jacobsson, R; Jalocha, P; Janik, R; Jarlskog, C; Jarlskog, G; Jarry, P; Jean-Marie, B; Johansson, E K; Jönsson, L B; Jönsson, P E; Joram, Christian; Juillot, P; Kaiser, M; Kapusta, F; Karafasoulis, K; Karlsson, M; Karvelas, E; Katsanevas, S; Katsoufis, E C; Keränen, R; Khokhlov, Yu A; Khomenko, B A; Khovanskii, N N; King, B J; Kjaer, N J; Klapp, O; Klein, H; Klovning, A; Kluit, P M; Köne, B; Kokkinias, P; Koratzinos, M; Korcyl, K; Kostyukhin, V; Kourkoumelis, C; Kuznetsov, O; Kreuter, C; Kronkvist, I J; Krumshtein, Z; Krupinski, W; Kubinec, P; Kucewicz, W; Kurvinen, K L; Lacasta, C; Laktineh, I; Lamsa, J; Lanceri, L; Lane, D W; Langefeld, P; Lapin, V; Laugier, J P; Lauhakangas, R; Leder, Gerhard; Ledroit, F; Lefébure, V; Legan, C K; Leitner, R; Lemonne, J; Lenzen, Georg; Lepeltier, V; Lesiak, T; Libby, J; Liko, D; Lindner, R; Lipniacka, A; Lippi, I; Lörstad, B; Loken, J G; López, J M; Loukas, D; Lutz, P; Lyons, L; Naughton, J M; Maehlum, G; Mahon, J R; Maio, A; Malmgren, T G M; Malychev, V; Mandl, F; Marco, J; Marco, R P; Maréchal, B; Margoni, M; Marin, J C; Mariotti, C; Markou, A; Martínez-Rivero, C; Martínez-Vidal, F; Martí i García, S; Masik, J; Matorras, F; Matteuzzi, C; Matthiae, Giorgio; Mazzucato, M; McCubbin, M L; McKay, R; McNulty, R; Medbo, J; Merk, M; Meroni, C; Meyer, S; Meyer, W T; Myagkov, A; Michelotto, M; Migliore, E; Mirabito, L; Mitaroff, Winfried A; Mjörnmark, U; Moa, T; Møller, R; Mönig, K; Monge, M R; Morettini, P; Müller, H; Mulders, M; Mundim, L M; Murray, W J; Muryn, B; Myatt, Gerald; Naraghi, F; Navarria, Francesco Luigi; Navas, S; Nawrocki, K; Negri, P; Neumann, W; Neumeister, N; Nicolaidou, R; Nielsen, B S; Nieuwenhuizen, M; Nikolaenko, V; Niss, P; Nomerotski, A; Normand, Ainsley; Oberschulte-Beckmann, W; Obraztsov, V F; Olshevskii, A G; Onofre, A; Orava, Risto; Österberg, K; Ouraou, A; Paganini, P; Paganoni, M; Pagès, P; Pain, R; Palka, H; Papadopoulou, T D; Papageorgiou, K; Pape, L; Parkes, C; Parodi, F; Passeri, A; Pegoraro, M; Peralta, L; Pernegger, H; Pernicka, Manfred; Perrotta, A; Petridou, C; Petrolini, A; Petrovykh, M; Phillips, H T; Piana, G; Pierre, F; Plaszczynski, S; Podobrin, O; Pol, M E; Polok, G; Poropat, P; Pozdnyakov, V; Privitera, P; Pukhaeva, N; Pullia, Antonio; Radojicic, D; Ragazzi, S; Rahmani, H; Rames, J; Ratoff, P N; Read, A L; Reale, M; Rebecchi, P; Redaelli, N G; Regler, Meinhard; Reid, D; Renton, P B; Resvanis, L K; Richard, F; Richardson, J; Rídky, J; Rinaudo, G; Ripp, I; Romero, A; Roncagliolo, I; Ronchese, P; Roos, L; Rosenberg, E I; Rosso, E; Roudeau, Patrick; Rovelli, T; Rückstuhl, W; Ruhlmann-Kleider, V; Ruiz, A; Rybicki, K; Saarikko, H; Sacquin, Yu; Sadovskii, A; Sahr, O; Sajot, G; Salt, J; Sánchez, J; Sannino, M; Schimmelpfennig, M; Schneider, H; Schwickerath, U; Schyns, M A E; Sciolla, G; Scuri, F; Seager, P; Sedykh, Yu; Segar, A M; Seitz, A; Sekulin, R L; Serbelloni, L; Shellard, R C; Siegrist, P; Silvestre, R; Simonetti, S; Simonetto, F; Sissakian, A N; Sitár, B; Skaali, T B; Smadja, G; Smirnov, N; Smirnova, O G; Smith, G R; Sokolov, A; Sosnowski, R; Souza-Santos, D; Spassoff, Tz; Spiriti, E; Sponholz, P; Squarcia, S; Stanescu, C; Stapnes, Steinar; Stavitski, I; Stevenson, K; Stichelbaut, F; Stocchi, A; Strauss, J; Strub, R; Stugu, B; Szczekowski, M; Szeptycka, M; Tabarelli de Fatis, T; Tavernet, J P; Chikilev, O G; Thomas, J; Tilquin, A; Timmermans, J; Tkatchev, L G; Todorov, T; Todorova, S; Toet, D Z; Tomaradze, A G; Tomé, B; Tonazzo, A; Tortora, L; Tranströmer, G; Treille, D; Trischuk, W; Tristram, G; Trombini, A; Troncon, C; Tsirou, A L; Turluer, M L; Tyapkin, I A; Tyndel, M; Tzamarias, S; Überschär, B; Ullaland, O; Uvarov, V; Valenti, G; Vallazza, E; van Apeldoorn, G W; van Dam, P; Van Eldik, J; Vassilopoulos, N; Vegni, G; Ventura, L; Venus, W A; Verbeure, F; Verlato, M; Vertogradov, L S; Vilanova, D; Vincent, P; Vitale, L; Vlasov, E; Vodopyanov, A S; Vrba, V; Wahlen, H; Walck, C; Waldner, F; Weierstall, M; Weilhammer, Peter; Weiser, C; Wetherell, Alan M; Wicke, D; Wickens, J H; Wielers, M; Wilkinson, G R; Williams, W S C; Winter, M; Witek, M; Woschnagg, K; Yip, K; Yushchenko, O P; Zach, F; Zaitsev, A; Zalewska-Bak, A; Zalewski, Piotr; Zavrtanik, D; Zevgolatakos, E; Zimin, N I; Zito, M; Zontar, D; Zucchelli, G C; Zumerle, G

    1996-01-01

    Event shape and charged particle inclusive distributions are measured using 750000 decays of the $Z$ to hadrons from the DELPHI detector at LEP. These precise data allow a decisive confrontation with models of the hadronization process. Improved tunings of the JETSET ARIADNE and HERWIG parton shower models and the JETSET matrix element model are obtained by fitting the models to these DELPHI data as well as to identified particle distributions from all LEP experiments. The description of the data distributions by the models is critically reviewed with special importance attributed to identified particles.

  10. Model-based approach for cyber-physical attack detection in water distribution systems.

    Science.gov (United States)

    Housh, Mashor; Ohar, Ziv

    2018-08-01

    Modern Water Distribution Systems (WDSs) are often controlled by Supervisory Control and Data Acquisition (SCADA) systems and Programmable Logic Controllers (PLCs) which manage their operation and maintain a reliable water supply. As such, and with the cyber layer becoming a central component of WDS operations, these systems are at a greater risk of being subjected to cyberattacks. This paper offers a model-based methodology based on a detailed hydraulic understanding of WDSs combined with an anomaly detection algorithm for the identification of complex cyberattacks that cannot be fully identified by hydraulically based rules alone. The results show that the proposed algorithm is capable of achieving the best-known performance when tested on the data published in the BATtle of the Attack Detection ALgorithms (BATADAL) competition (http://www.batadal.net). Copyright © 2018. Published by Elsevier Ltd.

  11. Detection of Internal Short Circuit in Lithium Ion Battery Using Model-Based Switching Model Method

    Directory of Open Access Journals (Sweden)

    Minhwan Seo

    2017-01-01

    Full Text Available Early detection of an internal short circuit (ISCr in a Li-ion battery can prevent it from undergoing thermal runaway, and thereby ensure battery safety. In this paper, a model-based switching model method (SMM is proposed to detect the ISCr in the Li-ion battery. The SMM updates the model of the Li-ion battery with ISCr to improve the accuracy of ISCr resistance R I S C f estimates. The open circuit voltage (OCV and the state of charge (SOC are estimated by applying the equivalent circuit model, and by using the recursive least squares algorithm and the relation between OCV and SOC. As a fault index, the R I S C f is estimated from the estimated OCVs and SOCs to detect the ISCr, and used to update the model; this process yields accurate estimates of OCV and R I S C f . Then the next R I S C f is estimated and used to update the model iteratively. Simulation data from a MATLAB/Simulink model and experimental data verify that this algorithm shows high accuracy of R I S C f estimates to detect the ISCr, thereby helping the battery management system to fulfill early detection of the ISCr.

  12. Model-Based Building Detection from Low-Cost Optical Sensors Onboard Unmanned Aerial Vehicles

    Science.gov (United States)

    Karantzalos, K.; Koutsourakis, P.; Kalisperakis, I.; Grammatikopoulos, L.

    2015-08-01

    The automated and cost-effective building detection in ultra high spatial resolution is of major importance for various engineering and smart city applications. To this end, in this paper, a model-based building detection technique has been developed able to extract and reconstruct buildings from UAV aerial imagery and low-cost imaging sensors. In particular, the developed approach through advanced structure from motion, bundle adjustment and dense image matching computes a DSM and a true orthomosaic from the numerous GoPro images which are characterised by important geometric distortions and fish-eye effect. An unsupervised multi-region, graphcut segmentation and a rule-based classification is responsible for delivering the initial multi-class classification map. The DTM is then calculated based on inpaininting and mathematical morphology process. A data fusion process between the detected building from the DSM/DTM and the classification map feeds a grammar-based building reconstruction and scene building are extracted and reconstructed. Preliminary experimental results appear quite promising with the quantitative evaluation indicating detection rates at object level of 88% regarding the correctness and above 75% regarding the detection completeness.

  13. Detection of Common Problems in Real-Time and Multicore Systems Using Model-Based Constraints

    Directory of Open Access Journals (Sweden)

    Raphaël Beamonte

    2016-01-01

    Full Text Available Multicore systems are complex in that multiple processes are running concurrently and can interfere with each other. Real-time systems add on top of that time constraints, making results invalid as soon as a deadline has been missed. Tracing is often the most reliable and accurate tool available to study and understand those systems. However, tracing requires that users understand the kernel events and their meaning. It is therefore not very accessible. Using modeling to generate source code or represent applications’ workflow is handy for developers and has emerged as part of the model-driven development methodology. In this paper, we propose a new approach to system analysis using model-based constraints, on top of userspace and kernel traces. We introduce the constraints representation and how traces can be used to follow the application’s workflow and check the constraints we set on the model. We then present a number of common problems that we encountered in real-time and multicore systems and describe how our model-based constraints could have helped to save time by automatically identifying the unwanted behavior.

  14. Observability analysis for model-based fault detection and sensor selection in induction motors

    International Nuclear Information System (INIS)

    Nakhaeinejad, Mohsen; Bryant, Michael D

    2011-01-01

    Sensors in different types and configurations provide information on the dynamics of a system. For a specific task, the question is whether measurements have enough information or whether the sensor configuration can be changed to improve the performance or to reduce costs. Observability analysis may answer the questions. This paper presents a general algorithm of nonlinear observability analysis with application to model-based diagnostics and sensor selection in three-phase induction motors. A bond graph model of the motor is developed and verified with experiments. A nonlinear observability matrix based on Lie derivatives is obtained from state equations. An observability index based on the singular value decomposition of the observability matrix is obtained. Singular values and singular vectors are used to identify the most and least observable configurations of sensors and parameters. A complex step derivative technique is used in the calculation of Jacobians to improve the computational performance of the observability analysis. The proposed algorithm of observability analysis can be applied to any nonlinear system to select the best configuration of sensors for applications of model-based diagnostics, observer-based controller, or to determine the level of sensor redundancy. Observability analysis on induction motors provides various sensor configurations with corresponding observability indices. Results show the redundancy levels for different sensors, and provide a sensor selection guideline for model-based diagnostics, and for observer-based controllers. The results can also be used for sensor fault detection and to improve the reliability of the system by increasing the redundancy level in measurements

  15. Model-Based Water Wall Fault Detection and Diagnosis of FBC Boiler Using Strong Tracking Filter

    Directory of Open Access Journals (Sweden)

    Li Sun

    2014-01-01

    Full Text Available Fluidized bed combustion (FBC boilers have received increasing attention in recent decades. The erosion issue on the water wall is one of the most common and serious faults for FBC boilers. Unlike direct measurement of tube thickness used by ultrasonic methods, the wastage of water wall is reconsidered equally as the variation of the overall heat transfer coefficient in the furnace. In this paper, a model-based approach is presented to estimate internal states and heat transfer coefficient dually from the noisy measurable outputs. The estimated parameter is compared with the normal value. Then the modified Bayesian algorithm is adopted for fault detection and diagnosis (FDD. The simulation results demonstrate that the approach is feasible and effective.

  16. CHIRP-Like Signals: Estimation, Detection and Processing A Sequential Model-Based Approach

    Energy Technology Data Exchange (ETDEWEB)

    Candy, J. V. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2016-08-04

    Chirp signals have evolved primarily from radar/sonar signal processing applications specifically attempting to estimate the location of a target in surveillance/tracking volume. The chirp, which is essentially a sinusoidal signal whose phase changes instantaneously at each time sample, has an interesting property in that its correlation approximates an impulse function. It is well-known that a matched-filter detector in radar/sonar estimates the target range by cross-correlating a replicant of the transmitted chirp with the measurement data reflected from the target back to the radar/sonar receiver yielding a maximum peak corresponding to the echo time and therefore enabling the desired range estimate. In this application, we perform the same operation as a radar or sonar system, that is, we transmit a “chirp-like pulse” into the target medium and attempt to first detect its presence and second estimate its location or range. Our problem is complicated by the presence of disturbance signals from surrounding broadcast stations as well as extraneous sources of interference in our frequency bands and of course the ever present random noise from instrumentation. First, we discuss the chirp signal itself and illustrate its inherent properties and then develop a model-based processing scheme enabling both the detection and estimation of the signal from noisy measurement data.

  17. Articulating uncertainty as part of scientific argumentation during model-based exoplanet detection tasks

    Science.gov (United States)

    Lee, Hee-Sun; Pallant, Amy; Pryputniewicz, Sarah

    2015-08-01

    Teaching scientific argumentation has emerged as an important goal for K-12 science education. In scientific argumentation, students are actively involved in coordinating evidence with theory based on their understanding of the scientific content and thinking critically about the strengths and weaknesses of the cited evidence in the context of the investigation. We developed a one-week-long online curriculum module called "Is there life in space?" where students conduct a series of four model-based tasks to learn how scientists detect extrasolar planets through the “wobble” and transit methods. The simulation model allows students to manipulate various parameters of an imaginary star and planet system such as planet size, orbit size, planet-orbiting-plane angle, and sensitivity of telescope equipment, and to adjust the display settings for graphs illustrating the relative velocity and light intensity of the star. Students can use model-based evidence to formulate an argument on whether particular signals in the graphs guarantee the presence of a planet. Students' argumentation is facilitated by the four-part prompts consisting of multiple-choice claim, open-ended explanation, Likert-scale uncertainty rating, and open-ended uncertainty rationale. We analyzed 1,013 scientific arguments formulated by 302 high school student groups taught by 7 teachers. We coded these arguments in terms of the accuracy of their claim, the sophistication of explanation connecting evidence to the established knowledge base, the uncertainty rating, and the scientific validity of uncertainty. We found that (1) only 18% of the students' uncertainty rationale involved critical reflection on limitations inherent in data and concepts, (2) 35% of students' uncertainty rationale reflected their assessment of personal ability and knowledge, rather than scientific sources of uncertainty related to the evidence, and (3) the nature of task such as the use of noisy data or the framing of

  18. Low contrast detectability and spatial resolution with model-based iterative reconstructions of MDCT images: a phantom and cadaveric study

    Energy Technology Data Exchange (ETDEWEB)

    Millon, Domitille; Coche, Emmanuel E. [Universite Catholique de Louvain, Department of Radiology and Medical Imaging, Cliniques Universitaires Saint Luc, Brussels (Belgium); Vlassenbroek, Alain [Philips Healthcare, Brussels (Belgium); Maanen, Aline G. van; Cambier, Samantha E. [Universite Catholique de Louvain, Statistics Unit, King Albert II Cancer Institute, Brussels (Belgium)

    2017-03-15

    To compare image quality [low contrast (LC) detectability, noise, contrast-to-noise (CNR) and spatial resolution (SR)] of MDCT images reconstructed with an iterative reconstruction (IR) algorithm and a filtered back projection (FBP) algorithm. The experimental study was performed on a 256-slice MDCT. LC detectability, noise, CNR and SR were measured on a Catphan phantom scanned with decreasing doses (48.8 down to 0.7 mGy) and parameters typical of a chest CT examination. Images were reconstructed with FBP and a model-based IR algorithm. Additionally, human chest cadavers were scanned and reconstructed using the same technical parameters. Images were analyzed to illustrate the phantom results. LC detectability and noise were statistically significantly different between the techniques, supporting model-based IR algorithm (p < 0.0001). At low doses, the noise in FBP images only enabled SR measurements of high contrast objects. The superior CNR of model-based IR algorithm enabled lower dose measurements, which showed that SR was dose and contrast dependent. Cadaver images reconstructed with model-based IR illustrated that visibility and delineation of anatomical structure edges could be deteriorated at low doses. Model-based IR improved LC detectability and enabled dose reduction. At low dose, SR became dose and contrast dependent. (orig.)

  19. Detecting surface events at the COBRA experiment

    Energy Technology Data Exchange (ETDEWEB)

    Tebruegge, Jan [Exp. Physik IV, TU Dortmund (Germany); Collaboration: COBRA-Collaboration

    2015-07-01

    The aim of the COBRA experiment is to prove the existence of neutrinoless double-beta-decay and to measure its half-life. For this purpose the COBRA demonstrator, a prototype for a large-scale experiment, is operated at the Gran Sasso Underground Laboratory (LNGS) in Italy. The demonstrator is a detector array made of 64 Cadmium-Zinc-Telluride (CdZnTe) semiconductor detectors in the coplanar grid anode configuration. Each detector is 1**1 ccm in size. This setup is used to investigate the experimental issues of operating CdZnTe detectors in low background mode and identify potential background components. As the ''detector=source'' principle is used, the neutrinoless double beta decay COBRA searches for happens within the whole detector volume. Consequently, events on the surface of the detectors are considered as background. These surface events are a main background component, stemming mainly from the natural radioactivity, especially radon. This talk explains to what extent surface events occur and shows how these are recognized and vetoed in the analysis using pulse shape discrimination algorithms.

  20. Subsurface Event Detection and Classification Using Wireless Signal Networks

    Directory of Open Access Journals (Sweden)

    Muhannad T. Suleiman

    2012-11-01

    Full Text Available Subsurface environment sensing and monitoring applications such as detection of water intrusion or a landslide, which could significantly change the physical properties of the host soil, can be accomplished using a novel concept, Wireless Signal Networks (WSiNs. The wireless signal networks take advantage of the variations of radio signal strength on the distributed underground sensor nodes of WSiNs to monitor and characterize the sensed area. To characterize subsurface environments for event detection and classification, this paper provides a detailed list and experimental data of soil properties on how radio propagation is affected by soil properties in subsurface communication environments. Experiments demonstrated that calibrated wireless signal strength variations can be used as indicators to sense changes in the subsurface environment. The concept of WSiNs for the subsurface event detection is evaluated with applications such as detection of water intrusion, relative density change, and relative motion using actual underground sensor nodes. To classify geo-events using the measured signal strength as a main indicator of geo-events, we propose a window-based minimum distance classifier based on Bayesian decision theory. The window-based classifier for wireless signal networks has two steps: event detection and event classification. With the event detection, the window-based classifier classifies geo-events on the event occurring regions that are called a classification window. The proposed window-based classification method is evaluated with a water leakage experiment in which the data has been measured in laboratory experiments. In these experiments, the proposed detection and classification method based on wireless signal network can detect and classify subsurface events.

  1. Subsurface event detection and classification using Wireless Signal Networks.

    Science.gov (United States)

    Yoon, Suk-Un; Ghazanfari, Ehsan; Cheng, Liang; Pamukcu, Sibel; Suleiman, Muhannad T

    2012-11-05

    Subsurface environment sensing and monitoring applications such as detection of water intrusion or a landslide, which could significantly change the physical properties of the host soil, can be accomplished using a novel concept, Wireless Signal Networks (WSiNs). The wireless signal networks take advantage of the variations of radio signal strength on the distributed underground sensor nodes of WSiNs to monitor and characterize the sensed area. To characterize subsurface environments for event detection and classification, this paper provides a detailed list and experimental data of soil properties on how radio propagation is affected by soil properties in subsurface communication environments. Experiments demonstrated that calibrated wireless signal strength variations can be used as indicators to sense changes in the subsurface environment. The concept of WSiNs for the subsurface event detection is evaluated with applications such as detection of water intrusion, relative density change, and relative motion using actual underground sensor nodes. To classify geo-events using the measured signal strength as a main indicator of geo-events, we propose a window-based minimum distance classifier based on Bayesian decision theory. The window-based classifier for wireless signal networks has two steps: event detection and event classification. With the event detection, the window-based classifier classifies geo-events on the event occurring regions that are called a classification window. The proposed window-based classification method is evaluated with a water leakage experiment in which the data has been measured in laboratory experiments. In these experiments, the proposed detection and classification method based on wireless signal network can detect and classify subsurface events.

  2. Learning Multimodal Deep Representations for Crowd Anomaly Event Detection

    Directory of Open Access Journals (Sweden)

    Shaonian Huang

    2018-01-01

    Full Text Available Anomaly event detection in crowd scenes is extremely important; however, the majority of existing studies merely use hand-crafted features to detect anomalies. In this study, a novel unsupervised deep learning framework is proposed to detect anomaly events in crowded scenes. Specifically, low-level visual features, energy features, and motion map features are simultaneously extracted based on spatiotemporal energy measurements. Three convolutional restricted Boltzmann machines are trained to model the mid-level feature representation of normal patterns. Then a multimodal fusion scheme is utilized to learn the deep representation of crowd patterns. Based on the learned deep representation, a one-class support vector machine model is used to detect anomaly events. The proposed method is evaluated using two available public datasets and compared with state-of-the-art methods. The experimental results show its competitive performance for anomaly event detection in video surveillance.

  3. Abnormal global and local event detection in compressive sensing domain

    Science.gov (United States)

    Wang, Tian; Qiao, Meina; Chen, Jie; Wang, Chuanyun; Zhang, Wenjia; Snoussi, Hichem

    2018-05-01

    Abnormal event detection, also known as anomaly detection, is one challenging task in security video surveillance. It is important to develop effective and robust movement representation models for global and local abnormal event detection to fight against factors such as occlusion and illumination change. In this paper, a new algorithm is proposed. It can locate the abnormal events on one frame, and detect the global abnormal frame. The proposed algorithm employs a sparse measurement matrix designed to represent the movement feature based on optical flow efficiently. Then, the abnormal detection mission is constructed as a one-class classification task via merely learning from the training normal samples. Experiments demonstrate that our algorithm performs well on the benchmark abnormal detection datasets against state-of-the-art methods.

  4. Abnormal global and local event detection in compressive sensing domain

    Directory of Open Access Journals (Sweden)

    Tian Wang

    2018-05-01

    Full Text Available Abnormal event detection, also known as anomaly detection, is one challenging task in security video surveillance. It is important to develop effective and robust movement representation models for global and local abnormal event detection to fight against factors such as occlusion and illumination change. In this paper, a new algorithm is proposed. It can locate the abnormal events on one frame, and detect the global abnormal frame. The proposed algorithm employs a sparse measurement matrix designed to represent the movement feature based on optical flow efficiently. Then, the abnormal detection mission is constructed as a one-class classification task via merely learning from the training normal samples. Experiments demonstrate that our algorithm performs well on the benchmark abnormal detection datasets against state-of-the-art methods.

  5. Event storm detection and identification in communication systems

    International Nuclear Information System (INIS)

    Albaghdadi, Mouayad; Briley, Bruce; Evens, Martha

    2006-01-01

    Event storms are the manifestation of an important class of abnormal behaviors in communication systems. They occur when a large number of nodes throughout the system generate a set of events within a small period of time. It is essential for network management systems to detect every event storm and identify its cause, in order to prevent and repair potential system faults. This paper presents a set of techniques for the effective detection and identification of event storms in communication systems. First, we introduce a new algorithm to synchronize events to a single node in the system. Second, the system's event log is modeled as a normally distributed random process. This is achieved by using data analysis techniques to explore and then model the statistical behavior of the event log. Third, event storm detection is proposed using a simple test statistic combined with an exponential smoothing technique to overcome the non-stationary behavior of event logs. Fourth, the system is divided into non-overlapping regions to locate the main contributing regions of a storm. We show that this technique provides us with a method for event storm identification. Finally, experimental results from a commercially deployed multimedia communication system that uses these techniques demonstrate their effectiveness

  6. Detecting failure events in buildings: a numerical and experimental analysis

    OpenAIRE

    Heckman, V. M.; Kohler, M. D.; Heaton, T. H.

    2010-01-01

    A numerical method is used to investigate an approach for detecting the brittle fracture of welds associated with beam -column connections in instrumented buildings in real time through the use of time-reversed Green’s functions and wave propagation reciprocity. The approach makes use of a prerecorded catalog of Green’s functions for an instrumented building to detect failure events in the building during a later seismic event by screening continuous data for the presence of wavef...

  7. Online Detection of Abnormal Events in Video Streams

    Directory of Open Access Journals (Sweden)

    Tian Wang

    2013-01-01

    an image descriptor and online nonlinear classification method. We introduce the covariance matrix of the optical flow and image intensity as a descriptor encoding moving information. The nonlinear online support vector machine (SVM firstly learns a limited set of the training frames to provide a basic reference model then updates the model and detects abnormal events in the current frame. We finally apply the method to detect abnormal events on a benchmark video surveillance dataset to demonstrate the effectiveness of the proposed technique.

  8. Non-Linguistic Vocal Event Detection Using Online Random

    DEFF Research Database (Denmark)

    Abou-Zleikha, Mohamed; Tan, Zheng-Hua; Christensen, Mads Græsbøll

    2014-01-01

    areas such as object detection, face recognition, and audio event detection. This paper proposes to use online random forest technique for detecting laughter and filler and for analyzing the importance of various features for non-linguistic vocal event classification through permutation. The results...... show that according to the Area Under Curve measure the online random forest achieved 88.1% compared to 82.9% obtained by the baseline support vector machines for laughter classification and 86.8% to 83.6% for filler classification....

  9. TACKLING EVENT DETECTION IN THE CONTEXT OF VIDEO SURVEILLANCE

    Directory of Open Access Journals (Sweden)

    Raducu DUMITRESCU

    2011-11-01

    Full Text Available In this paper we address the problem of event detection in the context of video surveillance systems. First we deal with background extraction. Three methods are being tested, namely: frame differencing, running average and an estimate of median filtering technique. This provides information about changing contents. Further, we use this information to address human presence detection in the scene. This is carried out thought a contour-based approach. Contours are extracted from moving regions and parameterized. Human silhouettes show particular signatures of these parameters. Experimental results prove the potential of this approach to event detection. However, these are our first preliminary results to this application.

  10. Event Coverage Detection and Event Source Determination in Underwater Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Zhangbing Zhou

    2015-12-01

    Full Text Available With the advent of the Internet of Underwater Things, smart things are deployed in the ocean space and establish underwater wireless sensor networks for the monitoring of vast and dynamic underwater environments. When events are found to have possibly occurred, accurate event coverage should be detected, and potential event sources should be determined for the enactment of prompt and proper responses. To address this challenge, a technique that detects event coverage and determines event sources is developed in this article. Specifically, the occurrence of possible events corresponds to a set of neighboring sensor nodes whose sensory data may deviate from a normal sensing range in a collective fashion. An appropriate sensor node is selected as the relay node for gathering and routing sensory data to sink node(s. When sensory data are collected at sink node(s, the event coverage is detected and represented as a weighted graph, where the vertices in this graph correspond to sensor nodes and the weight specified upon the edges reflects the extent of sensory data deviating from a normal sensing range. Event sources are determined, which correspond to the barycenters in this graph. The results of the experiments show that our technique is more energy efficient, especially when the network topology is relatively steady.

  11. COMPARISON OF FOUR METHODS TO DETECT ADVERSE EVENTS IN HOSPITAL

    Directory of Open Access Journals (Sweden)

    Inge Dhamanti

    2015-09-01

    Full Text Available AbstrakDeteksi terjadinya kejadian yang tidak diharapkan (KTD telah menjadi salah satu tantangan dalam keselamatan pasien oleh karena itu metode untuk mendeteksi terjadinya KTD sangatlah penting untuk meningkatkan keselamatan pasien. Tujuan dari artikel ini adalah untuk membandingkan kelebihan dan kekurangan dari beberapa metode untuk mendeteksi terjadinya KTD di rumah sakit, meliputi review rekam medis, pelaporan insiden secara mandiri, teknologi informasi, dan pelaporan oleh pasien. Studi ini merupakan kajian literatur untuk membandingkan dan menganalisa metode terbaik untuk mendeteksi KTD yang dapat diimplementasikan oleh rumah sakit. Semua dari empat metode telah terbukti mampu untuk mendeteksi terjadinya KTD di rumah sakit, tetapi masing-masing metode mempunyai kelebihan dan kekurangan yang perlu diatasi. Tidak ada satu metode terbaik yang akan memberikan hasil terbaik untuk mendeteksi KTD di rumah sakit. Sehingga untuk mendeteksi lebih banyak KTD yang seharusnya dapat dicegah, atau KTD yang telah terjadi, rumah sakit seharusnya mengkombinasikan lebih dari satu metode untuk mendeteksi, karena masing-masing metode mempunyai sensitivitas berbeda-beda.AbstractDetecting adverse events has become one of the challenges in patient safety thus methods to detect adverse events become critical for improving patient safety. The purpose of this paper is to compare the strengths and weaknesses of several methods of identifying adverse events in hospital, including medical records reviews, self-reported incidents, information technology, and patient self-reports. This study is a literature review to compared and analyzed to determine the best method implemented by the hospital. All of four methods have been proved in their ability in detecting adverse events in hospitals, but each method had strengths and limitations to be overcome. There is no ‘best’ single method that will give the best results for adverse events detection in hospital. Thus to

  12. Secure access control and large scale robust representation for online multimedia event detection.

    Science.gov (United States)

    Liu, Changyu; Lu, Bin; Li, Huiling

    2014-01-01

    We developed an online multimedia event detection (MED) system. However, there are a secure access control issue and a large scale robust representation issue when we want to integrate traditional event detection algorithms into the online environment. For the first issue, we proposed a tree proxy-based and service-oriented access control (TPSAC) model based on the traditional role based access control model. Verification experiments were conducted on the CloudSim simulation platform, and the results showed that the TPSAC model is suitable for the access control of dynamic online environments. For the second issue, inspired by the object-bank scene descriptor, we proposed a 1000-object-bank (1000OBK) event descriptor. Feature vectors of the 1000OBK were extracted from response pyramids of 1000 generic object detectors which were trained on standard annotated image datasets, such as the ImageNet dataset. A spatial bag of words tiling approach was then adopted to encode these feature vectors for bridging the gap between the objects and events. Furthermore, we performed experiments in the context of event classification on the challenging TRECVID MED 2012 dataset, and the results showed that the robust 1000OBK event descriptor outperforms the state-of-the-art approaches.

  13. Secure Access Control and Large Scale Robust Representation for Online Multimedia Event Detection

    Directory of Open Access Journals (Sweden)

    Changyu Liu

    2014-01-01

    Full Text Available We developed an online multimedia event detection (MED system. However, there are a secure access control issue and a large scale robust representation issue when we want to integrate traditional event detection algorithms into the online environment. For the first issue, we proposed a tree proxy-based and service-oriented access control (TPSAC model based on the traditional role based access control model. Verification experiments were conducted on the CloudSim simulation platform, and the results showed that the TPSAC model is suitable for the access control of dynamic online environments. For the second issue, inspired by the object-bank scene descriptor, we proposed a 1000-object-bank (1000OBK event descriptor. Feature vectors of the 1000OBK were extracted from response pyramids of 1000 generic object detectors which were trained on standard annotated image datasets, such as the ImageNet dataset. A spatial bag of words tiling approach was then adopted to encode these feature vectors for bridging the gap between the objects and events. Furthermore, we performed experiments in the context of event classification on the challenging TRECVID MED 2012 dataset, and the results showed that the robust 1000OBK event descriptor outperforms the state-of-the-art approaches.

  14. An Examination of Three Spatial Event Cluster Detection Methods

    Directory of Open Access Journals (Sweden)

    Hensley H. Mariathas

    2015-03-01

    Full Text Available In spatial disease surveillance, geographic areas with large numbers of disease cases are to be identified, so that targeted investigations can be pursued. Geographic areas with high disease rates are called disease clusters and statistical cluster detection tests are used to identify geographic areas with higher disease rates than expected by chance alone. In some situations, disease-related events rather than individuals are of interest for geographical surveillance, and methods to detect clusters of disease-related events are called event cluster detection methods. In this paper, we examine three distributional assumptions for the events in cluster detection: compound Poisson, approximate normal and multiple hypergeometric (exact. The methods differ on the choice of distributional assumption for the potentially multiple correlated events per individual. The methods are illustrated on emergency department (ED presentations by children and youth (age < 18 years because of substance use in the province of Alberta, Canada, during 1 April 2007, to 31 March 2008. Simulation studies are conducted to investigate Type I error and the power of the clustering methods.

  15. Analytic 3D image reconstruction using all detected events

    International Nuclear Information System (INIS)

    Kinahan, P.E.; Rogers, J.G.

    1988-11-01

    We present the results of testing a previously presented algorithm for three-dimensional image reconstruction that uses all gamma-ray coincidence events detected by a PET volume-imaging scanner. By using two iterations of an analytic filter-backprojection method, the algorithm is not constrained by the requirement of a spatially invariant detector point spread function, which limits normal analytic techniques. Removing this constraint allows the incorporation of all detected events, regardless of orientation, which improves the statistical quality of the final reconstructed image

  16. Spatial-Temporal Event Detection from Geo-Tagged Tweets

    Directory of Open Access Journals (Sweden)

    Yuqian Huang

    2018-04-01

    Full Text Available As one of the most popular social networking services in the world, Twitter allows users to post messages along with their current geographic locations. Such georeferenced or geo-tagged Twitter datasets can benefit location-based services, targeted advertising and geosocial studies. Our study focused on the detection of small-scale spatial-temporal events and their textual content. First, we used Spatial-Temporal Density-Based Spatial Clustering of Applications with Noise (ST-DBSCAN to spatially-temporally cluster the tweets. Then, the word frequencies were summarized for each cluster and the potential topics were modeled by the Latent Dirichlet Allocation (LDA algorithm. Using two years of Twitter data from four college cities in the U.S., we were able to determine the spatial-temporal patterns of two known events, two unknown events and one recurring event, which then were further explored and modeled to identify the semantic content about the events. This paper presents our process and recommendations for both finding event-related tweets as well as understanding the spatial-temporal behaviors and semantic natures of the detected events.

  17. Event-Triggered Fault Detection of Nonlinear Networked Systems.

    Science.gov (United States)

    Li, Hongyi; Chen, Ziran; Wu, Ligang; Lam, Hak-Keung; Du, Haiping

    2017-04-01

    This paper investigates the problem of fault detection for nonlinear discrete-time networked systems under an event-triggered scheme. A polynomial fuzzy fault detection filter is designed to generate a residual signal and detect faults in the system. A novel polynomial event-triggered scheme is proposed to determine the transmission of the signal. A fault detection filter is designed to guarantee that the residual system is asymptotically stable and satisfies the desired performance. Polynomial approximated membership functions obtained by Taylor series are employed for filtering analysis. Furthermore, sufficient conditions are represented in terms of sum of squares (SOSs) and can be solved by SOS tools in MATLAB environment. A numerical example is provided to demonstrate the effectiveness of the proposed results.

  18. Detection of Abnormal Events via Optical Flow Feature Analysis

    Directory of Open Access Journals (Sweden)

    Tian Wang

    2015-03-01

    Full Text Available In this paper, a novel algorithm is proposed to detect abnormal events in video streams. The algorithm is based on the histogram of the optical flow orientation descriptor and the classification method. The details of the histogram of the optical flow orientation descriptor are illustrated for describing movement information of the global video frame or foreground frame. By combining one-class support vector machine and kernel principal component analysis methods, the abnormal events in the current frame can be detected after a learning period characterizing normal behaviors. The difference abnormal detection results are analyzed and explained. The proposed detection method is tested on benchmark datasets, then the experimental results show the effectiveness of the algorithm.

  19. Detection of Abnormal Events via Optical Flow Feature Analysis

    Science.gov (United States)

    Wang, Tian; Snoussi, Hichem

    2015-01-01

    In this paper, a novel algorithm is proposed to detect abnormal events in video streams. The algorithm is based on the histogram of the optical flow orientation descriptor and the classification method. The details of the histogram of the optical flow orientation descriptor are illustrated for describing movement information of the global video frame or foreground frame. By combining one-class support vector machine and kernel principal component analysis methods, the abnormal events in the current frame can be detected after a learning period characterizing normal behaviors. The difference abnormal detection results are analyzed and explained. The proposed detection method is tested on benchmark datasets, then the experimental results show the effectiveness of the algorithm. PMID:25811227

  20. Research in Model-Based Change Detection and Site Model Updating

    National Research Council Canada - National Science Library

    Nevatia, R

    1998-01-01

    .... Some of these techniques also are applicable to automatic site modeling and some of our change detection techniques may apply to detection of larger mobile objects, such as airplanes. We have implemented an interactive modeling system that works in conjunction with our automatic system to minimize the need for tedious interaction.

  1. Event detection for car park entries by video-surveillance

    Science.gov (United States)

    Coquin, Didier; Tailland, Johan; Cintract, Michel

    2007-10-01

    Intelligent surveillance has become an important research issue due to the high cost and low efficiency of human supervisors, and machine intelligence is required to provide a solution for automated event detection. In this paper we describe a real-time system that has been used for detecting car park entries, using an adaptive background learning algorithm and two indicators representing activity and identity to overcome the difficulty of tracking objects.

  2. A Kriging Model Based Finite Element Model Updating Method for Damage Detection

    Directory of Open Access Journals (Sweden)

    Xiuming Yang

    2017-10-01

    Full Text Available Model updating is an effective means of damage identification and surrogate modeling has attracted considerable attention for saving computational cost in finite element (FE model updating, especially for large-scale structures. In this context, a surrogate model of frequency is normally constructed for damage identification, while the frequency response function (FRF is rarely used as it usually changes dramatically with updating parameters. This paper presents a new surrogate model based model updating method taking advantage of the measured FRFs. The Frequency Domain Assurance Criterion (FDAC is used to build the objective function, whose nonlinear response surface is constructed by the Kriging model. Then, the efficient global optimization (EGO algorithm is introduced to get the model updating results. The proposed method has good accuracy and robustness, which have been verified by a numerical simulation of a cantilever and experimental test data of a laboratory three-story structure.

  3. Context-aware event detection smartphone application for first responders

    Science.gov (United States)

    Boddhu, Sanjay K.; Dave, Rakesh P.; McCartney, Matt; West, James A.; Williams, Robert L.

    2013-05-01

    The rise of social networking platforms like Twitter, Facebook, etc…, have provided seamless sharing of information (as chat, video and other media) among its user community on a global scale. Further, the proliferation of the smartphones and their connectivity networks has powered the ordinary individuals to share and acquire information regarding the events happening in his/her immediate vicinity in a real-time fashion. This human-centric sensed data being generated in "human-as-sensor" approach is tremendously valuable as it delivered mostly with apt annotations and ground truth that would be missing in traditional machine-centric sensors, besides high redundancy factor (same data thru multiple users). Further, when appropriately employed this real-time data can support in detecting localized events like fire, accidents, shooting, etc…, as they unfold and pin-point individuals being affected by those events. This spatiotemporal information, when made available for first responders in the event vicinity (or approaching it) can greatly assist them to make effective decisions to protect property and life in a timely fashion. In this vein, under SATE and YATE programs, the research team at AFRL Tec^Edge Discovery labs had demonstrated the feasibility of developing Smartphone applications, that can provide a augmented reality view of the appropriate detected events in a given geographical location (localized) and also provide an event search capability over a large geographic extent. In its current state, the application thru its backend connectivity utilizes a data (Text & Image) processing framework, which deals with data challenges like; identifying and aggregating important events, analyzing and correlating the events temporally and spatially and building a search enabled event database. Further, the smartphone application with its backend data processing workflow has been successfully field tested with live user generated feeds.

  4. Optimization of single photon detection model based on GM-APD

    Science.gov (United States)

    Chen, Yu; Yang, Yi; Hao, Peiyu

    2017-11-01

    One hundred kilometers high precision laser ranging hopes the detector has very strong detection ability for very weak light. At present, Geiger-Mode of Avalanche Photodiode has more use. It has high sensitivity and high photoelectric conversion efficiency. Selecting and designing the detector parameters according to the system index is of great importance to the improvement of photon detection efficiency. Design optimization requires a good model. In this paper, we research the existing Poisson distribution model, and consider the important detector parameters of dark count rate, dead time, quantum efficiency and so on. We improve the optimization of detection model, select the appropriate parameters to achieve optimal photon detection efficiency. The simulation is carried out by using Matlab and compared with the actual test results. The rationality of the model is verified. It has certain reference value in engineering applications.

  5. Model-based fault detection and isolation of a PWR nuclear power plant using neural networks

    International Nuclear Information System (INIS)

    Far, R.R.; Davilu, H.; Lucas, C.

    2008-01-01

    The proper and timely fault detection and isolation of industrial plant is of premier importance to guarantee the safe and reliable operation of industrial plants. The paper presents application of a neural networks-based scheme for fault detection and isolation, for the pressurizer of a PWR nuclear power plant. The scheme is constituted by 2 components: residual generation and fault isolation. The first component generates residuals via the discrepancy between measurements coming from the plant and a nominal model. The neutral network estimator is trained with healthy data collected from a full-scale simulator. For the second component detection thresholds are used to encode the residuals as bipolar vectors which represent fault patterns. These patterns are stored in an associative memory based on a recurrent neutral network. The proposed fault diagnosis tool is evaluated on-line via a full-scale simulator detected and isolate the main faults appearing in the pressurizer of a PWR. (orig.)

  6. TNO at TRECVID 2013: Multimedia Event Detection and Instance Search

    NARCIS (Netherlands)

    Bouma, H.; Azzopardi, G.; Spitters, M.M.; Wit, J.J. de; Versloot, C.A.; Zon, R.W.L. van der; Eendebak, P.T.; Baan, J.; Hove, R.J.M. ten; Eekeren, A.W.M. van; Haar, F.B. ter; Hollander, R.J.M. den; Huis, R.J. van; Boer, M.H.T. de; Antwerpen, G. van; Broekhuijsen, B.J.; Daniele, L.M.; Brandt, P.; Schavemaker, J.G.M.; Kraaij, W.; Schutte, K.

    2013-01-01

    We describe the TNO system and the evaluation results for TRECVID 2013 Multimedia Event Detection (MED) and instance search (INS) tasks. The MED system consists of a bag-of-word (BOW) approach with spatial tiling that uses low-level static and dynamic visual features, an audio feature and high-level

  7. Distributed Event Detection in Wireless Sensor Networks for Disaster Management

    NARCIS (Netherlands)

    Bahrepour, M.; Meratnia, Nirvana; Poel, Mannes; Taghikhaki, Zahra; Havinga, Paul J.M.

    2010-01-01

    Recently, wireless sensor networks (WSNs) have become mature enough to go beyond being simple fine-grained continuous monitoring platforms and become one of the enabling technologies for disaster early-warning systems. Event detection functionality of WSNs can be of great help and importance for

  8. Complex Event Detection via Multi Source Video Attributes (Open Access)

    Science.gov (United States)

    2013-10-03

    Complex Event Detection via Multi-Source Video Attributes Zhigang Ma† Yi Yang‡ Zhongwen Xu‡§ Shuicheng Yan Nicu Sebe† Alexander G. Hauptmann...under its International Research Centre @ Singapore Fund- ing Initiative and administered by the IDM Programme Of- fice, and the Intelligence Advanced

  9. On Event Detection and Localization in Acyclic Flow Networks

    KAUST Repository

    Suresh, Mahima Agumbe

    2013-05-01

    Acyclic flow networks, present in many infrastructures of national importance (e.g., oil and gas and water distribution systems), have been attracting immense research interest. Existing solutions for detecting and locating attacks against these infrastructures have been proven costly and imprecise, particularly when dealing with large-scale distribution systems. In this article, to the best of our knowledge, for the first time, we investigate how mobile sensor networks can be used for optimal event detection and localization in acyclic flow networks. We propose the idea of using sensors that move along the edges of the network and detect events (i.e., attacks). To localize the events, sensors detect proximity to beacons, which are devices with known placement in the network. We formulate the problem of minimizing the cost of monitoring infrastructure (i.e., minimizing the number of sensors and beacons deployed) in a predetermined zone of interest, while ensuring a degree of coverage by sensors and a required accuracy in locating events using beacons. We propose algorithms for solving the aforementioned problem and demonstrate their effectiveness with results obtained from a realistic flow network simulator.

  10. Analysis of a SCADA System Anomaly Detection Model Based on Information Entropy

    Science.gov (United States)

    2014-03-27

    Gerard, 2005:3) The NTSB report lists alarm management as one of the top five areas for improvement in pipeline SCADA systems (Gerard, 2005:1...Zhang, Qin, Wang, and Liang for leak detection in a SCADA -run pipeline system. A concept derived from information theory improved leak detection...System for West Products Pipeline . Journal of Loss Prevention in the Process Industries, 22(6), 981-989. Zhu, B., & Sastry, S. (2010). SCADA

  11. Exploring the uncertainties of early detection results: model-based interpretation of mayo lung project

    Directory of Open Access Journals (Sweden)

    Berman Barbara

    2011-03-01

    Full Text Available Abstract Background The Mayo Lung Project (MLP, a randomized controlled clinical trial of lung cancer screening conducted between 1971 and 1986 among male smokers aged 45 or above, demonstrated an increase in lung cancer survival since the time of diagnosis, but no reduction in lung cancer mortality. Whether this result necessarily indicates a lack of mortality benefit for screening remains controversial. A number of hypotheses have been proposed to explain the observed outcome, including over-diagnosis, screening sensitivity, and population heterogeneity (initial difference in lung cancer risks between the two trial arms. This study is intended to provide model-based testing for some of these important arguments. Method Using a micro-simulation model, the MISCAN-lung model, we explore the possible influence of screening sensitivity, systematic error, over-diagnosis and population heterogeneity. Results Calibrating screening sensitivity, systematic error, or over-diagnosis does not noticeably improve the fit of the model, whereas calibrating population heterogeneity helps the model predict lung cancer incidence better. Conclusions Our conclusion is that the hypothesized imperfection in screening sensitivity, systematic error, and over-diagnosis do not in themselves explain the observed trial results. Model fit improvement achieved by accounting for population heterogeneity suggests a higher risk of cancer incidence in the intervention group as compared with the control group.

  12. Online model-based fault detection for grid connected PV systems monitoring

    KAUST Repository

    Harrou, Fouzi

    2017-12-14

    This paper presents an efficient fault detection approach to monitor the direct current (DC) side of photovoltaic (PV) systems. The key contribution of this work is combining both single diode model (SDM) flexibility and the cumulative sum (CUSUM) chart efficiency to detect incipient faults. In fact, unknown electrical parameters of SDM are firstly identified using an efficient heuristic algorithm, named Artificial Bee Colony algorithm. Then, based on the identified parameters, a simulation model is built and validated using a co-simulation between Matlab/Simulink and PSIM. Next, the peak power (Pmpp) residuals of the entire PV array are generated based on both real measured and simulated Pmpp values. Residuals are used as the input for the CUSUM scheme to detect potential faults. We validate the effectiveness of this approach using practical data from an actual 20 MWp grid-connected PV system located in the province of Adrar, Algeria.

  13. FraudMiner: A Novel Credit Card Fraud Detection Model Based on Frequent Itemset Mining

    Directory of Open Access Journals (Sweden)

    K. R. Seeja

    2014-01-01

    Full Text Available This paper proposes an intelligent credit card fraud detection model for detecting fraud from highly imbalanced and anonymous credit card transaction datasets. The class imbalance problem is handled by finding legal as well as fraud transaction patterns for each customer by using frequent itemset mining. A matching algorithm is also proposed to find to which pattern (legal or fraud the incoming transaction of a particular customer is closer and a decision is made accordingly. In order to handle the anonymous nature of the data, no preference is given to any of the attributes and each attribute is considered equally for finding the patterns. The performance evaluation of the proposed model is done on UCSD Data Mining Contest 2009 Dataset (anonymous and imbalanced and it is found that the proposed model has very high fraud detection rate, balanced classification rate, Matthews correlation coefficient, and very less false alarm rate than other state-of-the-art classifiers.

  14. FraudMiner: a novel credit card fraud detection model based on frequent itemset mining.

    Science.gov (United States)

    Seeja, K R; Zareapoor, Masoumeh

    2014-01-01

    This paper proposes an intelligent credit card fraud detection model for detecting fraud from highly imbalanced and anonymous credit card transaction datasets. The class imbalance problem is handled by finding legal as well as fraud transaction patterns for each customer by using frequent itemset mining. A matching algorithm is also proposed to find to which pattern (legal or fraud) the incoming transaction of a particular customer is closer and a decision is made accordingly. In order to handle the anonymous nature of the data, no preference is given to any of the attributes and each attribute is considered equally for finding the patterns. The performance evaluation of the proposed model is done on UCSD Data Mining Contest 2009 Dataset (anonymous and imbalanced) and it is found that the proposed model has very high fraud detection rate, balanced classification rate, Matthews correlation coefficient, and very less false alarm rate than other state-of-the-art classifiers.

  15. Online model-based fault detection for grid connected PV systems monitoring

    KAUST Repository

    Harrou, Fouzi; Sun, Ying; Saidi, Ahmed

    2017-01-01

    This paper presents an efficient fault detection approach to monitor the direct current (DC) side of photovoltaic (PV) systems. The key contribution of this work is combining both single diode model (SDM) flexibility and the cumulative sum (CUSUM) chart efficiency to detect incipient faults. In fact, unknown electrical parameters of SDM are firstly identified using an efficient heuristic algorithm, named Artificial Bee Colony algorithm. Then, based on the identified parameters, a simulation model is built and validated using a co-simulation between Matlab/Simulink and PSIM. Next, the peak power (Pmpp) residuals of the entire PV array are generated based on both real measured and simulated Pmpp values. Residuals are used as the input for the CUSUM scheme to detect potential faults. We validate the effectiveness of this approach using practical data from an actual 20 MWp grid-connected PV system located in the province of Adrar, Algeria.

  16. Detection of Outliers in Panel Data of Intervention Effects Model Based on Variance of Remainder Disturbance

    Directory of Open Access Journals (Sweden)

    Yanfang Lyu

    2015-01-01

    Full Text Available The presence of outliers can result in seriously biased parameter estimates. In order to detect outliers in panel data models, this paper presents a modeling method to assess the intervention effects based on the variance of remainder disturbance using an arbitrary strictly positive twice continuously differentiable function. This paper also provides a Lagrange Multiplier (LM approach to detect and identify a general type of outlier. Furthermore, fixed effects models and random effects models are discussed to identify outliers and the corresponding LM test statistics are given. The LM test statistics for an individual-based model to detect outliers are given as a particular case. Finally, this paper performs an application using panel data and explains the advantages of the proposed method.

  17. National Earthquake Information Center Seismic Event Detections on Multiple Scales

    Science.gov (United States)

    Patton, J.; Yeck, W. L.; Benz, H.; Earle, P. S.; Soto-Cordero, L.; Johnson, C. E.

    2017-12-01

    The U.S. Geological Survey National Earthquake Information Center (NEIC) monitors seismicity on local, regional, and global scales using automatic picks from more than 2,000 near-real time seismic stations. This presents unique challenges in automated event detection due to the high variability in data quality, network geometries and density, and distance-dependent variability in observed seismic signals. To lower the overall detection threshold while minimizing false detection rates, NEIC has begun to test the incorporation of new detection and picking algorithms, including multiband (Lomax et al., 2012) and kurtosis (Baillard et al., 2014) pickers, and a new bayesian associator (Glass 3.0). The Glass 3.0 associator allows for simultaneous processing of variably scaled detection grids, each with a unique set of nucleation criteria (e.g., nucleation threshold, minimum associated picks, nucleation phases) to meet specific monitoring goals. We test the efficacy of these new tools on event detection in networks of various scales and geometries, compare our results with previous catalogs, and discuss lessons learned. For example, we find that on local and regional scales, rapid nucleation of small events may require event nucleation with both P and higher-amplitude secondary phases (e.g., S or Lg). We provide examples of the implementation of a scale-independent associator for an induced seismicity sequence (local-scale), a large aftershock sequence (regional-scale), and for monitoring global seismicity. Baillard, C., Crawford, W. C., Ballu, V., Hibert, C., & Mangeney, A. (2014). An automatic kurtosis-based P-and S-phase picker designed for local seismic networks. Bulletin of the Seismological Society of America, 104(1), 394-409. Lomax, A., Satriano, C., & Vassallo, M. (2012). Automatic picker developments and optimization: FilterPicker - a robust, broadband picker for real-time seismic monitoring and earthquake early-warning, Seism. Res. Lett. , 83, 531-540, doi: 10

  18. Model-based fault detection for generator cooling system in wind turbines using SCADA data

    DEFF Research Database (Denmark)

    Borchersen, Anders Bech; Kinnaert, Michel

    2016-01-01

    In this work, an early fault detection system for the generator cooling of wind turbines is presented and tested. It relies on a hybrid model of the cooling system. The parameters of the generator model are estimated by an extended Kalman filter. The estimated parameters are then processed by an ...

  19. Detecting Difference between Process Models Based on the Refined Process Structure Tree

    Directory of Open Access Journals (Sweden)

    Jing Fan

    2017-01-01

    Full Text Available The development of mobile workflow management systems (mWfMS leads to large number of business process models. In the meantime, the location restriction embedded in mWfMS may result in different process models for a single business process. In order to help users quickly locate the difference and rebuild the process model, detecting the difference between different process models is needed. Existing detection methods either provide a dissimilarity value to represent the difference or use predefined difference template to generate the result, which cannot reflect the entire composition of the difference. Hence, in this paper, we present a new approach to solve this problem. Firstly, we parse the process models to their corresponding refined process structure trees (PSTs, that is, decomposing a process model into a hierarchy of subprocess models. Then we design a method to convert the PST to its corresponding task based process structure tree (TPST. As a consequence, the problem of detecting difference between two process models is transformed to detect difference between their corresponding TPSTs. Finally, we obtain the difference between two TPSTs based on the divide and conquer strategy, where the difference is described by an edit script and we make the cost of the edit script close to minimum. The extensive experimental evaluation shows that our method can meet the real requirements in terms of precision and efficiency.

  20. A model-based clustering method to detect infectious disease transmission outbreaks from sequence variation.

    Directory of Open Access Journals (Sweden)

    Rosemary M McCloskey

    2017-11-01

    Full Text Available Clustering infections by genetic similarity is a popular technique for identifying potential outbreaks of infectious disease, in part because sequences are now routinely collected for clinical management of many infections. A diverse number of nonparametric clustering methods have been developed for this purpose. These methods are generally intuitive, rapid to compute, and readily scale with large data sets. However, we have found that nonparametric clustering methods can be biased towards identifying clusters of diagnosis-where individuals are sampled sooner post-infection-rather than the clusters of rapid transmission that are meant to be potential foci for public health efforts. We develop a fundamentally new approach to genetic clustering based on fitting a Markov-modulated Poisson process (MMPP, which represents the evolution of transmission rates along the tree relating different infections. We evaluated this model-based method alongside five nonparametric clustering methods using both simulated and actual HIV sequence data sets. For simulated clusters of rapid transmission, the MMPP clustering method obtained higher mean sensitivity (85% and specificity (91% than the nonparametric methods. When we applied these clustering methods to published sequences from a study of HIV-1 genetic clusters in Seattle, USA, we found that the MMPP method categorized about half (46% as many individuals to clusters compared to the other methods. Furthermore, the mean internal branch lengths that approximate transmission rates were significantly shorter in clusters extracted using MMPP, but not by other methods. We determined that the computing time for the MMPP method scaled linearly with the size of trees, requiring about 30 seconds for a tree of 1,000 tips and about 20 minutes for 50,000 tips on a single computer. This new approach to genetic clustering has significant implications for the application of pathogen sequence analysis to public health, where

  1. Towards Detecting the Crowd Involved in Social Events

    Directory of Open Access Journals (Sweden)

    Wei Huang

    2017-10-01

    Full Text Available Knowing how people interact with urban environments is fundamental for a variety of fields, ranging from transportation to social science. Despite the fact that human mobility patterns have been a major topic of study in recent years, a challenge to understand large-scale human behavior when a certain event occurs remains due to a lack of either relevant data or suitable approaches. Psychological crowd refers to a group of people who are usually located at different places and show different behaviors, but who are very sensitively driven to take the same act (gather together by a certain event, which has been theoretically studied by social psychologists since the 19th century. This study aims to propose a computational approach using a machine learning method to model psychological crowds, contributing to the better understanding of human activity patterns under events. Psychological features and mental unity of the crowd are computed to detect the involved individuals. A national event happening across the USA in April, 2015 is analyzed using geotagged tweets as a case study to test our approach. The result shows that 81% of individuals in the crowd can be successfully detected. Through investigating the geospatial pattern of the involved users, not only can the event related users be identified but also those unobserved users before the event can be uncovered. The proposed approach can effectively represent the psychological feature and measure the mental unity of the psychological crowd, which sheds light on the study of large-scale psychological crowd and provides an innovative way to understanding human behavior under events.

  2. A travel time forecasting model based on change-point detection method

    Science.gov (United States)

    LI, Shupeng; GUANG, Xiaoping; QIAN, Yongsheng; ZENG, Junwei

    2017-06-01

    Travel time parameters obtained from road traffic sensors data play an important role in traffic management practice. A travel time forecasting model is proposed for urban road traffic sensors data based on the method of change-point detection in this paper. The first-order differential operation is used for preprocessing over the actual loop data; a change-point detection algorithm is designed to classify the sequence of large number of travel time data items into several patterns; then a travel time forecasting model is established based on autoregressive integrated moving average (ARIMA) model. By computer simulation, different control parameters are chosen for adaptive change point search for travel time series, which is divided into several sections of similar state.Then linear weight function is used to fit travel time sequence and to forecast travel time. The results show that the model has high accuracy in travel time forecasting.

  3. Neutron detector for detecting rare events of spontaneous fission

    International Nuclear Information System (INIS)

    Ter-Akop'yan, G.M.; Popeko, A.G.; Sokol, E.A.; Chelnokov, L.P.; Smirnov, V.I.; Gorshkov, V.A.

    1981-01-01

    The neutron detector for registering rare events of spontaneous fission by detecting multiple neutron emission is described. The detector represents a block of plexiglas of 550 mm diameter and 700 mm height in the centre of which there is a through 160 mm diameter channel for the sample under investigation. The detector comprises 56 3 He filled counters (up to 7 atm pressure) with 1% CO 2 addition. The counters have a 500 mm length and a 32 mm diameter. The sampling of fission events is realized by an electron system which allows determining the number of detected neutrons, numbers of operated counters, signal amplitude and time for fission event detecting. A block diagram of a neutron detector electron system is presented and its operation principle is considered. For protection against cosmic radiation the detector is surronded by a system of plastic scintillators and placed behind the concrete shield of 6 m thickness. The results of measurements of background radiation are given. It has been found that the background radiation of single neutron constitutes about 150 counts per hour, the detecting efficiency of single neutron equals 0.483 +- 0.005, for a 10l detector sensitive volume. By means of the detector described the parameters of multiplicity distribution of prompt neutrons for 256 Fm spontaneous fission are measured. The average multiplicity equals 3.59+-0.06 the dispersion being 2.30+-0.65

  4. Detection and interpretation of seismoacoustic events at German infrasound stations

    Science.gov (United States)

    Pilger, Christoph; Koch, Karl; Ceranna, Lars

    2016-04-01

    Three infrasound arrays with collocated or nearby installed seismometers are operated by the Federal Institute for Geosciences and Natural Resources (BGR) as the German National Data Center (NDC) for the verification of the Comprehensive Nuclear-Test-Ban Treaty (CTBT). Infrasound generated by seismoacoustic events is routinely detected at these infrasound arrays, but air-to-ground coupled acoustic waves occasionally show up in seismometer recordings as well. Different natural and artificial sources like meteoroids as well as industrial and mining activity generate infrasonic signatures that are simultaneously detected at microbarometers and seismometers. Furthermore, many near-surface sources like earthquakes and explosions generate both seismic and infrasonic waves that can be detected successively with both technologies. The combined interpretation of seismic and acoustic signatures provides additional information about the origin time and location of remote infrasound events or about the characterization of seismic events distinguishing man-made and natural origins. Furthermore, seismoacoustic studies help to improve the modelling of infrasound propagation and ducting in the atmosphere and allow quantifying the portion of energy coupled into ground and into air by seismoacoustic sources. An overview of different seismoacoustic sources and their detection by German infrasound stations as well as some conclusions on the benefit of a combined seismoacoustic analysis are presented within this study.

  5. Hybrid Model-Based and Data-Driven Fault Detection and Diagnostics for Commercial Buildings: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Frank, Stephen; Heaney, Michael; Jin, Xin; Robertson, Joseph; Cheung, Howard; Elmore, Ryan; Henze, Gregor

    2016-08-01

    Commercial buildings often experience faults that produce undesirable behavior in building systems. Building faults waste energy, decrease occupants' comfort, and increase operating costs. Automated fault detection and diagnosis (FDD) tools for buildings help building owners discover and identify the root causes of faults in building systems, equipment, and controls. Proper implementation of FDD has the potential to simultaneously improve comfort, reduce energy use, and narrow the gap between actual and optimal building performance. However, conventional rule-based FDD requires expensive instrumentation and valuable engineering labor, which limit deployment opportunities. This paper presents a hybrid, automated FDD approach that combines building energy models and statistical learning tools to detect and diagnose faults noninvasively, using minimal sensors, with little customization. We compare and contrast the performance of several hybrid FDD algorithms for a small security building. Our results indicate that the algorithms can detect and diagnose several common faults, but more work is required to reduce false positive rates and improve diagnosis accuracy.

  6. Detecting Seismic Events Using a Supervised Hidden Markov Model

    Science.gov (United States)

    Burks, L.; Forrest, R.; Ray, J.; Young, C.

    2017-12-01

    We explore the use of supervised hidden Markov models (HMMs) to detect seismic events in streaming seismogram data. Current methods for seismic event detection include simple triggering algorithms, such as STA/LTA and the Z-statistic, which can lead to large numbers of false positives that must be investigated by an analyst. The hypothesis of this study is that more advanced detection methods, such as HMMs, may decreases false positives while maintaining accuracy similar to current methods. We train a binary HMM classifier using 2 weeks of 3-component waveform data from the International Monitoring System (IMS) that was carefully reviewed by an expert analyst to pick all seismic events. Using an ensemble of simple and discrete features, such as the triggering of STA/LTA, the HMM predicts the time at which transition occurs from noise to signal. Compared to the STA/LTA detection algorithm, the HMM detects more true events, but the false positive rate remains unacceptably high. Future work to potentially decrease the false positive rate may include using continuous features, a Gaussian HMM, and multi-class HMMs to distinguish between types of seismic waves (e.g., P-waves and S-waves). Acknowledgement: Sandia National Laboratories is a multi-mission laboratory managed and operated by National Technology and Engineering Solutions of Sandia, LLC., a wholly owned subsidiary of Honeywell International, Inc., for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-NA-0003525.SAND No: SAND2017-8154 A

  7. Hybrid light transport model based bioluminescence tomography reconstruction for early gastric cancer detection

    Science.gov (United States)

    Chen, Xueli; Liang, Jimin; Hu, Hao; Qu, Xiaochao; Yang, Defu; Chen, Duofang; Zhu, Shouping; Tian, Jie

    2012-03-01

    Gastric cancer is the second cause of cancer-related death in the world, and it remains difficult to cure because it has been in late-stage once that is found. Early gastric cancer detection becomes an effective approach to decrease the gastric cancer mortality. Bioluminescence tomography (BLT) has been applied to detect early liver cancer and prostate cancer metastasis. However, the gastric cancer commonly originates from the gastric mucosa and grows outwards. The bioluminescent light will pass through a non-scattering region constructed by gastric pouch when it transports in tissues. Thus, the current BLT reconstruction algorithms based on the approximation model of radiative transfer equation are not optimal to handle this problem. To address the gastric cancer specific problem, this paper presents a novel reconstruction algorithm that uses a hybrid light transport model to describe the bioluminescent light propagation in tissues. The radiosity theory integrated with the diffusion equation to form the hybrid light transport model is utilized to describe light propagation in the non-scattering region. After the finite element discretization, the hybrid light transport model is converted into a minimization problem which fuses an l1 norm based regularization term to reveal the sparsity of bioluminescent source distribution. The performance of the reconstruction algorithm is first demonstrated with a digital mouse based simulation with the reconstruction error less than 1mm. An in situ gastric cancer-bearing nude mouse based experiment is then conducted. The primary result reveals the ability of the novel BLT reconstruction algorithm in early gastric cancer detection.

  8. Model-based assessment of the role of human-induced climate change in the 2005 Caribbean coral bleaching event

    Energy Technology Data Exchange (ETDEWEB)

    Donner, S.D. [Princeton Univ., NJ (United States). Woodrow Wilson School of Public and International Affairs; Knutson, T.R. [National Oceanic and Atmospheric Administration, Princeton, NJ (United States). Geophysical Fluid Dynamics Lab.; Oppenheimer, M. [Princeton Univ., NJ (United States). Dept. of Geosciences

    2007-03-27

    Episodes of mass coral bleaching around the world in recent decades have been attributed to periods of anomalously warm ocean temperatures. In 2005, the sea surface temperature (SST) anomaly in the tropical North Atlantic that may have contributed to the strong hurricane season caused widespread coral bleaching in the Eastern Caribbean. Here, the authors use two global climate models to evaluate the contribution of natural climate variability and anthropogenic forcing to the thermal stress that caused the 2005 coral bleaching event. Historical temperature data and simulations for the 1870-2000 period show that the observed warming in the region is unlikely to be due to unforced climate variability alone. Simulation of background climate variability suggests that anthropogenic warming may have increased the probability of occurrence of significant thermal stress events for corals in this region by an order of magnitude. Under scenarios of future greenhouse gas emissions, mass coral bleaching in the Eastern Caribbean may become a biannual event in 20-30 years. However, if corals and their symbionts can adapt by 1-1.5{sup o}C, such mass bleaching events may not begin to recur at potentially harmful intervals until the latter half of the century. The delay could enable more time to alter the path of greenhouse gas emissions, although long-term 'committed warming' even after stabilization of atmospheric CO{sub 2} levels may still represent an additional long-term threat to corals.

  9. Model-based temperature noise monitoring methods for LMFBR core anomaly detection

    International Nuclear Information System (INIS)

    Tamaoki, Tetsuo; Sonoda, Yukio; Sato, Masuo; Takahashi, Ryoichi.

    1994-01-01

    Temperature noise, measured by thermocouples mounted at each core fuel subassembly, is considered to be the most useful signal for detecting and locating local cooling anomalies in an LMFBR core. However, the core outlet temperature noise contains background noise due to fluctuations in the operating parameters including reactor power. It is therefore necessary to reduce this background noise for highly sensitive anomaly detection by subtracting predictable components from the measured signal. In the present study, both a physical model and an autoregressive model were applied to noise data measured in the experimental fast reactor JOYO. The results indicate that the autoregressive model has a higher precision than the physical model in background noise prediction. Based on these results, an 'autoregressive model modification method' is proposed, in which a temporary autoregressive model is generated by interpolation or extrapolation of reference models identified under a small number of different operating conditions. The generated autoregressive model has shown sufficient precision over a wide range of reactor power in applications to artificial noise data produced by an LMFBR noise simulator even when the coolant flow rate was changed to keep a constant power-to-flow ratio. (author)

  10. Polygraph lie detection on real events in a laboratory setting.

    Science.gov (United States)

    Bradley, M T; Cullen, M C

    1993-06-01

    This laboratory study dealt with real-life intense emotional events. Subjects generated embarrassing stories from their experience, then submitted to polygraph testing and, by lying, denied their stories and, by telling the truth, denied a randomly assigned story. Money was given as an incentive to be judged innocent on each story. An interrogator, blind to the stories, used Control Question Tests and found subjects more deceptive when lying than when truthful. Stories interacted with order such that lying on the second story was more easily detected than lying on the first. Embarrassing stories provide an alternative to the use of mock crimes to study lie detection in the laboratory.

  11. Complexity of deciding detectability in discrete event systems

    Czech Academy of Sciences Publication Activity Database

    Masopust, Tomáš

    2018-01-01

    Roč. 93, July (2018), s. 257-261 ISSN 0005-1098 Institutional support: RVO:67985840 Keywords : discrete event systems * finite automata * detectability Subject RIV: BA - General Mathematics OBOR OECD: Computer science s, information science , bioinformathics (hardware development to be 2.2, social aspect to be 5.8) Impact factor: 5.451, year: 2016 https://www. science direct.com/ science /article/pii/S0005109818301730

  12. Complexity of deciding detectability in discrete event systems

    Czech Academy of Sciences Publication Activity Database

    Masopust, Tomáš

    2018-01-01

    Roč. 93, July (2018), s. 257-261 ISSN 0005-1098 Institutional support: RVO:67985840 Keywords : discrete event systems * finite automata * detectability Subject RIV: BA - General Mathematics OBOR OECD: Computer sciences, information science, bioinformathics (hardware development to be 2.2, social aspect to be 5.8) Impact factor: 5.451, year: 2016 https://www.sciencedirect.com/science/article/pii/S0005109818301730

  13. Microseismic Events Detection on Xishancun Landslide, Sichuan Province, China

    Science.gov (United States)

    Sheng, M.; Chu, R.; Wei, Z.

    2016-12-01

    On landslide, the slope movement and the fracturing of the rock mass often lead to microearthquakes, which are recorded as weak signals on seismographs. The distribution characteristics of temporal and spatial regional unstability as well as the impact of external factors on the unstable regions can be understand and analyzed by monitoring those microseismic events. Microseismic method can provide some information inside the landslide, which can be used as supplementary of geodetic methods for monitoring the movement of landslide surface. Compared to drilling on landslide, microseismic method is more economical and safe. Xishancun Landslide is located about 60km northwest of Wenchuan earthquake centroid, it keep deforming after the earthquake, which greatly increases the probability of disasters. In the autumn of 2015, 30 seismometers were deployed on the landslide for 3 months with intervals of 200 500 meters. First, we used regional earthquakes for time correction of seismometers to eliminate the influence of inaccuracy GPS clocks and the subsurface structure of stations. Due to low velocity of the loose medium, the travel time difference of microseismic events on the landslide up to 5s. According to travel time and waveform characteristics, we found many microseismic events and converted them into envelopes as templates, then we used a sliding-window cross-correlation technique based on waveform envelope to detect the other microseismic events. Consequently, 100 microseismic events were detected with the waveforms recorded on all seismometers. Based on the location, we found most of them located on the front of the landslide while the others located on the back end. The bottom and top of the landslide accumulated considerable energy and deformed largely, radiated waves could be recorded by all stations. What's more, the bottom with more events seemed very active. In addition, there were many smaller events happened in middle part of the landslide where released

  14. A Model-Based Probabilistic Inversion Framework for Wire Fault Detection Using TDR

    Science.gov (United States)

    Schuet, Stefan R.; Timucin, Dogan A.; Wheeler, Kevin R.

    2010-01-01

    Time-domain reflectometry (TDR) is one of the standard methods for diagnosing faults in electrical wiring and interconnect systems, with a long-standing history focused mainly on hardware development of both high-fidelity systems for laboratory use and portable hand-held devices for field deployment. While these devices can easily assess distance to hard faults such as sustained opens or shorts, their ability to assess subtle but important degradation such as chafing remains an open question. This paper presents a unified framework for TDR-based chafing fault detection in lossy coaxial cables by combining an S-parameter based forward modeling approach with a probabilistic (Bayesian) inference algorithm. Results are presented for the estimation of nominal and faulty cable parameters from laboratory data.

  15. Model-based chatter stability prediction and detection for the turning of a flexible workpiece

    Science.gov (United States)

    Lu, Kaibo; Lian, Zisheng; Gu, Fengshou; Liu, Hunju

    2018-02-01

    Machining long slender workpieces still presents a technical challenge on the shop floor due to their low stiffness and damping. Regenerative chatter is a major hindrance in machining processes, reducing the geometric accuracies and dynamic stability of the cutting system. This study has been motivated by the fact that chatter occurrence is generally in relation to the cutting position in straight turning of slender workpieces, which has seldom been investigated comprehensively in literature. In the present paper, a predictive chatter model of turning a tailstock supported slender workpiece considering the cutting position change during machining is explored. Based on linear stability analysis and stiffness distribution at different cutting positions along the workpiece, the effect of the cutting tool movement along the length of the workpiece on chatter stability is studied. As a result, an entire stability chart for a single cutting pass is constructed. Through this stability chart the critical cutting condition and the chatter onset location along the workpiece in a turning operation can be estimated. The difference between the predicted tool locations and the experimental results was within 9% at high speed cutting. Also, on the basis of the predictive model the dynamic behavior during chatter that when chatter arises at some cutting location it will continue for a period of time until another specified location is arrived at, can be inferred. The experimental observation is in good agreement with the theoretical inference. In chatter detection respect, besides the delay strategy and overlap processing technique, a relative threshold algorithm is proposed to detect chatter by comparing the spectrum and variance of the acquired acceleration signals with the reference saved during stable cutting. The chatter monitoring method has shown reliability for various machining conditions.

  16. Towards Optimal Event Detection and Localization in Acyclic Flow Networks

    KAUST Repository

    Agumbe Suresh, Mahima

    2012-01-03

    Acyclic flow networks, present in many infrastructures of national importance (e.g., oil & gas and water distribution systems), have been attracting immense research interest. Existing solutions for detecting and locating attacks against these infrastructures, have been proven costly and imprecise, especially when dealing with large scale distribution systems. In this paper, to the best of our knowledge for the first time, we investigate how mobile sensor networks can be used for optimal event detection and localization in acyclic flow networks. Sensor nodes move along the edges of the network and detect events (i.e., attacks) and proximity to beacon nodes with known placement in the network. We formulate the problem of minimizing the cost of monitoring infrastructure (i.e., minimizing the number of sensor and beacon nodes deployed), while ensuring a degree of sensing coverage in a zone of interest and a required accuracy in locating events. We propose algorithms for solving these problems and demonstrate their effectiveness with results obtained from a high fidelity simulator.

  17. A glucose model based on support vector regression for the prediction of hypoglycemic events under free-living conditions.

    Science.gov (United States)

    Georga, Eleni I; Protopappas, Vasilios C; Ardigò, Diego; Polyzos, Demosthenes; Fotiadis, Dimitrios I

    2013-08-01

    The prevention of hypoglycemic events is of paramount importance in the daily management of insulin-treated diabetes. The use of short-term prediction algorithms of the subcutaneous (s.c.) glucose concentration may contribute significantly toward this direction. The literature suggests that, although the recent glucose profile is a prominent predictor of hypoglycemia, the overall patient's context greatly impacts its accurate estimation. The objective of this study is to evaluate the performance of a support vector for regression (SVR) s.c. glucose method on hypoglycemia prediction. We extend our SVR model to predict separately the nocturnal events during sleep and the non-nocturnal (i.e., diurnal) ones over 30-min and 60-min horizons using information on recent glucose profile, meals, insulin intake, and physical activities for a hypoglycemic threshold of 70 mg/dL. We also introduce herein additional variables accounting for recurrent nocturnal hypoglycemia due to antecedent hypoglycemia, exercise, and sleep. SVR predictions are compared with those from two other machine learning techniques. The method is assessed on a dataset of 15 patients with type 1 diabetes under free-living conditions. Nocturnal hypoglycemic events are predicted with 94% sensitivity for both horizons and with time lags of 5.43 min and 4.57 min, respectively. As concerns the diurnal events, when physical activities are not considered, the sensitivity is 92% and 96% for a 30-min and 60-min horizon, respectively, with both time lags being less than 5 min. However, when such information is introduced, the diurnal sensitivity decreases by 8% and 3%, respectively. Both nocturnal and diurnal predictions show a high (>90%) precision. Results suggest that hypoglycemia prediction using SVR can be accurate and performs better in most diurnal and nocturnal cases compared with other techniques. It is advised that the problem of hypoglycemia prediction should be handled differently for nocturnal

  18. Functionality of empirical model-based predictive analytics for the early detection of hemodynamic instabilty.

    Science.gov (United States)

    Summers, Richard L; Pipke, Matt; Wegerich, Stephan; Conkright, Gary; Isom, Kristen C

    2014-01-01

    Background. Monitoring cardiovascular hemodynamics in the modern clinical setting is a major challenge. Increasing amounts of physiologic data must be analyzed and interpreted in the context of the individual patient’s pathology and inherent biologic variability. Certain data-driven analytical methods are currently being explored for smart monitoring of data streams from patients as a first tier automated detection system for clinical deterioration. As a prelude to human clinical trials, an empirical multivariate machine learning method called Similarity-Based Modeling (“SBM”), was tested in an In Silico experiment using data generated with the aid of a detailed computer simulator of human physiology (Quantitative Circulatory Physiology or “QCP”) which contains complex control systems with realistic integrated feedback loops. Methods. SBM is a kernel-based, multivariate machine learning method that that uses monitored clinical information to generate an empirical model of a patient’s physiologic state. This platform allows for the use of predictive analytic techniques to identify early changes in a patient’s condition that are indicative of a state of deterioration or instability. The integrity of the technique was tested through an In Silico experiment using QCP in which the output of computer simulations of a slowly evolving cardiac tamponade resulted in progressive state of cardiovascular decompensation. Simulator outputs for the variables under consideration were generated at a 2-min data rate (0.083Hz) with the tamponade introduced at a point 420 minutes into the simulation sequence. The functionality of the SBM predictive analytics methodology to identify clinical deterioration was compared to the thresholds used by conventional monitoring methods. Results. The SBM modeling method was found to closely track the normal physiologic variation as simulated by QCP. With the slow development of the tamponade, the SBM model are seen to disagree while the

  19. Model-based decision making in early clinical development: minimizing the impact of a blood pressure adverse event.

    Science.gov (United States)

    Stroh, Mark; Addy, Carol; Wu, Yunhui; Stoch, S Aubrey; Pourkavoos, Nazaneen; Groff, Michelle; Xu, Yang; Wagner, John; Gottesdiener, Keith; Shadle, Craig; Wang, Hong; Manser, Kimberly; Winchell, Gregory A; Stone, Julie A

    2009-03-01

    We describe how modeling and simulation guided program decisions following a randomized placebo-controlled single-rising oral dose first-in-man trial of compound A where an undesired transient blood pressure (BP) elevation occurred in fasted healthy young adult males. We proposed a lumped-parameter pharmacokinetic-pharmacodynamic (PK/PD) model that captured important aspects of the BP homeostasis mechanism. Four conceptual units characterized the feedback PD model: a sinusoidal BP set point, an effect compartment, a linear effect model, and a system response. To explore approaches for minimizing the BP increase, we coupled the PD model to a modified PK model to guide oral controlled-release (CR) development. The proposed PK/PD model captured the central tendency of the observed data. The simulated BP response obtained with theoretical release rate profiles suggested some amelioration of the peak BP response with CR. This triggered subsequent CR formulation development; we used actual dissolution data from these candidate CR formulations in the PK/PD model to confirm a potential benefit in the peak BP response. Though this paradigm has yet to be tested in the clinic, our model-based approach provided a common rational framework to more fully utilize the limited available information for advancing the program.

  20. Model-based decoding, information estimation, and change-point detection techniques for multineuron spike trains.

    Science.gov (United States)

    Pillow, Jonathan W; Ahmadian, Yashar; Paninski, Liam

    2011-01-01

    One of the central problems in systems neuroscience is to understand how neural spike trains convey sensory information. Decoding methods, which provide an explicit means for reading out the information contained in neural spike responses, offer a powerful set of tools for studying the neural coding problem. Here we develop several decoding methods based on point-process neural encoding models, or forward models that predict spike responses to stimuli. These models have concave log-likelihood functions, which allow efficient maximum-likelihood model fitting and stimulus decoding. We present several applications of the encoding model framework to the problem of decoding stimulus information from population spike responses: (1) a tractable algorithm for computing the maximum a posteriori (MAP) estimate of the stimulus, the most probable stimulus to have generated an observed single- or multiple-neuron spike train response, given some prior distribution over the stimulus; (2) a gaussian approximation to the posterior stimulus distribution that can be used to quantify the fidelity with which various stimulus features are encoded; (3) an efficient method for estimating the mutual information between the stimulus and the spike trains emitted by a neural population; and (4) a framework for the detection of change-point times (the time at which the stimulus undergoes a change in mean or variance) by marginalizing over the posterior stimulus distribution. We provide several examples illustrating the performance of these estimators with simulated and real neural data.

  1. Structural Damage Detection using Frequency Response Function Index and Surrogate Model Based on Optimized Extreme Learning Machine Algorithm

    Directory of Open Access Journals (Sweden)

    R. Ghiasi

    2017-09-01

    Full Text Available Utilizing surrogate models based on artificial intelligence methods for detecting structural damages has attracted the attention of many researchers in recent decades. In this study, a new kernel based on Littlewood-Paley Wavelet (LPW is proposed for Extreme Learning Machine (ELM algorithm to improve the accuracy of detecting multiple damages in structural systems.  ELM is used as metamodel (surrogate model of exact finite element analysis of structures in order to efficiently reduce the computational cost through updating process. In the proposed two-step method, first a damage index, based on Frequency Response Function (FRF of the structure, is used to identify the location of damages. In the second step, the severity of damages in identified elements is detected using ELM. In order to evaluate the efficacy of ELM, the results obtained from the proposed kernel were compared with other kernels proposed for ELM as well as Least Square Support Vector Machine algorithm. The solved numerical problems indicated that ELM algorithm accuracy in detecting structural damages is increased drastically in case of using LPW kernel.

  2. Machine learning for the automatic detection of anomalous events

    Science.gov (United States)

    Fisher, Wendy D.

    In this dissertation, we describe our research contributions for a novel approach to the application of machine learning for the automatic detection of anomalous events. We work in two different domains to ensure a robust data-driven workflow that could be generalized for monitoring other systems. Specifically, in our first domain, we begin with the identification of internal erosion events in earth dams and levees (EDLs) using geophysical data collected from sensors located on the surface of the levee. As EDLs across the globe reach the end of their design lives, effectively monitoring their structural integrity is of critical importance. The second domain of interest is related to mobile telecommunications, where we investigate a system for automatically detecting non-commercial base station routers (BSRs) operating in protected frequency space. The presence of non-commercial BSRs can disrupt the connectivity of end users, cause service issues for the commercial providers, and introduce significant security concerns. We provide our motivation, experimentation, and results from investigating a generalized novel data-driven workflow using several machine learning techniques. In Chapter 2, we present results from our performance study that uses popular unsupervised clustering algorithms to gain insights to our real-world problems, and evaluate our results using internal and external validation techniques. Using EDL passive seismic data from an experimental laboratory earth embankment, results consistently show a clear separation of events from non-events in four of the five clustering algorithms applied. Chapter 3 uses a multivariate Gaussian machine learning model to identify anomalies in our experimental data sets. For the EDL work, we used experimental data from two different laboratory earth embankments. Additionally, we explore five wavelet transform methods for signal denoising. The best performance is achieved with the Haar wavelets. We achieve up to 97

  3. Detection of red tide events in the Ariake Sound, Japan

    Science.gov (United States)

    Ishizaka, Joji

    2003-05-01

    High resolution SeaWiFS data was used to detect a red tide event occurred in the Ariake Sound, Japan, in winter of 2000 to 2001. The area is small embayment surrounding by tidal flat, and it is known as one of the most productive areas in coast of Japan. The red tide event damaged to seaweed (Nori) culture, and the relation to the reclamation at the Isahaya Bay in the Sound has been discussed. SeaWiFS chlorophyll data showed the red tide started early December 2000, from the Isahaya Bay, although direct relationship to the reclamation was not clear. The red tide persisted to the end of February. Monthly average of SeaWiFS data from May 1998 to December 2001 indicated that the chlorophyll increased twice a year, early summer and fall after the rain. The red tide event was part of the fall bloom which started later and continued longer than other years. Ocean color is useful to detect the red tide; however, it is required to improve the algorithms to accurately estimate chlorophyll in high turbid water and to discriminate toxic flagellates.

  4. Research on Healthy Anomaly Detection Model Based on Deep Learning from Multiple Time-Series Physiological Signals

    Directory of Open Access Journals (Sweden)

    Kai Wang

    2016-01-01

    Full Text Available Health is vital to every human being. To further improve its already respectable medical technology, the medical community is transitioning towards a proactive approach which anticipates and mitigates risks before getting ill. This approach requires measuring the physiological signals of human and analyzes these data at regular intervals. In this paper, we present a novel approach to apply deep learning in physiological signals analysis that allows doctor to identify latent risks. However, extracting high level information from physiological time-series data is a hard problem faced by the machine learning communities. Therefore, in this approach, we apply model based on convolutional neural network that can automatically learn features from raw physiological signals in an unsupervised manner and then based on the learned features use multivariate Gauss distribution anomaly detection method to detect anomaly data. Our experiment is shown to have a significant performance in physiological signals anomaly detection. So it is a promising tool for doctor to identify early signs of illness even if the criteria are unknown a priori.

  5. Piecing together the puzzle: Improving event content coverage for real-time sub-event detection using adaptive microblog crawling.

    Directory of Open Access Journals (Sweden)

    Laurissa Tokarchuk

    Full Text Available In an age when people are predisposed to report real-world events through their social media accounts, many researchers value the benefits of mining user generated content from social media. Compared with the traditional news media, social media services, such as Twitter, can provide more complete and timely information about the real-world events. However events are often like a puzzle and in order to solve the puzzle/understand the event, we must identify all the sub-events or pieces. Existing Twitter event monitoring systems for sub-event detection and summarization currently typically analyse events based on partial data as conventional data collection methodologies are unable to collect comprehensive event data. This results in existing systems often being unable to report sub-events in real-time and often in completely missing sub-events or pieces in the broader event puzzle. This paper proposes a Sub-event detection by real-TIme Microblog monitoring (STRIM framework that leverages the temporal feature of an expanded set of news-worthy event content. In order to more comprehensively and accurately identify sub-events this framework first proposes the use of adaptive microblog crawling. Our adaptive microblog crawler is capable of increasing the coverage of events while minimizing the amount of non-relevant content. We then propose a stream division methodology that can be accomplished in real time so that the temporal features of the expanded event streams can be analysed by a burst detection algorithm. In the final steps of the framework, the content features are extracted from each divided stream and recombined to provide a final summarization of the sub-events. The proposed framework is evaluated against traditional event detection using event recall and event precision metrics. Results show that improving the quality and coverage of event contents contribute to better event detection by identifying additional valid sub-events. The

  6. Piecing together the puzzle: Improving event content coverage for real-time sub-event detection using adaptive microblog crawling.

    Science.gov (United States)

    Tokarchuk, Laurissa; Wang, Xinyue; Poslad, Stefan

    2017-01-01

    In an age when people are predisposed to report real-world events through their social media accounts, many researchers value the benefits of mining user generated content from social media. Compared with the traditional news media, social media services, such as Twitter, can provide more complete and timely information about the real-world events. However events are often like a puzzle and in order to solve the puzzle/understand the event, we must identify all the sub-events or pieces. Existing Twitter event monitoring systems for sub-event detection and summarization currently typically analyse events based on partial data as conventional data collection methodologies are unable to collect comprehensive event data. This results in existing systems often being unable to report sub-events in real-time and often in completely missing sub-events or pieces in the broader event puzzle. This paper proposes a Sub-event detection by real-TIme Microblog monitoring (STRIM) framework that leverages the temporal feature of an expanded set of news-worthy event content. In order to more comprehensively and accurately identify sub-events this framework first proposes the use of adaptive microblog crawling. Our adaptive microblog crawler is capable of increasing the coverage of events while minimizing the amount of non-relevant content. We then propose a stream division methodology that can be accomplished in real time so that the temporal features of the expanded event streams can be analysed by a burst detection algorithm. In the final steps of the framework, the content features are extracted from each divided stream and recombined to provide a final summarization of the sub-events. The proposed framework is evaluated against traditional event detection using event recall and event precision metrics. Results show that improving the quality and coverage of event contents contribute to better event detection by identifying additional valid sub-events. The novel combination of

  7. Unsupervised behaviour-specific dictionary learning for abnormal event detection

    DEFF Research Database (Denmark)

    Ren, Huamin; Liu, Weifeng; Olsen, Søren Ingvor

    2015-01-01

    the training data is only a small proportion of the surveillance data. Therefore, we propose behavior-specific dictionaries (BSD) through unsupervised learning, pursuing atoms from the same type of behavior to represent one behavior dictionary. To further improve the dictionary by introducing information from...... potential infrequent normal patterns, we refine the dictionary by searching ‘missed atoms’ that have compact coefficients. Experimental results show that our BSD algorithm outperforms state-of-the-art dictionaries in abnormal event detection on the public UCSD dataset. Moreover, BSD has less false alarms...

  8. How does structured sparsity work in abnormal event detection?

    DEFF Research Database (Denmark)

    Ren, Huamin; Pan, Hong; Olsen, Søren Ingvor

    the training, which is the due to the fact that abnormal videos are limited or even unavailable in advance in most video surveillance applications. As a result, there could be only one label in the training data which hampers supervised learning; 2) Even though there are multiple types of normal behaviors, how...... many normal patterns lie in the whole surveillance data is still unknown. This is because there is huge amount of video surveillance data and only a small proportion is used in algorithm learning, consequently, the normal patterns in the training data could be incomplete. As a result, any sparse...... structure learned from the training data could have a high bias and ruin the precision of abnormal event detection. Therefore, we in the paper propose an algorithm to solve the abnormality detection problem by sparse representation, in which local structured sparsity is preserved in coefficients. To better...

  9. Model-based iterative reconstruction and adaptive statistical iterative reconstruction: dose-reduced CT for detecting pancreatic calcification

    International Nuclear Information System (INIS)

    Yasaka, Koichiro; Katsura, Masaki; Akahane, Masaaki; Sato, Jiro; Matsuda, Izuru; Ohtomo, Kuni

    2016-01-01

    Iterative reconstruction methods have attracted attention for reducing radiation doses in computed tomography (CT). To investigate the detectability of pancreatic calcification using dose-reduced CT reconstructed with model-based iterative construction (MBIR) and adaptive statistical iterative reconstruction (ASIR). This prospective study approved by Institutional Review Board included 85 patients (57 men, 28 women; mean age, 69.9 years; mean body weight, 61.2 kg). Unenhanced CT was performed three times with different radiation doses (reference-dose CT [RDCT], low-dose CT [LDCT], ultralow-dose CT [ULDCT]). From RDCT, LDCT, and ULDCT, images were reconstructed with filtered-back projection (R-FBP, used for establishing reference standard), ASIR (L-ASIR), and MBIR and ASIR (UL-MBIR and UL-ASIR), respectively. A lesion (pancreatic calcification) detection test was performed by two blinded radiologists with a five-point certainty level scale. Dose-length products of RDCT, LDCT, and ULDCT were 410, 97, and 36 mGy-cm, respectively. Nine patients had pancreatic calcification. The sensitivity for detecting pancreatic calcification with UL-MBIR was high (0.67–0.89) compared to L-ASIR or UL-ASIR (0.11–0.44), and a significant difference was seen between UL-MBIR and UL-ASIR for one reader (P = 0.014). The area under the receiver-operating characteristic curve for UL-MBIR (0.818–0.860) was comparable to that for L-ASIR (0.696–0.844). The specificity was lower with UL-MBIR (0.79–0.92) than with L-ASIR or UL-ASIR (0.96–0.99), and a significant difference was seen for one reader (P < 0.01). In UL-MBIR, pancreatic calcification can be detected with high sensitivity, however, we should pay attention to the slightly lower specificity

  10. Model-based iterative reconstruction and adaptive statistical iterative reconstruction: dose-reduced CT for detecting pancreatic calcification.

    Science.gov (United States)

    Yasaka, Koichiro; Katsura, Masaki; Akahane, Masaaki; Sato, Jiro; Matsuda, Izuru; Ohtomo, Kuni

    2016-01-01

    Iterative reconstruction methods have attracted attention for reducing radiation doses in computed tomography (CT). To investigate the detectability of pancreatic calcification using dose-reduced CT reconstructed with model-based iterative construction (MBIR) and adaptive statistical iterative reconstruction (ASIR). This prospective study approved by Institutional Review Board included 85 patients (57 men, 28 women; mean age, 69.9 years; mean body weight, 61.2 kg). Unenhanced CT was performed three times with different radiation doses (reference-dose CT [RDCT], low-dose CT [LDCT], ultralow-dose CT [ULDCT]). From RDCT, LDCT, and ULDCT, images were reconstructed with filtered-back projection (R-FBP, used for establishing reference standard), ASIR (L-ASIR), and MBIR and ASIR (UL-MBIR and UL-ASIR), respectively. A lesion (pancreatic calcification) detection test was performed by two blinded radiologists with a five-point certainty level scale. Dose-length products of RDCT, LDCT, and ULDCT were 410, 97, and 36 mGy-cm, respectively. Nine patients had pancreatic calcification. The sensitivity for detecting pancreatic calcification with UL-MBIR was high (0.67-0.89) compared to L-ASIR or UL-ASIR (0.11-0.44), and a significant difference was seen between UL-MBIR and UL-ASIR for one reader (P = 0.014). The area under the receiver-operating characteristic curve for UL-MBIR (0.818-0.860) was comparable to that for L-ASIR (0.696-0.844). The specificity was lower with UL-MBIR (0.79-0.92) than with L-ASIR or UL-ASIR (0.96-0.99), and a significant difference was seen for one reader (P < 0.01). In UL-MBIR, pancreatic calcification can be detected with high sensitivity, however, we should pay attention to the slightly lower specificity.

  11. Description and detection of burst events in turbulent flows

    Science.gov (United States)

    Schmid, P. J.; García-Gutierrez, A.; Jiménez, J.

    2018-04-01

    A mathematical and computational framework is developed for the detection and identification of coherent structures in turbulent wall-bounded shear flows. In a first step, this data-based technique will use an embedding methodology to formulate the fluid motion as a phase-space trajectory, from which state-transition probabilities can be computed. Within this formalism, a second step then applies repeated clustering and graph-community techniques to determine a hierarchy of coherent structures ranked by their persistencies. This latter information will be used to detect highly transitory states that act as precursors to violent and intermittent events in turbulent fluid motion (e.g., bursts). Used as an analysis tool, this technique allows the objective identification of intermittent (but important) events in turbulent fluid motion; however, it also lays the foundation for advanced control strategies for their manipulation. The techniques are applied to low-dimensional model equations for turbulent transport, such as the self-sustaining process (SSP), for varying levels of complexity.

  12. Model-based inversion for the characterization of crack-like defects detected by ultrasound in a cladded component

    International Nuclear Information System (INIS)

    Haiat, G.

    2004-03-01

    This work deals with the inversion of ultrasonic data. The industrial context of the study in the non destructive evaluation of the internal walls of French reactor pressure vessels. Those inspections aim at detecting and characterizing cracks. Ultrasonic data correspond to echographic responses obtained with a transducer acting in pulse echo mode. Cracks are detected by crack tip diffraction effect. The analysis of measured data can become difficult because of the presence of a cladding, which surface is irregular. Moreover, its constituting material differs from the one of the reactor vessel. A model-based inverse method uses simulation of propagation and of diffraction of ultrasound taking into account the irregular properties of the cladding surface, as well as the heterogeneous nature of the component. The method developed was implemented and tested on a set of representative cases. Its performances were evaluated by the analysis of experimental results. The precision obtained in the laboratory on experimental cases treated is conform with industrial expectations motivating this study. (author)

  13. A novel airport extraction model based on saliency region detection for high spatial resolution remote sensing images

    Science.gov (United States)

    Lv, Wen; Zhang, Libao; Zhu, Yongchun

    2017-06-01

    The airport is one of the most crucial traffic facilities in military and civil fields. Automatic airport extraction in high spatial resolution remote sensing images has many applications such as regional planning and military reconnaissance. Traditional airport extraction strategies usually base on prior knowledge and locate the airport target by template matching and classification, which will cause high computation complexity and large costs of computing resources for high spatial resolution remote sensing images. In this paper, we propose a novel automatic airport extraction model based on saliency region detection, airport runway extraction and adaptive threshold segmentation. In saliency region detection, we choose frequency-tuned (FT) model for computing airport saliency using low level features of color and luminance that is easy and fast to implement and can provide full-resolution saliency maps. In airport runway extraction, Hough transform is adopted to count the number of parallel line segments. In adaptive threshold segmentation, the Otsu threshold segmentation algorithm is proposed to obtain more accurate airport regions. The experimental results demonstrate that the proposed model outperforms existing saliency analysis models and shows good performance in the extraction of the airport.

  14. Efficient hemodynamic event detection utilizing relational databases and wavelet analysis

    Science.gov (United States)

    Saeed, M.; Mark, R. G.

    2001-01-01

    Development of a temporal query framework for time-oriented medical databases has hitherto been a challenging problem. We describe a novel method for the detection of hemodynamic events in multiparameter trends utilizing wavelet coefficients in a MySQL relational database. Storage of the wavelet coefficients allowed for a compact representation of the trends, and provided robust descriptors for the dynamics of the parameter time series. A data model was developed to allow for simplified queries along several dimensions and time scales. Of particular importance, the data model and wavelet framework allowed for queries to be processed with minimal table-join operations. A web-based search engine was developed to allow for user-defined queries. Typical queries required between 0.01 and 0.02 seconds, with at least two orders of magnitude improvement in speed over conventional queries. This powerful and innovative structure will facilitate research on large-scale time-oriented medical databases.

  15. Detecting and characterising ramp events in wind power time series

    International Nuclear Information System (INIS)

    Gallego, Cristóbal; Cuerva, Álvaro; Costa, Alexandre

    2014-01-01

    In order to implement accurate models for wind power ramp forecasting, ramps need to be previously characterised. This issue has been typically addressed by performing binary ramp/non-ramp classifications based on ad-hoc assessed thresholds. However, recent works question this approach. This paper presents the ramp function, an innovative wavelet- based tool which detects and characterises ramp events in wind power time series. The underlying idea is to assess a continuous index related to the ramp intensity at each time step, which is obtained by considering large power output gradients evaluated under different time scales (up to typical ramp durations). The ramp function overcomes some of the drawbacks shown by the aforementioned binary classification and permits forecasters to easily reveal specific features of the ramp behaviour observed at a wind farm. As an example, the daily profile of the ramp-up and ramp-down intensities are obtained for the case of a wind farm located in Spain

  16. Motion Pattern Extraction and Event Detection for Automatic Visual Surveillance

    Directory of Open Access Journals (Sweden)

    Benabbas Yassine

    2011-01-01

    Full Text Available Efficient analysis of human behavior in video surveillance scenes is a very challenging problem. Most traditional approaches fail when applied in real conditions and contexts like amounts of persons, appearance ambiguity, and occlusion. In this work, we propose to deal with this problem by modeling the global motion information obtained from optical flow vectors. The obtained direction and magnitude models learn the dominant motion orientations and magnitudes at each spatial location of the scene and are used to detect the major motion patterns. The applied region-based segmentation algorithm groups local blocks that share the same motion direction and speed and allows a subregion of the scene to appear in different patterns. The second part of the approach consists in the detection of events related to groups of people which are merge, split, walk, run, local dispersion, and evacuation by analyzing the instantaneous optical flow vectors and comparing the learned models. The approach is validated and experimented on standard datasets of the computer vision community. The qualitative and quantitative results are discussed.

  17. Balloon-Borne Infrasound Detection of Energetic Bolide Events

    Science.gov (United States)

    Young, Eliot F.; Ballard, Courtney; Klein, Viliam; Bowman, Daniel; Boslough, Mark

    2016-10-01

    Infrasound is usually defined as sound waves below 20 Hz, the nominal limit of human hearing. Infrasound waves propagate over vast distances through the Earth's atmosphere: the CTBTO (Comprehensive Nuclear-Test-Ban Treaty Organization) has 48 installed infrasound-sensing stations around the world to detect nuclear detonations and other disturbances. In February 2013, several CTBTO infrasound stations detected infrasound signals from a large bolide that exploded over Chelyabinsk, Russia. Some stations recorded signals that had circumnavigated the Earth, over a day after the original event. The goal of this project is to improve upon the sensitivity of the CTBTO network by putting microphones on small, long-duration super-pressure balloons, with the overarching goal of studying the small end of the NEO population by using the Earth's atmosphere as a witness plate.A balloon-borne infrasound sensor is expected to have two advantages over ground-based stations: a lack of wind noise and a concentration of infrasound energy in the "stratospheric duct" between roughly 5 - 50 km altitude. To test these advantages, we have built a small balloon payload with five calibrated microphones. We plan to fly this payload on a NASA high-altitude balloon from Ft Sumner, NM in August 2016. We have arranged for three large explosions to take place in Socorro, NM while the balloon is aloft to assess the sensitivity of balloon-borne vs. ground-based infrasound sensors. We will report on the results from this test flight and the prospects for detecting/characterizing small bolides in the stratosphere.

  18. DETECT: a MATLAB toolbox for event detection and identification in time series, with applications to artifact detection in EEG signals.

    Science.gov (United States)

    Lawhern, Vernon; Hairston, W David; Robbins, Kay

    2013-01-01

    Recent advances in sensor and recording technology have allowed scientists to acquire very large time-series datasets. Researchers often analyze these datasets in the context of events, which are intervals of time where the properties of the signal change relative to a baseline signal. We have developed DETECT, a MATLAB toolbox for detecting event time intervals in long, multi-channel time series. Our primary goal is to produce a toolbox that is simple for researchers to use, allowing them to quickly train a model on multiple classes of events, assess the accuracy of the model, and determine how closely the results agree with their own manual identification of events without requiring extensive programming knowledge or machine learning experience. As an illustration, we discuss application of the DETECT toolbox for detecting signal artifacts found in continuous multi-channel EEG recordings and show the functionality of the tools found in the toolbox. We also discuss the application of DETECT for identifying irregular heartbeat waveforms found in electrocardiogram (ECG) data as an additional illustration.

  19. DETECT: a MATLAB toolbox for event detection and identification in time series, with applications to artifact detection in EEG signals.

    Directory of Open Access Journals (Sweden)

    Vernon Lawhern

    Full Text Available Recent advances in sensor and recording technology have allowed scientists to acquire very large time-series datasets. Researchers often analyze these datasets in the context of events, which are intervals of time where the properties of the signal change relative to a baseline signal. We have developed DETECT, a MATLAB toolbox for detecting event time intervals in long, multi-channel time series. Our primary goal is to produce a toolbox that is simple for researchers to use, allowing them to quickly train a model on multiple classes of events, assess the accuracy of the model, and determine how closely the results agree with their own manual identification of events without requiring extensive programming knowledge or machine learning experience. As an illustration, we discuss application of the DETECT toolbox for detecting signal artifacts found in continuous multi-channel EEG recordings and show the functionality of the tools found in the toolbox. We also discuss the application of DETECT for identifying irregular heartbeat waveforms found in electrocardiogram (ECG data as an additional illustration.

  20. Model-Based Off-Nominal State Isolation and Detection System for Autonomous Fault Management, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — The proposed model-based Fault Management system addresses the need for cost-effective solutions that enable higher levels of onboard spacecraft autonomy to reliably...

  1. Model-Based Off-Nominal State Isolation and Detection System for Autonomous Fault Management, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The proposed model-based Fault Management system addresses the need for cost-effective solutions that enable higher levels of onboard spacecraft autonomy to reliably...

  2. Optimal detection of burst events in gravitational wave interferometric observatories

    International Nuclear Information System (INIS)

    Vicere, Andrea

    2002-01-01

    We consider the problem of detecting a burst signal of unknown shape in the data from gravitational wave interferometric detectors. We introduce a statistic which generalizes the excess power statistic proposed first by Flanagan and Hughes, and then extended by Anderson et al. to the multiple detector case. The statistic that we propose is shown to be optimal for an arbitrary noise spectral characteristic, under the two hypotheses that the noise is Gaussian, albeit colored, and that the prior for the signal is uniform. The statistic derivation is based on the assumption that a signal affects only N parallel samples in the data stream, but that no other information is a priori available, and that the value of the signal at each sample can be arbitrary. This is the main difference from previous works, where different assumptions were made, such as a signal distribution uniform with respect to the metric induced by the (inverse) noise correlation matrix. The two choices are equivalent if the noise is white, and in that limit the two statistics do indeed coincide. In the general case, we believe that the statistic we propose may be more appropriate, because it does not reflect the characteristics of the noise affecting the detector on the supposed distribution of the gravitational wave signal. Moreover, we show that the proposed statistic can be easily implemented in its exact form, combining standard time-series analysis tools which can be efficiently implemented. We generalize this version of an excess power statistic to the multiple detector case, considering first a noise uncorrelated among the different instruments, and then including the effect of correlated noise. We discuss exact and approximate forms of the statistic; the choice depends on the characteristics of the noise and on the assumed length of the burst event. As an example, we show the sensitivity of the network of interferometers to a δ-function burst

  3. Developing Fluorescence Sensor Systems for Early Detection of Nitrification Events in Chloraminated Drinking Water Distribution Systems

    Science.gov (United States)

    Detection of nitrification events in chloraminated drinking water distribution systems remains an ongoing challenge for many drinking water utilities, including Dallas Water Utilities (DWU) and the City of Houston (CoH). Each year, these utilities experience nitrification events ...

  4. SPEED : a semantics-based pipeline for economic event detection

    NARCIS (Netherlands)

    Hogenboom, F.P.; Hogenboom, A.C.; Frasincar, F.; Kaymak, U.; Meer, van der O.; Schouten, K.; Vandic, D.; Parsons, J.; Motoshi, S.; Shoval, P.; Woo, C.; Wand, Y.

    2010-01-01

    Nowadays, emerging news on economic events such as acquisitions has a substantial impact on the financial markets. Therefore, it is important to be able to automatically and accurately identify events in news items in a timely manner. For this, one has to be able to process a large amount of

  5. Automated Detection of Financial Events in News Text

    NARCIS (Netherlands)

    F.P. Hogenboom (Frederik)

    2014-01-01

    markdownabstractToday’s financial markets are inextricably linked with financial events like acquisitions, profit announcements, or product launches. Information extracted from news messages that report on such events could hence be beneficial for financial decision making. The ubiquity of news,

  6. Semantics-based information extraction for detecting economic events

    NARCIS (Netherlands)

    A.C. Hogenboom (Alexander); F. Frasincar (Flavius); K. Schouten (Kim); O. van der Meer

    2013-01-01

    textabstractAs today's financial markets are sensitive to breaking news on economic events, accurate and timely automatic identification of events in news items is crucial. Unstructured news items originating from many heterogeneous sources have to be mined in order to extract knowledge useful for

  7. Prescription-event monitoring: developments in signal detection.

    Science.gov (United States)

    Ferreira, Germano

    2007-01-01

    Prescription-event monitoring (PEM) is a non-interventional intensive method for post-marketing drug safety monitoring of newly licensed medicines. PEM studies are cohort studies where exposure is obtained from a centralised service and outcomes from simple questionnaires completed by general practitioners. Follow-up forms are sent for selected events. Because PEM captures all events and not only the suspected adverse drug reactions, PEM cohorts potentially differ in respect to the distribution of number of events per person depending on the nature of the drug under study. This variance can be related either with the condition for which the drug is prescribed (e.g. a condition causing high morbidity will have, in average, a higher number of events per person compared with a condition with lower morbidity) or with the drug effect itself. This paper describes an exploratory investigation of the distortion caused by product-related variations of the number of events to the interpretation of the proportional reporting ratio (PRR) values ("the higher the PRR, the greater the strength of the signal") computed using drug-cohort data. We studied this effect by assessing the agreement between the PRR based on events (event of interest vs all other events) and PRR based on cases (cases with the event of interest vs cases with any other events). PRR were calculated for all combinations reported to ten selected drugs against a comparator of 81 other drugs. Three of the ten drugs had a cohort with an apparent higher proportion of patients with lower number of events. The PRRs based on events were systematically higher than the PRR based on cases for the combinations reported to these three drugs. Additionally, when applying the threshold criteria for signal screening (n > or =3, PRR > or =1.5 and Chi-squared > or =4), the binary agreement was generally high but apparently lower for these three drugs. In conclusion, the distribution of events per patient in drug cohorts shall be

  8. Signal detection to identify serious adverse events (neuropsychiatric events in travelers taking mefloquine for chemoprophylaxis of malaria

    Directory of Open Access Journals (Sweden)

    Naing C

    2012-08-01

    Full Text Available Cho Naing,1,3 Kyan Aung,1 Syed Imran Ahmed,2 Joon Wah Mak31School of Medical Sciences, 2School of Pharmacy and Health Sciences, 3School of Postgraduate Studies and Research, International Medical University, Kuala Lumpur, MalaysiaBackground: For all medications, there is a trade-off between benefits and potential for harm. It is important for patient safety to detect drug-event combinations and analyze by appropriate statistical methods. Mefloquine is used as chemoprophylaxis for travelers going to regions with known chloroquine-resistant Plasmodium falciparum malaria. As such, there is a concern about serious adverse events associated with mefloquine chemoprophylaxis. The objective of the present study was to assess whether any signal would be detected for the serious adverse events of mefloquine, based on data in clinicoepidemiological studies.Materials and methods: We extracted data on adverse events related to mefloquine chemoprophylaxis from the two published datasets. Disproportionality reporting of adverse events such as neuropsychiatric events and other adverse events was presented in the 2 × 2 contingency table. Reporting odds ratio and corresponding 95% confidence interval [CI] data-mining algorithm was applied for the signal detection. The safety signals are considered significant when the ROR estimates and the lower limits of the corresponding 95% CI are ≥2.Results: Two datasets addressing adverse events of mefloquine chemoprophylaxis (one from a published article and one from a Cochrane systematic review were included for analyses. Reporting odds ratio 1.58, 95% CI: 1.49–1.68 based on published data in the selected article, and 1.195, 95% CI: 0.94–1.44 based on data in the selected Cochrane review. Overall, in both datasets, the reporting odds ratio values of lower 95% CI were less than 2.Conclusion: Based on available data, findings suggested that signals for serious adverse events pertinent to neuropsychiatric event were

  9. Network hydraulics inclusion in water quality event detection using multiple sensor stations data.

    Science.gov (United States)

    Oliker, Nurit; Ostfeld, Avi

    2015-09-01

    Event detection is one of the current most challenging topics in water distribution systems analysis: how regular on-line hydraulic (e.g., pressure, flow) and water quality (e.g., pH, residual chlorine, turbidity) measurements at different network locations can be efficiently utilized to detect water quality contamination events. This study describes an integrated event detection model which combines multiple sensor stations data with network hydraulics. To date event detection modelling is likely limited to single sensor station location and dataset. Single sensor station models are detached from network hydraulics insights and as a result might be significantly exposed to false positive alarms. This work is aimed at decreasing this limitation through integrating local and spatial hydraulic data understanding into an event detection model. The spatial analysis complements the local event detection effort through discovering events with lower signatures by exploring the sensors mutual hydraulic influences. The unique contribution of this study is in incorporating hydraulic simulation information into the overall event detection process of spatially distributed sensors. The methodology is demonstrated on two example applications using base runs and sensitivity analyses. Results show a clear advantage of the suggested model over single-sensor event detection schemes. Copyright © 2015 Elsevier Ltd. All rights reserved.

  10. Abnormal Event Detection in Wireless Sensor Networks Based on Multiattribute Correlation

    Directory of Open Access Journals (Sweden)

    Mengdi Wang

    2017-01-01

    Full Text Available Abnormal event detection is one of the vital tasks in wireless sensor networks. However, the faults of nodes and the poor deployment environment have brought great challenges to abnormal event detection. In a typical event detection technique, spatiotemporal correlations are collected to detect an event, which is susceptible to noises and errors. To improve the quality of detection results, we propose a novel approach for abnormal event detection in wireless sensor networks. This approach considers not only spatiotemporal correlations but also the correlations among observed attributes. A dependency model of observed attributes is constructed based on Bayesian network. In this model, the dependency structure of observed attributes is obtained by structure learning, and the conditional probability table of each node is calculated by parameter learning. We propose a new concept named attribute correlation confidence to evaluate the fitting degree between the sensor reading and the abnormal event pattern. On the basis of time correlation detection and space correlation detection, the abnormal events are identified. Experimental results show that the proposed algorithm can reduce the impact of interference factors and the rate of the false alarm effectively; it can also improve the accuracy of event detection.

  11. Multivariate algorithms for initiating event detection and identification in nuclear power plants

    International Nuclear Information System (INIS)

    Wu, Shun-Chi; Chen, Kuang-You; Lin, Ting-Han; Chou, Hwai-Pwu

    2018-01-01

    Highlights: •Multivariate algorithms for NPP initiating event detection and identification. •Recordings from multiple sensors are simultaneously considered for detection. •Both spatial and temporal information is used for event identification. •Untrained event isolation avoids falsely relating an untrained event. •Efficacy of the algorithms is verified with data from the Maanshan NPP simulator. -- Abstract: To prevent escalation of an initiating event into a severe accident, promptly detecting its occurrence and precisely identifying its type are essential. In this study, several multivariate algorithms for initiating event detection and identification are proposed to help maintain safe operations of nuclear power plants (NPPs). By monitoring changes in the NPP sensing variables, an event is detected when the preset thresholds are exceeded. Unlike existing approaches, recordings from sensors of the same type are simultaneously considered for detection, and no subjective reasoning is involved in setting these thresholds. To facilitate efficient event identification, a spatiotemporal feature extractor is proposed. The extracted features consist of the temporal traits used by existing techniques and the spatial signature of an event. Through an F-score-based feature ranking, only those that are most discriminant in classifying the events under consideration will be retained for identification. Moreover, an untrained event isolation scheme is introduced to avoid relating an untrained event to those in the event dataset so that improper recovery actions can be prevented. Results from experiments containing data of 12 event classes and a total of 125 events generated using a Taiwan’s Maanshan NPP simulator are provided to illustrate the efficacy of the proposed algorithms.

  12. Towards Optimal Event Detection and Localization in Acyclic Flow Networks

    KAUST Repository

    Agumbe Suresh, Mahima; Stoleru, Radu; Denton, Ron; Zechman, Emily; Shihada, Basem

    2012-01-01

    infrastructure (i.e., minimizing the number of sensor and beacon nodes deployed), while ensuring a degree of sensing coverage in a zone of interest and a required accuracy in locating events. We propose algorithms for solving these problems and demonstrate

  13. On the event detected by the Mont Blanc underground neutrino detector on February 23, 1987

    Energy Technology Data Exchange (ETDEWEB)

    Dadykin, V L; Zatsepin, G T; Korchagin, V B

    1988-02-01

    The event detected by the Mont Balnc Soviet -Italian scintillation detector on February 23, 1987 at 2:52:37 are discussed. The corrected energies of the pulases of the event and the probability of the event imitation by the background are presented.

  14. A Dynamically Configurable Log-based Distributed Security Event Detection Methodology using Simple Event Correlator

    Science.gov (United States)

    2010-06-01

    Figures Figure Page 2.1. Verizon Data Breach Report: Detective Controls by percent of breach victims. [11...immature, operationally speaking. Figure 2.1: Verizon Data Breach Report: Detective Controls by percent of breach victims. [11] 7 The 2008 CSI Computer...Christopher Novak, Christopher Porter, Bryan Sartin, Peter Tippett, and J. Andrew Valentine. Data Breach Investigations Report. Technical report

  15. On Event Detection and Localization in Acyclic Flow Networks

    KAUST Repository

    Suresh, Mahima Agumbe; Stoleru, Radu; Zechman, Emily M.; Shihada, Basem

    2013-01-01

    Acyclic flow networks, present in many infrastructures of national importance (e.g., oil and gas and water distribution systems), have been attracting immense research interest. Existing solutions for detecting and locating attacks against

  16. Unsupervised Event Characterization and Detection in Multichannel Signals: An EEG application

    Directory of Open Access Journals (Sweden)

    Angel Mur

    2016-04-01

    Full Text Available In this paper, we propose a new unsupervised method to automatically characterize and detect events in multichannel signals. This method is used to identify artifacts in electroencephalogram (EEG recordings of brain activity. The proposed algorithm has been evaluated and compared with a supervised method. To this end an example of the performance of the algorithm to detect artifacts is shown. The results show that although both methods obtain similar classification, the proposed method allows detecting events without training data and can also be applied in signals whose events are unknown a priori. Furthermore, the proposed method provides an optimal window whereby an optimal detection and characterization of events is found. The detection of events can be applied in real-time.

  17. Improved Deep Belief Networks (IDBN Dynamic Model-Based Detection and Mitigation for Targeted Attacks on Heavy-Duty Robots

    Directory of Open Access Journals (Sweden)

    Lianpeng Li

    2018-04-01

    Full Text Available In recent years, the robots, especially heavy-duty robots, have become the hardest-hit areas for targeted attacks. These attacks come from both the cyber-domain and the physical-domain. In order to improve the security of heavy-duty robots, this paper proposes a detection and mitigation mechanism which based on improved deep belief networks (IDBN and dynamic model. The detection mechanism consists of two parts: (1 IDBN security checks, which can detect targeted attacks from the cyber-domain; (2 Dynamic model and security detection, used to detect the targeted attacks which can possibly lead to a physical-domain damage. The mitigation mechanism was established on the base of the detection mechanism and could mitigate transient and discontinuous attacks. Moreover, a test platform was established to carry out the performance evaluation test for the proposed mechanism. The results show that, the detection accuracy for the attack of the cyber-domain of IDBN reaches 96.2%, and the detection accuracy for the attack of physical-domain control commands reaches 94%. The performance evaluation test has verified the reliability and high efficiency of the proposed detection and mitigation mechanism for heavy-duty robots.

  18. Detection and location of multiple events by MARS. Final report

    International Nuclear Information System (INIS)

    Wang, J.; Masso, J.F.; Archambeau, C.B.; Savino, J.M.

    1980-09-01

    Seismic data from two explosions was processed using the Systems Science and Software MARS (Multiple Arrival Recognition System) seismic event detector in an effort to determine their relative spatial and temporal separation on the basis of seismic data alone. The explosions were less than 1.0 kilometer apart and were separated by less than 0.5 sec in origin times. The seismic data consisted of nine local accelerograms (r < 1.0 km) and four regional (240 through 400 km) seismograms. The MARS processing clearly indicates the presence of multiple explosions, but the restricted frequency range of the data inhibits accurate time picks and hence limits the precision of the event location

  19. Deep learning based beat event detection in action movie franchises

    Science.gov (United States)

    Ejaz, N.; Khan, U. A.; Martínez-del-Amor, M. A.; Sparenberg, H.

    2018-04-01

    Automatic understanding and interpretation of movies can be used in a variety of ways to semantically manage the massive volumes of movies data. "Action Movie Franchises" dataset is a collection of twenty Hollywood action movies from five famous franchises with ground truth annotations at shot and beat level of each movie. In this dataset, the annotations are provided for eleven semantic beat categories. In this work, we propose a deep learning based method to classify shots and beat-events on this dataset. The training dataset for each of the eleven beat categories is developed and then a Convolution Neural Network is trained. After finding the shot boundaries, key frames are extracted for each shot and then three classification labels are assigned to each key frame. The classification labels for each of the key frames in a particular shot are then used to assign a unique label to each shot. A simple sliding window based method is then used to group adjacent shots having the same label in order to find a particular beat event. The results of beat event classification are presented based on criteria of precision, recall, and F-measure. The results are compared with the existing technique and significant improvements are recorded.

  20. Model-Based Fault Detection and Isolation of a Liquid-Cooled Frequency Converter on a Wind Turbine

    DEFF Research Database (Denmark)

    Li, Peng; Odgaard, Peter Fogh; Stoustrup, Jakob

    2012-01-01

    advanced fault detection and isolation schemes. In this paper, an observer-based fault detection and isolation method for the cooling system in a liquid-cooled frequency converter on a wind turbine which is built up in a scalar version in the laboratory is presented. A dynamic model of the scale cooling...... system is derived based on energy balance equation. A fault analysis is conducted to determine the severity and occurrence rate of possible component faults and their end effects in the cooling system. A method using unknown input observer is developed in order to detect and isolate the faults based...... on the developed dynamical model. The designed fault detection and isolation algorithm is applied on a set of measured experiment data in which different faults are artificially introduced to the scaled cooling system. The experimental results conclude that the different faults are successfully detected...

  1. Falls event detection using triaxial accelerometry and barometric pressure measurement.

    Science.gov (United States)

    Bianchi, Federico; Redmond, Stephen J; Narayanan, Michael R; Cerutti, Sergio; Celler, Branko G; Lovell, Nigel H

    2009-01-01

    A falls detection system, employing a Bluetooth-based wearable device, containing a triaxial accelerometer and a barometric pressure sensor, is described. The aim of this study is to evaluate the use of barometric pressure measurement, as a surrogate measure of altitude, to augment previously reported accelerometry-based falls detection algorithms. The accelerometry and barometric pressure signals obtained from the waist-mounted device are analyzed by a signal processing and classification algorithm to discriminate falls from activities of daily living. This falls detection algorithm has been compared to two existing algorithms which utilize accelerometry signals alone. A set of laboratory-based simulated falls, along with other tasks associated with activities of daily living (16 tests) were performed by 15 healthy volunteers (9 male and 6 female; age: 23.7 +/- 2.9 years; height: 1.74 +/- 0.11 m). The algorithm incorporating pressure information detected falls with the highest sensitivity (97.8%) and the highest specificity (96.7%).

  2. Bidirectional RNN for Medical Event Detection in Electronic Health Records.

    Science.gov (United States)

    Jagannatha, Abhyuday N; Yu, Hong

    2016-06-01

    Sequence labeling for extraction of medical events and their attributes from unstructured text in Electronic Health Record (EHR) notes is a key step towards semantic understanding of EHRs. It has important applications in health informatics including pharmacovigilance and drug surveillance. The state of the art supervised machine learning models in this domain are based on Conditional Random Fields (CRFs) with features calculated from fixed context windows. In this application, we explored recurrent neural network frameworks and show that they significantly out-performed the CRF models.

  3. Power System Extreme Event Detection: The VulnerabilityFrontier

    Energy Technology Data Exchange (ETDEWEB)

    Lesieutre, Bernard C.; Pinar, Ali; Roy, Sandip

    2007-10-17

    In this work we apply graph theoretic tools to provide aclose bound on a frontier relating the number of line outages in a gridto the power disrupted by the outages. This frontier describes theboundary of a space relating the possible severity of a disturbance interms of power disruption, from zero to some maximum on the boundary, tothe number line outages involved in the event. We present the usefulnessof this analysis with a complete analysis of a 30 bus system, and presentresults for larger systems.

  4. Event detection in athletics for personalized sports content delivery

    DEFF Research Database (Denmark)

    Katsarakis, N.; Pnevmatikakis, A.

    2009-01-01

    Broadcasting of athletics is nowadays biased towards running (sprint and longer distances) sports. Personalized content delivery can change that for users that wish to focus on different content. Using a combination of video signal processing algorithms and live information that accompanies the v...... algorithms needed for the extraction of the events that trigger both between and within sport camera selection, and describes a system that handles user preferences, live information andvideo-generated events to offer personalized content to the users.......Broadcasting of athletics is nowadays biased towards running (sprint and longer distances) sports. Personalized content delivery can change that for users that wish to focus on different content. Using a combination of video signal processing algorithms and live information that accompanies...... the video of large-scale sports like the Olympics, a system can attend to the preferences of users by selecting the most suitable camera view for them.There are two types of camera selection for personalized content delivery. According to the between sport camera selection, the view is changed between two...

  5. Spatial-temporal event detection in climate parameter imagery.

    Energy Technology Data Exchange (ETDEWEB)

    McKenna, Sean Andrew; Gutierrez, Karen A.

    2011-10-01

    Previously developed techniques that comprise statistical parametric mapping, with applications focused on human brain imaging, are examined and tested here for new applications in anomaly detection within remotely-sensed imagery. Two approaches to analysis are developed: online, regression-based anomaly detection and conditional differences. These approaches are applied to two example spatial-temporal data sets: data simulated with a Gaussian field deformation approach and weekly NDVI images derived from global satellite coverage. Results indicate that anomalies can be identified in spatial temporal data with the regression-based approach. Additionally, la Nina and el Nino climatic conditions are used as different stimuli applied to the earth and this comparison shows that el Nino conditions lead to significant decreases in NDVI in both the Amazon Basin and in Southern India.

  6. A Decision Mixture Model-Based Method for Inshore Ship Detection Using High-Resolution Remote Sensing Images.

    Science.gov (United States)

    Bi, Fukun; Chen, Jing; Zhuang, Yin; Bian, Mingming; Zhang, Qingjun

    2017-06-22

    With the rapid development of optical remote sensing satellites, ship detection and identification based on large-scale remote sensing images has become a significant maritime research topic. Compared with traditional ocean-going vessel detection, inshore ship detection has received increasing attention in harbor dynamic surveillance and maritime management. However, because the harbor environment is complex, gray information and texture features between docked ships and their connected dock regions are indistinguishable, most of the popular detection methods are limited by their calculation efficiency and detection accuracy. In this paper, a novel hierarchical method that combines an efficient candidate scanning strategy and an accurate candidate identification mixture model is presented for inshore ship detection in complex harbor areas. First, in the candidate region extraction phase, an omnidirectional intersected two-dimension scanning (OITDS) strategy is designed to rapidly extract candidate regions from the land-water segmented images. In the candidate region identification phase, a decision mixture model (DMM) is proposed to identify real ships from candidate objects. Specifically, to improve the robustness regarding the diversity of ships, a deformable part model (DPM) was employed to train a key part sub-model and a whole ship sub-model. Furthermore, to improve the identification accuracy, a surrounding correlation context sub-model is built. Finally, to increase the accuracy of candidate region identification, these three sub-models are integrated into the proposed DMM. Experiments were performed on numerous large-scale harbor remote sensing images, and the results showed that the proposed method has high detection accuracy and rapid computational efficiency.

  7. Full-waveform detection of non-impulsive seismic events based on time-reversal methods

    Science.gov (United States)

    Solano, Ericka Alinne; Hjörleifsdóttir, Vala; Liu, Qinya

    2017-12-01

    We present a full-waveform detection method for non-impulsive seismic events, based on time-reversal principles. We use the strain Green's tensor as a matched filter, correlating it with continuous observed seismograms, to detect non-impulsive seismic events. We show that this is mathematically equivalent to an adjoint method for detecting earthquakes. We define the detection function, a scalar valued function, which depends on the stacked correlations for a group of stations. Event detections are given by the times at which the amplitude of the detection function exceeds a given value relative to the noise level. The method can make use of the whole seismic waveform or any combination of time-windows with different filters. It is expected to have an advantage compared to traditional detection methods for events that do not produce energetic and impulsive P waves, for example glacial events, landslides, volcanic events and transform-fault earthquakes for events which velocity structure along the path is relatively well known. Furthermore, the method has advantages over empirical Greens functions template matching methods, as it does not depend on records from previously detected events, and therefore is not limited to events occurring in similar regions and with similar focal mechanisms as these events. The method is not specific to any particular way of calculating the synthetic seismograms, and therefore complicated structural models can be used. This is particularly beneficial for intermediate size events that are registered on regional networks, for which the effect of lateral structure on the waveforms can be significant. To demonstrate the feasibility of the method, we apply it to two different areas located along the mid-oceanic ridge system west of Mexico where non-impulsive events have been reported. The first study area is between Clipperton and Siqueiros transform faults (9°N), during the time of two earthquake swarms, occurring in March 2012 and May

  8. Detecting impacts of extreme events with ecological in situ monitoring networks

    Directory of Open Access Journals (Sweden)

    M. D. Mahecha

    2017-09-01

    Full Text Available Extreme hydrometeorological conditions typically impact ecophysiological processes on land. Satellite-based observations of the terrestrial biosphere provide an important reference for detecting and describing the spatiotemporal development of such events. However, in-depth investigations of ecological processes during extreme events require additional in situ observations. The question is whether the density of existing ecological in situ networks is sufficient for analysing the impact of extreme events, and what are expected event detection rates of ecological in situ networks of a given size. To assess these issues, we build a baseline of extreme reductions in the fraction of absorbed photosynthetically active radiation (FAPAR, identified by a new event detection method tailored to identify extremes of regional relevance. We then investigate the event detection success rates of hypothetical networks of varying sizes. Our results show that large extremes can be reliably detected with relatively small networks, but also reveal a linear decay of detection probabilities towards smaller extreme events in log–log space. For instance, networks with  ≈  100 randomly placed sites in Europe yield a  ≥  90 % chance of detecting the eight largest (typically very large extreme events; but only a  ≥  50 % chance of capturing the 39 largest events. These findings are consistent with probability-theoretic considerations, but the slopes of the decay rates deviate due to temporal autocorrelation and the exact implementation of the extreme event detection algorithm. Using the examples of AmeriFlux and NEON, we then investigate to what degree ecological in situ networks can capture extreme events of a given size. Consistent with our theoretical considerations, we find that today's systematically designed networks (i.e. NEON reliably detect the largest extremes, but that the extreme event detection rates are not higher than would

  9. A coupled classification - evolutionary optimization model for contamination event detection in water distribution systems.

    Science.gov (United States)

    Oliker, Nurit; Ostfeld, Avi

    2014-03-15

    This study describes a decision support system, alerts for contamination events in water distribution systems. The developed model comprises a weighted support vector machine (SVM) for the detection of outliers, and a following sequence analysis for the classification of contamination events. The contribution of this study is an improvement of contamination events detection ability and a multi-dimensional analysis of the data, differing from the parallel one-dimensional analysis conducted so far. The multivariate analysis examines the relationships between water quality parameters and detects changes in their mutual patterns. The weights of the SVM model accomplish two goals: blurring the difference between sizes of the two classes' data sets (as there are much more normal/regular than event time measurements), and adhering the time factor attribute by a time decay coefficient, ascribing higher importance to recent observations when classifying a time step measurement. All model parameters were determined by data driven optimization so the calibration of the model was completely autonomic. The model was trained and tested on a real water distribution system (WDS) data set with randomly simulated events superimposed on the original measurements. The model is prominent in its ability to detect events that were only partly expressed in the data (i.e., affecting only some of the measured parameters). The model showed high accuracy and better detection ability as compared to previous modeling attempts of contamination event detection. Copyright © 2013 Elsevier Ltd. All rights reserved.

  10. Object-Oriented Query Language For Events Detection From Images Sequences

    Science.gov (United States)

    Ganea, Ion Eugen

    2015-09-01

    In this paper is presented a method to represent the events extracted from images sequences and the query language used for events detection. Using an object oriented model the spatial and temporal relationships between salient objects and also between events are stored and queried. This works aims to unify the storing and querying phases for video events processing. The object oriented language syntax used for events processing allow the instantiation of the indexes classes in order to improve the accuracy of the query results. The experiments were performed on images sequences provided from sport domain and it shows the reliability and the robustness of the proposed language. To extend the language will be added a specific syntax for constructing the templates for abnormal events and for detection of the incidents as the final goal of the research.

  11. Discrete Event Simulation Model of the Polaris 2.1 Gamma Ray Imaging Radiation Detection Device

    Science.gov (United States)

    2016-06-01

    release; distribution is unlimited DISCRETE EVENT SIMULATION MODEL OF THE POLARIS 2.1 GAMMA RAY IMAGING RADIATION DETECTION DEVICE by Andres T...ONLY (Leave blank) 2. REPORT DATE June 2016 3. REPORT TYPE AND DATES COVERED Master’s thesis 4. TITLE AND SUBTITLE DISCRETE EVENT SIMULATION MODEL...modeled. The platform, Simkit, was utilized to create a discrete event simulation (DES) model of the Polaris. After carefully constructing the DES

  12. Event Detection Challenges, Methods, and Applications in Natural and Artificial Systems

    Science.gov (United States)

    2009-03-01

    Sauvageon, Agogino, Mehr, and Tumer [2006], for instance, use a fourth degree polynomial within an event detection algorithm to sense high... cancer , and coronary artery disease. His study examines the age at which to begin screening exams, the intervals between the exams, and (possibly...AM, Mehr AF, and Tumer IY. 2006. “Comparison of Event Detection Methods for Centralized Sensor Networks.” IEEE Sensors Applications Symposium 2006

  13. Screening DNA chip and event-specific multiplex PCR detection methods for biotech crops.

    Science.gov (United States)

    Lee, Seong-Hun

    2014-11-01

    There are about 80 biotech crop events that have been approved by safety assessment in Korea. They have been controlled by genetically modified organism (GMO) and living modified organism (LMO) labeling systems. The DNA-based detection method has been used as an efficient scientific management tool. Recently, the multiplex polymerase chain reaction (PCR) and DNA chip have been developed as simultaneous detection methods for several biotech crops' events. The event-specific multiplex PCR method was developed to detect five biotech maize events: MIR604, Event 3272, LY 038, MON 88017 and DAS-59122-7. The specificity was confirmed and the sensitivity was 0.5%. The screening DNA chip was developed from four endogenous genes of soybean, maize, cotton and canola respectively along with two regulatory elements and seven genes: P35S, tNOS, pat, bar, epsps1, epsps2, pmi, cry1Ac and cry3B. The specificity was confirmed and the sensitivity was 0.5% for four crops' 12 events: one soybean, six maize, three cotton and two canola events. The multiplex PCR and DNA chip can be available for screening, gene-specific and event-specific analysis of biotech crops as efficient detection methods by saving on workload and time. © 2014 Society of Chemical Industry. © 2014 Society of Chemical Industry.

  14. An integrated logit model for contamination event detection in water distribution systems.

    Science.gov (United States)

    Housh, Mashor; Ostfeld, Avi

    2015-05-15

    The problem of contamination event detection in water distribution systems has become one of the most challenging research topics in water distribution systems analysis. Current attempts for event detection utilize a variety of approaches including statistical, heuristics, machine learning, and optimization methods. Several existing event detection systems share a common feature in which alarms are obtained separately for each of the water quality indicators. Unifying those single alarms from different indicators is usually performed by means of simple heuristics. A salient feature of the current developed approach is using a statistically oriented model for discrete choice prediction which is estimated using the maximum likelihood method for integrating the single alarms. The discrete choice model is jointly calibrated with other components of the event detection system framework in a training data set using genetic algorithms. The fusing process of each indicator probabilities, which is left out of focus in many existing event detection system models, is confirmed to be a crucial part of the system which could be modelled by exploiting a discrete choice model for improving its performance. The developed methodology is tested on real water quality data, showing improved performances in decreasing the number of false positive alarms and in its ability to detect events with higher probabilities, compared to previous studies. Copyright © 2015 Elsevier Ltd. All rights reserved.

  15. Predicting error in detecting mammographic masses among radiology trainees using statistical models based on BI-RADS features

    Energy Technology Data Exchange (ETDEWEB)

    Grimm, Lars J., E-mail: Lars.grimm@duke.edu; Ghate, Sujata V.; Yoon, Sora C.; Kim, Connie [Department of Radiology, Duke University Medical Center, Box 3808, Durham, North Carolina 27710 (United States); Kuzmiak, Cherie M. [Department of Radiology, University of North Carolina School of Medicine, 2006 Old Clinic, CB No. 7510, Chapel Hill, North Carolina 27599 (United States); Mazurowski, Maciej A. [Duke University Medical Center, Box 2731 Medical Center, Durham, North Carolina 27710 (United States)

    2014-03-15

    Purpose: The purpose of this study is to explore Breast Imaging-Reporting and Data System (BI-RADS) features as predictors of individual errors made by trainees when detecting masses in mammograms. Methods: Ten radiology trainees and three expert breast imagers reviewed 100 mammograms comprised of bilateral medial lateral oblique and craniocaudal views on a research workstation. The cases consisted of normal and biopsy proven benign and malignant masses. For cases with actionable abnormalities, the experts recorded breast (density and axillary lymph nodes) and mass (shape, margin, and density) features according to the BI-RADS lexicon, as well as the abnormality location (depth and clock face). For each trainee, a user-specific multivariate model was constructed to predict the trainee's likelihood of error based on BI-RADS features. The performance of the models was assessed using area under the receive operating characteristic curves (AUC). Results: Despite the variability in errors between different trainees, the individual models were able to predict the likelihood of error for the trainees with a mean AUC of 0.611 (range: 0.502–0.739, 95% Confidence Interval: 0.543–0.680,p < 0.002). Conclusions: Patterns in detection errors for mammographic masses made by radiology trainees can be modeled using BI-RADS features. These findings may have potential implications for the development of future educational materials that are personalized to individual trainees.

  16. Predicting error in detecting mammographic masses among radiology trainees using statistical models based on BI-RADS features

    International Nuclear Information System (INIS)

    Grimm, Lars J.; Ghate, Sujata V.; Yoon, Sora C.; Kim, Connie; Kuzmiak, Cherie M.; Mazurowski, Maciej A.

    2014-01-01

    Purpose: The purpose of this study is to explore Breast Imaging-Reporting and Data System (BI-RADS) features as predictors of individual errors made by trainees when detecting masses in mammograms. Methods: Ten radiology trainees and three expert breast imagers reviewed 100 mammograms comprised of bilateral medial lateral oblique and craniocaudal views on a research workstation. The cases consisted of normal and biopsy proven benign and malignant masses. For cases with actionable abnormalities, the experts recorded breast (density and axillary lymph nodes) and mass (shape, margin, and density) features according to the BI-RADS lexicon, as well as the abnormality location (depth and clock face). For each trainee, a user-specific multivariate model was constructed to predict the trainee's likelihood of error based on BI-RADS features. The performance of the models was assessed using area under the receive operating characteristic curves (AUC). Results: Despite the variability in errors between different trainees, the individual models were able to predict the likelihood of error for the trainees with a mean AUC of 0.611 (range: 0.502–0.739, 95% Confidence Interval: 0.543–0.680,p < 0.002). Conclusions: Patterns in detection errors for mammographic masses made by radiology trainees can be modeled using BI-RADS features. These findings may have potential implications for the development of future educational materials that are personalized to individual trainees

  17. Energy-Efficient Fault-Tolerant Dynamic Event Region Detection in Wireless Sensor Networks

    DEFF Research Database (Denmark)

    Enemark, Hans-Jacob; Zhang, Yue; Dragoni, Nicola

    2015-01-01

    to a hybrid algorithm for dynamic event region detection, such as real-time tracking of chemical leakage regions. Considering the characteristics of the moving away dynamic events, we propose a return back condition for the hybrid algorithm from distributed neighborhood collaboration, in which a node makes......Fault-tolerant event detection is fundamental to wireless sensor network applications. Existing approaches usually adopt neighborhood collaboration for better detection accuracy, while need more energy consumption due to communication. Focusing on energy efficiency, this paper makes an improvement...... its detection decision based on decisions received from its spatial and temporal neighbors, to local non-communicative decision making. The simulation results demonstrate that the improved algorithm does not degrade the detection accuracy of the original algorithm, while it has better energy...

  18. Adaptive Sensor Tuning for Seismic Event Detection in Environment with Electromagnetic Noise

    Science.gov (United States)

    Ziegler, Abra E.

    The goal of this research is to detect possible microseismic events at a carbon sequestration site. Data recorded on a continuous downhole microseismic array in the Farnsworth Field, an oil field in Northern Texas that hosts an ongoing carbon capture, utilization, and storage project, were evaluated using machine learning and reinforcement learning techniques to determine their effectiveness at seismic event detection on a dataset with electromagnetic noise. The data were recorded from a passive vertical monitoring array consisting of 16 levels of 3-component 15 Hz geophones installed in the field and continuously recording since January 2014. Electromagnetic and other noise recorded on the array has significantly impacted the utility of the data and it was necessary to characterize and filter the noise in order to attempt event detection. Traditional detection methods using short-term average/long-term average (STA/LTA) algorithms were evaluated and determined to be ineffective because of changing noise levels. To improve the performance of event detection and automatically and dynamically detect seismic events using effective data processing parameters, an adaptive sensor tuning (AST) algorithm developed by Sandia National Laboratories was utilized. AST exploits neuro-dynamic programming (reinforcement learning) trained with historic event data to automatically self-tune and determine optimal detection parameter settings. The key metric that guides the AST algorithm is consistency of each sensor with its nearest neighbors: parameters are automatically adjusted on a per station basis to be more or less sensitive to produce consistent agreement of detections in its neighborhood. The effects that changes in neighborhood configuration have on signal detection were explored, as it was determined that neighborhood-based detections significantly reduce the number of both missed and false detections in ground-truthed data. The performance of the AST algorithm was

  19. Is computer aided detection (CAD) cost effective in screening mammography? A model based on the CADET II study

    Science.gov (United States)

    2011-01-01

    Background Single reading with computer aided detection (CAD) is an alternative to double reading for detecting cancer in screening mammograms. The aim of this study is to investigate whether the use of a single reader with CAD is more cost-effective than double reading. Methods Based on data from the CADET II study, the cost-effectiveness of single reading with CAD versus double reading was measured in terms of cost per cancer detected. Cost (Pound (£), year 2007/08) of single reading with CAD versus double reading was estimated assuming a health and social service perspective and a 7 year time horizon. As the equipment cost varies according to the unit size a separate analysis was conducted for high, average and low volume screening units. One-way sensitivity analyses were performed by varying the reading time, equipment and assessment cost, recall rate and reader qualification. Results CAD is cost increasing for all sizes of screening unit. The introduction of CAD is cost-increasing compared to double reading because the cost of CAD equipment, staff training and the higher assessment cost associated with CAD are greater than the saving in reading costs. The introduction of single reading with CAD, in place of double reading, would produce an additional cost of £227 and £253 per 1,000 women screened in high and average volume units respectively. In low volume screening units, the high cost of purchasing the equipment will results in an additional cost of £590 per 1,000 women screened. One-way sensitivity analysis showed that the factors having the greatest effect on the cost-effectiveness of CAD with single reading compared with double reading were the reading time and the reader's professional qualification (radiologist versus advanced practitioner). Conclusions Without improvements in CAD effectiveness (e.g. a decrease in the recall rate) CAD is unlikely to be a cost effective alternative to double reading for mammography screening in UK. This study

  20. Automatic Detection and Classification of Audio Events for Road Surveillance Applications

    Directory of Open Access Journals (Sweden)

    Noor Almaadeed

    2018-06-01

    Full Text Available This work investigates the problem of detecting hazardous events on roads by designing an audio surveillance system that automatically detects perilous situations such as car crashes and tire skidding. In recent years, research has shown several visual surveillance systems that have been proposed for road monitoring to detect accidents with an aim to improve safety procedures in emergency cases. However, the visual information alone cannot detect certain events such as car crashes and tire skidding, especially under adverse and visually cluttered weather conditions such as snowfall, rain, and fog. Consequently, the incorporation of microphones and audio event detectors based on audio processing can significantly enhance the detection accuracy of such surveillance systems. This paper proposes to combine time-domain, frequency-domain, and joint time-frequency features extracted from a class of quadratic time-frequency distributions (QTFDs to detect events on roads through audio analysis and processing. Experiments were carried out using a publicly available dataset. The experimental results conform the effectiveness of the proposed approach for detecting hazardous events on roads as demonstrated by 7% improvement of accuracy rate when compared against methods that use individual temporal and spectral features.

  1. The Cognitive Processes Underlying Event-Based Prospective Memory In School Age Children and Young Adults: A Formal Model-Based Study

    OpenAIRE

    Smith, Rebekah E.; Bayen, Ute Johanna; Martin, Claudia

    2010-01-01

    Fifty 7-year-olds (29 female), 53 10-year-olds (29 female), and 36 young adults (19 female), performed a computerized event-based prospective memory task. All three groups differed significantly in prospective memory performance with adults showing the best performance and 7-year-olds the poorest performance. We used a formal multinomial process tree model of event-based prospective memory to decompose age differences in cognitive processes that jointly contribute to prospective memory perfor...

  2. Event-specific qualitative and quantitative detection of five genetically modified rice events using a single standard reference molecule.

    Science.gov (United States)

    Kim, Jae-Hwan; Park, Saet-Byul; Roh, Hyo-Jeong; Shin, Min-Ki; Moon, Gui-Im; Hong, Jin-Hwan; Kim, Hae-Yeong

    2017-07-01

    One novel standard reference plasmid, namely pUC-RICE5, was constructed as a positive control and calibrator for event-specific qualitative and quantitative detection of genetically modified (GM) rice (Bt63, Kemingdao1, Kefeng6, Kefeng8, and LLRice62). pUC-RICE5 contained fragments of a rice-specific endogenous reference gene (sucrose phosphate synthase) as well as the five GM rice events. An existing qualitative PCR assay approach was modified using pUC-RICE5 to create a quantitative method with limits of detection correlating to approximately 1-10 copies of rice haploid genomes. In this quantitative PCR assay, the square regression coefficients ranged from 0.993 to 1.000. The standard deviation and relative standard deviation values for repeatability ranged from 0.02 to 0.22 and 0.10% to 0.67%, respectively. The Ministry of Food and Drug Safety (Korea) validated the method and the results suggest it could be used routinely to identify five GM rice events. Copyright © 2017 Elsevier Ltd. All rights reserved.

  3. Impact of sensor detection limits on protecting water distribution systems from contamination events

    International Nuclear Information System (INIS)

    McKenna, Sean Andrew; Hart, David Blaine; Yarrington, Lane

    2006-01-01

    Real-time water quality sensors are becoming commonplace in water distribution systems. However, field deployable, contaminant-specific sensors are still in the development stage. As development proceeds, the necessary operating parameters of these sensors must be determined to protect consumers from accidental and malevolent contamination events. This objective can be quantified in several different ways including minimization of: the time necessary to detect a contamination event, the population exposed to contaminated water, the extent of the contamination within the network, and others. We examine the ability of a sensor set to meet these objectives as a function of both the detection limit of the sensors and the number of sensors in the network. A moderately sized distribution network is used as an example and different sized sets of randomly placed sensors are considered. For each combination of a certain number of sensors and a detection limit, the mean values of the different objectives across multiple random sensor placements are calculated. The tradeoff between the necessary detection limit in a sensor and the number of sensors is evaluated. Results show that for the example problem examined here, a sensor detection limit of 0.01 of the average source concentration is adequate for maximum protection. Detection of events is dependent on the detection limit of the sensors, but for those events that are detected, the values of the performance measures are not a function of the sensor detection limit. The results of replacing a single sensor in a network with a sensor having a much lower detection limit show that while this replacement can improve results, the majority of the additional events detected had performance measures of relatively low consequence.

  4. Insertable cardiac event recorder in detection of atrial fibrillation after cryptogenic stroke: an audit report.

    Science.gov (United States)

    Etgen, Thorleif; Hochreiter, Manfred; Mundel, Markus; Freudenberger, Thomas

    2013-07-01

    Atrial fibrillation (AF) is the most frequent risk factor in ischemic stroke but often remains undetected. We analyzed the value of insertable cardiac event recorder in detection of AF in a 1-year cohort of patients with cryptogenic ischemic stroke. All patients with cryptogenic stroke and eligibility for oral anticoagulation were offered the insertion of a cardiac event recorder. Regular follow-up for 1 year recorded the incidence of AF. Of the 393 patients with ischemic stroke, 65 (16.5%) had a cryptogenic stroke, and in 22 eligible patients, an event recorder was inserted. After 1 year, in 6 of 22 patients (27.3%), AF was detected. These preliminary data show that insertion of cardiac event recorder was eligible in approximately one third of patients with cryptogenic stroke and detected in approximately one quarter of these patients new AF.

  5. A robust neural network-based approach for microseismic event detection

    KAUST Repository

    Akram, Jubran

    2017-08-17

    We present an artificial neural network based approach for robust event detection from low S/N waveforms. We use a feed-forward network with a single hidden layer that is tuned on a training dataset and later applied on the entire example dataset for event detection. The input features used include the average of absolute amplitudes, variance, energy-ratio and polarization rectilinearity. These features are calculated in a moving-window of same length for the entire waveform. The output is set as a user-specified relative probability curve, which provides a robust way of distinguishing between weak and strong events. An optimal network is selected by studying the weight-based saliency and effect of number of neurons on the predicted results. Using synthetic data examples, we demonstrate that this approach is effective in detecting weaker events and reduces the number of false positives.

  6. Real-time detection and classification of anomalous events in streaming data

    Science.gov (United States)

    Ferragut, Erik M.; Goodall, John R.; Iannacone, Michael D.; Laska, Jason A.; Harrison, Lane T.

    2016-04-19

    A system is described for receiving a stream of events and scoring the events based on anomalousness and maliciousness (or other classification). The events can be displayed to a user in user-defined groupings in an animated fashion. The system can include a plurality of anomaly detectors that together implement an algorithm to identify low probability events and detect atypical traffic patterns. The atypical traffic patterns can then be classified as being of interest or not. In one particular example, in a network environment, the classification can be whether the network traffic is malicious or not.

  7. Predicting error in detecting mammographic masses among radiology trainees using statistical models based on BI-RADS features.

    Science.gov (United States)

    Grimm, Lars J; Ghate, Sujata V; Yoon, Sora C; Kuzmiak, Cherie M; Kim, Connie; Mazurowski, Maciej A

    2014-03-01

    The purpose of this study is to explore Breast Imaging-Reporting and Data System (BI-RADS) features as predictors of individual errors made by trainees when detecting masses in mammograms. Ten radiology trainees and three expert breast imagers reviewed 100 mammograms comprised of bilateral medial lateral oblique and craniocaudal views on a research workstation. The cases consisted of normal and biopsy proven benign and malignant masses. For cases with actionable abnormalities, the experts recorded breast (density and axillary lymph nodes) and mass (shape, margin, and density) features according to the BI-RADS lexicon, as well as the abnormality location (depth and clock face). For each trainee, a user-specific multivariate model was constructed to predict the trainee's likelihood of error based on BI-RADS features. The performance of the models was assessed using area under the receive operating characteristic curves (AUC). Despite the variability in errors between different trainees, the individual models were able to predict the likelihood of error for the trainees with a mean AUC of 0.611 (range: 0.502-0.739, 95% Confidence Interval: 0.543-0.680,p errors for mammographic masses made by radiology trainees can be modeled using BI-RADS features. These findings may have potential implications for the development of future educational materials that are personalized to individual trainees.

  8. Early snowmelt events: detection, distribution, and significance in a major sub-arctic watershed

    International Nuclear Information System (INIS)

    Semmens, Kathryn Alese; Ramage, Joan; Bartsch, Annett; Liston, Glen E

    2013-01-01

    High latitude drainage basins are experiencing higher average temperatures, earlier snowmelt onset in spring, and an increase in rain on snow (ROS) events in winter, trends that climate models project into the future. Snowmelt-dominated basins are most sensitive to winter temperature increases that influence the frequency of ROS events and the timing and duration of snowmelt, resulting in changes to spring runoff. Of specific interest in this study are early melt events that occur in late winter preceding melt onset in the spring. The study focuses on satellite determination and characterization of these early melt events using the Yukon River Basin (Canada/USA) as a test domain. The timing of these events was estimated using data from passive (Advanced Microwave Scanning Radiometer—EOS (AMSR-E)) and active (SeaWinds on Quick Scatterometer (QuikSCAT)) microwave remote sensors, employing detection algorithms for brightness temperature (AMSR-E) and radar backscatter (QuikSCAT). The satellite detected events were validated with ground station meteorological and hydrological data, and the spatial and temporal variability of the events across the entire river basin was characterized. Possible causative factors for the detected events, including ROS, fog, and positive air temperatures, were determined by comparing the timing of the events to parameters from SnowModel and National Centers for Environmental Prediction North American Regional Reanalysis (NARR) outputs, and weather station data. All melt events coincided with above freezing temperatures, while a limited number corresponded to ROS (determined from SnowModel and ground data) and a majority to fog occurrence (determined from NARR). The results underscore the significant influence that warm air intrusions have on melt in some areas and demonstrate the large temporal and spatial variability over years and regions. The study provides a method for melt detection and a baseline from which to assess future change

  9. Joint model-based clustering of nonlinear longitudinal trajectories and associated time-to-event data analysis, linked by latent class membership: with application to AIDS clinical studies.

    Science.gov (United States)

    Huang, Yangxin; Lu, Xiaosun; Chen, Jiaqing; Liang, Juan; Zangmeister, Miriam

    2017-10-27

    Longitudinal and time-to-event data are often observed together. Finite mixture models are currently used to analyze nonlinear heterogeneous longitudinal data, which, by releasing the homogeneity restriction of nonlinear mixed-effects (NLME) models, can cluster individuals into one of the pre-specified classes with class membership probabilities. This clustering may have clinical significance, and be associated with clinically important time-to-event data. This article develops a joint modeling approach to a finite mixture of NLME models for longitudinal data and proportional hazard Cox model for time-to-event data, linked by individual latent class indicators, under a Bayesian framework. The proposed joint models and method are applied to a real AIDS clinical trial data set, followed by simulation studies to assess the performance of the proposed joint model and a naive two-step model, in which finite mixture model and Cox model are fitted separately.

  10. Contamination Event Detection with Multivariate Time-Series Data in Agricultural Water Monitoring

    Directory of Open Access Journals (Sweden)

    Yingchi Mao

    2017-12-01

    Full Text Available Time series data of multiple water quality parameters are obtained from the water sensor networks deployed in the agricultural water supply network. The accurate and efficient detection and warning of contamination events to prevent pollution from spreading is one of the most important issues when pollution occurs. In order to comprehensively reduce the event detection deviation, a spatial–temporal-based event detection approach with multivariate time-series data for water quality monitoring (M-STED was proposed. The M-STED approach includes three parts. The first part is that M-STED adopts a Rule K algorithm to select backbone nodes as the nodes in the CDS, and forward the sensed data of multiple water parameters. The second part is to determine the state of each backbone node with back propagation neural network models and the sequential Bayesian analysis in the current timestamp. The third part is to establish a spatial model with Bayesian networks to estimate the state of the backbones in the next timestamp and trace the “outlier” node to its neighborhoods to detect a contamination event. The experimental results indicate that the average detection rate is more than 80% with M-STED and the false detection rate is lower than 9%, respectively. The M-STED approach can improve the rate of detection by about 40% and reduce the false alarm rate by about 45%, compared with the event detection with a single water parameter algorithm, S-STED. Moreover, the proposed M-STED can exhibit better performance in terms of detection delay and scalability.

  11. Individual differences in event-based prospective memory: Evidence for multiple processes supporting cue detection.

    Science.gov (United States)

    Brewer, Gene A; Knight, Justin B; Marsh, Richard L; Unsworth, Nash

    2010-04-01

    The multiprocess view proposes that different processes can be used to detect event-based prospective memory cues, depending in part on the specificity of the cue. According to this theory, attentional processes are not necessary to detect focal cues, whereas detection of nonfocal cues requires some form of controlled attention. This notion was tested using a design in which we compared performance on a focal and on a nonfocal prospective memory task by participants with high or low working memory capacity. An interaction was found, such that participants with high and low working memory performed equally well on the focal task, whereas the participants with high working memory performed significantly better on the nonfocal task than did their counterparts with low working memory. Thus, controlled attention was only necessary for detecting event-based prospective memory cues in the nonfocal task. These results have implications for theories of prospective memory, the processes necessary for cue detection, and the successful fulfillment of intentions.

  12. On Event/Time Triggered and Distributed Analysis of a WSN System for Event Detection, Using Fuzzy Logic

    Directory of Open Access Journals (Sweden)

    Sofia Maria Dima

    2016-01-01

    Full Text Available Event detection in realistic WSN environments is a critical research domain, while the environmental monitoring comprises one of its most pronounced applications. Although efforts related to the environmental applications have been presented in the current literature, there is a significant lack of investigation on the performance of such systems, when applied in wireless environments. Aiming at addressing this shortage, in this paper an advanced multimodal approach is followed based on fuzzy logic. The proposed fuzzy inference system (FIS is implemented on TelosB motes and evaluates the probability of fire detection while aiming towards power conservation. Additionally to a straightforward centralized approach, a distributed implementation of the above FIS is also proposed, aiming towards network congestion reduction while optimally distributing the energy consumption among network nodes so as to maximize network lifetime. Moreover this work proposes an event based execution of the aforementioned FIS aiming to further reduce the computational as well as the communication cost, compared to a periodical time triggered FIS execution. As a final contribution, performance metrics acquired from all the proposed FIS implementation techniques are thoroughly compared and analyzed with respect to critical network conditions aiming to offer realistic evaluation and thus objective conclusions’ extraction.

  13. An Unsupervised Anomalous Event Detection and Interactive Analysis Framework for Large-scale Satellite Data

    Science.gov (United States)

    LIU, Q.; Lv, Q.; Klucik, R.; Chen, C.; Gallaher, D. W.; Grant, G.; Shang, L.

    2016-12-01

    Due to the high volume and complexity of satellite data, computer-aided tools for fast quality assessments and scientific discovery are indispensable for scientists in the era of Big Data. In this work, we have developed a framework for automated anomalous event detection in massive satellite data. The framework consists of a clustering-based anomaly detection algorithm and a cloud-based tool for interactive analysis of detected anomalies. The algorithm is unsupervised and requires no prior knowledge of the data (e.g., expected normal pattern or known anomalies). As such, it works for diverse data sets, and performs well even in the presence of missing and noisy data. The cloud-based tool provides an intuitive mapping interface that allows users to interactively analyze anomalies using multiple features. As a whole, our framework can (1) identify outliers in a spatio-temporal context, (2) recognize and distinguish meaningful anomalous events from individual outliers, (3) rank those events based on "interestingness" (e.g., rareness or total number of outliers) defined by users, and (4) enable interactively query, exploration, and analysis of those anomalous events. In this presentation, we will demonstrate the effectiveness and efficiency of our framework in the application of detecting data quality issues and unusual natural events using two satellite datasets. The techniques and tools developed in this project are applicable for a diverse set of satellite data and will be made publicly available for scientists in early 2017.

  14. On-line detection of apnea/hypopnea events using SpO2 signal: a rule-based approach employing binary classifier models.

    Science.gov (United States)

    Koley, Bijoy Laxmi; Dey, Debangshu

    2014-01-01

    This paper presents an online method for automatic detection of apnea/hypopnea events, with the help of oxygen saturation (SpO2) signal, measured at fingertip by Bluetooth nocturnal pulse oximeter. Event detection is performed by identifying abnormal data segments from the recorded SpO2 signal, employing a binary classifier model based on a support vector machine (SVM). Thereafter the abnormal segment is further analyzed to detect different states within the segment, i.e., steady, desaturation, and resaturation, with the help of another SVM-based binary ensemble classifier model. Finally, a heuristically obtained rule-based system is used to identify the apnea/hypopnea events from the time-sequenced decisions of these classifier models. In the developmental phase, a set of 34 time domain-based features was extracted from the segmented SpO2 signal using an overlapped windowing technique. Later, an optimal set of features was selected on the basis of recursive feature elimination technique. A total of 34 subjects were included in the study. The results show average event detection accuracies of 96.7% and 93.8% for the offline and the online tests, respectively. The proposed system provides direct estimation of the apnea/hypopnea index with the help of a relatively inexpensive and widely available pulse oximeter. Moreover, the system can be monitored and accessed by physicians through LAN/WAN/Internet and can be extended to deploy in Bluetooth-enabled mobile phones.

  15. Detection of unusual events and trends in complex non-stationary data streams

    International Nuclear Information System (INIS)

    Charlton-Perez, C.; Perez, R.B.; Protopopescu, V.; Worley, B.A.

    2011-01-01

    The search for unusual events and trends hidden in multi-component, nonlinear, non-stationary, noisy signals is extremely important for diverse applications, ranging from power plant operation to homeland security. In the context of this work, we define an unusual event as a local signal disturbance and a trend as a continuous carrier of information added to and different from the underlying baseline dynamics. The goal of this paper is to investigate the feasibility of detecting hidden events inside intermittent signal data sets corrupted by high levels of noise, by using the Hilbert-Huang empirical mode decomposition method.

  16. Method for detecting binding events using micro-X-ray fluorescence spectrometry

    Science.gov (United States)

    Warner, Benjamin P.; Havrilla, George J.; Mann, Grace

    2010-12-28

    Method for detecting binding events using micro-X-ray fluorescence spectrometry. Receptors are exposed to at least one potential binder and arrayed on a substrate support. Each member of the array is exposed to X-ray radiation. The magnitude of a detectable X-ray fluorescence signal for at least one element can be used to determine whether a binding event between a binder and a receptor has occurred, and can provide information related to the extent of binding between the binder and receptor.

  17. Why conventional detection methods fail in identifying the existence of contamination events.

    Science.gov (United States)

    Liu, Shuming; Li, Ruonan; Smith, Kate; Che, Han

    2016-04-15

    Early warning systems are widely used to safeguard water security, but their effectiveness has raised many questions. To understand why conventional detection methods fail to identify contamination events, this study evaluates the performance of three contamination detection methods using data from a real contamination accident and two artificial datasets constructed using a widely applied contamination data construction approach. Results show that the Pearson correlation Euclidean distance (PE) based detection method performs better for real contamination incidents, while the Euclidean distance method (MED) and linear prediction filter (LPF) method are more suitable for detecting sudden spike-like variation. This analysis revealed why the conventional MED and LPF methods failed to identify existence of contamination events. The analysis also revealed that the widely used contamination data construction approach is misleading. Copyright © 2016 Elsevier Ltd. All rights reserved.

  18. Effect of parameters in moving average method for event detection enhancement using phase sensitive OTDR

    Science.gov (United States)

    Kwon, Yong-Seok; Naeem, Khurram; Jeon, Min Yong; Kwon, Il-bum

    2017-04-01

    We analyze the relations of parameters in moving average method to enhance the event detectability of phase sensitive optical time domain reflectometer (OTDR). If the external events have unique frequency of vibration, then the control parameters of moving average method should be optimized in order to detect these events efficiently. A phase sensitive OTDR was implemented by a pulsed light source, which is composed of a laser diode, a semiconductor optical amplifier, an erbium-doped fiber amplifier, a fiber Bragg grating filter, and a light receiving part, which has a photo-detector and high speed data acquisition system. The moving average method is operated with the control parameters: total number of raw traces, M, number of averaged traces, N, and step size of moving, n. The raw traces are obtained by the phase sensitive OTDR with sound signals generated by a speaker. Using these trace data, the relation of the control parameters is analyzed. In the result, if the event signal has one frequency, then the optimal values of N, n are existed to detect the event efficiently.

  19. Use of wireless sensor networks for distributed event detection in disaster management applications

    NARCIS (Netherlands)

    Bahrepour, M.; Meratnia, Nirvana; Poel, Mannes; Taghikhaki, Zahra; Havinga, Paul J.M.

    Recently, wireless sensor networks (WSNs) have become mature enough to go beyond being simple fine-grained continuous monitoring platforms and have become one of the enabling technologies for early-warning disaster systems. Event detection functionality of WSNs can be of great help and importance

  20. A robust neural network-based approach for microseismic event detection

    KAUST Repository

    Akram, Jubran; Ovcharenko, Oleg; Peter, Daniel

    2017-01-01

    We present an artificial neural network based approach for robust event detection from low S/N waveforms. We use a feed-forward network with a single hidden layer that is tuned on a training dataset and later applied on the entire example dataset

  1. Process variant comparison: using event logs to detect differences in behavior and business rules

    NARCIS (Netherlands)

    Bolt, A.; de Leoni, M.; van der Aalst, W.M.P.

    2018-01-01

    This paper addresses the problem of comparing different variants of the same process. We aim to detect relevant differences between processes based on what was recorded in event logs. We use transition systems to model behavior and to highlight differences. Transition systems are annotated with

  2. A novel seizure detection algorithm informed by hidden Markov model event states

    Science.gov (United States)

    Baldassano, Steven; Wulsin, Drausin; Ung, Hoameng; Blevins, Tyler; Brown, Mesha-Gay; Fox, Emily; Litt, Brian

    2016-06-01

    Objective. Recently the FDA approved the first responsive, closed-loop intracranial device to treat epilepsy. Because these devices must respond within seconds of seizure onset and not miss events, they are tuned to have high sensitivity, leading to frequent false positive stimulations and decreased battery life. In this work, we propose a more robust seizure detection model. Approach. We use a Bayesian nonparametric Markov switching process to parse intracranial EEG (iEEG) data into distinct dynamic event states. Each event state is then modeled as a multidimensional Gaussian distribution to allow for predictive state assignment. By detecting event states highly specific for seizure onset zones, the method can identify precise regions of iEEG data associated with the transition to seizure activity, reducing false positive detections associated with interictal bursts. The seizure detection algorithm was translated to a real-time application and validated in a small pilot study using 391 days of continuous iEEG data from two dogs with naturally occurring, multifocal epilepsy. A feature-based seizure detector modeled after the NeuroPace RNS System was developed as a control. Main results. Our novel seizure detection method demonstrated an improvement in false negative rate (0/55 seizures missed versus 2/55 seizures missed) as well as a significantly reduced false positive rate (0.0012 h versus 0.058 h-1). All seizures were detected an average of 12.1 ± 6.9 s before the onset of unequivocal epileptic activity (unequivocal epileptic onset (UEO)). Significance. This algorithm represents a computationally inexpensive, individualized, real-time detection method suitable for implantable antiepileptic devices that may considerably reduce false positive rate relative to current industry standards.

  3. Events

    Directory of Open Access Journals (Sweden)

    Igor V. Karyakin

    2016-02-01

    Full Text Available The 9th ARRCN Symposium 2015 was held during 21st–25th October 2015 at the Novotel Hotel, Chumphon, Thailand, one of the most favored travel destinations in Asia. The 10th ARRCN Symposium 2017 will be held during October 2017 in the Davao, Philippines. International Symposium on the Montagu's Harrier (Circus pygargus «The Montagu's Harrier in Europe. Status. Threats. Protection», organized by the environmental organization «Landesbund für Vogelschutz in Bayern e.V.» (LBV was held on November 20-22, 2015 in Germany. The location of this event was the city of Wurzburg in Bavaria.

  4. Event detection and exception handling strategies in the ASDEX Upgrade discharge control system

    International Nuclear Information System (INIS)

    Treutterer, W.; Neu, G.; Rapson, C.; Raupp, G.; Zasche, D.; Zehetbauer, T.

    2013-01-01

    Highlights: •Event detection and exception handling is integrated in control system architecture. •Pulse control with local exception handling and pulse supervision with central exception handling are strictly separated. •Local exception handling limits the effect of an exception to a minimal part of the controlled system. •Central Exception Handling solves problems requiring coordinated action of multiple control components. -- Abstract: Thermonuclear plasmas are governed by nonlinear characteristics: plasma operation can be classified into scenarios with pronounced features like L and H-mode, ELMs or MHD activity. Transitions between them may be treated as events. Similarly, technical systems are also subject to events such as failure of measurement sensors, actuator saturation or violation of machine and plant operation limits. Such situations often are handled with a mixture of pulse abortion and iteratively improved pulse schedule reference programming. In case of protection-relevant events, however, the complexity of even a medium-sized device as ASDEX Upgrade requires a sophisticated and coordinated shutdown procedure rather than a simple stop of the pulse. The detection of events and their intelligent handling by the control system has been shown to be valuable also in terms of saving experiment time and cost. This paper outlines how ASDEX Upgrade's discharge control system (DCS) detects events and handles exceptions in two stages: locally and centrally. The goal of local exception handling is to limit the effect of an unexpected or asynchronous event to a minimal part of the controlled system. Thus, local exception handling facilitates robustness to failures but keeps the decision structures lean. A central state machine deals with exceptions requiring coordinated action of multiple control components. DCS implements the state machine by means of pulse schedule segments containing pre-programmed waveforms to define discharge goal and control

  5. Event detection and exception handling strategies in the ASDEX Upgrade discharge control system

    Energy Technology Data Exchange (ETDEWEB)

    Treutterer, W., E-mail: Wolfgang.Treutterer@ipp.mpg.de; Neu, G.; Rapson, C.; Raupp, G.; Zasche, D.; Zehetbauer, T.

    2013-10-15

    Highlights: •Event detection and exception handling is integrated in control system architecture. •Pulse control with local exception handling and pulse supervision with central exception handling are strictly separated. •Local exception handling limits the effect of an exception to a minimal part of the controlled system. •Central Exception Handling solves problems requiring coordinated action of multiple control components. -- Abstract: Thermonuclear plasmas are governed by nonlinear characteristics: plasma operation can be classified into scenarios with pronounced features like L and H-mode, ELMs or MHD activity. Transitions between them may be treated as events. Similarly, technical systems are also subject to events such as failure of measurement sensors, actuator saturation or violation of machine and plant operation limits. Such situations often are handled with a mixture of pulse abortion and iteratively improved pulse schedule reference programming. In case of protection-relevant events, however, the complexity of even a medium-sized device as ASDEX Upgrade requires a sophisticated and coordinated shutdown procedure rather than a simple stop of the pulse. The detection of events and their intelligent handling by the control system has been shown to be valuable also in terms of saving experiment time and cost. This paper outlines how ASDEX Upgrade's discharge control system (DCS) detects events and handles exceptions in two stages: locally and centrally. The goal of local exception handling is to limit the effect of an unexpected or asynchronous event to a minimal part of the controlled system. Thus, local exception handling facilitates robustness to failures but keeps the decision structures lean. A central state machine deals with exceptions requiring coordinated action of multiple control components. DCS implements the state machine by means of pulse schedule segments containing pre-programmed waveforms to define discharge goal and control

  6. Automatic detection of adverse events to predict drug label changes using text and data mining techniques.

    Science.gov (United States)

    Gurulingappa, Harsha; Toldo, Luca; Rajput, Abdul Mateen; Kors, Jan A; Taweel, Adel; Tayrouz, Yorki

    2013-11-01

    The aim of this study was to assess the impact of automatically detected adverse event signals from text and open-source data on the prediction of drug label changes. Open-source adverse effect data were collected from FAERS, Yellow Cards and SIDER databases. A shallow linguistic relation extraction system (JSRE) was applied for extraction of adverse effects from MEDLINE case reports. Statistical approach was applied on the extracted datasets for signal detection and subsequent prediction of label changes issued for 29 drugs by the UK Regulatory Authority in 2009. 76% of drug label changes were automatically predicted. Out of these, 6% of drug label changes were detected only by text mining. JSRE enabled precise identification of four adverse drug events from MEDLINE that were undetectable otherwise. Changes in drug labels can be predicted automatically using data and text mining techniques. Text mining technology is mature and well-placed to support the pharmacovigilance tasks. Copyright © 2013 John Wiley & Sons, Ltd.

  7. Ontology-based knowledge management for personalized adverse drug events detection.

    Science.gov (United States)

    Cao, Feng; Sun, Xingzhi; Wang, Xiaoyuan; Li, Bo; Li, Jing; Pan, Yue

    2011-01-01

    Since Adverse Drug Event (ADE) has become a leading cause of death around the world, there arises high demand for helping clinicians or patients to identify possible hazards from drug effects. Motivated by this, we present a personalized ADE detection system, with the focus on applying ontology-based knowledge management techniques to enhance ADE detection services. The development of electronic health records makes it possible to automate the personalized ADE detection, i.e., to take patient clinical conditions into account during ADE detection. Specifically, we define the ADE ontology to uniformly manage the ADE knowledge from multiple sources. We take advantage of the rich semantics from the terminology SNOMED-CT and apply it to ADE detection via the semantic query and reasoning.

  8. Detections of Planets in Binaries Through the Channel of Chang–Refsdal Gravitational Lensing Events

    Energy Technology Data Exchange (ETDEWEB)

    Han, Cheongho [Department of Physics, Chungbuk National University, Cheongju 361-763 (Korea, Republic of); Shin, In-Gu; Jung, Youn Kil [Harvard-Smithsonian Center for Astrophysics, 60 Garden St., Cambridge, MA 02138 (United States)

    2017-02-01

    Chang–Refsdal (C–R) lensing, which refers to the gravitational lensing of a point mass perturbed by a constant external shear, provides a good approximation in describing lensing behaviors of either a very wide or a very close binary lens. C–R lensing events, which are identified by short-term anomalies near the peak of high-magnification lensing light curves, are routinely detected from lensing surveys, but not much attention is paid to them. In this paper, we point out that C–R lensing events provide an important channel to detect planets in binaries, both in close and wide binary systems. Detecting planets through the C–R lensing event channel is possible because the planet-induced perturbation occurs in the same region of the C–R lensing-induced anomaly and thus the existence of the planet can be identified by the additional deviation in the central perturbation. By presenting the analysis of the actually observed C–R lensing event OGLE-2015-BLG-1319, we demonstrate that dense and high-precision coverage of a C–R lensing-induced perturbation can provide a strong constraint on the existence of a planet in a wide range of planet parameters. The sample of an increased number of microlensing planets in binary systems will provide important observational constraints in giving shape to the details of planet formation, which have been restricted to the case of single stars to date.

  9. Fundamental aspects of seismic event detection, magnitude estimation and their interrelation

    International Nuclear Information System (INIS)

    Ringdal, F.

    1977-01-01

    The main common subject of the papers forming this thesis is statistical model development within the seismological disciplines of seismic event detection and event magnitude estimation. As more high quality seismic data become available as a result of recent seismic network developments, the opportunity will exist for large scale application and further refinement of these models. It is hoped that the work presented here will facilitate improved understanding of the basic issues, both within earthquake-explosion discrimination, in the framework of which most of this work originated, and in seismology in general. (Auth.)

  10. Facilitating adverse drug event detection in pharmacovigilance databases using molecular structure similarity: application to rhabdomyolysis

    Science.gov (United States)

    Vilar, Santiago; Harpaz, Rave; Chase, Herbert S; Costanzi, Stefano; Rabadan, Raul

    2011-01-01

    Background Adverse drug events (ADE) cause considerable harm to patients, and consequently their detection is critical for patient safety. The US Food and Drug Administration maintains an adverse event reporting system (AERS) to facilitate the detection of ADE in drugs. Various data mining approaches have been developed that use AERS to detect signals identifying associations between drugs and ADE. The signals must then be monitored further by domain experts, which is a time-consuming task. Objective To develop a new methodology that combines existing data mining algorithms with chemical information by analysis of molecular fingerprints to enhance initial ADE signals generated from AERS, and to provide a decision support mechanism to facilitate the identification of novel adverse events. Results The method achieved a significant improvement in precision in identifying known ADE, and a more than twofold signal enhancement when applied to the ADE rhabdomyolysis. The simplicity of the method assists in highlighting the etiology of the ADE by identifying structurally similar drugs. A set of drugs with strong evidence from both AERS and molecular fingerprint-based modeling is constructed for further analysis. Conclusion The results demonstrate that the proposed methodology could be used as a pharmacovigilance decision support tool to facilitate ADE detection. PMID:21946238

  11. Microfluidic Arrayed Lab-On-A-Chip for Electrochemical Capacitive Detection of DNA Hybridization Events.

    Science.gov (United States)

    Ben-Yoav, Hadar; Dykstra, Peter H; Bentley, William E; Ghodssi, Reza

    2017-01-01

    A microfluidic electrochemical lab-on-a-chip (LOC) device for DNA hybridization detection has been developed. The device comprises a 3 × 3 array of microelectrodes integrated with a dual layer microfluidic valved manipulation system that provides controlled and automated capabilities for high throughput analysis of microliter volume samples. The surface of the microelectrodes is functionalized with single-stranded DNA (ssDNA) probes which enable specific detection of complementary ssDNA targets. These targets are detected by a capacitive technique which measures dielectric variation at the microelectrode-electrolyte interface due to DNA hybridization events. A quantitative analysis of the hybridization events is carried out based on a sensing modeling that includes detailed analysis of energy storage and dissipation components. By calculating these components during hybridization events the device is able to demonstrate specific and dose response sensing characteristics. The developed microfluidic LOC for DNA hybridization detection offers a technology for real-time and label-free assessment of genetic markers outside of laboratory settings, such as at the point-of-care or in-field environmental monitoring.

  12. Simultaneous Event-Triggered Fault Detection and Estimation for Stochastic Systems Subject to Deception Attacks.

    Science.gov (United States)

    Li, Yunji; Wu, QingE; Peng, Li

    2018-01-23

    In this paper, a synthesized design of fault-detection filter and fault estimator is considered for a class of discrete-time stochastic systems in the framework of event-triggered transmission scheme subject to unknown disturbances and deception attacks. A random variable obeying the Bernoulli distribution is employed to characterize the phenomena of the randomly occurring deception attacks. To achieve a fault-detection residual is only sensitive to faults while robust to disturbances, a coordinate transformation approach is exploited. This approach can transform the considered system into two subsystems and the unknown disturbances are removed from one of the subsystems. The gain of fault-detection filter is derived by minimizing an upper bound of filter error covariance. Meanwhile, system faults can be reconstructed by the remote fault estimator. An recursive approach is developed to obtain fault estimator gains as well as guarantee the fault estimator performance. Furthermore, the corresponding event-triggered sensor data transmission scheme is also presented for improving working-life of the wireless sensor node when measurement information are aperiodically transmitted. Finally, a scaled version of an industrial system consisting of local PC, remote estimator and wireless sensor node is used to experimentally evaluate the proposed theoretical results. In particular, a novel fault-alarming strategy is proposed so that the real-time capacity of fault-detection is guaranteed when the event condition is triggered.

  13. Integrated hydraulic and organophosphate pesticide injection simulations for enhancing event detection in water distribution systems.

    Science.gov (United States)

    Schwartz, Rafi; Lahav, Ori; Ostfeld, Avi

    2014-10-15

    As a complementary step towards solving the general event detection problem of water distribution systems, injection of the organophosphate pesticides, chlorpyrifos (CP) and parathion (PA), were simulated at various locations within example networks and hydraulic parameters were calculated over 24-h duration. The uniqueness of this study is that the chemical reactions and byproducts of the contaminants' oxidation were also simulated, as well as other indicative water quality parameters such as alkalinity, acidity, pH and the total concentration of free chlorine species. The information on the change in water quality parameters induced by the contaminant injection may facilitate on-line detection of an actual event involving this specific substance and pave the way to development of a generic methodology for detecting events involving introduction of pesticides into water distribution systems. Simulation of the contaminant injection was performed at several nodes within two different networks. For each injection, concentrations of the relevant contaminants' mother and daughter species, free chlorine species and water quality parameters, were simulated at nodes downstream of the injection location. The results indicate that injection of these substances can be detected at certain conditions by a very rapid drop in Cl2, functioning as the indicative parameter, as well as a drop in alkalinity concentration and a small decrease in pH, both functioning as supporting parameters, whose usage may reduce false positive alarms. Copyright © 2014 Elsevier Ltd. All rights reserved.

  14. Detection of visual events along the apparent motion trace in patients with paranoid schizophrenia.

    Science.gov (United States)

    Sanders, Lia Lira Olivier; Muckli, Lars; de Millas, Walter; Lautenschlager, Marion; Heinz, Andreas; Kathmann, Norbert; Sterzer, Philipp

    2012-07-30

    Dysfunctional prediction in sensory processing has been suggested as a possible causal mechanism in the development of delusions in patients with schizophrenia. Previous studies in healthy subjects have shown that while the perception of apparent motion can mask visual events along the illusory motion trace, such motion masking is reduced when events are spatio-temporally compatible with the illusion, and, therefore, predictable. Here we tested the hypothesis that this specific detection advantage for predictable target stimuli on the apparent motion trace is reduced in patients with paranoid schizophrenia. Our data show that, although target detection along the illusory motion trace is generally impaired, both patients and healthy control participants detect predictable targets more often than unpredictable targets. Patients had a stronger motion masking effect when compared to controls. However, patients showed the same advantage in the detection of predictable targets as healthy control subjects. Our findings reveal stronger motion masking but intact prediction of visual events along the apparent motion trace in patients with paranoid schizophrenia and suggest that the sensory prediction mechanism underlying apparent motion is not impaired in paranoid schizophrenia. Copyright © 2012. Published by Elsevier Ireland Ltd.

  15. Hazardous Traffic Event Detection Using Markov Blanket and Sequential Minimal Optimization (MB-SMO

    Directory of Open Access Journals (Sweden)

    Lixin Yan

    2016-07-01

    Full Text Available The ability to identify hazardous traffic events is already considered as one of the most effective solutions for reducing the occurrence of crashes. Only certain particular hazardous traffic events have been studied in previous studies, which were mainly based on dedicated video stream data and GPS data. The objective of this study is twofold: (1 the Markov blanket (MB algorithm is employed to extract the main factors associated with hazardous traffic events; (2 a model is developed to identify hazardous traffic event using driving characteristics, vehicle trajectory, and vehicle position data. Twenty-two licensed drivers were recruited to carry out a natural driving experiment in Wuhan, China, and multi-sensor information data were collected for different types of traffic events. The results indicated that a vehicle’s speed, the standard deviation of speed, the standard deviation of skin conductance, the standard deviation of brake pressure, turn signal, the acceleration of steering, the standard deviation of acceleration, and the acceleration in Z (G have significant influences on hazardous traffic events. The sequential minimal optimization (SMO algorithm was adopted to build the identification model, and the accuracy of prediction was higher than 86%. Moreover, compared with other detection algorithms, the MB-SMO algorithm was ranked best in terms of the prediction accuracy. The conclusions can provide reference evidence for the development of dangerous situation warning products and the design of intelligent vehicles.

  16. Latency and mode of error detection as reflected in Swedish licensee event reports

    Energy Technology Data Exchange (ETDEWEB)

    Svenson, Ola; Salo, Ilkka [Stockholm Univ., (Sweden). Dept. of Psychology

    2002-03-01

    Licensee event reports (LERs) from an industry provide important information feedback about safety to the industry itself, the regulators and to the public. LERs from four nuclear power reactors were analyzed to find out about detection times, mode of detection and qualitative differences in reports from different reactors. The reliability of the coding was satisfactory and measured as the covariance between the ratings from two independent judges. The results showed differences in detection time across the reactors. On the average about ten percent of the errors remained undetected for 100 weeks or more, but the great majority of errors were detected soon after their first appearance in the plant. On the average 40 percent of the errors were detected in regular tests and 40 per cent through alarms. Operators found about 10 per cent of the errors through noticing something abnormal in the plant. The remaining errors were detected in various other ways. There were qualitative differences between the LERs from the different reactors reflecting the different conditions in the plants. The number of reports differed by a magnitude 1:2 between the different plants. However, a greater number of LERs can indicate both higher safety standards (e.g., a greater willingness to report all possible events to be able to learn from them) and lower safety standards (e.g., reporting as few events as possible to make a good impression). It was pointed out that LERs are indispensable in order to maintain safety of an industry and that the differences between plants found in the analyses of this study indicate how error reports can be used to initiate further investigations for improved safety.

  17. Latency and mode of error detection as reflected in Swedish licensee event reports

    International Nuclear Information System (INIS)

    Svenson, Ola; Salo, Ilkka

    2002-03-01

    Licensee event reports (LERs) from an industry provide important information feedback about safety to the industry itself, the regulators and to the public. LERs from four nuclear power reactors were analyzed to find out about detection times, mode of detection and qualitative differences in reports from different reactors. The reliability of the coding was satisfactory and measured as the covariance between the ratings from two independent judges. The results showed differences in detection time across the reactors. On the average about ten percent of the errors remained undetected for 100 weeks or more, but the great majority of errors were detected soon after their first appearance in the plant. On the average 40 percent of the errors were detected in regular tests and 40 per cent through alarms. Operators found about 10 per cent of the errors through noticing something abnormal in the plant. The remaining errors were detected in various other ways. There were qualitative differences between the LERs from the different reactors reflecting the different conditions in the plants. The number of reports differed by a magnitude 1:2 between the different plants. However, a greater number of LERs can indicate both higher safety standards (e.g., a greater willingness to report all possible events to be able to learn from them) and lower safety standards (e.g., reporting as few events as possible to make a good impression). It was pointed out that LERs are indispensable in order to maintain safety of an industry and that the differences between plants found in the analyses of this study indicate how error reports can be used to initiate further investigations for improved safety

  18. Detection of Unusual Events and Trends in Complex Non-Stationary Data Streams

    International Nuclear Information System (INIS)

    Perez, Rafael B.; Protopopescu, Vladimir A.; Worley, Brian Addison; Perez, Cristina

    2006-01-01

    The search for unusual events and trends hidden in multi-component, nonlinear, non-stationary, noisy signals is extremely important for a host of different applications, ranging from nuclear power plant and electric grid operation to internet traffic and implementation of non-proliferation protocols. In the context of this work, we define an unusual event as a local signal disturbance and a trend as a continuous carrier of information added to and different from the underlying baseline dynamics. The goal of this paper is to investigate the feasibility of detecting hidden intermittent events inside non-stationary signal data sets corrupted by high levels of noise, by using the Hilbert-Huang empirical mode decomposition method

  19. Energy Reconstruction for Events Detected in TES X-ray Detectors

    Science.gov (United States)

    Ceballos, M. T.; Cardiel, N.; Cobo, B.

    2015-09-01

    The processing of the X-ray events detected by a TES (Transition Edge Sensor) device (such as the one that will be proposed in the ESA AO call for instruments for the Athena mission (Nandra et al. 2013) as a high spectral resolution instrument, X-IFU (Barret et al. 2013)), is a several step procedure that starts with the detection of the current pulses in a noisy signal and ends up with their energy reconstruction. For this last stage, an energy calibration process is required to convert the pseudo energies measured in the detector to the real energies of the incoming photons, accounting for possible nonlinearity effects in the detector. We present the details of the energy calibration algorithm we implemented as the last part of the Event Processing software that we are developing for the X-IFU instrument, that permits the calculation of the calibration constants in an analytical way.

  20. A novel CUSUM-based approach for event detection in smart metering

    Science.gov (United States)

    Zhu, Zhicheng; Zhang, Shuai; Wei, Zhiqiang; Yin, Bo; Huang, Xianqing

    2018-03-01

    Non-intrusive load monitoring (NILM) plays such a significant role in raising consumer awareness on household electricity use to reduce overall energy consumption in the society. With regard to monitoring low power load, many researchers have introduced CUSUM into the NILM system, since the traditional event detection method is not as effective as expected. Due to the fact that the original CUSUM faces limitations given the small shift is below threshold, we therefore improve the test statistic which allows permissible deviation to gradually rise as the data size increases. This paper proposes a novel event detection and corresponding criterion that could be used in NILM systems to recognize transient states and to help the labelling task. Its performance has been tested in a real scenario where eight different appliances are connected to main line of electric power.

  1. Automatic detection of esophageal pressure events. Is there an alternative to rule-based criteria?

    DEFF Research Database (Denmark)

    Kruse-Andersen, S; Rütz, K; Kolberg, Jens Godsk

    1995-01-01

    of relevant pressure peaks at the various recording levels. Until now, this selection has been performed entirely by rule-based systems, requiring each pressure deflection to fit within predefined rigid numerical limits in order to be detected. However, due to great variations in the shapes of the pressure...... curves generated by muscular contractions, rule-based criteria do not always select the pressure events most relevant for further analysis. We have therefore been searching for a new concept for automatic event recognition. The present study describes a new system, based on the method of neurocomputing.......79-0.99 and accuracies of 0.89-0.98, depending on the recording level within the esophageal lumen. The neural networks often recognized peaks that clearly represented true contractions but that had been rejected by a rule-based system. We conclude that neural networks have potentials for automatic detections...

  2. Power Load Event Detection and Classification Based on Edge Symbol Analysis and Support Vector Machine

    Directory of Open Access Journals (Sweden)

    Lei Jiang

    2012-01-01

    Full Text Available Energy signature analysis of power appliance is the core of nonintrusive load monitoring (NILM where the detailed data of the appliances used in houses are obtained by analyzing changes in the voltage and current. This paper focuses on developing an automatic power load event detection and appliance classification based on machine learning. In power load event detection, the paper presents a new transient detection algorithm. By turn-on and turn-off transient waveforms analysis, it can accurately detect the edge point when a device is switched on or switched off. The proposed load classification technique can identify different power appliances with improved recognition accuracy and computational speed. The load classification method is composed of two processes including frequency feature analysis and support vector machine. The experimental results indicated that the incorporation of the new edge detection and turn-on and turn-off transient signature analysis into NILM revealed more information than traditional NILM methods. The load classification method has achieved more than ninety percent recognition rate.

  3. Flow detection via sparse frame analysis for suspicious event recognition in infrared imagery

    Science.gov (United States)

    Fernandes, Henrique C.; Batista, Marcos A.; Barcelos, Celia A. Z.; Maldague, Xavier P. V.

    2013-05-01

    It is becoming increasingly evident that intelligent systems are very bene¯cial for society and that the further development of such systems is necessary to continue to improve society's quality of life. One area that has drawn the attention of recent research is the development of automatic surveillance systems. In our work we outline a system capable of monitoring an uncontrolled area (an outside parking lot) using infrared imagery and recognizing suspicious events in this area. The ¯rst step is to identify moving objects and segment them from the scene's background. Our approach is based on a dynamic background-subtraction technique which robustly adapts detection to illumination changes. It is analyzed only regions where movement is occurring, ignoring in°uence of pixels from regions where there is no movement, to segment moving objects. Regions where movement is occurring are identi¯ed using °ow detection via sparse frame analysis. During the tracking process the objects are classi¯ed into two categories: Persons and Vehicles, based on features such as size and velocity. The last step is to recognize suspicious events that may occur in the scene. Since the objects are correctly segmented and classi¯ed it is possible to identify those events using features such as velocity and time spent motionless in one spot. In this paper we recognize the suspicious event suspicion of object(s) theft from inside a parked vehicle at spot X by a person" and results show that the use of °ow detection increases the recognition of this suspicious event from 78:57% to 92:85%.

  4. Detection of invisible and crucial events: from seismic fluctuations to the war against terrorism

    Energy Technology Data Exchange (ETDEWEB)

    Allegrini, Paolo; Fronzoni, Leone; Grigolini, Paolo; Latora, Vito; Mega, Mirko S.; Palatella, Luigi E-mail: luigi.palatella@df.unipi.it; Rapisarda, Andrea; Vinciguerra, Sergio

    2004-04-01

    We argue that the recent discovery of the non-Poissonian statistics of the seismic main-shocks is a special case of a more general approach to the detection of the distribution of the time increments between one crucial but invisible event and the next. We make the conjecture that the proposed approach can be applied to the analysis of terrorist network with significant benefits for the Intelligence Community.

  5. Automated Feature and Event Detection with SDO AIA and HMI Data

    Science.gov (United States)

    Davey, Alisdair; Martens, P. C. H.; Attrill, G. D. R.; Engell, A.; Farid, S.; Grigis, P. C.; Kasper, J.; Korreck, K.; Saar, S. H.; Su, Y.; Testa, P.; Wills-Davey, M.; Savcheva, A.; Bernasconi, P. N.; Raouafi, N.-E.; Delouille, V. A.; Hochedez, J. F..; Cirtain, J. W.; Deforest, C. E.; Angryk, R. A.; de Moortel, I.; Wiegelmann, T.; Georgouli, M. K.; McAteer, R. T. J.; Hurlburt, N.; Timmons, R.

    The Solar Dynamics Observatory (SDO) represents a new frontier in quantity and quality of solar data. At about 1.5 TB/day, the data will not be easily digestible by solar physicists using the same methods that have been employed for images from previous missions. In order for solar scientists to use the SDO data effectively they need meta-data that will allow them to identify and retrieve data sets that address their particular science questions. We are building a comprehensive computer vision pipeline for SDO, abstracting complete metadata on many of the features and events detectable on the Sun without human intervention. Our project unites more than a dozen individual, existing codes into a systematic tool that can be used by the entire solar community. The feature finding codes will run as part of the SDO Event Detection System (EDS) at the Joint Science Operations Center (JSOC; joint between Stanford and LMSAL). The metadata produced will be stored in the Heliophysics Event Knowledgebase (HEK), which will be accessible on-line for the rest of the world directly or via the Virtual Solar Observatory (VSO) . Solar scientists will be able to use the HEK to select event and feature data to download for science studies.

  6. A Macro-Observation Scheme for Abnormal Event Detection in Daily-Life Video Sequences

    Directory of Open Access Journals (Sweden)

    Chiu Wei-Yao

    2010-01-01

    Full Text Available Abstract We propose a macro-observation scheme for abnormal event detection in daily life. The proposed macro-observation representation records the time-space energy of motions of all moving objects in a scene without segmenting individual object parts. The energy history of each pixel in the scene is instantly updated with exponential weights without explicitly specifying the duration of each activity. Since possible activities in daily life are numerous and distinct from each other and not all abnormal events can be foreseen, images from a video sequence that spans sufficient repetition of normal day-to-day activities are first randomly sampled. A constrained clustering model is proposed to partition the sampled images into groups. The new observed event that has distinct distance from any of the cluster centroids is then classified as an anomaly. The proposed method has been evaluated in daily work of a laboratory and BEHAVE benchmark dataset. The experimental results reveal that it can well detect abnormal events such as burglary and fighting as long as they last for a sufficient duration of time. The proposed method can be used as a support system for the scene that requires full time monitoring personnel.

  7. Detecting adverse events in surgery: comparing events detected by the Veterans Health Administration Surgical Quality Improvement Program and the Patient Safety Indicators.

    Science.gov (United States)

    Mull, Hillary J; Borzecki, Ann M; Loveland, Susan; Hickson, Kathleen; Chen, Qi; MacDonald, Sally; Shin, Marlena H; Cevasco, Marisa; Itani, Kamal M F; Rosen, Amy K

    2014-04-01

    The Patient Safety Indicators (PSIs) use administrative data to screen for select adverse events (AEs). In this study, VA Surgical Quality Improvement Program (VASQIP) chart review data were used as the gold standard to measure the criterion validity of 5 surgical PSIs. Independent chart review was also used to determine reasons for PSI errors. The sensitivity, specificity, and positive predictive value of PSI software version 4.1a were calculated among Veterans Health Administration hospitalizations (2003-2007) reviewed by VASQIP (n = 268,771). Nurses re-reviewed a sample of hospitalizations for which PSI and VASQIP AE detection disagreed. Sensitivities ranged from 31% to 68%, specificities from 99.1% to 99.8%, and positive predictive values from 31% to 72%. Reviewers found that coding errors accounted for some PSI-VASQIP disagreement; some disagreement was also the result of differences in AE definitions. These results suggest that the PSIs have moderate criterion validity; however, some surgical PSIs detect different AEs than VASQIP. Future research should explore using both methods to evaluate surgical quality. Published by Elsevier Inc.

  8. Temporal and spatial predictability of an irrelevant event differently affect detection and memory of items in a visual sequence

    Directory of Open Access Journals (Sweden)

    Junji eOhyama

    2016-02-01

    Full Text Available We examined how the temporal and spatial predictability of a task-irrelevant visual event affects the detection and memory of a visual item embedded in a continuously changing sequence. Participants observed 11 sequentially presented letters, during which a task-irrelevant visual event was either present or absent. Predictabilities of spatial location and temporal position of the event were controlled in 2 × 2 conditions. In the spatially predictable conditions, the event occurred at the same location within the stimulus sequence or at another location, while, in the spatially unpredictable conditions, it occurred at random locations. In the temporally predictable conditions, the event timing was fixed relative to the order of the letters, while in the temporally unpredictable condition, it could not be predicted from the letter order. Participants performed a working memory task and a target detection reaction time task. Memory accuracy was higher for a letter simultaneously presented at the same location as the event in the temporally unpredictable conditions, irrespective of the spatial predictability of the event. On the other hand, the detection reaction times were only faster for a letter simultaneously presented at the same location as the event when the event was both temporally and spatially predictable. Thus, to facilitate ongoing detection processes, an event must be predictable both in space and time, while memory processes are enhanced by temporally unpredictable (i.e., surprising events. Evidently, temporal predictability has differential effects on detection and memory of a visual item embedded in a sequence of images.

  9. Method for the depth corrected detection of ionizing events from a co-planar grids sensor

    Science.gov (United States)

    De Geronimo, Gianluigi [Syosset, NY; Bolotnikov, Aleksey E [South Setauket, NY; Carini, Gabriella [Port Jefferson, NY

    2009-05-12

    A method for the detection of ionizing events utilizing a co-planar grids sensor comprising a semiconductor substrate, cathode electrode, collecting grid and non-collecting grid. The semiconductor substrate is sensitive to ionizing radiation. A voltage less than 0 Volts is applied to the cathode electrode. A voltage greater than the voltage applied to the cathode is applied to the non-collecting grid. A voltage greater than the voltage applied to the non-collecting grid is applied to the collecting grid. The collecting grid and the non-collecting grid are summed and subtracted creating a sum and difference respectively. The difference and sum are divided creating a ratio. A gain coefficient factor for each depth (distance between the ionizing event and the collecting grid) is determined, whereby the difference between the collecting electrode and the non-collecting electrode multiplied by the corresponding gain coefficient is the depth corrected energy of an ionizing event. Therefore, the energy of each ionizing event is the difference between the collecting grid and the non-collecting grid multiplied by the corresponding gain coefficient. The depth of the ionizing event can also be determined from the ratio.

  10. Presentation of the results of a Bayesian automatic event detection and localization program to human analysts

    Science.gov (United States)

    Kushida, N.; Kebede, F.; Feitio, P.; Le Bras, R.

    2016-12-01

    The Preparatory Commission for the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO) has been developing and testing NET-VISA (Arora et al., 2013), a Bayesian automatic event detection and localization program, and evaluating its performance in a realistic operational mode. In our preliminary testing at the CTBTO, NET-VISA shows better performance than its currently operating automatic localization program. However, given CTBTO's role and its international context, a new technology should be introduced cautiously when it replaces a key piece of the automatic processing. We integrated the results of NET-VISA into the Analyst Review Station, extensively used by the analysts so that they can check the accuracy and robustness of the Bayesian approach. We expect the workload of the analysts to be reduced because of the better performance of NET-VISA in finding missed events and getting a more complete set of stations than the current system which has been operating for nearly twenty years. The results of a series of tests indicate that the expectations born from the automatic tests, which show an overall overlap improvement of 11%, meaning that the missed events rate is cut by 42%, hold for the integrated interactive module as well. New events are found by analysts, which qualify for the CTBTO Reviewed Event Bulletin, beyond the ones analyzed through the standard procedures. Arora, N., Russell, S., and Sudderth, E., NET-VISA: Network Processing Vertically Integrated Seismic Analysis, 2013, Bull. Seismol. Soc. Am., 103, 709-729.

  11. Final Scientific Report, Integrated Seismic Event Detection and Location by Advanced Array Processing

    Energy Technology Data Exchange (ETDEWEB)

    Kvaerna, T.; Gibbons. S.J.; Ringdal, F; Harris, D.B.

    2007-01-30

    In the field of nuclear explosion monitoring, it has become a priority to detect, locate, and identify seismic events down to increasingly small magnitudes. The consideration of smaller seismic events has implications for a reliable monitoring regime. Firstly, the number of events to be considered increases greatly; an exponential increase in naturally occurring seismicity is compounded by large numbers of seismic signals generated by human activity. Secondly, the signals from smaller events become more difficult to detect above the background noise and estimates of parameters required for locating the events may be subject to greater errors. Thirdly, events are likely to be observed by a far smaller number of seismic stations, and the reliability of event detection and location using a very limited set of observations needs to be quantified. For many key seismic stations, detection lists may be dominated by signals from routine industrial explosions which should be ascribed, automatically and with a high level of confidence, to known sources. This means that expensive analyst time is not spent locating routine events from repeating seismic sources and that events from unknown sources, which could be of concern in an explosion monitoring context, are more easily identified and can be examined with due care. We have obtained extensive lists of confirmed seismic events from mining and other artificial sources which have provided an excellent opportunity to assess the quality of existing fully-automatic event bulletins and to guide the development of new techniques for online seismic processing. Comparing the times and locations of confirmed events from sources in Fennoscandia and NW Russia with the corresponding time and location estimates reported in existing automatic bulletins has revealed substantial mislocation errors which preclude a confident association of detected signals with known industrial sources. The causes of the errors are well understood and are

  12. Final Scientific Report, Integrated Seismic Event Detection and Location by Advanced Array Processing

    International Nuclear Information System (INIS)

    Kvaerna, T.; Gibbons. S.J.; Ringdal, F; Harris, D.B.

    2007-01-01

    In the field of nuclear explosion monitoring, it has become a priority to detect, locate, and identify seismic events down to increasingly small magnitudes. The consideration of smaller seismic events has implications for a reliable monitoring regime. Firstly, the number of events to be considered increases greatly; an exponential increase in naturally occurring seismicity is compounded by large numbers of seismic signals generated by human activity. Secondly, the signals from smaller events become more difficult to detect above the background noise and estimates of parameters required for locating the events may be subject to greater errors. Thirdly, events are likely to be observed by a far smaller number of seismic stations, and the reliability of event detection and location using a very limited set of observations needs to be quantified. For many key seismic stations, detection lists may be dominated by signals from routine industrial explosions which should be ascribed, automatically and with a high level of confidence, to known sources. This means that expensive analyst time is not spent locating routine events from repeating seismic sources and that events from unknown sources, which could be of concern in an explosion monitoring context, are more easily identified and can be examined with due care. We have obtained extensive lists of confirmed seismic events from mining and other artificial sources which have provided an excellent opportunity to assess the quality of existing fully-automatic event bulletins and to guide the development of new techniques for online seismic processing. Comparing the times and locations of confirmed events from sources in Fennoscandia and NW Russia with the corresponding time and location estimates reported in existing automatic bulletins has revealed substantial mislocation errors which preclude a confident association of detected signals with known industrial sources. The causes of the errors are well understood and are

  13. A Novel Event-Based Incipient Slip Detection Using Dynamic Active-Pixel Vision Sensor (DAVIS).

    Science.gov (United States)

    Rigi, Amin; Baghaei Naeini, Fariborz; Makris, Dimitrios; Zweiri, Yahya

    2018-01-24

    In this paper, a novel approach to detect incipient slip based on the contact area between a transparent silicone medium and different objects using a neuromorphic event-based vision sensor (DAVIS) is proposed. Event-based algorithms are developed to detect incipient slip, slip, stress distribution and object vibration. Thirty-seven experiments were performed on five objects with different sizes, shapes, materials and weights to compare precision and response time of the proposed approach. The proposed approach is validated by using a high speed constitutional camera (1000 FPS). The results indicate that the sensor can detect incipient slippage with an average of 44.1 ms latency in unstructured environment for various objects. It is worth mentioning that the experiments were conducted in an uncontrolled experimental environment, therefore adding high noise levels that affected results significantly. However, eleven of the experiments had a detection latency below 10 ms which shows the capability of this method. The results are very promising and show a high potential of the sensor being used for manipulation applications especially in dynamic environments.

  14. Event Detection Intelligent Camera: Demonstration of flexible, real-time data taking and processing

    Energy Technology Data Exchange (ETDEWEB)

    Szabolics, Tamás, E-mail: szabolics.tamas@wigner.mta.hu; Cseh, Gábor; Kocsis, Gábor; Szepesi, Tamás; Zoletnik, Sándor

    2015-10-15

    Highlights: • We present EDICAM's operation principles description. • Firmware tests results. • Software test results. • Further developments. - Abstract: An innovative fast camera (EDICAM – Event Detection Intelligent CAMera) was developed by MTA Wigner RCP in the last few years. This new concept was designed for intelligent event driven processing to be able to detect predefined events and track objects in the plasma. The camera provides a moderate frame rate of 400 Hz at full frame resolution (1280 × 1024), and readout of smaller region of interests can be done in the 1–140 kHz range even during exposure of the full image. One of the most important advantages of this hardware is a 10 Gbit/s optical link which ensures very fast communication and data transfer between the PC and the camera, enabling two level of processing: primitive algorithms in the camera hardware and high-level processing in the PC. This camera hardware has successfully proven to be able to monitoring the plasma in several fusion devices for example at ASDEX Upgrade, KSTAR and COMPASS with the first version of firmware. A new firmware and software package is under development. It allows to detect predefined events in real time and therefore the camera is capable to change its own operation or to give warnings e.g. to the safety system of the experiment. The EDICAM system can handle a huge amount of data (up to TBs) with high data rate (950 MB/s) and will be used as the central element of the 10 camera overview video diagnostic system of Wendenstein 7-X (W7-X) stellarator. This paper presents key elements of the newly developed built-in intelligence stressing the revolutionary new features and the results of the test of the different software elements.

  15. Automatic detection of lexical change: an auditory event-related potential study.

    Science.gov (United States)

    Muller-Gass, Alexandra; Roye, Anja; Kirmse, Ursula; Saupe, Katja; Jacobsen, Thomas; Schröger, Erich

    2007-10-29

    We investigated the detection of rare task-irrelevant changes in the lexical status of speech stimuli. Participants performed a nonlinguistic task on word and pseudoword stimuli that occurred, in separate conditions, rarely or frequently. Task performance for pseudowords was deteriorated relative to words, suggesting unintentional lexical analysis. Furthermore, rare word and pseudoword changes had a similar effect on the event-related potentials, starting as early as 165 ms. This is the first demonstration of the automatic detection of change in lexical status that is not based on a co-occurring acoustic change. We propose that, following lexical analysis of the incoming stimuli, a mental representation of the lexical regularity is formed and used as a template against which lexical change can be detected.

  16. High-Performance Signal Detection for Adverse Drug Events using MapReduce Paradigm.

    Science.gov (United States)

    Fan, Kai; Sun, Xingzhi; Tao, Ying; Xu, Linhao; Wang, Chen; Mao, Xianling; Peng, Bo; Pan, Yue

    2010-11-13

    Post-marketing pharmacovigilance is important for public health, as many Adverse Drug Events (ADEs) are unknown when those drugs were approved for marketing. However, due to the large number of reported drugs and drug combinations, detecting ADE signals by mining these reports is becoming a challenging task in terms of computational complexity. Recently, a parallel programming model, MapReduce has been introduced by Google to support large-scale data intensive applications. In this study, we proposed a MapReduce-based algorithm, for common ADE detection approach, Proportional Reporting Ratio (PRR), and tested it in mining spontaneous ADE reports from FDA. The purpose is to investigate the possibility of using MapReduce principle to speed up biomedical data mining tasks using this pharmacovigilance case as one specific example. The results demonstrated that MapReduce programming model could improve the performance of common signal detection algorithm for pharmacovigilance in a distributed computation environment at approximately liner speedup rates.

  17. EVENT DETECTION USING MOBILE PHONE MASS GPS DATA AND THEIR RELIAVILITY VERIFICATION BY DMSP/OLS NIGHT LIGHT IMAGE

    Directory of Open Access Journals (Sweden)

    A. Yuki

    2016-06-01

    Full Text Available In this study, we developed a method to detect sudden population concentration on a certain day and area, that is, an “Event,” all over Japan in 2012 using mass GPS data provided from mobile phone users. First, stay locations of all phone users were detected using existing methods. Second, areas and days where Events occurred were detected by aggregation of mass stay locations into 1-km-square grid polygons. Finally, the proposed method could detect Events with an especially large number of visitors in the year by removing the influences of Events that occurred continuously throughout the year. In addition, we demonstrated reasonable reliability of the proposed Event detection method by comparing the results of Event detection with light intensities obtained from the night light images from the DMSP/OLS night light images. Our method can detect not only positive events such as festivals but also negative events such as natural disasters and road accidents. These results are expected to support policy development of urban planning, disaster prevention, and transportation management.

  18. [Detection of adverse events in hospitalized adult patients by using the Global Trigger Tool method].

    Science.gov (United States)

    Guzmán-Ruiz, O; Ruiz-López, P; Gómez-Cámara, A; Ramírez-Martín, M

    2015-01-01

    To identify and characterize adverse events (AE) in an Internal Medicine Department of a district hospital using an extension of the Global Trigger Tool (GTT), analyzing the diagnostic validity of the tool. An observational, analytical, descriptive and retrospective study was conducted on 2013 clinical charts from an Internal Medicine Department in order to detect EA through the identification of 'triggers' (an event often related to an AE). The 'triggers' and AE were located by systematic review of clinical documentation. The AE were characterized after they were identified. A total of 149 AE were detected in 291 clinical charts during 2013, of which 75.3% were detected directly by the tool, while the rest were not associated with a trigger. The percentage of charts that had at least one AE was 35.4%. The most frequent AE found was pressure ulcer (12%), followed by delirium, constipation, nosocomial respiratory infection and altered level of consciousness by drugs. Almost half (47.6%) of the AE were related to drug use, and 32.2% of all AE were considered preventable. The tool demonstrated a sensitivity of 91.3% (95%CI: 88.9-93.2) and a specificity of 32.5% (95%CI: 29.9-35.1). It had a positive predictive value of 42.5% (95%CI: 40.1-45.1) and a negative predictive value of 87.1% (95%CI: 83.8-89.9). The tool used in this study is valid, useful and reproducible for the detection of AE. It also serves to determine rates of injury and to observe their progression over time. A high frequency of both AE and preventable events were observed in this study. Copyright © 2014 SECA. Published by Elsevier Espana. All rights reserved.

  19. Predictive modeling of structured electronic health records for adverse drug event detection.

    Science.gov (United States)

    Zhao, Jing; Henriksson, Aron; Asker, Lars; Boström, Henrik

    2015-01-01

    The digitization of healthcare data, resulting from the increasingly widespread adoption of electronic health records, has greatly facilitated its analysis by computational methods and thereby enabled large-scale secondary use thereof. This can be exploited to support public health activities such as pharmacovigilance, wherein the safety of drugs is monitored to inform regulatory decisions about sustained use. To that end, electronic health records have emerged as a potentially valuable data source, providing access to longitudinal observations of patient treatment and drug use. A nascent line of research concerns predictive modeling of healthcare data for the automatic detection of adverse drug events, which presents its own set of challenges: it is not yet clear how to represent the heterogeneous data types in a manner conducive to learning high-performing machine learning models. Datasets from an electronic health record database are used for learning predictive models with the purpose of detecting adverse drug events. The use and representation of two data types, as well as their combination, are studied: clinical codes, describing prescribed drugs and assigned diagnoses, and measurements. Feature selection is conducted on the various types of data to reduce dimensionality and sparsity, while allowing for an in-depth feature analysis of the usefulness of each data type and representation. Within each data type, combining multiple representations yields better predictive performance compared to using any single representation. The use of clinical codes for adverse drug event detection significantly outperforms the use of measurements; however, there is no significant difference over datasets between using only clinical codes and their combination with measurements. For certain adverse drug events, the combination does, however, outperform using only clinical codes. Feature selection leads to increased predictive performance for both data types, in isolation and

  20. An Ensemble Approach for Emotion Cause Detection with Event Extraction and Multi-Kernel SVMs

    Institute of Scientific and Technical Information of China (English)

    Ruifeng Xu; Jiannan Hu; Qin Lu; Dongyin Wu; Lin Gui

    2017-01-01

    In this paper,we present a new challenging task for emotion analysis,namely emotion cause extraction.In this task,we focus on the detection of emotion cause a.k.a the reason or the stimulant of an emotion,rather than the regular emotion classification or emotion component extraction.Since there is no open dataset for this task available,we first designed and annotated an emotion cause dataset which follows the scheme of W3C Emotion Markup Language.We then present an emotion cause detection method by using event extraction framework,where a tree structure-based representation method is used to represent the events.Since the distribution of events is imbalanced in the training data,we propose an under-sampling-based bagging algorithm to solve this problem.Even with a limited training set,the proposed approach may still extract sufficient features for analysis by a bagging of multi-kernel based SVMs method.Evaluations show that our approach achieves an F-measure 7.04% higher than the state-of-the-art methods.

  1. Automatic Multi-sensor Data Quality Checking and Event Detection for Environmental Sensing

    Science.gov (United States)

    LIU, Q.; Zhang, Y.; Zhao, Y.; Gao, D.; Gallaher, D. W.; Lv, Q.; Shang, L.

    2017-12-01

    With the advances in sensing technologies, large-scale environmental sensing infrastructures are pervasively deployed to continuously collect data for various research and application fields, such as air quality study and weather condition monitoring. In such infrastructures, many sensor nodes are distributed in a specific area and each individual sensor node is capable of measuring several parameters (e.g., humidity, temperature, and pressure), providing massive data for natural event detection and analysis. However, due to the dynamics of the ambient environment, sensor data can be contaminated by errors or noise. Thus, data quality is still a primary concern for scientists before drawing any reliable scientific conclusions. To help researchers identify potential data quality issues and detect meaningful natural events, this work proposes a novel algorithm to automatically identify and rank anomalous time windows from multiple sensor data streams. More specifically, (1) the algorithm adaptively learns the characteristics of normal evolving time series and (2) models the spatial-temporal relationship among multiple sensor nodes to infer the anomaly likelihood of a time series window for a particular parameter in a sensor node. Case studies using different data sets are presented and the experimental results demonstrate that the proposed algorithm can effectively identify anomalous time windows, which may resulted from data quality issues and natural events.

  2. Vision-based Event Detection of the Sit-to-Stand Transition

    Directory of Open Access Journals (Sweden)

    Victor Shia

    2015-12-01

    Full Text Available Sit-to-stand (STS motions are one of the most important activities of daily living as they serve as a precursor to mobility and walking. However, there exist no standard method of segmenting STS motions. This is partially due to the variety of different sensors and modalities used to study the STS motion such as force plate, vision, and accelerometers, each providing different types of data, and the variability of the STS motion in video data. In this work, we present a method using motion capture to detect events in the STS motion by estimating ground reaction forces, thereby eliminating the variability in joint angles from visual data. We illustrate the accuracy of this method with 10 subjects with an average difference of 16.5ms in event times obtained via motion capture vs force plate. This method serves as a proof of concept for detecting events in the STS motion via video which are comparable to those obtained via force plate.

  3. The necessity of recognizing all events in x-ray detection

    International Nuclear Information System (INIS)

    Papp, T.; Maxwell, J.A.; Papp, A.T.

    2008-01-01

    -ray detection. Examples will be given in detection of x-rays in nuclear backgrounds, and in industrial measurements for ROHS and WEEE compliance with input rates of up to several hundred thousands counts per seconds. The availability of all the events allows one to see the other part of the spectrum, and thus offer explanations why the basic parameters are in such a bad shape

  4. Detection of Visual Events in Underwater Video Using a Neuromorphic Saliency-based Attention System

    Science.gov (United States)

    Edgington, D. R.; Walther, D.; Cline, D. E.; Sherlock, R.; Salamy, K. A.; Wilson, A.; Koch, C.

    2003-12-01

    The Monterey Bay Aquarium Research Institute (MBARI) uses high-resolution video equipment on remotely operated vehicles (ROV) to obtain quantitative data on the distribution and abundance of oceanic animals. High-quality video data supplants the traditional approach of assessing the kinds and numbers of animals in the oceanic water column through towing collection nets behind ships. Tow nets are limited in spatial resolution, and often destroy abundant gelatinous animals resulting in species undersampling. Video camera-based quantitative video transects (QVT) are taken through the ocean midwater, from 50m to 4000m, and provide high-resolution data at the scale of the individual animals and their natural aggregation patterns. However, the current manual method of analyzing QVT video by trained scientists is labor intensive and poses a serious limitation to the amount of information that can be analyzed from ROV dives. Presented here is an automated system for detecting marine animals (events) visible in the videos. Automated detection is difficult due to the low contrast of many translucent animals and due to debris ("marine snow") cluttering the scene. Video frames are processed with an artificial intelligence attention selection algorithm that has proven a robust means of target detection in a variety of natural terrestrial scenes. The candidate locations identified by the attention selection module are tracked across video frames using linear Kalman filters. Typically, the occurrence of visible animals in the video footage is sparse in space and time. A notion of "boring" video frames is developed by detecting whether or not there is an interesting candidate object for an animal present in a particular sequence of underwater video -- video frames that do not contain any "interesting" events. If objects can be tracked successfully over several frames, they are stored as potentially "interesting" events. Based on low-level properties, interesting events are

  5. Performance Evaluation of Wireless Sensor Networks for Event-Detection with Shadowing-Induced Radio Irregularities

    Directory of Open Access Journals (Sweden)

    Giuseppe De Marco

    2007-01-01

    Full Text Available In this paper, we study a particular application of wireless sensor networks for event-detection and tracking. In this kind of application, the transport of data is simplified, and guaranteeing a minimum number of packets at the monitoring node is the only constraint on the performance of the sensor network. This minimum number of packets is called event-reliability. Contrary to other studies on the subject, here we consider the behavior of such a network in presence of a realistic radio model, such as the shadowing of the radio signal. With this setting, we extend our previous analysis of the event-reliability approach for the transport of data. In particular, both regular and random networks are considered. The contribute of this work is to show via simulations that, in the presence of randomness or irregularities in the radio channel, the event-reliability can be jeopardized, that is the constraint on the minimum number of packets at the sink node could not be satisfied.

  6. Rapid and reliable detection and identification of GM events using multiplex PCR coupled with oligonucleotide microarray.

    Science.gov (United States)

    Xu, Xiaodan; Li, Yingcong; Zhao, Heng; Wen, Si-yuan; Wang, Sheng-qi; Huang, Jian; Huang, Kun-lun; Luo, Yun-bo

    2005-05-18

    To devise a rapid and reliable method for the detection and identification of genetically modified (GM) events, we developed a multiplex polymerase chain reaction (PCR) coupled with a DNA microarray system simultaneously aiming at many targets in a single reaction. The system included probes for screening gene, species reference gene, specific gene, construct-specific gene, event-specific gene, and internal and negative control genes. 18S rRNA was combined with species reference genes as internal controls to assess the efficiency of all reactions and to eliminate false negatives. Two sets of the multiplex PCR system were used to amplify four and five targets, respectively. Eight different structure genes could be detected and identified simultaneously for Roundup Ready soybean in a single microarray. The microarray specificity was validated by its ability to discriminate two GM maizes Bt176 and Bt11. The advantages of this method are its high specificity and greatly reduced false-positives and -negatives. The multiplex PCR coupled with microarray technology presented here is a rapid and reliable tool for the simultaneous detection of GM organism ingredients.

  7. Femtomolar detection of single mismatches by discriminant analysis of DNA hybridization events using gold nanoparticles.

    Science.gov (United States)

    Ma, Xingyi; Sim, Sang Jun

    2013-03-21

    Even though DNA-based nanosensors have been demonstrated for quantitative detection of analytes and diseases, hybridization events have never been numerically investigated for further understanding of DNA mediated interactions. Here, we developed a nanoscale platform with well-designed capture and detection gold nanoprobes to precisely evaluate the hybridization events. The capture gold nanoprobes were mono-laid on glass and the detection probes were fabricated via a novel competitive conjugation method. The two kinds of probes combined in a suitable orientation following the hybridization with the target. We found that hybridization efficiency was markedly dependent on electrostatic interactions between DNA strands, which can be tailored by adjusting the salt concentration of the incubation solution. Due to the much lower stability of the double helix formed by mismatches, the hybridization efficiencies of single mismatched (MMT) and perfectly matched DNA (PMT) were different. Therefore, we obtained an optimized salt concentration that allowed for discrimination of MMT from PMT without stringent control of temperature or pH. The results indicated this to be an ultrasensitive and precise nanosensor for the diagnosis of genetic diseases.

  8. Detection of genetically modified maize events in Brazilian maize-derived food products

    Directory of Open Access Journals (Sweden)

    Maria Regina Branquinho

    2013-09-01

    Full Text Available The Brazilian government has approved many transgenic maize lines for commercialization and has established a threshold of 1% for food labeling, which underscores need for monitoring programs. Thirty four samples including flours and different types of nacho chips were analyzed by conventional and real-time PCR in 2011 and 2012. The events MON810, Bt11, and TC1507 were detected in most of the samples, and NK603 was present only in the samples analyzed in 2012. The authorized lines GA21, T25, and the unauthorized Bt176 were not detected. All positive samples in the qualitative tests collected in 2011 showed a transgenic content higher than 1%, and none of them was correctly labeled. Regarding the samples collected in 2012, all positive samples were quantified higher than the threshold, and 47.0% were not correctly labeled. The overall results indicated that the major genetically modified organisms detected were MON810, TC1507, Bt11, and NK603 events. Some industries that had failed to label their products in 2011 started labeling them in 2012, demonstrating compliance with the current legislation observing the consumer rights. Although these results are encouraging, it has been clearly demonstrated the need for continuous monitoring programs to ensure consumers that food products are labeled properly.

  9. Application of Data Cubes for Improving Detection of Water Cycle Extreme Events

    Science.gov (United States)

    Albayrak, Arif; Teng, William

    2015-01-01

    As part of an ongoing NASA-funded project to remove a longstanding barrier to accessing NASA data (i.e., accessing archived time-step array data as point-time series), for the hydrology and other point-time series-oriented communities, "data cubes" are created from which time series files (aka "data rods") are generated on-the-fly and made available as Web services from the Goddard Earth Sciences Data and Information Services Center (GES DISC). Data cubes are data as archived rearranged into spatio-temporal matrices, which allow for easy access to the data, both spatially and temporally. A data cube is a specific case of the general optimal strategy of reorganizing data to match the desired means of access. The gain from such reorganization is greater the larger the data set. As a use case of our project, we are leveraging existing software to explore the application of the data cubes concept to machine learning, for the purpose of detecting water cycle extreme events, a specific case of anomaly detection, requiring time series data. We investigate the use of support vector machines (SVM) for anomaly classification. We show an example of detection of water cycle extreme events, using data from the Tropical Rainfall Measuring Mission (TRMM).

  10. Very low frequency earthquakes (VLFEs) detected during episodic tremor and slip (ETS) events in Cascadia using a match filter method indicate repeating events

    Science.gov (United States)

    Hutchison, A. A.; Ghosh, A.

    2016-12-01

    Very low frequency earthquakes (VLFEs) occur in transitional zones of faults, releasing seismic energy in the 0.02-0.05 Hz frequency band over a 90 s duration and typically have magntitudes within the range of Mw 3.0-4.0. VLFEs can occur down-dip of the seismogenic zone, where they can transfer stress up-dip potentially bringing the locked zone closer to a critical failure stress. VLFEs also occur up-dip of the seismogenic zone in a region along the plate interface that can rupture coseismically during large megathrust events, such as the 2011 Tohoku-Oki earthquake [Ide et al., 2011]. VLFEs were first detected in Cascadia during the 2011 episodic tremor and slip (ETS) event, occurring coincidentally with tremor [Ghosh et al., 2015]. However, during the 2014 ETS event, VLFEs were spatially and temporally asynchronous with tremor activity [Hutchison and Ghosh, 2016]. Such contrasting behaviors remind us that the mechanics behind such events remain elusive, yet they are responsible for the largest portion of the moment release during an ETS event. Here, we apply a match filter method using known VLFEs as template events to detect additional VLFEs. Using a grid-search centroid moment tensor inversion method, we invert stacks of the resulting match filter detections to ensure moment tensor solutions are similar to that of the respective template events. Our ability to successfully employ a match filter method to VLFE detection in Cascadia intrinsically indicates that these events can be repeating, implying that the same asperities are likely responsible for generating multiple VLFEs.

  11. Hierarchical modeling for rare event detection and cell subset alignment across flow cytometry samples.

    Directory of Open Access Journals (Sweden)

    Andrew Cron

    Full Text Available Flow cytometry is the prototypical assay for multi-parameter single cell analysis, and is essential in vaccine and biomarker research for the enumeration of antigen-specific lymphocytes that are often found in extremely low frequencies (0.1% or less. Standard analysis of flow cytometry data relies on visual identification of cell subsets by experts, a process that is subjective and often difficult to reproduce. An alternative and more objective approach is the use of statistical models to identify cell subsets of interest in an automated fashion. Two specific challenges for automated analysis are to detect extremely low frequency event subsets without biasing the estimate by pre-processing enrichment, and the ability to align cell subsets across multiple data samples for comparative analysis. In this manuscript, we develop hierarchical modeling extensions to the Dirichlet Process Gaussian Mixture Model (DPGMM approach we have previously described for cell subset identification, and show that the hierarchical DPGMM (HDPGMM naturally generates an aligned data model that captures both commonalities and variations across multiple samples. HDPGMM also increases the sensitivity to extremely low frequency events by sharing information across multiple samples analyzed simultaneously. We validate the accuracy and reproducibility of HDPGMM estimates of antigen-specific T cells on clinically relevant reference peripheral blood mononuclear cell (PBMC samples with known frequencies of antigen-specific T cells. These cell samples take advantage of retrovirally TCR-transduced T cells spiked into autologous PBMC samples to give a defined number of antigen-specific T cells detectable by HLA-peptide multimer binding. We provide open source software that can take advantage of both multiple processors and GPU-acceleration to perform the numerically-demanding computations. We show that hierarchical modeling is a useful probabilistic approach that can provide a

  12. Signal Detection of Imipenem Compared to Other Drugs from Korea Adverse Event Reporting System Database.

    Science.gov (United States)

    Park, Kyounghoon; Soukavong, Mick; Kim, Jungmee; Kwon, Kyoung Eun; Jin, Xue Mei; Lee, Joongyub; Yang, Bo Ram; Park, Byung Joo

    2017-05-01

    To detect signals of adverse drug events after imipenem treatment using the Korea Institute of Drug Safety & Risk Management-Korea adverse event reporting system database (KIDS-KD). We performed data mining using KIDS-KD, which was constructed using spontaneously reported adverse event (AE) reports between December 1988 and June 2014. We detected signals calculated the proportional reporting ratio, reporting odds ratio, and information component of imipenem. We defined a signal as any AE that satisfied all three indices. The signals were compared with drug labels of nine countries. There were 807582 spontaneous AEs reports in the KIDS-KD. Among those, the number of antibiotics related AEs was 192510; 3382 reports were associated with imipenem. The most common imipenem-associated AE was the drug eruption; 353 times. We calculated the signal by comparing with all other antibiotics and drugs; 58 and 53 signals satisfied the three methods. We compared the drug labelling information of nine countries, including the USA, the UK, Japan, Italy, Switzerland, Germany, France, Canada, and South Korea, and discovered that the following signals were currently not included in drug labels: hypokalemia, cardiac arrest, cardiac failure, Parkinson's syndrome, myocardial infarction, and prostate enlargement. Hypokalemia was an additional signal compared with all other antibiotics, and the other signals were not different compared with all other antibiotics and all other drugs. We detected new signals that were not listed on the drug labels of nine countries. However, further pharmacoepidemiologic research is needed to evaluate the causality of these signals. © Copyright: Yonsei University College of Medicine 2017

  13. Quench detection of fast plasma events for the JT-60SA central solenoid

    International Nuclear Information System (INIS)

    Murakami, Haruyuki; Kizu, Kaname; Tsuchiya, Katsuhiko; Kamiya, Koji; Takahashi, Yoshikazu; Yoshida, Kiyoshi

    2012-01-01

    Highlights: ► Pick-up coil method is used for the quench detection of JT-60SA magnet system. ► Disk-shaped pick-up coils are inserted in CS module to compensate inductive voltage. ► Applicability of pick-up coil is evaluated by two dimensional analysis. ► Pick-up coil is applicable whenever disruption, mini collapse and other plasma event. - Abstract: The JT-60 is planned to be modified to a full-superconducting tokamak referred to as the JT-60 Super Advanced (JT-60SA). The maximum temperature of the magnet during its quench might reach the temperature of higher than several hundreds Kelvin that will damage the superconducting magnet itself. The high precision quench detection system, therefore, is one of the key technologies in the superconducting magnet protection system. The pick-up coil method, which is using voltage taps to detect the normal voltage, is used for the quench detection of the JT-60SA superconducting magnet system. The disk-shaped pick-up coils are inserted in the central solenoid (CS) module to compensate the inductive voltage. In the previous study, the quench detection system requires a large number of pick-up coils. The reliability of quench detection system would be higher by simplifying the detection system such as reducing the number of pick-up coils. Simplifying the quench detection system is also important to reduce the total cost of the protection system. Hence the design method is improved by increasing optimizing parameters. The improved design method can reduce the number of pick-up coils without reducing the sensitivity of detection; consequently the protection system can be designed with higher reliability and lower cost. The applicability of the disk-shaped pick-up coil for quench detection system is evaluated by the two dimensional analysis. In the previous study, however, the analysis model only took into account the CS, EF (equilibrium field) coils and plasma. Therefore, applicability of the disk-shaped pick-up coil for

  14. Analysis of arrhythmic events is useful to detect lead failure earlier in patients followed by remote monitoring.

    Science.gov (United States)

    Nishii, Nobuhiro; Miyoshi, Akihito; Kubo, Motoki; Miyamoto, Masakazu; Morimoto, Yoshimasa; Kawada, Satoshi; Nakagawa, Koji; Watanabe, Atsuyuki; Nakamura, Kazufumi; Morita, Hiroshi; Ito, Hiroshi

    2018-03-01

    Remote monitoring (RM) has been advocated as the new standard of care for patients with cardiovascular implantable electronic devices (CIEDs). RM has allowed the early detection of adverse clinical events, such as arrhythmia, lead failure, and battery depletion. However, lead failure was often identified only by arrhythmic events, but not impedance abnormalities. To compare the usefulness of arrhythmic events with conventional impedance abnormalities for identifying lead failure in CIED patients followed by RM. CIED patients in 12 hospitals have been followed by the RM center in Okayama University Hospital. All transmitted data have been analyzed and summarized. From April 2009 to March 2016, 1,873 patients have been followed by the RM center. During the mean follow-up period of 775 days, 42 lead failure events (atrial lead 22, right ventricular pacemaker lead 5, implantable cardioverter defibrillator [ICD] lead 15) were detected. The proportion of lead failures detected only by arrhythmic events, which were not detected by conventional impedance abnormalities, was significantly higher than that detected by impedance abnormalities (arrhythmic event 76.2%, 95% CI: 60.5-87.9%; impedance abnormalities 23.8%, 95% CI: 12.1-39.5%). Twenty-seven events (64.7%) were detected without any alert. Of 15 patients with ICD lead failure, none has experienced inappropriate therapy. RM can detect lead failure earlier, before clinical adverse events. However, CIEDs often diagnose lead failure as just arrhythmic events without any warning. Thus, to detect lead failure earlier, careful human analysis of arrhythmic events is useful. © 2017 Wiley Periodicals, Inc.

  15. An analog cell to detect single event transients in voltage references

    Energy Technology Data Exchange (ETDEWEB)

    Franco, F.J., E-mail: fjfranco@fis.ucm.es [Departamento de Física Aplicada III, Facultad de Físicas, Universidad Complutense de Madrid (UCM), 28040 Madrid (Spain); Palomar, C. [Departamento de Física Aplicada III, Facultad de Físicas, Universidad Complutense de Madrid (UCM), 28040 Madrid (Spain); Izquierdo, J.G. [Centro de Láseres Ultrarrápidos, Facultad de Químicas, Universidad Complutense de Madrid (UCM), 28040 Madrid (Spain); Agapito, J.A. [Departamento de Física Aplicada III, Facultad de Físicas, Universidad Complutense de Madrid (UCM), 28040 Madrid (Spain)

    2015-01-11

    A reliable voltage reference is mandatory in mixed-signal systems. However, this family of components can undergo very long single event transients when operating in radiation environments such as space and nuclear facilities due to the impact of heavy ions. The purpose of the present paper is to demonstrate how a simple cell can be used to detect these transients. The cell was implemented with typical COTS components and its behavior was verified by SPICE simulations and in a laser facility. Different applications of the cell are explored as well.

  16. Optimized Swinging Door Algorithm for Wind Power Ramp Event Detection: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Cui, Mingjian; Zhang, Jie; Florita, Anthony R.; Hodge, Bri-Mathias; Ke, Deping; Sun, Yuanzhang

    2015-08-06

    Significant wind power ramp events (WPREs) are those that influence the integration of wind power, and they are a concern to the continued reliable operation of the power grid. As wind power penetration has increased in recent years, so has the importance of wind power ramps. In this paper, an optimized swinging door algorithm (SDA) is developed to improve ramp detection performance. Wind power time series data are segmented by the original SDA, and then all significant ramps are detected and merged through a dynamic programming algorithm. An application of the optimized SDA is provided to ascertain the optimal parameter of the original SDA. Measured wind power data from the Electric Reliability Council of Texas (ERCOT) are used to evaluate the proposed optimized SDA.

  17. Highly specific detection of genetic modification events using an enzyme-linked probe hybridization chip.

    Science.gov (United States)

    Zhang, M Z; Zhang, X F; Chen, X M; Chen, X; Wu, S; Xu, L L

    2015-08-10

    The enzyme-linked probe hybridization chip utilizes a method based on ligase-hybridizing probe chip technology, with the principle of using thio-primers for protection against enzyme digestion, and using lambda DNA exonuclease to cut multiple PCR products obtained from the sample being tested into single-strand chains for hybridization. The 5'-end amino-labeled probe was fixed onto the aldehyde chip, and hybridized with the single-stranded PCR product, followed by addition of a fluorescent-modified probe that was then enzymatically linked with the adjacent, substrate-bound probe in order to achieve highly specific, parallel, and high-throughput detection. Specificity and sensitivity testing demonstrated that enzyme-linked probe hybridization technology could be applied to the specific detection of eight genetic modification events at the same time, with a sensitivity reaching 0.1% and the achievement of accurate, efficient, and stable results.

  18. A power filter for the detection of burst events based on time-frequency spectrum estimation

    International Nuclear Information System (INIS)

    Guidi, G M; Cuoco, E; Vicere, A

    2004-01-01

    We propose as a statistic for the detection of bursts in a gravitational wave interferometer the 'energy' of the events estimated with a time-dependent calculation of the spectrum. This statistic has an asymptotic Gaussian distribution with known statistical moments, which makes it possible to perform a uniformly most powerful test (McDonough R N and Whalen A D 1995 Detection of Signals in Noise (New York: Academic)) on the energy mean. We estimate the receiver operating characteristic (ROC, from the same book) of this statistic for different levels of the signal-to-noise ratio in the specific case of a simulated noise having the spectral density expected for Virgo, using test signals taken from a library of possible waveforms emitted during the collapse of the core of type II supernovae

  19. Automatic detection of whole night snoring events using non-contact microphone.

    Directory of Open Access Journals (Sweden)

    Eliran Dafna

    Full Text Available OBJECTIVE: Although awareness of sleep disorders is increasing, limited information is available on whole night detection of snoring. Our study aimed to develop and validate a robust, high performance, and sensitive whole-night snore detector based on non-contact technology. DESIGN: Sounds during polysomnography (PSG were recorded using a directional condenser microphone placed 1 m above the bed. An AdaBoost classifier was trained and validated on manually labeled snoring and non-snoring acoustic events. PATIENTS: Sixty-seven subjects (age 52.5 ± 13.5 years, BMI 30.8 ± 4.7 kg/m(2, m/f 40/27 referred for PSG for obstructive sleep apnea diagnoses were prospectively and consecutively recruited. Twenty-five subjects were used for the design study; the validation study was blindly performed on the remaining forty-two subjects. MEASUREMENTS AND RESULTS: To train the proposed sound detector, >76,600 acoustic episodes collected in the design study were manually classified by three scorers into snore and non-snore episodes (e.g., bedding noise, coughing, environmental. A feature selection process was applied to select the most discriminative features extracted from time and spectral domains. The average snore/non-snore detection rate (accuracy for the design group was 98.4% based on a ten-fold cross-validation technique. When tested on the validation group, the average detection rate was 98.2% with sensitivity of 98.0% (snore as a snore and specificity of 98.3% (noise as noise. CONCLUSIONS: Audio-based features extracted from time and spectral domains can accurately discriminate between snore and non-snore acoustic events. This audio analysis approach enables detection and analysis of snoring sounds from a full night in order to produce quantified measures for objective follow-up of patients.

  20. Detection of planets in extremely weak central perturbation microlensing events via next-generation ground-based surveys

    International Nuclear Information System (INIS)

    Chung, Sun-Ju; Lee, Chung-Uk; Koo, Jae-Rim

    2014-01-01

    Even though the recently discovered high-magnification event MOA-2010-BLG-311 had complete coverage over its peak, confident planet detection did not happen due to extremely weak central perturbations (EWCPs, fractional deviations of ≲ 2%). For confident detection of planets in EWCP events, it is necessary to have both high cadence monitoring and high photometric accuracy better than those of current follow-up observation systems. The next-generation ground-based observation project, Korea Microlensing Telescope Network (KMTNet), satisfies these conditions. We estimate the probability of occurrence of EWCP events with fractional deviations of ≤2% in high-magnification events and the efficiency of detecting planets in the EWCP events using the KMTNet. From this study, we find that the EWCP events occur with a frequency of >50% in the case of ≲ 100 M E planets with separations of 0.2 AU ≲ d ≲ 20 AU. We find that for main-sequence and sub-giant source stars, ≳ 1 M E planets in EWCP events with deviations ≤2% can be detected with frequency >50% in a certain range that changes with the planet mass. However, it is difficult to detect planets in EWCP events of bright stars like giant stars because it is easy for KMTNet to be saturated around the peak of the events because of its constant exposure time. EWCP events are caused by close, intermediate, and wide planetary systems with low-mass planets and close and wide planetary systems with massive planets. Therefore, we expect that a much greater variety of planetary systems than those already detected, which are mostly intermediate planetary systems, regardless of the planet mass, will be significantly detected in the near future.

  1. Detection of Water Contamination Events Using Fluorescence Spectroscopy and Alternating Trilinear Decomposition Algorithm

    Directory of Open Access Journals (Sweden)

    Jie Yu

    2017-01-01

    Full Text Available The method based on conventional index and UV-vision has been widely applied in the field of water quality abnormality detection. This paper presents a qualitative analysis approach to detect the water contamination events with unknown pollutants. Fluorescence spectra were used as water quality monitoring tools, and the detection method of unknown contaminants in water based on alternating trilinear decomposition (ATLD is proposed to analyze the excitation and emission spectra of the samples. The Delaunay triangulation interpolation method was used to make the pretreatment of three-dimensional fluorescence spectra data, in order to estimate the effect of Rayleigh and Raman scattering; ATLD model was applied to establish the model of normal water sample, and the residual matrix was obtained by subtracting the measured matrix from the model matrix; the residual sum of squares obtained from the residual matrix and threshold was used to make qualitative discrimination of test samples and distinguish drinking water samples and organic pollutant samples. The results of the study indicate that ATLD modeling with three-dimensional fluorescence spectra can provide a tool for detecting unknown organic pollutants in water qualitatively. The method based on fluorescence spectra can be complementary to the method based on conventional index and UV-vision.

  2. An Event-Triggered Machine Learning Approach for Accelerometer-Based Fall Detection.

    Science.gov (United States)

    Putra, I Putu Edy Suardiyana; Brusey, James; Gaura, Elena; Vesilo, Rein

    2017-12-22

    The fixed-size non-overlapping sliding window (FNSW) and fixed-size overlapping sliding window (FOSW) approaches are the most commonly used data-segmentation techniques in machine learning-based fall detection using accelerometer sensors. However, these techniques do not segment by fall stages (pre-impact, impact, and post-impact) and thus useful information is lost, which may reduce the detection rate of the classifier. Aligning the segment with the fall stage is difficult, as the segment size varies. We propose an event-triggered machine learning (EvenT-ML) approach that aligns each fall stage so that the characteristic features of the fall stages are more easily recognized. To evaluate our approach, two publicly accessible datasets were used. Classification and regression tree (CART), k -nearest neighbor ( k -NN), logistic regression (LR), and the support vector machine (SVM) were used to train the classifiers. EvenT-ML gives classifier F-scores of 98% for a chest-worn sensor and 92% for a waist-worn sensor, and significantly reduces the computational cost compared with the FNSW- and FOSW-based approaches, with reductions of up to 8-fold and 78-fold, respectively. EvenT-ML achieves a significantly better F-score than existing fall detection approaches. These results indicate that aligning feature segments with fall stages significantly increases the detection rate and reduces the computational cost.

  3. Endpoint visual detection of three genetically modified rice events by loop-mediated isothermal amplification.

    Science.gov (United States)

    Chen, Xiaoyun; Wang, Xiaofu; Jin, Nuo; Zhou, Yu; Huang, Sainan; Miao, Qingmei; Zhu, Qing; Xu, Junfeng

    2012-11-07

    Genetically modified (GM) rice KMD1, TT51-1, and KF6 are three of the most well known transgenic Bt rice lines in China. A rapid and sensitive molecular assay for risk assessment of GM rice is needed. Polymerase chain reaction (PCR), currently the most common method for detecting genetically modified organisms, requires temperature cycling and relatively complex procedures. Here we developed a visual and rapid loop-mediated isothermal amplification (LAMP) method to amplify three GM rice event-specific junction sequences. Target DNA was amplified and visualized by two indicators (SYBR green or hydroxy naphthol blue [HNB]) within 60 min at an isothermal temperature of 63 °C. Different kinds of plants were selected to ensure the specificity of detection and the results of the non-target samples were negative, indicating that the primer sets for the three GM rice varieties had good levels of specificity. The sensitivity of LAMP, with detection limits at low concentration levels (0.01%−0.005% GM), was 10- to 100-fold greater than that of conventional PCR. Additionally, the LAMP assay coupled with an indicator (SYBR green or HNB) facilitated analysis. These findings revealed that the rapid detection method was suitable as a simple field-based test to determine the status of GM crops.

  4. Development of electrochemical biosensor for detection of pathogenic microorganism in Asian dust events.

    Science.gov (United States)

    Yoo, Min-Sang; Shin, Minguk; Kim, Younghun; Jang, Min; Choi, Yoon-E; Park, Si Jae; Choi, Jonghoon; Lee, Jinyoung; Park, Chulhwan

    2017-05-01

    We developed a single-walled carbon nanotubes (SWCNTs)-based electrochemical biosensor for the detection of Bacillus subtilis, one of the microorganisms observed in Asian dust events, which causes respiratory diseases such as asthma and pneumonia. SWCNTs plays the role of a transducer in biological antigen/antibody reaction for the electrical signal while 1-pyrenebutanoic acid succinimidyl ester (1-PBSE) and ant-B. subtilis were performed as a chemical linker and an acceptor, respectively, for the adhesion of target microorganism in the developed biosensor. The detection range (10 2 -10 10  CFU/mL) and the detection limit (10 2  CFU/mL) of the developed biosensor were identified while the response time was 10 min. The amount of target B. subtilis was the highest in the specificity test of the developed biosensor, compared with the other tested microorganisms (Staphylococcus aureus, Flavobacterium psychrolimnae, and Aquabacterium commune). In addition, target B. subtilis detected by the developed biosensor was observed by scanning electron microscope (SEM) analysis. Copyright © 2017 Elsevier Ltd. All rights reserved.

  5. Endpoint Visual Detection of Three Genetically Modified Rice Events by Loop-Mediated Isothermal Amplification

    Directory of Open Access Journals (Sweden)

    Qing Zhu

    2012-11-01

    Full Text Available Genetically modified (GM rice KMD1, TT51-1, and KF6 are three of the most well known transgenic Bt rice lines in China. A rapid and sensitive molecular assay for risk assessment of GM rice is needed. Polymerase chain reaction (PCR, currently the most common method for detecting genetically modified organisms, requires temperature cycling and relatively complex procedures. Here we developed a visual and rapid loop-mediated isothermal amplification (LAMP method to amplify three GM rice event-specific junction sequences. Target DNA was amplified and visualized by two indicators (SYBR green or hydroxy naphthol blue [HNB] within 60 min at an isothermal temperature of 63 °C. Different kinds of plants were selected to ensure the specificity of detection and the results of the non-target samples were negative, indicating that the primer sets for the three GM rice varieties had good levels of specificity. The sensitivity of LAMP, with detection limits at low concentration levels (0.01%–0.005% GM, was 10- to 100-fold greater than that of conventional PCR. Additionally, the LAMP assay coupled with an indicator (SYBR green or HNB facilitated analysis. These findings revealed that the rapid detection method was suitable as a simple field-based test to determine the status of GM crops.

  6. A Cluster-Based Fuzzy Fusion Algorithm for Event Detection in Heterogeneous Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    ZiQi Hao

    2015-01-01

    Full Text Available As limited energy is one of the tough challenges in wireless sensor networks (WSN, energy saving becomes important in increasing the lifecycle of the network. Data fusion enables combining information from several sources thus to provide a unified scenario, which can significantly save sensor energy and enhance sensing data accuracy. In this paper, we propose a cluster-based data fusion algorithm for event detection. We use k-means algorithm to form the nodes into clusters, which can significantly reduce the energy consumption of intracluster communication. Distances between cluster heads and event and energy of clusters are fuzzified, thus to use a fuzzy logic to select the clusters that will participate in data uploading and fusion. Fuzzy logic method is also used by cluster heads for local decision, and then the local decision results are sent to the base station. Decision-level fusion for final decision of event is performed by base station according to the uploaded local decisions and fusion support degree of clusters calculated by fuzzy logic method. The effectiveness of this algorithm is demonstrated by simulation results.

  7. Online Least Squares One-Class Support Vector Machines-Based Abnormal Visual Event Detection

    Directory of Open Access Journals (Sweden)

    Tian Wang

    2013-12-01

    Full Text Available The abnormal event detection problem is an important subject in real-time video surveillance. In this paper, we propose a novel online one-class classification algorithm, online least squares one-class support vector machine (online LS-OC-SVM, combined with its sparsified version (sparse online LS-OC-SVM. LS-OC-SVM extracts a hyperplane as an optimal description of training objects in a regularized least squares sense. The online LS-OC-SVM learns a training set with a limited number of samples to provide a basic normal model, then updates the model through remaining data. In the sparse online scheme, the model complexity is controlled by the coherence criterion. The online LS-OC-SVM is adopted to handle the abnormal event detection problem. Each frame of the video is characterized by the covariance matrix descriptor encoding the moving information, then is classified into a normal or an abnormal frame. Experiments are conducted, on a two-dimensional synthetic distribution dataset and a benchmark video surveillance dataset, to demonstrate the promising results of the proposed online LS-OC-SVM method.

  8. Detecting regular sound changes in linguistics as events of concerted evolution.

    Science.gov (United States)

    Hruschka, Daniel J; Branford, Simon; Smith, Eric D; Wilkins, Jon; Meade, Andrew; Pagel, Mark; Bhattacharya, Tanmoy

    2015-01-05

    Concerted evolution is normally used to describe parallel changes at different sites in a genome, but it is also observed in languages where a specific phoneme changes to the same other phoneme in many words in the lexicon—a phenomenon known as regular sound change. We develop a general statistical model that can detect concerted changes in aligned sequence data and apply it to study regular sound changes in the Turkic language family. Linguistic evolution, unlike the genetic substitutional process, is dominated by events of concerted evolutionary change. Our model identified more than 70 historical events of regular sound change that occurred throughout the evolution of the Turkic language family, while simultaneously inferring a dated phylogenetic tree. Including regular sound changes yielded an approximately 4-fold improvement in the characterization of linguistic change over a simpler model of sporadic change, improved phylogenetic inference, and returned more reliable and plausible dates for events on the phylogenies. The historical timings of the concerted changes closely follow a Poisson process model, and the sound transition networks derived from our model mirror linguistic expectations. We demonstrate that a model with no prior knowledge of complex concerted or regular changes can nevertheless infer the historical timings and genealogical placements of events of concerted change from the signals left in contemporary data. Our model can be applied wherever discrete elements—such as genes, words, cultural trends, technologies, or morphological traits—can change in parallel within an organism or other evolving group. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.

  9. Detection of ULF geomagnetic signals associated with seismic events in Central Mexico using Discrete Wavelet Transform

    Directory of Open Access Journals (Sweden)

    O. Chavez

    2010-12-01

    Full Text Available The geomagnetic observatory of Juriquilla Mexico, located at longitude –100.45° and latitude 20.70°, and 1946 m a.s.l., has been operational since June 2004 compiling geomagnetic field measurements with a three component fluxgate magnetometer. In this paper, the results of the analysis of these measurements in relation to important seismic activity in the period of 2007 to 2009 are presented. For this purpose, we used superposed epochs of Discrete Wavelet Transform of filtered signals for the three components of the geomagnetic field during relative seismic calm, and it was compared with seismic events of magnitudes greater than Ms > 5.5, which have occurred in Mexico. The analysed epochs consisted of 18 h of observations for a dataset corresponding to 18 different earthquakes (EQs. The time series were processed for a period of 9 h prior to and 9 h after each seismic event. This data processing was compared with the same number of observations during a seismic calm. The proposed methodology proved to be an efficient tool to detect signals associated with seismic activity, especially when the seismic events occur in a distance (D from the observatory to the EQ, such that the ratio D/ρ < 1.8 where ρ is the earthquake radius preparation zone. The methodology presented herein shows important anomalies in the Ultra Low Frequency Range (ULF; 0.005–1 Hz, primarily for 0.25 to 0.5 Hz. Furthermore, the time variance (σ2 increases prior to, during and after the seismic event in relation to the coefficient D1 obtained, principally in the Bx (N-S and By (E-W geomagnetic components. Therefore, this paper proposes and develops a new methodology to extract the abnormal signals of the geomagnetic anomalies related to different stages of the EQs.

  10. Statistical improvement in detection level of gravitational microlensing events from their light curves

    Science.gov (United States)

    Ibrahim, Ichsan; Malasan, Hakim L.; Kunjaya, Chatief; Timur Jaelani, Anton; Puannandra Putri, Gerhana; Djamal, Mitra

    2018-04-01

    In astronomy, the brightness of a source is typically expressed in terms of magnitude. Conventionally, the magnitude is defined by the logarithm of received flux. This relationship is known as the Pogson formula. For received flux with a small signal to noise ratio (S/N), however, the formula gives a large magnitude error. We investigate whether the use of Inverse Hyperbolic Sine function (hereafter referred to as the Asinh magnitude) in the modified formulae could allow for an alternative calculation of magnitudes for small S/N flux, and whether the new approach is better for representing the brightness of that region. We study the possibility of increasing the detection level of gravitational microlensing using 40 selected microlensing light curves from the 2013 and 2014 seasons and by using the Asinh magnitude. Photometric data of the selected events are obtained from the Optical Gravitational Lensing Experiment (OGLE). We found that utilization of the Asinh magnitude makes the events brighter compared to using the logarithmic magnitude, with an average of about 3.42 × 10‑2 magnitude and an average in the difference of error between the logarithmic and the Asinh magnitude of about 2.21 × 10‑2 magnitude. The microlensing events OB140847 and OB140885 are found to have the largest difference values among the selected events. Using a Gaussian fit to find the peak for OB140847 and OB140885, we conclude statistically that the Asinh magnitude gives better mean squared values of the regression and narrower residual histograms than the Pogson magnitude. Based on these results, we also attempt to propose a limit in magnitude value for which use of the Asinh magnitude is optimal with small S/N data.

  11. Building a Global Catalog of Nonvolcanic Tremor Events Using an Automatic Detection Algorithm

    Science.gov (United States)

    Bagley, B. C.; Revenaugh, J.

    2009-12-01

    Nonvolcanic tremor is characterized by a long-period seismic event containing a series of low-frequency earthquakes (LFEs). Tremor has been detected in regions of subduction (e.g. Kao et. al. 2007, 2008; Shelly 2006) and beneath the San Andreas fault near Cholame, California (e.g. Nadeau and Dolenc, 2005). In some cases tremor events seem to have periodicity, and these are often referred to as episodic tremor and slip (ETS). The origin of nonvolcanic tremor has been ascribed to shear slip along plate boundaries and/or high pore-fluid pressure. The apparent periodicity and tectonic setting associated with ETS has led to the suggestion that there may be a link between ETS and megathrust earthquakes. Until recently tremor detection has been a manual process requiring visual inspection of seismic data. In areas that have dense seismic arrays (e.g. Japan) waveform cross correlation techniques have been successfully employed (e.g. Obara, 2002). Kao et al. (2007) developed an algorithm for automatic detection of seismic tremor that can be used in regions without dense arrays. This method has been used to create the Tremor Activity Monitoring System (TAMS), which is used by the Geologic Survey of Canada to monitor northern Cascadia. So far the study of nonvolcanic tremor has been limited to regions of subduction or along major transform faults. It is unknown if tremor events occur in other tectonic settings, or if the current detection schemes will be useful for finding them. We propose to look for tremor events in non-subduction regions. It is possible that if tremor exists in other regions it will have different characteristics and may not trigger the TAMS system or be amenable to other existing detection schemes. We are developing algorithms for searching sparse array data sets for quasi-harmonic energy bursts in hopes of recognizing and cataloging nonvolcanic tremor in an expanded tectonic setting. Statistical comparisons against the TAMS algorithm will be made if

  12. Single Versus Multiple Events Error Potential Detection in a BCI-Controlled Car Game With Continuous and Discrete Feedback.

    Science.gov (United States)

    Kreilinger, Alex; Hiebel, Hannah; Müller-Putz, Gernot R

    2016-03-01

    This work aimed to find and evaluate a new method for detecting errors in continuous brain-computer interface (BCI) applications. Instead of classifying errors on a single-trial basis, the new method was based on multiple events (MEs) analysis to increase the accuracy of error detection. In a BCI-driven car game, based on motor imagery (MI), discrete events were triggered whenever subjects collided with coins and/or barriers. Coins counted as correct events, whereas barriers were errors. This new method, termed ME method, combined and averaged the classification results of single events (SEs) and determined the correctness of MI trials, which consisted of event sequences instead of SEs. The benefit of this method was evaluated in an offline simulation. In an online experiment, the new method was used to detect erroneous MI trials. Such MI trials were discarded and could be repeated by the users. We found that, even with low SE error potential (ErrP) detection rates, feasible accuracies can be achieved when combining MEs to distinguish erroneous from correct MI trials. Online, all subjects reached higher scores with error detection than without, at the cost of longer times needed for completing the game. Findings suggest that ErrP detection may become a reliable tool for monitoring continuous states in BCI applications when combining MEs. This paper demonstrates a novel technique for detecting errors in online continuous BCI applications, which yields promising results even with low single-trial detection rates.

  13. Predictors of Arrhythmic Events Detected by Implantable Loop Recorders in Renal Transplant Candidates

    Energy Technology Data Exchange (ETDEWEB)

    Silva, Rodrigo Tavares; Martinelli Filho, Martino, E-mail: martino@cardiol.br; Peixoto, Giselle de Lima; Lima, José Jayme Galvão de; Siqueira, Sérgio Freitas de; Costa, Roberto; Gowdak, Luís Henrique Wolff [Instituto do Coração do Hospital das Clínicas da Faculdade de Medicina da Universidade de São Paulo, São Paulo, SP (Brazil); Paula, Flávio Jota de [Unidade de Transplante Renal - Divisão de Urologia do Hospital das Clínicas da Faculdade de Medicina da Universidade de São Paulo, São Paulo, SP (Brazil); Kalil Filho, Roberto; Ramires, José Antônio Franchini [Instituto do Coração do Hospital das Clínicas da Faculdade de Medicina da Universidade de São Paulo, São Paulo, SP (Brazil)

    2015-11-15

    The recording of arrhythmic events (AE) in renal transplant candidates (RTCs) undergoing dialysis is limited by conventional electrocardiography. However, continuous cardiac rhythm monitoring seems to be more appropriate due to automatic detection of arrhythmia, but this method has not been used. We aimed to investigate the incidence and predictors of AE in RTCs using an implantable loop recorder (ILR). A prospective observational study conducted from June 2009 to January 2011 included 100 consecutive ambulatory RTCs who underwent ILR and were followed-up for at least 1 year. Multivariate logistic regression was applied to define predictors of AE. During a mean follow-up of 424 ± 127 days, AE could be detected in 98% of patients, and 92% had more than one type of arrhythmia, with most considered potentially not serious. Sustained atrial tachycardia and atrial fibrillation occurred in 7% and 13% of patients, respectively, and bradyarrhythmia and non-sustained or sustained ventricular tachycardia (VT) occurred in 25% and 57%, respectively. There were 18 deaths, of which 7 were sudden cardiac events: 3 bradyarrhythmias, 1 ventricular fibrillation, 1 myocardial infarction, and 2 undetermined. The presence of a long QTc (odds ratio [OR] = 7.28; 95% confidence interval [CI], 2.01–26.35; p = 0.002), and the duration of the PR interval (OR = 1.05; 95% CI, 1.02–1.08; p < 0.001) were independently associated with bradyarrhythmias. Left ventricular dilatation (LVD) was independently associated with non-sustained VT (OR = 2.83; 95% CI, 1.01–7.96; p = 0.041). In medium-term follow-up of RTCs, ILR helped detect a high incidence of AE, most of which did not have clinical relevance. The PR interval and presence of long QTc were predictive of bradyarrhythmias, whereas LVD was predictive of non-sustained VT.

  14. Predictors of Arrhythmic Events Detected by Implantable Loop Recorders in Renal Transplant Candidates

    Directory of Open Access Journals (Sweden)

    Rodrigo Tavares Silva

    2015-11-01

    Full Text Available AbstractBackground:The recording of arrhythmic events (AE in renal transplant candidates (RTCs undergoing dialysis is limited by conventional electrocardiography. However, continuous cardiac rhythm monitoring seems to be more appropriate due to automatic detection of arrhythmia, but this method has not been used.Objective:We aimed to investigate the incidence and predictors of AE in RTCs using an implantable loop recorder (ILR.Methods:A prospective observational study conducted from June 2009 to January 2011 included 100 consecutive ambulatory RTCs who underwent ILR and were followed-up for at least 1 year. Multivariate logistic regression was applied to define predictors of AE.Results:During a mean follow-up of 424 ± 127 days, AE could be detected in 98% of patients, and 92% had more than one type of arrhythmia, with most considered potentially not serious. Sustained atrial tachycardia and atrial fibrillation occurred in 7% and 13% of patients, respectively, and bradyarrhythmia and non-sustained or sustained ventricular tachycardia (VT occurred in 25% and 57%, respectively. There were 18 deaths, of which 7 were sudden cardiac events: 3 bradyarrhythmias, 1 ventricular fibrillation, 1 myocardial infarction, and 2 undetermined. The presence of a long QTc (odds ratio [OR] = 7.28; 95% confidence interval [CI], 2.01–26.35; p = 0.002, and the duration of the PR interval (OR = 1.05; 95% CI, 1.02–1.08; p < 0.001 were independently associated with bradyarrhythmias. Left ventricular dilatation (LVD was independently associated with non-sustained VT (OR = 2.83; 95% CI, 1.01–7.96; p = 0.041.Conclusions:In medium-term follow-up of RTCs, ILR helped detect a high incidence of AE, most of which did not have clinical relevance. The PR interval and presence of long QTc were predictive of bradyarrhythmias, whereas LVD was predictive of non-sustained VT.

  15. Predictors of Arrhythmic Events Detected by Implantable Loop Recorders in Renal Transplant Candidates

    Science.gov (United States)

    Silva, Rodrigo Tavares; Martinelli Filho, Martino; Peixoto, Giselle de Lima; de Lima, José Jayme Galvão; de Siqueira, Sérgio Freitas; Costa, Roberto; Gowdak, Luís Henrique Wolff; de Paula, Flávio Jota; Kalil Filho, Roberto; Ramires, José Antônio Franchini

    2015-01-01

    Background The recording of arrhythmic events (AE) in renal transplant candidates (RTCs) undergoing dialysis is limited by conventional electrocardiography. However, continuous cardiac rhythm monitoring seems to be more appropriate due to automatic detection of arrhythmia, but this method has not been used. Objective We aimed to investigate the incidence and predictors of AE in RTCs using an implantable loop recorder (ILR). Methods A prospective observational study conducted from June 2009 to January 2011 included 100 consecutive ambulatory RTCs who underwent ILR and were followed-up for at least 1 year. Multivariate logistic regression was applied to define predictors of AE. Results During a mean follow-up of 424 ± 127 days, AE could be detected in 98% of patients, and 92% had more than one type of arrhythmia, with most considered potentially not serious. Sustained atrial tachycardia and atrial fibrillation occurred in 7% and 13% of patients, respectively, and bradyarrhythmia and non-sustained or sustained ventricular tachycardia (VT) occurred in 25% and 57%, respectively. There were 18 deaths, of which 7 were sudden cardiac events: 3 bradyarrhythmias, 1 ventricular fibrillation, 1 myocardial infarction, and 2 undetermined. The presence of a long QTc (odds ratio [OR] = 7.28; 95% confidence interval [CI], 2.01–26.35; p = 0.002), and the duration of the PR interval (OR = 1.05; 95% CI, 1.02–1.08; p < 0.001) were independently associated with bradyarrhythmias. Left ventricular dilatation (LVD) was independently associated with non-sustained VT (OR = 2.83; 95% CI, 1.01–7.96; p = 0.041). Conclusions In medium-term follow-up of RTCs, ILR helped detect a high incidence of AE, most of which did not have clinical relevance. The PR interval and presence of long QTc were predictive of bradyarrhythmias, whereas LVD was predictive of non-sustained VT. PMID:26351983

  16. Comparison and applicability of landslide susceptibility models based on landslide ratio-based logistic regression, frequency ratio, weight of evidence, and instability index methods in an extreme rainfall event

    Science.gov (United States)

    Wu, Chunhung

    2016-04-01

    Few researches have discussed about the applicability of applying the statistical landslide susceptibility (LS) model for extreme rainfall-induced landslide events. The researches focuses on the comparison and applicability of LS models based on four methods, including landslide ratio-based logistic regression (LRBLR), frequency ratio (FR), weight of evidence (WOE), and instability index (II) methods, in an extreme rainfall-induced landslide cases. The landslide inventory in the Chishan river watershed, Southwestern Taiwan, after 2009 Typhoon Morakot is the main materials in this research. The Chishan river watershed is a tributary watershed of Kaoping river watershed, which is a landslide- and erosion-prone watershed with the annual average suspended load of 3.6×107 MT/yr (ranks 11th in the world). Typhoon Morakot struck Southern Taiwan from Aug. 6-10 in 2009 and dumped nearly 2,000 mm of rainfall in the Chishan river watershed. The 24-hour, 48-hour, and 72-hours accumulated rainfall in the Chishan river watershed exceeded the 200-year return period accumulated rainfall. 2,389 landslide polygons in the Chishan river watershed were extracted from SPOT 5 images after 2009 Typhoon Morakot. The total landslide area is around 33.5 km2, equals to the landslide ratio of 4.1%. The main landslide types based on Varnes' (1978) classification are rotational and translational slides. The two characteristics of extreme rainfall-induced landslide event are dense landslide distribution and large occupation of downslope landslide areas owing to headward erosion and bank erosion in the flooding processes. The area of downslope landslide in the Chishan river watershed after 2009 Typhoon Morakot is 3.2 times higher than that of upslope landslide areas. The prediction accuracy of LS models based on LRBLR, FR, WOE, and II methods have been proven over 70%. The model performance and applicability of four models in a landslide-prone watershed with dense distribution of rainfall

  17. Towards Real-Time Detection of Gait Events on Different Terrains Using Time-Frequency Analysis and Peak Heuristics Algorithm.

    Science.gov (United States)

    Zhou, Hui; Ji, Ning; Samuel, Oluwarotimi Williams; Cao, Yafei; Zhao, Zheyi; Chen, Shixiong; Li, Guanglin

    2016-10-01

    Real-time detection of gait events can be applied as a reliable input to control drop foot correction devices and lower-limb prostheses. Among the different sensors used to acquire the signals associated with walking for gait event detection, the accelerometer is considered as a preferable sensor due to its convenience of use, small size, low cost, reliability, and low power consumption. Based on the acceleration signals, different algorithms have been proposed to detect toe off (TO) and heel strike (HS) gait events in previous studies. While these algorithms could achieve a relatively reasonable performance in gait event detection, they suffer from limitations such as poor real-time performance and are less reliable in the cases of up stair and down stair terrains. In this study, a new algorithm is proposed to detect the gait events on three walking terrains in real-time based on the analysis of acceleration jerk signals with a time-frequency method to obtain gait parameters, and then the determination of the peaks of jerk signals using peak heuristics. The performance of the newly proposed algorithm was evaluated with eight healthy subjects when they were walking on level ground, up stairs, and down stairs. Our experimental results showed that the mean F1 scores of the proposed algorithm were above 0.98 for HS event detection and 0.95 for TO event detection on the three terrains. This indicates that the current algorithm would be robust and accurate for gait event detection on different terrains. Findings from the current study suggest that the proposed method may be a preferable option in some applications such as drop foot correction devices and leg prostheses.

  18. Solar Power Ramp Events Detection Using an Optimized Swinging Door Algorithm: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Cui, Mingjian; Zhang, Jie; Florita, Anthony; Hodge, Bri-Mathias; Ke, Deping; Sun, Yuanzhang

    2015-08-07

    Solar power ramp events (SPREs) are those that significantly influence the integration of solar power on non-clear days and threaten the reliable and economic operation of power systems. Accurately extracting solar power ramps becomes more important with increasing levels of solar power penetrations in power systems. In this paper, we develop an optimized swinging door algorithm (OpSDA) to detection. First, the swinging door algorithm (SDA) is utilized to segregate measured solar power generation into consecutive segments in a piecewise linear fashion. Then we use a dynamic programming approach to combine adjacent segments into significant ramps when the decision thresholds are met. In addition, the expected SPREs occurring in clear-sky solar power conditions are removed. Measured solar power data from Tucson Electric Power is used to assess the performance of the proposed methodology. OpSDA is compared to two other ramp detection methods: the SDA and the L1-Ramp Detect with Sliding Window (L1-SW) method. The statistical results show the validity and effectiveness of the proposed method. OpSDA can significantly improve the performance of the SDA, and it can perform as well as or better than L1-SW with substantially less computation time.

  19. Leveraging KVM Events to Detect Cache-Based Side Channel Attacks in a Virtualization Environment

    Directory of Open Access Journals (Sweden)

    Ady Wahyudi Paundu

    2018-01-01

    Full Text Available Cache-based side channel attack (CSCa techniques in virtualization systems are becoming more advanced, while defense methods against them are still perceived as nonpractical. The most recent CSCa variant called Flush + Flush has showed that the current detection methods can be easily bypassed. Within this work, we introduce a novel monitoring approach to detect CSCa operations inside a virtualization environment. We utilize the Kernel Virtual Machine (KVM event data in the kernel and process this data using a machine learning technique to identify any CSCa operation in the guest Virtual Machine (VM. We evaluate our approach using Receiver Operating Characteristic (ROC diagram of multiple attack and benign operation scenarios. Our method successfully separate the CSCa datasets from the non-CSCa datasets, on both trained and nontrained data scenarios. The successful classification also include the Flush + Flush attack scenario. We are also able to explain the classification results by extracting the set of most important features that separate both classes using their Fisher scores and show that our monitoring approach can work to detect CSCa in general. Finally, we evaluate the overhead impact of our CSCa monitoring method and show that it has a negligible computation overhead on the host and the guest VM.

  20. Review of nuclear power reactor coolant system leakage events and leak detection requirements

    International Nuclear Information System (INIS)

    Chokshi, N.C.; Srinivasan, M.; Kupperman, D.S.; Krishnaswamy, P.

    2005-01-01

    In response to the vessel head event at the Davis-Besse reactor, the U.S. Nuclear Regulatory Commission (NRC) formed a Lessons Learned Task Force (LLTF). Four action plans were formulated to respond to the recommendations of the LLTF. The action plans involved efforts on barrier integrity, stress corrosion cracking (SCC), operating experience, and inspection and program management. One part of the action plan on barrier integrity was an assessment to identify potential safety benefits from changes in requirements pertaining to leakage in the reactor coolant system (RCS). In this effort, experiments and models were reviewed to identify correlations between crack size, crack-tip-opening displacement (CTOD), and leak rate in the RCS. Sensitivity studies using the Seepage Quantification of Upsets In Reactor Tubes (SQUIRT) code were carried out to correlate crack parameters, such as crack size, with leak rate for various types of crack configurations in RCS components. A database that identifies the leakage source, leakage rate, and resulting actions from RCS leaks discovered in U.S. light water reactors was developed. Humidity monitoring systems for detecting leakage and acoustic emission crack monitoring systems for the detection of crack initiation and growth before a leak occurs were also considered. New approaches to the detection of a leak in the reactor head region by monitoring boric-acid aerosols were also considered. (authors)

  1. Ultra-Low Power Sensor System for Disaster Event Detection in Metro Tunnel Systems

    Directory of Open Access Journals (Sweden)

    Jonah VINCKE

    2017-05-01

    Full Text Available In this extended paper, the concept for an ultra-low power wireless sensor network (WSN for underground tunnel systems is presented highlighting the chosen sensors. Its objectives are the detection of emergency events either from natural disasters, such as flooding or fire, or from terrorist attacks using explosives. Earlier works have demonstrated that the power consumption for the communication can be reduced such that the data acquisition (i.e. sensor sub-system becomes the most significant energy consumer. By using ultra-low power components for the smoke detector, a hydrostatic pressure sensor for water ingress detection and a passive acoustic emission sensor for explosion detection, all considered threats are covered while the energy consumption can be kept very low in relation to the data acquisition. In addition to 1 the sensor system is integrated into a sensor board. The total average power consumption for operating the sensor sub-system is measured to be 35.9 µW for lower and 7.8 µW for upper nodes.

  2. Detecting Smoking Events Using Accelerometer Data Collected Via Smartwatch Technology: Validation Study.

    Science.gov (United States)

    Cole, Casey A; Anshari, Dien; Lambert, Victoria; Thrasher, James F; Valafar, Homayoun

    2017-12-13

    Smoking is the leading cause of preventable death in the world today. Ecological research on smoking in context currently relies on self-reported smoking behavior. Emerging smartwatch technology may more objectively measure smoking behavior by automatically detecting smoking sessions using robust machine learning models. This study aimed to examine the feasibility of detecting smoking behavior using smartwatches. The second aim of this study was to compare the success of observing smoking behavior with smartwatches to that of conventional self-reporting. A convenience sample of smokers was recruited for this study. Participants (N=10) recorded 12 hours of accelerometer data using a mobile phone and smartwatch. During these 12 hours, they engaged in various daily activities, including smoking, for which they logged the beginning and end of each smoking session. Raw data were classified as either smoking or nonsmoking using a machine learning model for pattern recognition. The accuracy of the model was evaluated by comparing the output with a detailed description of a modeled smoking session. In total, 120 hours of data were collected from participants and analyzed. The accuracy of self-reported smoking was approximately 78% (96/123). Our model was successful in detecting 100 of 123 (81%) smoking sessions recorded by participants. After eliminating sessions from the participants that did not adhere to study protocols, the true positive detection rate of the smartwatch based-detection increased to more than 90%. During the 120 hours of combined observation time, only 22 false positive smoking sessions were detected resulting in a 2.8% false positive rate. Smartwatch technology can provide an accurate, nonintrusive means of monitoring smoking behavior in natural contexts. The use of machine learning algorithms for passively detecting smoking sessions may enrich ecological momentary assessment protocols and cessation intervention studies that often rely on self

  3. First events from the CNGS neutrino beam detected in the OPERA experiment

    CERN Document Server

    Acquafredda, R.; Ambrosio, M.; Anokhina, A.; Aoki, S.; Ariga, A.; Arrabito, L.; Autiero, D.; Badertscher, A.; Bergnoli, A.; Bersani Greggio, F.; Besnier, M.; Beyer, M.; Bondil-Blin, S.; Borer, K.; Boucrot, J.; Boyarkin, V.; Bozza, C.; Brugnera, R.; Buontempo, S.; Caffari, Y.; Campagne, Jean-Eric; Carlus, B.; Carrara, E.; Cazes, A.; Chaussard, L.; Chernyavsky, M.; Chiarella, V.; Chon-Sen, N.; Chukanov, A.; Ciesielski, R.; Consiglio, L.; Cozzi, M.; Dal Corso, F.; D'Ambrosio, N.; Damet, J.; De Lellis, G.; Declais, Y.; Descombes, T.; De Serio, M.; Di Capua, F.; Di Ferdinando, D.; Di Giovanni, A.; Di Marco, N.; Di Troia, C.; Dmitrievski, S.; Dracos, M.; Duchesneau, D.; Dulach, B.; Dusini, S.; Ebert, J.; Enikeev, R.; Ereditato, A.; Esposito, L.S.; Fanin, C.; Favier, J.; Felici, G.; Ferber, T.; Fournier, L.; Franceschi, A.; Frekers, D.; Fukuda, T.; Fukushima, C.; Galkin, V.I.; Galkin, V.A.; Gallet, R.; Garfagnini, A.; Gaudiot, G.; Giacomelli, G.; Giarmana, O.; Giorgini, M.; Girard, L.; Girerd, C.; Goellnitz, C.; Goldberg, J.; Gornoushkin, Y.; Grella, G.; Grianti, F.; Guerin, C.; Guler, M.; Gustavino, C.; Hagner, C.; Hamane, T.; Hara, T.; Hauger, M.; Hess, M.; Hoshino, K.; Ieva, M.; Incurvati, M.; Jakovcic, K.; Janicsko Csathy, J.; Janutta, B.; Jollet, C.; Juget, F.; Kazuyama, M.; Kim, S.H.; Kimura, M.; Knuesel, J.; Kodama, K.; Kolev, D.; Komatsu, M.; Kose, U.; Krasnoperov, A.; Kreslo, I.; Krumstein, Z.; Laktineh, I.; de La Taille, C.; Le Flour, T.; Lieunard, S.; Ljubicic, A.; Longhin, A.; Malgin, A.; Manai, K.; Mandrioli, G.; Mantello, U.; Marotta, A.; Marteau, J.; Martin-Chassard, G.; Matveev, V.; Messina, M.; Meyer, L.; Micanovic, S.; Migliozzi, P.; Miyamoto, S.; Monacelli, Piero; Monteiro, I.; Morishima, K.; Moser, U.; Muciaccia, M.T.; Mugnier, P.; Naganawa, N.; Nakamura, M.; Nakano, T.; Napolitano, T.; Natsume, M.; Niwa, K.; Nonoyama, Y.; Nozdrin, A.; Ogawa, S.; Olchevski, A.; Orlandi, D.; Ossetski, D.; Paoloni, A.; Park, B.D.; Park, I.G.; Pastore, A.; Patrizii, L.; Pellegrino, L.; Pessard, H.; Pilipenko, V.; Pistillo, C.; Polukhina, N.; Pozzato, M.; Pretzl, K.; Publichenko, P.; Raux, L.; Repellin, J.P.; Roganova, T.; Romano, G.; Rosa, G.; Rubbia, A.; Ryasny, V.; Ryazhskaya, O.; Ryzhikov, D.; Sadovski, A.; Sanelli, C.; Sato, O.; Sato, Y.; Saveliev, V.; Savvinov, N.; Sazhina, G.; Schembri, A.; Schmidt Parzefall, W.; Schroeder, H.; Schutz, H.U.; Scotto Lavina, L.; Sewing, J.; Shibuya, H.; Simone, S.; Sioli, M.; Sirignano, C.; Sirri, G.; Song, J.S.; Spaeti, R.; Spinetti, M.; Stanco, L.; Starkov, N.; Stipcevic, M.; Strolin, Paolo Emilio; Sugonyaev, V.; Takahashi, S.; Tereschenko, V.; Terranova, F.; Tezuka, I.; Tioukov, V.; Tikhomirov, I.; Tolun, P.; Toshito, T.; Tsarev, V.; Tsenov, R.; Ugolino, U.; Ushida, N.; Van Beek, G.; Verguilov, V.; Vilain, P.; Votano, L.; Vuilleumier, J.L.; Waelchli, T.; Waldi, R.; Weber, M.; Wilquet, G.; Wonsak, B.; Wurth, R.; Wurtz, J.; Yakushev, V.; Yoon, C.S.; Zaitsev, Y.; Zamboni, I.; Zimmerman, R.

    2006-01-01

    The OPERA neutrino detector at the underground Gran Sasso Laboratory (LNGS) was designed to perform the first detection of neutrino oscillations in appearance mode, through the study of nu_mu to nu_tau oscillations. The apparatus consists of a lead/emulsion-film target complemented by electronic detectors. It is placed in the high-energy, long-baseline CERN to LNGS beam (CNGS) 730 km away from the neutrino source. In August 2006 a first run with CNGS neutrinos was successfully conducted. A first sample of neutrino events was collected, statistically consistent with the integrated beam intensity. After a brief description of the beam and of the various sub-detectors, we report on the achievement of this milestone, presenting the first data and some analysis results.

  4. The Event Detection and the Apparent Velocity Estimation Based on Computer Vision

    Science.gov (United States)

    Shimojo, M.

    2012-08-01

    The high spatial and time resolution data obtained by the telescopes aboard Hinode revealed the new interesting dynamics in solar atmosphere. In order to detect such events and estimate the velocity of dynamics automatically, we examined the estimation methods of the optical flow based on the OpenCV that is the computer vision library. We applied the methods to the prominence eruption observed by NoRH, and the polar X-ray jet observed by XRT. As a result, it is clear that the methods work well for solar images if the images are optimized for the methods. It indicates that the optical flow estimation methods in the OpenCV library are very useful to analyze the solar phenomena.

  5. Detection of Cardiopulmonary Activity and Related Abnormal Events Using Microsoft Kinect Sensor.

    Science.gov (United States)

    Al-Naji, Ali; Chahl, Javaan

    2018-03-20

    Monitoring of cardiopulmonary activity is a challenge when attempted under adverse conditions, including different sleeping postures, environmental settings, and an unclear region of interest (ROI). This study proposes an efficient remote imaging system based on a Microsoft Kinect v2 sensor for the observation of cardiopulmonary-signal-and-detection-related abnormal cardiopulmonary events (e.g., tachycardia, bradycardia, tachypnea, bradypnea, and central apnoea) in many possible sleeping postures within varying environmental settings including in total darkness and whether the subject is covered by a blanket or not. The proposed system extracts the signal from the abdominal-thoracic region where cardiopulmonary activity is most pronounced, using a real-time image sequence captured by Kinect v2 sensor. The proposed system shows promising results in any sleep posture, regardless of illumination conditions and unclear ROI even in the presence of a blanket, whilst being reliable, safe, and cost-effective.

  6. Event detection and localization for small mobile robots using reservoir computing.

    Science.gov (United States)

    Antonelo, E A; Schrauwen, B; Stroobandt, D

    2008-08-01

    Reservoir Computing (RC) techniques use a fixed (usually randomly created) recurrent neural network, or more generally any dynamic system, which operates at the edge of stability, where only a linear static readout output layer is trained by standard linear regression methods. In this work, RC is used for detecting complex events in autonomous robot navigation. This can be extended to robot localization tasks which are solely based on a few low-range, high-noise sensory data. The robot thus builds an implicit map of the environment (after learning) that is used for efficient localization by simply processing the input stream of distance sensors. These techniques are demonstrated in both a simple simulation environment and in the physically realistic Webots simulation of the commercially available e-puck robot, using several complex and even dynamic environments.

  7. Decision support methods for the detection of adverse events in post-marketing data.

    Science.gov (United States)

    Hauben, M; Bate, A

    2009-04-01

    Spontaneous reporting is a crucial component of post-marketing drug safety surveillance despite its significant limitations. The size and complexity of some spontaneous reporting system databases represent a challenge for drug safety professionals who traditionally have relied heavily on the scientific and clinical acumen of the prepared mind. Computer algorithms that calculate statistical measures of reporting frequency for huge numbers of drug-event combinations are increasingly used to support pharamcovigilance analysts screening large spontaneous reporting system databases. After an overview of pharmacovigilance and spontaneous reporting systems, we discuss the theory and application of contemporary computer algorithms in regular use, those under development, and the practical considerations involved in the implementation of computer algorithms within a comprehensive and holistic drug safety signal detection program.

  8. Detection of Cardiopulmonary Activity and Related Abnormal Events Using Microsoft Kinect Sensor

    Directory of Open Access Journals (Sweden)

    Ali Al-Naji

    2018-03-01

    Full Text Available Monitoring of cardiopulmonary activity is a challenge when attempted under adverse conditions, including different sleeping postures, environmental settings, and an unclear region of interest (ROI. This study proposes an efficient remote imaging system based on a Microsoft Kinect v2 sensor for the observation of cardiopulmonary-signal-and-detection-related abnormal cardiopulmonary events (e.g., tachycardia, bradycardia, tachypnea, bradypnea, and central apnoea in many possible sleeping postures within varying environmental settings including in total darkness and whether the subject is covered by a blanket or not. The proposed system extracts the signal from the abdominal-thoracic region where cardiopulmonary activity is most pronounced, using a real-time image sequence captured by Kinect v2 sensor. The proposed system shows promising results in any sleep posture, regardless of illumination conditions and unclear ROI even in the presence of a blanket, whilst being reliable, safe, and cost-effective.

  9. Standard and Nonstandard Neutrino-Nucleus Reactions Cross Sections and Event Rates to Neutrino Detection Experiments

    Directory of Open Access Journals (Sweden)

    D. K. Papoulias

    2015-01-01

    Full Text Available In this work, we explore ν-nucleus processes from a nuclear theory point of view and obtain results with high confidence level based on accurate nuclear structure cross sections calculations. Besides cross sections, the present study includes simulated signals expected to be recorded by nuclear detectors and differential event rates as well as total number of events predicted to be measured. Our original cross sections calculations are focused on measurable rates for the standard model process, but we also perform calculations for various channels of the nonstandard neutrino-nucleus reactions and come out with promising results within the current upper limits of the corresponding exotic parameters. We concentrate on the possibility of detecting (i supernova neutrinos by using massive detectors like those of the GERDA and SuperCDMS dark matter experiments and (ii laboratory neutrinos produced near the spallation neutron source facilities (at Oak Ridge National Lab by the COHERENT experiment. Our nuclear calculations take advantage of the relevant experimental sensitivity and employ the severe bounds extracted for the exotic parameters entering the Lagrangians of various particle physics models and specifically those resulting from the charged lepton flavour violating μ-→e- experiments (Mu2e and COMET experiments.

  10. Energy efficient data representation and aggregation with event region detection in wireless sensor networks

    Science.gov (United States)

    Banerjee, Torsha

    Detection (PERD) for WSNs. When a single event occurs, a child of the tree sends a Flagged Polynomial (FP) to its parent, if the readings approximated by it falls outside the data range defining the existing phenomenon. After the aggregation process is over, the root having the two polynomials, P and FP can be queried for FP (approximating the new event region) instead of flooding the whole network. For multiple such events, instead of computing a polynomial corresponding to each new event, areas with same data range are combined by the corresponding tree nodes and the aggregated coefficients are passed on. Results reveal that a new event can be detected by PERD while error in detection remains constant and is less than a threshold of 10%. As the node density increases, accuracy and delay for event detection are found to remain almost constant, making PERD highly scalable. Whenever an event occurs in a WSN, data is generated by closeby sensors and relaying the data to the base station (BS) make sensors closer to the BS run out of energy at a much faster rate than sensors in other parts of the network. This gives rise to an unequal distribution of residual energy in the network and makes those sensors with lower remaining energy level die at much faster rate than others. We propose a scheme for enhancing network Lifetime using mobile cluster heads (CH) in a WSN. To maintain remaining energy more evenly, some energy-rich nodes are designated as CHs which move in a controlled manner towards sensors rich in energy and data. This eliminates multihop transmission required by the static sensors and thus increases the overall lifetime of the WSN. We combine the idea of clustering and mobile CH to first form clusters of static sensor nodes. A collaborative strategy among the CHs further increases the lifetime of the network. Time taken for transmitting data to the BS is reduced further by making the CHs follow a connectivity strategy that always maintain a connected path to the BS

  11. The Monitoring, Detection, Isolation and Assessment of Information Warfare Attacks Through Multi-Level, Multi-Scale System Modeling and Model Based Technology

    Science.gov (United States)

    2004-01-01

    login identity to the one under which the system call is executed, the parameters of the system call execution - file names including full path...Anomaly detection COAST-EIMDT Distributed on target hosts EMERALD Distributed on target hosts and security servers Signature recognition Anomaly...uses a centralized architecture, and employs an anomaly detection technique for intrusion detection. The EMERALD project [80] proposes a

  12. Convolutional neural networks for event-related potential detection: impact of the architecture.

    Science.gov (United States)

    Cecotti, H

    2017-07-01

    The detection of brain responses at the single-trial level in the electroencephalogram (EEG) such as event-related potentials (ERPs) is a difficult problem that requires different processing steps to extract relevant discriminant features. While most of the signal and classification techniques for the detection of brain responses are based on linear algebra, different pattern recognition techniques such as convolutional neural network (CNN), as a type of deep learning technique, have shown some interests as they are able to process the signal after limited pre-processing. In this study, we propose to investigate the performance of CNNs in relation of their architecture and in relation to how they are evaluated: a single system for each subject, or a system for all the subjects. More particularly, we want to address the change of performance that can be observed between specifying a neural network to a subject, or by considering a neural network for a group of subjects, taking advantage of a larger number of trials from different subjects. The results support the conclusion that a convolutional neural network trained on different subjects can lead to an AUC above 0.9 by using an appropriate architecture using spatial filtering and shift invariant layers.

  13. Acoustic Event Detection in Multichannel Audio Using Gated Recurrent Neural Networks with High‐Resolution Spectral Features

    Directory of Open Access Journals (Sweden)

    Hyoung‐Gook Kim

    2017-12-01

    Full Text Available Recently, deep recurrent neural networks have achieved great success in various machine learning tasks, and have also been applied for sound event detection. The detection of temporally overlapping sound events in realistic environments is much more challenging than in monophonic detection problems. In this paper, we present an approach to improve the accuracy of polyphonic sound event detection in multichannel audio based on gated recurrent neural networks in combination with auditory spectral features. In the proposed method, human hearing perception‐based spatial and spectral‐domain noise‐reduced harmonic features are extracted from multichannel audio and used as high‐resolution spectral inputs to train gated recurrent neural networks. This provides a fast and stable convergence rate compared to long short‐term memory recurrent neural networks. Our evaluation reveals that the proposed method outperforms the conventional approaches.

  14. Detecting Forest Disturbance Events from MODIS and Landsat Time Series for the Conterminous United States

    Science.gov (United States)

    Zhang, G.; Ganguly, S.; Saatchi, S. S.; Hagen, S. C.; Harris, N.; Yu, Y.; Nemani, R. R.

    2013-12-01

    Spatial and temporal patterns of forest disturbance and regrowth processes are key for understanding aboveground terrestrial vegetation biomass and carbon stocks at regional-to-continental scales. The NASA Carbon Monitoring System (CMS) program seeks key input datasets, especially information related to impacts due to natural/man-made disturbances in forested landscapes of Conterminous U.S. (CONUS), that would reduce uncertainties in current carbon stock estimation and emission models. This study provides a end-to-end forest disturbance detection framework based on pixel time series analysis from MODIS (Moderate Resolution Imaging Spectroradiometer) and Landsat surface spectral reflectance data. We applied the BFAST (Breaks for Additive Seasonal and Trend) algorithm to the Normalized Difference Vegetation Index (NDVI) data for the time period from 2000 to 2011. A harmonic seasonal model was implemented in BFAST to decompose the time series to seasonal and interannual trend components in order to detect abrupt changes in magnitude and direction of these components. To apply the BFAST for whole CONUS, we built a parallel computing setup for processing massive time-series data using the high performance computing facility of the NASA Earth Exchange (NEX). In the implementation process, we extracted the dominant deforestation events from the magnitude of abrupt changes in both seasonal and interannual components, and estimated dates for corresponding deforestation events. We estimated the recovery rate for deforested regions through regression models developed between NDVI values and time since disturbance for all pixels. A similar implementation of the BFAST algorithm was performed over selected Landsat scenes (all Landsat cloud free data was used to generate NDVI from atmospherically corrected spectral reflectances) to demonstrate the spatial coherence in retrieval layers between MODIS and Landsat. In future, the application of this largely parallel disturbance

  15. The power to detect recent fragmentation events using genetic differentiation methods.

    Directory of Open Access Journals (Sweden)

    Michael W Lloyd

    Full Text Available Habitat loss and fragmentation are imminent threats to biological diversity worldwide and thus are fundamental issues in conservation biology. Increased isolation alone has been implicated as a driver of negative impacts in populations associated with fragmented landscapes. Genetic monitoring and the use of measures of genetic divergence have been proposed as means to detect changes in landscape connectivity. Our goal was to evaluate the sensitivity of Wright's F st, Hedrick' G'st , Sherwin's MI, and Jost's D to recent fragmentation events across a range of population sizes and sampling regimes. We constructed an individual-based model, which used a factorial design to compare effects of varying population size, presence or absence of overlapping generations, and presence or absence of population sub-structuring. Increases in population size, overlapping generations, and population sub-structuring each reduced F st, G'st , MI, and D. The signal of fragmentation was detected within two generations for all metrics. However, the magnitude of the change in each was small in all cases, and when N e was >100 individuals it was extremely small. Multi-generational sampling and population estimates are required to differentiate the signal of background divergence from changes in Fst , G'st , MI, and D associated with fragmentation. Finally, the window during which rapid change in Fst , G'st , MI, and D between generations occurs can be small, and if missed would lead to inconclusive results. For these reasons, use of F st, G'st , MI, or D for detecting and monitoring changes in connectivity is likely to prove difficult in real-world scenarios. We advocate use of genetic monitoring only in conjunction with estimates of actual movement among patches such that one could compare current movement with the genetic signature of past movement to determine there has been a change.

  16. Detection of adverse events in general surgery using the " Trigger Tool" methodology.

    Science.gov (United States)

    Pérez Zapata, Ana Isabel; Gutiérrez Samaniego, María; Rodríguez Cuéllar, Elías; Andrés Esteban, Eva María; Gómez de la Cámara, Agustín; Ruiz López, Pedro

    2015-02-01

    Surgery is one of the high-risk areas for the occurrence of adverse events (AE). The purpose of this study is to know the percentage of hospitalisation-related AE that are detected by the «Global Trigger Tool» methodology in surgical patients, their characteristics and the tool validity. Retrospective, observational study on patients admitted to a general surgery department, who underwent a surgical operation in a third level hospital during the year 2012. The identification of AE was carried out by patient record review using an adaptation of «Global Trigger Tool» methodology. Once an AE was identified, a harm category was assigned, including the grade in which the AE could have been avoided and its relation with the surgical procedure. The prevalence of AE was 36,8%. There were 0,5 AE per patient. 56,2% were deemed preventable. 69,3% were directly related to the surgical procedure. The tool had a sensitivity of 86% and a specificity of 93,6%. The positive predictive value was 89% and the negative predictive value 92%. Prevalence of AE is greater than the estimate of other studies. In most cases the AE detected were related to the surgical procedure and more than half were also preventable. The adapted «Global Trigger Tool» methodology has demonstrated to be highly effective and efficient for detecting AE in surgical patients, identifying all the serious AE with few false negative results. Copyright © 2014 AEC. Publicado por Elsevier España, S.L.U. All rights reserved.

  17. FOREWORD: 3rd Symposium on Large TPCs for Low Energy Event Detection

    Science.gov (United States)

    Irastorza, Igor G.; Colas, Paul; Gorodetzky, Phillippe

    2007-05-01

    The Third International Symposium on large TPCs for low-energy rare-event detection was held at Carré des sciences, Poincaré auditorium, 25 rue de la Montagne Ste Geneviève in Paris on 11 12 December 2006. This prestigious location belonging to the Ministry of Research is hosted in the former Ecole Polytechnique. The meeting, held in Paris every two years, gathers a significant community of physicists involved in rare event detection. Its purpose is an extensive discussion of present and future projects using large TPCs for low energy, low background detection of rare events (low-energy neutrinos, dark matter, solar axions). The use of a new generation of Micro-Pattern Gaseous Detectors (MPGD) appears to be a promising way to reach this goal. The program this year was enriched by a new session devoted to the detection challenge of polarized gamma rays, relevant novel experimental techniques and the impact on particle physics, astrophysics and astronomy. A very particular feature of this conference is the large variety of talks ranging from purely theoretical to purely experimental subjects including novel technological aspects. This allows discussion and exchange of useful information and new ideas that are emerging to address particle physics experimental challenges. The scientific highlights at the Symposium came on many fronts: Status of low-energy neutrino physics and double-beta decay New ideas on double-beta decay experiments Gamma ray polarization measurement combining high-precision TPCs with MPGD read-out Dark Matter challenges in both axion and WIMP search with new emerging ideas for detection improvements Progress in gaseous and liquid TPCs for rare event detection Georges Charpak opened the meeting with a talk on gaseous detectors for applications in the bio-medical field. He also underlined the importance of new MPGD detectors for both physics and applications. There were about 100 registered participants at the symposium. The successful

  18. Bioluminescence Detection of Cells Having Stabilized p53 in Response to a Genotoxic Event

    Directory of Open Access Journals (Sweden)

    Alnawaz Rehemtulla

    2004-01-01

    Full Text Available Inactivation of p53 is one of the most frequent molecular events in neoplastic transformation. Approximately 60% of all human tumors have mutations in both p53 alleles. Wild-type p53 activity is regulated in large part by the proteosome-dependent degradation of p53, resulting in a short p53 half-life in unstressed and untransformed cells. Activation of p53 by a variety of stimuli, including DNA damage induced by genotoxic drugs or radiation, is accomplished by stabilization of wild-type p53. The stabilized and active p53 can result in either cell-cycle arrest or apoptosis. Surprisingly, the majority of tumor-associated, inactivating p53 mutations also result in p53 accumulation. Thus, constitutive elevation of p53 levels in cells is a reliable measure of p53 inactivation, whereas transiently increased p53 levels reflect a recent genotoxic stress. In order to facilitate noninvasive imaging of p53 accumulation, we here describe the construction of a p53-luciferase fusion protein. Induction of DNA damage in cells expressing the fusion protein resulted in a time-dependent accumulation of the fusion that was noninvasively detected using bioluminescence imaging and validated by Western blot analysis. The p53-Luc protein retains p53 function because its expression in HCT116 cells lacking functional p53 resulted in activation of p21 expression as well as induction of apoptosis in response to a DNA damaging event. Employed in a transgenic animal model, the proposed p53-reporter fusion protein will be useful for studying p53 activation in response to exposure to DNA-damaging carcinogenic agents. It could also be used to study p53 stabilization as a result of inactivating p53 mutations. Such studies will further our understanding of p53's role as the “guardian of the genome” and its function in tumorigenesis.

  19. The Monitoring, Detection, Isolation and Assessment of Information Warfare Attacks Through Multi-Level, Multi-Scale System Modeling and Model Based Technology

    National Research Council Canada - National Science Library

    Ye, Nong

    2004-01-01

    With the goal of protecting computer and networked systems from various attacks, the following intrusion detection techniques were developed and tested using the 1998 and 2000 MIT Lincoln Lab Evaluation Data...

  20. Detection of microsleep events in a car driving simulation study using electrocardiographic features

    Directory of Open Access Journals (Sweden)

    Lenis Gustavo

    2016-09-01

    Full Text Available Microsleep events (MSE are short intrusions of sleep under the demand of sustained attention. They can impose a major threat to safety while driving a car and are considered one of the most significant causes of traffic accidents. Driver’s fatigue and MSE account for up to 20% of all car crashes in Europe and at least 100,000 accidents in the US every year. Unfortunately, there is not a standardized test developed to quantify the degree of vigilance of a driver. To account for this problem, different approaches based on biosignal analysis have been studied in the past. In this paper, we investigate an electrocardiographic-based detection of MSE using morphological and rhythmical features. 14 records from a car driving simulation study with a high incidence of MSE were analyzed and the behavior of the ECG features before and after an MSE in relation to reference baseline values (without drowsiness were investigated. The results show that MSE cannot be detected (or predicted using only the ECG. However, in the presence of MSE, the rhythmical and morphological features were observed to be significantly different than the ones calculated for the reference signal without sleepiness. In particular, when MSE were present, the heart rate diminished while the heart rate variability increased. Time distances between P wave and R peak, and R peak and T wave and their dispersion increased also. This demonstrates a noticeable change of the autonomous regulation of the heart. In future, the ECG parameter could be used as a surrogate measure of fatigue.

  1. Detecting Micro-seismicity and Long-duration Tremor-like Events from the Oklahoma Wavefield Experiment

    Science.gov (United States)

    Li, C.; Li, Z.; Peng, Z.; Zhang, C.; Nakata, N.

    2017-12-01

    Oklahoma has experienced abrupt increase of induced seismicity in the last decade. An important way to fully understand seismic activities in Oklahoma is to obtain more complete earthquake catalogs and detect different types of seismic events. The IRIS Community Wavefield Demonstration Experiment was deployed near Enid, Oklahoma in Summer of 2016. The dataset from this ultra-dense array provides an excellent opportunity for detecting microseismicity in that region with wavefield approaches. Here we examine continuous waveforms recorded by 3 seismic lines using local coherence for ultra-dense arrays (Li et al., 2017), which is a measure of cross-correlation of waveform at each station with its nearby stations. So far we have detected more than 5,000 events from 06/22/2016 to 07/20/2016, and majority of them are not listed on the regional catalog of Oklahoma or global catalogs, indicating that they are local events. We also identify 15-20 long-period long-duration events, some of them lasting for more than 500 s. Such events have been found at major plate-boundary faults (also known as deep tectonic tremor), as well as during hydraulic fracturing, slow-moving landslides and glaciers. Our next step is to locate these possible tremor-like events with their relative arrival times across the array and compare their occurrence times with solid-earth tides and injection histories to better understand their driving mechanisms.

  2. On the feasibility of using satellite gravity observations for detecting large-scale solid mass transfer events

    Science.gov (United States)

    Peidou, Athina C.; Fotopoulos, Georgia; Pagiatakis, Spiros

    2017-10-01

    The main focus of this paper is to assess the feasibility of utilizing dedicated satellite gravity missions in order to detect large-scale solid mass transfer events (e.g. landslides). Specifically, a sensitivity analysis of Gravity Recovery and Climate Experiment (GRACE) gravity field solutions in conjunction with simulated case studies is employed to predict gravity changes due to past subaerial and submarine mass transfer events, namely the Agulhas slump in southeastern Africa and the Heart Mountain Landslide in northwestern Wyoming. The detectability of these events is evaluated by taking into account the expected noise level in the GRACE gravity field solutions and simulating their impact on the gravity field through forward modelling of the mass transfer. The spectral content of the estimated gravity changes induced by a simulated large-scale landslide event is estimated for the known spatial resolution of the GRACE observations using wavelet multiresolution analysis. The results indicate that both the Agulhas slump and the Heart Mountain Landslide could have been detected by GRACE, resulting in {\\vert }0.4{\\vert } and {\\vert }0.18{\\vert } mGal change on GRACE solutions, respectively. The suggested methodology is further extended to the case studies of the submarine landslide in Tohoku, Japan, and the Grand Banks landslide in Newfoundland, Canada. The detectability of these events using GRACE solutions is assessed through their impact on the gravity field.

  3. Snake scales, partial exposure, and the Snake Detection Theory: A human event-related potentials study

    Science.gov (United States)

    Van Strien, Jan W.; Isbell, Lynne A.

    2017-01-01

    Studies of event-related potentials in humans have established larger early posterior negativity (EPN) in response to pictures depicting snakes than to pictures depicting other creatures. Ethological research has recently shown that macaques and wild vervet monkeys respond strongly to partially exposed snake models and scale patterns on the snake skin. Here, we examined whether snake skin patterns and partially exposed snakes elicit a larger EPN in humans. In Task 1, we employed pictures with close-ups of snake skins, lizard skins, and bird plumage. In task 2, we employed pictures of partially exposed snakes, lizards, and birds. Participants watched a random rapid serial visual presentation of these pictures. The EPN was scored as the mean activity (225–300 ms after picture onset) at occipital and parieto-occipital electrodes. Consistent with previous studies, and with the Snake Detection Theory, the EPN was significantly larger for snake skin pictures than for lizard skin and bird plumage pictures, and for lizard skin pictures than for bird plumage pictures. Likewise, the EPN was larger for partially exposed snakes than for partially exposed lizards and birds. The results suggest that the EPN snake effect is partly driven by snake skin scale patterns which are otherwise rare in nature. PMID:28387376

  4. Vision-based Detection of Acoustic Timed Events: a Case Study on Clarinet Note Onsets

    Science.gov (United States)

    Bazzica, A.; van Gemert, J. C.; Liem, C. C. S.; Hanjalic, A.

    2017-05-01

    Acoustic events often have a visual counterpart. Knowledge of visual information can aid the understanding of complex auditory scenes, even when only a stereo mixdown is available in the audio domain, \\eg identifying which musicians are playing in large musical ensembles. In this paper, we consider a vision-based approach to note onset detection. As a case study we focus on challenging, real-world clarinetist videos and carry out preliminary experiments on a 3D convolutional neural network based on multiple streams and purposely avoiding temporal pooling. We release an audiovisual dataset with 4.5 hours of clarinetist videos together with cleaned annotations which include about 36,000 onsets and the coordinates for a number of salient points and regions of interest. By performing several training trials on our dataset, we learned that the problem is challenging. We found that the CNN model is highly sensitive to the optimization algorithm and hyper-parameters, and that treating the problem as binary classification may prevent the joint optimization of precision and recall. To encourage further research, we publicly share our dataset, annotations and all models and detail which issues we came across during our preliminary experiments.

  5. First Satellite-detected Perturbations of Outgoing Longwave Radiation Associated with Blowing Snow Events over Antarctica

    Science.gov (United States)

    Yang, Yuekui; Palm, Stephen P.; Marshak, Alexander; Wu, Dong L.; Yu, Hongbin; Fu, Qiang

    2014-01-01

    We present the first satellite-detected perturbations of the outgoing longwave radiation (OLR) associated with blowing snow events over the Antarctic ice sheet using data from Cloud-Aerosol Lidar with Orthogonal Polarization and Clouds and the Earth's Radiant Energy System. Significant cloud-free OLR differences are observed between the clear and blowing snow sky, with the sign andmagnitude depending on season and time of the day. During nighttime, OLRs are usually larger when blowing snow is present; the average difference in OLRs between without and with blowing snow over the East Antarctic Ice Sheet is about 5.2 W/m2 for the winter months of 2009. During daytime, in contrast, the OLR perturbation is usually smaller or even has the opposite sign. The observed seasonal variations and day-night differences in the OLR perturbation are consistent with theoretical calculations of the influence of blowing snow on OLR. Detailed atmospheric profiles are needed to quantify the radiative effect of blowing snow from the satellite observations.

  6. Evaluation of outbreak detection performance using multi-stream syndromic surveillance for influenza-like illness in rural Hubei Province, China: a temporal simulation model based on healthcare-seeking behaviors.

    Directory of Open Access Journals (Sweden)

    Yunzhou Fan

    Full Text Available BACKGROUND: Syndromic surveillance promotes the early detection of diseases outbreaks. Although syndromic surveillance has increased in developing countries, performance on outbreak detection, particularly in cases of multi-stream surveillance, has scarcely been evaluated in rural areas. OBJECTIVE: This study introduces a temporal simulation model based on healthcare-seeking behaviors to evaluate the performance of multi-stream syndromic surveillance for influenza-like illness. METHODS: Data were obtained in six towns of rural Hubei Province, China, from April 2012 to June 2013. A Susceptible-Exposed-Infectious-Recovered model generated 27 scenarios of simulated influenza A (H1N1 outbreaks, which were converted into corresponding simulated syndromic datasets through the healthcare-behaviors model. We then superimposed converted syndromic datasets onto the baselines obtained to create the testing datasets. Outbreak performance of single-stream surveillance of clinic visit, frequency of over the counter drug purchases, school absenteeism, and multi-stream surveillance of their combinations were evaluated using receiver operating characteristic curves and activity monitoring operation curves. RESULTS: In the six towns examined, clinic visit surveillance and school absenteeism surveillance exhibited superior performances of outbreak detection than over the counter drug purchase frequency surveillance; the performance of multi-stream surveillance was preferable to signal-stream surveillance, particularly at low specificity (Sp <90%. CONCLUSIONS: The temporal simulation model based on healthcare-seeking behaviors offers an accessible method for evaluating the performance of multi-stream surveillance.

  7. Real Time Robot Soccer Game Event Detection Using Finite State Machines with Multiple Fuzzy Logic Probability Evaluators

    Directory of Open Access Journals (Sweden)

    Elmer P. Dadios

    2009-01-01

    Full Text Available This paper presents a new algorithm for real time event detection using Finite State Machines with multiple Fuzzy Logic Probability Evaluators (FLPEs. A machine referee for a robot soccer game is developed and is used as the platform to test the proposed algorithm. A novel technique to detect collisions and other events in microrobot soccer game under inaccurate and insufficient information is presented. The robots' collision is used to determine goalkeeper charging and goal score events which are crucial for the machine referee's decisions. The Main State Machine (MSM handles the schedule of event activation. The FLPE calculates the probabilities of the true occurrence of the events. Final decisions about the occurrences of events are evaluated and compared through threshold crisp probability values. The outputs of FLPEs can be combined to calculate the probability of an event composed of subevents. Using multiple fuzzy logic system, the FLPE utilizes minimal number of rules and can be tuned individually. Experimental results show the accuracy and robustness of the proposed algorithm.

  8. MR-perfusion (MRP) and diffusion-weighted imaging (DWI) in prostate cancer: quantitative and model-based gadobenate dimeglumine MRP parameters in detection of prostate cancer.

    Science.gov (United States)

    Scherr, M K; Seitz, M; Müller-Lisse, U G; Ingrisch, M; Reiser, M F; Müller-Lisse, U L

    2010-12-01

    Various MR methods, including MR-spectroscopy (MRS), dynamic, contrast-enhanced MRI (DCE-MRI), and diffusion-weighted imaging (DWI) have been applied to improve test quality of standard MRI of the prostate. To determine if quantitative, model-based MR-perfusion (MRP) with gadobenate dimeglumine (Gd-BOPTA) discriminates between prostate cancer, benign tissue, and transitional zone (TZ) tissue. 27 patients (age, 65±4 years; PSA 11.0±6.1 ng/ml) with clinical suspicion of prostate cancer underwent standard MRI, 3D MR-spectroscopy (MRS), and MRP with Gd-BOPTA. Based on results of combined MRI/MRS and subsequent guided prostate biopsy alone (17/27), biopsy and radical prostatectomy (9/27), or sufficient negative follow-up (7/27), maps of model-free, deconvolution-based mean transit time (dMTT) were generated for 29 benign regions (bROIs), 14 cancer regions (cROIs), and 18 regions of transitional zone (tzROIs). Applying a 2-compartment exchange model, quantitative perfusion analysis was performed including as parameters: plasma flow (PF), plasma volume (PV), plasma mean transit time (PMTT), extraction flow (EFL), extraction fraction (EFR), interstitial volume (IV) and interstitial mean transit time (IMTT). Two-sided T-tests (significance level pMRP with Gd-BOPTA discriminates between prostate cancer and benign tissue with several parameters. However, distinction of prostate cancer and TZ does not appear to be reliable. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  9. BRCA mutation carrier detection. A model-based cost-effectiveness analysis comparing the traditional family history approach and the testing of all patients with breast cancer

    Science.gov (United States)

    Norum, Jan; Grindedal, Eli Marie; Heramb, Cecilie; Karsrud, Inga; Ariansen, Sarah Louise; Undlien, Dag Erik; Schlichting, Ellen; Mæhle, Lovise

    2018-01-01

    Background Identification of BRCA mutation carriers among patients with breast cancer (BC) involves costs and gains. Testing has been performed according to international guidelines, focusing on family history (FH) of breast and/or ovarian cancer. An alternative is testing all patients with BC employing sequencing of the BRCA genes and Multiplex Ligation Probe Amplification (MLPA). Patients and methods A model-based cost-effectiveness analysis, employing data from Oslo University Hospital, Ullevål (OUH-U) and a decision tree, was done. The societal and the healthcare perspectives were focused and a lifetime perspective employed. The comparators were the traditional FH approach used as standard of care at OUH-U in 2013 and the intervention (testing all patients with BC) performed in 2014 and 2015 at the same hospital. During the latter period, 535 patients with BC were offered BRCA testing with sequencing and MLPA. National 2014 data on mortality rates and costs were implemented, a 3% discount rate used and the costing year was 2015. The incremental cost-effectiveness ratio was calculated in euros (€) per life-year gained (LYG). Results The net healthcare cost (healthcare perspective) was €40 503/LYG. Including all resource use (societal perspective), the cost was €5669/LYG. The univariate sensitivity analysis documented the unit cost of the BRCA test and the number of LYGs the prominent parameters affecting the result. Diagnostic BRCA testing of all patients with BC was superior to the FH approach and cost-effective within the frequently used thresholds (healthcare perspective) in Norway (€60 000–€80 000/LYG). PMID:29682331

  10. BRCA mutation carrier detection. A model-based cost-effectiveness analysis comparing the traditional family history approach and the testing of all patients with breast cancer.

    Science.gov (United States)

    Norum, Jan; Grindedal, Eli Marie; Heramb, Cecilie; Karsrud, Inga; Ariansen, Sarah Louise; Undlien, Dag Erik; Schlichting, Ellen; Mæhle, Lovise

    2018-01-01

    Identification of BRCA mutation carriers among patients with breast cancer (BC) involves costs and gains. Testing has been performed according to international guidelines, focusing on family history (FH) of breast and/or ovarian cancer. An alternative is testing all patients with BC employing sequencing of the BRCA genes and Multiplex Ligation Probe Amplification (MLPA). A model-based cost-effectiveness analysis, employing data from Oslo University Hospital, Ullevål (OUH-U) and a decision tree, was done. The societal and the healthcare perspectives were focused and a lifetime perspective employed. The comparators were the traditional FH approach used as standard of care at OUH-U in 2013 and the intervention (testing all patients with BC) performed in 2014 and 2015 at the same hospital. During the latter period, 535 patients with BC were offered BRCA testing with sequencing and MLPA. National 2014 data on mortality rates and costs were implemented, a 3% discount rate used and the costing year was 2015. The incremental cost-effectiveness ratio was calculated in euros (€) per life-year gained (LYG). The net healthcare cost (healthcare perspective) was €40 503/LYG. Including all resource use (societal perspective), the cost was €5669/LYG. The univariate sensitivity analysis documented the unit cost of the BRCA test and the number of LYGs the prominent parameters affecting the result.Diagnostic BRCA testing of all patients with BC was superior to the FH approach and cost-effective within the frequently used thresholds (healthcare perspective) in Norway (€60 000-€80 000/LYG).

  11. An assessment of thin cloud detection by applying bidirectional reflectance distribution function model-based background surface reflectance using Geostationary Ocean Color Imager (GOCI): A case study for South Korea

    Science.gov (United States)

    Kim, Hye-Won; Yeom, Jong-Min; Shin, Daegeun; Choi, Sungwon; Han, Kyung-Soo; Roujean, Jean-Louis

    2017-08-01

    In this study, a new assessment of thin cloud detection with the application of bidirectional reflectance distribution function (BRDF) model-based background surface reflectance was undertaken by interpreting surface spectra characterized using the Geostationary Ocean Color Imager (GOCI) over a land surface area. Unlike cloud detection over the ocean, the detection of cloud over land surfaces is difficult due to the complicated surface scattering characteristics, which vary among land surface types. Furthermore, in the case of thin clouds, in which the surface and cloud radiation are mixed, it is difficult to detect the clouds in both land and atmospheric fields. Therefore, to interpret background surface reflectance, especially underneath cloud, the semiempirical BRDF model was used to simulate surface reflectance by reflecting solar angle-dependent geostationary sensor geometry. For quantitative validation, Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observation (CALIPSO) data were used to make a comparison with the proposed cloud masking result. As a result, the new cloud masking scheme resulted in a high probability of detection (POD = 0.82) compared with the Moderate Resolution Imaging Spectroradiometer (MODIS) (POD = 0.808) for all cloud cases. In particular, the agreement between the CALIPSO cloud product and new GOCI cloud mask was over 94% when detecting thin cloud (e.g., altostratus and cirrus) from January 2014 to June 2015. This result is relatively high in comparison with the result from the MODIS Collection 6 cloud mask product (MYD35).

  12. MR-perfusion (MRP) and diffusion-weighted imaging (DWI) in prostate cancer: Quantitative and model-based gadobenate dimeglumine MRP parameters in detection of prostate cancer

    Energy Technology Data Exchange (ETDEWEB)

    Scherr, M.K., E-mail: michael.scherr@med.uni-muenchen.de [Institute of Clinical Radiology, University of Munich, Munich (Germany); Seitz, M. [Department of Urology, University of Munich, Munich (Germany); Mueller-Lisse, U.G. [Institute of Clinical Radiology, University of Munich, Munich (Germany); Ingrisch, M. [Josef Lissner Laboratory for Biomedical Imaging, Institute of Clinical Radiology, University of Munich, Munich (Germany); Reiser, M.F. [Institute of Clinical Radiology, University of Munich, Munich (Germany); Mueller-Lisse, U.L. [Department of Urology, University of Munich, Munich (Germany)

    2010-12-15

    Background: Various MR methods, including MR-spectroscopy (MRS), dynamic, contrast-enhanced MRI (DCE-MRI), and diffusion-weighted imaging (DWI) have been applied to improve test quality of standard MRI of the prostate. Purpose: To determine if quantitative, model-based MR-perfusion (MRP) with gadobenate dimeglumine (Gd-BOPTA) discriminates between prostate cancer, benign tissue, and transitional zone (TZ) tissue. Material and methods: 27 patients (age, 65 {+-} 4 years; PSA 11.0 {+-} 6.1 ng/ml) with clinical suspicion of prostate cancer underwent standard MRI, 3D MR-spectroscopy (MRS), and MRP with Gd-BOPTA. Based on results of combined MRI/MRS and subsequent guided prostate biopsy alone (17/27), biopsy and radical prostatectomy (9/27), or sufficient negative follow-up (7/27), maps of model-free, deconvolution-based mean transit time (dMTT) were generated for 29 benign regions (bROIs), 14 cancer regions (cROIs), and 18 regions of transitional zone (tzROIs). Applying a 2-compartment exchange model, quantitative perfusion analysis was performed including as parameters: plasma flow (PF), plasma volume (PV), plasma mean transit time (PMTT), extraction flow (EFL), extraction fraction (EFR), interstitial volume (IV) and interstitial mean transit time (IMTT). Two-sided T-tests (significance level p < 0.05) discriminated bROIs vs. cROIs and cROIs vs. tzROIs, respectively. Results: PMTT discriminated best between bROIs (11.8 {+-} 3.0 s) and cROIs (24.3 {+-} 9.6 s) (p < 0.0001), while PF, PV, PS, EFR, IV, IMTT also differed significantly (p 0.00002-0.0136). Discrimination between cROIs and tzROIs was insignificant for all parameters except PV (14.3 {+-} 2.5 ml vs. 17.6 {+-} 2.6 ml, p < 0.05). Conclusions: Besides MRI, MRS and DWI quantitative, 2-compartment MRP with Gd-BOPTA discriminates between prostate cancer and benign tissue with several parameters. However, distinction of prostate cancer and TZ does not appear to be reliable.

  13. A new method to detect event-related potentials based on Pearson's correlation.

    Science.gov (United States)

    Giroldini, William; Pederzoli, Luciano; Bilucaglia, Marco; Melloni, Simone; Tressoldi, Patrizio

    2016-12-01

    Event-related potentials (ERPs) are widely used in brain-computer interface applications and in neuroscience.  Normal EEG activity is rich in background noise, and therefore, in order to detect ERPs, it is usually necessary to take the average from multiple trials to reduce the effects of this noise.  The noise produced by EEG activity itself is not correlated with the ERP waveform and so, by calculating the average, the noise is decreased by a factor inversely proportional to the square root of N , where N is the number of averaged epochs. This is the easiest strategy currently used to detect ERPs, which is based on calculating the average of all ERP's waveform, these waveforms being time- and phase-locked.  In this paper, a new method called GW6 is proposed, which calculates the ERP using a mathematical method based only on Pearson's correlation. The result is a graph with the same time resolution as the classical ERP and which shows only positive peaks representing the increase-in consonance with the stimuli-in EEG signal correlation over all channels.  This new method is also useful for selectively identifying and highlighting some hidden components of the ERP response that are not phase-locked, and that are usually hidden in the standard and simple method based on the averaging of all the epochs.  These hidden components seem to be caused by variations (between each successive stimulus) of the ERP's inherent phase latency period (jitter), although the same stimulus across all EEG channels produces a reasonably constant phase. For this reason, this new method could be very helpful to investigate these hidden components of the ERP response and to develop applications for scientific and medical purposes. Moreover, this new method is more resistant to EEG artifacts than the standard calculations of the average and could be very useful in research and neurology.  The method we are proposing can be directly used in the form of a process written in the well

  14. A secure distributed logistic regression protocol for the detection of rare adverse drug events.

    Science.gov (United States)

    El Emam, Khaled; Samet, Saeed; Arbuckle, Luk; Tamblyn, Robyn; Earle, Craig; Kantarcioglu, Murat

    2013-05-01

    There is limited capacity to assess the comparative risks of medications after they enter the market. For rare adverse events, the pooling of data from multiple sources is necessary to have the power and sufficient population heterogeneity to detect differences in safety and effectiveness in genetic, ethnic and clinically defined subpopulations. However, combining datasets from different data custodians or jurisdictions to perform an analysis on the pooled data creates significant privacy concerns that would need to be addressed. Existing protocols for addressing these concerns can result in reduced analysis accuracy and can allow sensitive information to leak. To develop a secure distributed multi-party computation protocol for logistic regression that provides strong privacy guarantees. We developed a secure distributed logistic regression protocol using a single analysis center with multiple sites providing data. A theoretical security analysis demonstrates that the protocol is robust to plausible collusion attacks and does not allow the parties to gain new information from the data that are exchanged among them. The computational performance and accuracy of the protocol were evaluated on simulated datasets. The computational performance scales linearly as the dataset sizes increase. The addition of sites results in an exponential growth in computation time. However, for up to five sites, the time is still short and would not affect practical applications. The model parameters are the same as the results on pooled raw data analyzed in SAS, demonstrating high model accuracy. The proposed protocol and prototype system would allow the development of logistic regression models in a secure manner without requiring the sharing of personal health information. This can alleviate one of the key barriers to the establishment of large-scale post-marketing surveillance programs. We extended the secure protocol to account for correlations among patients within sites through

  15. Enriched Encoding: Reward Motivation Organizes Cortical Networks for Hippocampal Detection of Unexpected Events

    OpenAIRE

    Murty, Vishnu P.; Adcock, R. Alison

    2013-01-01

    Learning how to obtain rewards requires learning about their contexts and likely causes. How do long-term memory mechanisms balance the need to represent potential determinants of reward outcomes with the computational burden of an over-inclusive memory? One solution would be to enhance memory for salient events that occur during reward anticipation, because all such events are potential determinants of reward. We tested whether reward motivation enhances encoding of salient events like expec...

  16. The necessity of recognizing all events in X-ray detection.

    Science.gov (United States)

    Papp, T; Maxwell, J A; Papp, A T

    2010-01-01

    In our work in studying properties of inner shell ionization, we are troubled that the experimental data used to determine the basic parameters of X-ray physics have a large and unexplainable scatter. As we looked into the problems we found that many of them contradict simple logic, elemental arithmetic, even parity and angular momentum conservation laws. We have identified that the main source of the problems, other than the human factor, is rooted in the signal processing electronics. To overcome these problems we have developed a fully digital signal processor, which not only has excellent resolution and line shape, but also allows proper accounting of all events. This is achieved by processing all events and separating them into two or more spectra (maximum 16), where the first spectrum is the accepted or good spectrum and the second spectrum is the spectrum of all rejected events. The availability of all the events allows one to see the other part of the spectrum. To our surprise the total information explains many of the shortcomings and contradictions of the X-ray database. The data processing methodology cannot be established on the partial and fractional information offered by other approaches. Comparing Monte Carlo detector modeling results with the partial spectra is ambiguous. It suggests that the metrology of calibration by radioactive sources as well as other X-ray measurements could be improved by the availability of the proper accounting of all events. It is not enough to know that an event was rejected and increment the input counter, it is necessary to know, what was rejected and why it happened, whether it was a noise or a disturbed event, a retarded event or a true event, or any pile up combination of these events. Such information is supplied by our processor reporting the events rejected by each discriminator in separate spectra. Several industrial applications of this quality assurance capable signal processor are presented. Copyright 2009

  17. Detection of water-quality contamination events based on multi-sensor fusion using an extented Dempster–Shafer method

    International Nuclear Information System (INIS)

    Hou, Dibo; He, Huimei; Huang, Pingjie; Zhang, Guangxin; Loaiciga, Hugo

    2013-01-01

    This study presents a method for detecting contamination events of sources of drinking water based on the Dempster–Shafer (D-S) evidence theory. The detection method has the purpose of protecting water supply systems against accidental and intentional contamination events. This purpose is achieved by first predicting future water-quality parameters using an autoregressive (AR) model. The AR model predicts future water-quality parameters using recent measurements of these parameters made with automated (on-line) water-quality sensors. Next, a probabilistic method assigns probabilities to the time series of residuals formed by comparing predicted water-quality parameters with threshold values. Finally, the D-S fusion method searches for anomalous probabilities of the residuals and uses the result of that search to determine whether the current water quality is normal (that is, free of pollution) or contaminated. The D-S fusion method is extended and improved in this paper by weighted averaging of water-contamination evidence and by the analysis of the persistence of anomalous probabilities of water-quality parameters. The extended D-S fusion method makes determinations that have a high probability of being correct concerning whether or not a source of drinking water has been contaminated. This paper's method for detecting water-contamination events was tested with water-quality time series from automated (on-line) water quality sensors. In addition, a small-scale, experimental, water-pipe network was tested to detect water-contamination events. The two tests demonstrated that the extended D-S fusion method achieves a low false alarm rate and high probabilities of detecting water contamination events. (paper)

  18. Unified framework for triaxial accelerometer-based fall event detection and classification using cumulants and hierarchical decision tree classifier.

    Science.gov (United States)

    Kambhampati, Satya Samyukta; Singh, Vishal; Manikandan, M Sabarimalai; Ramkumar, Barathram

    2015-08-01

    In this Letter, the authors present a unified framework for fall event detection and classification using the cumulants extracted from the acceleration (ACC) signals acquired using a single waist-mounted triaxial accelerometer. The main objective of this Letter is to find suitable representative cumulants and classifiers in effectively detecting and classifying different types of fall and non-fall events. It was discovered that the first level of the proposed hierarchical decision tree algorithm implements fall detection using fifth-order cumulants and support vector machine (SVM) classifier. In the second level, the fall event classification algorithm uses the fifth-order cumulants and SVM. Finally, human activity classification is performed using the second-order cumulants and SVM. The detection and classification results are compared with those of the decision tree, naive Bayes, multilayer perceptron and SVM classifiers with different types of time-domain features including the second-, third-, fourth- and fifth-order cumulants and the signal magnitude vector and signal magnitude area. The experimental results demonstrate that the second- and fifth-order cumulant features and SVM classifier can achieve optimal detection and classification rates of above 95%, as well as the lowest false alarm rate of 1.03%.

  19. Assessment of low contrast detection in CT using model observers. Developing a clinically-relevant tool for characterising adaptive statistical and model-based iterative reconstruction

    Energy Technology Data Exchange (ETDEWEB)

    Ott, Julien G.; Ba, Alexandre; Racine, Damien; Viry, Anais; Bochud, Francois O.; Verdun, Francis R. [Univ. Hospital Lausanne (Switzerland). Inst. of Radiation Physics

    2017-08-01

    This study aims to assess CT image quality in a way that would meet specific requirements of clinical practice. Physics metrics like Fourier transform derived metrics were traditionally employed for that. However, assessment methods through a detection task have also developed quite extensively lately, and we chose here to rely on this modality for image quality assessment. Our goal was to develop a tool adapted for a fast and reliable CT image quality assessment in order to pave the way for new CT benchmarking techniques in a clinical context. Additionally, we also used this method to estimate the benefits brought by some IR algorithms. A modified QRM chest phantom containing spheres of 5 and 8 mm at contrast levels of 10 and 20 HU at 120 kVp was used. Images of the phantom were acquired at CTDI{sub vol} of 0.8, 3.6, 8.2 and 14.5 mGy, before being reconstructed using FBP, ASIR 40 and MBIR on a GE HD 750 CT scanner. They were then assessed by eight human observers undergoing a 4-AFC test. After that, these data were compared with the results obtained from two different model observers (NPWE and CHO with DDoG channels). The study investigated the effects of the acquisition conditions as well as reconstruction methods. NPWE and CHO models both gave coherent results and approximated human observer results well. Moreover, the reconstruction technique used to retrieve the images had a clear impact on the PC values. Both models suggest that switching from FBP to ASIR 40 and particularly to MBIR produces an increase of the low contrast detection, provided a minimum level of exposure is reached. Our work shows that both CHO with DDoG channels and NPWE models both approximate the trend of humans performing a detection task. Both models also suggest that the use of MBIR goes along with an increase of the PCs, indicating that further dose reduction is still possible when using those techniques. Eventually, the CHO model associated to the protocol we described in this study

  20. A Systematic Analysis of the Sensitivity of Plasma Pharmacokinetics to Detect Differences in the Pulmonary Performance of Inhaled Fluticasone Propionate Products Using a Model-Based Simulation Approach.

    Science.gov (United States)

    Weber, Benjamin; Hochhaus, Guenther

    2015-07-01

    The role of plasma pharmacokinetics (PK) for assessing bioequivalence at the target site, the lung, for orally inhaled drugs remains unclear. A validated semi-mechanistic model, considering the presence of mucociliary clearance in central lung regions, was expanded for quantifying the sensitivity of PK studies in detecting differences in the pulmonary performance (total lung deposition, central-to-peripheral lung deposition ratio, and pulmonary dissolution characteristics) between test (T) and reference (R) inhaled fluticasone propionate (FP) products. PK bioequivalence trials for inhaled FP were simulated based on this PK model for a varying number of subjects and T products. The statistical power to conclude bioequivalence when T and R products are identical was demonstrated to be 90% for approximately 50 subjects. Furthermore, the simulations demonstrated that PK metrics (area under the concentration time curve (AUC) and C max) are capable of detecting differences between T and R formulations of inhaled FP products when the products differ by more than 20%, 30%, and 25% for total lung deposition, central-to-peripheral lung deposition ratio, and pulmonary dissolution characteristics, respectively. These results were derived using a rather conservative risk assessment approach with an error rate of <10%. The simulations thus indicated that PK studies might be a viable alternative to clinical studies comparing pulmonary efficacy biomarkers for slowly dissolving inhaled drugs. PK trials for pulmonary efficacy equivalence testing should be complemented by in vitro studies to avoid false positive bioequivalence assessments that are theoretically possible for some specific scenarios. Moreover, a user-friendly web application for simulating such PK equivalence trials with inhaled FP is provided.

  1. A semi-automated method for rapid detection of ripple events on interictal voltage discharges in the scalp electroencephalogram.

    Science.gov (United States)

    Chu, Catherine J; Chan, Arthur; Song, Dan; Staley, Kevin J; Stufflebeam, Steven M; Kramer, Mark A

    2017-02-01

    High frequency oscillations are emerging as a clinically important indicator of epileptic networks. However, manual detection of these high frequency oscillations is difficult, time consuming, and subjective, especially in the scalp EEG, thus hindering further clinical exploration and application. Semi-automated detection methods augment manual detection by reducing inspection to a subset of time intervals. We propose a new method to detect high frequency oscillations that co-occur with interictal epileptiform discharges. The new method proceeds in two steps. The first step identifies candidate time intervals during which high frequency activity is increased. The second step computes a set of seven features for each candidate interval. These features require that the candidate event contain a high frequency oscillation approximately sinusoidal in shape, with at least three cycles, that co-occurs with a large amplitude discharge. Candidate events that satisfy these features are stored for validation through visual analysis. We evaluate the detector performance in simulation and on ten examples of scalp EEG data, and show that the proposed method successfully detects spike-ripple events, with high positive predictive value, low false positive rate, and high intra-rater reliability. The proposed method is less sensitive than the existing method of visual inspection, but much faster and much more reliable. Accurate and rapid detection of high frequency activity increases the clinical viability of this rhythmic biomarker of epilepsy. The proposed spike-ripple detector rapidly identifies candidate spike-ripple events, thus making clinical analysis of prolonged, multielectrode scalp EEG recordings tractable. Copyright © 2016 Elsevier B.V. All rights reserved.

  2. MO-DE-207A-01: Impact of Statistical Weights On Detection of Low-Contrast Details in Model-Based Iterative CT Reconstruction

    International Nuclear Information System (INIS)

    Noo, F; Guo, Z

    2016-01-01

    Purpose: Penalized-weighted least-square reconstruction has become an important research topic in CT, to reduce dose without affecting image quality. Two components impact image quality in this reconstruction: the statistical weights and the use of an edge-preserving penalty term. We are interested in assessing the influence of statistical weights on their own, without the edge-preserving feature. Methods: The influence of statistical weights on image quality was assessed in terms of low-contrast detail detection using LROC analysis. The task amounted to detect and localize a 6-mm lesion with random contrast inside the FORBILD head phantom. A two-alternative forced-choice experiment was used with two human observers performing the task. Reconstructions without and with statistical weights were compared, both using the same quadratic penalty term. The beam energy was set to 30keV to amplify spatial differences in attenuation and thereby the role of statistical weights. A fan-beam data acquisition geometry was used. Results: Visual inspection of images clearly showed a difference in noise between the two reconstructions methods. As expected, the reconstruction without statistical weights exhibited noise streaks. The other reconstruction appeared better in this aspect, but presented other disturbing noise patterns and artifacts induced by the weights. The LROC analysis yield the following 95-percent confidence interval for the difference in reader-averaged AUC (reconstruction without weights minus reconstruction with weights): [0.0026,0.0599]. The mean AUC value was 0.9094. Conclusion: We have investigated the impact of statistical weights without the use of edge-preserving penalty in penalized weighted least-square reconstruction. A decrease rather than increase in image quality was observed when using statistical weights. Thus, the observers were better able to cope with the noise streaks than the noise patterns and artifacts induced by the statistical weights. It

  3. MO-DE-207A-01: Impact of Statistical Weights On Detection of Low-Contrast Details in Model-Based Iterative CT Reconstruction

    Energy Technology Data Exchange (ETDEWEB)

    Noo, F; Guo, Z [University of Utah, Salt Lake City, UT (United States)

    2016-06-15

    Purpose: Penalized-weighted least-square reconstruction has become an important research topic in CT, to reduce dose without affecting image quality. Two components impact image quality in this reconstruction: the statistical weights and the use of an edge-preserving penalty term. We are interested in assessing the influence of statistical weights on their own, without the edge-preserving feature. Methods: The influence of statistical weights on image quality was assessed in terms of low-contrast detail detection using LROC analysis. The task amounted to detect and localize a 6-mm lesion with random contrast inside the FORBILD head phantom. A two-alternative forced-choice experiment was used with two human observers performing the task. Reconstructions without and with statistical weights were compared, both using the same quadratic penalty term. The beam energy was set to 30keV to amplify spatial differences in attenuation and thereby the role of statistical weights. A fan-beam data acquisition geometry was used. Results: Visual inspection of images clearly showed a difference in noise between the two reconstructions methods. As expected, the reconstruction without statistical weights exhibited noise streaks. The other reconstruction appeared better in this aspect, but presented other disturbing noise patterns and artifacts induced by the weights. The LROC analysis yield the following 95-percent confidence interval for the difference in reader-averaged AUC (reconstruction without weights minus reconstruction with weights): [0.0026,0.0599]. The mean AUC value was 0.9094. Conclusion: We have investigated the impact of statistical weights without the use of edge-preserving penalty in penalized weighted least-square reconstruction. A decrease rather than increase in image quality was observed when using statistical weights. Thus, the observers were better able to cope with the noise streaks than the noise patterns and artifacts induced by the statistical weights. It

  4. Particle Filter Based Tracking in a Detection Sparse Discrete Event Simulation Environment

    National Research Council Canada - National Science Library

    Borovies, Drew A

    2007-01-01

    .... In actuality, even extremely uncertain or incomplete detections and counter-detections of opposing entities can provide enough data for entities to make reasonably intelligent decisions on the virtual battlefield...

  5. A Data-Driven Monitoring Technique for Enhanced Fall Events Detection

    KAUST Repository

    Zerrouki, Nabil; Harrou, Fouzi; Sun, Ying; Houacine, Amrane

    2016-01-01

    Fall detection is a crucial issue in the health care of seniors. In this work, we propose an innovative method for detecting falls via a simple human body descriptors. The extracted features are discriminative enough to describe human postures

  6. Accuracy and precision of equine gait event detection during walking with limb and trunk mounted inertial sensors

    DEFF Research Database (Denmark)

    Olsen, Emil; Andersen, Pia Haubro; Pfau, Thilo

    2012-01-01

    The increased variations of temporal gait events when pathology is present are good candidate features for objective diagnostic tests. We hypothesised that the gait events hoof-on/off and stance can be detected accurately and precisely using features from trunk and distal limb-mounted Inertial....... Accuracy (bias) and precision (SD of bias) was calculated to compare force plate and IMU timings for gait events. Data were collected from seven horses. One hundred and twenty three (123) front limb steps were analysed; hoof-on was detected with a bias (SD) of -7 (23) ms, hoof-off with 0.7 (37) ms...... and front limb stance with -0.02 (37) ms. A total of 119 hind limb steps were analysed; hoof-on was found with a bias (SD) of -4 (25) ms, hoof-off with 6 (21) ms and hind limb stance with 0.2 (28) ms. IMUs mounted on the distal limbs and sacrum can detect gait events accurately and precisely....

  7. Validating administrative data for the detection of adverse events in older hospitalized patients

    Directory of Open Access Journals (Sweden)

    Ackroyd-Stolarz S

    2014-08-01

    Full Text Available Stacy Ackroyd-Stolarz,1,2 Susan K Bowles,3–5 Lorri Giffin6 1Performance Excellence Portfolio, Capital District Health Authority, Halifax, Nova Scotia, Canada; 2Department of Emergency Medicine, Dalhousie University, Halifax, Nova Scotia, Canada; 3Geriatric Medicine, Capital District Health Authority, Halifax, Nova Scotia, Canada; 4College of Pharmacy and Division of Geriatric Medicine, Dalhousie University, Halifax, Nova Scotia, Canada; 5Department of Pharmacy at Capital District Health Authority, Halifax, Nova Scotia, Canada; 6South Shore Family Health, Bridgewater, Nova Scotia, Canada Abstract: Older hospitalized patients are at risk of experiencing adverse events including, but not limited to, hospital-acquired pressure ulcers, fall-related injuries, and adverse drug events. A significant challenge in monitoring and managing adverse events is lack of readily accessible information on their occurrence. Purpose: The objective of this retrospective cross-sectional study was to validate diagnostic codes for pressure ulcers, fall-related injuries, and adverse drug events found in routinely collected administrative hospitalization data. Methods: All patients 65 years of age or older discharged between April 1, 2009 and March 31, 2011 from a provincial academic health sciences center in Canada were eligible for inclusion in the validation study. For each of the three types of adverse events, a random sample of 50 patients whose records were positive and 50 patients whose records were not positive for an adverse event was sought for review in the validation study (n=300 records in total. A structured health record review was performed independently by two health care providers with experience in geriatrics, both of whom were unaware of the patient's status with respect to adverse event coding. A physician reviewed 40 records (20 reviewed by each health care provider to establish interrater agreement. Results: A total of 39 pressure ulcers, 56 fall

  8. [Comparison of the "Trigger" tool with the minimum basic data set for detecting adverse events in general surgery].

    Science.gov (United States)

    Pérez Zapata, A I; Gutiérrez Samaniego, M; Rodríguez Cuéllar, E; Gómez de la Cámara, A; Ruiz López, P

    Surgery is a high risk for the occurrence of adverse events (AE). The main objective of this study is to compare the effectiveness of the Trigger tool with the Hospital National Health System registration of Discharges, the minimum basic data set (MBDS), in detecting adverse events in patients admitted to General Surgery and undergoing surgery. Observational and descriptive retrospective study of patients admitted to general surgery of a tertiary hospital, and undergoing surgery in 2012. The identification of adverse events was made by reviewing the medical records, using an adaptation of "Global Trigger Tool" methodology, as well as the (MBDS) registered on the same patients. Once the AE were identified, they were classified according to damage and to the extent to which these could have been avoided. The area under the curve (ROC) were used to determine the discriminatory power of the tools. The Hanley and Mcneil test was used to compare both tools. AE prevalence was 36.8%. The TT detected 89.9% of all AE, while the MBDS detected 28.48%. The TT provides more information on the nature and characteristics of the AE. The area under the curve was 0.89 for the TT and 0.66 for the MBDS. These differences were statistically significant (P<.001). The Trigger tool detects three times more adverse events than the MBDS registry. The prevalence of adverse events in General Surgery is higher than that estimated in other studies. Copyright © 2017 SECA. Publicado por Elsevier España, S.L.U. All rights reserved.

  9. Impact of model-based iterative reconstruction on low-contrast lesion detection and image quality in abdominal CT: a 12-reader-based comparative phantom study with filtered back projection at different tube voltages

    Energy Technology Data Exchange (ETDEWEB)

    Euler, Andre; Stieltjes, Bram; Eichenberger, Reto; Reisinger, Clemens; Hirschmann, Anna; Zaehringer, Caroline; Kircher, Achim; Streif, Matthias; Bucher, Sabine; Buergler, David; D' Errico, Luigia; Kopp, Sebastien; Wilhelm, Markus [University Hospital Basel, Clinic of Radiology and Nuclear Medicine, Basel (Switzerland); Szucs-Farkas, Zsolt [Hospital Centre of Biel, Institute of Radiology, Biel (Switzerland); Schindera, Sebastian T. [University Hospital Basel, Clinic of Radiology and Nuclear Medicine, Basel (Switzerland); Cantonal Hospital Aarau, Institute of Radiology, Aarau (Switzerland)

    2017-12-15

    To evaluate the impact of model-based iterative reconstruction (MBIR) on image quality and low-contrast lesion detection compared with filtered back projection (FBP) in abdominal computed tomography (CT) of simulated medium and large patients at different tube voltages. A phantom with 45 hypoattenuating lesions was placed in two water containers and scanned at 70, 80, 100, and 120 kVp. The 120-kVp protocol served as reference, and the volume CT dose index (CTDI{sub vol}) was kept constant for all protocols. The datasets were reconstructed with MBIR and FBP. Image noise and contrast-to-noise-ratio (CNR) were assessed. Low-contrast lesion detectability was evaluated by 12 radiologists. MBIR decreased the image noise by 24% and 27%, and increased the CNR by 30% and 29% for the medium and large phantoms, respectively. Lower tube voltages increased the CNR by 58%, 46%, and 16% at 70, 80, and 100 kVp, respectively, compared with 120 kVp in the medium phantom and by 9%, 18% and 12% in the large phantom. No significant difference in lesion detection rate was observed (medium: 79-82%; large: 57-65%; P > 0.37). Although MBIR improved quantitative image quality compared with FBP, it did not result in increased low-contrast lesion detection in abdominal CT at different tube voltages in simulated medium and large patients. (orig.)

  10. The sequentially discounting autoregressive (SDAR) method for on-line automatic seismic event detecting on long term observation

    Science.gov (United States)

    Wang, L.; Toshioka, T.; Nakajima, T.; Narita, A.; Xue, Z.

    2017-12-01

    In recent years, more and more Carbon Capture and Storage (CCS) studies focus on seismicity monitoring. For the safety management of geological CO2 storage at Tomakomai, Hokkaido, Japan, an Advanced Traffic Light System (ATLS) combined different seismic messages (magnitudes, phases, distributions et al.) is proposed for injection controlling. The primary task for ATLS is the seismic events detection in a long-term sustained time series record. Considering the time-varying characteristics of Signal to Noise Ratio (SNR) of a long-term record and the uneven energy distributions of seismic event waveforms will increase the difficulty in automatic seismic detecting, in this work, an improved probability autoregressive (AR) method for automatic seismic event detecting is applied. This algorithm, called sequentially discounting AR learning (SDAR), can identify the effective seismic event in the time series through the Change Point detection (CPD) of the seismic record. In this method, an anomaly signal (seismic event) can be designed as a change point on the time series (seismic record). The statistical model of the signal in the neighborhood of event point will change, because of the seismic event occurrence. This means the SDAR aims to find the statistical irregularities of the record thought CPD. There are 3 advantages of SDAR. 1. Anti-noise ability. The SDAR does not use waveform messages (such as amplitude, energy, polarization) for signal detecting. Therefore, it is an appropriate technique for low SNR data. 2. Real-time estimation. When new data appears in the record, the probability distribution models can be automatic updated by SDAR for on-line processing. 3. Discounting property. the SDAR introduces a discounting parameter to decrease the influence of present statistic value on future data. It makes SDAR as a robust algorithm for non-stationary signal processing. Within these 3 advantages, the SDAR method can handle the non-stationary time-varying long

  11. Detection of rain events in radiological early warning networks with spectro-dosimetric systems

    Science.gov (United States)

    Dąbrowski, R.; Dombrowski, H.; Kessler, P.; Röttger, A.; Neumaier, S.

    2017-10-01

    Short-term pronounced increases of the ambient dose equivalent rate, due to rainfall are a well-known phenomenon. Increases in the same order of magnitude or even below may also be caused by a nuclear or radiological event, i.e. by artificial radiation. Hence, it is important to be able to identify natural rain events in dosimetric early warning networks and to distinguish them from radiological events. Novel spectrometric systems based on scintillators may be used to differentiate between the two scenarios, because the measured gamma spectra provide significant nuclide-specific information. This paper describes three simple, automatic methods to check whether an dot H*(10) increase is caused by a rain event or by artificial radiation. These methods were applied to measurements of three spectrometric systems based on CeBr3, LaBr3 and SrI2 scintillation crystals, investigated and tested for their practicability at a free-field reference site of PTB.

  12. [Performance and optimisation of a trigger tool for the detection of adverse events in hospitalised adult patients].

    Science.gov (United States)

    Guzmán Ruiz, Óscar; Pérez Lázaro, Juan José; Ruiz López, Pedro

    To characterise the performance of the triggers used in the detection of adverse events (AE) of hospitalised adult patients and to define a simplified panel of triggers to facilitate the detection of AE. Cross-sectional study of charts of patients from a service of internal medicine to detect EA through systematic review of the charts and identification of triggers (clinical event often related to AE), determining if there was AE as the context in which it appeared the trigger. Once the EA was detected, we proceeded to the characterization of the triggers that detected it. Logistic regression was applied to select the triggers with greater AE detection capability. A total of 291 charts were reviewed, with a total of 562 triggers in 103 patients, of which 163 were involved in detecting an AE. The triggers that detected the most AE were "A.1. Pressure ulcer" (9.82%), "B.5. Laxative or enema" (8.59%), "A.8. Agitation" (8.59%), "A.9. Over-sedation" (7.98%), "A.7. Haemorrhage" (6.75%) and "B.4. Antipsychotic" (6.75%). A simplified model was obtained using logistic regression, and included the variable "Number of drugs" and the triggers "Over-sedation", "Urinary catheterisation", "Readmission in 30 days", "Laxative or enema" and "Abrupt medication stop". This model showed a probability of 81% to correctly classify charts with EA or without EA (p <0.001; 95% confidence interval: 0.763-0.871). A high number of triggers were associated with AE. The summary model is capable of detecting a large amount of AE, with a minimum of elements. Copyright © 2017 SESPAS. Publicado por Elsevier España, S.L.U. All rights reserved.

  13. Research and development of a high-temperature helium-leak detection system (joint research). Part 1 survey on leakage events and current leak detection technology

    Energy Technology Data Exchange (ETDEWEB)

    Sakaba, Nariaki; Nakazawa, Toshio; Kawasaki, Kozo [Japan Atomic Energy Research Inst., Oarai, Ibaraki (Japan). Oarai Research Establishment; Urakami, Masao; Saisyu, Sadanori [Japan Atomic Power Co., Tokyo (Japan)

    2003-03-01

    In High Temperature Gas-cooled Reactors (HTGR), the detection of leakage of helium at an early stage is very important for the safety and stability of operations. Since helium is a colourless gas, it is generally difficult to identify the location and the amount of leakage when very little leakage has occurred. The purpose of this R and D is to develop a helium leak detection system for the high temperature environment appropriate to the HTGR. As the first step in the development, this paper describes the result of surveying leakage events at nuclear facilities inside and outside Japan and current gas leakage detection technology to adapt optical-fibre detection technology to HTGRs. (author)

  14. A Data-Driven Monitoring Technique for Enhanced Fall Events Detection

    KAUST Repository

    Zerrouki, Nabil

    2016-07-26

    Fall detection is a crucial issue in the health care of seniors. In this work, we propose an innovative method for detecting falls via a simple human body descriptors. The extracted features are discriminative enough to describe human postures and not too computationally complex to allow a fast processing. The fall detection is addressed as a statistical anomaly detection problem. The proposed approach combines modeling using principal component analysis modeling with the exponentially weighted moving average (EWMA) monitoring chart. The EWMA scheme is applied on the ignored principal components to detect the presence of falls. Using two different fall detection datasets, URFD and FDD, we have demonstrated the greater sensitivity and effectiveness of the developed method over the conventional PCA-based methods.

  15. INTEGRAL Detection of the First Prompt Gamma-Ray Signal Coincident with the Gravitational-wave Event GW170817

    DEFF Research Database (Denmark)

    Savchenko, V.; Ferrigno, C.; Kuulkers, E.

    2017-01-01

    We report the INTernational Gamma-ray Astrophysics Laboratory (INTEGRAL) detection of the short gamma-ray burst GRB 170817A (discovered by Fermi-GBM) with a signal-to-noise ratio of 4.6, and, for the first time, its association with the gravitational waves (GWs) from binary neutron star (BNS......) merging event GW170817 detected by the LIGO and Virgo observatories. The significance of association between the gamma-ray burst observed by INTEGRAL and GW170817 is 3.2σ, while the association between the Fermi-GBM and INTEGRAL detections is 4.2σ. GRB 170817A was detected by the SPI-ACS instrument about...

  16. One Novel Multiple-Target Plasmid Reference Molecule Targeting Eight Genetically Modified Canola Events for Genetically Modified Canola Detection.

    Science.gov (United States)

    Li, Zhuqing; Li, Xiang; Wang, Canhua; Song, Guiwen; Pi, Liqun; Zheng, Lan; Zhang, Dabing; Yang, Litao

    2017-09-27

    Multiple-target plasmid DNA reference materials have been generated and utilized as good substitutes of matrix-based reference materials in the analysis of genetically modified organisms (GMOs). Herein, we report the construction of one multiple-target plasmid reference molecule, pCAN, which harbors eight GM canola event-specific sequences (RF1, RF2, MS1, MS8, Topas 19/2, Oxy235, RT73, and T45) and a partial sequence of the canola endogenous reference gene PEP. The applicability of this plasmid reference material in qualitative and quantitative PCR assays of the eight GM canola events was evaluated, including the analysis of specificity, limit of detection (LOD), limit of quantification (LOQ), and performance of pCAN in the analysis of various canola samples, etc. The LODs are 15 copies for RF2, MS1, and RT73 assays using pCAN as the calibrator and 10 genome copies for the other events. The LOQ in each event-specific real-time PCR assay is 20 copies. In quantitative real-time PCR analysis, the PCR efficiencies of all event-specific and PEP assays are between 91% and 97%, and the squared regression coefficients (R 2 ) are all higher than 0.99. The quantification bias values varied from 0.47% to 20.68% with relative standard deviation (RSD) from 1.06% to 24.61% in the quantification of simulated samples. Furthermore, 10 practical canola samples sampled from imported shipments in the port of Shanghai, China, were analyzed employing pCAN as the calibrator, and the results were comparable with those assays using commercial certified materials as the calibrator. Concluding from these results, we believe that this newly developed pCAN plasmid is one good candidate for being a plasmid DNA reference material in the detection and quantification of the eight GM canola events in routine analysis.

  17. INTEGRAL Detection of the First Prompt Gamma-Ray Signal Coincident with the Gravitational-wave Event GW170817

    Energy Technology Data Exchange (ETDEWEB)

    Savchenko, V.; Ferrigno, C.; Bozzo, E.; Courvoisier, T. J.-L. [ISDC, Department of Astronomy, University of Geneva, Chemin d’Écogia, 16 CH-1290 Versoix (Switzerland); Kuulkers, E. [European Space Research and Technology Centre (ESA/ESTEC), Keplerlaan 1, 2201 AZ Noordwijk (Netherlands); Bazzano, A.; Natalucci, L.; Rodi, J. [INAF-Institute for Space Astrophysics and Planetology, Via Fosso del Cavaliere 100, I-00133-Rome (Italy); Brandt, S.; Chenevez, J. [DTU Space, National Space Institute Elektrovej, Building 327 DK-2800 Kongens Lyngby (Denmark); Diehl, R.; Von Kienlin, A. [Max-Planck-Institut für Extraterrestrische Physik, Garching (Germany); Domingo, A. [Centro de Astrobiología (CAB-CSIC/INTA, ESAC Campus), Camino bajo del Castillo S/N, E-28692 Villanueva de la Cañada, Madrid (Spain); Hanlon, L.; Martin-Carrillo, A. [Space Science Group, School of Physics, University College Dublin, Belfield, Dublin 4 (Ireland); Jourdain, E. [IRAP, Université de Toulouse, CNRS, UPS, CNES, 9 Av. Roche, F-31028 Toulouse (France); Laurent, P.; Lebrun, F. [APC, AstroParticule et Cosmologie, Université Paris Diderot, CNRS/IN2P3, CEA/Irfu, Observatoire de Paris Sorbonne Paris Cité, 10 rue Alice Domont et Léonie Duquet, F-75205 Paris Cedex 13 (France); Lutovinov, A. [Space Research Institute of Russian Academy of Sciences, Profsoyuznaya 84/32, 117997 Moscow (Russian Federation); Mereghetti, S. [INAF, IASF-Milano, via E.Bassini 15, I-20133 Milano (Italy); and others

    2017-10-20

    We report the INTernational Gamma-ray Astrophysics Laboratory ( INTEGRAL ) detection of the short gamma-ray burst GRB 170817A (discovered by Fermi -GBM) with a signal-to-noise ratio of 4.6, and, for the first time, its association with the gravitational waves (GWs) from binary neutron star (BNS) merging event GW170817 detected by the LIGO and Virgo observatories. The significance of association between the gamma-ray burst observed by INTEGRAL and GW170817 is 3.2σ, while the association between the Fermi -GBM and INTEGRAL detections is 4.2σ. GRB 170817A was detected by the SPI-ACS instrument about 2 s after the end of the GW event. We measure a fluence of (1.4 ± 0.4 ± 0.6) × 10{sup −7} erg cm{sup −2} (75–2000 keV), where, respectively, the statistical error is given at the 1σ confidence level, and the systematic error corresponds to the uncertainty in the spectral model and instrument response. We also report on the pointed follow-up observations carried out by INTEGRAL , starting 19.5 hr after the event, and lasting for 5.4 days. We provide a stringent upper limit on any electromagnetic signal in a very broad energy range, from 3 keV to 8 MeV, constraining the soft gamma-ray afterglow flux to <7.1 × 10{sup −11} erg cm{sup −2} s{sup −1} (80–300 keV). Exploiting the unique capabilities of INTEGRAL , we constrained the gamma-ray line emission from radioactive decays that are expected to be the principal source of the energy behind a kilonova event following a BNS coalescence. Finally, we put a stringent upper limit on any delayed bursting activity, for example, from a newly formed magnetar.

  18. Evaluation of outbreak detection performance using multi-stream syndromic surveillance for influenza-like illness in rural Hubei Province, China: a temporal simulation model based on healthcare-seeking behaviors.

    Science.gov (United States)

    Fan, Yunzhou; Wang, Ying; Jiang, Hongbo; Yang, Wenwen; Yu, Miao; Yan, Weirong; Diwan, Vinod K; Xu, Biao; Dong, Hengjin; Palm, Lars; Nie, Shaofa

    2014-01-01

    Syndromic surveillance promotes the early detection of diseases outbreaks. Although syndromic surveillance has increased in developing countries, performance on outbreak detection, particularly in cases of multi-stream surveillance, has scarcely been evaluated in rural areas. This study introduces a temporal simulation model based on healthcare-seeking behaviors to evaluate the performance of multi-stream syndromic surveillance for influenza-like illness. Data were obtained in six towns of rural Hubei Province, China, from April 2012 to June 2013. A Susceptible-Exposed-Infectious-Recovered model generated 27 scenarios of simulated influenza A (H1N1) outbreaks, which were converted into corresponding simulated syndromic datasets through the healthcare-behaviors model. We then superimposed converted syndromic datasets onto the baselines obtained to create the testing datasets. Outbreak performance of single-stream surveillance of clinic visit, frequency of over the counter drug purchases, school absenteeism, and multi-stream surveillance of their combinations were evaluated using receiver operating characteristic curves and activity monitoring operation curves. In the six towns examined, clinic visit surveillance and school absenteeism surveillance exhibited superior performances of outbreak detection than over the counter drug purchase frequency surveillance; the performance of multi-stream surveillance was preferable to signal-stream surveillance, particularly at low specificity (Sp performance of multi-stream surveillance.

  19. Online surveillance of media health event reporting in Nepal: digital disease detection from a One Health perspective.

    Science.gov (United States)

    Schwind, Jessica S; Norman, Stephanie A; Karmacharya, Dibesh; Wolking, David J; Dixit, Sameer M; Rajbhandari, Rajesh M; Mekaru, Sumiko R; Brownstein, John S

    2017-09-21

    Traditional media and the internet are crucial sources of health information. Media can significantly shape public opinion, knowledge and understanding of emerging and endemic health threats. As digital communication rapidly progresses, local access and dissemination of health information contribute significantly to global disease detection and reporting. Health event reports in Nepal (October 2013-December 2014) were used to characterize Nepal's media environment from a One Health perspective using HealthMap - a global online disease surveillance and mapping tool. Event variables (location, media source type, disease or risk factor of interest, and affected species) were extracted from HealthMap. A total of 179 health reports were captured from various sources including newspapers, inter-government agency bulletins, individual reports, and trade websites, yielding 108 (60%) unique articles. Human health events were reported most often (n = 85; 79%), followed by animal health events (n = 23; 21%), with no reports focused solely on environmental health. By expanding event coverage across all of the health sectors, media in developing countries could play a crucial role in national risk communication efforts and could enhance early warning systems for disasters and disease outbreaks.

  20. Snake scales, partial exposure, and the Snake Detection Theory: A human event-related potentials study

    NARCIS (Netherlands)

    J.W. van Strien (Jan); L.A. Isbell (Lynne A.)

    2017-01-01

    textabstractStudies of event-related potentials in humans have established larger early posterior negativity (EPN) in response to pictures depicting snakes than to pictures depicting other creatures. Ethological research has recently shown that macaques and wild vervet monkeys respond strongly to

  1. Detection and monitoring of two dust storm events by multispectral modis images.

    Digital Repository Service at National Institute of Oceanography (India)

    Mehta P.S.; Kunte, P.D.

    of Oman, over Arabian Sea to the coast of Pakistan. The dust storm lasted over the Arabian Sea till 30th March. MODIS sensors on both Terra and Aqua Satellites captured images of both events. From the difference in emissive/transmissive nature...

  2. Oil spill detection and remote sensing : an overview with focus on recent events

    International Nuclear Information System (INIS)

    Coolbaugh, T.S.

    2008-01-01

    Several offshore oil spills occurred during the period from November to December 2007 in various parts of the world, each highlighting the need of quickly detect oil spills in marine settings. Several factors must be considered in order to determine the best technical approach for successful detection and oil spill monitoring. These include the reason for detection or monitoring; the location of the spill; the scale of spatial coverage; availability of detection equipment and time to deploy; high specificity for petroleum oil; weather conditions at and above the spill site; and cost of the detection approach. This paper outlined some of the key attributes of several remote sensing options that are available today or being considered. The approaches used to enhance visualization or detection of spills include traditional electromagnetic spectrum-based approaches such as ultra violet (UV), visible, infra-red (IR), radar, and fluorescence-based systems. Analytical approaches such as chemical analysis for the detection of volatile organic compounds (VOCs) or monitoring of electrical conductivity of the water surface may also provide a warning that hydrocarbons have been released. 12 refs., 3 tabs., 11 figs

  3. Control Surface Fault Diagnosis with Specified Detection Probability - Real Event Experiences

    DEFF Research Database (Denmark)

    Hansen, Søren; Blanke, Mogens

    2013-01-01

    desired levels of false alarms and detection probabilities. Self-tuning residual generators are employed for diagnosis and are combined with statistical change detection to form a setup for robust fault diagnosis. On-line estimation of test statistics is used to obtain a detection threshold and a desired...... false alarm probability. A data based method is used to determine the validity of the methods proposed. Verification is achieved using real data and shows that the presented diagnosis method is efficient and could have avoided incidents where faults led to loss of aircraft....

  4. Detecting Specific Health-Related Events Using an Integrated Sensor System for Vital Sign Monitoring

    Directory of Open Access Journals (Sweden)

    Mourad Adnane

    2009-09-01

    Full Text Available In this paper, a new method for the detection of apnea/hypopnea periods in physiological data is presented. The method is based on the intelligent combination of an integrated sensor system for long-time cardiorespiratory signal monitoring and dedicated signal-processing packages. Integrated sensors are a PVDF film and conductive fabric sheets. The signal processing package includes dedicated respiratory cycle (RC and QRS complex detection algorithms and a new method using the respiratory cycle variability (RCV for detecting apnea/hypopnea periods in physiological data. Results show that our method is suitable for online analysis of long time series data.

  5. Model Based Temporal Reasoning

    Science.gov (United States)

    Rabin, Marla J.; Spinrad, Paul R.; Fall, Thomas C.

    1988-03-01

    Systems that assess the real world must cope with evidence that is uncertain, ambiguous, and spread over time. Typically, the most important function of an assessment system is to identify when activities are occurring that are unusual or unanticipated. Model based temporal reasoning addresses both of these requirements. The differences among temporal reasoning schemes lies in the methods used to avoid computational intractability. If we had n pieces of data and we wanted to examine how they were related, the worst case would be where we had to examine every subset of these points to see if that subset satisfied the relations. This would be 2n, which is intractable. Models compress this; if several data points are all compatible with a model, then that model represents all those data points. Data points are then considered related if they lie within the same model or if they lie in models that are related. Models thus address the intractability problem. They also address the problem of determining unusual activities if the data do not agree with models that are indicated by earlier data then something out of the norm is taking place. The models can summarize what we know up to that time, so when they are not predicting correctly, either something unusual is happening or we need to revise our models. The model based reasoner developed at Advanced Decision Systems is thus both intuitive and powerful. It is currently being used on one operational system and several prototype systems. It has enough power to be used in domains spanning the spectrum from manufacturing engineering and project management to low-intensity conflict and strategic assessment.

  6. Real-time detection of organic contamination events in water distribution systems by principal components analysis of ultraviolet spectral data.

    Science.gov (United States)

    Zhang, Jian; Hou, Dibo; Wang, Ke; Huang, Pingjie; Zhang, Guangxin; Loáiciga, Hugo

    2017-05-01

    The detection of organic contaminants in water distribution systems is essential to protect public health from potential harmful compounds resulting from accidental spills or intentional releases. Existing methods for detecting organic contaminants are based on quantitative analyses such as chemical testing and gas/liquid chromatography, which are time- and reagent-consuming and involve costly maintenance. This study proposes a novel procedure based on discrete wavelet transform and principal component analysis for detecting organic contamination events from ultraviolet spectral data. Firstly, the spectrum of each observation is transformed using discrete wavelet with a coiflet mother wavelet to capture the abrupt change along the wavelength. Principal component analysis is then employed to approximate the spectra based on capture and fusion features. The significant value of Hotelling's T 2 statistics is calculated and used to detect outliers. An alarm of contamination event is triggered by sequential Bayesian analysis when the outliers appear continuously in several observations. The effectiveness of the proposed procedure is tested on-line using a pilot-scale setup and experimental data.

  7. Model-based sensor diagnosis

    International Nuclear Information System (INIS)

    Milgram, J.; Dormoy, J.L.

    1994-09-01

    Running a nuclear power plant involves monitoring data provided by the installation's sensors. Operators and computerized systems then use these data to establish a diagnostic of the plant. However, the instrumentation system is complex, and is not immune to faults and failures. This paper presents a system for detecting sensor failures using a topological description of the installation and a set of component models. This model of the plant implicitly contains relations between sensor data. These relations must always be checked if all the components are functioning correctly. The failure detection task thus consists of checking these constraints. The constraints are extracted in two stages. Firstly, a qualitative model of their existence is built using structural analysis. Secondly, the models are formally handled according to the results of the structural analysis, in order to establish the constraints on the sensor data. This work constitutes an initial step in extending model-based diagnosis, as the information on which it is based is suspect. This work will be followed by surveillance of the detection system. When the instrumentation is assumed to be sound, the unverified constraints indicate errors on the plant model. (authors). 8 refs., 4 figs

  8. Mining the IPTV Channel Change Event Stream to Discover Insight and Detect Ads

    Directory of Open Access Journals (Sweden)

    Matej Kren

    2016-01-01

    Full Text Available IPTV has been widely deployed throughout the world, bringing significant advantages to users in terms of the channel offering, video on demand, and interactive applications. One aspect that has been often neglected is the ability of precise and unobtrusive telemetry. TV set-top boxes that are deployed in modern IPTV systems can be thought of as capable sensor nodes that collect vast amounts of data, representing both the user activity and the quality of service delivered by the system itself. In this paper we focus on the user-generated events and analyze how the data stream of channel change events received from the entire IPTV network can be mined to obtain insight about the content. We demonstrate that it is possible to predict the occurrence of TV ads with high probability and show that the approach could be extended to model the user behavior and classify the viewership in multiple dimensions.

  9. The association of incidentally detected heart valve calcification with future cardiovascular events

    OpenAIRE

    Gondrie, Martijn J. A.; van der Graaf, Yolanda; Jacobs, Peter C.; Oen, Ay L.; Mali, Willem P. Th. M.

    2010-01-01

    Objectives This study aims to investigate the prognostic value of incidental aortic valve calcification (AVC), mitral valve calcification (MVC) and mitral annular calcification (MAC) for cardiovascular events and non-rheumatic valve disease in particular on routine diagnostic chest CT. Methods The study followed a case-cohort design. 10410 patients undergoing chest CT were followed for a median period of 17 months. Patients referred for cardiovascular disease were excluded. A random sample of...

  10. Passive Visual Analytics of Social Media Data for Detection of Unusual Events

    OpenAIRE

    Rustagi, Kush; Chae, Junghoon

    2016-01-01

    Now that social media sites have gained substantial traction, huge amounts of un-analyzed valuable data are being generated. Posts containing images and text have spatiotemporal data attached as well, having immense value for increasing situational awareness of local events, providing insights for investigations and understanding the extent of incidents, their severity, and consequences, as well as their time-evolving nature. However, the large volume of unstructured social media data hinders...

  11. Enriched encoding: reward motivation organizes cortical networks for hippocampal detection of unexpected events.

    Science.gov (United States)

    Murty, Vishnu P; Adcock, R Alison

    2014-08-01

    Learning how to obtain rewards requires learning about their contexts and likely causes. How do long-term memory mechanisms balance the need to represent potential determinants of reward outcomes with the computational burden of an over-inclusive memory? One solution would be to enhance memory for salient events that occur during reward anticipation, because all such events are potential determinants of reward. We tested whether reward motivation enhances encoding of salient events like expectancy violations. During functional magnetic resonance imaging, participants performed a reaction-time task in which goal-irrelevant expectancy violations were encountered during states of high- or low-reward motivation. Motivation amplified hippocampal activation to and declarative memory for expectancy violations. Connectivity of the ventral tegmental area (VTA) with medial prefrontal, ventrolateral prefrontal, and visual cortices preceded and predicted this increase in hippocampal sensitivity. These findings elucidate a novel mechanism whereby reward motivation can enhance hippocampus-dependent memory: anticipatory VTA-cortical-hippocampal interactions. Further, the findings integrate literatures on dopaminergic neuromodulation of prefrontal function and hippocampus-dependent memory. We conclude that during reward motivation, VTA modulation induces distributed neural changes that amplify hippocampal signals and records of expectancy violations to improve predictions-a potentially unique contribution of the hippocampus to reward learning. © The Author 2013. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  12. Detection of explosive cough events in audio recordings by internal sound analysis.

    Science.gov (United States)

    Rocha, B M; Mendes, L; Couceiro, R; Henriques, J; Carvalho, P; Paiva, R P

    2017-07-01

    We present a new method for the discrimination of explosive cough events, which is based on a combination of spectral content descriptors and pitch-related features. After the removal of near-silent segments, a vector of event boundaries is obtained and a proposed set of 9 features is extracted for each event. Two data sets, recorded using electronic stethoscopes and comprising a total of 46 healthy subjects and 13 patients, were employed to evaluate the method. The proposed feature set is compared to three other sets of descriptors: a baseline, a combination of both sets, and an automatic selection of the best 10 features from both sets. The combined feature set yields good results on the cross-validated database, attaining a sensitivity of 92.3±2.3% and a specificity of 84.7±3.3%. Besides, this feature set seems to generalize well when it is trained on a small data set of patients, with a variety of respiratory and cardiovascular diseases, and tested on a bigger data set of mostly healthy subjects: a sensitivity of 93.4% and a specificity of 83.4% are achieved in those conditions. These results demonstrate that complementing the proposed feature set with a baseline set is a promising approach.

  13. {sup 123}I-MIBG imaging detects cardiac involvement and predicts cardiac events in Churg-Strauss syndrome

    Energy Technology Data Exchange (ETDEWEB)

    Horiguchi, Yoriko; Morita, Yukiko [National Hospital Organization Sagamihara National Hospital, Department of Cardiology, Sagamihara City, Kanagawa (Japan); Tsurikisawa, Naomi; Akiyama, Kazuo [National Hospital Organization Sagamihara National Hospital, Clinical Research Centre for Allergy and Rheumatology, Sagamihara City, Kanagawa (Japan)

    2011-02-15

    In Churg-Strauss syndrome (CSS) it is important to detect cardiac involvement, which predicts poor prognosis. This study evaluated whether {sup 123}I-metaiodobenzylguanidine (MIBG) scintigraphy could detect cardiac damage and predict cardiac events in CSS. {sup 123}I-MIBG scintigraphy was performed in 28 patients with CSS, 12 of whom had cardiac involvement. The early and delayed heart to mediastinum ratio (early H/M and delayed H/M) and washout rate were calculated by using {sup 123}I-MIBG scintigraphy and compared with those in control subjects. Early H/M and delayed H/M were significantly lower and the washout rate was significantly higher in patients with cardiac involvement than in those without and in controls (early H/M, p = 0.0024, p = 0.0001; delayed H/M, p = 0.0002, p = 0.0001; washout rate, p = 0.0012, p = 0.0052 vs those without and vs controls, respectively). Accuracy for detecting cardiac involvement was 86% for delayed H/M and washout rate and 79% for early H/M and B-type natriuretic peptide (BNP). Kaplan-Meier analysis showed significantly lower cardiac event-free rates in patients with early H/M {<=} 2.18 and BNP > 21.8 pg/ml than those with early H/M > 2.18 and BNP {<=} 21.8 pg/ml (log-rank test p = 0.006). Cardiac sympathetic nerve function was damaged in CSS patients with cardiac involvement. {sup 123}I-MIBG scintigraphy was useful in detecting cardiac involvement and in predicting cardiac events. (orig.)

  14. 123I-MIBG imaging detects cardiac involvement and predicts cardiac events in Churg-Strauss syndrome

    International Nuclear Information System (INIS)

    Horiguchi, Yoriko; Morita, Yukiko; Tsurikisawa, Naomi; Akiyama, Kazuo

    2011-01-01

    In Churg-Strauss syndrome (CSS) it is important to detect cardiac involvement, which predicts poor prognosis. This study evaluated whether 123 I-metaiodobenzylguanidine (MIBG) scintigraphy could detect cardiac damage and predict cardiac events in CSS. 123 I-MIBG scintigraphy was performed in 28 patients with CSS, 12 of whom had cardiac involvement. The early and delayed heart to mediastinum ratio (early H/M and delayed H/M) and washout rate were calculated by using 123 I-MIBG scintigraphy and compared with those in control subjects. Early H/M and delayed H/M were significantly lower and the washout rate was significantly higher in patients with cardiac involvement than in those without and in controls (early H/M, p = 0.0024, p = 0.0001; delayed H/M, p = 0.0002, p = 0.0001; washout rate, p = 0.0012, p = 0.0052 vs those without and vs controls, respectively). Accuracy for detecting cardiac involvement was 86% for delayed H/M and washout rate and 79% for early H/M and B-type natriuretic peptide (BNP). Kaplan-Meier analysis showed significantly lower cardiac event-free rates in patients with early H/M ≤ 2.18 and BNP > 21.8 pg/ml than those with early H/M > 2.18 and BNP ≤ 21.8 pg/ml (log-rank test p = 0.006). Cardiac sympathetic nerve function was damaged in CSS patients with cardiac involvement. 123 I-MIBG scintigraphy was useful in detecting cardiac involvement and in predicting cardiac events. (orig.)

  15. The Diurnal Variation of the Wimp Detection Event Rates in Directional Experiments

    CERN Document Server

    Vergados, J D

    2009-01-01

    The recent WMAP data have confirmed that exotic dark matter together with the vacuum energy (cosmological constant) dominate in the flat Universe. Modern particle theories naturally provide viable cold dark matter candidates with masses in the GeV-TeV region. Supersymmetry provides the lightest supersymmetric particle (LSP), theories in extra dimensions supply the lightest Kaluza-Klein particle (LKP) etc. The nature of dark matter can only be unraveled only by its direct detection in the laboratory. All such candidates will be called WIMPs (Weakly Interacting Massive Particles). In any case the direct dark matter search, which amounts to detecting the recoiling nucleus, following its collision with WIMP, is central to particle physics and cosmology. In this work we briefly review the theoretical elements relevant to the direct dark matter detection experiments, paying particular attention to directional experiments. i.e experiments in which, not only the energy but the direction of the recoiling nucleus is ob...

  16. Short Personality and Life Event scale for detection of suicide attempters.

    Science.gov (United States)

    Artieda-Urrutia, Paula; Delgado-Gómez, David; Ruiz-Hernández, Diego; García-Vega, Juan Manuel; Berenguer, Nuria; Oquendo, Maria A; Blasco-Fontecilla, Hilario

    2015-01-01

    To develop a brief and reliable psychometric scale to identify individuals at risk for suicidal behaviour. Case-control study. 182 individuals (61 suicide attempters, 57 psychiatric controls, and 64 psychiatrically healthy controls) aged 18 or older, admitted to the Emergency Department at Puerta de Hierro University Hospital in Madrid, Spain. All participants completed a form including their socio-demographic and clinical characteristics, and the Personality and Life Events scale (27 items). To assess Axis I diagnoses, all psychiatric patients (including suicide attempters) were administered the Mini International Neuropsychiatric Interview. Descriptive statistics were computed for the socio-demographic factors. Additionally, χ(2) independence tests were applied to evaluate differences in socio-demographic and clinical variables, and the Personality and Life Events scale between groups. A stepwise linear regression with backward variable selection was conducted to build the Short Personality Life Event (S-PLE) scale. In order to evaluate the accuracy, a ROC analysis was conducted. The internal reliability was assessed using Cronbach's α, and the external reliability was evaluated using a test-retest procedure. The S-PLE scale, composed of just 6 items, showed good performance in discriminating between medical controls, psychiatric controls and suicide attempters in an independent sample. For instance, the S-PLE scale discriminated between past suicide and past non-suicide attempters with sensitivity of 80% and specificity of 75%. The area under the ROC curve was 88%. A factor analysis extracted only one factor, revealing a single dimension of the S-PLE scale. Furthermore, the S-PLE scale provides values of internal and external reliability between poor (test-retest: 0.55) and acceptable (Cronbach's α: 0.65) ranges. Administration time is about one minute. The S-PLE scale is a useful and accurate instrument for estimating the risk of suicidal behaviour in

  17. MEDUSA: Mining Events to Detect Undesirable uSer Actions in SCADA

    NARCIS (Netherlands)

    Hadziosmanovic, D.; Bolzoni, D.; Hartel, Pieter H.; Jha, Somesh; Sommer, Robin; Kreibich, Christian

    2010-01-01

    Standard approaches for detecting malicious behaviors, e.g. monitoring network traffic, cannot address process-related threats in SCADA(Supervisory Control And Data Acquisition) systems. These threats take place when an attacker gains user access rights and performs actions which look legitimate,

  18. Extranuclear detection of histones and nucleosomes in activated human lymphoblasts as an early event in apoptosis.

    NARCIS (Netherlands)

    Gabler, C.; Blank, N.; Hieronymus, T.; Schiller, M.; Berden, J.H.M.; Kalden, J.R.; Lorenz, H.M.

    2004-01-01

    OBJECTIVE: To evaluate the presence of histones and nucleosomes in cell lysates of freshly isolated peripheral blood mononuclear cells (PBMC), fully activated lymphoblasts, or lymphoblasts after induction of apoptosis. METHODS: Each histone class (H1, H2A, H2B, H3, and H4) was detected by western

  19. USBeSafe: Applying One Class SVM for Effective USB Event Anomaly Detection

    Science.gov (United States)

    2016-04-25

    2012. [5] Phil Muncaster. Indian navy computers stormed by malware-ridden USBs. 2012. [6] Ponemon. 2011 Second Annual Cost of Cyber Crime Study...Zhang, and Shanshan Sun. “A mixed unsu- pervised clustering-based intrusion detection model”. In: Genetic and Evolutionary Computing, 2009. WGEC’09

  20. myBlackBox: Blackbox Mobile Cloud Systems for Personalized Unusual Event Detection

    Directory of Open Access Journals (Sweden)

    Junho Ahn

    2016-05-01

    Full Text Available We demonstrate the feasibility of constructing a novel and practical real-world mobile cloud system, called myBlackBox, that efficiently fuses multimodal smartphone sensor data to identify and log unusual personal events in mobile users’ daily lives. The system incorporates a hybrid architectural design that combines unsupervised classification of audio, accelerometer and location data with supervised joint fusion classification to achieve high accuracy, customization, convenience and scalability. We show the feasibility of myBlackBox by implementing and evaluating this end-to-end system that combines Android smartphones with cloud servers, deployed for 15 users over a one-month period.

  1. Exploring design tradeoffs of a distributed algorithm for cosmic ray event detection

    Science.gov (United States)

    Yousaf, S.; Bakhshi, R.; van Steen, M.; Voulgaris, S.; Kelley, J. L.

    2013-03-01

    Many sensor networks, including large particle detector arrays measuring high-energy cosmic-ray air showers, traditionally rely on centralised trigger algorithms to find spatial and temporal coincidences of individual nodes. Such schemes suffer from scalability problems, especially if the nodes communicate wirelessly or have bandwidth limitations. However, nodes which instead communicate with each other can, in principle, use a distributed algorithm to find coincident events themselves without communication with a central node. We present such an algorithm and consider various design tradeoffs involved, in the context of a potential trigger for the Auger Engineering Radio Array (AERA).

  2. Analyzing direct dark matter detection data with unrejected background events by the AMIDAS website

    International Nuclear Information System (INIS)

    Shan, Chung-Lin

    2012-01-01

    In this talk I have presented the data analysis results of extracting properties of halo WIMPs: the mass and the (ratios between the) spin-independent and spin-dependent couplings/cross sections on nucleons by the AMIDAS website by taking into account possible unrejected background events in the analyzed data sets. Although non-standard astronomical setup has been used to generate pseudodata sets for our analyses, it has been found that, without prior information/assumption about the local density and velocity distribution of halo Dark Matter, these WIMP properties have been reconstructed with ∼ 2% to ∼< 30% deviations from the input values.

  3. The Tendency of the Crest Factor Helps Detect Nascent Events; Electronic Circuit, Software and Applications to Signals from Diverse Fields

    Directory of Open Access Journals (Sweden)

    Núñez-Pérez Ricardo Francisco

    2014-04-01

    Full Text Available Within the signal analysis techniques in the time domain, the crest factor (CF is undoubtedly one of the most simple and fast to implement using electronic circuits and/or software. That's why it has been used reliably to care for machinery and to evaluate the quality of supply. One of the major manufacturers of instruments for these purposes is Bruel and Kjaer and defines the crest factor of voltage or repetitive current signal as the ratio of the peak level and its rms value during a certain period of time. In this paper, we try to find out experimentally the potential of CF and their tendency to detect the nascent and evolution of events in various fields of knowledge, either by generating it with a developed electronic circuit, or with calculations, through routines that are performed with the programs DADISP and LabVIEW. The results are validated and checked for all the above factors and trends through a comparison between them and the proposed features and specifications. The results were acceptable so that the tools were applied to detect early faults in electrical machines, to identify chaosity differences between the circuits with these dynamics, to detect abnormal respiratory distress or rales in patients and to detect harmful distortions in the electrical current, all this based on simulations and measurements for each of the 4 cases studied. Other CF original applications proposed are: a control of chaos in electronic circuits that stir/ mix industrial processes and b correct the power factor of non-linear and inductive loads. A medium-term study and use a CF that considers the maximum signal peak to peak is contemplated, and it is thought that it can improve event detection

  4. ON THE DETECTABILITY OF A PREDICTED MESOLENSING EVENT ASSOCIATED WITH THE HIGH PROPER MOTION STAR VB 10

    International Nuclear Information System (INIS)

    Lépine, Sébastien; DiStefano, Rosanne

    2012-01-01

    Extrapolation of the astrometric motion of the nearby low-mass star VB 10 indicates that sometime in late 2011 December or during the first 2-3 months of 2012, the star will make a close approach to a background point source. Based on astrometric uncertainties, we estimate a 1 in 2 chance that the distance of closest approach ρ min will be less than 100 mas, a 1 in 5 chance that ρ min min J planet on a moderately wide (≈0.18 AU–0.84 AU) orbit, there is a chance (1% to more than 10%, depending on the distance of closest approach and orbital period and inclination) that a passage of the planet closer to the background source will result in a secondary event of higher magnification. The detection of secondary events could be made possible with a several-times-per-night multi-site monitoring campaign.

  5. DETECTION OF SUPERSONIC DOWNFLOWS AND ASSOCIATED HEATING EVENTS IN THE TRANSITION REGION ABOVE SUNSPOTS

    Energy Technology Data Exchange (ETDEWEB)

    Kleint, L.; Martínez-Sykora, J. [Bay Area Environmental Research Institute, 625 2nd Street, Ste. 209, Petaluma, CA (United States); Antolin, P. [National Astronomical Observatory of Japan, 2-21-1 Osawa, Mitaka, Tokyo 181-8588 (Japan); Tian, H.; Testa, P.; Reeves, K. K.; McKillop, S.; Saar, S.; Golub, L. [Harvard-Smithsonian Center for Astrophysics, 60 Garden Street, Cambridge, MA 02138 (United States); Judge, P. [High Altitude Observatory/NCAR, P.O. Box 3000, Boulder, CO 80307 (United States); De Pontieu, B.; Wuelser, J. P.; Boerner, P.; Hurlburt, N.; Lemen, J.; Tarbell, T. D.; Title, A. [Lockheed Martin Solar and Astrophysics Laboratory, 3251 Hanover St., Org. ADBS, Bldg. 252, Palo Alto, CA 94304 (United States); Carlsson, M.; Hansteen, V. [Institute of Theoretical Astrophysics, University of Oslo, P.O. Box 1029, Blindern, NO-0315 Oslo (Norway); Jaeggli, S., E-mail: lucia.kleint@fhnw.ch [Department of Physics, Montana State University, Bozeman, P.O. Box 173840, Bozeman, MT 59717 (United States); and others

    2014-07-10

    Interface Region Imaging Spectrograph data allow us to study the solar transition region (TR) with an unprecedented spatial resolution of 0.''33. On 2013 August 30, we observed bursts of high Doppler shifts suggesting strong supersonic downflows of up to 200 km s{sup –1} and weaker, slightly slower upflows in the spectral lines Mg II h and k, C II 1336, Si IV 1394 Å, and 1403 Å, that are correlated with brightenings in the slitjaw images (SJIs). The bursty behavior lasts throughout the 2 hr observation, with average burst durations of about 20 s. The locations of these short-lived events appear to be the umbral and penumbral footpoints of EUV loops. Fast apparent downflows are observed along these loops in the SJIs and in the Atmospheric Imaging Assembly, suggesting that the loops are thermally unstable. We interpret the observations as cool material falling from coronal heights, and especially coronal rain produced along the thermally unstable loops, which leads to an increase of intensity at the loop footpoints, probably indicating an increase of density and temperature in the TR. The rain speeds are on the higher end of previously reported speeds for this phenomenon, and possibly higher than the free-fall velocity along the loops. On other observing days, similar bright dots are sometimes aligned into ribbons, resembling small flare ribbons. These observations provide a first insight into small-scale heating events in sunspots in the TR.

  6. NuSTAR Detection of X-Ray Heating Events in the Quiet Sun

    Science.gov (United States)

    Kuhar, Matej; Krucker, Säm; Glesener, Lindsay; Hannah, Iain G.; Grefenstette, Brian W.; Smith, David M.; Hudson, Hugh S.; White, Stephen M.

    2018-04-01

    The explanation of the coronal heating problem potentially lies in the existence of nanoflares, numerous small-scale heating events occurring across the whole solar disk. In this Letter, we present the first imaging spectroscopy X-ray observations of three quiet Sun flares during the Nuclear Spectroscopic Telescope ARray (NuSTAR) solar campaigns on 2016 July 26 and 2017 March 21, concurrent with the Solar Dynamics Observatory/Atmospheric Imaging Assembly (SDO/AIA) observations. Two of the three events showed time lags of a few minutes between peak X-ray and extreme ultraviolet emissions. Isothermal fits with rather low temperatures in the range 3.2–4.1 MK and emission measures of (0.6–15) × 1044 cm‑3 describe their spectra well, resulting in thermal energies in the range (2–6) × 1026 erg. NuSTAR spectra did not show any signs of a nonthermal or higher temperature component. However, as the estimated upper limits of (hidden) nonthermal energy are comparable to the thermal energy estimates, the lack of a nonthermal component in the observed spectra is not a constraining result. The estimated Geostationary Operational Environmental Satellite (GOES) classes from the fitted values of temperature and emission measure fall between 1/1000 and 1/100 A class level, making them eight orders of magnitude fainter in soft X-ray flux than the largest solar flares.

  7. Semi-automated camera trap image processing for the detection of ungulate fence crossing events.

    Science.gov (United States)

    Janzen, Michael; Visser, Kaitlyn; Visscher, Darcy; MacLeod, Ian; Vujnovic, Dragomir; Vujnovic, Ksenija

    2017-09-27

    Remote cameras are an increasingly important tool for ecological research. While remote camera traps collect field data with minimal human attention, the images they collect require post-processing and characterization before it can be ecologically and statistically analyzed, requiring the input of substantial time and money from researchers. The need for post-processing is due, in part, to a high incidence of non-target images. We developed a stand-alone semi-automated computer program to aid in image processing, categorization, and data reduction by employing background subtraction and histogram rules. Unlike previous work that uses video as input, our program uses still camera trap images. The program was developed for an ungulate fence crossing project and tested against an image dataset which had been previously processed by a human operator. Our program placed images into categories representing the confidence of a particular sequence of images containing a fence crossing event. This resulted in a reduction of 54.8% of images that required further human operator characterization while retaining 72.6% of the known fence crossing events. This program can provide researchers using remote camera data the ability to reduce the time and cost required for image post-processing and characterization. Further, we discuss how this procedure might be generalized to situations not specifically related to animal use of linear features.

  8. Combined passive detection and ultrafast active imaging of cavitation events induced by short pulses of high-intensity ultrasound.

    Science.gov (United States)

    Gateau, Jérôme; Aubry, Jean-François; Pernot, Mathieu; Fink, Mathias; Tanter, Mickaël

    2011-03-01

    The activation of natural gas nuclei to induce larger bubbles is possible using short ultrasonic excitations of high amplitude, and is required for ultrasound cavitation therapies. However, little is known about the distribution of nuclei in tissues. Therefore, the acoustic pressure level necessary to generate bubbles in a targeted zone and their exact location are currently difficult to predict. To monitor the initiation of cavitation activity, a novel all-ultrasound technique sensitive to single nucleation events is presented here. It is based on combined passive detection and ultrafast active imaging over a large volume using the same multi-element probe. Bubble nucleation was induced using a focused transducer (660 kHz, f-number = 1) driven by a high-power electric burst (up to 300 W) of one to two cycles. Detection was performed with a linear array (4 to 7 MHz) aligned with the single-element focal point. In vitro experiments in gelatin gel and muscular tissue are presented. The synchronized passive detection enabled radio-frequency data to be recorded, comprising high-frequency coherent wave fronts as signatures of the acoustic emissions linked to the activation of the nuclei. Active change detection images were obtained by subtracting echoes collected in the unnucleated medium. These indicated the appearance of stable cavitating regions. Because of the ultrafast frame rate, active detection occurred as quickly as 330 μs after the high-amplitude excitation and the dynamics of the induced regions were studied individually.

  9. Using additional external inputs to forecast water quality with an artificial neural network for contamination event detection in source water

    Science.gov (United States)

    Schmidt, F.; Liu, S.

    2016-12-01

    Source water quality plays an important role for the safety of drinking water and early detection of its contamination is vital to taking appropriate countermeasures. However, compared to drinking water, it is more difficult to detect contamination events because its environment is less controlled and numerous natural causes contribute to a high variability of the background values. In this project, Artificial Neural Networks (ANNs) and a Contamination Event Detection Process (CED Process) were used to identify events in river water. The ANN models the response of basic water quality sensors obtained in laboratory experiments in an off-line learning stage and continuously forecasts future values of the time line in an on-line forecasting step. During this second stage, the CED Process compares the forecast to the measured value and classifies it as regular background or event value, which modifies the ANN's continuous learning and influences its forecasts. In addition to this basic setup, external information is fed to the CED Process: A so-called Operator Input (OI) is provided to inform about unusual water quality levels that are unrelated to the presence of contamination, for example due to cooling water discharge from a nearby power plant. This study's primary goal is to evaluate how well the OI fits into the design of the combined forecasting ANN and CED Process and to understand its effects on the online forecasting stage. To test this, data from laboratory experiments conducted previously at the School of Environment, Tsinghua University, have been used to perform simulations highlighting features and drawbacks of this method. Applying the OI has been shown to have a positive influence on the ANN's ability to handle a sudden change in background values, which is unrelated to contamination. However, it might also mask the presence of an event, an issue that underlines the necessity to have several instances of the algorithm run in parallel. Other difficulties

  10. Multiscale vision model for event detection and reconstruction in two-photon imaging data

    DEFF Research Database (Denmark)

    Brazhe, Alexey; Mathiesen, Claus; Lind, Barbara Lykke

    2014-01-01

    on a modified multiscale vision model, an object detection framework based on the thresholding of wavelet coefficients and hierarchical trees of significant coefficients followed by nonlinear iterative partial object reconstruction, for the analysis of two-photon calcium imaging data. The framework is discussed...... of the multiscale vision model is similar in the denoising, but provides a better segmenation of the image into meaningful objects, whereas other methods need to be combined with dedicated thresholding and segmentation utilities....

  11. CISN ShakeAlert: Faster Warning Information Through Multiple Threshold Event Detection in the Virtual Seismologist (VS) Early Warning Algorithm

    Science.gov (United States)

    Cua, G. B.; Fischer, M.; Caprio, M.; Heaton, T. H.; Cisn Earthquake Early Warning Project Team

    2010-12-01

    The Virtual Seismologist (VS) earthquake early warning (EEW) algorithm is one of 3 EEW approaches being incorporated into the California Integrated Seismic Network (CISN) ShakeAlert system, a prototype EEW system that could potentially be implemented in California. The VS algorithm, implemented by the Swiss Seismological Service at ETH Zurich, is a Bayesian approach to EEW, wherein the most probable source estimate at any given time is a combination of contributions from a likehihood function that evolves in response to incoming data from the on-going earthquake, and selected prior information, which can include factors such as network topology, the Gutenberg-Richter relationship or previously observed seismicity. The VS codes have been running in real-time at the Southern California Seismic Network since July 2008, and at the Northern California Seismic Network since February 2009. We discuss recent enhancements to the VS EEW algorithm that are being integrated into CISN ShakeAlert. We developed and continue to test a multiple-threshold event detection scheme, which uses different association / location approaches depending on the peak amplitudes associated with an incoming P pick. With this scheme, an event with sufficiently high initial amplitudes can be declared on the basis of a single station, maximizing warning times for damaging events for which EEW is most relevant. Smaller, non-damaging events, which will have lower initial amplitudes, will require more picks to initiate an event declaration, with the goal of reducing false alarms. This transforms the VS codes from a regional EEW approach reliant on traditional location estimation (and the requirement of at least 4 picks as implemented by the Binder Earthworm phase associator) into an on-site/regional approach capable of providing a continuously evolving stream of EEW information starting from the first P-detection. Real-time and offline analysis on Swiss and California waveform datasets indicate that the

  12. Evaluating Monitoring Strategies to Detect Precipitation-Induced Microbial Contamination Events in Karstic Springs Used for Drinking Water

    Directory of Open Access Journals (Sweden)

    Michael D. Besmer

    2017-11-01

    Full Text Available Monitoring of microbial drinking water quality is a key component for ensuring safety and understanding risk, but conventional monitoring strategies are typically based on low sampling frequencies (e.g., quarterly or monthly. This is of concern because many drinking water sources, such as karstic springs are often subject to changes in bacterial concentrations on much shorter time scales (e.g., hours to days, for example after precipitation events. Microbial contamination events are crucial from a risk assessment perspective and should therefore be targeted by monitoring strategies to establish both the frequency of their occurrence and the magnitude of bacterial peak concentrations. In this study we used monitoring data from two specific karstic springs. We assessed the performance of conventional monitoring based on historical records and tested a number of alternative strategies based on a high-resolution data set of bacterial concentrations in spring water collected with online flow cytometry (FCM. We quantified the effect of increasing sampling frequency and found that for the specific case studied, at least bi-weekly sampling would be needed to detect precipitation events with a probability of >90%. We then proposed an optimized monitoring strategy with three targeted samples per event, triggered by precipitation measurements. This approach is more effective and efficient than simply increasing overall sampling frequency. It would enable the water utility to (1 analyze any relevant event and (2 limit median underestimation of peak concentrations to approximately 10%. We conclude with a generalized perspective on sampling optimization and argue that the assessment of short-term dynamics causing microbial peak loads initially requires increased sampling/analysis efforts, but can be optimized subsequently to account for limited resources. This offers water utilities and public health authorities systematic ways to evaluate and optimize their

  13. A new method for detecting interactions between the senses in event-related potentials

    DEFF Research Database (Denmark)

    Gondan, Matthias; Röder, B.

    2006-01-01

    Event-related potentials (ERPs) can be used in multisensory research to determine the point in time when different senses start to interact, for example, the auditory and the visual system. For this purpose, the ERP to bimodal stimuli (AV) is often compared to the sum of the ERPs to auditory (A......) and visual (V) stimuli: AV - (A + V). If the result is non-zero, this is interpreted as an indicator for multisensory interactions. Using this method, several studies have demonstrated auditory-visual interactions as early as 50 ms after stimulus onset. The subtraction requires that A, V, and AV do...... not contain common activity: This activity would be subtracted twice from one ERP and would, therefore, contaminate the result. In the present study, ERPs to unimodal, bimodal, and trimodal auditory, visual, and tactile stimuli (T) were recorded. We demonstrate that (T + TAV) - (TA + TV) is equivalent to AV...

  14. Study of the Convergence in State Estimators for LTI Systems with Event Detection

    Directory of Open Access Journals (Sweden)

    Juan C. Posada

    2016-01-01

    Full Text Available The methods frequently used to estimate the state of an LTI system require that the precise value of the output variable is known at all times, or at equidistant sampling times. In LTI systems, in which the output signal is measured through binary sensors (detectors, the traditional way of state observers design is not applicable even though the system has a complete observability matrix. This type of state observers design is known as passive. It is necessary, then, to introduce a new state estimation technique, which allows reckoning the state from the information of the variable’s crossing through a detector’s action threshold (switch. This paper seeks, therefore, to study the convergence in this type of estimators in finite time, allowing establishing, theoretically, whether some family of the proposed models can be estimated in a convergent way through the use of the estimation technique based on events.

  15. A multiplex microplatform for the detection of multiple DNA methylation events using gold-DNA affinity.

    Science.gov (United States)

    Sina, Abu Ali Ibn; Foster, Matthew Thomas; Korbie, Darren; Carrascosa, Laura G; Shiddiky, Muhammad J A; Gao, Jing; Dey, Shuvashis; Trau, Matt

    2017-10-07

    We report a new multiplexed strategy for the electrochemical detection of regional DNA methylation across multiple regions. Using the sequence dependent affinity of bisulfite treated DNA towards gold surfaces, the method integrates the high sensitivity of a micro-fabricated multiplex device comprising a microarray of gold electrodes, with the powerful multiplexing capability of multiplex-PCR. The synergy of this combination enables the monitoring of the methylation changes across several genomic regions simultaneously from as low as 500 pg μl -1 of DNA with no sequencing requirement.

  16. Fast joint detection-estimation of evoked brain activity in event-related FMRI using a variational approach

    Science.gov (United States)

    Chaari, Lotfi; Vincent, Thomas; Forbes, Florence; Dojat, Michel; Ciuciu, Philippe

    2013-01-01

    In standard within-subject analyses of event-related fMRI data, two steps are usually performed separately: detection of brain activity and estimation of the hemodynamic response. Because these two steps are inherently linked, we adopt the so-called region-based Joint Detection-Estimation (JDE) framework that addresses this joint issue using a multivariate inference for detection and estimation. JDE is built by making use of a regional bilinear generative model of the BOLD response and constraining the parameter estimation by physiological priors using temporal and spatial information in a Markovian model. In contrast to previous works that use Markov Chain Monte Carlo (MCMC) techniques to sample the resulting intractable posterior distribution, we recast the JDE into a missing data framework and derive a Variational Expectation-Maximization (VEM) algorithm for its inference. A variational approximation is used to approximate the Markovian model in the unsupervised spatially adaptive JDE inference, which allows automatic fine-tuning of spatial regularization parameters. It provides a new algorithm that exhibits interesting properties in terms of estimation error and computational cost compared to the previously used MCMC-based approach. Experiments on artificial and real data show that VEM-JDE is robust to model mis-specification and provides computational gain while maintaining good performance in terms of activation detection and hemodynamic shape recovery. PMID:23096056

  17. Microlensing events by Proxima Centauri in 2014 and 2016: Opportunities for mass determination and possible planet detection

    Energy Technology Data Exchange (ETDEWEB)

    Sahu, Kailash C.; Bond, Howard E.; Anderson, Jay [Space Telescope Science Institute, 3700 San Martin Drive, Baltimore, MD 21218 (United States); Dominik, Martin, E-mail: ksahu@stsci.edu, E-mail: jayander@stsci.edu, E-mail: heb11@psu.edu, E-mail: md35@st-andrews.ac.uk [SUPA, School of Physics and Astronomy, University of St. Andrews, North Haugh, St. Andrews KY16 9SS (United Kingdom)

    2014-02-20

    We have found that Proxima Centauri, the star closest to our Sun, will pass close to a pair of faint background stars in the next few years. Using Hubble Space Telescope (HST) images obtained in 2012 October, we determine that the passage close to a mag 20 star will occur in 2014 October (impact parameter 1.''6), and to a mag 19.5 star in 2016 February (impact parameter 0.''5). As Proxima passes in front of these stars, the relativistic deflection of light will cause shifts in the positions of the background stars of ∼0.5 and 1.5 mas, respectively, readily detectable by HST imaging, and possibly by Gaia and ground-based facilities such as the Very Large Telescope. Measurement of these astrometric shifts offers a unique and direct method to measure the mass of Proxima. Moreover, if Proxima has a planetary system, the planets may be detectable through their additional microlensing signals, although the probability of such detections is small. With astrometric accuracies of 0.03 mas (achievable with HST spatial scanning), centroid shifts caused by Jovian planets are detectable at separations of up to 2.''0 (corresponding to 2.6 AU at the distance of Proxima), and centroid shifts by Earth-mass planets are detectable within a small band of 8 mas (corresponding to 0.01 AU) around the source trajectories. Jovian planets within a band of about 28 mas (corresponding to 0.036 AU) around the source trajectories would produce a brightening of the source by >0.01 mag and could hence be detectable. Estimated timescales of the astrometric and photometric microlensing events due to a planet range from a few hours to a few days, and both methods would provide direct measurements of the planetary mass.

  18. [Event-related EEG potentials associated with error detection in psychiatric disorder: literature review].

    Science.gov (United States)

    Balogh, Lívia; Czobor, Pál

    2010-01-01

    Error-related bioelectric signals constitute a special subgroup of event-related potentials. Researchers have identified two evoked potential components to be closely related to error processing, namely error-related negativity (ERN) and error-positivity (Pe), and they linked these to specific cognitive functions. In our article first we give a brief description of these components, then based on the available literature, we review differences in error-related evoked potentials observed in patients across psychiatric disorders. The PubMed and Medline search engines were used in order to identify all relevant articles, published between 2000 and 2009. For the purpose of the current paper we reviewed publications summarizing results of clinical trials. Patients suffering from schizophrenia, anorexia nervosa or borderline personality disorder exhibited a decrease in the amplitude of error-negativity when compared with healthy controls, while in cases of depression and anxiety an increase in the amplitude has been observed. Some of the articles suggest specific personality variables, such as impulsivity, perfectionism, negative emotions or sensitivity to punishment to underlie these electrophysiological differences. Research in the field of error-related electric activity has come to the focus of psychiatry research only recently, thus the amount of available data is significantly limited. However, since this is a relatively new field of research, the results available at present are noteworthy and promising for future electrophysiological investigations in psychiatric disorders.

  19. A new computational method for the detection of horizontal gene transfer events.

    Science.gov (United States)

    Tsirigos, Aristotelis; Rigoutsos, Isidore

    2005-01-01

    In recent years, the increase in the amounts of available genomic data has made it easier to appreciate the extent by which organisms increase their genetic diversity through horizontally transferred genetic material. Such transfers have the potential to give rise to extremely dynamic genomes where a significant proportion of their coding DNA has been contributed by external sources. Because of the impact of these horizontal transfers on the ecological and pathogenic character of the recipient organisms, methods are continuously sought that are able to computationally determine which of the genes of a given genome are products of transfer events. In this paper, we introduce and discuss a novel computational method for identifying horizontal transfers that relies on a gene's nucleotide composition and obviates the need for knowledge of codon boundaries. In addition to being applicable to individual genes, the method can be easily extended to the case of clusters of horizontally transferred genes. With the help of an extensive and carefully designed set of experiments on 123 archaeal and bacterial genomes, we demonstrate that the new method exhibits significant improvement in sensitivity when compared to previously published approaches. In fact, it achieves an average relative improvement across genomes of between 11 and 41% compared to the Codon Adaptation Index method in distinguishing native from foreign genes. Our method's horizontal gene transfer predictions for 123 microbial genomes are available online at http://cbcsrv.watson.ibm.com/HGT/.

  20. Architecture design of the multi-functional wavelet-based ECG microprocessor for realtime detection of abnormal cardiac events.

    Science.gov (United States)

    Cheng, Li-Fang; Chen, Tung-Chien; Chen, Liang-Gee

    2012-01-01

    Most of the abnormal cardiac events such as myocardial ischemia, acute myocardial infarction (AMI) and fatal arrhythmia can be diagnosed through continuous electrocardiogram (ECG) analysis. According to recent clinical research, early detection and alarming of such cardiac events can reduce the time delay to the hospital, and the clinical outcomes of these individuals can be greatly improved. Therefore, it would be helpful if there is a long-term ECG monitoring system with the ability to identify abnormal cardiac events and provide realtime warning for the users. The combination of the wireless body area sensor network (BASN) and the on-sensor ECG processor is a possible solution for this application. In this paper, we aim to design and implement a digital signal processor that is suitable for continuous ECG monitoring and alarming based on the continuous wavelet transform (CWT) through the proposed architectures--using both programmable RISC processor and application specific integrated circuits (ASIC) for performance optimization. According to the implementation results, the power consumption of the proposed processor integrated with an ASIC for CWT computation is only 79.4 mW. Compared with the single-RISC processor, about 91.6% of the power reduction is achieved.

  1. Automated Detection of Obstructive Sleep Apnea Events from a Single-Lead Electrocardiogram Using a Convolutional Neural Network.

    Science.gov (United States)

    Urtnasan, Erdenebayar; Park, Jong-Uk; Joo, Eun-Yeon; Lee, Kyoung-Joung

    2018-04-23

    In this study, we propose a method for the automated detection of obstructive sleep apnea (OSA) from a single-lead electrocardiogram (ECG) using a convolutional neural network (CNN). A CNN model was designed with six optimized convolution layers including activation, pooling, and dropout layers. One-dimensional (1D) convolution, rectified linear units (ReLU), and max pooling were applied to the convolution, activation, and pooling layers, respectively. For training and evaluation of the CNN model, a single-lead ECG dataset was collected from 82 subjects with OSA and was divided into training (including data from 63 patients with 34,281 events) and testing (including data from 19 patients with 8571 events) datasets. Using this CNN model, a precision of 0.99%, a recall of 0.99%, and an F 1 -score of 0.99% were attained with the training dataset; these values were all 0.96% when the CNN was applied to the testing dataset. These results show that the proposed CNN model can be used to detect OSA accurately on the basis of a single-lead ECG. Ultimately, this CNN model may be used as a screening tool for those suspected to suffer from OSA.

  2. Secondary scintillation yield of xenon with sub-percent levels of CO2 additive for rare-event detection

    Science.gov (United States)

    Henriques, C. A. O.; Freitas, E. D. C.; Azevedo, C. D. R.; González-Díaz, D.; Mano, R. D. P.; Jorge, M. R.; Fernandes, L. M. P.; Monteiro, C. M. B.; Gómez-Cadenas, J. J.; Álvarez, V.; Benlloch-Rodríguez, J. M.; Borges, F. I. G. M.; Botas, A.; Cárcel, S.; Carríon, J. V.; Cebrían, S.; Conde, C. A. N.; Díaz, J.; Diesburg, M.; Esteve, R.; Felkai, R.; Ferrario, P.; Ferreira, A. L.; Goldschmidt, A.; Gutiérrez, R. M.; Hauptman, J.; Hernandez, A. I.; Hernando Morata, J. A.; Herrero, V.; Jones, B. J. P.; Labarga, L.; Laing, A.; Lebrun, P.; Liubarsky, I.; López-March, N.; Losada, M.; Martín-Albo, J.; Martínez-Lema, G.; Martínez, A.; McDonald, A. D.; Monrabal, F.; Mora, F. J.; Moutinho, L. M.; Muñoz Vidal, J.; Musti, M.; Nebot-Guinot, M.; Novella, P.; Nygren, D. R.; Palmeiro, B.; Para, A.; Pérez, J.; Querol, M.; Renner, J.; Ripoll, L.; Rodríguez, J.; Rogers, L.; Santos, F. P.; dos Santos, J. M. F.; Simón, A.; Sofka, C.; Sorel, M.; Stiegler, T.; Toledo, J. F.; Torrent, J.; Tsamalaidze, Z.; Veloso, J. F. C. A.; Webb, R.; White, J. T.; Yahlali, N.; NEXT Collaboration

    2017-10-01

    Xe-CO2 mixtures are important alternatives to pure xenon in Time Projection Chambers (TPC) based on secondary scintillation (electroluminescence) signal amplification with applications in the important field of rare event detection such as directional dark matter, double electron capture and double beta decay detection. The addition of CO2 to pure xenon at the level of 0.05-0.1% can reduce significantly the scale of electron diffusion from 10 mm /√{m} to 2.5 mm /√{m}, with high impact on the discrimination efficiency of the events through pattern recognition of the topology of primary ionization trails. We have measured the electroluminescence (EL) yield of Xe-CO2 mixtures, with sub-percent CO2 concentrations. We demonstrate that the EL production is still high in these mixtures, 70% and 35% relative to that produced in pure xenon, for CO2 concentrations around 0.05% and 0.1%, respectively. The contribution of the statistical fluctuations in EL production to the energy resolution increases with increasing CO2 concentration, being smaller than the contribution of the Fano factor for concentrations below 0.1% CO2.

  3. Cryogenic scintillators for rare events detection in the Edelweiss and EURECA experiments

    International Nuclear Information System (INIS)

    Verdier, M.A.

    2010-10-01

    The riddle of the dark matter in astrophysics could be solved by the detection of WIMPs (Weakly Interactive Massive Particles), particles that are predicted by supersymmetry. The direct detection of WIMPs requires a large mass of detectors, able to identify these particles in the background of natural radioactivity and cosmic rays. This thesis takes place within the framework of the EDELWEISS and the future EURECA experiments. These experiments use a technology based on two channel cryogenic detectors (bolometers), working at a few tens of mK. They are composed of crystals in which the energy deposited by particle interactions will produce a temperature increase (phonon signal), and where the ionization of the crystals results in either a charge or photon signal, depending on their nature. In order to broaden the range of targets for scintillating bolometers, we have built a setup to study the scintillation of crystals cooled down to 3 K. It is based on a cryostat with a compact optical geometry allowing enhanced light collection. Thanks to an individual photon counting technique and a statistical treatment of data, it allows us to measure the evolution of the the light yields and the decay time components between room temperature and 3 K. Thus this thesis presents the results obtained at 3 K on two well known room temperature crystals: BGO (Bi 4 Ge 3 O 12 ) and BaF 2 . We also study the luminescence properties of titanium sapphire (Ti:Al 2 O 3 ), under VUV excitation cooled down to 8 K. (author)

  4. Principles of models based engineering

    Energy Technology Data Exchange (ETDEWEB)

    Dolin, R.M.; Hefele, J.

    1996-11-01

    This report describes a Models Based Engineering (MBE) philosophy and implementation strategy that has been developed at Los Alamos National Laboratory`s Center for Advanced Engineering Technology. A major theme in this discussion is that models based engineering is an information management technology enabling the development of information driven engineering. Unlike other information management technologies, models based engineering encompasses the breadth of engineering information, from design intent through product definition to consumer application.

  5. Hospital staff should use more than one method to detect adverse events and potential adverse events: incident reporting, pharmacist surveillance and local real‐time record review may all have a place

    Science.gov (United States)

    Olsen, Sisse; Neale, Graham; Schwab, Kat; Psaila, Beth; Patel, Tejal; Chapman, E Jane; Vincent, Charles

    2007-01-01

    Background Over the past five years, in most hospitals in England and Wales, incident reporting has become well established but it remains unclear how well reports match clinical adverse events. International epidemiological studies of adverse events are based on retrospective, multi‐hospital case record review. In this paper the authors describe the use of incident reporting, pharmacist surveillance and local real‐time record review for the recognition of clinical risks associated with hospital inpatient care. Methodology Data on adverse events were collected prospectively on 288 patients discharged from adult acute medical and surgical units in an NHS district general hospital using incident reports, active surveillance of prescription charts by pharmacists and record review at time of discharge. Results Record review detected 26 adverse events (AEs) and 40 potential adverse events (PAEs) occurring during the index admission. In contrast, in the same patient group, incident reporting detected 11 PAEs and no AEs. Pharmacy surveillance found 10 medication errors all of which were PAEs. There was little overlap in the nature of events detected by the three methods. Conclusion The findings suggest that incident reporting does not provide an adequate assessment of clinical adverse events and that this method needs to be supplemented with other more systematic forms of data collection. Structured record review, carried out by clinicians, provides an important component of an integrated approach to identifying risk in the context of developing a safety and quality improvement programme. PMID:17301203

  6. Detecting kinematic boundary surfaces in phase space: particle mass measurements in SUSY-like events

    Science.gov (United States)

    Debnath, Dipsikha; Gainer, James S.; Kilic, Can; Kim, Doojin; Matchev, Konstantin T.; Yang, Yuan-Pao

    2017-06-01

    We critically examine the classic endpoint method for particle mass determination, focusing on difficult corners of parameter space, where some of the measurements are not independent, while others are adversely affected by the experimental resolution. In such scenarios, mass differences can be measured relatively well, but the overall mass scale remains poorly constrained. Using the example of the standard SUSY decay chain \\tilde{q}\\to {\\tilde{χ}}_2^0\\to \\tilde{ℓ}\\to {\\tilde{χ}}_1^0 , we demonstrate that sensitivity to the remaining mass scale parameter can be recovered by measuring the two-dimensional kinematical boundary in the relevant three-dimensional phase space of invariant masses squared. We develop an algorithm for detecting this boundary, which uses the geometric properties of the Voronoi tessellation of the data, and in particular, the relative standard deviation (RSD) of the volumes of the neighbors for each Voronoi cell in the tessellation. We propose a new observable, \\overline{Σ} , which is the average RSD per unit area, calculated over the hypothesized boundary. We show that the location of the \\overline{Σ} maximum correlates very well with the true values of the new particle masses. Our approach represents the natural extension of the one-dimensional kinematic endpoint method to the relevant three dimensions of invariant mass phase space.

  7. Detecting kinematic boundary surfaces in phase space and particle mass measurements in SUSY-like events

    CERN Document Server

    Debnath, Dipsikha; Kilic, Can; Kim, Doojin; Matchev, Konstantin T.; Yang, Yuan-Pao

    2017-06-19

    We critically examine the classic endpoint method for particle mass determination, focusing on difficult corners of parameter space, where some of the measurements are not independent, while others are adversely affected by the experimental resolution. In such scenarios, mass differences can be measured relatively well, but the overall mass scale remains poorly constrained. Using the example of the standard SUSY decay chain $\\tilde q\\to \\tilde\\chi^0_2\\to \\tilde \\ell \\to \\tilde \\chi^0_1$, we demonstrate that sensitivity to the remaining mass scale parameter can be recovered by measuring the two-dimensional kinematical boundary in the relevant three-dimensional phase space of invariant masses squared. We develop an algorithm for detecting this boundary, which uses the geometric properties of the Voronoi tessellation of the data, and in particular, the relative standard deviation (RSD) of the volumes of the neighbors for each Voronoi cell in the tessellation. We propose a new observable, $\\bar\\Sigma$, which is ...

  8. Multiple-Threshold Event Detection and Other Enhancements to the Virtual Seismologist (VS) Earthquake Early Warning Algorithm

    Science.gov (United States)

    Fischer, M.; Caprio, M.; Cua, G. B.; Heaton, T. H.; Clinton, J. F.; Wiemer, S.

    2009-12-01

    The Virtual Seismologist (VS) algorithm is a Bayesian approach to earthquake early warning (EEW) being implemented by the Swiss Seismological Service at ETH Zurich. The application of Bayes’ theorem in earthquake early warning states that the most probable source estimate at any given time is a combination of contributions from a likelihood function that evolves in response to incoming data from the on-going earthquake, and selected prior information, which can include factors such as network topology, the Gutenberg-Richter relationship or previously observed seismicity. The VS algorithm was one of three EEW algorithms involved in the California Integrated Seismic Network (CISN) real-time EEW testing and performance evaluation effort. Its compelling real-time performance in California over the last three years has led to its inclusion in the new USGS-funded effort to develop key components of CISN ShakeAlert, a prototype EEW system that could potentially be implemented in California. A significant portion of VS code development was supported by the SAFER EEW project in Europe. We discuss recent enhancements to the VS EEW algorithm. We developed and continue to test a multiple-threshold event detection scheme, which uses different association / location approaches depending on the peak amplitudes associated with an incoming P pick. With this scheme, an event with sufficiently high initial amplitudes can be declared on the basis of a single station, maximizing warning times for damaging events for which EEW is most relevant. Smaller, non-damaging events, which will have lower initial amplitudes, will require more picks to be declared an event to reduce false alarms. This transforms the VS codes from a regional EEW approach reliant on traditional location estimation (and it requirement of at least 4 picks as implemented by the Binder Earthworm phase associator) to a hybrid on-site/regional approach capable of providing a continuously evolving stream of EEW

  9. Atomic-scale nanoindentation: detection and identification of single glide events in three dimensions by force microscopy

    International Nuclear Information System (INIS)

    Egberts, P; Bennewitz, R

    2011-01-01

    Indentation experiments on the nanometre scale have been performed by means of atomic force microscopy in ultra-high vacuum on KBr(100) surfaces. The surfaces yield in the form of discrete surface displacements with a typical length scale of 1 A. These surface displacements are detected in both normal and lateral directions. Measurement of the lateral tip displacement requires a load-dependent calibration due to the load dependence of the effective lateral compliance. Correlation of the lateral and normal displacements for each glide event allow identification of the activated slip system. The results are discussed in terms of the resolved shear stress in indentation experiments and of typical results in atomistic simulations of nanometre-scale indentation.

  10. Symbolic Processing Combined with Model-Based Reasoning

    Science.gov (United States)

    James, Mark

    2009-01-01

    A computer program for the detection of present and prediction of future discrete states of a complex, real-time engineering system utilizes a combination of symbolic processing and numerical model-based reasoning. One of the biggest weaknesses of a purely symbolic approach is that it enables prediction of only future discrete states while missing all unmodeled states or leading to incorrect identification of an unmodeled state as a modeled one. A purely numerical approach is based on a combination of statistical methods and mathematical models of the applicable physics and necessitates development of a complete model to the level of fidelity required for prediction. In addition, a purely numerical approach does not afford the ability to qualify its results without some form of symbolic processing. The present software implements numerical algorithms to detect unmodeled events and symbolic algorithms to predict expected behavior, correlate the expected behavior with the unmodeled events, and interpret the results in order to predict future discrete states. The approach embodied in this software differs from that of the BEAM methodology (aspects of which have been discussed in several prior NASA Tech Briefs articles), which provides for prediction of future measurements in the continuous-data domain.

  11. Adverse Events Associated with Hospitalization or Detected through the RAI-HC Assessment among Canadian Home Care Clients

    Science.gov (United States)

    Doran, Diane; Hirdes, John P.; Blais, Régis; Baker, G. Ross; Poss, Jeff W.; Li, Xiaoqiang; Dill, Donna; Gruneir, Andrea; Heckman, George; Lacroix, Hélène; Mitchell, Lori; O'Beirne, Maeve; Foebel, Andrea; White, Nancy; Qian, Gan; Nahm, Sang-Myong; Yim, Odilia; Droppo, Lisa; McIsaac, Corrine

    2013-01-01

    Background: The occurrence of adverse events (AEs) in care settings is a patient safety concern that has significant consequences across healthcare systems. Patient safety problems have been well documented in acute care settings; however, similar data for clients in home care (HC) settings in Canada are limited. The purpose of this Canadian study was to investigate AEs in HC, specifically those associated with hospitalization or detected through the Resident Assessment Instrument for Home Care (RAI-HC). Method: A retrospective cohort design was used. The cohort consisted of HC clients from the provinces of Nova Scotia, Ontario, British Columbia and the Winnipeg Regional Health Authority. Results: The overall incidence rate of AEs associated with hospitalization ranged from 6% to 9%. The incidence rate of AEs determined from the RAI-HC was 4%. Injurious falls, injuries from other than fall and medication-related events were the most frequent AEs associated with hospitalization, whereas new caregiver distress was the most frequent AE identified through the RAI-HC. Conclusion: The incidence of AEs from all sources of data ranged from 4% to 9%. More resources are needed to target strategies for addressing safety risks in HC in a broader context. Tools such as the RAI-HC and its Clinical Assessment Protocols, already available in Canada, could be very useful in the assessment and management of HC clients who are at safety risk. PMID:23968676

  12. Detection and discrimination of maintenance and de novo CpG methylation events using MethylBreak.

    Science.gov (United States)

    Hsu, William; Mercado, Augustus T; Hsiao, George; Yeh, Jui-Ming; Chen, Chung-Yung

    2017-05-15

    Understanding the principles governing the establishment and maintenance activities of DNA methyltransferases (DNMTs) can help in the development of predictive biomarkers associated with genetic disorders and diseases. A detection system was developed that distinguishes and quantifies methylation events using methylation-sensitive endonucleases and molecular beacon technology. MethylBreak (MB) is a 22-mer oligonucleotide with one hemimethylated and two unmethylated CpG sites, which are also recognition sites for Sau96I and SacII, and is attached to a fluorophore and a quencher. Maintenance methylation was quantified by fluorescence emission due to the digestion of SacII when the hemimethylated CpG site is methylated, which inhibits Sau96I cleavage. The signal difference between SacII digestion of both MB substrate and maintenance methylated MB corresponds to de novo methylation event. Our technology successfully discriminated and measured both methylation activities at different concentrations of MB and achieved a high correlation coefficient of R 2 =0.997. Additionally, MB was effectively applied to normal and cancer cell lines and in the analysis of enzymatic kinetics and RNA inhibition of recombinant human DNMT1. Copyright © 2017 Elsevier B.V. All rights reserved.

  13. Method and device for detecting impact events on a security barrier which includes a hollow rebar allowing insertion and removal of an optical fiber

    Science.gov (United States)

    Pies, Ross E.

    2016-03-29

    A method and device for the detection of impact events on a security barrier. A hollow rebar is farmed within a security barrier, whereby the hollow rebar is completely surrounded by the security barrier. An optical fiber passes through the interior of the hollow rebar. An optical transmitter and an optical receiver are both optically connected to the optical fiber and connected to optical electronics. The optical electronics are configured to provide notification upon the detection of an impact event at the security barrier based on the detection of disturbances within the optical fiber.

  14. Gaseous time projection chambers for rare event detection: results from the T-REX project. I. Double beta decay

    Energy Technology Data Exchange (ETDEWEB)

    Irastorza, I.G.; Aznar, F.; Castel, J., E-mail: igor.irastorza@cern.ch, E-mail: faznar@unizar.es, E-mail: jfcastel@unizar.es [Grupo de Física Nuclear y Astropartículas, Departamento de Física Teórica, Universidad de Zaragoza, C/ P. Cerbuna 12, Zaragoza, 50009 (Spain); and others

    2016-01-01

    As part of the T-REX project, a number of R and D and prototyping activities have been carried out during the last years to explore the applicability of gaseous Time Projection Chambers (TPCs) with Micromesh Gas Structures (Micromegas) in rare event searches like double beta decay, axion research and low-mass WIMP searches. In both this and its companion paper, we compile the main results of the project and give an outlook of application prospects for this detection technique. While in the companion paper we focus on axions and WIMPs, in this paper we focus on the results regarding the measurement of the double beta decay (DBD) of {sup 136}Xe in a high pressure Xe (HPXe) TPC. Micromegas of the microbulk type have been extensively studied in high pressure Xe and Xe mixtures. Particularly relevant are the results obtained in Xe + trimethylamine (TMA) mixtures, showing very promising results in terms of gain, stability of operation, and energy resolution at high pressures up to 10 bar. The addition of TMA at levels of ∼ 1% reduces electron diffusion by up to a factor of 10 with respect to pure Xe, improving the quality of the topological pattern, with a positive impact on the discrimination capability. Operation with a medium size prototype of 30 cm diameter and 38 cm of drift (holding about 1 kg of Xe at 10 bar in the fiducial volume, enough to contain high energy electron tracks in the detector volume) has allowed to test the detection concept in realistic experimental conditions. Microbulk Micromegas are able to image the DBD ionization signature with high quality while, at the same time, measuring its energy deposition with a resolution of at least a ∼ 3% FWHM @ Q{sub ββ}. This value was experimentally demonstrated for high-energy extended tracks at 10 bar, and is probably improvable down to the ∼ 1% FWHM levels as extrapolated from low energy events. In addition, first results on the topological signature information (one straggling track ending in two

  15. Naive Probability: Model-based Estimates of Unique Events

    Science.gov (United States)

    2014-05-04

    of inference. Argument and Computation, 1–17, iFirst. Khemlani, S., & Johnson-Laird, P.N. (2012b). Theories of the syllogism: A meta -analysis...is the probability that… 1 space tourism will achieve widespread popularity in the next 50 years? advances in material science will lead to the... governments dedicate more resources to contacting extra-terrestrials? 8 the United States adopts an open border policy of universal acceptance? English is

  16. Model-based version management system framework

    International Nuclear Information System (INIS)

    Mehmood, W.

    2016-01-01

    In this paper we present a model-based version management system. Version Management System (VMS) a branch of software configuration management (SCM) aims to provide a controlling mechanism for evolution of software artifacts created during software development process. Controlling the evolution requires many activities to perform, such as, construction and creation of versions, identification of differences between versions, conflict detection and merging. Traditional VMS systems are file-based and consider software systems as a set of text files. File based VMS systems are not adequate for performing software configuration management activities such as, version control on software artifacts produced in earlier phases of the software life cycle. New challenges of model differencing, merge, and evolution control arise while using models as central artifact. The goal of this work is to present a generic framework model-based VMS which can be used to overcome the problem of tradition file-based VMS systems and provide model versioning services. (author)

  17. Glaucoma progression detection by retinal nerve fiber layer measurement using scanning laser polarimetry: event and trend analysis.

    Science.gov (United States)

    Moon, Byung Gil; Sung, Kyung Rim; Cho, Jung Woo; Kang, Sung Yong; Yun, Sung-Cheol; Na, Jung Hwa; Lee, Youngrok; Kook, Michael S

    2012-06-01

    To evaluate the use of scanning laser polarimetry (SLP, GDx VCC) to measure the retinal nerve fiber layer (RNFL) thickness in order to evaluate the progression of glaucoma. Test-retest measurement variability was determined in 47 glaucomatous eyes. One eye each from 152 glaucomatous patients with at least 4 years of follow-up was enrolled. Visual field (VF) loss progression was determined by both event analysis (EA, Humphrey guided progression analysis) and trend analysis (TA, linear regression analysis of the visual field index). SLP progression was defined as a reduction of RNFL exceeding the predetermined repeatability coefficient in three consecutive exams, as compared to the baseline measure (EA). The slope of RNFL thickness change over time was determined by linear regression analysis (TA). Twenty-two eyes (14.5%) progressed according to the VF EA, 16 (10.5%) by VF TA, 37 (24.3%) by SLP EA and 19 (12.5%) by SLP TA. Agreement between VF and SLP progression was poor in both EA and TA (VF EA vs. SLP EA, k = 0.110; VF TA vs. SLP TA, k = 0.129). The mean (±standard deviation) progression rate of RNFL thickness as measured by SLP TA did not significantly differ between VF EA progressors and non-progressors (-0.224 ± 0.148 µm/yr vs. -0.218 ± 0.151 µm/yr, p = 0.874). SLP TA and EA showed similar levels of sensitivity when VF progression was considered as the reference standard. RNFL thickness as measurement by SLP was shown to be capable of detecting glaucoma progression. Both EA and TA of SLP showed poor agreement with VF outcomes in detecting glaucoma progression.

  18. Pharyngeal pH alone is not reliable for the detection of pharyngeal reflux events: A study with oesophageal and pharyngeal pH-impedance monitoring

    Science.gov (United States)

    Desjardin, Marie; Roman, Sabine; des Varannes, Stanislas Bruley; Gourcerol, Guillaume; Coffin, Benoit; Ropert, Alain; Mion, François

    2013-01-01

    Background Pharyngeal pH probes and pH-impedance catheters have been developed for the diagnosis of laryngo-pharyngeal reflux. Objective To determine the reliability of pharyngeal pH alone for the detection of pharyngeal reflux events. Methods 24-h pH-impedance recordings performed in 45 healthy subjects with a bifurcated probe for detection of pharyngeal and oesophageal reflux events were reviewed. Pharyngeal pH drops to below 4 and 5 were analysed for the simultaneous occurrence of pharyngeal reflux, gastro-oesophageal reflux, and swallows, according to impedance patterns. Results Only 7.0% of pharyngeal pH drops to below 5 identified with impedance corresponded to pharyngeal reflux, while 92.6% were related to swallows and 10.2 and 13.3% were associated with proximal and distal gastro-oesophageal reflux events, respectively. Of pharyngeal pH drops to below 4, 13.2% were related to pharyngeal reflux, 87.5% were related to swallows, and 18.1 and 21.5% were associated with proximal and distal gastro-oesophageal reflux events, respectively. Conclusions This study demonstrates that pharyngeal pH alone is not reliable for the detection of pharyngeal reflux and that adding distal oesophageal pH analysis is not helpful. The only reliable analysis should take into account impedance patterns demonstrating the presence of pharyngeal reflux event preceded by a distal and proximal reflux event within the oesophagus. PMID:24917995

  19. A measurement of the muon number in showers using inclined events detected at the Pierre Auger Observatory

    Directory of Open Access Journals (Sweden)

    Rodriguez G.

    2013-06-01

    Full Text Available The average muon content of measured showers with zenith angles between 62∘ and 80∘ detected at the Pierre Auger Observatory is obtained as a function of shower energy using a reconstruction method specifically designed for inclined showers and the hybrid character of the detector. The reconstruction of inclined showers relies on a comparison between the measured signals at ground and reference patterns at ground level from which an overall normalization factor is obtained. Since inclined showers are dominated by muons this factor gives the relative muon size. It can be calibrated using a subsample of showers simultaneously recorded with the fluorescence detector (FD and the surface detector (SD which provides an independent calorimetric measurement of the energy. The muon size obtained for each shower becomes a measurement of the relative number of muons with respect to the reference distributions. The precision of the measurement is assessed using simulated events which are reconstructed using exactly the same procedure. We compare the relative number of muons versus energy as obtained to simulations. Proton simulations with QGSJETII show a factor of 2.13 ± 0.04(stat ± 0.11(sys at 1019eV without significant variations in the energy range explored between 4 × 1018eV to 7 × 1019eV. We find that none of the current shower models, neither for proton nor for iron primaries, are able to predict as many muons as are observed.

  20. Waveform correlation and coherence of short-period seismic noise within Gauribidanur array with implications for event detection

    International Nuclear Information System (INIS)

    Bhadauria, Y.S.; Arora, S.K.

    1995-01-01

    In continuation with our effort to model the short-period micro seismic noise at the seismic array at Gauribidanur (GBA), we have examined in detail time-correlation and spectral coherence of the noise field within the array space. This has implications of maximum possible improvement in signal-to-noise ratio (SNR) relevant to event detection. The basis of this study is about a hundred representative wide-band noise samples collected from GBA throughout the year 1992. Both time-structured correlation as well as coherence of the noise waveforms are found to be practically independent of the inter element distances within the array, and they exhibit strong temporal and spectral stability. It turns out that the noise is largely incoherent at frequencies ranging upwards from 2 Hz; the coherency coefficient tends to increase in the lower frequency range attaining a maximum of 0.6 close to 0.5 Hz. While the maximum absolute cross-correlation also diminishes with increasing frequency, the zero-lag cross-correlation is found to be insensitive to frequency filtering regardless of the pass band. An extremely small value of -0.01 of the zero-lag correlation and a comparatively higher year-round average estimate at 0.15 of the maximum absolute time-lagged correlation yields an SNR improvement varying between a probable high of 4.1 and a low of 2.3 for the full 20-element array. 19 refs., 6 figs

  1. Deficits in cue detection underlie event-based prospective memory impairment in major depression: an eye tracking study.

    Science.gov (United States)

    Chen, Siyi; Zhou, Renlai; Cui, Hong; Chen, Xinyin

    2013-10-30

    This study examined the cue detection in the non-focal event-based prospective memory (PM) of individuals with and without a major depressive disorder using behavioural and eye tracking assessments. The participants were instructed to search on each trial for a different target stimulus that could be present or absent and to make prospective responses to the cue object. PM tasks included cue only and target plus cue, whereas ongoing tasks included target only and distracter only. The results showed that a) participants with depression performed more poorly than those without depression in PM; b) participants with depression showed more fixations and longer total and average fixation durations in both ongoing and PM conditions; c) participants with depression had lower scores on accuracy in target-plus-cue trials than in cue-only trials and had a higher gaze rate of targets on hits and misses in target-plus-cue trials than did those without depression. The results indicate that the state of depression may impair top-down cognitive control function, which in turn results in particular deficits in the engagement of monitoring for PM cues. Copyright © 2013. Published by Elsevier Ireland Ltd.

  2. 1.3 mm WAVELENGTH VLBI OF SAGITTARIUS A*: DETECTION OF TIME-VARIABLE EMISSION ON EVENT HORIZON SCALES

    International Nuclear Information System (INIS)

    Fish, Vincent L.; Doeleman, Sheperd S.; Beaudoin, Christopher; Bolin, David E.; Rogers, Alan E. E.; Blundell, Ray; Gurwell, Mark A.; Moran, James M.; Primiani, Rurik; Bower, Geoffrey C.; Plambeck, Richard; Chamberlin, Richard; Freund, Robert; Friberg, Per; Honma, Mareki; Oyama, Tomoaki; Inoue, Makoto; Krichbaum, Thomas P.; Lamb, James; Marrone, Daniel P.

    2011-01-01

    Sagittarius A*, the ∼4 x 10 6 M sun black hole candidate at the Galactic center, can be studied on Schwarzschild radius scales with (sub)millimeter wavelength very long baseline interferometry (VLBI). We report on 1.3 mm wavelength observations of Sgr A* using a VLBI array consisting of the JCMT on Mauna Kea, the Arizona Radio Observatory's Submillimeter Telescope on Mt. Graham in Arizona, and two telescopes of the CARMA array at Cedar Flat in California. Both Sgr A* and the quasar calibrator 1924-292 were observed over three consecutive nights, and both sources were clearly detected on all baselines. For the first time, we are able to extract 1.3 mm VLBI interferometer phase information on Sgr A* through measurement of closure phase on the triangle of baselines. On the third night of observing, the correlated flux density of Sgr A* on all VLBI baselines increased relative to the first two nights, providing strong evidence for time-variable change on scales of a few Schwarzschild radii. These results suggest that future VLBI observations with greater sensitivity and additional baselines will play a valuable role in determining the structure of emission near the event horizon of Sgr A*.

  3. Model-based Software Engineering

    DEFF Research Database (Denmark)

    Kindler, Ekkart

    2010-01-01

    The vision of model-based software engineering is to make models the main focus of software development and to automatically generate software from these models. Part of that idea works already today. But, there are still difficulties when it comes to behaviour. Actually, there is no lack in models...

  4. Graph Model Based Indoor Tracking

    DEFF Research Database (Denmark)

    Jensen, Christian Søndergaard; Lu, Hua; Yang, Bin

    2009-01-01

    The tracking of the locations of moving objects in large indoor spaces is important, as it enables a range of applications related to, e.g., security and indoor navigation and guidance. This paper presents a graph model based approach to indoor tracking that offers a uniform data management...

  5. Is detection of adverse events affected by record review methodology? an evaluation of the "Harvard Medical Practice Study" method and the "Global Trigger Tool".

    Science.gov (United States)

    Unbeck, Maria; Schildmeijer, Kristina; Henriksson, Peter; Jürgensen, Urban; Muren, Olav; Nilsson, Lena; Pukk Härenstam, Karin

    2013-04-15

    There has been a theoretical debate as to which retrospective record review method is the most valid, reliable, cost efficient and feasible for detecting adverse events. The aim of the present study was to evaluate the feasibility and capability of two common retrospective record review methods, the "Harvard Medical Practice Study" method and the "Global Trigger Tool" in detecting adverse events in adult orthopaedic inpatients. We performed a three-stage structured retrospective record review process in a random sample of 350 orthopaedic admissions during 2009 at a Swedish university hospital. Two teams comprised each of a registered nurse and two physicians were assigned, one to each method. All records were primarily reviewed by registered nurses. Records containing a potential adverse event were forwarded to physicians for review in stage 2. Physicians made an independent review regarding, for example, healthcare causation, preventability and severity. In the third review stage all adverse events that were found with the two methods together were compared and all discrepancies after review stage 2 were analysed. Events that had not been identified by one of the methods in the first two review stages were reviewed by the respective physicians. Altogether, 160 different adverse events were identified in 105 (30.0%) of the 350 records with both methods combined. The "Harvard Medical Practice Study" method identified 155 of the 160 (96.9%, 95% CI: 92.9-99.0) adverse events in 104 (29.7%) records compared with 137 (85.6%, 95% CI: 79.2-90.7) adverse events in 98 (28.0%) records using the "Global Trigger Tool". Adverse events "causing harm without permanent disability" accounted for most of the observed difference. The overall positive predictive value for criteria and triggers using the "Harvard Medical Practice Study" method and the "Global Trigger Tool" was 40.3% and 30.4%, respectively. More adverse events were identified using the "Harvard Medical Practice Study

  6. Detection of Healthcare-Related Extended-Spectrum Beta-Lactamase-Producing Escherichia coli Transmission Events Using Combined Genetic and Phenotypic Epidemiology.

    Directory of Open Access Journals (Sweden)

    Anne F Voor In 't Holt

    Full Text Available Since the year 2000 there has been a sharp increase in the prevalence of healthcare-related infections caused by extended-spectrum beta-lactamase (ESBL-producing Escherichia coli. However, the high community prevalence of ESBL-producing E. coli isolates means that many E. coli typing techniques may not be suitable for detecting E. coli transmission events. Therefore, we investigated if High-throughput MultiLocus Sequence Typing (HiMLST and/or Raman spectroscopy were suitable techniques for detecting recent E. coli transmission events.This study was conducted from January until December 2010 at Erasmus University Medical Center, Rotterdam, the Netherlands. Isolates were typed using HiMLST and Raman spectroscopy. A genetic cluster was defined as two or more patients carrying identical isolates. We used predefined definitions for epidemiological relatedness to assess healthcare-related transmission.We included 194 patients; strains of 112 patients were typed using HiMLST and strains of 194 patients were typed using Raman spectroscopy. Raman spectroscopy identified 16 clusters while HiMLST identified 10 clusters. However, no healthcare-related transmission events were detected. When combining data from both typing techniques, we identified eight clusters (n = 34 patients, as well as 78 patients with a non-cluster isolate. However, we could not detect any healthcare-related transmission in these 8 clusters.Although clusters were genetically detected using HiMLST and Raman spectroscopy, no definite epidemiological relationships could be demonstrated which makes the possibility of healthcare-related transmission events highly unlikely. Our results suggest that typing of ESBL-producing E. coli using HiMLST and/or Raman spectroscopy is not helpful in detecting E. coli healthcare-related transmission events.

  7. Multichannel linear descriptors analysis for event-related EEG of vascular dementia patients during visual detection task.

    Science.gov (United States)

    Lou, Wutao; Xu, Jin; Sheng, Hengsong; Zhao, Songzhen

    2011-11-01

    Multichannel EEG recorded in a task condition could contain more information about cognition. However, that has not been widely investigated in the vascular-dementia (VaD)- related studies. The purpose of this study was to explore the differences of brain functional states between VaD patients and normal controls while performing a detection task. Three multichannel linear descriptors, i.e. spatial complexity (Ω), field strength (Σ) and frequency of field changes (Φ), were applied to analyse four frequency bands (delta, theta, alpha and beta) of multichannel event-related EEG signals for 12 VaD patients (mean age ± SD: 69.25 ± 10.56 years ; MMSE score ± SD: 22.58 ± 4.42) and 12 age-matched healthy subjects (mean age ± SD: 67.17 ± 5.97 years ; MMSE score ± SD: 29.08 ± 0.9). The correlations between the three measures and MMSE scores were also analysed. VaD patients showed a significant higher Ω value in the delta (p = 0.013) and theta (p = 0.021) frequency bands, a lower Σ value (p = 0.011) and a higher Φ (p = 0.008) value in the delta frequency band compared with normal controls. The MMSE scores were negatively correlated with the Ω (r = -0.52, p = 0.01) and Φ (r = -0.47, p = 0.02) values in the delta frequency band. The results indicated the VaD patients presented a reduction of synchronization in the slow frequency band during target detection, and suggested more neurons might be activated in VaD patients compared with normal controls. The Ω and Φ measures in the delta frequency band might be used to evaluate the degree of cognitive dysfunction. The multichannel linear descriptors are promising measures to reveal the differences in brain functions between VaD patients and normal subjects, and could potentially be used to evaluate the degree of cognitive dysfunction in VaD patients. Copyright © 2011 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.

  8. Effectiveness of pacemaker tele-monitoring on quality of life, functional capacity, event detection and workload: The PONIENTE trial.

    Science.gov (United States)

    Lopez-Villegas, Antonio; Catalan-Matamoros, Daniel; Robles-Musso, Emilio; Peiro, Salvador

    2016-11-01

    The purpose of the present study was to assess the effectiveness of the remote monitoring (RM) of older adults with pacemakers on health-related quality of life, functional capacity, feasibility, reliability and safety. The PONIENTE study is a controlled, non-randomized, non-blinded clinical trial, with data collection carried out during the pre-implant stage and after 12 months. Between October of 2012 and November of 2013, 82 patients were assigned to either a remote monitoring group (n = 30) or a conventional hospital monitoring (HM) group (n = 52). The EuroQol-5D (EQ-5D) and the Duke Activity Status Index were used to measure health-related quality of life and functional capacity, respectively. Baseline characteristics and number of hospital visits were also analyzed. The baseline characteristics of the two study groups were similar for both the EQ-5D (RM 0.74, HM 0.67; P = 0.404) and the Duke Activity Status Index (RM 21.42, HM 19.95; P = 0.272). At the 12-month follow up, the EQ-5D utility score was improved for both groups (RM 0.91, HM 0.81; P = 0.154), unlike the EQ-5D Visual Analog Scale (P = 0.043). The Duke Activity Status Index score was similar to the baseline score. The number of in-hospital visits was 27% lower (3 vs 4; P pacemakers in older adults is an equivalent option to hospital monitoring, in terms of health-related quality of life and functional capacity. Furthermore, it allows for the early detection of clinical and pacemaker-related adverse events, and significantly reduces the number of in-hospital visits. Geriatr Gerontol Int 2016; 16: 1188-1195. © 2015 Japan Geriatrics Society.

  9. A comment on Farwell : brain fingerprinting: a comprehensive tutorial review of detection of concealed information with event-related brain potentials

    NARCIS (Netherlands)

    Meijer, E.H.; Ben-Shakhar, G.; Verschuere, B.; Donchin, E.

    2013-01-01

    In a recent issue of Cognitive Neurodynamics Farwell (Cogn Neurodyn 6:115-154, 2012) published a comprehensive tutorial review of the use of Event Related Brain Potentials (ERP) in the detection of concealed information. Farwell’s review covered much of his own work employing his ‘‘brain

  10. Automatic, ECG-based detection of autonomic arousals and their association with cortical arousals, leg movements, and respiratory events in sleep

    DEFF Research Database (Denmark)

    Olsen, Mads; Schneider, Logan Douglas; Cheung, Joseph

    2018-01-01

    The current definition of sleep arousals neglects to address the diversity of arousals and their systemic cohesion. Autonomic arousals (AA) are autonomic activations often associated with cortical arousals (CA), but they may also occur in isolation in relation to a respiratory event, a leg movement...... event or spontaneously, without any other physiological associations. AA should be acknowledged as essential events to understand and explore the systemic implications of arousals. We developed an automatic AA detection algorithm based on intelligent feature selection and advanced machine learning using...... or respiratory events. This indicates that most FP constitute autonomic activations that are indistinguishable from those with cortical cohesion. The proposed algorithm provides an automatic system trained in a clinical environment, which can be utilized to analyse the systemic and clinical impacts of arousals....

  11. Model-based security testing

    OpenAIRE

    Schieferdecker, Ina; Großmann, Jürgen; Schneider, Martin

    2012-01-01

    Security testing aims at validating software system requirements related to security properties like confidentiality, integrity, authentication, authorization, availability, and non-repudiation. Although security testing techniques are available for many years, there has been little approaches that allow for specification of test cases at a higher level of abstraction, for enabling guidance on test identification and specification as well as for automated test generation. Model-based security...

  12. Model-based machine learning.

    Science.gov (United States)

    Bishop, Christopher M

    2013-02-13

    Several decades of research in the field of machine learning have resulted in a multitude of different algorithms for solving a broad range of problems. To tackle a new application, a researcher typically tries to map their problem onto one of these existing methods, often influenced by their familiarity with specific algorithms and by the availability of corresponding software implementations. In this study, we describe an alternative methodology for applying machine learning, in which a bespoke solution is formulated for each new application. The solution is expressed through a compact modelling language, and the corresponding custom machine learning code is then generated automatically. This model-based approach offers several major advantages, including the opportunity to create highly tailored models for specific scenarios, as well as rapid prototyping and comparison of a range of alternative models. Furthermore, newcomers to the field of machine learning do not have to learn about the huge range of traditional methods, but instead can focus their attention on understanding a single modelling environment. In this study, we show how probabilistic graphical models, coupled with efficient inference algorithms, provide a very flexible foundation for model-based machine learning, and we outline a large-scale commercial application of this framework involving tens of millions of users. We also describe the concept of probabilistic programming as a powerful software environment for model-based machine learning, and we discuss a specific probabilistic programming language called Infer.NET, which has been widely used in practical applications.

  13. Gait Event Detection in Real-World Environment for Long-Term Applications: Incorporating Domain Knowledge Into Time-Frequency Analysis.

    Science.gov (United States)

    Khandelwal, Siddhartha; Wickstrom, Nicholas

    2016-12-01

    Detecting gait events is the key to many gait analysis applications that would benefit from continuous monitoring or long-term analysis. Most gait event detection algorithms using wearable sensors that offer a potential for use in daily living have been developed from data collected in controlled indoor experiments. However, for real-word applications, it is essential that the analysis is carried out in humans' natural environment; that involves different gait speeds, changing walking terrains, varying surface inclinations and regular turns among other factors. Existing domain knowledge in the form of principles or underlying fundamental gait relationships can be utilized to drive and support the data analysis in order to develop robust algorithms that can tackle real-world challenges in gait analysis. This paper presents a novel approach that exhibits how domain knowledge about human gait can be incorporated into time-frequency analysis to detect gait events from long-term accelerometer signals. The accuracy and robustness of the proposed algorithm are validated by experiments done in indoor and outdoor environments with approximately 93 600 gait events in total. The proposed algorithm exhibits consistently high performance scores across all datasets in both, indoor and outdoor environments.

  14. Vibrotactile Detection, Identification and Directional Perception of signal-Processed Sounds from Environmental Events: A Pilot Field Evaluation in Five Cases

    Directory of Open Access Journals (Sweden)

    Parivash Ranjbar

    2008-09-01

    Full Text Available Objectives: Conducting field tests of a vibrotactile aid for deaf/deafblind persons for detection, identification and directional perception of environmental sounds. Methods: Five deaf (3F/2M, 22–36 years individuals tested the aid separately in a home environment (kitchen and in a traffic environment. Their eyes were blindfolded and they wore a headband and holding a vibrator for sound identification. In the headband, three microphones were mounted and two vibrators for signalling direction of the sound source. The sounds originated from events typical for the home environment and traffic. The subjects were inexperienced (events unknown and experienced (events known. They identified the events in a home and traffic environment, but perceived sound source direction only in traffic. Results: The detection scores were higher than 98% both in the home and in the traffic environment. In the home environment, identification scores varied between 25%-58% when the subjects were inexperienced and between 33%-83% when they were experienced. In traffic, identification scores varied between 20%-40% when the subjects were inexperienced and between 22%-56% when they were experienced. The directional perception scores varied between 30%-60% when inexperienced and between 61%-83% when experienced. Discussion: The vibratory aid consistently improved all participants’ detection, identification and directional perception ability.

  15. Application of process monitoring to anomaly detection in nuclear material processing systems via system-centric event interpretation of data from multiple sensors of varying reliability

    International Nuclear Information System (INIS)

    Garcia, Humberto E.; Simpson, Michael F.; Lin, Wen-Chiao; Carlson, Reed B.; Yoo, Tae-Sic

    2017-01-01

    Highlights: • Process monitoring can strengthen nuclear safeguards and material accountancy. • Assessment is conducted at a system-centric level to improve safeguards effectiveness. • Anomaly detection is improved by integrating process and operation relationships. • Decision making is benefited from using sensor and event sequence information. • Formal framework enables optimization of sensor and data processing resources. - Abstract: In this paper, we apply an advanced safeguards approach and associated methods for process monitoring to a hypothetical nuclear material processing system. The assessment regarding the state of the processing facility is conducted at a system-centric level formulated in a hybrid framework. This utilizes architecture for integrating both time- and event-driven data and analysis for decision making. While the time-driven layers of the proposed architecture encompass more traditional process monitoring methods based on time series data and analysis, the event-driven layers encompass operation monitoring methods based on discrete event data and analysis. By integrating process- and operation-related information and methodologies within a unified framework, the task of anomaly detection is greatly improved. This is because decision-making can benefit from not only known time-series relationships among measured signals but also from known event sequence relationships among generated events. This available knowledge at both time series and discrete event layers can then be effectively used to synthesize observation solutions that optimally balance sensor and data processing requirements. The application of the proposed approach is then implemented on an illustrative monitored system based on pyroprocessing and results are discussed.

  16. Pulse oximetry recorded from the Phone Oximeter for detection of obstructive sleep apnea events with and without oxygen desaturation in children.

    Science.gov (United States)

    Garde, Ainara; Dehkordi, Parastoo; Wensley, David; Ansermino, J Mark; Dumont, Guy A

    2015-01-01

    Obstructive sleep apnea (OSA) disrupts normal ventilation during sleep and can lead to serious health problems in children if left untreated. Polysomnography, the gold standard for OSA diagnosis, is resource intensive and requires a specialized laboratory. Thus, we proposed to use the Phone Oximeter™, a portable device integrating pulse oximetry with a smartphone, to detect OSA events. As a proportion of OSA events occur without oxygen desaturation (defined as SpO2 decreases ≥ 3%), we suggest combining SpO2 and pulse rate variability (PRV) analysis to identify all OSA events and provide a more detailed sleep analysis. We recruited 160 children and recorded pulse oximetry consisting of SpO2 and plethysmography (PPG) using the Phone Oximeter™, alongside standard polysomnography. A sleep technician visually scored all OSA events with and without oxygen desaturation from polysomnography. We divided pulse oximetry signals into 1-min signal segments and extracted several features from SpO2 and PPG analysis in the time and frequency domain. Segments with OSA, especially the ones with oxygen desaturation, presented greater SpO2 variability and modulation reflected in the spectral domain than segments without OSA. Segments with OSA also showed higher heart rate and sympathetic activity through the PRV analysis relative to segments without OSA. PRV analysis was more sensitive than SpO2 analysis for identification of OSA events without oxygen desaturation. Combining SpO2 and PRV analysis enhanced OSA event detection through a multiple logistic regression model. The area under the ROC curve increased from 81% to 87%. Thus, the Phone Oximeter™ might be useful to monitor sleep and identify OSA events with and without oxygen desaturation at home.

  17. Event-specific qualitative and quantitative PCR detection of the GMO carnation (Dianthus caryophyllus) variety Moonlite based upon the 5'-transgene integration sequence.

    Science.gov (United States)

    Li, P; Jia, J W; Jiang, L X; Zhu, H; Bai, L; Wang, J B; Tang, X M; Pan, A H

    2012-04-27

    To ensure the implementation of genetically modified organism (GMO)-labeling regulations, an event-specific detection method was developed based on the junction sequence of an exogenous integrant in the transgenic carnation variety Moonlite. The 5'-transgene integration sequence was isolated by thermal asymmetric interlaced PCR. Based upon the 5'-transgene integration sequence, the event-specific primers and TaqMan probe were designed to amplify the fragments, which spanned the exogenous DNA and carnation genomic DNA. Qualitative and quantitative PCR assays were developed employing the designed primers and probe. The detection limit of the qualitative PCR assay was 0.05% for Moonlite in 100 ng total carnation genomic DNA, corresponding to about 79 copies of the carnation haploid genome; the limit of detection and quantification of the quantitative PCR assay were estimated to be 38 and 190 copies of haploid carnation genomic DNA, respectively. Carnation samples with different contents of genetically modified components were quantified and the bias between the observed and true values of three samples were lower than the acceptance criterion (GMO detection method. These results indicated that these event-specific methods would be useful for the identification and quantification of the GMO carnation Moonlite.

  18. The association of colonoscopy quality indicators with the detection of screen-relevant lesions, adverse events, and postcolonoscopy cancers in an asymptomatic Canadian colorectal cancer screening population.

    Science.gov (United States)

    Hilsden, Robert J; Dube, Catherine; Heitman, Steven J; Bridges, Ronald; McGregor, S Elizabeth; Rostom, Alaa

    2015-11-01

    Although several quality indicators of colonoscopy have been defined, quality assurance activities should be directed at the measurement of quality indicators that are predictive of key screening colonoscopy outcomes. The goal of this study was to examine the association among established quality indicators and the detection of screen-relevant lesions (SRLs), adverse events, and postcolonoscopy cancers. Historical cohort study. Canadian colorectal cancer screening center. A total of 18,456 asymptomatic men and women ages 40 to 74, at either average risk or increased risk for colorectal cancer because of a family history, who underwent a screening colonoscopy from 2008 to 2010. Using univariate and multivariate analyses, we explored the association among procedural quality indicators and 3 colonoscopy outcomes: detection of SRLs, adverse events, and postcolonoscopy cancers. The crude rates of SRLs, adverse events, and postcolonoscopy cancers were 240, 6.44, and .54 per 1000 colonoscopies, respectively. Several indicators, including endoscopist withdrawal time (OR, 1.3; 95% CI, 1.2-1.4) and cecal intubation rate (OR, 13.9; 95% CI, 1.9-96.9), were associated with the detection of SRLs. No quality indicator was associated with the risk of adverse events. Endoscopist average withdrawal time over 6 minutes (OR, .12; 95% CI, .002-.85) and SRL detection rate over 20% (OR, .17; 95% CI, .03-.74) were associated with a reduced risk of postcolonoscopy cancers. Single-center study. Quality assurance programs should prioritize the measurement of endoscopist average withdrawal time and adenoma (SRL) detection rate. Copyright © 2015 American Society for Gastrointestinal Endoscopy. Published by Elsevier Inc. All rights reserved.

  19. Reconstruction efficiency and precision for the events detected by the BIS-2 installation using the Perun pattern recognition program

    International Nuclear Information System (INIS)

    Burilkov, D.T.; Genchev, V.I.; Markov, P.K.; Likhachev, M.F.; Takhtamyshev, G.G.; Todorov, P.T.; Trayanov, P.K.

    1982-01-01

    Results of studying the efficiency and accuracy of the track and event reconstruction with the Perun pattern recognition program used in the experiments carried out at the BIS-2 installation are presented. The Monte Carlo method is used for simulating the processes of neutron interactions with matter. The con-- clusion is made that the Perun program correctly, with good accuracy and high efficiency reconstructs complex multiparticle events [ru

  20. Model-Based Security Testing

    Directory of Open Access Journals (Sweden)

    Ina Schieferdecker

    2012-02-01

    Full Text Available Security testing aims at validating software system requirements related to security properties like confidentiality, integrity, authentication, authorization, availability, and non-repudiation. Although security testing techniques are available for many years, there has been little approaches that allow for specification of test cases at a higher level of abstraction, for enabling guidance on test identification and specification as well as for automated test generation. Model-based security testing (MBST is a relatively new field and especially dedicated to the systematic and efficient specification and documentation of security test objectives, security test cases and test suites, as well as to their automated or semi-automated generation. In particular, the combination of security modelling and test generation approaches is still a challenge in research and of high interest for industrial applications. MBST includes e.g. security functional testing, model-based fuzzing, risk- and threat-oriented testing, and the usage of security test patterns. This paper provides a survey on MBST techniques and the related models as well as samples of new methods and tools that are under development in the European ITEA2-project DIAMONDS.

  1. Detection of zero anisotropy at 5.2 AU during the November 1998 solar particle event: Ulysses Anisotropy Telescopes observations

    Directory of Open Access Journals (Sweden)

    S. Dalla

    Full Text Available For the first time during the mission, the Anisotropy Telescopes instrument on board the Ulysses spacecraft measured constant zero anisotropy of protons in the 1.3-2.2 MeV energy range, for a period lasting more than three days. This measurement was made during the energetic particle event taking place at Ulysses between 25 November and 15 December 1998, an event characterised by constant high proton fluxes within a region delimited by two interplanetary forward shocks, at a distance of 5.2 AU from the Sun and heliographic latitude of 17°S. We present the ATs results for this event and discuss their possible interpretation and their relevance to the issue of intercalibration of the two telescopes.

    Key words: Interplanetary physics (energetic particles - Solar physics, astrophysics and astronomy (energetic particles - Space plasma physics (instruments and techniques

  2. Auto-adaptive averaging: Detecting artifacts in event-related potential data using a fully automated procedure

    NARCIS (Netherlands)

    Talsma, D.

    2008-01-01

    The auto-adaptive averaging procedure proposed here classifies artifacts in event-related potential data by optimizing the signal-to-noise ratio. This method rank orders single trials according to the impact of each trial on the ERP average. Then, the minimum residual background noise level in the

  3. Auto-adaptive averaging: Detecting artifacts in event-related potential data using a fully automated procedure.

    NARCIS (Netherlands)

    Talsma, D.

    2008-01-01

    The auto-adaptive averaging procedure proposed here classifies artifacts in event-related potential data by optimizing the signal-to-noise ratio. This method rank orders single trials according to the impact of each trial on the ERP average. Then, the minimum residual background noise level in the

  4. Implementing Lumberjacks and Black Swans Into Model-Based Tools to Support Human-Automation Interaction.

    Science.gov (United States)

    Sebok, Angelia; Wickens, Christopher D

    2017-03-01

    The objectives were to (a) implement theoretical perspectives regarding human-automation interaction (HAI) into model-based tools to assist designers in developing systems that support effective performance and (b) conduct validations to assess the ability of the models to predict operator performance. Two key concepts in HAI, the lumberjack analogy and black swan events, have been studied extensively. The lumberjack analogy describes the effects of imperfect automation on operator performance. In routine operations, an increased degree of automation supports performance, but in failure conditions, increased automation results in more significantly impaired performance. Black swans are the rare and unexpected failures of imperfect automation. The lumberjack analogy and black swan concepts have been implemented into three model-based tools that predict operator performance in different systems. These tools include a flight management system, a remotely controlled robotic arm, and an environmental process control system. Each modeling effort included a corresponding validation. In one validation, the software tool was used to compare three flight management system designs, which were ranked in the same order as predicted by subject matter experts. The second validation compared model-predicted operator complacency with empirical performance in the same conditions. The third validation compared model-predicted and empirically determined time to detect and repair faults in four automation conditions. The three model-based tools offer useful ways to predict operator performance in complex systems. The three tools offer ways to predict the effects of different automation designs on operator performance.

  5. Phenotyping for patient safety: algorithm development for electronic health record based automated adverse event and medical error detection in neonatal intensive care.

    Science.gov (United States)

    Li, Qi; Melton, Kristin; Lingren, Todd; Kirkendall, Eric S; Hall, Eric; Zhai, Haijun; Ni, Yizhao; Kaiser, Megan; Stoutenborough, Laura; Solti, Imre

    2014-01-01

    Although electronic health records (EHRs) have the potential to provide a foundation for quality and safety algorithms, few studies have measured their impact on automated adverse event (AE) and medical error (ME) detection within the neonatal intensive care unit (NICU) environment. This paper presents two phenotyping AE and ME detection algorithms (ie, IV infiltrations, narcotic medication oversedation and dosing errors) and describes manual annotation of airway management and medication/fluid AEs from NICU EHRs. From 753 NICU patient EHRs from 2011, we developed two automatic AE/ME detection algorithms, and manually annotated 11 classes of AEs in 3263 clinical notes. Performance of the automatic AE/ME detection algorithms was compared to trigger tool and voluntary incident reporting results. AEs in clinical notes were double annotated and consensus achieved under neonatologist supervision. Sensitivity, positive predictive value (PPV), and specificity are reported. Twelve severe IV infiltrates were detected. The algorithm identified one more infiltrate than the trigger tool and eight more than incident reporting. One narcotic oversedation was detected demonstrating 100% agreement with the trigger tool. Additionally, 17 narcotic medication MEs were detected, an increase of 16 cases over voluntary incident reporting. Automated AE/ME detection algorithms provide higher sensitivity and PPV than currently used trigger tools or voluntary incident-reporting systems, including identification of potential dosing and frequency errors that current methods are unequipped to detect. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  6. Lessons derived from two high-frequency sea level events in the Atlantic: implications for coastal risk analysis and tsunami detection

    Directory of Open Access Journals (Sweden)

    Begoña Pérez-Gómez

    2016-11-01

    Full Text Available The upgrade and enhancement of sea level networks worldwide for integration in sea level hazard warning systems have significantly increased the possibilities for measuring and analyzing high frequency sea level oscillations, with typical periods ranging from a few minutes to a few hours. Many tide gauges now afford 1 min or more frequent sampling and have shown such events to be a common occurrence. Their origins and spatial distribution are diverse and must be well understood in order to correctly design and interpret, for example, the automatic detection algorithms used by tsunami warning centers. Two events recorded recently in European Atlantic waters are analyzed here: possible wave-induced seiches that occurred along the North coast of Spain during the storms of January and February of 2014, and oscillations detected after an earthquake in the mid-Atlantic the 13th of February of 2015. The former caused significant flooding in towns and villages and a huge increase in wave-induced coastal damage that was reported in the media for weeks. The second was a smaller signal present in several tide gauges along the Atlantic coast that, that coincided with the occurrence of this earthquake, leading to a debate on the potential detection of a very small tsunami and how it might yield significant information for tsunami wave modelers and for the development of tsunami detection software. These kinds of events inform us about the limitations of automatic algorithms for tsunami warning and help to improve the information provided to tsunami warning centers, whilst also emphasizing the importance of other forcings in generating extreme sea levels and their associated potential for causing damage to infrastructure.

  7. Novel ST-MUSIC-based spectral analysis for detection of ULF geomagnetic signals anomalies associated with seismic events in Mexico

    Directory of Open Access Journals (Sweden)

    Omar Chavez

    2016-05-01

    Full Text Available Recently, the analysis of ultra-low-frequency (ULF geomagnetic signals in order to detect seismic anomalies has been reported in several works. Yet, they, although having promising results, present problems for their detection since these anomalies are generally too much weak and embedded in high noise levels. In this work, a short-time multiple signal classification (ST-MUSIC, which is a technique with high-frequency resolution and noise immunity, is proposed for the detection of seismic anomalies in the ULF geomagnetic signals. Besides, the energy (E of geomagnetic signals processed by ST-MUSIC is also presented as a complementary parameter to measure the fluctuations between seismic activity and seismic calm period. The usefulness and effectiveness of the proposal are demonstrated through the analysis of a synthetic signal and five real signals with earthquakes. The analysed ULF geomagnetic signals have been obtained using a tri-axial fluxgate magnetometer at the Juriquilla station, which is localized in Queretaro, Mexico (geographic coordinates: longitude 100.45° E and latitude 20.70° N. The results obtained show the detection of seismic perturbations before, during, and after the main shock, making the proposal a suitable tool for detecting seismic precursors.

  8. Detection and Modeling of a Meteotsunami in Lake Erie During a High Wind Event on May 27, 2012

    Science.gov (United States)

    Anderson, E. J.; Schwab, D. J.; Lombardy, K. A.; LaPlante, R. E.

    2012-12-01

    On May 27, 2012, a mesoscale convective system moved southeast across the central basin of Lake Erie (the shallowest of the Great Lakes) causing an increase in surface wind speed from 3 to 15 m/s over a few minutes. Although no significant pressure change was observed during this period (+1 mbar), the storm resulted in 3 reported edge waves on the southern shore (5 minutes apart), with wave heights up to 7 feet (2.13 m). Witnesses along the coast reported that the water receded before the waves hit, the only warning of the impending danger. After impact on the southern shore, several individuals were stranded in the water near Cleveland, Ohio. Fortunately, there were no fatalities or serious injury as a result of the edge waves. The storm event yielded two separate but similar squall line events that impacted the southern shore of Lake Erie several hours apart. The first event had little impact on nearshore conditions, however, the second event (moving south-eastward at 21.1 m/s or 41 knots), resulted in 7 ft waves near Cleveland as reported above. The thunderstorms generated three closely packed outflow boundaries that intersected the southern shore of Lake Erie between 1700 and 1730 UTC. The outflow boundaries were followed by a stronger outflow at 1800 UTC. Radial velocities on the WSR-88D in Cleveland, Ohio indicated the winds were stronger in the second outflow boundary. The radar indicated winds between 20.6 and 24.7 m/s (40 and 48 knots) within 240 meters (800 feet) above ground level. In order to better understand the storm event and the cause of the waves that impacted the southern shore, a three-dimensional hydrodynamic model of Lake Erie has been developed using the Finite Volume Coastal Ocean Model (FVCOM). The model is being developed as part of the Great Lakes Coastal Forecasting (GLCFS), a set of experimental real-time pre-operational hydrodynamic models run at the NOAA Great Lakes Research Laboratory that forecast currents, waves, temperature, and

  9. Issues in practical model-based diagnosis

    NARCIS (Netherlands)

    Bakker, R.R.; Bakker, R.R.; van den Bempt, P.C.A.; van den Bempt, P.C.A.; Mars, Nicolaas; Out, D.-J.; Out, D.J.; van Soest, D.C.; van Soes, D.C.

    1993-01-01

    The model-based diagnosis project at the University of Twente has been directed at improving the practical usefulness of model-based diagnosis. In cooperation with industrial partners, the research addressed the modeling problem and the efficiency problem in model-based reasoning. Main results of

  10. Detection and copy number estimation of the transgenic nucleotide sequences in an unknown GM event of Oryza sativa

    Directory of Open Access Journals (Sweden)

    Ali M. Sajjad

    2016-12-01

    Full Text Available The present study was designed to establish a qualitative detection method based on conventional and real time PCR assay to screen the commonly grown rice varieties for the presence of the cry1Ac gene. The detection of genetically modified rice in the screening process would necessitate accurate assay development and precise qualitative PCR tests complying with established procedures for the detection and characterization of transgenes in food grains. Such assay would not only enable the monitoring of transgene flow in local agricultural environment but also the characterization of different plant species produced with this transgene and its regulatory components. Thus, a reliable and quick screening assay was established for the qualitative detection of the transgene along with the promoter and selectable marker gene in genetically modified rice. By conventional PCR, a fragment of 215 bp was amplified with gene specific primers of cry1Ac. Primers for other transgenes such as gna and bar were also employed; however, no amplification was detected. The presence of the p35s, sps, and nptII genes was confirmed by qualitative real-time PCR. The specificity of the respective PCR products was checked through melt peak curve analysis. Sharp and precise melting temperatures indicated the presence of a single kind of PCR product in correspondence to each of the primers used. Moreover, the copy number of cry1Ac was estimated by ∆∆CT method. It is proposed that the primer sets and experimental conditions used in this study will be sufficient to meet the requirements for molecular detection and characterization of the cry1Ac transgene and affiliated sequences in sorting out conventional rice varieties from the ones which are genetically modified. It will also help to monitor the ecological flow of these transgenes and other biosafety factors.

  11. Detection of magnetic-labeled antibody specific recognition events by combined atomic force and magnetic force microscopy

    International Nuclear Information System (INIS)

    Hong Xia; Liu Yanmei; Li Jun; Guo Wei; Bai Yubai

    2009-01-01

    Atomic force (AFM) and magnetic force microscopy (MFM) were developed to detect biomolecular specific interaction. Goat anti-mouse immunoglobulin (anti-IgG) was covalently attached onto gold substrate modified by a self-assembly monolayer of thioctic acid via 1-ethyl-3-[3-(dimethylamino) propyl] carbodiimide (EDC) activation. Magnetic-labeled IgG then specifically adsorbed onto anti-IgG surface. The morphological variation was identified by AFM. MFM was proved to be a fine assistant tool to distinguish the immunorecognized nanocomposites from the impurities by detection of the magnetic signal from magnetic-labeled IgG. It would enhance the understanding of biomolecular recognition process.

  12. Detection of magnetic-labeled antibody specific recognition events by combined atomic force and magnetic force microscopy

    Energy Technology Data Exchange (ETDEWEB)

    Hong Xia [Center for Advanced Optoelectronic Functional Materials Research, Key Laboratory of UV Light-Emitting Materials and Technology, Ministry of Education, Northeast Normal University, Changchun 130024 (China); College of Chemistry, Jilin University, Changchun 130023 (China)], E-mail: xiahong@nenu.edu.cn; Liu Yanmei; Li Jun; Guo Wei; Bai Yubai [College of Chemistry, Jilin University, Changchun 130023 (China)

    2009-09-15

    Atomic force (AFM) and magnetic force microscopy (MFM) were developed to detect biomolecular specific interaction. Goat anti-mouse immunoglobulin (anti-IgG) was covalently attached onto gold substrate modified by a self-assembly monolayer of thioctic acid via 1-ethyl-3-[3-(dimethylamino) propyl] carbodiimide (EDC) activation. Magnetic-labeled IgG then specifically adsorbed onto anti-IgG surface. The morphological variation was identified by AFM. MFM was proved to be a fine assistant tool to distinguish the immunorecognized nanocomposites from the impurities by detection of the magnetic signal from magnetic-labeled IgG. It would enhance the understanding of biomolecular recognition process.

  13. Detection of a Type IIn Supernova in Optical Follow-up Observations of IceCube Neutrino Events

    OpenAIRE

    Aartsen, M. G.; Abraham, K.; Ackermann, M.; Adams, J.; Aguilar, J. A.; Ahlers, M.; Ahrens, M.; Altmann, D.; Anderson, T.; Archinger, M.; Arguelles, C.; Arlen, T. C.; Auffenberg, J.; Bai, X.; Barwick, S. W.

    2015-01-01

    The IceCube neutrino observatory pursues a follow-up program selecting interesting neutrino events in real-time and issuing alerts for electromagnetic follow-up observations. In 2012 March, the most significant neutrino alert during the first three years of operation was issued by IceCube. In the follow-up observations performed by the Palomar Transient Factory (PTF), a Type IIn supernova (SN IIn) PTF12csy was found 0.degrees 2 away from the neutrino alert direction, with an error radius of 0...

  14. Structure of Exogenous Gene Integration and Event-Specific Detection in the Glyphosate-Tolerant Transgenic Cotton Line BG2-7.

    Science.gov (United States)

    Zhang, Xiaobing; Tang, Qiaoling; Wang, Xujing; Wang, Zhixing

    2016-01-01

    In this study, the flanking sequence of an inserted fragment conferring glyphosate tolerance on transgenic cotton line BG2-7 was analyzed by thermal asymmetric interlaced polymerase chain reaction (TAIL-PCR) and standard PCR. The results showed apparent insertion of the exogenous gene into chromosome D10 of the Gossypium hirsutum L. genome, as the left and right borders of the inserted fragment are nucleotides 61,962,952 and 61,962,921 of chromosome D10, respectively. In addition, a 31-bp cotton microsatellite sequence was noted between the genome sequence and the 5' end of the exogenous gene. In total, 84 and 298 bp were deleted from the left and right borders of the exogenous gene, respectively, with 30 bp deleted from the cotton chromosome at the insertion site. According to the flanking sequence obtained, several pairs of event-specific detection primers were designed to amplify sequence between the 5' end of the exogenous gene and the cotton genome junction region as well as between the 3' end and the cotton genome junction region. Based on screening tests, the 5'-end primers GTCATAACGTGACTCCCTTAATTCTCC/CCTATTACACGGCTATGC and 3'-end primers TCCTTTCGCTTTCTTCCCTT/ACACTTACATGGCGTCTTCT were used to detect the respective BG2-7 event-specific primers. The limit of detection of the former primers reached 44 copies, and that of the latter primers reached 88 copies. The results of this study provide useful data for assessment of BG2-7 safety and for accelerating its industrialization.

  15. Utilizing NASA Earth Observations to detect factors contributing to hypoxic events in the southern Gulf of Mexico

    Science.gov (United States)

    Chapman, R.; Johansen, A.; Mitchell, Å. R.; Caraballo Álvarez, I. O.; Taggart, M.; Smith, B.

    2015-12-01

    Monitoring and analyzing harmful algal blooms (HABs) and hypoxic events in the southern coastal areas of the Gulf of Mexico (GoM) is important for watershed management and mitigation of environmental degradation. This study uncovered trends and dynamic characteristics of chlorophyll-a (Chl) concentration, sea surface temperature (SST), colored dissolved organic matter index (CDOM), and photosynthetically available radiation (PAR); as evident in 8-day standard mapped image (SMI) products from the MODIS instrument on the Aqua platform from 2002-2015 using Clark Labs TerrSet Earth Trends Modeler (ETM). Predicted dissolved oxygen images were classified using a Multi-Layer Perceptron regression approach with in-situ data from the northern GoM. Additionally, sediment and nutrient loading values of the Grijalva-Usumacinta watershed were modeled using the ArcGIS Soil and Water Assessment Tool (SWAT). Lastly, A Turbidity Index was generated using Landsat 8 Operational Land Imager (OLI) scenes for 2014-2015. Results, tools, and products will assist local environmental and health authorities in revising water quality standards and mitigating the impacts of future HABs and hypoxic events in the region. This project uses NASA's earth observations as a viable alternative to studying a region with no in-situ data.

  16. Detecting single-electron events in TEM using low-cost electronics and a silicon strip sensor.

    Science.gov (United States)

    Gontard, Lionel C; Moldovan, Grigore; Carmona-Galán, Ricardo; Lin, Chao; Kirkland, Angus I

    2014-04-01

    There is great interest in developing novel position-sensitive direct detectors for transmission electron microscopy (TEM) that do not rely in the conversion of electrons into photons. Direct imaging improves contrast and efficiency and allows the operation of the microscope at lower energies and at lower doses without loss in resolution, which is especially important for studying soft materials and biological samples. We investigate the feasibility of employing a silicon strip detector as an imaging detector for TEM. This device, routinely used in high-energy particle physics, can detect small variations in electric current associated with the impact of a single charged particle. The main advantages of using this type of sensor for direct imaging in TEM are its intrinsic radiation hardness and large detection area. Here, we detail design, simulation, fabrication and tests in a TEM of the front-end electronics developed using low-cost discrete components and discuss the limitations and applications of this technology for TEM.

  17. Hierarchical Context Modeling for Video Event Recognition.

    Science.gov (United States)

    Wang, Xiaoyang; Ji, Qiang

    2016-10-11

    Current video event recognition research remains largely target-centered. For real-world surveillance videos, targetcentered event recognition faces great challenges due to large intra-class target variation, limited image resolution, and poor detection and tracking results. To mitigate these challenges, we introduced a context-augmented video event recognition approach. Specifically, we explicitly capture different types of contexts from three levels including image level, semantic level, and prior level. At the image level, we introduce two types of contextual features including the appearance context features and interaction context features to capture the appearance of context objects and their interactions with the target objects. At the semantic level, we propose a deep model based on deep Boltzmann machine to learn event object representations and their interactions. At the prior level, we utilize two types of prior-level contexts including scene priming and dynamic cueing. Finally, we introduce a hierarchical context model that systematically integrates the contextual information at different levels. Through the hierarchical context model, contexts at different levels jointly contribute to the event recognition. We evaluate the hierarchical context model for event recognition on benchmark surveillance video datasets. Results show that incorporating contexts in each level can improve event recognition performance, and jointly integrating three levels of contexts through our hierarchical model achieves the best performance.

  18. Simulation and prototyping of 2 m long resistive plate chambers for detection of fast neutrons and multi-neutron event identification

    Energy Technology Data Exchange (ETDEWEB)

    Elekes, Z., E-mail: z.elekes@hzdr.de [Helmholtz-Zentrum Dresden-Rossendorf, Dresden (Germany); Aumann, T. [GSI Helmholtzzentrumfür Schwerionenforschung, Darmstadt (Germany); Technische Universität Darmstadt, Darmstadt (Germany); Bemmerer, D. [Helmholtz-Zentrum Dresden-Rossendorf, Dresden (Germany); Boretzky, K. [GSI Helmholtzzentrumfür Schwerionenforschung, Darmstadt (Germany); Caesar, C. [GSI Helmholtzzentrumfür Schwerionenforschung, Darmstadt (Germany); Technische Universität Darmstadt, Darmstadt (Germany); Cowan, T.C. [Helmholtz-Zentrum Dresden-Rossendorf, Dresden (Germany); Technische Universität Dresden, Dresden (Germany); Hehner, J.; Heil, M. [GSI Helmholtzzentrumfür Schwerionenforschung, Darmstadt (Germany); Kempe, M. [Helmholtz-Zentrum Dresden-Rossendorf, Dresden (Germany); Rossi, D. [GSI Helmholtzzentrumfür Schwerionenforschung, Darmstadt (Germany); Röder, M. [Helmholtz-Zentrum Dresden-Rossendorf, Dresden (Germany); Technische Universität Dresden, Dresden (Germany); Simon, H. [GSI Helmholtzzentrumfür Schwerionenforschung, Darmstadt (Germany); Sobiella, M.; Stach, D. [Helmholtz-Zentrum Dresden-Rossendorf, Dresden (Germany); Reinhardt, T. [Helmholtz-Zentrum Dresden-Rossendorf, Dresden (Germany); Technische Universität Dresden, Dresden (Germany); Wagner, A.; Yakorev, D. [Helmholtz-Zentrum Dresden-Rossendorf, Dresden (Germany); Zilges, A. [Universität zu Köln, Köln (Germany); Zuber, K. [Technische Universität Dresden, Dresden (Germany)

    2013-02-11

    Resistive plate chamber (RPC) prototypes of 2 m length were simulated and built. The experimental tests using a 31 MeV electron beam, discussed in details, showed an efficiency higher than 90% and an excellent time resolution of around σ=100ps. Furthermore, comprehensive simulations were performed by GEANT4 toolkit in order to study the possible use of these RPCs for fast neutron (200 MeV–1 GeV) detection and multi-neutron event identification. The validation of simulation parameters was carried out via a comparison to experimental data. A possible setup for invariant mass spectroscopy of multi-neutron emission is presented and the characteristics are discussed. The results show that the setup has a high detection efficiency. Its capability of determining the momentum of the outgoing neutrons and reconstructing the relative energy between the fragments from nuclear reactions is demonstrated for different scenarios.

  19. Event-related near-infrared spectroscopy detects conflict in the motor cortex in a Stroop task.

    Science.gov (United States)

    Szűcs, Dénes; Killikelly, Clare; Cutini, Simone

    2012-10-05

    The Stroop effect is one of the most popular models of conflict processing in neuroscience and psychology. The response conflict theory of the Stroop effect explains decreased performance in the incongruent condition of Stroop tasks by assuming that the task-relevant and the task-irrelevant stimulus features elicit conflicting response tendencies. However, to date, there is not much explicit neural evidence supporting this theory. Here we used functional near-infrared imaging (fNIRS) to examine whether conflict at the level of the motor cortex can be detected in the incongruent relative to the congruent condition of a Stroop task. Response conflict was determined by comparing the activity of the hemisphere ipsilateral to the response hand in the congruent and incongruent conditions. First, results provided explicit hemodynamic evidence supporting the response conflict theory of the Stroop effect: there was greater motor cortex activation in the hemisphere ipsilateral to the response hand in the incongruent than in the congruent condition during the initial stage of the hemodynamic response. Second, as fNIRS is still a relatively novel technology, it is methodologically significant that our data shows that fNIRS is able to detect a brief and transient increase in hemodynamic activity localized to the motor cortex, which in this study is related to subthreshold motor response activation. Copyright © 2012 Elsevier B.V. All rights reserved.

  20. Detection of and response to a probable volcanogenic T-wave event swarm on the western Blanco Transform Fault Zone

    Science.gov (United States)

    Dziak, R.P.; Fox, C.G.; Embley, R.W.; Lupton, J.E.; Johnson, G.C.; Chadwick, W.W.; Koski, R.A.

    1996-01-01

    The East Blanco Depression (EBD), a pull-apart basin within the western Blanco Transform Fault Zone (BTFZ), was the site of an intense earthquake T-wave swarm that began at 1317Z on January 9, 1994. Although tectonically generated earthquakes occur frequently along the BTFZ, this swarm was unusual in that it was preceded and accompanied by periodic, low-frequency, long-duration acoustic signals, that originated from near the swarm epicenters. These tremor-like signals were very similar in character to acoustic energy produced by a shallow-submarine eruption near Socorro Island, a seamount several hundred km west of Baja, California. The ???69 earthquakes and ???400 tremor-like events at the EBD occurred sporadically, with two periods of peak activity occurring between January 5-16 and 27-31. The swarm-like character of the earthquakes and the similarity of the tremor activity to the Socorro eruption indicated that the EBD was undergoing an intrusion or eruption episode. On January 27, six CTD/rosette casts were conducted at the site. Water samples from two of the stations yielded anomalous 3He concentrations, with maxima at ???2800 m depth over the main basin. In June 1994 two camera tows within the basin yielded evidence of pillow-lava volcanism and hydrothermal deposits, but no conclusive evidence of a recent seafloor eruption. In September 1994, deployments of the U.S. Navy's Advanced Tethered Vehicle resulted in the discovery of an active hydrothermal mound on the flanks of a pillow-lava volcano. The hydrothermal mound consists of Fe-rich hydrothermal precipitate and bacterial mats. Temperatures to 60??C were measured 30 cm below the surface. This is the first discovery of active hydrothermal vents along an oceanic fracture zone. Although no conclusive evidence of volcanic activity associated with the T-wave event swarm was found during these response efforts, the EBD has been the site of recent seafloor eruptions. Copyright 1996 by the American Geophysical

  1. Fusion events

    International Nuclear Information System (INIS)

    Aboufirassi, M; Angelique, J.C.; Bizard, G.; Bougault, R.; Brou, R.; Buta, A.; Colin, J.; Cussol, D.; Durand, D.; Genoux-Lubain, A.; Horn, D.; Kerambrun, A.; Laville, J.L.; Le Brun, C.; Lecolley, J.F.; Lefebvres, F.; Lopez, O.; Louvel, M.; Meslin, C.; Metivier, V.; Nakagawa, T.; Peter, J.; Popescu, R.; Regimbart, R.; Steckmeyer, J.C.; Tamain, B.; Vient, E.; Wieloch, A.; Yuasa-Nakagawa, K.

    1998-01-01

    The fusion reactions between low energy heavy ions have a very high cross section. First measurements at energies around 30-40 MeV/nucleon indicated no residue of either complete or incomplete fusion, thus demonstrating the disappearance of this process. This is explained as being due to the high amount o energies transferred to the nucleus, what leads to its total dislocation in light fragments and particles. Exclusive analyses have permitted to mark clearly the presence of fusion processes in heavy systems at energies above 30-40 MeV/nucleon. Among the complete events of the Kr + Au reaction at 60 MeV/nucleon the majority correspond to binary collisions. Nevertheless, for the most considerable energy losses, a class of events do occur for which the detected fragments appears to be emitted from a unique source. These events correspond to an incomplete projectile-target fusion followed by a multifragmentation. Such events were singled out also in the reaction Xe + Sn at 50 MeV/nucleon. For the events in which the energy dissipation was maximal it was possible to isolate an isotropic group of events showing all the characteristics of fusion nuclei. The fusion is said to be incomplete as pre-equilibrium Z = 1 and Z = 2 particles are emitted. The cross section is of the order of 25 mb. Similar conclusions were drown for the systems 36 Ar + 27 Al and 64 Zn + nat Ti. A cross section value of ∼ 20 mb was determined at 55 MeV/nucleon in the first case, while the measurement of evaporation light residues in the last system gave an upper limit of 20-30 mb for the cross section at 50 MeV/nucleon

  2. Data-mining for detecting signals of adverse drug reactions of fluoxetine using the Korea Adverse Event Reporting System (KAERS) database.

    Science.gov (United States)

    Kim, Seonji; Park, Kyounghoon; Kim, Mi-Sook; Yang, Bo Ram; Choi, Hyun Jin; Park, Byung-Joo

    2017-10-01

    Selective serotonin reuptake inhibitors (SSRIs) have become one of the most broadly used medications in psychiatry. Fluoxetine is the first representative antidepressant SSRI drug approved by the Food and Drug Administration (FDA) in 1987. Safety information on fluoxetine use alone was less reported than its combined use with other drugs. There were no published papers on adverse drug reactions (ADRs) of fluoxetine analyzing spontaneous adverse events reports. We detected signals of the adverse drug reactions of fluoxetine by data mining using the Korea Adverse Events Reporting System (KAERS) database. We defined signals in this study by the reporting odds ratios (ROR), proportional reporting ratios (PRR), and information components (IC) indices. The KAERS database included 860,224 AE reports, among which 866 reports contained fluoxetine. We compared the labels of fluoxetine among the United States, UK, Germany, France, China, and Korea. Some of the signals, including emotional lability, myositis, spinal stenosis, paradoxical drug reaction, drug dependence, extrapyramidal disorder, adrenal insufficiency, and intracranial hemorrhage, were not labeled in the six countries. In conclusion, we identified new signals that were not known at the time of market approval. However, certain factors should be required for signal evaluation, such as clinical significance, preventability, and causality of the detected signals. Copyright © 2017 Elsevier B.V. All rights reserved.

  3. Reliable Maintanace of Wireless Sensor Networks for Event-detection Applications%事件检测型传感器网络的可靠性维护

    Institute of Scientific and Technical Information of China (English)

    胡四泉; 杨金阳; 王俊峰

    2011-01-01

    The reliability maintannace of the wireless sensor network is a key point to keep the alarm messages delivered reliably to the monitor center on time in a event-detection application. Based on the unreliable links in the wireless sensor network and the network charateristics of an event detection application,MPRRM,a multiple path redundant reliability maintanace algoritm was proposed in this paper. Both analytical and simulation results show that the MPRRM algorithm is superior to the previous published solutions in the metrics of reliability, false positive rate, latency and message overhead.%传感器网络(Wireless Sensor Networks,WSN)的事件检测型应用中,如何通过可靠性维护来保证在检测到事件时报警信息能及时、可靠地传输到监控主机至关重要.通过对不可靠的无线链路和网络传输的分析,提出多路冗余可靠性维护算法MPRRM.通过解析方法和仿真分析证明,该算法在可靠性、误报率、延迟和消息开销量上比同类算法具有优势.

  4. Automatic change detection in vision: Adaptation, memory mismatch, or both? II: Oddball and adaptation effects on event-related potentials.

    Science.gov (United States)

    Bodnár, Flóra; File, Domonkos; Sulykos, István; Kecskés-Kovács, Krisztina; Czigler, István

    2017-11-01

    In this study we compared the event-related potentials (ERPs) obtained in two different paradigms: a passive visual oddball paradigm and an adaptation paradigm. The aim of the study was to investigate the relation between the effects of activity decrease following an adaptor (stimulus-specific adaptation) and the effects of an infrequent stimulus within sequences of frequent ones. In Experiment 1, participants were presented with different line textures. The frequent (standard) and rare (deviant) texture elements differed in their orientation. In Experiment 2, windmill pattern stimuli were presented in which the number of vanes differentiated the deviant and standard stimuli. In Experiment 1 the ERP differences elicited between the oddball deviant and the standard were similar to the differences between the ERPs to the nonadapted and adapted stimuli in the adaptation paradigm. In both paradigms the differences appeared as a posterior negativity with the latency of 120-140 ms. This finding demonstrates that the representation of a sequential rule (successive presentation of the standard) and the violation of this rule are not necessary for deviancy effects to emerge. In Experiment 2 (windmill pattern), in the oddball paradigm the difference potentials appeared as a long-lasting negativity. In the adaptation condition, the later part of this negativity (after 200 ms) was absent. We identified the later part of the oddball difference potential as the genuine visual mismatch negativity-that is, an ERP correlate of sequence violations. The latencies of the difference potentials (deviant minus standard) and the endogenous components (P1 and N1) diverged; therefore, the adaptation of these particular ERP components cannot explain the deviancy effect. Accordingly, the sources contributing to the standard-versus-deviant modulations differed from those related to visual adaptation; that is, they generated distinct ERP components.

  5. DEVELOPMENT AND TESTING OF PROCEDURES FOR CARRYING OUT EMERGENCY PHYSICAL INVENTORY TAKING AFTER DETECTING ANOMALY EVENTS CONCERNING NM SECURITY

    International Nuclear Information System (INIS)

    VALENTE, J.; FISHBONE, L.

    2003-01-01

    In the State Scientific Center of Russian Federation - Institute of Physics and Power Engineering (SSC RF-IPPE, Obninsk), which is under Minatom jurisdiction, the procedures for carrying out emergency physical inventory taking (EPIT) were developed and tested in cooperation with the Brookhaven National Laboratory (USA). Here the emergency physical inventory taking means the PIT, which is carried out in case of symptoms indicating a possibility of NM loss (theft). Such PIT often requires a verification of attributes and quantitative characteristics for all the NM items located in a specific Material Balance Area (MBA). In order to carry out the exercise, an MBA was selected where many thousands of NM items containing highly enriched uranium are used. Three clients of the computerized material accounting system (CMAS) are installed in this MBA. Labels with unique (within IPPE site) identification numbers in the form of digit combinations and an appropriate bar code have been applied on the NM items, containers and authorized locations. All the data to be checked during the EPIT are stored in the CMAS database. Five variants of anomalies initiating EPIT and requiring different types of activities on EPIT organization are considered. Automatic working places (AWP) were created on the basis of the client computers in order to carry out a large number of measurements within a reasonable time. In addition to a CMAS client computer, the main components of an AWP include a bar-code reader, an electronic scale and an enrichment meter with NaI--detector--the lMCA Inspector (manufactured by the Canberra Company). All these devices work together with a client computer in the on-line mode. Special computer code (Emergency Inventory Software-EIS) was developed. All the algorithms of interaction between the operator and the system, as well as algorithms of data exchange during the measurements and data comparison, are implemented in this software. Registration of detected

  6. Evaluation of epidemic intelligence systems integrated in the early alerting and reporting project for the detection of A/H5N1 influenza events.

    Directory of Open Access Journals (Sweden)

    Philippe Barboza

    Full Text Available The objective of Web-based expert epidemic intelligence systems is to detect health threats. The Global Health Security Initiative (GHSI Early Alerting and Reporting (EAR project was launched to assess the feasibility and opportunity for pooling epidemic intelligence data from seven expert systems. EAR participants completed a qualitative survey to document epidemic intelligence strategies and to assess perceptions regarding the systems performance. Timeliness and sensitivity were rated highly illustrating the value of the systems for epidemic intelligence. Weaknesses identified included representativeness, completeness and flexibility. These findings were corroborated by the quantitative analysis performed on signals potentially related to influenza A/H5N1 events occurring in March 2010. For the six systems for which this information was available, the detection rate ranged from 31% to 38%, and increased to 72% when considering the virtual combined system. The effective positive predictive values ranged from 3% to 24% and F1-scores ranged from 6% to 27%. System sensitivity ranged from 38% to 72%. An average difference of 23% was observed between the sensitivities calculated for human cases and epizootics, underlining the difficulties in developing an efficient algorithm for a single pathology. However, the sensitivity increased to 93% when the virtual combined system was considered, clearly illustrating complementarities between individual systems. The average delay between the detection of A/H5N1 events by the systems and their official reporting by WHO or OIE was 10.2 days (95% CI: 6.7-13.8. This work illustrates the diversity in implemented epidemic intelligence activities, differences in system's designs, and the potential added values and opportunities for synergy between systems, between users and between systems and users.

  7. Evaluation of Epidemic Intelligence Systems Integrated in the Early Alerting and Reporting Project for the Detection of A/H5N1 Influenza Events

    Science.gov (United States)

    Barboza, Philippe; Vaillant, Laetitia; Mawudeku, Abla; Nelson, Noele P.; Hartley, David M.; Madoff, Lawrence C.; Linge, Jens P.; Collier, Nigel; Brownstein, John S.; Yangarber, Roman; Astagneau, Pascal; on behalf of the Early Alerting, Reporting Project of the Global Health Security Initiative

    2013-01-01

    The objective of Web-based expert epidemic intelligence systems is to detect health threats. The Global Health Security Initiative (GHSI) Early Alerting and Reporting (EAR) project was launched to assess the feasibility and opportunity for pooling epidemic intelligence data from seven expert systems. EAR participants completed a qualitative survey to document epidemic intelligence strategies and to assess perceptions regarding the systems performance. Timeliness and sensitivity were rated highly illustrating the value of the systems for epidemic intelligence. Weaknesses identified included representativeness, completeness and flexibility. These findings were corroborated by the quantitative analysis performed on signals potentially related to influenza A/H5N1 events occurring in March 2010. For the six systems for which this information was available, the detection rate ranged from 31% to 38%, and increased to 72% when considering the virtual combined system. The effective positive predictive values ranged from 3% to 24% and F1-scores ranged from 6% to 27%. System sensitivity ranged from 38% to 72%. An average difference of 23% was observed between the sensitivities calculated for human cases and epizootics, underlining the difficulties in developing an efficient algorithm for a single pathology. However, the sensitivity increased to 93% when the virtual combined system was considered, clearly illustrating complementarities between individual systems. The average delay between the detection of A/H5N1 events by the systems and their official reporting by WHO or OIE was 10.2 days (95% CI: 6.7–13.8). This work illustrates the diversity in implemented epidemic intelligence activities, differences in system's designs, and the potential added values and opportunities for synergy between systems, between users and between systems and users. PMID:23472077

  8. Evaluation of epidemic intelligence systems integrated in the early alerting and reporting project for the detection of A/H5N1 influenza events.

    Science.gov (United States)

    Barboza, Philippe; Vaillant, Laetitia; Mawudeku, Abla; Nelson, Noele P; Hartley, David M; Madoff, Lawrence C; Linge, Jens P; Collier, Nigel; Brownstein, John S; Yangarber, Roman; Astagneau, Pascal

    2013-01-01

    The objective of Web-based expert epidemic intelligence systems is to detect health threats. The Global Health Security Initiative (GHSI) Early Alerting and Reporting (EAR) project was launched to assess the feasibility and opportunity for pooling epidemic intelligence data from seven expert systems. EAR participants completed a qualitative survey to document epidemic intelligence strategies and to assess perceptions regarding the systems performance. Timeliness and sensitivity were rated highly illustrating the value of the systems for epidemic intelligence. Weaknesses identified included representativeness, completeness and flexibility. These findings were corroborated by the quantitative analysis performed on signals potentially related to influenza A/H5N1 events occurring in March 2010. For the six systems for which this information was available, the detection rate ranged from 31% to 38%, and increased to 72% when considering the virtual combined system. The effective positive predictive values ranged from 3% to 24% and F1-scores ranged from 6% to 27%. System sensitivity ranged from 38% to 72%. An average difference of 23% was observed between the sensitivities calculated for human cases and epizootics, underlining the difficulties in developing an efficient algorithm for a single pathology. However, the sensitivity increased to 93% when the virtual combined system was considered, clearly illustrating complementarities between individual systems. The average delay between the detection of A/H5N1 events by the systems and their official reporting by WHO or OIE was 10.2 days (95% CI: 6.7-13.8). This work illustrates the diversity in implemented epidemic intelligence activities, differences in system's designs, and the potential added values and opportunities for synergy between systems, between users and between systems and users.

  9. Flanking sequence determination and event-specific detection of genetically modified wheat B73-6-1.

    Science.gov (United States)

    Xu, Junyi; Cao, Jijuan; Cao, Dongmei; Zhao, Tongtong; Huang, Xin; Zhang, Piqiao; Luan, Fengxia

    2013-05-01

    In order to establish a specific identification method for genetically modified (GM) wheat, exogenous insert DNA and flanking sequence between exogenous fragment and recombinant chromosome of GM wheat B73-6-1 were successfully acquired by means of conventional polymerase chain reaction (PCR) and thermal asymmetric interlaced (TAIL)-PCR strategies. Newly acquired exogenous fragment covered the full-length sequence of transformed genes such as transformed plasmid and corresponding functional genes including marker uidA, herbicide-resistant bar, ubiquitin promoter, and high-molecular-weight gluten subunit. The flanking sequence between insert DNA revealed high similarity with Triticum turgidum A gene (GenBank: AY494981.1). A specific PCR detection method for GM wheat B73-6-1 was established on the basis of primers designed according to the flanking sequence. This specific PCR method was validated by GM wheat, GM corn, GM soybean, GM rice, and non-GM wheat. The specifically amplified target band was observed only in GM wheat B73-6-1. This method is of high specificity, high reproducibility, rapid identification, and excellent accuracy for the identification of GM wheat B73-6-1.

  10. Gaseous time projection chambers for rare event detection: results from the T-REX project. II. Dark matter

    Energy Technology Data Exchange (ETDEWEB)

    Irastorza, I.G.; Aznar, F.; Castel, J., E-mail: igor.irastorza@cern.ch, E-mail: faznar@unizar.es, E-mail: jfcastel@unizar.es [Grupo de Física Nuclear y Astropartículas, Departamento de Física Teórica, Universidad de Zaragoza, C/ P. Cerbuna 12, Zaragoza, 50009 Spain (Spain); and others

    2016-01-01

    As part of the T-REX project, a number of R and D and prototyping activities have been carried out during the last years to explore the applicability of gaseous Time Projection Chambers (TPCs) with Micromesh Gas Structures (Micromegas) in rare event searches like double beta decay, axion research and low-mass WIMP searches. While in the companion paper we focus on double beta decay, in this paper we focus on the results regarding the search for dark matter candidates, both axions and WIMPs. Small (few cm wide) ultra-low background Micromegas detectors are used to image the axion-induced x-ray signal expected in axion helioscopes like the CERN Axion Solar Telescope (CAST) experiment. Background levels as low as 0.8 × 10{sup −6} counts keV{sup −1} cm{sup −2} s{sup −1} have already been achieved in CAST while values down to ∼10{sup −7} counts keV{sup −1} cm{sup −2} s{sup −1} have been obtained in a test bench placed underground in the Laboratorio Subterráneo de Canfranc (LSC). Prospects to consolidate and further reduce these values down to ∼10{sup −8} counts keV{sup −1} cm{sup −2} s{sup −1} will be described. Such detectors, placed at the focal point of x-ray telescopes in the future International Axion Observatory (IAXO), would allow for 10{sup 5} better signal-to-noise ratio than CAST, and search for solar axions with g{sub a}γ down to few 10{sup 12} GeV{sup −1}, well into unexplored axion parameter space. In addition, a scaled-up version of these TPCs, properly shielded and placed underground, can be competitive in the search for low-mass WIMPs. The TREX-DM prototype, with ∼ 0.300 kg of Ar at 10 bar, or alternatively ∼ 0.160 kg of Ne at 10 bar, and energy threshold well below 1 keV, has been built to test this concept. We will describe the main technical solutions developed, as well as the results from the commissioning phase on surface. The anticipated sensitivity of this technique might reach ∼10{sup −44} cm{sup 2} for

  11. Model-Based Method for Sensor Validation

    Science.gov (United States)

    Vatan, Farrokh

    2012-01-01

    Fault detection, diagnosis, and prognosis are essential tasks in the operation of autonomous spacecraft, instruments, and in situ platforms. One of NASA s key mission requirements is robust state estimation. Sensing, using a wide range of sensors and sensor fusion approaches, plays a central role in robust state estimation, and there is a need to diagnose sensor failure as well as component failure. Sensor validation can be considered to be part of the larger effort of improving reliability and safety. The standard methods for solving the sensor validation problem are based on probabilistic analysis of the system, from which the method based on Bayesian networks is most popular. Therefore, these methods can only predict the most probable faulty sensors, which are subject to the initial probabilities defined for the failures. The method developed in this work is based on a model-based approach and provides the faulty sensors (if any), which can be logically inferred from the model of the system and the sensor readings (observations). The method is also more suitable for the systems when it is hard, or even impossible, to find the probability functions of the system. The method starts by a new mathematical description of the problem and develops a very efficient and systematic algorithm for its solution. The method builds on the concepts of analytical redundant relations (ARRs).

  12. Expediting model-based optoacoustic reconstructions with tomographic symmetries

    International Nuclear Information System (INIS)

    Lutzweiler, Christian; Deán-Ben, Xosé Luís; Razansky, Daniel

    2014-01-01

    Purpose: Image quantification in optoacoustic tomography implies the use of accurate forward models of excitation, propagation, and detection of optoacoustic signals while inversions with high spatial resolution usually involve very large matrices, leading to unreasonably long computation times. The development of fast and memory efficient model-based approaches represents then an important challenge to advance on the quantitative and dynamic imaging capabilities of tomographic optoacoustic imaging. Methods: Herein, a method for simplification and acceleration of model-based inversions, relying on inherent symmetries present in common tomographic acquisition geometries, has been introduced. The method is showcased for the case of cylindrical symmetries by using polar image discretization of the time-domain optoacoustic forward model combined with efficient storage and inversion strategies. Results: The suggested methodology is shown to render fast and accurate model-based inversions in both numerical simulations andpost mortem small animal experiments. In case of a full-view detection scheme, the memory requirements are reduced by one order of magnitude while high-resolution reconstructions are achieved at video rate. Conclusions: By considering the rotational symmetry present in many tomographic optoacoustic imaging systems, the proposed methodology allows exploiting the advantages of model-based algorithms with feasible computational requirements and fast reconstruction times, so that its convenience and general applicability in optoacoustic imaging systems with tomographic symmetries is anticipated

  13. Automatic Event Detection and Picking of P, S Seismic Phases for Earthquake Early Warning: A Case Study of the 2008 Wenchuan Earthquake

    Science.gov (United States)

    WANG, Z.; Zhao, B.

    2015-12-01

    We develop an automatic seismic phase arrival detection and picking algorithm for the impending earthquakes occurred with diverse focal mechanisms and depths. The polarization analysis of the three-component seismograms is utilized to distinguish between P and S waves through a sliding time window. When applying the short term average/long term average (STA/LTA) method to the polarized data, we also construct a new characteristics function that can sensitively reflect the changes of signals' amplitude and frequency, providing a better detection for the phase arrival. Then an improved combination method of the higher order statistics and the Akaike information criteria (AIC) picker is applied to the refined signal to lock on the arrival time with a higher degree of accuracy. We test our techniques to the aftershocks of the Ms8.0 Wenchuan earthquake, where hundreds of three-component acceleration records with magnitudes of 4.0 to 6.4 are treated. In comparison to the analyst picks, the results of the proposed detection algorithms are shown to perform well and can be applied from a single instrument within a network of stations for the large seismic events in the Earthquake Early Warning System (EEWS).

  14. Detection of prospective memory deficits in mild cognitive impairment of suspected Alzheimer's disease etiology using a novel event-based prospective memory task.

    LENUS (Irish Health Repository)

    Blanco-Campal, Alberto

    2009-01-01

    We investigated the relative discriminatory efficacy of an event-based prospective memory (PM) task, in which specificity of the instructions and perceptual salience of the PM cue were manipulated, compared with two widely used retrospective memory (RM) tests (Rivermead Paragraph Recall Test and CERAD-Word List Test), when detecting mild cognitive impairment of suspected Alzheimer\\'s disease etiology (MCI-AD) (N = 19) from normal controls (NC) (N = 21). Statistical analyses showed high discriminatory capacity of the PM task for detecting MCI-AD. The Non-Specific-Non-Salient condition proved particularly useful in detecting MCI-AD, possibly reflecting the difficulty of the task, requiring more strategic attentional resources to monitor for the PM cue. With a cutoff score of <4\\/10, the Non-Specific-Non-Salient condition achieved a sensitivity = 84%, and a specificity = 95%, superior to the most discriminative RM test used (CERAD-Total Learning: sensitivity = 83%; specificity = 76%). Results suggest that PM is an early sign of memory failure in MCI-AD and may be a more pronounced deficit than retrospective failure, probably reflecting the greater self-initiated retrieval demands involved in the PM task used. Limitations include the relatively small sample size, and the use of a convenience sample (i.e. memory clinic attenders and healthy active volunteers), reducing the generalizability of the results, which should be regarded as preliminary. (JINS, 2009, 15, 154-159.).

  15. Toward detection of supernova event near the earth based on high-resolution analysis of cosmogenic nuclide 10Be in marine sediments

    Science.gov (United States)

    Takiguchi, S.; Suganuma, Y.; Kataoka, R.; Yamaguchi, K. E.

    2017-12-01

    Cosmic rays react with substances in the Earth's atmosphere and form cosmogenic nuclides. The flux would abruptly increase with nearby supernova or terrestrial magnetic events such as reversal or excursion of terrestrial magnetism. The Earth must have been exposed to cosmic ray radiation for as long as 10 Ma, if any, by nearby supernova activities (Kataoka et al., 2014). Increased and prolonged activity of cosmic rays would affect Earth's climate through forming greenhouse gases and biosphere through damaging DNA. Therefore, interests have been growing as to whether and how past supernova events have ever left any fingerprints on them. However, detection of nearby supernova is still under debate (e.g., Knie et al., 2004) To detect long-term record of past supernova activities, we utilize cosmogenic nuclide 10Be because of its short residence time (1-2yr) in the atmosphere, simple transport process, and adequate half-life (1.36 kyr) which is nearly equivalent to the duration of present-day deep water circulation. Sediment samples collected from the equatorial western Pacific (706-825 kyr in age) were finely powdered and decomposed by mixed acids (HNO3, HF, and HClO4). Authigenic phase was also separated from bulk powders by leaching with a weak acid. Because quantitative separation of Be from samples is essential toward high-quality 10Be analysis, both Be-bearing fractions were applied to optimized anion exchange chromatography for Be separation, and Be abundance was measured by atomic absorption spectrometry. The 10Be abundance (10Be/9Be ratios) were measured by accelerator mass spectrometry. The authigenic phase showed temporal curve that is similar to that of bulk samples (Suganuma et al., 2012), reflecting the influence of relative paleo-intensity and utility of authigenic method. Increased data set in terms of sampling interval (density) and total age range would allow us to judge whether it could detect past supernova activities and how it appears when

  16. Model-based Recursive Partitioning for Subgroup Analyses

    OpenAIRE

    Seibold, Heidi; Zeileis, Achim; Hothorn, Torsten

    2016-01-01

    The identification of patient subgroups with differential treatment effects is the first step towards individualised treatments. A current draft guideline by the EMA discusses potentials and problems in subgroup analyses and formulated challenges to the development of appropriate statistical procedures for the data-driven identification of patient subgroups. We introduce model-based recursive partitioning as a procedure for the automated detection of patient subgroups that are identifiable by...

  17. Symbolic Model-Based SAR Feature Analysis and Change Detection

    Science.gov (United States)

    1992-02-01

    normalization fac- tor described above in the Dempster rule of combination. Another problem is that in certain cases D-S overweights prior probabilities compared...Beaufort Sea data set and the Peru data set. The Phoenix results are described in section 6.2.2 including a partial trace of the opera- tion of the

  18. Probabilistic Model-Based Approach for Heart Beat Detection

    OpenAIRE

    Chen, Hugh; Erol, Yusuf; Shen, Eric; Russell, Stuart

    2015-01-01

    Nowadays, hospitals are ubiquitous and integral to modern society. Patients flow in and out of a veritable whirlwind of paperwork, consultations, and potential inpatient admissions, through an abstracted system that is not without flaws. One of the biggest flaws in the medical system is perhaps an unexpected one: the patient alarm system. One longitudinal study reported an 88.8% rate of false alarms, with other studies reporting numbers of similar magnitudes. These false alarm rates lead to a...

  19. Model based defect detection for free stator of ultrasonic motor

    DEFF Research Database (Denmark)

    Amini, Rouzbeh; Mojallali, Hamed; Izadi-Zamanabadi, Roozbeh

    2007-01-01

    In this paper, measurements of admittance magnitude and phase are used to identify the complex values of equivalent circuit model for free stator of an ultrasonic motor. The model is used to evaluate the changes in the admittance and relative changes in the values of equivalent circuit elements. ...

  20. An efficient visual saliency detection model based on Ripplet transform

    Indian Academy of Sciences (India)

    A Diana Andrushia

    human visual attention models is still not well investigated. ... Ripplet transform; visual saliency model; Receiver Operating Characteristics (ROC); .... proposed method has the same resolution as that of an input ... regions are obtained, which are independent of their sizes. ..... impact than those far away from the attention.

  1. Drug Adverse Event Detection in Health Plan Data Using the Gamma Poisson Shrinker and Comparison to the Tree-based Scan Statistic

    Directory of Open Access Journals (Sweden)

    David Smith

    2013-03-01

    Full Text Available Background: Drug adverse event (AE signal detection using the Gamma Poisson Shrinker (GPS is commonly applied in spontaneous reporting. AE signal detection using large observational health plan databases can expand medication safety surveillance. Methods: Using data from nine health plans, we conducted a pilot study to evaluate the implementation and findings of the GPS approach for two antifungal drugs, terbinafine and itraconazole, and two diabetes drugs, pioglitazone and rosiglitazone. We evaluated 1676 diagnosis codes grouped into 183 different clinical concepts and four levels of granularity. Several signaling thresholds were assessed. GPS results were compared to findings from a companion study using the identical analytic dataset but an alternative statistical method—the tree-based scan statistic (TreeScan. Results: We identified 71 statistical signals across two signaling thresholds and two methods, including closely-related signals of overlapping diagnosis definitions. Initial review found that most signals represented known adverse drug reactions or confounding. About 31% of signals met the highest signaling threshold. Conclusions: The GPS method was successfully applied to observational health plan data in a distributed data environment as a drug safety data mining method. There was substantial concordance between the GPS and TreeScan approaches. Key method implementation decisions relate to defining exposures and outcomes and informed choice of signaling thresholds.

  2. Line impedance estimation using model based identification technique

    DEFF Research Database (Denmark)

    Ciobotaru, Mihai; Agelidis, Vassilios; Teodorescu, Remus

    2011-01-01

    The estimation of the line impedance can be used by the control of numerous grid-connected systems, such as active filters, islanding detection techniques, non-linear current controllers, detection of the on/off grid operation mode. Therefore, estimating the line impedance can add extra functions...... into the operation of the grid-connected power converters. This paper describes a quasi passive method for estimating the line impedance of the distribution electricity network. The method uses the model based identification technique to obtain the resistive and inductive parts of the line impedance. The quasi...

  3. Characterization of the exogenous insert and development of event-specific PCR detection methods for genetically modified Huanong No. 1 papaya.

    Science.gov (United States)

    Guo, Jinchao; Yang, Litao; Liu, Xin; Guan, Xiaoyan; Jiang, Lingxi; Zhang, Dabing

    2009-08-26

    Genetically modified (GM) papaya (Carica papaya L.), Huanong No. 1, was approved for commercialization in Guangdong province, China in 2006, and the development of the Huanong No. 1 papaya detection method is necessary for implementing genetically modified organism (GMO) labeling regulations. In this study, we reported the characterization of the exogenous integration of GM Huanong No. 1 papaya by means of conventional polymerase chain reaction (PCR) and thermal asymmetric interlaced (TAIL)-PCR strategies. The results suggested that one intact copy of the initial construction was integrated in the papaya genome and which probably resulted in one deletion (38 bp in size) of the host genomic DNA. Also, one unintended insertion of a 92 bp truncated NptII fragment was observed at the 5' end of the exogenous insert. Furthermore, we revealed its 5' and 3' flanking sequences between the insert DNA and the papaya genomic DNA, and developed the event-specific qualitative and quantitative PCR assays for GM Huanong No. 1 papaya based on the 5' integration flanking sequence. The relative limit of detection (LOD) of the qualitative PCR assay was about 0.01% in 100 ng of total papaya genomic DNA, corresponding to about 25 copies of papaya haploid genome. In the quantitative PCR, the limits of detection and quantification (LOD and LOQ) were as low as 12.5 and 25 copies of papaya haploid genome, respectively. In practical sample quantification, the quantified biases between the test and true values of three samples ranged from 0.44% to 4.41%. Collectively, we proposed that all of these results are useful for the identification and quantification of Huanong No. 1 papaya and its derivates.

  4. Tsunami detection by high-frequency radar in British Columbia: performance assessment of the time-correlation algorithm for synthetic and real events

    Science.gov (United States)

    Guérin, Charles-Antoine; Grilli, Stéphan T.; Moran, Patrick; Grilli, Annette R.; Insua, Tania L.

    2018-02-01

    The authors recently proposed a new method for detecting tsunamis using high-frequency (HF) radar observations, referred to as "time-correlation algorithm" (TCA; Grilli et al. Pure Appl Geophys 173(12):3895-3934, 2016a, 174(1): 3003-3028, 2017). Unlike standard algorithms that detect surface current patterns, the TCA is based on analyzing space-time correlations of radar signal time series in pairs of radar cells, which does not require inverting radial surface currents. This was done by calculating a contrast function, which quantifies the change in pattern of the mean correlation between pairs of neighboring cells upon tsunami arrival, with respect to a reference correlation computed in the recent past. In earlier work, the TCA was successfully validated based on realistic numerical simulations of both the radar signal and tsunami wave trains. Here, this algorithm is adapted to apply to actual data from a HF radar installed in Tofino, BC, for three test cases: (1) a simulated far-field tsunami generated in the Semidi Subduction Zone in the Aleutian Arc; (2) a simulated near-field tsunami from a submarine mass failure on the continental slope off of Tofino; and (3) an event believed to be a meteotsunami, which occurred on October 14th, 2016, off of the Pacific West Coast and was measured by the radar. In the first two cases, the synthetic tsunami signal is superimposed onto the radar signal by way of a current memory term; in the third case, the tsunami signature is present within the radar data. In light of these test cases, we develop a detection methodology based on the TCA, using a correlation contrast function, and show that in all three cases the algorithm is able to trigger a timely early warning.

  5. Tsunami detection by high-frequency radar in British Columbia: performance assessment of the time-correlation algorithm for synthetic and real events

    Science.gov (United States)

    Guérin, Charles-Antoine; Grilli, Stéphan T.; Moran, Patrick; Grilli, Annette R.; Insua, Tania L.

    2018-05-01

    The authors recently proposed a new method for detecting tsunamis using high-frequency (HF) radar observations, referred to as "time-correlation algorithm" (TCA; Grilli et al. Pure Appl Geophys 173(12):3895-3934, 2016a, 174(1): 3003-3028, 2017). Unlike standard algorithms that detect surface current patterns, the TCA is based on analyzing space-time correlations of radar signal time series in pairs of radar cells, which does not require inverting radial surface currents. This was done by calculating a contrast function, which quantifies the change in pattern of the mean correlation between pairs of neighboring cells upon tsunami arrival, with respect to a reference correlation computed in the recent past. In earlier work, the TCA was successfully validated based on realistic numerical simulations of both the radar signal and tsunami wave trains. Here, this algorithm is adapted to apply to actual data from a HF radar installed in Tofino, BC, for three test cases: (1) a simulated far-field tsunami generated in the Semidi Subduction Zone in the Aleutian Arc; (2) a simulated near-field tsunami from a submarine mass failure on the continental slope off of Tofino; and (3) an event believed to be a meteotsunami, which occurred on October 14th, 2016, off of the Pacific West Coast and was measured by the radar. In the first two cases, the synthetic tsunami signal is superimposed onto the radar signal by way of a current memory term; in the third case, the tsunami signature is present within the radar data. In light of these test cases, we develop a detection methodology based on the TCA, using a correlation contrast function, and show that in all three cases the algorithm is able to trigger a timely early warning.

  6. Model-based inversion for the characterization of crack-like defects detected by ultrasound in a cladded component; Etude d'une methode d'inversion basee sur la simulation pour la caracterisation de fissures detectees par ultrasons dans un composant revetu

    Energy Technology Data Exchange (ETDEWEB)

    Haiat, G

    2004-03-01

    This work deals with the inversion of ultrasonic data. The industrial context of the study in the non destructive evaluation of the internal walls of French reactor pressure vessels. Those inspections aim at detecting and characterizing cracks. Ultrasonic data correspond to echographic responses obtained with a transducer acting in pulse echo mode. Cracks are detected by crack tip diffraction effect. The analysis of measured data can become difficult because of the presence of a cladding, which surface is irregular. Moreover, its constituting material differs from the one of the reactor vessel. A model-based inverse method uses simulation of propagation and of diffraction of ultrasound taking into account the irregular properties of the cladding surface, as well as the heterogeneous nature of the component. The method developed was implemented and tested on a set of representative cases. Its performances were evaluated by the analysis of experimental results. The precision obtained in the laboratory on experimental cases treated is conform with industrial expectations motivating this study. (author)

  7. Vehicle coordinated transportation dispatching model base on multiple crisis locations

    Science.gov (United States)

    Tian, Ran; Li, Shanwei; Yang, Guoying

    2018-05-01

    Many disastrous events are often caused after unconventional emergencies occur, and the requirements of disasters are often different. It is difficult for a single emergency resource center to satisfy such requirements at the same time. Therefore, how to coordinate the emergency resources stored by multiple emergency resource centers to various disaster sites requires the coordinated transportation of emergency vehicles. In this paper, according to the problem of emergency logistics coordination scheduling, based on the related constraints of emergency logistics transportation, an emergency resource scheduling model based on multiple disasters is established.

  8. To Fill or Not to Fill: Sensitivity Analysis of the Influence of Resolution and Hole Filling on Point Cloud Surface Modeling and Individual Rockfall Event Detection

    Directory of Open Access Journals (Sweden)

    Michael J. Olsen

    2015-09-01

    Full Text Available Monitoring unstable slopes with terrestrial laser scanning (TLS has been proven effective. However, end users still struggle immensely with the efficient processing, analysis, and interpretation of the massive and complex TLS datasets. Two recent advances described in this paper now improve the ability to work with TLS data acquired on steep slopes. The first is the improved processing of TLS data to model complex topography and fill holes. This processing step results in a continuous topographic surface model that seamlessly characterizes the rock and soil surface. The second is an advance in the automated interpretation of the surface model in such a way that a magnitude and frequency relationship of rockfall events can be quantified, which can be used to assess maintenance strategies and forecast costs. The approach is applied to unstable highway slopes in the state of Alaska, U.S.A. to evaluate its effectiveness. Further, the influence of the selected model resolution and degree of hole filling on the derived slope metrics were analyzed. In general, model resolution plays a pivotal role in the ability to detect smaller rockfall events when developing magnitude-frequency relationships. The total volume estimates are also influenced by model resolution, but were comparatively less sensitive. In contrast, hole filling had a noticeable effect on magnitude-frequency relationships but to a lesser extent than modeling resolution. However, hole filling yielded a modest increase in overall volumetric quantity estimates. Optimal analysis results occur when appropriately balancing high modeling resolution with an appropriate level of hole filling.

  9. Bond graph model-based fault diagnosis of hybrid systems

    CERN Document Server

    Borutzky, Wolfgang

    2015-01-01

    This book presents a bond graph model-based approach to fault diagnosis in mechatronic systems appropriately represented by a hybrid model. The book begins by giving a survey of the fundamentals of fault diagnosis and failure prognosis, then recalls state-of-art developments referring to latest publications, and goes on to discuss various bond graph representations of hybrid system models, equations formulation for switched systems, and simulation of their dynamic behavior. The structured text: • focuses on bond graph model-based fault detection and isolation in hybrid systems; • addresses isolation of multiple parametric faults in hybrid systems; • considers system mode identification; • provides a number of elaborated case studies that consider fault scenarios for switched power electronic systems commonly used in a variety of applications; and • indicates that bond graph modelling can also be used for failure prognosis. In order to facilitate the understanding of fault diagnosis and the presented...

  10. Establishing the functional connectivity of the frontotemporal network in pre-attentive change detection with Transcranial Magnetic Stimulation and event-related optical signal.

    Science.gov (United States)

    Tse, Chun-Yu; Long-Yin, Yip; Lui, Troby Ka-Yan; Xiao, Xue-Zhen; Wang, Yang; Chu, Winnie Chiu Wing; Parks, Nathan Allen; Chan, Sandra Sau-Man; Neggers, Sebastiaan Franciscus Wijnandus

    2018-06-18

    Current theories of pre-attentive deviant detection postulate that before the Superior Temporal Cortex (STC) detects a change, the Inferior Frontal Cortex (IFC) engages in stimulus analysis, which is particularly critical for ambiguous deviations (e.g., deviant preceded by a short train of standards). These theories rest on the assumption that IFC and STC are functionally connected, which has only been supported by correlational brain imaging studies. We examined this functional connectivity assumption by applying Transcranial Magnetic Stimulation (TMS) to disrupt IFC function, while measuring the later STC mismatch response with the event-related optical signal (EROS). EROS can localize brain activity in both spatial and temporal dimensions via measurement of optical property changes associated with neuronal activity, and is inert to the electromagnetic interference produced by TMS. Specifically, the STC mismatch response at 120-180 ms elicited by a deviant preceded by a short standard train when IFC TMS was applied at 80 ms was compared with the STC mismatch responses in temporal control (TMS with 200 ms delay), spatial control (sham TMS at vertex), auditory control (TMS pulse noise only), and cognitive control (deviant preceded by a long standard train) conditions. The STC mismatch response to deviants preceded by the short train was abolished by TMS of the IFC at 80 ms, while the STC responses remained intact in all other control conditions. These results confirm the involvement of the IFC in the STC mismatch response and support a functional connection between IFC and STC. Copyright © 2018. Published by Elsevier Inc.

  11. Detection of Ostreid herpesvirus-1 microvariants in healthy Crassostrea gigas following disease events and their possible role as reservoirs of infection.

    Science.gov (United States)

    Evans, Olivia; Hick, Paul; Whittington, Richard J

    2017-09-01

    Ostreid herpesvirus-1 microvariants (OsHV-1) cause severe mortalities in farmed Crassostrea gigas in Europe, New Zealand and Australia. Outbreaks are seasonal, recurring in the warmer months of the year in endemic estuaries. The reference genotype and microvariant genotypes of OsHV-1 have been previously detected in the tissues of apparently healthy adult oysters naturally exposed to OsHV-1 in the field. However, the role of such oysters as reservoirs of infection for subsequent mortality outbreaks remains unclear. The aims of this study were: (1) to identify the optimal sample type to use for the detection of OsHV-1 DNA in apparently healthy C. gigas; and (2) to assess whether live C. gigas maintained on-farm after an OsHV-1 related mortality event remain infected and could act as a reservoir host for subsequent outbreaks. OsHV-1 DNA was detected in the hemolymph, gill, mantle, adductor muscle, gonad and digestive gland of apparently healthy adult oysters. The likelihood of detecting OsHV-1 DNA in hemolymph was equivalent to that in gill and mantle, but the odds of detecting OsHV-1 DNA in hemolymph and gill were more than 8 times that of adductor muscle. Gill had the highest viral loads. Compared to testing whole gill homogenates, testing snippets of the gill improved the detection of OsHV-1 DNA by about four fold. The prevalence of OsHV-1 in gill and mantle was highest after the first season of OsHV-1 exposure; it then declined to low or negligible levels in the same cohorts in subsequent seasons, despite repeated seasonal exposure in monitoring lasting up to 4years. The hemolymph of individually identified oysters was repeatedly sampled over 15months, and OsHV-1 prevalence declined over that time frame in the youngest cohort, which had been exposed to OsHV-1 for the first time at the start of that season. In contrast, the prevalence in two cohorts of older oysters, which had been exposed to OsHV-1 in prior seasons, was consistently low (<10%). Viral loads were

  12. Traceability in Model-Based Testing

    Directory of Open Access Journals (Sweden)

    Mathew George

    2012-11-01

    Full Text Available The growing complexities of software and the demand for shorter time to market are two important challenges that face today’s IT industry. These challenges demand the increase of both productivity and quality of software. Model-based testing is a promising technique for meeting these challenges. Traceability modeling is a key issue and challenge in model-based testing. Relationships between the different models will help to navigate from one model to another, and trace back to the respective requirements and the design model when the test fails. In this paper, we present an approach for bridging the gaps between the different models in model-based testing. We propose relation definition markup language (RDML for defining the relationships between models.

  13. Scalable Joint Models for Reliable Uncertainty-Aware Event Prediction.

    Science.gov (United States)

    Soleimani, Hossein; Hensman, James; Saria, Suchi

    2017-08-21

    Missing data and noisy observations pose significant challenges for reliably predicting events from irregularly sampled multivariate time series (longitudinal) data. Imputation methods, which are typically used for completing the data prior to event prediction, lack a principled mechanism to account for the uncertainty due to missingness. Alternatively, state-of-the-art joint modeling techniques can be used for jointly modeling the longitudinal and event data and compute event probabilities conditioned on the longitudinal observations. These approaches, however, make strong parametric assumptions and do not easily scale to multivariate signals with many observations. Our proposed approach consists of several key innovations. First, we develop a flexible and scalable joint model based upon sparse multiple-output Gaussian processes. Unlike state-of-the-art joint models, the proposed model can explain highly challenging structure including non-Gaussian noise while scaling to large data. Second, we derive an optimal policy for predicting events using the distribution of the event occurrence estimated by the joint model. The derived policy trades-off the cost of a delayed detection versus incorrect assessments and abstains from making decisions when the estimated event probability does not satisfy the derived confidence criteria. Experiments on a large dataset show that the proposed framework significantly outperforms state-of-the-art techniques in event prediction.

  14. Model-based testing for embedded systems

    CERN Document Server

    Zander, Justyna; Mosterman, Pieter J

    2011-01-01

    What the experts have to say about Model-Based Testing for Embedded Systems: "This book is exactly what is needed at the exact right time in this fast-growing area. From its beginnings over 10 years ago of deriving tests from UML statecharts, model-based testing has matured into a topic with both breadth and depth. Testing embedded systems is a natural application of MBT, and this book hits the nail exactly on the head. Numerous topics are presented clearly, thoroughly, and concisely in this cutting-edge book. The authors are world-class leading experts in this area and teach us well-used

  15. Model-based internal wave processing

    Energy Technology Data Exchange (ETDEWEB)

    Candy, J.V.; Chambers, D.H.

    1995-06-09

    A model-based approach is proposed to solve the oceanic internal wave signal processing problem that is based on state-space representations of the normal-mode vertical velocity and plane wave horizontal velocity propagation models. It is shown that these representations can be utilized to spatially propagate the modal (dept) vertical velocity functions given the basic parameters (wave numbers, Brunt-Vaisala frequency profile etc.) developed from the solution of the associated boundary value problem as well as the horizontal velocity components. Based on this framework, investigations are made of model-based solutions to the signal enhancement problem for internal waves.

  16. Model Based Control of Reefer Container Systems

    DEFF Research Database (Denmark)

    Sørensen, Kresten Kjær

    This thesis is concerned with the development of model based control for the Star Cool refrigerated container (reefer) with the objective of reducing energy consumption. This project has been carried out under the Danish Industrial PhD programme and has been financed by Lodam together with the Da......This thesis is concerned with the development of model based control for the Star Cool refrigerated container (reefer) with the objective of reducing energy consumption. This project has been carried out under the Danish Industrial PhD programme and has been financed by Lodam together...

  17. Event generators for address event representation transmitters

    Science.gov (United States)

    Serrano-Gotarredona, Rafael; Serrano-Gotarredona, Teresa; Linares Barranco, Bernabe

    2005-06-01

    Address Event Representation (AER) is an emergent neuromorphic interchip communication protocol that allows for real-time virtual massive connectivity between huge number neurons located on different chips. By exploiting high speed digital communication circuits (with nano-seconds timings), synaptic neural connections can be time multiplexed, while neural activity signals (with mili-seconds timings) are sampled at low frequencies. Also, neurons generate 'events' according to their activity levels. More active neurons generate more events per unit time, and access the interchip communication channel more frequently, while neurons with low activity consume less communication bandwidth. In a typical AER transmitter chip, there is an array of neurons that generate events. They send events to a peripheral circuitry (let's call it "AER Generator") that transforms those events to neurons coordinates (addresses) which are put sequentially on an interchip high speed digital bus. This bus includes a parallel multi-bit address word plus a Rqst (request) and Ack (acknowledge) handshaking signals for asynchronous data exchange. There have been two main approaches published in the literature for implementing such "AER Generator" circuits. They differ on the way of handling event collisions coming from the array of neurons. One approach is based on detecting and discarding collisions, while the other incorporates arbitration for sequencing colliding events . The first approach is supposed to be simpler and faster, while the second is able to handle much higher event traffic. In this article we will concentrate on the second arbiter-based approach. Boahen has been publishing several techniques for implementing and improving the arbiter based approach. Originally, he proposed an arbitration squeme by rows, followed by a column arbitration. In this scheme, while one neuron was selected by the arbiters to transmit his event out of the chip, the rest of neurons in the array were

  18. Event monitoring of parallel computations

    Directory of Open Access Journals (Sweden)

    Gruzlikov Alexander M.

    2015-06-01

    Full Text Available The paper considers the monitoring of parallel computations for detection of abnormal events. It is assumed that computations are organized according to an event model, and monitoring is based on specific test sequences

  19. Service creation: a model-based approach

    NARCIS (Netherlands)

    Quartel, Dick; van Sinderen, Marten J.; Ferreira Pires, Luis

    1999-01-01

    This paper presents a model-based approach to support service creation. In this approach, services are assumed to be created from (available) software components. The creation process may involve multiple design steps in which the requested service is repeatedly decomposed into more detailed

  20. Model based development of engine control algorithms

    NARCIS (Netherlands)

    Dekker, H.J.; Sturm, W.L.

    1996-01-01

    Model based development of engine control systems has several advantages. The development time and costs are strongly reduced because much of the development and optimization work is carried out by simulating both engine and control system. After optimizing the control algorithm it can be executed

  1. An acoustical model based monitoring network

    NARCIS (Netherlands)

    Wessels, P.W.; Basten, T.G.H.; Eerden, F.J.M. van der

    2010-01-01

    In this paper the approach for an acoustical model based monitoring network is demonstrated. This network is capable of reconstructing a noise map, based on the combination of measured sound levels and an acoustic model of the area. By pre-calculating the sound attenuation within the network the

  2. Approximation Algorithms for Model-Based Diagnosis

    NARCIS (Netherlands)

    Feldman, A.B.

    2010-01-01

    Model-based diagnosis is an area of abductive inference that uses a system model, together with observations about system behavior, to isolate sets of faulty components (diagnoses) that explain the observed behavior, according to some minimality criterion. This thesis presents greedy a