WorldWideScience

Sample records for adaptive anomaly detection

  1. Autonomic intrusion detection: Adaptively detecting anomalies over unlabeled audit data streams in computer networks

    KAUST Repository

    Wang, Wei; Guyet, Thomas; Quiniou, René ; Cordier, Marie-Odile; Masseglia, Florent; Zhang, Xiangliang

    2014-01-01

    In this work, we propose a novel framework of autonomic intrusion detection that fulfills online and adaptive intrusion detection over unlabeled HTTP traffic streams in computer networks. The framework holds potential for self-managing: self-labeling, self-updating and self-adapting. Our framework employs the Affinity Propagation (AP) algorithm to learn a subject’s behaviors through dynamical clustering of the streaming data. It automatically labels the data and adapts to normal behavior changes while identifies anomalies. Two large real HTTP traffic streams collected in our institute as well as a set of benchmark KDD’99 data are used to validate the framework and the method. The test results show that the autonomic model achieves better results in terms of effectiveness and efficiency compared to adaptive Sequential Karhunen–Loeve method and static AP as well as three other static anomaly detection methods, namely, k-NN, PCA and SVM.

  2. Autonomic intrusion detection: Adaptively detecting anomalies over unlabeled audit data streams in computer networks

    KAUST Repository

    Wang, Wei

    2014-06-22

    In this work, we propose a novel framework of autonomic intrusion detection that fulfills online and adaptive intrusion detection over unlabeled HTTP traffic streams in computer networks. The framework holds potential for self-managing: self-labeling, self-updating and self-adapting. Our framework employs the Affinity Propagation (AP) algorithm to learn a subject’s behaviors through dynamical clustering of the streaming data. It automatically labels the data and adapts to normal behavior changes while identifies anomalies. Two large real HTTP traffic streams collected in our institute as well as a set of benchmark KDD’99 data are used to validate the framework and the method. The test results show that the autonomic model achieves better results in terms of effectiveness and efficiency compared to adaptive Sequential Karhunen–Loeve method and static AP as well as three other static anomaly detection methods, namely, k-NN, PCA and SVM.

  3. Profile-based adaptive anomaly detection for network security.

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Pengchu C. (Sandia National Laboratories, Albuquerque, NM); Durgin, Nancy Ann

    2005-11-01

    As information systems become increasingly complex and pervasive, they become inextricably intertwined with the critical infrastructure of national, public, and private organizations. The problem of recognizing and evaluating threats against these complex, heterogeneous networks of cyber and physical components is a difficult one, yet a solution is vital to ensuring security. In this paper we investigate profile-based anomaly detection techniques that can be used to address this problem. We focus primarily on the area of network anomaly detection, but the approach could be extended to other problem domains. We investigate using several data analysis techniques to create profiles of network hosts and perform anomaly detection using those profiles. The ''profiles'' reduce multi-dimensional vectors representing ''normal behavior'' into fewer dimensions, thus allowing pattern and cluster discovery. New events are compared against the profiles, producing a quantitative measure of how ''anomalous'' the event is. Most network intrusion detection systems (IDSs) detect malicious behavior by searching for known patterns in the network traffic. This approach suffers from several weaknesses, including a lack of generalizability, an inability to detect stealthy or novel attacks, and lack of flexibility regarding alarm thresholds. Our research focuses on enhancing current IDS capabilities by addressing some of these shortcomings. We identify and evaluate promising techniques for data mining and machine-learning. The algorithms are ''trained'' by providing them with a series of data-points from ''normal'' network traffic. A successful algorithm can be trained automatically and efficiently, will have a low error rate (low false alarm and miss rates), and will be able to identify anomalies in ''pseudo real-time'' (i.e., while the intrusion is still in progress

  4. Detecting an atomic clock frequency anomaly using an adaptive Kalman filter algorithm

    Science.gov (United States)

    Song, Huijie; Dong, Shaowu; Wu, Wenjun; Jiang, Meng; Wang, Weixiong

    2018-06-01

    The abnormal frequencies of an atomic clock mainly include frequency jump and frequency drift jump. Atomic clock frequency anomaly detection is a key technique in time-keeping. The Kalman filter algorithm, as a linear optimal algorithm, has been widely used in real-time detection for abnormal frequency. In order to obtain an optimal state estimation, the observation model and dynamic model of the Kalman filter algorithm should satisfy Gaussian white noise conditions. The detection performance is degraded if anomalies affect the observation model or dynamic model. The idea of the adaptive Kalman filter algorithm, applied to clock frequency anomaly detection, uses the residuals given by the prediction for building ‘an adaptive factor’ the prediction state covariance matrix is real-time corrected by the adaptive factor. The results show that the model error is reduced and the detection performance is improved. The effectiveness of the algorithm is verified by the frequency jump simulation, the frequency drift jump simulation and the measured data of the atomic clock by using the chi-square test.

  5. Adaptive cancellation of geomagnetic background noise for magnetic anomaly detection using coherence

    International Nuclear Information System (INIS)

    Liu, Dunge; Xu, Xin; Huang, Chao; Zhu, Wanhua; Liu, Xiaojun; Fang, Guangyou; Yu, Gang

    2015-01-01

    Magnetic anomaly detection (MAD) is an effective method for the detection of ferromagnetic targets against background magnetic fields. Currently, the performance of MAD systems is mainly limited by the background geomagnetic noise. Several techniques have been developed to detect target signatures, such as the synchronous reference subtraction (SRS) method. In this paper, we propose an adaptive coherent noise suppression (ACNS) method. The proposed method is capable of evaluating and detecting weak anomaly signals buried in background geomagnetic noise. Tests with real-world recorded magnetic signals show that the ACNS method can excellently remove the background geomagnetic noise by about 21 dB or more in high background geomagnetic field environments. Additionally, as a general form of the SRS method, the ACNS method offers appreciable advantages over the existing algorithms. Compared to the SRS method, the ACNS algorithm can eliminate the false target signals and represents a noise suppressing capability improvement of 6.4 dB. The positive outcomes in terms of intelligibility make this method a potential candidate for application in MAD systems. (paper)

  6. Adaptive hidden Markov model with anomaly States for price manipulation detection.

    Science.gov (United States)

    Cao, Yi; Li, Yuhua; Coleman, Sonya; Belatreche, Ammar; McGinnity, Thomas Martin

    2015-02-01

    Price manipulation refers to the activities of those traders who use carefully designed trading behaviors to manually push up or down the underlying equity prices for making profits. With increasing volumes and frequency of trading, price manipulation can be extremely damaging to the proper functioning and integrity of capital markets. The existing literature focuses on either empirical studies of market abuse cases or analysis of particular manipulation types based on certain assumptions. Effective approaches for analyzing and detecting price manipulation in real time are yet to be developed. This paper proposes a novel approach, called adaptive hidden Markov model with anomaly states (AHMMAS) for modeling and detecting price manipulation activities. Together with wavelet transformations and gradients as the feature extraction methods, the AHMMAS model caters to price manipulation detection and basic manipulation type recognition. The evaluation experiments conducted on seven stock tick data from NASDAQ and the London Stock Exchange and 10 simulated stock prices by stochastic differential equation show that the proposed AHMMAS model can effectively detect price manipulation patterns and outperforms the selected benchmark models.

  7. Anomaly Detection in Sequences

    Data.gov (United States)

    National Aeronautics and Space Administration — We present a set of novel algorithms which we call sequenceMiner, that detect and characterize anomalies in large sets of high-dimensional symbol sequences that...

  8. Ferret Workflow Anomaly Detection System

    National Research Council Canada - National Science Library

    Smith, Timothy J; Bryant, Stephany

    2005-01-01

    The Ferret workflow anomaly detection system project 2003-2004 has provided validation and anomaly detection in accredited workflows in secure knowledge management systems through the use of continuous, automated audits...

  9. Road Anomalies Detection System Evaluation.

    Science.gov (United States)

    Silva, Nuno; Shah, Vaibhav; Soares, João; Rodrigues, Helena

    2018-06-21

    Anomalies on road pavement cause discomfort to drivers and passengers, and may cause mechanical failure or even accidents. Governments spend millions of Euros every year on road maintenance, often causing traffic jams and congestion on urban roads on a daily basis. This paper analyses the difference between the deployment of a road anomalies detection and identification system in a “conditioned” and a real world setup, where the system performed worse compared to the “conditioned” setup. It also presents a system performance analysis based on the analysis of the training data sets; on the analysis of the attributes complexity, through the application of PCA techniques; and on the analysis of the attributes in the context of each anomaly type, using acceleration standard deviation attributes to observe how different anomalies classes are distributed in the Cartesian coordinates system. Overall, in this paper, we describe the main insights on road anomalies detection challenges to support the design and deployment of a new iteration of our system towards the deployment of a road anomaly detection service to provide information about roads condition to drivers and government entities.

  10. Signal anomaly detection and characterization

    International Nuclear Information System (INIS)

    Morgenstern, V.M.; Upadhyaya, B.R.; Gloeckler, O.

    1988-08-01

    As part of a comprehensive signal validation system, we have developed a signal anomaly detector, without specifically establishing the cause of the anomaly. A signal recorded from process instrumentation is said to have an anomaly, if during steady-state operation, the deviation in the level of the signal, its root-mean-square (RMS) value, or its statistical distribution changes by a preset value. This deviation could be an unacceptable increase or a decrease in the quantity being monitored. An anomaly in a signal may be characterized by wideband or single-frequency noise, bias error, pulse-type error, nonsymmetric behavior, or a change in the signal bandwidth. Various signatures can be easily computed from data samples and compared against specified threshold values. We want to point out that in real processes, pulses can appear with different time widths, and at different rates of change of the signal. Thus, in characterizing an anomaly as a pulse-type, the fastest pulse width is constrained by the signal sampling interval. For example, if a signal is sampled at 100 Hz, we will not be able to detect pulses occurring at kHz rates. Discussion with utility and Combustion Engineering personnel indicated that it is not practical to detect pulses having a narrow time width. 9 refs., 11 figs., 8 tabs

  11. Anomaly detection in diurnal data

    NARCIS (Netherlands)

    Mata, F.; Zuraniewski, P.W.; Mandjes, M.; Mellia, M.

    2014-01-01

    In this paper we present methodological advances in anomaly detection tailored to discover abnormal traffic patterns under the presence of seasonal trends in data. In our setup we impose specific assumptions on the traffic type and nature; our study features VoIP call counts, for which several

  12. Automatic detection of multiple UXO-like targets using magnetic anomaly inversion and self-adaptive fuzzy c-means clustering

    Science.gov (United States)

    Yin, Gang; Zhang, Yingtang; Fan, Hongbo; Ren, Guoquan; Li, Zhining

    2017-12-01

    We have developed a method for automatically detecting UXO-like targets based on magnetic anomaly inversion and self-adaptive fuzzy c-means clustering. Magnetic anomaly inversion methods are used to estimate the initial locations of multiple UXO-like sources. Although these initial locations have some errors with respect to the real positions, they form dense clouds around the actual positions of the magnetic sources. Then we use the self-adaptive fuzzy c-means clustering algorithm to cluster these initial locations. The estimated number of cluster centroids represents the number of targets and the cluster centroids are regarded as the locations of magnetic targets. Effectiveness of the method has been demonstrated using synthetic datasets. Computational results show that the proposed method can be applied to the case of several UXO-like targets that are randomly scattered within in a confined, shallow subsurface, volume. A field test was carried out to test the validity of the proposed method and the experimental results show that the prearranged magnets can be detected unambiguously and located precisely.

  13. Network anomaly detection a machine learning perspective

    CERN Document Server

    Bhattacharyya, Dhruba Kumar

    2013-01-01

    With the rapid rise in the ubiquity and sophistication of Internet technology and the accompanying growth in the number of network attacks, network intrusion detection has become increasingly important. Anomaly-based network intrusion detection refers to finding exceptional or nonconforming patterns in network traffic data compared to normal behavior. Finding these anomalies has extensive applications in areas such as cyber security, credit card and insurance fraud detection, and military surveillance for enemy activities. Network Anomaly Detection: A Machine Learning Perspective presents mach

  14. Anomaly Detection in Dynamic Networks

    Energy Technology Data Exchange (ETDEWEB)

    Turcotte, Melissa [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2014-10-14

    Anomaly detection in dynamic communication networks has many important security applications. These networks can be extremely large and so detecting any changes in their structure can be computationally challenging; hence, computationally fast, parallelisable methods for monitoring the network are paramount. For this reason the methods presented here use independent node and edge based models to detect locally anomalous substructures within communication networks. As a first stage, the aim is to detect changes in the data streams arising from node or edge communications. Throughout the thesis simple, conjugate Bayesian models for counting processes are used to model these data streams. A second stage of analysis can then be performed on a much reduced subset of the network comprising nodes and edges which have been identified as potentially anomalous in the first stage. The first method assumes communications in a network arise from an inhomogeneous Poisson process with piecewise constant intensity. Anomaly detection is then treated as a changepoint problem on the intensities. The changepoint model is extended to incorporate seasonal behavior inherent in communication networks. This seasonal behavior is also viewed as a changepoint problem acting on a piecewise constant Poisson process. In a static time frame, inference is made on this extended model via a Gibbs sampling strategy. In a sequential time frame, where the data arrive as a stream, a novel, fast Sequential Monte Carlo (SMC) algorithm is introduced to sample from the sequence of posterior distributions of the change points over time. A second method is considered for monitoring communications in a large scale computer network. The usage patterns in these types of networks are very bursty in nature and don’t fit a Poisson process model. For tractable inference, discrete time models are considered, where the data are aggregated into discrete time periods and probability models are fitted to the

  15. Algorithms for Anomaly Detection - Lecture 1

    CERN Multimedia

    CERN. Geneva

    2017-01-01

    The concept of statistical anomalies, or outliers, has fascinated experimentalists since the earliest attempts to interpret data. We want to know why some data points don’t seem to belong with the others: perhaps we want to eliminate spurious or unrepresentative data from our model. Or, the anomalies themselves may be what we are interested in: an outlier could represent the symptom of a disease, an attack on a computer network, a scientific discovery, or even an unfaithful partner. We start with some general considerations, such as the relationship between clustering and anomaly detection, the choice between supervised and unsupervised methods, and the difference between global and local anomalies. Then we will survey the most representative anomaly detection algorithms, highlighting what kind of data each approach is best suited to, and discussing their limitations. We will finish with a discussion of the difficulties of anomaly detection in high-dimensional data and some new directions for anomaly detec...

  16. Algorithms for Anomaly Detection - Lecture 2

    CERN Multimedia

    CERN. Geneva

    2017-01-01

    The concept of statistical anomalies, or outliers, has fascinated experimentalists since the earliest attempts to interpret data. We want to know why some data points don’t seem to belong with the others: perhaps we want to eliminate spurious or unrepresentative data from our model. Or, the anomalies themselves may be what we are interested in: an outlier could represent the symptom of a disease, an attack on a computer network, a scientific discovery, or even an unfaithful partner. We start with some general considerations, such as the relationship between clustering and anomaly detection, the choice between supervised and unsupervised methods, and the difference between global and local anomalies. Then we will survey the most representative anomaly detection algorithms, highlighting what kind of data each approach is best suited to, and discussing their limitations. We will finish with a discussion of the difficulties of anomaly detection in high-dimensional data and some new directions for anomaly detec...

  17. Implementation of anomaly detection algorithms for detecting transmission control protocol synchronized flooding attacks

    CSIR Research Space (South Africa)

    Mkuzangwe, NNP

    2015-08-01

    Full Text Available This work implements two anomaly detection algorithms for detecting Transmission Control Protocol Synchronized (TCP SYN) flooding attack. The two algorithms are an adaptive threshold algorithm and a cumulative sum (CUSUM) based algorithm...

  18. Data Mining for Anomaly Detection

    Science.gov (United States)

    Biswas, Gautam; Mack, Daniel; Mylaraswamy, Dinkar; Bharadwaj, Raj

    2013-01-01

    The Vehicle Integrated Prognostics Reasoner (VIPR) program describes methods for enhanced diagnostics as well as a prognostic extension to current state of art Aircraft Diagnostic and Maintenance System (ADMS). VIPR introduced a new anomaly detection function for discovering previously undetected and undocumented situations, where there are clear deviations from nominal behavior. Once a baseline (nominal model of operations) is established, the detection and analysis is split between on-aircraft outlier generation and off-aircraft expert analysis to characterize and classify events that may not have been anticipated by individual system providers. Offline expert analysis is supported by data curation and data mining algorithms that can be applied in the contexts of supervised learning methods and unsupervised learning. In this report, we discuss efficient methods to implement the Kolmogorov complexity measure using compression algorithms, and run a systematic empirical analysis to determine the best compression measure. Our experiments established that the combination of the DZIP compression algorithm and CiDM distance measure provides the best results for capturing relevant properties of time series data encountered in aircraft operations. This combination was used as the basis for developing an unsupervised learning algorithm to define "nominal" flight segments using historical flight segments.

  19. Improved prenatal detection of chromosomal anomalies

    DEFF Research Database (Denmark)

    Frøslev-Friis, Christina; Hjort-Pedersen, Karina; Henriques, Carsten U

    2011-01-01

    Prenatal screening for karyotype anomalies takes place in most European countries. In Denmark, the screening method was changed in 2005. The aim of this study was to study the trends in prevalence and prenatal detection rates of chromosome anomalies and Down syndrome (DS) over a 22-year period....

  20. Quantum machine learning for quantum anomaly detection

    Science.gov (United States)

    Liu, Nana; Rebentrost, Patrick

    2018-04-01

    Anomaly detection is used for identifying data that deviate from "normal" data patterns. Its usage on classical data finds diverse applications in many important areas such as finance, fraud detection, medical diagnoses, data cleaning, and surveillance. With the advent of quantum technologies, anomaly detection of quantum data, in the form of quantum states, may become an important component of quantum applications. Machine-learning algorithms are playing pivotal roles in anomaly detection using classical data. Two widely used algorithms are the kernel principal component analysis and the one-class support vector machine. We find corresponding quantum algorithms to detect anomalies in quantum states. We show that these two quantum algorithms can be performed using resources that are logarithmic in the dimensionality of quantum states. For pure quantum states, these resources can also be logarithmic in the number of quantum states used for training the machine-learning algorithm. This makes these algorithms potentially applicable to big quantum data applications.

  1. Detection of cardiovascular anomalies: Hybrid systems approach

    KAUST Repository

    Ledezma, Fernando; Laleg-Kirati, Taous-Meriem

    2012-01-01

    In this paper, we propose a hybrid interpretation of the cardiovascular system. Based on a model proposed by Simaan et al. (2009), we study the problem of detecting cardiovascular anomalies that can be caused by variations in some physiological

  2. Residual generator for cardiovascular anomalies detection

    KAUST Repository

    Belkhatir, Zehor; Laleg-Kirati, Taous-Meriem; Tadjine, Mohamed

    2014-01-01

    This paper discusses the possibility of using observer-based approaches for cardiovascular anomalies detection and isolation. We consider a lumped parameter model of the cardiovascular system that can be written in a form of nonlinear state

  3. FLEAD: online frequency likelihood estimation anomaly detection for mobile sensing

    NARCIS (Netherlands)

    Le Viet Duc, L Duc; Scholten, Johan; Havinga, Paul J.M.

    With the rise of smartphone platforms, adaptive sensing becomes an predominant key to overcome intricate constraints such as smartphone's capabilities and dynamic data. One way to do this is estimating the event probability based on anomaly detection to invoke heavy processes, such as switching on

  4. Reducing customer minutes lost by anomaly detection?

    NARCIS (Netherlands)

    Bakker, M.; Vreeburg, J.H.G.; Rietveld, L.C.; van der Roer, M.

    2012-01-01

    An method which compares measured and predicted water demands to detect anomalies, was developed and tested on three data sets of water demand of three years in which and 25 pipe bursts were reported. The method proved to be able to detect bursts where the water loss exceeds 30% of the average water

  5. Anomaly-based Network Intrusion Detection Methods

    Directory of Open Access Journals (Sweden)

    Pavel Nevlud

    2013-01-01

    Full Text Available The article deals with detection of network anomalies. Network anomalies include everything that is quite different from the normal operation. For detection of anomalies were used machine learning systems. Machine learning can be considered as a support or a limited type of artificial intelligence. A machine learning system usually starts with some knowledge and a corresponding knowledge organization so that it can interpret, analyse, and test the knowledge acquired. There are several machine learning techniques available. We tested Decision tree learning and Bayesian networks. The open source data-mining framework WEKA was the tool we used for testing the classify, cluster, association algorithms and for visualization of our results. The WEKA is a collection of machine learning algorithms for data mining tasks.

  6. Network Anomaly Detection Based on Wavelet Analysis

    Directory of Open Access Journals (Sweden)

    Ali A. Ghorbani

    2008-11-01

    Full Text Available Signal processing techniques have been applied recently for analyzing and detecting network anomalies due to their potential to find novel or unknown intrusions. In this paper, we propose a new network signal modelling technique for detecting network anomalies, combining the wavelet approximation and system identification theory. In order to characterize network traffic behaviors, we present fifteen features and use them as the input signals in our system. We then evaluate our approach with the 1999 DARPA intrusion detection dataset and conduct a comprehensive analysis of the intrusions in the dataset. Evaluation results show that the approach achieves high-detection rates in terms of both attack instances and attack types. Furthermore, we conduct a full day's evaluation in a real large-scale WiFi ISP network where five attack types are successfully detected from over 30 millions flows.

  7. Network Anomaly Detection Based on Wavelet Analysis

    Science.gov (United States)

    Lu, Wei; Ghorbani, Ali A.

    2008-12-01

    Signal processing techniques have been applied recently for analyzing and detecting network anomalies due to their potential to find novel or unknown intrusions. In this paper, we propose a new network signal modelling technique for detecting network anomalies, combining the wavelet approximation and system identification theory. In order to characterize network traffic behaviors, we present fifteen features and use them as the input signals in our system. We then evaluate our approach with the 1999 DARPA intrusion detection dataset and conduct a comprehensive analysis of the intrusions in the dataset. Evaluation results show that the approach achieves high-detection rates in terms of both attack instances and attack types. Furthermore, we conduct a full day's evaluation in a real large-scale WiFi ISP network where five attack types are successfully detected from over 30 millions flows.

  8. Fusion and normalization to enhance anomaly detection

    Science.gov (United States)

    Mayer, R.; Atkinson, G.; Antoniades, J.; Baumback, M.; Chester, D.; Edwards, J.; Goldstein, A.; Haas, D.; Henderson, S.; Liu, L.

    2009-05-01

    This study examines normalizing the imagery and the optimization metrics to enhance anomaly and change detection, respectively. The RX algorithm, the standard anomaly detector for hyperspectral imagery, more successfully extracts bright rather than dark man-made objects when applied to visible hyperspectral imagery. However, normalizing the imagery prior to applying the anomaly detector can help detect some of the problematic dark objects, but can also miss some bright objects. This study jointly fuses images of RX applied to normalized and unnormalized imagery and has a single decision surface. The technique was tested using imagery of commercial vehicles in urban environment gathered by a hyperspectral visible/near IR sensor mounted in an airborne platform. Combining detections first requires converting the detector output to a target probability. The observed anomaly detections were fitted with a linear combination of chi square distributions and these weights were used to help compute the target probability. Receiver Operator Characteristic (ROC) quantitatively assessed the target detection performance. The target detection performance is highly variable depending on the relative number of candidate bright and dark targets and false alarms and controlled in this study by using vegetation and street line masks. The joint Boolean OR and AND operations also generate variable performance depending on the scene. The joint SUM operation provides a reasonable compromise between OR and AND operations and has good target detection performance. In addition, new transforms based on normalizing correlation coefficient and least squares generate new transforms related to canonical correlation analysis (CCA) and a normalized image regression (NIR). Transforms based on CCA and NIR performed better than the standard approaches. Only RX detection of the unnormalized of the difference imagery in change detection provides adequate change detection performance.

  9. A Semiparametric Model for Hyperspectral Anomaly Detection

    Science.gov (United States)

    2012-01-01

    treeline ) in the presence of natural background clutter (e.g., trees, dirt roads, grasses). Each target consists of about 7 × 4 pixels, and each pixel...vehicles near the treeline in Cube 1 (Figure 1) constitutes the target set, but, since anomaly detectors are not designed to detect a particular target

  10. Detection of cardiovascular anomalies: Hybrid systems approach

    KAUST Repository

    Ledezma, Fernando

    2012-06-06

    In this paper, we propose a hybrid interpretation of the cardiovascular system. Based on a model proposed by Simaan et al. (2009), we study the problem of detecting cardiovascular anomalies that can be caused by variations in some physiological parameters, using an observerbased approach. We present the first numerical results obtained. © 2012 IFAC.

  11. Anomaly Detection using the "Isolation Forest" algorithm

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    Anomaly detection can provide clues about an outlying minority class in your data: hackers in a set of network events, fraudsters in a set of credit card transactions, or exotic particles in a set of high-energy collisions. In this talk, we analyze a real dataset of breast tissue biopsies, with malignant results forming the minority class. The "Isolation Forest" algorithm finds anomalies by deliberately “overfitting” models that memorize each data point. Since outliers have more empty space around them, they take fewer steps to memorize. Intuitively, a house in the country can be identified simply as “that house out by the farm”, while a house in the city needs a longer description like “that house in Brooklyn, near Prospect Park, on Union Street, between the firehouse and the library, not far from the French restaurant”. We first use anomaly detection to find outliers in the biopsy data, then apply traditional predictive modeling to discover rules that separate anomalies from normal data...

  12. System and method for anomaly detection

    Science.gov (United States)

    Scherrer, Chad

    2010-06-15

    A system and method for detecting one or more anomalies in a plurality of observations is provided. In one illustrative embodiment, the observations are real-time network observations collected from a stream of network traffic. The method includes performing a discrete decomposition of the observations, and introducing derived variables to increase storage and query efficiencies. A mathematical model, such as a conditional independence model, is then generated from the formatted data. The formatted data is also used to construct frequency tables which maintain an accurate count of specific variable occurrence as indicated by the model generation process. The formatted data is then applied to the mathematical model to generate scored data. The scored data is then analyzed to detect anomalies.

  13. Amalgamation of Anomaly-Detection Indices for Enhanced Process Monitoring

    KAUST Repository

    Harrou, Fouzi; Sun, Ying; Khadraoui, Sofiane

    2016-01-01

    Accurate and effective anomaly detection and diagnosis of modern industrial systems are crucial for ensuring reliability and safety and for maintaining desired product quality. Anomaly detection based on principal component analysis (PCA) has been

  14. Spectral anomaly methods for aerial detection using KUT nuisance rejection

    International Nuclear Information System (INIS)

    Detwiler, R.S.; Pfund, D.M.; Myjak, M.J.; Kulisek, J.A.; Seifert, C.E.

    2015-01-01

    This work discusses the application and optimization of a spectral anomaly method for the real-time detection of gamma radiation sources from an aerial helicopter platform. Aerial detection presents several key challenges over ground-based detection. For one, larger and more rapid background fluctuations are typical due to higher speeds, larger field of view, and geographically induced background changes. As well, the possible large altitude or stand-off distance variations cause significant steps in background count rate as well as spectral changes due to increased gamma-ray scatter with detection at higher altitudes. The work here details the adaptation and optimization of the PNNL-developed algorithm Nuisance-Rejecting Spectral Comparison Ratios for Anomaly Detection (NSCRAD), a spectral anomaly method previously developed for ground-based applications, for an aerial platform. The algorithm has been optimized for two multi-detector systems; a NaI(Tl)-detector-based system and a CsI detector array. The optimization here details the adaptation of the spectral windows for a particular set of target sources to aerial detection and the tailoring for the specific detectors. As well, the methodology and results for background rejection methods optimized for the aerial gamma-ray detection using Potassium, Uranium and Thorium (KUT) nuisance rejection are shown. Results indicate that use of a realistic KUT nuisance rejection may eliminate metric rises due to background magnitude and spectral steps encountered in aerial detection due to altitude changes and geographically induced steps such as at land–water interfaces

  15. Anomaly detection in wide area network mesh using two machine learning anomaly detection algorithms

    OpenAIRE

    Zhang, James; Vukotic, Ilija; Gardner, Robert

    2018-01-01

    Anomaly detection is the practice of identifying items or events that do not conform to an expected behavior or do not correlate with other items in a dataset. It has previously been applied to areas such as intrusion detection, system health monitoring, and fraud detection in credit card transactions. In this paper, we describe a new method for detecting anomalous behavior over network performance data, gathered by perfSONAR, using two machine learning algorithms: Boosted Decision Trees (BDT...

  16. Residual generator for cardiovascular anomalies detection

    KAUST Repository

    Belkhatir, Zehor

    2014-06-01

    This paper discusses the possibility of using observer-based approaches for cardiovascular anomalies detection and isolation. We consider a lumped parameter model of the cardiovascular system that can be written in a form of nonlinear state-space representation. We show that residuals that are sensitive to variations in some cardiovascular parameters and to abnormal opening and closure of the valves, can be generated. Since the whole state is not easily available for measurement, we propose to associate the residual generator to a robust extended kalman filter. Numerical results performed on synthetic data are provided.

  17. An Anomaly Detection Algorithm of Cloud Platform Based on Self-Organizing Maps

    Directory of Open Access Journals (Sweden)

    Jun Liu

    2016-01-01

    Full Text Available Virtual machines (VM on a Cloud platform can be influenced by a variety of factors which can lead to decreased performance and downtime, affecting the reliability of the Cloud platform. Traditional anomaly detection algorithms and strategies for Cloud platforms have some flaws in their accuracy of detection, detection speed, and adaptability. In this paper, a dynamic and adaptive anomaly detection algorithm based on Self-Organizing Maps (SOM for virtual machines is proposed. A unified modeling method based on SOM to detect the machine performance within the detection region is presented, which avoids the cost of modeling a single virtual machine and enhances the detection speed and reliability of large-scale virtual machines in Cloud platform. The important parameters that affect the modeling speed are optimized in the SOM process to significantly improve the accuracy of the SOM modeling and therefore the anomaly detection accuracy of the virtual machine.

  18. Suboptimal processor for anomaly detection for system surveillance and diagnosis

    Energy Technology Data Exchange (ETDEWEB)

    Ciftcioglu, Oe.; Hoogenboom, J.E.; Dam, H. van

    1989-06-01

    Anomaly detection for nuclear reactor surveillance and diagnosis is described. The residual noise obtained as a result of autoregressive (AR) modelling is essential to obtain high sensitivity for anomaly detection. By means of the method of hypothesis testing a suboptimal anomaly detection processor is devised for system surveillance and diagnosis. Experiments are carried out to investigate the performance of the processor, which is in particular of interest for on-line and real-time applications.

  19. Clustering and Recurring Anomaly Identification: Recurring Anomaly Detection System (ReADS)

    Science.gov (United States)

    McIntosh, Dawn

    2006-01-01

    This viewgraph presentation reviews the Recurring Anomaly Detection System (ReADS). The Recurring Anomaly Detection System is a tool to analyze text reports, such as aviation reports and maintenance records: (1) Text clustering algorithms group large quantities of reports and documents; Reduces human error and fatigue (2) Identifies interconnected reports; Automates the discovery of possible recurring anomalies; (3) Provides a visualization of the clusters and recurring anomalies We have illustrated our techniques on data from Shuttle and ISS discrepancy reports, as well as ASRS data. ReADS has been integrated with a secure online search

  20. Unsupervised Anomaly Detection for Liquid-Fueled Rocket Prop...

    Data.gov (United States)

    National Aeronautics and Space Administration — Title: Unsupervised Anomaly Detection for Liquid-Fueled Rocket Propulsion Health Monitoring. Abstract: This article describes the results of applying four...

  1. Signal anomaly detection using modified CUSUM [cumulative sum] method

    International Nuclear Information System (INIS)

    Morgenstern, V.; Upadhyaya, B.R.; Benedetti, M.

    1988-01-01

    An important aspect of detection of anomalies in signals is the identification of changes in signal behavior caused by noise, jumps, changes in band-width, sudden pulses and signal bias. A methodology is developed to identify, isolate and characterize these anomalies using a modification of the cumulative sum (CUSUM) approach. The new algorithm performs anomaly detection at three levels and is implemented on a general purpose computer. 7 refs., 4 figs

  2. Anomaly detection through information sharing under different topologies

    NARCIS (Netherlands)

    Gallos, Lazaros K.; Korczynski, M.T.; Fefferman, Nina H.

    2017-01-01

    Early detection of traffic anomalies in networks increases the probability of effective intervention/mitigation actions, thereby improving the stability of system function. Centralized methods of anomaly detection are subject to inherent constraints: (1) they create a communication burden on the

  3. Multicriteria Similarity-Based Anomaly Detection Using Pareto Depth Analysis.

    Science.gov (United States)

    Hsiao, Ko-Jen; Xu, Kevin S; Calder, Jeff; Hero, Alfred O

    2016-06-01

    We consider the problem of identifying patterns in a data set that exhibits anomalous behavior, often referred to as anomaly detection. Similarity-based anomaly detection algorithms detect abnormally large amounts of similarity or dissimilarity, e.g., as measured by the nearest neighbor Euclidean distances between a test sample and the training samples. In many application domains, there may not exist a single dissimilarity measure that captures all possible anomalous patterns. In such cases, multiple dissimilarity measures can be defined, including nonmetric measures, and one can test for anomalies by scalarizing using a nonnegative linear combination of them. If the relative importance of the different dissimilarity measures are not known in advance, as in many anomaly detection applications, the anomaly detection algorithm may need to be executed multiple times with different choices of weights in the linear combination. In this paper, we propose a method for similarity-based anomaly detection using a novel multicriteria dissimilarity measure, the Pareto depth. The proposed Pareto depth analysis (PDA) anomaly detection algorithm uses the concept of Pareto optimality to detect anomalies under multiple criteria without having to run an algorithm multiple times with different choices of weights. The proposed PDA approach is provably better than using linear combinations of the criteria, and shows superior performance on experiments with synthetic and real data sets.

  4. Generative adversarial networks for anomaly detection in images

    OpenAIRE

    Batiste Ros, Guillem

    2018-01-01

    Anomaly detection is used to identify abnormal observations that don t follow a normal pattern. Inthis work, we use the power of Generative Adversarial Networks in sampling from image distributionsto perform anomaly detection with images and to identify local anomalous segments within thisimages. Also, we explore potential application of this method to support pathological analysis ofbiological tissues

  5. Learning Multimodal Deep Representations for Crowd Anomaly Event Detection

    Directory of Open Access Journals (Sweden)

    Shaonian Huang

    2018-01-01

    Full Text Available Anomaly event detection in crowd scenes is extremely important; however, the majority of existing studies merely use hand-crafted features to detect anomalies. In this study, a novel unsupervised deep learning framework is proposed to detect anomaly events in crowded scenes. Specifically, low-level visual features, energy features, and motion map features are simultaneously extracted based on spatiotemporal energy measurements. Three convolutional restricted Boltzmann machines are trained to model the mid-level feature representation of normal patterns. Then a multimodal fusion scheme is utilized to learn the deep representation of crowd patterns. Based on the learned deep representation, a one-class support vector machine model is used to detect anomaly events. The proposed method is evaluated using two available public datasets and compared with state-of-the-art methods. The experimental results show its competitive performance for anomaly event detection in video surveillance.

  6. An Entropy-Based Network Anomaly Detection Method

    Directory of Open Access Journals (Sweden)

    Przemysław Bereziński

    2015-04-01

    Full Text Available Data mining is an interdisciplinary subfield of computer science involving methods at the intersection of artificial intelligence, machine learning and statistics. One of the data mining tasks is anomaly detection which is the analysis of large quantities of data to identify items, events or observations which do not conform to an expected pattern. Anomaly detection is applicable in a variety of domains, e.g., fraud detection, fault detection, system health monitoring but this article focuses on application of anomaly detection in the field of network intrusion detection.The main goal of the article is to prove that an entropy-based approach is suitable to detect modern botnet-like malware based on anomalous patterns in network. This aim is achieved by realization of the following points: (i preparation of a concept of original entropy-based network anomaly detection method, (ii implementation of the method, (iii preparation of original dataset, (iv evaluation of the method.

  7. Detection Range of Airborne Magnetometers in Magnetic Anomaly Detection

    Directory of Open Access Journals (Sweden)

    Chengjing Li

    2015-11-01

    Full Text Available Airborne magnetometers are utilized for the small-range search, precise positioning, and identification of the ferromagnetic properties of underwater targets. As an important performance parameter of sensors, the detection range of airborne magnetometers is commonly set as a fixed value in references regardless of the influences of environment noise, target magnetic properties, and platform features in a classical model to detect airborne magnetic anomalies. As a consequence, deviation in detection ability analysis is observed. In this study, a novel detection range model is proposed on the basis of classic detection range models of airborne magnetometers. In this model, probability distribution is applied, and the magnetic properties of targets and the environment noise properties of a moving submarine are considered. The detection range model is also constructed by considering the distribution of the moving submarine during detection. A cell-averaging greatest-of-constant false alarm rate test method is also used to calculate the detection range of the model at a desired false alarm rate. The detection range model is then used to establish typical submarine search probabilistic models. Results show that the model can be used to evaluate not only the effects of ambient magnetic noise but also the moving and geomagnetic features of the target and airborne detection platform. The model can also be utilized to display the actual operating range of sensor systems.

  8. A New Anomaly Detection System for School Electricity Consumption Data

    Directory of Open Access Journals (Sweden)

    Wenqiang Cui

    2017-11-01

    Full Text Available Anomaly detection has been widely used in a variety of research and application domains, such as network intrusion detection, insurance/credit card fraud detection, health-care informatics, industrial damage detection, image processing and novel topic detection in text mining. In this paper, we focus on remote facilities management that identifies anomalous events in buildings by detecting anomalies in building electricity consumption data. We investigated five models within electricity consumption data from different schools to detect anomalies in the data. Furthermore, we proposed a hybrid model that combines polynomial regression and Gaussian distribution, which detects anomalies in the data with 0 false negative and an average precision higher than 91%. Based on the proposed model, we developed a data detection and visualization system for a facilities management company to detect and visualize anomalies in school electricity consumption data. The system is tested and evaluated by facilities managers. According to the evaluation, our system has improved the efficiency of facilities managers to identify anomalies in the data.

  9. Anomaly Detection in the Bitcoin System - A Network Perspective

    OpenAIRE

    Pham, Thai; Lee, Steven

    2016-01-01

    The problem of anomaly detection has been studied for a long time, and many Network Analysis techniques have been proposed as solutions. Although some results appear to be quite promising, no method is clearly to be superior to the rest. In this paper, we particularly consider anomaly detection in the Bitcoin transaction network. Our goal is to detect which users and transactions are the most suspicious; in this case, anomalous behavior is a proxy for suspicious behavior. To this end, we use ...

  10. EUROCAT website data on prenatal detection rates of congenital anomalies

    NARCIS (Netherlands)

    Garne, Ester; Dolk, Helen; Loane, Maria; Boyd, Patricia A.

    2010-01-01

    The EUROCAT website www.eurocat-network.eu publishes prenatal detection rates for major congenital anomalies using data from European population-based congenital anomaly registers, covering 28% of the EU population as well as non-EU countries. Data are updated annually. This information can be

  11. EUROCAT website data on prenatal detection rates of congenital anomalies

    DEFF Research Database (Denmark)

    Garne, Ester; Dolk, Helen; Loane, Maria

    2010-01-01

    The EUROCAT website www.eurocat-network.eu publishes prenatal detection rates for major congenital anomalies using data from European population-based congenital anomaly registers, covering 28% of the EU population as well as non-EU countries. Data are updated annually. This information can be us...

  12. In-Flight Diagnosis and Anomaly Detection, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — In flight diagnosis and anomaly detection is a difficult challenge that requires sufficient observation and real-time processing of health information. Our approach...

  13. Detection of sinkholes or anomalies using full seismic wave fields.

    Science.gov (United States)

    2013-04-01

    This research presents an application of two-dimensional (2-D) time-domain waveform tomography for detection of embedded sinkholes and anomalies. The measured seismic surface wave fields were inverted using a full waveform inversion (FWI) technique, ...

  14. Anomaly Detection and Diagnosis Algorithms for Discrete Symbols

    Data.gov (United States)

    National Aeronautics and Space Administration — We present a set of novel algorithms which we call sequenceMiner that detect and characterize anomalies in large sets of high-dimensional symbol sequences that arise...

  15. Condition Parameter Modeling for Anomaly Detection in Wind Turbines

    Directory of Open Access Journals (Sweden)

    Yonglong Yan

    2014-05-01

    Full Text Available Data collected from the supervisory control and data acquisition (SCADA system, used widely in wind farms to obtain operational and condition information about wind turbines (WTs, is of important significance for anomaly detection in wind turbines. The paper presents a novel model for wind turbine anomaly detection mainly based on SCADA data and a back-propagation neural network (BPNN for automatic selection of the condition parameters. The SCADA data sets are determined through analysis of the cumulative probability distribution of wind speed and the relationship between output power and wind speed. The automatic BPNN-based parameter selection is for reduction of redundant parameters for anomaly detection in wind turbines. Through investigation of cases of WT faults, the validity of the automatic parameter selection-based model for WT anomaly detection is verified.

  16. Solving a prisoner's dilemma in distributed anomaly detection

    Data.gov (United States)

    National Aeronautics and Space Administration — Anomaly detection has recently become an important problem in many industrial and financial applications. In several instances, the data to be analyzed for possible...

  17. Detecting Anomalies by Fusing Voice and Operations Data, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Our innovation will detect, in near real-time, NAS operational anomalies by uniquely combing with analytical methods our existing Microsoft Azure based TFMData...

  18. ANOMALY DETECTION IN NETWORKING USING HYBRID ARTIFICIAL IMMUNE ALGORITHM

    Directory of Open Access Journals (Sweden)

    D. Amutha Guka

    2012-01-01

    Full Text Available Especially in today’s network scenario, when computers are interconnected through internet, security of an information system is very important issue. Because no system can be absolutely secure, the timely and accurate detection of anomalies is necessary. The main aim of this research paper is to improve the anomaly detection by using Hybrid Artificial Immune Algorithm (HAIA which is based on Artificial Immune Systems (AIS and Genetic Algorithm (GA. In this research work, HAIA approach is used to develop Network Anomaly Detection System (NADS. The detector set is generated by using GA and the anomalies are identified using Negative Selection Algorithm (NSA which is based on AIS. The HAIA algorithm is tested with KDD Cup 99 benchmark dataset. The detection rate is used to measure the effectiveness of the NADS. The results and consistency of the HAIA are compared with earlier approaches and the results are presented. The proposed algorithm gives best results when compared to the earlier approaches.

  19. Hyperspectral Imagery Target Detection Using Improved Anomaly Detection and Signature Matching Methods

    National Research Council Canada - National Science Library

    Smetek, Timothy E

    2007-01-01

    This research extends the field of hyperspectral target detection by developing autonomous anomaly detection and signature matching methodologies that reduce false alarms relative to existing benchmark detectors...

  20. Anomaly detection using magnetic flux leakage technology

    Energy Technology Data Exchange (ETDEWEB)

    Rempel, Raymond G. [BJ Pipeline Inspection Services, Alberta (Canada)

    2005-07-01

    There are many aspects to properly assessing the integrity of a pipeline. In-line-Inspection (ILI) tools, in particular those that employ the advanced use of Magnetic Flux Leakage (MFL) technology, provide a valuable means of achieving required up-to-date knowledge of a pipeline. More prevalent use of High Resolution MFL In-Line-Inspection tools is growing the knowledge base that leads to more reliable and accurate identification of anomalies in a pipeline, thus, minimizing the need for expensive verification excavations. Accurate assessment of pipeline anomalies can improve the decision making process within an Integrity Management Program and excavation programs can then focus on required repairs instead of calibration or exploratory digs. Utilizing the information from an MFL ILI inspection is not only cost effective but, as well, can also prove to be an extremely valuable building block of a Pipeline Integrity Management Program. (author)

  1. Unsupervised Ensemble Anomaly Detection Using Time-Periodic Packet Sampling

    Science.gov (United States)

    Uchida, Masato; Nawata, Shuichi; Gu, Yu; Tsuru, Masato; Oie, Yuji

    We propose an anomaly detection method for finding patterns in network traffic that do not conform to legitimate (i.e., normal) behavior. The proposed method trains a baseline model describing the normal behavior of network traffic without using manually labeled traffic data. The trained baseline model is used as the basis for comparison with the audit network traffic. This anomaly detection works in an unsupervised manner through the use of time-periodic packet sampling, which is used in a manner that differs from its intended purpose — the lossy nature of packet sampling is used to extract normal packets from the unlabeled original traffic data. Evaluation using actual traffic traces showed that the proposed method has false positive and false negative rates in the detection of anomalies regarding TCP SYN packets comparable to those of a conventional method that uses manually labeled traffic data to train the baseline model. Performance variation due to the probabilistic nature of sampled traffic data is mitigated by using ensemble anomaly detection that collectively exploits multiple baseline models in parallel. Alarm sensitivity is adjusted for the intended use by using maximum- and minimum-based anomaly detection that effectively take advantage of the performance variations among the multiple baseline models. Testing using actual traffic traces showed that the proposed anomaly detection method performs as well as one using manually labeled traffic data and better than one using randomly sampled (unlabeled) traffic data.

  2. A Survey on Anomaly Based Host Intrusion Detection System

    Science.gov (United States)

    Jose, Shijoe; Malathi, D.; Reddy, Bharath; Jayaseeli, Dorathi

    2018-04-01

    An intrusion detection system (IDS) is hardware, software or a combination of two, for monitoring network or system activities to detect malicious signs. In computer security, designing a robust intrusion detection system is one of the most fundamental and important problems. The primary function of system is detecting intrusion and gives alerts when user tries to intrusion on timely manner. In these techniques when IDS find out intrusion it will send alert massage to the system administrator. Anomaly detection is an important problem that has been researched within diverse research areas and application domains. This survey tries to provide a structured and comprehensive overview of the research on anomaly detection. From the existing anomaly detection techniques, each technique has relative strengths and weaknesses. The current state of the experiment practice in the field of anomaly-based intrusion detection is reviewed and survey recent studies in this. This survey provides a study of existing anomaly detection techniques, and how the techniques used in one area can be applied in another application domain.

  3. Adaptively detecting changes in Autonomic Grid Computing

    KAUST Repository

    Zhang, Xiangliang

    2010-10-01

    Detecting the changes is the common issue in many application fields due to the non-stationary distribution of the applicative data, e.g., sensor network signals, web logs and gridrunning logs. Toward Autonomic Grid Computing, adaptively detecting the changes in a grid system can help to alarm the anomalies, clean the noises, and report the new patterns. In this paper, we proposed an approach of self-adaptive change detection based on the Page-Hinkley statistic test. It handles the non-stationary distribution without the assumption of data distribution and the empirical setting of parameters. We validate the approach on the EGEE streaming jobs, and report its better performance on achieving higher accuracy comparing to the other change detection methods. Meanwhile this change detection process could help to discover the device fault which was not claimed in the system logs. © 2010 IEEE.

  4. A New Method for Early Anomaly Detection of BWR Instabilities

    International Nuclear Information System (INIS)

    Ivanov, K.N.

    2005-01-01

    The objective of the performed research is to develop an early anomaly detection methodology so as to enhance safety, availability, and operational flexibility of Boiling Water Reactor (BWR) nuclear power plants. The technical approach relies on suppression of potential power oscillations in BWRs by detecting small anomalies at an early stage and taking appropriate prognostic actions based on an anticipated operation schedule. The research utilizes a model of coupled (two-phase) thermal-hydraulic and neutron flux dynamics, which is used as a generator of time series data for anomaly detection at an early stage. The model captures critical nonlinear features of coupled thermal-hydraulic and nuclear reactor dynamics and (slow time-scale) evolution of the anomalies as non-stationary parameters. The time series data derived from this nonlinear non-stationary model serves as the source of information for generating the symbolic dynamics for characterization of model parameter changes that quantitatively represent small anomalies. The major focus of the presented research activity was on developing and qualifying algorithms of pattern recognition for power instability based on anomaly detection from time series data, which later can be used to formulate real-time decision and control algorithms for suppression of power oscillations for a variety of anticipated operating conditions. The research being performed in the framework of this project is essential to make significant improvement in the capability of thermal instability analyses for enhancing safety, availability, and operational flexibility of currently operating and next generation BWRs.

  5. Structural material anomaly detection system using water chemistry data

    International Nuclear Information System (INIS)

    Asakura, Yamato; Nagase, Makoto; Uchida, Shunsuke; Ohsumi, Katsumi.

    1992-01-01

    The concept of an advanced water chemistry diagnosis system for detection of anomalies and preventive maintenance of system components is proposed and put into a concrete form. Using the analogy to a medical inspection system, analyses of water chemistry change will make it possible to detect symptoms of anomalies in system components. Then, correlations between water chemistry change and anomaly occurrence in the components of the BWR primary cooling system are analyzed theoretically. These fragmentary correlations are organized and reduced to an algorithm for the on-line diagnosis system using on-line monitoring data, pH and conductivity. By using actual plant data, the on-line diagnosis model system is verified to be applicable for early and automatic finding of the anomaly cause and for timely supply of much diagnostic information to plant operators. (author)

  6. Amalgamation of Anomaly-Detection Indices for Enhanced Process Monitoring

    KAUST Repository

    Harrou, Fouzi

    2016-01-29

    Accurate and effective anomaly detection and diagnosis of modern industrial systems are crucial for ensuring reliability and safety and for maintaining desired product quality. Anomaly detection based on principal component analysis (PCA) has been studied intensively and largely applied to multivariate processes with highly cross-correlated process variables; howver conventional PCA-based methods often fail to detect small or moderate anomalies. In this paper, the proposed approach integrates two popular process-monitoring detection tools, the conventional PCA-based monitoring indices Hotelling’s T2 and Q and the exponentially weighted moving average (EWMA). We develop two EWMA tools based on the Q and T2 statistics, T2-EWMA and Q-EWMA, to detect anomalies in the process mean. The performances of the proposed methods were compared with that of conventional PCA-based anomaly-detection methods by applying each method to two examples: a synthetic data set and experimental data collected from a flow heating system. The results clearly show the benefits and effectiveness of the proposed methods over conventional PCA-based methods.

  7. An incremental anomaly detection model for virtual machines

    Science.gov (United States)

    Zhang, Hancui; Chen, Shuyu; Liu, Jun; Zhou, Zhen; Wu, Tianshu

    2017-01-01

    Self-Organizing Map (SOM) algorithm as an unsupervised learning method has been applied in anomaly detection due to its capabilities of self-organizing and automatic anomaly prediction. However, because of the algorithm is initialized in random, it takes a long time to train a detection model. Besides, the Cloud platforms with large scale virtual machines are prone to performance anomalies due to their high dynamic and resource sharing characters, which makes the algorithm present a low accuracy and a low scalability. To address these problems, an Improved Incremental Self-Organizing Map (IISOM) model is proposed for anomaly detection of virtual machines. In this model, a heuristic-based initialization algorithm and a Weighted Euclidean Distance (WED) algorithm are introduced into SOM to speed up the training process and improve model quality. Meanwhile, a neighborhood-based searching algorithm is presented to accelerate the detection time by taking into account the large scale and high dynamic features of virtual machines on cloud platform. To demonstrate the effectiveness, experiments on a common benchmark KDD Cup dataset and a real dataset have been performed. Results suggest that IISOM has advantages in accuracy and convergence velocity of anomaly detection for virtual machines on cloud platform. PMID:29117245

  8. An incremental anomaly detection model for virtual machines.

    Directory of Open Access Journals (Sweden)

    Hancui Zhang

    Full Text Available Self-Organizing Map (SOM algorithm as an unsupervised learning method has been applied in anomaly detection due to its capabilities of self-organizing and automatic anomaly prediction. However, because of the algorithm is initialized in random, it takes a long time to train a detection model. Besides, the Cloud platforms with large scale virtual machines are prone to performance anomalies due to their high dynamic and resource sharing characters, which makes the algorithm present a low accuracy and a low scalability. To address these problems, an Improved Incremental Self-Organizing Map (IISOM model is proposed for anomaly detection of virtual machines. In this model, a heuristic-based initialization algorithm and a Weighted Euclidean Distance (WED algorithm are introduced into SOM to speed up the training process and improve model quality. Meanwhile, a neighborhood-based searching algorithm is presented to accelerate the detection time by taking into account the large scale and high dynamic features of virtual machines on cloud platform. To demonstrate the effectiveness, experiments on a common benchmark KDD Cup dataset and a real dataset have been performed. Results suggest that IISOM has advantages in accuracy and convergence velocity of anomaly detection for virtual machines on cloud platform.

  9. Anomaly Detection In Additively Manufactured Parts Using Laser Doppler Vibrometery

    Energy Technology Data Exchange (ETDEWEB)

    Hernandez, Carlos A. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-09-29

    Additively manufactured parts are susceptible to non-uniform structure caused by the unique manufacturing process. This can lead to structural weakness or catastrophic failure. Using laser Doppler vibrometry and frequency response analysis, non-contact detection of anomalies in additively manufactured parts may be possible. Preliminary tests show promise for small scale detection, but more future work is necessary.

  10. Approaches in anomaly-based network intrusion detection systems

    NARCIS (Netherlands)

    Bolzoni, D.; Etalle, S.; Di Pietro, R.; Mancini, L.V.

    2008-01-01

    Anomaly-based network intrusion detection systems (NIDSs) can take into consideration packet headers, the payload, or a combination of both. We argue that payload-based approaches are becoming the most effective methods to detect attacks. Nowadays, attacks aim mainly to exploit vulnerabilities at

  11. Approaches in Anomaly-based Network Intrusion Detection Systems

    NARCIS (Netherlands)

    Bolzoni, D.; Etalle, Sandro

    Anomaly-based network intrusion detection systems (NIDSs) can take into consideration packet headers, the payload, or a combination of both. We argue that payload-based approaches are becoming the most effective methods to detect attacks. Nowadays, attacks aim mainly to exploit vulnerabilities at

  12. Probabilistic Anomaly Detection Based On System Calls Analysis

    Directory of Open Access Journals (Sweden)

    Przemysław Maciołek

    2007-01-01

    Full Text Available We present an application of probabilistic approach to the anomaly detection (PAD. Byanalyzing selected system calls (and their arguments, the chosen applications are monitoredin the Linux environment. This allows us to estimate “(abnormality” of their behavior (bycomparison to previously collected profiles. We’ve attached results of threat detection ina typical computer environment.

  13. Online Anomaly Energy Consumption Detection Using Lambda Architecture

    DEFF Research Database (Denmark)

    Liu, Xiufeng; Iftikhar, Nadeem; Nielsen, Per Sieverts

    2016-01-01

    problem, which does data mining on a large amount of parallel data streams from smart meters. In this paper, we propose a supervised learning and statistical-based anomaly detection method, and implement a Lambda system using the in-memory distributed computing framework, Spark and its extension Spark...... of the lambda detection system....

  14. A hybrid approach for efficient anomaly detection using metaheuristic methods

    Directory of Open Access Journals (Sweden)

    Tamer F. Ghanem

    2015-07-01

    Full Text Available Network intrusion detection based on anomaly detection techniques has a significant role in protecting networks and systems against harmful activities. Different metaheuristic techniques have been used for anomaly detector generation. Yet, reported literature has not studied the use of the multi-start metaheuristic method for detector generation. This paper proposes a hybrid approach for anomaly detection in large scale datasets using detectors generated based on multi-start metaheuristic method and genetic algorithms. The proposed approach has taken some inspiration of negative selection-based detector generation. The evaluation of this approach is performed using NSL-KDD dataset which is a modified version of the widely used KDD CUP 99 dataset. The results show its effectiveness in generating a suitable number of detectors with an accuracy of 96.1% compared to other competitors of machine learning algorithms.

  15. Identifying Threats Using Graph-based Anomaly Detection

    Science.gov (United States)

    Eberle, William; Holder, Lawrence; Cook, Diane

    Much of the data collected during the monitoring of cyber and other infrastructures is structural in nature, consisting of various types of entities and relationships between them. The detection of threatening anomalies in such data is crucial to protecting these infrastructures. We present an approach to detecting anomalies in a graph-based representation of such data that explicitly represents these entities and relationships. The approach consists of first finding normative patterns in the data using graph-based data mining and then searching for small, unexpected deviations to these normative patterns, assuming illicit behavior tries to mimic legitimate, normative behavior. The approach is evaluated using several synthetic and real-world datasets. Results show that the approach has high truepositive rates, low false-positive rates, and is capable of detecting complex structural anomalies in real-world domains including email communications, cellphone calls and network traffic.

  16. Anomalies

    International Nuclear Information System (INIS)

    Bardeen, W.A.

    1985-08-01

    Anomalies have a diverse impact on many aspects of physical phenomena. The role of anomalies in determining physical structure from the amplitude for π 0 decay to the foundations of superstring theory will be reviewed. 36 refs

  17. Anomaly Detection Based on Sensor Data in Petroleum Industry Applications

    Directory of Open Access Journals (Sweden)

    Luis Martí

    2015-01-01

    Full Text Available Anomaly detection is the problem of finding patterns in data that do not conform to an a priori expected behavior. This is related to the problem in which some samples are distant, in terms of a given metric, from the rest of the dataset, where these anomalous samples are indicated as outliers. Anomaly detection has recently attracted the attention of the research community, because of its relevance in real-world applications, like intrusion detection, fraud detection, fault detection and system health monitoring, among many others. Anomalies themselves can have a positive or negative nature, depending on their context and interpretation. However, in either case, it is important for decision makers to be able to detect them in order to take appropriate actions. The petroleum industry is one of the application contexts where these problems are present. The correct detection of such types of unusual information empowers the decision maker with the capacity to act on the system in order to correctly avoid, correct or react to the situations associated with them. In that application context, heavy extraction machines for pumping and generation operations, like turbomachines, are intensively monitored by hundreds of sensors each that send measurements with a high frequency for damage prevention. In this paper, we propose a combination of yet another segmentation algorithm (YASA, a novel fast and high quality segmentation algorithm, with a one-class support vector machine approach for efficient anomaly detection in turbomachines. The proposal is meant for dealing with the aforementioned task and to cope with the lack of labeled training data. As a result, we perform a series of empirical studies comparing our approach to other methods applied to benchmark problems and a real-life application related to oil platform turbomachinery anomaly detection.

  18. On-line intermittent connector anomaly detection

    Data.gov (United States)

    National Aeronautics and Space Administration — This paper investigates a non-traditional use of differential current sensor and current sensor to detect intermittent disconnection problems in connectors. An...

  19. Adaptive filtering and change detection

    CERN Document Server

    Gustafsson, Fredrik

    2003-01-01

    Adaptive filtering is a classical branch of digital signal processing (DSP). Industrial interest in adaptive filtering grows continuously with the increase in computer performance that allows ever more conplex algorithms to be run in real-time. Change detection is a type of adaptive filtering for non-stationary signals and is also the basic tool in fault detection and diagnosis. Often considered as separate subjects Adaptive Filtering and Change Detection bridges a gap in the literature with a unified treatment of these areas, emphasizing that change detection is a natural extensi

  20. Trajectory Shape Analysis and Anomaly Detection Utilizing Information Theory Tools

    Directory of Open Access Journals (Sweden)

    Yuejun Guo

    2017-06-01

    Full Text Available In this paper, we propose to improve trajectory shape analysis by explicitly considering the speed attribute of trajectory data, and to successfully achieve anomaly detection. The shape of object motion trajectory is modeled using Kernel Density Estimation (KDE, making use of both the angle attribute of the trajectory and the speed of the moving object. An unsupervised clustering algorithm, based on the Information Bottleneck (IB method, is employed for trajectory learning to obtain an adaptive number of trajectory clusters through maximizing the Mutual Information (MI between the clustering result and a feature set of the trajectory data. Furthermore, we propose to effectively enhance the performance of IB by taking into account the clustering quality in each iteration of the clustering procedure. The trajectories are determined as either abnormal (infrequently observed or normal by a measure based on Shannon entropy. Extensive tests on real-world and synthetic data show that the proposed technique behaves very well and outperforms the state-of-the-art methods.

  1. Unsupervised topic discovery by anomaly detection

    OpenAIRE

    Cheng, Leon

    2013-01-01

    Approved for public release; distribution is unlimited With the vast amount of information and public comment available online, it is of increasing interest to understand what is being said and what topics are trending online. Government agencies, for example, want to know what policies concern the public without having to look through thousands of comments manually. Topic detection provides automatic identification of topics in documents based on the information content and enhances many ...

  2. Detection of data taking anomalies for the ATLAS experiment

    CERN Document Server

    De Castro Vargas Fernandes, Julio; The ATLAS collaboration; Lehmann Miotto, Giovanna

    2015-01-01

    The physics signals produced by the ATLAS detector at the Large Hadron Collider (LHC) at CERN are acquired and selected by a distributed Trigger and Data AcQuistition (TDAQ) system, comprising a large number of hardware devices and software components. In this work, we focus on the problem of online detection of anomalies along the data taking period. Anomalies, in this context, are defined as an unexpected behaviour of the TDAQ system that result in a loss of data taking efficiency: the causes for those anomalies may come from the TDAQ itself or from external sources. While the TDAQ system operates, it publishes several useful information (trigger rates, dead times, memory usage…). Such information over time creates a set of time series that can be monitored in order to detect (and react to) problems (or anomalies). Here, we approach TDAQ operation monitoring through a data quality perspective, i.e, an anomaly is seen as a loss of quality (an outlier) and it is reported: this information can be used to rea...

  3. Monitoring water supply systems for anomaly detection and response

    NARCIS (Netherlands)

    Bakker, M.; Lapikas, T.; Tangena, B.H.; Vreeburg, J.H.G.

    2012-01-01

    Water supply systems are vulnerable to damage caused by unintended or intended human actions, or due to aging of the system. In order to minimize the damages and the inconvenience for the customers, a software tool was developed to detect anomalies at an early stage, and to support the responsible

  4. Searching for Complex Patterns Using Disjunctive Anomaly Detection

    OpenAIRE

    Sabhnani, Maheshkumar; Dubrawski, Artur; Schneider, Jeff

    2013-01-01

    Objective Disjunctive anomaly detection (DAD) algorithm [1] can efficiently search across multidimensional biosurveillance data to find multiple simultaneously occurring (in time) and overlapping (across different data dimensions) anomalous clusters. We introduce extensions of DAD to handle rich cluster interactions and diverse data distributions. Introduction Modern biosurveillance data contains thousands of unique time series defined across various categorical dimensions (zipcode, age group...

  5. Hyperspectral anomaly detection using Sony PlayStation 3

    Science.gov (United States)

    Rosario, Dalton; Romano, João; Sepulveda, Rene

    2009-05-01

    We present a proof-of-principle demonstration using Sony's IBM Cell processor-based PlayStation 3 (PS3) to run-in near real-time-a hyperspectral anomaly detection algorithm (HADA) on real hyperspectral (HS) long-wave infrared imagery. The PS3 console proved to be ideal for doing precisely the kind of heavy computational lifting HS based algorithms require, and the fact that it is a relatively open platform makes programming scientific applications feasible. The PS3 HADA is a unique parallel-random sampling based anomaly detection approach that does not require prior spectra of the clutter background. The PS3 HADA is designed to handle known underlying difficulties (e.g., target shape/scale uncertainties) often ignored in the development of autonomous anomaly detection algorithms. The effort is part of an ongoing cooperative contribution between the Army Research Laboratory and the Army's Armament, Research, Development and Engineering Center, which aims at demonstrating performance of innovative algorithmic approaches for applications requiring autonomous anomaly detection using passive sensors.

  6. Anomaly detection in real-time gross payment data

    NARCIS (Netherlands)

    Triepels, Ron; Daniels, Hennie; Heijmans, R.; Camp, Olivier; Filipe, Joaquim

    2017-01-01

    We discuss how an autoencoder can detect system-level anomalies in a real-time gross settlement system by reconstructing a set of liquidity vectors. A liquidity vector is an aggregated representation of the underlying payment network of a settlement system for a particular time interval.

  7. Anomaly detection in VoIP traffic with trends

    NARCIS (Netherlands)

    Mata, F.; Zuraniewski, P.W.; Mandjes, M.; Mellia, M.

    2012-01-01

    In this paper we present methodological advances in anomaly detection, which, among other purposes, can be used to discover abnormal traffic patterns under the presence of deterministic trends in data, given that specific assumptions about the traffic type and nature are met. A performance study of

  8. Anomalous human behavior detection: An Adaptive approach

    NARCIS (Netherlands)

    Leeuwen, C. van; Halma, A.; Schutte, K.

    2013-01-01

    Detection of anomalies (outliers or abnormal instances) is an important element in a range of applications such as fault, fraud, suspicious behavior detection and knowledge discovery. In this article we propose a new method for anomaly detection and performed tested its ability to detect anomalous

  9. Fuzzy Kernel k-Medoids algorithm for anomaly detection problems

    Science.gov (United States)

    Rustam, Z.; Talita, A. S.

    2017-07-01

    Intrusion Detection System (IDS) is an essential part of security systems to strengthen the security of information systems. IDS can be used to detect the abuse by intruders who try to get into the network system in order to access and utilize the available data sources in the system. There are two approaches of IDS, Misuse Detection and Anomaly Detection (behavior-based intrusion detection). Fuzzy clustering-based methods have been widely used to solve Anomaly Detection problems. Other than using fuzzy membership concept to determine the object to a cluster, other approaches as in combining fuzzy and possibilistic membership or feature-weighted based methods are also used. We propose Fuzzy Kernel k-Medoids that combining fuzzy and possibilistic membership as a powerful method to solve anomaly detection problem since on numerical experiment it is able to classify IDS benchmark data into five different classes simultaneously. We classify IDS benchmark data KDDCup'99 data set into five different classes simultaneously with the best performance was achieved by using 30 % of training data with clustering accuracy reached 90.28 percent.

  10. HPNAIDM: The High-Performance Network Anomaly/Intrusion Detection and Mitigation System

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Yan [Northwesten University

    2013-12-05

    Identifying traffic anomalies and attacks rapidly and accurately is critical for large network operators. With the rapid growth of network bandwidth, such as the next generation DOE UltraScience Network, and fast emergence of new attacks/virus/worms, existing network intrusion detection systems (IDS) are insufficient because they: • Are mostly host-based and not scalable to high-performance networks; • Are mostly signature-based and unable to adaptively recognize flow-level unknown attacks; • Cannot differentiate malicious events from the unintentional anomalies. To address these challenges, we proposed and developed a new paradigm called high-performance network anomaly/intrustion detection and mitigation (HPNAIDM) system. The new paradigm is significantly different from existing IDSes with the following features (research thrusts). • Online traffic recording and analysis on high-speed networks; • Online adaptive flow-level anomaly/intrusion detection and mitigation; • Integrated approach for false positive reduction. Our research prototype and evaluation demonstrate that the HPNAIDM system is highly effective and economically feasible. Beyond satisfying the pre-set goals, we even exceed that significantly (see more details in the next section). Overall, our project harvested 23 publications (2 book chapters, 6 journal papers and 15 peer-reviewed conference/workshop papers). Besides, we built a website for technique dissemination, which hosts two system prototype release to the research community. We also filed a patent application and developed strong international and domestic collaborations which span both academia and industry.

  11. Evaluation of Anomaly Detection Techniques for SCADA Communication Resilience

    OpenAIRE

    Shirazi, Syed Noor Ul Hassan; Gouglidis, Antonios; Syeda, Kanza Noor; Simpson, Steven; Mauthe, Andreas Ulrich; Stephanakis, Ioannis M.; Hutchison, David

    2016-01-01

    Attacks on critical infrastructures’ Supervisory Control and Data Acquisition (SCADA) systems are beginning to increase. They are often initiated by highly skilled attackers, who are capable of deploying sophisticated attacks to exfiltrate data or even to cause physical damage. In this paper, we rehearse the rationale for protecting against cyber attacks and evaluate a set of Anomaly Detection (AD) techniques in detecting attacks by analysing traffic captured in a SCADA network. For this purp...

  12. Development of anomaly detection models for deep subsurface monitoring

    Science.gov (United States)

    Sun, A. Y.

    2017-12-01

    Deep subsurface repositories are used for waste disposal and carbon sequestration. Monitoring deep subsurface repositories for potential anomalies is challenging, not only because the number of sensor networks and the quality of data are often limited, but also because of the lack of labeled data needed to train and validate machine learning (ML) algorithms. Although physical simulation models may be applied to predict anomalies (or the system's nominal state for that sake), the accuracy of such predictions may be limited by inherent conceptual and parameter uncertainties. The main objective of this study was to demonstrate the potential of data-driven models for leakage detection in carbon sequestration repositories. Monitoring data collected during an artificial CO2 release test at a carbon sequestration repository were used, which include both scalar time series (pressure) and vector time series (distributed temperature sensing). For each type of data, separate online anomaly detection algorithms were developed using the baseline experiment data (no leak) and then tested on the leak experiment data. Performance of a number of different online algorithms was compared. Results show the importance of including contextual information in the dataset to mitigate the impact of reservoir noise and reduce false positive rate. The developed algorithms were integrated into a generic Web-based platform for real-time anomaly detection.

  13. Boiling anomaly detection by various signal characterization methods

    International Nuclear Information System (INIS)

    Sakuma, M.; Kozma, R.; Kitamura, M.; Schoonewelle, H.; Hoogenboom, J.E.

    1996-01-01

    In order to detect anomalies in the early stage for complex dynamical systems like nuclear power plants, it is important to characterize various statistical features of the data acquired in normal operating condition. In this paper, concept of hierarchical anomaly monitoring method is outlined, which is based on the diversification principle. In addition to usual time and frequency domain analysis (FFT, APDF, MAR-SPRT), other analysis (wavelet, fractal, etc.) are performed. As soon as any inconsistency arises in the results of the analysis on the upper level, a thorough analysis is initiated. A comparison among these methods is performed and the efficiency of the diversification approach has been demonstrated through simulated boiling anomalies in nuclear reactors. (authors)

  14. Tactile sensor of hardness recognition based on magnetic anomaly detection

    Science.gov (United States)

    Xue, Lingyun; Zhang, Dongfang; Chen, Qingguang; Rao, Huanle; Xu, Ping

    2018-03-01

    Hardness, as one kind of tactile sensing, plays an important role in the field of intelligent robot application such as gripping, agricultural harvesting, prosthetic hand and so on. Recently, with the rapid development of magnetic field sensing technology with high performance, a number of magnetic sensors have been developed for intelligent application. The tunnel Magnetoresistance(TMR) based on magnetoresistance principal works as the sensitive element to detect the magnetic field and it has proven its excellent ability of weak magnetic detection. In the paper, a new method based on magnetic anomaly detection was proposed to detect the hardness in the tactile way. The sensor is composed of elastic body, ferrous probe, TMR element, permanent magnet. When the elastic body embedded with ferrous probe touches the object under the certain size of force, deformation of elastic body will produce. Correspondingly, the ferrous probe will be forced to displace and the background magnetic field will be distorted. The distorted magnetic field was detected by TMR elements and the output signal at different time can be sampled. The slope of magnetic signal with the sampling time is different for object with different hardness. The result indicated that the magnetic anomaly sensor can recognize the hardness rapidly within 150ms after the tactile moment. The hardness sensor based on magnetic anomaly detection principal proposed in the paper has the advantages of simple structure, low cost, rapid response and it has shown great application potential in the field of intelligent robot.

  15. A comparison of three time-domain anomaly detection methods

    Energy Technology Data Exchange (ETDEWEB)

    Schoonewelle, H.; Hagen, T.H.J.J. van der; Hoogenboom, J.E. [Delft University of Technology (Netherlands). Interfaculty Reactor Institute

    1996-01-01

    Three anomaly detection methods based on a comparison of signal values with predictions from an autoregressive model are presented. These methods are: the extremes method, the {chi}{sup 2} method and the sequential probability ratio test. The methods are used to detect a change of the standard deviation of the residual noise obtained from applying an autoregressive model. They are fast and can be used in on-line applications. For each method some important anomaly detection parameters are determined by calculation or simulation. These parameters are: the false alarm rate, the average time to alarm and - being of minor importance -the alarm failure rate. Each method is optimized with respect to the average time to alarm for a given value of the false alarm rate. The methods are compared with each other, resulting in the sequential probability ratio test being clearly superior. (author).

  16. A comparison of three time-domain anomaly detection methods

    International Nuclear Information System (INIS)

    Schoonewelle, H.; Hagen, T.H.J.J. van der; Hoogenboom, J.E.

    1996-01-01

    Three anomaly detection methods based on a comparison of signal values with predictions from an autoregressive model are presented. These methods are: the extremes method, the χ 2 method and the sequential probability ratio test. The methods are used to detect a change of the standard deviation of the residual noise obtained from applying an autoregressive model. They are fast and can be used in on-line applications. For each method some important anomaly detection parameters are determined by calculation or simulation. These parameters are: the false alarm rate, the average time to alarm and - being of minor importance -the alarm failure rate. Each method is optimized with respect to the average time to alarm for a given value of the false alarm rate. The methods are compared with each other, resulting in the sequential probability ratio test being clearly superior. (author)

  17. Detecting anomalies in crowded scenes via locality-constrained affine subspace coding

    Science.gov (United States)

    Fan, Yaxiang; Wen, Gongjian; Qiu, Shaohua; Li, Deren

    2017-07-01

    Video anomaly event detection is the process of finding an abnormal event deviation compared with the majority of normal or usual events. The main challenges are the high structure redundancy and the dynamic changes in the scenes that are in surveillance videos. To address these problems, we present a framework for anomaly detection and localization in videos that is based on locality-constrained affine subspace coding (LASC) and a model updating procedure. In our algorithm, LASC attempts to reconstruct the test sample by its top-k nearest subspaces, which are obtained by segmenting the normal samples space using a clustering method. A sample with a large reconstruction cost is detected as abnormal by setting a threshold. To adapt to the scene changes over time, a model updating strategy is proposed. We experiment on two public datasets: the UCSD dataset and the Avenue dataset. The results demonstrate that our method achieves competitive performance at a 700 fps on a single desktop PC.

  18. Anomaly Detection for Next-Generation Space Launch Ground Operations

    Science.gov (United States)

    Spirkovska, Lilly; Iverson, David L.; Hall, David R.; Taylor, William M.; Patterson-Hine, Ann; Brown, Barbara; Ferrell, Bob A.; Waterman, Robert D.

    2010-01-01

    NASA is developing new capabilities that will enable future human exploration missions while reducing mission risk and cost. The Fault Detection, Isolation, and Recovery (FDIR) project aims to demonstrate the utility of integrated vehicle health management (IVHM) tools in the domain of ground support equipment (GSE) to be used for the next generation launch vehicles. In addition to demonstrating the utility of IVHM tools for GSE, FDIR aims to mature promising tools for use on future missions and document the level of effort - and hence cost - required to implement an application with each selected tool. One of the FDIR capabilities is anomaly detection, i.e., detecting off-nominal behavior. The tool we selected for this task uses a data-driven approach. Unlike rule-based and model-based systems that require manual extraction of system knowledge, data-driven systems take a radically different approach to reasoning. At the basic level, they start with data that represent nominal functioning of the system and automatically learn expected system behavior. The behavior is encoded in a knowledge base that represents "in-family" system operations. During real-time system monitoring or during post-flight analysis, incoming data is compared to that nominal system operating behavior knowledge base; a distance representing deviation from nominal is computed, providing a measure of how far "out of family" current behavior is. We describe the selected tool for FDIR anomaly detection - Inductive Monitoring System (IMS), how it fits into the FDIR architecture, the operations concept for the GSE anomaly monitoring, and some preliminary results of applying IMS to a Space Shuttle GSE anomaly.

  19. Occurrence and Detectability of Thermal Anomalies on Europa

    Science.gov (United States)

    Hayne, Paul O.; Christensen, Philip R.; Spencer, John R.; Abramov, Oleg; Howett, Carly; Mellon, Michael; Nimmo, Francis; Piqueux, Sylvain; Rathbun, Julie A.

    2017-10-01

    Endogenic activity is likely on Europa, given its young surface age of and ongoing tidal heating by Jupiter. Temperature is a fundamental signature of activity, as witnessed on Enceladus, where plumes emanate from vents with strongly elevated temperatures. Recent observations suggest the presence of similar water plumes at Europa. Even if plumes are uncommon, resurfacing may produce elevated surface temperatures, perhaps due to near-surface liquid water. Detecting endogenic activity on Europa is one of the primary mission objectives of NASA’s planned Europa Clipper flyby mission.Here, we use a probabilistic model to assess the likelihood of detectable thermal anomalies on the surface of Europa. The Europa Thermal Emission Imaging System (E-THEMIS) investigation is designed to characterize Europa’s thermal behavior and identify any thermal anomalies due to recent or ongoing activity. We define “detectability” on the basis of expected E-THEMIS measurements, which include multi-spectral infrared emission, both day and night.Thermal anomalies on Europa may take a variety of forms, depending on the resurfacing style, frequency, and duration of events: 1) subsurface melting due to hot spots, 2) shear heating on faults, and 3) eruptions of liquid water or warm ice on the surface. We use numerical and analytical models to estimate temperatures for these features. Once activity ceases, lifetimes of thermal anomalies are estimated to be 100 - 1000 yr. On average, Europa’s 10 - 100 Myr surface age implies a resurfacing rate of ~3 - 30 km2/yr. The typical size of resurfacing features determines their frequency of occurrence. For example, if ~100 km2 chaos features dominate recent resurfacing, we expect one event every few years to decades. Smaller features, such as double-ridges, may be active much more frequently. We model each feature type as a statistically independent event, with probabilities weighted by their observed coverage of Europa’s surface. Our results

  20. Anomaly-based intrusion detection for SCADA systems

    International Nuclear Information System (INIS)

    Yang, D.; Usynin, A.; Hines, J. W.

    2006-01-01

    Most critical infrastructure such as chemical processing plants, electrical generation and distribution networks, and gas distribution is monitored and controlled by Supervisory Control and Data Acquisition Systems (SCADA. These systems have been the focus of increased security and there are concerns that they could be the target of international terrorists. With the constantly growing number of internet related computer attacks, there is evidence that our critical infrastructure may also be vulnerable. Researchers estimate that malicious online actions may cause $75 billion at 2007. One of the interesting countermeasures for enhancing information system security is called intrusion detection. This paper will briefly discuss the history of research in intrusion detection techniques and introduce the two basic detection approaches: signature detection and anomaly detection. Finally, it presents the application of techniques developed for monitoring critical process systems, such as nuclear power plants, to anomaly intrusion detection. The method uses an auto-associative kernel regression (AAKR) model coupled with the statistical probability ratio test (SPRT) and applied to a simulated SCADA system. The results show that these methods can be generally used to detect a variety of common attacks. (authors)

  1. Detecting Anomaly in Traffic Flow from Road Similarity Analysis

    KAUST Repository

    Liu, Xinran

    2016-06-02

    Taxies equipped with GPS devices are considered as 24-hour moving sensors widely distributed in urban road networks. Plenty of accurate and realtime trajectories of taxi are recorded by GPS devices and are commonly studied for understanding traffic dynamics. This paper focuses on anomaly detection in traffic volume, especially the non-recurrent traffic anomaly caused by unexpected or transient incidents, such as traffic accidents, celebrations and disasters. It is important to detect such sharp changes of traffic status for sensing abnormal events and planning their impact on the smooth volume of traffic. Unlike existing anomaly detection approaches that mainly monitor the derivation of current traffic status from history in the past, the proposed method in this paper evaluates the abnormal score of traffic on one road by comparing its current traffic volume with not only its historical data but also its neighbors. We define the neighbors as the roads that are close in sense of both geo-location and traffic patterns, which are extracted by matrix factorization. The evaluation results on trajectories data of 12,286 taxies over four weeks in Beijing show that our approach outperforms other baseline methods with higher precision and recall.

  2. Using Physical Models for Anomaly Detection in Control Systems

    Science.gov (United States)

    Svendsen, Nils; Wolthusen, Stephen

    Supervisory control and data acquisition (SCADA) systems are increasingly used to operate critical infrastructure assets. However, the inclusion of advanced information technology and communications components and elaborate control strategies in SCADA systems increase the threat surface for external and subversion-type attacks. The problems are exacerbated by site-specific properties of SCADA environments that make subversion detection impractical; and by sensor noise and feedback characteristics that degrade conventional anomaly detection systems. Moreover, potential attack mechanisms are ill-defined and may include both physical and logical aspects.

  3. Inflight and Preflight Detection of Pitot Tube Anomalies

    Science.gov (United States)

    Mitchell, Darrell W.

    2014-01-01

    The health and integrity of aircraft sensors play a critical role in aviation safety. Inaccurate or false readings from these sensors can lead to improper decision making, resulting in serious and sometimes fatal consequences. This project demonstrated the feasibility of using advanced data analysis techniques to identify anomalies in Pitot tubes resulting from blockage such as icing, moisture, or foreign objects. The core technology used in this project is referred to as noise analysis because it relates sensors' response time to the dynamic component (noise) found in the signal of these same sensors. This analysis technique has used existing electrical signals of Pitot tube sensors that result from measured processes during inflight conditions and/or induced signals in preflight conditions to detect anomalies in the sensor readings. Analysis and Measurement Services Corporation (AMS Corp.) has routinely used this technology to determine the health of pressure transmitters in nuclear power plants. The application of this technology for the detection of aircraft anomalies is innovative. Instead of determining the health of process monitoring at a steady-state condition, this technology will be used to quickly inform the pilot when an air-speed indication becomes faulty under any flight condition as well as during preflight preparation.

  4. Towards Reliable Evaluation of Anomaly-Based Intrusion Detection Performance

    Science.gov (United States)

    Viswanathan, Arun

    2012-01-01

    This report describes the results of research into the effects of environment-induced noise on the evaluation process for anomaly detectors in the cyber security domain. This research was conducted during a 10-week summer internship program from the 19th of August, 2012 to the 23rd of August, 2012 at the Jet Propulsion Laboratory in Pasadena, California. The research performed lies within the larger context of the Los Angeles Department of Water and Power (LADWP) Smart Grid cyber security project, a Department of Energy (DoE) funded effort involving the Jet Propulsion Laboratory, California Institute of Technology and the University of Southern California/ Information Sciences Institute. The results of the present effort constitute an important contribution towards building more rigorous evaluation paradigms for anomaly-based intrusion detectors in complex cyber physical systems such as the Smart Grid. Anomaly detection is a key strategy for cyber intrusion detection and operates by identifying deviations from profiles of nominal behavior and are thus conceptually appealing for detecting "novel" attacks. Evaluating the performance of such a detector requires assessing: (a) how well it captures the model of nominal behavior, and (b) how well it detects attacks (deviations from normality). Current evaluation methods produce results that give insufficient insight into the operation of a detector, inevitably resulting in a significantly poor characterization of a detectors performance. In this work, we first describe a preliminary taxonomy of key evaluation constructs that are necessary for establishing rigor in the evaluation regime of an anomaly detector. We then focus on clarifying the impact of the operational environment on the manifestation of attacks in monitored data. We show how dynamic and evolving environments can introduce high variability into the data stream perturbing detector performance. Prior research has focused on understanding the impact of this

  5. Rule-based expert system for maritime anomaly detection

    Science.gov (United States)

    Roy, Jean

    2010-04-01

    Maritime domain operators/analysts have a mandate to be aware of all that is happening within their areas of responsibility. This mandate derives from the needs to defend sovereignty, protect infrastructures, counter terrorism, detect illegal activities, etc., and it has become more challenging in the past decade, as commercial shipping turned into a potential threat. In particular, a huge portion of the data and information made available to the operators/analysts is mundane, from maritime platforms going about normal, legitimate activities, and it is very challenging for them to detect and identify the non-mundane. To achieve such anomaly detection, they must establish numerous relevant situational facts from a variety of sensor data streams. Unfortunately, many of the facts of interest just cannot be observed; the operators/analysts thus use their knowledge of the maritime domain and their reasoning faculties to infer these facts. As they are often overwhelmed by the large amount of data and information, automated reasoning tools could be used to support them by inferring the necessary facts, ultimately providing indications and warning on a small number of anomalous events worthy of their attention. Along this line of thought, this paper describes a proof-of-concept prototype of a rule-based expert system implementing automated rule-based reasoning in support of maritime anomaly detection.

  6. Detection of Cardiovascular Anomalies: An Observer-Based Approach

    KAUST Repository

    Ledezma, Fernando

    2012-07-01

    In this thesis, a methodology for the detection of anomalies in the cardiovascular system is presented. The cardiovascular system is one of the most fascinating and complex physiological systems. Nowadays, cardiovascular diseases constitute one of the most important causes of mortality in the world. For instance, an estimate of 17.3 million people died in 2008 from cardiovascular diseases. Therefore, many studies have been devoted to modeling the cardiovascular system in order to better understand its behavior and find new reliable diagnosis techniques. The lumped parameter model of the cardiovascular system proposed in [1] is restructured using a hybrid systems approach in order to include a discrete input vector that represents the influence of the mitral and aortic valves in the different phases of the cardiac cycle. Parting from this model, a Taylor expansion around the nominal values of a vector of parameters is conducted. This expansion serves as the foundation for a component fault detection process to detect changes in the physiological parameters of the cardiovascular system which could be associated with cardiovascular anomalies such as atherosclerosis, aneurysm, high blood pressure, etc. An Extended Kalman Filter is used in order to achieve a joint estimation of the state vector and the changes in the considered parameters. Finally, a bank of filters is, as in [2], used in order to detect the appearance of heart valve diseases, particularly stenosis and regurgitation. The first numerical results obtained are presented.

  7. Detection of Anomalies in Hydrometric Data Using Artificial Intelligence Techniques

    Science.gov (United States)

    Lauzon, N.; Lence, B. J.

    2002-12-01

    This work focuses on the detection of anomalies in hydrometric data sequences, such as 1) outliers, which are individual data having statistical properties that differ from those of the overall population; 2) shifts, which are sudden changes over time in the statistical properties of the historical records of data; and 3) trends, which are systematic changes over time in the statistical properties. For the purpose of the design and management of water resources systems, it is important to be aware of these anomalies in hydrometric data, for they can induce a bias in the estimation of water quantity and quality parameters. These anomalies may be viewed as specific patterns affecting the data, and therefore pattern recognition techniques can be used for identifying them. However, the number of possible patterns is very large for each type of anomaly and consequently large computing capacities are required to account for all possibilities using the standard statistical techniques, such as cluster analysis. Artificial intelligence techniques, such as the Kohonen neural network and fuzzy c-means, are clustering techniques commonly used for pattern recognition in several areas of engineering and have recently begun to be used for the analysis of natural systems. They require much less computing capacity than the standard statistical techniques, and therefore are well suited for the identification of outliers, shifts and trends in hydrometric data. This work constitutes a preliminary study, using synthetic data representing hydrometric data that can be found in Canada. The analysis of the results obtained shows that the Kohonen neural network and fuzzy c-means are reasonably successful in identifying anomalies. This work also addresses the problem of uncertainties inherent to the calibration procedures that fit the clusters to the possible patterns for both the Kohonen neural network and fuzzy c-means. Indeed, for the same database, different sets of clusters can be

  8. Automatic metal parts inspection: Use of thermographic images and anomaly detection algorithms

    Science.gov (United States)

    Benmoussat, M. S.; Guillaume, M.; Caulier, Y.; Spinnler, K.

    2013-11-01

    A fully-automatic approach based on the use of induction thermography and detection algorithms is proposed to inspect industrial metallic parts containing different surface and sub-surface anomalies such as open cracks, open and closed notches with different sizes and depths. A practical experimental setup is developed, where lock-in and pulsed thermography (LT and PT, respectively) techniques are used to establish a dataset of thermal images for three different mockups. Data cubes are constructed by stacking up the temporal sequence of thermogram images. After the reduction of the data space dimension by means of denoising and dimensionality reduction methods; anomaly detection algorithms are applied on the reduced data cubes. The dimensions of the reduced data spaces are automatically calculated with arbitrary criterion. The results show that, when reduced data cubes are used, the anomaly detection algorithms originally developed for hyperspectral data, the well-known Reed and Xiaoli Yu detector (RX) and the regularized adaptive RX (RARX), give good detection performances for both surface and sub-surface defects in a non-supervised way.

  9. Detecting Traffic Anomalies in Urban Areas Using Taxi GPS Data

    Directory of Open Access Journals (Sweden)

    Weiming Kuang

    2015-01-01

    Full Text Available Large-scale GPS data contain hidden information and provide us with the opportunity to discover knowledge that may be useful for transportation systems using advanced data mining techniques. In major metropolitan cities, many taxicabs are equipped with GPS devices. Because taxies operate continuously for nearly 24 hours per day, they can be used as reliable sensors for the perceived traffic state. In this paper, the entire city was divided into subregions by roads, and taxi GPS data were transformed into traffic flow data to build a traffic flow matrix. In addition, a highly efficient anomaly detection method was proposed based on wavelet transform and PCA (principal component analysis for detecting anomalous traffic events in urban regions. The traffic anomaly is considered to occur in a subregion when the values of the corresponding indicators deviate significantly from the expected values. This method was evaluated using a GPS dataset that was generated by more than 15,000 taxies over a period of half a year in Harbin, China. The results show that this detection method is effective and efficient.

  10. Anomaly depth detection in trans-admittance mammography: a formula independent of anomaly size or admittivity contrast

    International Nuclear Information System (INIS)

    Zhang, Tingting; Lee, Eunjung; Seo, Jin Keun

    2014-01-01

    Trans-admittance mammography (TAM) is a bioimpedance technique for breast cancer detection. It is based on the comparison of tissue conductivity: cancerous tissue is identified by its higher conductivity in comparison with the surrounding normal tissue. In TAM, the breast is compressed between two electrical plates (in a similar architecture to x-ray mammography). The bottom plate has many sensing point electrodes that provide two-dimensional images (trans-admittance maps) that are induced by voltage differences between the two plates. Multi-frequency admittance data (Neumann data) are measured over the range 50 Hz–500 kHz. TAM aims to determine the location and size of any anomaly from the multi-frequency admittance data. Various anomaly detection algorithms can be used to process TAM data to determine the transverse positions of anomalies. However, existing methods cannot reliably determine the depth or size of an anomaly. Breast cancer detection using TAM would be improved if the depth or size of an anomaly could also be estimated, properties that are independent of the admittivity contrast. A formula is proposed here that can estimate the depth of an anomaly independent of its size and the admittivity contrast. This depth estimation can also be used to derive an estimation of the size of the anomaly. The proposed estimations are verified rigorously under a simplified model. Numerical simulation shows that the proposed method also works well in general settings. (paper)

  11. DeepAnomaly: Combining Background Subtraction and Deep Learning for Detecting Obstacles and Anomalies in an Agricultural Field

    Directory of Open Access Journals (Sweden)

    Peter Christiansen

    2016-11-01

    Full Text Available Convolutional neural network (CNN-based systems are increasingly used in autonomous vehicles for detecting obstacles. CNN-based object detection and per-pixel classification (semantic segmentation algorithms are trained for detecting and classifying a predefined set of object types. These algorithms have difficulties in detecting distant and heavily occluded objects and are, by definition, not capable of detecting unknown object types or unusual scenarios. The visual characteristics of an agriculture field is homogeneous, and obstacles, like people, animals and other obstacles, occur rarely and are of distinct appearance compared to the field. This paper introduces DeepAnomaly, an algorithm combining deep learning and anomaly detection to exploit the homogenous characteristics of a field to perform anomaly detection. We demonstrate DeepAnomaly as a fast state-of-the-art detector for obstacles that are distant, heavily occluded and unknown. DeepAnomaly is compared to state-of-the-art obstacle detectors including “Faster R-CNN: Towards Real-Time Object Detection with Region Proposal Networks” (RCNN. In a human detector test case, we demonstrate that DeepAnomaly detects humans at longer ranges (45–90 m than RCNN. RCNN has a similar performance at a short range (0–30 m. However, DeepAnomaly has much fewer model parameters and (182 ms/25 ms = a 7.28-times faster processing time per image. Unlike most CNN-based methods, the high accuracy, the low computation time and the low memory footprint make it suitable for a real-time system running on a embedded GPU (Graphics Processing Unit.

  12. Robust Adaptable Video Copy Detection

    DEFF Research Database (Denmark)

    Assent, Ira; Kremer, Hardy

    2009-01-01

    in contrast). Our query processing combines filtering and indexing structures for efficient multistep computation of video copies under this model. We show that our model successfully identifies altered video copies and does so more reliably than existing models.......Video copy detection should be capable of identifying video copies subject to alterations e.g. in video contrast or frame rates. We propose a video copy detection scheme that allows for adaptable detection of videos that are altered temporally (e.g. frame rate change) and/or visually (e.g. change...

  13. Mobile Anomaly Detection Based on Improved Self-Organizing Maps

    Directory of Open Access Journals (Sweden)

    Chunyong Yin

    2017-01-01

    Full Text Available Anomaly detection has always been the focus of researchers and especially, the developments of mobile devices raise new challenges of anomaly detection. For example, mobile devices can keep connection with Internet and they are rarely turned off even at night. This means mobile devices can attack nodes or be attacked at night without being perceived by users and they have different characteristics from Internet behaviors. The introduction of data mining has made leaps forward in this field. Self-organizing maps, one of famous clustering algorithms, are affected by initial weight vectors and the clustering result is unstable. The optimal method of selecting initial clustering centers is transplanted from K-means to SOM. To evaluate the performance of improved SOM, we utilize diverse datasets and KDD Cup99 dataset to compare it with traditional one. The experimental results show that improved SOM can get higher accuracy rate for universal datasets. As for KDD Cup99 dataset, it achieves higher recall rate and precision rate.

  14. Optimize the Coverage Probability of Prediction Interval for Anomaly Detection of Sensor-Based Monitoring Series

    Directory of Open Access Journals (Sweden)

    Jingyue Pang

    2018-03-01

    Full Text Available Effective anomaly detection of sensing data is essential for identifying potential system failures. Because they require no prior knowledge or accumulated labels, and provide uncertainty presentation, the probability prediction methods (e.g., Gaussian process regression (GPR and relevance vector machine (RVM are especially adaptable to perform anomaly detection for sensing series. Generally, one key parameter of prediction models is coverage probability (CP, which controls the judging threshold of the testing sample and is generally set to a default value (e.g., 90% or 95%. There are few criteria to determine the optimal CP for anomaly detection. Therefore, this paper designs a graphic indicator of the receiver operating characteristic curve of prediction interval (ROC-PI based on the definition of the ROC curve which can depict the trade-off between the PI width and PI coverage probability across a series of cut-off points. Furthermore, the Youden index is modified to assess the performance of different CPs, by the minimization of which the optimal CP is derived by the simulated annealing (SA algorithm. Experiments conducted on two simulation datasets demonstrate the validity of the proposed method. Especially, an actual case study on sensing series from an on-orbit satellite illustrates its significant performance in practical application.

  15. Low Count Anomaly Detection at Large Standoff Distances

    Science.gov (United States)

    Pfund, David Michael; Jarman, Kenneth D.; Milbrath, Brian D.; Kiff, Scott D.; Sidor, Daniel E.

    2010-02-01

    Searching for hidden illicit sources of gamma radiation in an urban environment is difficult. Background radiation profiles are variable and cluttered with transient acquisitions from naturally occurring radioactive materials and medical isotopes. Potentially threatening sources likely will be nearly hidden in this noise and encountered at high standoff distances and low threat count rates. We discuss an anomaly detection algorithm that characterizes low count sources as threatening or non-threatening and operates well in the presence of high benign source variability. We discuss the algorithm parameters needed to reliably find sources both close to the detector and far away from it. These parameters include the cutoff frequencies of background tracking filters and the integration time of the spectrometer. This work is part of the development of the Standoff Radiation Imaging System (SORIS) as part of DNDO's Standoff Radiation Detection System Advanced Technology Demonstration (SORDS-ATD) program.

  16. Anomaly detection of microstructural defects in continuous fiber reinforced composites

    Science.gov (United States)

    Bricker, Stephen; Simmons, J. P.; Przybyla, Craig; Hardie, Russell

    2015-03-01

    Ceramic matrix composites (CMC) with continuous fiber reinforcements have the potential to enable the next generation of high speed hypersonic vehicles and/or significant improvements in gas turbine engine performance due to their exhibited toughness when subjected to high mechanical loads at extreme temperatures (2200F+). Reinforced fiber composites (RFC) provide increased fracture toughness, crack growth resistance, and strength, though little is known about how stochastic variation and imperfections in the material effect material properties. In this work, tools are developed for quantifying anomalies within the microstructure at several scales. The detection and characterization of anomalous microstructure is a critical step in linking production techniques to properties, as well as in accurate material simulation and property prediction for the integrated computation materials engineering (ICME) of RFC based components. It is desired to find statistical outliers for any number of material characteristics such as fibers, fiber coatings, and pores. Here, fiber orientation, or `velocity', and `velocity' gradient are developed and examined for anomalous behavior. Categorizing anomalous behavior in the CMC is approached by multivariate Gaussian mixture modeling. A Gaussian mixture is employed to estimate the probability density function (PDF) of the features in question, and anomalies are classified by their likelihood of belonging to the statistical normal behavior for that feature.

  17. Support vector machines for TEC seismo-ionospheric anomalies detection

    Directory of Open Access Journals (Sweden)

    M. Akhoondzadeh

    2013-02-01

    Full Text Available Using time series prediction methods, it is possible to pursue the behaviors of earthquake precursors in the future and to announce early warnings when the differences between the predicted value and the observed value exceed the predefined threshold value. Support Vector Machines (SVMs are widely used due to their many advantages for classification and regression tasks. This study is concerned with investigating the Total Electron Content (TEC time series by using a SVM to detect seismo-ionospheric anomalous variations induced by the three powerful earthquakes of Tohoku (11 March 2011, Haiti (12 January 2010 and Samoa (29 September 2009. The duration of TEC time series dataset is 49, 46 and 71 days, for Tohoku, Haiti and Samoa earthquakes, respectively, with each at time resolution of 2 h. In the case of Tohoku earthquake, the results show that the difference between the predicted value obtained from the SVM method and the observed value reaches the maximum value (i.e., 129.31 TECU at earthquake time in a period of high geomagnetic activities. The SVM method detected a considerable number of anomalous occurrences 1 and 2 days prior to the Haiti earthquake and also 1 and 5 days before the Samoa earthquake in a period of low geomagnetic activities. In order to show that the method is acting sensibly with regard to the results extracted during nonevent and event TEC data, i.e., to perform some null-hypothesis tests in which the methods would also be calibrated, the same period of data from the previous year of the Samoa earthquake date has been taken into the account. Further to this, in this study, the detected TEC anomalies using the SVM method were compared to the previous results (Akhoondzadeh and Saradjian, 2011; Akhoondzadeh, 2012 obtained from the mean, median, wavelet and Kalman filter methods. The SVM detected anomalies are similar to those detected using the previous methods. It can be concluded that SVM can be a suitable learning method

  18. Freezing of Gait Detection in Parkinson's Disease: A Subject-Independent Detector Using Anomaly Scores.

    Science.gov (United States)

    Pham, Thuy T; Moore, Steven T; Lewis, Simon John Geoffrey; Nguyen, Diep N; Dutkiewicz, Eryk; Fuglevand, Andrew J; McEwan, Alistair L; Leong, Philip H W

    2017-11-01

    Freezing of gait (FoG) is common in Parkinsonian gait and strongly relates to falls. Current clinical FoG assessments are patients' self-report diaries and experts' manual video analysis. Both are subjective and yield moderate reliability. Existing detection algorithms have been predominantly designed in subject-dependent settings. In this paper, we aim to develop an automated FoG detector for subject independent. After extracting highly relevant features, we apply anomaly detection techniques to detect FoG events. Specifically, feature selection is performed using correlation and clusterability metrics. From a list of 244 feature candidates, 36 candidates were selected using saliency and robustness criteria. We develop an anomaly score detector with adaptive thresholding to identify FoG events. Then, using accuracy metrics, we reduce the feature list to seven candidates. Our novel multichannel freezing index was the most selective across all window sizes, achieving sensitivity (specificity) of (). On the other hand, freezing index from the vertical axis was the best choice for a single input, achieving sensitivity (specificity) of () for ankle and () for back sensors. Our subject-independent method is not only significantly more accurate than those previously reported, but also uses a much smaller window (e.g., versus ) and/or lower tolerance (e.g., versus ).Freezing of gait (FoG) is common in Parkinsonian gait and strongly relates to falls. Current clinical FoG assessments are patients' self-report diaries and experts' manual video analysis. Both are subjective and yield moderate reliability. Existing detection algorithms have been predominantly designed in subject-dependent settings. In this paper, we aim to develop an automated FoG detector for subject independent. After extracting highly relevant features, we apply anomaly detection techniques to detect FoG events. Specifically, feature selection is performed using correlation and clusterability metrics. From

  19. Anomaly detection in an automated safeguards system using neural networks

    International Nuclear Information System (INIS)

    Whiteson, R.; Howell, J.A.

    1992-01-01

    An automated safeguards system must be able to detect an anomalous event, identify the nature of the event, and recommend a corrective action. Neural networks represent a new way of thinking about basic computational mechanisms for intelligent information processing. In this paper, we discuss the issues involved in applying a neural network model to the first step of this process: anomaly detection in materials accounting systems. We extend our previous model to a 3-tank problem and compare different neural network architectures and algorithms. We evaluate the computational difficulties in training neural networks and explore how certain design principles affect the problems. The issues involved in building a neural network architecture include how the information flows, how the network is trained, how the neurons in a network are connected, how the neurons process information, and how the connections between neurons are modified. Our approach is based on the demonstrated ability of neural networks to model complex, nonlinear, real-time processes. By modeling the normal behavior of the processes, we can predict how a system should be behaving and, therefore, detect when an abnormality occurs

  20. Early Detection and Identification of Anomalies in Chemical Regime

    International Nuclear Information System (INIS)

    Figedy, Stefan; Smiesko, Ivan

    2011-01-01

    This paper provides a brief information about the basic features of a newly developed diagnostic system for early detection and identification of anomalies incoming in the water chemistry regime of the primary and secondary circuit of VVER-440 reactor. This system, called SACHER (System of Analysis of CHEmical Regime) is being installed within the major modernization project at the NPP-V2 Bohunice in the Slovak Republic. System SACHER has been developed fully in MATLAB environment. The availability of prompt information about the chemical conditions of the primary and secondary circuit is very important to prevent the undue corrosion and deposit build-up. The typical chemical information systems that exist and work at the NPPs give the user values of the measured quantities together with their time trends and other derived values. It is then the experienced user's role to recognize the situation the monitored process is in and make the subsequent decisions and take the measures. The SACHER system, based on the computational intelligence techniques, inserts the elements of intelligence into the overall chemical information system. It has the modular structure with the following most important modules: normality module- its aim is to recognize that the process starts to deviate from the normal one and serves as the early warning to the staff to take the adequate measures, fuzzy identification module- its aim is to identify the anomaly on the basis of a set of fuzzy rules, time-prediction module- its aim is to predict the behavior/trend of selected chemical quantities 8 hours ahead in 15 min step from the moment of request, validation module- its aim is to validate the measured quantities, trend module- this module serves for showing the trends of the acquired quantities

  1. Sensor Anomaly Detection in Wireless Sensor Networks for Healthcare

    Science.gov (United States)

    Haque, Shah Ahsanul; Rahman, Mustafizur; Aziz, Syed Mahfuzul

    2015-01-01

    Wireless Sensor Networks (WSN) are vulnerable to various sensor faults and faulty measurements. This vulnerability hinders efficient and timely response in various WSN applications, such as healthcare. For example, faulty measurements can create false alarms which may require unnecessary intervention from healthcare personnel. Therefore, an approach to differentiate between real medical conditions and false alarms will improve remote patient monitoring systems and quality of healthcare service afforded by WSN. In this paper, a novel approach is proposed to detect sensor anomaly by analyzing collected physiological data from medical sensors. The objective of this method is to effectively distinguish false alarms from true alarms. It predicts a sensor value from historic values and compares it with the actual sensed value for a particular instance. The difference is compared against a threshold value, which is dynamically adjusted, to ascertain whether the sensor value is anomalous. The proposed approach has been applied to real healthcare datasets and compared with existing approaches. Experimental results demonstrate the effectiveness of the proposed system, providing high Detection Rate (DR) and low False Positive Rate (FPR). PMID:25884786

  2. Radon anomalies: When are they possible to be detected?

    Science.gov (United States)

    Passarelli, Luigi; Woith, Heiko; Seyis, Cemil; Nikkhoo, Mehdi; Donner, Reik

    2017-04-01

    Records of the Radon noble gas in different environments like soil, air, groundwater, rock, caves, and tunnels, typically display cyclic variations including diurnal (S1), semidiurnal (S2) and seasonal components. But there are also cases where theses cycles are absent. Interestingly, radon emission can also be affected by transient processes, which inhibit or enhance the radon carrying process at the surface. This results in transient changes in the radon emission rate, which are superimposed on the low and high frequency cycles. The complexity in the spectral contents of the radon time-series makes any statistical analysis aiming at understanding the physical driving processes a challenging task. In the past decades there have been several attempts to relate changes in radon emission rate with physical triggering processes such as earthquake occurrence. One of the problems in this type of investigation is to objectively detect anomalies in the radon time-series. In the present work, we propose a simple and objective statistical method for detecting changes in the radon emission rate time-series. The method uses non-parametric statistical tests (e.g., Kolmogorov-Smirnov) to compare empirical distributions of radon emission rate by sequentially applying various time window to the time-series. The statistical test indicates whether two empirical distributions of data originate from the same distribution at a desired significance level. We test the algorithm on synthetic data in order to explore the sensitivity of the statistical test to the sample size. We successively apply the test to six radon emission rate recordings from stations located around the Marmara Sea obtained within the MARsite project (MARsite has received funding from the European Union's Seventh Programme for research, technological development and demonstration under grant agreement No 308417). We conclude that the test performs relatively well on identify transient changes in the radon emission

  3. Adaptive rate selection scheme for video transmission to resolve IEEE 802.11 performance anomaly

    Science.gov (United States)

    Tang, Guijin; Zhu, Xiuchang

    2011-10-01

    Multi-rate transmission may lead to performance anomaly in an IEEE 802.11 network. It will decrease the throughputs of all the higher rate stations. This paper proposes an adaptive rate selection scheme for video service when performance anomaly occurs. Considering that video has the characteristic of tolerance to packet loss, we actively drop several packets so as to select the rates as high as possible for transmitting packets. Experiment shows our algorithm can decrease the delay and jitter of video, and improve the system throughput as well.

  4. Reliable detection of fluence anomalies in EPID-based IMRT pretreatment quality assurance using pixel intensity deviations

    International Nuclear Information System (INIS)

    Gordon, J. J.; Gardner, J. K.; Wang, S.; Siebers, J. V.

    2012-01-01

    Purpose: This work uses repeat images of intensity modulated radiation therapy (IMRT) fields to quantify fluence anomalies (i.e., delivery errors) that can be reliably detected in electronic portal images used for IMRT pretreatment quality assurance. Methods: Repeat images of 11 clinical IMRT fields are acquired on a Varian Trilogy linear accelerator at energies of 6 MV and 18 MV. Acquired images are corrected for output variations and registered to minimize the impact of linear accelerator and electronic portal imaging device (EPID) positioning deviations. Detection studies are performed in which rectangular anomalies of various sizes are inserted into the images. The performance of detection strategies based on pixel intensity deviations (PIDs) and gamma indices is evaluated using receiver operating characteristic analysis. Results: Residual differences between registered images are due to interfraction positional deviations of jaws and multileaf collimator leaves, plus imager noise. Positional deviations produce large intensity differences that degrade anomaly detection. Gradient effects are suppressed in PIDs using gradient scaling. Background noise is suppressed using median filtering. In the majority of images, PID-based detection strategies can reliably detect fluence anomalies of ≥5% in ∼1 mm 2 areas and ≥2% in ∼20 mm 2 areas. Conclusions: The ability to detect small dose differences (≤2%) depends strongly on the level of background noise. This in turn depends on the accuracy of image registration, the quality of the reference image, and field properties. The longer term aim of this work is to develop accurate and reliable methods of detecting IMRT delivery errors and variations. The ability to resolve small anomalies will allow the accuracy of advanced treatment techniques, such as image guided, adaptive, and arc therapies, to be quantified.

  5. A Comparative Evaluation of Unsupervised Anomaly Detection Algorithms for Multivariate Data

    Science.gov (United States)

    Goldstein, Markus; Uchida, Seiichi

    2016-01-01

    Anomaly detection is the process of identifying unexpected items or events in datasets, which differ from the norm. In contrast to standard classification tasks, anomaly detection is often applied on unlabeled data, taking only the internal structure of the dataset into account. This challenge is known as unsupervised anomaly detection and is addressed in many practical applications, for example in network intrusion detection, fraud detection as well as in the life science and medical domain. Dozens of algorithms have been proposed in this area, but unfortunately the research community still lacks a comparative universal evaluation as well as common publicly available datasets. These shortcomings are addressed in this study, where 19 different unsupervised anomaly detection algorithms are evaluated on 10 different datasets from multiple application domains. By publishing the source code and the datasets, this paper aims to be a new well-funded basis for unsupervised anomaly detection research. Additionally, this evaluation reveals the strengths and weaknesses of the different approaches for the first time. Besides the anomaly detection performance, computational effort, the impact of parameter settings as well as the global/local anomaly detection behavior is outlined. As a conclusion, we give an advise on algorithm selection for typical real-world tasks. PMID:27093601

  6. A first approach on fault detection and isolation for cardiovascular anomalies detection

    KAUST Repository

    Ledezma, Fernando

    2015-07-01

    In this paper, we use an extended version of the cardiovascular system\\'s state space model presented by [1] and propose a fault detection and isolation methodology to study the problem of detecting cardiovascular anomalies that can originate from variations in physiological parameters and deviations in the performance of the heart\\'s mitral and aortic valves. An observer-based approach is discussed as the basis of the method. The approach contemplates a bank of Extended Kalman Filters to achieve joint estimation of the model\\'s states and parameters and to detect malfunctions in the valves\\' performance. © 2015 American Automatic Control Council.

  7. Detecting Distributed Network Traffic Anomaly with Network-Wide Correlation Analysis

    Directory of Open Access Journals (Sweden)

    Yang Dan

    2008-12-01

    Full Text Available Distributed network traffic anomaly refers to a traffic abnormal behavior involving many links of a network and caused by the same source (e.g., DDoS attack, worm propagation. The anomaly transiting in a single link might be unnoticeable and hard to detect, while the anomalous aggregation from many links can be prevailing, and does more harm to the networks. Aiming at the similar features of distributed traffic anomaly on many links, this paper proposes a network-wide detection method by performing anomalous correlation analysis of traffic signals' instantaneous parameters. In our method, traffic signals' instantaneous parameters are firstly computed, and their network-wide anomalous space is then extracted via traffic prediction. Finally, an anomaly is detected by a global correlation coefficient of anomalous space. Our evaluation using Abilene traffic traces demonstrates the excellent performance of this approach for distributed traffic anomaly detection.

  8. Detection of short-term anomaly using parasitic discrete wavelet transform

    International Nuclear Information System (INIS)

    Nagamatsu, Takashi; Gofuku, Akio

    2013-01-01

    A parasitic discrete wavelet transform (P-DWT) that has a large flexibility in design of the mother wavelet (MW) and a high processing speed was applied for simulation and measured anomalies. First, we applied the P-DWT to detection of the short-term anomalies. Second, we applied the P-DWT to detection of the collision of pump using the pump diagnostic experiment equipment that was designed taking into consideration the structure of the pump used for the water-steam system of the fast breeder reactor 'Monju'. The vibration signals were measured by the vibration sensor attached to the pump when injecting four types of small objects (sphere, small sphere, cube, and rectangular parallelepiped). Anomaly detection was performed by calculating the fast wavelet instantaneous correlation using the parasitic filter that was constructed on the basis of the measured signals. The results suggested that the anomalies could be detected for all types of the supposed anomalies. (author)

  9. Anomaly Detection in Gas Turbine Fuel Systems Using a Sequential Symbolic Method

    Directory of Open Access Journals (Sweden)

    Fei Li

    2017-05-01

    Full Text Available Anomaly detection plays a significant role in helping gas turbines run reliably and economically. Considering the collective anomalous data and both sensitivity and robustness of the anomaly detection model, a sequential symbolic anomaly detection method is proposed and applied to the gas turbine fuel system. A structural Finite State Machine is used to evaluate posterior probabilities of observing symbolic sequences and the most probable state sequences they may locate. Hence an estimation-based model and a decoding-based model are used to identify anomalies in two different ways. Experimental results indicate that both models have both ideal performance overall, but the estimation-based model has a strong robustness ability, whereas the decoding-based model has a strong accuracy ability, particularly in a certain range of sequence lengths. Therefore, the proposed method can facilitate well existing symbolic dynamic analysis- based anomaly detection methods, especially in the gas turbine domain.

  10. Anomaly Detection for Temporal Data using Long Short-Term Memory (LSTM)

    OpenAIRE

    Singh, Akash

    2017-01-01

    We explore the use of Long short-term memory (LSTM) for anomaly detection in temporal data. Due to the challenges in obtaining labeled anomaly datasets, an unsupervised approach is employed. We train recurrent neural networks (RNNs) with LSTM units to learn the normal time series patterns and predict future values. The resulting prediction errors are modeled to give anomaly scores. We investigate different ways of maintaining LSTM state, and the effect of using a fixed number of time steps on...

  11. A robust background regression based score estimation algorithm for hyperspectral anomaly detection

    Science.gov (United States)

    Zhao, Rui; Du, Bo; Zhang, Liangpei; Zhang, Lefei

    2016-12-01

    Anomaly detection has become a hot topic in the hyperspectral image analysis and processing fields in recent years. The most important issue for hyperspectral anomaly detection is the background estimation and suppression. Unreasonable or non-robust background estimation usually leads to unsatisfactory anomaly detection results. Furthermore, the inherent nonlinearity of hyperspectral images may cover up the intrinsic data structure in the anomaly detection. In order to implement robust background estimation, as well as to explore the intrinsic data structure of the hyperspectral image, we propose a robust background regression based score estimation algorithm (RBRSE) for hyperspectral anomaly detection. The Robust Background Regression (RBR) is actually a label assignment procedure which segments the hyperspectral data into a robust background dataset and a potential anomaly dataset with an intersection boundary. In the RBR, a kernel expansion technique, which explores the nonlinear structure of the hyperspectral data in a reproducing kernel Hilbert space, is utilized to formulate the data as a density feature representation. A minimum squared loss relationship is constructed between the data density feature and the corresponding assigned labels of the hyperspectral data, to formulate the foundation of the regression. Furthermore, a manifold regularization term which explores the manifold smoothness of the hyperspectral data, and a maximization term of the robust background average density, which suppresses the bias caused by the potential anomalies, are jointly appended in the RBR procedure. After this, a paired-dataset based k-nn score estimation method is undertaken on the robust background and potential anomaly datasets, to implement the detection output. The experimental results show that RBRSE achieves superior ROC curves, AUC values, and background-anomaly separation than some of the other state-of-the-art anomaly detection methods, and is easy to implement

  12. A Multi-Agent Framework for Anomalies Detection on Distributed Firewalls Using Data Mining Techniques

    Science.gov (United States)

    Karoui, Kamel; Ftima, Fakher Ben; Ghezala, Henda Ben

    The Agents and Data Mining integration has emerged as a promising area for disributed problems solving. Applying this integration on distributed firewalls will facilitate the anomalies detection process. In this chapter, we present a set of algorithms and mining techniques to analyse, manage and detect anomalies on distributed firewalls' policy rules using the multi-agent approach; first, for each firewall, a static agent will execute a set of data mining techniques to generate a new set of efficient firewall policy rules. Then, a mobile agent will exploit these sets of optimized rules to detect eventual anomalies on a specific firewall (intra-firewalls anomalies) or between firewalls (inter-firewalls anomalies). An experimental case study will be presented to demonstrate the usefulness of our approach.

  13. Ensemble regression model-based anomaly detection for cyber-physical intrusion detection in smart grids

    DEFF Research Database (Denmark)

    Kosek, Anna Magdalena; Gehrke, Oliver

    2016-01-01

    The shift from centralised large production to distributed energy production has several consequences for current power system operation. The replacement of large power plants by growing numbers of distributed energy resources (DERs) increases the dependency of the power system on small scale......, distributed production. Many of these DERs can be accessed and controlled remotely, posing a cybersecurity risk. This paper investigates an intrusion detection system which evaluates the DER operation in order to discover unauthorized control actions. The proposed anomaly detection method is based...

  14. Aircraft Anomaly Detection Using Performance Models Trained on Fleet Data

    Science.gov (United States)

    Gorinevsky, Dimitry; Matthews, Bryan L.; Martin, Rodney

    2012-01-01

    This paper describes an application of data mining technology called Distributed Fleet Monitoring (DFM) to Flight Operational Quality Assurance (FOQA) data collected from a fleet of commercial aircraft. DFM transforms the data into aircraft performance models, flight-to-flight trends, and individual flight anomalies by fitting a multi-level regression model to the data. The model represents aircraft flight performance and takes into account fixed effects: flight-to-flight and vehicle-to-vehicle variability. The regression parameters include aerodynamic coefficients and other aircraft performance parameters that are usually identified by aircraft manufacturers in flight tests. Using DFM, the multi-terabyte FOQA data set with half-million flights was processed in a few hours. The anomalies found include wrong values of competed variables, (e.g., aircraft weight), sensor failures and baises, failures, biases, and trends in flight actuators. These anomalies were missed by the existing airline monitoring of FOQA data exceedances.

  15. Using statistical anomaly detection models to find clinical decision support malfunctions.

    Science.gov (United States)

    Ray, Soumi; McEvoy, Dustin S; Aaron, Skye; Hickman, Thu-Trang; Wright, Adam

    2018-05-11

    Malfunctions in Clinical Decision Support (CDS) systems occur due to a multitude of reasons, and often go unnoticed, leading to potentially poor outcomes. Our goal was to identify malfunctions within CDS systems. We evaluated 6 anomaly detection models: (1) Poisson Changepoint Model, (2) Autoregressive Integrated Moving Average (ARIMA) Model, (3) Hierarchical Divisive Changepoint (HDC) Model, (4) Bayesian Changepoint Model, (5) Seasonal Hybrid Extreme Studentized Deviate (SHESD) Model, and (6) E-Divisive with Median (EDM) Model and characterized their ability to find known anomalies. We analyzed 4 CDS alerts with known malfunctions from the Longitudinal Medical Record (LMR) and Epic® (Epic Systems Corporation, Madison, WI, USA) at Brigham and Women's Hospital, Boston, MA. The 4 rules recommend lead testing in children, aspirin therapy in patients with coronary artery disease, pneumococcal vaccination in immunocompromised adults and thyroid testing in patients taking amiodarone. Poisson changepoint, ARIMA, HDC, Bayesian changepoint and the SHESD model were able to detect anomalies in an alert for lead screening in children and in an alert for pneumococcal conjugate vaccine in immunocompromised adults. EDM was able to detect anomalies in an alert for monitoring thyroid function in patients on amiodarone. Malfunctions/anomalies occur frequently in CDS alert systems. It is important to be able to detect such anomalies promptly. Anomaly detection models are useful tools to aid such detections.

  16. Adaptively detecting changes in Autonomic Grid Computing

    KAUST Repository

    Zhang, Xiangliang; Germain, Cé cile; Sebag, Michè le

    2010-01-01

    Detecting the changes is the common issue in many application fields due to the non-stationary distribution of the applicative data, e.g., sensor network signals, web logs and gridrunning logs. Toward Autonomic Grid Computing, adaptively detecting

  17. Energy detection based on undecimated discrete wavelet transform and its application in magnetic anomaly detection.

    Directory of Open Access Journals (Sweden)

    Xinhua Nie

    Full Text Available Magnetic anomaly detection (MAD is a passive approach for detection of a ferromagnetic target, and its performance is often limited by external noises. In consideration of one major noise source is the fractal noise (or called 1/f noise with a power spectral density of 1/fa (0detection method based on undecimated discrete wavelet transform (UDWT is proposed in this paper. Firstly, the foundations of magnetic anomaly detection and UDWT are introduced in brief, while a possible detection system based on giant magneto-impedance (GMI magnetic sensor is also given out. Then our proposed energy detection based on UDWT is described in detail, and the probabilities of false alarm and detection for given the detection threshold in theory are presented. It is noticeable that no a priori assumptions regarding the ferromagnetic target or the magnetic noise probability are necessary for our method, and different from the discrete wavelet transform (DWT, the UDWT is shift invariant. Finally, some simulations are performed and the results show that the detection performance of our proposed detector is better than that of the conventional energy detector even utilized in the Gaussian white noise, especially when the spectral parameter α is less than 1.0. In addition, a real-world experiment was done to demonstrate the advantages of the proposed method.

  18. OceanXtremes: Oceanographic Data-Intensive Anomaly Detection and Analysis Portal

    Data.gov (United States)

    National Aeronautics and Space Administration — Anomaly detection is a process of identifying items, events or observations, which do not conform to an expected pattern in a dataset or time series. Current and...

  19. Kullback-Leibler distance-based enhanced detection of incipient anomalies

    KAUST Repository

    Harrou, Fouzi; Sun, Ying; Madakyaru, Muddu

    2016-01-01

    Accurate and effective anomaly detection and diagnosis of modern engineering systems by monitoring processes ensure reliability and safety of a product while maintaining desired quality. In this paper, an innovative method based on Kullback

  20. Anomaly detection in OECD Benchmark data using co-variance methods

    International Nuclear Information System (INIS)

    Srinivasan, G.S.; Krinizs, K.; Por, G.

    1993-02-01

    OECD Benchmark data distributed for the SMORN VI Specialists Meeting in Reactor Noise were investigated for anomaly detection in artificially generated reactor noise benchmark analysis. It was observed that statistical features extracted from covariance matrix of frequency components are very sensitive in terms of the anomaly detection level. It is possible to create well defined alarm levels. (R.P.) 5 refs.; 23 figs.; 1 tab

  1. Bio-Inspired Distributed Decision Algorithms for Anomaly Detection

    Science.gov (United States)

    2017-03-01

    Generation Services (ETG) 3. Replay of Traffic Traces (RTT) BTG creates “ norm ” traffic background with pre-specified distribution, BTG takes in a...a cap on the IP counter to offset this artificial effect. For this reason, we also evaluated the dependence of DIAMoND performance on the IP counter... cap . 3.3.2.10 Performance Evaluation Metrics. Given the local anomaly detector is based on TCP session negotiation protocols, it is natural to

  2. Improved Principal Component Analysis for Anomaly Detection: Application to an Emergency Department

    KAUST Repository

    Harrou, Fouzi; Kadri, Farid; Chaabane, Sondé s; Tahon, Christian; Sun, Ying

    2015-01-01

    Monitoring of production systems, such as those in hospitals, is primordial for ensuring the best management and maintenance desired product quality. Detection of emergent abnormalities allows preemptive actions that can prevent more serious consequences. Principal component analysis (PCA)-based anomaly-detection approach has been used successfully for monitoring systems with highly correlated variables. However, conventional PCA-based detection indices, such as the Hotelling’s T2T2 and the Q statistics, are ill suited to detect small abnormalities because they use only information from the most recent observations. Other multivariate statistical metrics, such as the multivariate cumulative sum (MCUSUM) control scheme, are more suitable for detection small anomalies. In this paper, a generic anomaly detection scheme based on PCA is proposed to monitor demands to an emergency department. In such a framework, the MCUSUM control chart is applied to the uncorrelated residuals obtained from the PCA model. The proposed PCA-based MCUSUM anomaly detection strategy is successfully applied to the practical data collected from the database of the pediatric emergency department in the Lille Regional Hospital Centre, France. The detection results evidence that the proposed method is more effective than the conventional PCA-based anomaly-detection methods.

  3. Improved Principal Component Analysis for Anomaly Detection: Application to an Emergency Department

    KAUST Repository

    Harrou, Fouzi

    2015-07-03

    Monitoring of production systems, such as those in hospitals, is primordial for ensuring the best management and maintenance desired product quality. Detection of emergent abnormalities allows preemptive actions that can prevent more serious consequences. Principal component analysis (PCA)-based anomaly-detection approach has been used successfully for monitoring systems with highly correlated variables. However, conventional PCA-based detection indices, such as the Hotelling’s T2T2 and the Q statistics, are ill suited to detect small abnormalities because they use only information from the most recent observations. Other multivariate statistical metrics, such as the multivariate cumulative sum (MCUSUM) control scheme, are more suitable for detection small anomalies. In this paper, a generic anomaly detection scheme based on PCA is proposed to monitor demands to an emergency department. In such a framework, the MCUSUM control chart is applied to the uncorrelated residuals obtained from the PCA model. The proposed PCA-based MCUSUM anomaly detection strategy is successfully applied to the practical data collected from the database of the pediatric emergency department in the Lille Regional Hospital Centre, France. The detection results evidence that the proposed method is more effective than the conventional PCA-based anomaly-detection methods.

  4. Detection of Airway Anomalies in?Pediatric?Patients with Cardiovascular Anomalies with Low Dose Prospective ECG-Gated Dual-Source CT

    OpenAIRE

    Jiao, Hui; Xu, Zhuodong; Wu, Lebin; Cheng, Zhaoping; Ji, Xiaopeng; Zhong, Hai; Meng, Chen

    2013-01-01

    OBJECTIVES: To assess the feasibility of low-dose prospective ECG-gated dual-source CT (DSCT) in detecting airway anomalies in pediatric patients with cardiovascular anomalies compared with flexible tracheobronchoscopy (FTB). METHODS: 33 pediatrics with respiratory symptoms who had been revealed cardiovascular anomalies by transthoracic echocardiography underwent FTB and contrast material-enhanced prospective ECG-triggering CT were enrolled. The study was approved by our institution review bo...

  5. Lunar magnetic anomalies detected by the Apollo substatellite magnetometers

    Science.gov (United States)

    Hood, L.L.; Coleman, P.J.; Russell, C.T.; Wilhelms, D.E.

    1979-01-01

    Properties of lunar crustal magnetization thus far deduced from Apollo subsatellite magnetometer data are reviewed using two of the most accurate presently available magnetic anomaly maps - one covering a portion of the lunar near side and the other a part of the far side. The largest single anomaly found within the region of coverage on the near-side map correlates exactly with a conspicuous, light-colored marking in western Oceanus Procellarum called Reiner Gamma. This feature is interpreted as an unusual deposit of ejecta from secondary craters of the large nearby primary impact crater Cavalerius. An age for Cavalerius (and, by implication, for Reiner Gamma) of 3.2 ?? 0.2 ?? 109 y is estimated. The main (30 ?? 60 km) Reiner Gamma deposit is nearly uniformly magnetized in a single direction, with a minimum mean magnetization intensity of ???7 ?? 10-2 G cm3/g (assuming a density of 3 g/cm3), or about 700 times the stable magnetization component of the most magnetic returned samples. Additional medium-amplitude anomalies exist over the Fra Mauro Formation (Imbrium basin ejecta emplaced ???3.9 ?? 109 y ago) where it has not been flooded by mare basalt flows, but are nearly absent over the maria and over the craters Copernicus, Kepler, and Reiner and their encircling ejecta mantles. The mean altitude of the far-side anomaly gap is much higher than that of the near-side map and the surface geology is more complex, so individual anomaly sources have not yet been identified. However, it is clear that a concentration of especially strong sources exists in the vicinity of the craters Van de Graaff and Aitken. Numerical modeling of the associated fields reveals that the source locations do not correspond with the larger primary impact craters of the region and, by analogy with Reiner Gamma, may be less conspicuous secondary crater ejecta deposits. The reason for a special concentration of strong sources in the Van de Graaff-Aitken region is unknown, but may be indirectly

  6. Multivariate anomaly detection for Earth observations: a comparison of algorithms and feature extraction techniques

    Directory of Open Access Journals (Sweden)

    M. Flach

    2017-08-01

    Full Text Available Today, many processes at the Earth's surface are constantly monitored by multiple data streams. These observations have become central to advancing our understanding of vegetation dynamics in response to climate or land use change. Another set of important applications is monitoring effects of extreme climatic events, other disturbances such as fires, or abrupt land transitions. One important methodological question is how to reliably detect anomalies in an automated and generic way within multivariate data streams, which typically vary seasonally and are interconnected across variables. Although many algorithms have been proposed for detecting anomalies in multivariate data, only a few have been investigated in the context of Earth system science applications. In this study, we systematically combine and compare feature extraction and anomaly detection algorithms for detecting anomalous events. Our aim is to identify suitable workflows for automatically detecting anomalous patterns in multivariate Earth system data streams. We rely on artificial data that mimic typical properties and anomalies in multivariate spatiotemporal Earth observations like sudden changes in basic characteristics of time series such as the sample mean, the variance, changes in the cycle amplitude, and trends. This artificial experiment is needed as there is no gold standard for the identification of anomalies in real Earth observations. Our results show that a well-chosen feature extraction step (e.g., subtracting seasonal cycles, or dimensionality reduction is more important than the choice of a particular anomaly detection algorithm. Nevertheless, we identify three detection algorithms (k-nearest neighbors mean distance, kernel density estimation, a recurrence approach and their combinations (ensembles that outperform other multivariate approaches as well as univariate extreme-event detection methods. Our results therefore provide an effective workflow to

  7. Multi-Level Anomaly Detection on Time-Varying Graph Data

    Energy Technology Data Exchange (ETDEWEB)

    Bridges, Robert A [ORNL; Collins, John P [ORNL; Ferragut, Erik M [ORNL; Laska, Jason A [ORNL; Sullivan, Blair D [ORNL

    2015-01-01

    This work presents a novel modeling and analysis framework for graph sequences which addresses the challenge of detecting and contextualizing anomalies in labelled, streaming graph data. We introduce a generalization of the BTER model of Seshadhri et al. by adding flexibility to community structure, and use this model to perform multi-scale graph anomaly detection. Specifically, probability models describing coarse subgraphs are built by aggregating probabilities at finer levels, and these closely related hierarchical models simultaneously detect deviations from expectation. This technique provides insight into a graph's structure and internal context that may shed light on a detected event. Additionally, this multi-scale analysis facilitates intuitive visualizations by allowing users to narrow focus from an anomalous graph to particular subgraphs or nodes causing the anomaly. For evaluation, two hierarchical anomaly detectors are tested against a baseline Gaussian method on a series of sampled graphs. We demonstrate that our graph statistics-based approach outperforms both a distribution-based detector and the baseline in a labeled setting with community structure, and it accurately detects anomalies in synthetic and real-world datasets at the node, subgraph, and graph levels. To illustrate the accessibility of information made possible via this technique, the anomaly detector and an associated interactive visualization tool are tested on NCAA football data, where teams and conferences that moved within the league are identified with perfect recall, and precision greater than 0.786.

  8. Implementation of a General Real-Time Visual Anomaly Detection System Via Soft Computing

    Science.gov (United States)

    Dominguez, Jesus A.; Klinko, Steve; Ferrell, Bob; Steinrock, Todd (Technical Monitor)

    2001-01-01

    The intelligent visual system detects anomalies or defects in real time under normal lighting operating conditions. The application is basically a learning machine that integrates fuzzy logic (FL), artificial neural network (ANN), and generic algorithm (GA) schemes to process the image, run the learning process, and finally detect the anomalies or defects. The system acquires the image, performs segmentation to separate the object being tested from the background, preprocesses the image using fuzzy reasoning, performs the final segmentation using fuzzy reasoning techniques to retrieve regions with potential anomalies or defects, and finally retrieves them using a learning model built via ANN and GA techniques. FL provides a powerful framework for knowledge representation and overcomes uncertainty and vagueness typically found in image analysis. ANN provides learning capabilities, and GA leads to robust learning results. An application prototype currently runs on a regular PC under Windows NT, and preliminary work has been performed to build an embedded version with multiple image processors. The application prototype is being tested at the Kennedy Space Center (KSC), Florida, to visually detect anomalies along slide basket cables utilized by the astronauts to evacuate the NASA Shuttle launch pad in an emergency. The potential applications of this anomaly detection system in an open environment are quite wide. Another current, potentially viable application at NASA is in detecting anomalies of the NASA Space Shuttle Orbiter's radiator panels.

  9. Detection of anomaly in human retina using Laplacian Eigenmaps and vectorized matched filtering

    Science.gov (United States)

    Yacoubou Djima, Karamatou A.; Simonelli, Lucia D.; Cunningham, Denise; Czaja, Wojciech

    2015-03-01

    We present a novel method for automated anomaly detection on auto fluorescent data provided by the National Institute of Health (NIH). This is motivated by the need for new tools to improve the capability of diagnosing macular degeneration in its early stages, track the progression over time, and test the effectiveness of new treatment methods. In previous work, macular anomalies have been detected automatically through multiscale analysis procedures such as wavelet analysis or dimensionality reduction algorithms followed by a classification algorithm, e.g., Support Vector Machine. The method that we propose is a Vectorized Matched Filtering (VMF) algorithm combined with Laplacian Eigenmaps (LE), a nonlinear dimensionality reduction algorithm with locality preserving properties. By applying LE, we are able to represent the data in the form of eigenimages, some of which accentuate the visibility of anomalies. We pick significant eigenimages and proceed with the VMF algorithm that classifies anomalies across all of these eigenimages simultaneously. To evaluate our performance, we compare our method to two other schemes: a matched filtering algorithm based on anomaly detection on single images and a combination of PCA and VMF. LE combined with VMF algorithm performs best, yielding a high rate of accurate anomaly detection. This shows the advantage of using a nonlinear approach to represent the data and the effectiveness of VMF, which operates on the images as a data cube rather than individual images.

  10. Effective Sensor Selection and Data Anomaly Detection for Condition Monitoring of Aircraft Engines

    Directory of Open Access Journals (Sweden)

    Liansheng Liu

    2016-04-01

    Full Text Available In a complex system, condition monitoring (CM can collect the system working status. The condition is mainly sensed by the pre-deployed sensors in/on the system. Most existing works study how to utilize the condition information to predict the upcoming anomalies, faults, or failures. There is also some research which focuses on the faults or anomalies of the sensing element (i.e., sensor to enhance the system reliability. However, existing approaches ignore the correlation between sensor selecting strategy and data anomaly detection, which can also improve the system reliability. To address this issue, we study a new scheme which includes sensor selection strategy and data anomaly detection by utilizing information theory and Gaussian Process Regression (GPR. The sensors that are more appropriate for the system CM are first selected. Then, mutual information is utilized to weight the correlation among different sensors. The anomaly detection is carried out by using the correlation of sensor data. The sensor data sets that are utilized to carry out the evaluation are provided by National Aeronautics and Space Administration (NASA Ames Research Center and have been used as Prognostics and Health Management (PHM challenge data in 2008. By comparing the two different sensor selection strategies, the effectiveness of selection method on data anomaly detection is proved.

  11. A Hybrid Semi-Supervised Anomaly Detection Model for High-Dimensional Data

    Directory of Open Access Journals (Sweden)

    Hongchao Song

    2017-01-01

    Full Text Available Anomaly detection, which aims to identify observations that deviate from a nominal sample, is a challenging task for high-dimensional data. Traditional distance-based anomaly detection methods compute the neighborhood distance between each observation and suffer from the curse of dimensionality in high-dimensional space; for example, the distances between any pair of samples are similar and each sample may perform like an outlier. In this paper, we propose a hybrid semi-supervised anomaly detection model for high-dimensional data that consists of two parts: a deep autoencoder (DAE and an ensemble k-nearest neighbor graphs- (K-NNG- based anomaly detector. Benefiting from the ability of nonlinear mapping, the DAE is first trained to learn the intrinsic features of a high-dimensional dataset to represent the high-dimensional data in a more compact subspace. Several nonparametric KNN-based anomaly detectors are then built from different subsets that are randomly sampled from the whole dataset. The final prediction is made by all the anomaly detectors. The performance of the proposed method is evaluated on several real-life datasets, and the results confirm that the proposed hybrid model improves the detection accuracy and reduces the computational complexity.

  12. Theoretical and numerical investigations into the SPRT method for anomaly detection

    Energy Technology Data Exchange (ETDEWEB)

    Schoonewelle, H.; Hagen, T.H.J.J. van der; Hoogenboom, J.E. [Interuniversitair Reactor Inst., Delft (Netherlands)

    1995-11-01

    The sequential probability ratio test developed by Wald is a powerful method of testing an alternative hypothesis against a null hypothesis. This makes the method applicable for anomaly detection. In this paper the method is used to detect a change of the standard deviation of a Gaussian distributed white noise signal. The false alarm probability, the alarm failure probability and the average time to alarm of the method, which are important parameters for anomaly detection, are determined by simulation and compared with theoretical results. Each of the three parameters is presented in dependence of the other two and the ratio of the standard deviation of the anomalous signal and that of the normal signal. Results show that the method is very well suited for anomaly detection. It can detect for example a 50% change in standard deviation within 1 second with a false alarm and alarm failure rate of less than once per month. (author).

  13. Theoretical and numerical investigations into the SPRT method for anomaly detection

    International Nuclear Information System (INIS)

    Schoonewelle, H.; Hagen, T.H.J.J. van der; Hoogenboom, J.E.

    1995-01-01

    The sequential probability ratio test developed by Wald is a powerful method of testing an alternative hypothesis against a null hypothesis. This makes the method applicable for anomaly detection. In this paper the method is used to detect a change of the standard deviation of a Gaussian distributed white noise signal. The false alarm probability, the alarm failure probability and the average time to alarm of the method, which are important parameters for anomaly detection, are determined by simulation and compared with theoretical results. Each of the three parameters is presented in dependence of the other two and the ratio of the standard deviation of the anomalous signal and that of the normal signal. Results show that the method is very well suited for anomaly detection. It can detect for example a 50% change in standard deviation within 1 second with a false alarm and alarm failure rate of less than once per month. (author)

  14. A new comparison of hyperspectral anomaly detection algorithms for real-time applications

    Science.gov (United States)

    Díaz, María.; López, Sebastián.; Sarmiento, Roberto

    2016-10-01

    Due to the high spectral resolution that remotely sensed hyperspectral images provide, there has been an increasing interest in anomaly detection. The aim of anomaly detection is to stand over pixels whose spectral signature differs significantly from the background spectra. Basically, anomaly detectors mark pixels with a certain score, considering as anomalies those whose scores are higher than a threshold. Receiver Operating Characteristic (ROC) curves have been widely used as an assessment measure in order to compare the performance of different algorithms. ROC curves are graphical plots which illustrate the trade- off between false positive and true positive rates. However, they are limited in order to make deep comparisons due to the fact that they discard relevant factors required in real-time applications such as run times, costs of misclassification and the competence to mark anomalies with high scores. This last fact is fundamental in anomaly detection in order to distinguish them easily from the background without any posterior processing. An extensive set of simulations have been made using different anomaly detection algorithms, comparing their performances and efficiencies using several extra metrics in order to complement ROC curves analysis. Results support our proposal and demonstrate that ROC curves do not provide a good visualization of detection performances for themselves. Moreover, a figure of merit has been proposed in this paper which encompasses in a single global metric all the measures yielded for the proposed additional metrics. Therefore, this figure, named Detection Efficiency (DE), takes into account several crucial types of performance assessment that ROC curves do not consider. Results demonstrate that algorithms with the best detection performances according to ROC curves do not have the highest DE values. Consequently, the recommendation of using extra measures to properly evaluate performances have been supported and justified by

  15. Global Anomaly Detection in Two-Dimensional Symmetry-Protected Topological Phases

    Science.gov (United States)

    Bultinck, Nick; Vanhove, Robijn; Haegeman, Jutho; Verstraete, Frank

    2018-04-01

    Edge theories of symmetry-protected topological phases are well known to possess global symmetry anomalies. In this Letter we focus on two-dimensional bosonic phases protected by an on-site symmetry and analyze the corresponding edge anomalies in more detail. Physical interpretations of the anomaly in terms of an obstruction to orbifolding and constructing symmetry-preserving boundaries are connected to the cohomology classification of symmetry-protected phases in two dimensions. Using the tensor network and matrix product state formalism we numerically illustrate our arguments and discuss computational detection schemes to identify symmetry-protected order in a ground state wave function.

  16. Advancements of Data Anomaly Detection Research in Wireless Sensor Networks: A Survey and Open Issues

    Directory of Open Access Journals (Sweden)

    Mohd Aizaini Maarof

    2013-08-01

    Full Text Available Wireless Sensor Networks (WSNs are important and necessary platforms for the future as the concept “Internet of Things” has emerged lately. They are used for monitoring, tracking, or controlling of many applications in industry, health care, habitat, and military. However, the quality of data collected by sensor nodes is affected by anomalies that occur due to various reasons, such as node failures, reading errors, unusual events, and malicious attacks. Therefore, anomaly detection is a necessary process to ensure the quality of sensor data before it is utilized for making decisions. In this review, we present the challenges of anomaly detection in WSNs and state the requirements to design efficient and effective anomaly detection models. We then review the latest advancements of data anomaly detection research in WSNs and classify current detection approaches in five main classes based on the detection methods used to design these approaches. Varieties of the state-of-the-art models for each class are covered and their limitations are highlighted to provide ideas for potential future works. Furthermore, the reviewed approaches are compared and evaluated based on how well they meet the stated requirements. Finally, the general limitations of current approaches are mentioned and further research opportunities are suggested and discussed.

  17. Advancements of Data Anomaly Detection Research in Wireless Sensor Networks: A Survey and Open Issues

    Science.gov (United States)

    Rassam, Murad A.; Zainal, Anazida; Maarof, Mohd Aizaini

    2013-01-01

    Wireless Sensor Networks (WSNs) are important and necessary platforms for the future as the concept “Internet of Things” has emerged lately. They are used for monitoring, tracking, or controlling of many applications in industry, health care, habitat, and military. However, the quality of data collected by sensor nodes is affected by anomalies that occur due to various reasons, such as node failures, reading errors, unusual events, and malicious attacks. Therefore, anomaly detection is a necessary process to ensure the quality of sensor data before it is utilized for making decisions. In this review, we present the challenges of anomaly detection in WSNs and state the requirements to design efficient and effective anomaly detection models. We then review the latest advancements of data anomaly detection research in WSNs and classify current detection approaches in five main classes based on the detection methods used to design these approaches. Varieties of the state-of-the-art models for each class are covered and their limitations are highlighted to provide ideas for potential future works. Furthermore, the reviewed approaches are compared and evaluated based on how well they meet the stated requirements. Finally, the general limitations of current approaches are mentioned and further research opportunities are suggested and discussed. PMID:23966182

  18. Adaptive prediction applied to seismic event detection

    International Nuclear Information System (INIS)

    Clark, G.A.; Rodgers, P.W.

    1981-01-01

    Adaptive prediction was applied to the problem of detecting small seismic events in microseismic background noise. The Widrow-Hoff LMS adaptive filter used in a prediction configuration is compared with two standard seismic filters as an onset indicator. Examples demonstrate the technique's usefulness with both synthetic and actual seismic data

  19. Adaptive prediction applied to seismic event detection

    Energy Technology Data Exchange (ETDEWEB)

    Clark, G.A.; Rodgers, P.W.

    1981-09-01

    Adaptive prediction was applied to the problem of detecting small seismic events in microseismic background noise. The Widrow-Hoff LMS adaptive filter used in a prediction configuration is compared with two standard seismic filters as an onset indicator. Examples demonstrate the technique's usefulness with both synthetic and actual seismic data.

  20. Anomaly Detection and Life Pattern Estimation for the Elderly Based on Categorization of Accumulated Data

    Science.gov (United States)

    Mori, Taketoshi; Ishino, Takahito; Noguchi, Hiroshi; Shimosaka, Masamichi; Sato, Tomomasa

    2011-06-01

    We propose a life pattern estimation method and an anomaly detection method for elderly people living alone. In our observation system for such people, we deploy some pyroelectric sensors into the house and measure the person's activities all the time in order to grasp the person's life pattern. The data are transferred successively to the operation center and displayed to the nurses in the center in a precise way. Then, the nurses decide whether the data is the anomaly or not. In the system, the people whose features in their life resemble each other are categorized as the same group. Anomalies occurred in the past are shared in the group and utilized in the anomaly detection algorithm. This algorithm is based on "anomaly score." The "anomaly score" is figured out by utilizing the activeness of the person. This activeness is approximately proportional to the frequency of the sensor response in a minute. The "anomaly score" is calculated from the difference between the activeness in the present and the past one averaged in the long term. Thus, the score is positive if the activeness in the present is higher than the average in the past, and the score is negative if the value in the present is lower than the average. If the score exceeds a certain threshold, it means that an anomaly event occurs. Moreover, we developed an activity estimation algorithm. This algorithm estimates the residents' basic activities such as uprising, outing, and so on. The estimation is shown to the nurses with the "anomaly score" of the residents. The nurses can understand the residents' health conditions by combining these two information.

  1. Detection of admittivity anomaly on high-contrast heterogeneous backgrounds using frequency difference EIT.

    Science.gov (United States)

    Jang, J; Seo, J K

    2015-06-01

    This paper describes a multiple background subtraction method in frequency difference electrical impedance tomography (fdEIT) to detect an admittivity anomaly from a high-contrast background conductivity distribution. The proposed method expands the use of the conventional weighted frequency difference EIT method, which has been used limitedly to detect admittivity anomalies in a roughly homogeneous background. The proposed method can be viewed as multiple weighted difference imaging in fdEIT. Although the spatial resolutions of the output images by fdEIT are very low due to the inherent ill-posedness, numerical simulations and phantom experiments of the proposed method demonstrate its feasibility to detect anomalies. It has potential application in stroke detection in a head model, which is highly heterogeneous due to the skull.

  2. Panacea: Automating Attack Classification for Anomaly-based Network Intrusion Detection Systems

    NARCIS (Netherlands)

    Bolzoni, D.; Etalle, Sandro; Hartel, Pieter H.; Kirda, E.; Jha, S.; Balzarotti, D.

    Anomaly-based intrusion detection systems are usually criticized because they lack a classication of attack, thus security teams have to manually inspect any raised alert to classify it. We present a new approach, Panacea, to automatically and systematically classify attacks detected by an

  3. Panacea : Automating attack classification for anomaly-based network intrusion detection systems

    NARCIS (Netherlands)

    Bolzoni, D.; Etalle, S.; Hartel, P.H.; Kirda, E.; Jha, S.; Balzarotti, D.

    2009-01-01

    Anomaly-based intrusion detection systems are usually criticized because they lack a classification of attacks, thus security teams have to manually inspect any raised alert to classify it. We present a new approach, Panacea, to automatically and systematically classify attacks detected by an

  4. Panacea : Automating attack classification for anomaly-based network intrusion detection systems

    NARCIS (Netherlands)

    Bolzoni, D.; Etalle, S.; Hartel, P.H.

    2009-01-01

    Anomaly-based intrusion detection systems are usually criticized because they lack a classification of attack, thus security teams have to manually inspect any raised alert to classify it. We present a new approach, Panacea, to automatically and systematically classify attacks detected by an

  5. Panacea: Automating Attack Classification for Anomaly-based Network Intrusion Detection Systems

    NARCIS (Netherlands)

    Bolzoni, D.; Etalle, Sandro; Hartel, Pieter H.

    2009-01-01

    Anomaly-based intrusion detection systems are usually criticized because they lack a classication of attack, thus security teams have to manually inspect any raised alert to classify it. We present a new approach, Panacea, to automatically and systematically classify attacks detected by an

  6. GLRT Based Anomaly Detection for Sensor Network Monitoring

    KAUST Repository

    Harrou, Fouzi

    2015-12-07

    Proper operation of antenna arrays requires continuously monitoring their performances. When a fault occurs in an antenna array, the radiation pattern changes and can significantly deviate from the desired design performance specifications. In this paper, the problem of fault detection in linear antenna arrays is addressed within a statistical framework. Specifically, a statistical fault detection method based on the generalized likelihood ratio (GLR) principle is utilized for detecting potential faults in linear antenna arrays. The proposed method relies on detecting deviations in the radiation pattern of the monitored array with respect to a reference (fault-free) one. To assess the abilities of the GLR based fault detection method, three case studies involving different types of faults have been performed. The simulation results clearly illustrate the effectiveness of the GLR-based fault detection method in monitoring the performance of linear antenna arrays.

  7. GLRT Based Anomaly Detection for Sensor Network Monitoring

    KAUST Repository

    Harrou, Fouzi; Sun, Ying

    2015-01-01

    Proper operation of antenna arrays requires continuously monitoring their performances. When a fault occurs in an antenna array, the radiation pattern changes and can significantly deviate from the desired design performance specifications. In this paper, the problem of fault detection in linear antenna arrays is addressed within a statistical framework. Specifically, a statistical fault detection method based on the generalized likelihood ratio (GLR) principle is utilized for detecting potential faults in linear antenna arrays. The proposed method relies on detecting deviations in the radiation pattern of the monitored array with respect to a reference (fault-free) one. To assess the abilities of the GLR based fault detection method, three case studies involving different types of faults have been performed. The simulation results clearly illustrate the effectiveness of the GLR-based fault detection method in monitoring the performance of linear antenna arrays.

  8. PLAT: An Automated Fault and Behavioural Anomaly Detection Tool for PLC Controlled Manufacturing Systems

    Directory of Open Access Journals (Sweden)

    Arup Ghosh

    2016-01-01

    Full Text Available Operational faults and behavioural anomalies associated with PLC control processes take place often in a manufacturing system. Real time identification of these operational faults and behavioural anomalies is necessary in the manufacturing industry. In this paper, we present an automated tool, called PLC Log-Data Analysis Tool (PLAT that can detect them by using log-data records of the PLC signals. PLAT automatically creates a nominal model of the PLC control process and employs a novel hash table based indexing and searching scheme to satisfy those purposes. Our experiments show that PLAT is significantly fast, provides real time identification of operational faults and behavioural anomalies, and can execute within a small memory footprint. In addition, PLAT can easily handle a large manufacturing system with a reasonable computing configuration and can be installed in parallel to the data logging system to identify operational faults and behavioural anomalies effectively.

  9. PLAT: An Automated Fault and Behavioural Anomaly Detection Tool for PLC Controlled Manufacturing Systems.

    Science.gov (United States)

    Ghosh, Arup; Qin, Shiming; Lee, Jooyeoun; Wang, Gi-Nam

    2016-01-01

    Operational faults and behavioural anomalies associated with PLC control processes take place often in a manufacturing system. Real time identification of these operational faults and behavioural anomalies is necessary in the manufacturing industry. In this paper, we present an automated tool, called PLC Log-Data Analysis Tool (PLAT) that can detect them by using log-data records of the PLC signals. PLAT automatically creates a nominal model of the PLC control process and employs a novel hash table based indexing and searching scheme to satisfy those purposes. Our experiments show that PLAT is significantly fast, provides real time identification of operational faults and behavioural anomalies, and can execute within a small memory footprint. In addition, PLAT can easily handle a large manufacturing system with a reasonable computing configuration and can be installed in parallel to the data logging system to identify operational faults and behavioural anomalies effectively.

  10. Revisiting Anomaly-based Network Intrusion Detection Systems

    NARCIS (Netherlands)

    Bolzoni, D.

    2009-01-01

    Intrusion detection systems (IDSs) are well-known and widely-deployed security tools to detect cyber-attacks and malicious activities in computer systems and networks. A signature-based IDS works similar to anti-virus software. It employs a signature database of known attacks, and a successful match

  11. Anomaly Detection and Mitigation at Internet Scale: A Survey

    NARCIS (Netherlands)

    Steinberger, Jessica; Schehlmann, Lisa; Abt, Sebastian; Baier, Harald; Doyen, Guillaume; Waldburger, Martin; Celeda, Pavel; Sperotto, Anna; Stiller, Burkhard

    Network-based attacks pose a strong threat to the Internet landscape. There are different possibilities to encounter these threats. On the one hand attack detection operated at the end-users' side, on the other hand attack detection implemented at network operators' infrastructures. An obvious

  12. Anomaly detection in random heterogeneous media Feynman-Kac formulae, stochastic homogenization and statistical inversion

    CERN Document Server

    Simon, Martin

    2015-01-01

    This monograph is concerned with the analysis and numerical solution of a stochastic inverse anomaly detection problem in electrical impedance tomography (EIT). Martin Simon studies the problem of detecting a parameterized anomaly in an isotropic, stationary and ergodic conductivity random field whose realizations are rapidly oscillating. For this purpose, he derives Feynman-Kac formulae to rigorously justify stochastic homogenization in the case of the underlying stochastic boundary value problem. The author combines techniques from the theory of partial differential equations and functional analysis with probabilistic ideas, paving the way to new mathematical theorems which may be fruitfully used in the treatment of the problem at hand. Moreover, the author proposes an efficient numerical method in the framework of Bayesian inversion for the practical solution of the stochastic inverse anomaly detection problem.   Contents Feynman-Kac formulae Stochastic homogenization Statistical inverse problems  Targe...

  13. Anomalies in the detection of change: When changes in sample size are mistaken for changes in proportions.

    Science.gov (United States)

    Fiedler, Klaus; Kareev, Yaakov; Avrahami, Judith; Beier, Susanne; Kutzner, Florian; Hütter, Mandy

    2016-01-01

    Detecting changes, in performance, sales, markets, risks, social relations, or public opinions, constitutes an important adaptive function. In a sequential paradigm devised to investigate detection of change, every trial provides a sample of binary outcomes (e.g., correct vs. incorrect student responses). Participants have to decide whether the proportion of a focal feature (e.g., correct responses) in the population from which the sample is drawn has decreased, remained constant, or increased. Strong and persistent anomalies in change detection arise when changes in proportional quantities vary orthogonally to changes in absolute sample size. Proportional increases are readily detected and nonchanges are erroneously perceived as increases when absolute sample size increases. Conversely, decreasing sample size facilitates the correct detection of proportional decreases and the erroneous perception of nonchanges as decreases. These anomalies are however confined to experienced samples of elementary raw events from which proportions have to be inferred inductively. They disappear when sample proportions are described as percentages in a normalized probability format. To explain these challenging findings, it is essential to understand the inductive-learning constraints imposed on decisions from experience.

  14. Apparatus and method for detecting a magnetic anomaly contiguous to remote location by SQUID gradiometer and magnetometer systems

    Science.gov (United States)

    Overton, W.C. Jr.; Steyert, W.A. Jr.

    1981-05-22

    A superconducting quantum interference device (SQUID) magnetic detection apparatus detects magnetic fields, signals, and anomalies at remote locations. Two remotely rotatable SQUID gradiometers may be housed in a cryogenic environment to search for and locate unambiguously magnetic anomalies. The SQUID magnetic detection apparatus can be used to determine the azimuth of a hydrofracture by first flooding the hydrofracture with a ferrofluid to create an artificial magnetic anomaly therein.

  15. Statistical Techniques For Real-time Anomaly Detection Using Spark Over Multi-source VMware Performance Data

    Energy Technology Data Exchange (ETDEWEB)

    Solaimani, Mohiuddin [Univ. of Texas-Dallas, Richardson, TX (United States); Iftekhar, Mohammed [Univ. of Texas-Dallas, Richardson, TX (United States); Khan, Latifur [Univ. of Texas-Dallas, Richardson, TX (United States); Thuraisingham, Bhavani [Univ. of Texas-Dallas, Richardson, TX (United States); Ingram, Joey Burton [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-09-01

    Anomaly detection refers to the identi cation of an irregular or unusual pat- tern which deviates from what is standard, normal, or expected. Such deviated patterns typically correspond to samples of interest and are assigned different labels in different domains, such as outliers, anomalies, exceptions, or malware. Detecting anomalies in fast, voluminous streams of data is a formidable chal- lenge. This paper presents a novel, generic, real-time distributed anomaly detection framework for heterogeneous streaming data where anomalies appear as a group. We have developed a distributed statistical approach to build a model and later use it to detect anomaly. As a case study, we investigate group anomaly de- tection for a VMware-based cloud data center, which maintains a large number of virtual machines (VMs). We have built our framework using Apache Spark to get higher throughput and lower data processing time on streaming data. We have developed a window-based statistical anomaly detection technique to detect anomalies that appear sporadically. We then relaxed this constraint with higher accuracy by implementing a cluster-based technique to detect sporadic and continuous anomalies. We conclude that our cluster-based technique out- performs other statistical techniques with higher accuracy and lower processing time.

  16. Robust and Accurate Anomaly Detection in ECG Artifacts Using Time Series Motif Discovery

    Science.gov (United States)

    Sivaraks, Haemwaan

    2015-01-01

    Electrocardiogram (ECG) anomaly detection is an important technique for detecting dissimilar heartbeats which helps identify abnormal ECGs before the diagnosis process. Currently available ECG anomaly detection methods, ranging from academic research to commercial ECG machines, still suffer from a high false alarm rate because these methods are not able to differentiate ECG artifacts from real ECG signal, especially, in ECG artifacts that are similar to ECG signals in terms of shape and/or frequency. The problem leads to high vigilance for physicians and misinterpretation risk for nonspecialists. Therefore, this work proposes a novel anomaly detection technique that is highly robust and accurate in the presence of ECG artifacts which can effectively reduce the false alarm rate. Expert knowledge from cardiologists and motif discovery technique is utilized in our design. In addition, every step of the algorithm conforms to the interpretation of cardiologists. Our method can be utilized to both single-lead ECGs and multilead ECGs. Our experiment results on real ECG datasets are interpreted and evaluated by cardiologists. Our proposed algorithm can mostly achieve 100% of accuracy on detection (AoD), sensitivity, specificity, and positive predictive value with 0% false alarm rate. The results demonstrate that our proposed method is highly accurate and robust to artifacts, compared with competitive anomaly detection methods. PMID:25688284

  17. Kullback-Leibler distance-based enhanced detection of incipient anomalies

    KAUST Repository

    Harrou, Fouzi

    2016-09-09

    Accurate and effective anomaly detection and diagnosis of modern engineering systems by monitoring processes ensure reliability and safety of a product while maintaining desired quality. In this paper, an innovative method based on Kullback-Leibler divergence for detecting incipient anomalies in highly correlated multivariate data is presented. We use a partial least square (PLS) method as a modeling framework and a symmetrized Kullback-Leibler distance (KLD) as an anomaly indicator, where it is used to quantify the dissimilarity between current PLS-based residual and reference probability distributions obtained using fault-free data. Furthermore, this paper reports the development of two monitoring charts based on the KLD. The first approach is a KLD-Shewhart chart, where the Shewhart monitoring chart with a three sigma rule is used to monitor the KLD of the response variables residuals from the PLS model. The second approach integrates the KLD statistic into the exponentially weighted moving average monitoring chart. The performance of the PLS-based KLD anomaly-detection methods is illustrated and compared to that of conventional PLS-based anomaly detection methods. Using synthetic data and simulated distillation column data, we demonstrate the greater sensitivity and effectiveness of the developed method over the conventional PLS-based methods, especially when data are highly correlated and small anomalies are of interest. Results indicate that the proposed chart is a very promising KLD-based method because KLD-based charts are, in practice, designed to detect small shifts in process parameters. © 2016 Elsevier Ltd

  18. A measurement-based technique for incipient anomaly detection

    KAUST Repository

    Harrou, Fouzi; Sun, Ying

    2016-01-01

    Fault detection is essential for safe operation of various engineering systems. Principal component analysis (PCA) has been widely used in monitoring highly correlated process variables. Conventional PCA-based methods, nevertheless, often fail to detect small or incipient faults. In this paper, we develop new PCA-based monitoring charts, combining PCA with multivariate memory control charts, such as the multivariate cumulative sum (MCUSUM) and multivariate exponentially weighted moving average (MEWMA) monitoring schemes. The multivariate control charts with memory are sensitive to small and moderate faults in the process mean, which significantly improves the performance of PCA methods and widen their applicability in practice. Using simulated data, we demonstrate that the proposed PCA-based MEWMA and MCUSUM control charts are more effective in detecting small shifts in the mean of the multivariate process variables, and outperform the conventional PCA-based monitoring charts. © 2015 IEEE.

  19. A measurement-based technique for incipient anomaly detection

    KAUST Repository

    Harrou, Fouzi

    2016-06-13

    Fault detection is essential for safe operation of various engineering systems. Principal component analysis (PCA) has been widely used in monitoring highly correlated process variables. Conventional PCA-based methods, nevertheless, often fail to detect small or incipient faults. In this paper, we develop new PCA-based monitoring charts, combining PCA with multivariate memory control charts, such as the multivariate cumulative sum (MCUSUM) and multivariate exponentially weighted moving average (MEWMA) monitoring schemes. The multivariate control charts with memory are sensitive to small and moderate faults in the process mean, which significantly improves the performance of PCA methods and widen their applicability in practice. Using simulated data, we demonstrate that the proposed PCA-based MEWMA and MCUSUM control charts are more effective in detecting small shifts in the mean of the multivariate process variables, and outperform the conventional PCA-based monitoring charts. © 2015 IEEE.

  20. Time series analysis of infrared satellite data for detecting thermal anomalies: a hybrid approach

    Science.gov (United States)

    Koeppen, W. C.; Pilger, E.; Wright, R.

    2011-07-01

    We developed and tested an automated algorithm that analyzes thermal infrared satellite time series data to detect and quantify the excess energy radiated from thermal anomalies such as active volcanoes. Our algorithm enhances the previously developed MODVOLC approach, a simple point operation, by adding a more complex time series component based on the methods of the Robust Satellite Techniques (RST) algorithm. Using test sites at Anatahan and Kīlauea volcanoes, the hybrid time series approach detected ~15% more thermal anomalies than MODVOLC with very few, if any, known false detections. We also tested gas flares in the Cantarell oil field in the Gulf of Mexico as an end-member scenario representing very persistent thermal anomalies. At Cantarell, the hybrid algorithm showed only a slight improvement, but it did identify flares that were undetected by MODVOLC. We estimate that at least 80 MODIS images for each calendar month are required to create good reference images necessary for the time series analysis of the hybrid algorithm. The improved performance of the new algorithm over MODVOLC will result in the detection of low temperature thermal anomalies that will be useful in improving our ability to document Earth's volcanic eruptions, as well as detecting low temperature thermal precursors to larger eruptions.

  1. Anomaly Detection in Nanofibrous Materials by CNN-Based Self-Similarity

    Directory of Open Access Journals (Sweden)

    Paolo Napoletano

    2018-01-01

    Full Text Available Automatic detection and localization of anomalies in nanofibrous materials help to reduce the cost of the production process and the time of the post-production visual inspection process. Amongst all the monitoring methods, those exploiting Scanning Electron Microscope (SEM imaging are the most effective. In this paper, we propose a region-based method for the detection and localization of anomalies in SEM images, based on Convolutional Neural Networks (CNNs and self-similarity. The method evaluates the degree of abnormality of each subregion of an image under consideration by computing a CNN-based visual similarity with respect to a dictionary of anomaly-free subregions belonging to a training set. The proposed method outperforms the state of the art.

  2. Using new edges for anomaly detection in computer networks

    Science.gov (United States)

    Neil, Joshua Charles

    2015-05-19

    Creation of new edges in a network may be used as an indication of a potential attack on the network. Historical data of a frequency with which nodes in a network create and receive new edges may be analyzed. Baseline models of behavior among the edges in the network may be established based on the analysis of the historical data. A new edge that deviates from a respective baseline model by more than a predetermined threshold during a time window may be detected. The new edge may be flagged as potentially anomalous when the deviation from the respective baseline model is detected. Probabilities for both new and existing edges may be obtained for all edges in a path or other subgraph. The probabilities may then be combined to obtain a score for the path or other subgraph. A threshold may be obtained by calculating an empirical distribution of the scores under historical conditions.

  3. Structural material anomaly detection system using water chemistry data, (7)

    International Nuclear Information System (INIS)

    Nagase, Makoto; Uchida, Shunsuke; Asakura, Yamato; Ohsumi, Katsumi.

    1993-01-01

    A method to detect small changes in water quality and diagnose their causes by analyzing on-line conductivity and pH data was proposed. Laboratory tests showed that effective noise reduction of measured on-line data could be got by using median filter to detect small changes of conductivity ; a relative change of 0.001 μS/cm was distinguishable. By simulating the changes of pH and conductivity in the reactor water against a small concentration change of sodium ion or sulfate ion in the feedwater, it was found that an adequate elapsed time for the diagnosis was 4 h from the start of the concentration change. A conductivity difference of 0.001 μS/cm in the reactor water made it theoretically possible to distinguish between a sodium ion concentration change of 4.6 ppt and a sulfate ion concentration change of 9.6 ppt in the feedwater. (author)

  4. Multivariate diagnostics and anomaly detection for nuclear safeguards

    International Nuclear Information System (INIS)

    Burr, T.

    1994-01-01

    For process control and other reasons, new and future nuclear reprocessing plants are expected to be increasingly more automated than older plants. As a consequence of this automation, the quantity of data potentially available for safeguards may be much greater in future reprocessing plants than in current plants. The authors first review recent literature that applies multivariate Shewhart and multivariate cumulative sum (Cusum) tests to detect anomalous data. These tests are used to evaluate residuals obtained from a simulated three-tank problem in which five variables (volume, density, and concentrations of uranium, plutonium, and nitric acid) in each tank are modeled and measured. They then present results from several simulations involving transfers between the tanks and between the tanks and the environment. Residuals from a no-fault problem in which the measurements and model predictions are both correct are used to develop Cusum test parameters which are then used to test for faults for several simulated anomalous situations, such as an unknown leak or diversion of material from one of the tanks. The leak can be detected by comparing measurements, which estimate the true state of the tank system, with the model predictions, which estimate the state of the tank system as it ''should'' be. The no-fault simulation compares false alarm behavior for the various tests, whereas the anomalous problems allow one to compare the power of the various tests to detect faults under possible diversion scenarios. For comparison with the multivariate tests, univariate tests are also applied to the residuals

  5. Anomaly Detection for Internet of Vehicles: A Trust Management Scheme with Affinity Propagation

    Directory of Open Access Journals (Sweden)

    Shu Yang

    2016-01-01

    Full Text Available Anomaly detection is critical for intelligent vehicle (IV collaboration. Forming clusters/platoons, IVs can work together to accomplish complex jobs that they are unable to perform individually. To improve security and efficiency of Internet of Vehicles, IVs’ anomaly detection has been extensively studied and a number of trust-based approaches have been proposed. However, most of these proposals either pay little attention to leader-based detection algorithm or ignore the utility of networked Roadside-Units (RSUs. In this paper, we introduce a trust-based anomaly detection scheme for IVs, where some malicious or incapable vehicles are existing on roads. The proposed scheme works by allowing IVs to detect abnormal vehicles, communicate with each other, and finally converge to some trustworthy cluster heads (CHs. Periodically, the CHs take responsibility for intracluster trust management. Moreover, the scheme is enhanced with a distributed supervising mechanism and a central reputation arbitrator to assure robustness and fairness in detecting process. The simulation results show that our scheme can achieve a low detection failure rate below 1%, demonstrating its ability to detect and filter the abnormal vehicles.

  6. Statistical methods for anomaly detection in the complex process; Methodes statistiques de detection d'anomalies de fonctionnement dans les processus complexes

    Energy Technology Data Exchange (ETDEWEB)

    Al Mouhamed, Mayez

    1977-09-15

    In a number of complex physical systems the accessible signals are often characterized by random fluctuations about a mean value. The fluctuations (signature) often transmit information about the state of the system that the mean value cannot predict. This study is undertaken to elaborate statistical methods of anomaly detection on the basis of signature analysis of the noise inherent in the process. The algorithm presented first learns the characteristics of normal operation of a complex process. Then it detects small deviations from the normal behavior. The algorithm can be implemented in a medium-sized computer for on line application. (author) [French] Dans de nombreux systemes physiques complexes les grandeurs accessibles a l'homme sont souvent caracterisees par des fluctuations aleatoires autour d'une valeur moyenne. Les fluctuations (signatures) transmettent souvent des informations sur l'etat du systeme que la valeur moyenne ne peut predire. Cette etude est entreprise pour elaborer des methodes statistiques de detection d'anomalies de fonctionnement sur la base de l'analyse des signatures contenues dans les signaux de bruit provenant du processus. L'algorithme presente est capable de: 1/ Apprendre les caracteristiques des operations normales dans un processus complexe. 2/ Detecter des petites deviations par rapport a la conduite normale du processus. L'algorithme peut etre implante sur un calculateur de taille moyenne pour les applications en ligne. (auteur)

  7. Stochastic pattern recognition techniques and artificial intelligence for nuclear power plant surveillance and anomaly detection

    Energy Technology Data Exchange (ETDEWEB)

    Kemeny, L.G

    1998-12-31

    In this paper a theoretical and system conceptual model is outlined for the instrumentation, core assessment and surveillance and anomaly detection of a nuclear power plant. The system specified is based on the statistical on-line analysis of optimally placed instrumentation sensed fluctuating signals in terms of such variates as coherence, correlation function, zero-crossing and spectral density

  8. Stochastic pattern recognition techniques and artificial intelligence for nuclear power plant surveillance and anomaly detection

    International Nuclear Information System (INIS)

    Kemeny, L.G.

    1998-01-01

    In this paper a theoretical and system conceptual model is outlined for the instrumentation, core assessment and surveillance and anomaly detection of a nuclear power plant. The system specified is based on the statistical on-line analysis of optimally placed instrumentation sensed fluctuating signals in terms of such variates as coherence, correlation function, zero-crossing and spectral density

  9. A Bayesian model for anomaly detection in SQL databases for security systems

    NARCIS (Netherlands)

    Drugan, M.M.

    2017-01-01

    We focus on automatic anomaly detection in SQL databases for security systems. Many logs of database systems, here the Townhall database, contain detailed information about users, like the SQL queries and the response of the database. A database is a list of log instances, where each log instance is

  10. A white-box anomaly-based framework for database leakage detection

    NARCIS (Netherlands)

    Costante, E.; den Hartog, J.; Petkovic, M.; Etalle, S.; Pechenizkiy, M.

    2017-01-01

    Data leakage is at the heart most of the privacy breaches worldwide. In this paper we present a white-box approach to detect potential data leakage by spotting anomalies in database transactions. We refer to our solution as white-box because it builds self explanatory profiles that are easy to

  11. Dynamic analysis methods for detecting anomalies in asynchronously interacting systems

    Energy Technology Data Exchange (ETDEWEB)

    Kumar, Akshat; Solis, John Hector; Matschke, Benjamin

    2014-01-01

    Detecting modifications to digital system designs, whether malicious or benign, is problematic due to the complexity of the systems being analyzed. Moreover, static analysis techniques and tools can only be used during the initial design and implementation phases to verify safety and liveness properties. It is computationally intractable to guarantee that any previously verified properties still hold after a system, or even a single component, has been produced by a third-party manufacturer. In this paper we explore new approaches for creating a robust system design by investigating highly-structured computational models that simplify verification and analysis. Our approach avoids the need to fully reconstruct the implemented system by incorporating a small verification component that dynamically detects for deviations from the design specification at run-time. The first approach encodes information extracted from the original system design algebraically into a verification component. During run-time this component randomly queries the implementation for trace information and verifies that no design-level properties have been violated. If any deviation is detected then a pre-specified fail-safe or notification behavior is triggered. Our second approach utilizes a partitioning methodology to view liveness and safety properties as a distributed decision task and the implementation as a proposed protocol that solves this task. Thus the problem of verifying safety and liveness properties is translated to that of verifying that the implementation solves the associated decision task. We develop upon results from distributed systems and algebraic topology to construct a learning mechanism for verifying safety and liveness properties from samples of run-time executions.

  12. An angle-based subspace anomaly detection approach to high-dimensional data: With an application to industrial fault detection

    International Nuclear Information System (INIS)

    Zhang, Liangwei; Lin, Jing; Karim, Ramin

    2015-01-01

    The accuracy of traditional anomaly detection techniques implemented on full-dimensional spaces degrades significantly as dimensionality increases, thereby hampering many real-world applications. This work proposes an approach to selecting meaningful feature subspace and conducting anomaly detection in the corresponding subspace projection. The aim is to maintain the detection accuracy in high-dimensional circumstances. The suggested approach assesses the angle between all pairs of two lines for one specific anomaly candidate: the first line is connected by the relevant data point and the center of its adjacent points; the other line is one of the axis-parallel lines. Those dimensions which have a relatively small angle with the first line are then chosen to constitute the axis-parallel subspace for the candidate. Next, a normalized Mahalanobis distance is introduced to measure the local outlier-ness of an object in the subspace projection. To comprehensively compare the proposed algorithm with several existing anomaly detection techniques, we constructed artificial datasets with various high-dimensional settings and found the algorithm displayed superior accuracy. A further experiment on an industrial dataset demonstrated the applicability of the proposed algorithm in fault detection tasks and highlighted another of its merits, namely, to provide preliminary interpretation of abnormality through feature ordering in relevant subspaces. - Highlights: • An anomaly detection approach for high-dimensional reliability data is proposed. • The approach selects relevant subspaces by assessing vectorial angles. • The novel ABSAD approach displays superior accuracy over other alternatives. • Numerical illustration approves its efficacy in fault detection applications

  13. Detecting ship targets in spaceborne infrared image based on modeling radiation anomalies

    Science.gov (United States)

    Wang, Haibo; Zou, Zhengxia; Shi, Zhenwei; Li, Bo

    2017-09-01

    Using infrared imaging sensors to detect ship target in the ocean environment has many advantages compared to other sensor modalities, such as better thermal sensitivity and all-weather detection capability. We propose a new ship detection method by modeling radiation anomalies for spaceborne infrared image. The proposed method can be decomposed into two stages, where in the first stage, a test infrared image is densely divided into a set of image patches and the radiation anomaly of each patch is estimated by a Gaussian Mixture Model (GMM), and thereby target candidates are obtained from anomaly image patches. In the second stage, target candidates are further checked by a more discriminative criterion to obtain the final detection result. The main innovation of the proposed method is inspired by the biological mechanism that human eyes are sensitive to the unusual and anomalous patches among complex background. The experimental result on short wavelength infrared band (1.560 - 2.300 μm) and long wavelength infrared band (10.30 - 12.50 μm) of Landsat-8 satellite shows the proposed method achieves a desired ship detection accuracy with higher recall than other classical ship detection methods.

  14. Adaptive skin detection based on online training

    Science.gov (United States)

    Zhang, Ming; Tang, Liang; Zhou, Jie; Rong, Gang

    2007-11-01

    Skin is a widely used cue for porn image classification. Most conventional methods are off-line training schemes. They usually use a fixed boundary to segment skin regions in the images and are effective only in restricted conditions: e.g. good lightness and unique human race. This paper presents an adaptive online training scheme for skin detection which can handle these tough cases. In our approach, skin detection is considered as a classification problem on Gaussian mixture model. For each image, human face is detected and the face color is used to establish a primary estimation of skin color distribution. Then an adaptive online training algorithm is used to find the real boundary between skin color and background color in current image. Experimental results on 450 images showed that the proposed method is more robust in general situations than the conventional ones.

  15. JACoW Model learning algorithms for anomaly detection in CERN control systems

    CERN Document Server

    Tilaro, Filippo; Gonzalez-Berges, Manuel; Roshchin, Mikhail; Varela, Fernando

    2018-01-01

    The CERN automation infrastructure consists of over 600 heterogeneous industrial control systems with around 45 million deployed sensors, actuators and control objects. Therefore, it is evident that the monitoring of such huge system represents a challenging and complex task. This paper describes three different mathematical approaches that have been designed and developed to detect anomalies in any of the CERN control systems. Specifically, one of these algorithms is purely based on expert knowledge; the other two mine the historical generated data to create a simple model of the system; this model is then used to detect faulty sensors measurements. The presented methods can be categorized as dynamic unsupervised anomaly detection; “dynamic” since the behaviour of the system and the evolution of its attributes are observed and changing in time. They are “unsupervised” because we are trying to predict faulty events without examples in the data history. So, the described strategies involve monitoring t...

  16. A scalable architecture for online anomaly detection of WLCG batch jobs

    Science.gov (United States)

    Kuehn, E.; Fischer, M.; Giffels, M.; Jung, C.; Petzold, A.

    2016-10-01

    For data centres it is increasingly important to monitor the network usage, and learn from network usage patterns. Especially configuration issues or misbehaving batch jobs preventing a smooth operation need to be detected as early as possible. At the GridKa data and computing centre we therefore operate a tool BPNetMon for monitoring traffic data and characteristics of WLCG batch jobs and pilots locally on different worker nodes. On the one hand local information itself are not sufficient to detect anomalies for several reasons, e.g. the underlying job distribution on a single worker node might change or there might be a local misconfiguration. On the other hand a centralised anomaly detection approach does not scale regarding network communication as well as computational costs. We therefore propose a scalable architecture based on concepts of a super-peer network.

  17. Detection of a weak meddy-like anomaly from high-resolution satellite SST maps

    Directory of Open Access Journals (Sweden)

    Mikhail Emelianov

    2012-09-01

    Full Text Available Despite the considerable impact of meddies on climate through the long-distance transport of properties, a consistent observation of meddy generation and propagation in the ocean is rather elusive. Meddies propagate at about 1000 m below the ocean surface, so satellite sensors are not able to detect them directly and finding them in the open ocean is more fortuitous than intentional. However, a consistent census of meddies and their paths is required in order to gain knowledge about their role in transporting properties such as heat and salt. In this paper we propose a new methodology for processing high-resolution sea surface temperature maps in order to detect meddy-like anomalies in the open ocean on a near-real-time basis. We present an example of detection, involving an atypical meddy-like anomaly that was confirmed as such by in situ measurements.

  18. Improved anomaly detection using multi-scale PLS and generalized likelihood ratio test

    KAUST Repository

    Madakyaru, Muddu

    2017-02-16

    Process monitoring has a central role in the process industry to enhance productivity, efficiency, and safety, and to avoid expensive maintenance. In this paper, a statistical approach that exploit the advantages of multiscale PLS models (MSPLS) and those of a generalized likelihood ratio (GLR) test to better detect anomalies is proposed. Specifically, to consider the multivariate and multi-scale nature of process dynamics, a MSPLS algorithm combining PLS and wavelet analysis is used as modeling framework. Then, GLR hypothesis testing is applied using the uncorrelated residuals obtained from MSPLS model to improve the anomaly detection abilities of these latent variable based fault detection methods even further. Applications to a simulated distillation column data are used to evaluate the proposed MSPLS-GLR algorithm.

  19. Improved anomaly detection using multi-scale PLS and generalized likelihood ratio test

    KAUST Repository

    Madakyaru, Muddu; Harrou, Fouzi; Sun, Ying

    2017-01-01

    Process monitoring has a central role in the process industry to enhance productivity, efficiency, and safety, and to avoid expensive maintenance. In this paper, a statistical approach that exploit the advantages of multiscale PLS models (MSPLS) and those of a generalized likelihood ratio (GLR) test to better detect anomalies is proposed. Specifically, to consider the multivariate and multi-scale nature of process dynamics, a MSPLS algorithm combining PLS and wavelet analysis is used as modeling framework. Then, GLR hypothesis testing is applied using the uncorrelated residuals obtained from MSPLS model to improve the anomaly detection abilities of these latent variable based fault detection methods even further. Applications to a simulated distillation column data are used to evaluate the proposed MSPLS-GLR algorithm.

  20. Online Detection of Anomalous Sub-trajectories: A Sliding Window Approach Based on Conformal Anomaly Detection and Local Outlier Factor

    OpenAIRE

    Laxhammar , Rikard; Falkman , Göran

    2012-01-01

    Part 4: First Conformal Prediction and Its Applications Workshop (COPA 2012); International audience; Automated detection of anomalous trajectories is an important problem in the surveillance domain. Various algorithms based on learning of normal trajectory patterns have been proposed for this problem. Yet, these algorithms suffer from one or more of the following limitations: First, they are essentially designed for offline anomaly detection in databases. Second, they are insensitive to loca...

  1. A new approach for structural health monitoring by applying anomaly detection on strain sensor data

    Science.gov (United States)

    Trichias, Konstantinos; Pijpers, Richard; Meeuwissen, Erik

    2014-03-01

    Structural Health Monitoring (SHM) systems help to monitor critical infrastructures (bridges, tunnels, etc.) remotely and provide up-to-date information about their physical condition. In addition, it helps to predict the structure's life and required maintenance in a cost-efficient way. Typically, inspection data gives insight in the structural health. The global structural behavior, and predominantly the structural loading, is generally measured with vibration and strain sensors. Acoustic emission sensors are more and more used for measuring global crack activity near critical locations. In this paper, we present a procedure for local structural health monitoring by applying Anomaly Detection (AD) on strain sensor data for sensors that are applied in expected crack path. Sensor data is analyzed by automatic anomaly detection in order to find crack activity at an early stage. This approach targets the monitoring of critical structural locations, such as welds, near which strain sensors can be applied during construction and/or locations with limited inspection possibilities during structural operation. We investigate several anomaly detection techniques to detect changes in statistical properties, indicating structural degradation. The most effective one is a novel polynomial fitting technique, which tracks slow changes in sensor data. Our approach has been tested on a representative test structure (bridge deck) in a lab environment, under constant and variable amplitude fatigue loading. In both cases, the evolving cracks at the monitored locations were successfully detected, autonomously, by our AD monitoring tool.

  2. OceanXtremes: Scalable Anomaly Detection in Oceanographic Time-Series

    Science.gov (United States)

    Wilson, B. D.; Armstrong, E. M.; Chin, T. M.; Gill, K. M.; Greguska, F. R., III; Huang, T.; Jacob, J. C.; Quach, N.

    2016-12-01

    The oceanographic community must meet the challenge to rapidly identify features and anomalies in complex and voluminous observations to further science and improve decision support. Given this data-intensive reality, we are developing an anomaly detection system, called OceanXtremes, powered by an intelligent, elastic Cloud-based analytic service backend that enables execution of domain-specific, multi-scale anomaly and feature detection algorithms across the entire archive of 15 to 30-year ocean science datasets.Our parallel analytics engine is extending the NEXUS system and exploits multiple open-source technologies: Apache Cassandra as a distributed spatial "tile" cache, Apache Spark for in-memory parallel computation, and Apache Solr for spatial search and storing pre-computed tile statistics and other metadata. OceanXtremes provides these key capabilities: Parallel generation (Spark on a compute cluster) of 15 to 30-year Ocean Climatologies (e.g. sea surface temperature or SST) in hours or overnight, using simple pixel averages or customizable Gaussian-weighted "smoothing" over latitude, longitude, and time; Parallel pre-computation, tiling, and caching of anomaly fields (daily variables minus a chosen climatology) with pre-computed tile statistics; Parallel detection (over the time-series of tiles) of anomalies or phenomena by regional area-averages exceeding a specified threshold (e.g. high SST in El Nino or SST "blob" regions), or more complex, custom data mining algorithms; Shared discovery and exploration of ocean phenomena and anomalies (facet search using Solr), along with unexpected correlations between key measured variables; Scalable execution for all capabilities on a hybrid Cloud, using our on-premise OpenStack Cloud cluster or at Amazon. The key idea is that the parallel data-mining operations will be run "near" the ocean data archives (a local "network" hop) so that we can efficiently access the thousands of files making up a three decade time

  3. Detecting errors and anomalies in computerized materials control and accountability databases

    International Nuclear Information System (INIS)

    Whiteson, R.; Hench, K.; Yarbro, T.; Baumgart, C.

    1998-01-01

    The Automated MC and A Database Assessment project is aimed at improving anomaly and error detection in materials control and accountability (MC and A) databases and increasing confidence in the data that they contain. Anomalous data resulting in poor categorization of nuclear material inventories greatly reduces the value of the database information to users. Therefore it is essential that MC and A data be assessed periodically for anomalies or errors. Anomaly detection can identify errors in databases and thus provide assurance of the integrity of data. An expert system has been developed at Los Alamos National Laboratory that examines these large databases for anomalous or erroneous data. For several years, MC and A subject matter experts at Los Alamos have been using this automated system to examine the large amounts of accountability data that the Los Alamos Plutonium Facility generates. These data are collected and managed by the Material Accountability and Safeguards System, a near-real-time computerized nuclear material accountability and safeguards system. This year they have expanded the user base, customizing the anomaly detector for the varying requirements of different groups of users. This paper describes the progress in customizing the expert systems to the needs of the users of the data and reports on their results

  4. Road Traffic Anomaly Detection via Collaborative Path Inference from GPS Snippets

    Directory of Open Access Journals (Sweden)

    Hongtao Wang

    2017-03-01

    Full Text Available Road traffic anomaly denotes a road segment that is anomalous in terms of traffic flow of vehicles. Detecting road traffic anomalies from GPS (Global Position System snippets data is becoming critical in urban computing since they often suggest underlying events. However, the noisy ands parse nature of GPS snippets data have ushered multiple problems, which have prompted the detection of road traffic anomalies to be very challenging. To address these issues, we propose a two-stage solution which consists of two components: a Collaborative Path Inference (CPI model and a Road Anomaly Test (RAT model. CPI model performs path inference incorporating both static and dynamic features into a Conditional Random Field (CRF. Dynamic context features are learned collaboratively from large GPS snippets via a tensor decomposition technique. Then RAT calculates the anomalous degree for each road segment from the inferred fine-grained trajectories in given time intervals. We evaluated our method using a large scale real world dataset, which includes one-month GPS location data from more than eight thousand taxi cabs in Beijing. The evaluation results show the advantages of our method beyond other baseline techniques.

  5. QRS Detection Based on Improved Adaptive Threshold

    Directory of Open Access Journals (Sweden)

    Xuanyu Lu

    2018-01-01

    Full Text Available Cardiovascular disease is the first cause of death around the world. In accomplishing quick and accurate diagnosis, automatic electrocardiogram (ECG analysis algorithm plays an important role, whose first step is QRS detection. The threshold algorithm of QRS complex detection is known for its high-speed computation and minimized memory storage. In this mobile era, threshold algorithm can be easily transported into portable, wearable, and wireless ECG systems. However, the detection rate of the threshold algorithm still calls for improvement. An improved adaptive threshold algorithm for QRS detection is reported in this paper. The main steps of this algorithm are preprocessing, peak finding, and adaptive threshold QRS detecting. The detection rate is 99.41%, the sensitivity (Se is 99.72%, and the specificity (Sp is 99.69% on the MIT-BIH Arrhythmia database. A comparison is also made with two other algorithms, to prove our superiority. The suspicious abnormal area is shown at the end of the algorithm and RR-Lorenz plot drawn for doctors and cardiologists to use as aid for diagnosis.

  6. Using an autonomous Wave Glider to detect seawater anomalies related to submarine groundwater discharge - engineering challenge

    Science.gov (United States)

    Leibold, P.; Brueckmann, W.; Schmidt, M.; Balushi, H. A.; Abri, O. A.

    2017-12-01

    Coastal aquifer systems are amongst the most precious and vulnerable water resources worldwide. While differing in lateral and vertical extent they commonly show a complex interaction with the marine realm. Excessive groundwater extraction can cause saltwater intrusion from the sea into the aquifers, having a strongly negative impact on the groundwater quality. While the reverse pathway, the discharge of groundwater into the sea is well understood in principle, it's mechanisms and quantities not well constrained. We will present a project that combines onshore monitoring and modeling of groundwater in the coastal plain of Salalah, Oman with an offshore autonomous robotic monitoring system, the Liquid Robotics Wave Glider. Eventually, fluxes detected by the Wave Glider system and the onshore monitoring of groundwater will be combined into a 3-D flow model of the coastal and deeper aquifers. The main tool for offshore SGD investigation project is a Wave Glider, an autonomous vehicle based on a new propulsion technology. The Wave Glider is a low-cost satellite-connected marine craft, consisting of a combination of a sea-surface and an underwater component which is propelled by the conversion of ocean wave energy into forward thrust. While the wave energy propulsion system is purely mechanical, electrical energy for onboard computers, communication and sensors is provided by photovoltaic cells. For the project the SGD Wave Glider is being equipped with dedicated sensors to measure temperature, conductivity, Radon isotope (222Rn, 220Rn) activity concentration as well as other tracers of groundwater discharge. Dedicated software using this data input will eventually allow the Wave Glider to autonomously collect information and actively adapt its search pattern to hunt for spatial and temporal anomalies. Our presentation will focus on the engineering and operational challenges ofdetecting submarine groundwater discharges with the Wave Glider system in the Bay of Salalah

  7. Adaptive distributed outlier detection for WSNs.

    Science.gov (United States)

    De Paola, Alessandra; Gaglio, Salvatore; Lo Re, Giuseppe; Milazzo, Fabrizio; Ortolani, Marco

    2015-05-01

    The paradigm of pervasive computing is gaining more and more attention nowadays, thanks to the possibility of obtaining precise and continuous monitoring. Ease of deployment and adaptivity are typically implemented by adopting autonomous and cooperative sensory devices; however, for such systems to be of any practical use, reliability and fault tolerance must be guaranteed, for instance by detecting corrupted readings amidst the huge amount of gathered sensory data. This paper proposes an adaptive distributed Bayesian approach for detecting outliers in data collected by a wireless sensor network; our algorithm aims at optimizing classification accuracy, time complexity and communication complexity, and also considering externally imposed constraints on such conflicting goals. The performed experimental evaluation showed that our approach is able to improve the considered metrics for latency and energy consumption, with limited impact on classification accuracy.

  8. A primitive study on unsupervised anomaly detection with an autoencoder in emergency head CT volumes

    Science.gov (United States)

    Sato, Daisuke; Hanaoka, Shouhei; Nomura, Yukihiro; Takenaga, Tomomi; Miki, Soichiro; Yoshikawa, Takeharu; Hayashi, Naoto; Abe, Osamu

    2018-02-01

    Purpose: The target disorders of emergency head CT are wide-ranging. Therefore, people working in an emergency department desire a computer-aided detection system for general disorders. In this study, we proposed an unsupervised anomaly detection method in emergency head CT using an autoencoder and evaluated the anomaly detection performance of our method in emergency head CT. Methods: We used a 3D convolutional autoencoder (3D-CAE), which contains 11 layers in the convolution block and 6 layers in the deconvolution block. In the training phase, we trained the 3D-CAE using 10,000 3D patches extracted from 50 normal cases. In the test phase, we calculated abnormalities of each voxel in 38 emergency head CT volumes (22 abnormal cases and 16 normal cases) for evaluation and evaluated the likelihood of lesion existence. Results: Our method achieved a sensitivity of 68% and a specificity of 88%, with an area under the curve of the receiver operating characteristic curve of 0.87. It shows that this method has a moderate accuracy to distinguish normal CT cases to abnormal ones. Conclusion: Our method has potentialities for anomaly detection in emergency head CT.

  9. Capacitance probe for detection of anomalies in non-metallic plastic pipe

    Science.gov (United States)

    Mathur, Mahendra P.; Spenik, James L.; Condon, Christopher M.; Anderson, Rodney; Driscoll, Daniel J.; Fincham, Jr., William L.; Monazam, Esmail R.

    2010-11-23

    The disclosure relates to analysis of materials using a capacitive sensor to detect anomalies through comparison of measured capacitances. The capacitive sensor is used in conjunction with a capacitance measurement device, a location device, and a processor in order to generate a capacitance versus location output which may be inspected for the detection and localization of anomalies within the material under test. The components may be carried as payload on an inspection vehicle which may traverse through a pipe interior, allowing evaluation of nonmetallic or plastic pipes when the piping exterior is not accessible. In an embodiment, supporting components are solid-state devices powered by a low voltage on-board power supply, providing for use in environments where voltage levels may be restricted.

  10. Thermal anomalies detection before strong earthquakes (M > 6.0 using interquartile, wavelet and Kalman filter methods

    Directory of Open Access Journals (Sweden)

    M. Akhoondzadeh

    2011-04-01

    Full Text Available Thermal anomaly is known as a significant precursor of strong earthquakes, therefore Land Surface Temperature (LST time series have been analyzed in this study to locate relevant anomalous variations prior to the Bam (26 December 2003, Zarand (22 February 2005 and Borujerd (31 March 2006 earthquakes. The duration of the three datasets which are comprised of MODIS LST images is 44, 28 and 46 days for the Bam, Zarand and Borujerd earthquakes, respectively. In order to exclude variations of LST from temperature seasonal effects, Air Temperature (AT data derived from the meteorological stations close to the earthquakes epicenters have been taken into account. The detection of thermal anomalies has been assessed using interquartile, wavelet transform and Kalman filter methods, each presenting its own independent property in anomaly detection. The interquartile method has been used to construct the higher and lower bounds in LST data to detect disturbed states outside the bounds which might be associated with impending earthquakes. The wavelet transform method has been used to locate local maxima within each time series of LST data for identifying earthquake anomalies by a predefined threshold. Also, the prediction property of the Kalman filter has been used in the detection process of prominent LST anomalies. The results concerning the methodology indicate that the interquartile method is capable of detecting the highest intensity anomaly values, the wavelet transform is sensitive to sudden changes, and the Kalman filter method significantly detects the highest unpredictable variations of LST. The three methods detected anomalous occurrences during 1 to 20 days prior to the earthquakes showing close agreement in results found between the different applied methods on LST data in the detection of pre-seismic anomalies. The proposed method for anomaly detection was also applied on regions irrelevant to earthquakes for which no anomaly was detected

  11. Application of Kalman filter in detecting pre-earthquake ionospheric TEC anomaly

    Directory of Open Access Journals (Sweden)

    Zhu Fuying

    2011-05-01

    Full Text Available : As an attempt, the Kalman filter was used to study the anomalous variations of ionospheric Total Electron Content (TEC before and after Wenchuan Ms8.0 earthquake, these TEC data were calculated from the GPS data observed by the Crustal Movement Observation Network of China. The result indicates that this method is reasonable and reliable in detecting TEC anomalies associated with large earthquakes.

  12. Improvement of statistical methods for detecting anomalies in climate and environmental monitoring systems

    Science.gov (United States)

    Yakunin, A. G.; Hussein, H. M.

    2018-01-01

    The article shows how the known statistical methods, which are widely used in solving financial problems and a number of other fields of science and technology, can be effectively applied after minor modification for solving such problems in climate and environment monitoring systems, as the detection of anomalies in the form of abrupt changes in signal levels, the occurrence of positive and negative outliers and the violation of the cycle form in periodic processes.

  13. Setup Instructions for the Applied Anomaly Detection Tool (AADT) Web Server

    Science.gov (United States)

    2016-09-01

    tool has been developed for many platforms: Android , iOS, and Windows. The Windows version has been developed as a web server that allows the...Microsoft Windows. 15. SUBJECT TERMS Applied Anomaly Detection Tool, AADT, Windows, server, web service, installation 16. SECURITY CLASSIFICATION OF: 17...instructional information about identifying them as groups and individually. The software has been developed for several different platforms: Android

  14. Behavior Drift Detection Based on Anomalies Identification in Home Living Quantitative Indicators

    OpenAIRE

    Fabio Veronese; Andrea Masciadri; Sara Comai; Matteo Matteucci; Fabio Salice

    2018-01-01

    Home Automation and Smart Homes diffusion are providing an interesting opportunity to implement elderly monitoring. This is a new valid technological support to allow in-place aging of seniors by means of a detection system to notify potential anomalies. Monitoring has been implemented by means of Complex Event Processing on live streams of home automation data: this allows the analysis of the behavior of the house inhabitant through quantitative indicators. Different kinds of quantitative in...

  15. Network Traffic Features for Anomaly Detection in Specific Industrial Control System Network

    Directory of Open Access Journals (Sweden)

    Matti Mantere

    2013-09-01

    Full Text Available The deterministic and restricted nature of industrial control system networks sets them apart from more open networks, such as local area networks in office environments. This improves the usability of network security, monitoring approaches that would be less feasible in more open environments. One of such approaches is machine learning based anomaly detection. Without proper customization for the special requirements of the industrial control system network environment, many existing anomaly or misuse detection systems will perform sub-optimally. A machine learning based approach could reduce the amount of manual customization required for different industrial control system networks. In this paper we analyze a possible set of features to be used in a machine learning based anomaly detection system in the real world industrial control system network environment under investigation. The network under investigation is represented by architectural drawing and results derived from network trace analysis. The network trace is captured from a live running industrial process control network and includes both control data and the data flowing between the control network and the office network. We limit the investigation to the IP traffic in the traces.

  16. A Model-Based Anomaly Detection Approach for Analyzing Streaming Aircraft Engine Measurement Data

    Science.gov (United States)

    Simon, Donald L.; Rinehart, Aidan Walker

    2015-01-01

    This paper presents a model-based anomaly detection architecture designed for analyzing streaming transient aircraft engine measurement data. The technique calculates and monitors residuals between sensed engine outputs and model predicted outputs for anomaly detection purposes. Pivotal to the performance of this technique is the ability to construct a model that accurately reflects the nominal operating performance of the engine. The dynamic model applied in the architecture is a piecewise linear design comprising steady-state trim points and dynamic state space matrices. A simple curve-fitting technique for updating the model trim point information based on steadystate information extracted from available nominal engine measurement data is presented. Results from the application of the model-based approach for processing actual engine test data are shown. These include both nominal fault-free test case data and seeded fault test case data. The results indicate that the updates applied to improve the model trim point information also improve anomaly detection performance. Recommendations for follow-on enhancements to the technique are also presented and discussed.

  17. Clusters versus GPUs for Parallel Target and Anomaly Detection in Hyperspectral Images

    Directory of Open Access Journals (Sweden)

    Antonio Plaza

    2010-01-01

    Full Text Available Remotely sensed hyperspectral sensors provide image data containing rich information in both the spatial and the spectral domain, and this information can be used to address detection tasks in many applications. In many surveillance applications, the size of the objects (targets searched for constitutes a very small fraction of the total search area and the spectral signatures associated to the targets are generally different from those of the background, hence the targets can be seen as anomalies. In hyperspectral imaging, many algorithms have been proposed for automatic target and anomaly detection. Given the dimensionality of hyperspectral scenes, these techniques can be time-consuming and difficult to apply in applications requiring real-time performance. In this paper, we develop several new parallel implementations of automatic target and anomaly detection algorithms. The proposed parallel algorithms are quantitatively evaluated using hyperspectral data collected by the NASA's Airborne Visible Infra-Red Imaging Spectrometer (AVIRIS system over theWorld Trade Center (WTC in New York, five days after the terrorist attacks that collapsed the two main towers in theWTC complex.

  18. Clusters versus GPUs for Parallel Target and Anomaly Detection in Hyperspectral Images

    Directory of Open Access Journals (Sweden)

    Paz Abel

    2010-01-01

    Full Text Available Abstract Remotely sensed hyperspectral sensors provide image data containing rich information in both the spatial and the spectral domain, and this information can be used to address detection tasks in many applications. In many surveillance applications, the size of the objects (targets searched for constitutes a very small fraction of the total search area and the spectral signatures associated to the targets are generally different from those of the background, hence the targets can be seen as anomalies. In hyperspectral imaging, many algorithms have been proposed for automatic target and anomaly detection. Given the dimensionality of hyperspectral scenes, these techniques can be time-consuming and difficult to apply in applications requiring real-time performance. In this paper, we develop several new parallel implementations of automatic target and anomaly detection algorithms. The proposed parallel algorithms are quantitatively evaluated using hyperspectral data collected by the NASA's Airborne Visible Infra-Red Imaging Spectrometer (AVIRIS system over theWorld Trade Center (WTC in New York, five days after the terrorist attacks that collapsed the two main towers in theWTC complex.

  19. Adaptive sampling algorithm for detection of superpoints

    Institute of Scientific and Technical Information of China (English)

    CHENG Guang; GONG Jian; DING Wei; WU Hua; QIANG ShiQiang

    2008-01-01

    The superpoints are the sources (or the destinations) that connect with a great deal of destinations (or sources) during a measurement time interval, so detecting the superpoints in real time is very important to network security and management. Previous algorithms are not able to control the usage of the memory and to deliver the desired accuracy, so it is hard to detect the superpoints on a high speed link in real time. In this paper, we propose an adaptive sampling algorithm to detect the superpoints in real time, which uses a flow sample and hold module to reduce the detection of the non-superpoints and to improve the measurement accuracy of the superpoints. We also design a data stream structure to maintain the flow records, which compensates for the flow Hash collisions statistically. An adaptive process based on different sampling probabilities is used to maintain the recorded IP ad dresses in the limited memory. This algorithm is compared with the other algo rithms by analyzing the real network trace data. Experiment results and mathematic analysis show that this algorithm has the advantages of both the limited memory requirement and high measurement accuracy.

  20. Improving Anomaly Detection for Text-Based Protocols by Exploiting Message Structures

    Directory of Open Access Journals (Sweden)

    Christian M. Mueller

    2010-12-01

    Full Text Available Service platforms using text-based protocols need to be protected against attacks. Machine-learning algorithms with pattern matching can be used to detect even previously unknown attacks. In this paper, we present an extension to known Support Vector Machine (SVM based anomaly detection algorithms for the Session Initiation Protocol (SIP. Our contribution is to extend the amount of different features used for classification (feature space by exploiting the structure of SIP messages, which reduces the false positive rate. Additionally, we show how combining our approach with attribute reduction significantly improves throughput.

  1. Entropy Measures for Stochastic Processes with Applications in Functional Anomaly Detection

    Directory of Open Access Journals (Sweden)

    Gabriel Martos

    2018-01-01

    Full Text Available We propose a definition of entropy for stochastic processes. We provide a reproducing kernel Hilbert space model to estimate entropy from a random sample of realizations of a stochastic process, namely functional data, and introduce two approaches to estimate minimum entropy sets. These sets are relevant to detect anomalous or outlier functional data. A numerical experiment illustrates the performance of the proposed method; in addition, we conduct an analysis of mortality rate curves as an interesting application in a real-data context to explore functional anomaly detection.

  2. An anomaly detection and isolation scheme with instance-based learning and sequential analysis

    International Nuclear Information System (INIS)

    Yoo, T. S.; Garcia, H. E.

    2006-01-01

    This paper presents an online anomaly detection and isolation (FDI) technique using an instance-based learning method combined with a sequential change detection and isolation algorithm. The proposed method uses kernel density estimation techniques to build statistical models of the given empirical data (null hypothesis). The null hypothesis is associated with the set of alternative hypotheses modeling the abnormalities of the systems. A decision procedure involves a sequential change detection and isolation algorithm. Notably, the proposed method enjoys asymptotic optimality as the applied change detection and isolation algorithm is optimal in minimizing the worst mean detection/isolation delay for a given mean time before a false alarm or a false isolation. Applicability of this methodology is illustrated with redundant sensor data set and its performance. (authors)

  3. On-road anomaly detection by multimodal sensor analysis and multimedia processing

    Science.gov (United States)

    Orhan, Fatih; Eren, P. E.

    2014-03-01

    The use of smartphones in Intelligent Transportation Systems is gaining popularity, yet many challenges exist in developing functional applications. Due to the dynamic nature of transportation, vehicular social applications face complexities such as developing robust sensor management, performing signal and image processing tasks, and sharing information among users. This study utilizes a multimodal sensor analysis framework which enables the analysis of sensors in multimodal aspect. It also provides plugin-based analyzing interfaces to develop sensor and image processing based applications, and connects its users via a centralized application as well as to social networks to facilitate communication and socialization. With the usage of this framework, an on-road anomaly detector is being developed and tested. The detector utilizes the sensors of a mobile device and is able to identify anomalies such as hard brake, pothole crossing, and speed bump crossing. Upon such detection, the video portion containing the anomaly is automatically extracted in order to enable further image processing analysis. The detection results are shared on a central portal application for online traffic condition monitoring.

  4. Anomaly Detection for Aviation Safety Based on an Improved KPCA Algorithm

    Directory of Open Access Journals (Sweden)

    Xiaoyu Zhang

    2017-01-01

    Full Text Available Thousands of flights datasets should be analyzed per day for a moderate sized fleet; therefore, flight datasets are very large. In this paper, an improved kernel principal component analysis (KPCA method is proposed to search for signatures of anomalies in flight datasets through the squared prediction error statistics, in which the number of principal components and the confidence for the confidence limit are automatically determined by OpenMP-based K-fold cross-validation algorithm and the parameter in the radial basis function (RBF is optimized by GPU-based kernel learning method. Performed on Nvidia GeForce GTX 660, the computation of the proposed GPU-based RBF parameter is 112.9 times (average 82.6 times faster than that of sequential CPU task execution. The OpenMP-based K-fold cross-validation process for training KPCA anomaly detection model becomes 2.4 times (average 1.5 times faster than that of sequential CPU task execution. Experiments show that the proposed approach can effectively detect the anomalies with the accuracy of 93.57% and false positive alarm rate of 1.11%.

  5. Illustration, detection and prevention of sleep deprivation anomaly in mobile ad hoc networks

    International Nuclear Information System (INIS)

    Nadeem, A.; Ahsan, K.; Sarim, M.

    2017-01-01

    MANETs (Mobile Ad Hoc Networks) have applications in various walks of life from rescue operations to battle field operations, personal and commercial. However, routing operations in MANETs are still vulnerable to anomalies and DoS (Denial of Service) attacks such as sleep deprivation. In SD (Sleep Deprivation) attack malicious node exploits the vulnerability in the route discovery function of the reactive routing protocol for example AODV (Ad Hoc On-Demand Distance Vector). In this paper, we first illustrate the SD anomaly in MANETs and then propose a SD detection and prevention algorithm which efficiently deals with this attack. We assess the performance of our proposed approach through simulation, evaluating its successfulness using different network scenarios. (author)

  6. Radiation anomaly detection algorithms for field-acquired gamma energy spectra

    Science.gov (United States)

    Mukhopadhyay, Sanjoy; Maurer, Richard; Wolff, Ron; Guss, Paul; Mitchell, Stephen

    2015-08-01

    The Remote Sensing Laboratory (RSL) is developing a tactical, networked radiation detection system that will be agile, reconfigurable, and capable of rapid threat assessment with high degree of fidelity and certainty. Our design is driven by the needs of users such as law enforcement personnel who must make decisions by evaluating threat signatures in urban settings. The most efficient tool available to identify the nature of the threat object is real-time gamma spectroscopic analysis, as it is fast and has a very low probability of producing false positive alarm conditions. Urban radiological searches are inherently challenged by the rapid and large spatial variation of background gamma radiation, the presence of benign radioactive materials in terms of the normally occurring radioactive materials (NORM), and shielded and/or masked threat sources. Multiple spectral anomaly detection algorithms have been developed by national laboratories and commercial vendors. For example, the Gamma Detector Response and Analysis Software (GADRAS) a one-dimensional deterministic radiation transport software capable of calculating gamma ray spectra using physics-based detector response functions was developed at Sandia National Laboratories. The nuisance-rejection spectral comparison ratio anomaly detection algorithm (or NSCRAD), developed at Pacific Northwest National Laboratory, uses spectral comparison ratios to detect deviation from benign medical and NORM radiation source and can work in spite of strong presence of NORM and or medical sources. RSL has developed its own wavelet-based gamma energy spectral anomaly detection algorithm called WAVRAD. Test results and relative merits of these different algorithms will be discussed and demonstrated.

  7. Item Anomaly Detection Based on Dynamic Partition for Time Series in Recommender Systems.

    Science.gov (United States)

    Gao, Min; Tian, Renli; Wen, Junhao; Xiong, Qingyu; Ling, Bin; Yang, Linda

    2015-01-01

    In recent years, recommender systems have become an effective method to process information overload. However, recommendation technology still suffers from many problems. One of the problems is shilling attacks-attackers inject spam user profiles to disturb the list of recommendation items. There are two characteristics of all types of shilling attacks: 1) Item abnormality: The rating of target items is always maximum or minimum; and 2) Attack promptness: It takes only a very short period time to inject attack profiles. Some papers have proposed item anomaly detection methods based on these two characteristics, but their detection rate, false alarm rate, and universality need to be further improved. To solve these problems, this paper proposes an item anomaly detection method based on dynamic partitioning for time series. This method first dynamically partitions item-rating time series based on important points. Then, we use chi square distribution (χ2) to detect abnormal intervals. The experimental results on MovieLens 100K and 1M indicate that this approach has a high detection rate and a low false alarm rate and is stable toward different attack models and filler sizes.

  8. Seasonal ARMA-based SPC charts for anomaly detection: Application to emergency department systems

    KAUST Repository

    Kadri, Farid

    2015-10-22

    Monitoring complex production systems is primordial to ensure management, reliability and safety as well as maintaining the desired product quality. Early detection of emergent abnormal behaviour in monitored systems allows pre-emptive action to prevent more serious consequences, to improve system operations and to reduce manufacturing and/or service costs. This study reports the design of a new methodology for the detection of abnormal situations based on the integration of time-series analysis models and statistical process control (SPC) tools for the joint development of a monitoring system to help supervising of the behaviour of emergency department services (EDs). The monitoring system developed is able to provide early alerts in the event of abnormal situations. The seasonal autoregressive moving average (SARMA)-based exponentially weighted moving average (EWMA) anomaly detection scheme proposed was successfully applied to the practical data collected from the database of the paediatric emergency department (PED) at Lille regional hospital centre, France. The method developed utilizes SARMA as a modelling framework and EWMA for anomaly detection. The EWMA control chart is applied to the uncorrelated residuals obtained from the SARMA model. The detection results of the EWMA chart are compared with two other commonly applied residual-based tests: a Shewhart individuals chart and a Cumulative Sum (CUSUM) control chart.

  9. Data-Driven Anomaly Detection Performance for the Ares I-X Ground Diagnostic Prototype

    Science.gov (United States)

    Martin, Rodney A.; Schwabacher, Mark A.; Matthews, Bryan L.

    2010-01-01

    In this paper, we will assess the performance of a data-driven anomaly detection algorithm, the Inductive Monitoring System (IMS), which can be used to detect simulated Thrust Vector Control (TVC) system failures. However, the ability of IMS to detect these failures in a true operational setting may be related to the realistic nature of how they are simulated. As such, we will investigate both a low fidelity and high fidelity approach to simulating such failures, with the latter based upon the underlying physics. Furthermore, the ability of IMS to detect anomalies that were previously unknown and not previously simulated will be studied in earnest, as well as apparent deficiencies or misapplications that result from using the data-driven paradigm. Our conclusions indicate that robust detection performance of simulated failures using IMS is not appreciably affected by the use of a high fidelity simulation. However, we have found that the inclusion of a data-driven algorithm such as IMS into a suite of deployable health management technologies does add significant value.

  10. Seasonal ARMA-based SPC charts for anomaly detection: Application to emergency department systems

    KAUST Repository

    Kadri, Farid; Harrou, Fouzi; Chaabane, Sondè s; Sun, Ying; Tahon, Christian

    2015-01-01

    Monitoring complex production systems is primordial to ensure management, reliability and safety as well as maintaining the desired product quality. Early detection of emergent abnormal behaviour in monitored systems allows pre-emptive action to prevent more serious consequences, to improve system operations and to reduce manufacturing and/or service costs. This study reports the design of a new methodology for the detection of abnormal situations based on the integration of time-series analysis models and statistical process control (SPC) tools for the joint development of a monitoring system to help supervising of the behaviour of emergency department services (EDs). The monitoring system developed is able to provide early alerts in the event of abnormal situations. The seasonal autoregressive moving average (SARMA)-based exponentially weighted moving average (EWMA) anomaly detection scheme proposed was successfully applied to the practical data collected from the database of the paediatric emergency department (PED) at Lille regional hospital centre, France. The method developed utilizes SARMA as a modelling framework and EWMA for anomaly detection. The EWMA control chart is applied to the uncorrelated residuals obtained from the SARMA model. The detection results of the EWMA chart are compared with two other commonly applied residual-based tests: a Shewhart individuals chart and a Cumulative Sum (CUSUM) control chart.

  11. Item Anomaly Detection Based on Dynamic Partition for Time Series in Recommender Systems

    Science.gov (United States)

    Gao, Min; Tian, Renli; Wen, Junhao; Xiong, Qingyu; Ling, Bin; Yang, Linda

    2015-01-01

    In recent years, recommender systems have become an effective method to process information overload. However, recommendation technology still suffers from many problems. One of the problems is shilling attacks-attackers inject spam user profiles to disturb the list of recommendation items. There are two characteristics of all types of shilling attacks: 1) Item abnormality: The rating of target items is always maximum or minimum; and 2) Attack promptness: It takes only a very short period time to inject attack profiles. Some papers have proposed item anomaly detection methods based on these two characteristics, but their detection rate, false alarm rate, and universality need to be further improved. To solve these problems, this paper proposes an item anomaly detection method based on dynamic partitioning for time series. This method first dynamically partitions item-rating time series based on important points. Then, we use chi square distribution (χ2) to detect abnormal intervals. The experimental results on MovieLens 100K and 1M indicate that this approach has a high detection rate and a low false alarm rate and is stable toward different attack models and filler sizes. PMID:26267477

  12. Caldera unrest detected with seawater temperature anomalies at Deception Island, Antarctic Peninsula

    Science.gov (United States)

    Berrocoso, M.; Prates, G.; Fernández-Ros, A.; Peci, L. M.; de Gil, A.; Rosado, B.; Páez, R.; Jigena, B.

    2018-04-01

    Increased thermal activity was detected to coincide with the onset of volcano inflation in the seawater-filled caldera at Deception Island. This thermal activity was manifested in pulses of high water temperature that coincided with ocean tide cycles. The seawater temperature anomalies were detected by a thermometric sensor attached to the tide gauge (bottom pressure sensor). This was installed where the seawater circulation and the locations of known thermal anomalies, fumaroles and thermal springs, together favor the detection of water warmed within the caldera. Detection of the increased thermal activity was also possible because sea ice, which covers the entire caldera during the austral winter months, insulates the water and thus reduces temperature exchange between seawater and atmosphere. In these conditions, the water temperature data has been shown to provide significant information about Deception volcano activity. The detected seawater temperature increase, also observed in soil temperature readings, suggests rapid and near-simultaneous increase in geothermal activity with onset of caldera inflation and an increased number of seismic events observed in the following austral summer.

  13. Small-scale anomaly detection in panoramic imaging using neural models of low-level vision

    Science.gov (United States)

    Casey, Matthew C.; Hickman, Duncan L.; Pavlou, Athanasios; Sadler, James R. E.

    2011-06-01

    Our understanding of sensory processing in animals has reached the stage where we can exploit neurobiological principles in commercial systems. In human vision, one brain structure that offers insight into how we might detect anomalies in real-time imaging is the superior colliculus (SC). The SC is a small structure that rapidly orients our eyes to a movement, sound or touch that it detects, even when the stimulus may be on a small-scale; think of a camouflaged movement or the rustle of leaves. This automatic orientation allows us to prioritize the use of our eyes to raise awareness of a potential threat, such as a predator approaching stealthily. In this paper we describe the application of a neural network model of the SC to the detection of anomalies in panoramic imaging. The neural approach consists of a mosaic of topographic maps that are each trained using competitive Hebbian learning to rapidly detect image features of a pre-defined shape and scale. What makes this approach interesting is the ability of the competition between neurons to automatically filter noise, yet with the capability of generalizing the desired shape and scale. We will present the results of this technique applied to the real-time detection of obscured targets in visible-band panoramic CCTV images. Using background subtraction to highlight potential movement, the technique is able to correctly identify targets which span as little as 3 pixels wide while filtering small-scale noise.

  14. PROBABILITY CALIBRATION BY THE MINIMUM AND MAXIMUM PROBABILITY SCORES IN ONE-CLASS BAYES LEARNING FOR ANOMALY DETECTION

    Data.gov (United States)

    National Aeronautics and Space Administration — PROBABILITY CALIBRATION BY THE MINIMUM AND MAXIMUM PROBABILITY SCORES IN ONE-CLASS BAYES LEARNING FOR ANOMALY DETECTION GUICHONG LI, NATHALIE JAPKOWICZ, IAN HOFFMAN,...

  15. Improved detection of incipient anomalies via multivariate memory monitoring charts: Application to an air flow heating system

    KAUST Repository

    Harrou, Fouzi; Madakyaru, Muddu; Sun, Ying; Khadraoui, Sofiane

    2016-01-01

    Detecting anomalies is important for reliable operation of several engineering systems. Multivariate statistical monitoring charts are an efficient tool for checking the quality of a process by identifying abnormalities. Principal component analysis

  16. Robust adaptive subspace detection in impulsive noise

    KAUST Repository

    Ben Atitallah, Ismail

    2016-09-13

    This paper addresses the design of the Adaptive Subspace Matched Filter (ASMF) detector in the presence of compound Gaussian clutters and a mismatch in the steering vector. In particular, we consider the case wherein the ASMF uses the regularized Tyler estimator (RTE) to estimate the clutter covariance matrix. Under this setting, a major question that needs to be addressed concerns the setting of the threshold and the regularization parameter. To answer this question, we consider the regime in which the number of observations used to estimate the RTE and their dimensions grow large together. Recent results from random matrix theory are then used in order to approximate the false alarm and detection probabilities by deterministic quantities. The latter are optimized in order to maximize an upper bound on the asymptotic detection probability while keeping the asymptotic false alarm probability at a fixed rate. © 2016 IEEE.

  17. Robust adaptive subspace detection in impulsive noise

    KAUST Repository

    Ben Atitallah, Ismail; Kammoun, Abla; Alouini, Mohamed-Slim; Al-Naffouri, Tareq Y.

    2016-01-01

    This paper addresses the design of the Adaptive Subspace Matched Filter (ASMF) detector in the presence of compound Gaussian clutters and a mismatch in the steering vector. In particular, we consider the case wherein the ASMF uses the regularized Tyler estimator (RTE) to estimate the clutter covariance matrix. Under this setting, a major question that needs to be addressed concerns the setting of the threshold and the regularization parameter. To answer this question, we consider the regime in which the number of observations used to estimate the RTE and their dimensions grow large together. Recent results from random matrix theory are then used in order to approximate the false alarm and detection probabilities by deterministic quantities. The latter are optimized in order to maximize an upper bound on the asymptotic detection probability while keeping the asymptotic false alarm probability at a fixed rate. © 2016 IEEE.

  18. A robust anomaly based change detection method for time-series remote sensing images

    Science.gov (United States)

    Shoujing, Yin; Qiao, Wang; Chuanqing, Wu; Xiaoling, Chen; Wandong, Ma; Huiqin, Mao

    2014-03-01

    Time-series remote sensing images record changes happening on the earth surface, which include not only abnormal changes like human activities and emergencies (e.g. fire, drought, insect pest etc.), but also changes caused by vegetation phenology and climate changes. Yet, challenges occur in analyzing global environment changes and even the internal forces. This paper proposes a robust Anomaly Based Change Detection method (ABCD) for time-series images analysis by detecting abnormal points in data sets, which do not need to follow a normal distribution. With ABCD we can detect when and where changes occur, which is the prerequisite condition of global change studies. ABCD was tested initially with 10-day SPOT VGT NDVI (Normalized Difference Vegetation Index) times series tracking land cover type changes, seasonality and noise, then validated to real data in a large area in Jiangxi, south of China. Initial results show that ABCD can precisely detect spatial and temporal changes from long time series images rapidly.

  19. Anomaly Detection in Smart Metering Infrastructure with the Use of Time Series Analysis

    Directory of Open Access Journals (Sweden)

    Tomasz Andrysiak

    2017-01-01

    Full Text Available The article presents solutions to anomaly detection in network traffic for critical smart metering infrastructure, realized with the use of radio sensory network. The structure of the examined smart meter network and the key security aspects which have influence on the correct performance of an advanced metering infrastructure (possibility of passive and active cyberattacks are described. An effective and quick anomaly detection method is proposed. At its initial stage, Cook’s distance was used for detection and elimination of outlier observations. So prepared data was used to estimate standard statistical models based on exponential smoothing, that is, Brown’s, Holt’s, and Winters’ models. To estimate possible fluctuations in forecasts of the implemented models, properly parameterized Bollinger Bands was used. Next, statistical relations between the estimated traffic model and its real variability were examined to detect abnormal behavior, which could indicate a cyberattack attempt. An update procedure of standard models in case there were significant real network traffic fluctuations was also proposed. The choice of optimal parameter values of statistical models was realized as forecast error minimization. The results confirmed efficiency of the presented method and accuracy of choice of the proper statistical model for the analyzed time series.

  20. Anomaly Detection for Beam Loss Maps in the Large Hadron Collider

    Science.gov (United States)

    Valentino, Gianluca; Bruce, Roderik; Redaelli, Stefano; Rossi, Roberto; Theodoropoulos, Panagiotis; Jaster-Merz, Sonja

    2017-07-01

    In the LHC, beam loss maps are used to validate collimator settings for cleaning and machine protection. This is done by monitoring the loss distribution in the ring during infrequent controlled loss map campaigns, as well as in standard operation. Due to the complexity of the system, consisting of more than 50 collimators per beam, it is difficult to identify small changes in the collimation hierarchy, which may be due to setting errors or beam orbit drifts with such methods. A technique based on Principal Component Analysis and Local Outlier Factor is presented to detect anomalies in the loss maps and therefore provide an automatic check of the collimation hierarchy.

  1. Anomaly Detection for Beam Loss Maps in the Large Hadron Collider

    International Nuclear Information System (INIS)

    Valentino, Gianluca; Bruce, Roderik; Redaelli, Stefano; Rossi, Roberto; Theodoropoulos, Panagiotis; Jaster-Merz, Sonja

    2017-01-01

    In the LHC, beam loss maps are used to validate collimator settings for cleaning and machine protection. This is done by monitoring the loss distribution in the ring during infrequent controlled loss map campaigns, as well as in standard operation. Due to the complexity of the system, consisting of more than 50 collimators per beam, it is difficult to identify small changes in the collimation hierarchy, which may be due to setting errors or beam orbit drifts with such methods. A technique based on Principal Component Analysis and Local Outlier Factor is presented to detect anomalies in the loss maps and therefore provide an automatic check of the collimation hierarchy. (paper)

  2. Validity and efficiency of conformal anomaly detection on big distributed data

    Directory of Open Access Journals (Sweden)

    Ilia Nouretdinov

    2017-05-01

    Full Text Available Conformal Prediction is a recently developed framework for reliable confident predictions. In this work we discuss its possible application to big data coming from different, possibly heterogeneous data sources. On example of anomaly detection problem, we study the question of saving validity of Conformal Prediction in this case. We show that the straight forward averaging approach is invalid, while its easy alternative of maximizing is not very efficient because of its conservativeness. We propose the third compromised approach that is valid, but much less conservative. It is supported by both theoretical justification and experimental results in the area of energy engineering.

  3. Fiber Optic Bragg Grating Sensors for Thermographic Detection of Subsurface Anomalies

    Science.gov (United States)

    Allison, Sidney G.; Winfree, William P.; Wu, Meng-Chou

    2009-01-01

    Conventional thermography with an infrared imager has been shown to be an extremely viable technique for nondestructively detecting subsurface anomalies such as thickness variations due to corrosion. A recently developed technique using fiber optic sensors to measure temperature holds potential for performing similar inspections without requiring an infrared imager. The structure is heated using a heat source such as a quartz lamp with fiber Bragg grating (FBG) sensors at the surface of the structure to detect temperature. Investigated structures include a stainless steel plate with thickness variations simulated by small platelets attached to the back side using thermal grease. A relationship is shown between the FBG sensor thermal response and variations in material thickness. For comparison, finite element modeling was performed and found to agree closely with the fiber optic thermography results. This technique shows potential for applications where FBG sensors are already bonded to structures for Integrated Vehicle Health Monitoring (IVHM) strain measurements and can serve dual-use by also performing thermographic detection of subsurface anomalies.

  4. Multiple Kernel Learning for Heterogeneous Anomaly Detection: Algorithm and Aviation Safety Case Study

    Science.gov (United States)

    Das, Santanu; Srivastava, Ashok N.; Matthews, Bryan L.; Oza, Nikunj C.

    2010-01-01

    The world-wide aviation system is one of the most complex dynamical systems ever developed and is generating data at an extremely rapid rate. Most modern commercial aircraft record several hundred flight parameters including information from the guidance, navigation, and control systems, the avionics and propulsion systems, and the pilot inputs into the aircraft. These parameters may be continuous measurements or binary or categorical measurements recorded in one second intervals for the duration of the flight. Currently, most approaches to aviation safety are reactive, meaning that they are designed to react to an aviation safety incident or accident. In this paper, we discuss a novel approach based on the theory of multiple kernel learning to detect potential safety anomalies in very large data bases of discrete and continuous data from world-wide operations of commercial fleets. We pose a general anomaly detection problem which includes both discrete and continuous data streams, where we assume that the discrete streams have a causal influence on the continuous streams. We also assume that atypical sequence of events in the discrete streams can lead to off-nominal system performance. We discuss the application domain, novel algorithms, and also discuss results on real-world data sets. Our algorithm uncovers operationally significant events in high dimensional data streams in the aviation industry which are not detectable using state of the art methods

  5. Temporal Data-Driven Sleep Scheduling and Spatial Data-Driven Anomaly Detection for Clustered Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Gang Li

    2016-09-01

    Full Text Available The spatial–temporal correlation is an important feature of sensor data in wireless sensor networks (WSNs. Most of the existing works based on the spatial–temporal correlation can be divided into two parts: redundancy reduction and anomaly detection. These two parts are pursued separately in existing works. In this work, the combination of temporal data-driven sleep scheduling (TDSS and spatial data-driven anomaly detection is proposed, where TDSS can reduce data redundancy. The TDSS model is inspired by transmission control protocol (TCP congestion control. Based on long and linear cluster structure in the tunnel monitoring system, cooperative TDSS and spatial data-driven anomaly detection are then proposed. To realize synchronous acquisition in the same ring for analyzing the situation of every ring, TDSS is implemented in a cooperative way in the cluster. To keep the precision of sensor data, spatial data-driven anomaly detection based on the spatial correlation and Kriging method is realized to generate an anomaly indicator. The experiment results show that cooperative TDSS can realize non-uniform sensing effectively to reduce the energy consumption. In addition, spatial data-driven anomaly detection is quite significant for maintaining and improving the precision of sensor data.

  6. Adaptive algorithm of magnetic heading detection

    Science.gov (United States)

    Liu, Gong-Xu; Shi, Ling-Feng

    2017-11-01

    Magnetic data obtained from a magnetic sensor usually fluctuate in a certain range, which makes it difficult to estimate the magnetic heading accurately. In fact, magnetic heading information is usually submerged in noise because of all kinds of electromagnetic interference and the diversity of the pedestrian’s motion states. In order to solve this problem, a new adaptive algorithm based on the (typically) right-angled corridors of a building or residential buildings is put forward to process heading information. First, a 3D indoor localization platform is set up based on MPU9250. Then, several groups of data are measured by changing the experimental environment and pedestrian’s motion pace. The raw data from the attached inertial measurement unit are calibrated and arranged into a time-stamped array and written to a data file. Later, the data file is imported into MATLAB for processing and analysis using the proposed adaptive algorithm. Finally, the algorithm is verified by comparison with the existing algorithm. The experimental results show that the algorithm has strong robustness and good fault tolerance, which can detect the heading information accurately and in real-time.

  7. Adaptive and accelerated tracking-learning-detection

    Science.gov (United States)

    Guo, Pengyu; Li, Xin; Ding, Shaowen; Tian, Zunhua; Zhang, Xiaohu

    2013-08-01

    An improved online long-term visual tracking algorithm, named adaptive and accelerated TLD (AA-TLD) based on Tracking-Learning-Detection (TLD) which is a novel tracking framework has been introduced in this paper. The improvement focuses on two aspects, one is adaption, which makes the algorithm not dependent on the pre-defined scanning grids by online generating scale space, and the other is efficiency, which uses not only algorithm-level acceleration like scale prediction that employs auto-regression and moving average (ARMA) model to learn the object motion to lessen the detector's searching range and the fixed number of positive and negative samples that ensures a constant retrieving time, but also CPU and GPU parallel technology to achieve hardware acceleration. In addition, in order to obtain a better effect, some TLD's details are redesigned, which uses a weight including both normalized correlation coefficient and scale size to integrate results, and adjusts distance metric thresholds online. A contrastive experiment on success rate, center location error and execution time, is carried out to show a performance and efficiency upgrade over state-of-the-art TLD with partial TLD datasets and Shenzhou IX return capsule image sequences. The algorithm can be used in the field of video surveillance to meet the need of real-time video tracking.

  8. Statistical Algorithm for the Adaptation of Detection Thresholds

    DEFF Research Database (Denmark)

    Stotsky, Alexander A.

    2008-01-01

    Many event detection mechanisms in spark ignition automotive engines are based on the comparison of the engine signals to the detection threshold values. Different signal qualities for new and aged engines necessitate the development of an adaptation algorithm for the detection thresholds...... remains constant regardless of engine age and changing detection threshold values. This, in turn, guarantees the same event detection performance for new and aged engines/sensors. Adaptation of the engine knock detection threshold is given as an example. Udgivelsesdato: 2008...

  9. Improving Accuracy of Dempster-Shafer Theory Based Anomaly Detection Systems

    Directory of Open Access Journals (Sweden)

    Ling Zou

    2014-07-01

    Full Text Available While the Dempster-Shafer theory of evidence has been widely used in anomaly detection, there are some issues with them. Dempster-Shafer theory of evidence trusts evidences equally which does not hold in distributed-sensor ADS. Moreover, evidences are dependent with each other sometimes which will lead to false alert. We propose improving by incorporating two algorithms. Features selection algorithm employs Gaussian Graphical Models to discover correlation between some candidate features. A group of suitable ADS were selected to detect and detection result were send to the fusion engine. Information gain is applied to set weight for every feature on Weights estimated algorithm. A weighted Dempster-Shafer theory of evidence combined the detection results to achieve a better accuracy. We evaluate our detection prototype through a set of experiments that were conducted with standard benchmark Wisconsin Breast Cancer Dataset and real Internet traffic. Evaluations on the Wisconsin Breast Cancer Dataset show that our prototype can find the correlation in nine features and improve the detection rate without affecting the false positive rate. Evaluations on Internet traffic show that Weights estimated algorithm can improve the detection performance significantly.

  10. Model-based temperature noise monitoring methods for LMFBR core anomaly detection

    International Nuclear Information System (INIS)

    Tamaoki, Tetsuo; Sonoda, Yukio; Sato, Masuo; Takahashi, Ryoichi.

    1994-01-01

    Temperature noise, measured by thermocouples mounted at each core fuel subassembly, is considered to be the most useful signal for detecting and locating local cooling anomalies in an LMFBR core. However, the core outlet temperature noise contains background noise due to fluctuations in the operating parameters including reactor power. It is therefore necessary to reduce this background noise for highly sensitive anomaly detection by subtracting predictable components from the measured signal. In the present study, both a physical model and an autoregressive model were applied to noise data measured in the experimental fast reactor JOYO. The results indicate that the autoregressive model has a higher precision than the physical model in background noise prediction. Based on these results, an 'autoregressive model modification method' is proposed, in which a temporary autoregressive model is generated by interpolation or extrapolation of reference models identified under a small number of different operating conditions. The generated autoregressive model has shown sufficient precision over a wide range of reactor power in applications to artificial noise data produced by an LMFBR noise simulator even when the coolant flow rate was changed to keep a constant power-to-flow ratio. (author)

  11. Implementing Operational Analytics using Big Data Technologies to Detect and Predict Sensor Anomalies

    Science.gov (United States)

    Coughlin, J.; Mital, R.; Nittur, S.; SanNicolas, B.; Wolf, C.; Jusufi, R.

    2016-09-01

    Operational analytics when combined with Big Data technologies and predictive techniques have been shown to be valuable in detecting mission critical sensor anomalies that might be missed by conventional analytical techniques. Our approach helps analysts and leaders make informed and rapid decisions by analyzing large volumes of complex data in near real-time and presenting it in a manner that facilitates decision making. It provides cost savings by being able to alert and predict when sensor degradations pass a critical threshold and impact mission operations. Operational analytics, which uses Big Data tools and technologies, can process very large data sets containing a variety of data types to uncover hidden patterns, unknown correlations, and other relevant information. When combined with predictive techniques, it provides a mechanism to monitor and visualize these data sets and provide insight into degradations encountered in large sensor systems such as the space surveillance network. In this study, data from a notional sensor is simulated and we use big data technologies, predictive algorithms and operational analytics to process the data and predict sensor degradations. This study uses data products that would commonly be analyzed at a site. This study builds on a big data architecture that has previously been proven valuable in detecting anomalies. This paper outlines our methodology of implementing an operational analytic solution through data discovery, learning and training of data modeling and predictive techniques, and deployment. Through this methodology, we implement a functional architecture focused on exploring available big data sets and determine practical analytic, visualization, and predictive technologies.

  12. Anomaly detection using temporal data mining in a smart home environment.

    Science.gov (United States)

    Jakkula, V; Cook, D J

    2008-01-01

    To many people, home is a sanctuary. With the maturing of smart home technologies, many people with cognitive and physical disabilities can lead independent lives in their own homes for extended periods of time. In this paper, we investigate the design of machine learning algorithms that support this goal. We hypothesize that machine learning algorithms can be designed to automatically learn models of resident behavior in a smart home, and that the results can be used to perform automated health monitoring and to detect anomalies. Specifically, our algorithms draw upon the temporal nature of sensor data collected in a smart home to build a model of expected activities and to detect unexpected, and possibly health-critical, events in the home. We validate our algorithms using synthetic data and real activity data collected from volunteers in an automated smart environment. The results from our experiments support our hypothesis that a model can be learned from observed smart home data and used to report anomalies, as they occur, in a smart home.

  13. MedMon: securing medical devices through wireless monitoring and anomaly detection.

    Science.gov (United States)

    Zhang, Meng; Raghunathan, Anand; Jha, Niraj K

    2013-12-01

    Rapid advances in personal healthcare systems based on implantable and wearable medical devices promise to greatly improve the quality of diagnosis and treatment for a range of medical conditions. However, the increasing programmability and wireless connectivity of medical devices also open up opportunities for malicious attackers. Unfortunately, implantable/wearable medical devices come with extreme size and power constraints, and unique usage models, making it infeasible to simply borrow conventional security solutions such as cryptography. We propose a general framework for securing medical devices based on wireless channel monitoring and anomaly detection. Our proposal is based on a medical security monitor (MedMon) that snoops on all the radio-frequency wireless communications to/from medical devices and uses multi-layered anomaly detection to identify potentially malicious transactions. Upon detection of a malicious transaction, MedMon takes appropriate response actions, which could range from passive (notifying the user) to active (jamming the packets so that they do not reach the medical device). A key benefit of MedMon is that it is applicable to existing medical devices that are in use by patients, with no hardware or software modifications to them. Consequently, it also leads to zero power overheads on these devices. We demonstrate the feasibility of our proposal by developing a prototype implementation for an insulin delivery system using off-the-shelf components (USRP software-defined radio). We evaluate its effectiveness under several attack scenarios. Our results show that MedMon can detect virtually all naive attacks and a large fraction of more sophisticated attacks, suggesting that it is an effective approach to enhancing the security of medical devices.

  14. An Overview of Deep Learning Based Methods for Unsupervised and Semi-Supervised Anomaly Detection in Videos

    Directory of Open Access Journals (Sweden)

    B. Ravi Kiran

    2018-02-01

    Full Text Available Videos represent the primary source of information for surveillance applications. Video material is often available in large quantities but in most cases it contains little or no annotation for supervised learning. This article reviews the state-of-the-art deep learning based methods for video anomaly detection and categorizes them based on the type of model and criteria of detection. We also perform simple studies to understand the different approaches and provide the criteria of evaluation for spatio-temporal anomaly detection.

  15. A New Unified Intrusion Anomaly Detection in Identifying Unseen Web Attacks

    Directory of Open Access Journals (Sweden)

    Muhammad Hilmi Kamarudin

    2017-01-01

    Full Text Available The global usage of more sophisticated web-based application systems is obviously growing very rapidly. Major usage includes the storing and transporting of sensitive data over the Internet. The growth has consequently opened up a serious need for more secured network and application security protection devices. Security experts normally equip their databases with a large number of signatures to help in the detection of known web-based threats. In reality, it is almost impossible to keep updating the database with the newly identified web vulnerabilities. As such, new attacks are invisible. This research presents a novel approach of Intrusion Detection System (IDS in detecting unknown attacks on web servers using the Unified Intrusion Anomaly Detection (UIAD approach. The unified approach consists of three components (preprocessing, statistical analysis, and classification. Initially, the process starts with the removal of irrelevant and redundant features using a novel hybrid feature selection method. Thereafter, the process continues with the application of a statistical approach to identifying traffic abnormality. We performed Relative Percentage Ratio (RPR coupled with Euclidean Distance Analysis (EDA and the Chebyshev Inequality Theorem (CIT to calculate the normality score and generate a finest threshold. Finally, Logitboost (LB is employed alongside Random Forest (RF as a weak classifier, with the aim of minimising the final false alarm rate. The experiment has demonstrated that our approach has successfully identified unknown attacks with greater than a 95% detection rate and less than a 1% false alarm rate for both the DARPA 1999 and the ISCX 2012 datasets.

  16. Anomaly detection in heterogeneous media via saliency analysis of propagating wavefields

    Science.gov (United States)

    Druce, Jeffrey M.; Haupt, Jarvis D.; Gonella, Stefano

    2014-03-01

    This work investigates the problem of anomaly detection by means of an agnostic inference strategy based on the concepts of spatial saliency and data sparsity. Specifically, it addresses the implementation and experimental validation aspects of a salient feature extraction methodology that was recently proposed for laser-based diagnostics and leverages the wavefield spatial reconstruction capability offered by scanning laser vibrometers. The methodology consists of two steps. The first is a spatiotemporal windowing strategy designed to partition the structural domain in small sub-domains and replicate impinging wave conditions at each location. The second is the construction of a low-rank-plus-outlier model of the regional data set using principal component analysis. Regions are labeled salient when their behavior does not belong to a common low-dimensional subspace that successfully describes the typical behavior of the anomaly-free portion of the surrounding medium. The most at­ tractive feature of this method is that it requires virtually no knowledge of the structural and material properties of the medium. This property makes it a powerful diagnostic tool for the inspection of media with pronounced heterogeneity or with unknown or unreliable material property distributions, e.g., as a result of severe material degradation over large portions of their domain.

  17. Detection of Anomalies in Citrus Leaves Using Laser-Induced Breakdown Spectroscopy (LIBS).

    Science.gov (United States)

    Sankaran, Sindhuja; Ehsani, Reza; Morgan, Kelly T

    2015-08-01

    Nutrient assessment and management are important to maintain productivity in citrus orchards. In this study, laser-induced breakdown spectroscopy (LIBS) was applied for rapid and real-time detection of citrus anomalies. Laser-induced breakdown spectroscopy spectra were collected from citrus leaves with anomalies such as diseases (Huanglongbing, citrus canker) and nutrient deficiencies (iron, manganese, magnesium, zinc), and compared with those of healthy leaves. Baseline correction, wavelet multivariate denoising, and normalization techniques were applied to the LIBS spectra before analysis. After spectral pre-processing, features were extracted using principal component analysis and classified using two models, quadratic discriminant analysis and support vector machine (SVM). The SVM resulted in a high average classification accuracy of 97.5%, with high average canker classification accuracy (96.5%). LIBS peak analysis indicated that high intensities at 229.7, 247.9, 280.3, 393.5, 397.0, and 769.8 nm were observed of 11 peaks found in all the samples. Future studies using controlled experiments with variable nutrient applications are required for quantification of foliar nutrients by using LIBS-based sensing.

  18. Early detection and identification of anomalies in chemical regime based on computational intelligence techniques

    International Nuclear Information System (INIS)

    Figedy, Stefan; Smiesko, Ivan

    2012-01-01

    This article provides brief information about the fundamental features of a newly-developed diagnostic system for early detection and identification of anomalies being generated in water chemistry regime of the primary and secondary circuit of the VVER-440 reactor. This system, which is called SACHER (System of Analysis of CHEmical Regime), was installed within the major modernization project at the NPP-V2 Bohunice in the Slovak Republic. The SACHER system has been fully developed on MATLAB environment. It is based on computational intelligence techniques and inserts various elements of intelligent data processing modules for clustering, diagnosing, future prediction, signal validation, etc, into the overall chemical information system. The application of SACHER would essentially assist chemists to identify the current situation regarding anomalies being generated in the primary and secondary circuit water chemistry. This system is to be used for diagnostics and data handling, however it is not intended to fully replace the presence of experienced chemists to decide upon corrective actions. (author)

  19. Fast clustering using adaptive density peak detection.

    Science.gov (United States)

    Wang, Xiao-Feng; Xu, Yifan

    2017-12-01

    Common limitations of clustering methods include the slow algorithm convergence, the instability of the pre-specification on a number of intrinsic parameters, and the lack of robustness to outliers. A recent clustering approach proposed a fast search algorithm of cluster centers based on their local densities. However, the selection of the key intrinsic parameters in the algorithm was not systematically investigated. It is relatively difficult to estimate the "optimal" parameters since the original definition of the local density in the algorithm is based on a truncated counting measure. In this paper, we propose a clustering procedure with adaptive density peak detection, where the local density is estimated through the nonparametric multivariate kernel estimation. The model parameter is then able to be calculated from the equations with statistical theoretical justification. We also develop an automatic cluster centroid selection method through maximizing an average silhouette index. The advantage and flexibility of the proposed method are demonstrated through simulation studies and the analysis of a few benchmark gene expression data sets. The method only needs to perform in one single step without any iteration and thus is fast and has a great potential to apply on big data analysis. A user-friendly R package ADPclust is developed for public use.

  20. Improved detection of incipient anomalies via multivariate memory monitoring charts: Application to an air flow heating system

    KAUST Repository

    Harrou, Fouzi

    2016-08-11

    Detecting anomalies is important for reliable operation of several engineering systems. Multivariate statistical monitoring charts are an efficient tool for checking the quality of a process by identifying abnormalities. Principal component analysis (PCA) was shown effective in monitoring processes with highly correlated data. Traditional PCA-based methods, nevertheless, often are relatively inefficient at detecting incipient anomalies. Here, we propose a statistical approach that exploits the advantages of PCA and those of multivariate memory monitoring schemes, like the multivariate cumulative sum (MCUSUM) and multivariate exponentially weighted moving average (MEWMA) monitoring schemes to better detect incipient anomalies. Memory monitoring charts are sensitive to incipient anomalies in process mean, which significantly improve the performance of PCA method and enlarge its profitability, and to utilize these improvements in various applications. The performance of PCA-based MEWMA and MCUSUM control techniques are demonstrated and compared with traditional PCA-based monitoring methods. Using practical data gathered from a heating air-flow system, we demonstrate the greater sensitivity and efficiency of the developed method over the traditional PCA-based methods. Results indicate that the proposed techniques have potential for detecting incipient anomalies in multivariate data. © 2016 Elsevier Ltd

  1. Real-time progressive hyperspectral image processing endmember finding and anomaly detection

    CERN Document Server

    Chang, Chein-I

    2016-01-01

    The book covers the most crucial parts of real-time hyperspectral image processing: causality and real-time capability. Recently, two new concepts of real time hyperspectral image processing, Progressive Hyperspectral Imaging (PHSI) and Recursive Hyperspectral Imaging (RHSI). Both of these can be used to design algorithms and also form an integral part of real time hyperpsectral image processing. This book focuses on progressive nature in algorithms on their real-time and causal processing implementation in two major applications, endmember finding and anomaly detection, both of which are fundamental tasks in hyperspectral imaging but generally not encountered in multispectral imaging. This book is written to particularly address PHSI in real time processing, while a book, Recursive Hyperspectral Sample and Band Processing: Algorithm Architecture and Implementation (Springer 2016) can be considered as its companion book. Includes preliminary background which is essential to those who work in hyperspectral ima...

  2. Bootstrap Prediction Intervals in Non-Parametric Regression with Applications to Anomaly Detection

    Science.gov (United States)

    Kumar, Sricharan; Srivistava, Ashok N.

    2012-01-01

    Prediction intervals provide a measure of the probable interval in which the outputs of a regression model can be expected to occur. Subsequently, these prediction intervals can be used to determine if the observed output is anomalous or not, conditioned on the input. In this paper, a procedure for determining prediction intervals for outputs of nonparametric regression models using bootstrap methods is proposed. Bootstrap methods allow for a non-parametric approach to computing prediction intervals with no specific assumptions about the sampling distribution of the noise or the data. The asymptotic fidelity of the proposed prediction intervals is theoretically proved. Subsequently, the validity of the bootstrap based prediction intervals is illustrated via simulations. Finally, the bootstrap prediction intervals are applied to the problem of anomaly detection on aviation data.

  3. Stochastic anomaly detection in eye-tracking data for quantification of motor symptoms in Parkinson's disease

    Science.gov (United States)

    Jansson, Daniel; Medvedev, Alexander; Axelson, Hans; Nyholm, Dag

    2013-10-01

    Two methods for distinguishing between healthy controls and patients diagnosed with Parkinson's disease by means of recorded smooth pursuit eye movements are presented and evaluated. Both methods are based on the principles of stochastic anomaly detection and make use of orthogonal series approximation for probability distribution estimation. The first method relies on the identification of a Wiener-type model of the smooth pursuit system and attempts to find statistically significant differences between the estimated parameters in healthy controls and patientts with Parkinson's disease. The second method applies the same statistical method to distinguish between the gaze trajectories of healthy and Parkinson subjects attempting to track visual stimuli. Both methods show promising results, where healthy controls and patients with Parkinson's disease are effectively separated in terms of the considered metric. The results are preliminary because of the small number of participating test subjects, but they are indicative of the potential of the presented methods as diagnosing or staging tools for Parkinson's disease.

  4. Clustering and Support Vector Regression for Water Demand Forecasting and Anomaly Detection

    Directory of Open Access Journals (Sweden)

    Antonio Candelieri

    2017-03-01

    Full Text Available This paper presents a completely data-driven and machine-learning-based approach, in two stages, to first characterize and then forecast hourly water demand in the short term with applications of two different data sources: urban water demand (SCADA data and individual customer water consumption (AMR data. In the first case, reliable forecasting can be used to optimize operations, particularly the pumping schedule, in order to reduce energy-related costs, while in the second case, the comparison between forecast and actual values may support the online detection of anomalies, such as smart meter faults, fraud or possible cyber-physical attacks. Results are presented for a real case: the water distribution network in Milan.

  5. Detecting and modeling persistent self-potential anomalies from underground nuclear explosions at the Nevada Test Site

    International Nuclear Information System (INIS)

    McKague, H.L.; Kansa, E.; Kasameyer, P.W.

    1992-01-01

    Self-potential anomalies are naturally occurring, nearly stationary electric fields that are detected by measuring the potential difference between two points on (or in) the ground. SP anomalies arise from a number of causes: principally electrochemical reactions, and heat and fluid flows. SP is routinely used to locate mineral deposits, geothermal systems, and zones of seepage. This paper is a progress report on our work toward detecting explosion-related SP signals at the Nevada Test Site (NTS) and in understanding the physics of these anomalies that persist and continue changing over periods of time that range from months to years. As background, we also include a brief description of how SP signals arise, and we mention their use in other areas such as exploring for geothermal resources and locating seepage through dams. Between the years 1988 and 1991, we surveyed the areas around seven underground nuclear tests for persistent SP anomalies. We not only detected anomalies, but we also found that various phenomena could be contributing to them and that we did not know which of these were actually occurring. We analyzed our new data with existing steady state codes and with a newly developed time-dependent thermal modeling code. Our results with the new code showed that the conductive decay of the thermal pulse from an underground nuclear test could produce many of the observed signals, and that others are probably caused by movement of fluid induced by the explosion. 25 refs

  6. The Use of Hidden Markov Models for Anomaly Detection in Nuclear Core Condition Monitoring

    Science.gov (United States)

    Stephen, Bruce; West, Graeme M.; Galloway, Stuart; McArthur, Stephen D. J.; McDonald, James R.; Towle, Dave

    2009-04-01

    Unplanned outages can be especially costly for generation companies operating nuclear facilities. Early detection of deviations from expected performance through condition monitoring can allow a more proactive and managed approach to dealing with ageing plant. This paper proposes an anomaly detection framework incorporating the use of the Hidden Markov Model (HMM) to support the analysis of nuclear reactor core condition monitoring data. Fuel Grab Load Trace (FGLT) data gathered within the UK during routine refueling operations has been seen to provide information relating to the condition of the graphite bricks that comprise the core. Although manual analysis of this data is time consuming and requires considerable expertise, this paper demonstrates how techniques such as the HMM can provide analysis support by providing a benchmark model of expected behavior against which future refueling events may be compared. The presence of anomalous behavior in candidate traces is inferred through the underlying statistical foundation of the HMM which gives an observation likelihood averaged along the length of the input sequence. Using this likelihood measure, the engineer can be alerted to anomalous behaviour, indicating data which might require further detailed examination. It is proposed that this data analysis technique is used in conjunction with other intelligent analysis techniques currently employed to analyse FGLT to provide a greater confidence measure in detecting anomalous behaviour from FGLT data.

  7. RFID-Based Human Behavior Modeling and Anomaly Detection for Elderly Care

    Directory of Open Access Journals (Sweden)

    Hui-Huang Hsu

    2010-01-01

    Full Text Available This research aimed at building an intelligent system that can detect abnormal behavior for the elderly at home. Active RFID tags can be deployed at home to help collect daily movement data of the elderly who carries an RFID reader. When the reader detects the signals from the tags, RSSI values that represent signal strength are obtained. The RSSI values are reversely related to the distance between the tags and the reader and they are recorded following the movement of the user. The movement patterns, not the exact locations, of the user are the major concern. With the movement data (RSSI values, the clustering technique is then used to build a personalized model of normal behavior. After the model is built, any incoming datum outside the model can be viewed as abnormal and an alarm can be raised by the system. In this paper, we present the system architecture for RFID data collection and preprocessing, clustering for anomaly detection, and experimental results. The results show that this novel approach is promising.

  8. Adaptive Detection and ISI Mitigation for Mobile Molecular Communication.

    Science.gov (United States)

    Chang, Ge; Lin, Lin; Yan, Hao

    2018-03-01

    Current studies on modulation and detection schemes in molecular communication mainly focus on the scenarios with static transmitters and receivers. However, mobile molecular communication is needed in many envisioned applications, such as target tracking and drug delivery. Until now, investigations about mobile molecular communication have been limited. In this paper, a static transmitter and a mobile bacterium-based receiver performing random walk are considered. In this mobile scenario, the channel impulse response changes due to the dynamic change of the distance between the transmitter and the receiver. Detection schemes based on fixed distance fail in signal detection in such a scenario. Furthermore, the intersymbol interference (ISI) effect becomes more complex due to the dynamic character of the signal which makes the estimation and mitigation of the ISI even more difficult. In this paper, an adaptive ISI mitigation method and two adaptive detection schemes are proposed for this mobile scenario. In the proposed scheme, adaptive ISI mitigation, estimation of dynamic distance, and the corresponding impulse response reconstruction are performed in each symbol interval. Based on the dynamic channel impulse response in each interval, two adaptive detection schemes, concentration-based adaptive threshold detection and peak-time-based adaptive detection, are proposed for signal detection. Simulations demonstrate that the ISI effect is significantly reduced and the adaptive detection schemes are reliable and robust for mobile molecular communication.

  9. Anomaly Detection in Log Data using Graph Databases and Machine Learning to Defend Advanced Persistent Threats

    OpenAIRE

    Schindler, Timo

    2018-01-01

    Advanced Persistent Threats (APTs) are a main impendence in cyber security of computer networks. In 2015, a successful breach remains undetected 146 days on average, reported by [Fi16].With our work we demonstrate a feasible and fast way to analyse real world log data to detect breaches or breach attempts. By adapting well-known kill chain mechanisms and a combine of a time series database and an abstracted graph approach, it is possible to create flexible attack profiles. Using this approach...

  10. Paternal psychological response after ultrasonographic detection of structural fetal anomalies with a comparison to maternal response: a cohort study.

    Science.gov (United States)

    Kaasen, Anne; Helbig, Anne; Malt, Ulrik Fredrik; Naes, Tormod; Skari, Hans; Haugen, Guttorm Nils

    2013-07-12

    In Norway almost all pregnant women attend one routine ultrasound examination. Detection of fetal structural anomalies triggers psychological stress responses in the women affected. Despite the frequent use of ultrasound examination in pregnancy, little attention has been devoted to the psychological response of the expectant father following the detection of fetal anomalies. This is important for later fatherhood and the psychological interaction within the couple. We aimed to describe paternal psychological responses shortly after detection of structural fetal anomalies by ultrasonography, and to compare paternal and maternal responses within the same couple. A prospective observational study was performed at a tertiary referral centre for fetal medicine. Pregnant women with a structural fetal anomaly detected by ultrasound and their partners (study group,n=155) and 100 with normal ultrasound findings (comparison group) were included shortly after sonographic examination (inclusion period: May 2006-February 2009). Gestational age was >12 weeks. We used psychometric questionnaires to assess self-reported social dysfunction, health perception, and psychological distress (intrusion, avoidance, arousal, anxiety, and depression): Impact of Event Scale. General Health Questionnaire and Edinburgh Postnatal Depression Scale. Fetal anomalies were classified according to severity and diagnostic or prognostic ambiguity at the time of assessment. Median (range) gestational age at inclusion in the study and comparison group was 19 (12-38) and 19 (13-22) weeks, respectively. Men and women in the study group had significantly higher levels of psychological distress than men and women in the comparison group on all psychometric endpoints. The lowest level of distress in the study group was associated with the least severe anomalies with no diagnostic or prognostic ambiguity (p < 0.033). Men had lower scores than women on all psychometric outcome variables. The correlation in

  11. GNSS reflectometry aboard the International Space Station: phase-altimetry simulation to detect ocean topography anomalies

    Science.gov (United States)

    Semmling, Maximilian; Leister, Vera; Saynisch, Jan; Zus, Florian; Wickert, Jens

    2016-04-01

    An ocean altimetry experiment using Earth reflected GNSS signals has been proposed to the European Space Agency (ESA). It is part of the GNSS Reflectometry Radio Occultation Scatterometry (GEROS) mission that is planned aboard the International Space Station (ISS). Altimetric simulations are presented that examine the detection of ocean topography anomalies assuming GNSS phase delay observations. Such delay measurements are well established for positioning and are possible due to a sufficient synchronization of GNSS receiver and transmitter. For altimetric purpose delays of Earth reflected GNSS signals can be observed similar to radar altimeter signals. The advantage of GNSS is the synchronized separation of transmitter and receiver that allow a significantly increased number of observation per receiver due to more than 70 GNSS transmitters currently in orbit. The altimetric concept has already been applied successfully to flight data recorded over the Mediterranean Sea. The presented altimetric simulation considers anomalies in the Agulhas current region which are obtained from the Region Ocean Model System (ROMS). Suitable reflection events in an elevation range between 3° and 30° last about 10min with ground track's length >3000km. Typical along-track footprints (1s signal integration time) have a length of about 5km. The reflection's Fresnel zone limits the footprint of coherent observations to a major axis extention between 1 to 6km dependent on the elevation. The altimetric performance depends on the signal-to-noise ratio (SNR) of the reflection. Simulation results show that precision is better than 10cm for SNR of 30dB. Whereas, it is worse than 0.5m if SNR goes down to 10dB. Precision, in general, improves towards higher elevation angles. Critical biases are introduced by atmospheric and ionospheric refraction. Corresponding correction strategies are still under investigation.

  12. Value of Ultrasound in Detecting Urinary Tract Anomalies After First Febrile Urinary Tract Infection in Children.

    Science.gov (United States)

    Ghobrial, Emad E; Abdelaziz, Doaa M; Sheba, Maha F; Abdel-Azeem, Yasser S

    2016-05-01

    Background Urinary tract infection (UTI) is an infection that affects part of the urinary tract. Ultrasound is a noninvasive test that can demonstrate the size and shape of kidneys, presence of dilatation of the ureters, and the existence of anatomic abnormalities. The aim of the study is to estimate the value of ultrasound in detecting urinary tract anomalies after first attack of UTI. Methods This study was conducted at the Nephrology Clinic, New Children's Hospital, Faculty of Medicine, Cairo University, from August 2012 to March 2013, and included 30 children who presented with first attack of acute febrile UTI. All patients were subjected to urine analysis, urine culture and sensitivity, serum creatinine, complete blood count, and imaging in the form of renal ultrasound, voiding cysto-urethrography, and renal scan. Results All the patients had fever with a mean of 38.96°C ± 0.44°C and the mean duration of illness was 6.23 ± 5.64 days. Nineteen patients (63.3%) had an ultrasound abnormality. The commonest abnormalities were kidney stones (15.8%). Only 2 patients who had abnormal ultrasound had also vesicoureteric reflux on cystourethrography. Sensitivity of ultrasound was 66.7%, specificity was 37.5%, positive predictive value was 21.1%, negative predictive value was 81.8%, and total accuracy was 43.33%. Conclusion We concluded that ultrasound alone was not of much value in diagnosing and putting a plan of first attack of febrile UTI. It is recommended that combined investigations are the best way to confirm diagnosis of urinary tract anomalies. © The Author(s) 2015.

  13. Airborne detection of magnetic anomalies associated with soils on the Oak Ridge Reservation, Tennessee

    International Nuclear Information System (INIS)

    Doll, W.E.; Beard, L.P.; Helm, J.M.

    1995-01-01

    Reconnaissance airborne geophysical data acquired over the 35,000-acre Oak Ridge Reservation (ORR), TN, show several magnetic anomalies over undisturbed areas mapped as Copper Ridge Dolomite (CRD). The anomalies of interest are most apparent in magnetic gradient maps where they exceed 0.06 nT/m and in some cases exceed 0.5 nT/m. Anomalies as large as 25nT are seen on maps. Some of the anomalies correlate with known or suspected karst, or with apparent conductivity anomalies calculated from electromagnetic data acquired contemporaneously with the magnetic data. Some of the anomalies have a strong correlation with topographic lows or closed depressions. Surface magnetic data have been acquired over some of these sites and have confirmed the existence of the anomalies. Ground inspections in the vicinity of several of the anomalies has not led to any discoveries of manmade surface materials of sufficient size to generate the observed anomalies. One would expect an anomaly of approximately 1 nT for a pickup truck from 200 ft altitude. Typical residual magnetic anomalies have magnitudes of 5--10 nT, and some are as large as 25nT. The absence of roads or other indications of culture (past or present) near the anomalies and the modeling of anomalies in data acquired with surface instruments indicate that man-made metallic objects are unlikely to be responsible for the anomaly. The authors show that observed anomalies in the CRD can reasonably be associated with thickening of the soil layer. The occurrence of the anomalies in areas where evidences of karstification are seen would follow because sediment deposition would occur in topographic lows. Linear groups of anomalies on the maps may be associated with fracture zones which were eroded more than adjacent rocks and were subsequently covered with a thicker blanket of sediment. This study indicates that airborne magnetic data may be of use in other sites where fracture zones or buried collapse structures are of interest

  14. An Analysis of Mechanical Constraints when Using Superconducting Gravimeters for Far-Field Pre-Seismic Anomaly Detection

    Directory of Open Access Journals (Sweden)

    Shyh-Chin Lan

    2011-01-01

    Full Text Available Pre-seismic gravity anomalies from records obtained at a 1 Hz sampling rate from superconducting gravimeters (SG around East Asia are analyzed. A comparison of gravity anomalies to the source parameters of associated earthquakes shows that the detection of pre-seismic gravity anomalies is constrained by several mechanical conditions of the seismic fault plane. The constraints of the far-field pre-seismic gravity amplitude perturbation were examined and the critical spatial relationship between the SG station and the epicenter precursory signal for detection was determined. The results show that: (1 the pre-seismic amplitude perturbation of gravity is inversely proportional to distance; (2 the transfer path from the epicenter to the SG station that crosses a tectonic boundary has a relatively low pre-seismic gravity anomaly amplitude; (3 the pre-seismic gravity perturbation amplitude is also affected by the attitude between the location of an SG station and the strike of the ruptured fault plane. The removal of typhoon effects and the selection of SG stations within a certain intersection angle to the strike of the fault plane are essential for obtaining reliable pre-seismic gravity anomaly results.

  15. Hypergraph-based anomaly detection of high-dimensional co-occurrences.

    Science.gov (United States)

    Silva, Jorge; Willett, Rebecca

    2009-03-01

    This paper addresses the problem of detecting anomalous multivariate co-occurrences using a limited number of unlabeled training observations. A novel method based on using a hypergraph representation of the data is proposed to deal with this very high-dimensional problem. Hypergraphs constitute an important extension of graphs which allow edges to connect more than two vertices simultaneously. A variational Expectation-Maximization algorithm for detecting anomalies directly on the hypergraph domain without any feature selection or dimensionality reduction is presented. The resulting estimate can be used to calculate a measure of anomalousness based on the False Discovery Rate. The algorithm has O(np) computational complexity, where n is the number of training observations and p is the number of potential participants in each co-occurrence event. This efficiency makes the method ideally suited for very high-dimensional settings, and requires no tuning, bandwidth or regularization parameters. The proposed approach is validated on both high-dimensional synthetic data and the Enron email database, where p > 75,000, and it is shown that it can outperform other state-of-the-art methods.

  16. Behavior Drift Detection Based on Anomalies Identification in Home Living Quantitative Indicators

    Directory of Open Access Journals (Sweden)

    Fabio Veronese

    2018-01-01

    Full Text Available Home Automation and Smart Homes diffusion are providing an interesting opportunity to implement elderly monitoring. This is a new valid technological support to allow in-place aging of seniors by means of a detection system to notify potential anomalies. Monitoring has been implemented by means of Complex Event Processing on live streams of home automation data: this allows the analysis of the behavior of the house inhabitant through quantitative indicators. Different kinds of quantitative indicators for monitoring and behavior drift detection have been identified and implemented using the Esper complex event processing engine. The chosen solution permits us not only to exploit the queries when run “online”, but enables also “offline” (re-execution for testing and a posteriori analysis. Indicators were developed on both real world data and on realistic simulations. Tests were made on a dataset of 180 days: the obtained results prove that it is possible to evidence behavior changes for an evaluation of a person’s condition.

  17. Design of a Fuzzy Logic based Framework for Comprehensive Anomaly Detection in Real-World Energy Consumption Data

    NARCIS (Netherlands)

    Hol, M.; Bilgin, A.; Bosse, T.; Bredeweg, B.

    2017-01-01

    Due to the rapid growth of energy consumption worldwide, it has become a necessity that the energy waste caused by buildings is explicated by the aid of automated systems that can identify anomalous behaviour. Comprehensible anomaly detection, however, is a challenging task considering the lack of

  18. Detecting geothermal anomalies and evaluating LST geothermal component by combining thermal remote sensing time series and land surface model data

    NARCIS (Netherlands)

    Romaguera, M.; Vaughan, R. G.; Ettema, J.; Izquierdo-Verdiguier, E.; Hecker, C. A.; van der Meer, F. D.

    This paper explores for the first time the possibilities to use two land surface temperature (LST) time series of different origins (geostationary Meteosat Second Generation satellite data and Noah land surface modelling, LSM), to detect geothermal anomalies and extract the geothermal component of

  19. Detecting geothermal anomalies and evaluating LST geothermal component by combining thermal remote sensing time series and land surface model data

    NARCIS (Netherlands)

    Romaguera, M.; Vaughan, R. G.; Ettema, J.; Izquierdo-Verdiguier, E.; Hecker, C. A.; van der Meer, F. D.

    2017-01-01

    This paper explores for the first time the possibilities to use two land surface temperature (LST) time series of different origins (geostationary Meteosat Second Generation satellite data and Noah land surface modelling, LSM), to detect geothermal anomalies and extract the geothermal component of

  20. Possibility of detecting triple gluon coupling and Adler-Bell-Jackiw anomaly in polarized deep inelastic scattering

    International Nuclear Information System (INIS)

    Lam, C.S.; Li, B.A.

    1980-05-01

    A way to detect experimentally the existence of triple gluon coupling and the Adler-Bell-Jackiw anomaly is to measure the Q 2 -dependence of polarized deep inelastic scattering. These effects lead to a ln ln Q 2 term which we calculate by introducing a new gluon operator in the Wilson expansion

  1. Estimation of the Potential Detection of Diatom Assemblages Based on Ocean Color Radiance Anomalies in the North Sea

    Directory of Open Access Journals (Sweden)

    Anne-Hélène Rêve-Lamarche

    2017-12-01

    Full Text Available Over the past years, a large number of new approaches in the domain of ocean-color have been developed, leading to a variety of innovative descriptors for phytoplankton communities. One of these methods, named PHYSAT, currently allows for the qualitative detection of five main phytoplankton groups from ocean-color measurements. Even though PHYSAT products are widely used in various applications and projects, the approach is limited by the fact it identifies only dominant phytoplankton groups. This current limitation is due to the use of biomarker pigment ratios for establishing empirical relationships between in-situ information and specific ocean-color radiance anomalies in open ocean waters. However, theoretical explanations of PHYSAT suggests that it could be possible to detect more than dominance cases but move more toward phytoplanktonic assemblage detection. Thus, to evaluate the potential of PHYSAT for the detection of phytoplankton assemblages, we took advantage of the Continuous Plankton Recorder (CPR survey, collected in both the English Channel and the North Sea. The available CPR dataset contains information on diatom abundance in two large areas of the North Sea for the period 1998-2010. Using this unique dataset, recurrent diatom assemblages were retrieved based on classification of CPR samples. Six diatom assemblages were identified in-situ, each having indicators taxa or species. Once this first step was completed, the in-situ analysis was used to empirically associate the diatom assemblages with specific PHYSAT spectral anomalies. This step was facilitated by the use of previous classifications of regional radiance anomalies in terms of shape and amplitude, coupled with phenological tools. Through a matchup exercise, three CPR assemblages were associated with specific radiance anomalies. The maps of detection of these specific radiances anomalies are in close agreement with current in-situ ecological knowledge.

  2. On-line Flagging of Anomalies and Adaptive Sequential Hypothesis Testing for Fine-feature Characterization of Geosynchronous Satellites

    Science.gov (United States)

    Chaudhary, A.; Payne, T.; Kinateder, K.; Dao, P.; Beecher, E.; Boone, D.; Elliott, B.

    The objective of on-line flagging in this paper is to perform interactive assessment of geosynchronous satellites anomalies such as cross-tagging of a satellites in a cluster, solar panel offset change, etc. This assessment will utilize a Bayesian belief propagation procedure and will include automated update of baseline signature data for the satellite, while accounting for the seasonal changes. Its purpose is to enable an ongoing, automated assessment of satellite behavior through its life cycle using the photometry data collected during the synoptic search performed by a ground or space-based sensor as a part of its metrics mission. The change in the satellite features will be reported along with the probabilities of Type I and Type II errors. The objective of adaptive sequential hypothesis testing in this paper is to define future sensor tasking for the purpose of characterization of fine features of the satellite. The tasking will be designed in order to maximize new information with the least number of photometry data points to be collected during the synoptic search by a ground or space-based sensor. Its calculation is based on the utilization of information entropy techniques. The tasking is defined by considering a sequence of hypotheses in regard to the fine features of the satellite. The optimal observation conditions are then ordered in order to maximize new information about a chosen fine feature. The combined objective of on-line flagging and adaptive sequential hypothesis testing is to progressively discover new information about the features of a geosynchronous satellites by leveraging the regular but sparse cadence of data collection during the synoptic search performed by a ground or space-based sensor. Automated Algorithm to Detect Changes in Geostationary Satellite's Configuration and Cross-Tagging Phan Dao, Air Force Research Laboratory/RVB By characterizing geostationary satellites based on photometry and color photometry, analysts can

  3. Vibration-Based Adaptive Novelty Detection Method for Monitoring Faults in a Kinematic Chain

    Directory of Open Access Journals (Sweden)

    Jesus Adolfo Cariño-Corrales

    2016-01-01

    Full Text Available This paper presents an adaptive novelty detection methodology applied to a kinematic chain for the monitoring of faults. The proposed approach has the premise that only information of the healthy operation of the machine is initially available and fault scenarios will eventually develop. This approach aims to cover some of the challenges presented when condition monitoring is applied under a continuous learning framework. The structure of the method is divided into two recursive stages: first, an offline stage for initialization and retraining of the feature reduction and novelty detection modules and, second, an online monitoring stage to continuously assess the condition of the machine. Contrary to classical static feature reduction approaches, the proposed method reformulates the features by employing first a Laplacian Score ranking and then the Fisher Score ranking for retraining. The proposed methodology is validated experimentally by monitoring the vibration measurements of a kinematic chain driven by an induction motor. Two faults are induced in the motor to validate the method performance to detect anomalies and adapt the feature reduction and novelty detection modules to the new information. The obtained results show the advantages of employing an adaptive approach for novelty detection and feature reduction making the proposed method suitable for industrial machinery diagnosis applications.

  4. Confabulation Based Real-time Anomaly Detection for Wide-area Surveillance Using Heterogeneous High Performance Computing Architecture

    Science.gov (United States)

    2015-06-01

    CONFABULATION BASED REAL-TIME ANOMALY DETECTION FOR WIDE-AREA SURVEILLANCE USING HETEROGENEOUS HIGH PERFORMANCE COMPUTING ARCHITECTURE SYRACUSE...DETECTION FOR WIDE-AREA SURVEILLANCE USING HETEROGENEOUS HIGH PERFORMANCE COMPUTING ARCHITECTURE 5a. CONTRACT NUMBER FA8750-12-1-0251 5b. GRANT...processors including graphic processor units (GPUs) and Intel Xeon Phi processors. Experimental results showed significant speedups, which can enable

  5. Adaptive gaze control for object detection

    NARCIS (Netherlands)

    De Croon, G.C.H.E.; Postma, E.O.; Van den Herik, H.J.

    2011-01-01

    We propose a novel gaze-control model for detecting objects in images. The model, named act-detect, uses the information from local image samples in order to shift its gaze towards object locations. The model constitutes two main contributions. The first contribution is that the model’s setup makes

  6. Application of Ground-Penetrating Radar for Detecting Internal Anomalies in Tree Trunks with Irregular Contours.

    Science.gov (United States)

    Li, Weilin; Wen, Jian; Xiao, Zhongliang; Xu, Shengxia

    2018-02-22

    To assess the health conditions of tree trunks, it is necessary to estimate the layers and anomalies of their internal structure. The main objective of this paper is to investigate the internal part of tree trunks considering their irregular contour. In this respect, we used ground penetrating radar (GPR) for non-invasive detection of defects and deteriorations in living trees trunks. The Hilbert transform algorithm and the reflection amplitudes were used to estimate the relative dielectric constant. The point cloud data technique was applied as well to extract the irregular contours of trunks. The feasibility and accuracy of the methods were examined through numerical simulations, laboratory and field measurements. The results demonstrated that the applied methodology allowed for accurate characterizations of the internal inhomogeneity. Furthermore, the point cloud technique resolved the trunk well by providing high-precision coordinate information. This study also demonstrated that cross-section tomography provided images with high resolution and accuracy. These integrated techniques thus proved to be promising for observing tree trunks and other cylindrical objects. The applied approaches offer a great promise for future 3D reconstruction of tomographic images with radar wave.

  7. A Comparative Study of Anomaly Detection Techniques for Smart City Wireless Sensor Networks.

    Science.gov (United States)

    Garcia-Font, Victor; Garrigues, Carles; Rifà-Pous, Helena

    2016-06-13

    In many countries around the world, smart cities are becoming a reality. These cities contribute to improving citizens' quality of life by providing services that are normally based on data extracted from wireless sensor networks (WSN) and other elements of the Internet of Things. Additionally, public administration uses these smart city data to increase its efficiency, to reduce costs and to provide additional services. However, the information received at smart city data centers is not always accurate, because WSNs are sometimes prone to error and are exposed to physical and computer attacks. In this article, we use real data from the smart city of Barcelona to simulate WSNs and implement typical attacks. Then, we compare frequently used anomaly detection techniques to disclose these attacks. We evaluate the algorithms under different requirements on the available network status information. As a result of this study, we conclude that one-class Support Vector Machines is the most appropriate technique. We achieve a true positive rate at least 56% higher than the rates achieved with the other compared techniques in a scenario with a maximum false positive rate of 5% and a 26% higher in a scenario with a false positive rate of 15%.

  8. Using Low Resolution Satellite Imagery for Yield Prediction and Yield Anomaly Detection

    Directory of Open Access Journals (Sweden)

    Oscar Rojas

    2013-04-01

    Full Text Available Low resolution satellite imagery has been extensively used for crop monitoring and yield forecasting for over 30 years and plays an important role in a growing number of operational systems. The combination of their high temporal frequency with their extended geographical coverage generally associated with low costs per area unit makes these images a convenient choice at both national and regional scales. Several qualitative and quantitative approaches can be clearly distinguished, going from the use of low resolution satellite imagery as the main predictor of final crop yield to complex crop growth models where remote sensing-derived indicators play different roles, depending on the nature of the model and on the availability of data measured on the ground. Vegetation performance anomaly detection with low resolution images continues to be a fundamental component of early warning and drought monitoring systems at the regional scale. For applications at more detailed scales, the limitations created by the mixed nature of low resolution pixels are being progressively reduced by the higher resolution offered by new sensors, while the continuity of existing systems remains crucial for ensuring the availability of long time series as needed by the majority of the yield prediction methods used today.

  9. Visualizing histopathologic deep learning classification and anomaly detection using nonlinear feature space dimensionality reduction.

    Science.gov (United States)

    Faust, Kevin; Xie, Quin; Han, Dominick; Goyle, Kartikay; Volynskaya, Zoya; Djuric, Ugljesa; Diamandis, Phedias

    2018-05-16

    There is growing interest in utilizing artificial intelligence, and particularly deep learning, for computer vision in histopathology. While accumulating studies highlight expert-level performance of convolutional neural networks (CNNs) on focused classification tasks, most studies rely on probability distribution scores with empirically defined cutoff values based on post-hoc analysis. More generalizable tools that allow humans to visualize histology-based deep learning inferences and decision making are scarce. Here, we leverage t-distributed Stochastic Neighbor Embedding (t-SNE) to reduce dimensionality and depict how CNNs organize histomorphologic information. Unique to our workflow, we develop a quantitative and transparent approach to visualizing classification decisions prior to softmax compression. By discretizing the relationships between classes on the t-SNE plot, we show we can super-impose randomly sampled regions of test images and use their distribution to render statistically-driven classifications. Therefore, in addition to providing intuitive outputs for human review, this visual approach can carry out automated and objective multi-class classifications similar to more traditional and less-transparent categorical probability distribution scores. Importantly, this novel classification approach is driven by a priori statistically defined cutoffs. It therefore serves as a generalizable classification and anomaly detection tool less reliant on post-hoc tuning. Routine incorporation of this convenient approach for quantitative visualization and error reduction in histopathology aims to accelerate early adoption of CNNs into generalized real-world applications where unanticipated and previously untrained classes are often encountered.

  10. A Comparative Study of Anomaly Detection Techniques for Smart City Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Victor Garcia-Font

    2016-06-01

    Full Text Available In many countries around the world, smart cities are becoming a reality. These cities contribute to improving citizens’ quality of life by providing services that are normally based on data extracted from wireless sensor networks (WSN and other elements of the Internet of Things. Additionally, public administration uses these smart city data to increase its efficiency, to reduce costs and to provide additional services. However, the information received at smart city data centers is not always accurate, because WSNs are sometimes prone to error and are exposed to physical and computer attacks. In this article, we use real data from the smart city of Barcelona to simulate WSNs and implement typical attacks. Then, we compare frequently used anomaly detection techniques to disclose these attacks. We evaluate the algorithms under different requirements on the available network status information. As a result of this study, we conclude that one-class Support Vector Machines is the most appropriate technique. We achieve a true positive rate at least 56% higher than the rates achieved with the other compared techniques in a scenario with a maximum false positive rate of 5% and a 26% higher in a scenario with a false positive rate of 15%.

  11. Edge detection of magnetic anomalies using analytic signal of tilt angle (ASTA)

    Science.gov (United States)

    Alamdar, K.; Ansari, A. H.; Ghorbani, A.

    2009-04-01

    Magnetic is a commonly used geophysical technique to identify and image potential subsurface targets. Interpretation of magnetic anomalies is a complex process due to the superposition of multiple magnetic sources, presence of geologic and cultural noise and acquisition and positioning error. Both the vertical and horizontal derivatives of potential field data are useful; horizontal derivative, enhance edges whereas vertical derivative narrow the width of anomaly and so locate source bodies more accurately. We can combine vertical and horizontal derivative of magnetic field to achieve analytic signal which is independent to body magnetization direction and maximum value of this lies over edges of body directly. Tilt angle filter is phased-base filter and is defined as angle between vertical derivative and total horizontal derivative. Tilt angle value differ from +90 degree to -90 degree and its zero value lies over body edge. One of disadvantage of this filter is when encountering with deep sources the detected edge is blurred. For overcome this problem many authors introduced new filters such as total horizontal derivative of tilt angle or vertical derivative of tilt angle which Because of using high-order derivative in these filters results may be too noisy. If we combine analytic signal and tilt angle, a new filter termed (ASTA) is produced which its maximum value lies directly over body edge and is easer than tilt angle to delineate body edge and no complicity of tilt angle. In this work new filter has been demonstrated on magnetic data from an area in Sar- Cheshme region in Iran. This area is located in 55 degree longitude and 32 degree latitude and is a copper potential region. The main formation in this area is Andesith and Trachyandezite. Magnetic surveying was employed to separate the boundaries of Andezite and Trachyandezite from adjacent area. In this regard a variety of filters such as analytic signal, tilt angle and ASTA filter have been applied which

  12. Anomaly detection for medical images based on a one-class classification

    Science.gov (United States)

    Wei, Qi; Ren, Yinhao; Hou, Rui; Shi, Bibo; Lo, Joseph Y.; Carin, Lawrence

    2018-02-01

    Detecting an anomaly such as a malignant tumor or a nodule from medical images including mammogram, CT or PET images is still an ongoing research problem drawing a lot of attention with applications in medical diagnosis. A conventional way to address this is to learn a discriminative model using training datasets of negative and positive samples. The learned model can be used to classify a testing sample into a positive or negative class. However, in medical applications, the high unbalance between negative and positive samples poses a difficulty for learning algorithms, as they will be biased towards the majority group, i.e., the negative one. To address this imbalanced data issue as well as leverage the huge amount of negative samples, i.e., normal medical images, we propose to learn an unsupervised model to characterize the negative class. To make the learned model more flexible and extendable for medical images of different scales, we have designed an autoencoder based on a deep neural network to characterize the negative patches decomposed from large medical images. A testing image is decomposed into patches and then fed into the learned autoencoder to reconstruct these patches themselves. The reconstruction error of one patch is used to classify this patch into a binary class, i.e., a positive or a negative one, leading to a one-class classifier. The positive patches highlight the suspicious areas containing anomalies in a large medical image. The proposed method has been tested on InBreast dataset and achieves an AUC of 0.84. The main contribution of our work can be summarized as follows. 1) The proposed one-class learning requires only data from one class, i.e., the negative data; 2) The patch-based learning makes the proposed method scalable to images of different sizes and helps avoid the large scale problem for medical images; 3) The training of the proposed deep convolutional neural network (DCNN) based auto-encoder is fast and stable.

  13. Novel ST-MUSIC-based spectral analysis for detection of ULF geomagnetic signals anomalies associated with seismic events in Mexico

    Directory of Open Access Journals (Sweden)

    Omar Chavez

    2016-05-01

    Full Text Available Recently, the analysis of ultra-low-frequency (ULF geomagnetic signals in order to detect seismic anomalies has been reported in several works. Yet, they, although having promising results, present problems for their detection since these anomalies are generally too much weak and embedded in high noise levels. In this work, a short-time multiple signal classification (ST-MUSIC, which is a technique with high-frequency resolution and noise immunity, is proposed for the detection of seismic anomalies in the ULF geomagnetic signals. Besides, the energy (E of geomagnetic signals processed by ST-MUSIC is also presented as a complementary parameter to measure the fluctuations between seismic activity and seismic calm period. The usefulness and effectiveness of the proposal are demonstrated through the analysis of a synthetic signal and five real signals with earthquakes. The analysed ULF geomagnetic signals have been obtained using a tri-axial fluxgate magnetometer at the Juriquilla station, which is localized in Queretaro, Mexico (geographic coordinates: longitude 100.45° E and latitude 20.70° N. The results obtained show the detection of seismic perturbations before, during, and after the main shock, making the proposal a suitable tool for detecting seismic precursors.

  14. Methods for computational disease surveillance in infection prevention and control: Statistical process control versus Twitter's anomaly and breakout detection algorithms.

    Science.gov (United States)

    Wiemken, Timothy L; Furmanek, Stephen P; Mattingly, William A; Wright, Marc-Oliver; Persaud, Annuradha K; Guinn, Brian E; Carrico, Ruth M; Arnold, Forest W; Ramirez, Julio A

    2018-02-01

    Although not all health care-associated infections (HAIs) are preventable, reducing HAIs through targeted intervention is key to a successful infection prevention program. To identify areas in need of targeted intervention, robust statistical methods must be used when analyzing surveillance data. The objective of this study was to compare and contrast statistical process control (SPC) charts with Twitter's anomaly and breakout detection algorithms. SPC and anomaly/breakout detection (ABD) charts were created for vancomycin-resistant Enterococcus, Acinetobacter baumannii, catheter-associated urinary tract infection, and central line-associated bloodstream infection data. Both SPC and ABD charts detected similar data points as anomalous/out of control on most charts. The vancomycin-resistant Enterococcus ABD chart detected an extra anomalous point that appeared to be higher than the same time period in prior years. Using a small subset of the central line-associated bloodstream infection data, the ABD chart was able to detect anomalies where the SPC chart was not. SPC charts and ABD charts both performed well, although ABD charts appeared to work better in the context of seasonal variation and autocorrelation. Because they account for common statistical issues in HAI data, ABD charts may be useful for practitioners for analysis of HAI surveillance data. Copyright © 2018 Association for Professionals in Infection Control and Epidemiology, Inc. Published by Elsevier Inc. All rights reserved.

  15. Self-adaptive change detection in streaming data with non-stationary distribution

    KAUST Repository

    Zhang, Xiangliang

    2010-01-01

    Non-stationary distribution, in which the data distribution evolves over time, is a common issue in many application fields, e.g., intrusion detection and grid computing. Detecting the changes in massive streaming data with a non-stationary distribution helps to alarm the anomalies, to clean the noises, and to report the new patterns. In this paper, we employ a novel approach for detecting changes in streaming data with the purpose of improving the quality of modeling the data streams. Through observing the outliers, this approach of change detection uses a weighted standard deviation to monitor the evolution of the distribution of data streams. A cumulative statistical test, Page-Hinkley, is employed to collect the evidence of changes in distribution. The parameter used for reporting the changes is self-adaptively adjusted according to the distribution of data streams, rather than set by a fixed empirical value. The self-adaptability of the novel approach enhances the effectiveness of modeling data streams by timely catching the changes of distributions. We validated the approach on an online clustering framework with a benchmark KDDcup 1999 intrusion detection data set as well as with a real-world grid data set. The validation results demonstrate its better performance on achieving higher accuracy and lower percentage of outliers comparing to the other change detection approaches. © 2010 Springer-Verlag.

  16. Detection of sinkholes or anomalies using full seismic wave fields : phase II.

    Science.gov (United States)

    2016-08-01

    A new 2-D Full Waveform Inversion (FWI) software code was developed to characterize layering and anomalies beneath the ground surface using seismic testing. The software is capable of assessing the shear and compression wave velocities (Vs and Vp) fo...

  17. Adaptive search and detection of laser radiation

    International Nuclear Information System (INIS)

    Efendiev, F.A.; Kasimova, F.I.

    2008-01-01

    Formation of cosmic optical line connected with the solving of difficult problems, among which stand out spatial search task, detection and target tracking. Indeed, the main advantage of systems of the optical diapason, high radiation direction leads to a challenging task of entering in communication, consisting in mutual targeting antenna receiving and transmitting systems. Algorithm detection, obtained by solving the corresponding statistical optimal detection test synthesis tasks detector determines the structure and quality of his work which depend on the average characteristics of the signal and the background radiation of the thermal noise require full priori certainty about the conditions of observation. Algorithm of the optimal detector of laser light modulated on a sub carrier frequency of intensity assumes a priori known intensity and efficiency background radiation and internal noise power photo detector

  18. An Adaptive Ship Detection Scheme for Spaceborne SAR Imagery

    Directory of Open Access Journals (Sweden)

    Xiangguang Leng

    2016-08-01

    Full Text Available With the rapid development of spaceborne synthetic aperture radar (SAR and the increasing need of ship detection, research on adaptive ship detection in spaceborne SAR imagery is of great importance. Focusing on practical problems of ship detection, this paper presents a highly adaptive ship detection scheme for spaceborne SAR imagery. It is able to process a wide range of sensors, imaging modes and resolutions. Two main stages are identified in this paper, namely: ship candidate detection and ship discrimination. Firstly, this paper proposes an adaptive land masking method using ship size and pixel size. Secondly, taking into account the imaging mode, incidence angle, and polarization channel of SAR imagery, it implements adaptive ship candidate detection in spaceborne SAR imagery by applying different strategies to different resolution SAR images. Finally, aiming at different types of typical false alarms, this paper proposes a comprehensive ship discrimination method in spaceborne SAR imagery based on confidence level and complexity analysis. Experimental results based on RADARSAT-1, RADARSAT-2, TerraSAR-X, RS-1, and RS-3 images demonstrate that the adaptive scheme proposed in this paper is able to detect ship targets in a fast, efficient and robust way.

  19. Toward Bulk Synchronous Parallel-Based Machine Learning Techniques for Anomaly Detection in High-Speed Big Data Networks

    Directory of Open Access Journals (Sweden)

    Kamran Siddique

    2017-09-01

    Full Text Available Anomaly detection systems, also known as intrusion detection systems (IDSs, continuously monitor network traffic aiming to identify malicious actions. Extensive research has been conducted to build efficient IDSs emphasizing two essential characteristics. The first is concerned with finding optimal feature selection, while another deals with employing robust classification schemes. However, the advent of big data concepts in anomaly detection domain and the appearance of sophisticated network attacks in the modern era require some fundamental methodological revisions to develop IDSs. Therefore, we first identify two more significant characteristics in addition to the ones mentioned above. These refer to the need for employing specialized big data processing frameworks and utilizing appropriate datasets for validating system’s performance, which is largely overlooked in existing studies. Afterwards, we set out to develop an anomaly detection system that comprehensively follows these four identified characteristics, i.e., the proposed system (i performs feature ranking and selection using information gain and automated branch-and-bound algorithms respectively; (ii employs logistic regression and extreme gradient boosting techniques for classification; (iii introduces bulk synchronous parallel processing to cater computational requirements of high-speed big data networks; and; (iv uses the Infromation Security Centre of Excellence, of the University of Brunswick real-time contemporary dataset for performance evaluation. We present experimental results that verify the efficacy of the proposed system.

  20. An Adaptive Database Intrusion Detection System

    Science.gov (United States)

    Barrios, Rita M.

    2011-01-01

    Intrusion detection is difficult to accomplish when attempting to employ current methodologies when considering the database and the authorized entity. It is a common understanding that current methodologies focus on the network architecture rather than the database, which is not an adequate solution when considering the insider threat. Recent…

  1. MODVOLC2: A Hybrid Time Series Analysis for Detecting Thermal Anomalies Applied to Thermal Infrared Satellite Data

    Science.gov (United States)

    Koeppen, W. C.; Wright, R.; Pilger, E.

    2009-12-01

    We developed and tested a new, automated algorithm, MODVOLC2, which analyzes thermal infrared satellite time series data to detect and quantify the excess energy radiated from thermal anomalies such as active volcanoes, fires, and gas flares. MODVOLC2 combines two previously developed algorithms, a simple point operation algorithm (MODVOLC) and a more complex time series analysis (Robust AVHRR Techniques, or RAT) to overcome the limitations of using each approach alone. MODVOLC2 has four main steps: (1) it uses the original MODVOLC algorithm to process the satellite data on a pixel-by-pixel basis and remove thermal outliers, (2) it uses the remaining data to calculate reference and variability images for each calendar month, (3) it compares the original satellite data and any newly acquired data to the reference images normalized by their variability, and it detects pixels that fall outside the envelope of normal thermal behavior, (4) it adds any pixels detected by MODVOLC to those detected in the time series analysis. Using test sites at Anatahan and Kilauea volcanoes, we show that MODVOLC2 was able to detect ~15% more thermal anomalies than using MODVOLC alone, with very few, if any, known false detections. Using gas flares from the Cantarell oil field in the Gulf of Mexico, we show that MODVOLC2 provided results that were unattainable using a time series-only approach. Some thermal anomalies (e.g., Cantarell oil field flares) are so persistent that an additional, semi-automated 12-µm correction must be applied in order to correctly estimate both the number of anomalies and the total excess radiance being emitted by them. Although all available data should be included to make the best possible reference and variability images necessary for the MODVOLC2, we estimate that at least 80 images per calendar month are required to generate relatively good statistics from which to run MODVOLC2, a condition now globally met by a decade of MODIS observations. We also found

  2. Detection of geothermal anomalies in Tengchong, Yunnan Province, China from MODIS multi-temporal night LST imagery

    Science.gov (United States)

    Li, H.; Kusky, T. M.; Peng, S.; Zhu, M.

    2012-12-01

    Thermal infrared (TIR) remote sensing is an important technique in the exploration of geothermal resources. In this study, a geothermal survey is conducted in Tengchong area of Yunnan province in China using multi-temporal MODIS LST (Land Surface Temperature). The monthly night MODIS LST data from Mar. 2000 to Mar. 2011 of the study area were collected and analyzed. The 132 month average LST map was derived and three geothermal anomalies were identified. The findings of this study agree well with the results from relative geothermal gradient measurements. Finally, we conclude that TIR remote sensing is a cost-effective technique to detect geothermal anomalies. Combining TIR remote sensing with geological analysis and the understanding of geothermal mechanism is an accurate and efficient approach to geothermal area detection.

  3. Research on Healthy Anomaly Detection Model Based on Deep Learning from Multiple Time-Series Physiological Signals

    Directory of Open Access Journals (Sweden)

    Kai Wang

    2016-01-01

    Full Text Available Health is vital to every human being. To further improve its already respectable medical technology, the medical community is transitioning towards a proactive approach which anticipates and mitigates risks before getting ill. This approach requires measuring the physiological signals of human and analyzes these data at regular intervals. In this paper, we present a novel approach to apply deep learning in physiological signals analysis that allows doctor to identify latent risks. However, extracting high level information from physiological time-series data is a hard problem faced by the machine learning communities. Therefore, in this approach, we apply model based on convolutional neural network that can automatically learn features from raw physiological signals in an unsupervised manner and then based on the learned features use multivariate Gauss distribution anomaly detection method to detect anomaly data. Our experiment is shown to have a significant performance in physiological signals anomaly detection. So it is a promising tool for doctor to identify early signs of illness even if the criteria are unknown a priori.

  4. Ionospheric anomalies detected by ionosonde and possibly related to crustal earthquakes in Greece

    Science.gov (United States)

    Perrone, Loredana; De Santis, Angelo; Abbattista, Cristoforo; Alfonsi, Lucilla; Amoruso, Leonardo; Carbone, Marianna; Cesaroni, Claudio; Cianchini, Gianfranco; De Franceschi, Giorgiana; De Santis, Anna; Di Giovambattista, Rita; Marchetti, Dedalo; Pavòn-Carrasco, Francisco J.; Piscini, Alessandro; Spogli, Luca; Santoro, Francesca

    2018-03-01

    Ionosonde data and crustal earthquakes with magnitude M ≥ 6.0 observed in Greece during the 2003-2015 period were examined to check if the relationships obtained earlier between precursory ionospheric anomalies and earthquakes in Japan and central Italy are also valid for Greek earthquakes. The ionospheric anomalies are identified on the observed variations of the sporadic E-layer parameters (h'Es, foEs) and foF2 at the ionospheric station of Athens. The corresponding empirical relationships between the seismo-ionospheric disturbances and the earthquake magnitude and the epicentral distance are obtained and found to be similar to those previously published for other case studies. The large lead times found for the ionospheric anomalies occurrence may confirm a rather long earthquake preparation period. The possibility of using the relationships obtained for earthquake prediction is finally discussed.

  5. Ionospheric anomalies detected by ionosonde and possibly related to crustal earthquakes in Greece

    Directory of Open Access Journals (Sweden)

    L. Perrone

    2018-03-01

    Full Text Available Ionosonde data and crustal earthquakes with magnitude M ≥ 6.0 observed in Greece during the 2003–2015 period were examined to check if the relationships obtained earlier between precursory ionospheric anomalies and earthquakes in Japan and central Italy are also valid for Greek earthquakes. The ionospheric anomalies are identified on the observed variations of the sporadic E-layer parameters (h′Es, foEs and foF2 at the ionospheric station of Athens. The corresponding empirical relationships between the seismo-ionospheric disturbances and the earthquake magnitude and the epicentral distance are obtained and found to be similar to those previously published for other case studies. The large lead times found for the ionospheric anomalies occurrence may confirm a rather long earthquake preparation period. The possibility of using the relationships obtained for earthquake prediction is finally discussed.

  6. Development of references of anomalies detection on P91 material using Self-Magnetic Leakage Field (SMLF) technique

    Science.gov (United States)

    Husin, Shuib; Afiq Pauzi, Ahmad; Yunus, Salmi Mohd; Ghafar, Mohd Hafiz Abdul; Adilin Sekari, Saiful

    2017-10-01

    This technical paper demonstrates the successful of the application of self-magnetic leakage field (SMLF) technique in detecting anomalies in weldment of a thick P91 materials joint (1 inch thickness). Boiler components such as boiler tubes, stub boiler at penthouse and energy piping such as hot reheat pipe (HRP) and H-balance energy piping to turbine are made of P91 material. P91 is ferromagnetic material, therefore the technique of self-magnetic leakage field (SMLF) is applicable for P91 in detecting anomalies within material (internal defects). The technique is categorized under non-destructive technique (NDT). It is the second passive method after acoustic emission (AE), at which the information on structures radiation (magnetic field and energy waves) is used. The measured magnetic leakage field of a product or component is a magnetic leakage field occurring on the component’s surface in the zone of dislocation stable slipbands under the influence of operational (in-service) or residual stresses or in zones of maximum inhomogeneity of metal structure in new products or components. Inter-granular and trans-granular cracks, inclusion, void, cavity and corrosion are considered types of inhomogeneity and discontinuity in material where obviously the output of magnetic leakage field will be shown when using this technique. The technique does not required surface preparation for the component to be inspected. This technique is contact-type inspection, which means the sensor has to touch or in-contact to the component’s surface during inspection. The results of application of SMLF technique on the developed P91 reference blocks have demonstrated that the technique is practical to be used for anomaly inspection and detection as well as identification of anomalies’ location. The evaluation of this passive self-magnetic leakage field (SMLF) technique has been verified by other conventional non-destructive tests (NDTs) on the reference blocks where simulated

  7. Detection of network attacks based on adaptive resonance theory

    Science.gov (United States)

    Bukhanov, D. G.; Polyakov, V. M.

    2018-05-01

    The paper considers an approach to intrusion detection systems using a neural network of adaptive resonant theory. It suggests the structure of an intrusion detection system consisting of two types of program modules. The first module manages connections of user applications by preventing the undesirable ones. The second analyzes the incoming network traffic parameters to check potential network attacks. After attack detection, it notifies the required stations using a secure transmission channel. The paper describes the experiment on the detection and recognition of network attacks using the test selection. It also compares the obtained results with similar experiments carried out by other authors. It gives findings and conclusions on the sufficiency of the proposed approach. The obtained information confirms the sufficiency of applying the neural networks of adaptive resonant theory to analyze network traffic within the intrusion detection system.

  8. Temporal anomaly detection: an artificial immune approach based on T cell activation, clonal size regulation and homeostasis.

    Science.gov (United States)

    Antunes, Mário J; Correia, Manuel E

    2010-01-01

    This paper presents an artificial immune system (AIS) based on Grossman's tunable activation threshold (TAT) for temporal anomaly detection. We describe the generic AIS framework and the TAT model adopted for simulating T Cells behaviour, emphasizing two novel important features: the temporal dynamic adjustment of T Cells clonal size and its associated homeostasis mechanism. We also present some promising results obtained with artificially generated data sets, aiming to test the appropriateness of using TAT in dynamic changing environments, to distinguish new unseen patterns as part of what should be detected as normal or as anomalous. We conclude by discussing results obtained thus far with artificially generated data sets.

  9. Using Generalized Entropies and OC-SVM with Mahalanobis Kernel for Detection and Classification of Anomalies in Network Traffic

    Directory of Open Access Journals (Sweden)

    Jayro Santiago-Paz

    2015-09-01

    Full Text Available Network anomaly detection and classification is an important open issue in network security. Several approaches and systems based on different mathematical tools have been studied and developed, among them, the Anomaly-Network Intrusion Detection System (A-NIDS, which monitors network traffic and compares it against an established baseline of a “normal” traffic profile. Then, it is necessary to characterize the “normal” Internet traffic. This paper presents an approach for anomaly detection and classification based on Shannon, Rényi and Tsallis entropies of selected features, and the construction of regions from entropy data employing the Mahalanobis distance (MD, and One Class Support Vector Machine (OC-SVM with different kernels (Radial Basis Function (RBF and Mahalanobis Kernel (MK for “normal” and abnormal traffic. Regular and non-regular regions built from “normal” traffic profiles allow anomaly detection, while the classification is performed under the assumption that regions corresponding to the attack classes have been previously characterized. Although this approach allows the use of as many features as required, only four well-known significant features were selected in our case. In order to evaluate our approach, two different data sets were used: one set of real traffic obtained from an Academic Local Area Network (LAN, and the other a subset of the 1998 MIT-DARPA set. For these data sets, a True positive rate up to 99.35%, a True negative rate up to 99.83% and a False negative rate at about 0.16% were yielded. Experimental results show that certain q-values of the generalized entropies and the use of OC-SVM with RBF kernel improve the detection rate in the detection stage, while the novel inclusion of MK kernel in OC-SVM and k-temporal nearest neighbors improve accuracy in classification. In addition, the results show that using the Box-Cox transformation, the Mahalanobis distance yielded high detection rates with

  10. Passive Sonar Target Detection Using Statistical Classifier and Adaptive Threshold

    Directory of Open Access Journals (Sweden)

    Hamed Komari Alaie

    2018-01-01

    Full Text Available This paper presents the results of an experimental investigation about target detecting with passive sonar in Persian Gulf. Detecting propagated sounds in the water is one of the basic challenges of the researchers in sonar field. This challenge will be complex in shallow water (like Persian Gulf and noise less vessels. Generally, in passive sonar, the targets are detected by sonar equation (with constant threshold that increases the detection error in shallow water. The purpose of this study is proposed a new method for detecting targets in passive sonars using adaptive threshold. In this method, target signal (sound is processed in time and frequency domain. For classifying, Bayesian classification is used and posterior distribution is estimated by Maximum Likelihood Estimation algorithm. Finally, target was detected by combining the detection points in both domains using Least Mean Square (LMS adaptive filter. Results of this paper has showed that the proposed method has improved true detection rate by about 24% when compared other the best detection method.

  11. WE-H-BRC-06: A Unified Machine-Learning Based Probabilistic Model for Automated Anomaly Detection in the Treatment Plan Data

    International Nuclear Information System (INIS)

    Chang, X; Liu, S; Kalet, A; Yang, D

    2016-01-01

    Purpose: The purpose of this work was to investigate the ability of a machine-learning based probabilistic approach to detect radiotherapy treatment plan anomalies given initial disease classes information. Methods In total we obtained 1112 unique treatment plans with five plan parameters and disease information from a Mosaiq treatment management system database for use in the study. The plan parameters include prescription dose, fractions, fields, modality and techniques. The disease information includes disease site, and T, M and N disease stages. A Bayesian network method was employed to model the probabilistic relationships between tumor disease information, plan parameters and an anomaly flag. A Bayesian learning method with Dirichlet prior was useed to learn the joint probabilities between dependent variables in error-free plan data and data with artificially induced anomalies. In the study, we randomly sampled data with anomaly in a specified anomaly space.We tested the approach with three groups of plan anomalies – improper concurrence of values of all five plan parameters and values of any two out of five parameters, and all single plan parameter value anomalies. Totally, 16 types of plan anomalies were covered by the study. For each type, we trained an individual Bayesian network. Results: We found that the true positive rate (recall) and positive predictive value (precision) to detect concurrence anomalies of five plan parameters in new patient cases were 94.45±0.26% and 93.76±0.39% respectively. To detect other 15 types of plan anomalies, the average recall and precision were 93.61±2.57% and 93.78±3.54% respectively. The computation time to detect the plan anomaly of each type in a new plan is ∼0.08 seconds. Conclusion: The proposed method for treatment plan anomaly detection was found effective in the initial tests. The results suggest that this type of models could be applied to develop plan anomaly detection tools to assist manual and

  12. WE-H-BRC-06: A Unified Machine-Learning Based Probabilistic Model for Automated Anomaly Detection in the Treatment Plan Data

    Energy Technology Data Exchange (ETDEWEB)

    Chang, X; Liu, S [Washington University in St. Louis, St. Louis, MO (United States); Kalet, A [University of Washington Medical Center, Seattle, WA (United States); Yang, D [Washington University in St Louis, St Louis, MO (United States)

    2016-06-15

    Purpose: The purpose of this work was to investigate the ability of a machine-learning based probabilistic approach to detect radiotherapy treatment plan anomalies given initial disease classes information. Methods In total we obtained 1112 unique treatment plans with five plan parameters and disease information from a Mosaiq treatment management system database for use in the study. The plan parameters include prescription dose, fractions, fields, modality and techniques. The disease information includes disease site, and T, M and N disease stages. A Bayesian network method was employed to model the probabilistic relationships between tumor disease information, plan parameters and an anomaly flag. A Bayesian learning method with Dirichlet prior was useed to learn the joint probabilities between dependent variables in error-free plan data and data with artificially induced anomalies. In the study, we randomly sampled data with anomaly in a specified anomaly space.We tested the approach with three groups of plan anomalies – improper concurrence of values of all five plan parameters and values of any two out of five parameters, and all single plan parameter value anomalies. Totally, 16 types of plan anomalies were covered by the study. For each type, we trained an individual Bayesian network. Results: We found that the true positive rate (recall) and positive predictive value (precision) to detect concurrence anomalies of five plan parameters in new patient cases were 94.45±0.26% and 93.76±0.39% respectively. To detect other 15 types of plan anomalies, the average recall and precision were 93.61±2.57% and 93.78±3.54% respectively. The computation time to detect the plan anomaly of each type in a new plan is ∼0.08 seconds. Conclusion: The proposed method for treatment plan anomaly detection was found effective in the initial tests. The results suggest that this type of models could be applied to develop plan anomaly detection tools to assist manual and

  13. Detection and analysis of anomalies in the brightness temperature difference field using MSG rapid scan data

    Czech Academy of Sciences Publication Activity Database

    Šťástka, J.; Radová, Michaela

    2013-01-01

    Roč. 123, SI (2013), s. 354-359 ISSN 0169-8095 R&D Projects: GA ČR GA205/07/0905 Institutional support: RVO:68378289 Keywords : brightness temperature difference (BTD) * BTD anomaly * cloud-top brightness temperature (BT) * convective storm * MSG Subject RIV: DG - Athmosphere Sciences, Meteorology OBOR OECD: Meteorology and atmospheric sciences Impact factor: 2.421, year: 2013 https://www.sciencedirect.com/science/article/pii/S0169809512001548

  14. Properties of the geoelectric structure that promote the detection of electrotelluric anomalies. The case of Ioannina, Greece

    Energy Technology Data Exchange (ETDEWEB)

    Makris, J. P. [Technological Educational Institute of Crete, Dept. of Electronics (Branch of Chania), Chalepa, Chania, Crete (Greece)

    2001-04-01

    The reliable detection and identification of electrotelluric anomalies that could be considered as precursory phenomena of earthquakes become fundamental aspects of earthquake prediction research. Special arrangements, in local and/or regional scale, of the geoelectric structure beneath the measuring point, may act as natural realtime filters on the ULF electrotelluric data improving considerably the signal to magnetotelluric-noise ratio of anomalies originated by probably non-magnetotelluric sources. Linear polarization, i.e. local channelling of the electric field on the surface is expected in cases where 3D-local inhomogeneities, producing strong shear distortion, are present in the vicinity of the monitoring site and/or when a 2D-regional geoelectrical setting exhibits high anisotropy. By assuming different generation mechanisms and modes of propagation for the electrotelluric anomalies that could be considered earthquake precursory phenomena, a rotationally originated residual electrotelluric field results, eliminating background magnetotelluric-noise and revealing hidden transient variations that could be associated to earthquakes. The suggested method is applicable in real-time data collection, thus simplifies and accelerates the tedious task of identification of suspicious signals. As an indicative example, the case of Ioannina (located in Northwestern Greece) is presented. The local polarization of the electrotelluric field varies dramatically even at neighboring points although the regional geoelectric strike direction does not change.

  15. Detection of Anomalies and Changes of Rainfall in the Yellow River Basin, China, through Two Graphical Methods

    Directory of Open Access Journals (Sweden)

    Hao Wu

    2017-12-01

    Full Text Available This study aims to reveal rainfall anomalies and changes over the Yellow River Basin due to the fragile ecosystem and rainfall-related disasters. Common trend analyses relate to overall trends in mean values. Therefore, we used two graphical methods: the quantile perturbation method (QPM was used to investigate anomalies over time in extreme rainfall, and the partial trend method (PTM was used to analyze rainfall changes at different intensities. A nonparametric bootstrap procedure is proposed in order to identify significant PTM indices. The QPM indicated prevailing positive anomalies in extreme daily rainfall 50 years ago and in the middle reaches during the 1970s and 1980s. The PTM detected significant decreases in annual rainfall mainly in the latter half of the middle reaches, two-thirds of which occurred in high and heavy rainfall. Most stations in the middle and lower reaches showed significant decreases in rainy days. Daily rainfall intensity had a significant increase at 13 stations, where rainy days were generally decreasing. The combined effect of these opposing changes explains the prevailing absence of change in annual rainfall, and the observed decreases in annual rainfall can be attributed to the decreasing number of rainy days. The changes in rainy days and rainfall intensity were dominated by the wet season and dry season, respectively.

  16. Detection of anomalies in radio tomography of asteroids: Source count and forward errors

    Science.gov (United States)

    Pursiainen, S.; Kaasalainen, M.

    2014-09-01

    The purpose of this study was to advance numerical methods for radio tomography in which asteroid's internal electric permittivity distribution is to be recovered from radio frequency data gathered by an orbiter. The focus was on signal generation via multiple sources (transponders) providing one potential, or even essential, scenario to be implemented in a challenging in situ measurement environment and within tight payload limits. As a novel feature, the effects of forward errors including noise and a priori uncertainty of the forward (data) simulation were examined through a combination of the iterative alternating sequential (IAS) inverse algorithm and finite-difference time-domain (FDTD) simulation of time evolution data. Single and multiple source scenarios were compared in two-dimensional localization of permittivity anomalies. Three different anomaly strengths and four levels of total noise were tested. Results suggest, among other things, that multiple sources can be necessary to obtain appropriate results, for example, to distinguish three separate anomalies with permittivity less or equal than half of the background value, relevant in recovery of internal cavities.

  17. Locally adaptive decision in detection of clustered microcalcifications in mammograms

    Science.gov (United States)

    Sainz de Cea, María V.; Nishikawa, Robert M.; Yang, Yongyi

    2018-02-01

    In computer-aided detection or diagnosis of clustered microcalcifications (MCs) in mammograms, the performance often suffers from not only the presence of false positives (FPs) among the detected individual MCs but also large variability in detection accuracy among different cases. To address this issue, we investigate a locally adaptive decision scheme in MC detection by exploiting the noise characteristics in a lesion area. Instead of developing a new MC detector, we propose a decision scheme on how to best decide whether a detected object is an MC or not in the detector output. We formulate the individual MCs as statistical outliers compared to the many noisy detections in a lesion area so as to account for the local image characteristics. To identify the MCs, we first consider a parametric method for outlier detection, the Mahalanobis distance detector, which is based on a multi-dimensional Gaussian distribution on the noisy detections. We also consider a non-parametric method which is based on a stochastic neighbor graph model of the detected objects. We demonstrated the proposed decision approach with two existing MC detectors on a set of 188 full-field digital mammograms (95 cases). The results, evaluated using free response operating characteristic (FROC) analysis, showed a significant improvement in detection accuracy by the proposed outlier decision approach over traditional thresholding (the partial area under the FROC curve increased from 3.95 to 4.25, p-value  FPs at a given sensitivity level. The proposed adaptive decision approach could not only reduce the number of FPs in detected MCs but also improve case-to-case consistency in detection.

  18. Time series of GNSS-derived ionospheric maps to detect anomalies as possible precursors of high magnitude earthquakes

    Science.gov (United States)

    Barbarella, M.; De Giglio, M.; Galeandro, A.; Mancini, F.

    2012-04-01

    The modification of some atmospheric physical properties prior to a high magnitude earthquake has been recently debated within the Lithosphere-Atmosphere-Ionosphere (LAI) Coupling model. Among this variety of phenomena the ionization of air at the higher level of the atmosphere, called ionosphere, is investigated in this work. Such a ionization occurrences could be caused by possible leaking of gases from earth crust and their presence was detected around the time of high magnitude earthquakes by several authors. However, the spatial scale and temporal domain over which such a disturbances come into evidence is still a controversial item. Even thought the ionospheric activity could be investigated by different methodologies (satellite or terrestrial measurements), we selected the production of ionospheric maps by the analysis of GNSS (Global Navigation Satellite Data) data as possible way to detect anomalies prior of a seismic event over a wide area around the epicentre. It is well known that, in the GNSS sciences, the ionospheric activity could be probed by the analysis of refraction phenomena occurred on the dual frequency signals along the satellite to receiver path. The analysis of refraction phenomena affecting data acquired by the GNSS permanent trackers is able to produce daily to hourly maps representing the spatial distribution of the ionospheric Total Electron Content (TEC) as an index of the ionization degree in the upper atmosphere. The presence of large ionospheric anomalies could be therefore interpreted in the LAI Coupling model like a precursor signal of a strong earthquake, especially when the appearance of other different precursors (thermal anomalies and/or gas fluxes) could be detected. In this work, a six-month long series of ionospheric maps produced from GNSS data collected by a network of 49 GPS permanent stations distributed within an area around the city of L'Aquila (Abruzzi, Italy), where an earthquake (M = 6.3) occurred on April 6, 2009

  19. ADAPTIVE ANT COLONY OPTIMIZATION BASED GRADIENT FOR EDGE DETECTION

    Directory of Open Access Journals (Sweden)

    Febri Liantoni

    2014-08-01

    Full Text Available Ant Colony Optimization (ACO is a nature-inspired optimization algorithm which is motivated by ants foraging behavior. Due to its favorable advantages, ACO has been widely used to solve several NP-hard problems, including edge detection. Since ACO initially distributes ants at random, it may cause imbalance ant distribution which later affects path discovery process. In this paper an adaptive ACO is proposed to optimize edge detection by adaptively distributing ant according to gradient analysis. Ants are adaptively distributed according to gradient ratio of each image regions. Region which has bigger gradient ratio, will have bigger number of ant distribution. Experiments are conducted using images from various datasets. Precision and recall are used to quantitatively evaluate performance of the proposed algorithm. Precision and recall of adaptive ACO reaches 76.98 % and 96.8 %. Whereas highest precision and recall for standard ACO are 69.74 % and 74.85 %. Experimental results show that the adaptive ACO outperforms standard ACO which randomly distributes ants.

  20. Adaptive DSP Algorithms for UMTS: Blind Adaptive MMSE and PIC Multiuser Detection

    NARCIS (Netherlands)

    Potman, J.

    2003-01-01

    A study of the application of blind adaptive Minimum Mean Square Error (MMSE) and Parallel Interference Cancellation (PIC) multiuser detection techniques to Wideband Code Division Multiple Access (WCDMA), the physical layer of Universal Mobile Telecommunication System (UMTS), has been performed as

  1. A new segmentation strategy for processing magnetic anomaly detection data of shallow depth ferromagnetic pipeline

    Science.gov (United States)

    Feng, Shuo; Liu, Dejun; Cheng, Xing; Fang, Huafeng; Li, Caifang

    2017-04-01

    Magnetic anomalies produced by underground ferromagnetic pipelines because of the polarization of earth's magnetic field are used to obtain the information on the location, buried depth and other parameters of pipelines. In order to achieve a fast inversion and interpretation of measured data, it is necessary to develop a fast and stable forward method. Magnetic dipole reconstruction (MDR), as a kind of integration numerical method, is well suited for simulating a thin pipeline anomaly. In MDR the pipeline model must be cut into small magnetic dipoles through different segmentation methods. The segmentation method has an impact on the stability and speed of forward calculation. Rapid and accurate simulation of deep-buried pipelines has been achieved by exciting segmentation method. However, in practical measurement, the depth of underground pipe is uncertain. When it comes to the shallow-buried pipeline, the present segmentation may generate significant errors. This paper aims at solving this problem in three stages. First, the cause of inaccuracy is analyzed by simulation experiment. Secondly, new variable interval section segmentation is proposed based on the existing segmentation. It can help MDR method to obtain simulation results in a fast way under the premise of ensuring the accuracy of different depth models. Finally, the measured data is inversed based on new segmentation method. The result proves that the inversion based on the new segmentation can achieve fast and accurate inversion of depth parameters of underground pipes without being limited by pipeline depth.

  2. Prenatal detection of structural cardiac defects and presence of associated anomalies: a retrospective observational study of 1262 fetal echocardiograms.

    Science.gov (United States)

    Mone, Fionnuala; Walsh, Colin; Mulcahy, Cecelia; McMahon, Colin J; Farrell, Sinead; MacTiernan, Aoife; Segurado, Ricardo; Mahony, Rhona; Higgins, Shane; Carroll, Stephen; McParland, Peter; McAuliffe, Fionnuala M

    2015-06-01

    The aim of this study is to document the detection of fetal congenital heart defect (CHD) in relation to the following: (1) indication for referral, (2) chromosomal and (3) extracardiac abnormalities. All fetal echocardiograms performed in our institution from 2007 to 2011 were reviewed retrospectively. Indication for referral, cardiac diagnosis based on the World Health Organization International Classification of Diseases tenth revision criteria and the presence of chromosomal and extracardiac defects were recorded. Of 1262 echocardiograms, 287 (22.7%) had CHD. Abnormal anatomy scan in pregnancies originally considered to be at low risk of CHD was the best indicator for detecting CHD (91.2% of positive cardiac diagnoses), compared with other indications of family history (5.6%) or maternal medical disorder (3.1%). Congenital anomalies of the cardiac septa comprised the largest category (n = 89), within which atrioventricular septal defects were the most common anomaly (n = 36). Invasive prenatal testing was performed for 126 of 287 cases, of which 44% (n = 55) had a chromosomal abnormality. Of 232 fetuses without chromosomal abnormalities, 31% had an extracardiac defect (n = 76). Most CHDs occur in pregnancies regarded to be at low risk, highlighting the importance of a routine midtrimester fetal anatomy scan. Frequent association of fetal CHD and chromosomal and extracardiac pathology emphasises the importance of thorough evaluation of any fetus with CHD. © 2015 John Wiley & Sons, Ltd.

  3. A Review of Anomaly Detection Techniques and Distributed Denial of Service (DDoS on Software Defined Network (SDN

    Directory of Open Access Journals (Sweden)

    M. H. H. Khairi

    2018-04-01

    Full Text Available Software defined network (SDN is a network architecture in which the network traffic may be operated and managed dynamically according to user requirements and demands. Issue of security is one of the big challenges of SDN because different attacks may affect performance and these attacks can be classified into different types. One of the famous attacks is distributed denial of service (DDoS. SDN is a new networking approach that is introduced with the goal to simplify the network management by separating the data and control planes. However, the separation leads to the emergence of new types of distributed denial-of-service (DDOS attacks on SDN networks. The centralized role of the controller in SDN makes it a perfect target for the attackers. Such attacks can easily bring down the entire network by bringing down the controller. This research explains DDoS attacks and the anomaly detection as one of the famous detection techniques for intelligent networks.

  4. Fuzzy Logic Based Anomaly Detection for Embedded Network Security Cyber Sensor

    Energy Technology Data Exchange (ETDEWEB)

    Ondrej Linda; Todd Vollmer; Jason Wright; Milos Manic

    2011-04-01

    Resiliency and security in critical infrastructure control systems in the modern world of cyber terrorism constitute a relevant concern. Developing a network security system specifically tailored to the requirements of such critical assets is of a primary importance. This paper proposes a novel learning algorithm for anomaly based network security cyber sensor together with its hardware implementation. The presented learning algorithm constructs a fuzzy logic rule based model of normal network behavior. Individual fuzzy rules are extracted directly from the stream of incoming packets using an online clustering algorithm. This learning algorithm was specifically developed to comply with the constrained computational requirements of low-cost embedded network security cyber sensors. The performance of the system was evaluated on a set of network data recorded from an experimental test-bed mimicking the environment of a critical infrastructure control system.

  5. Forward looking anomaly detection via fusion of infrared and color imagery

    Science.gov (United States)

    Stone, K.; Keller, J. M.; Popescu, M.; Havens, T. C.; Ho, K. C.

    2010-04-01

    This paper develops algorithms for the detection of interesting and abnormal objects in color and infrared imagery taken from cameras mounted on a moving vehicle, observing a fixed scene. The primary purpose of detection is to cue a human-in-the-loop detection system. Algorithms for direct detection and change detection are investigated, as well as fusion of the two. Both methods use temporal information to reduce the number of false alarms. The direct detection algorithm uses image self-similarity computed between local neighborhoods to determine interesting, or unique, parts of an image. Neighborhood similarity is computed using Euclidean distance in CIELAB color space for the color imagery, and Euclidean distance between grey levels in the infrared imagery. The change detection algorithm uses the affine scale-invariant feature transform (ASIFT) to transform multiple background frames into the current image space. Each transformed image is then compared to the current image, and the multiple outputs are fused to produce a single difference image. Changes in lighting and contrast between the background run and the current run are adjusted for in both color and infrared imagery. Frame-to-frame motion is modeled using a perspective transformation, the parameters of which are computed using scale-invariant feature transform (SIFT) keypoint correspondences. This information is used to perform temporal accumulation of single frame detections for both the direct detection and change detection algorithms. Performance of the proposed algorithms is evaluated on multiple lanes from a data collection at a US Army test site.

  6. Risk of developing palatally displaced canines in patients with early detectable dental anomalies: a retrospective cohort study.

    Science.gov (United States)

    Garib, Daniela Gamba; Lancia, Melissa; Kato, Renata Mayumi; Oliveira, Thais Marchini; Neves, Lucimara Teixeira das

    2016-01-01

    To estimate the risk of PDC occurrence in children with dental anomalies identified early during mixed dentition. The sample comprised 730 longitudinal orthodontic records from children (448 females and 282 males) with an initial mean age of 8.3 years (SD=1.36). The dental anomaly group (DA) included 263 records of patients with at least one dental anomaly identified in the initial or middle mixed dentition. The non-dental anomaly group (NDA) was composed of 467 records of patients with no dental anomalies. The occurrence of PDC in both groups was diagnosed using panoramic and periapical radiographs taken in the late mixed dentition or early permanent dentition. The prevalence of PDC in patients with and without early diagnosed dental anomalies was compared using the chi-square test (pdental anomalies diagnosed during early mixed dentition have an approximately two and a half fold increased risk of developing PDC during late mixed dentition compared with children without dental anomalies.

  7. Evaluating the SEVIRI Fire Thermal Anomaly Detection Algorithm across the Central African Republic Using the MODIS Active Fire Product

    Directory of Open Access Journals (Sweden)

    Patrick H. Freeborn

    2014-02-01

    Full Text Available Satellite-based remote sensing of active fires is the only practical way to consistently and continuously monitor diurnal fluctuations in biomass burning from regional, to continental, to global scales. Failure to understand, quantify, and communicate the performance of an active fire detection algorithm, however, can lead to improper interpretations of the spatiotemporal distribution of biomass burning, and flawed estimates of fuel consumption and trace gas and aerosol emissions. This work evaluates the performance of the Spinning Enhanced Visible and Infrared Imager (SEVIRI Fire Thermal Anomaly (FTA detection algorithm using seven months of active fire pixels detected by the Moderate Resolution Imaging Spectroradiometer (MODIS across the Central African Republic (CAR. Results indicate that the omission rate of the SEVIRI FTA detection algorithm relative to MODIS varies spatially across the CAR, ranging from 25% in the south to 74% in the east. In the absence of confounding artifacts such as sunglint, uncertainties in the background thermal characterization, and cloud cover, the regional variation in SEVIRI’s omission rate can be attributed to a coupling between SEVIRI’s low spatial resolution detection bias (i.e., the inability to detect fires below a certain size and intensity and a strong geographic gradient in active fire characteristics across the CAR. SEVIRI’s commission rate relative to MODIS increases from 9% when evaluated near MODIS nadir to 53% near the MODIS scene edges, indicating that SEVIRI errors of commission at the MODIS scene edges may not be false alarms but rather true fires that MODIS failed to detect as a result of larger pixel sizes at extreme MODIS scan angles. Results from this work are expected to facilitate (i future improvements to the SEVIRI FTA detection algorithm; (ii the assimilation of the SEVIRI and MODIS active fire products; and (iii the potential inclusion of SEVIRI into a network of geostationary

  8. Adaptive multi-resolution Modularity for detecting communities in networks

    Science.gov (United States)

    Chen, Shi; Wang, Zhi-Zhong; Bao, Mei-Hua; Tang, Liang; Zhou, Ji; Xiang, Ju; Li, Jian-Ming; Yi, Chen-He

    2018-02-01

    Community structure is a common topological property of complex networks, which attracted much attention from various fields. Optimizing quality functions for community structures is a kind of popular strategy for community detection, such as Modularity optimization. Here, we introduce a general definition of Modularity, by which several classical (multi-resolution) Modularity can be derived, and then propose a kind of adaptive (multi-resolution) Modularity that can combine the advantages of different Modularity. By applying the Modularity to various synthetic and real-world networks, we study the behaviors of the methods, showing the validity and advantages of the multi-resolution Modularity in community detection. The adaptive Modularity, as a kind of multi-resolution method, can naturally solve the first-type limit of Modularity and detect communities at different scales; it can quicken the disconnecting of communities and delay the breakup of communities in heterogeneous networks; and thus it is expected to generate the stable community structures in networks more effectively and have stronger tolerance against the second-type limit of Modularity.

  9. Adaptive 4d Psi-Based Change Detection

    Science.gov (United States)

    Yang, Chia-Hsiang; Soergel, Uwe

    2018-04-01

    In a previous work, we proposed a PSI-based 4D change detection to detect disappearing and emerging PS points (3D) along with their occurrence dates (1D). Such change points are usually caused by anthropic events, e.g., building constructions in cities. This method first divides an entire SAR image stack into several subsets by a set of break dates. The PS points, which are selected based on their temporal coherences before or after a break date, are regarded as change candidates. Change points are then extracted from these candidates according to their change indices, which are modelled from their temporal coherences of divided image subsets. Finally, we check the evolution of the change indices for each change point to detect the break date that this change occurred. The experiment validated both feasibility and applicability of our method. However, two questions still remain. First, selection of temporal coherence threshold associates with a trade-off between quality and quantity of PS points. This selection is also crucial for the amount of change points in a more complex way. Second, heuristic selection of change index thresholds brings vulnerability and causes loss of change points. In this study, we adapt our approach to identify change points based on statistical characteristics of change indices rather than thresholding. The experiment validates this adaptive approach and shows increase of change points compared with the old version. In addition, we also explore and discuss optimal selection of temporal coherence threshold.

  10. Modern Adaptive Analytics Approach to Lowering Seismic Network Detection Thresholds

    Science.gov (United States)

    Johnson, C. E.

    2017-12-01

    Modern seismic networks present a number of challenges, but perhaps most notably are those related to 1) extreme variation in station density, 2) temporal variation in station availability, and 3) the need to achieve detectability for much smaller events of strategic importance. The first of these has been reasonably addressed in the development of modern seismic associators, such as GLASS 3.0 by the USGS/NEIC, though some work still remains to be done in this area. However, the latter two challenges demand special attention. Station availability is impacted by weather, equipment failure or the adding or removing of stations, and while thresholds have been pushed to increasingly smaller magnitudes, new algorithms are needed to achieve even lower thresholds. Station availability can be addressed by a modern, adaptive architecture that maintains specified performance envelopes using adaptive analytics coupled with complexity theory. Finally, detection thresholds can be lowered using a novel approach that tightly couples waveform analytics with the event detection and association processes based on a principled repicking algorithm that uses particle realignment for enhanced phase discrimination.

  11. DEVELOPMENT AND TESTING OF PROCEDURES FOR CARRYING OUT EMERGENCY PHYSICAL INVENTORY TAKING AFTER DETECTING ANOMALY EVENTS CONCERNING NM SECURITY

    International Nuclear Information System (INIS)

    VALENTE, J.; FISHBONE, L.

    2003-01-01

    In the State Scientific Center of Russian Federation - Institute of Physics and Power Engineering (SSC RF-IPPE, Obninsk), which is under Minatom jurisdiction, the procedures for carrying out emergency physical inventory taking (EPIT) were developed and tested in cooperation with the Brookhaven National Laboratory (USA). Here the emergency physical inventory taking means the PIT, which is carried out in case of symptoms indicating a possibility of NM loss (theft). Such PIT often requires a verification of attributes and quantitative characteristics for all the NM items located in a specific Material Balance Area (MBA). In order to carry out the exercise, an MBA was selected where many thousands of NM items containing highly enriched uranium are used. Three clients of the computerized material accounting system (CMAS) are installed in this MBA. Labels with unique (within IPPE site) identification numbers in the form of digit combinations and an appropriate bar code have been applied on the NM items, containers and authorized locations. All the data to be checked during the EPIT are stored in the CMAS database. Five variants of anomalies initiating EPIT and requiring different types of activities on EPIT organization are considered. Automatic working places (AWP) were created on the basis of the client computers in order to carry out a large number of measurements within a reasonable time. In addition to a CMAS client computer, the main components of an AWP include a bar-code reader, an electronic scale and an enrichment meter with NaI--detector--the lMCA Inspector (manufactured by the Canberra Company). All these devices work together with a client computer in the on-line mode. Special computer code (Emergency Inventory Software-EIS) was developed. All the algorithms of interaction between the operator and the system, as well as algorithms of data exchange during the measurements and data comparison, are implemented in this software. Registration of detected

  12. Neural communication patterns underlying conflict detection, resolution, and adaptation.

    Science.gov (United States)

    Oehrn, Carina R; Hanslmayr, Simon; Fell, Juergen; Deuker, Lorena; Kremers, Nico A; Do Lam, Anne T; Elger, Christian E; Axmacher, Nikolai

    2014-07-30

    In an ever-changing environment, selecting appropriate responses in conflicting situations is essential for biological survival and social success and requires cognitive control, which is mediated by dorsomedial prefrontal cortex (DMPFC) and dorsolateral prefrontal cortex (DLPFC). How these brain regions communicate during conflict processing (detection, resolution, and adaptation), however, is still unknown. The Stroop task provides a well-established paradigm to investigate the cognitive mechanisms mediating such response conflict. Here, we explore the oscillatory patterns within and between the DMPFC and DLPFC in human epilepsy patients with intracranial EEG electrodes during an auditory Stroop experiment. Data from the DLPFC were obtained from 12 patients. Thereof four patients had additional DMPFC electrodes available for interaction analyses. Our results show that an early θ (4-8 Hz) modulated enhancement of DLPFC γ-band (30-100 Hz) activity constituted a prerequisite for later successful conflict processing. Subsequent conflict detection was reflected in a DMPFC θ power increase that causally entrained DLPFC θ activity (DMPFC to DLPFC). Conflict resolution was thereafter completed by coupling of DLPFC γ power to DMPFC θ oscillations. Finally, conflict adaptation was related to increased postresponse DLPFC γ-band activity and to θ coupling in the reverse direction (DLPFC to DMPFC). These results draw a detailed picture on how two regions in the prefrontal cortex communicate to resolve cognitive conflicts. In conclusion, our data show that conflict detection, control, and adaptation are supported by a sequence of processes that use the interplay of θ and γ oscillations within and between DMPFC and DLPFC. Copyright © 2014 the authors 0270-6474/14/3410438-15$15.00/0.

  13. Enhanced Anomaly Detection Via PLS Regression Models and Information Entropy Theory

    KAUST Repository

    Harrou, Fouzi

    2015-12-07

    Accurate and effective fault detection and diagnosis of modern engineering systems is crucial for ensuring reliability, safety and maintaining the desired product quality. In this work, we propose an innovative method for detecting small faults in the highly correlated multivariate data. The developed method utilizes partial least square (PLS) method as a modelling framework, and the symmetrized Kullback-Leibler divergence (KLD) as a monitoring index, where it is used to quantify the dissimilarity between probability distributions of current PLS-based residual and reference one obtained using fault-free data. The performance of the PLS-based KLD fault detection algorithm is illustrated and compared to the conventional PLS-based fault detection methods. Using synthetic data, we have demonstrated the greater sensitivity and effectiveness of the developed method over the conventional methods, especially when data are highly correlated and small faults are of interest.

  14. Enhanced Anomaly Detection Via PLS Regression Models and Information Entropy Theory

    KAUST Repository

    Harrou, Fouzi; Sun, Ying

    2015-01-01

    Accurate and effective fault detection and diagnosis of modern engineering systems is crucial for ensuring reliability, safety and maintaining the desired product quality. In this work, we propose an innovative method for detecting small faults in the highly correlated multivariate data. The developed method utilizes partial least square (PLS) method as a modelling framework, and the symmetrized Kullback-Leibler divergence (KLD) as a monitoring index, where it is used to quantify the dissimilarity between probability distributions of current PLS-based residual and reference one obtained using fault-free data. The performance of the PLS-based KLD fault detection algorithm is illustrated and compared to the conventional PLS-based fault detection methods. Using synthetic data, we have demonstrated the greater sensitivity and effectiveness of the developed method over the conventional methods, especially when data are highly correlated and small faults are of interest.

  15. Detection of Congenital Mullerian Anomalies Using Real-Time 3D Sonography

    Directory of Open Access Journals (Sweden)

    Firoozeh Ahmadi

    2011-01-01

    Full Text Available A 35 year-old woman referred to Royan Institute (Reproductive Biomedicine Research Center for infertilitytreatment. She had an eleven-year history of primary infertility with a normal abdominal ultrasound.Hysterosalpingography (HSG was obtained one month prior to referral in another center (Fig A.The HSG finding of an apparent unicorn uterus followed by a normal vaginal ultrasound led us toperform a three-dimensional vaginal ultrasound before resorting to hysteroscopy. Results of thethree-dimensional vaginal ultrasound revealed a normal uterus (Fig B, C.Accurate characterization of congenital Mullerian anomalies (MDAs such as an arcuate, unicornuate,didelphys, bicornuate or septate uterus is challenging. While HSG has been the standard test in the diagnosisof MDAs, some limitations may favor the use of three-dimensional ultrasound. The most difficult partof HSG is interpreting the two-dimensional radiographic image into a complex, three-dimensional livingorgan (1. A variety of technical problems may occur while performing HSG. In this case, only an obliqueview could lead to a correct interpretation. It is advisable for the interpreter to perform the procedure ratherthan to inspect only the finished radiographic images (2.One of the most useful scan planes obtained on three-dimensional ultrasound is the coronal view ofthe uterus. This view is known to be a valuable problem-solving tool that assists in differentiatingbetween various types of MDAs due to the high level of agreement between three-dimensionalultrasound and HSG (3, 4.Recently, three-dimensional ultrasound has become the sole mandatory step in the initial investigationof MDAs due to its superiority to other techniques that have been used for the same purpose (5.

  16. Analysis of a SCADA System Anomaly Detection Model Based on Information Entropy

    Science.gov (United States)

    2014-03-27

    Gerard, 2005:3) The NTSB report lists alarm management as one of the top five areas for improvement in pipeline SCADA systems (Gerard, 2005:1...Zhang, Qin, Wang, and Liang for leak detection in a SCADA -run pipeline system. A concept derived from information theory improved leak detection...System for West Products Pipeline . Journal of Loss Prevention in the Process Industries, 22(6), 981-989. Zhu, B., & Sastry, S. (2010). SCADA

  17. Rapid Anomaly Detection and Tracking via Compressive Time-Spectra Measurement

    Science.gov (United States)

    2016-02-12

    characteristic and precision recall curves. New 1. REPORT DATE (DD-MM-YYYY) 4. TITLE AND SUBTITLE 13. SUPPLEMENTARY NOTES 12. DISTRIBUTION...detection on data in the compressed domain. Detection performance was analyzed using receiver operating characteristic and precision recall curves. New...following categories: (b) Papers published in non-peer-reviewed journals (N/A for none) (c) Presentations Received Paper TOTAL : Received Paper TOTAL

  18. MEASURE/ANOMTEST. Anomaly detection software package for the Dodewaard power plant facility. Supplement 1. Extension of measurement analysis part, addition of plot package

    International Nuclear Information System (INIS)

    Schoonewelle, H.

    1995-01-01

    The anomaly detection software package installed at the Dodewaard nuclear power plant has been revised with respect to the part of the measurement analysis. A plot package has been added to the package. Signals in which an anomaly has been detected are automatically plotted including the uncertainty margins of the signals. This report gives a description of the revised measurement analysis part and the plot package. Each new routine of the plot package is described briefly and the new input and output files are given. (orig.)

  19. Demand pattern analysis of taxi trip data for anomalies detection and explanation

    DEFF Research Database (Denmark)

    Markou, Ioulia; Rodrigues, Filipe; Pereira, Francisco Camara

    2017-01-01

    Due to environmental and economic stress, strong investment exists now towards adaptive transport systems that can efficiently utilize capacity, minimizing costs and environmental impacts. The common vision is a system that dynamically changes itself (the supply) to anticipate traveler needs (the...... demand). In some occasions, unexpected and unwanted demand patterns are noticed in the traffic network that lead to system failures and cost implications. Significantly low speeds or excessively low flows at an unforeseeable time are only some of the phenomena that are often noticed and need...

  20. Domain Adaptation for Pedestrian Detection Based on Prediction Consistency

    Directory of Open Access Journals (Sweden)

    Yu Li-ping

    2014-01-01

    Full Text Available Pedestrian detection is an active area of research in computer vision. It remains a quite challenging problem in many applications where many factors cause a mismatch between source dataset used to train the pedestrian detector and samples in the target scene. In this paper, we propose a novel domain adaptation model for merging plentiful source domain samples with scared target domain samples to create a scene-specific pedestrian detector that performs as well as rich target domain simples are present. Our approach combines the boosting-based learning algorithm with an entropy-based transferability, which is derived from the prediction consistency with the source classifications, to selectively choose the samples showing positive transferability in source domains to the target domain. Experimental results show that our approach can improve the detection rate, especially with the insufficient labeled data in target scene.

  1. Adaptive filtering for hidden node detection and tracking in networks.

    Science.gov (United States)

    Hamilton, Franz; Setzer, Beverly; Chavez, Sergio; Tran, Hien; Lloyd, Alun L

    2017-07-01

    The identification of network connectivity from noisy time series is of great interest in the study of network dynamics. This connectivity estimation problem becomes more complicated when we consider the possibility of hidden nodes within the network. These hidden nodes act as unknown drivers on our network and their presence can lead to the identification of false connections, resulting in incorrect network inference. Detecting the parts of the network they are acting on is thus critical. Here, we propose a novel method for hidden node detection based on an adaptive filtering framework with specific application to neuronal networks. We consider the hidden node as a problem of missing variables when model fitting and show that the estimated system noise covariance provided by the adaptive filter can be used to localize the influence of the hidden nodes and distinguish the effects of different hidden nodes. Additionally, we show that the sequential nature of our algorithm allows for tracking changes in the hidden node influence over time.

  2. Implementation Issues of Adaptive Energy Detection in Heterogeneous Wireless Networks

    Science.gov (United States)

    Sobron, Iker; Eizmendi, Iñaki; Martins, Wallace A.; Diniz, Paulo S. R.; Ordiales, Juan Luis; Velez, Manuel

    2017-01-01

    Spectrum sensing (SS) enables the coexistence of non-coordinated heterogeneous wireless systems operating in the same band. Due to its computational simplicity, energy detection (ED) technique has been widespread employed in SS applications; nonetheless, the conventional ED may be unreliable under environmental impairments, justifying the use of ED-based variants. Assessing ED algorithms from theoretical and simulation viewpoints relies on several assumptions and simplifications which, eventually, lead to conclusions that do not necessarily meet the requirements imposed by real propagation environments. This work addresses those problems by dealing with practical implementation issues of adaptive least mean square (LMS)-based ED algorithms. The paper proposes a new adaptive ED algorithm that uses a variable step-size guaranteeing the LMS convergence in time-varying environments. Several implementation guidelines are provided and, additionally, an empirical assessment and validation with a software defined radio-based hardware is carried out. Experimental results show good performance in terms of probabilities of detection (Pd>0.9) and false alarm (Pf∼0.05) in a range of low signal-to-noise ratios around [-4,1] dB, in both single-node and cooperative modes. The proposed sensing methodology enables a seamless monitoring of the radio electromagnetic spectrum in order to provide band occupancy information for an efficient usage among several wireless communications systems. PMID:28441751

  3. An investigation of scalable anomaly detection techniques for a large network of Wi-Fi hotspots

    CSIR Research Space (South Africa)

    Machaka, P

    2015-01-01

    Full Text Available . The Neural Networks, Bayesian Networks and Artificial Immune Systems were used for this experiment. Using a set of data extracted from a live network of Wi-Fi hotspots managed by an ISP; we integrated algorithms into a data collection system to detect...

  4. Anomaly based intrusion detection for a biometric identification system using neural networks

    CSIR Research Space (South Africa)

    Mgabile, T

    2012-10-01

    Full Text Available detection technique that analyses the fingerprint biometric network traffic for evidence of intrusion. The neural network algorithm that imitates the way a human brain works is used in this study to classify normal traffic and learn the correct traffic...

  5. Multi-level anomaly detection: Relevance of big data analytics in ...

    Indian Academy of Sciences (India)

    The Internet has become a vital source of information; internal and exter- .... (iii) DDos detection: Distributed Denial of Service (DDoS) is a common malicious ...... Guirguis M, Bestavros A, Matta I and Zhang Y 2005a Reduction of quality (roq) ...

  6. USBeSafe: Applying One Class SVM for Effective USB Event Anomaly Detection

    Science.gov (United States)

    2016-04-25

    2012. [5] Phil Muncaster. Indian navy computers stormed by malware-ridden USBs. 2012. [6] Ponemon. 2011 Second Annual Cost of Cyber Crime Study...Zhang, and Shanshan Sun. “A mixed unsu- pervised clustering-based intrusion detection model”. In: Genetic and Evolutionary Computing, 2009. WGEC’09

  7. The adaptive value of primate color vision for predator detection.

    Science.gov (United States)

    Pessoa, Daniel Marques Almeida; Maia, Rafael; de Albuquerque Ajuz, Rafael Cavalcanti; De Moraes, Pedro Zurvaino Palmeira Melo Rosa; Spyrides, Maria Helena Constantino; Pessoa, Valdir Filgueiras

    2014-08-01

    The complex evolution of primate color vision has puzzled biologists for decades. Primates are the only eutherian mammals that evolved an enhanced capacity for discriminating colors in the green-red part of the spectrum (trichromatism). However, while Old World primates present three types of cone pigments and are routinely trichromatic, most New World primates exhibit a color vision polymorphism, characterized by the occurrence of trichromatic and dichromatic females and obligatory dichromatic males. Even though this has stimulated a prolific line of inquiry, the selective forces and relative benefits influencing color vision evolution in primates are still under debate, with current explanations focusing almost exclusively at the advantages in finding food and detecting socio-sexual signals. Here, we evaluate a previously untested possibility, the adaptive value of primate color vision for predator detection. By combining color vision modeling data on New World and Old World primates, as well as behavioral information from human subjects, we demonstrate that primates exhibiting better color discrimination (trichromats) excel those displaying poorer color visions (dichromats) at detecting carnivoran predators against the green foliage background. The distribution of color vision found in extant anthropoid primates agrees with our results, and may be explained by the advantages of trichromats and dichromats in detecting predators and insects, respectively. © 2014 Wiley Periodicals, Inc.

  8. Adaptive Road Crack Detection System by Pavement Classification

    Directory of Open Access Journals (Sweden)

    Alejandro Amírola

    2011-10-01

    Full Text Available This paper presents a road distress detection system involving the phases needed to properly deal with fully automatic road distress assessment. A vehicle equipped with line scan cameras, laser illumination and acquisition HW-SW is used to storage the digital images that will be further processed to identify road cracks. Pre-processing is firstly carried out to both smooth the texture and enhance the linear features. Non-crack features detection is then applied to mask areas of the images with joints, sealed cracks and white painting, that usually generate false positive cracking. A seed-based approach is proposed to deal with road crack detection, combining Multiple Directional Non-Minimum Suppression (MDNMS with a symmetry check. Seeds are linked by computing the paths with the lowest cost that meet the symmetry restrictions. The whole detection process involves the use of several parameters. A correct setting becomes essential to get optimal results without manual intervention. A fully automatic approach by means of a linear SVM-based classifier ensemble able to distinguish between up to 10 different types of pavement that appear in the Spanish roads is proposed. The optimal feature vector includes different texture-based features. The parameters are then tuned depending on the output provided by the classifier. Regarding non-crack features detection, results show that the introduction of such module reduces the impact of false positives due to non-crack features up to a factor of 2. In addition, the observed performance of the crack detection system is significantly boosted by adapting the parameters to the type of pavement.

  9. Big Data Analytics for Flow-based Anomaly Detection in High-Speed Networks

    OpenAIRE

    Garofalo, Mauro

    2017-01-01

    The Cisco VNI Complete Forecast Highlights clearly states that the Internet traffic is growing in three different directions, Volume, Velocity, and Variety, bringing computer network into the big data era. At the same time, sophisticated network attacks are growing exponentially. Such growth making the existing signature-based security tools, like firewall and traditional intrusion detection systems, ineffective against new kind of attacks or variations of known attacks. In this dissertati...

  10. Anomaly-based online intrusion detection system as a sensor for cyber security situational awareness system

    OpenAIRE

    Kokkonen, Tero

    2016-01-01

    Almost all the organisations and even individuals rely on complex structures of data networks and networked computer systems. That complex data ensemble, the cyber domain, provides great opportunities, but at the same time it offers many possible attack vectors that can be abused for cyber vandalism, cyber crime, cyber espionage or cyber terrorism. Those threats produce requirements for cyber security situational awareness and intrusion detection capability. This dissertation conc...

  11. Anomaly detection in forward looking infrared imaging using one-class classifiers

    Science.gov (United States)

    Popescu, Mihail; Stone, Kevin; Havens, Timothy; Ho, Dominic; Keller, James

    2010-04-01

    In this paper we describe a method for generating cues of possible abnormal objects present in the field of view of an infrared (IR) camera installed on a moving vehicle. The proposed method has two steps. In the first step, for each frame, we generate a set of possible points of interest using a corner detection algorithm. In the second step, the points related to the background are discarded from the point set using an one class classifier (OCC) trained on features extracted from a local neighborhood of each point. The advantage of using an OCC is that we do not need examples from the "abnormal object" class to train the classifier. Instead, OCC is trained using corner points from images known to be abnormal object free, i.e., that contain only background scenes. To further reduce the number of false alarms we use a temporal fusion procedure: a region has to be detected as "interesting" in m out of n, mGM). The comparison is performed using a set of about 900 background point neighborhoods for training and 400 for testing. The best performing OCC is then used to detect abnormal objects in a set of IR video sequences obtained on a 1 mile long country road.

  12. Early detection and diagnosis of plant anomalies using parallel simulation and knowledge engineering techniques

    International Nuclear Information System (INIS)

    Ness, E.; Berg, Oe.; Soerenssen, A.

    1990-01-01

    A conventional alarm system in a nuclear power plant surveys pressures, temperatures and similar physical quantities and triggers if they get too high or too low. To avoid false alarms for a dynamic process, the alarm limits should be rather wide. This means that a disturbance may develop quite a bit before it is detected. In order to get warnings sufficiently early to avoid taking drastic countermeasures in restoring normal plant conditions, the alarm limits should be put close to the desired operating points. As a extension of several years' activity on alarm reduction methods, the OECD Halden Reactor Project started in 1985 to develop a fault detection system based on the application of reference models for process sections. The system looks at groups of variables rather than single variables. In this way each variable within the group may have a legal value, but the group as a whole may indicate that something is wrong. Therefore faults are detected earlier than by conventional alarm systems, even in dynamic situations. Reference models for the feedwater system of a PWR nuclear power plant have been successfully evaluated with real process data. A test installation is now running on the Loviisa NPP, Finland

  13. Knowledge-based verification of clinical guidelines by detection of anomalies.

    Science.gov (United States)

    Duftschmid, G; Miksch, S

    2001-04-01

    As shown in numerous studies, a significant part of published clinical guidelines is tainted with different types of semantical errors that interfere with their practical application. The adaptation of generic guidelines, necessitated by circumstances such as resource limitations within the applying organization or unexpected events arising in the course of patient care, further promotes the introduction of defects. Still, most current approaches for the automation of clinical guidelines are lacking mechanisms, which check the overall correctness of their output. In the domain of software engineering in general and in the domain of knowledge-based systems (KBS) in particular, a common strategy to examine a system for potential defects consists in its verification. The focus of this work is to present an approach, which helps to ensure the semantical correctness of clinical guidelines in a three-step process. We use a particular guideline specification language called Asbru to demonstrate our verification mechanism. A scenario-based evaluation of our method is provided based on a guideline for the artificial ventilation of newborn infants. The described approach is kept sufficiently general in order to allow its application to several other guideline representation formats.

  14. Gravitational anomalies

    Energy Technology Data Exchange (ETDEWEB)

    Leutwyler, H; Mallik, S

    1986-12-01

    The effective action for fermions moving in external gravitational and gauge fields is analyzed in terms of the corresponding external field propagator. The central object in our approach is the covariant energy-momentum tensor which is extracted from the regular part of the propagator at short distances. It is shown that the Lorentz anomaly, the conformal anomaly and the gauge anomaly can be expressed in terms of the local polynomials which determine the singular part of the propagator. (There are no coordinate anomalies). Except for the conformal anomaly, for which we give explicit representations only in dless than or equal to4, we consider an arbitrary number of dimensions.

  15. Accuracy Analysis Comparison of Supervised Classification Methods for Anomaly Detection on Levees Using SAR Imagery

    Directory of Open Access Journals (Sweden)

    Ramakalavathi Marapareddy

    2017-10-01

    Full Text Available This paper analyzes the use of a synthetic aperture radar (SAR imagery to support levee condition assessment by detecting potential slide areas in an efficient and cost-effective manner. Levees are prone to a failure in the form of internal erosion within the earthen structure and landslides (also called slough or slump slides. If not repaired, slough slides may lead to levee failures. In this paper, we compare the accuracy of the supervised classification methods minimum distance (MD using Euclidean and Mahalanobis distance, support vector machine (SVM, and maximum likelihood (ML, using SAR technology to detect slough slides on earthen levees. In this work, the effectiveness of the algorithms was demonstrated using quad-polarimetric L-band SAR imagery from the NASA Jet Propulsion Laboratory’s (JPL’s uninhabited aerial vehicle synthetic aperture radar (UAVSAR. The study area is a section of the lower Mississippi River valley in the Southern USA, where earthen flood control levees are maintained by the US Army Corps of Engineers.

  16. Robust online tracking via adaptive samples selection with saliency detection

    Science.gov (United States)

    Yan, Jia; Chen, Xi; Zhu, QiuPing

    2013-12-01

    Online tracking has shown to be successful in tracking of previously unknown objects. However, there are two important factors which lead to drift problem of online tracking, the one is how to select the exact labeled samples even when the target locations are inaccurate, and the other is how to handle the confusors which have similar features with the target. In this article, we propose a robust online tracking algorithm with adaptive samples selection based on saliency detection to overcome the drift problem. To deal with the problem of degrading the classifiers using mis-aligned samples, we introduce the saliency detection method to our tracking problem. Saliency maps and the strong classifiers are combined to extract the most correct positive samples. Our approach employs a simple yet saliency detection algorithm based on image spectral residual analysis. Furthermore, instead of using the random patches as the negative samples, we propose a reasonable selection criterion, in which both the saliency confidence and similarity are considered with the benefits that confusors in the surrounding background are incorporated into the classifiers update process before the drift occurs. The tracking task is formulated as a binary classification via online boosting framework. Experiment results in several challenging video sequences demonstrate the accuracy and stability of our tracker.

  17. Fraud Detection in Credit Card Transactions; Using Parallel Processing of Anomalies in Big Data

    Directory of Open Access Journals (Sweden)

    Mohammad Reza Taghva

    2016-10-01

    Full Text Available In parallel to the increasing use of electronic cards, especially in the banking industry, the volume of transactions using these cards has grown rapidly. Moreover, the financial nature of these cards has led to the desirability of fraud in this area. The present study with Map Reduce approach and parallel processing, applied the Kohonen neural network model to detect abnormalities in bank card transactions. For this purpose, firstly it was proposed to classify all transactions into the fraudulent and legal which showed better performance compared with other methods. In the next step, we transformed the Kohonen model into the form of parallel task which demonstrated appropriate performance in terms of time; as expected to be well implemented in transactions with Big Data assumptions.

  18. ORP and pH measurements to detect redox and acid-base anomalies from hydrothermal activity

    Science.gov (United States)

    Santana-Casiano, J. M.; González-Dávila, M.; Fraile-Nuez, E.

    2017-12-01

    The Tagoro submarine volcano is located 1.8 km south of the Island of El Hierro at 350 m depth and rises up to 88 m below sea level. It was erupting melting material for five months, from October 2011 to March 2012, changing drastically the physical-chemical properties of the water column in the area. After this eruption, the system evolved to a hydrothermal system. The character of both reduced and acid of the hydrothermal emissions in the Tagoro submarine volcano allowed us to detect anomalies related with changes in the chemical potential and the proton concentration using ORP and pH sensors, respectively. Tow-yos using a CTD-rosette with these two sensors provided the locations of the emissions plotting δ(ORP)/δt and ΔpH versus the latitude or longitude. The ORP sensor responds very quickly to the presence of reduced chemicals in the water column. Changes in potential are proportional to the amount of reduced chemical species present in the water. The magnitude of these changes are examined by the time derivative of ORP, δ(ORP)/δt. To detect changes in the pH, the mean pH for each depth at a reference station in an area not affected by the vent emission is subtracted from each point measured near the volcanic edifice, defining in this way ΔpH. Detailed surveys of the volcanic edifice were carried out between 2014 and 2016 using several CTD-pH-ORP tow-yo studies, localizing the ORP and pH changes, which were used to obtain surface maps of anomalies. Moreover, meridional tow-yos were used to calculate the amount of volcanic CO2 added to the water column. The inputs of CO2 along multiple sections combined with measurements of oceanic currents produced an estimated volcanic CO2 flux = 6.0 105 ± 1.1 105 kg d-1 which increases the acidity above the volcano by 20%. Sites like the Tagoro submarine volcano, in its degasification stage, provide an excellent opportunity to study the carbonate system in a high CO2 world, the volcanic contribution to the global

  19. Detecting geothermal anomalies and evaluating LST geothermal component by combining thermal remote sensing time series and land surface model data

    Science.gov (United States)

    Romaguera, Mireia; Vaughan, R. Greg; Ettema, J.; Izquierdo-Verdiguier, E.; Hecker, C. A.; van der Meer, F.D.

    2018-01-01

    This paper explores for the first time the possibilities to use two land surface temperature (LST) time series of different origins (geostationary Meteosat Second Generation satellite data and Noah land surface modelling, LSM), to detect geothermal anomalies and extract the geothermal component of LST, the LSTgt. We hypothesize that in geothermal areas the LSM time series will underestimate the LST as compared to the remote sensing data, since the former does not account for the geothermal component in its model.In order to extract LSTgt, two approaches of different nature (physical based and data mining) were developed and tested in an area of about 560 × 560 km2 centered at the Kenyan Rift. Pre-dawn data in the study area during the first 45 days of 2012 were analyzed.The results show consistent spatial and temporal LSTgt patterns between the two approaches, and systematic differences of about 2 K. A geothermal area map from surface studies was used to assess LSTgt inside and outside the geothermal boundaries. Spatial means were found to be higher inside the geothermal limits, as well as the relative frequency of occurrence of high LSTgt. Results further show that areas with strong topography can result in anomalously high LSTgt values (false positives), which suggests the need for a slope and aspect correction in the inputs to achieve realistic results in those areas. The uncertainty analysis indicates that large uncertainties of the input parameters may limit detection of LSTgt anomalies. To validate the approaches, higher spatial resolution images from the Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) data over the Olkaria geothermal field were used. An established method to estimate radiant geothermal flux was applied providing values between 9 and 24 W/m2 in the geothermal area, which coincides with the LSTgt flux rates obtained with the proposed approaches.The proposed approaches are a first step in estimating LSTgt

  20. Sensitivity of susceptibility-weighted imaging in detecting developmental venous anomalies and associated cavernomas and microhemorrhages in children

    International Nuclear Information System (INIS)

    Young, Allen; Bosemani, Thangamadhan; Goel, Reema; Huisman, Thierry A.G.M.; Poretti, Andrea

    2017-01-01

    Developmental venous anomalies (DVA) are common neuroimaging abnormalities that are traditionally diagnosed by contrast-enhanced T1-weighted images as the gold standard. We aimed to evaluate the sensitivity of SWI in detecting DVA and associated cavernous malformations (CM) and microhemorrhages in children in order to determine if SWI may replace contrast-enhanced MRI sequences. Contrast-enhanced T1-weighted images were used as diagnostic gold standard for DVA. The presence of DVA was qualitatively assessed on axial SWI and T2-weighted images by an experienced pediatric neuroradiologist. In addition, the presence of CM and microhemorrhages was evaluated on SWI and contrast-enhanced T1-weighted images. Fifty-seven children with DVA (34 males, mean age at neuroimaging 11.2 years, range 1 month to 17.9 years) were included in this study. Forty-nine out of 57 DVA were identified on SWI (sensitivity of 86%) and 16 out of 57 DVA were detected on T2-weighted images (sensitivity of 28.1%). General anesthesia-related changes in brain hemodynamics and oxygenation were most likely responsible for the majority of SWI false negative. CM were detected in 12 patients on axial SWI, but only in six on contrast-enhanced T1-weighted images. Associated microhemorrhages could be identified in four patients on both axial SWI and contrast-enhanced T1-weighted images, although more numerous and conspicuous on SWI. SWI can identify DVA and associated cavernous malformations and microhemorrhages with high sensitivity, obviating the need for contrast-enhanced MRI sequences. (orig.)

  1. Sensitivity of susceptibility-weighted imaging in detecting developmental venous anomalies and associated cavernomas and microhemorrhages in children

    Energy Technology Data Exchange (ETDEWEB)

    Young, Allen; Bosemani, Thangamadhan; Goel, Reema; Huisman, Thierry A.G.M. [The Johns Hopkins School of Medicine, Charlotte R. Bloomberg Children' s Center, Division of Pediatric Radiology and Pediatric Neuroradiology, The Russell H. Morgan Department of Radiology and Radiological Science, Baltimore, MD (United States); Poretti, Andrea [The Johns Hopkins School of Medicine, Charlotte R. Bloomberg Children' s Center, Division of Pediatric Radiology and Pediatric Neuroradiology, The Russell H. Morgan Department of Radiology and Radiological Science, Baltimore, MD (United States); Kennedy Krieger Institute, Department of Neurogenetics, Baltimore, MD (United States)

    2017-08-15

    Developmental venous anomalies (DVA) are common neuroimaging abnormalities that are traditionally diagnosed by contrast-enhanced T1-weighted images as the gold standard. We aimed to evaluate the sensitivity of SWI in detecting DVA and associated cavernous malformations (CM) and microhemorrhages in children in order to determine if SWI may replace contrast-enhanced MRI sequences. Contrast-enhanced T1-weighted images were used as diagnostic gold standard for DVA. The presence of DVA was qualitatively assessed on axial SWI and T2-weighted images by an experienced pediatric neuroradiologist. In addition, the presence of CM and microhemorrhages was evaluated on SWI and contrast-enhanced T1-weighted images. Fifty-seven children with DVA (34 males, mean age at neuroimaging 11.2 years, range 1 month to 17.9 years) were included in this study. Forty-nine out of 57 DVA were identified on SWI (sensitivity of 86%) and 16 out of 57 DVA were detected on T2-weighted images (sensitivity of 28.1%). General anesthesia-related changes in brain hemodynamics and oxygenation were most likely responsible for the majority of SWI false negative. CM were detected in 12 patients on axial SWI, but only in six on contrast-enhanced T1-weighted images. Associated microhemorrhages could be identified in four patients on both axial SWI and contrast-enhanced T1-weighted images, although more numerous and conspicuous on SWI. SWI can identify DVA and associated cavernous malformations and microhemorrhages with high sensitivity, obviating the need for contrast-enhanced MRI sequences. (orig.)

  2. Anomaly detection using simulated MTI data cubes derived from HYDICE data

    International Nuclear Information System (INIS)

    Moya, M.M.; Taylor, J.G.; Stallard, B.R.; Motomatsu, S.E.

    1998-01-01

    The US Department of Energy is funding the development of the Multi-spectral Thermal Imager (MTI), a satellite-based multi-spectral (MS) thermal imaging sensor scheduled for launch in October 1999. MTI is a research and development (R and D) platform to test the applicability of multispectral and thermal imaging technology for detecting and monitoring signs of proliferation of weapons of mass destruction. During its three-year mission, MTI will periodically record images of participating government, industrial and natural sites in fifteen visible and infrared spectral bands to provide a variety of image data associated with weapons production activities. The MTI satellite will have spatial resolution in the visible bands that is five times better than LANDSAT TM in each dimension and will have five thermal bands. In this work, the authors quantify the separability between specific materials and the natural background by applying Receiver Operating Curve (ROC) analysis to the residual errors from a linear unmixing. The authors apply the ROC analysis to quantify performance of the MTI. They describe the MTI imager and simulate its data by filtering HYDICE hyperspectral imagery both spatially and spectrally and by introducing atmospheric effects corresponding to the MTI satellite altitude. They compare and contrast the individual effects on performance of spectral resolution, spatial resolution, atmospheric corrections, and varying atmospheric conditions

  3. Multi-scale structure and topological anomaly detection via a new network statistic: The onion decomposition.

    Science.gov (United States)

    Hébert-Dufresne, Laurent; Grochow, Joshua A; Allard, Antoine

    2016-08-18

    We introduce a network statistic that measures structural properties at the micro-, meso-, and macroscopic scales, while still being easy to compute and interpretable at a glance. Our statistic, the onion spectrum, is based on the onion decomposition, which refines the k-core decomposition, a standard network fingerprinting method. The onion spectrum is exactly as easy to compute as the k-cores: It is based on the stages at which each vertex gets removed from a graph in the standard algorithm for computing the k-cores. Yet, the onion spectrum reveals much more information about a network, and at multiple scales; for example, it can be used to quantify node heterogeneity, degree correlations, centrality, and tree- or lattice-likeness. Furthermore, unlike the k-core decomposition, the combined degree-onion spectrum immediately gives a clear local picture of the network around each node which allows the detection of interesting subgraphs whose topological structure differs from the global network organization. This local description can also be leveraged to easily generate samples from the ensemble of networks with a given joint degree-onion distribution. We demonstrate the utility of the onion spectrum for understanding both static and dynamic properties on several standard graph models and on many real-world networks.

  4. Trench Parallel Bouguer Anomaly (TPBA): A robust measure for statically detecting asperities along the forearc of subduction zones

    Science.gov (United States)

    Raeesi, M.

    2009-05-01

    During 1970s some researchers noticed that large earthquakes occur repeatedly at the same locations. These observations led to the asperity hypothesis. At the same times some researchers noticed that there was a relationship between the location of great interplate earthquakes and the submarine structures, basins in particular, over the rupture area in the forearc regions. Despite these observations there was no comprehensive and reliable hypothesis explaining the relationship. There were numerous cons and pros to the various hypotheses given in this regard. In their pioneering study, Song and Simons (2003) approached the problem using gravity data. This was a turning point in seismology. Although their approach was correct, appropriate gravity anomaly had to be used in order to reveal the location and extent of the asperities. Following the method of Song and Simons (2003) but using the Bouguer gravity anomaly that we called "Trench Parallel Bouguer Anomaly", TPBA, we found strong, logical, and convincing relation between the TPBA-derived asperities and the slip distribution as well as earthquake distribution, foreshocks and aftershocks in particular. Various parameters with different levels of importance are known that affect the contact between the subducting and the overriding plates, We found that the TPBA can show which are the important factors. Because the TPBA-derived asperities are based on static physical properties (gravity and elevation), they do not suffer from instabilities due to the trade-offs, as it happens for asperities derived in dynamic studies such as waveform inversion. Comparison of the TPBA-derived asperities with rupture processes of the well-studied great earthquakes, reveals the high level of accuracy of the TPBA. This new measure opens a forensic viewpoint on the rupture process along the subduction zones. The TPBA reveals the reason behind 9+ earthquakes and it explains where and why they occur. The TPBA reveals the areas that can

  5. An Unsupervised Deep Hyperspectral Anomaly Detector

    Directory of Open Access Journals (Sweden)

    Ning Ma

    2018-02-01

    Full Text Available Hyperspectral image (HSI based detection has attracted considerable attention recently in agriculture, environmental protection and military applications as different wavelengths of light can be advantageously used to discriminate different types of objects. Unfortunately, estimating the background distribution and the detection of interesting local objects is not straightforward, and anomaly detectors may give false alarms. In this paper, a Deep Belief Network (DBN based anomaly detector is proposed. The high-level features and reconstruction errors are learned through the network in a manner which is not affected by previous background distribution assumption. To reduce contamination by local anomalies, adaptive weights are constructed from reconstruction errors and statistical information. By using the code image which is generated during the inference of DBN and modified by adaptively updated weights, a local Euclidean distance between under test pixels and their neighboring pixels is used to determine the anomaly targets. Experimental results on synthetic and recorded HSI datasets show the performance of proposed method outperforms the classic global Reed-Xiaoli detector (RXD, local RX detector (LRXD and the-state-of-the-art Collaborative Representation detector (CRD.

  6. Holonomy anomalies

    International Nuclear Information System (INIS)

    Bagger, J.; Nemeschansky, D.; Yankielowicz, S.

    1985-05-01

    A new type of anomaly is discussed that afflicts certain non-linear sigma models with fermions. This anomaly is similar to the ordinary gauge and gravitational anomalies since it reflects a topological obstruction to the reparametrization invariance of the quantum effective action. Nonlinear sigma models are constructed based on homogeneous spaces G/H. Anomalies arising when the fermions are chiral are shown to be cancelled sometimes by Chern-Simons terms. Nonlinear sigma models are considered based on general Riemannian manifolds. 9 refs

  7. Risk of developing palatally displaced canines in patients with early detectable dental anomalies: a retrospective cohort study

    Directory of Open Access Journals (Sweden)

    Daniela Gamba GARIB

    Full Text Available ABSTRACT The early recognition of risk factors for the occurrence of palatally displaced canines (PDC can increase the possibility of impaction prevention. Objective To estimate the risk of PDC occurrence in children with dental anomalies identified early during mixed dentition. Material and Methods The sample comprised 730 longitudinal orthodontic records from children (448 females and 282 males with an initial mean age of 8.3 years (SD=1.36. The dental anomaly group (DA included 263 records of patients with at least one dental anomaly identified in the initial or middle mixed dentition. The non-dental anomaly group (NDA was composed of 467 records of patients with no dental anomalies. The occurrence of PDC in both groups was diagnosed using panoramic and periapical radiographs taken in the late mixed dentition or early permanent dentition. The prevalence of PDC in patients with and without early diagnosed dental anomalies was compared using the chi-square test (p<0.01, relative risk assessments (RR, and positive and negative predictive values (PPV and NPV. Results PDC frequency was 16.35% and 6.2% in DA and NDA groups, respectively. A statistically significant difference was observed between groups (p<0.01, with greater risk of PDC development in the DA group (RR=2.63. The PPV and NPV was 16% and 93%, respectively. Small maxillary lateral incisors, deciduous molar infraocclusion, and mandibular second premolar distoangulation were associated with PDC. Conclusion Children with dental anomalies diagnosed during early mixed dentition have an approximately two and a half fold increased risk of developing PDC during late mixed dentition compared with children without dental anomalies.

  8. Risk of developing palatally displaced canines in patients with early detectable dental anomalies: a retrospective cohort study

    OpenAIRE

    GARIB, Daniela Gamba; LANCIA, Melissa; KATO, Renata Mayumi; OLIVEIRA, Thais Marchini; NEVES, Lucimara Teixeira das

    2016-01-01

    ABSTRACT The early recognition of risk factors for the occurrence of palatally displaced canines (PDC) can increase the possibility of impaction prevention. Objective To estimate the risk of PDC occurrence in children with dental anomalies identified early during mixed dentition. Material and Methods The sample comprised 730 longitudinal orthodontic records from children (448 females and 282 males) with an initial mean age of 8.3 years (SD=1.36). The dental anomaly group (DA) included 263...

  9. Risk of developing palatally displaced canines in patients with early detectable dental anomalies: a retrospective cohort study

    Science.gov (United States)

    GARIB, Daniela Gamba; LANCIA, Melissa; KATO, Renata Mayumi; OLIVEIRA, Thais Marchini; NEVES, Lucimara Teixeira das

    2016-01-01

    ABSTRACT The early recognition of risk factors for the occurrence of palatally displaced canines (PDC) can increase the possibility of impaction prevention. Objective To estimate the risk of PDC occurrence in children with dental anomalies identified early during mixed dentition. Material and Methods The sample comprised 730 longitudinal orthodontic records from children (448 females and 282 males) with an initial mean age of 8.3 years (SD=1.36). The dental anomaly group (DA) included 263 records of patients with at least one dental anomaly identified in the initial or middle mixed dentition. The non-dental anomaly group (NDA) was composed of 467 records of patients with no dental anomalies. The occurrence of PDC in both groups was diagnosed using panoramic and periapical radiographs taken in the late mixed dentition or early permanent dentition. The prevalence of PDC in patients with and without early diagnosed dental anomalies was compared using the chi-square test (p<0.01), relative risk assessments (RR), and positive and negative predictive values (PPV and NPV). Results PDC frequency was 16.35% and 6.2% in DA and NDA groups, respectively. A statistically significant difference was observed between groups (p<0.01), with greater risk of PDC development in the DA group (RR=2.63). The PPV and NPV was 16% and 93%, respectively. Small maxillary lateral incisors, deciduous molar infraocclusion, and mandibular second premolar distoangulation were associated with PDC. Conclusion Children with dental anomalies diagnosed during early mixed dentition have an approximately two and a half fold increased risk of developing PDC during late mixed dentition compared with children without dental anomalies. PMID:28076458

  10. Tribal Odisha Eye Disease Study (TOES # 2 Rayagada school screening program: efficacy of multistage screening of school teachers in detection of impaired vision and other ocular anomalies

    Directory of Open Access Journals (Sweden)

    Panda L

    2018-06-01

    Full Text Available Lapam Panda,1 Taraprasad Das,1 Suryasmita Nayak,1 Umasankar Barik,2 Bikash C Mohanta,1 Jachin Williams,3 Vivekanand Warkad,4 Guha Poonam Tapas Kumar,5 Rohit C Khanna3 1Indian Oil Center for Rural Eye Health, GPR ICARE, L V Prasad Eye Institute, MTC Campus, Bhubaneswar, India; 2Naraindas Morbai Budhrani Eye Centre, L V Prasad Eye Institute, Rayagada, India; 3Gullapalli Pratibha Rao International Center for Advancement of Rural Eye Care, L V Prasad Eye Institute, KAR Campus, Hyderabad, India; 4Miriam Hyman Children Eye Care Center, L V Prasad Eye Institute, MTC Campus, Bhubaneswar, India; 5District Administration, Government of Odisha, Rayagada, India Purpose: To describe program planning and effectiveness of multistage school eye screening and assess accuracy of teachers in vision screening and detection of other ocular anomalies in Rayagada District School Sight Program, Odisha, India.Methods: This multistage screening of students included as follows: stage I: screening for vision and other ocular anomalies by school teachers in the school; stage II: photorefraction, subjective correction and other ocular anomaly confirmation by optometrists in the school; stage III: comprehensive ophthalmologist examination in secondary eye center; and stage IV: pediatric ophthalmologist examination in tertiary eye center. Sensitivity, specificity, positive predictive value (PPV and negative predictive value (NPV of teachers for vision screening and other ocular anomaly detection were calculated vis-à-vis optometrist (gold standard.Results: In the study, 216 teachers examined 153,107 (95.7% of enrolled students aged 5–15 years. Teachers referred 8,363 (5.4% of examined students and 5,990 (71.6% of referred were examined in stage II. After prescribing spectacles to 443, optometrists referred 883 students to stage III. The sensitivity (80.51% and PPV (93.05% of teachers for vision screening were high, but specificity (53.29% and NPV (26.02% were low. The

  11. [The advantages of early midtrimester targeted fetal systematic organ screening for the detection of fetal anomalies--will a global change start in Israel?].

    Science.gov (United States)

    Bronshtein, Moshe; Solt, Ido; Blumenfeld, Zeev

    2014-06-01

    Despite more than three decades of universal popularity of fetal sonography as an integral part of pregnancy evaluation, there is still no unequivocal agreement regarding the optimal dating of fetal sonographic screening and the type of ultrasound (transvaginal vs abdominal). TransvaginaL systematic sonography at 14-17 weeks for fetal organ screening. The evaluation of over 72.000 early (14-17 weeks) and late (18-24 weeks) fetal ultrasonographic systematic organ screenings revealed that 96% of the malformations are detectable in the early screening with an incidence of 1:50 gestations. Only 4% of the fetal anomalies are diagnosed later in pregnancy. Over 99% of the fetal cardiac anomalies are detectable in the early screening and most of them appear in low risk gestations. Therefore, we suggest a new platform of fetal sonographic evaluation and follow-up: The extensive systematic fetal organ screening should be performed by an expert sonographer who has been trained in the detection of fetal malformations, at 14-17 weeks gestation. This examination should also include fetal cardiac echography Three additional ultrasound examinations are suggested during pregnancy: the first, performed by the patient's obstetrician at 6-7 weeks for the exclusion of ectopic pregnancy, confirmation of fetal viability, dating, assessment of chorionicity in multiple gestations, and visualization of maternal adnexae. The other two, at 22-26 and 32-34 weeks, require less training and should be performed by an obstetrician who has been qualified in the sonographic detection of fetal anomalies. The advantages of early midtrimester targeted fetal systematic organ screening for the detection of fetal anomalies may dictate a global change.

  12. Detecting Patterns of Anomalies

    Science.gov (United States)

    2009-03-01

    notwithstanding any other provision of law, no person shall be subject to a penalty for failing to comply with a collection of information if it does...Report (SAR) 18. NUMBER OF PAGES 174 19a. NAME OF RESPONSIBLE PERSON a. REPORT unclassified b. ABSTRACT unclassified c. THIS PAGE unclassified...Tillett and I. L. Spencer. Influenza surveillance in england and wales using routine statistics. Journal of Hygine , 88:83–94, 1982. Shobha Venkataraman

  13. Spectral analysis of geological materials in the Central Volcanic Range of Costa Rica and its relationship to the remote detection of anomalies

    OpenAIRE

    Rejas, J. G.; Martínez-Frías, J.; Martínez, R.; Bonatti, J.

    2014-01-01

    The aim of this work is the comparative study of methods for calculating spectral anomalies from imaging spectrometry in several test areas of the Central Volcanic Range (CVR) of Costa Rica. In the detection of anomalous responses it is assumed no prior knowledge of the targets, so that the pixels are automatically separated according to their spectral information significantly differentiated with respect to a background to be estimated, either globally for the full scene, either locally by i...

  14. An adaptive demodulation approach for bearing fault detection based on adaptive wavelet filtering and spectral subtraction

    Science.gov (United States)

    Zhang, Yan; Tang, Baoping; Liu, Ziran; Chen, Rengxiang

    2016-02-01

    Fault diagnosis of rolling element bearings is important for improving mechanical system reliability and performance. Vibration signals contain a wealth of complex information useful for state monitoring and fault diagnosis. However, any fault-related impulses in the original signal are often severely tainted by various noises and the interfering vibrations caused by other machine elements. Narrow-band amplitude demodulation has been an effective technique to detect bearing faults by identifying bearing fault characteristic frequencies. To achieve this, the key step is to remove the corrupting noise and interference, and to enhance the weak signatures of the bearing fault. In this paper, a new method based on adaptive wavelet filtering and spectral subtraction is proposed for fault diagnosis in bearings. First, to eliminate the frequency associated with interfering vibrations, the vibration signal is bandpass filtered with a Morlet wavelet filter whose parameters (i.e. center frequency and bandwidth) are selected in separate steps. An alternative and efficient method of determining the center frequency is proposed that utilizes the statistical information contained in the production functions (PFs). The bandwidth parameter is optimized using a local ‘greedy’ scheme along with Shannon wavelet entropy criterion. Then, to further reduce the residual in-band noise in the filtered signal, a spectral subtraction procedure is elaborated after wavelet filtering. Instead of resorting to a reference signal as in the majority of papers in the literature, the new method estimates the power spectral density of the in-band noise from the associated PF. The effectiveness of the proposed method is validated using simulated data, test rig data, and vibration data recorded from the transmission system of a helicopter. The experimental results and comparisons with other methods indicate that the proposed method is an effective approach to detecting the fault-related impulses

  15. An adaptive demodulation approach for bearing fault detection based on adaptive wavelet filtering and spectral subtraction

    International Nuclear Information System (INIS)

    Zhang, Yan; Tang, Baoping; Chen, Rengxiang; Liu, Ziran

    2016-01-01

    Fault diagnosis of rolling element bearings is important for improving mechanical system reliability and performance. Vibration signals contain a wealth of complex information useful for state monitoring and fault diagnosis. However, any fault-related impulses in the original signal are often severely tainted by various noises and the interfering vibrations caused by other machine elements. Narrow-band amplitude demodulation has been an effective technique to detect bearing faults by identifying bearing fault characteristic frequencies. To achieve this, the key step is to remove the corrupting noise and interference, and to enhance the weak signatures of the bearing fault. In this paper, a new method based on adaptive wavelet filtering and spectral subtraction is proposed for fault diagnosis in bearings. First, to eliminate the frequency associated with interfering vibrations, the vibration signal is bandpass filtered with a Morlet wavelet filter whose parameters (i.e. center frequency and bandwidth) are selected in separate steps. An alternative and efficient method of determining the center frequency is proposed that utilizes the statistical information contained in the production functions (PFs). The bandwidth parameter is optimized using a local ‘greedy’ scheme along with Shannon wavelet entropy criterion. Then, to further reduce the residual in-band noise in the filtered signal, a spectral subtraction procedure is elaborated after wavelet filtering. Instead of resorting to a reference signal as in the majority of papers in the literature, the new method estimates the power spectral density of the in-band noise from the associated PF. The effectiveness of the proposed method is validated using simulated data, test rig data, and vibration data recorded from the transmission system of a helicopter. The experimental results and comparisons with other methods indicate that the proposed method is an effective approach to detecting the fault-related impulses

  16. Development of early core anomaly detection system by using in-sodium microphone in JOYO. Fundamental characteristics test of in-sodium microphone in water and examination of improvement of detection accuracy

    International Nuclear Information System (INIS)

    Komai, Masafumi

    2001-07-01

    Fast reactor core anomalies can be detected in near real-time with acoustic sensors. An acoustic detection system senses an in-core anomaly immediately from the fast acoustic signals that propagate through the sodium coolant. One example of a detectable anomaly is sodium boiling due to local blockage in a sub-assembly; the slight change in background acoustic signals can be detected. A key advantage of the acoustic detector is that it can be located outside the core. The location of the anomaly in the core can be determined by correlating multiple acoustic signals. This report describes the testing and fundamental characteristics of a microphone suitable for use in the sodium coolant and examines methods to improve the system's S/N ratio. Testing in water confirmed that the in-sodium microphone has good impulse and wide band frequency responses. These tests used impulse and white noise signals that imitate acoustic signals from boiling sodium. Correlation processing of multiple microphone signals to improve S/N ratio is also described. (author)

  17. Large-Scale Topic Detection and Language Model Adaptation

    National Research Council Canada - National Science Library

    Seymore, Kristie

    1997-01-01

    .... We have developed a language model adaptation scheme that takes apiece of text, chooses the most similar topic clusters from a set of over 5000 elemental topics, and uses topic specific language...

  18. A comparison of classical and intelligent methods to detect potential thermal anomalies before the 11 August 2012 Varzeghan, Iran, earthquake (Mw = 6.4)

    Science.gov (United States)

    Akhoondzadeh, M.

    2013-04-01

    In this paper, a number of classical and intelligent methods, including interquartile, autoregressive integrated moving average (ARIMA), artificial neural network (ANN) and support vector machine (SVM), have been proposed to quantify potential thermal anomalies around the time of the 11 August 2012 Varzeghan, Iran, earthquake (Mw = 6.4). The duration of the data set, which is comprised of Aqua-MODIS land surface temperature (LST) night-time snapshot images, is 62 days. In order to quantify variations of LST data obtained from satellite images, the air temperature (AT) data derived from the meteorological station close to the earthquake epicenter has been taken into account. For the models examined here, results indicate the following: (i) ARIMA models, which are the most widely used in the time series community for short-term forecasting, are quickly and easily implemented, and can efficiently act through linear solutions. (ii) A multilayer perceptron (MLP) feed-forward neural network can be a suitable non-parametric method to detect the anomalous changes of a non-linear time series such as variations of LST. (iii) Since SVMs are often used due to their many advantages for classification and regression tasks, it can be shown that, if the difference between the predicted value using the SVM method and the observed value exceeds the pre-defined threshold value, then the observed value could be regarded as an anomaly. (iv) ANN and SVM methods could be powerful tools in modeling complex phenomena such as earthquake precursor time series where we may not know what the underlying data generating process is. There is good agreement in the results obtained from the different methods for quantifying potential anomalies in a given LST time series. This paper indicates that the detection of the potential thermal anomalies derive credibility from the overall efficiencies and potentialities of the four integrated methods.

  19. Focal skin defect, limb anomalies and microphthalmia.

    NARCIS (Netherlands)

    Jackson, K.E.; Andersson, H.C.

    2004-01-01

    We describe two unrelated female patients with congenital single focal skin defects, unilateral microphthalmia and limb anomalies. Growth and psychomotor development were normal and no brain malformation was detected. Although eye and limb anomalies are commonly associated, clinical anophthalmia and

  20. Potential tank waste material anomalies located near the liquid observation wells: Model predicted responses of a neutron moisture detection system

    International Nuclear Information System (INIS)

    Finfrock, S.H.; Toffer, H.; Watson, W.T.

    1994-09-01

    Extensive analyses have been completed to demonstrate that a neutron moisture probe can be used to recognize anomalies in materials and geometry surrounding the liquid observation wells (LOWs). Furthermore, techniques can be developed that will permit the interpretation of detector readings, perturbed by the presence of anomalies, as more accurate moisture concentrations. This analysis effort extends the usefulness of a neutron moisture probe system significantly, especially in the complicated geometries and material conditions that may be encountered in the waste tanks. Both static-source and pulsed-source neutron probes were considered in the analyses. Four different detector configurations were investigated: Thermal and epithermal neutron detectors located in both the near and far field

  1. Enzyme leaching of surficial geochemical samples for detecting hydromorphic trace-element anomalies associated with precious-metal mineralized bedrock buried beneath glacial overburden in northern Minnesota

    Science.gov (United States)

    Clark, Robert J.; Meier, A.L.; Riddle, G.; ,

    1990-01-01

    One objective of the International Falls and Roseau, Minnesota, CUSMAP projects was to develop a means of conducting regional-scale geochemical surveys in areas where bedrock is buried beneath complex glacially derived overburden. Partial analysis of B-horizon soils offered hope for detecting subtle hydromorphic trace-element dispersion patterns. An enzyme-based partial leach selectively removes metals from oxide coatings on the surfaces of soil materials without attacking their matrix. Most trace-element concentrations in the resulting solutions are in the part-per-trillion to low part-per-billion range, necessitating determinations by inductively coupled plasma/mass spectrometry. The resulting data show greater contrasts for many trace elements than with other techniques tested. Spatially, many trace metal anomalies are locally discontinuous, but anomalous trends within larger areas are apparent. In many instances, the source for an anomaly seems to be either basal till or bedrock. Ground water flow is probably the most important mechanism for transporting metals toward the surface, although ionic diffusion, electrochemical gradients, and capillary action may play a role in anomaly dispersal. Sample sites near the Rainy Lake-Seine River fault zone, a regional shear zone, often have anomalous concentrations of a variety of metals, commonly including Zn and/or one or more metals which substitute for Zn in sphalerite (Cd, Ge, Ga, and Sn). Shifts in background concentrations of Bi, Sb, and As show a trend across the area indicating a possible regional zoning of lode-Au mineralization. Soil anomalies of Ag, Co, and Tl parallel basement structures, suggesting areas that may have potential for Cobalt/Thunder Baytype silver viens. An area around Baudette, Minnesota, which is underlain by quartz-chlorite-carbonate-altered shear zones, is anomalous in Ag, As, Bi, Co, Mo, Te, Tl, and W. Anomalies of Ag, As, Bi, Te, and W tend to follow the fault zones, suggesting potential

  2. Time series analysis of precipitation and vegetation to detect food production anomalies in the Horn of Africa. The case of Lower Shabelle (Somalia

    Directory of Open Access Journals (Sweden)

    M. A. Belenguer-Plomer

    2016-12-01

    Full Text Available The Horn of Africa is one of the most food-insecure locations around the world due to the continuous increase of its population and the practice of the subsistence agriculture. This causes that much of the population cannot take the minimum nutritional needs for a healthy life. Moreover, this situation of food vulnerability may be seriously affected in the coming years due to the effects of climate change. The aim of this work is combine the information about the state of the vegetation that offers the NDVI with rainfall data to detect negative anomalies in food production. This work has been used the monthly products of NDVI MOD13A3 of MODIS and the rainfall estimation product TAMSAT, both during the period 2001-2015. With these products we have calculated the average of the entire time period selected and we have detected the years whose NDVI values were further away from the average, being these 2010, 2011 and 2014. Once detected the years with major anomalies in NDVI, there has been an exclusive monthly analysis of those years, where we have analysed the relationships between the value of NDVI and monthly rainfall, obtaining a direct relationship between the two values. It also has been used crop calendar to focus the analysis in the months of agricultural production and finding that the main cause of anomalies in vegetation is a decrease in the registration of rainfall during the months of agricultural production. This reason explains the origin of the food shortages that occurred in 2010 and 2011 that generated an enormous humanitarian crisis in this area.

  3. DOWN'S ANOMALY.

    Science.gov (United States)

    PENROSE, L.S.; SMITH, G.F.

    BOTH CLINICAL AND PATHOLOGICAL ASPECTS AND MATHEMATICAL ELABORATIONS OF DOWN'S ANOMALY, KNOWN ALSO AS MONGOLISM, ARE PRESENTED IN THIS REFERENCE MANUAL FOR PROFESSIONAL PERSONNEL. INFORMATION PROVIDED CONCERNS (1) HISTORICAL STUDIES, (2) PHYSICAL SIGNS, (3) BONES AND MUSCLES, (4) MENTAL DEVELOPMENT, (5) DERMATOGLYPHS, (6) HEMATOLOGY, (7)…

  4. A high-throughput multiplex method adapted for GMO detection.

    Science.gov (United States)

    Chaouachi, Maher; Chupeau, Gaëlle; Berard, Aurélie; McKhann, Heather; Romaniuk, Marcel; Giancola, Sandra; Laval, Valérie; Bertheau, Yves; Brunel, Dominique

    2008-12-24

    A high-throughput multiplex assay for the detection of genetically modified organisms (GMO) was developed on the basis of the existing SNPlex method designed for SNP genotyping. This SNPlex assay allows the simultaneous detection of up to 48 short DNA sequences (approximately 70 bp; "signature sequences") from taxa endogenous reference genes, from GMO constructions, screening targets, construct-specific, and event-specific targets, and finally from donor organisms. This assay avoids certain shortcomings of multiplex PCR-based methods already in widespread use for GMO detection. The assay demonstrated high specificity and sensitivity. The results suggest that this assay is reliable, flexible, and cost- and time-effective for high-throughput GMO detection.

  5. Detecting content adaptive scaling of images for forensic applications

    Science.gov (United States)

    Fillion, Claude; Sharma, Gaurav

    2010-01-01

    Content-aware resizing methods have recently been developed, among which, seam-carving has achieved the most widespread use. Seam-carving's versatility enables deliberate object removal and benign image resizing, in which perceptually important content is preserved. Both types of modifications compromise the utility and validity of the modified images as evidence in legal and journalistic applications. It is therefore desirable that image forensic techniques detect the presence of seam-carving. In this paper we address detection of seam-carving for forensic purposes. As in other forensic applications, we pose the problem of seam-carving detection as the problem of classifying a test image in either of two classes: a) seam-carved or b) non-seam-carved. We adopt a pattern recognition approach in which a set of features is extracted from the test image and then a Support Vector Machine based classifier, trained over a set of images, is utilized to estimate which of the two classes the test image lies in. Based on our study of the seam-carving algorithm, we propose a set of intuitively motivated features for the detection of seam-carving. Our methodology for detection of seam-carving is then evaluated over a test database of images. We demonstrate that the proposed method provides the capability for detecting seam-carving with high accuracy. For images which have been reduced 30% by benign seam-carving, our method provides a classification accuracy of 91%.

  6. Adapting Local Features for Face Detection in Thermal Image

    Directory of Open Access Journals (Sweden)

    Chao Ma

    2017-11-01

    Full Text Available A thermal camera captures the temperature distribution of a scene as a thermal image. In thermal images, facial appearances of different people under different lighting conditions are similar. This is because facial temperature distribution is generally constant and not affected by lighting condition. This similarity in face appearances is advantageous for face detection. To detect faces in thermal images, cascade classifiers with Haar-like features are generally used. However, there are few studies exploring the local features for face detection in thermal images. In this paper, we introduce two approaches relying on local features for face detection in thermal images. First, we create new feature types by extending Multi-Block LBP. We consider a margin around the reference and the generally constant distribution of facial temperature. In this way, we make the features more robust to image noise and more effective for face detection in thermal images. Second, we propose an AdaBoost-based training method to get cascade classifiers with multiple types of local features. These feature types have different advantages. In this way we enhance the description power of local features. We did a hold-out validation experiment and a field experiment. In the hold-out validation experiment, we captured a dataset from 20 participants, comprising 14 males and 6 females. For each participant, we captured 420 images with 10 variations in camera distance, 21 poses, and 2 appearances (participant with/without glasses. We compared the performance of cascade classifiers trained by different sets of the features. The experiment results showed that the proposed approaches effectively improve the performance of face detection in thermal images. In the field experiment, we compared the face detection performance in realistic scenes using thermal and RGB images, and gave discussion based on the results.

  7. Detecting users handedness for ergonomic adaptation of mobile user interfaces

    DEFF Research Database (Denmark)

    Löchtefeld, Markus; Schardt, Phillip; Krüger, Antonio

    2015-01-01

    ) for users with average hand sizes. One solution is to offer adaptive user interfaces for such one-handed interactions. These modes have to be triggered manually and thus induce a critical overhead. They are further designed to bring all content closer, regardless of whether the phone is operated...... with the left or right hand. In this paper, we present an algorithm that allows determining the users' interacting hand from their unlocking behavior. Our algorithm correctly distinguishes one- and twohanded usage as well as left- and right handed unlocking in 98.51% of all cases. This is achieved through a k...

  8. Adaptive Pseudo Dilation for Gestalt Edge Grouping and Contour Detection

    NARCIS (Netherlands)

    Papari, Giuseppe; Petkov, Nicolai

    2008-01-01

    We consider the problem of detecting object contours in natural images. In many cases, local luminance changes turn out to be stronger in textured areas than on object contours. Therefore, local edge features, which only look at a small neighborhood of each pixel, cannot be reliable indicators of

  9. Fetal renal anomalies : diagnosis, management, and outcome

    NARCIS (Netherlands)

    Damen-Elias, Henrica Antonia Maria

    2004-01-01

    In two to three percent of fetuses structural anomalies can be found with prenatal ultrasound investigation. Anomalies of the urinary tract account for 15 to 20% of these anomalies with a detection rate of approximately of 90%. In Chapter 2, 3 and 4 we present reference curves for size and growth

  10. Adapting Parameterized Motions using Iterative Learning and Online Collision Detection

    DEFF Research Database (Denmark)

    Laursen, Johan Sund; Sørensen, Lars Carøe; Schultz, Ulrik Pagh

    2018-01-01

    utilizing Gaussian Process learning. This allows for motion parameters to be optimized using real world trials which incorporate all uncertainties inherent in the assembly process without requiring advanced robot and sensor setups. The result is a simple and straightforward system which helps the user...... automatically find robust and uncertainty-tolerant motions. We present experiments for an assembly case showing both detection and learning in the real world and how these combine to a robust robot system....

  11. ANOMALY DETECTION AND COMPARATIVE ANALYSIS OF HYDROTHERMAL ALTERATION MATERIALS TROUGH HYPERSPECTRAL MULTISENSOR DATA IN THE TURRIALBA VOLCANO

    Directory of Open Access Journals (Sweden)

    J. G. Rejas

    2012-07-01

    Full Text Available The aim of this work is the comparative study of the presence of hydrothermal alteration materials in the Turrialba volcano (Costa Rica in relation with computed spectral anomalies from multitemporal and multisensor data adquired in spectral ranges of the visible (VIS, short wave infrared (SWIR and thermal infrared (TIR. We used for this purposes hyperspectral and multispectral images from the HyMAP and MASTER airborne sensors, and ASTER and Hyperion scenes in a period between 2002 and 2010. Field radiometry was applied in order to remove the atmospheric contribution in an empirical line method. HyMAP and MASTER images were georeferenced directly thanks to positioning and orientation data that were measured at the same time in the acquisition campaign from an inertial system based on GPS/IMU. These two important steps were allowed the identification of spectral diagnostic bands of hydrothermal alteration minerals and the accuracy spatial correlation. Enviromental impact of the volcano activity has been studied through different vegetation indexes and soil patterns. Have been mapped hydrothermal materials in the crater of the volcano, in fact currently active, and their surrounding carrying out a principal components analysis differentiated for a high and low absorption bands to characterize accumulations of kaolinite, illite, alunite and kaolinite+smectite, delimitating zones with the presence of these minerals. Spectral anomalies have been calculated on a comparative study of methods pixel and subpixel focused in thermal bands fused with high-resolution images. Results are presented as an approach based on expert whose main interest lies in the automated identification of patterns of hydrothermal altered materials without prior knowledge or poor information on the area.

  12. Anomaly Detection and Comparative Analysis of Hydrothermal Alteration Materials Trough Hyperspectral Multisensor Data in the Turrialba Volcano

    Science.gov (United States)

    Rejas, J. G.; Martínez-Frías, J.; Bonatti, J.; Martínez, R.; Marchamalo, M.

    2012-07-01

    The aim of this work is the comparative study of the presence of hydrothermal alteration materials in the Turrialba volcano (Costa Rica) in relation with computed spectral anomalies from multitemporal and multisensor data adquired in spectral ranges of the visible (VIS), short wave infrared (SWIR) and thermal infrared (TIR). We used for this purposes hyperspectral and multispectral images from the HyMAP and MASTER airborne sensors, and ASTER and Hyperion scenes in a period between 2002 and 2010. Field radiometry was applied in order to remove the atmospheric contribution in an empirical line method. HyMAP and MASTER images were georeferenced directly thanks to positioning and orientation data that were measured at the same time in the acquisition campaign from an inertial system based on GPS/IMU. These two important steps were allowed the identification of spectral diagnostic bands of hydrothermal alteration minerals and the accuracy spatial correlation. Enviromental impact of the volcano activity has been studied through different vegetation indexes and soil patterns. Have been mapped hydrothermal materials in the crater of the volcano, in fact currently active, and their surrounding carrying out a principal components analysis differentiated for a high and low absorption bands to characterize accumulations of kaolinite, illite, alunite and kaolinite+smectite, delimitating zones with the presence of these minerals. Spectral anomalies have been calculated on a comparative study of methods pixel and subpixel focused in thermal bands fused with high-resolution images. Results are presented as an approach based on expert whose main interest lies in the automated identification of patterns of hydrothermal altered materials without prior knowledge or poor information on the area.

  13. Comparative study of adaptive-noise-cancellation algorithms for intrusion detection systems

    International Nuclear Information System (INIS)

    Claassen, J.P.; Patterson, M.M.

    1981-01-01

    Some intrusion detection systems are susceptible to nonstationary noise resulting in frequent nuisance alarms and poor detection when the noise is present. Adaptive inverse filtering for single channel systems and adaptive noise cancellation for two channel systems have both demonstrated good potential in removing correlated noise components prior detection. For such noise susceptible systems the suitability of a noise reduction algorithm must be established in a trade-off study weighing algorithm complexity against performance. The performance characteristics of several distinct classes of algorithms are established through comparative computer studies using real signals. The relative merits of the different algorithms are discussed in the light of the nature of intruder and noise signals

  14. Adapting detection sensitivity based on evidence of irregular sinus arrhythmia to improve atrial fibrillation detection in insertable cardiac monitors.

    Science.gov (United States)

    Pürerfellner, Helmut; Sanders, Prashanthan; Sarkar, Shantanu; Reisfeld, Erin; Reiland, Jerry; Koehler, Jodi; Pokushalov, Evgeny; Urban, Luboš; Dekker, Lukas R C

    2017-10-03

    Intermittent change in p-wave discernibility during periods of ectopy and sinus arrhythmia is a cause of inappropriate atrial fibrillation (AF) detection in insertable cardiac monitors (ICM). To address this, we developed and validated an enhanced AF detection algorithm. Atrial fibrillation detection in Reveal LINQ ICM uses patterns of incoherence in RR intervals and absence of P-wave evidence over a 2-min period. The enhanced algorithm includes P-wave evidence during RR irregularity as evidence of sinus arrhythmia or ectopy to adaptively optimize sensitivity for AF detection. The algorithm was developed and validated using Holter data from the XPECT and LINQ Usability studies which collected surface electrocardiogram (ECG) and continuous ICM ECG over a 24-48 h period. The algorithm detections were compared with Holter annotations, performed by multiple reviewers, to compute episode and duration detection performance. The validation dataset comprised of 3187 h of valid Holter and LINQ recordings from 138 patients, with true AF in 37 patients yielding 108 true AF episodes ≥2-min and 449 h of AF. The enhanced algorithm reduced inappropriately detected episodes by 49% and duration by 66% with adapts sensitivity for AF detection reduced inappropriately detected episodes and duration with minimal reduction in sensitivity. © The Author 2017. Published by Oxford University Press on behalf of the European Society of Cardiology

  15. An adaptive prediction and detection algorithm for multistream syndromic surveillance

    Directory of Open Access Journals (Sweden)

    Magruder Steve F

    2005-10-01

    Full Text Available Abstract Background Surveillance of Over-the-Counter pharmaceutical (OTC sales as a potential early indicator of developing public health conditions, in particular in cases of interest to biosurvellance, has been suggested in the literature. This paper is a continuation of a previous study in which we formulated the problem of estimating clinical data from OTC sales in terms of optimal LMS linear and Finite Impulse Response (FIR filters. In this paper we extend our results to predict clinical data multiple steps ahead using OTC sales as well as the clinical data itself. Methods The OTC data are grouped into a few categories and we predict the clinical data using a multichannel filter that encompasses all the past OTC categories as well as the past clinical data itself. The prediction is performed using FIR (Finite Impulse Response filters and the recursive least squares method in order to adapt rapidly to nonstationary behaviour. In addition, we inject simulated events in both clinical and OTC data streams to evaluate the predictions by computing the Receiver Operating Characteristic curves of a threshold detector based on predicted outputs. Results We present all prediction results showing the effectiveness of the combined filtering operation. In addition, we compute and present the performance of a detector using the prediction output. Conclusion Multichannel adaptive FIR least squares filtering provides a viable method of predicting public health conditions, as represented by clinical data, from OTC sales, and/or the clinical data. The potential value to a biosurveillance system cannot, however, be determined without studying this approach in the presence of transient events (nonstationary events of relatively short duration and fast rise times. Our simulated events superimposed on actual OTC and clinical data allow us to provide an upper bound on that potential value under some restricted conditions. Based on our ROC curves we argue that a

  16. Improved pulsed photoacoustic detection by means of an adapted filter

    Science.gov (United States)

    González, M.; Santiago, G.; Peuriot, A.; Slezak, V.; Mosquera, C.

    2005-06-01

    We present a numerical and experimental study of two adapted filters devised to the quantitative analysis of weak photoacoustic signals. The first one is a simple convolution-type one and the other is based on neural networks of the multilayer perceptron type. The theoretical signal used as one of the inputs in both filters is derived from the solution of the transient response of the acoustic cell modeled with a simple transmission-line analogue. The filters were tested numerically by using the theoretical signal corrupted with white noise. After 500 iterations it was possible to define an average error for the returned value of each filter. Since the neural network outperformed the convolution-type, we assessed its performance by measuring SF6 traces diluted in N2 and excited by tuned TEA CO2 laser. The results show the use of the neural network filter allows recovering a signal with poor signal-to-noise ratio without resorting to extensive averaging, thus reducing the acquisition time while improving the precision of the measurement.

  17. COMPARISON OF BACKGROUND SUBTRACTION, SOBEL, ADAPTIVE MOTION DETECTION, FRAME DIFFERENCES, AND ACCUMULATIVE DIFFERENCES IMAGES ON MOTION DETECTION

    Directory of Open Access Journals (Sweden)

    Dara Incam Ramadhan

    2018-02-01

    Full Text Available Nowadays, digital image processing is not only used to recognize motionless objects, but also used to recognize motions objects on video. One use of moving object recognition on video is to detect motion, which implementation can be used on security cameras. Various methods used to detect motion have been developed so that in this research compared some motion detection methods, namely Background Substraction, Adaptive Motion Detection, Sobel, Frame Differences and Accumulative Differences Images (ADI. Each method has a different level of accuracy. In the background substraction method, the result obtained 86.1% accuracy in the room and 88.3% outdoors. In the sobel method the result of motion detection depends on the lighting conditions of the room being supervised. When the room is in bright condition, the accuracy of the system decreases and when the room is dark, the accuracy of the system increases with an accuracy of 80%. In the adaptive motion detection method, motion can be detected with a condition in camera visibility there is no object that is easy to move. In the frame difference method, testing on RBG image using average computation with threshold of 35 gives the best value. In the ADI method, the result of accuracy in motion detection reached 95.12%.

  18. Brain shaving: adaptive detection for brain PET data

    International Nuclear Information System (INIS)

    Grecchi, Elisabetta; Doyle, Orla M; Turkheimer, Federico E; Bertoldo, Alessandra; Pavese, Nicola

    2014-01-01

    The intricacy of brain biology is such that the variation of imaging end-points in health and disease exhibits an unpredictable range of spatial distributions from the extremely localized to the very diffuse. This represents a challenge for the two standard approaches to analysis, the mass univariate and the multivariate that exhibit either strong specificity but not as good sensitivity (the former) or poor specificity and comparatively better sensitivity (the latter). In this work, we develop an analytical methodology for positron emission tomography that operates an extraction (‘shaving’) of coherent patterns of signal variation while maintaining control of the type I error. The methodology operates two rotations on the image data, one local using the wavelet transform and one global using the singular value decomposition. The control of specificity is obtained by using the gap statistic that selects, within each eigenvector, a subset of significantly coherent elements. Face-validity of the algorithm is demonstrated using a paradigmatic data-set with two radiotracers, [ 11 C]-raclopride and [ 11 C]-(R)-PK11195, measured on the same Huntington's disease patients, a disorder with a genetic based diagnosis. The algorithm is able to detect the two well-known separate but connected processes of dopamine neuronal loss (localized in the basal ganglia) and neuroinflammation (diffusive around the whole brain). These processes are at the two extremes of the distributional envelope, one being very sparse and the latter being perfectly Gaussian and they are not adequately detected by the univariate and the multivariate approaches. (paper)

  19. Brain shaving: adaptive detection for brain PET data

    Science.gov (United States)

    Grecchi, Elisabetta; Doyle, Orla M.; Bertoldo, Alessandra; Pavese, Nicola; Turkheimer, Federico E.

    2014-05-01

    The intricacy of brain biology is such that the variation of imaging end-points in health and disease exhibits an unpredictable range of spatial distributions from the extremely localized to the very diffuse. This represents a challenge for the two standard approaches to analysis, the mass univariate and the multivariate that exhibit either strong specificity but not as good sensitivity (the former) or poor specificity and comparatively better sensitivity (the latter). In this work, we develop an analytical methodology for positron emission tomography that operates an extraction (‘shaving’) of coherent patterns of signal variation while maintaining control of the type I error. The methodology operates two rotations on the image data, one local using the wavelet transform and one global using the singular value decomposition. The control of specificity is obtained by using the gap statistic that selects, within each eigenvector, a subset of significantly coherent elements. Face-validity of the algorithm is demonstrated using a paradigmatic data-set with two radiotracers, [11C]-raclopride and [11C]-(R)-PK11195, measured on the same Huntington's disease patients, a disorder with a genetic based diagnosis. The algorithm is able to detect the two well-known separate but connected processes of dopamine neuronal loss (localized in the basal ganglia) and neuroinflammation (diffusive around the whole brain). These processes are at the two extremes of the distributional envelope, one being very sparse and the latter being perfectly Gaussian and they are not adequately detected by the univariate and the multivariate approaches.

  20. Research Progress of Space-Time Adaptive Detection for Airborne Radar

    Directory of Open Access Journals (Sweden)

    Wang Yong-liang

    2014-04-01

    Full Text Available Compared with Space-Time Adaptive Processing (STAP, Space-Time Adaptive Detection (STAD employs the data in the cell under test and those in the training to form reasonable detection statistics and consequently decides whether the target exists or not. The STAD has concise processing procedure and flexible design. Furthermore, the detection statistics usually possess the Constant False Alarm Rate (CFAR property, and hence it needs no additional CFAR processing. More importantly, the STAD usually exhibits improved detection performance than that of the conventional processing, which first suppresses the clutter then adopts other detection strategy. In this paper, we first summarize the key strongpoint of the STAD, then make a classification for the STAD, and finally give some future research tracks.

  1. Adaptation to the Birth of a Child with a Congenital Anomaly: A Prospective Longitudinal Study of Maternal Well-Being and Psychological Distress

    Science.gov (United States)

    Nes, Ragnhild B.; Røysamb, Espen; Hauge, Lars J.; Kornstad, Tom; Landolt, Markus A.; Irgens, Lorentz M.; Eskedal, Leif; Kristensen, Petter; Vollrath, Margarete E.

    2014-01-01

    This study explores the stability and change in maternal life satisfaction and psychological distress following the birth of a child with a congenital anomaly using 5 assessments from the Norwegian Mother and Child Cohort Study collected from Pregnancy Week 17 to 36 months postpartum. Participating mothers were divided into those having infants…

  2. Application of process monitoring to anomaly detection in nuclear material processing systems via system-centric event interpretation of data from multiple sensors of varying reliability

    International Nuclear Information System (INIS)

    Garcia, Humberto E.; Simpson, Michael F.; Lin, Wen-Chiao; Carlson, Reed B.; Yoo, Tae-Sic

    2017-01-01

    Highlights: • Process monitoring can strengthen nuclear safeguards and material accountancy. • Assessment is conducted at a system-centric level to improve safeguards effectiveness. • Anomaly detection is improved by integrating process and operation relationships. • Decision making is benefited from using sensor and event sequence information. • Formal framework enables optimization of sensor and data processing resources. - Abstract: In this paper, we apply an advanced safeguards approach and associated methods for process monitoring to a hypothetical nuclear material processing system. The assessment regarding the state of the processing facility is conducted at a system-centric level formulated in a hybrid framework. This utilizes architecture for integrating both time- and event-driven data and analysis for decision making. While the time-driven layers of the proposed architecture encompass more traditional process monitoring methods based on time series data and analysis, the event-driven layers encompass operation monitoring methods based on discrete event data and analysis. By integrating process- and operation-related information and methodologies within a unified framework, the task of anomaly detection is greatly improved. This is because decision-making can benefit from not only known time-series relationships among measured signals but also from known event sequence relationships among generated events. This available knowledge at both time series and discrete event layers can then be effectively used to synthesize observation solutions that optimally balance sensor and data processing requirements. The application of the proposed approach is then implemented on an illustrative monitored system based on pyroprocessing and results are discussed.

  3. Deep Exemplar 2D-3D Detection by Adapting from Real to Rendered Views

    OpenAIRE

    Massa, Francisco; Russell, Bryan; Aubry, Mathieu

    2015-01-01

    This paper presents an end-to-end convolutional neural network (CNN) for 2D-3D exemplar detection. We demonstrate that the ability to adapt the features of natural images to better align with those of CAD rendered views is critical to the success of our technique. We show that the adaptation can be learned by compositing rendered views of textured object models on natural images. Our approach can be naturally incorporated into a CNN detection pipeline and extends the accuracy and speed benefi...

  4. Dyonic anomalies

    International Nuclear Information System (INIS)

    Henningson, Mans; Johansson, Erik P.G.

    2005-01-01

    We consider the problem of coupling a dyonic p-brane in d=2p+4 space-time dimensions to a prescribed (p+2)-form field strength. This is particularly subtle when p is odd. For the case p=1, we explicitly construct a coupling functional, which is a sum of two terms: one which is linear in the prescribed field strength, and one which describes the coupling of the brane to its self-field and takes the form of a Wess-Zumino term depending only on the embedding of the brane world-volume into space-time. We then show that this functional is well-defined only modulo a certain anomaly, related to the Euler class of the normal bundle of the brane world-volume

  5. Ship detection for high resolution optical imagery with adaptive target filter

    Science.gov (United States)

    Ju, Hongbin

    2015-10-01

    Ship detection is important due to both its civil and military use. In this paper, we propose a novel ship detection method, Adaptive Target Filter (ATF), for high resolution optical imagery. The proposed framework can be grouped into two stages, where in the first stage, a test image is densely divided into different detection windows and each window is transformed to a feature vector in its feature space. The Histograms of Oriented Gradients (HOG) is accumulated as a basic feature descriptor. In the second stage, the proposed ATF highlights all the ship regions and suppresses the undesired backgrounds adaptively. Each detection window is assigned a score, which represents the degree of the window belonging to a certain ship category. The ATF can be adaptively obtained by the weighted Logistic Regression (WLR) according to the distribution of backgrounds and targets of the input image. The main innovation of our method is that we only need to collect positive training samples to build the filter, while the negative training samples are adaptively generated by the input image. This is different to other classification method such as Support Vector Machine (SVM) and Logistic Regression (LR), which need to collect both positive and negative training samples. The experimental result on 1-m high resolution optical images shows the proposed method achieves a desired ship detection performance with higher quality and robustness than other methods, e.g., SVM and LR.

  6. Use of Taxi-Trip Data in Analysis of Demand Patterns for Detection and Explanation of Anomalies

    DEFF Research Database (Denmark)

    Markou, Ioulia; Rodrigues, Filipe; Pereira, Francisco Camara

    2017-01-01

    Because of environmental and economic stress, current strong investment in adaptive transport systems can efficiently use capacity, minimizing costs and environmental impacts. The common vision is of a system that dynamically changes itself (the supply) to anticipate the needs of travelers (the...... demand). In some occasions, unexpected and unwanted demand patterns are noticed in the traffic network; these patterns lead to system failures and cost implications. Significantly, low speeds or excessively low flows at an unforeseeable time are only some of the phenomena that are often noticed and need...

  7. Stochastic adaptation and fold-change detection: from single-cell to population behavior

    Directory of Open Access Journals (Sweden)

    Leier André

    2011-02-01

    Full Text Available Abstract Background In cell signaling terminology, adaptation refers to a system's capability of returning to its equilibrium upon a transient response. To achieve this, a network has to be both sensitive and precise. Namely, the system must display a significant output response upon stimulation, and later on return to pre-stimulation levels. If the system settles at the exact same equilibrium, adaptation is said to be 'perfect'. Examples of adaptation mechanisms include temperature regulation, calcium regulation and bacterial chemotaxis. Results We present models of the simplest adaptation architecture, a two-state protein system, in a stochastic setting. Furthermore, we consider differences between individual and collective adaptive behavior, and show how our system displays fold-change detection properties. Our analysis and simulations highlight why adaptation needs to be understood in terms of probability, and not in strict numbers of molecules. Most importantly, selection of appropriate parameters in this simple linear setting may yield populations of cells displaying adaptation, while single cells do not. Conclusions Single cell behavior cannot be inferred from population measurements and, sometimes, collective behavior cannot be determined from the individuals. By consequence, adaptation can many times be considered a purely emergent property of the collective system. This is a clear example where biological ergodicity cannot be assumed, just as is also the case when cell replication rates are not homogeneous, or depend on the cell state. Our analysis shows, for the first time, how ergodicity cannot be taken for granted in simple linear examples either. The latter holds even when cells are considered isolated and devoid of replication capabilities (cell-cycle arrested. We also show how a simple linear adaptation scheme displays fold-change detection properties, and how rupture of ergodicity prevails in scenarios where transitions between

  8. Contextual anomaly detection for cyber-physical security in Smart Grids based on an artificial neural network model

    DEFF Research Database (Denmark)

    Kosek, Anna Magdalena

    2016-01-01

    control. An intrusion detection system observes distributed energy resource’s behaviour, control actions and the power system impact, and is tested together with an ongoing voltage control attack in a co-simulation set-up. The simulation results obtained with a real photovoltaic rooftop power plant data...

  9. A Motion-Adaptive Deinterlacer via Hybrid Motion Detection and Edge-Pattern Recognition

    Directory of Open Access Journals (Sweden)

    He-Yuan Lin

    2008-03-01

    Full Text Available A novel motion-adaptive deinterlacing algorithm with edge-pattern recognition and hybrid motion detection is introduced. The great variety of video contents makes the processing of assorted motion, edges, textures, and the combination of them very difficult with a single algorithm. The edge-pattern recognition algorithm introduced in this paper exhibits the flexibility in processing both textures and edges which need to be separately accomplished by line average and edge-based line average before. Moreover, predicting the neighboring pixels for pattern analysis and interpolation further enhances the adaptability of the edge-pattern recognition unit when motion detection is incorporated. Our hybrid motion detection features accurate detection of fast and slow motion in interlaced video and also the motion with edges. Using only three fields for detection also renders higher temporal correlation for interpolation. The better performance of our deinterlacing algorithm with higher content-adaptability and less memory cost than the state-of-the-art 4-field motion detection algorithms can be seen from the subjective and objective experimental results of the CIF and PAL video sequences.

  10. A Motion-Adaptive Deinterlacer via Hybrid Motion Detection and Edge-Pattern Recognition

    Directory of Open Access Journals (Sweden)

    Li Hsin-Te

    2008-01-01

    Full Text Available Abstract A novel motion-adaptive deinterlacing algorithm with edge-pattern recognition and hybrid motion detection is introduced. The great variety of video contents makes the processing of assorted motion, edges, textures, and the combination of them very difficult with a single algorithm. The edge-pattern recognition algorithm introduced in this paper exhibits the flexibility in processing both textures and edges which need to be separately accomplished by line average and edge-based line average before. Moreover, predicting the neighboring pixels for pattern analysis and interpolation further enhances the adaptability of the edge-pattern recognition unit when motion detection is incorporated. Our hybrid motion detection features accurate detection of fast and slow motion in interlaced video and also the motion with edges. Using only three fields for detection also renders higher temporal correlation for interpolation. The better performance of our deinterlacing algorithm with higher content-adaptability and less memory cost than the state-of-the-art 4-field motion detection algorithms can be seen from the subjective and objective experimental results of the CIF and PAL video sequences.

  11. Limb anomalies

    DEFF Research Database (Denmark)

    Gurrieri, Fiorella; Kjær, Klaus Wilbrandt; Sangiorgi, Eugenio

    2002-01-01

    of limb development has been conserved for more than 300 millions years, with all the necessary adaptive modifications occurring throughout evolution, we also take into consideration the evolutionary aspects of limb development in terms of genetic repertoire, molecular pathways, and morphogenetic events....

  12. Detection of User Independent Single Trial ERPs in Brain Computer Interfaces: An Adaptive Spatial Filtering Approach

    DEFF Research Database (Denmark)

    Leza, Cristina; Puthusserypady, Sadasivan

    2017-01-01

    Brain Computer Interfaces (BCIs) use brain signals to communicate with the external world. The main challenges to address are speed, accuracy and adaptability. Here, a novel algorithm for P300 based BCI spelling system is presented, specifically suited for single-trial detection of Event...

  13. Detection of person misfit in computerized adaptive tests with polytomous items

    NARCIS (Netherlands)

    van Krimpen-Stoop, Edith; Meijer, R.R.

    2000-01-01

    Item scores that do not fit an assumed item response theory model may cause the latent trait value to be estimated inaccurately. For computerized adaptive tests (CAT) with dichotomous items, several person-fit statistics for detecting nonfitting item score patterns have been proposed. Both for

  14. Prenatal Detection of Cardiac Anomalies in Fetuses with Single Umbilical Artery: Diagnostic Accuracy Comparison of Maternal-Fetal-Medicine and Pediatric Cardiologist

    Directory of Open Access Journals (Sweden)

    Ilir Tasha

    2014-01-01

    Full Text Available Aim. To determine agreement of cardiac anomalies between maternal fetal medicine (MFM physicians and pediatric cardiologists (PC in fetuses with single umbilical artery (SUA. Methods. A retrospective review of all fetuses with SUA between 1999 and 2008. Subjects were studied by MFM and PC, delivered at our institution, and had confirmation of SUA and cardiac anomaly by antenatal and neonatal PC follow-up. Subjects were divided into four groups: isolated SUA, SUA and isolated cardiac anomaly, SUA and multiple anomalies without heart anomalies, and SUA and multiple malformations including cardiac anomaly. Results. 39,942 cases were studied between 1999 and 2008. In 376 of 39,942 cases (0.94%, SUA was diagnosed. Only 182 (48.4% met inclusion criteria. Cardiac anomalies were found in 21% (38/182. Agreement between MFM physicians and PC in all groups combined was 94% (171/182 (95% CI [89.2, 96.8]. MFM physicians overdiagnosed cardiac anomalies in 4.4% (8/182. MFM physicians and PC failed to antenatally diagnose cardiac anomaly in the same two cases. Conclusions. Good agreement was noted between MFM physicians and PC in our institution. Studies performed antenatally by MFM physicians and PC are less likely to uncover the entire spectrum of cardiac abnormalities and thus neonatal follow-up is suggested.

  15. Adaptation.

    Science.gov (United States)

    Broom, Donald M

    2006-01-01

    The term adaptation is used in biology in three different ways. It may refer to changes which occur at the cell and organ level, or at the individual level, or at the level of gene action and evolutionary processes. Adaptation by cells, especially nerve cells helps in: communication within the body, the distinguishing of stimuli, the avoidance of overload and the conservation of energy. The time course and complexity of these mechanisms varies. Adaptive characters of organisms, including adaptive behaviours, increase fitness so this adaptation is evolutionary. The major part of this paper concerns adaptation by individuals and its relationships to welfare. In complex animals, feed forward control is widely used. Individuals predict problems and adapt by acting before the environmental effect is substantial. Much of adaptation involves brain control and animals have a set of needs, located in the brain and acting largely via motivational mechanisms, to regulate life. Needs may be for resources but are also for actions and stimuli which are part of the mechanism which has evolved to obtain the resources. Hence pigs do not just need food but need to be able to carry out actions like rooting in earth or manipulating materials which are part of foraging behaviour. The welfare of an individual is its state as regards its attempts to cope with its environment. This state includes various adaptive mechanisms including feelings and those which cope with disease. The part of welfare which is concerned with coping with pathology is health. Disease, which implies some significant effect of pathology, always results in poor welfare. Welfare varies over a range from very good, when adaptation is effective and there are feelings of pleasure or contentment, to very poor. A key point concerning the concept of individual adaptation in relation to welfare is that welfare may be good or poor while adaptation is occurring. Some adaptation is very easy and energetically cheap and

  16. Adaptive Change Detection for Long-Term Machinery Monitoring Using Incremental Sliding-Window

    Science.gov (United States)

    Wang, Teng; Lu, Guo-Liang; Liu, Jie; Yan, Peng

    2017-11-01

    Detection of structural changes from an operational process is a major goal in machine condition monitoring. Existing methods for this purpose are mainly based on retrospective analysis, resulting in a large detection delay that limits their usages in real applications. This paper presents a new adaptive real-time change detection algorithm, an extension of the recent research by combining with an incremental sliding-window strategy, to handle the multi-change detection in long-term monitoring of machine operations. In particular, in the framework, Hilbert space embedding of distribution is used to map the original data into the Re-producing Kernel Hilbert Space (RKHS) for change detection; then, a new adaptive threshold strategy can be developed when making change decision, in which a global factor (used to control the coarse-to-fine level of detection) is introduced to replace the fixed value of threshold. Through experiments on a range of real testing data which was collected from an experimental rotating machinery system, the excellent detection performances of the algorithm for engineering applications were demonstrated. Compared with state-of-the-art methods, the proposed algorithm can be more suitable for long-term machinery condition monitoring without any manual re-calibration, thus is promising in modern industries.

  17. High-Level Synthesis of DSP Applications Using Adaptive Negative Cycle Detection

    Directory of Open Access Journals (Sweden)

    Nitin Chandrachoodan

    2002-09-01

    Full Text Available The problem of detecting negative weight cycles in a graph is examined in the context of the dynamic graph structures that arise in the process of high level synthesis (HLS. The concept of adaptive negative cycle detection is introduced, in which a graph changes over time and negative cycle detection needs to be done periodically, but not necessarily after every individual change. We present an algorithm for this problem, based on a novel extension of the well-known Bellman-Ford algorithm that allows us to adapt existing cycle information to the modified graph, and show by experiments that our algorithm significantly outperforms previous incremental approaches for dynamic graphs. In terms of applications, the adaptive technique leads to a very fast implementation of Lawlers algorithm for the computation of the maximum cycle mean (MCM of a graph, especially for a certain form of sparse graph. Such sparseness often occurs in practical circuits and systems, as demonstrated, for example, by the ISCAS 89/93 benchmarks. The application of the adaptive technique to design-space exploration (synthesis is also demonstrated by developing automated search techniques for scheduling iterative data-flow graphs.

  18. A spectrally efficient detect-and-forward scheme with two-tier adaptive cooperation

    KAUST Repository

    Benjillali, Mustapha

    2011-09-01

    We propose a simple relay-based adaptive cooperation scheme to improve the spectral efficiency of "Detect-and-Forward" (DetF) half-duplex relaying in fading channels. In a new common framework, we show that the proposed scheme offers considerable gainsin terms of the achievable information ratescompared to conventional DetF relaying schemes for both orthogonal and non-orthogonal source/relay transmissions. The analysis leads on to a general adaptive cooperation strategy based on the maximization of information rates at the destination which needs to observe only the average signal-to-noise ratios of the links. © 2006 IEEE.

  19. Mobile gamma-ray scanning system for detecting radiation anomalies associated with 226Ra-bearing materials

    International Nuclear Information System (INIS)

    Myrick, T.E.; Blair, M.S.; Doane, R.W.; Goldsmith, W.A.

    1982-11-01

    A mobile gamma-ray scanning system has been developed by Oak Ridge National Laboratory for use in the Department of Energy's remedial action survey programs. The unit consists of a NaI(T1) detection system housed in a specially-equipped van. The system is operator controlled through an on-board mini-computer, with data output provided on the computer video screen, strip chart recorders, and an on-line printer. Data storage is provided by a floppy disk system. Multichannel analysis capabilities are included for qualitative radionuclide identification. A 226 Ra-specific algorithm is employed to identify locations containing residual radium-bearing materials. This report presents the details of the system description, software development, and scanning methods utilized with the ORNL system. Laboratory calibration and field testing have established the system sensitivity, field of view, and other performance characteristics, the results of which are also presented. Documentation of the instrumentation and computer programs are included

  20. Coronary anomalies: what the radiologist should know*

    Science.gov (United States)

    Neves, Priscilla Ornellas; Andrade, Joalbo; Monção, Henry

    2015-01-01

    Coronary anomalies comprise a diverse group of malformations, some of them asymptomatic with a benign course, and the others related to symptoms as chest pain and sudden death. Such anomalies may be classified as follows: 1) anomalies of origination and course; 2) anomalies of intrinsic coronary arterial anatomy; 3) anomalies of coronary termination. The origin and the proximal course of anomalous coronary arteries are the main prognostic factors, and interarterial course or a coronary artery is considered to be malignant due its association with increased risk of sudden death. Coronary computed tomography angiography has become the reference method for such an assessment as it detects not only anomalies in origination of these arteries, but also its course in relation to other mediastinal structures, which plays a relevant role in the definition of the therapeutic management. Finally, it is essential for radiologists to recognize and characterize such anomalies. PMID:26379322

  1. Adaptation

    International Development Research Centre (IDRC) Digital Library (Canada)

    building skills, knowledge or networks on adaptation, ... the African partners leading the AfricaAdapt network, together with the UK-based Institute of Development Studies; and ... UNCCD Secretariat, Regional Coordination Unit for Africa, Tunis, Tunisia .... 26 Rural–urban Cooperation on Water Management in the Context of.

  2. Change Detection of High-Resolution Remote Sensing Images Based on Adaptive Fusion of Multiple Features

    Science.gov (United States)

    Wang, G. H.; Wang, H. B.; Fan, W. F.; Liu, Y.; Chen, C.

    2018-04-01

    In view of the traditional change detection algorithm mainly depends on the spectral information image spot, failed to effectively mining and fusion of multi-image feature detection advantage, the article borrows the ideas of object oriented analysis proposed a multi feature fusion of remote sensing image change detection algorithm. First by the multi-scale segmentation of image objects based; then calculate the various objects of color histogram and linear gradient histogram; utilizes the color distance and edge line feature distance between EMD statistical operator in different periods of the object, using the adaptive weighted method, the color feature distance and edge in a straight line distance of combination is constructed object heterogeneity. Finally, the curvature histogram analysis image spot change detection results. The experimental results show that the method can fully fuse the color and edge line features, thus improving the accuracy of the change detection.

  3. A Negative Selection Algorithm Based on Hierarchical Clustering of Self Set and its Application in Anomaly Detection

    Directory of Open Access Journals (Sweden)

    Wen Chen

    2011-08-01

    Full Text Available A negative selection algorithm based on the hierarchical clustering of self set HC-RNSA is introduced in this paper. Several strategies are applied to improve the algorithm performance. First, the self data set is replaced by the self cluster centers to compare with the detector candidates in each cluster level. As the number of self clusters is much less than the self set size, the detector generation efficiency is improved. Second, during the detector generation process, the detector candidates are restricted to the lower coverage space to reduce detector redundancy. In the article, the problem that the distances between antigens coverage to a constant value in the high dimensional space is analyzed, accordingly the Principle Component Analysis (PCA method is used to reduce the data dimension, and the fractional distance function is employed to enhance the distinctiveness between the self and non-self antigens. The detector generation procedure is terminated when the expected non-self coverage is reached. The theory analysis and experimental results demonstrate that the detection rate of HC-RNSA is higher than that of the traditional negative selection algorithms while the false alarm rate and time cost are reduced.

  4. Using exceedance probabilities to detect anomalies in routinely recorded animal health data, with particular reference to foot-and-mouth disease in Viet Nam.

    Science.gov (United States)

    Richards, K K; Hazelton, M L; Stevenson, M A; Lockhart, C Y; Pinto, J; Nguyen, L

    2014-10-01

    The widespread availability of computer hardware and software for recording and storing disease event information means that, in theory, we have the necessary information to carry out detailed analyses of factors influencing the spatial distribution of disease in animal populations. However, the reliability of such analyses depends on data quality, with anomalous records having the potential to introduce significant bias and lead to inappropriate decision making. In this paper we promote the use of exceedance probabilities as a tool for detecting anomalies when applying hierarchical spatio-temporal models to animal health data. We illustrate this methodology through a case study data on outbreaks of foot-and-mouth disease (FMD) in Viet Nam for the period 2006-2008. A flexible binomial logistic regression was employed to model the number of FMD infected communes within each province of the country. Standard analyses of the residuals from this model failed to identify problems, but exceedance probabilities identified provinces in which the number of reported FMD outbreaks was unexpectedly low. This finding is interesting given that these provinces are on major cattle movement pathways through Viet Nam. Copyright © 2014 Elsevier Ltd. All rights reserved.

  5. Addition of Adapted Optics towards obtaining a quantitative detection of diabetic retinopathy

    Science.gov (United States)

    Yust, Brian; Obregon, Isidro; Tsin, Andrew; Sardar, Dhiraj

    2009-04-01

    An adaptive optics system was assembled for correcting the aberrated wavefront of light reflected from the retina. The adaptive optics setup includes a superluminous diode light source, Hartmann-Shack wavefront sensor, deformable mirror, and imaging CCD camera. Aberrations found in the reflected wavefront are caused by changes in the index of refraction along the light path as the beam travels through the cornea, lens, and vitreous humour. The Hartmann-Shack sensor allows for detection of aberrations in the wavefront, which may then be corrected with the deformable mirror. It has been shown that there is a change in the polarization of light reflected from neovascularizations in the retina due to certain diseases, such as diabetic retinopathy. The adaptive optics system was assembled towards the goal of obtaining a quantitative measure of onset and progression of this ailment, as one does not currently exist. The study was done to show that the addition of adaptive optics results in a more accurate detection of neovascularization in the retina by measuring the expected changes in polarization of the corrected wavefront of reflected light.

  6. An Improved Semisupervised Outlier Detection Algorithm Based on Adaptive Feature Weighted Clustering

    Directory of Open Access Journals (Sweden)

    Tingquan Deng

    2016-01-01

    Full Text Available There exist already various approaches to outlier detection, in which semisupervised methods achieve encouraging superiority due to the introduction of prior knowledge. In this paper, an adaptive feature weighted clustering-based semisupervised outlier detection strategy is proposed. This method maximizes the membership degree of a labeled normal object to the cluster it belongs to and minimizes the membership degrees of a labeled outlier to all clusters. In consideration of distinct significance of features or components in a dataset in determining an object being an inlier or outlier, each feature is adaptively assigned different weights according to the deviation degrees between this feature of all objects and that of a certain cluster prototype. A series of experiments on a synthetic dataset and several real-world datasets are implemented to verify the effectiveness and efficiency of the proposal.

  7. Automatic video shot boundary detection using k-means clustering and improved adaptive dual threshold comparison

    Science.gov (United States)

    Sa, Qila; Wang, Zhihui

    2018-03-01

    At present, content-based video retrieval (CBVR) is the most mainstream video retrieval method, using the video features of its own to perform automatic identification and retrieval. This method involves a key technology, i.e. shot segmentation. In this paper, the method of automatic video shot boundary detection with K-means clustering and improved adaptive dual threshold comparison is proposed. First, extract the visual features of every frame and divide them into two categories using K-means clustering algorithm, namely, one with significant change and one with no significant change. Then, as to the classification results, utilize the improved adaptive dual threshold comparison method to determine the abrupt as well as gradual shot boundaries.Finally, achieve automatic video shot boundary detection system.

  8. Detection of Human Impacts by an Adaptive Energy-Based Anisotropic Algorithm

    Directory of Open Access Journals (Sweden)

    Manuel Prado-Velasco

    2013-10-01

    Full Text Available Boosted by health consequences and the cost of falls in the elderly, this work develops and tests a novel algorithm and methodology to detect human impacts that will act as triggers of a two-layer fall monitor. The two main requirements demanded by socio-healthcare providers—unobtrusiveness and reliability—defined the objectives of the research. We have demonstrated that a very agile, adaptive, and energy-based anisotropic algorithm can provide 100% sensitivity and 78% specificity, in the task of detecting impacts under demanding laboratory conditions. The algorithm works together with an unsupervised real-time learning technique that addresses the adaptive capability, and this is also presented. The work demonstrates the robustness and reliability of our new algorithm, which will be the basis of a smart falling monitor. This is shown in this work to underline the relevance of the results.

  9. AN AMELIORATED DETECTION STATISTICS FOR ADAPTIVE MASK MEDIAN FILTRATION OF HEAVILY NOISED DIGITAL IMAGES

    Directory of Open Access Journals (Sweden)

    Geeta Hanji

    2016-11-01

    Full Text Available Noise reduction is an important area of research in image processing applications. The performance of the digital image noise filtering method primarily depends upon the accuracy of noise detection scheme. This paper presents an effective detector based, adaptive mask, median filtration of heavily noised digital images affected with fixed value (or salt and pepper impulse noise. The proposed filter presents a novel approach; an ameliorated Rank Ordered Absolute Deviation (ROAD statistics to judge whether the input pixel is noised or noise free. If a pixel is detected as corrupted, it is subjected to adaptive mask median filtration; otherwise, it is kept unchanged. Extensive experimental results and comparative performance evaluations demonstrate that the proposed filter outperforms the existing decision type, median based filters with powerful noise detectors in terms of objective performance measures and visual retrieviation accuracy.

  10. Path scanning for the detection of anomalous subgraphs and use of DNS requests and host agents for anomaly/change detection and network situational awareness

    Science.gov (United States)

    Neil, Joshua Charles; Fisk, Michael Edward; Brugh, Alexander William; Hash, Curtis Lee; Storlie, Curtis Byron; Uphoff, Benjamin; Kent, Alexander

    2017-11-21

    A system, apparatus, computer-readable medium, and computer-implemented method are provided for detecting anomalous behavior in a network. Historical parameters of the network are determined in order to determine normal activity levels. A plurality of paths in the network are enumerated as part of a graph representing the network, where each computing system in the network may be a node in the graph and the sequence of connections between two computing systems may be a directed edge in the graph. A statistical model is applied to the plurality of paths in the graph on a sliding window basis to detect anomalous behavior. Data collected by a Unified Host Collection Agent ("UHCA") may also be used to detect anomalous behavior.

  11. Early pack-off diagnosis in drilling using an adaptive observer and statistical change detection

    DEFF Research Database (Denmark)

    Willersrud, Anders; Imsland, Lars; Blanke, Mogens

    2015-01-01

    in the well. A model-based adaptive observer is used to estimate these friction parameters as well as flow rates. Detecting changes to these estimates can then be used for pack-off diagnosis, which due to measurement noise is done using statistical change detection. Isolation of incident type and location...... is done using a multivariate generalized likelihood ratio test, determining the change direction of the estimated mean values. The method is tested on simulated data from the commercial high-fidelity multi-phase simulator OLGA, where three different pack-offs at different locations and with different...

  12. Adaptive endpoint detection of seismic signal based on auto-correlated function

    International Nuclear Information System (INIS)

    Fan Wanchun; Shi Ren

    2001-01-01

    Based on the analysis of auto-correlation function, the notion of the distance between auto-correlation function was quoted, and the characterization of the noise and the signal with noise were discussed by using the distance. Then, the method of auto- adaptable endpoint detection of seismic signal based on auto-correlated similarity was summed up. The steps of implementation and determining of the thresholds were presented in detail. The experimental results that were compared with the methods based on artificial detecting show that this method has higher sensitivity even in a low signal with noise ratio circumstance

  13. Aeromagnetic anomalies over faulted strata

    Science.gov (United States)

    Grauch, V.J.S.; Hudson, Mark R.

    2011-01-01

    High-resolution aeromagnetic surveys are now an industry standard and they commonly detect anomalies that are attributed to faults within sedimentary basins. However, detailed studies identifying geologic sources of magnetic anomalies in sedimentary environments are rare in the literature. Opportunities to study these sources have come from well-exposed sedimentary basins of the Rio Grande rift in New Mexico and Colorado. High-resolution aeromagnetic data from these areas reveal numerous, curvilinear, low-amplitude (2–15 nT at 100-m terrain clearance) anomalies that consistently correspond to intrasedimentary normal faults (Figure 1). Detailed geophysical and rock-property studies provide evidence for the magnetic sources at several exposures of these faults in the central Rio Grande rift (summarized in Grauch and Hudson, 2007, and Hudson et al., 2008). A key result is that the aeromagnetic anomalies arise from the juxtaposition of magnetically differing strata at the faults as opposed to chemical processes acting at the fault zone. The studies also provide (1) guidelines for understanding and estimating the geophysical parameters controlling aeromagnetic anomalies at faulted strata (Grauch and Hudson), and (2) observations on key geologic factors that are favorable for developing similar sedimentary sources of aeromagnetic anomalies elsewhere (Hudson et al.).

  14. Adaptive endpoint detection of seismic signal based on auto-correlated function

    International Nuclear Information System (INIS)

    Fan Wanchun; Shi Ren

    2000-01-01

    There are certain shortcomings for the endpoint detection by time-waveform envelope and/or by checking the travel table (both labelled as the artificial detection method). Based on the analysis of the auto-correlation function, the notion of the distance between auto-correlation functions was quoted, and the characterizations of the noise and the signal with noise were discussed by using the distance. Then, the method of auto-adaptable endpoint detection of seismic signal based on auto-correlated similarity was summed up. The steps of implementation and determining of the thresholds were presented in detail. The experimental results that were compared with the methods based on artificial detecting show that this method has higher sensitivity even in a low SNR circumstance

  15. Adaptive Fourier decomposition based R-peak detection for noisy ECG Signals.

    Science.gov (United States)

    Ze Wang; Chi Man Wong; Feng Wan

    2017-07-01

    An adaptive Fourier decomposition (AFD) based R-peak detection method is proposed for noisy ECG signals. Although lots of QRS detection methods have been proposed in literature, most detection methods require high signal quality. The proposed method extracts the R waves from the energy domain using the AFD and determines the R-peak locations based on the key decomposition parameters, achieving the denoising and the R-peak detection at the same time. Validated by clinical ECG signals in the MIT-BIH Arrhythmia Database, the proposed method shows better performance than the Pan-Tompkin (PT) algorithm in both situations of a native PT and the PT with a denoising process.

  16. Adaptive Fault Detection for Complex Dynamic Processes Based on JIT Updated Data Set

    Directory of Open Access Journals (Sweden)

    Jinna Li

    2012-01-01

    Full Text Available A novel fault detection technique is proposed to explicitly account for the nonlinear, dynamic, and multimodal problems existed in the practical and complex dynamic processes. Just-in-time (JIT detection method and k-nearest neighbor (KNN rule-based statistical process control (SPC approach are integrated to construct a flexible and adaptive detection scheme for the control process with nonlinear, dynamic, and multimodal cases. Mahalanobis distance, representing the correlation among samples, is used to simplify and update the raw data set, which is the first merit in this paper. Based on it, the control limit is computed in terms of both KNN rule and SPC method, such that we can identify whether the current data is normal or not by online approach. Noted that the control limit obtained changes with updating database such that an adaptive fault detection technique that can effectively eliminate the impact of data drift and shift on the performance of detection process is obtained, which is the second merit in this paper. The efficiency of the developed method is demonstrated by the numerical examples and an industrial case.

  17. Chiral anomalies and differential geometry

    International Nuclear Information System (INIS)

    Zumino, B.

    1983-10-01

    Some properties of chiral anomalies are described from a geometric point of view. Topics include chiral anomalies and differential forms, transformation properties of the anomalies, identification and use of the anomalies, and normalization of the anomalies. 22 references

  18. Detection of plant adaptation responses to saline environment in rhizosphere using microwave sensing

    International Nuclear Information System (INIS)

    Shimomachi, T.; Kobashikawa, C.; Tanigawa, H.; Omoda, E.

    2008-01-01

    The physiological adaptation responses in plants to environmental stress, such as water stress and salt stress induce changes in physicochemical conditions of the plant, since formation of osmotic-regulatory substances can be formed during the environmental adaptation responses. Strong electrolytes, amino acids, proteins and saccharides are well-known as osmoregulatory substances. Since these substances are ionic conductors and their molecules are electrically dipolar, it can be considered that these substances cause changes in the dielectric properties of the plant, which can be detected by microwave sensing. The dielectric properties (0.3 to 3GHz), water content and water potential of plant leaves which reflect the physiological condition of the plant under salt stress were measured and analyzed. Experimental results showed the potential of the microwave sensing as a method for monitoring adaptation responses in plants under saline environment and that suggested the saline environment in rhizosphere can be detected noninvasively and quantitatively by the microwave sensing which detects the changes in complex dielectric properties of the plant

  19. Adaptive Energy-Efficient Target Detection Based on Mobile Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Tengyue Zou

    2017-05-01

    Full Text Available Target detection is a widely used application for area surveillance, elder care, and fire alarms; its purpose is to find a particular object or event in a region of interest. Usually, fixed observing stations or static sensor nodes are arranged uniformly in the field. However, each part of the field has a different probability of being intruded upon; if an object suddenly enters an area with few guardian devices, a loss of detection will occur, and the stations in the safe areas will waste their energy for a long time without any discovery. Thus, mobile wireless sensor networks may benefit from adaptation and pertinence in detection. Sensor nodes equipped with wheels are able to move towards the risk area via an adaptive learning procedure based on Bayesian networks. Furthermore, a clustering algorithm based on k-means++ and an energy control mechanism is used to reduce the energy consumption of nodes. The extended Kalman filter and a voting data fusion method are employed to raise the localization accuracy of the target. The simulation and experimental results indicate that this new system with adaptive energy-efficient methods is able to achieve better performance than the traditional ones.

  20. Computing Adaptive Feature Weights with PSO to Improve Android Malware Detection

    Directory of Open Access Journals (Sweden)

    Yanping Xu

    2017-01-01

    Full Text Available Android malware detection is a complex and crucial issue. In this paper, we propose a malware detection model using a support vector machine (SVM method based on feature weights that are computed by information gain (IG and particle swarm optimization (PSO algorithms. The IG weights are evaluated based on the relevance between features and class labels, and the PSO weights are adaptively calculated to result in the best fitness (the performance of the SVM classification model. Moreover, to overcome the defects of basic PSO, we propose a new adaptive inertia weight method called fitness-based and chaotic adaptive inertia weight-PSO (FCAIW-PSO that improves on basic PSO and is based on the fitness and a chaotic term. The goal is to assign suitable weights to the features to ensure the best Android malware detection performance. The results of experiments indicate that the IG weights and PSO weights both improve the performance of SVM and that the performance of the PSO weights is better than that of the IG weights.

  1. Adaptive Energy-Efficient Target Detection Based on Mobile Wireless Sensor Networks.

    Science.gov (United States)

    Zou, Tengyue; Li, Zhenjia; Li, Shuyuan; Lin, Shouying

    2017-05-04

    Target detection is a widely used application for area surveillance, elder care, and fire alarms; its purpose is to find a particular object or event in a region of interest. Usually, fixed observing stations or static sensor nodes are arranged uniformly in the field. However, each part of the field has a different probability of being intruded upon; if an object suddenly enters an area with few guardian devices, a loss of detection will occur, and the stations in the safe areas will waste their energy for a long time without any discovery. Thus, mobile wireless sensor networks may benefit from adaptation and pertinence in detection. Sensor nodes equipped with wheels are able to move towards the risk area via an adaptive learning procedure based on Bayesian networks. Furthermore, a clustering algorithm based on k -means++ and an energy control mechanism is used to reduce the energy consumption of nodes. The extended Kalman filter and a voting data fusion method are employed to raise the localization accuracy of the target. The simulation and experimental results indicate that this new system with adaptive energy-efficient methods is able to achieve better performance than the traditional ones.

  2. Graph anomalies in cyber communications

    Energy Technology Data Exchange (ETDEWEB)

    Vander Wiel, Scott A [Los Alamos National Laboratory; Storlie, Curtis B [Los Alamos National Laboratory; Sandine, Gary [Los Alamos National Laboratory; Hagberg, Aric A [Los Alamos National Laboratory; Fisk, Michael [Los Alamos National Laboratory

    2011-01-11

    Enterprises monitor cyber traffic for viruses, intruders and stolen information. Detection methods look for known signatures of malicious traffic or search for anomalies with respect to a nominal reference model. Traditional anomaly detection focuses on aggregate traffic at central nodes or on user-level monitoring. More recently, however, traffic is being viewed more holistically as a dynamic communication graph. Attention to the graph nature of the traffic has expanded the types of anomalies that are being sought. We give an overview of several cyber data streams collected at Los Alamos National Laboratory and discuss current work in modeling the graph dynamics of traffic over the network. We consider global properties and local properties within the communication graph. A method for monitoring relative entropy on multiple correlated properties is discussed in detail.

  3. Calibration model maintenance in melamine resin production: Integrating drift detection, smart sample selection and model adaptation.

    Science.gov (United States)

    Nikzad-Langerodi, Ramin; Lughofer, Edwin; Cernuda, Carlos; Reischer, Thomas; Kantner, Wolfgang; Pawliczek, Marcin; Brandstetter, Markus

    2018-07-12

    The physico-chemical properties of Melamine Formaldehyde (MF) based thermosets are largely influenced by the degree of polymerization (DP) in the underlying resin. On-line supervision of the turbidity point by means of vibrational spectroscopy has recently emerged as a promising technique to monitor the DP of MF resins. However, spectroscopic determination of the DP relies on chemometric models, which are usually sensitive to drifts caused by instrumental and/or sample-associated changes occurring over time. In order to detect the time point when drifts start causing prediction bias, we here explore a universal drift detector based on a faded version of the Page-Hinkley (PH) statistic, which we test in three data streams from an industrial MF resin production process. We employ committee disagreement (CD), computed as the variance of model predictions from an ensemble of partial least squares (PLS) models, as a measure for sample-wise prediction uncertainty and use the PH statistic to detect changes in this quantity. We further explore supervised and unsupervised strategies for (semi-)automatic model adaptation upon detection of a drift. For the former, manual reference measurements are requested whenever statistical thresholds on Hotelling's T 2 and/or Q-Residuals are violated. Models are subsequently re-calibrated using weighted partial least squares in order to increase the influence of newer samples, which increases the flexibility when adapting to new (drifted) states. Unsupervised model adaptation is carried out exploiting the dual antecedent-consequent structure of a recently developed fuzzy systems variant of PLS termed FLEXFIS-PLS. In particular, antecedent parts are updated while maintaining the internal structure of the local linear predictors (i.e. the consequents). We found improved drift detection capability of the CD compared to Hotelling's T 2 and Q-Residuals when used in combination with the proposed PH test. Furthermore, we found that active

  4. Adapt

    Science.gov (United States)

    Bargatze, L. F.

    2015-12-01

    Active Data Archive Product Tracking (ADAPT) is a collection of software routines that permits one to generate XML metadata files to describe and register data products in support of the NASA Heliophysics Virtual Observatory VxO effort. ADAPT is also a philosophy. The ADAPT concept is to use any and all available metadata associated with scientific data to produce XML metadata descriptions in a consistent, uniform, and organized fashion to provide blanket access to the full complement of data stored on a targeted data server. In this poster, we present an application of ADAPT to describe all of the data products that are stored by using the Common Data File (CDF) format served out by the CDAWEB and SPDF data servers hosted at the NASA Goddard Space Flight Center. These data servers are the primary repositories for NASA Heliophysics data. For this purpose, the ADAPT routines have been used to generate data resource descriptions by using an XML schema named Space Physics Archive, Search, and Extract (SPASE). SPASE is the designated standard for documenting Heliophysics data products, as adopted by the Heliophysics Data and Model Consortium. The set of SPASE XML resource descriptions produced by ADAPT includes high-level descriptions of numerical data products, display data products, or catalogs and also includes low-level "Granule" descriptions. A SPASE Granule is effectively a universal access metadata resource; a Granule associates an individual data file (e.g. a CDF file) with a "parent" high-level data resource description, assigns a resource identifier to the file, and lists the corresponding assess URL(s). The CDAWEB and SPDF file systems were queried to provide the input required by the ADAPT software to create an initial set of SPASE metadata resource descriptions. Then, the CDAWEB and SPDF data repositories were queried subsequently on a nightly basis and the CDF file lists were checked for any changes such as the occurrence of new, modified, or deleted

  5. An adaptive algorithm for the detection of microcalcifications in simulated low-dose mammography

    Science.gov (United States)

    Treiber, O.; Wanninger, F.; Führ, H.; Panzer, W.; Regulla, D.; Winkler, G.

    2003-02-01

    This paper uses the task of microcalcification detection as a benchmark problem to assess the potential for dose reduction in x-ray mammography. We present the results of a newly developed algorithm for detection of microcalcifications as a case study for a typical commercial film-screen system (Kodak Min-R 2000/2190). The first part of the paper deals with the simulation of dose reduction for film-screen mammography based on a physical model of the imaging process. Use of a more sensitive film-screen system is expected to result in additional smoothing of the image. We introduce two different models of that behaviour, called moderate and strong smoothing. We then present an adaptive, model-based microcalcification detection algorithm. Comparing detection results with ground-truth images obtained under the supervision of an expert radiologist allows us to establish the soundness of the detection algorithm. We measure the performance on the dose-reduced images in order to assess the loss of information due to dose reduction. It turns out that the smoothing behaviour has a strong influence on detection rates. For moderate smoothing, a dose reduction by 25% has no serious influence on the detection results, whereas a dose reduction by 50% already entails a marked deterioration of the performance. Strong smoothing generally leads to an unacceptable loss of image quality. The test results emphasize the impact of the more sensitive film-screen system and its characteristics on the problem of assessing the potential for dose reduction in film-screen mammography. The general approach presented in the paper can be adapted to fully digital mammography.

  6. An adaptive algorithm for the detection of microcalcifications in simulated low-dose mammography

    International Nuclear Information System (INIS)

    Treiber, O; Wanninger, F; Fuehr, H; Panzer, W; Regulla, D; Winkler, G

    2003-01-01

    This paper uses the task of microcalcification detection as a benchmark problem to assess the potential for dose reduction in x-ray mammography. We present the results of a newly developed algorithm for detection of microcalcifications as a case study for a typical commercial film-screen system (Kodak Min-R 2000/2190). The first part of the paper deals with the simulation of dose reduction for film-screen mammography based on a physical model of the imaging process. Use of a more sensitive film-screen system is expected to result in additional smoothing of the image. We introduce two different models of that behaviour, called moderate and strong smoothing. We then present an adaptive, model-based microcalcification detection algorithm. Comparing detection results with ground-truth images obtained under the supervision of an expert radiologist allows us to establish the soundness of the detection algorithm. We measure the performance on the dose-reduced images in order to assess the loss of information due to dose reduction. It turns out that the smoothing behaviour has a strong influence on detection rates. For moderate smoothing, a dose reduction by 25% has no serious influence on the detection results, whereas a dose reduction by 50% already entails a marked deterioration of the performance. Strong smoothing generally leads to an unacceptable loss of image quality. The test results emphasize the impact of the more sensitive film-screen system and its characteristics on the problem of assessing the potential for dose reduction in film-screen mammography. The general approach presented in the paper can be adapted to fully digital mammography

  7. Anomaly Detection with Text Mining

    Data.gov (United States)

    National Aeronautics and Space Administration — Many existing complex space systems have a significant amount of historical maintenance and problem data bases that are stored in unstructured text forms. The...

  8. Anomaly detection using process mining

    NARCIS (Netherlands)

    Bezerra, F.; Wainer, J.; Aalst, van der W.M.P.; Halpin, T.; Krogstie, J.; Nurcan, S.; Proper, E.; Schmidt, R.; Soffer, P.; Ukor, R.

    2009-01-01

    Recently, several large companies have been involved in financial scandals related to mismanagement, resulting in financial damages for their stockholders. In response, certifications and manuals for best practices of governance were developed, and in some cases, tougher federal laws were

  9. Anomaly Detection for Complex Systems

    Data.gov (United States)

    National Aeronautics and Space Administration — In performance maintenance in large, complex systems, sensor information from sub-components tends to be readily available, and can be used to make predictions about...

  10. Domain Adaptation Methods for Improving Lab-to-field Generalization of Cocaine Detection using Wearable ECG

    Science.gov (United States)

    Natarajan, Annamalai; Angarita, Gustavo; Gaiser, Edward; Malison, Robert; Ganesan, Deepak; Marlin, Benjamin M.

    2016-01-01

    Mobile health research on illicit drug use detection typically involves a two-stage study design where data to learn detectors is first collected in lab-based trials, followed by a deployment to subjects in a free-living environment to assess detector performance. While recent work has demonstrated the feasibility of wearable sensors for illicit drug use detection in the lab setting, several key problems can limit lab-to-field generalization performance. For example, lab-based data collection often has low ecological validity, the ground-truth event labels collected in the lab may not be available at the same level of temporal granularity in the field, and there can be significant variability between subjects. In this paper, we present domain adaptation methods for assessing and mitigating potential sources of performance loss in lab-to-field generalization and apply them to the problem of cocaine use detection from wearable electrocardiogram sensor data. PMID:28090605

  11. Extraction of ECG signal with adaptive filter for hearth abnormalities detection

    Science.gov (United States)

    Turnip, Mardi; Saragih, Rijois. I. E.; Dharma, Abdi; Esti Kusumandari, Dwi; Turnip, Arjon; Sitanggang, Delima; Aisyah, Siti

    2018-04-01

    This paper demonstrates an adaptive filter method for extraction ofelectrocardiogram (ECG) feature in hearth abnormalities detection. In particular, electrocardiogram (ECG) is a recording of the heart's electrical activity by capturing a tracingof cardiac electrical impulse as it moves from the atrium to the ventricles. The applied algorithm is to evaluate and analyze ECG signals for abnormalities detection based on P, Q, R and S peaks. In the first phase, the real-time ECG data is acquired and pre-processed. In the second phase, the procured ECG signal is subjected to feature extraction process. The extracted features detect abnormal peaks present in the waveform. Thus the normal and abnormal ECG signal could be differentiated based on the features extracted.

  12. Failure detection by adaptive lattice modelling using Kalman filtering methodology : application to NPP

    International Nuclear Information System (INIS)

    Ciftcioglu, O.

    1991-03-01

    Detection of failure in the operational status of a NPP is described. The method uses lattice form of the signal modelling established by means of Kalman filtering methodology. In this approach each lattice parameter is considered to be a state and the minimum variance estimate of the states is performed adaptively by optimal parameter estimation together with fast convergence and favourable statistical properties. In particular, the state covariance is also the covariance of the error committed by that estimate of the state value and the Mahalanobis distance formed for pattern comparison takes x 2 distribution for normally distributed signals. The failure detection is performed after a decision making process by probabilistic assessments based on the statistical information provided. The failure detection system is implemented in multi-channel signal environment of Borssele NPP and its favourable features are demonstrated. (author). 29 refs.; 7 figs

  13. An Adaptive Cultural Algorithm with Improved Quantum-behaved Particle Swarm Optimization for Sonar Image Detection.

    Science.gov (United States)

    Wang, Xingmei; Hao, Wenqian; Li, Qiming

    2017-12-18

    This paper proposes an adaptive cultural algorithm with improved quantum-behaved particle swarm optimization (ACA-IQPSO) to detect the underwater sonar image. In the population space, to improve searching ability of particles, iterative times and the fitness value of particles are regarded as factors to adaptively adjust the contraction-expansion coefficient of the quantum-behaved particle swarm optimization algorithm (QPSO). The improved quantum-behaved particle swarm optimization algorithm (IQPSO) can make particles adjust their behaviours according to their quality. In the belief space, a new update strategy is adopted to update cultural individuals according to the idea of the update strategy in shuffled frog leaping algorithm (SFLA). Moreover, to enhance the utilization of information in the population space and belief space, accept function and influence function are redesigned in the new communication protocol. The experimental results show that ACA-IQPSO can obtain good clustering centres according to the grey distribution information of underwater sonar images, and accurately complete underwater objects detection. Compared with other algorithms, the proposed ACA-IQPSO has good effectiveness, excellent adaptability, a powerful searching ability and high convergence efficiency. Meanwhile, the experimental results of the benchmark functions can further demonstrate that the proposed ACA-IQPSO has better searching ability, convergence efficiency and stability.

  14. Defect Detection of Steel Surfaces with Global Adaptive Percentile Thresholding of Gradient Image

    Science.gov (United States)

    Neogi, Nirbhar; Mohanta, Dusmanta K.; Dutta, Pranab K.

    2017-12-01

    Steel strips are used extensively for white goods, auto bodies and other purposes where surface defects are not acceptable. On-line surface inspection systems can effectively detect and classify defects and help in taking corrective actions. For detection of defects use of gradients is very popular in highlighting and subsequently segmenting areas of interest in a surface inspection system. Most of the time, segmentation by a fixed value threshold leads to unsatisfactory results. As defects can be both very small and large in size, segmentation of a gradient image based on percentile thresholding can lead to inadequate or excessive segmentation of defective regions. A global adaptive percentile thresholding of gradient image has been formulated for blister defect and water-deposit (a pseudo defect) in steel strips. The developed method adaptively changes the percentile value used for thresholding depending on the number of pixels above some specific values of gray level of the gradient image. The method is able to segment defective regions selectively preserving the characteristics of defects irrespective of the size of the defects. The developed method performs better than Otsu method of thresholding and an adaptive thresholding method based on local properties.

  15. Renal anomalies in congenital heart disease

    International Nuclear Information System (INIS)

    Lee, Byung Hee; Kim, In One; Yeon, Kyung Mo; Yoon, Yong Soo

    1987-01-01

    In general, the incidence of urinary tract anomalies in congenital heart disease is higher than that in general population. So authors performed abdominal cineradiography in 1045 infants and children undergoing cineangiographic examinations for congenital heart disease, as a screening method for the detection, the incidence, and the nature of associated urinary tract anomalies. The results were as follows: 1. The incidence of urinary tract anomaly associated with congenital heart disease was 4.1% (<2% in general population). 2. Incidence of urinary tract anomalies was 4.62% in 671 acyanotic heart diseases, 3.20% in 374 cyanotic heart diseases. 3. There was no constant relationship between the type of cardiac anomaly and the type of urinary tract anomaly

  16. Dim small targets detection based on self-adaptive caliber temporal-spatial filtering

    Science.gov (United States)

    Fan, Xiangsuo; Xu, Zhiyong; Zhang, Jianlin; Huang, Yongmei; Peng, Zhenming

    2017-09-01

    To boost the detect ability of dim small targets, this paper began by using improved anisotropy for background prediction (IABP), followed by target enhancement by improved high-order cumulates (HQS). Finally, on the basis of image pre-processing, to address the problem of missed and wrong detection caused by fixed caliber of traditional pipeline filtering, this paper used targets' multi-frame movement correlation in the time-space domain, combined with the scale-space theory, to propose a temporal-spatial filtering algorithm which allows the caliber to make self-adaptive changes according to the changes of the targets' scale, effectively solving the detection-related issues brought by unchanged caliber and decreased/increased size of the targets. Experiments showed that the improved anisotropic background predication could be loyal to the true background of the original image to the maximum extent, presenting a superior overall performance to other background prediction methods; the improved HQS significantly increased the signal-noise ratio of images; when the signal-noise ratio was lower than 2.6 dB, this detection algorithm could effectively eliminate noise and detect targets. For the algorithm, the lowest signal-to-noise ratio of the detectable target is 0.37.

  17. Adaptive Ridge Point Refinement for Seeds Detection in X-Ray Coronary Angiogram

    Directory of Open Access Journals (Sweden)

    Ruoxiu Xiao

    2015-01-01

    Full Text Available Seed point is prerequired condition for tracking based method for extracting centerline or vascular structures from the angiogram. In this paper, a novel seed point detection method for coronary artery segmentation is proposed. Vessels on the image are first enhanced according to the distribution of Hessian eigenvalue in multiscale space; consequently, centerlines of tubular vessels are also enhanced. Ridge point is extracted as candidate seed point, which is then refined according to its mathematical definition. The theoretical feasibility of this method is also proven. Finally, all the detected ridge points are checked using a self-adaptive threshold to improve the robustness of results. Clinical angiograms are used to evaluate the performance of the proposed algorithm, and the results show that the proposed algorithm can detect a large set of true seed points located on most branches of vessels. Compared with traditional seed point detection algorithms, the proposed method can detect a larger number of seed points with higher precision. Considering that the proposed method can achieve accurate seed detection without any human interaction, it can be utilized for several clinical applications, such as vessel segmentation, centerline extraction, and topological identification.

  18. An integration time adaptive control method for atmospheric composition detection of occultation

    Science.gov (United States)

    Ding, Lin; Hou, Shuai; Yu, Fei; Liu, Cheng; Li, Chao; Zhe, Lin

    2018-01-01

    When sun is used as the light source for atmospheric composition detection, it is necessary to image sun for accurate identification and stable tracking. In the course of 180 second of the occultation, the magnitude of sun light intensity through the atmosphere changes greatly. It is nearly 1100 times illumination change between the maximum atmospheric and the minimum atmospheric. And the process of light change is so severe that 2.9 times per second of light change can be reached. Therefore, it is difficult to control the integration time of sun image camera. In this paper, a novel adaptive integration time control method for occultation is presented. In this method, with the distribution of gray value in the image as the reference variable, and the concepts of speed integral PID control, the integration time adaptive control problem of high frequency imaging. The large dynamic range integration time automatic control in the occultation can be achieved.

  19. Adaptive Sensor Tuning for Seismic Event Detection in Environment with Electromagnetic Noise

    Science.gov (United States)

    Ziegler, Abra E.

    The goal of this research is to detect possible microseismic events at a carbon sequestration site. Data recorded on a continuous downhole microseismic array in the Farnsworth Field, an oil field in Northern Texas that hosts an ongoing carbon capture, utilization, and storage project, were evaluated using machine learning and reinforcement learning techniques to determine their effectiveness at seismic event detection on a dataset with electromagnetic noise. The data were recorded from a passive vertical monitoring array consisting of 16 levels of 3-component 15 Hz geophones installed in the field and continuously recording since January 2014. Electromagnetic and other noise recorded on the array has significantly impacted the utility of the data and it was necessary to characterize and filter the noise in order to attempt event detection. Traditional detection methods using short-term average/long-term average (STA/LTA) algorithms were evaluated and determined to be ineffective because of changing noise levels. To improve the performance of event detection and automatically and dynamically detect seismic events using effective data processing parameters, an adaptive sensor tuning (AST) algorithm developed by Sandia National Laboratories was utilized. AST exploits neuro-dynamic programming (reinforcement learning) trained with historic event data to automatically self-tune and determine optimal detection parameter settings. The key metric that guides the AST algorithm is consistency of each sensor with its nearest neighbors: parameters are automatically adjusted on a per station basis to be more or less sensitive to produce consistent agreement of detections in its neighborhood. The effects that changes in neighborhood configuration have on signal detection were explored, as it was determined that neighborhood-based detections significantly reduce the number of both missed and false detections in ground-truthed data. The performance of the AST algorithm was

  20. Adapting astronomical source detection software to help detect animals in thermal images obtained by unmanned aerial systems

    Science.gov (United States)

    Longmore, S. N.; Collins, R. P.; Pfeifer, S.; Fox, S. E.; Mulero-Pazmany, M.; Bezombes, F.; Goodwind, A.; de Juan Ovelar, M.; Knapen, J. H.; Wich, S. A.

    2017-02-01

    In this paper we describe an unmanned aerial system equipped with a thermal-infrared camera and software pipeline that we have developed to monitor animal populations for conservation purposes. Taking a multi-disciplinary approach to tackle this problem, we use freely available astronomical source detection software and the associated expertise of astronomers, to efficiently and reliably detect humans and animals in aerial thermal-infrared footage. Combining this astronomical detection software with existing machine learning algorithms into a single, automated, end-to-end pipeline, we test the software using aerial video footage taken in a controlled, field-like environment. We demonstrate that the pipeline works reliably and describe how it can be used to estimate the completeness of different observational datasets to objects of a given type as a function of height, observing conditions etc. - a crucial step in converting video footage to scientifically useful information such as the spatial distribution and density of different animal species. Finally, having demonstrated the potential utility of the system, we describe the steps we are taking to adapt the system for work in the field, in particular systematic monitoring of endangered species at National Parks around the world.

  1. Detecting consistent patterns of directional adaptation using differential selection codon models.

    Science.gov (United States)

    Parto, Sahar; Lartillot, Nicolas

    2017-06-23

    Phylogenetic codon models are often used to characterize the selective regimes acting on protein-coding sequences. Recent methodological developments have led to models explicitly accounting for the interplay between mutation and selection, by modeling the amino acid fitness landscape along the sequence. However, thus far, most of these models have assumed that the fitness landscape is constant over time. Fluctuations of the fitness landscape may often be random or depend on complex and unknown factors. However, some organisms may be subject to systematic changes in selective pressure, resulting in reproducible molecular adaptations across independent lineages subject to similar conditions. Here, we introduce a codon-based differential selection model, which aims to detect and quantify the fine-grained consistent patterns of adaptation at the protein-coding level, as a function of external conditions experienced by the organism under investigation. The model parameterizes the global mutational pressure, as well as the site- and condition-specific amino acid selective preferences. This phylogenetic model is implemented in a Bayesian MCMC framework. After validation with simulations, we applied our method to a dataset of HIV sequences from patients with known HLA genetic background. Our differential selection model detects and characterizes differentially selected coding positions specifically associated with two different HLA alleles. Our differential selection model is able to identify consistent molecular adaptations as a function of repeated changes in the environment of the organism. These models can be applied to many other problems, ranging from viral adaptation to evolution of life-history strategies in plants or animals.

  2. A Fiber Bragg Grating Interrogation System with Self-Adaption Threshold Peak Detection Algorithm.

    Science.gov (United States)

    Zhang, Weifang; Li, Yingwu; Jin, Bo; Ren, Feifei; Wang, Hongxun; Dai, Wei

    2018-04-08

    A Fiber Bragg Grating (FBG) interrogation system with a self-adaption threshold peak detection algorithm is proposed and experimentally demonstrated in this study. This system is composed of a field programmable gate array (FPGA) and advanced RISC machine (ARM) platform, tunable Fabry-Perot (F-P) filter and optical switch. To improve system resolution, the F-P filter was employed. As this filter is non-linear, this causes the shifting of central wavelengths with the deviation compensated by the parts of the circuit. Time-division multiplexing (TDM) of FBG sensors is achieved by an optical switch, with the system able to realize the combination of 256 FBG sensors. The wavelength scanning speed of 800 Hz can be achieved by a FPGA+ARM platform. In addition, a peak detection algorithm based on a self-adaption threshold is designed and the peak recognition rate is 100%. Experiments with different temperatures were conducted to demonstrate the effectiveness of the system. Four FBG sensors were examined in the thermal chamber without stress. When the temperature changed from 0 °C to 100 °C, the degree of linearity between central wavelengths and temperature was about 0.999 with the temperature sensitivity being 10 pm/°C. The static interrogation precision was able to reach 0.5 pm. Through the comparison of different peak detection algorithms and interrogation approaches, the system was verified to have an optimum comprehensive performance in terms of precision, capacity and speed.

  3. Sanshool on The Fingertip Interferes with Vibration Detection in a Rapidly-Adapting (RA Tactile Channel.

    Directory of Open Access Journals (Sweden)

    Scinob Kuroki

    Full Text Available An Asian spice, Szechuan pepper (sanshool, is well known for the tingling sensation it induces on the mouth and on the lips. Electrophysiological studies have revealed that its active ingredient can induce firing of mechanoreceptor fibres that typically respond to mechanical vibration. Moreover, a human behavioral study has reported that the perceived frequency of sanshool-induced tingling matches with the preferred frequency range of the tactile rapidly adapting (RA channel, suggesting the contribution of sanshool-induced RA channel firing to its unique perceptual experience. However, since the RA channel may not be the only channel activated by sanshool, there could be a possibility that the sanshool tingling percept may be caused in whole or in part by other sensory channels. Here, by using a perceptual interference paradigm, we show that the sanshool-induced RA input indeed contributes to the human tactile processing. The absolute detection thresholds for vibrotactile input were measured with and without sanshool application on the fingertip. Sanshool significantly impaired detection of vibrations at 30 Hz (RA channel dominant frequency, but did not impair detection of higher frequency vibrations at 240 Hz (Pacinian-corpuscle (PC channel dominant frequency or lower frequency vibrations at 1 Hz (slowly adapting 1 (SA1 channel dominant frequency. These results show that the sanshool induces a peripheral RA channel activation that is relevant for tactile perception. This anomalous activation of RA channels may contribute to the unique tingling experience of sanshool.

  4. A Fiber Bragg Grating Interrogation System with Self-Adaption Threshold Peak Detection Algorithm

    Directory of Open Access Journals (Sweden)

    Weifang Zhang

    2018-04-01

    Full Text Available A Fiber Bragg Grating (FBG interrogation system with a self-adaption threshold peak detection algorithm is proposed and experimentally demonstrated in this study. This system is composed of a field programmable gate array (FPGA and advanced RISC machine (ARM platform, tunable Fabry–Perot (F–P filter and optical switch. To improve system resolution, the F–P filter was employed. As this filter is non-linear, this causes the shifting of central wavelengths with the deviation compensated by the parts of the circuit. Time-division multiplexing (TDM of FBG sensors is achieved by an optical switch, with the system able to realize the combination of 256 FBG sensors. The wavelength scanning speed of 800 Hz can be achieved by a FPGA+ARM platform. In addition, a peak detection algorithm based on a self-adaption threshold is designed and the peak recognition rate is 100%. Experiments with different temperatures were conducted to demonstrate the effectiveness of the system. Four FBG sensors were examined in the thermal chamber without stress. When the temperature changed from 0 °C to 100 °C, the degree of linearity between central wavelengths and temperature was about 0.999 with the temperature sensitivity being 10 pm/°C. The static interrogation precision was able to reach 0.5 pm. Through the comparison of different peak detection algorithms and interrogation approaches, the system was verified to have an optimum comprehensive performance in terms of precision, capacity and speed.

  5. Detecting the presence of a magnetic field under Gaussian and non-Gaussian noise by adaptive measurement

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Yuan-Mei; Li, Jun-Gang, E-mail: jungl@bit.edu.cn; Zou, Jian

    2017-06-15

    Highlights: • Adaptive measurement strategy is used to detect the presence of a magnetic field. • Gaussian Ornstein–Uhlenbeck noise and non-Gaussian noise have been considered. • Weaker magnetic fields may be more easily detected than some stronger ones. - Abstract: By using the adaptive measurement method we study how to detect whether a weak magnetic field is actually present or not under Gaussian noise and non-Gaussian noise. We find that the adaptive measurement method can effectively improve the detection accuracy. For the case of Gaussian noise, we find the stronger the magnetic field strength, the easier for us to detect the magnetic field. Counterintuitively, for non-Gaussian noise, some weaker magnetic fields are more likely to be detected rather than some stronger ones. Finally, we give a reasonable physical interpretation.

  6. Kohn anomalies in superconductors

    International Nuclear Information System (INIS)

    Flatte, M.E.

    1994-01-01

    The detailed behavior of phonon dispersion curves near momenta which span the electronic Fermi sea in a superconductor is presented. An anomaly, similar to the metallic Kohn anomaly, exists in a superconductor's dispersion curves when the frequency of the photon spanning the Fermi sea exceeds twice the superconducting energy gap. This anomaly occurs at approximately the same momentum but is stronger than the normal-state Kohn anomaly. It also survives at finite temperature, unlike the metallic anomaly. Determination of Fermi-surface diameters from the location of these anomalies, therefore, may be more successful in the superconducting phase than in the normal state. However, the superconductor's anomaly fades rapidly with increased phonon frequency and becomes unobservable when the phonon frequency greatly exceeds the gap. This constraint makes these anomalies useful only in high-temperature superconductors such as La 1.85 Sr 0.15 CuO 4

  7. Detection of circuit-board components with an adaptive multiclass correlation filter

    Science.gov (United States)

    Diaz-Ramirez, Victor H.; Kober, Vitaly

    2008-08-01

    A new method for reliable detection of circuit-board components is proposed. The method is based on an adaptive multiclass composite correlation filter. The filter is designed with the help of an iterative algorithm using complex synthetic discriminant functions. The impulse response of the filter contains information needed to localize and classify geometrically distorted circuit-board components belonging to different classes. Computer simulation results obtained with the proposed method are provided and compared with those of known multiclass correlation based techniques in terms of performance criteria for recognition and classification of objects.

  8. Improved spectral kurtosis with adaptive redundant multiwavelet packet and its applications for rotating machinery fault detection

    International Nuclear Information System (INIS)

    Chen, Jinglong; Zi, Yanyang; He, Zhengjia; Yuan, Jing

    2012-01-01

    Rotating machinery fault detection is significant to avoid serious accidents and huge economic losses effectively. However, due to the vibration signal with the character of non-stationarity and nonlinearity, the detection and extraction of the fault feature turn into a challenging task. Therefore, a novel method called improved spectral kurtosis (ISK) with adaptive redundant multiwavelet packet (ARMP) is proposed for this task. Spectral kurtosis (SK) has been proved to be a powerful tool to detect and characterize the non-stationary signal. To improve the SK in filter limitation and enhance the resolution of spectral analysis as well as match fault feature optimally, the ARMP is introduced into the SK. Moreover, since kurtosis does not reflect the actual trend of periodic impulses, the SK is improved by incorporating an evaluation index called envelope spectrum entropy as supplement. The proposed method is applied to the rolling element bearing and gear fault detection to validate its reliability and effectiveness. Compared with the conventional frequency spectrum, envelope spectrum, original SK and some single wavelet methods, the results indicate that it could improve the accuracy of frequency-band selection and enhance the ability of rotating machinery fault detection. (paper)

  9. Tracheobronchial Branching Anomalies

    International Nuclear Information System (INIS)

    Hong, Min Ji; Kim, Young Tong; Jou, Sung Shick; Park, A Young

    2010-01-01

    There are various congenital anomalies with respect to the number, length, diameter, and location of tracheobronchial branching patterns. The tracheobronchial anomalies are classified into two groups. The first one, anomalies of division, includes tracheal bronchus, cardiac bronchus, tracheal diverticulum, pulmonary isomerism, and minor variations. The second one, dysmorphic lung, includes lung agenesis-hypoplasia complex and lobar agenesis-aplasia complex

  10. Tracheobronchial Branching Anomalies

    Energy Technology Data Exchange (ETDEWEB)

    Hong, Min Ji; Kim, Young Tong; Jou, Sung Shick [Soonchunhyang University, Cheonan Hospital, Cheonan (Korea, Republic of); Park, A Young [Soonchunhyang University College of Medicine, Asan (Korea, Republic of)

    2010-04-15

    There are various congenital anomalies with respect to the number, length, diameter, and location of tracheobronchial branching patterns. The tracheobronchial anomalies are classified into two groups. The first one, anomalies of division, includes tracheal bronchus, cardiac bronchus, tracheal diverticulum, pulmonary isomerism, and minor variations. The second one, dysmorphic lung, includes lung agenesis-hypoplasia complex and lobar agenesis-aplasia complex

  11. A Modified Adaptive Stochastic Resonance for Detecting Faint Signal in Sensors

    Directory of Open Access Journals (Sweden)

    Hengwei Li

    2007-02-01

    Full Text Available In this paper, an approach is presented to detect faint signals with strong noises in sensors by stochastic resonance (SR. We adopt the power spectrum as the evaluation tool of SR, which can be obtained by the fast Fourier transform (FFT. Furthermore, we introduce the adaptive filtering scheme to realize signal processing automatically. The key of the scheme is how to adjust the barrier height to satisfy the optimal condition of SR in the presence of any input. For the given input signal, we present an operable procedure to execute the adjustment scheme. An example utilizing one audio sensor to detect the fault information from the power supply is given. Simulation results show that th

  12. Culture- and molecular-based detection of swine-adapted Salmonella shed by avian scavengers.

    Science.gov (United States)

    Blanco, Guillermo; Díaz de Tuesta, Juan A

    2018-04-13

    Salmonella can play an important role as a disease agent in wildlife, which can then act as carriers and reservoirs of sanitary importance at the livestock-human interface. Transmission from livestock to avian scavengers can occur when these species consume contaminated carcasses and meat remains in supplementary feeding stations and rubbish dumps. We compared the performance of PCR-based detection with conventional culture-based methods to detect Salmonella in the faeces of red kites (Milvus milvus) and griffon vultures (Gyps fulvus) in central Spain. The occurrence of culturable Salmonella was intermediate in red kites (1.9%, n=52) and high in griffon vultures (26.3%, n=99). These proportions were clearly higher with PCR-based detection (13.5% and 40.4%, respectively). Confirmation cultures failed to grow Salmonella in all faecal samples positive by the molecular assay but negative by the initial conventional culture in both scavenger species, indicating the occurrence of false (non-culturable) positives by PCR-based detection. This suggests that the molecular assay is highly sensitive to detecting viable Salmonella in cultures, but also partial genomes and dead or unviable bacteria from past infections or contamination. Thus, the actual occurrence of Salmonella in a particular sampling time period can be underestimated when using only culture detection. The serovars found in the scavenger faeces were among the most frequently isolated in pigs from Spain and other EU countries, especially those generally recognized as swine-adapted monophasic variants of S. Typhimurium. Because the studied species obtain much of their food from pig carcasses, this livestock may be the primary source of Salmonella via direct ingestion of infected carcasses and indirectly via contamination due to the unsanitary conditions found in supplementary feeding stations established for scavenger conservation. Combining culture- and molecular-based detection is encouraged to understand the

  13. An Anomaly Detector Based on Multi-aperture Mapping for Hyperspectral Data

    Directory of Open Access Journals (Sweden)

    LI Min

    2016-10-01

    Full Text Available Considering the correlationship of spectral content between anomaly and clutter background, inaccurate selection of background pixels induced estimation error of background model. In order to solve the above problems, a multi-aperture mapping based anomaly detector was proposed in this paper. Firstly, differing from background model which focused on feature extraction of background, multi-aperture mapping of hyperspectral data characterized the feature of whole hyperspectral data. According to constructed basis set of multi-aperture mapping, anomaly salience index of every test pixel was proposed to measure the relative statistic difference. Secondly, in order to analysis the moderate salience anomaly precisely, membership value was constructed to identify anomaly salience of test pixels continuously based on fuzzy logical theory. At same time, weighted iterative estimation of multi-aperture mapping was expected to converge adaptively with membership value as weight. Thirdly, classical defuzzification was proposed to fuse different detection results. Hyperspectral data was used in the experiments, and the robustness and sensitivity to anomaly with lower silence of proposed detector were tested.

  14. A novel ship CFAR detection algorithm based on adaptive parameter enhancement and wake-aided detection in SAR images

    Science.gov (United States)

    Meng, Siqi; Ren, Kan; Lu, Dongming; Gu, Guohua; Chen, Qian; Lu, Guojun

    2018-03-01

    Synthetic aperture radar (SAR) is an indispensable and useful method for marine monitoring. With the increase of SAR sensors, high resolution images can be acquired and contain more target structure information, such as more spatial details etc. This paper presents a novel adaptive parameter transform (APT) domain constant false alarm rate (CFAR) to highlight targets. The whole method is based on the APT domain value. Firstly, the image is mapped to the new transform domain by the algorithm. Secondly, the false candidate target pixels are screened out by the CFAR detector to highlight the target ships. Thirdly, the ship pixels are replaced by the homogeneous sea pixels. And then, the enhanced image is processed by Niblack algorithm to obtain the wake binary image. Finally, normalized Hough transform (NHT) is used to detect wakes in the binary image, as a verification of the presence of the ships. Experiments on real SAR images validate that the proposed transform does enhance the target structure and improve the contrast of the image. The algorithm has a good performance in the ship and ship wake detection.

  15. Using principal component analysis for selecting network behavioral anomaly metrics

    Science.gov (United States)

    Gregorio-de Souza, Ian; Berk, Vincent; Barsamian, Alex

    2010-04-01

    This work addresses new approaches to behavioral analysis of networks and hosts for the purposes of security monitoring and anomaly detection. Most commonly used approaches simply implement anomaly detectors for one, or a few, simple metrics and those metrics can exhibit unacceptable false alarm rates. For instance, the anomaly score of network communication is defined as the reciprocal of the likelihood that a given host uses a particular protocol (or destination);this definition may result in an unrealistically high threshold for alerting to avoid being flooded by false positives. We demonstrate that selecting and adapting the metrics and thresholds, on a host-by-host or protocol-by-protocol basis can be done by established multivariate analyses such as PCA. We will show how to determine one or more metrics, for each network host, that records the highest available amount of information regarding the baseline behavior, and shows relevant deviances reliably. We describe the methodology used to pick from a large selection of available metrics, and illustrate a method for comparing the resulting classifiers. Using our approach we are able to reduce the resources required to properly identify misbehaving hosts, protocols, or networks, by dedicating system resources to only those metrics that actually matter in detecting network deviations.

  16. Phenotypic- and Genotypic-Resistance Detection for Adaptive Resistance Management in Tetranychus urticae Koch.

    Directory of Open Access Journals (Sweden)

    Deok Ho Kwon

    Full Text Available Rapid resistance detection is necessary for the adaptive management of acaricide-resistant populations of Tetranychus urticae. Detection of phenotypic and genotypic resistance was conducted by employing residual contact vial bioassay (RCV and quantitative sequencing (QS methods, respectively. RCV was useful for detecting the acaricide resistance levels of T. urticae, particularly for on-site resistance detection; however, it was only applicable for rapid-acting acaricides (12 out of 19 tested acaricides. QS was effective for determining the frequencies of resistance alleles on a population basis, which corresponded to 12 nonsynonymous point mutations associated with target-site resistance to five types of acaricides [organophosphates (monocrotophos, pirimiphos-methyl, dimethoate and chlorpyrifos, pyrethroids (fenpropathrin and bifenthrin, abamectin, bifenazate and etoxazole]. Most field-collected mites exhibited high levels of multiple resistance, as determined by RCV and QS data, suggesting the seriousness of their current acaricide resistance status in rose cultivation areas in Korea. The correlation analyses revealed moderate to high levels of positive relationships between the resistance allele frequencies and the actual resistance levels in only five of the acaricides evaluated, which limits the general application of allele frequency as a direct indicator for estimating actual resistance levels. Nevertheless, the resistance allele frequency data alone allowed for the evaluation of the genetic resistance potential and background of test mite populations. The combined use of RCV and QS provides basic information on resistance levels, which is essential for choosing appropriate acaricides for the management of resistant T. urticae.

  17. SuBSENSE: a universal change detection method with local adaptive sensitivity.

    Science.gov (United States)

    St-Charles, Pierre-Luc; Bilodeau, Guillaume-Alexandre; Bergevin, Robert

    2015-01-01

    Foreground/background segmentation via change detection in video sequences is often used as a stepping stone in high-level analytics and applications. Despite the wide variety of methods that have been proposed for this problem, none has been able to fully address the complex nature of dynamic scenes in real surveillance tasks. In this paper, we present a universal pixel-level segmentation method that relies on spatiotemporal binary features as well as color information to detect changes. This allows camouflaged foreground objects to be detected more easily while most illumination variations are ignored. Besides, instead of using manually set, frame-wide constants to dictate model sensitivity and adaptation speed, we use pixel-level feedback loops to dynamically adjust our method's internal parameters without user intervention. These adjustments are based on the continuous monitoring of model fidelity and local segmentation noise levels. This new approach enables us to outperform all 32 previously tested state-of-the-art methods on the 2012 and 2014 versions of the ChangeDetection.net dataset in terms of overall F-Measure. The use of local binary image descriptors for pixel-level modeling also facilitates high-speed parallel implementations: our own version, which used no low-level or architecture-specific instruction, reached real-time processing speed on a midlevel desktop CPU. A complete C++ implementation based on OpenCV is available online.

  18. Adaptive Kalman Filter Based on Adjustable Sampling Interval in Burst Detection for Water Distribution System

    Directory of Open Access Journals (Sweden)

    Doo Yong Choi

    2016-04-01

    Full Text Available Rapid detection of bursts and leaks in water distribution systems (WDSs can reduce the social and economic costs incurred through direct loss of water into the ground, additional energy demand for water supply, and service interruptions. Many real-time burst detection models have been developed in accordance with the use of supervisory control and data acquisition (SCADA systems and the establishment of district meter areas (DMAs. Nonetheless, no consideration has been given to how frequently a flow meter measures and transmits data for predicting breaks and leaks in pipes. This paper analyzes the effect of sampling interval when an adaptive Kalman filter is used for detecting bursts in a WDS. A new sampling algorithm is presented that adjusts the sampling interval depending on the normalized residuals of flow after filtering. The proposed algorithm is applied to a virtual sinusoidal flow curve and real DMA flow data obtained from Jeongeup city in South Korea. The simulation results prove that the self-adjusting algorithm for determining the sampling interval is efficient and maintains reasonable accuracy in burst detection. The proposed sampling method has a significant potential for water utilities to build and operate real-time DMA monitoring systems combined with smart customer metering systems.

  19. Performance Analysis of Hierarchical Group Key Management Integrated with Adaptive Intrusion Detection in Mobile ad hoc Networks

    Science.gov (United States)

    2016-04-05

    applications in wireless networks such as military battlefields, emergency response, mobile commerce , online gaming, and collaborative work are based on the...www.elsevier.com/locate/peva Performance analysis of hierarchical group key management integrated with adaptive intrusion detection in mobile ad hoc...Accepted 19 September 2010 Available online 26 September 2010 Keywords: Mobile ad hoc networks Intrusion detection Group communication systems Group

  20. Automatic change detection in vision: Adaptation, memory mismatch, or both? II: Oddball and adaptation effects on event-related potentials.

    Science.gov (United States)

    Bodnár, Flóra; File, Domonkos; Sulykos, István; Kecskés-Kovács, Krisztina; Czigler, István

    2017-11-01

    In this study we compared the event-related potentials (ERPs) obtained in two different paradigms: a passive visual oddball paradigm and an adaptation paradigm. The aim of the study was to investigate the relation between the effects of activity decrease following an adaptor (stimulus-specific adaptation) and the effects of an infrequent stimulus within sequences of frequent ones. In Experiment 1, participants were presented with different line textures. The frequent (standard) and rare (deviant) texture elements differed in their orientation. In Experiment 2, windmill pattern stimuli were presented in which the number of vanes differentiated the deviant and standard stimuli. In Experiment 1 the ERP differences elicited between the oddball deviant and the standard were similar to the differences between the ERPs to the nonadapted and adapted stimuli in the adaptation paradigm. In both paradigms the differences appeared as a posterior negativity with the latency of 120-140 ms. This finding demonstrates that the representation of a sequential rule (successive presentation of the standard) and the violation of this rule are not necessary for deviancy effects to emerge. In Experiment 2 (windmill pattern), in the oddball paradigm the difference potentials appeared as a long-lasting negativity. In the adaptation condition, the later part of this negativity (after 200 ms) was absent. We identified the later part of the oddball difference potential as the genuine visual mismatch negativity-that is, an ERP correlate of sequence violations. The latencies of the difference potentials (deviant minus standard) and the endogenous components (P1 and N1) diverged; therefore, the adaptation of these particular ERP components cannot explain the deviancy effect. Accordingly, the sources contributing to the standard-versus-deviant modulations differed from those related to visual adaptation; that is, they generated distinct ERP components.

  1. Branchial anomalies in children.

    Science.gov (United States)

    Bajaj, Y; Ifeacho, S; Tweedie, D; Jephson, C G; Albert, D M; Cochrane, L A; Wyatt, M E; Jonas, N; Hartley, B E J

    2011-08-01

    Branchial cleft anomalies are the second most common head and neck congenital lesions seen in children. Amongst the branchial cleft malformations, second cleft lesions account for 95% of the branchial anomalies. This article analyzes all the cases of branchial cleft anomalies operated on at Great Ormond Street Hospital over the past 10 years. All children who underwent surgery for branchial cleft sinus or fistula from January 2000 to December 2010 were included in this study. In this series, we had 80 patients (38 female and 42 male). The age at the time of operation varied from 1 year to 14 years. Amongst this group, 15 patients had first branchial cleft anomaly, 62 had second branchial cleft anomaly and 3 had fourth branchial pouch anomaly. All the first cleft cases were operated on by a superficial parotidectomy approach with facial nerve identification. Complete excision was achieved in all these first cleft cases. In this series of first cleft anomalies, we had one complication (temporary marginal mandibular nerve weakness. In the 62 children with second branchial cleft anomalies, 50 were unilateral and 12 were bilateral. In the vast majority, the tract extended through the carotid bifurcation and extended up to pharyngeal constrictor muscles. Majority of these cases were operated on through an elliptical incision around the external opening. Complete excision was achieved in all second cleft cases except one who required a repeat excision. In this subgroup, we had two complications one patient developed a seroma and one had incomplete excision. The three patients with fourth pouch anomaly were treated with endoscopic assisted monopolar diathermy to the sinus opening with good outcome. Branchial anomalies are relatively common in children. There are three distinct types, first cleft, second cleft and fourth pouch anomaly. Correct diagnosis is essential to avoid inadequate surgery and multiple procedures. The surgical approach needs to be tailored to the type

  2. Aircraft Abnormal Conditions Detection, Identification, and Evaluation Using Innate and Adaptive Immune Systems Interaction

    Science.gov (United States)

    Al Azzawi, Dia

    Abnormal flight conditions play a major role in aircraft accidents frequently causing loss of control. To ensure aircraft operation safety in all situations, intelligent system monitoring and adaptation must rely on accurately detecting the presence of abnormal conditions as soon as they take place, identifying their root cause(s), estimating their nature and severity, and predicting their impact on the flight envelope. Due to the complexity and multidimensionality of the aircraft system under abnormal conditions, these requirements are extremely difficult to satisfy using existing analytical and/or statistical approaches. Moreover, current methodologies have addressed only isolated classes of abnormal conditions and a reduced number of aircraft dynamic parameters within a limited region of the flight envelope. This research effort aims at developing an integrated and comprehensive framework for the aircraft abnormal conditions detection, identification, and evaluation based on the artificial immune systems paradigm, which has the capability to address the complexity and multidimensionality issues related to aircraft systems. Within the proposed framework, a novel algorithm was developed for the abnormal conditions detection problem and extended to the abnormal conditions identification and evaluation. The algorithm and its extensions were inspired from the functionality of the biological dendritic cells (an important part of the innate immune system) and their interaction with the different components of the adaptive immune system. Immunity-based methodologies for re-assessing the flight envelope at post-failure and predicting the impact of the abnormal conditions on the performance and handling qualities are also proposed and investigated in this study. The generality of the approach makes it applicable to any system. Data for artificial immune system development were collected from flight tests of a supersonic research aircraft within a motion-based flight

  3. Integrating age in the detection and mapping of incongruous patches in coffee (Coffea arabica) plantations using multi-temporal Landsat 8 NDVI anomalies

    Science.gov (United States)

    Chemura, Abel; Mutanga, Onisimo; Dube, Timothy

    2017-05-01

    The development of cost-effective, reliable and easy to implement crop condition monitoring methods is urgently required for perennial tree crops such as coffee (Coffea arabica), as they are grown over large areas and represent long term and higher levels of investment. These monitoring methods are useful in identifying farm areas that experience poor crop growth, pest infestation, diseases outbreaks and/or to monitor response to management interventions. This study compares field level coffee mean NDVI and LSWI anomalies and age-adjusted coffee mean NDVI and LSWI anomalies in identifying and mapping incongruous patches across perennial coffee plantations. To achieve this objective, we first derived deviation of coffee pixels from the global coffee mean NDVI and LSWI values of nine sequential Landsat 8 OLI image scenes. We then evaluated the influence of coffee age class (young, mature and old) on Landsat-scale NDVI and LSWI values using a one-way ANOVA and since results showed significant differences, we adjusted NDVI and LSWI anomalies for age-class. We then used the cumulative inverse distribution function (α ≤ 0.05) to identify fields and within field areas with excessive deviation of NDVI and LSWI from the global and the age-expected mean for each of the Landsat 8 OLI scene dates spanning three seasons. Results from accuracy assessment indicated that it was possible to separate incongruous and healthy patches using these anomalies and that using NDVI performed better than using LSWI for both global and age-adjusted mean anomalies. Using the age-adjusted anomalies performed better in separating incongruous and healthy patches than using the global mean for both NDVI (Overall accuracy = 80.9% and 68.1% respectively) and for LSWI (Overall accuracy = 68.1% and 48.9% respectively). When applied to other Landsat 8 OLI scenes, the results showed that the proportions of coffee fields that were modelled incongruent decreased with time for the young age category and

  4. REFERENCE-LESS DETECTION, ASTROMETRY, AND PHOTOMETRY OF FAINT COMPANIONS WITH ADAPTIVE OPTICS

    International Nuclear Information System (INIS)

    Gladysz, Szymon; Christou, Julian C.

    2009-01-01

    We propose a complete framework for the detection, astrometry, and photometry of faint companions from a sequence of adaptive optics (AO) corrected short exposures. The algorithms exploit the difference in statistics between the on-axis and off-axis intensity of the AO point-spread function (PSF) to differentiate real sources from speckles. We validate the new approach and illustrate its performance using moderate Strehl ratio data obtained with the natural guide star AO system on the Lick Observatory's 3 m Shane Telescope. We obtain almost a 2 mag gain in achievable contrast by using our detection method compared to 5σ detectability in long exposures. We also present a first guide to expected accuracy of differential photometry and astrometry with the new techniques. Our approach performs better than PSF-fitting in general and especially so for close companions, which are located within the uncompensated seeing (speckle) halo. All three proposed algorithms are self-calibrating, i.e., they do not require observation of a calibration star. One of the advantages of this approach is improved observing efficiency.

  5. Adapting plant measurement data to improve hardware fault detection performance in pressurised water reactors

    International Nuclear Information System (INIS)

    Cilliers, A.C.; Mulder, E.J.

    2012-01-01

    Highlights: ► Attempt was to use available resources at a nuclear plant in a value added fashion. ► Includes plant measurement data and plant training and engineering simulator capabilities. ► Solving the fault masking effect by the distributed control systems in the plant. ► Modelling the effect of inaccuracies in plant models used in the simulators. ► Combination of above resulted in the development of a deterministic fault identifications system. -- Abstract: With the fairly recent adoption of digital control and instrumentation systems in the nuclear industry a lot of research now focus on the development expert fault identification systems. The fault identification systems enable detecting early onset faults of fault causes which allows maintenance planning on the equipment showing signs of deterioration or failure. This includes valve and leaks and small cracks in steam generator tubes usually detected by means of ultrasonic inspection. Detecting faults early during transient operation in NPPs is problematic due to the absence of a reliable reference to compare plant measurements with during transients. The distributed application of control systems operating independently to keep the plant operating within the safe operating boundaries complicates the problem since the control systems would not only operate to reduce the effect of transient disturbances but fault disturbances as well. This paper provides a method to adapt the plant measurements that isolates the control actions on the fault and re-introduces it into the measurement data, thereby improving plant diagnostic performance.

  6. A System based on Adaptive Background Subtraction Approach for Moving Object Detection and Tracking in Videos

    Directory of Open Access Journals (Sweden)

    Bahadır KARASULU

    2013-04-01

    Full Text Available Video surveillance systems are based on video and image processing research areas in the scope of computer science. Video processing covers various methods which are used to browse the changes in existing scene for specific video. Nowadays, video processing is one of the important areas of computer science. Two-dimensional videos are used to apply various segmentation and object detection and tracking processes which exists in multimedia content-based indexing, information retrieval, visual and distributed cross-camera surveillance systems, people tracking, traffic tracking and similar applications. Background subtraction (BS approach is a frequently used method for moving object detection and tracking. In the literature, there exist similar methods for this issue. In this research study, it is proposed to provide a more efficient method which is an addition to existing methods. According to model which is produced by using adaptive background subtraction (ABS, an object detection and tracking system’s software is implemented in computer environment. The performance of developed system is tested via experimental works with related video datasets. The experimental results and discussion are given in the study

  7. A P2P Botnet detection scheme based on decision tree and adaptive multilayer neural networks.

    Science.gov (United States)

    Alauthaman, Mohammad; Aslam, Nauman; Zhang, Li; Alasem, Rafe; Hossain, M A

    2018-01-01

    In recent years, Botnets have been adopted as a popular method to carry and spread many malicious codes on the Internet. These malicious codes pave the way to execute many fraudulent activities including spam mail, distributed denial-of-service attacks and click fraud. While many Botnets are set up using centralized communication architecture, the peer-to-peer (P2P) Botnets can adopt a decentralized architecture using an overlay network for exchanging command and control data making their detection even more difficult. This work presents a method of P2P Bot detection based on an adaptive multilayer feed-forward neural network in cooperation with decision trees. A classification and regression tree is applied as a feature selection technique to select relevant features. With these features, a multilayer feed-forward neural network training model is created using a resilient back-propagation learning algorithm. A comparison of feature set selection based on the decision tree, principal component analysis and the ReliefF algorithm indicated that the neural network model with features selection based on decision tree has a better identification accuracy along with lower rates of false positives. The usefulness of the proposed approach is demonstrated by conducting experiments on real network traffic datasets. In these experiments, an average detection rate of 99.08 % with false positive rate of 0.75 % was observed.

  8. Edge Detection on Images of Pseudoimpedance Section Supported by Context and Adaptive Transformation Model Images

    Directory of Open Access Journals (Sweden)

    Kawalec-Latała Ewa

    2014-03-01

    Full Text Available Most of underground hydrocarbon storage are located in depleted natural gas reservoirs. Seismic survey is the most economical source of detailed subsurface information. The inversion of seismic section for obtaining pseudoacoustic impedance section gives the possibility to extract detailed subsurface information. The seismic wavelet parameters and noise briefly influence the resolution. Low signal parameters, especially long signal duration time and the presence of noise decrease pseudoimpedance resolution. Drawing out from measurement or modelled seismic data approximation of distribution of acoustic pseuoimpedance leads us to visualisation and images useful to stratum homogeneity identification goal. In this paper, the improvement of geologic section image resolution by use of minimum entropy deconvolution method before inversion is applied. The author proposes context and adaptive transformation of images and edge detection methods as a way to increase the effectiveness of correct interpretation of simulated images. In the paper, the edge detection algorithms using Sobel, Prewitt, Robert, Canny operators as well as Laplacian of Gaussian method are emphasised. Wiener filtering of image transformation improving rock section structure interpretation pseudoimpedance matrix on proper acoustic pseudoimpedance value, corresponding to selected geologic stratum. The goal of the study is to develop applications of image transformation tools to inhomogeneity detection in salt deposits.

  9. Automated Detection of Microaneurysms Using Scale-Adapted Blob Analysis and Semi-Supervised Learning

    Energy Technology Data Exchange (ETDEWEB)

    Adal, Kedir M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Sidebe, Desire [Univ. of Burgundy, Dijon (France); Ali, Sharib [Univ. of Burgundy, Dijon (France); Chaum, Edward [Univ. of Tennessee, Knoxville, TN (United States); Karnowski, Thomas Paul [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Meriaudeau, Fabrice [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2014-01-07

    Despite several attempts, automated detection of microaneurysm (MA) from digital fundus images still remains to be an open issue. This is due to the subtle nature of MAs against the surrounding tissues. In this paper, the microaneurysm detection problem is modeled as finding interest regions or blobs from an image and an automatic local-scale selection technique is presented. Several scale-adapted region descriptors are then introduced to characterize these blob regions. A semi-supervised based learning approach, which requires few manually annotated learning examples, is also proposed to train a classifier to detect true MAs. The developed system is built using only few manually labeled and a large number of unlabeled retinal color fundus images. The performance of the overall system is evaluated on Retinopathy Online Challenge (ROC) competition database. A competition performance measure (CPM) of 0.364 shows the competitiveness of the proposed system against state-of-the art techniques as well as the applicability of the proposed features to analyze fundus images.

  10. Mass Detection in Mammographic Images Using Wavelet Processing and Adaptive Threshold Technique.

    Science.gov (United States)

    Vikhe, P S; Thool, V R

    2016-04-01

    Detection of mass in mammogram for early diagnosis of breast cancer is a significant assignment in the reduction of the mortality rate. However, in some cases, screening of mass is difficult task for radiologist, due to variation in contrast, fuzzy edges and noisy mammograms. Masses and micro-calcifications are the distinctive signs for diagnosis of breast cancer. This paper presents, a method for mass enhancement using piecewise linear operator in combination with wavelet processing from mammographic images. The method includes, artifact suppression and pectoral muscle removal based on morphological operations. Finally, mass segmentation for detection using adaptive threshold technique is carried out to separate the mass from background. The proposed method has been tested on 130 (45 + 85) images with 90.9 and 91 % True Positive Fraction (TPF) at 2.35 and 2.1 average False Positive Per Image(FP/I) from two different databases, namely Mammographic Image Analysis Society (MIAS) and Digital Database for Screening Mammography (DDSM). The obtained results show that, the proposed technique gives improved diagnosis in the early breast cancer detection.

  11. An Adaptive and Time-Efficient ECG R-Peak Detection Algorithm.

    Science.gov (United States)

    Qin, Qin; Li, Jianqing; Yue, Yinggao; Liu, Chengyu

    2017-01-01

    R-peak detection is crucial in electrocardiogram (ECG) signal analysis. This study proposed an adaptive and time-efficient R-peak detection algorithm for ECG processing. First, wavelet multiresolution analysis was applied to enhance the ECG signal representation. Then, ECG was mirrored to convert large negative R-peaks to positive ones. After that, local maximums were calculated by the first-order forward differential approach and were truncated by the amplitude and time interval thresholds to locate the R-peaks. The algorithm performances, including detection accuracy and time consumption, were tested on the MIT-BIH arrhythmia database and the QT database. Experimental results showed that the proposed algorithm achieved mean sensitivity of 99.39%, positive predictivity of 99.49%, and accuracy of 98.89% on the MIT-BIH arrhythmia database and 99.83%, 99.90%, and 99.73%, respectively, on the QT database. By processing one ECG record, the mean time consumptions were 0.872 s and 0.763 s for the MIT-BIH arrhythmia database and QT database, respectively, yielding 30.6% and 32.9% of time reduction compared to the traditional Pan-Tompkins method.

  12. Automated detection of microaneurysms using scale-adapted blob analysis and semi-supervised learning.

    Science.gov (United States)

    Adal, Kedir M; Sidibé, Désiré; Ali, Sharib; Chaum, Edward; Karnowski, Thomas P; Mériaudeau, Fabrice

    2014-04-01

    Despite several attempts, automated detection of microaneurysm (MA) from digital fundus images still remains to be an open issue. This is due to the subtle nature of MAs against the surrounding tissues. In this paper, the microaneurysm detection problem is modeled as finding interest regions or blobs from an image and an automatic local-scale selection technique is presented. Several scale-adapted region descriptors are introduced to characterize these blob regions. A semi-supervised based learning approach, which requires few manually annotated learning examples, is also proposed to train a classifier which can detect true MAs. The developed system is built using only few manually labeled and a large number of unlabeled retinal color fundus images. The performance of the overall system is evaluated on Retinopathy Online Challenge (ROC) competition database. A competition performance measure (CPM) of 0.364 shows the competitiveness of the proposed system against state-of-the art techniques as well as the applicability of the proposed features to analyze fundus images. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  13. An anomaly analysis framework for database systems

    NARCIS (Netherlands)

    Vavilis, S.; Egner, A.I.; Petkovic, M.; Zannone, N.

    2015-01-01

    Anomaly detection systems are usually employed to monitor database activities in order to detect security incidents. These systems raise an alert when anomalous activities are detected. The raised alerts have to be analyzed to timely respond to the security incidents. Their analysis, however, is

  14. Global gravitational anomalies

    International Nuclear Information System (INIS)

    Witten, E.

    1985-01-01

    A general formula for global gauge and gravitational anomalies is derived. It is used to show that the anomaly free supergravity and superstring theories in ten dimensions are all free of global anomalies that might have ruined their consistency. However, it is shown that global anomalies lead to some restrictions on allowed compactifications of these theories. For example, in the case of O(32) superstring theory, it is shown that a global anomaly related to π 7 (O(32)) leads to a Dirac-like quantization condition for the field strength of the antisymmetric tensor field. Related to global anomalies is the question of the number of fermion zero modes in an instanton field. It is argued that the relevant gravitational instantons are exotic spheres. It is shown that the number of fermion zero modes in an instanton field is always even in ten dimensional supergravity. (orig.)

  15. Eriophorum angustifolium and Lolium perenne metabolic adaptations to metals- and metalloids-induced anomalies in the vicinity of a chemical industrial complex.

    Science.gov (United States)

    Anjum, Naser A; Ahmad, Iqbal; Rodrigues, Sónia M; Henriques, Bruno; Cruz, Nuno; Coelho, Cláudia; Pacheco, Mário; Duarte, Armando C; Pereira, Eduarda

    2013-01-01

    As plants constitute the foundation of the food chain, concerns have been raised about the possibility of toxic concentrations of metals and metalloids being transported from plants to the higher food chain strata. In this perspective, the use of important phytotoxicity endpoints may be of utmost significance in assessing the hazardous nature of metals and metalloids and also in developing ecological soil screening levels. The current study aimed to investigate the role of glutathione (GSH) and its associated enzymes in the metabolic adaptation of two grass species namely Eriophorum angustifolium Honck. and Lolium perenne L. to metals and metalloids stress in the vicinity of a chemical industrial complex (Estarreja, Portugal). Soil and plant samples were collected from contaminated (C) and non-contaminated (reference, R) sites, respectively, near and away from the Estarreja Chemical Complex, Portugal. Soils (from 0 to 10 and 10 to 20 cm depths) were analyzed for pH, organic carbon, and metals and metalloids concentrations. Plant samples were processed fresh for physiological and biochemical estimations, while oven-dried plant samples were used for metals and metalloids determinations following standard methodologies. Both soils and plants from the industrial area exhibited differential concentrations of major metals and metalloids including As, Cu, Hg, Pb, and Zn. In particular, L. perenne shoot displayed significantly higher and lower concentrations of Pb and As, respectively at contaminated site (vs. E. angustifolium). Irrespective of sites, L. perenne shoot exhibited significantly higher total GSH pool, oxidized glutathione (GSSG) and oxidized protein (vs. E. angustifolium). Additionally, severe damages to photosynthetic pigments, proteins, cellular membrane integrity (in terms of electrolyte leakage), and lipid peroxidation were also perceptible in L. perenne shoot. Contrarily, irrespective of the sites, activities of catalase and GSH-regenerating enzyme, GSH

  16. Adaptive Spot Detection With Optimal Scale Selection in Fluorescence Microscopy Images.

    Science.gov (United States)

    Basset, Antoine; Boulanger, Jérôme; Salamero, Jean; Bouthemy, Patrick; Kervrann, Charles

    2015-11-01

    Accurately detecting subcellular particles in fluorescence microscopy is of primary interest for further quantitative analysis such as counting, tracking, or classification. Our primary goal is to segment vesicles likely to share nearly the same size in fluorescence microscopy images. Our method termed adaptive thresholding of Laplacian of Gaussian (LoG) images with autoselected scale (ATLAS) automatically selects the optimal scale corresponding to the most frequent spot size in the image. Four criteria are proposed and compared to determine the optimal scale in a scale-space framework. Then, the segmentation stage amounts to thresholding the LoG of the intensity image. In contrast to other methods, the threshold is locally adapted given a probability of false alarm (PFA) specified by the user for the whole set of images to be processed. The local threshold is automatically derived from the PFA value and local image statistics estimated in a window whose size is not a critical parameter. We also propose a new data set for benchmarking, consisting of six collections of one hundred images each, which exploits backgrounds extracted from real microscopy images. We have carried out an extensive comparative evaluation on several data sets with ground-truth, which demonstrates that ATLAS outperforms existing methods. ATLAS does not need any fine parameter tuning and requires very low computation time. Convincing results are also reported on real total internal reflection fluorescence microscopy images.

  17. The score statistic of the LD-lod analysis: detecting linkage adaptive to linkage disequilibrium.

    Science.gov (United States)

    Huang, J; Jiang, Y

    2001-01-01

    We study the properties of a modified lod score method for testing linkage that incorporates linkage disequilibrium (LD-lod). By examination of its score statistic, we show that the LD-lod score method adaptively combines two sources of information: (a) the IBD sharing score which is informative for linkage regardless of the existence of LD and (b) the contrast between allele-specific IBD sharing scores which is informative for linkage only in the presence of LD. We also consider the connection between the LD-lod score method and the transmission-disequilibrium test (TDT) for triad data and the mean test for affected sib pair (ASP) data. We show that, for triad data, the recessive LD-lod test is asymptotically equivalent to the TDT; and for ASP data, it is an adaptive combination of the TDT and the ASP mean test. We demonstrate that the LD-lod score method has relatively good statistical efficiency in comparison with the ASP mean test and the TDT for a broad range of LD and the genetic models considered in this report. Therefore, the LD-lod score method is an interesting approach for detecting linkage when the extent of LD is unknown, such as in a genome-wide screen with a dense set of genetic markers. Copyright 2001 S. Karger AG, Basel

  18. Rate adaptive multilevel coded modulation with high coding gain in intensity modulation direct detection optical communication

    Science.gov (United States)

    Xiao, Fei; Liu, Bo; Zhang, Lijia; Xin, Xiangjun; Zhang, Qi; Tian, Qinghua; Tian, Feng; Wang, Yongjun; Rao, Lan; Ullah, Rahat; Zhao, Feng; Li, Deng'ao

    2018-02-01

    A rate-adaptive multilevel coded modulation (RA-MLC) scheme based on fixed code length and a corresponding decoding scheme is proposed. RA-MLC scheme combines the multilevel coded and modulation technology with the binary linear block code at the transmitter. Bits division, coding, optional interleaving, and modulation are carried out by the preset rule, then transmitted through standard single mode fiber span equal to 100 km. The receiver improves the accuracy of decoding by means of soft information passing through different layers, which enhances the performance. Simulations are carried out in an intensity modulation-direct detection optical communication system using MATLAB®. Results show that the RA-MLC scheme can achieve bit error rate of 1E-5 when optical signal-to-noise ratio is 20.7 dB. It also reduced the number of decoders by 72% and realized 22 rate adaptation without significantly increasing the computing time. The coding gain is increased by 7.3 dB at BER=1E-3.

  19. Adaptive Near-Optimal Multiuser Detection Using a Stochastic and Hysteretic Hopfield Net Receiver

    Directory of Open Access Journals (Sweden)

    Gábor Jeney

    2003-01-01

    Full Text Available This paper proposes a novel adaptive MUD algorithm for a wide variety (practically any kind of interference limited systems, for example, code division multiple access (CDMA. The algorithm is based on recently developed neural network techniques and can perform near optimal detection in the case of unknown channel characteristics. The proposed algorithm consists of two main blocks; one estimates the symbols sent by the transmitters, the other identifies each channel of the corresponding communication links. The estimation of symbols is carried out either by a stochastic Hopfield net (SHN or by a hysteretic neural network (HyNN or both. The channel identification is based on either the self-organizing feature map (SOM or the learning vector quantization (LVQ. The combination of these two blocks yields a powerful real-time detector with near optimal performance. The performance is analyzed by extensive simulations.

  20. Anomaly-free models for flavour anomalies

    Science.gov (United States)

    Ellis, John; Fairbairn, Malcolm; Tunney, Patrick

    2018-03-01

    We explore the constraints imposed by the cancellation of triangle anomalies on models in which the flavour anomalies reported by LHCb and other experiments are due to an extra U(1)^' gauge boson Z^' . We assume universal and rational U(1)^' charges for the first two generations of left-handed quarks and of right-handed up-type quarks but allow different charges for their third-generation counterparts. If the right-handed charges vanish, cancellation of the triangle anomalies requires all the quark U(1)^' charges to vanish, if there are either no exotic fermions or there is only one Standard Model singlet dark matter (DM) fermion. There are non-trivial anomaly-free models with more than one such `dark' fermion, or with a single DM fermion if right-handed up-type quarks have non-zero U(1)^' charges. In some of the latter models the U(1)^' couplings of the first- and second-generation quarks all vanish, weakening the LHC Z^' constraint, and in some other models the DM particle has purely axial couplings, weakening the direct DM scattering constraint. We also consider models in which anomalies are cancelled via extra vector-like leptons, showing how the prospective LHC Z^' constraint may be weakened because the Z^' → μ ^+ μ ^- branching ratio is suppressed relative to other decay modes.