WorldWideScience

Sample records for arima-based time series

  1. ARIMA-Based Time Series Model of Stochastic Wind Power Generation

    DEFF Research Database (Denmark)

    Chen, Peiyuan; Pedersen, Troels; Bak-Jensen, Birgitte

    2010-01-01

    This paper proposes a stochastic wind power model based on an autoregressive integrated moving average (ARIMA) process. The model takes into account the nonstationarity and physical limits of stochastic wind power generation. The model is constructed based on wind power measurement of one year from...... the Nysted offshore wind farm in Denmark. The proposed limited-ARIMA (LARIMA) model introduces a limiter and characterizes the stochastic wind power generation by mean level, temporal correlation and driving noise. The model is validated against the measurement in terms of temporal correlation...... and probability distribution. The LARIMA model outperforms a first-order transition matrix based discrete Markov model in terms of temporal correlation, probability distribution and model parameter number. The proposed LARIMA model is further extended to include the monthly variation of the stochastic wind power...

  2. Time Series

    OpenAIRE

    Gil-Alana, L.A.; Moreno, A; Pérez-de-Gracia, F. (Fernando)

    2011-01-01

    The last 20 years have witnessed a considerable increase in the use of time series techniques in econometrics. The articles in this important set have been chosen to illustrate the main themes in time series work as it relates to econometrics. The editor has written a new concise introduction to accompany the articles. Sections covered include: Ad Hoc Forecasting Procedures, ARIMA Modelling, Structural Time Series Models, Unit Roots, Detrending and Non-stationarity, Seasonality, Seasonal Adju...

  3. Forecasting inflation in Montenegro using univariate time series models

    Directory of Open Access Journals (Sweden)

    Milena Lipovina-Božović

    2015-04-01

    Full Text Available The analysis of price trends and their prognosis is one of the key tasks of the economic authorities in each country. Due to the nature of the Montenegrin economy as small and open economy with euro as currency, forecasting inflation is very specific which is more difficult due to low quality of the data. This paper analyzes the utility and applicability of univariate time series models for forecasting price index in Montenegro. Data analysis of key macroeconomic movements in previous decades indicates the presence of many possible determinants that could influence forecasting result. This paper concludes that the forecasting models (ARIMA based only on its own previous values cannot adequately cover the key factors that determine the price level in the future, probably because of the existence of numerous external factors that influence the price movement in Montenegro.

  4. Time Series Momentum

    DEFF Research Database (Denmark)

    Moskowitz, Tobias J.; Ooi, Yao Hua; Heje Pedersen, Lasse

    2012-01-01

    under-reaction and delayed over-reaction. A diversified portfolio of time series momentum strategies across all asset classes delivers substantial abnormal returns with little exposure to standard asset pricing factors and performs best during extreme markets. Examining the trading activities...... of speculators and hedgers, we find that speculators profit from time series momentum at the expense of hedgers....

  5. Multivariate Time Series Search

    Data.gov (United States)

    National Aeronautics and Space Administration — Multivariate Time-Series (MTS) are ubiquitous, and are generated in areas as disparate as sensor recordings in aerospace systems, music and video streams, medical...

  6. Long time series

    DEFF Research Database (Denmark)

    Hisdal, H.; Holmqvist, E.; Hyvärinen, V.;

    Awareness that emission of greenhouse gases will raise the global temperature and change the climate has led to studies trying to identify such changes in long-term climate and hydrologic time series. This report, written by the......Awareness that emission of greenhouse gases will raise the global temperature and change the climate has led to studies trying to identify such changes in long-term climate and hydrologic time series. This report, written by the...

  7. Applied time series analysis

    CERN Document Server

    Woodward, Wayne A; Elliott, Alan C

    2011-01-01

    ""There is scarcely a standard technique that the reader will find left out … this book is highly recommended for those requiring a ready introduction to applicable methods in time series and serves as a useful resource for pedagogical purposes.""-International Statistical Review (2014), 82""Current time series theory for practice is well summarized in this book.""-Emmanuel Parzen, Texas A&M University""What an extraordinary range of topics covered, all very insightfully. I like [the authors'] innovations very much, such as the AR factor table.""-David Findley, U.S. Census Bureau (retired)""…

  8. Time series analysis

    CERN Document Server

    Madsen, Henrik

    2007-01-01

    ""In this book the author gives a detailed account of estimation, identification methodologies for univariate and multivariate stationary time-series models. The interesting aspect of this introductory book is that it contains several real data sets and the author made an effort to explain and motivate the methodology with real data. … this introductory book will be interesting and useful not only to undergraduate students in the UK universities but also to statisticians who are keen to learn time-series techniques and keen to apply them. I have no hesitation in recommending the book.""-Journa

  9. Visual time series analysis

    DEFF Research Database (Denmark)

    Fischer, Paul; Hilbert, Astrid

    2012-01-01

    commands, our application is select-and-click-driven. It allows to derive many different sequences of deviations for a given time series and to visualize them in different ways in order to judge their expressive power and to reuse the procedure found. For many transformations or model-ts, the user may...

  10. Causality between time series

    CERN Document Server

    Liang, X San

    2014-01-01

    Given two time series, can one tell, in a rigorous and quantitative way, the cause and effect between them? Based on a recently rigorized physical notion namely information flow, we arrive at a concise formula and give this challenging question, which is of wide concern in different disciplines, a positive answer. Here causality is measured by the time rate of change of information flowing from one series, say, X2, to another, X1. The measure is asymmetric between the two parties and, particularly, if the process underlying X1 does not depend on X2, then the resulting causality from X2 to X1 vanishes. The formula is tight in form, involving only the commonly used statistics, sample covariances. It has been validated with touchstone series purportedly generated with one-way causality. It has also been applied to the investigation of real world problems; an example presented here is the cause-effect relation between two climate modes, El Ni\\~no and Indian Ocean Dipole, which have been linked to the hazards in f...

  11. Introduction to Time Series Modeling

    CERN Document Server

    Kitagawa, Genshiro

    2010-01-01

    In time series modeling, the behavior of a certain phenomenon is expressed in relation to the past values of itself and other covariates. Since many important phenomena in statistical analysis are actually time series and the identification of conditional distribution of the phenomenon is an essential part of the statistical modeling, it is very important and useful to learn fundamental methods of time series modeling. Illustrating how to build models for time series using basic methods, "Introduction to Time Series Modeling" covers numerous time series models and the various tools f

  12. GPS Position Time Series @ JPL

    Science.gov (United States)

    Owen, Susan; Moore, Angelyn; Kedar, Sharon; Liu, Zhen; Webb, Frank; Heflin, Mike; Desai, Shailen

    2013-01-01

    Different flavors of GPS time series analysis at JPL - Use same GPS Precise Point Positioning Analysis raw time series - Variations in time series analysis/post-processing driven by different users. center dot JPL Global Time Series/Velocities - researchers studying reference frame, combining with VLBI/SLR/DORIS center dot JPL/SOPAC Combined Time Series/Velocities - crustal deformation for tectonic, volcanic, ground water studies center dot ARIA Time Series/Coseismic Data Products - Hazard monitoring and response focused center dot ARIA data system designed to integrate GPS and InSAR - GPS tropospheric delay used for correcting InSAR - Caltech's GIANT time series analysis uses GPS to correct orbital errors in InSAR - Zhen Liu's talking tomorrow on InSAR Time Series analysis

  13. Predicting Nonlinear Time Series

    Science.gov (United States)

    1993-12-01

    response becomes R,(k) = f (Y FV,(k)) (2.4) where Wy specifies the weight associated with the output of node i to the input of nodej in the next layer and...interconnections for each of these previous nodes. 18 prr~~~o• wfe :t iam i -- ---- --- --- --- Figure 5: Delay block for ATNN [9] Thus, nodej receives the...computed values, aj(tn), and dj(tn) denotes the desired output of nodej at time in. In this thesis, the weights and time delays update after each input

  14. Models for dependent time series

    CERN Document Server

    Tunnicliffe Wilson, Granville; Haywood, John

    2015-01-01

    Models for Dependent Time Series addresses the issues that arise and the methodology that can be applied when the dependence between time series is described and modeled. Whether you work in the economic, physical, or life sciences, the book shows you how to draw meaningful, applicable, and statistically valid conclusions from multivariate (or vector) time series data.The first four chapters discuss the two main pillars of the subject that have been developed over the last 60 years: vector autoregressive modeling and multivariate spectral analysis. These chapters provide the foundational mater

  15. Fractal and Multifractal Time Series

    CERN Document Server

    Kantelhardt, Jan W

    2008-01-01

    Data series generated by complex systems exhibit fluctuations on many time scales and/or broad distributions of the values. In both equilibrium and non-equilibrium situations, the natural fluctuations are often found to follow a scaling relation over several orders of magnitude, allowing for a characterisation of the data and the generating complex system by fractal (or multifractal) scaling exponents. In addition, fractal and multifractal approaches can be used for modelling time series and deriving predictions regarding extreme events. This review article describes and exemplifies several methods originating from Statistical Physics and Applied Mathematics, which have been used for fractal and multifractal time series analysis.

  16. Time Series with Tailored Nonlinearities

    CERN Document Server

    Raeth, C

    2015-01-01

    It is demonstrated how to generate time series with tailored nonlinearities by inducing well- defined constraints on the Fourier phases. Correlations between the phase information of adjacent phases and (static and dynamic) measures of nonlinearities are established and their origin is explained. By applying a set of simple constraints on the phases of an originally linear and uncor- related Gaussian time series, the observed scaling behavior of the intensity distribution of empirical time series can be reproduced. The power law character of the intensity distributions being typical for e.g. turbulence and financial data can thus be explained in terms of phase correlations.

  17. Benchmarking of energy time series

    Energy Technology Data Exchange (ETDEWEB)

    Williamson, M.A.

    1990-04-01

    Benchmarking consists of the adjustment of time series data from one source in order to achieve agreement with similar data from a second source. The data from the latter source are referred to as the benchmark(s), and often differ in that they are observed at a lower frequency, represent a higher level of temporal aggregation, and/or are considered to be of greater accuracy. This report provides an extensive survey of benchmarking procedures which have appeared in the statistical literature, and reviews specific benchmarking procedures currently used by the Energy Information Administration (EIA). The literature survey includes a technical summary of the major benchmarking methods and their statistical properties. Factors influencing the choice and application of particular techniques are described and the impact of benchmark accuracy is discussed. EIA applications and procedures are reviewed and evaluated for residential natural gas deliveries series and coal production series. It is found that the current method of adjusting the natural gas series is consistent with the behavior of the series and the methods used in obtaining the initial data. As a result, no change is recommended. For the coal production series, a staged approach based on a first differencing technique is recommended over the current procedure. A comparison of the adjustments produced by the two methods is made for the 1987 Indiana coal production series. 32 refs., 5 figs., 1 tab.

  18. Random time series in astronomy.

    Science.gov (United States)

    Vaughan, Simon

    2013-02-13

    Progress in astronomy comes from interpreting the signals encoded in the light received from distant objects: the distribution of light over the sky (images), over photon wavelength (spectrum), over polarization angle and over time (usually called light curves by astronomers). In the time domain, we see transient events such as supernovae, gamma-ray bursts and other powerful explosions; we see periodic phenomena such as the orbits of planets around nearby stars, radio pulsars and pulsations of stars in nearby galaxies; and we see persistent aperiodic variations ('noise') from powerful systems such as accreting black holes. I review just a few of the recent and future challenges in the burgeoning area of time domain astrophysics, with particular attention to persistently variable sources, the recovery of reliable noise power spectra from sparsely sampled time series, higher order properties of accreting black holes, and time delays and correlations in multi-variate time series.

  19. Random time series in Astronomy

    CERN Document Server

    Vaughan, Simon

    2013-01-01

    Progress in astronomy comes from interpreting the signals encoded in the light received from distant objects: the distribution of light over the sky (images), over photon wavelength (spectrum), over polarization angle, and over time (usually called light curves by astronomers). In the time domain we see transient events such as supernovae, gamma-ray bursts, and other powerful explosions; we see periodic phenomena such as the orbits of planets around nearby stars, radio pulsars, and pulsations of stars in nearby galaxies; and persistent aperiodic variations (`noise') from powerful systems like accreting black holes. I review just a few of the recent and future challenges in the burgeoning area of Time Domain Astrophysics, with particular attention to persistently variable sources, the recovery of reliable noise power spectra from sparsely sampled time series, higher-order properties of accreting black holes, and time delays and correlations in multivariate time series.

  20. ARIMA based time variation model for beneath the chassis UWB channel

    OpenAIRE

    Ergen, Sinem Çöleri; Demir, Utku

    2016-01-01

    Intra-vehicular wireless sensor network (ivwsn) enables the integration of the wireless sensor network technology into the vehicle architecture through either eliminating the wires between the existing sensors and the corresponding electronic controller units (ecus) or empowering new sensor technologies that are not currently implemented due to technical limitations. Ultra-wideband (uwb) has been determined to be the most appropriate technology for ivwsns since it provides energy efficiency t...

  1. Event Discovery in Time Series

    CERN Document Server

    Preston, Dan; Brodley, Carla

    2009-01-01

    The discovery of events in time series can have important implications, such as identifying microlensing events in astronomical surveys, or changes in a patient's electrocardiogram. Current methods for identifying events require a sliding window of a fixed size, which is not ideal for all applications and could overlook important events. In this work, we develop probability models for calculating the significance of an arbitrary-sized sliding window and use these probabilities to find areas of significance. Because a brute force search of all sliding windows and all window sizes would be computationally intractable, we introduce a method for quickly approximating the results. We apply our method to over 100,000 astronomical time series from the MACHO survey, in which 56 different sections of the sky are considered, each with one or more known events. Our method was able to recover 100% of these events in the top 1% of the results, essentially pruning 99% of the data. Interestingly, our method was able to iden...

  2. Detecting chaos from time series

    Science.gov (United States)

    Xiaofeng, Gong; Lai, C. H.

    2000-02-01

    In this paper, an entirely data-based method to detect chaos from the time series is developed by introducing icons/Journals/Common/epsilon" ALT="epsilon" ALIGN="TOP"/> p -neighbour points (the p -steps icons/Journals/Common/epsilon" ALT="epsilon" ALIGN="TOP"/> -neighbour points). We demonstrate that for deterministic chaotic systems, there exists a linear relationship between the logarithm of the average number of icons/Journals/Common/epsilon" ALT="epsilon" ALIGN="TOP"/> p -neighbour points, lnn p ,icons/Journals/Common/epsilon" ALT="epsilon" ALIGN="TOP"/> , and the time step, p . The coefficient can be related to the KS entropy of the system. The effects of the embedding dimension and noise are also discussed.

  3. Trend prediction of chaotic time series

    Institute of Scientific and Technical Information of China (English)

    Li Aiguo; Zhao Cai; Li Zhanhuai

    2007-01-01

    To predict the trend of chaotic time series in time series analysis and time series data mining fields, a novel predicting algorithm of chaotic time series trend is presented, and an on-line segmenting algorithm is proposed to convert a time series into a binary string according to ascending or descending trend of each subsequence. The on-line segmenting algorithm is independent of the prior knowledge about time series. The naive Bayesian algorithm is then employed to predict the trend of chaotic time series according to the binary string. The experimental results of three chaotic time series demonstrate that the proposed method predicts the ascending or descending trend of chaotic time series with few error.

  4. A Course in Time Series Analysis

    CERN Document Server

    Peña, Daniel; Tsay, Ruey S

    2011-01-01

    New statistical methods and future directions of research in time series A Course in Time Series Analysis demonstrates how to build time series models for univariate and multivariate time series data. It brings together material previously available only in the professional literature and presents a unified view of the most advanced procedures available for time series model building. The authors begin with basic concepts in univariate time series, providing an up-to-date presentation of ARIMA models, including the Kalman filter, outlier analysis, automatic methods for building ARIMA models, a

  5. Description of complex time series by multipoles

    DEFF Research Database (Denmark)

    Lewkowicz, M.; Levitan, J.; Puzanov, N.;

    2002-01-01

    We present a new method to describe time series with a highly complex time evolution. The time series is projected onto a two-dimensional phase-space plot which is quantified in terms of a multipole expansion where every data point is assigned a unit mass. The multipoles provide an efficient...... characterization of the original time series....

  6. Effective Feature Preprocessing for Time Series Forecasting

    DEFF Research Database (Denmark)

    Zhao, Junhua; Dong, Zhaoyang; Xu, Zhao

    2006-01-01

    Time series forecasting is an important area in data mining research. Feature preprocessing techniques have significant influence on forecasting accuracy, therefore are essential in a forecasting model. Although several feature preprocessing techniques have been applied in time series forecasting...... performance in time series forecasting. It is demonstrated in our experiment that, effective feature preprocessing can significantly enhance forecasting accuracy. This research can be a useful guidance for researchers on effectively selecting feature preprocessing techniques and integrating them with time...... series forecasting models....

  7. Kolmogorov space in time series data

    Science.gov (United States)

    Kanjamapornkul, Kabin; Pinčák, Richard

    2016-10-01

    We provide the proof that the space of time series data is a Kolmogorov space with $T_{0}$-separation axiom using the loop space of time series data. In our approach we define a cyclic coordinate of intrinsic time scale of time series data after empirical mode decomposition. A spinor field of time series data comes from the rotation of data around price and time axis by defining a new extradimension to time series data. We show that there exist hidden eight dimensions in Kolmogorov space for time series data. Our concept is realized as the algorithm of empirical mode decomposition and intrinsic time scale decomposition and it is subsequently used for preliminary analysis on the real time series data.

  8. Regenerating time series from ordinal networks

    Science.gov (United States)

    McCullough, Michael; Sakellariou, Konstantinos; Stemler, Thomas; Small, Michael

    2017-03-01

    Recently proposed ordinal networks not only afford novel methods of nonlinear time series analysis but also constitute stochastic approximations of the deterministic flow time series from which the network models are constructed. In this paper, we construct ordinal networks from discrete sampled continuous chaotic time series and then regenerate new time series by taking random walks on the ordinal network. We then investigate the extent to which the dynamics of the original time series are encoded in the ordinal networks and retained through the process of regenerating new time series by using several distinct quantitative approaches. First, we use recurrence quantification analysis on traditional recurrence plots and order recurrence plots to compare the temporal structure of the original time series with random walk surrogate time series. Second, we estimate the largest Lyapunov exponent from the original time series and investigate the extent to which this invariant measure can be estimated from the surrogate time series. Finally, estimates of correlation dimension are computed to compare the topological properties of the original and surrogate time series dynamics. Our findings show that ordinal networks constructed from univariate time series data constitute stochastic models which approximate important dynamical properties of the original systems.

  9. Time Series Analysis and Forecasting by Example

    CERN Document Server

    Bisgaard, Soren

    2011-01-01

    An intuition-based approach enables you to master time series analysis with ease Time Series Analysis and Forecasting by Example provides the fundamental techniques in time series analysis using various examples. By introducing necessary theory through examples that showcase the discussed topics, the authors successfully help readers develop an intuitive understanding of seemingly complicated time series models and their implications. The book presents methodologies for time series analysis in a simplified, example-based approach. Using graphics, the authors discuss each presented example in

  10. Data mining in time series databases

    CERN Document Server

    Kandel, Abraham; Bunke, Horst

    2004-01-01

    Adding the time dimension to real-world databases produces Time SeriesDatabases (TSDB) and introduces new aspects and difficulties to datamining and knowledge discovery. This book covers the state-of-the-artmethodology for mining time series databases. The novel data miningmethods presented in the book include techniques for efficientsegmentation, indexing, and classification of noisy and dynamic timeseries. A graph-based method for anomaly detection in time series isdescribed and the book also studies the implications of a novel andpotentially useful representation of time series as strings. Theproblem of detecting changes in data mining models that are inducedfrom temporal databases is additionally discussed.

  11. Outliers Mining in Time Series Data Sets

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    In this paper, we present a cluster-based algorithm for time series outlier mining.We use discrete Fourier transformation (DFT) to transform time series from time domain to frequency domain. Time series thus can be mapped as the points in k-dimensional space.For these points, a cluster-based algorithm is developed to mine the outliers from these points.The algorithm first partitions the input points into disjoint clusters and then prunes the clusters,through judgment that can not contain outliers.Our algorithm has been run in the electrical load time series of one steel enterprise and proved to be effective.

  12. International Work-Conference on Time Series

    CERN Document Server

    Pomares, Héctor

    2016-01-01

    This volume presents selected peer-reviewed contributions from The International Work-Conference on Time Series, ITISE 2015, held in Granada, Spain, July 1-3, 2015. It discusses topics in time series analysis and forecasting, advanced methods and online learning in time series, high-dimensional and complex/big data time series as well as forecasting in real problems. The International Work-Conferences on Time Series (ITISE) provide a forum for scientists, engineers, educators and students to discuss the latest ideas and implementations in the foundations, theory, models and applications in the field of time series analysis and forecasting. It focuses on interdisciplinary and multidisciplinary research encompassing the disciplines of computer science, mathematics, statistics and econometrics.

  13. Coupling between time series: a network view

    CERN Document Server

    Mehraban, Saeed; Zamani, Maryam; Jafari, Gholamreza

    2013-01-01

    Recently, the visibility graph has been introduced as a novel view for analyzing time series, which maps it to a complex network. In this paper, we introduce new algorithm of visibility, "cross-visibility", which reveals the conjugation of two coupled time series. The correspondence between the two time series is mapped to a network, "the cross-visibility graph", to demonstrate the correlation between them. We applied the algorithm to several correlated and uncorrelated time series, generated by the linear stationary ARFIMA process. The results demonstrate that the cross-visibility graph associated with correlated time series with power-law auto-correlation is scale-free. If the time series are uncorrelated, the degree distribution of their cross-visibility network deviates from power-law. For more clarifying the process, we applied the algorithm to real-world data from the financial trades of two companies, and observed significant small-scale coupling in their dynamics.

  14. Forecasting Enrollments with Fuzzy Time Series.

    Science.gov (United States)

    Song, Qiang; Chissom, Brad S.

    The concept of fuzzy time series is introduced and used to forecast the enrollment of a university. Fuzzy time series, an aspect of fuzzy set theory, forecasts enrollment using a first-order time-invariant model. To evaluate the model, the conventional linear regression technique is applied and the predicted values obtained are compared to the…

  15. Hurst Exponent Analysis of Financial Time Series

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    Statistical properties of stock market time series and the implication of their Hurst exponents are discussed. Hurst exponents of DJ1A (Dow Jones Industrial Average) components are tested using re-scaled range analysis. In addition to the original stock return series, the linear prediction errors of the daily returns are also tested. Numerical results show that the Hurst exponent analysis can provide some information about the statistical properties of the financial time series.

  16. Statistical criteria for characterizing irradiance time series.

    Energy Technology Data Exchange (ETDEWEB)

    Stein, Joshua S.; Ellis, Abraham; Hansen, Clifford W.

    2010-10-01

    We propose and examine several statistical criteria for characterizing time series of solar irradiance. Time series of irradiance are used in analyses that seek to quantify the performance of photovoltaic (PV) power systems over time. Time series of irradiance are either measured or are simulated using models. Simulations of irradiance are often calibrated to or generated from statistics for observed irradiance and simulations are validated by comparing the simulation output to the observed irradiance. Criteria used in this comparison should derive from the context of the analyses in which the simulated irradiance is to be used. We examine three statistics that characterize time series and their use as criteria for comparing time series. We demonstrate these statistics using observed irradiance data recorded in August 2007 in Las Vegas, Nevada, and in June 2009 in Albuquerque, New Mexico.

  17. Reconstruction of time-delay systems from chaotic time series.

    Science.gov (United States)

    Bezruchko, B P; Karavaev, A S; Ponomarenko, V I; Prokhorov, M D

    2001-11-01

    We propose a method that allows one to estimate the parameters of model scalar time-delay differential equations from time series. The method is based on a statistical analysis of time intervals between extrema in the time series. We verify our method by using it for the reconstruction of time-delay differential equations from their chaotic solutions and for modeling experimental systems with delay-induced dynamics from their chaotic time series.

  18. Lag space estimation in time series modelling

    DEFF Research Database (Denmark)

    Goutte, Cyril

    1997-01-01

    The purpose of this article is to investigate some techniques for finding the relevant lag-space, i.e. input information, for time series modelling. This is an important aspect of time series modelling, as it conditions the design of the model through the regressor vector a.k.a. the input layer...

  19. On reconstruction of time series in climatology

    Directory of Open Access Journals (Sweden)

    V. Privalsky

    2015-10-01

    Full Text Available The approach to time series reconstruction in climatology based upon cross-correlation coefficients and regression equations is mathematically incorrect because it ignores the dependence of time series upon their past. The proper method described here for the bivariate case requires the autoregressive time- and frequency domains modeling of the time series which contains simultaneous observations of both scalar series with subsequent application of the model to restore the shorter one into the past. The method presents further development of previous efforts taken by a number of authors starting from A. Douglass who introduced some concepts of time series analysis into paleoclimatology. The method is applied to the monthly data of total solar irradiance (TSI, 1979–2014, and sunspot numbers (SSN, 1749–2014, to restore the TSI data over 1749–1978. The results of the reconstruction are in statistical agreement with observations.

  20. A radar image time series

    Science.gov (United States)

    Leberl, F.; Fuchs, H.; Ford, J. P.

    1981-01-01

    A set of ten side-looking radar images of a mining area in Arizona that were aquired over a period of 14 yr are studied to demonstrate the photogrammetric differential-rectification technique applied to radar images and to examine changes that occurred in the area over time. Five of the images are rectified by using ground control points and a digital height model taken from a map. Residual coordinate errors in ground control are reduced from several hundred meters in all cases to + or - 19 to 70 m. The contents of the radar images are compared with a Landsat image and with aerial photographs. Effects of radar system parameters on radar images are briefly reviewed.

  1. Time Series Analysis Forecasting and Control

    CERN Document Server

    Box, George E P; Reinsel, Gregory C

    2011-01-01

    A modernized new edition of one of the most trusted books on time series analysis. Since publication of the first edition in 1970, Time Series Analysis has served as one of the most influential and prominent works on the subject. This new edition maintains its balanced presentation of the tools for modeling and analyzing time series and also introduces the latest developments that have occurred n the field over the past decade through applications from areas such as business, finance, and engineering. The Fourth Edition provides a clearly written exploration of the key methods for building, cl

  2. Time series modeling, computation, and inference

    CERN Document Server

    Prado, Raquel

    2010-01-01

    The authors systematically develop a state-of-the-art analysis and modeling of time series. … this book is well organized and well written. The authors present various statistical models for engineers to solve problems in time series analysis. Readers no doubt will learn state-of-the-art techniques from this book.-Hsun-Hsien Chang, Computing Reviews, March 2012My favorite chapters were on dynamic linear models and vector AR and vector ARMA models.-William Seaver, Technometrics, August 2011… a very modern entry to the field of time-series modelling, with a rich reference list of the current lit

  3. A Simple Fuzzy Time Series Forecasting Model

    DEFF Research Database (Denmark)

    Ortiz-Arroyo, Daniel

    2016-01-01

    In this paper we describe a new first order fuzzy time series forecasting model. We show that our automatic fuzzy partitioning method provides an accurate approximation to the time series that when combined with rule forecasting and an OWA operator improves forecasting accuracy. Our model does...... not attempt to provide the best results in comparison with other forecasting methods but to show how to improve first order models using simple techniques. However, we show that our first order model is still capable of outperforming some more complex higher order fuzzy time series models....

  4. DATA MINING IN CANADIAN LYNX TIME SERIES

    Directory of Open Access Journals (Sweden)

    R.Karnaboopathy

    2012-01-01

    Full Text Available This paper sums up the applications of Statistical model such as ARIMA family timeseries models in Canadian lynx data time series analysis and introduces the method of datamining combined with Statistical knowledge to analysis Canadian lynx data series.

  5. Visibility Graph Based Time Series Analysis

    Science.gov (United States)

    Stephen, Mutua; Gu, Changgui; Yang, Huijie

    2015-01-01

    Network based time series analysis has made considerable achievements in the recent years. By mapping mono/multivariate time series into networks, one can investigate both it’s microscopic and macroscopic behaviors. However, most proposed approaches lead to the construction of static networks consequently providing limited information on evolutionary behaviors. In the present paper we propose a method called visibility graph based time series analysis, in which series segments are mapped to visibility graphs as being descriptions of the corresponding states and the successively occurring states are linked. This procedure converts a time series to a temporal network and at the same time a network of networks. Findings from empirical records for stock markets in USA (S&P500 and Nasdaq) and artificial series generated by means of fractional Gaussian motions show that the method can provide us rich information benefiting short-term and long-term predictions. Theoretically, we propose a method to investigate time series from the viewpoint of network of networks. PMID:26571115

  6. Evaluation of Harmonic Analysis of Time Series (HANTS): impact of gaps on time series reconstruction

    NARCIS (Netherlands)

    Zhou, J.Y.; Jia, L.; Hu, G.; Menenti, M.

    2012-01-01

    In recent decades, researchers have developed methods and models to reconstruct time series of irregularly spaced observations from satellite remote sensing, among which the widely used Harmonic Analysis of Time Series (HANTS) method. Many studies based on time series reconstructed with HANTS docume

  7. Forecasting Daily Time Series using Periodic Unobserved Components Time Series Models

    NARCIS (Netherlands)

    Koopman, Siem Jan; Ooms, Marius

    2004-01-01

    We explore a periodic analysis in the context of unobserved components time series models that decompose time series into components of interest such as trend and seasonal. Periodic time series models allow dynamic characteristics to depend on the period of the year, month, week or day. In the stand

  8. Measuring nonlinear behavior in time series data

    Science.gov (United States)

    Wai, Phoong Seuk; Ismail, Mohd Tahir

    2014-12-01

    Stationary Test is an important test in detect the time series behavior since financial and economic data series always have missing data, structural change as well as jumps or breaks in the data set. Moreover, stationary test is able to transform the nonlinear time series variable to become stationary by taking difference-stationary process or trend-stationary process. Two different types of hypothesis testing of stationary tests that are Augmented Dickey-Fuller (ADF) test and Kwiatkowski-Philips-Schmidt-Shin (KPSS) test are examine in this paper to describe the properties of the time series variables in financial model. Besides, Least Square method is used in Augmented Dickey-Fuller test to detect the changes of the series and Lagrange multiplier is used in Kwiatkowski-Philips-Schmidt-Shin test to examine the properties of oil price, gold price and Malaysia stock market. Moreover, Quandt-Andrews, Bai-Perron and Chow tests are also use to detect the existence of break in the data series. The monthly index data are ranging from December 1989 until May 2012. Result is shown that these three series exhibit nonlinear properties but are able to transform to stationary series after taking first difference process.

  9. Applied time series analysis and innovative computing

    CERN Document Server

    Ao, Sio-Iong

    2010-01-01

    This text is a systematic, state-of-the-art introduction to the use of innovative computing paradigms as an investigative tool for applications in time series analysis. It includes frontier case studies based on recent research.

  10. Spectra: Time series power spectrum calculator

    Science.gov (United States)

    Gallardo, Tabaré

    2017-01-01

    Spectra calculates the power spectrum of a time series equally spaced or not based on the Spectral Correlation Coefficient (Ferraz-Mello 1981, Astron. Journal 86 (4), 619). It is very efficient for detection of low frequencies.

  11. Improving Intercomparability of Marine Biogeochemical Time Series

    Science.gov (United States)

    Benway, Heather M.; Telszewski, Maciej; Lorenzoni, Laura

    2013-04-01

    Shipboard biogeochemical time series represent one of the most valuable tools scientists have to quantify marine elemental fluxes and associated biogeochemical processes and to understand their links to changing climate. They provide the long, temporally resolved data sets needed to characterize ocean climate, biogeochemistry, and ecosystem variability and change. However, to monitor and differentiate natural cycles and human-driven changes in the global oceans, time series methodologies must be transparent and intercomparable when possible. To review current shipboard biogeochemical time series sampling and analytical methods, the International Ocean Carbon Coordination Project (IOCCP; http://www.ioccp.org/) and the Ocean Carbon and Biogeochemistry Program (http://www.us-ocb.org/) convened an international ocean time series workshop at the Bermuda Institute for Ocean Sciences.

  12. Combination prediction method of chaotic time series

    Institute of Scientific and Technical Information of China (English)

    ZHAO DongHua; RUAN Jiong; CAI ZhiJie

    2007-01-01

    In the present paper, we propose an approach of combination prediction of chaotic time series. The method is based on the adding-weight one-rank local-region method of chaotic time series. The method allows us to define an interval containing a future value with a given probability, which is obtained by studying the prediction error distribution. Its effectiveness is shown with data generated by Logistic map.

  13. Pseudotime estimation: deconfounding single cell time series

    OpenAIRE

    John E Reid; Wernisch, Lorenz

    2016-01-01

    Motivation: Repeated cross-sectional time series single cell data confound several sources of variation, with contributions from measurement noise, stochastic cell-to-cell variation and cell progression at different rates. Time series from single cell assays are particularly susceptible to confounding as the measurements are not averaged over populations of cells. When several genes are assayed in parallel these effects can be estimated and corrected for under certain smoothness assumptions o...

  14. FATS: Feature Analysis for Time Series

    CERN Document Server

    Nun, Isadora; Sim, Brandon; Zhu, Ming; Dave, Rahul; Castro, Nicolas; Pichara, Karim

    2015-01-01

    In this paper, we present the FATS (Feature Analysis for Time Series) library. FATS is a Python library which facilitates and standardizes feature extraction for time series data. In particular, we focus on one application: feature extraction for astronomical light curve data, although the library is generalizable for other uses. We detail the methods and features implemented for light curve analysis, and present examples for its usage.

  15. Introduction to time series analysis and forecasting

    CERN Document Server

    Montgomery, Douglas C; Kulahci, Murat

    2008-01-01

    An accessible introduction to the most current thinking in and practicality of forecasting techniques in the context of time-oriented data. Analyzing time-oriented data and forecasting are among the most important problems that analysts face across many fields, ranging from finance and economics to production operations and the natural sciences. As a result, there is a widespread need for large groups of people in a variety of fields to understand the basic concepts of time series analysis and forecasting. Introduction to Time Series Analysis and Forecasting presents the time series analysis branch of applied statistics as the underlying methodology for developing practical forecasts, and it also bridges the gap between theory and practice by equipping readers with the tools needed to analyze time-oriented data and construct useful, short- to medium-term, statistically based forecasts.

  16. Time Series Forecasting with Missing Values

    Directory of Open Access Journals (Sweden)

    Shin-Fu Wu

    2015-11-01

    Full Text Available Time series prediction has become more popular in various kinds of applications such as weather prediction, control engineering, financial analysis, industrial monitoring, etc. To deal with real-world problems, we are often faced with missing values in the data due to sensor malfunctions or human errors. Traditionally, the missing values are simply omitted or replaced by means of imputation methods. However, omitting those missing values may cause temporal discontinuity. Imputation methods, on the other hand, may alter the original time series. In this study, we propose a novel forecasting method based on least squares support vector machine (LSSVM. We employ the input patterns with the temporal information which is defined as local time index (LTI. Time series data as well as local time indexes are fed to LSSVM for doing forecasting without imputation. We compare the forecasting performance of our method with other imputation methods. Experimental results show that the proposed method is promising and is worth further investigations.

  17. Feature Matching in Time Series Modelling

    CERN Document Server

    Xia, Yingcun

    2011-01-01

    Using a time series model to mimic an observed time series has a long history. However, with regard to this objective, conventional estimation methods for discrete-time dynamical models are frequently found to be wanting. In the absence of a true model, we prefer an alternative approach to conventional model fitting that typically involves one-step-ahead prediction errors. Our primary aim is to match the joint probability distribution of the observable time series, including long-term features of the dynamics that underpin the data, such as cycles, long memory and others, rather than short-term prediction. For want of a better name, we call this specific aim {\\it feature matching}. The challenges of model mis-specification, measurement errors and the scarcity of data are forever present in real time series modelling. In this paper, by synthesizing earlier attempts into an extended-likelihood, we develop a systematic approach to empirical time series analysis to address these challenges and to aim at achieving...

  18. Predicting road accidents: Structural time series approach

    Science.gov (United States)

    Junus, Noor Wahida Md; Ismail, Mohd Tahir

    2014-07-01

    In this paper, the model for occurrence of road accidents in Malaysia between the years of 1970 to 2010 was developed and throughout this model the number of road accidents have been predicted by using the structural time series approach. The models are developed by using stepwise method and the residual of each step has been analyzed. The accuracy of the model is analyzed by using the mean absolute percentage error (MAPE) and the best model is chosen based on the smallest Akaike information criterion (AIC) value. A structural time series approach found that local linear trend model is the best model to represent the road accidents. This model allows level and slope component to be varied over time. In addition, this approach also provides useful information on improving the conventional time series method.

  19. Efficient Approximate OLAP Querying Over Time Series

    DEFF Research Database (Denmark)

    Perera, Kasun Baruhupolage Don Kasun Sanjeewa; Hahmann, Martin; Lehner, Wolfgang;

    2016-01-01

    are either costly or require continuous maintenance. In this paper we propose an approach for approximate OLAP querying of time series that offers constant latency and is maintenance-free. To achieve this, we identify similarities between aggregation cuboids and propose algorithms that eliminate......The ongoing trend for data gathering not only produces larger volumes of data, but also increases the variety of recorded data types. Out of these, especially time series, e.g. various sensor readings, have attracted attention in the domains of business intelligence and decision making. As OLAP...... queries play a major role in these domains, it is desirable to also execute them on time series data. While this is not a problem on the conceptual level, it can become a bottleneck with regards to query run-time. In general, processing OLAP queries gets more computationally intensive as the volume...

  20. Introduction to time series analysis and forecasting

    CERN Document Server

    Montgomery, Douglas C; Kulahci, Murat

    2015-01-01

    Praise for the First Edition ""…[t]he book is great for readers who need to apply the methods and models presented but have little background in mathematics and statistics."" -MAA Reviews Thoroughly updated throughout, Introduction to Time Series Analysis and Forecasting, Second Edition presents the underlying theories of time series analysis that are needed to analyze time-oriented data and construct real-world short- to medium-term statistical forecasts.    Authored by highly-experienced academics and professionals in engineering statistics, the Second Edition features discussions on both

  1. Layered Ensemble Architecture for Time Series Forecasting.

    Science.gov (United States)

    Rahman, Md Mustafizur; Islam, Md Monirul; Murase, Kazuyuki; Yao, Xin

    2016-01-01

    Time series forecasting (TSF) has been widely used in many application areas such as science, engineering, and finance. The phenomena generating time series are usually unknown and information available for forecasting is only limited to the past values of the series. It is, therefore, necessary to use an appropriate number of past values, termed lag, for forecasting. This paper proposes a layered ensemble architecture (LEA) for TSF problems. Our LEA consists of two layers, each of which uses an ensemble of multilayer perceptron (MLP) networks. While the first ensemble layer tries to find an appropriate lag, the second ensemble layer employs the obtained lag for forecasting. Unlike most previous work on TSF, the proposed architecture considers both accuracy and diversity of the individual networks in constructing an ensemble. LEA trains different networks in the ensemble by using different training sets with an aim of maintaining diversity among the networks. However, it uses the appropriate lag and combines the best trained networks to construct the ensemble. This indicates LEAs emphasis on accuracy of the networks. The proposed architecture has been tested extensively on time series data of neural network (NN)3 and NN5 competitions. It has also been tested on several standard benchmark time series data. In terms of forecasting accuracy, our experimental results have revealed clearly that LEA is better than other ensemble and nonensemble methods.

  2. Complex network analysis of time series

    Science.gov (United States)

    Gao, Zhong-Ke; Small, Michael; Kurths, Jürgen

    2016-12-01

    Revealing complicated behaviors from time series constitutes a fundamental problem of continuing interest and it has attracted a great deal of attention from a wide variety of fields on account of its significant importance. The past decade has witnessed a rapid development of complex network studies, which allow to characterize many types of systems in nature and technology that contain a large number of components interacting with each other in a complicated manner. Recently, the complex network theory has been incorporated into the analysis of time series and fruitful achievements have been obtained. Complex network analysis of time series opens up new venues to address interdisciplinary challenges in climate dynamics, multiphase flow, brain functions, ECG dynamics, economics and traffic systems.

  3. Improving the prediction of chaotic time series

    Institute of Scientific and Technical Information of China (English)

    李克平; 高自友; 陈天仑

    2003-01-01

    One of the features of deterministic chaos is sensitive to initial conditions. This feature limits the prediction horizons of many chaotic systems. In this paper, we propose a new prediction technique for chaotic time series. In our method, some neighbouring points of the predicted point, for which the corresponding local Lyapunov exponent is particularly large, would be discarded during estimating the local dynamics, and thus the error accumulated by the prediction algorithm is reduced. The model is tested for the convection amplitude of Lorenz systems. The simulation results indicate that the prediction technique can improve the prediction of chaotic time series.

  4. Lecture notes for Advanced Time Series Analysis

    DEFF Research Database (Denmark)

    Madsen, Henrik; Holst, Jan

    1997-01-01

    A first version of this notes was used at the lectures in Grenoble, and they are now extended and improved (together with Jan Holst), and used in Ph.D. courses on Advanced Time Series Analysis at IMM and at the Department of Mathematical Statistics, University of Lund, 1994, 1997, ......A first version of this notes was used at the lectures in Grenoble, and they are now extended and improved (together with Jan Holst), and used in Ph.D. courses on Advanced Time Series Analysis at IMM and at the Department of Mathematical Statistics, University of Lund, 1994, 1997, ...

  5. Dynamical networks reconstructed from time series

    CERN Document Server

    Levnajić, Zoran

    2012-01-01

    Novel method of reconstructing dynamical networks from empirically measured time series is proposed. By statistically examining the correlations between motions displayed by network nodes, we derive a simple equation that directly yields the adjacency matrix, assuming the intra-network interaction functions to be known. We illustrate the method's implementation on a simple example and discuss the dependence of the reconstruction precision on the properties of time series. Our method is applicable to any network, allowing for reconstruction precision to be maximized, and errors to be estimated.

  6. Introduction to time series and forecasting

    CERN Document Server

    Brockwell, Peter J

    2016-01-01

    This book is aimed at the reader who wishes to gain a working knowledge of time series and forecasting methods as applied to economics, engineering and the natural and social sciences. It assumes knowledge only of basic calculus, matrix algebra and elementary statistics. This third edition contains detailed instructions for the use of the professional version of the Windows-based computer package ITSM2000, now available as a free download from the Springer Extras website. The logic and tools of time series model-building are developed in detail. Numerous exercises are included and the software can be used to analyze and forecast data sets of the user's own choosing. The book can also be used in conjunction with other time series packages such as those included in R. The programs in ITSM2000 however are menu-driven and can be used with minimal investment of time in the computational details. The core of the book covers stationary processes, ARMA and ARIMA processes, multivariate time series and state-space mod...

  7. Multifractal Analysis of Polyalanines Time Series

    CERN Document Server

    Figueirêdo, P H; Moret, M A; Coutinho, Sérgio; 10.1016/j.physa.2009.11.045

    2010-01-01

    Multifractal properties of the energy time series of short $\\alpha$-helix structures, specifically from a polyalanine family, are investigated through the MF-DFA technique ({\\it{multifractal detrended fluctuation analysis}}). Estimates for the generalized Hurst exponent $h(q)$ and its associated multifractal exponents $\\tau(q)$ are obtained for several series generated by numerical simulations of molecular dynamics in different systems from distinct initial conformations. All simulations were performed using the GROMOS force field, implemented in the program THOR. The main results have shown that all series exhibit multifractal behavior depending on the number of residues and temperature. Moreover, the multifractal spectra reveal important aspects on the time evolution of the system and suggest that the nucleation process of the secondary structures during the visits on the energy hyper-surface is an essential feature of the folding process.

  8. Time Series Rule Discovery: Tough, not Meaningless

    NARCIS (Netherlands)

    Struzik, Z.R.

    2003-01-01

    `Model free' rule discovery from data has recently been subject to considerable criticism, which has cast a shadow over the emerging discipline of time series data mining. However, other than in data mining, rule discovery has long been the subject of research in statistical physics of complex pheno

  9. Forecasting with periodic autoregressive time series models

    NARCIS (Netherlands)

    Ph.H.B.F. Franses (Philip Hans); R. Paap (Richard)

    1999-01-01

    textabstractThis paper is concerned with forecasting univariate seasonal time series data using periodic autoregressive models. We show how one should account for unit roots and deterministic terms when generating out-of-sample forecasts. We illustrate the models for various quarterly UK consumption

  10. 25 years of time series forecasting

    NARCIS (Netherlands)

    de Gooijer, J.G.; Hyndman, R.J.

    2006-01-01

    We review the past 25 years of research into time series forecasting. In this silver jubilee issue, we naturally highlight results published in journals managed by the International Institute of Forecasters (Journal of Forecasting 1982-1985 and International Journal of Forecasting 1985-2005). During

  11. Nonlinear time series modelling: an introduction

    OpenAIRE

    Simon M. Potter

    1999-01-01

    Recent developments in nonlinear time series modelling are reviewed. Three main types of nonlinear models are discussed: Markov Switching, Threshold Autoregression and Smooth Transition Autoregression. Classical and Bayesian estimation techniques are described for each model. Parametric tests for nonlinearity are reviewed with examples from the three types of models. Finally, forecasting and impulse response analysis is developed.

  12. Common large innovations across nonlinear time series

    NARCIS (Netherlands)

    Ph.H.B.F. Franses (Philip Hans); R. Paap (Richard)

    2002-01-01

    textabstractWe propose a multivariate nonlinear econometric time series model, which can be used to examine if there is common nonlinearity across economic variables. The model is a multivariate censored latent effects autoregression. The key feature of this model is that nonlinearity appears as sep

  13. Designer networks for time series processing

    DEFF Research Database (Denmark)

    Svarer, C; Hansen, Lars Kai; Larsen, Jan;

    1993-01-01

    The conventional tapped-delay neural net may be analyzed using statistical methods and the results of such analysis can be applied to model optimization. The authors review and extend efforts to demonstrate the power of this strategy within time series processing. They attempt to design compact...

  14. Nonlinear Time Series Analysis via Neural Networks

    Science.gov (United States)

    Volná, Eva; Janošek, Michal; Kocian, Václav; Kotyrba, Martin

    This article deals with a time series analysis based on neural networks in order to make an effective forex market [Moore and Roche, J. Int. Econ. 58, 387-411 (2002)] pattern recognition. Our goal is to find and recognize important patterns which repeatedly appear in the market history to adapt our trading system behaviour based on them.

  15. What marketing scholars should know about time series analysis : time series applications in marketing

    NARCIS (Netherlands)

    Horváth, Csilla; Kornelis, Marcel; Leeflang, Peter S.H.

    2002-01-01

    In this review, we give a comprehensive summary of time series techniques in marketing, and discuss a variety of time series analysis (TSA) techniques and models. We classify them in the sets (i) univariate TSA, (ii) multivariate TSA, and (iii) multiple TSA. We provide relevant marketing application

  16. Remote Sensing Time Series Product Tool

    Science.gov (United States)

    Predos, Don; Ryan, Robert E.; Ross, Kenton W.

    2006-01-01

    The TSPT (Time Series Product Tool) software was custom-designed for NASA to rapidly create and display single-band and band-combination time series, such as NDVI (Normalized Difference Vegetation Index) images, for wide-area crop surveillance and for other time-critical applications. The TSPT, developed in MATLAB, allows users to create and display various MODIS (Moderate Resolution Imaging Spectroradiometer) or simulated VIIRS (Visible/Infrared Imager Radiometer Suite) products as single images, as time series plots at a selected location, or as temporally processed image videos. Manually creating these types of products is extremely labor intensive; however, the TSPT development tool makes the process simplified and efficient. MODIS is ideal for monitoring large crop areas because of its wide swath (2330 km), its relatively small ground sample distance (250 m), and its high temporal revisit time (twice daily). Furthermore, because MODIS imagery is acquired daily, rapid changes in vegetative health can potentially be detected. The new TSPT technology provides users with the ability to temporally process high-revisit-rate satellite imagery, such as that acquired from MODIS and from its successor, the VIIRS. The TSPT features the important capability of fusing data from both MODIS instruments onboard the Terra and Aqua satellites, which drastically improves cloud statistics. With the TSPT, MODIS metadata is used to find and optionally remove bad and suspect data. Noise removal and temporal processing techniques allow users to create low-noise time series plots and image videos and to select settings and thresholds that tailor particular output products. The TSPT GUI (graphical user interface) provides an interactive environment for crafting what-if scenarios by enabling a user to repeat product generation using different settings and thresholds. The TSPT Application Programming Interface provides more fine-tuned control of product generation, allowing experienced

  17. Time Series Forecasting: A Nonlinear Dynamics Approach

    OpenAIRE

    Sello, Stefano

    1999-01-01

    The problem of prediction of a given time series is examined on the basis of recent nonlinear dynamics theories. Particular attention is devoted to forecast the amplitude and phase of one of the most common solar indicator activity, the international monthly smoothed sunspot number. It is well known that the solar cycle is very difficult to predict due to the intrinsic complexity of the related time behaviour and to the lack of a succesful quantitative theoretical model of the Sun magnetic cy...

  18. Delay Differential Analysis of Time Series

    Science.gov (United States)

    Lainscsek, Claudia; Sejnowski, Terrence J.

    2015-01-01

    Nonlinear dynamical system analysis based on embedding theory has been used for modeling and prediction, but it also has applications to signal detection and classification of time series. An embedding creates a multidimensional geometrical object from a single time series. Traditionally either delay or derivative embeddings have been used. The delay embedding is composed of delayed versions of the signal, and the derivative embedding is composed of successive derivatives of the signal. The delay embedding has been extended to nonuniform embeddings to take multiple timescales into account. Both embeddings provide information on the underlying dynamical system without having direct access to all the system variables. Delay differential analysis is based on functional embeddings, a combination of the derivative embedding with nonuniform delay embeddings. Small delay differential equation (DDE) models that best represent relevant dynamic features of time series data are selected from a pool of candidate models for detection or classification. We show that the properties of DDEs support spectral analysis in the time domain where nonlinear correlation functions are used to detect frequencies, frequency and phase couplings, and bispectra. These can be efficiently computed with short time windows and are robust to noise. For frequency analysis, this framework is a multivariate extension of discrete Fourier transform (DFT), and for higher-order spectra, it is a linear and multivariate alternative to multidimensional fast Fourier transform of multidimensional correlations. This method can be applied to short or sparse time series and can be extended to cross-trial and cross-channel spectra if multiple short data segments of the same experiment are available. Together, this time-domain toolbox provides higher temporal resolution, increased frequency and phase coupling information, and it allows an easy and straightforward implementation of higher-order spectra across time

  19. The Statistical Analysis of Time Series

    CERN Document Server

    Anderson, T W

    2011-01-01

    The Wiley Classics Library consists of selected books that have become recognized classics in their respective fields. With these new unabridged and inexpensive editions, Wiley hopes to extend the life of these important works by making them available to future generations of mathematicians and scientists. Currently available in the Series: T. W. Anderson Statistical Analysis of Time Series T. S. Arthanari & Yadolah Dodge Mathematical Programming in Statistics Emil Artin Geometric Algebra Norman T. J. Bailey The Elements of Stochastic Processes with Applications to the Natural Sciences George

  20. Nonlinear Analysis of Physiological Time Series

    Institute of Scientific and Technical Information of China (English)

    MENG Qing-fang; PENG Yu-hua; XUE Yu-li; HAN Min

    2007-01-01

    Abstract.The heart rate variability could be explained by a low-dimensional governing mechanism. There has been increasing interest in verifying and understanding the coupling between the respiration and the heart rate. In this paper we use the nonlinear detection method to detect the nonlinear deterministic component in the physiological time series by a single variable series and two variables series respectively, and use the conditional information entropy to analyze the correlation between the heart rate, the respiration and the blood oxygen concentration. The conclusions are that there is the nonlinear deterministic component in the heart rate data and respiration data, and the heart rate and the respiration are two variables originating from the same underlying dynamics.

  1. TIME SERIES FORECASTING USING NEURAL NETWORKS

    Directory of Open Access Journals (Sweden)

    BOGDAN OANCEA

    2013-05-01

    Full Text Available Recent studies have shown the classification and prediction power of the Neural Networks. It has been demonstrated that a NN can approximate any continuous function. Neural networks have been successfully used for forecasting of financial data series. The classical methods used for time series prediction like Box-Jenkins or ARIMA assumes that there is a linear relationship between inputs and outputs. Neural Networks have the advantage that can approximate nonlinear functions. In this paper we compared the performances of different feed forward and recurrent neural networks and training algorithms for predicting the exchange rate EUR/RON and USD/RON. We used data series with daily exchange rates starting from 2005 until 2013.

  2. Forecasting with nonlinear time series models

    DEFF Research Database (Denmark)

    Kock, Anders Bredahl; Teräsvirta, Timo

    applied to economic fore- casting problems, is briefly highlighted. A number of large published studies comparing macroeconomic forecasts obtained using different time series models are discussed, and the paper also contains a small simulation study comparing recursive and direct forecasts in a partic......In this paper, nonlinear models are restricted to mean nonlinear parametric models. Several such models popular in time series econo- metrics are presented and some of their properties discussed. This in- cludes two models based on universal approximators: the Kolmogorov- Gabor polynomial model...... and two versions of a simple artificial neural network model. Techniques for generating multi-period forecasts from nonlinear models recursively are considered, and the direct (non-recursive) method for this purpose is mentioned as well. Forecasting with com- plex dynamic systems, albeit less frequently...

  3. Visibility graphlet approach to chaotic time series.

    Science.gov (United States)

    Mutua, Stephen; Gu, Changgui; Yang, Huijie

    2016-05-01

    Many novel methods have been proposed for mapping time series into complex networks. Although some dynamical behaviors can be effectively captured by existing approaches, the preservation and tracking of the temporal behaviors of a chaotic system remains an open problem. In this work, we extended the visibility graphlet approach to investigate both discrete and continuous chaotic time series. We applied visibility graphlets to capture the reconstructed local states, so that each is treated as a node and tracked downstream to create a temporal chain link. Our empirical findings show that the approach accurately captures the dynamical properties of chaotic systems. Networks constructed from periodic dynamic phases all converge to regular networks and to unique network structures for each model in the chaotic zones. Furthermore, our results show that the characterization of chaotic and non-chaotic zones in the Lorenz system corresponds to the maximal Lyapunov exponent, thus providing a simple and straightforward way to analyze chaotic systems.

  4. Univariate time series forecasting algorithm validation

    Science.gov (United States)

    Ismail, Suzilah; Zakaria, Rohaiza; Muda, Tuan Zalizam Tuan

    2014-12-01

    Forecasting is a complex process which requires expert tacit knowledge in producing accurate forecast values. This complexity contributes to the gaps between end users and expert. Automating this process by using algorithm can act as a bridge between them. Algorithm is a well-defined rule for solving a problem. In this study a univariate time series forecasting algorithm was developed in JAVA and validated using SPSS and Excel. Two set of simulated data (yearly and non-yearly); several univariate forecasting techniques (i.e. Moving Average, Decomposition, Exponential Smoothing, Time Series Regressions and ARIMA) and recent forecasting process (such as data partition, several error measures, recursive evaluation and etc.) were employed. Successfully, the results of the algorithm tally with the results of SPSS and Excel. This algorithm will not just benefit forecaster but also end users that lacking in depth knowledge of forecasting process.

  5. On clustering fMRI time series

    DEFF Research Database (Denmark)

    Goutte, C; Toft, P; Rostrup, E

    1999-01-01

    Analysis of fMRI time series is often performed by extracting one or more parameters for the individual voxels. Methods based, e.g., on various statistical tests are then used to yield parameters corresponding to probability of activation or activation strength. However, these methods do not indi......Analysis of fMRI time series is often performed by extracting one or more parameters for the individual voxels. Methods based, e.g., on various statistical tests are then used to yield parameters corresponding to probability of activation or activation strength. However, these methods do...... between the activation stimulus and the fMRI signal. We present two different clustering algorithms and use them to identify regions of similar activations in an fMRI experiment involving a visual stimulus....

  6. Revisiting algorithms for generating surrogate time series

    CERN Document Server

    Raeth, C; Papadakis, I E; Brinkmann, W

    2011-01-01

    The method of surrogates is one of the key concepts of nonlinear data analysis. Here, we demonstrate that commonly used algorithms for generating surrogates often fail to generate truly linear time series. Rather, they create surrogate realizations with Fourier phase correlations leading to non-detections of nonlinearities. We argue that reliable surrogates can only be generated, if one tests separately for static and dynamic nonlinearities.

  7. Learning and Prediction of Relational Time Series

    Science.gov (United States)

    2013-03-01

    genetic algorithms can generate a sequence of events to maximize some functions or the likelihood to achieve the assumed goals. With reference...Reinforcement learning is not the same as relational time-series learning mainly because its main focus is to learn a set of policies to maximize the...scope blending, and has been applied to machine poetry generation [48] and the generation of animation characters [49]. Tan and Kowk [50] applied the

  8. Time Series Modelling using Proc Varmax

    DEFF Research Database (Denmark)

    Milhøj, Anders

    2007-01-01

    In this paper it will be demonstrated how various time series problems could be met using Proc Varmax. The procedure is rather new and hence new features like cointegration, testing for Granger causality are included, but it also means that more traditional ARIMA modelling as outlined by Box & Je...... & Jenkins is performed in a more modern way using the computer resources which are now available...

  9. Normalizing the causality between time series

    CERN Document Server

    Liang, X San

    2015-01-01

    Recently, a rigorous yet concise formula has been derived to evaluate the information flow, and hence the causality in a quantitative sense, between time series. To assess the importance of a resulting causality, it needs to be normalized. The normalization is achieved through distinguishing three types of fundamental mechanisms that govern the marginal entropy change of the flow recipient. A normalized or relative flow measures its importance relative to other mechanisms. In analyzing realistic series, both absolute and relative information flows need to be taken into account, since the normalizers for a pair of reverse flows belong to two different entropy balances; it is quite normal that two identical flows may differ a lot in relative importance in their respective balances. We have reproduced these results with several autoregressive models. We have also shown applications to a climate change problem and a financial analysis problem. For the former, reconfirmed is the role of the Indian Ocean Dipole as ...

  10. Argos: An Optimized Time-Series Photometer

    Indian Academy of Sciences (India)

    Anjum S. Mukadam; R. E. Nather

    2005-06-01

    We designed a prime focus CCD photometer, Argos, optimized for high speed time-series measurements of blue variables (Nather & Mukadam 2004) for the 2.1 m telescope at McDonald Observatory. Lack of any intervening optics between the primary mirror and the CCD makes the instrument highly efficient.We measure an improvement in sensitivity by a factor of nine over the 3-channel PMT photometers used on the same telescope and for the same exposure time. The CCD frame transfer operation triggered by GPS synchronized pulses serves as an electronic shutter for the photometer. This minimizes the dead time between exposures, but more importantly, allows a precise control of the start and duration of the exposure. We expect the uncertainty in our timing to be less than 100 s.

  11. Directed networks with underlying time structures from multivariate time series

    CERN Document Server

    Tanizawa, Toshihiro; Taya, Fumihiko

    2014-01-01

    In this paper we propose a method of constructing directed networks of time-dependent phenomena from multivariate time series. As the construction method is based on the linear model, the network fully reflects dynamical features of the system such as time structures of periodicities. Furthermore, this method can construct networks even if these time series show no similarity: situations in which common methods fail. We explicitly introduce a case where common methods do not work. This fact indicates the importance of constructing networks based on dynamical perspective, when we consider time-dependent phenomena. We apply the method to multichannel electroencephalography~(EEG) data and the result reveals underlying interdependency among the components in the brain system.

  12. Fractal fluctuations in cardiac time series

    Science.gov (United States)

    West, B. J.; Zhang, R.; Sanders, A. W.; Miniyar, S.; Zuckerman, J. H.; Levine, B. D.; Blomqvist, C. G. (Principal Investigator)

    1999-01-01

    Human heart rate, controlled by complex feedback mechanisms, is a vital index of systematic circulation. However, it has been shown that beat-to-beat values of heart rate fluctuate continually over a wide range of time scales. Herein we use the relative dispersion, the ratio of the standard deviation to the mean, to show, by systematically aggregating the data, that the correlation in the beat-to-beat cardiac time series is a modulated inverse power law. This scaling property indicates the existence of long-time memory in the underlying cardiac control process and supports the conclusion that heart rate variability is a temporal fractal. We argue that the cardiac control system has allometric properties that enable it to respond to a dynamical environment through scaling.

  13. Time Series Forecasting A Nonlinear Dynamics Approach

    CERN Document Server

    Sello, S

    1999-01-01

    The problem of prediction of a given time series is examined on the basis of recent nonlinear dynamics theories. Particular attention is devoted to forecast the amplitude and phase of one of the most common solar indicator activity, the international monthly smoothed sunspot number. It is well known that the solar cycle is very difficult to predict due to the intrinsic complexity of the related time behaviour and to the lack of a succesful quantitative theoretical model of the Sun magnetic cycle. Starting from a previous recent work, we checked the reliability and accuracy of a forecasting model based on concepts of nonlinear dynamical systems applied to experimental time series, such as embedding phase space,Lyapunov spectrum,chaotic behaviour. The model is based on a locally hypothesis of the behaviour on the embedding space, utilizing an optimal number k of neighbour vectors to predict the future evolution of the current point with the set of characteristic parameters determined by several previous paramet...

  14. Time Series Photometry of KZ Lacertae

    Science.gov (United States)

    Joner, Michael D.

    2016-01-01

    We present BVRI time series photometry of the high amplitude delta Scuti star KZ Lacertae secured using the 0.9-meter telescope located at the Brigham Young University West Mountain Observatory. In addition to the multicolor light curves that are presented, the V data from the last six years of observations are used to plot an O-C diagram in order to determine the ephemeris and evaluate evidence for period change. We wish to thank the Brigham Young University College of Physical and Mathematical Sciences as well as the Department of Physics and Astronomy for their continued support of the research activities at the West Mountain Observatory.

  15. Time series modeling for automatic target recognition

    Science.gov (United States)

    Sokolnikov, Andre

    2012-05-01

    Time series modeling is proposed for identification of targets whose images are not clearly seen. The model building takes into account air turbulence, precipitation, fog, smoke and other factors obscuring and distorting the image. The complex of library data (of images, etc.) serving as a basis for identification provides the deterministic part of the identification process, while the partial image features, distorted parts, irrelevant pieces and absence of particular features comprise the stochastic part of the target identification. The missing data approach is elaborated that helps the prediction process for the image creation or reconstruction. The results are provided.

  16. Outlier Detection in Structural Time Series Models

    DEFF Research Database (Denmark)

    Marczak, Martyna; Proietti, Tommaso

    –to–specific approach to the detection of structural change, currently implemented in Autometrics via indicator saturation, has proven to be both practical and effective in the context of stationary dynamic regression models and unit–root autoregressions. By focusing on impulse– and step–indicator saturation, we...... investigate via Monte Carlo simulations how this approach performs for detecting additive outliers and level shifts in the analysis of nonstationary seasonal time series. The reference model is the basic structural model, featuring a local linear trend, possibly integrated of order two, stochastic seasonality...

  17. Fourier analysis of time series an introduction

    CERN Document Server

    Bloomfield, Peter

    2000-01-01

    A new, revised edition of a yet unrivaled work on frequency domain analysis Long recognized for his unique focus on frequency domain methods for the analysis of time series data as well as for his applied, easy-to-understand approach, Peter Bloomfield brings his well-known 1976 work thoroughly up to date. With a minimum of mathematics and an engaging, highly rewarding style, Bloomfield provides in-depth discussions of harmonic regression, harmonic analysis, complex demodulation, and spectrum analysis. All methods are clearly illustrated using examples of specific data sets, while ample

  18. Modeling noisy time series Physiological tremor

    CERN Document Server

    Timmer, J

    1998-01-01

    Empirical time series often contain observational noise. We investigate the effect of this noise on the estimated parameters of models fitted to the data. For data of physiological tremor, i.e. a small amplitude oscillation of the outstretched hand of healthy subjects, we compare the results for a linear model that explicitly includes additional observational noise to one that ignores this noise. We discuss problems and possible solutions for nonlinear deterministic as well as nonlinear stochastic processes. Especially we discuss the state space model applicable for modeling noisy stochastic systems and Bock's algorithm capable for modeling noisy deterministic systems.

  19. Time Series Analysis of SOLSTICE Measurements

    Science.gov (United States)

    Wen, G.; Cahalan, R. F.

    2003-12-01

    Solar radiation is the major energy source for the Earth's biosphere and atmospheric and ocean circulations. Variations of solar irradiance have been a major concern of scientists both in solar physics and atmospheric sciences. A number of missions have been carried out to monitor changes in total solar irradiance (TSI) [see Fröhlich and Lean, 1998 for review] and spectral solar irradiance (SSI) [e.g., SOLSTICE on UARS and VIRGO on SOHO]. Observations over a long time period reveal the connection between variations in solar irradiance and surface magnetic fields of the Sun [Lean1997]. This connection provides a guide to scientists in modeling solar irradiances [e.g., Fontenla et al., 1999; Krivova et al., 2003]. Solar spectral observations have now been made over a relatively long time period, allowing statistical analysis. This paper focuses on predictability of solar spectral irradiance using observed SSI from SOLSTICE . Analysis of predictability is based on nonlinear dynamics using an artificial neural network in a reconstructed phase space [Abarbanel et al., 1993]. In the analysis, we first examine the average mutual information of the observed time series and a delayed time series. The time delay that gives local minimum of mutual information is chosen as the time-delay for phase space reconstruction [Fraser and Swinney, 1986]. The embedding dimension of the reconstructed phase space is determined using the false neighbors and false strands method [Kennel and Abarbanel, 2002]. Subsequently, we use a multi-layer feed-forward network with back propagation scheme [e.g., Haykin, 1994] to model the time series. The predictability of solar irradiance as a function of wavelength is considered. References Abarbanel, H. D. I., R. Brown, J. J. Sidorowich, and L. Sh. Tsimring, Rev. Mod. Phys. 65, 1331, 1993. Fraser, A. M. and H. L. Swinney, Phys. Rev. 33A, 1134, 1986. Fontenla, J., O. R. White, P. Fox, E. H. Avrett and R. L. Kurucz, The Astrophysical Journal, 518, 480

  20. An introduction to state space time series analysis.

    NARCIS (Netherlands)

    Commandeur, J.J.F. & Koopman, S.J.

    2007-01-01

    Providing a practical introduction to state space methods as applied to unobserved components time series models, also known as structural time series models, this book introduces time series analysis using state space methodology to readers who are neither familiar with time series analysis, nor wi

  1. Nonlinear Time Series Analysis Since 1990:Some Personal Reflections

    Institute of Scientific and Technical Information of China (English)

    Howel Tong

    2002-01-01

    I reflect upon the development of nonlinear time series analysis since 1990 by focusing on five major areas of development. These areas include the interface between nonlinear time series analysis and chaos, the nonparametric/semiparametric approach, nonlinear state space modelling, financial time series and nonlinear modelling of panels of time series.

  2. Ensemble vs. time averages in financial time series analysis

    Science.gov (United States)

    Seemann, Lars; Hua, Jia-Chen; McCauley, Joseph L.; Gunaratne, Gemunu H.

    2012-12-01

    Empirical analysis of financial time series suggests that the underlying stochastic dynamics are not only non-stationary, but also exhibit non-stationary increments. However, financial time series are commonly analyzed using the sliding interval technique that assumes stationary increments. We propose an alternative approach that is based on an ensemble over trading days. To determine the effects of time averaging techniques on analysis outcomes, we create an intraday activity model that exhibits periodic variable diffusion dynamics and we assess the model data using both ensemble and time averaging techniques. We find that ensemble averaging techniques detect the underlying dynamics correctly, whereas sliding intervals approaches fail. As many traded assets exhibit characteristic intraday volatility patterns, our work implies that ensemble averages approaches will yield new insight into the study of financial markets’ dynamics.

  3. Partial spectral analysis of hydrological time series

    Science.gov (United States)

    Jukić, D.; Denić-Jukić, V.

    2011-03-01

    SummaryHydrological time series comprise the influences of numerous processes involved in the transfer of water in hydrological cycle. It implies that an ambiguity with respect to the processes encoded in spectral and cross-spectral density functions exists. Previous studies have not paid attention adequately to this issue. Spectral and cross-spectral density functions represent the Fourier transforms of auto-covariance and cross-covariance functions. Using this basic property, the ambiguity is resolved by applying a novel approach based on the spectral representation of partial correlation. Mathematical background for partial spectral density, partial amplitude and partial phase functions is presented. The proposed functions yield the estimates of spectral density, amplitude and phase that are not affected by a controlling process. If an input-output relation is the subject of interest, antecedent and subsequent influences of the controlling process can be distinguished considering the input event as a referent point. The method is used for analyses of the relations between the rainfall, air temperature and relative humidity, as well as the influences of air temperature and relative humidity on the discharge from karst spring. Time series are collected in the catchment of the Jadro Spring located in the Dinaric karst area of Croatia.

  4. Forecasting the Time Series of Sunspot Numbers

    Science.gov (United States)

    Aguirre, L. A.; Letellier, C.; Maquet, J.

    2008-05-01

    Forecasting the solar cycle is of great importance for weather prediction and environmental monitoring, and also constitutes a difficult scientific benchmark in nonlinear dynamical modeling. This paper describes the identification of a model and its use in the forecasting the time series comprised of Wolf’s sunspot numbers. A key feature of this procedure is that the original time series is first transformed into a symmetrical space where the dynamics of the solar dynamo are unfolded in a better way, thus improving the model. The nonlinear model obtained is parsimonious and has both deterministic and stochastic parts. Monte Carlo simulation of the whole model produces very consistent results with the deterministic part of the model but allows for the determination of confidence bands. The obtained model was used to predict cycles 24 and 25, although the forecast of the latter is seen as a crude approximation, given the long prediction horizon required. As for the 24th cycle, two estimates were obtained with peaks of 65±16 and of 87±13 units of sunspot numbers. The simulated results suggest that the 24th cycle will be shorter and less active than the preceding one.

  5. Forecasting autoregressive time series under changing persistence

    DEFF Research Database (Denmark)

    Kruse, Robinson

    Changing persistence in time series models means that a structural change from nonstationarity to stationarity or vice versa occurs over time. Such a change has important implications for forecasting, as negligence may lead to inaccurate model predictions. This paper derives generally applicable...... recommendations, no matter whether a change in persistence occurs or not. Seven different forecasting strategies based on a biasedcorrected estimator are compared by means of a large-scale Monte Carlo study. The results for decreasing and increasing persistence are highly asymmetric and new to the literature. Its...... good predictive ability and its balanced performance among different settings strongly advocate the use of forecasting strategies based on the Bai-Perron procedure....

  6. Useful Pattern Mining on Time Series

    DEFF Research Database (Denmark)

    Goumatianos, Nikitas; Christou, Ioannis T; Lindgren, Peter

    2013-01-01

    We present the architecture of a “useful pattern” mining system that is capable of detecting thousands of different candlestick sequence patterns at the tick or any higher granularity levels. The system architecture is highly distributed and performs most of its highly compute-intensive aggregation...... calculations as complex but efficient distributed SQL queries on the relational databases that store the time-series. We present initial results from mining all frequent candlestick sequences with the characteristic property that when they occur then, with an average at least 60% probability, they signal a 2......% or higher increase (or, alternatively, decrease) in a chosen property of the stock (e.g. close-value) within a given time-window (e.g. 5 days). Initial results from a first prototype implementation of the architecture show that after training on a large set of stocks, the system is capable of finding...

  7. Learning with Latent Factors in Time Series

    CERN Document Server

    Jalali, Ali

    2011-01-01

    This paper considers the problem of learning, from samples, the dependency structure of a system of linear stochastic differential equations, when some of the variables are latent. In particular, we observe the time evolution of some variables, and never observe other variables; from this, we would like to find the dependency structure between the observed variables -- separating out the spurious interactions caused by the (marginalizing out of the) latent variables' time series. We develop a new method, based on convex optimization, to do so in the case when the number of latent variables is smaller than the number of observed ones. For the case when the dependency structure between the observed variables is sparse, we theoretically establish a high-dimensional scaling result for structure recovery. We verify our theoretical result with both synthetic and real data (from the stock market).

  8. Automated time series forecasting for biosurveillance.

    Science.gov (United States)

    Burkom, Howard S; Murphy, Sean Patrick; Shmueli, Galit

    2007-09-30

    For robust detection performance, traditional control chart monitoring for biosurveillance is based on input data free of trends, day-of-week effects, and other systematic behaviour. Time series forecasting methods may be used to remove this behaviour by subtracting forecasts from observations to form residuals for algorithmic input. We describe three forecast methods and compare their predictive accuracy on each of 16 authentic syndromic data streams. The methods are (1) a non-adaptive regression model using a long historical baseline, (2) an adaptive regression model with a shorter, sliding baseline, and (3) the Holt-Winters method for generalized exponential smoothing. Criteria for comparing the forecasts were the root-mean-square error, the median absolute per cent error (MedAPE), and the median absolute deviation. The median-based criteria showed best overall performance for the Holt-Winters method. The MedAPE measures over the 16 test series averaged 16.5, 11.6, and 9.7 for the non-adaptive regression, adaptive regression, and Holt-Winters methods, respectively. The non-adaptive regression forecasts were degraded by changes in the data behaviour in the fixed baseline period used to compute model coefficients. The mean-based criterion was less conclusive because of the effects of poor forecasts on a small number of calendar holidays. The Holt-Winters method was also most effective at removing serial autocorrelation, with most 1-day-lag autocorrelation coefficients below 0.15. The forecast methods were compared without tuning them to the behaviour of individual series. We achieved improved predictions with such tuning of the Holt-Winters method, but practical use of such improvements for routine surveillance will require reliable data classification methods.

  9. Trend prediction of chaotic time series

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    Trend prediction of chaotic ti me series is anin-teresting probleminti me series analysis andti me se-ries data mining(TSDM)fields[1].TSDM-basedmethods can successfully characterize and predictcomplex,irregular,and chaotic ti me series.Somemethods have been proposed to predict the trend ofchaotic ti me series.In our knowledge,these meth-ods can be classified into t wo categories as follows.The first category is based on the embeddedspace[2-3],where rawti me series data is mapped to areconstructed phase spac...

  10. Periodograms for multiband astronomical time series

    Science.gov (United States)

    Ivezic, Z.; VanderPlas, J. T.

    2016-05-01

    We summarize the multiband periodogram, a general extension of the well-known Lomb-Scargle approach for detecting periodic signals in time- domain data developed by VanderPlas & Ivezic (2015). A Python implementation of this method is available on GitHub. The multiband periodogram significantly improves period finding for randomly sampled multiband light curves (e.g., Pan-STARRS, DES, and LSST), and can treat non-uniform sampling and heteroscedastic errors. The light curves in each band are modeled as arbitrary truncated Fourier series, with the period and phase shared across all bands. The key aspect is the use of Tikhonov regularization which drives most of the variability into the so-called base model common to all bands, while fits for individual bands describe residuals relative to the base model and typically require lower-order Fourier series. We use simulated light curves and randomly subsampled SDSS Stripe 82 data to demonstrate the superiority of this method compared to other methods from the literature, and find that this method will be able to efficiently determine the correct period in the majority of LSST's bright RR Lyrae stars with as little as six months of LSST data.

  11. Correcting and combining time series forecasters.

    Science.gov (United States)

    Firmino, Paulo Renato A; de Mattos Neto, Paulo S G; Ferreira, Tiago A E

    2014-02-01

    Combined forecasters have been in the vanguard of stochastic time series modeling. In this way it has been usual to suppose that each single model generates a residual or prediction error like a white noise. However, mostly because of disturbances not captured by each model, it is yet possible that such supposition is violated. The present paper introduces a two-step method for correcting and combining forecasting models. Firstly, the stochastic process underlying the bias of each predictive model is built according to a recursive ARIMA algorithm in order to achieve a white noise behavior. At each iteration of the algorithm the best ARIMA adjustment is determined according to a given information criterion (e.g. Akaike). Then, in the light of the corrected predictions, it is considered a maximum likelihood combined estimator. Applications involving single ARIMA and artificial neural networks models for Dow Jones Industrial Average Index, S&P500 Index, Google Stock Value, and Nasdaq Index series illustrate the usefulness of the proposed framework.

  12. On clustering fMRI time series

    DEFF Research Database (Denmark)

    Goutte, Cyril; Toft, Peter Aundal; Rostrup, E.

    1999-01-01

    Analysis of fMRI time series is often performed by extracting one or more parameters for the individual voxels. Methods based, e.g., on various statistical tests are then used to yield parameters corresponding to probability of activation or activation strength. However, these methods do not indi...... between the activation stimulus and the fMRI signal. We present two different clustering algorithms and use them to identify regions of similar activations in an fMRI experiment involving a visual stimulus....... not indicate whether sets of voxels are activated in a similar way or in different ways. Typically, delays between two activated signals are not identified. In this article, we use clustering methods to detect similarities in activation between voxels. We employ a novel metric that measures the similarity...

  13. Normalizing the causality between time series

    Science.gov (United States)

    Liang, X. San

    2015-08-01

    Recently, a rigorous yet concise formula was derived to evaluate information flow, and hence the causality in a quantitative sense, between time series. To assess the importance of a resulting causality, it needs to be normalized. The normalization is achieved through distinguishing a Lyapunov exponent-like, one-dimensional phase-space stretching rate and a noise-to-signal ratio from the rate of information flow in the balance of the marginal entropy evolution of the flow recipient. It is verified with autoregressive models and applied to a real financial analysis problem. An unusually strong one-way causality is identified from IBM (International Business Machines Corporation) to GE (General Electric Company) in their early era, revealing to us an old story, which has almost faded into oblivion, about "Seven Dwarfs" competing with a giant for the mainframe computer market.

  14. Inferring causality from noisy time series data

    DEFF Research Database (Denmark)

    Mønster, Dan; Fusaroli, Riccardo; Tylén, Kristian;

    2016-01-01

    Convergent Cross-Mapping (CCM) has shown high potential to perform causal inference in the absence of models. We assess the strengths and weaknesses of the method by varying coupling strength and noise levels in coupled logistic maps. We find that CCM fails to infer accurate coupling strength...... and even causality direction in synchronized time-series and in the presence of intermediate coupling. We find that the presence of noise deterministically reduces the level of cross-mapping fidelity, while the convergence rate exhibits higher levels of robustness. Finally, we propose that controlled noise...... injections in intermediate-to-strongly coupled systems could enable more accurate causal inferences. Given the inherent noisy nature of real-world systems, our findings enable a more accurate evaluation of CCM applicability and advance suggestions on how to overcome its weaknesses....

  15. Highly comparative, feature-based time-series classification

    CERN Document Server

    Fulcher, Ben D

    2014-01-01

    A highly comparative, feature-based approach to time series classification is introduced that uses an extensive database of algorithms to extract thousands of interpretable features from time series. These features are derived from across the scientific time-series analysis literature, and include summaries of time series in terms of their correlation structure, distribution, entropy, stationarity, scaling properties, and fits to a range of time-series models. After computing thousands of features for each time series in a training set, those that are most informative of the class structure are selected using greedy forward feature selection with a linear classifier. The resulting feature-based classifiers automatically learn the differences between classes using a reduced number of time-series properties, and circumvent the need to calculate distances between time series. Representing time series in this way results in orders of magnitude of dimensionality reduction, allowing the method to perform well on ve...

  16. PERIODOGRAMS FOR MULTIBAND ASTRONOMICAL TIME SERIES

    Energy Technology Data Exchange (ETDEWEB)

    VanderPlas, Jacob T. [eScience Institute, University of Washington, Seattle, WA (United States); Ivezic, Željko [Department of Astronomy, University of Washington, Seattle, WA (United States)

    2015-10-10

    This paper introduces the multiband periodogram, a general extension of the well-known Lomb–Scargle approach for detecting periodic signals in time-domain data. In addition to advantages of the Lomb–Scargle method such as treatment of non-uniform sampling and heteroscedastic errors, the multiband periodogram significantly improves period finding for randomly sampled multiband light curves (e.g., Pan-STARRS, DES, and LSST). The light curves in each band are modeled as arbitrary truncated Fourier series, with the period and phase shared across all bands. The key aspect is the use of Tikhonov regularization which drives most of the variability into the so-called base model common to all bands, while fits for individual bands describe residuals relative to the base model and typically require lower-order Fourier series. This decrease in the effective model complexity is the main reason for improved performance. After a pedagogical development of the formalism of least-squares spectral analysis, which motivates the essential features of the multiband model, we use simulated light curves and randomly subsampled SDSS Stripe 82 data to demonstrate the superiority of this method compared to other methods from the literature and find that this method will be able to efficiently determine the correct period in the majority of LSST’s bright RR Lyrae stars with as little as six months of LSST data, a vast improvement over the years of data reported to be required by previous studies. A Python implementation of this method, along with code to fully reproduce the results reported here, is available on GitHub.

  17. Timing calibration and spectral cleaning of LOFAR time series data

    Science.gov (United States)

    Corstanje, A.; Buitink, S.; Enriquez, J. E.; Falcke, H.; Hörandel, J. R.; Krause, M.; Nelles, A.; Rachen, J. P.; Schellart, P.; Scholten, O.; ter Veen, S.; Thoudam, S.; Trinh, T. N. G.

    2016-05-01

    We describe a method for spectral cleaning and timing calibration of short time series data of the voltage in individual radio interferometer receivers. It makes use of phase differences in fast Fourier transform (FFT) spectra across antenna pairs. For strong, localized terrestrial sources these are stable over time, while being approximately uniform-random for a sum over many sources or for noise. Using only milliseconds-long datasets, the method finds the strongest interfering transmitters, a first-order solution for relative timing calibrations, and faulty data channels. No knowledge of gain response or quiescent noise levels of the receivers is required. With relatively small data volumes, this approach is suitable for use in an online system monitoring setup for interferometric arrays. We have applied the method to our cosmic-ray data collection, a collection of measurements of short pulses from extensive air showers, recorded by the LOFAR radio telescope. Per air shower, we have collected 2 ms of raw time series data for each receiver. The spectral cleaning has a calculated optimal sensitivity corresponding to a power signal-to-noise ratio of 0.08 (or -11 dB) in a spectral window of 25 kHz, for 2 ms of data in 48 antennas. This is well sufficient for our application. Timing calibration across individual antenna pairs has been performed at 0.4 ns precision; for calibration of signal clocks across stations of 48 antennas the precision is 0.1 ns. Monitoring differences in timing calibration per antenna pair over the course of the period 2011 to 2015 shows a precision of 0.08 ns, which is useful for monitoring and correcting drifts in signal path synchronizations. A cross-check method for timing calibration is presented, using a pulse transmitter carried by a drone flying over the array. Timing precision is similar, 0.3 ns, but is limited by transmitter position measurements, while requiring dedicated flights.

  18. Time series modeling for syndromic surveillance

    Directory of Open Access Journals (Sweden)

    Mandl Kenneth D

    2003-01-01

    Full Text Available Abstract Background Emergency department (ED based syndromic surveillance systems identify abnormally high visit rates that may be an early signal of a bioterrorist attack. For example, an anthrax outbreak might first be detectable as an unusual increase in the number of patients reporting to the ED with respiratory symptoms. Reliably identifying these abnormal visit patterns requires a good understanding of the normal patterns of healthcare usage. Unfortunately, systematic methods for determining the expected number of (ED visits on a particular day have not yet been well established. We present here a generalized methodology for developing models of expected ED visit rates. Methods Using time-series methods, we developed robust models of ED utilization for the purpose of defining expected visit rates. The models were based on nearly a decade of historical data at a major metropolitan academic, tertiary care pediatric emergency department. The historical data were fit using trimmed-mean seasonal models, and additional models were fit with autoregressive integrated moving average (ARIMA residuals to account for recent trends in the data. The detection capabilities of the model were tested with simulated outbreaks. Results Models were built both for overall visits and for respiratory-related visits, classified according to the chief complaint recorded at the beginning of each visit. The mean absolute percentage error of the ARIMA models was 9.37% for overall visits and 27.54% for respiratory visits. A simple detection system based on the ARIMA model of overall visits was able to detect 7-day-long simulated outbreaks of 30 visits per day with 100% sensitivity and 97% specificity. Sensitivity decreased with outbreak size, dropping to 94% for outbreaks of 20 visits per day, and 57% for 10 visits per day, all while maintaining a 97% benchmark specificity. Conclusions Time series methods applied to historical ED utilization data are an important tool

  19. Timing calibration and spectral cleaning of LOFAR time series data

    CERN Document Server

    Corstanje, A; Enriquez, J E; Falcke, H; Hörandel, J R; Krause, M; Nelles, A; Rachen, J P; Schellart, P; Scholten, O; ter Veen, S; Thoudam, S; Trinh, T N G

    2016-01-01

    We describe a method for spectral cleaning and timing calibration of short voltage time series data from individual radio interferometer receivers. It makes use of the phase differences in Fast Fourier Transform (FFT) spectra across antenna pairs. For strong, localized terrestrial sources these are stable over time, while being approximately uniform-random for a sum over many sources or for noise. Using only milliseconds-long datasets, the method finds the strongest interfering transmitters, a first-order solution for relative timing calibrations, and faulty data channels. No knowledge of gain response or quiescent noise levels of the receivers is required. With relatively small data volumes, this approach is suitable for use in an online system monitoring setup for interferometric arrays. We have applied the method to our cosmic-ray data collection, a collection of measurements of short pulses from extensive air showers, recorded by the LOFAR radio telescope. Per air shower, we have collected 2 ms of raw tim...

  20. Time series models of symptoms in schizophrenia.

    Science.gov (United States)

    Tschacher, Wolfgang; Kupper, Zeno

    2002-12-15

    The symptom courses of 84 schizophrenia patients (mean age: 24.4 years; mean previous admissions: 1.3; 64% males) of a community-based acute ward were examined to identify dynamic patterns of symptoms and to investigate the relation between these patterns and treatment outcome. The symptoms were monitored by systematic daily staff ratings using a scale composed of three factors: psychoticity, excitement, and withdrawal. Patients showed moderate to high symptomatic improvement documented by effect size measures. Each of the 84 symptom trajectories was analyzed by time series methods using vector autoregression (VAR) that models the day-to-day interrelations between symptom factors. Multiple and stepwise regression analyses were then performed on the basis of the VAR models. Two VAR parameters were found to be associated significantly with favorable outcome in this exploratory study: 'withdrawal preceding a reduction of psychoticity' as well as 'excitement preceding an increase of withdrawal'. The findings were interpreted as generating hypotheses about how patients cope with psychotic episodes.

  1. Testing whether a time series is Guassian

    Energy Technology Data Exchange (ETDEWEB)

    Lee, S.

    1991-01-01

    The authors first tests whether a stationary linear process with mean 0 is Gaussian. For the invertible processes, he considers the empirical process based on the residuals as the basis of a test procedure. By applying the result of Boldin (1983) and Kreiss (1988), he shows that the process behaves asymptotically like the one based on the true errors. For non-invertible processes, on the other hand, Lee uses the empirical process based on data themselves rather than the one based on residuals. Here, the time series is assumed to be a strongly mixing process with a suitable mixing order. Then, the asymptotic behavior of the empirical process in each case is studied under a sequence of contiguous alternatives, and quadratic functionals of the empirical process are employed for AAR([infinity]) processes in order to compare efficiencies between these two procedures. The rest of the thesis is devoted to extending Boldin's results to nonstationary processes such as unstable AR(p) processes and explosive AR(1) processes, analyzing by means of a general stochastic regression model.

  2. Spectral Estimation of Non-Gaussian Time Series

    OpenAIRE

    Fabián, Z. (Zdeněk)

    2010-01-01

    Based on the concept of the scalar score of a probability distribution, we introduce a concept of a scalar score of time series and propose to characterize a non-Gaussian time series by spectral density of its scalar score.

  3. Climate Prediction Center (CPC) Global Temperature Time Series

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The global temperature time series provides time series charts using station based observations of daily temperature. These charts provide information about the...

  4. Climate Prediction Center (CPC) Global Precipitation Time Series

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The global precipitation time series provides time series charts showing observations of daily precipitation as well as accumulated precipitation compared to normal...

  5. An introduction to state space time series analysis.

    OpenAIRE

    Commandeur, J.J.F. & Koopman, S.J.

    2007-01-01

    Providing a practical introduction to state space methods as applied to unobserved components time series models, also known as structural time series models, this book introduces time series analysis using state space methodology to readers who are neither familiar with time series analysis, nor with state space methods. The only background required in order to understand the material presented in the book is a basic knowledge of classical linear regression models, of which a brief review is...

  6. Transmission of linear regression patterns between time series: from relationship in time series to complex networks.

    Science.gov (United States)

    Gao, Xiangyun; An, Haizhong; Fang, Wei; Huang, Xuan; Li, Huajiao; Zhong, Weiqiong; Ding, Yinghui

    2014-07-01

    The linear regression parameters between two time series can be different under different lengths of observation period. If we study the whole period by the sliding window of a short period, the change of the linear regression parameters is a process of dynamic transmission over time. We tackle fundamental research that presents a simple and efficient computational scheme: a linear regression patterns transmission algorithm, which transforms linear regression patterns into directed and weighted networks. The linear regression patterns (nodes) are defined by the combination of intervals of the linear regression parameters and the results of the significance testing under different sizes of the sliding window. The transmissions between adjacent patterns are defined as edges, and the weights of the edges are the frequency of the transmissions. The major patterns, the distance, and the medium in the process of the transmission can be captured. The statistical results of weighted out-degree and betweenness centrality are mapped on timelines, which shows the features of the distribution of the results. Many measurements in different areas that involve two related time series variables could take advantage of this algorithm to characterize the dynamic relationships between the time series from a new perspective.

  7. Seasonal Time Series Analysis Based on Genetic Algorithm

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    Pattern discovery from the seasonal time-series is of importance. Traditionally, most of the algorithms of pattern discovery in time series are similar. A novel mode of time series is proposed which integrates the Genetic Algorithm (GA) for the actual problem. The experiments on the electric power yield sequence models show that this algorithm is practicable and effective.

  8. Generalized Framework for Similarity Measure of Time Series

    Directory of Open Access Journals (Sweden)

    Hongsheng Yin

    2014-01-01

    Full Text Available Currently, there is no definitive and uniform description for the similarity of time series, which results in difficulties for relevant research on this topic. In this paper, we propose a generalized framework to measure the similarity of time series. In this generalized framework, whether the time series is univariable or multivariable, and linear transformed or nonlinear transformed, the similarity of time series is uniformly defined using norms of vectors or matrices. The definitions of the similarity of time series in the original space and the transformed space are proved to be equivalent. Furthermore, we also extend the theory on similarity of univariable time series to multivariable time series. We present some experimental results on published time series datasets tested with the proposed similarity measure function of time series. Through the proofs and experiments, it can be claimed that the similarity measure functions of linear multivariable time series based on the norm distance of covariance matrix and nonlinear multivariable time series based on kernel function are reasonable and practical.

  9. Hidden Markov Models for Time Series An Introduction Using R

    CERN Document Server

    Zucchini, Walter

    2009-01-01

    Illustrates the flexibility of HMMs as general-purpose models for time series data. This work presents an overview of HMMs for analyzing time series data, from continuous-valued, circular, and multivariate series to binary data, bounded and unbounded counts and categorical observations.

  10. Time and ensemble averaging in time series analysis

    CERN Document Server

    Latka, Miroslaw; Jernajczyk, Wojciech; West, Bruce J

    2010-01-01

    In many applications expectation values are calculated by partitioning a single experimental time series into an ensemble of data segments of equal length. Such single trajectory ensemble (STE) is a counterpart to a multiple trajectory ensemble (MTE) used whenever independent measurements or realizations of a stochastic process are available. The equivalence of STE and MTE for stationary systems was postulated by Wang and Uhlenbeck in their classic paper on Brownian motion (Rev. Mod. Phys. 17, 323 (1945)) but surprisingly has not yet been proved. Using the stationary and ergodic paradigm of statistical physics -- the Ornstein-Uhlenbeck (OU) Langevin equation, we revisit Wang and Uhlenbeck's postulate. In particular, we find that the variance of the solution of this equation is different for these two ensembles. While the variance calculated using the MTE quantifies the spreading of independent trajectories originating from the same initial point, the variance for STE measures the spreading of two correlated r...

  11. Scale-dependent intrinsic entropies of complex time series.

    Science.gov (United States)

    Yeh, Jia-Rong; Peng, Chung-Kang; Huang, Norden E

    2016-04-13

    Multi-scale entropy (MSE) was developed as a measure of complexity for complex time series, and it has been applied widely in recent years. The MSE algorithm is based on the assumption that biological systems possess the ability to adapt and function in an ever-changing environment, and these systems need to operate across multiple temporal and spatial scales, such that their complexity is also multi-scale and hierarchical. Here, we present a systematic approach to apply the empirical mode decomposition algorithm, which can detrend time series on various time scales, prior to analysing a signal's complexity by measuring the irregularity of its dynamics on multiple time scales. Simulated time series of fractal Gaussian noise and human heartbeat time series were used to study the performance of this new approach. We show that our method can successfully quantify the fractal properties of the simulated time series and can accurately distinguish modulations in human heartbeat time series in health and disease.

  12. Efficient Algorithms for Segmentation of Item-Set Time Series

    Science.gov (United States)

    Chundi, Parvathi; Rosenkrantz, Daniel J.

    We propose a special type of time series, which we call an item-set time series, to facilitate the temporal analysis of software version histories, email logs, stock market data, etc. In an item-set time series, each observed data value is a set of discrete items. We formalize the concept of an item-set time series and present efficient algorithms for segmenting a given item-set time series. Segmentation of a time series partitions the time series into a sequence of segments where each segment is constructed by combining consecutive time points of the time series. Each segment is associated with an item set that is computed from the item sets of the time points in that segment, using a function which we call a measure function. We then define a concept called the segment difference, which measures the difference between the item set of a segment and the item sets of the time points in that segment. The segment difference values are required to construct an optimal segmentation of the time series. We describe novel and efficient algorithms to compute segment difference values for each of the measure functions described in the paper. We outline a dynamic programming based scheme to construct an optimal segmentation of the given item-set time series. We use the item-set time series segmentation techniques to analyze the temporal content of three different data sets—Enron email, stock market data, and a synthetic data set. The experimental results show that an optimal segmentation of item-set time series data captures much more temporal content than a segmentation constructed based on the number of time points in each segment, without examining the item set data at the time points, and can be used to analyze different types of temporal data.

  13. Sparse Representation for Time-Series Classification

    Science.gov (United States)

    2015-02-08

    Comput. Vision and Pattern Recognition (CVPR), pp. 4114–4121 (2014). 18. J. Mairal, F. Bach , A. Zisserman, and G. Sapiro. Supervised dictionary learn...ing. In Advances Neural Inform. Process. Syst. (NIPS), pp. 1033–1040 (2008). 19. J. Mairal, F. Bach , and J. Ponce, Task-driven dictionary learning...Series Classification 17 compressive sensing, SISC. 33(1), 250–278 (2011). 41. J. Mairal, F. Bach , J. Ponce, and G. Sapiro, Online dictionary learning for

  14. TIME SERIES ANALYSIS USING A UNIQUE MODEL OF TRANSFORMATION

    Directory of Open Access Journals (Sweden)

    Goran Klepac

    2007-12-01

    Full Text Available REFII1 model is an authorial mathematical model for time series data mining. The main purpose of that model is to automate time series analysis, through a unique transformation model of time series. An advantage of this approach of time series analysis is the linkage of different methods for time series analysis, linking traditional data mining tools in time series, and constructing new algorithms for analyzing time series. It is worth mentioning that REFII model is not a closed system, which means that we have a finite set of methods. At first, this is a model for transformation of values of time series, which prepares data used by different sets of methods based on the same model of transformation in a domain of problem space. REFII model gives a new approach in time series analysis based on a unique model of transformation, which is a base for all kind of time series analysis. The advantage of REFII model is its possible application in many different areas such as finance, medicine, voice recognition, face recognition and text mining.

  15. Time-series prediction and applications a machine intelligence approach

    CERN Document Server

    Konar, Amit

    2017-01-01

    This book presents machine learning and type-2 fuzzy sets for the prediction of time-series with a particular focus on business forecasting applications. It also proposes new uncertainty management techniques in an economic time-series using type-2 fuzzy sets for prediction of the time-series at a given time point from its preceding value in fluctuating business environments. It employs machine learning to determine repetitively occurring similar structural patterns in the time-series and uses stochastic automaton to predict the most probabilistic structure at a given partition of the time-series. Such predictions help in determining probabilistic moves in a stock index time-series Primarily written for graduate students and researchers in computer science, the book is equally useful for researchers/professionals in business intelligence and stock index prediction. A background of undergraduate level mathematics is presumed, although not mandatory, for most of the sections. Exercises with tips are provided at...

  16. Ruin Probability in Linear Time Series Model

    Institute of Scientific and Technical Information of China (English)

    ZHANG Lihong

    2005-01-01

    This paper analyzes a continuous time risk model with a linear model used to model the claim process. The time is discretized stochastically using the times when claims occur, using Doob's stopping time theorem and martingale inequalities to obtain expressions for the ruin probability as well as both exponential and non-exponential upper bounds for the ruin probability for an infinite time horizon. Numerical results are included to illustrate the accuracy of the non-exponential bound.

  17. On correlations and fractal characteristics of time series

    CERN Document Server

    Vitanov, N K; Yankulova, E D; Vitanov, Nikolay K.; Sakai, kenschi; Yankulova, Elka D.

    2005-01-01

    Correlation analysis is convenient and frequently used tool for investigation of time series from complex systems. Recently new methods such as the multifractal detrended fluctuation analysis (MFDFA) and the wavelet transform modulus maximum method (WTMM) have been developed. By means of these methods (i) we can investigate long-range correlations in time series and (ii) we can calculate fractal spectra of these time series. But opposite to the classical tool for correlation analysis - the autocorrelation function, the newly developed tools are not applicable to all kinds of time series. The unappropriate application of MFDFA or WTMM leads to wrong results and conclusions. In this article we discuss the opportunities and risks connected to the application of the MFDFA method to time series from a random number generator and to experimentally measured time series (i) for accelerations of an agricultural tractor and (ii) for the heartbeat activity of {\\sl Drosophila melanogaster}. Our main goal is to emphasize ...

  18. Clustering Time Series Data Stream - A Literature Survey

    CERN Document Server

    Kavitha, V

    2010-01-01

    Mining Time Series data has a tremendous growth of interest in today's world. To provide an indication various implementations are studied and summarized to identify the different problems in existing applications. Clustering time series is a trouble that has applications in an extensive assortment of fields and has recently attracted a large amount of research. Time series data are frequently large and may contain outliers. In addition, time series are a special type of data set where elements have a temporal ordering. Therefore clustering of such data stream is an important issue in the data mining process. Numerous techniques and clustering algorithms have been proposed earlier to assist clustering of time series data streams. The clustering algorithms and its effectiveness on various applications are compared to develop a new method to solve the existing problem. This paper presents a survey on various clustering algorithms available for time series datasets. Moreover, the distinctiveness and restriction ...

  19. Non-parametric causal inference for bivariate time series

    CERN Document Server

    McCracken, James M

    2015-01-01

    We introduce new quantities for exploratory causal inference between bivariate time series. The quantities, called penchants and leanings, are computationally straightforward to apply, follow directly from assumptions of probabilistic causality, do not depend on any assumed models for the time series generating process, and do not rely on any embedding procedures; these features may provide a clearer interpretation of the results than those from existing time series causality tools. The penchant and leaning are computed based on a structured method for computing probabilities.

  20. Predicting Chaotic Time Series Using Recurrent Neural Network

    Institute of Scientific and Technical Information of China (English)

    ZHANG Jia-Shu; XIAO Xian-Ci

    2000-01-01

    A new proposed method, i.e. the recurrent neural network (RNN), is introduced to predict chaotic time series. The effectiveness of using RNN for making one-step and multi-step predictions is tested based on remarkable few datum points by computer-generated chaotic time series. Numerical results show that the RNN proposed here is a very powerful tool for making prediction of chaotic time series.

  1. Information distance and its application in time series

    Directory of Open Access Journals (Sweden)

    B. Mirza

    2008-03-01

    Full Text Available   In this paper a new method is introduced for studying time series of complex systems. This method is based on using the concept of entropy and Jensen-Shannon divergence. In this paper this method is applied to time series of billiard system and heart signals. By this method, we can diagnose the healthy and unhealthy heart and also chaotic billiards from non chaotic systems . The method can also be applied to other time series.

  2. Intrusion Detection Forecasting Using Time Series for Improving Cyber Defence

    OpenAIRE

    Abdullah, Azween Bin; Pillai, Thulasyammal Ramiah; Cai, Long Zheng

    2015-01-01

    The strength of time series modeling is generally not used in almost all current intrusion detection and prevention systems. By having time series models, system administrators will be able to better plan resource allocation and system readiness to defend against malicious activities. In this paper, we address the knowledge gap by investigating the possible inclusion of a statistical based time series modeling that can be seamlessly integrated into existing cyber defense system. Cyber-attack ...

  3. Forecasting the underlying potential governing climatic time series

    CERN Document Server

    Livina, V N; Mudelsee, M; Lenton, T M

    2012-01-01

    We introduce a technique of time series analysis, potential forecasting, which is based on dynamical propagation of the probability density of time series. We employ polynomial coefficients of the orthogonal approximation of the empirical probability distribution and extrapolate them in order to forecast the future probability distribution of data. The method is tested on artificial data, used for hindcasting observed climate data, and then applied to forecast Arctic sea-ice time series. The proposed methodology completes a framework for `potential analysis' of climatic tipping points which altogether serves anticipating, detecting and forecasting climate transitions and bifurcations using several independent techniques of time series analysis.

  4. Using neural networks for dynamic light scattering time series processing

    Science.gov (United States)

    Chicea, Dan

    2017-04-01

    A basic experiment to record dynamic light scattering (DLS) time series was assembled using basic components. The DLS time series processing using the Lorentzian function fit was considered as reference. A Neural Network was designed and trained using simulated frequency spectra for spherical particles in the range 0–350 nm, assumed to be scattering centers, and the neural network design and training procedure are described in detail. The neural network output accuracy was tested both on simulated and on experimental time series. The match with the DLS results, considered as reference, was good serving as a proof of concept for using neural networks in fast DLS time series processing.

  5. Efficient use of correlation entropy for analysing time series data

    Indian Academy of Sciences (India)

    K P Harikrishnan; R Misra; G Ambika

    2009-02-01

    The correlation dimension 2 and correlation entropy 2 are both important quantifiers in nonlinear time series analysis. However, use of 2 has been more common compared to 2 as a discriminating measure. One reason for this is that 2 is a static measure and can be easily evaluated from a time series. However, in many cases, especially those involving coloured noise, 2 is regarded as a more useful measure. Here we present an efficient algorithmic scheme to compute 2 directly from a time series data and show that 2 can be used as a more effective measure compared to 2 for analysing practical time series involving coloured noise.

  6. Trend time-series modeling and forecasting with neural networks.

    Science.gov (United States)

    Qi, Min; Zhang, G Peter

    2008-05-01

    Despite its great importance, there has been no general consensus on how to model the trends in time-series data. Compared to traditional approaches, neural networks (NNs) have shown some promise in time-series forecasting. This paper investigates how to best model trend time series using NNs. Four different strategies (raw data, raw data with time index, detrending, and differencing) are used to model various trend patterns (linear, nonlinear, deterministic, stochastic, and breaking trend). We find that with NNs differencing often gives meritorious results regardless of the underlying data generating processes (DGPs). This finding is also confirmed by the real gross national product (GNP) series.

  7. gatspy: General tools for Astronomical Time Series in Python

    Science.gov (United States)

    VanderPlas, Jake

    2016-10-01

    Gatspy contains efficient, well-documented implementations of several common routines for Astronomical time series analysis, including the Lomb-Scargle periodogram, the Supersmoother method, and others.

  8. Time series analysis in the social sciences the fundamentals

    CERN Document Server

    Shin, Youseop

    2017-01-01

    Times Series Analysis in the Social Sciences is a practical and highly readable introduction written exclusively for students and researchers whose mathematical background is limited to basic algebra. The book focuses on fundamental elements of time series analysis that social scientists need to understand so they can employ time series analysis for their research and practice. Through step-by-step explanations and using monthly violent crime rates as case studies, this book explains univariate time series from the preliminary visual analysis through the modeling of seasonality, trends, and re

  9. Studies on time series applications in environmental sciences

    CERN Document Server

    Bărbulescu, Alina

    2016-01-01

    Time series analysis and modelling represent a large study field, implying the approach from the perspective of the time and frequency, with applications in different domains. Modelling hydro-meteorological time series is difficult due to the characteristics of these series, as long range dependence, spatial dependence, the correlation with other series. Continuous spatial data plays an important role in planning, risk assessment and decision making in environmental management. In this context, in this book we present various statistical tests and modelling techniques used for time series analysis, as well as applications to hydro-meteorological series from Dobrogea, a region situated in the south-eastern part of Romania, less studied till now. Part of the results are accompanied by their R code. .

  10. How to analyse irregularly sampled geophysical time series?

    Science.gov (United States)

    Eroglu, Deniz; Ozken, Ibrahim; Stemler, Thomas; Marwan, Norbert; Wyrwoll, Karl-Heinz; Kurths, Juergen

    2015-04-01

    One of the challenges of time series analysis is to detect dynamical changes in the dynamics of the underlying system.There are numerous methods that can be used to detect such regime changes in regular sampled times series. Here we present a new approach, that can be applied, when the time series is irregular sampled. Such data sets occur frequently in real world applications as in paleo climate proxy records. The basic idea follows Victor and Purpura [1] and considers segments of the time series. For each segment we compute the cost of transforming the segment into the following one. If the time series is from one dynamical regime the cost of transformation should be similar for each segment of the data. Dramatic changes in the cost time series indicate a change in the underlying dynamics. Any kind of analysis can be applicable to the cost time series since it is a regularly sampled time series. While recurrence plots are not the best choice for irregular sampled data with some measurement noise component, we show that a recurrence plot analysis based on the cost time series can successfully identify the changes in the dynamics of the system. We tested this method using synthetically created time series and will use these results to highlight the performance of our method. Furthermore we present our analysis of a suite of calcite and aragonite stalagmites located in the eastern Kimberley region of tropical Western Australia. This oxygen isotopic data is a proxy for the monsoon activity over the last 8,000 years. In this time series our method picks up several so far undetected changes from wet to dry in the monsoon system and therefore enables us to get a better understanding of the monsoon dynamics in the North-East of Australia over the last couple of thousand years. [1] J. D. Victor and K. P. Purpura, Network: Computation in Neural Systems 8, 127 (1997)

  11. A Method for Determining Periods in Time Series.

    Science.gov (United States)

    1981-04-01

    SUPPLEMENTARY NOTES IS. KEY WORDS (Conlinu an revere cide Ii necesry d Identify by block nmi 9ber) Univariate time series; spectral density function ; Newton’s...and the method is applied to a series of hormone levels data. KEY WORDS: Univariate time series; Spectral density function ; Newton’s Method...Z the set of integers, be a zero mean covariance stationary time series with autocovariance function R(v) = E(Y(t)Y(t+v)), vZ and spectral density function f

  12. Distance measure with improved lower bound for multivariate time series

    Science.gov (United States)

    Li, Hailin

    2017-02-01

    Lower bound function is one of the important techniques used to fast search and index time series data. Multivariate time series has two aspects of high dimensionality including the time-based dimension and the variable-based dimension. Due to the influence of variable-based dimension, a novel method is proposed to deal with the lower bound distance computation for multivariate time series. The proposed method like the traditional ones also reduces the dimensionality of time series in its first step and thus does not directly apply the lower bound function on the multivariate time series. The dimensionality reduction is that multivariate time series is reduced to univariate time series denoted as center sequences according to the principle of piecewise aggregate approximation. In addition, an extended lower bound function is designed to obtain good tightness and fast measure the distance between any two center sequences. The experimental results demonstrate that the proposed lower bound function has better tightness and improves the performance of similarity search in multivariate time series datasets.

  13. Recovery of the Time-Evolution Equation of Time-Delay Systems from Time Series

    CERN Document Server

    Bünner, M J; Kittel, A; Parisi, J; Meyer, Th.

    1997-01-01

    We present a method for time series analysis of both, scalar and nonscalar time-delay systems. If the dynamics of the system investigated is governed by a time-delay induced instability, the method allows to determine the delay time. In a second step, the time-delay differential equation can be recovered from the time series. The method is a generalization of our recently proposed method suitable for time series analysis of {\\it scalar} time-delay systems. The dynamics is not required to be settled on its attractor, which also makes transient motion accessible to the analysis. If the motion actually takes place on a chaotic attractor, the applicability of the method does not depend on the dimensionality of the chaotic attractor - one main advantage over all time series analysis methods known until now. For demonstration, we analyze time series, which are obtained with the help of the numerical integration of a two-dimensional time-delay differential equation. After having determined the delay time, we recover...

  14. Time series prediction using wavelet process neural network

    Institute of Scientific and Technical Information of China (English)

    Ding Gang; Zhong Shi-Sheng; Li Yang

    2008-01-01

    In the real world, the inputs of many complicated systems are time-varying functions or processes. In order to predict the outputs of these systems with high speed and accuracy, this paper proposes a time series prediction model based on the wavelet process neural network, and develops the corresponding learning algorithm based on the expansion of the orthogonal basis functions. The effectiveness of the proposed time series prediction model and its learning algorithm is proved by the Mackey-Glass time series prediction, and the comparative prediction results indicate that the proposed time series prediction model based on the wavelet process neural network seems to perform well and appears suitable for using as a good tool to predict the highly complex nonlinear time series.

  15. Fixed Points in Self-Similar Analysis of Time Series

    OpenAIRE

    Gluzman, S.; Yukalov, V. I.

    1998-01-01

    Two possible definitions of fixed points in the self-similar analysis of time series are considered. One definition is based on the minimal-difference condition and another, on a simple averaging. From studying stock market time series, one may conclude that these two definitions are practically equivalent. A forecast is made for the stock market indices for the end of March 1998.

  16. Robust Forecasting of Non-Stationary Time Series

    NARCIS (Netherlands)

    Croux, C.; Fried, R.; Gijbels, I.; Mahieu, K.

    2010-01-01

    This paper proposes a robust forecasting method for non-stationary time series. The time series is modelled using non-parametric heteroscedastic regression, and fitted by a localized MM-estimator, combining high robustness and large efficiency. The proposed method is shown to produce reliable foreca

  17. Mean shifts, unit roots and forecasting seasonal time series

    NARCIS (Netherlands)

    Ph.H.B.F. Franses (Philip Hans); R. Paap (Richard); H. Hoek (Henk)

    1997-01-01

    textabstractExamples of descriptive models for changing seasonal patterns in economic time series are autoregressive models with seasonal unit roots or with deterministic seasonal mean shifts. In this paper we show through a forecasting comparison for three macroeconomic time series (for which tests

  18. Stata: The language of choice for time series analysis?

    OpenAIRE

    Baum, Christopher F

    2004-01-01

    This paper discusses the use of Stata for the analysis of time series and panel data. The evolution of time-series capabilities in Stata is reviewed. Facilities for data management, graphics, and econometric analysis from both official Stata and the user community are discussed. A new routine to provide moving-window regression estimates, rollreg, is described, and its use illustrated.

  19. Metagenomics meets time series analysis: unraveling microbial community dynamics

    NARCIS (Netherlands)

    Faust, K.; Lahti, L.M.; Gonze, D.; Vos, de W.M.; Raes, J.

    2015-01-01

    The recent increase in the number of microbial time series studies offers new insights into the stability and dynamics of microbial communities, from the world's oceans to human microbiota. Dedicated time series analysis tools allow taking full advantage of these data. Such tools can reveal periodic

  20. Time series analysis : Smoothed correlation integrals, autocovariances, and power spectra

    NARCIS (Netherlands)

    Takens, F; Dumortier, F; Broer, H; Mawhin, J; Vanderbauwhede, A; Lunel, SV

    2005-01-01

    In this paper we relate notions from linear time series analyses, like autocovariances and power spectra, with notions from nonlinear times series analysis, like (smoothed) correlation integrals and the corresponding dimensions and entropies. The complete proofs of the results announced in this pape

  1. Two-fractal overlap time series: Earthquakes and market crashes

    Indian Academy of Sciences (India)

    Bikas K Chakrabarti; Arnab Chatterjee; Pratip Bhattacharyya

    2008-08-01

    We find prominent similarities in the features of the time series for the (model earthquakes or) overlap of two Cantor sets when one set moves with uniform relative velocity over the other and time series of stock prices. An anticipation method for some of the crashes have been proposed here, based on these observations.

  2. Comparison of New and Old Sunspot Number Time Series

    Science.gov (United States)

    Cliver, E. W.

    2016-11-01

    Four new sunspot number time series have been published in this Topical Issue: a backbone-based group number in Svalgaard and Schatten ( Solar Phys., 2016; referred to here as SS, 1610 - present), a group number series in Usoskin et al. ( Solar Phys., 2016; UEA, 1749 - present) that employs active day fractions from which it derives an observational threshold in group spot area as a measure of observer merit, a provisional group number series in Cliver and Ling ( Solar Phys., 2016; CL, 1841 - 1976) that removed flaws in the Hoyt and Schatten ( Solar Phys. 179, 189, 1998a; 181, 491, 1998b) normalization scheme for the original relative group sunspot number (RG, 1610 - 1995), and a corrected Wolf (international, RI) number in Clette and Lefèvre ( Solar Phys., 2016; SN, 1700 - present). Despite quite different construction methods, the four new series agree well after about 1900. Before 1900, however, the UEA time series is lower than SS, CL, and SN, particularly so before about 1885. Overall, the UEA series most closely resembles the original RG series. Comparison of the UEA and SS series with a new solar wind B time series (Owens et al. in J. Geophys. Res., 2016; 1845 - present) indicates that the UEA time series is too low before 1900. We point out incongruities in the Usoskin et al. ( Solar Phys., 2016) observer normalization scheme and present evidence that this method under-estimates group counts before 1900. In general, a correction factor time series, obtained by dividing an annual group count series by the corresponding yearly averages of raw group counts for all observers, can be used to assess the reliability of new sunspot number reconstructions.

  3. Fisher Information Framework for Time Series Modeling

    CERN Document Server

    Venkatesan, R C

    2016-01-01

    A robust prediction model invoking the Takens embedding theorem, whose \\textit{working hypothesis} is obtained via an inference procedure based on the minimum Fisher information principle, is presented. The coefficients of the ansatz, central to the \\textit{working hypothesis} satisfy a time independent Schr\\"{o}dinger-like equation in a vector setting. The inference of i) the probability density function of the coefficients of the \\textit{working hypothesis} and ii) the establishing of constraint driven pseudo-inverse condition for the modeling phase of the prediction scheme, is made, for the case of normal distributions, with the aid of the quantum mechanical virial theorem. The well-known reciprocity relations and the associated Legendre transform structure for the Fisher information measure (FIM, hereafter)-based model in a vector setting (with least square constraints) are self-consistently derived. These relations are demonstrated to yield an intriguing form of the FIM for the modeling phase, which defi...

  4. Time series analysis and inverse theory for geophysicists

    Institute of Scientific and Technical Information of China (English)

    Junzo Kasahara

    2006-01-01

    @@ Thanks to the advances in geophysical measurement technologies, most geophysical data are now recorded in digital form. But to extract the ‘Earth's nature’ from observed data, it is necessary to apply the signal-processing method to the time-series data, seismograms and geomagnetic records being the most common. The processing of time-series data is one of the major subjects of this book.By the processing of time series data, numerical values such as travel-times are obtained.The first stage of data analysis is forward modeling, but the more advanced step is the inversion method. This is the second subject of this book.

  5. Performance of multifractal detrended fluctuation analysis on short time series

    CERN Document Server

    Lopez, Juan Luis

    2013-01-01

    The performance of the multifractal detrended analysis on short time series is evaluated for synthetic samples of several mono- and multifractal models. The reconstruction of the generalized Hurst exponents is used to determine the range of applicability of the method and the precision of its results as a function of the decreasing length of the series. As an application the series of the daily exchange rate between the U.S. dollar and the euro is studied.

  6. Cross recurrence plot based synchronization of time series

    OpenAIRE

    N. Marwan; Thiel, M.; Nowaczyk, N. R.

    2002-01-01

    The method of recurrence plots is extended to the cross recurrence plots (CRP) which, among others, enables the study of synchronization or time differences in two time series. This is emphasized in a distorted main diagonal in the cross recurrence plot, the line of synchronization (LOS). A non-parametrical fit of this LOS can be used to rescale the time axis of the two data series (whereby one of them is compressed or stretched) so ...

  7. Solving Nonlinear Time Delay Control Systems by Fourier series

    Directory of Open Access Journals (Sweden)

    Mohammad Hadi Farahi

    2014-06-01

    Full Text Available In this paper we present a method to find the solution of time-delay optimal control systems using Fourier series. The method is based upon expanding various time functions in the system as their truncated Fourier series. Operational matrices of integration and delay are presented and are utilized to reduce the solution of time-delay control systems to the solution of algebraic equations. Illustrative examples are included to demonstrate the validity and applicability of the technique.

  8. Modeling Persistence In Hydrological Time Series Using Fractional Differencing

    Science.gov (United States)

    Hosking, J. R. M.

    1984-12-01

    The class of autoregressive integrated moving average (ARIMA) time series models may be generalized by permitting the degree of differencing d to take fractional values. Models including fractional differencing are capable of representing persistent series (d > 0) or short-memory series (d = 0). The class of fractionally differenced ARIMA processes provides a more flexible way than has hitherto been available of simultaneously modeling the long-term and short-term behavior of a time series. In this paper some fundamental properties of fractionally differenced ARIMA processes are presented. Methods of simulating these processes are described. Estimation of the parameters of fractionally differenced ARIMA models is discussed, and an approximate maximum likelihood method is proposed. The methodology is illustrated by fitting fractionally differenced models to time series of streamflows and annual temperatures.

  9. Modelling road accidents: An approach using structural time series

    Science.gov (United States)

    Junus, Noor Wahida Md; Ismail, Mohd Tahir

    2014-09-01

    In this paper, the trend of road accidents in Malaysia for the years 2001 until 2012 was modelled using a structural time series approach. The structural time series model was identified using a stepwise method, and the residuals for each model were tested. The best-fitted model was chosen based on the smallest Akaike Information Criterion (AIC) and prediction error variance. In order to check the quality of the model, a data validation procedure was performed by predicting the monthly number of road accidents for the year 2012. Results indicate that the best specification of the structural time series model to represent road accidents is the local level with a seasonal model.

  10. Algorithms for Linear Time Series Analysis: With R Package

    Directory of Open Access Journals (Sweden)

    A. Ian McLeod

    2007-11-01

    Full Text Available Our ltsa package implements the Durbin-Levinson and Trench algorithms and provides a general approach to the problems of fitting, forecasting and simulating linear time series models as well as fitting regression models with linear time series errors. For computational efficiency both algorithms are implemented in C and interfaced to R. Examples are given which illustrate the efficiency and accuracy of the algorithms. We provide a second package FGN which illustrates the use of the ltsa package with fractional Gaussian noise (FGN. It is hoped that the ltsa will provide a base for further time series software.

  11. Multivariate time series analysis with R and financial applications

    CERN Document Server

    Tsay, Ruey S

    2013-01-01

    Since the publication of his first book, Analysis of Financial Time Series, Ruey Tsay has become one of the most influential and prominent experts on the topic of time series. Different from the traditional and oftentimes complex approach to multivariate (MV) time series, this sequel book emphasizes structural specification, which results in simplified parsimonious VARMA modeling and, hence, eases comprehension. Through a fundamental balance between theory and applications, the book supplies readers with an accessible approach to financial econometric models and their applications to real-worl

  12. On the detection of superdiffusive behaviour in time series

    CERN Document Server

    Gottwald, Georg A

    2016-01-01

    We present a new method for detecting superdiffusive behaviour and for determining rates of superdiffusion in time series data. Our method applies equally to stochastic and deterministic time series data and relies on one realisation (ie one sample path) of the process. Linear drift effects are automatically removed without any preprocessing. We show numerical results for time series constructed from i.i.d. $\\alpha$-stable random variables and from deterministic weakly chaotic maps. We compare our method with the standard method of estimating the growth rate of the mean-square displacement as well as the $p$-variation method.

  13. A vector of quarters representation for bivariate time series

    NARCIS (Netherlands)

    Ph.H.B.F. Franses (Philip Hans)

    1995-01-01

    textabstractIn this paper it is shown that several models for a bivariate nonstationary quarterly time series are nested in a vector autoregression with cointegration restrictions for the eight annual series of quarterly observations. Or, the Granger Representation Theorem is extended to incorporate

  14. A multivariate approach to modeling univariate seasonal time series

    NARCIS (Netherlands)

    Ph.H.B.F. Franses (Philip Hans)

    1994-01-01

    textabstractA seasonal time series can be represented by a vector autoregressive model for the annual series containing the seasonal observations. This model allows for periodically varying coefficients. When the vector elements are integrated, the maximum likelihood cointegration method can be used

  15. Seasonality, nonstationarity and the forecasting of monthly time series

    NARCIS (Netherlands)

    Ph.H.B.F. Franses (Philip Hans)

    1991-01-01

    textabstractWe focus on two forecasting models for a monthly time series. The first model requires that the variable is first order and seasonally differenced. The second model considers the series only in its first differences, while seasonality is modeled with a constant and seasonal dummies. A me

  16. Scalable Prediction of Energy Consumption using Incremental Time Series Clustering

    Energy Technology Data Exchange (ETDEWEB)

    Simmhan, Yogesh; Noor, Muhammad Usman

    2013-10-09

    Time series datasets are a canonical form of high velocity Big Data, and often generated by pervasive sensors, such as found in smart infrastructure. Performing predictive analytics on time series data can be computationally complex, and requires approximation techniques. In this paper, we motivate this problem using a real application from the smart grid domain. We propose an incremental clustering technique, along with a novel affinity score for determining cluster similarity, which help reduce the prediction error for cumulative time series within a cluster. We evaluate this technique, along with optimizations, using real datasets from smart meters, totaling ~700,000 data points, and show the efficacy of our techniques in improving the prediction error of time series data within polynomial time.

  17. Elements of nonlinear time series analysis and forecasting

    CERN Document Server

    De Gooijer, Jan G

    2017-01-01

    This book provides an overview of the current state-of-the-art of nonlinear time series analysis, richly illustrated with examples, pseudocode algorithms and real-world applications. Avoiding a “theorem-proof” format, it shows concrete applications on a variety of empirical time series. The book can be used in graduate courses in nonlinear time series and at the same time also includes interesting material for more advanced readers. Though it is largely self-contained, readers require an understanding of basic linear time series concepts, Markov chains and Monte Carlo simulation methods. The book covers time-domain and frequency-domain methods for the analysis of both univariate and multivariate (vector) time series. It makes a clear distinction between parametric models on the one hand, and semi- and nonparametric models/methods on the other. This offers the reader the option of concentrating exclusively on one of these nonlinear time series analysis methods. To make the book as user friendly as possible...

  18. Multi-dimensional sparse time series: feature extraction

    CERN Document Server

    Franciosi, Marco

    2008-01-01

    We show an analysis of multi-dimensional time series via entropy and statistical linguistic techniques. We define three markers encoding the behavior of the series, after it has been translated into a multi-dimensional symbolic sequence. The leading component and the trend of the series with respect to a mobile window analysis result from the entropy analysis and label the dynamical evolution of the series. The diversification formalizes the differentiation in the use of recurrent patterns, from a Zipf law point of view. These markers are the starting point of further analysis such as classification or clustering of large database of multi-dimensional time series, prediction of future behavior and attribution of new data. We also present an application to economic data. We deal with measurements of money investments of some business companies in advertising market for different media sources.

  19. Uniform Consistency for Nonparametric Estimators in Null Recurrent Time Series

    DEFF Research Database (Denmark)

    Gao, Jiti; Kanaya, Shin; Li, Degui;

    2015-01-01

    This paper establishes uniform consistency results for nonparametric kernel density and regression estimators when time series regressors concerned are nonstationary null recurrent Markov chains. Under suitable regularity conditions, we derive uniform convergence rates of the estimators. Our...

  20. Distinguishing chaotic time series from noise: A random matrix approach

    Science.gov (United States)

    Ye, Bin; Chen, Jianxing; Ju, Chen; Li, Huijun; Wang, Xuesong

    2017-03-01

    Deterministically chaotic systems can often give rise to random and unpredictable behaviors which make the time series obtained from them to be almost indistinguishable from noise. Motivated by the fact that data points in a chaotic time series will have intrinsic correlations between them, we propose a random matrix theory (RMT) approach to identify the deterministic or stochastic dynamics of the system. We show that the spectral distributions of the correlation matrices, constructed from the chaotic time series, deviate significantly from the predictions of random matrix ensembles. On the contrary, the eigenvalue statistics for a noisy signal follow closely those of random matrix ensembles. Numerical results also indicate that the approach is to some extent robust to additive observational noise which pollutes the data in many practical situations. Our approach is efficient in recognizing the continuous chaotic dynamics underlying the evolution of the time series.

  1. On robust forecasting of autoregressive time series under censoring

    OpenAIRE

    Kharin, Y.; Badziahin, I.

    2009-01-01

    Problems of robust statistical forecasting are considered for autoregressive time series observed under distortions generated by interval censoring. Three types of robust forecasting statistics are developed; meansquare risk is evaluated for the developed forecasting statistics. Numerical results are given.

  2. A probability distribution approach to synthetic turbulence time series

    Science.gov (United States)

    Sinhuber, Michael; Bodenschatz, Eberhard; Wilczek, Michael

    2016-11-01

    The statistical features of turbulence can be described in terms of multi-point probability density functions (PDFs). The complexity of these statistical objects increases rapidly with the number of points. This raises the question of how much information has to be incorporated into statistical models of turbulence to capture essential features such as inertial-range scaling and intermittency. Using high Reynolds number hot-wire data obtained at the Variable Density Turbulence Tunnel at the Max Planck Institute for Dynamics and Self-Organization, we establish a PDF-based approach on generating synthetic time series that reproduce those features. To do this, we measure three-point conditional PDFs from the experimental data and use an adaption-rejection method to draw random velocities from this distribution to produce synthetic time series. Analyzing these synthetic time series, we find that time series based on even low-dimensional conditional PDFs already capture some essential features of real turbulent flows.

  3. AFSC/ABL: Naknek sockeye salmon scale time series

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — A time series of scale samples (1956 2002) collected from adult sockeye salmon returning to Naknek River were retrieved from the Alaska Department of Fish and Game....

  4. Phenotyping of Clinical Time Series with LSTM Recurrent Neural Networks

    OpenAIRE

    Lipton, Zachary C.; Kale, David C.; Wetzell, Randall C.

    2015-01-01

    We present a novel application of LSTM recurrent neural networks to multilabel classification of diagnoses given variable-length time series of clinical measurements. Our method outperforms a strong baseline on a variety of metrics.

  5. Fast and Flexible Multivariate Time Series Subsequence Search

    Data.gov (United States)

    National Aeronautics and Space Administration — Multivariate Time-Series (MTS) are ubiquitous, and are generated in areas as disparate as sensor recordings in aerospace systems, music and video streams, medical...

  6. AFSC/ABL: Ugashik sockeye salmon scale time series

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — A time series of scale samples (1956 b?? 2002) collected from adult sockeye salmon returning to Ugashik River were retrieved from the Alaska Department of Fish and...

  7. Lagrangian Time Series Models for Ocean Surface Drifter Trajectories

    CERN Document Server

    Sykulski, Adam M; Lilly, Jonathan M; Danioux, Eric

    2016-01-01

    This paper proposes stochastic models for the analysis of ocean surface trajectories obtained from freely-drifting satellite-tracked instruments. The proposed time series models are used to summarise large multivariate datasets and infer important physical parameters of inertial oscillations and other ocean processes. Nonstationary time series methods are employed to account for the spatiotemporal variability of each trajectory. Because the datasets are large, we construct computationally efficient methods through the use of frequency-domain modelling and estimation, with the data expressed as complex-valued time series. We detail how practical issues related to sampling and model misspecification may be addressed using semi-parametric techniques for time series, and we demonstrate the effectiveness of our stochastic models through application to both real-world data and to numerical model output.

  8. Stacked Heterogeneous Neural Networks for Time Series Forecasting

    Directory of Open Access Journals (Sweden)

    Florin Leon

    2010-01-01

    Full Text Available A hybrid model for time series forecasting is proposed. It is a stacked neural network, containing one normal multilayer perceptron with bipolar sigmoid activation functions, and the other with an exponential activation function in the output layer. As shown by the case studies, the proposed stacked hybrid neural model performs well on a variety of benchmark time series. The combination of weights of the two stack components that leads to optimal performance is also studied.

  9. A Generalization of Some Classical Time Series Tools

    DEFF Research Database (Denmark)

    Nielsen, Henrik Aalborg; Madsen, Henrik

    2001-01-01

    In classical time series analysis the sample autocorrelation function (SACF) and the sample partial autocorrelation function (SPACF) has gained wide application for structural identification of linear time series models. We suggest generalizations, founded on smoothing techniques, applicable for ....... In this paper the generalizations are applied to some simulated data sets and to the Canadian lynx data. The generalizations seem to perform well and the measure of the departure from linearity proves to be an important additional tool....

  10. Prediction and interpolation of time series by state space models

    OpenAIRE

    Helske, Jouni

    2015-01-01

    A large amount of data collected today is in the form of a time series. In order to make realistic inferences based on time series forecasts, in addition to point predictions, prediction intervals or other measures of uncertainty should be presented. Multiple sources of uncertainty are often ignored due to the complexities involved in accounting them correctly. In this dissertation, some of these problems are reviewed and some new solutions are presented. A state space approach...

  11. The use of synthetic input sequences in time series modeling

    Energy Technology Data Exchange (ETDEWEB)

    Oliveira, Dair Jose de [Programa de Pos-Graduacao em Engenharia Eletrica, Universidade Federal de Minas Gerais, Av. Antonio Carlos 6627, 31.270-901 Belo Horizonte, MG (Brazil); Letellier, Christophe [CORIA/CNRS UMR 6614, Universite et INSA de Rouen, Av. de l' Universite, BP 12, F-76801 Saint-Etienne du Rouvray cedex (France); Gomes, Murilo E.D. [Programa de Pos-Graduacao em Engenharia Eletrica, Universidade Federal de Minas Gerais, Av. Antonio Carlos 6627, 31.270-901 Belo Horizonte, MG (Brazil); Aguirre, Luis A. [Programa de Pos-Graduacao em Engenharia Eletrica, Universidade Federal de Minas Gerais, Av. Antonio Carlos 6627, 31.270-901 Belo Horizonte, MG (Brazil)], E-mail: aguirre@cpdee.ufmg.br

    2008-08-04

    In many situations time series models obtained from noise-like data settle to trivial solutions under iteration. This Letter proposes a way of producing a synthetic (dummy) input, that is included to prevent the model from settling down to a trivial solution, while maintaining features of the original signal. Simulated benchmark models and a real time series of RR intervals from an ECG are used to illustrate the procedure.

  12. The use of synthetic input sequences in time series modeling

    Science.gov (United States)

    de Oliveira, Dair José; Letellier, Christophe; Gomes, Murilo E. D.; Aguirre, Luis A.

    2008-08-01

    In many situations time series models obtained from noise-like data settle to trivial solutions under iteration. This Letter proposes a way of producing a synthetic (dummy) input, that is included to prevent the model from settling down to a trivial solution, while maintaining features of the original signal. Simulated benchmark models and a real time series of RR intervals from an ECG are used to illustrate the procedure.

  13. Extracting Chaos Control Parameters from Time Series Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Santos, R B B [Centro Universitario da FEI, Avenida Humberto de Alencar Castelo Branco 3972, 09850-901, Sao Bernardo do Campo, SP (Brazil); Graves, J C, E-mail: rsantos@fei.edu.br [Instituto Tecnologico de Aeronautica, Praca Marechal Eduardo Gomes 50, 12228-900, Sao Jose dos Campos, SP (Brazil)

    2011-03-01

    We present a simple method to analyze time series, and estimate the parameters needed to control chaos in dynamical systems. Application of the method to a system described by the logistic map is also shown. Analyzing only two 100-point time series, we achieved results within 2% of the analytical ones. With these estimates, we show that OGY control method successfully stabilized a period-1 unstable periodic orbit embedded in the chaotic attractor.

  14. Time-varying parameter auto-regressive models for autocovariance nonstationary time series

    Institute of Scientific and Technical Information of China (English)

    FEI WanChun; BAI Lun

    2009-01-01

    In this paper,autocovariance nonstationary time series is clearly defined on a family of time series.We propose three types of TVPAR (time-varying parameter auto-regressive) models:the full order TVPAR model,the time-unvarying order TVPAR model and the time-varying order TVPAR model for autocovariance nonstationary time series.Related minimum AIC (Akaike information criterion) estimations are carried out.

  15. Time-varying parameter auto-regressive models for autocovariance nonstationary time series

    Institute of Scientific and Technical Information of China (English)

    2009-01-01

    In this paper, autocovariance nonstationary time series is clearly defined on a family of time series. We propose three types of TVPAR (time-varying parameter auto-regressive) models: the full order TVPAR model, the time-unvarying order TVPAR model and the time-varying order TV-PAR model for autocovariance nonstationary time series. Related minimum AIC (Akaike information criterion) estimations are carried out.

  16. A method for detecting changes in long time series

    Energy Technology Data Exchange (ETDEWEB)

    Downing, D.J.; Lawkins, W.F.; Morris, M.D.; Ostrouchov, G.

    1995-09-01

    Modern scientific activities, both physical and computational, can result in time series of many thousands or even millions of data values. Here the authors describe a statistically motivated algorithm for quick screening of very long time series data for the presence of potentially interesting but arbitrary changes. The basic data model is a stationary Gaussian stochastic process, and the approach to detecting a change is the comparison of two predictions of the series at a time point or contiguous collection of time points. One prediction is a ``forecast``, i.e. based on data from earlier times, while the other a ``backcast``, i.e. based on data from later times. The statistic is the absolute value of the log-likelihood ratio for these two predictions, evaluated at the observed data. A conservative procedure is suggested for specifying critical values for the statistic under the null hypothesis of ``no change``.

  17. LEGENDRE SERIES SOLUTIONS FOR TIME-VARIATION DYNAMICS

    Institute of Scientific and Technical Information of China (English)

    Cao Zhiyuan; Zou Guiping; Tang Shougao

    2000-01-01

    In this topic, a new approach to the analysis of time-variation dynamics is proposed by use of Legendre series expansion and Legendre integral operator matrix. The theoretical basis for effective solution of time-variation dynamics is therefore established, which is beneficial to further research of time-variation science.

  18. Combined forecasts from linear and nonlinear time series models

    NARCIS (Netherlands)

    N. Terui (Nobuhiko); H.K. van Dijk (Herman)

    1999-01-01

    textabstractCombined forecasts from a linear and a nonlinear model are investigated for time series with possibly nonlinear characteristics. The forecasts are combined by a constant coefficient regression method as well as a time varying method. The time varying method allows for a locally (non)line

  19. Similarity estimators for irregular and age uncertain time series

    Directory of Open Access Journals (Sweden)

    K. Rehfeld

    2013-09-01

    Full Text Available Paleoclimate time series are often irregularly sampled and age uncertain, which is an important technical challenge to overcome for successful reconstruction of past climate variability and dynamics. Visual comparison and interpolation-based linear correlation approaches have been used to infer dependencies from such proxy time series. While the first is subjective, not measurable and not suitable for the comparison of many datasets at a time, the latter introduces interpolation bias, and both face difficulties if the underlying dependencies are nonlinear. In this paper we investigate similarity estimators that could be suitable for the quantitative investigation of dependencies in irregular and age uncertain time series. We compare the Gaussian-kernel based cross correlation (gXCF, Rehfeld et al., 2011 and mutual information (gMI, Rehfeld et al., 2013 against their interpolation-based counterparts and the new event synchronization function (ESF. We test the efficiency of the methods in estimating coupling strength and coupling lag numerically, using ensembles of synthetic stalagmites with short, autocorrelated, linear and nonlinearly coupled proxy time series, and in the application to real stalagmite time series. In the linear test case coupling strength increases are identified consistently for all estimators, while in the nonlinear test case the correlation-based approaches fail. The lag at which the time series are coupled is identified correctly as the maximum of the similarity functions in around 60–55% (in the linear case to 53–42% (for the nonlinear processes of the cases when the dating of the synthetic stalagmite is perfectly precise. If the age uncertainty increases beyond 5% of the time series length, however, the true coupling lag is not identified more often than the others for which the similarity function was estimated. Age uncertainty contributes up to half of the uncertainty in the similarity estimation process. Time

  20. Similarity estimators for irregular and age-uncertain time series

    Science.gov (United States)

    Rehfeld, K.; Kurths, J.

    2014-01-01

    Paleoclimate time series are often irregularly sampled and age uncertain, which is an important technical challenge to overcome for successful reconstruction of past climate variability and dynamics. Visual comparison and interpolation-based linear correlation approaches have been used to infer dependencies from such proxy time series. While the first is subjective, not measurable and not suitable for the comparison of many data sets at a time, the latter introduces interpolation bias, and both face difficulties if the underlying dependencies are nonlinear. In this paper we investigate similarity estimators that could be suitable for the quantitative investigation of dependencies in irregular and age-uncertain time series. We compare the Gaussian-kernel-based cross-correlation (gXCF, Rehfeld et al., 2011) and mutual information (gMI, Rehfeld et al., 2013) against their interpolation-based counterparts and the new event synchronization function (ESF). We test the efficiency of the methods in estimating coupling strength and coupling lag numerically, using ensembles of synthetic stalagmites with short, autocorrelated, linear and nonlinearly coupled proxy time series, and in the application to real stalagmite time series. In the linear test case, coupling strength increases are identified consistently for all estimators, while in the nonlinear test case the correlation-based approaches fail. The lag at which the time series are coupled is identified correctly as the maximum of the similarity functions in around 60-55% (in the linear case) to 53-42% (for the nonlinear processes) of the cases when the dating of the synthetic stalagmite is perfectly precise. If the age uncertainty increases beyond 5% of the time series length, however, the true coupling lag is not identified more often than the others for which the similarity function was estimated. Age uncertainty contributes up to half of the uncertainty in the similarity estimation process. Time series irregularity

  1. Similarity estimators for irregular and age uncertain time series

    Science.gov (United States)

    Rehfeld, K.; Kurths, J.

    2013-09-01

    Paleoclimate time series are often irregularly sampled and age uncertain, which is an important technical challenge to overcome for successful reconstruction of past climate variability and dynamics. Visual comparison and interpolation-based linear correlation approaches have been used to infer dependencies from such proxy time series. While the first is subjective, not measurable and not suitable for the comparison of many datasets at a time, the latter introduces interpolation bias, and both face difficulties if the underlying dependencies are nonlinear. In this paper we investigate similarity estimators that could be suitable for the quantitative investigation of dependencies in irregular and age uncertain time series. We compare the Gaussian-kernel based cross correlation (gXCF, Rehfeld et al., 2011) and mutual information (gMI, Rehfeld et al., 2013) against their interpolation-based counterparts and the new event synchronization function (ESF). We test the efficiency of the methods in estimating coupling strength and coupling lag numerically, using ensembles of synthetic stalagmites with short, autocorrelated, linear and nonlinearly coupled proxy time series, and in the application to real stalagmite time series. In the linear test case coupling strength increases are identified consistently for all estimators, while in the nonlinear test case the correlation-based approaches fail. The lag at which the time series are coupled is identified correctly as the maximum of the similarity functions in around 60-55% (in the linear case) to 53-42% (for the nonlinear processes) of the cases when the dating of the synthetic stalagmite is perfectly precise. If the age uncertainty increases beyond 5% of the time series length, however, the true coupling lag is not identified more often than the others for which the similarity function was estimated. Age uncertainty contributes up to half of the uncertainty in the similarity estimation process. Time series irregularity

  2. Comparison of time series using entropy and mutual correlation

    Science.gov (United States)

    Madonna, Fabio; Rosoldi, Marco

    2015-04-01

    The potential for redundant time series to reduce uncertainty in atmospheric variables has not been investigated comprehensively for climate observations. Moreover, comparison among time series of in situ and ground based remote sensing measurements have been performed using several methods, but quite often relying on linear models. In this work, the concepts of entropy (H) and mutual correlation (MC), defined in the frame of the information theory, are applied to the study of essential climate variables with the aim of characterizing the uncertainty of a time series and the redundancy of collocated measurements provided by different surface-based techniques. In particular, integrated water vapor (IWV) and water vapour mixing ratio times series obtained at five highly instrumented GRUAN (GCOS, Global Climate Observing System, Reference Upper-Air Network) stations with several sensors (e.g radiosondes, GPS, microwave and infrared radiometers, Raman lidar), in the period from 2010-2012, are analyzed in terms of H and MC. The comparison between the probability density functions of the time series shows that caution in using linear assumptions is needed and the use of statistics, like entropy, that are robust to outliers, is recommended to investigate measurements time series. Results reveals that the random uncertainties on the IWV measured with radiosondes, global positioning system, microwave and infrared radiometers, and Raman lidar measurements differed by less than 8 % over the considered time period. Comparisons of the time series of IWV content from ground-based remote sensing instruments with in situ soundings showed that microwave radiometers have the highest redundancy with the IWV time series measured by radiosondes and therefore the highest potential to reduce the random uncertainty of the radiosondes time series. Moreover, the random uncertainty of a time series from one instrument can be reduced by 60% by constraining the measurements with those from

  3. Analyses of Inhomogeneities in Radiosonde Temperature and Humidity Time Series.

    Science.gov (United States)

    Zhai, Panmao; Eskridge, Robert E.

    1996-04-01

    Twice daily radiosonde data from selected stations in the United States (period 1948 to 1990) and China (period 1958 to 1990) were sorted into time series. These stations have one sounding taken in darkness and the other in sunlight. The analysis shows that the 0000 and 1200 UTC time series are highly correlated. Therefore, the Easterling and Peterson technique was tested on the 0000 and 1200 time series to detect inhomogeneities and to estimate the size of the biases. Discontinuities were detected using the difference series created from the 0000 and 1200 UTC time series. To establish that the detected bias was significant, a t test was performed to confirm that the change occurs in the daytime series but not in the nighttime series.Both U.S. and Chinese radiosonde temperature and humidity data include inhomogeneities caused by changes in radiosonde sensors and observation times. The U.S. humidity data have inhomogeneities that were caused by instrument changes and the censoring of data. The practice of reporting relative humidity as 19% when it is lower than 20% or the temperature is below 40°C is called censoring. This combination of procedural and instrument changes makes the detection of biases and adjustment of the data very difficult. In the Chinese temperatures, them are inhomogeneities related to a change in the radiation correction procedure.Test results demonstrate that a modified Easterling and Peterson method is suitable for use in detecting and adjusting time series radiosonde data.Accurate stations histories are very desirable. Stations histories can confirm that detected inhomogeneities are related to instrument or procedural changes. Adjustments can then he made to the data with some confidence.

  4. Correlation measure to detect time series distances, whence economy globalization

    Science.gov (United States)

    Miśkiewicz, Janusz; Ausloos, Marcel

    2008-11-01

    An instantaneous time series distance is defined through the equal time correlation coefficient. The idea is applied to the Gross Domestic Product (GDP) yearly increments of 21 rich countries between 1950 and 2005 in order to test the process of economic globalisation. Some data discussion is first presented to decide what (EKS, GK, or derived) GDP series should be studied. Distances are then calculated from the correlation coefficient values between pairs of series. The role of time averaging of the distances over finite size windows is discussed. Three network structures are next constructed based on the hierarchy of distances. It is shown that the mean distance between the most developed countries on several networks actually decreases in time, -which we consider as a proof of globalization. An empirical law is found for the evolution after 1990, similar to that found in flux creep. The optimal observation time window size is found ≃15 years.

  5. Exploratory Causal Analysis in Bivariate Time Series Data

    Science.gov (United States)

    McCracken, James M.

    Many scientific disciplines rely on observational data of systems for which it is difficult (or impossible) to implement controlled experiments and data analysis techniques are required for identifying causal information and relationships directly from observational data. This need has lead to the development of many different time series causality approaches and tools including transfer entropy, convergent cross-mapping (CCM), and Granger causality statistics. In this thesis, the existing time series causality method of CCM is extended by introducing a new method called pairwise asymmetric inference (PAI). It is found that CCM may provide counter-intuitive causal inferences for simple dynamics with strong intuitive notions of causality, and the CCM causal inference can be a function of physical parameters that are seemingly unrelated to the existence of a driving relationship in the system. For example, a CCM causal inference might alternate between ''voltage drives current'' and ''current drives voltage'' as the frequency of the voltage signal is changed in a series circuit with a single resistor and inductor. PAI is introduced to address both of these limitations. Many of the current approaches in the times series causality literature are not computationally straightforward to apply, do not follow directly from assumptions of probabilistic causality, depend on assumed models for the time series generating process, or rely on embedding procedures. A new approach, called causal leaning, is introduced in this work to avoid these issues. The leaning is found to provide causal inferences that agree with intuition for both simple systems and more complicated empirical examples, including space weather data sets. The leaning may provide a clearer interpretation of the results than those from existing time series causality tools. A practicing analyst can explore the literature to find many proposals for identifying drivers and causal connections in times series data

  6. Evaluation of scaling invariance embedded in short time series.

    Directory of Open Access Journals (Sweden)

    Xue Pan

    Full Text Available Scaling invariance of time series has been making great contributions in diverse research fields. But how to evaluate scaling exponent from a real-world series is still an open problem. Finite length of time series may induce unacceptable fluctuation and bias to statistical quantities and consequent invalidation of currently used standard methods. In this paper a new concept called correlation-dependent balanced estimation of diffusion entropy is developed to evaluate scale-invariance in very short time series with length ~10(2. Calculations with specified Hurst exponent values of 0.2,0.3,...,0.9 show that by using the standard central moving average de-trending procedure this method can evaluate the scaling exponents for short time series with ignorable bias (≤0.03 and sharp confidential interval (standard deviation ≤0.05. Considering the stride series from ten volunteers along an approximate oval path of a specified length, we observe that though the averages and deviations of scaling exponents are close, their evolutionary behaviors display rich patterns. It has potential use in analyzing physiological signals, detecting early warning signals, and so on. As an emphasis, the our core contribution is that by means of the proposed method one can estimate precisely shannon entropy from limited records.

  7. Wavelet matrix transform for time-series similarity measurement

    Institute of Scientific and Technical Information of China (English)

    HU Zhi-kun; XU Fei; GUI Wei-hua; YANG Chun-hua

    2009-01-01

    A time-series similarity measurement method based on wavelet and matrix transform was proposed, and its anti-noise ability, sensitivity and accuracy were discussed. The time-series sequences were compressed into wavelet subspace, and sample feature vector and orthogonal basics of sample time-series sequences were obtained by K-L transform. Then the inner product transform was carried out to project analyzed time-series sequence into orthogonal basics to gain analyzed feature vectors. The similarity was calculated between sample feature vector and analyzed feature vector by the Euclid distance. Taking fault wave of power electronic devices for example, the experimental results show that the proposed method has low dimension of feature vector, the anti-noise ability of proposed method is 30 times as large as that of plain wavelet method, the sensitivity of proposed method is 1/3 as large as that of plain wavelet method, and the accuracy of proposed method is higher than that of the wavelet singular value decomposition method. The proposed method can be applied in similarity matching and indexing for lager time series databases.

  8. Self-affinity in the dengue fever time series

    Science.gov (United States)

    Azevedo, S. M.; Saba, H.; Miranda, J. G. V.; Filho, A. S. Nascimento; Moret, M. A.

    2016-06-01

    Dengue is a complex public health problem that is common in tropical and subtropical regions. This disease has risen substantially in the last three decades, and the physical symptoms depict the self-affine behavior of the occurrences of reported dengue cases in Bahia, Brazil. This study uses detrended fluctuation analysis (DFA) to verify the scale behavior in a time series of dengue cases and to evaluate the long-range correlations that are characterized by the power law α exponent for different cities in Bahia, Brazil. The scaling exponent (α) presents different long-range correlations, i.e. uncorrelated, anti-persistent, persistent and diffusive behaviors. The long-range correlations highlight the complex behavior of the time series of this disease. The findings show that there are two distinct types of scale behavior. In the first behavior, the time series presents a persistent α exponent for a one-month period. For large periods, the time series signal approaches subdiffusive behavior. The hypothesis of the long-range correlations in the time series of the occurrences of reported dengue cases was validated. The observed self-affinity is useful as a forecasting tool for future periods through extrapolation of the α exponent behavior. This complex system has a higher predictability in a relatively short time (approximately one month), and it suggests a new tool in epidemiological control strategies. However, predictions for large periods using DFA are hidden by the subdiffusive behavior.

  9. Stationary Time Series Analysis Using Information and Spectral Analysis

    Science.gov (United States)

    1992-09-01

    spectral density function of the time series. The spectral density function f(w), 0 < w < 1, is defined as the Fourier transform of...series with spectral density function f(w). 4 An important result of Pinsker [(1964), p. 196] can be interpreted as providing a for- mula for asymptotic...Analysis Papers, Holden-Day, San Francisco, California. Parzen, E. (1958) "On asymptotically efficient consistent estimates of the spectral density function

  10. Gaussian semiparametric estimation of non-stationary time series

    OpenAIRE

    Velasco, Carlos

    1998-01-01

    Generalizing the definition of the memory parameter d in terms of the differentiated series, we showed in Velasco (Non-stationary log-periodogram regression, Forthcoming J. Economet., 1997) that it is possible to estimate consistently the memory of non-stationary processes using methods designed for stationary long-range-dependent time series. In this paper we consider the Gaussian semiparametric estimate analysed by Robinson (Gaussian semiparametric estimation of long range dependence. Ann. ...

  11. Moderate Growth Time Series for Dynamic Combinatorics Modelisation

    CERN Document Server

    Jaff, Luaï; Kacem, Hatem Hadj; Bertelle, Cyrille

    2007-01-01

    Here, we present a family of time series with a simple growth constraint. This family can be the basis of a model to apply to emerging computation in business and micro-economy where global functions can be expressed from local rules. We explicit a double statistics on these series which allows to establish a one-to-one correspondence between three other ballot-like strunctures.

  12. Image-Based Learning Approach Applied to Time Series Forecasting

    Directory of Open Access Journals (Sweden)

    J. C. Chimal-Eguía

    2012-06-01

    Full Text Available In this paper, a new learning approach based on time-series image information is presented. In order to implementthis new learning technique, a novel time-series input data representation is also defined. This input datarepresentation is based on information obtained by image axis division into boxes. The difference between this newinput data representation and the classical is that this technique is not time-dependent. This new information isimplemented in the new Image-Based Learning Approach (IBLA and by means of a probabilistic mechanism thislearning technique is applied to the interesting problem of time series forecasting. The experimental results indicatethat by using the methodology proposed in this article, it is possible to obtain better results than with the classicaltechniques such as artificial neuronal networks and support vector machines.

  13. A refined fuzzy time series model for stock market forecasting

    Science.gov (United States)

    Jilani, Tahseen Ahmed; Burney, Syed Muhammad Aqil

    2008-05-01

    Time series models have been used to make predictions of stock prices, academic enrollments, weather, road accident casualties, etc. In this paper we present a simple time-variant fuzzy time series forecasting method. The proposed method uses heuristic approach to define frequency-density-based partitions of the universe of discourse. We have proposed a fuzzy metric to use the frequency-density-based partitioning. The proposed fuzzy metric also uses a trend predictor to calculate the forecast. The new method is applied for forecasting TAIEX and enrollments’ forecasting of the University of Alabama. It is shown that the proposed method work with higher accuracy as compared to other fuzzy time series methods developed for forecasting TAIEX and enrollments of the University of Alabama.

  14. Weighted statistical parameters for irregularly sampled time series

    CERN Document Server

    Rimoldini, Lorenzo

    2014-01-01

    Unevenly spaced time series are common in astronomy because of the day-night cycle, weather conditions, dependence on the source position in the sky, allocated telescope time, corrupt measurements, for example, or be inherent to the scanning law of satellites like Hipparcos and the forthcoming Gaia. This paper aims at improving the accuracy of common statistical parameters for the characterization of irregularly sampled signals. The uneven representation of time series, often including clumps of measurements and gaps with no data, can severely disrupt the values of estimators. A weighting scheme adapting to the sampling density and noise level of the signal is formulated. Its application to time series from the Hipparcos periodic catalogue led to significant improvements in the overall accuracy and precision of the estimators with respect to the unweighted counterparts and those weighted by inverse-squared uncertainties. Automated classification procedures employing statistical parameters weighted by the sugg...

  15. First time-series optical photometry from Antarctica

    CERN Document Server

    Strassmeier, K G; Granzer, T; Tosti, G; DiVarano, I; Savanov, I; Bagaglia, M; Castellini, S; Mancini, A; Nucciarelli, G; Straniero, O; Distefano, E; Messina, S; Cutispoto, G

    2008-01-01

    Beating the Earth's day-night cycle is mandatory for long and continuous time-series photometry and had been achieved with either large ground-based networks of observatories at different geographic longitudes or when conducted from space. A third possibility is offered by a polar location with astronomically-qualified site characteristics. Aims. In this paper, we present the first scientific stellar time-series optical photometry from Dome C in Antarctica and analyze approximately 13,000 CCD frames taken in July 2007. We conclude that high-precision CCD photometry with exceptional time coverage and cadence can be obtained at Dome C in Antarctica and be successfully used for time-series astrophysics.

  16. Time series analysis of the response of measurement instruments

    CERN Document Server

    Georgakaki, Dimitra; Polatoglou, Hariton

    2012-01-01

    In this work the significance of treating a set of measurements as a time series is being explored. Time Series Analysis (TSA) techniques, part of the Exploratory Data Analysis (EDA) approach, can provide much insight regarding the stochastic correlations that are induced on the outcome of an experiment by the measurement system and can provide criteria for the limited use of the classical variance in metrology. Specifically, techniques such as the Lag Plots, Autocorrelation Function, Power Spectral Density and Allan Variance are used to analyze series of sequential measurements, collected at equal time intervals from an electromechanical transducer. These techniques are used in conjunction with power law models of stochastic noise in order to characterize time or frequency regimes for which the usually assumed white noise model is adequate for the description of the measurement system response. However, through the detection of colored noise, usually referred to as flicker noise, which is expected to appear ...

  17. Periodicity Estimation in Mechanical Acoustic Time-Series Data

    Directory of Open Access Journals (Sweden)

    Zhu Yongbo

    2015-01-01

    Full Text Available Periodicity estimation in mechanical acoustic time-series data is a well-established problem in data mining as it can be applicable in variety of disciplines either for anomaly detection or for prediction purposes in industry. In this paper, we develop a new approach for capturing and characterizing periodic patterns in time-series data by virtue of the dynamic time warping (DTW. We have conducted extensive experiments to evaluate the proposed approach with synthetic data and our collected data in practice. Experimental results demonstrated its effectiveness and robustness on periodicity detection in highly noised data.

  18. Detecting structural breaks in time series via genetic algorithms

    DEFF Research Database (Denmark)

    Doerr, Benjamin; Fischer, Paul; Hilbert, Astrid

    2016-01-01

    Detecting structural breaks is an essential task for the statistical analysis of time series, for example, for fitting parametric models to it. In short, structural breaks are points in time at which the behaviour of the time series substantially changes. Typically, no solid background knowledge...... and mutation operations for this problem, we conduct extensive experiments to determine good choices for the parameters and operators of the genetic algorithm. One surprising observation is that use of uniform and one-point crossover together gave significantly better results than using either crossover...

  19. Time Series Analysis of Wheat Futures Reward in China

    Institute of Scientific and Technical Information of China (English)

    2005-01-01

    Different from the fact that the main researches are focused on single futures contract and lack of the comparison of different periods, this paper described the statistical characteristics of wheat futures reward time series of Zhengzhou Commodity Exchange in recent three years. Besides the basic statistic analysis, the paper used the GARCH and EGARCH model to describe the time series which had the ARCH effect and analyzed the persistence of volatility shocks and the leverage effect. The results showed that compared with that of normal one,wheat futures reward series were abnormality, leptokurtic and thick tail distribution. The study also found that two-part of the reward series had no autocorrelation. Among the six correlative series, three ones presented the ARCH effect. By using of the Auto-regressive Distributed Lag Model, GARCH model and EGARCH model, the paper demonstrates the persistence of volatility shocks and the leverage effect on the wheat futures reward time series. The results reveal that on the one hand, the statistical characteristics of the wheat futures reward are similar to the aboard mature futures market as a whole. But on the other hand, the results reflect some shortages such as the immatureness and the over-control by the government in the Chinese future market.

  20. The Application of Kernel Smoothing to Time Series Data

    Institute of Scientific and Technical Information of China (English)

    Zhao-jun Wang; Yi Zhao; Chun-jie Wu; Yan-ting Li

    2006-01-01

    There are already a lot of models to fit a set of stationary time series, such as AR, MA, and ARMA models. For the non-stationary data, an ARIMA or seasonal ARIMA models can be used to fit the given data.Moreover, there are also many statistical softwares that can be used to build a stationary or non-stationary time series model for a given set of time series data, such as SAS, SPLUS, etc. However, some statistical softwares wouldn't work well for small samples with or without missing data, especially for small time series data with seasonal trend. A nonparametric smoothing technique to build a forecasting model for a given small seasonal time series data is carried out in this paper. And then, both the method provided in this paper and that in SAS package axe applied to the modeling of international airline passengers data respectively, the comparisons between the two methods are done afterwards. The results of the comparison show us the method provided in this paper has superiority over SAS's method.

  1. Sparse time series chain graphical models for reconstructing genetic networks

    NARCIS (Netherlands)

    Abegaz, Fentaw; Wit, Ernst

    2013-01-01

    We propose a sparse high-dimensional time series chain graphical model for reconstructing genetic networks from gene expression data parametrized by a precision matrix and autoregressive coefficient matrix. We consider the time steps as blocks or chains. The proposed approach explores patterns of co

  2. Nonlinear projective filtering; 1, Application to real time series

    CERN Document Server

    Schreiber, T

    1998-01-01

    We discuss applications of nonlinear filtering of time series by locally linear phase space projections. Noise can be reduced whenever the error due to the manifold approximation is smaller than the noise in the system. Examples include the real time extraction of the fetal electrocardiogram from abdominal recordings.

  3. Optimization of recurrent neural networks for time series modeling

    DEFF Research Database (Denmark)

    Pedersen, Morten With

    1997-01-01

    The present thesis is about optimization of recurrent neural networks applied to time series modeling. In particular is considered fully recurrent networks working from only a single external input, one layer of nonlinear hidden units and a li near output unit applied to prediction of discrete time...

  4. Mining approximate periodic pattern in hydrological time series

    Science.gov (United States)

    Zhu, Y. L.; Li, S. J.; Bao, N. N.; Wan, D. S.

    2012-04-01

    There is a lot of information about the hidden laws of nature evolution and the influences of human beings activities on the earth surface in long sequence of hydrological time series. Data mining technology can help find those hidden laws, such as flood frequency and abrupt change, which is useful for the decision support of hydrological prediction and flood control scheduling. The periodic nature of hydrological time series is important for trend forecasting of drought and flood and hydraulic engineering planning. In Hydrology, the full period analysis of hydrological time series has attracted a lot of attention, such as the discrete periodogram, simple partial wave method, Fourier analysis method, and maximum entropy spectral analysis method and wavelet analysis. In fact, the hydrological process is influenced both by deterministic factors and stochastic ones. For example, the tidal level is also affected by moon circling the Earth, in addition to the Earth revolution and its rotation. Hence, there is some kind of approximate period hidden in the hydrological time series, sometimes which is also called the cryptic period. Recently, partial period mining originated from the data mining domain can be a remedy for the traditional period analysis methods in hydrology, which has a loose request of the data integrity and continuity. They can find some partial period in the time series. This paper is focused on the partial period mining in the hydrological time series. Based on asynchronous periodic pattern and partial period mining with suffix tree, this paper proposes to mine multi-event asynchronous periodic pattern based on modified suffix tree representation and traversal, and invent a dynamic candidate period intervals adjusting method, which can avoids period omissions or waste of time and space. The experimental results on synthetic data and real water level data of the Yangtze River at Nanjing station indicate that this algorithm can discover hydrological

  5. Real Time Clustering of Time Series Using Triangular Potentials

    Directory of Open Access Journals (Sweden)

    Aldo Pacchiano

    2015-01-01

    Full Text Available Motivated by the problem of computing investment portfolio weightin gs we investigate various methods of clustering as alternatives to traditional mean-v ariance approaches. Such methods can have significant benefits from a practical point of view since they remove the need to invert a sample covariance matrix, which can suffer from estimation error and will almost certainly be non-stationary. The general idea is to find groups of assets w hich share similar return characteristics over time and treat each group as a singl e composite asset. We then apply inverse volatility weightings to these new composite assets. In the course of our investigation we devise a method of clustering based on triangular potentials and we present as sociated theoretical results as well as various examples based on synthetic data.

  6. A Platform for Processing Expression of Short Time Series (PESTS

    Directory of Open Access Journals (Sweden)

    Markatou Marianthi

    2011-01-01

    Full Text Available Abstract Background Time course microarray profiles examine the expression of genes over a time domain. They are necessary in order to determine the complete set of genes that are dynamically expressed under given conditions, and to determine the interaction between these genes. Because of cost and resource issues, most time series datasets contain less than 9 points and there are few tools available geared towards the analysis of this type of data. Results To this end, we introduce a platform for Processing Expression of Short Time Series (PESTS. It was designed with a focus on usability and interpretability of analyses for the researcher. As such, it implements several standard techniques for comparability as well as visualization functions. However, it is designed specifically for the unique methods we have developed for significance analysis, multiple test correction and clustering of short time series data. The central tenet of these methods is the use of biologically relevant features for analysis. Features summarize short gene expression profiles, inherently incorporate dependence across time, and allow for both full description of the examined curve and missing data points. Conclusions PESTS is fully generalizable to other types of time series analyses. PESTS implements novel methods as well as several standard techniques for comparability and visualization functions. These features and functionality make PESTS a valuable resource for a researcher's toolkit. PESTS is available to download for free to academic and non-profit users at http://www.mailman.columbia.edu/academic-departments/biostatistics/research-service/software-development.

  7. Time Series Outlier Detection Based on Sliding Window Prediction

    Directory of Open Access Journals (Sweden)

    Yufeng Yu

    2014-01-01

    Full Text Available In order to detect outliers in hydrological time series data for improving data quality and decision-making quality related to design, operation, and management of water resources, this research develops a time series outlier detection method for hydrologic data that can be used to identify data that deviate from historical patterns. The method first built a forecasting model on the history data and then used it to predict future values. Anomalies are assumed to take place if the observed values fall outside a given prediction confidence interval (PCI, which can be calculated by the predicted value and confidence coefficient. The use of PCI as threshold is mainly on the fact that it considers the uncertainty in the data series parameters in the forecasting model to address the suitable threshold selection problem. The method performs fast, incremental evaluation of data as it becomes available, scales to large quantities of data, and requires no preclassification of anomalies. Experiments with different hydrologic real-world time series showed that the proposed methods are fast and correctly identify abnormal data and can be used for hydrologic time series analysis.

  8. Increment entropy as a measure of complexity for time series

    CERN Document Server

    Liu, Xiaofeng; Xu, Ning; Xue, Jianru

    2015-01-01

    Entropy has been a common index to quantify the complexity of time series in a variety of fields. Here, we introduce increment entropy to measure the complexity of time series in which each increment is mapped into a word of two letters, one letter corresponding to direction and the other corresponding to magnitude. The Shannon entropy of the words is termed as increment entropy (IncrEn). Simulations on synthetic data and tests on epileptic EEG signals have demonstrated its ability of detecting the abrupt change, regardless of energetic (e.g. spikes or bursts) or structural changes. The computation of IncrEn does not make any assumption on time series and it can be applicable to arbitrary real-world data.

  9. Track Irregularity Time Series Analysis and Trend Forecasting

    Directory of Open Access Journals (Sweden)

    Jia Chaolong

    2012-01-01

    Full Text Available The combination of linear and nonlinear methods is widely used in the prediction of time series data. This paper analyzes track irregularity time series data by using gray incidence degree models and methods of data transformation, trying to find the connotative relationship between the time series data. In this paper, GM (1,1 is based on first-order, single variable linear differential equations; after an adaptive improvement and error correction, it is used to predict the long-term changing trend of track irregularity at a fixed measuring point; the stochastic linear AR, Kalman filtering model, and artificial neural network model are applied to predict the short-term changing trend of track irregularity at unit section. Both long-term and short-term changes prove that the model is effective and can achieve the expected accuracy.

  10. Feature-preserving interpolation and filtering of environmental time series

    CERN Document Server

    Mariethoz, Gregoire; Jougnot, Damien; Rezaee, Hassan

    2015-01-01

    We propose a method for filling gaps and removing interferences in time series for applications involving continuous monitoring of environmental variables. The approach is non-parametric and based on an iterative pattern-matching between the affected and the valid parts of the time series. It considers several variables jointly in the pattern matching process and allows preserving linear or non-linear dependences between variables. The uncertainty in the reconstructed time series is quantified through multiple realizations. The method is tested on self-potential data that are affected by strong interferences as well as data gaps, and the results show that our approach allows reproducing the spectral features of the original signal. Even in the presence of intense signal perturbations, it significantly improves the signal and corrects bias introduced by asymmetrical interferences. Potential applications are wide-ranging, including geophysics, meteorology and hydrology.

  11. Grammar-based feature generation for time-series prediction

    CERN Document Server

    De Silva, Anthony Mihirana

    2015-01-01

    This book proposes a novel approach for time-series prediction using machine learning techniques with automatic feature generation. Application of machine learning techniques to predict time-series continues to attract considerable attention due to the difficulty of the prediction problems compounded by the non-linear and non-stationary nature of the real world time-series. The performance of machine learning techniques, among other things, depends on suitable engineering of features. This book proposes a systematic way for generating suitable features using context-free grammar. A number of feature selection criteria are investigated and a hybrid feature generation and selection algorithm using grammatical evolution is proposed. The book contains graphical illustrations to explain the feature generation process. The proposed approaches are demonstrated by predicting the closing price of major stock market indices, peak electricity load and net hourly foreign exchange client trade volume. The proposed method ...

  12. GAS DETECTING AND FORECASTING VIA TIME SERIES METHOD

    Institute of Scientific and Technical Information of China (English)

    黄养光

    1990-01-01

    The importance and urgency of gas detecting and forecasting in underground coal mining are self-evident. Unfortunately, this problem has not yet been solved thoroughly. In this paper, the author suggests that the time series analysis method be adopted for processing the gas stochastic data. The time series method is superior to the conventional Fourier analysis in some aspects, especially, the time series method possesses Forecasting (or prediction) function which is highly valuable for gas monitoring. An example ot a set ot gas data sampled From a certain foul coal mine is investigated and an AR (3) model is established. The fitting result and the forecasting error are accepted satisfactorily. At the end of this paper several remarks are presented for further discussion.

  13. The time series forecasting: from the aspect of network

    CERN Document Server

    Chen, S; Hu, Y; Liu, Q; Deng, Y

    2014-01-01

    Forecasting can estimate the statement of events according to the historical data and it is considerably important in many disciplines. At present, time series models have been utilized to solve forecasting problems in various domains. In general, researchers use curve fitting and parameter estimation methods (moment estimation, maximum likelihood estimation and least square method) to forecast. In this paper, a new sight is given to the forecasting and a completely different method is proposed to forecast time series. Inspired by the visibility graph and link prediction, this letter converts time series into network and then finds the nodes which are mostly likelihood to link with the predicted node. Finally, the predicted value will be obtained according to the state of the link. The TAIEX data set is used in the case study to illustrate that the proposed method is effectiveness. Compared with ARIMA model, the proposed shows a good forecasting performance when there is a small amount of data.

  14. Parameter-Free Search of Time-Series Discord

    Institute of Scientific and Technical Information of China (English)

    Wei Luo; Marcus Gallagher; Janet Wiles

    2013-01-01

    Time-series discord is widely used in data mining applications to characterize anomalous subsequences in time series.Compared to some other discord search algorithms,the direct search algorithm based on the recurrence plot shows the advantage of being fast and parameter free.The direct search algorithm,however,relies on quasi-periodicity in input time series,an assumption that limits the algorithm's applicability.In this paper,we eliminate the periodicity assumption from the direct search algorithm by proposing a reference function for subsequences and a new sampling strategy based on the reference function.These measures result in a new algorithm with improved efficiency and robustness,as evidenced by our empirical evaluation.

  15. Time series characterization via horizontal visibility graph and Information Theory

    Science.gov (United States)

    Gonçalves, Bruna Amin; Carpi, Laura; Rosso, Osvaldo A.; Ravetti, Martín G.

    2016-12-01

    Complex networks theory have gained wider applicability since methods for transformation of time series to networks were proposed and successfully tested. In the last few years, horizontal visibility graph has become a popular method due to its simplicity and good results when applied to natural and artificially generated data. In this work, we explore different ways of extracting information from the network constructed from the horizontal visibility graph and evaluated by Information Theory quantifiers. Most works use the degree distribution of the network, however, we found alternative probability distributions, more efficient than the degree distribution in characterizing dynamical systems. In particular, we find that, when using distributions based on distances and amplitude values, significant shorter time series are required. We analyze fractional Brownian motion time series, and a paleoclimatic proxy record of ENSO from the Pallcacocha Lake to study dynamical changes during the Holocene.

  16. Asymptotics for Nonlinear Transformations of Fractionally Integrated Time Series

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    The asymptotic theory for nonlinear transformations of fractionally integrated time series is developed. By the use of fractional Occupation Times Formula, various nonlinear functions of fractionally integrated series such as ARFIMA time series are studied, and the asymptotic distributions of the sample moments of such functions are obtained and analyzed. The transformations considered in this paper includes a variety of functions such as regular functions, integrable functions and asymptotically homogeneous functions that are often used in practical nonlinear econometric analysis. It is shown that the asymptotic theory of nonlinear transformations of original and normalized fractionally integrated processes is different from that of fractionally integrated processes, but is similar to the asymptotic theory of nonlinear transformations of integrated processes.

  17. Complex Network Approach to the Fractional Time Series

    CERN Document Server

    Manshour, Pouya

    2015-01-01

    In order to extract the correlation information inherited in a stochastic time series, the visibility graph algorithm has been recently proposed, by which a time series can be mapped onto a complex network. We demonstrate that the visibility algorithm is not an appropriate one to study the correlation aspects of a time series. We then employ the horizontal visibility algorithm, as a much simpler one, to map the fractional processes onto complex networks. The parabolic exponential functions are found to ?fit with the corresponding degree distributions, with Hurst dependent ?fitting parameter. Further, we take into account other topological properties such as the maximum eigenvalue of the adjacency matrix and the degree assortativity, and show that such topological quantities can also be used to predict the Hurst exponent, with an exception for the antipersistent fractional Gaussian noises. To solve this problem, we take into account the Spearman correlation coefficient between the node's degree and its corresp...

  18. On the detection of superdiffusive behaviour in time series

    Science.gov (United States)

    Gottwald, G. A.; Melbourne, I.

    2016-12-01

    We present a new method for detecting superdiffusive behaviour and for determining rates of superdiffusion in time series data. Our method applies equally to stochastic and deterministic time series data (with no prior knowledge required of the nature of the data) and relies on one realisation (ie one sample path) of the process. Linear drift effects are automatically removed without any preprocessing. We show numerical results for time series constructed from i.i.d. α-stable random variables and from deterministic weakly chaotic maps. We compare our method with the standard method of estimating the growth rate of the mean-square displacement as well as the p-variation method, maximum likelihood, quantile matching and linear regression of the empirical characteristic function.

  19. Increment Entropy as a Measure of Complexity for Time Series

    Directory of Open Access Journals (Sweden)

    Xiaofeng Liu

    2016-01-01

    Full Text Available Entropy has been a common index to quantify the complexity of time series in a variety of fields. Here, we introduce an increment entropy to measure the complexity of time series in which each increment is mapped onto a word of two letters, one corresponding to the sign and the other corresponding to the magnitude. Increment entropy (IncrEn is defined as the Shannon entropy of the words. Simulations on synthetic data and tests on epileptic electroencephalogram (EEG signals demonstrate its ability of detecting abrupt changes, regardless of the energetic (e.g., spikes or bursts or structural changes. The computation of IncrEn does not make any assumption on time series, and it can be applicable to arbitrary real-world data.

  20. Time series, correlation matrices and random matrix models

    Energy Technology Data Exchange (ETDEWEB)

    Vinayak [Instituto de Ciencias Físicas, Universidad Nacional Autónoma de México, C.P. 62210 Cuernavaca (Mexico); Seligman, Thomas H. [Instituto de Ciencias Físicas, Universidad Nacional Autónoma de México, C.P. 62210 Cuernavaca, México and Centro Internacional de Ciencias, C.P. 62210 Cuernavaca (Mexico)

    2014-01-08

    In this set of five lectures the authors have presented techniques to analyze open classical and quantum systems using correlation matrices. For diverse reasons we shall see that random matrices play an important role to describe a null hypothesis or a minimum information hypothesis for the description of a quantum system or subsystem. In the former case various forms of correlation matrices of time series associated with the classical observables of some system. The fact that such series are necessarily finite, inevitably introduces noise and this finite time influence lead to a random or stochastic component in these time series. By consequence random correlation matrices have a random component, and corresponding ensembles are used. In the latter we use random matrices to describe high temperature environment or uncontrolled perturbations, ensembles of differing chaotic systems etc. The common theme of the lectures is thus the importance of random matrix theory in a wide range of fields in and around physics.

  1. Time series analysis by the Maximum Entropy method

    Energy Technology Data Exchange (ETDEWEB)

    Kirk, B.L.; Rust, B.W.; Van Winkle, W.

    1979-01-01

    The principal subject of this report is the use of the Maximum Entropy method for spectral analysis of time series. The classical Fourier method is also discussed, mainly as a standard for comparison with the Maximum Entropy method. Examples are given which clearly demonstrate the superiority of the latter method over the former when the time series is short. The report also includes a chapter outlining the theory of the method, a discussion of the effects of noise in the data, a chapter on significance tests, a discussion of the problem of choosing the prediction filter length, and, most importantly, a description of a package of FORTRAN subroutines for making the various calculations. Cross-referenced program listings are given in the appendices. The report also includes a chapter demonstrating the use of the programs by means of an example. Real time series like the lynx data and sunspot numbers are also analyzed. 22 figures, 21 tables, 53 references.

  2. Detection of "noisy" chaos in a time series

    DEFF Research Database (Denmark)

    Chon, K H; Kanters, J K; Cohen, R J;

    1997-01-01

    Time series from biological system often displays fluctuations in the measured variables. Much effort has been directed at determining whether this variability reflects deterministic chaos, or whether it is merely "noise". The output from most biological systems is probably the result of both...... the internal dynamics of the systems, and the input to the system from the surroundings. This implies that the system should be viewed as a mixed system with both stochastic and deterministic components. We present a method that appears to be useful in deciding whether determinism is present in a time series......, and if this determinism has chaotic attributes. The method relies on fitting a nonlinear autoregressive model to the time series followed by an estimation of the characteristic exponents of the model over the observed probability distribution of states for the system. The method is tested by computer simulations...

  3. Extracting unstable periodic orbits from chaotic time series data

    Energy Technology Data Exchange (ETDEWEB)

    So, P.; Schiff, S.; Gluckman, B.J., [Center for Neuroscience, Childrens Research Institute, Childrens National Medical Center and the George Washington University, NW, Washington, D.C. 20010 (United States); So, P.; Ott, E.; Grebogi, C., [Institute for Plasma Research, University of Maryland, College Park, Maryland 20742 (United States); Sauer, T., [Department of Mathematics, The George Mason University, Fairfax, Virginia 22030 (United States); Gluckman, B.J., [Naval Surface Warfare Center, Carderock Division, Bethesda, Maryland 20054-5000 (United States)

    1997-05-01

    A general nonlinear method to extract unstable periodic orbits from chaotic time series is proposed. By utilizing the estimated local dynamics along a trajectory, we devise a transformation of the time series data such that the transformed data are concentrated on the periodic orbits. Thus, one can extract unstable periodic orbits from a chaotic time series by simply looking for peaks in a finite grid approximation of the distribution function of the transformed data. Our method is demonstrated using data from both numerical and experimental examples, including neuronal ensemble data from mammalian brain slices. The statistical significance of the results in the presence of noise is assessed using surrogate data. {copyright} {ital 1997} {ital The American Physical Society}

  4. Appropriate Algorithms for Nonlinear Time Series Analysis in Psychology

    Science.gov (United States)

    Scheier, Christian; Tschacher, Wolfgang

    Chaos theory has a strong appeal for psychology because it allows for the investigation of the dynamics and nonlinearity of psychological systems. Consequently, chaos-theoretic concepts and methods have recently gained increasing attention among psychologists and positive claims for chaos have been published in nearly every field of psychology. Less attention, however, has been paid to the appropriateness of chaos-theoretic algorithms for psychological time series. An appropriate algorithm can deal with short, noisy data sets and yields `objective' results. In the present paper it is argued that most of the classical nonlinear techniques don't satisfy these constraints and thus are not appropriate for psychological data. A methodological approach is introduced that is based on nonlinear forecasting and the method of surrogate data. In artificial data sets and empirical time series we can show that this methodology reliably assesses nonlinearity and chaos in time series even if they are short and contaminated by noise.

  5. General expression for linear and nonlinear time series models

    Institute of Scientific and Technical Information of China (English)

    Ren HUANG; Feiyun XU; Ruwen CHEN

    2009-01-01

    The typical time series models such as ARMA, AR, and MA are founded on the normality and stationarity of a system and expressed by a linear difference equation; therefore, they are strictly limited to the linear system. However, some nonlinear factors are within the practical system; thus, it is difficult to fit the model for real systems with the above models. This paper proposes a general expression for linear and nonlinear auto-regressive time series models (GNAR). With the gradient optimization method and modified AIC information criteria integrated with the prediction error, the parameter estimation and order determination are achieved. The model simulation and experiments show that the GNAR model can accurately approximate to the dynamic characteristics of the most nonlinear models applied in academics and engineering. The modeling and prediction accuracy of the GNAR model is superior to the classical time series models. The proposed GNAR model is flexible and effective.

  6. Handbook of Time Series Analysis Recent Theoretical Developments and Applications

    CERN Document Server

    Schelter, Björn; Timmer, Jens

    2006-01-01

    This handbook provides an up-to-date survey of current research topics and applications of time series analysis methods written by leading experts in their fields. It covers recent developments in univariate as well as bivariate and multivariate time series analysis techniques ranging from physics' to life sciences' applications. Each chapter comprises both methodological aspects and applications to real world complex systems, such as the human brain or Earth's climate. Covering an exceptionally broad spectrum of topics, beginners, experts and practitioners who seek to understand the latest de

  7. Chaotic time series prediction and additive white Gaussian noise

    Energy Technology Data Exchange (ETDEWEB)

    Lim, Teck Por [Department of Mathematics, 6M30 Huxley, Imperial College London, 180 Queen' s Gate, London, SW7 2BZ (United Kingdom)]. E-mail: teckpor@gmail.com; Puthusserypady, Sadasivan [Department of Electrical and Computer Engineering, National University of Singapore, 4 Engineering Drive 3, Singapore 117576 (Singapore)]. E-mail: elespk@nus.edu.sg

    2007-06-04

    Taken's delay embedding theorem states that a pseudo state-space can be reconstructed from a time series consisting of observations of a chaotic process. However, experimental observations are inevitably corrupted by measurement noise, which can be modelled as Additive White Gaussian Noise (AWGN). This Letter analyses time series prediction in the presence of AWGN using the triangle inequality and the mean of the Nakagami distribution. It is shown that using more delay coordinates than those used by a typical delay embedding can improve prediction accuracy, when the mean magnitude of the input vector dominates the mean magnitude of AWGN.

  8. Complex network approach for recurrence analysis of time series

    Energy Technology Data Exchange (ETDEWEB)

    Marwan, Norbert, E-mail: marwan@pik-potsdam.d [Potsdam Institute for Climate Impact Research, PO Box 601203, 14412 Potsdam (Germany); Donges, Jonathan F. [Potsdam Institute for Climate Impact Research, PO Box 601203, 14412 Potsdam (Germany)] [Department of Physics, Humboldt University Berlin, Newtonstr. 15, 12489 Berlin (Germany); Zou Yong [Potsdam Institute for Climate Impact Research, PO Box 601203, 14412 Potsdam (Germany); Donner, Reik V. [Potsdam Institute for Climate Impact Research, PO Box 601203, 14412 Potsdam (Germany)] [Institute for Transport and Economics, Dresden University of Technology, Andreas-Schubert-Str. 23, 01062 Dresden (Germany)] [Graduate School of Science, Osaka Prefecture University, 1-1 Gakuencho, Naka-ku, Sakai 599-8531 (Japan); Kurths, Juergen [Potsdam Institute for Climate Impact Research, PO Box 601203, 14412 Potsdam (Germany)] [Department of Physics, Humboldt University Berlin, Newtonstr. 15, 12489 Berlin (Germany)

    2009-11-09

    We propose a novel approach for analysing time series using complex network theory. We identify the recurrence matrix (calculated from time series) with the adjacency matrix of a complex network and apply measures for the characterisation of complex networks to this recurrence matrix. By using the logistic map, we illustrate the potential of these complex network measures for the detection of dynamical transitions. Finally, we apply the proposed approach to a marine palaeo-climate record and identify the subtle changes to the climate regime.

  9. Mining Rules from Electrical Load Time Series Data Set

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    The mining of the rules from the electrical load time series data which are collected from the EMS (Energy Management System) is discussed. The data from the EMS are too huge and sophisticated to be understood and used by the power system engineer, while useful information is hidden in the electrical load data. The authors discuss the use of fuzzy linguistic summary as data mining method to induce the rules from the electrical load time series. The data preprocessing techniques are also discussed in the paper.

  10. A Non-standard Empirical Likelihood for Time Series

    DEFF Research Database (Denmark)

    Nordman, Daniel J.; Bunzel, Helle; Lahiri, Soumendra N.

    Standard blockwise empirical likelihood (BEL) for stationary, weakly dependent time series requires specifying a fixed block length as a tuning parameter for setting confidence regions. This aspect can be difficult and impacts coverage accuracy. As an alternative, this paper proposes a new version......-standard asymptotics and requires a significantly different development compared to standard BEL. We establish the large-sample distribution of log-ratio statistics from the new BEL method for calibrating confidence regions for mean or smooth function parameters of time series. This limit law is not the usual chi...

  11. Nonlinear Time Series Forecast Using Radial Basis Function Neural Networks

    Institute of Scientific and Technical Information of China (English)

    ZHENGXin; CHENTian-Lun

    2003-01-01

    In the research of using Radial Basis Function Neural Network (RBF NN) forecasting nonlinear time series, we investigate how the different clusterings affect the process of learning and forecasting. We find that k-means clustering is very suitable. In order to increase the precision we introduce a nonlinear feedback term to escape from the local minima of energy, then we use the model to forecast the nonlinear time series which are produced by Mackey-Glass equation and stocks. By selecting the k-means clustering and the suitable feedback term, much better forecasting results are obtained.

  12. Parameterizing unconditional skewness in models for financial time series

    DEFF Research Database (Denmark)

    He, Changli; Silvennoinen, Annastiina; Teräsvirta, Timo

    In this paper we consider the third-moment structure of a class of time series models. It is often argued that the marginal distribution of financial time series such as returns is skewed. Therefore it is of importance to know what properties a model should possess if it is to accommodate...... unconditional skewness. We consider modelling the unconditional mean and variance using models that respond nonlinearly or asymmetrically to shocks. We investigate the implications of these models on the third-moment structure of the marginal distribution as well as conditions under which the unconditional...

  13. Testing for intracycle determinism in pseudoperiodic time series

    Science.gov (United States)

    Coelho, Mara C. S.; Mendes, Eduardo M. A. M.; Aguirre, Luis A.

    2008-06-01

    A determinism test is proposed based on the well-known method of the surrogate data. Assuming predictability to be a signature of determinism, the proposed method checks for intracycle (e.g., short-term) determinism in the pseudoperiodic time series for which standard methods of surrogate analysis do not apply. The approach presented is composed of two steps. First, the data are preprocessed to reduce the effects of seasonal and trend components. Second, standard tests of surrogate analysis can then be used. The determinism test is applied to simulated and experimental pseudoperiodic time series and the results show the applicability of the proposed test.

  14. Multi-Scale Dissemination of Time Series Data

    DEFF Research Database (Denmark)

    Guo, Qingsong; Zhou, Yongluan; Su, Li

    2013-01-01

    , which is an abstract indicator for both the physical limits and the amount of data that the subscriber would like to handle. To handle this problem, we propose a system framework for multi-scale time series data dissemination that employs a typical tree-based dissemination network and existing time......-series compression models. Due to the bandwidth limits regarding to potentially sheer speed of data, it is inevitable to compress and re-compress data along the dissemination paths according to the subscription level of each node. Compression would caused the accuracy loss of data, thus we devise several algorithms...

  15. Bootstrap Power of Time Series Goodness of fit tests

    Directory of Open Access Journals (Sweden)

    Sohail Chand

    2013-10-01

    Full Text Available In this article, we looked at power of various versions of Box and Pierce statistic and Cramer von Mises test. An extensive simulation study has been conducted to compare the power of these tests. Algorithms have been provided for the power calculations and comparison has also been made between the semi parametric bootstrap methods used for time series. Results show that Box-Pierce statistic and its various versions have good power against linear time series models but poor power against non linear models while situation reverses for Cramer von Mises test. Moreover, we found that dynamic bootstrap method is better than xed design bootstrap method.

  16. Kālī: Time series data modeler

    Science.gov (United States)

    Kasliwal, Vishal P.

    2016-07-01

    The fully parallelized and vectorized software package Kālī models time series data using various stochastic processes such as continuous-time ARMA (C-ARMA) processes and uses Bayesian Markov Chain Monte-Carlo (MCMC) for inferencing a stochastic light curve. Kālimacr; is written in c++ with Python language bindings for ease of use. K¯lī is named jointly after the Hindu goddess of time, change, and power and also as an acronym for KArma LIbrary.

  17. Fractal dimension of electroencephalographic time series and underlying brain processes.

    Science.gov (United States)

    Lutzenberger, W; Preissl, H; Pulvermüller, F

    1995-10-01

    Fractal dimension has been proposed as a useful measure for the characterization of electrophysiological time series. This paper investigates what the pointwise dimension of electroencephalographic (EEG) time series can reveal about underlying neuronal generators. The following theoretical assumptions concerning brain function were made (i) within the cortex, strongly coupled neural assemblies exist which oscillate at certain frequencies when they are active, (ii) several such assemblies can oscillate at a time, and (iii) activity flow between assemblies is minimal. If these assumptions are made, cortical activity can be considered as the weighted sum of a finite number of oscillations (plus noise). It is shown that the correlation dimension of finite time series generated by multiple oscillators increases monotonically with the number of oscillators. Furthermore, it is shown that a reliable estimate of the pointwise dimension of the raw EEG signal can be calculated from a time series as short as a few seconds. These results indicate that (i) The pointwise dimension of the EEG allows conclusions regarding the number of independently oscillating networks in the cortex, and (ii) a reliable estimate of the pointwise dimension of the EEG is possible on the basis of short raw signals.

  18. FTSPlot: fast time series visualization for large datasets.

    Science.gov (United States)

    Riss, Michael

    2014-01-01

    The analysis of electrophysiological recordings often involves visual inspection of time series data to locate specific experiment epochs, mask artifacts, and verify the results of signal processing steps, such as filtering or spike detection. Long-term experiments with continuous data acquisition generate large amounts of data. Rapid browsing through these massive datasets poses a challenge to conventional data plotting software because the plotting time increases proportionately to the increase in the volume of data. This paper presents FTSPlot, which is a visualization concept for large-scale time series datasets using techniques from the field of high performance computer graphics, such as hierarchic level of detail and out-of-core data handling. In a preprocessing step, time series data, event, and interval annotations are converted into an optimized data format, which then permits fast, interactive visualization. The preprocessing step has a computational complexity of O(n x log(N)); the visualization itself can be done with a complexity of O(1) and is therefore independent of the amount of data. A demonstration prototype has been implemented and benchmarks show that the technology is capable of displaying large amounts of time series data, event, and interval annotations lag-free with visualization method for long-term electrophysiological experiments.

  19. LEARNING GRANGER CAUSALITY GRAPHS FOR MULTIVARIATE NONLINEAR TIME SERIES

    Institute of Scientific and Technical Information of China (English)

    Wei GAO; Zheng TIAN

    2009-01-01

    An information theory method is proposed to test the. Granger causality and contemporaneous conditional independence in Granger causality graph models. In the graphs, the vertex set denotes the component series of the multivariate time series, and the directed edges denote causal dependence, while the undirected edges reflect the instantaneous dependence. The presence of the edges is measured by a statistics based on conditional mutual information and tested by a permutation procedure. Furthermore, for the existed relations, a statistics based on the difference between general conditional mutual information and linear conditional mutual information is proposed to test the nonlinearity. The significance of the nonlinear test statistics is determined by a bootstrap method based on surrogate data. We investigate the finite sample behavior of the procedure through simulation time series with different dependence structures, including linear and nonlinear relations.

  20. Studies in Astronomical Time Series Analysis. VI. Bayesian Block Representations

    CERN Document Server

    Scargle, Jeffrey D; Jackson, Brad; Chiang, James

    2012-01-01

    This paper addresses the problem of detecting and characterizing local variability in time series and other forms of sequential data. The goal is to identify and characterize statistically significant variations, at the same time suppressing the inevitable corrupting observational errors. We present a simple nonparametric modeling technique and an algorithm implementing it - an improved and generalized version of Bayesian Blocks (Scargle 1998) - that finds the optimal segmentation of the data in the observation interval. The structure of the algorithm allows it to be used in either a real-time trigger mode, or a retrospective mode. Maximum likelihood or marginal posterior functions to measure model fitness are presented for events, binned counts, and measurements at arbitrary times with known error distributions. Problems addressed include those connected with data gaps, variable exposure, extension to piecewise linear and piecewise exponential representations, multi-variate time series data, analysis of vari...

  1. Recovery of delay time from time series based on the nearest neighbor method

    Science.gov (United States)

    Prokhorov, M. D.; Ponomarenko, V. I.; Khorev, V. S.

    2013-12-01

    We propose a method for the recovery of delay time from time series of time-delay systems. The method is based on the nearest neighbor analysis. The method allows one to reconstruct delays in various classes of time-delay systems including systems of high order, systems with several coexisting delays, and nonscalar time-delay systems. It can be applied to time series heavily corrupted by additive and dynamical noise.

  2. Recovery of delay time from time series based on the nearest neighbor method

    Energy Technology Data Exchange (ETDEWEB)

    Prokhorov, M.D., E-mail: mdprokhorov@yandex.ru [Saratov Branch of Kotel' nikov Institute of Radio Engineering and Electronics of Russian Academy of Sciences, Zelyonaya Street, 38, Saratov 410019 (Russian Federation); Ponomarenko, V.I. [Saratov Branch of Kotel' nikov Institute of Radio Engineering and Electronics of Russian Academy of Sciences, Zelyonaya Street, 38, Saratov 410019 (Russian Federation); Department of Nano- and Biomedical Technologies, Saratov State University, Astrakhanskaya Street, 83, Saratov 410012 (Russian Federation); Khorev, V.S. [Department of Nano- and Biomedical Technologies, Saratov State University, Astrakhanskaya Street, 83, Saratov 410012 (Russian Federation)

    2013-12-09

    We propose a method for the recovery of delay time from time series of time-delay systems. The method is based on the nearest neighbor analysis. The method allows one to reconstruct delays in various classes of time-delay systems including systems of high order, systems with several coexisting delays, and nonscalar time-delay systems. It can be applied to time series heavily corrupted by additive and dynamical noise.

  3. Change detection in a time series of polarimetric SAR images

    DEFF Research Database (Denmark)

    Skriver, Henning; Nielsen, Allan Aasbjerg; Conradsen, Knut

    can be used to detect at which points changes occur in the time series. [1] T. W. Anderson, An Introduction to Multivariate Statistical Analysis, John Wiley, New York, third edition, 2003. [2] K. Conradsen, A. A. Nielsen, J. Schou, and H. Skriver, “A test statistic in the complex Wishart distribution...

  4. Time series analysis in astronomy: Limits and potentialities

    DEFF Research Database (Denmark)

    Vio, R.; Kristensen, N.R.; Madsen, Henrik

    2005-01-01

    In this paper we consider the problem of the limits concerning the physical information that can be extracted from the analysis of one or more time series ( light curves) typical of astrophysical objects. On the basis of theoretical considerations and numerical simulations, we show that with no a...

  5. Wavelet methods in (financial) time-series processing

    NARCIS (Netherlands)

    Struzik, Z.R.

    2000-01-01

    We briefly describe the major advantages of using the wavelet transform for the processing of financial time series on the example of the S&P index. In particular, we show how to uncover local the scaling (correlation) characteristics of the S&P index with the wavelet based effective H'older expone

  6. Long-memory time series theory and methods

    CERN Document Server

    Palma, Wilfredo

    2007-01-01

    Wilfredo Palma, PhD, is Chairman and Professor of Statistics in the Department of Statistics at Pontificia Universidad Católica de Chile. Dr. Palma has published several refereed articles and has received over a dozen academic honors and awards. His research interests include time series analysis, prediction theory, state space systems, linear models, and econometrics.

  7. Notes on economic time series analysis system theoretic perspectives

    CERN Document Server

    Aoki, Masanao

    1983-01-01

    In seminars and graduate level courses I have had several opportunities to discuss modeling and analysis of time series with economists and economic graduate students during the past several years. These experiences made me aware of a gap between what economic graduate students are taught about vector-valued time series and what is available in recent system literature. Wishing to fill or narrow the gap that I suspect is more widely spread than my personal experiences indicate, I have written these notes to augment and reor­ ganize materials I have given in these courses and seminars. I have endeavored to present, in as much a self-contained way as practicable, a body of results and techniques in system theory that I judge to be relevant and useful to economists interested in using time series in their research. I have essentially acted as an intermediary and interpreter of system theoretic results and perspectives in time series by filtering out non-essential details, and presenting coherent accounts of wha...

  8. Deriving dynamic marketing effectiveness from econometric time series models

    NARCIS (Netherlands)

    C. Horváth (Csilla); Ph.H.B.F. Franses (Philip Hans)

    2003-01-01

    textabstractTo understand the relevance of marketing efforts, it has become standard practice to estimate the long-run and short-run effects of the marketing-mix, using, say, weekly scanner data. A common vehicle for this purpose is an econometric time series model. Issues that are addressed in the

  9. Outlier detection algorithms for least squares time series regression

    DEFF Research Database (Denmark)

    Johansen, Søren; Nielsen, Bent

    We review recent asymptotic results on some robust methods for multiple regression. The regressors include stationary and non-stationary time series as well as polynomial terms. The methods include the Huber-skip M-estimator, 1-step Huber-skip M-estimators, in particular the Impulse Indicator Sat...

  10. Publicly Verifiable Private Aggregation of Time-Series Data

    NARCIS (Netherlands)

    Bakondi, B.G.; Peter, A.; Everts, M.H.; Hartel, P.H.; Jonker, W.

    2015-01-01

    Aggregation of time-series data offers the possibility to learn certain statistics over data periodically uploaded by different sources. In case of privacy sensitive data, it is desired to hide every data provider's individual values from the other participants (including the data aggregator). Exist

  11. Noise in multivariate GPS position time-series

    NARCIS (Netherlands)

    Amiri-Simkooei, A.R.

    2008-01-01

    A methodology is developed to analyze a multivariate linear model, which occurs in many geodetic and geophysical applications. Proper analysis of multivariate GPS coordinate time-series is considered to be an application. General, special, and more practical stochastic models are adopted to assess t

  12. A Hybrid Joint Moment Ratio Test for Financial Time Series

    NARCIS (Netherlands)

    P.A. Groenendijk (Patrick); A. Lucas (André); C.G. de Vries (Casper)

    1998-01-01

    textabstractWe advocate the use of absolute moment ratio statistics in conjunction with standard variance ratio statistics in order to disentangle linear dependence, non-linear dependence, and leptokurtosis in financial time series. Both statistics are computed for multiple return horizons simultane

  13. Segmentation of Nonstationary Time Series with Geometric Clustering

    DEFF Research Database (Denmark)

    Bocharov, Alexei; Thiesson, Bo

    2013-01-01

    We introduce a non-parametric method for segmentation in regimeswitching time-series models. The approach is based on spectral clustering of target-regressor tuples and derives a switching regression tree, where regime switches are modeled by oblique splits. Such models can be learned efficiently...

  14. A test of conditional heteroscedasticity in time series

    Institute of Scientific and Technical Information of China (English)

    陈敏; 安鸿志

    1999-01-01

    A new test of conditional heteroscedasticity for time series is proposed. The new testing method is based on a goodness of fit type test statistics and a Cramer-von Mises type test statistic. The asymptotic properties of the new test statistic is establised. The results demonstrate that such a test is consistent.

  15. What Makes a Coursebook Series Stand the Test of Time?

    Science.gov (United States)

    Illes, Eva

    2009-01-01

    Intriguingly, at a time when the ELT market is inundated with state-of-the-art coursebooks teaching modern-day English, a 30-year-old series enjoys continuing popularity in some secondary schools in Hungary. Why would teachers, several of whom are school-based teacher-mentors in the vanguard of the profession, purposefully choose materials which…

  16. Irreversibility of financial time series: A graph-theoretical approach

    Science.gov (United States)

    Flanagan, Ryan; Lacasa, Lucas

    2016-04-01

    The relation between time series irreversibility and entropy production has been recently investigated in thermodynamic systems operating away from equilibrium. In this work we explore this concept in the context of financial time series. We make use of visibility algorithms to quantify, in graph-theoretical terms, time irreversibility of 35 financial indices evolving over the period 1998-2012. We show that this metric is complementary to standard measures based on volatility and exploit it to both classify periods of financial stress and to rank companies accordingly. We then validate this approach by finding that a projection in principal components space of financial years, based on time irreversibility features, clusters together periods of financial stress from stable periods. Relations between irreversibility, efficiency and predictability are briefly discussed.

  17. Metagenomics meets time series analysis: unraveling microbial community dynamics.

    Science.gov (United States)

    Faust, Karoline; Lahti, Leo; Gonze, Didier; de Vos, Willem M; Raes, Jeroen

    2015-06-01

    The recent increase in the number of microbial time series studies offers new insights into the stability and dynamics of microbial communities, from the world's oceans to human microbiota. Dedicated time series analysis tools allow taking full advantage of these data. Such tools can reveal periodic patterns, help to build predictive models or, on the contrary, quantify irregularities that make community behavior unpredictable. Microbial communities can change abruptly in response to small perturbations, linked to changing conditions or the presence of multiple stable states. With sufficient samples or time points, such alternative states can be detected. In addition, temporal variation of microbial interactions can be captured with time-varying networks. Here, we apply these techniques on multiple longitudinal datasets to illustrate their potential for microbiome research.

  18. Displaying time series, spatial, and space-time data with R

    CERN Document Server

    Perpinan Lamigueiro, Oscar

    2014-01-01

    Code and Methods for Creating High-Quality Data GraphicsA data graphic is not only a static image, but it also tells a story about the data. It activates cognitive processes that are able to detect patterns and discover information not readily available with the raw data. This is particularly true for time series, spatial, and space-time datasets.Focusing on the exploration of data with visual methods, Displaying Time Series, Spatial, and Space-Time Data with R presents methods and R code for producing high-quality graphics of time series, spatial, and space-time data. Practical examples using

  19. Classification of time series patterns from complex dynamic systems

    Energy Technology Data Exchange (ETDEWEB)

    Schryver, J.C.; Rao, N.

    1998-07-01

    An increasing availability of high-performance computing and data storage media at decreasing cost is making possible the proliferation of large-scale numerical databases and data warehouses. Numeric warehousing enterprises on the order of hundreds of gigabytes to terabytes are a reality in many fields such as finance, retail sales, process systems monitoring, biomedical monitoring, surveillance and transportation. Large-scale databases are becoming more accessible to larger user communities through the internet, web-based applications and database connectivity. Consequently, most researchers now have access to a variety of massive datasets. This trend will probably only continue to grow over the next several years. Unfortunately, the availability of integrated tools to explore, analyze and understand the data warehoused in these archives is lagging far behind the ability to gain access to the same data. In particular, locating and identifying patterns of interest in numerical time series data is an increasingly important problem for which there are few available techniques. Temporal pattern recognition poses many interesting problems in classification, segmentation, prediction, diagnosis and anomaly detection. This research focuses on the problem of classification or characterization of numerical time series data. Highway vehicles and their drivers are examples of complex dynamic systems (CDS) which are being used by transportation agencies for field testing to generate large-scale time series datasets. Tools for effective analysis of numerical time series in databases generated by highway vehicle systems are not yet available, or have not been adapted to the target problem domain. However, analysis tools from similar domains may be adapted to the problem of classification of numerical time series data.

  20. Modelling, simulation and inference for multivariate time series of counts

    OpenAIRE

    Veraart, Almut E. D.

    2016-01-01

    This article presents a new continuous-time modelling framework for multivariate time series of counts which have an infinitely divisible marginal distribution. The model is based on a mixed moving average process driven by L\\'{e}vy noise - called a trawl process - where the serial correlation and the cross-sectional dependence are modelled independently of each other. Such processes can exhibit short or long memory. We derive a stochastic simulation algorithm and a statistical inference meth...

  1. Statistical Analysis of Time Series Data (STATS). Users Manual (Preliminary)

    Science.gov (United States)

    1987-05-01

    15, 30. 60, 90, 120, andL -!/14:X.... 183 days are presently used. auto Page 1 of 10 wrpy *VtsE0> J1 record (continued) Field Variab Vlue D 2 NPRDS ...each event. 6 JEND + Order number of last period in time series to ( NPRDS ) select for analysis. If blank, the last period is assumed. 7 JPPF Plotting...values. 2 NPRDS + Actual number of periods for the event following on ’INO records until the next ID, BF, or LI record. IN record - T:E SERIES DATA

  2. Improving predictability of time series using maximum entropy methods

    Science.gov (United States)

    Chliamovitch, G.; Dupuis, A.; Golub, A.; Chopard, B.

    2015-04-01

    We discuss how maximum entropy methods may be applied to the reconstruction of Markov processes underlying empirical time series and compare this approach to usual frequency sampling. It is shown that, in low dimension, there exists a subset of the space of stochastic matrices for which the MaxEnt method is more efficient than sampling, in the sense that shorter historical samples have to be considered to reach the same accuracy. Considering short samples is of particular interest when modelling smoothly non-stationary processes, which provides, under some conditions, a powerful forecasting tool. The method is illustrated for a discretized empirical series of exchange rates.

  3. Autoregression of Quasi-Stationary Time Series (Invited)

    Science.gov (United States)

    Meier, T. M.; Küperkoch, L.

    2009-12-01

    Autoregression is a model based tool for spectral analysis and prediction of time series. It has the potential to increase the resolution of spectral estimates. However, the validity of the assumed model has to be tested. Here we review shortly methods for the determination of the parameters of autoregression and summarize properties of autoregressive prediction and autoregressive spectral analysis. Time series with a limited number of dominant frequencies varying slowly in time (quasi-stationary time series) may well be described by a time-dependent autoregressive model of low order. An algorithm for the estimation of the autoregression parameters in a moving window is presented. Time-varying dominant frequencies are estimated. The comparison to results obtained by Fourier transform based methods and the visualization of the time dependent normalized prediction error are essential for quality assessment of the results. The algorithm is applied to synthetic examples as well as to mircoseism and tremor. The sensitivity of the results to the choice of model and filter parameters is discussed. Autoregressive forward prediction offers the opportunity to detect body wave phases in seismograms and to determine arrival times automatically. Examples are shown for P- and S-phases at local and regional distances. In order to determine S-wave arrival times the autoregressive model is extended to multi-component recordings. For the detection of significant temporal changes in waveforms, the choice of the model appears to be less crucial compared to spectral analysis. Temporal changes in frequency, amplitude, phase, and polarisation are detectable by autoregressive prediction. Quality estimates of automatically determined onset times may be obtained from the slope of the absolute prediction error as a function of time and the signal-to-noise ratio. Results are compared to manual readings.

  4. Wavelet analysis on paleomagnetic (and computer simulated VGP time series

    Directory of Open Access Journals (Sweden)

    A. Siniscalchi

    2003-06-01

    Full Text Available We present Continuous Wavelet Transform (CWT data analysis of Virtual Geomagnetic Pole (VGP latitude time series. The analyzed time series are sedimentary paleomagnetic and geodynamo simulated data. Two mother wavelets (the Morlet function and the first derivative of a Gaussian function are used in order to detect features related to the spectral content as well as polarity excursions and reversals. By means of the Morlet wavelet, we estimate both the global spectrum and the time evolution of the spectral content of the paleomagnetic data series. Some peaks corresponding to the orbital components are revealed by the spectra and the local analysis helped disclose their statistical significance. Even if this feature could be an indication of orbital influence on geodynamo, other interpretations are possible. In particular, we note a correspondence of local spectral peaks with the appearance of the excursions in the series. The comparison among the paleomagnetic and simulated spectra shows a similarity in the high frequency region indicating that their degree of regularity is analogous. By means of Gaussian first derivative wavelet, reversals and excursions of polarity were sought. The analysis was performed first on the simulated data, to have a guide in understanding the features present in the more complex paleomagnetic data. Various excursions and reversals have been identified, despite of the prevalent normality of the series and its inherent noise. The found relative chronology of the paleomagnetic data reversals was compared with a coeval global polarity time scale (Channel et al., 1995. The relative lengths of polarity stability intervals are found similar, but a general shift appears between the two scales, that could be due to the datation uncertainties of the Hauterivian/Barremian boundary.

  5. Minimum Entropy Density Method for the Time Series Analysis

    CERN Document Server

    Lee, J W; Moon, H T; Park, J B; Yang, J S; Jo, Hang-Hyun; Lee, Jeong Won; Moon, Hie-Tae; Park, Joongwoo Brian; Yang, Jae-Suk

    2006-01-01

    The entropy density is an intuitive and powerful concept to study the complicated nonlinear processes derived from physical systems. We develop the minimum entropy density method (MEDM) to detect the most correlated time interval of a given time series and define the effective delay of information (EDI) as the correlation length that minimizes the entropy density in relation to the velocity of information flow. The MEDM is applied to the financial time series of Standard and Poor's 500 (S&P500) index from February 1983 to April 2006. It is found that EDI of S&P500 index has decreased for the last twenty years, which suggests that the efficiency of the U.S. market dynamics became close to the efficient market hypothesis.

  6. Adaptively Sharing Time-Series with Differential Privacy

    CERN Document Server

    Fan, Liyue

    2012-01-01

    Sharing real-time aggregate statistics of private data has given much benefit to the public to perform data mining for understanding important phenomena, such as Influenza outbreaks and traffic congestions. We propose an adaptive approach with sampling and estimation to release aggregated time series under differential privacy, the key innovation of which is that we utilize feedback loops based on observed (perturbed) values to dynamically adjust the estimation model as well as the sampling rate. To minimize the overall privacy cost, our solution uses the PID controller to adaptively sample long time-series according to detected data dynamics. To improve the accuracy of data release per timestamp, the Kalman filter is used to predict data values at non-sampling points and to estimate true values from perturbed query answers at sampling points. Our experiments with three real data sets show that it is beneficial to incorporate feedback into both the estimation model and the sampling process. The results confir...

  7. Cross Recurrence Plot Based Synchronization of Time Series

    CERN Document Server

    Marwan, N; Nowaczyk, N R

    2002-01-01

    The method of recurrence plots is extended to the cross recurrence plots (CRP), which among others enables the study of synchronization or time differences in two time series. This is emphasized in a distorted main diagonal in the cross recurrence plot, the line of synchronization (LOS). A non-parametrical fit of this LOS can be used to rescale the time axis of the two data series (whereby one of it is e.g. compressed or stretched) so that they are synchronized. An application of this method to geophysical sediment core data illustrates its suitability for real data. The rock magnetic data of two different sediment cores from the Makarov Basin can be adjusted to each other by using this method, so that they are comparable.

  8. A multivariate heuristic model for fuzzy time-series forecasting.

    Science.gov (United States)

    Huarng, Kun-Huang; Yu, Tiffany Hui-Kuang; Hsu, Yu Wei

    2007-08-01

    Fuzzy time-series models have been widely applied due to their ability to handle nonlinear data directly and because no rigid assumptions for the data are needed. In addition, many such models have been shown to provide better forecasting results than their conventional counterparts. However, since most of these models require complicated matrix computations, this paper proposes the adoption of a multivariate heuristic function that can be integrated with univariate fuzzy time-series models into multivariate models. Such a multivariate heuristic function can easily be extended and integrated with various univariate models. Furthermore, the integrated model can handle multiple variables to improve forecasting results and, at the same time, avoid complicated computations due to the inclusion of multiple variables.

  9. Cross recurrence plot based synchronization of time series

    Directory of Open Access Journals (Sweden)

    N. Marwan

    2002-01-01

    Full Text Available The method of recurrence plots is extended to the cross recurrence plots (CRP which, among others, enables the study of synchronization or time differences in two time series. This is emphasized in a distorted main diagonal in the cross recurrence plot, the line of synchronization (LOS. A non-parametrical fit of this LOS can be used to rescale the time axis of the two data series (whereby one of them is compressed or stretched so that they are synchronized. An application of this method to geophysical sediment core data illustrates its suitability for real data. The rock magnetic data of two different sediment cores from the Makarov Basin can be adjusted to each other by using this method, so that they are comparable.

  10. Time series data mining for the Gaia variability analysis

    CERN Document Server

    Nienartowicz, Krzysztof; Guy, Leanne; Holl, Berry; Lecoeur-Taïbi, Isabelle; Mowlavi, Nami; Rimoldini, Lorenzo; Ruiz, Idoia; Süveges, Maria; Eyer, Laurent

    2014-01-01

    Gaia is an ESA cornerstone mission, which was successfully launched December 2013 and commenced operations in July 2014. Within the Gaia Data Processing and Analysis consortium, Coordination Unit 7 (CU7) is responsible for the variability analysis of over a billion celestial sources and nearly 4 billion associated time series (photometric, spectrophotometric, and spectroscopic), encoding information in over 800 billion observations during the 5 years of the mission, resulting in a petabyte scale analytical problem. In this article, we briefly describe the solutions we developed to address the challenges of time series variability analysis: from the structure for a distributed data-oriented scientific collaboration to architectural choices and specific components used. Our approach is based on Open Source components with a distributed, partitioned database as the core to handle incrementally: ingestion, distributed processing, analysis, results and export in a constrained time window.

  11. Recursive Bayesian recurrent neural networks for time-series modeling.

    Science.gov (United States)

    Mirikitani, Derrick T; Nikolaev, Nikolay

    2010-02-01

    This paper develops a probabilistic approach to recursive second-order training of recurrent neural networks (RNNs) for improved time-series modeling. A general recursive Bayesian Levenberg-Marquardt algorithm is derived to sequentially update the weights and the covariance (Hessian) matrix. The main strengths of the approach are a principled handling of the regularization hyperparameters that leads to better generalization, and stable numerical performance. The framework involves the adaptation of a noise hyperparameter and local weight prior hyperparameters, which represent the noise in the data and the uncertainties in the model parameters. Experimental investigations using artificial and real-world data sets show that RNNs equipped with the proposed approach outperform standard real-time recurrent learning and extended Kalman training algorithms for recurrent networks, as well as other contemporary nonlinear neural models, on time-series modeling.

  12. A comprehensive characterization of recurrences in time series

    CERN Document Server

    Chicheportiche, Rémy

    2013-01-01

    Study of recurrences in earthquakes, climate, financial time-series, etc. is crucial to better forecast disasters and limit their consequences. However, almost all the previous phenomenological studies involved only a long-ranged autocorrelation function, or disregarded the multi-scaling properties induced by potential higher order dependencies. Consequently, they missed the facts that non-linear dependences do impact both the statistics and dynamics of recurrence times, and that scaling arguments for the unconditional distribution may not be applicable. We argue that copulas is the correct model-free framework to study non-linear dependencies in time series and related concepts like recurrences. Fitting and/or simulating the intertemporal distribution of recurrence intervals is very much system specific, and cannot actually benefit from universal features, in contrast to the previous claims. This has important implications in epilepsy prognosis and financial risk management applications.

  13. Measures of Analysis of Time Series (MATS: A MATLAB Toolkit for Computation of Multiple Measures on Time Series Data Bases

    Directory of Open Access Journals (Sweden)

    Dimitris Kugiumtzis

    2010-02-01

    Full Text Available In many applications, such as physiology and finance, large time series data bases are to be analyzed requiring the computation of linear, nonlinear and other measures. Such measures have been developed and implemented in commercial and freeware softwares rather selectively and independently. The Measures of Analysis of Time Series (MATS MATLAB toolkit is designed to handle an arbitrary large set of scalar time series and compute a large variety of measures on them, allowing for the specification of varying measure parameters as well. The variety of options with added facilities for visualization of the results support different settings of time series analysis, such as the detection of dynamics changes in long data records, resampling (surrogate or bootstrap tests for independence and linearity with various test statistics, and discrimination power of different measures and for different combinations of their parameters. The basic features of MATS are presented and the implemented measures are briefly described. The usefulness of MATS is illustrated on some empirical examples along with screenshots.

  14. Building Real-Time Network Intrusion Detection System Based on Parallel Time-Series Mining Techniques

    Institute of Scientific and Technical Information of China (English)

    Zhao Feng; Li Qinghua

    2005-01-01

    A new real-time model based on parallel time-series mining is proposed to improve the accuracy and efficiency of the network intrusion detection systems. In this model, multidimensional dataset is constructed to describe network events, and sliding window updating algorithm is used to maintain network stream. Moreover, parallel frequent patterns and frequent episodes mining algorithms are applied to implement parallel time-series mining engineer which can intelligently generate rules to distinguish intrusions from normal activities. Analysis and study on the basis of DAWNING 3000 indicate that this parallel time-series mining-based model provides a more accurate and efficient way to building real-time NIDS.

  15. Reconstruction of ensembles of coupled time-delay systems from time series

    Science.gov (United States)

    Sysoev, I. V.; Prokhorov, M. D.; Ponomarenko, V. I.; Bezruchko, B. P.

    2014-06-01

    We propose a method to recover from time series the parameters of coupled time-delay systems and the architecture of couplings between them. The method is based on a reconstruction of model delay-differential equations and estimation of statistical significance of couplings. It can be applied to networks composed of nonidentical nodes with an arbitrary number of unidirectional and bidirectional couplings. We test our method on chaotic and periodic time series produced by model equations of ensembles of diffusively coupled time-delay systems in the presence of noise, and apply it to experimental time series obtained from electronic oscillators with delayed feedback coupled by resistors.

  16. Exploring large scale time-series data using nested timelines

    Science.gov (United States)

    Xie, Zaixian; Ward, Matthew O.; Rundensteiner, Elke A.

    2013-01-01

    When data analysts study time-series data, an important task is to discover how data patterns change over time. If the dataset is very large, this task becomes challenging. Researchers have developed many visualization techniques to help address this problem. However, little work has been done regarding the changes of multivariate patterns, such as linear trends and clusters, on time-series data. In this paper, we describe a set of history views to fill this gap. This technique works under two modes: merge and non-merge. For the merge mode, merge algorithms were applied to selected time windows to generate a change-based hierarchy. Contiguous time windows having similar patterns are merged first. Users can choose different levels of merging with the tradeoff between more details in the data and less visual clutter in the visualizations. In the non-merge mode, the framework can use natural hierarchical time units or one defined by domain experts to represent timelines. This can help users navigate across long time periods. Gridbased views were designed to provide a compact overview for the history data. In addition, MDS pattern starfields and distance maps were developed to enable users to quickly investigate the degree of pattern similarity among different time periods. The usability evaluation demonstrated that most participants could understand the concepts of the history views correctly and finished assigned tasks with a high accuracy and relatively fast response time.

  17. Assessing spatial covariance among time series of abundance.

    Science.gov (United States)

    Jorgensen, Jeffrey C; Ward, Eric J; Scheuerell, Mark D; Zabel, Richard W

    2016-04-01

    For species of conservation concern, an essential part of the recovery planning process is identifying discrete population units and their location with respect to one another. A common feature among geographically proximate populations is that the number of organisms tends to covary through time as a consequence of similar responses to exogenous influences. In turn, high covariation among populations can threaten the persistence of the larger metapopulation. Historically, explorations of the covariance in population size of species with many (>10) time series have been computationally difficult. Here, we illustrate how dynamic factor analysis (DFA) can be used to characterize diversity among time series of population abundances and the degree to which all populations can be represented by a few common signals. Our application focuses on anadromous Chinook salmon (Oncorhynchus tshawytscha), a species listed under the US Endangered Species Act, that is impacted by a variety of natural and anthropogenic factors. Specifically, we fit DFA models to 24 time series of population abundance and used model selection to identify the minimum number of latent variables that explained the most temporal variation after accounting for the effects of environmental covariates. We found support for grouping the time series according to 5 common latent variables. The top model included two covariates: the Pacific Decadal Oscillation in spring and summer. The assignment of populations to the latent variables matched the currently established population structure at a broad spatial scale. At a finer scale, there was more population grouping complexity. Some relatively distant populations were grouped together, and some relatively close populations - considered to be more aligned with each other - were more associated with populations further away. These coarse- and fine-grained examinations of spatial structure are important because they reveal different structural patterns not evident

  18. Satellite time series analysis using Empirical Mode Decomposition

    Science.gov (United States)

    Pannimpullath, R. Renosh; Doolaeghe, Diane; Loisel, Hubert; Vantrepotte, Vincent; Schmitt, Francois G.

    2016-04-01

    Geophysical fields possess large fluctuations over many spatial and temporal scales. Satellite successive images provide interesting sampling of this spatio-temporal multiscale variability. Here we propose to consider such variability by performing satellite time series analysis, pixel by pixel, using Empirical Mode Decomposition (EMD). EMD is a time series analysis technique able to decompose an original time series into a sum of modes, each one having a different mean frequency. It can be used to smooth signals, to extract trends. It is built in a data-adaptative way, and is able to extract information from nonlinear signals. Here we use MERIS Suspended Particulate Matter (SPM) data, on a weekly basis, during 10 years. There are 458 successive time steps. We have selected 5 different regions of coastal waters for the present study. They are Vietnam coastal waters, Brahmaputra region, St. Lawrence, English Channel and McKenzie. These regions have high SPM concentrations due to large scale river run off. Trend and Hurst exponents are derived for each pixel in each region. The energy also extracted using Hilberts Spectral Analysis (HSA) along with EMD method. Normalised energy computed for each mode for each region with the total energy. The total energy computed using all the modes are extracted using EMD method.

  19. West Africa land use and land cover time series

    Science.gov (United States)

    Cotillon, Suzanne E.

    2017-02-16

    Started in 1999, the West Africa Land Use Dynamics project represents an effort to map land use and land cover, characterize the trends in time and space, and understand their effects on the environment across West Africa. The outcome of the West Africa Land Use Dynamics project is the production of a three-time period (1975, 2000, and 2013) land use and land cover dataset for the Sub-Saharan region of West Africa, including the Cabo Verde archipelago. The West Africa Land Use Land Cover Time Series dataset offers a unique basis for characterizing and analyzing land changes across the region, systematically and at an unprecedented level of detail.

  20. Copulas and time series with long-ranged dependences

    CERN Document Server

    Chicheportiche, Rémy

    2013-01-01

    We review ideas on temporal dependences and recurrences in discrete time series from several areas of natural and social sciences. We revisit existing studies and redefine the relevant observables in the language of copulas (joint laws of the ranks). We propose that copulas provide an appropriate mathematical framework to study non-linear time dependences and related concepts - like aftershocks, Omori law, recurrences, waiting times. We also critically argue using this global approach that previous phenomenological attempts involving only a long-ranged autocorrelation function lacked complexity in that they were essentially mono-scale.

  1. A Comparative Study of Portmanteau Tests for Univariate Time Series Models

    Directory of Open Access Journals (Sweden)

    Sohail Chand

    2006-07-01

    Full Text Available Time series model diagnostic checking is the most important stage of time series model building. In this paper the comparison among several suggested diagnostic tests has been made using the simulation time series data.

  2. Simple Patterns in Fluctuations of Time Series of Economic Interest

    Science.gov (United States)

    Fanchiotti, H.; García Canal, C. A.; García Zúñiga, H.

    Time series corresponding to nominal exchange rates between the US dollar and Argentina, Brazil and European Economic Community currencies; different financial indexes as the Industrial Dow Jones, the British Footsie, the German DAX Composite, the Australian Share Price and the Nikkei Cash and also different Argentine local tax revenues, are analyzed looking for the appearance of simple patterns and the possible definition of forecast evaluators. In every case, the statistical fractal dimensions are obtained from the behavior of the corresponding variance of increments at a given lag. The detrended fluctuation analysis of the data in terms of the corresponding exponent in the resulting power law is carried out. Finally, the frequency power spectra of all the time series considered are computed and compared

  3. Time series analysis of physiological response during ICU visitation.

    Science.gov (United States)

    Hepworth, J T; Hendrickson, S G; Lopez, J

    1994-12-01

    Time series analysis (TSA) is an important statistical procedure for clinical nursing research. The current paucity of nursing research reports using TSA may be due to unfamiliarity with this technique. In this article, TSA is compared with the ordinary least squares regression model; validity concerns of time series designs are discussed; and concomitant and interrupted TSA of data collected on the effects of family visitation on intracranial pressure (ICP), heart rate, and blood pressure of patients in ICUs are presented. The concomitant TSA of the effect of family on ICP suggested that family presence tended to be associated with decreased ICP. Interrupted TSA indicated the effect of family on heart rate and blood pressure was not as consistent: The overall effect on blood pressure appeared to be negligible, and heart rate may increase overall. Restrictive visiting policies, once typical of intensive care units, should be reconsidered.

  4. Time series analysis methods and applications for flight data

    CERN Document Server

    Zhang, Jianye

    2017-01-01

    This book focuses on different facets of flight data analysis, including the basic goals, methods, and implementation techniques. As mass flight data possesses the typical characteristics of time series, the time series analysis methods and their application for flight data have been illustrated from several aspects, such as data filtering, data extension, feature optimization, similarity search, trend monitoring, fault diagnosis, and parameter prediction, etc. An intelligent information-processing platform for flight data has been established to assist in aircraft condition monitoring, training evaluation and scientific maintenance. The book will serve as a reference resource for people working in aviation management and maintenance, as well as researchers and engineers in the fields of data analysis and data mining.

  5. Model of a synthetic wind speed time series generator

    DEFF Research Database (Denmark)

    Negra, N.B.; Holmstrøm, O.; Bak-Jensen, B.;

    2008-01-01

    Wind energy has assumed a great relevance in the operation and planning of today's power systems due to the exponential increase of installations in the last 10 years. For this reason, many performed studies have looked at suitable representations of wind generation for power system analysis. One...... of the main elements to consider for this purpose is the model of the wind speed that is usually required as input. Wind speed measurements may represent a solution for this problem, but, for techniques such as sequential Monte Carlo simulation, they have to be long enough in order to describe a wide range...... of possible wind conditions. If these information are not available, synthetic wind speed time series may be a useful tool as well, but their generator must preserve statistical and stochastic features of the phenomenon. This paper deals with this issue: a generator for synthetic wind speed time series...

  6. Time series prediction by feedforward neural networks - is it difficult?

    CERN Document Server

    Rosen-Zvi, M; Kinzel, W

    2003-01-01

    The difficulties that a neural network faces when trying to learn from a quasi-periodic time series are studied analytically using a teacher-student scenario where the random input is divided into two macroscopic regions with different variances, 1 and 1/gamma sup 2 (gamma >> 1). The generalization error is found to decrease as epsilon sub g propor to exp(-alpha/gamma sup 2), where alpha is the number of examples per input dimension. In contradiction to this very slow vanishing generalization error, the next output prediction is found to be almost free of mistakes. This picture is consistent with learning quasi-periodic time series produced by feedforward neural networks, which is dominated by enhanced components of the Fourier spectrum of the input. Simulation results are in good agreement with the analytical results.

  7. Mathematical methods in time series analysis and digital image processing

    CERN Document Server

    Kurths, J; Maass, P; Timmer, J

    2008-01-01

    The aim of this volume is to bring together research directions in theoretical signal and imaging processing developed rather independently in electrical engineering, theoretical physics, mathematics and the computer sciences. In particular, mathematically justified algorithms and methods, the mathematical analysis of these algorithms, and methods as well as the investigation of connections between methods from time series analysis and image processing are reviewed. An interdisciplinary comparison of these methods, drawing upon common sets of test problems from medicine and geophysical/enviromental sciences, is also addressed. This volume coherently summarizes work carried out in the field of theoretical signal and image processing. It focuses on non-linear and non-parametric models for time series as well as on adaptive methods in image processing.

  8. Model-Coupled Autoencoder for Time Series Visualisation

    CERN Document Server

    Gianniotis, Nikolaos; Tiňo, Peter; Polsterer, Kai L

    2016-01-01

    We present an approach for the visualisation of a set of time series that combines an echo state network with an autoencoder. For each time series in the dataset we train an echo state network, using a common and fixed reservoir of hidden neurons, and use the optimised readout weights as the new representation. Dimensionality reduction is then performed via an autoencoder on the readout weight representations. The crux of the work is to equip the autoencoder with a loss function that correctly interprets the reconstructed readout weights by associating them with a reconstruction error measured in the data space of sequences. This essentially amounts to measuring the predictive performance that the reconstructed readout weights exhibit on their corresponding sequences when plugged back into the echo state network with the same fixed reservoir. We demonstrate that the proposed visualisation framework can deal both with real valued sequences as well as binary sequences. We derive magnification factors in order t...

  9. Radial basis function network design for chaotic time series prediction

    Energy Technology Data Exchange (ETDEWEB)

    Shin, Chang Yong; Kim, Taek Soo; Park, Sang Hui [Yonsei University, Seoul (Korea, Republic of); Choi, Yoon Ho [Kyonggi University, Suwon (Korea, Republic of)

    1996-04-01

    In this paper, radial basis function networks with two hidden layers, which employ the K-means clustering method and the hierarchical training, are proposed for improving the short-term predictability of chaotic time series. Furthermore the recursive training method of radial basis function network using the recursive modified Gram-Schmidt algorithm is proposed for the purpose. In addition, the radial basis function networks trained by the proposed training methods are compared with the X.D. He A Lapedes`s model and the radial basis function network by non-recursive training method. Through this comparison, an improved radial basis function network for predicting chaotic time series is presented. (author). 17 refs., 8 figs., 3 tabs.

  10. Chaotic time series. Part II. System Identification and Prediction

    Directory of Open Access Journals (Sweden)

    Bjørn Lillekjendlie

    1994-10-01

    Full Text Available This paper is the second in a series of two, and describes the current state of the art in modeling and prediction of chaotic time series. Sample data from deterministic non-linear systems may look stochastic when analysed with linear methods. However, the deterministic structure may be uncovered and non-linear models constructed that allow improved prediction. We give the background for such methods from a geometrical point of view, and briefly describe the following types of methods: global polynomials, local polynomials, multilayer perceptrons and semi-local methods including radial basis functions. Some illustrative examples from known chaotic systems are presented, emphasising the increase in prediction error with time. We compare some of the algorithms with respect to prediction accuracy and storage requirements, and list applications of these methods to real data from widely different areas.

  11. TESTING FOR OUTLIERS IN TIME SERIES USING WAVELETS

    Institute of Scientific and Technical Information of China (English)

    ZHANG Tong; ZHANG Xibin; ZHANG Shiying

    2003-01-01

    One remarkable feature of wavelet decomposition is that the wavelet coefficients are localized, and any singularity in the input signals can only affect the wavelet coefficients at the point near the singularity. The localized property of the wavelet coefficients allows us to identify the singularities in the input signals by studying the wavelet coefficients at different resolution levels. This paper considers wavelet-based approaches for the detection of outliers in time series. Outliers are high-frequency phenomena which are associated with the wavelet coefficients with large absolute values at different resolution levels. On the basis of the first-level wavelet coefficients, this paper presents a diagnostic to identify outliers in a time series. Under the null hypothesis that there is no outlier, the proposed diagnostic is distributed as a X12. Empirical examples are presented to demonstrate the application of the proposed diagnostic.

  12. Chaotic time series; 2, system identification and prediction

    CERN Document Server

    Lillekjendlie, B

    1994-01-01

    This paper is the second in a series of two, and describes the current state of the art in modelling and prediction of chaotic time series. Sampled data from deterministic non-linear systems may look stochastic when analysed with linear methods. However, the deterministic structure may be uncovered and non-linear models constructed that allow improved prediction. We give the background for such methods from a geometrical point of view, and briefly describe the following types of methods: global polynomials, local polynomials, multi layer perceptrons and semi-local methods including radial basis functions. Some illustrative examples from known chaotic systems are presented, emphasising the increase in prediction error with time. We compare some of the algorithms with respect to prediction accuracy and storage requirements, and list applications of these methods to real data from widely different areas.

  13. Modeling Philippine Stock Exchange Composite Index Using Time Series Analysis

    Science.gov (United States)

    Gayo, W. S.; Urrutia, J. D.; Temple, J. M. F.; Sandoval, J. R. D.; Sanglay, J. E. A.

    2015-06-01

    This study was conducted to develop a time series model of the Philippine Stock Exchange Composite Index and its volatility using the finite mixture of ARIMA model with conditional variance equations such as ARCH, GARCH, EG ARCH, TARCH and PARCH models. Also, the study aimed to find out the reason behind the behaviorof PSEi, that is, which of the economic variables - Consumer Price Index, crude oil price, foreign exchange rate, gold price, interest rate, money supply, price-earnings ratio, Producers’ Price Index and terms of trade - can be used in projecting future values of PSEi and this was examined using Granger Causality Test. The findings showed that the best time series model for Philippine Stock Exchange Composite index is ARIMA(1,1,5) - ARCH(1). Also, Consumer Price Index, crude oil price and foreign exchange rate are factors concluded to Granger cause Philippine Stock Exchange Composite Index.

  14. Time Series Analysis, Modeling and Applications A Computational Intelligence Perspective

    CERN Document Server

    Chen, Shyi-Ming

    2013-01-01

    Temporal and spatiotemporal data form an inherent fabric of the society as we are faced with streams of data coming from numerous sensors, data feeds, recordings associated with numerous areas of application embracing physical and human-generated phenomena (environmental data, financial markets, Internet activities, etc.). A quest for a thorough analysis, interpretation, modeling and prediction of time series comes with an ongoing challenge for developing models that are both accurate and user-friendly (interpretable). The volume is aimed to exploit the conceptual and algorithmic framework of Computational Intelligence (CI) to form a cohesive and comprehensive environment for building models of time series. The contributions covered in the volume are fully reflective of the wealth of the CI technologies by bringing together ideas, algorithms, and numeric studies, which convincingly demonstrate their relevance, maturity and visible usefulness. It reflects upon the truly remarkable diversity of methodological a...

  15. DEM error retrieval by analyzing time series of differential interferograms

    OpenAIRE

    Bombrun, Lionel; Gay, Michel; Trouvé, Emmanuel; Vasile, Gabriel; Mars, Jerome,

    2009-01-01

    International audience; 2-pass Differential Synthetic Aperture Radar Interferometry (D-InSAR) processing have been successfully used by the scientific community to derive velocity fields. Nevertheless, a precise Digital Elevation Model (DEM) is necessary to remove the topographic component from the interferograms. This letter presents a novel method to detect and retrieve DEM errors by analyzing time series of differential interferograms. The principle of the method is based on the comparison...

  16. Financial Time Series Prediction Using Elman Recurrent Random Neural Networks

    Directory of Open Access Journals (Sweden)

    Jie Wang

    2016-01-01

    (ERNN, the empirical results show that the proposed neural network displays the best performance among these neural networks in financial time series forecasting. Further, the empirical research is performed in testing the predictive effects of SSE, TWSE, KOSPI, and Nikkei225 with the established model, and the corresponding statistical comparisons of the above market indices are also exhibited. The experimental results show that this approach gives good performance in predicting the values from the stock market indices.

  17. On Clustering Time Series Using Euclidean Distance and Pearson Correlation

    OpenAIRE

    Berthold, Michael R.; Höppner, Frank

    2016-01-01

    For time series comparisons, it has often been observed that z-score normalized Euclidean distances far outperform the unnormalized variant. In this paper we show that a z-score normalized, squared Euclidean Distance is, in fact, equal to a distance based on Pearson Correlation. This has profound impact on many distance-based classification or clustering methods. In addition to this theoretically sound result we also show that the often used k-Means algorithm formally needs a mod ification to...

  18. Analyzing the Dynamics of Nonlinear Multivariate Time Series Models

    Institute of Scientific and Technical Information of China (English)

    DenghuaZhong; ZhengfengZhang; DonghaiLiu; StefanMittnik

    2004-01-01

    This paper analyzes the dynamics of nonlinear multivariate time series models that is represented by generalized impulse response functions and asymmetric functions. We illustrate the measures of shock persistences and asymmetric effects of shocks derived from the generalized impulse response functions and asymmetric function in bivariate smooth transition regression models. The empirical work investigates a bivariate smooth transition model of US GDP and the unemployment rate.

  19. Time-series properties of state-level public expenditure.

    OpenAIRE

    Rajaraman, Indira; Mukhopadhyaya, Hiranya; Rao, Kavita R.

    2001-01-01

    Public expenditure reform must be underpinned by some understanding of the time-series properties of public expenditure. This paper examines the univariate properties of aggregate revenue expenditure at the level of State governments in India over the period 1974-98 for three states: Punjab, Haryana and Maharashtra. The empirical exercise is performed on the logarithmic transformation of aggregate revenue expenditure in terms of nominal (rather than ex post real) expenditure, not normalised t...

  20. Note---New Confidence Interval Estimators Using Standardized Time Series

    OpenAIRE

    David Goldsman; Lee Schruben

    1990-01-01

    We develop new asymptotically valid confidence interval estimators (CIE's) for the underlying mean of a stationary simulation process. The new estimators are weighted generalizations of Schruben's standardized time series area CIE. We show that the weighted CIE's have the same asymptotic expected length and variance of the length as the area CIE; but in the small sample environment, the new CIE's exhibit performance characteristics which are different from those of the area CIE.

  1. Comparison of correlation analysis techniques for irregularly sampled time series

    Directory of Open Access Journals (Sweden)

    K. Rehfeld

    2011-06-01

    Full Text Available Geoscientific measurements often provide time series with irregular time sampling, requiring either data reconstruction (interpolation or sophisticated methods to handle irregular sampling. We compare the linear interpolation technique and different approaches for analyzing the correlation functions and persistence of irregularly sampled time series, as Lomb-Scargle Fourier transformation and kernel-based methods. In a thorough benchmark test we investigate the performance of these techniques.

    All methods have comparable root mean square errors (RMSEs for low skewness of the inter-observation time distribution. For high skewness, very irregular data, interpolation bias and RMSE increase strongly. We find a 40 % lower RMSE for the lag-1 autocorrelation function (ACF for the Gaussian kernel method vs. the linear interpolation scheme,in the analysis of highly irregular time series. For the cross correlation function (CCF the RMSE is then lower by 60 %. The application of the Lomb-Scargle technique gave results comparable to the kernel methods for the univariate, but poorer results in the bivariate case. Especially the high-frequency components of the signal, where classical methods show a strong bias in ACF and CCF magnitude, are preserved when using the kernel methods.

    We illustrate the performances of interpolation vs. Gaussian kernel method by applying both to paleo-data from four locations, reflecting late Holocene Asian monsoon variability as derived from speleothem δ18O measurements. Cross correlation results are similar for both methods, which we attribute to the long time scales of the common variability. The persistence time (memory is strongly overestimated when using the standard, interpolation-based, approach. Hence, the Gaussian kernel is a reliable and more robust estimator with significant advantages compared to other techniques and suitable for large scale application to paleo-data.

  2. Statistical Inference Methods for Sparse Biological Time Series Data

    Directory of Open Access Journals (Sweden)

    Voit Eberhard O

    2011-04-01

    Full Text Available Abstract Background Comparing metabolic profiles under different biological perturbations has become a powerful approach to investigating the functioning of cells. The profiles can be taken as single snapshots of a system, but more information is gained if they are measured longitudinally over time. The results are short time series consisting of relatively sparse data that cannot be analyzed effectively with standard time series techniques, such as autocorrelation and frequency domain methods. In this work, we study longitudinal time series profiles of glucose consumption in the yeast Saccharomyces cerevisiae under different temperatures and preconditioning regimens, which we obtained with methods of in vivo nuclear magnetic resonance (NMR spectroscopy. For the statistical analysis we first fit several nonlinear mixed effect regression models to the longitudinal profiles and then used an ANOVA likelihood ratio method in order to test for significant differences between the profiles. Results The proposed methods are capable of distinguishing metabolic time trends resulting from different treatments and associate significance levels to these differences. Among several nonlinear mixed-effects regression models tested, a three-parameter logistic function represents the data with highest accuracy. ANOVA and likelihood ratio tests suggest that there are significant differences between the glucose consumption rate profiles for cells that had been--or had not been--preconditioned by heat during growth. Furthermore, pair-wise t-tests reveal significant differences in the longitudinal profiles for glucose consumption rates between optimal conditions and heat stress, optimal and recovery conditions, and heat stress and recovery conditions (p-values Conclusion We have developed a nonlinear mixed effects model that is appropriate for the analysis of sparse metabolic and physiological time profiles. The model permits sound statistical inference procedures

  3. Deconvolution of mixing time series on a graph

    CERN Document Server

    Blocker, Alexander W

    2011-01-01

    In many applications we are interested in making inference on latent time series from indirect measurements, which are often low-dimensional projections resulting from mixing or aggregation. Positron emission tomography, super-resolution, and network traffic monitoring are some examples. Inference in such settings requires solving a sequence of ill-posed inverse problems, y_t= A x_t, where the projection mechanism provides information on A. We consider problems in which A specifies mixing on a graph of times series that are bursty and sparse. We develop a multilevel state-space model for mixing times series and an efficient approach to inference. A simple model is used to calibrate regularization parameters that lead to efficient inference in the multilevel state-space model. We apply this method to the problem of estimating point-to-point traffic flows on a network from aggregate measurements. Our solution outperforms existing methods for this problem, and our two-stage approach suggests an efficient inferen...

  4. Toward automatic time-series forecasting using neural networks.

    Science.gov (United States)

    Yan, Weizhong

    2012-07-01

    Over the past few decades, application of artificial neural networks (ANN) to time-series forecasting (TSF) has been growing rapidly due to several unique features of ANN models. However, to date, a consistent ANN performance over different studies has not been achieved. Many factors contribute to the inconsistency in the performance of neural network models. One such factor is that ANN modeling involves determining a large number of design parameters, and the current design practice is essentially heuristic and ad hoc, this does not exploit the full potential of neural networks. Systematic ANN modeling processes and strategies for TSF are, therefore, greatly needed. Motivated by this need, this paper attempts to develop an automatic ANN modeling scheme. It is based on the generalized regression neural network (GRNN), a special type of neural network. By taking advantage of several GRNN properties (i.e., a single design parameter and fast learning) and by incorporating several design strategies (e.g., fusing multiple GRNNs), we have been able to make the proposed modeling scheme to be effective for modeling large-scale business time series. The initial model was entered into the NN3 time-series competition. It was awarded the best prediction on the reduced dataset among approximately 60 different models submitted by scholars worldwide.

  5. Time series analysis for psychological research: examining and forecasting change.

    Science.gov (United States)

    Jebb, Andrew T; Tay, Louis; Wang, Wei; Huang, Qiming

    2015-01-01

    Psychological research has increasingly recognized the importance of integrating temporal dynamics into its theories, and innovations in longitudinal designs and analyses have allowed such theories to be formalized and tested. However, psychological researchers may be relatively unequipped to analyze such data, given its many characteristics and the general complexities involved in longitudinal modeling. The current paper introduces time series analysis to psychological research, an analytic domain that has been essential for understanding and predicting the behavior of variables across many diverse fields. First, the characteristics of time series data are discussed. Second, different time series modeling techniques are surveyed that can address various topics of interest to psychological researchers, including describing the pattern of change in a variable, modeling seasonal effects, assessing the immediate and long-term impact of a salient event, and forecasting future values. To illustrate these methods, an illustrative example based on online job search behavior is used throughout the paper, and a software tutorial in R for these analyses is provided in the Supplementary Materials.

  6. Fractal Characteristic of Rock Cutting Load Time Series

    Directory of Open Access Journals (Sweden)

    Hongxiang Jiang

    2014-01-01

    Full Text Available A test-bed was developed to perform the rock cutting experiments under different cutting conditions. The fractal theory was adopted to investigate the fractal characteristic of cutting load time series and fragment size distribution in rock cutting. The box-counting dimension for the cutting load time series was consistent with the fractal dimension of the corresponding fragment size distribution, which indicated that there were inherent relations between the rock fragmentation and the cutting load. Furthermore, the box-counting dimension was used to describe the fractal characteristic of cutting load time series under different conditions. The results show that the rock compressive strength, cutting depth, cutting angle, and assisted water-jet types all have no significant effect on the fractal characteristic of cutting load. The box-counting dimension can be an evaluation index to assess the extent of rock crushing or cutting. Rock fracture mechanism would not be changed due to water-jet in front of or behind the cutter, but it would be changed when the water-jet was in cutter.

  7. Modeling financial time series with S-plus

    CERN Document Server

    Zivot, Eric

    2003-01-01

    The field of financial econometrics has exploded over the last decade This book represents an integration of theory, methods, and examples using the S-PLUS statistical modeling language and the S+FinMetrics module to facilitate the practice of financial econometrics This is the first book to show the power of S-PLUS for the analysis of time series data It is written for researchers and practitioners in the finance industry, academic researchers in economics and finance, and advanced MBA and graduate students in economics and finance Readers are assumed to have a basic knowledge of S-PLUS and a solid grounding in basic statistics and time series concepts Eric Zivot is an associate professor and Gary Waterman Distinguished Scholar in the Economics Department at the University of Washington, and is co-director of the nascent Professional Master's Program in Computational Finance He regularly teaches courses on econometric theory, financial econometrics and time series econometrics, and is the recipient of the He...

  8. LS-SVR and AGO Based Time Series Prediction Method

    Institute of Scientific and Technical Information of China (English)

    ZHANG Shou-peng; LIU Shan; CHAI Wang-xu; ZHANG Jia-qi; GUO Yang-ming

    2016-01-01

    Recently , fault or health condition prediction of complex systems becomes an interesting research topic.However, it is difficult to establish precise physical model for complex systems , and the time series properties are often necessary to be incorporated for the prediction in practice .Currently ,the LS -SVR is widely adopted for prediction of systems with time series data .In this paper , in order to improve the prediction accuracy, accumulated generating operation (AGO) is carried out to improve the data quality and regularity of raw time series data based on grey system theory;then, the inverse accumulated generating operation ( IAGO) is performed to obtain the prediction results .In addition , due to the reason that appropriate kernel function plays an important role in improving the accuracy of prediction through LS-SVR, a modified Gaussian radial basis function (RBF) is proposed.The requirements of distance functions-based kernel functions are satisfied , which ensure fast damping at the place adjacent to the test point and a moderate damping at infinity .The presented model is applied to the analysis of benchmarks .As indicated by the results , the proposed method is an effective prediction one with good precision .

  9. Learning restricted Boolean network model by time-series data.

    Science.gov (United States)

    Ouyang, Hongjia; Fang, Jie; Shen, Liangzhong; Dougherty, Edward R; Liu, Wenbin

    2014-01-01

    Restricted Boolean networks are simplified Boolean networks that are required for either negative or positive regulations between genes. Higa et al. (BMC Proc 5:S5, 2011) proposed a three-rule algorithm to infer a restricted Boolean network from time-series data. However, the algorithm suffers from a major drawback, namely, it is very sensitive to noise. In this paper, we systematically analyze the regulatory relationships between genes based on the state switch of the target gene and propose an algorithm with which restricted Boolean networks may be inferred from time-series data. We compare the proposed algorithm with the three-rule algorithm and the best-fit algorithm based on both synthetic networks and a well-studied budding yeast cell cycle network. The performance of the algorithms is evaluated by three distance metrics: the normalized-edge Hamming distance [Formula: see text], the normalized Hamming distance of state transition [Formula: see text], and the steady-state distribution distance μ (ssd). Results show that the proposed algorithm outperforms the others according to both [Formula: see text] and [Formula: see text], whereas its performance according to μ (ssd) is intermediate between best-fit and the three-rule algorithms. Thus, our new algorithm is more appropriate for inferring interactions between genes from time-series data.

  10. Genetic programming and serial processing for time series classification.

    Science.gov (United States)

    Alfaro-Cid, Eva; Sharman, Ken; Esparcia-Alcázar, Anna I

    2014-01-01

    This work describes an approach devised by the authors for time series classification. In our approach genetic programming is used in combination with a serial processing of data, where the last output is the result of the classification. The use of genetic programming for classification, although still a field where more research in needed, is not new. However, the application of genetic programming to classification tasks is normally done by considering the input data as a feature vector. That is, to the best of our knowledge, there are not examples in the genetic programming literature of approaches where the time series data are processed serially and the last output is considered as the classification result. The serial processing approach presented here fills a gap in the existing literature. This approach was tested in three different problems. Two of them are real world problems whose data were gathered for online or conference competitions. As there are published results of these two problems this gives us the chance to compare the performance of our approach against top performing methods. The serial processing of data in combination with genetic programming obtained competitive results in both competitions, showing its potential for solving time series classification problems. The main advantage of our serial processing approach is that it can easily handle very large datasets.

  11. Clustering Multivariate Time Series Using Hidden Markov Models

    Directory of Open Access Journals (Sweden)

    Shima Ghassempour

    2014-03-01

    Full Text Available In this paper we describe an algorithm for clustering multivariate time series with variables taking both categorical and continuous values. Time series of this type are frequent in health care, where they represent the health trajectories of individuals. The problem is challenging because categorical variables make it difficult to define a meaningful distance between trajectories. We propose an approach based on Hidden Markov Models (HMMs, where we first map each trajectory into an HMM, then define a suitable distance between HMMs and finally proceed to cluster the HMMs with a method based on a distance matrix. We test our approach on a simulated, but realistic, data set of 1,255 trajectories of individuals of age 45 and over, on a synthetic validation set with known clustering structure, and on a smaller set of 268 trajectories extracted from the longitudinal Health and Retirement Survey. The proposed method can be implemented quite simply using standard packages in R and Matlab and may be a good candidate for solving the difficult problem of clustering multivariate time series with categorical variables using tools that do not require advanced statistic knowledge, and therefore are accessible to a wide range of researchers.

  12. Learning restricted Boolean network model by time-series data

    Science.gov (United States)

    2014-01-01

    Restricted Boolean networks are simplified Boolean networks that are required for either negative or positive regulations between genes. Higa et al. (BMC Proc 5:S5, 2011) proposed a three-rule algorithm to infer a restricted Boolean network from time-series data. However, the algorithm suffers from a major drawback, namely, it is very sensitive to noise. In this paper, we systematically analyze the regulatory relationships between genes based on the state switch of the target gene and propose an algorithm with which restricted Boolean networks may be inferred from time-series data. We compare the proposed algorithm with the three-rule algorithm and the best-fit algorithm based on both synthetic networks and a well-studied budding yeast cell cycle network. The performance of the algorithms is evaluated by three distance metrics: the normalized-edge Hamming distance μhame, the normalized Hamming distance of state transition μhamst, and the steady-state distribution distance μssd. Results show that the proposed algorithm outperforms the others according to both μhame and μhamst, whereas its performance according to μssd is intermediate between best-fit and the three-rule algorithms. Thus, our new algorithm is more appropriate for inferring interactions between genes from time-series data. PMID:25093019

  13. Complexity analysis of the UV radiation dose time series

    CERN Document Server

    Mihailovic, Dragutin T

    2013-01-01

    We have used the Lempel-Ziv and sample entropy measures to assess the complexity in the UV radiation activity in the Vojvodina region (Serbia) for the period 1990-2007. In particular, we have examined the reconstructed daily sum (dose) of the UV-B time series from seven representative places in this region and calculated the Lempel-Ziv Complexity (LZC) and Sample Entropy (SE) values for each time series. The results indicate that the LZC values in some places are close to each other while in others they differ. We have devided the period 1990-2007 into two subintervals: (a) 1990-1998 and (b) 1999-2007 and calculated LZC and SE values for the various time series in these subintervals. It is found that during the period 1999-2007, there is a decrease in their complexities, and corresponding changes in the SE, in comparison to the period 1990-1998. This complexity loss may be attributed to increased (i) human intervention in the post civil war period (land and crop use and urbanization) and military activities i...

  14. Hydroxyl time series and recirculation in turbulent nonpremixed swirling flames

    Energy Technology Data Exchange (ETDEWEB)

    Guttenfelder, Walter A.; Laurendeau, Normand M.; Ji, Jun; King, Galen B.; Gore, Jay P. [School of Mechanical Engineering, Purdue University, West Lafayette, IN 47907-1288 (United States); Renfro, Michael W. [Department of Mechanical Engineering, University of Connecticut, Storrs, CT 06269-3139 (United States)

    2006-10-15

    Time-series measurements of OH, as related to accompanying flow structures, are reported using picosecond time-resolved laser-induced fluorescence (PITLIF) and particle-imaging velocimetry (PIV) for turbulent, swirling, nonpremixed methane-air flames. The [OH] data portray a primary reaction zone surrounding the internal recirculation zone, with residual OH in the recirculation zone approaching chemical equilibrium. Modeling of the OH electronic quenching environment, when compared to fluorescence lifetime measurements, offers additional evidence that the reaction zone burns as a partially premixed flame. A time-series analysis affirms the presence of thin flamelet-like regions based on the relation between swirl-induced turbulence and fluctuations of [OH] in the reaction and recirculation zones. The OH integral time-scales are found to correspond qualitatively to local mean velocities. Furthermore, quantitative dependencies can be established with respect to axial position, Reynolds number, and global equivalence ratio. Given these relationships, the OH time-scales, and thus the primary reaction zone, appear to be dominated by convection-driven fluctuations. Surprisingly, the OH time-scales for these nominally swirling flames demonstrate significant similarities to previous PITLIF results in nonpremixed jet flames. (author)

  15. Estimating the Lyapunov spectrum of time delay feedback systems from scalar time series.

    Science.gov (United States)

    Hegger, R

    1999-08-01

    On the basis of a recently developed method for modeling time delay systems, we propose a procedure to estimate the spectrum of Lyapunov exponents from a scalar time series. It turns out that the spectrum is approximated very well and allows for good estimates of the Lyapunov dimension even if the sampling rate of the time series is so low that the infinite dimensional tangent space is spanned quite sparsely.

  16. Forecasting long memory time series under a break in persistence

    DEFF Research Database (Denmark)

    Heinen, Florian; Sibbertsen, Philipp; Kruse, Robinson

    We consider the problem of forecasting time series with long memory when the memory parameter is subject to a structural break. By means of a large-scale Monte Carlo study we show that ignoring such a change in persistence leads to substantially reduced forecasting precision. The strength...... of this effect depends on whether the memory parameter is increasing or decreasing over time. A comparison of six forecasting strategies allows us to conclude that pre-testing for a change in persistence is highly recommendable in our setting. In addition we provide an empirical example which underlines...

  17. Extracting the relevant delays in time series modelling

    DEFF Research Database (Denmark)

    Goutte, Cyril

    1997-01-01

    In this contribution, we suggest a convenient way to use generalisation error to extract the relevant delays from a time-varying process, i.e. the delays that lead to the best prediction performance. We design a generalisation-based algorithm that takes its inspiration from traditional variable...... selection, and more precisely stepwise forward selection. The method is compared to other forward selection schemes, as well as to a nonparametric tests aimed at estimating the embedding dimension of time series. The final application extends these results to the efficient estimation of FIR filters on some...

  18. Satellite Image Time Series Decomposition Based on EEMD

    Directory of Open Access Journals (Sweden)

    Yun-long Kong

    2015-11-01

    Full Text Available Satellite Image Time Series (SITS have recently been of great interest due to the emerging remote sensing capabilities for Earth observation. Trend and seasonal components are two crucial elements of SITS. In this paper, a novel framework of SITS decomposition based on Ensemble Empirical Mode Decomposition (EEMD is proposed. EEMD is achieved by sifting an ensemble of adaptive orthogonal components called Intrinsic Mode Functions (IMFs. EEMD is noise-assisted and overcomes the drawback of mode mixing in conventional Empirical Mode Decomposition (EMD. Inspired by these advantages, the aim of this work is to employ EEMD to decompose SITS into IMFs and to choose relevant IMFs for the separation of seasonal and trend components. In a series of simulations, IMFs extracted by EEMD achieved a clear representation with physical meaning. The experimental results of 16-day compositions of Moderate Resolution Imaging Spectroradiometer (MODIS, Normalized Difference Vegetation Index (NDVI, and Global Environment Monitoring Index (GEMI time series with disturbance illustrated the effectiveness and stability of the proposed approach to monitoring tasks, such as applications for the detection of abrupt changes.

  19. Linear and nonlinear dynamic systems in financial time series prediction

    Directory of Open Access Journals (Sweden)

    Salim Lahmiri

    2012-10-01

    Full Text Available Autoregressive moving average (ARMA process and dynamic neural networks namely the nonlinear autoregressive moving average with exogenous inputs (NARX are compared by evaluating their ability to predict financial time series; for instance the S&P500 returns. Two classes of ARMA are considered. The first one is the standard ARMA model which is a linear static system. The second one uses Kalman filter (KF to estimate and predict ARMA coefficients. This model is a linear dynamic system. The forecasting ability of each system is evaluated by means of mean absolute error (MAE and mean absolute deviation (MAD statistics. Simulation results indicate that the ARMA-KF system performs better than the standard ARMA alone. Thus, introducing dynamics into the ARMA process improves the forecasting accuracy. In addition, the ARMA-KF outperformed the NARX. This result may suggest that the linear component found in the S&P500 return series is more dominant than the nonlinear part. In sum, we conclude that introducing dynamics into the ARMA process provides an effective system for S&P500 time series prediction.

  20. Time-series animation techniques for visualizing urban growth

    Science.gov (United States)

    Acevedo, W.; Masuoka, P.

    1997-01-01

    Time-series animation is a visually intuitive way to display urban growth. Animations of landuse change for the Baltimore-Washington region were generated by showing a series of images one after the other in sequential order. Before creating an animation, various issues which will affect the appearance of the animation should be considered, including the number of original data frames to use, the optimal animation display speed, the number of intermediate frames to create between the known frames, and the output media on which the animations will be displayed. To create new frames between the known years of data, the change in each theme (i.e. urban development, water bodies, transportation routes) must be characterized and an algorithm developed to create the in-between frames. Example time-series animations were created using a temporal GIS database of the Baltimore-Washington area. Creating the animations involved generating raster images of the urban development, water bodies, and principal transportation routes; overlaying the raster images on a background image; and importing the frames to a movie file. Three-dimensional perspective animations were created by draping each image over digital elevation data prior to importing the frames to a movie file. ?? 1997 Elsevier Science Ltd.

  1. Nonlinear time-series-based adaptive control applications

    Science.gov (United States)

    Mohler, R. R.; Rajkumar, V.; Zakrzewski, R. R.

    1991-01-01

    A control design methodology based on a nonlinear time-series reference model is presented. It is indicated by highly nonlinear simulations that such designs successfully stabilize troublesome aircraft maneuvers undergoing large changes in angle of attack as well as large electric power transients due to line faults. In both applications, the nonlinear controller was significantly better than the corresponding linear adaptive controller. For the electric power network, a flexible AC transmission system with series capacitor power feedback control is studied. A bilinear autoregressive moving average reference model is identified from system data, and the feedback control is manipulated according to a desired reference state. The control is optimized according to a predictive one-step quadratic performance index. A similar algorithm is derived for control of rapid changes in aircraft angle of attack over a normally unstable flight regime. In the latter case, however, a generalization of a bilinear time-series model reference includes quadratic and cubic terms in angle of attack.

  2. Studies in Astronomical Time Series Analysis. VI. Bayesian Block Representations

    Science.gov (United States)

    Scargle, Jeffrey D.; Norris, Jay P.; Jackson, Brad; Chiang, James

    2013-01-01

    This paper addresses the problem of detecting and characterizing local variability in time series and other forms of sequential data. The goal is to identify and characterize statistically significant variations, at the same time suppressing the inevitable corrupting observational errors. We present a simple nonparametric modeling technique and an algorithm implementing it-an improved and generalized version of Bayesian Blocks [Scargle 1998]-that finds the optimal segmentation of the data in the observation interval. The structure of the algorithm allows it to be used in either a real-time trigger mode, or a retrospective mode. Maximum likelihood or marginal posterior functions to measure model fitness are presented for events, binned counts, and measurements at arbitrary times with known error distributions. Problems addressed include those connected with data gaps, variable exposure, extension to piece- wise linear and piecewise exponential representations, multivariate time series data, analysis of variance, data on the circle, other data modes, and dispersed data. Simulations provide evidence that the detection efficiency for weak signals is close to a theoretical asymptotic limit derived by [Arias-Castro, Donoho and Huo 2003]. In the spirit of Reproducible Research [Donoho et al. (2008)] all of the code and data necessary to reproduce all of the figures in this paper are included as auxiliary material.

  3. STUDIES IN ASTRONOMICAL TIME SERIES ANALYSIS. VI. BAYESIAN BLOCK REPRESENTATIONS

    Energy Technology Data Exchange (ETDEWEB)

    Scargle, Jeffrey D. [Space Science and Astrobiology Division, MS 245-3, NASA Ames Research Center, Moffett Field, CA 94035-1000 (United States); Norris, Jay P. [Physics Department, Boise State University, 2110 University Drive, Boise, ID 83725-1570 (United States); Jackson, Brad [The Center for Applied Mathematics and Computer Science, Department of Mathematics, San Jose State University, One Washington Square, MH 308, San Jose, CA 95192-0103 (United States); Chiang, James, E-mail: jeffrey.d.scargle@nasa.gov [W. W. Hansen Experimental Physics Laboratory, Kavli Institute for Particle Astrophysics and Cosmology, Department of Physics and SLAC National Accelerator Laboratory, Stanford University, Stanford, CA 94305 (United States)

    2013-02-20

    This paper addresses the problem of detecting and characterizing local variability in time series and other forms of sequential data. The goal is to identify and characterize statistically significant variations, at the same time suppressing the inevitable corrupting observational errors. We present a simple nonparametric modeling technique and an algorithm implementing it-an improved and generalized version of Bayesian Blocks-that finds the optimal segmentation of the data in the observation interval. The structure of the algorithm allows it to be used in either a real-time trigger mode, or a retrospective mode. Maximum likelihood or marginal posterior functions to measure model fitness are presented for events, binned counts, and measurements at arbitrary times with known error distributions. Problems addressed include those connected with data gaps, variable exposure, extension to piecewise linear and piecewise exponential representations, multivariate time series data, analysis of variance, data on the circle, other data modes, and dispersed data. Simulations provide evidence that the detection efficiency for weak signals is close to a theoretical asymptotic limit derived by Arias-Castro et al. In the spirit of Reproducible Research all of the code and data necessary to reproduce all of the figures in this paper are included as supplementary material.

  4. FAULT IDENTIFICATION IN HETEROGENEOUS NETWORKS USING TIME SERIES ANALYSIS

    Institute of Scientific and Technical Information of China (English)

    孙钦东; 张德运; 孙朝晖

    2004-01-01

    Fault management is crucial to provide quality of service grantees for the future networks, and fault identification is an essential part of it. A novel fault identification algorithm is proposed in this paper, which focuses on the anomaly detection of network traffic. Since the fault identification has been achieved using statistical information in management information base, the algorithm is compatible with the existing simple network management protocol framework. The network traffic time series is verified to be non-stationary. By fitting the adaptive autoregressive model, the series is transformed into a multidimensional vector. The training samples and identifiers are acquired from the network simulation. A k-nearest neighbor classifier identifies the system faults after being trained. The experiment results are consistent with the given fault scenarios, which prove the accuracy of the algorithm. The identification errors are discussed to illustrate that the novel fault identification algorithm is adaptive in the fault scenarios with network traffic change.

  5. Research on time series mining based on shape concept time warping

    Institute of Scientific and Technical Information of China (English)

    翁颖钧; 朱仲英

    2004-01-01

    Time series is an important kind of complex data, while a growing attention has been paid to mining time series knowledge recently. Typically Euclidean distance measure is used for comparing time series. However, it may be a brittle distance measure because of less robustness. Dynamic time warp is a pattern matching algorithm based on nonlinear dynamic programming technique, however it is computationally expensive and suffered from the local shape variance. A modification algorithm named by shape DTW is presented, which uses linguistic variable concept to describe the slope feather of time series. The concept tree is developed by cloud models theory which integrates randomness and probability of uncertainty, so that it makes conversion between qualitative and quantitive knowledge. Experiments about cluster analysis on the basis of this algorithm, compared with Euclidean measure, are implemented on synthetic control chart time series. The results show that this method has strong robustness to loss of feature data due to piecewise segment preprocessing. Moreover, after the construction of shape concept tree, we can discovery knowledge of time series on different time granularity.

  6. Time series analysis of the behavior of brazilian natural rubber

    Directory of Open Access Journals (Sweden)

    Antônio Donizette de Oliveira

    2009-03-01

    Full Text Available The natural rubber is a non-wood product obtained of the coagulation of some lattices of forest species, being Hevea brasiliensis the main one. Native from the Amazon Region, this species was already known by the Indians before the discovery of America. The natural rubber became a product globally valued due to its multiple applications in the economy, being its almost perfect substitute the synthetic rubber derived from the petroleum. Similarly to what happens with other countless products the forecast of future prices of the natural rubber has been object of many studies. The use of models of forecast of univariate timeseries stands out as the more accurate and useful to reduce the uncertainty in the economic decision making process. This studyanalyzed the historical series of prices of the Brazilian natural rubber (R$/kg, in the Jan/99 - Jun/2006 period, in order tocharacterize the rubber price behavior in the domestic market; estimated a model for the time series of monthly natural rubberprices; and foresaw the domestic prices of the natural rubber, in the Jul/2006 - Jun/2007 period, based on the estimated models.The studied models were the ones belonging to the ARIMA family. The main results were: the domestic market of the natural rubberis expanding due to the growth of the world economy; among the adjusted models, the ARIMA (1,1,1 model provided the bestadjustment of the time series of prices of the natural rubber (R$/kg; the prognosis accomplished for the series supplied statistically adequate fittings.

  7. Characterizing weak chaos using time series of Lyapunov exponents.

    Science.gov (United States)

    da Silva, R M; Manchein, C; Beims, M W; Altmann, E G

    2015-06-01

    We investigate chaos in mixed-phase-space Hamiltonian systems using time series of the finite-time Lyapunov exponents. The methodology we propose uses the number of Lyapunov exponents close to zero to define regimes of ordered (stickiness), semiordered (or semichaotic), and strongly chaotic motion. The dynamics is then investigated looking at the consecutive time spent in each regime, the transition between different regimes, and the regions in the phase space associated to them. Applying our methodology to a chain of coupled standard maps we obtain (i) that it allows for an improved numerical characterization of stickiness in high-dimensional Hamiltonian systems, when compared to the previous analyses based on the distribution of recurrence times; (ii) that the transition probabilities between different regimes are determined by the phase-space volume associated to the corresponding regions; and (iii) the dependence of the Lyapunov exponents with the coupling strength.

  8. Removing atmosphere loading effect from GPS time series

    Science.gov (United States)

    Tiampo, K. F.; Samadi Alinia, H.; Samsonov, S. V.; Gonzalez, P. J.

    2015-12-01

    The GPS time series of site position are contaminated by various sources of noise; in particular, the ionospheric and tropospheric path delays are significant [Gray et al., 2000; Meyer et al., 2006]. The GPS path delay in the ionosphere is largely dependent on the wave frequency whereas the delay in troposphere is dependent on the length of the travel path and therefore site elevation. Various approaches available for compensating ionosphere path delay cannot be used for removal of the tropospheric component. Quantifying the tropospheric delay plays an important role for determination of the vertical GPS component precision, as tropospheric parameters over a large distance have very little correlation with each other. Several methods have been proposed for tropospheric signal elimination from GPS vertical time series. Here we utilize surface temperature fluctuations and seasonal variations in water vapour and air pressure data for various spatial and temporal profiles in order to more accurately remove the atmospheric path delay [Samsonov et al., 2014]. In this paper, we model the atmospheric path delay of vertical position time series by analyzing the signal in the frequency domain and study its dependency on topography in eastern Ontario for the time period from January 2008 to December 2012. Systematic dependency of amplitude of atmospheric path delay as a function of height and its temporal variations based on the development of a new, physics-based model relating tropospheric/atmospheric effects with topography and can help in determining the most accurate GPS position.The GPS time series of site position are contaminated by various sources of noise; in particular, the ionospheric and tropospheric path delays are significant [Gray et al., 2000; Meyer et al., 2006]. The GPS path delay in the ionosphere is largely dependent on the wave frequency whereas the delay in troposphere is dependent on the length of the travel path and therefore site elevation. Various

  9. Remote-Sensing Time Series Analysis, a Vegetation Monitoring Tool

    Science.gov (United States)

    McKellip, Rodney; Prados, Donald; Ryan, Robert; Ross, Kenton; Spruce, Joseph; Gasser, Gerald; Greer, Randall

    2008-01-01

    The Time Series Product Tool (TSPT) is software, developed in MATLAB , which creates and displays high signal-to- noise Vegetation Indices imagery and other higher-level products derived from remotely sensed data. This tool enables automated, rapid, large-scale regional surveillance of crops, forests, and other vegetation. TSPT temporally processes high-revisit-rate satellite imagery produced by the Moderate Resolution Imaging Spectroradiometer (MODIS) and by other remote-sensing systems. Although MODIS imagery is acquired daily, cloudiness and other sources of noise can greatly reduce the effective temporal resolution. To improve cloud statistics, the TSPT combines MODIS data from multiple satellites (Aqua and Terra). The TSPT produces MODIS products as single time-frame and multitemporal change images, as time-series plots at a selected location, or as temporally processed image videos. Using the TSPT program, MODIS metadata is used to remove and/or correct bad and suspect data. Bad pixel removal, multiple satellite data fusion, and temporal processing techniques create high-quality plots and animated image video sequences that depict changes in vegetation greenness. This tool provides several temporal processing options not found in other comparable imaging software tools. Because the framework to generate and use other algorithms is established, small modifications to this tool will enable the use of a large range of remotely sensed data types. An effective remote-sensing crop monitoring system must be able to detect subtle changes in plant health in the earliest stages, before the effects of a disease outbreak or other adverse environmental conditions can become widespread and devastating. The integration of the time series analysis tool with ground-based information, soil types, crop types, meteorological data, and crop growth models in a Geographic Information System, could provide the foundation for a large-area crop-surveillance system that could identify

  10. Improvement in global forecast for chaotic time series

    Science.gov (United States)

    Alves, P. R. L.; Duarte, L. G. S.; da Mota, L. A. C. P.

    2016-10-01

    In the Polynomial Global Approach to Time Series Analysis, the most costly (computationally speaking) step is the finding of the fitting polynomial. Here we present two routines that improve the forecasting. In the first, an algorithm that greatly improves this situation is introduced and implemented. The heart of this procedure is implemented on the specific routine which performs a mapping with great efficiency. In comparison with the similar procedure of the TimeS package developed by Carli et al. (2014), an enormous gain in efficiency and an increasing in accuracy are obtained. Another development in this work is the establishment of a level of confidence in global prediction with a statistical test for evaluating if the minimization performed is suitable or not. The other program presented in this article applies the Shapiro-Wilk test for checking the normality of the distribution of errors and calculates the expected deviation. The development is employed in observed and simulated time series to illustrate the performance obtained.

  11. Estimation of coupling between time-delay systems from time series.

    Science.gov (United States)

    Prokhorov, M D; Ponomarenko, V I

    2005-07-01

    We propose a method for estimation of coupling between the systems governed by scalar time-delay differential equations of the Mackey-Glass type from the observed time series data. The method allows one to detect the presence of certain types of linear coupling between two time-delay systems, to define the type, strength, and direction of coupling, and to recover the model equations of coupled time-delay systems from chaotic time series corrupted by noise. We verify our method using both numerical and experimental data.

  12. A Tool to Recover Scalar Time-Delay Systems from Experimental Time Series

    CERN Document Server

    Bünner, M J; Meyer, T; Kittel, A; Parisi, J; Meyer, Th.

    1996-01-01

    We propose a method that is able to analyze chaotic time series, gained from exp erimental data. The method allows to identify scalar time-delay systems. If the dynamics of the system under investigation is governed by a scalar time-delay differential equation of the form $dy(t)/dt = h(y(t),y(t-\\tau_0))$, the delay time $\\tau_0$ and the functi on $h$ can be recovered. There are no restrictions to the dimensionality of the chaotic attractor. The method turns out to be insensitive to noise. We successfully apply the method to various time series taken from a computer experiment and two different electronic oscillators.

  13. Mapping Brazilian savanna vegetation gradients with Landsat time series

    Science.gov (United States)

    Schwieder, Marcel; Leitão, Pedro J.; da Cunha Bustamante, Mercedes Maria; Ferreira, Laerte Guimarães; Rabe, Andreas; Hostert, Patrick

    2016-10-01

    Global change has tremendous impacts on savanna systems around the world. Processes related to climate change or agricultural expansion threaten the ecosystem's state, function and the services it provides. A prominent example is the Brazilian Cerrado that has an extent of around 2 million km2 and features high biodiversity with many endemic species. It is characterized by landscape patterns from open grasslands to dense forests, defining a heterogeneous gradient in vegetation structure throughout the biome. While it is undisputed that the Cerrado provides a multitude of valuable ecosystem services, it is exposed to changes, e.g. through large scale land conversions or climatic changes. Monitoring of the Cerrado is thus urgently needed to assess the state of the system as well as to analyze and further understand ecosystem responses and adaptations to ongoing changes. Therefore we explored the potential of dense Landsat time series to derive phenological information for mapping vegetation gradients in the Cerrado. Frequent data gaps, e.g. due to cloud contamination, impose a serious challenge for such time series analyses. We synthetically filled data gaps based on Radial Basis Function convolution filters to derive continuous pixel-wise temporal profiles capable of representing Land Surface Phenology (LSP). Derived phenological parameters revealed differences in the seasonal cycle between the main Cerrado physiognomies and could thus be used to calibrate a Support Vector Classification model to map their spatial distribution. Our results show that it is possible to map the main spatial patterns of the observed physiognomies based on their phenological differences, whereat inaccuracies occurred especially between similar classes and data-scarce areas. The outcome emphasizes the need for remote sensing based time series analyses at fine scales. Mapping heterogeneous ecosystems such as savannas requires spatial detail, as well as the ability to derive important

  14. Characterizability of metabolic pathway systems from time series data.

    Science.gov (United States)

    Voit, Eberhard O

    2013-12-01

    Over the past decade, the biomathematical community has devoted substantial effort to the complicated challenge of estimating parameter values for biological systems models. An even more difficult issue is the characterization of functional forms for the processes that govern these systems. Most parameter estimation approaches tacitly assume that these forms are known or can be assumed with some validity. However, this assumption is not always true. The recently proposed method of Dynamic Flux Estimation (DFE) addresses this problem in a genuinely novel fashion for metabolic pathway systems. Specifically, DFE allows the characterization of fluxes within such systems through an analysis of metabolic time series data. Its main drawback is the fact that DFE can only directly be applied if the pathway system contains as many metabolites as unknown fluxes. This situation is unfortunately rare. To overcome this roadblock, earlier work in this field had proposed strategies for augmenting the set of unknown fluxes with independent kinetic information, which however is not always available. Employing Moore-Penrose pseudo-inverse methods of linear algebra, the present article discusses an approach for characterizing fluxes from metabolic time series data that is applicable even if the pathway system is underdetermined and contains more fluxes than metabolites. Intriguingly, this approach is independent of a specific modeling framework and unaffected by noise in the experimental time series data. The results reveal whether any fluxes may be characterized and, if so, which subset is characterizable. They also help with the identification of fluxes that, if they could be determined independently, would allow the application of DFE.

  15. Assemblage time series reveal biodiversity change but not systematic loss.

    Science.gov (United States)

    Dornelas, Maria; Gotelli, Nicholas J; McGill, Brian; Shimadzu, Hideyasu; Moyes, Faye; Sievers, Caya; Magurran, Anne E

    2014-04-18

    The extent to which biodiversity change in local assemblages contributes to global biodiversity loss is poorly understood. We analyzed 100 time series from biomes across Earth to ask how diversity within assemblages is changing through time. We quantified patterns of temporal α diversity, measured as change in local diversity, and temporal β diversity, measured as change in community composition. Contrary to our expectations, we did not detect systematic loss of α diversity. However, community composition changed systematically through time, in excess of predictions from null models. Heterogeneous rates of environmental change, species range shifts associated with climate change, and biotic homogenization may explain the different patterns of temporal α and β diversity. Monitoring and understanding change in species composition should be a conservation priority.

  16. Financial Time Series Prediction Using Elman Recurrent Random Neural Networks.

    Science.gov (United States)

    Wang, Jie; Wang, Jun; Fang, Wen; Niu, Hongli

    2016-01-01

    In recent years, financial market dynamics forecasting has been a focus of economic research. To predict the price indices of stock markets, we developed an architecture which combined Elman recurrent neural networks with stochastic time effective function. By analyzing the proposed model with the linear regression, complexity invariant distance (CID), and multiscale CID (MCID) analysis methods and taking the model compared with different models such as the backpropagation neural network (BPNN), the stochastic time effective neural network (STNN), and the Elman recurrent neural network (ERNN), the empirical results show that the proposed neural network displays the best performance among these neural networks in financial time series forecasting. Further, the empirical research is performed in testing the predictive effects of SSE, TWSE, KOSPI, and Nikkei225 with the established model, and the corresponding statistical comparisons of the above market indices are also exhibited. The experimental results show that this approach gives good performance in predicting the values from the stock market indices.

  17. VARTOOLS: A Program for Analyzing Astronomical Time-Series Data

    CERN Document Server

    Hartman, Joel D

    2016-01-01

    This paper describes the VARTOOLS program, which is an open-source command-line utility, written in C, for analyzing astronomical time-series data, especially light curves. The program provides a general-purpose set of tools for processing light curves including signal identification, filtering, light curve manipulation, time conversions, and modeling and simulating light curves. Some of the routines implemented include the Generalized Lomb-Scargle periodogram, the Box-Least Squares transit search routine, the Analysis of Variance periodogram, the Discrete Fourier Transform including the CLEAN algorithm, the Weighted Wavelet Z-Transform, light curve arithmetic, linear and non-linear optimization of analytic functions including support for Markov Chain Monte Carlo analyses with non-trivial covariances, characterizing and/or simulating time-correlated noise, and the TFA and SYSREM filtering algorithms, among others. A mechanism is also provided for incorporating a user's own compiled processing routines into th...

  18. Scaling in Non-stationary time series I

    CERN Document Server

    Ignaccolo, M; Grigolini, P; Hamilton, P; West, B J

    2003-01-01

    Most data processing techniques, applied to biomedical and sociological time series, are only valid for random fluctuations that are stationary in time. Unfortunately, these data are often non stationary and the use of techniques of analysis resting on the stationary assumption can produce a wrong information on the scaling, and so on the complexity of the process under study. Herein, we test and compare two techniques for removing the non-stationary influences from computer generated time series, consisting of the superposition of a slow signal and a random fluctuation. The former is based on the method of wavelet decomposition, and the latter is a proposal of this paper, denoted by us as step detrending technique. We focus our attention on two cases, when the slow signal is a periodic function mimicking the influence of seasons, and when it is an aperiodic signal mimicking the influence of a population change (increase or decrease). For the purpose of computational simplicity the random fluctuation is taken...

  19. Artificial neural networks applied to forecasting time series.

    Science.gov (United States)

    Montaño Moreno, Juan J; Palmer Pol, Alfonso; Muñoz Gracia, Pilar

    2011-04-01

    This study offers a description and comparison of the main models of Artificial Neural Networks (ANN) which have proved to be useful in time series forecasting, and also a standard procedure for the practical application of ANN in this type of task. The Multilayer Perceptron (MLP), Radial Base Function (RBF), Generalized Regression Neural Network (GRNN), and Recurrent Neural Network (RNN) models are analyzed. With this aim in mind, we use a time series made up of 244 time points. A comparative study establishes that the error made by the four neural network models analyzed is less than 10%. In accordance with the interpretation criteria of this performance, it can be concluded that the neural network models show a close fit regarding their forecasting capacity. The model with the best performance is the RBF, followed by the RNN and MLP. The GRNN model is the one with the worst performance. Finally, we analyze the advantages and limitations of ANN, the possible solutions to these limitations, and provide an orientation towards future research.

  20. GPS time series at Campi Flegrei caldera (2000-2013

    Directory of Open Access Journals (Sweden)

    Prospero De Martino

    2014-05-01

    Full Text Available The Campi Flegrei caldera is an active volcanic system associated to a high volcanic risk, and represents a well known and peculiar example of ground deformations (bradyseism, characterized by intense uplift periods, followed by subsidence phases with some episodic superimposed mini-uplifts. Ground deformation is an important volcanic precursor, and, its continuous monitoring, is one of the main tool for short time forecast of eruptive activity. This paper provides an overview of the continuous GPS monitoring of the Campi Flegrei caldera from January 2000 to July 2013, including network operations, data recording and processing, and data products. In this period the GPS time series allowed continuous and accurate tracking of ground deformation of the area. Seven main uplift episodes were detected, and during each uplift period, the recurrent horizontal displacement pattern, radial from the “caldera center”, suggests no significant change in deformation source geometry and location occurs. The complete archive of GPS time series at Campi Flegrei area is reported in the Supplementary materials. These data can be usefull for the scientific community in improving the research on Campi Flegrei caldera dynamic and hazard assessment.

  1. Time series prediction of mining subsidence based on a SVM

    Institute of Scientific and Technical Information of China (English)

    Li Peixian; Tan Zhixiang; Yah Lili; Deng Kazhong

    2011-01-01

    In order to study dynamic laws of surface movements over coal mines due to mining activities,a dynamic prediction model of surface movements was established,based on the theory of support vector machines (SVM) and times-series analysis.An engineering application was used to verify the correctness of the model.Measurements from observation stations were analyzed and processed to obtain equal-time interval surface movement data and subjected to tests of stationary,zero means and normality.Then the data were used to train the SVM model.A time series model was established to predict mining subsidence by rational choices of embedding dimensions and SVM parameters.MAPE and WIA were used asindicators to evaluate the accuracy of the model and for generalization performance.In the end,the model was used to predict future surface movements.Data from observation stations in Huaibei coal mining area were used as an example.The results show that the maximum absolute error of subsidence is 9 mm,the maximum relative error 1.5%.the maximum absolute error of displacement 7 mm and the maximum relative error 1.8%.The accuracy and reliability of the model meet the requirements of on-site engineering.The results of the study provide a new approach to investigate the dynamics of surface movements.

  2. On the maximum-entropy/autoregressive modeling of time series

    Science.gov (United States)

    Chao, B. F.

    1984-01-01

    The autoregressive (AR) model of a random process is interpreted in the light of the Prony's relation which relates a complex conjugate pair of poles of the AR process in the z-plane (or the z domain) on the one hand, to the complex frequency of one complex harmonic function in the time domain on the other. Thus the AR model of a time series is one that models the time series as a linear combination of complex harmonic functions, which include pure sinusoids and real exponentials as special cases. An AR model is completely determined by its z-domain pole configuration. The maximum-entropy/autogressive (ME/AR) spectrum, defined on the unit circle of the z-plane (or the frequency domain), is nothing but a convenient, but ambiguous visual representation. It is asserted that the position and shape of a spectral peak is determined by the corresponding complex frequency, and the height of the spectral peak contains little information about the complex amplitude of the complex harmonic functions.

  3. Reconstruction of Ordinary Differential Equations From Time Series Data

    CERN Document Server

    Mai, Manuel; O'Hern, Corey S

    2016-01-01

    We develop a numerical method to reconstruct systems of ordinary differential equations (ODEs) from time series data without {\\it a priori} knowledge of the underlying ODEs using sparse basis learning and sparse function reconstruction. We show that employing sparse representations provides more accurate ODE reconstruction compared to least-squares reconstruction techniques for a given amount of time series data. We test and validate the ODE reconstruction method on known 1D, 2D, and 3D systems of ODEs. The 1D system possesses two stable fixed points; the 2D system possesses an oscillatory fixed point with closed orbits; and the 3D system displays chaotic dynamics on a strange attractor. We determine the amount of data required to achieve an error in the reconstructed functions to less than $0.1\\%$. For the reconstructed 1D and 2D systems, we are able to match the trajectories from the original ODEs even at long times. For the 3D system with chaotic dynamics, as expected, the trajectories from the original an...

  4. Monthly hail time series analysis related to agricultural insurance

    Science.gov (United States)

    Tarquis, Ana M.; Saa, Antonio; Gascó, Gabriel; Díaz, M. C.; Garcia Moreno, M. R.; Burgaz, F.

    2010-05-01

    Hail is one of the mos important crop insurance in Spain being more than the 50% of the total insurance in cereal crops. The purpose of the present study is to carry out a study about the hail in cereals. Four provinces have been chosen, those with the values of production are higher: Burgos and Zaragoza for the wheat and Cuenca and Valladolid for the barley. The data that we had available for the study of the evolution and intensity of the damages for hail includes an analysis of the correlation between the ratios of agricultural insurances provided by ENESA and the number of days of annual hail (from 1981 to 2007). At the same time, several weather station per province were selected by the longest more complete data recorded (from 1963 to 2007) to perform an analysis of monthly time series of the number of hail days (HD). The results of the study show us that relation between the ratio of the agricultural insurances and the number of hail days is not clear. Several observations are discussed to explain these results as well as if it is possible to determinte a change in tendency in the HD time series.

  5. Kernel canonical-correlation Granger causality for multiple time series

    Science.gov (United States)

    Wu, Guorong; Duan, Xujun; Liao, Wei; Gao, Qing; Chen, Huafu

    2011-04-01

    Canonical-correlation analysis as a multivariate statistical technique has been applied to multivariate Granger causality analysis to infer information flow in complex systems. It shows unique appeal and great superiority over the traditional vector autoregressive method, due to the simplified procedure that detects causal interaction between multiple time series, and the avoidance of potential model estimation problems. However, it is limited to the linear case. Here, we extend the framework of canonical correlation to include the estimation of multivariate nonlinear Granger causality for drawing inference about directed interaction. Its feasibility and effectiveness are verified on simulated data.

  6. Time series analysis using semiparametric regression on oil palm production

    Science.gov (United States)

    Yundari, Pasaribu, U. S.; Mukhaiyar, U.

    2016-04-01

    This paper presents semiparametric kernel regression method which has shown its flexibility and easiness in mathematical calculation, especially in estimating density and regression function. Kernel function is continuous and it produces a smooth estimation. The classical kernel density estimator is constructed by completely nonparametric analysis and it is well reasonable working for all form of function. Here, we discuss about parameter estimation in time series analysis. First, we consider the parameters are exist, then we use nonparametrical estimation which is called semiparametrical. The selection of optimum bandwidth is obtained by considering the approximation of Mean Integrated Square Root Error (MISE).

  7. Signatures of discrete scale invariance in Dst time series

    Science.gov (United States)

    Balasis, Georgios; Papadimitriou, Constantinos; Daglis, Ioannis A.; Anastasiadis, Anastasios; Athanasopoulou, Labrini; Eftaxias, Konstantinos

    2011-07-01

    Self-similar systems are characterized by continuous scale invariance and, in response, the existence of power laws. However, a significant number of systems exhibits discrete scale invariance (DSI) which in turn leads to log-periodic corrections to scaling that decorate the pure power law. Here, we present the results of a search of log-periodic corrections to scaling in the squares of Dst index increments which are taken as proxies of the energy dissipation rate in the magnetosphere. We show that Dst time series exhibit DSI and discuss the consequence of this feature, as well as the possible implications of Dst DSI on space weather forecasting efforts.

  8. Nonlinear analysis and prediction of time series in multiphase reactors

    CERN Document Server

    Liu, Mingyan

    2014-01-01

    This book reports on important nonlinear aspects or deterministic chaos issues in the systems of multi-phase reactors. The reactors treated in the book include gas-liquid bubble columns, gas-liquid-solid fluidized beds and gas-liquid-solid magnetized fluidized beds. The authors take pressure fluctuations in the bubble columns  as time series for nonlinear analysis, modeling and forecasting. They present qualitative and quantitative non-linear analysis tools which include attractor phase plane plot, correlation dimension, Kolmogorov entropy and largest Lyapunov exponent calculations and local non-linear short-term prediction.

  9. Quality Quandaries- Time Series Model Selection and Parsimony

    DEFF Research Database (Denmark)

    Bisgaard, Søren; Kulahci, Murat

    2009-01-01

    Some of the issues involved in selecting adequate models for time series data are discussed using an example concerning the number of users of an Internet server. The process of selecting an appropriate model is subjective and requires experience and judgment. The authors believe an important...... consideration in model selection should be parameter parsimony. They favor the use of parsimonious mixed ARMA models, noting that research has shown that a model building strategy that considers only autoregressive representations will lead to non-parsimonious models and to loss of forecasting accuracy....

  10. Nonlinear Time Series Forecast Using Radial Basis Function Neural Networks

    Institute of Scientific and Technical Information of China (English)

    ZHENG Xin; CHEN Tian-Lun

    2003-01-01

    In the research of using Radial Basis Function Neural Network (RBF NN) forecasting nonlinear timeseries, we investigate how the different clusterings affect the process of learning and forecasting. We find that k-meansclustering is very suitable. In order to increase the precision we introduce a nonlinear feedback term to escape from thelocal minima of energy, then we use the model to forecast the nonlinear time series which are produced by Mackey-Glassequation and stocks. By selecting the k-means clustering and the suitable feedback term, much better forecasting resultsare obtained.

  11. Nonlinear Time Series Prediction Using Chaotic Neural Networks

    Institute of Scientific and Technical Information of China (English)

    LI KePing; CHEN TianLun

    2001-01-01

    A nonlinear feedback term is introduced into the evaluation equation of weights of the backpropagation algorithm for neural network, the network becomes a chaotic one. For the purpose of that we can investigate how the different feedback terms affect the process of learning and forecasting, we use the model to forecast the nonlinear time series which is produced by Makey-Glass equation. By selecting the suitable feedback term, the system can escape from the local minima and converge to the global minimum or its approximate solutions, and the forecasting results are better than those of backpropagation algorithm.``

  12. Ensemble Deep Learning for Biomedical Time Series Classification

    Directory of Open Access Journals (Sweden)

    Lin-peng Jin

    2016-01-01

    Full Text Available Ensemble learning has been proved to improve the generalization ability effectively in both theory and practice. In this paper, we briefly outline the current status of research on it first. Then, a new deep neural network-based ensemble method that integrates filtering views, local views, distorted views, explicit training, implicit training, subview prediction, and Simple Average is proposed for biomedical time series classification. Finally, we validate its effectiveness on the Chinese Cardiovascular Disease Database containing a large number of electrocardiogram recordings. The experimental results show that the proposed method has certain advantages compared to some well-known ensemble methods, such as Bagging and AdaBoost.

  13. Ensemble Deep Learning for Biomedical Time Series Classification.

    Science.gov (United States)

    Jin, Lin-Peng; Dong, Jun

    2016-01-01

    Ensemble learning has been proved to improve the generalization ability effectively in both theory and practice. In this paper, we briefly outline the current status of research on it first. Then, a new deep neural network-based ensemble method that integrates filtering views, local views, distorted views, explicit training, implicit training, subview prediction, and Simple Average is proposed for biomedical time series classification. Finally, we validate its effectiveness on the Chinese Cardiovascular Disease Database containing a large number of electrocardiogram recordings. The experimental results show that the proposed method has certain advantages compared to some well-known ensemble methods, such as Bagging and AdaBoost.

  14. Finding recurrence networks' threshold adaptively for a specific time series

    Science.gov (United States)

    Eroglu, D.; Marwan, N.; Prasad, S.; Kurths, J.

    2014-11-01

    Recurrence-plot-based recurrence networks are an approach used to analyze time series using a complex networks theory. In both approaches - recurrence plots and recurrence networks -, a threshold to identify recurrent states is required. The selection of the threshold is important in order to avoid bias of the recurrence network results. In this paper, we propose a novel method to choose a recurrence threshold adaptively. We show a comparison between the constant threshold and adaptive threshold cases to study period-chaos and even period-period transitions in the dynamics of a prototypical model system. This novel method is then used to identify climate transitions from a lake sediment record.

  15. Phase space reconstruction using input-output time series data

    Science.gov (United States)

    Walker, David M.; Tufillaro, Nicholas B.

    1999-10-01

    In this paper we suggest that an extension of a procedure recently proposed by Wayland et al. [Phys. Rev. Lett. 70, 580 (1993)] for recognizing determinism in an autonomous time series can also be used as a diagnostic for determining an appropriate embedding dimension for driven (``input-output'') systems. We compare the results of this extension to the results produced by the extensions to the method of false nearest neighbors put forward by Rhodes and Morari [Proceedings of the American Control Conference, Seattle, edited by The American Automatic Control Council (IEEE, Piscataway, 1995)] and the method of averaged false nearest neighbors by Cao et al. [Int. J. Bifurcation Chaos 8, 1491 (1998)].

  16. Ensemble Deep Learning for Biomedical Time Series Classification

    Science.gov (United States)

    2016-01-01

    Ensemble learning has been proved to improve the generalization ability effectively in both theory and practice. In this paper, we briefly outline the current status of research on it first. Then, a new deep neural network-based ensemble method that integrates filtering views, local views, distorted views, explicit training, implicit training, subview prediction, and Simple Average is proposed for biomedical time series classification. Finally, we validate its effectiveness on the Chinese Cardiovascular Disease Database containing a large number of electrocardiogram recordings. The experimental results show that the proposed method has certain advantages compared to some well-known ensemble methods, such as Bagging and AdaBoost.

  17. A Suspicious Action Detection System Considering Time Series

    Science.gov (United States)

    Kozuka, Noriaki; Kimura, Koji; Hagiwara, Masafumi

    The paper proposes a new system that can detect suspicious actions such as a car break-in and surroundings in an open space parking, based on image processing. The proposed system focuses on three points of “order”, “time”, and “location” of human actions. The proposed system has the following features: it 1) deals time series data flow, 2) estimates human actions and the location, 3) extracts suspicious action detection rules automatically, 4) detects suspicious actions using the suspicious score. We carried out experiments using real image sequences. As a result, we obtained about 7.8% higher estimation rate than the conventional system.

  18. A series expansion for the time autocorrelation of dynamical variables

    CERN Document Server

    Maiocchi, A M; Giorgilli, A

    2011-01-01

    We present here a general iterative formula which gives a (formal) series expansion for the time autocorrelation of smooth dynamical variables, for all Hamiltonian systems endowed with an invariant measure. We add some criteria, theoretical in nature, which enable one to decide whether the decay of the correlations is exponentially fast or not. One of these criteria is implemented numerically for the case of the Fermi-Pasta-Ulam system, and we find indications which might suggest a sub-exponentially decay for such a system.

  19. Disease management with ARIMA model in time series.

    Science.gov (United States)

    Sato, Renato Cesar

    2013-01-01

    The evaluation of infectious and noninfectious disease management can be done through the use of a time series analysis. In this study, we expect to measure the results and prevent intervention effects on the disease. Clinical studies have benefited from the use of these techniques, particularly for the wide applicability of the ARIMA model. This study briefly presents the process of using the ARIMA model. This analytical tool offers a great contribution for researchers and healthcare managers in the evaluation of healthcare interventions in specific populations.

  20. Estimation of dynamic flux profiles from metabolic time series data

    Directory of Open Access Journals (Sweden)

    Chou I-Chun

    2012-07-01

    Full Text Available Abstract Background Advances in modern high-throughput techniques of molecular biology have enabled top-down approaches for the estimation of parameter values in metabolic systems, based on time series data. Special among them is the recent method of dynamic flux estimation (DFE, which uses such data not only for parameter estimation but also for the identification of functional forms of the processes governing a metabolic system. DFE furthermore provides diagnostic tools for the evaluation of model validity and of the quality of a model fit beyond residual errors. Unfortunately, DFE works only when the data are more or less complete and the system contains as many independent fluxes as metabolites. These drawbacks may be ameliorated with other types of estimation and information. However, such supplementations incur their own limitations. In particular, assumptions must be made regarding the functional forms of some processes and detailed kinetic information must be available, in addition to the time series data. Results The authors propose here a systematic approach that supplements DFE and overcomes some of its shortcomings. Like DFE, the approach is model-free and requires only minimal assumptions. If sufficient time series data are available, the approach allows the determination of a subset of fluxes that enables the subsequent applicability of DFE to the rest of the flux system. The authors demonstrate the procedure with three artificial pathway systems exhibiting distinct characteristics and with actual data of the trehalose pathway in Saccharomyces cerevisiae. Conclusions The results demonstrate that the proposed method successfully complements DFE under various situations and without a priori assumptions regarding the model representation. The proposed method also permits an examination of whether at all, to what degree, or within what range the available time series data can be validly represented in a particular functional format of

  1. Real Rainfall Time Series for Storm Sewer Design

    DEFF Research Database (Denmark)

    Larsen, Torben

    1981-01-01

    to a storm sewer system. The output of the simulation is the frequency distribution of the peak flow, overflow volume etc. from the overflow or the retention storage. The parameters in the transfer model are found either from rainfall/runoff measurements in the catchment or from one or more simulations......This paper describes a simulation method for the design of retention storages, overflows etc. in storm sewer systems. The method is based on computer simulation with real real rainfall time series as input and with a simple transfer model of the ARMA-type (Autoregressive moving average) applied...... with an advanced hydraulic computer model....

  2. Real Rainfall Time Series for Storm Sewer Design

    DEFF Research Database (Denmark)

    Larsen, Torben

    ) as the model of the storm sewer system. The output of the simulation is the frequency distribution of the peak flow, overflow volume etc. from the overflow or retention storage. The parameters in the transfer model is found either from rainfall/runoff measurements in the catchment or from one or a few......The paper describes a simulation method for the design of retention storages, overflows etc. in storm sewer systems. The method is based on computer simulation with real rainfall time series as input ans with the aply of a simple transfer model of the ARMA-type (autoregressiv moving average model...... simulations with an advanced hydraulic computer model....

  3. Almost Periodically Correlated Time Series in Business Fluctuations Analysis

    CERN Document Server

    Lenart, Lukasz

    2012-01-01

    We propose a non-standard subsampling procedure to make formal statistical inference about the business cycle, one of the most important unobserved feature characterising fluctuations of economic growth. We show that some characteristics of business cycle can be modelled in a non-parametric way by discrete spectrum of the Almost Periodically Correlated (APC) time series. On the basis of estimated characteristics of this spectrum business cycle is extracted by filtering. As an illustration we characterise the man properties of business cycles in industrial production index for Polish economy.

  4. Time series analysis for minority game simulations of financial markets

    CERN Document Server

    Ferreira, F F; Machado, B S; Muruganandam, P

    2003-01-01

    The minority game model introduced recently provides promising insights into the understanding of the evolution of prices, indices and rates in the financial markets. In this paper we perform a time series analysis of the model employing tools from statistics, dynamical systems theory and stochastic processes. Using benchmark systems and a financial index for comparison, we draw conclusions about the generating mechanism for this kind of evolution. The trajectories of the model are found to be similar to that of the first differences of the SP500 index: stochastic, nonlinear and (unit root) stationary.

  5. Visualizing trends and clusters in ranked time-series data

    Science.gov (United States)

    Gousie, Michael B.; Grady, John; Branagan, Melissa

    2013-12-01

    There are many systems that provide visualizations for time-oriented data. Of those, few provide the means of finding patterns in time-series data in which rankings are also important. Fewer still have the fine granularity necessary to visually follow individual data points through time. We propose the Ranking Timeline, a novel visualization method for modestly-sized multivariate data sets that include the top ten rankings over time. The system includes two main visualization components: a ranking over time and a cluster analysis. The ranking visualization, loosely based on line plots, allows the user to track individual data points so as to facilitate comparisons within a given time frame. Glyphs represent additional attributes within the framework of the overall system. The user has control over many aspects of the visualization, including viewing a subset of the data and/or focusing on a desired time frame. The cluster analysis tool shows the relative importance of individual items in conjunction with a visualization showing the connection(s) to other, similar items, while maintaining the aforementioned glyphs and user interaction. The user controls the clustering according to a similarity threshold. The system has been implemented as a Web application, and has been tested with data showing the top ten actors/actresses from 1929-2010. The experiments have revealed patterns in the data heretofore not explored.

  6. Local polynomial method for ensemble forecast of time series

    Directory of Open Access Journals (Sweden)

    S. Regonda

    2005-01-01

    Full Text Available We present a nonparametric approach based on local polynomial regression for ensemble forecast of time series. The state space is first reconstructed by embedding the univariate time series of the response variable in a space of dimension (D with a delay time (τ. To obtain a forecast from a given time point t, three steps are involved: (i the current state of the system is mapped on to the state space, known as the feature vector, (ii a small number (K=α*n, α=fraction (0,1] of the data, n=data length of neighbors (and their future evolution to the feature vector are identified in the state space, and (iii a polynomial of order p is fitted to the identified neighbors, which is then used for prediction. A suite of parameter combinations (D, τ, α, p is selected based on an objective criterion, called the Generalized Cross Validation (GCV. All of the selected parameter combinations are then used to issue a T-step iterated forecast starting from the current time t, thus generating an ensemble forecast which can be used to obtain the forecast probability density function (PDF. The ensemble approach improves upon the traditional method of providing a single mean forecast by providing the forecast uncertainty. Further, for short noisy data it can provide better forecasts. We demonstrate the utility of this approach on two synthetic (Henon and Lorenz attractors and two real data sets (Great Salt Lake bi-weekly volume and NINO3 index. This framework can also be used to forecast a vector of response variables based on a vector of predictors.

  7. Detection of intermittent events in atmospheric time series

    Science.gov (United States)

    Paradisi, P.; Cesari, R.; Palatella, L.; Contini, D.; Donateo, A.

    2009-04-01

    associated with the occurrence of critical events in the atmospheric dynamics. The critical events are associated with transitions between meta-stable configurations. Consequently, this approach could give some effort in the study of Extreme Events in meteorology and climatology and in weather classification schemes. Then, the renewal approach could give some effort in the modelling of non-Gaussian closures for turbulent fluxes [3]. In the proposed approach the main features that need to be estimated are: (a) the distribution of life-times of a given atmospheric meta-stable structure (Waiting Times between two critical events); (b) the statistical distribution of fluctuations; (c) the presence of memory in the time series. These features are related to the evaluation of memory content and scaling from the time series. In order to analyze these features, in recent years some novel statistical techniques have been developed. In particular, the analysis of Diffusion Entropy [4] was shown to be a robust method for the determination of the dynamical scaling. This property is related to the power-law behaviour of the life-time statistics and to the memory properties of the time series. The analysis of Renewal Aging [5], based on renewal theory [2], allows to estimate the content of memory in a time series that is related to the amount of critical events in the time series itself. After a brief review of the statistical techniques (Diffusion Entropy and Renewal Aging), an application to experimental atmospheric time series will be illustrated. References [1] Weiss G.H., Rubin R.J., Random Walks: theory and selected applications, Advances in Chemical Physics,1983, 52, 363-505 (1983). [2] D.R. Cox, Renewal Theory, Methuen, London (1962). [3] P. Paradisi, R. Cesari, F. Mainardi, F. Tampieri: The fractional Fick's law for non-local transport processes, Physica A, 293, p. 130-142 (2001). [4] P. Grigolini, L. Palatella, G. Raffaelli, Fractals 9 (2001) 439. [5] P. Allegrini, F. Barbi, P

  8. Hybrid perturbation methods based on statistical time series models

    Science.gov (United States)

    San-Juan, Juan Félix; San-Martín, Montserrat; Pérez, Iván; López, Rosario

    2016-04-01

    In this work we present a new methodology for orbit propagation, the hybrid perturbation theory, based on the combination of an integration method and a prediction technique. The former, which can be a numerical, analytical or semianalytical theory, generates an initial approximation that contains some inaccuracies derived from the fact that, in order to simplify the expressions and subsequent computations, not all the involved forces are taken into account and only low-order terms are considered, not to mention the fact that mathematical models of perturbations not always reproduce physical phenomena with absolute precision. The prediction technique, which can be based on either statistical time series models or computational intelligence methods, is aimed at modelling and reproducing missing dynamics in the previously integrated approximation. This combination results in the precision improvement of conventional numerical, analytical and semianalytical theories for determining the position and velocity of any artificial satellite or space debris object. In order to validate this methodology, we present a family of three hybrid orbit propagators formed by the combination of three different orders of approximation of an analytical theory and a statistical time series model, and analyse their capability to process the effect produced by the flattening of the Earth. The three considered analytical components are the integration of the Kepler problem, a first-order and a second-order analytical theories, whereas the prediction technique is the same in the three cases, namely an additive Holt-Winters method.

  9. Time series clustering analysis of health-promoting behavior

    Science.gov (United States)

    Yang, Chi-Ta; Hung, Yu-Shiang; Deng, Guang-Feng

    2013-10-01

    Health promotion must be emphasized to achieve the World Health Organization goal of health for all. Since the global population is aging rapidly, ComCare elder health-promoting service was developed by the Taiwan Institute for Information Industry in 2011. Based on the Pender health promotion model, ComCare service offers five categories of health-promoting functions to address the everyday needs of seniors: nutrition management, social support, exercise management, health responsibility, stress management. To assess the overall ComCare service and to improve understanding of the health-promoting behavior of elders, this study analyzed health-promoting behavioral data automatically collected by the ComCare monitoring system. In the 30638 session records collected for 249 elders from January, 2012 to March, 2013, behavior patterns were identified by fuzzy c-mean time series clustering algorithm combined with autocorrelation-based representation schemes. The analysis showed that time series data for elder health-promoting behavior can be classified into four different clusters. Each type reveals different health-promoting needs, frequencies, function numbers and behaviors. The data analysis result can assist policymakers, health-care providers, and experts in medicine, public health, nursing and psychology and has been provided to Taiwan National Health Insurance Administration to assess the elder health-promoting behavior.

  10. TIME SERIES FORECASTING WITH MULTIPLE CANDIDATE MODELS: SELECTING OR COMBINING?

    Institute of Scientific and Technical Information of China (English)

    YU Lean; WANG Shouyang; K. K. Lai; Y.Nakamori

    2005-01-01

    Various mathematical models have been commonly used in time series analysis and forecasting. In these processes, academic researchers and business practitioners often come up against two important problems. One is whether to select an appropriate modeling approach for prediction purposes or to combine these different individual approaches into a single forecast for the different/dissimilar modeling approaches. Another is whether to select the best candidate model for forecasting or to mix the various candidate models with different parameters into a new forecast for the same/similar modeling approaches. In this study, we propose a set of computational procedures to solve the above two issues via two judgmental criteria. Meanwhile, in view of the problems presented in the literature, a novel modeling technique is also proposed to overcome the drawbacks of existing combined forecasting methods. To verify the efficiency and reliability of the proposed procedure and modeling technique, the simulations and real data examples are conducted in this study.The results obtained reveal that the proposed procedure and modeling technique can be used as a feasible solution for time series forecasting with multiple candidate models.

  11. Time series modelling and forecasting of emergency department overcrowding.

    Science.gov (United States)

    Kadri, Farid; Harrou, Fouzi; Chaabane, Sondès; Tahon, Christian

    2014-09-01

    Efficient management of patient flow (demand) in emergency departments (EDs) has become an urgent issue for many hospital administrations. Today, more and more attention is being paid to hospital management systems to optimally manage patient flow and to improve management strategies, efficiency and safety in such establishments. To this end, EDs require significant human and material resources, but unfortunately these are limited. Within such a framework, the ability to accurately forecast demand in emergency departments has considerable implications for hospitals to improve resource allocation and strategic planning. The aim of this study was to develop models for forecasting daily attendances at the hospital emergency department in Lille, France. The study demonstrates how time-series analysis can be used to forecast, at least in the short term, demand for emergency services in a hospital emergency department. The forecasts were based on daily patient attendances at the paediatric emergency department in Lille regional hospital centre, France, from January 2012 to December 2012. An autoregressive integrated moving average (ARIMA) method was applied separately to each of the two GEMSA categories and total patient attendances. Time-series analysis was shown to provide a useful, readily available tool for forecasting emergency department demand.

  12. A New Hybrid Methodology for Nonlinear Time Series Forecasting

    Directory of Open Access Journals (Sweden)

    Mehdi Khashei

    2011-01-01

    Full Text Available Artificial neural networks (ANNs are flexible computing frameworks and universal approximators that can be applied to a wide range of forecasting problems with a high degree of accuracy. However, using ANNs to model linear problems have yielded mixed results, and hence; it is not wise to apply them blindly to any type of data. This is the reason that hybrid methodologies combining linear models such as ARIMA and nonlinear models such as ANNs have been proposed in the literature of time series forecasting. Despite of all advantages of the traditional methodologies for combining ARIMA and ANNs, they have some assumptions that will degenerate their performance if the opposite situation occurs. In this paper, a new methodology is proposed in order to combine the ANNs with ARIMA in order to overcome the limitations of traditional hybrid methodologies and yield more general and more accurate hybrid models. Empirical results with Canadian Lynx data set indicate that the proposed methodology can be a more effective way in order to combine linear and nonlinear models together than traditional hybrid methodologies. Therefore, it can be applied as an appropriate alternative methodology for hybridization in time series forecasting field, especially when higher forecasting accuracy is needed.

  13. Intermittency and multifractional Brownian character of geomagnetic time series

    Directory of Open Access Journals (Sweden)

    G. Consolini

    2013-07-01

    Full Text Available The Earth's magnetosphere exhibits a complex behavior in response to the solar wind conditions. This behavior, which is described in terms of mutifractional Brownian motions, could be the consequence of the occurrence of dynamical phase transitions. On the other hand, it has been shown that the dynamics of the geomagnetic signals is also characterized by intermittency at the smallest temporal scales. Here, we focus on the existence of a possible relationship in the geomagnetic time series between the multifractional Brownian motion character and the occurrence of intermittency. In detail, we investigate the multifractional nature of two long time series of the horizontal intensity of the Earth's magnetic field as measured at L'Aquila Geomagnetic Observatory during two years (2001 and 2008, which correspond to different conditions of solar activity. We propose a possible double origin of the intermittent character of the small-scale magnetic field fluctuations, which is related to both the multifractional nature of the geomagnetic field and the intermittent character of the disturbance level. Our results suggest a more complex nature of the geomagnetic response to solar wind changes than previously thought.

  14. Exponential smoothing for financial time series data forecasting

    Directory of Open Access Journals (Sweden)

    Kuzhda, Tetyana Ivanivna

    2014-05-01

    Full Text Available The article begins with the formulation for predictive learning called exponential smoothing forecasting. The exponential smoothing is commonly applied to financial markets such as stock or bond, foreign exchange, insurance, credit, primary and secondary markets. The exponential smoothing models are useful in providing the valuable decision information for investors. Simple and double exponential smoothing models are two basic types of exponential smoothing method. The simple exponential smoothing method is suitable for financial time series forecasting for the specified time period. The simple exponential smoothing weights past observations with exponentially decreasing weights to forecast future values. The double exponential smoothing is a refinement of the simple exponential smoothing model but adds another component which takes into account any trend in the data. The double exponential smoothing is designed to address this type of data series by taking into account any trend in the data. Measurement of the forecast accuracy is described in this article. Finally, the quantitative value of the price per common share forecast using simple exponential smoothing is calculated. The applied recommendations concerning determination of the price per common share forecast using double exponential smoothing are shown in the article.

  15. A new complexity measure for time series analysis and classification

    Science.gov (United States)

    Nagaraj, Nithin; Balasubramanian, Karthi; Dey, Sutirth

    2013-07-01

    Complexity measures are used in a number of applications including extraction of information from data such as ecological time series, detection of non-random structure in biomedical signals, testing of random number generators, language recognition and authorship attribution etc. Different complexity measures proposed in the literature like Shannon entropy, Relative entropy, Lempel-Ziv, Kolmogrov and Algorithmic complexity are mostly ineffective in analyzing short sequences that are further corrupted with noise. To address this problem, we propose a new complexity measure ETC and define it as the "Effort To Compress" the input sequence by a lossless compression algorithm. Here, we employ the lossless compression algorithm known as Non-Sequential Recursive Pair Substitution (NSRPS) and define ETC as the number of iterations needed for NSRPS to transform the input sequence to a constant sequence. We demonstrate the utility of ETC in two applications. ETC is shown to have better correlation with Lyapunov exponent than Shannon entropy even with relatively short and noisy time series. The measure also has a greater rate of success in automatic identification and classification of short noisy sequences, compared to entropy and a popular measure based on Lempel-Ziv compression (implemented by Gzip).

  16. Financial time series prediction using spiking neural networks.

    Science.gov (United States)

    Reid, David; Hussain, Abir Jaafar; Tawfik, Hissam

    2014-01-01

    In this paper a novel application of a particular type of spiking neural network, a Polychronous Spiking Network, was used for financial time series prediction. It is argued that the inherent temporal capabilities of this type of network are suited to non-stationary data such as this. The performance of the spiking neural network was benchmarked against three systems: two "traditional", rate-encoded, neural networks; a Multi-Layer Perceptron neural network and a Dynamic Ridge Polynomial neural network, and a standard Linear Predictor Coefficients model. For this comparison three non-stationary and noisy time series were used: IBM stock data; US/Euro exchange rate data, and the price of Brent crude oil. The experiments demonstrated favourable prediction results for the Spiking Neural Network in terms of Annualised Return and prediction error for 5-Step ahead predictions. These results were also supported by other relevant metrics such as Maximum Drawdown and Signal-To-Noise ratio. This work demonstrated the applicability of the Polychronous Spiking Network to financial data forecasting and this in turn indicates the potential of using such networks over traditional systems in difficult to manage non-stationary environments.

  17. Financial time series prediction using spiking neural networks.

    Directory of Open Access Journals (Sweden)

    David Reid

    Full Text Available In this paper a novel application of a particular type of spiking neural network, a Polychronous Spiking Network, was used for financial time series prediction. It is argued that the inherent temporal capabilities of this type of network are suited to non-stationary data such as this. The performance of the spiking neural network was benchmarked against three systems: two "traditional", rate-encoded, neural networks; a Multi-Layer Perceptron neural network and a Dynamic Ridge Polynomial neural network, and a standard Linear Predictor Coefficients model. For this comparison three non-stationary and noisy time series were used: IBM stock data; US/Euro exchange rate data, and the price of Brent crude oil. The experiments demonstrated favourable prediction results for the Spiking Neural Network in terms of Annualised Return and prediction error for 5-Step ahead predictions. These results were also supported by other relevant metrics such as Maximum Drawdown and Signal-To-Noise ratio. This work demonstrated the applicability of the Polychronous Spiking Network to financial data forecasting and this in turn indicates the potential of using such networks over traditional systems in difficult to manage non-stationary environments.

  18. Time series analysis of gold production in Malaysia

    Science.gov (United States)

    Muda, Nora; Hoon, Lee Yuen

    2012-05-01

    Gold is a soft, malleable, bright yellow metallic element and unaffected by air or most reagents. It is highly valued as an asset or investment commodity and is extensively used in jewellery, industrial application, dentistry and medical applications. In Malaysia, gold mining is limited in several areas such as Pahang, Kelantan, Terengganu, Johor and Sarawak. The main purpose of this case study is to obtain a suitable model for the production of gold in Malaysia. The model can also be used to predict the data of Malaysia's gold production in the future. Box-Jenkins time series method was used to perform time series analysis with the following steps: identification, estimation, diagnostic checking and forecasting. In addition, the accuracy of prediction is tested using mean absolute percentage error (MAPE). From the analysis, the ARIMA (3,1,1) model was found to be the best fitted model with MAPE equals to 3.704%, indicating the prediction is very accurate. Hence, this model can be used for forecasting. This study is expected to help the private and public sectors to understand the gold production scenario and later plan the gold mining activities in Malaysia.

  19. Land surface phenology from SPOT VEGETATION time series

    Directory of Open Access Journals (Sweden)

    A. Verger

    2016-12-01

    Full Text Available Land surface phenology from time series of satellite data are expected to contribute to improve the representation of vegetation phenology in earth system models. We characterized the baseline phenology of the vegetation at the global scale from GEOCLIM-LAI, a global climatology of leaf area index (LAI derived from 1-km SPOT VEGETATION time series for 1999-2010. The calibration with ground measurements showed that the start and end of season were best identified using respectively 30% and 40% threshold of LAI amplitude values. The satellite-derived phenology was spatially consistent with the global distributions of climatic drivers and biome land cover. The accuracy of the derived phenological metrics, evaluated using available ground observations for birch forests in Europe, cherry in Asia and lilac shrubs in North America showed an overall root mean square error lower than 19 days for the start, end and length of season, and good agreement between the latitudinal gradients of VEGETATION LAI phenology and ground data.

  20. Blind source separation problem in GPS time series

    Science.gov (United States)

    Gualandi, A.; Serpelloni, E.; Belardinelli, M. E.

    2016-04-01

    A critical point in the analysis of ground displacement time series, as those recorded by space geodetic techniques, is the development of data-driven methods that allow the different sources of deformation to be discerned and characterized in the space and time domains. Multivariate statistic includes several approaches that can be considered as a part of data-driven methods. A widely used technique is the principal component analysis (PCA), which allows us to reduce the dimensionality of the data space while maintaining most of the variance of the dataset explained. However, PCA does not perform well in finding the solution to the so-called blind source separation (BSS) problem, i.e., in recovering and separating the original sources that generate the observed data. This is mainly due to the fact that PCA minimizes the misfit calculated using an L2 norm (χ 2), looking for a new Euclidean space where the projected data are uncorrelated. The independent component analysis (ICA) is a popular technique adopted to approach the BSS problem. However, the independence condition is not easy to impose, and it is often necessary to introduce some approximations. To work around this problem, we test the use of a modified variational Bayesian ICA (vbICA) method to recover the multiple sources of ground deformation even in the presence of missing data. The vbICA method models the probability density function (pdf) of each source signal using a mix of Gaussian distributions, allowing for more flexibility in the description of the pdf of the sources with respect to standard ICA, and giving a more reliable estimate of them. Here we present its application to synthetic global positioning system (GPS) position time series, generated by simulating deformation near an active fault, including inter-seismic, co-seismic, and post-seismic signals, plus seasonal signals and noise, and an additional time-dependent volcanic source. We evaluate the ability of the PCA and ICA decomposition

  1. Dependency Structures in Differentially Coded Cardiovascular Time Series

    Directory of Open Access Journals (Sweden)

    Tatjana Tasic

    2017-01-01

    Full Text Available Objectives. This paper analyses temporal dependency in the time series recorded from aging rats, the healthy ones and those with early developed hypertension. The aim is to explore effects of age and hypertension on mutual sample relationship along the time axis. Methods. A copula method is applied to raw and to differentially coded signals. The latter ones were additionally binary encoded for a joint conditional entropy application. The signals were recorded from freely moving male Wistar rats and from spontaneous hypertensive rats, aged 3 months and 12 months. Results. The highest level of comonotonic behavior of pulse interval with respect to systolic blood pressure is observed at time lags τ=0, 3, and 4, while a strong counter-monotonic behavior occurs at time lags τ=1 and 2. Conclusion. Dynamic range of aging rats is considerably reduced in hypertensive groups. Conditional entropy of systolic blood pressure signal, compared to unconditional, shows an increased level of discrepancy, except for a time lag 1, where the equality is preserved in spite of the memory of differential coder. The antiparallel streams play an important role at single beat time lag.

  2. Dependency Structures in Differentially Coded Cardiovascular Time Series

    Science.gov (United States)

    Tasic, Tatjana; Jovanovic, Sladjana; Mohamoud, Omer; Skoric, Tamara; Japundzic-Zigon, Nina

    2017-01-01

    Objectives. This paper analyses temporal dependency in the time series recorded from aging rats, the healthy ones and those with early developed hypertension. The aim is to explore effects of age and hypertension on mutual sample relationship along the time axis. Methods. A copula method is applied to raw and to differentially coded signals. The latter ones were additionally binary encoded for a joint conditional entropy application. The signals were recorded from freely moving male Wistar rats and from spontaneous hypertensive rats, aged 3 months and 12 months. Results. The highest level of comonotonic behavior of pulse interval with respect to systolic blood pressure is observed at time lags τ = 0, 3, and 4, while a strong counter-monotonic behavior occurs at time lags τ = 1 and 2. Conclusion. Dynamic range of aging rats is considerably reduced in hypertensive groups. Conditional entropy of systolic blood pressure signal, compared to unconditional, shows an increased level of discrepancy, except for a time lag 1, where the equality is preserved in spite of the memory of differential coder. The antiparallel streams play an important role at single beat time lag. PMID:28127384

  3. Time series analysis of the interdependence among air pollutants

    Energy Technology Data Exchange (ETDEWEB)

    Hsu, K.-J. (National Taiwan University, Taipei (Taiwan). Dept. of Atmospheric Sciences)

    1992-12-01

    A statistical time series analysis was applied to study the interdependence between the primary and secondary pollutants in the Taipei area. Estimations using the vector autoregression model (VAR) indicate that 2 and 4 h time lags are sufficient to represent the observed values at two stations studied. The impulse response functions and variance decompositions of NO, NO[sub 2] and O[sub 3] were derived using the vector moving average representations to examine the significance of one species on others. Influences of photochemistry and transport processes on these air pollutants at different locations were evaluated from the results. This technique may provide a simple tool for preliminary assessment of pollution problems. 14 refs., 6 figs., 6 tabs.

  4. Behavior of road accidents: Structural time series approach

    Science.gov (United States)

    Junus, Noor Wahida Md; Ismail, Mohd Tahir; Arsad, Zainudin

    2014-12-01

    Road accidents become a major issue in contributing to the increasing number of deaths. Few researchers suggest that road accidents occur due to road structure and road condition. The road structure and condition may differ according to the area and volume of traffic of the location. Therefore, this paper attempts to look up the behavior of the road accidents in four main regions in Peninsular Malaysia by employing a structural time series (STS) approach. STS offers the possibility of modelling the unobserved component such as trends and seasonal component and it is allowed to vary over time. The results found that the number of road accidents is described by a different model. Perhaps, the results imply that the government, especially a policy maker should consider to implement a different approach in ways to overcome the increasing number of road accidents.

  5. Vegetation Dynamics of NW Mexico using MODIS time series data

    Science.gov (United States)

    Valdes, M.; Bonifaz, R.; Pelaez, G.; Leyva Contreras, A.

    2010-12-01

    Northwestern Mexico is an area subjected to a combination of marine and continental climatic influences which produce a highly variable vegetation dynamics throughout time. Using Moderate Resolution Imaging Spectroradiometer (MODIS) vegetation indices data (NDVI and EVI) from 2001 to 2008, mean and standard deviation image values of the time series were calculated. Using this data, annual vegetation dynamics was characterized based on the different values for the different vegetation types. Annual mean values were compared and inter annual variations or anomalies were analyzed calculating departures of de mean. An anomaly was considered if the value was over or under two standard deviations. Using this procedure it was possible determine spatio-temporal patterns over the study area and relate them to climatic conditions.

  6. Time series power flow analysis for distribution connected PV generation.

    Energy Technology Data Exchange (ETDEWEB)

    Broderick, Robert Joseph; Quiroz, Jimmy Edward; Ellis, Abraham; Reno, Matthew J.; Smith, Jeff; Dugan, Roger

    2013-01-01

    Distributed photovoltaic (PV) projects must go through an interconnection study process before connecting to the distribution grid. These studies are intended to identify the likely impacts and mitigation alternatives. In the majority of the cases, system impacts can be ruled out or mitigation can be identified without an involved study, through a screening process or a simple supplemental review study. For some proposed projects, expensive and time-consuming interconnection studies are required. The challenges to performing the studies are twofold. First, every study scenario is potentially unique, as the studies are often highly specific to the amount of PV generation capacity that varies greatly from feeder to feeder and is often unevenly distributed along the same feeder. This can cause location-specific impacts and mitigations. The second challenge is the inherent variability in PV power output which can interact with feeder operation in complex ways, by affecting the operation of voltage regulation and protection devices. The typical simulation tools and methods in use today for distribution system planning are often not adequate to accurately assess these potential impacts. This report demonstrates how quasi-static time series (QSTS) simulation and high time-resolution data can be used to assess the potential impacts in a more comprehensive manner. The QSTS simulations are applied to a set of sample feeders with high PV deployment to illustrate the usefulness of the approach. The report describes methods that can help determine how PV affects distribution system operations. The simulation results are focused on enhancing the understanding of the underlying technical issues. The examples also highlight the steps needed to perform QSTS simulation and describe the data needed to drive the simulations. The goal of this report is to make the methodology of time series power flow analysis readily accessible to utilities and others responsible for evaluating

  7. Seasonal signals in the reprocessed GPS coordinate time series

    Science.gov (United States)

    Kenyeres, A.; van Dam, T.; Figurski, M.; Szafranek, K.

    2008-12-01

    The global (IGS) and regional (EPN) CGPS time series have already been studied in detail by several authors to analyze the periodic signals and noise present in the long term displacement series. The comparisons indicated that the amplitude and phase of the CGPS derived seasonal signals mostly disagree with the surface mass redistribution models. The CGPS results are highly overestimating the seasonal term, only about 40% of the observed annual amplitude can be explained with the joint contribution of the geophysical models (Dong et al. 2002). Additionally the estimated amplitudes or phases are poorly coherent with the models, especially at sites close to coastal areas (van Dam et al, 2007). The conclusion of the studies was that the GPS results are distorted by analysis artifacts (e.g. ocean tide loading, aliasing of unmodeled short periodic tidal signals, antenna PCV models), monument thermal effects and multipath. Additionally, the GPS series available so far are inhomogeneous in terms of processing strategy, applied models and reference frames. The introduction of the absolute phase center variation (PCV) models for the satellite and ground antennae in 2006 and the related reprocessing of the GPS precise orbits made a perfect ground and strong argument for the complete re-analysis of the GPS observations from global to local level of networks. This enormous work is in progress within the IGS and a pilot analysis was already done for the complete EPN observations from 1996 to 2007 by the MUT group (Military University of Warsaw). The quick analysis of the results proved the expectations and the superiority of the reprocessed data. The noise level (weekly coordinate repeatability) was highly reduced making ground for the later analysis on the daily solution level. We also observed the significant decrease of the seasonal term in the residual coordinate time series, which called our attention to perform a repeated comparison of the GPS derived annual periodicity

  8. Adaptive time-variant models for fuzzy-time-series forecasting.

    Science.gov (United States)

    Wong, Wai-Keung; Bai, Enjian; Chu, Alice Wai-Ching

    2010-12-01

    A fuzzy time series has been applied to the prediction of enrollment, temperature, stock indices, and other domains. Related studies mainly focus on three factors, namely, the partition of discourse, the content of forecasting rules, and the methods of defuzzification, all of which greatly influence the prediction accuracy of forecasting models. These studies use fixed analysis window sizes for forecasting. In this paper, an adaptive time-variant fuzzy-time-series forecasting model (ATVF) is proposed to improve forecasting accuracy. The proposed model automatically adapts the analysis window size of fuzzy time series based on the prediction accuracy in the training phase and uses heuristic rules to generate forecasting values in the testing phase. The performance of the ATVF model is tested using both simulated and actual time series including the enrollments at the University of Alabama, Tuscaloosa, and the Taiwan Stock Exchange Capitalization Weighted Stock Index (TAIEX). The experiment results show that the proposed ATVF model achieves a significant improvement in forecasting accuracy as compared to other fuzzy-time-series forecasting models.

  9. Time dependent intrinsic correlation analysis of temperature and dissolved oxygen time series using empirical mode decomposition

    CERN Document Server

    Huang, Y X

    2014-01-01

    In the marine environment, many fields have fluctuations over a large range of different spatial and temporal scales. These quantities can be nonlinear \\red{and} non-stationary, and often interact with each other. A good method to study the multiple scale dynamics of such time series, and their correlations, is needed. In this paper an application of an empirical mode decomposition based time dependent intrinsic correlation, \\red{of} two coastal oceanic time series, temperature and dissolved oxygen (saturation percentage) is presented. The two time series are recorded every 20 minutes \\red{for} 7 years, from 2004 to 2011. The application of the Empirical Mode Decomposition on such time series is illustrated, and the power spectra of the time series are estimated using the Hilbert transform (Hilbert spectral analysis). Power-law regimes are found with slopes of 1.33 for dissolved oxygen and 1.68 for temperature at high frequencies (between 1.2 and 12 hours) \\red{with} both close to 1.9 for lower frequencies (t...

  10. Multi-Granular Trend Detection for Time-Series Analysis.

    Science.gov (United States)

    Arthur Van, Goethem; Staals, Frank; Loffler, Maarten; Dykes, Jason; Speckmann, Bettina

    2017-01-01

    Time series (such as stock prices) and ensembles (such as model runs for weather forecasts) are two important types of one-dimensional time-varying data. Such data is readily available in large quantities but visual analysis of the raw data quickly becomes infeasible, even for moderately sized data sets. Trend detection is an effective way to simplify time-varying data and to summarize salient information for visual display and interactive analysis. We propose a geometric model for trend-detection in one-dimensional time-varying data, inspired by topological grouping structures for moving objects in two- or higher-dimensional space. Our model gives provable guarantees on the trends detected and uses three natural parameters: granularity, support-size, and duration. These parameters can be changed on-demand. Our system also supports a variety of selection brushes and a time-sweep to facilitate refined searches and interactive visualization of (sub-)trends. We explore different visual styles and interactions through which trends, their persistence, and evolution can be explored.

  11. Time Series Analysis of the Blazar OJ 287

    Science.gov (United States)

    Gamel, Ellen; Ryle, W. T.; Carini, M. T.

    2013-06-01

    Blazars are a subset of active galactic nuclei (AGN) where the light is viewed along the jet of radiation produced by the central supermassive black hole. These very luminous objects vary in brightness and are associated with the cores of distant galaxies. The blazar, OJ 287, has been monitored and its brightness tracked over time. From these light curves the relationship between the characteristic “break frequency” and black hole mass can be determined through the use of power density spectra. In order to obtain a well-sampled light curve, this blazar will be observed at a wide range of timescales. Long time scales will be obtained using archived light curves from published literature. Medium time scales were obtained through a combination of data provided by Western Kentucky University and data collected at The Bank of Kentucky Observatory. Short time scales were achieved via a single night of observation at the 72” Perkins Telescope at Lowell Observatory in Flagstaff, AZ. Using time series analysis, we present a revised mass estimate for the super massive black hole of OJ 287. This object is of particular interest because it may harbor a binary black hole at its center.

  12. United States Forest Disturbance Trends Observed Using Landsat Time Series

    Science.gov (United States)

    Masek, Jeffrey G.; Goward, Samuel N.; Kennedy, Robert E.; Cohen, Warren B.; Moisen, Gretchen G.; Schleeweis, Karen; Huang, Chengquan

    2013-01-01

    Disturbance events strongly affect the composition, structure, and function of forest ecosystems; however, existing U.S. land management inventories were not designed to monitor disturbance. To begin addressing this gap, the North American Forest Dynamics (NAFD) project has examined a geographic sample of 50 Landsat satellite image time series to assess trends in forest disturbance across the conterminous United States for 1985-2005. The geographic sample design used a probability-based scheme to encompass major forest types and maximize geographic dispersion. For each sample location disturbance was identified in the Landsat series using the Vegetation Change Tracker (VCT) algorithm. The NAFD analysis indicates that, on average, 2.77 Mha/yr of forests were disturbed annually, representing 1.09%/yr of US forestland. These satellite-based national disturbance rates estimates tend to be lower than those derived from land management inventories, reflecting both methodological and definitional differences. In particular the VCT approach used with a biennial time step has limited sensitivity to low-intensity disturbances. Unlike prior satellite studies, our biennial forest disturbance rates vary by nearly a factor of two between high and low years. High western US disturbance rates were associated with active fire years and insect activity, while variability in the east is more strongly related to harvest rates in managed forests. We note that generating a geographic sample based on representing forest type and variability may be problematic since the spatial pattern of disturbance does not necessarily correlate with forest type. We also find that the prevalence of diffuse, non-stand clearing disturbance in US forests makes the application of a biennial geographic sample problematic. Future satellite-based studies of disturbance at regional and national scales should focus on wall-to-wall analyses with annual time step for improved accuracy.

  13. Modeling Glacier Elevation Change from DEM Time Series

    Directory of Open Access Journals (Sweden)

    Di Wang

    2015-08-01

    Full Text Available In this study, a methodology for glacier elevation reconstruction from Digital Elevation Model (DEM time series (tDEM is described for modeling the evolution of glacier elevation and estimating related volume change, with focus on medium-resolution and noisy satellite DEMs. The method is robust with respect to outliers in individual DEM products. Fox Glacier and Franz Josef Glacier in New Zealand are used as test cases based on 31 Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER DEMs and the Shuttle Radar Topography Mission (SRTM DEM. We obtained a mean surface elevation lowering rate of −0.51 ± 0.02 m·a−1 and −0.09 ± 0.02 m·a−1 between 2000 and 2014 for Fox and Franz Josef Glacier, respectively. The specific volume difference between 2000 and 2014 was estimated as −0.77 ± 0.13 m·a−1 and −0.33 ± 0.06 m·a−1 by our tDEM method. The comparably moderate thinning rates are mainly due to volume gains after 2013 that compensate larger thinning rates earlier in the series. Terminus thickening prevailed between 2002 and 2007.

  14. VARTOOLS: A program for analyzing astronomical time-series data

    Science.gov (United States)

    Hartman, J. D.; Bakos, G. Á.

    2016-10-01

    This paper describes the VARTOOLS program, which is an open-source command-line utility, written in C, for analyzing astronomical time-series data, especially light curves. The program provides a general-purpose set of tools for processing light curves including signal identification, filtering, light curve manipulation, time conversions, and modeling and simulating light curves. Some of the routines implemented include the Generalized Lomb-Scargle periodogram, the Box-Least Squares transit search routine, the Analysis of Variance periodogram, the Discrete Fourier Transform including the CLEAN algorithm, the Weighted Wavelet Z-Transform, light curve arithmetic, linear and non-linear optimization of analytic functions including support for Markov Chain Monte Carlo analyses with non-trivial covariances, characterizing and/or simulating time-correlated noise, and the TFA and SYSREM filtering algorithms, among others. A mechanism is also provided for incorporating a user's own compiled processing routines into the program. VARTOOLS is designed especially for batch processing of light curves, including built-in support for parallel processing, making it useful for large time-domain surveys such as searches for transiting planets. Several examples are provided to illustrate the use of the program.

  15. Time-series analysis of Campylobacter incidence in Switzerland.

    Science.gov (United States)

    Wei, W; Schüpbach, G; Held, L

    2015-07-01

    Campylobacteriosis has been the most common food-associated notifiable infectious disease in Switzerland since 1995. Contact with and ingestion of raw or undercooked broilers are considered the dominant risk factors for infection. In this study, we investigated the temporal relationship between the disease incidence in humans and the prevalence of Campylobacter in broilers in Switzerland from 2008 to 2012. We use a time-series approach to describe the pattern of the disease by incorporating seasonal effects and autocorrelation. The analysis shows that prevalence of Campylobacter in broilers, with a 2-week lag, has a significant impact on disease incidence in humans. Therefore Campylobacter cases in humans can be partly explained by contagion through broiler meat. We also found a strong autoregressive effect in human illness, and a significant increase of illness during Christmas and New Year's holidays. In a final analysis, we corrected for the sampling error of prevalence in broilers and the results gave similar conclusions.

  16. Crop Yield Forecasted Model Based on Time Series Techniques

    Institute of Scientific and Technical Information of China (English)

    Li Hong-ying; Hou Yan-lin; Zhou Yong-juan; Zhao Hui-ming

    2012-01-01

    Traditional studies on potential yield mainly referred to attainable yield: the maximum yield which could be reached by a crop in a given environment. The new concept of crop yield under average climate conditions was defined in this paper, which was affected by advancement of science and technology. Based on the new concept of crop yield, the time series techniques relying on past yield data was employed to set up a forecasting model. The model was tested by using average grain yields of Liaoning Province in China from 1949 to 2005. The testing combined dynamic n-choosing and micro tendency rectification, and an average forecasting error was 1.24%. In the trend line of yield change, and then a yield turning point might occur, in which case the inflexion model was used to solve the problem of yield turn point.

  17. Single-Index Additive Vector Autoregressive Time Series Models

    KAUST Repository

    LI, YEHUA

    2009-09-01

    We study a new class of nonlinear autoregressive models for vector time series, where the current vector depends on single-indexes defined on the past lags and the effects of different lags have an additive form. A sufficient condition is provided for stationarity of such models. We also study estimation of the proposed model using P-splines, hypothesis testing, asymptotics, selection of the order of the autoregression and of the smoothing parameters and nonlinear forecasting. We perform simulation experiments to evaluate our model in various settings. We illustrate our methodology on a climate data set and show that our model provides more accurate yearly forecasts of the El Niño phenomenon, the unusual warming of water in the Pacific Ocean. © 2009 Board of the Foundation of the Scandinavian Journal of Statistics.

  18. SPITZER IRAC PHOTOMETRY FOR TIME SERIES IN CROWDED FIELDS

    Energy Technology Data Exchange (ETDEWEB)

    Novati, S. Calchi; Beichman, C. [NASA Exoplanet Science Institute, MS 100-22, California Institute of Technology, Pasadena, CA 91125 (United States); Gould, A.; Fausnaugh, M.; Gaudi, B. S.; Pogge, R. W.; Wibking, B.; Zhu, W.; Poleski, R. [Department of Astronomy, Ohio State University, 140 W. 18th Ave., Columbus, OH 43210 (United States); Yee, J. C. [Harvard-Smithsonian Center for Astrophysics, 60 Garden St., Cambridge, MA 02138 (United States); Bryden, G.; Henderson, C. B.; Shvartzvald, Y. [Jet Propulsion Laboratory, California Institute of Technology, 4800 Oak Grove Drive, Pasadena, CA 91109 (United States); Carey, S. [Spitzer, Science Center, MS 220-6, California Institute of Technology, Pasadena, CA (United States); Udalski, A.; Pawlak, M.; Szymański, M. K.; Skowron, J.; Mróz, P.; Kozłowski, S. [Warsaw University Observatory, Al. Ujazdowskie 4, 00-478 Warszawa (Poland); Collaboration: Spitzer team; OGLE group; and others

    2015-12-01

    We develop a new photometry algorithm that is optimized for the Infrared Array Camera (IRAC) Spitzer time series in crowded fields and that is particularly adapted to faint or heavily blended targets. We apply this to the 170 targets from the 2015 Spitzer microlensing campaign and present the results of three variants of this algorithm in an online catalog. We present detailed accounts of the application of this algorithm to two difficult cases, one very faint and the other very crowded. Several of Spitzer's instrumental characteristics that drive the specific features of this algorithm are shared by Kepler and WFIRST, implying that these features may prove to be a useful starting point for algorithms designed for microlensing campaigns by these other missions.

  19. Spitzer IRAC Photometry for Time Series in Crowded Fields

    CERN Document Server

    Novati, S Calchi; Yee, J C; Beichman, C; Bryden, G; Carey, S; Fausnaugh, M; Gaudi, B S; Henderson, C B; Pogge, R W; Shvartzvald, Y; Wibking, B; Zhu, W; Udalski, A; Poleski, R; Pawlak, M; Szymański, M K; Skowron, J; Mróz, P; Kozłowski, S; Wyrzykowski, Ł; Pietrukowicz, P; Pietrzyński, G; Soszyński, I; Ulaczyk, K

    2015-01-01

    We develop a new photometry algorithm that is optimized for $Spitzer$ time series in crowded fields and that is particularly adapted to faint and/or heavily blended targets. We apply this to the 170 targets from the 2015 $Spitzer$ microlensing campaign and present the results of three variants of this algorithm in an online catalog. We present detailed accounts of the application of this algorithm to two difficult cases, one very faint and the other very crowded. Several of $Spitzer$'s instrumental characteristics that drive the specific features of this algorithm are shared by $Kepler$ and $WFIRST$, implying that these features may prove to be a useful starting point for algorithms designed for microlensing campaigns by these other missions.

  20. Optimal estimation of recurrence structures from time series

    Science.gov (United States)

    beim Graben, Peter; Sellers, Kristin K.; Fröhlich, Flavio; Hutt, Axel

    2016-05-01

    Recurrent temporal dynamics is a phenomenon observed frequently in high-dimensional complex systems and its detection is a challenging task. Recurrence quantification analysis utilizing recurrence plots may extract such dynamics, however it still encounters an unsolved pertinent problem: the optimal selection of distance thresholds for estimating the recurrence structure of dynamical systems. The present work proposes a stochastic Markov model for the recurrent dynamics that allows for the analytical derivation of a criterion for the optimal distance threshold. The goodness of fit is assessed by a utility function which assumes a local maximum for that threshold reflecting the optimal estimate of the system's recurrence structure. We validate our approach by means of the nonlinear Lorenz system and its linearized stochastic surrogates. The final application to neurophysiological time series obtained from anesthetized animals illustrates the method and reveals novel dynamic features of the underlying system. We propose the number of optimal recurrence domains as a statistic for classifying an animals' state of consciousness.

  1. On The Fourier And Wavelet Analysis Of Coronal Time Series

    CERN Document Server

    Auchère, F; Bocchialini, K; Buchlin, E; Solomon, J

    2016-01-01

    Using Fourier and wavelet analysis, we critically re-assess the significance of our detection of periodic pulsations in coronal loops. We show that the proper identification of the frequency dependence and statistical properties of the different components of the power spectra provies a strong argument against the common practice of data detrending, which tends to produce spurious detections around the cut-off frequency of the filter. In addition, the white and red noise models built into the widely used wavelet code of Torrence & Compo cannot, in most cases, adequately represent the power spectra of coronal time series, thus also possibly causing false positives. Both effects suggest that several reports of periodic phenomena should be re-examined. The Torrence & Compo code nonetheless effectively computes rigorous confidence levels if provided with pertinent models of mean power spectra, and we describe the appropriate manner in which to call its core routines. We recall the meaning of the default c...

  2. The Abysmal State of Abyssal Time Series: An Acoustic Challenge

    Science.gov (United States)

    Munk, W. H.; Worcester, P. F.; Dushaw, B. D.; Howe, B. M.; Spindel, R. C.

    2001-12-01

    The 20th century rise in global sea level by 18 cm has not been explained. The rise has been continuous and linear since the previous century. It cannot be predominantly the result of thermal expansion. Global ocean warming (as recently compiled by Levitus and his collaborators) started too late, is too non-linear and too weak to account for the recorded rise. It is not impossible that the global warming has been underestimated for lack of adequate observations in the southern hemisphere, and at abyssal depths. Time series of abyssal temperatures are badly lacking. Tomographic methods have the required precision, vertical resolution and horizontal integration to accomplish this task. A more likely explanation is to attribute most of the sea level rise to melting of polar ice sheets. There are two difficulties: the required melting is considerably larger than has generally been estimated, and there are serious restrictions imposed by astronomic measurements of the Earth?s rotation.

  3. MODELLING GASOLINE DEMAND IN GHANA: A STRUCTURAL TIME SERIES ANALYSIS

    Directory of Open Access Journals (Sweden)

    Ishmael Ackah

    2014-01-01

    Full Text Available Concerns about the role of energy consumption in global warming have led to policy designs that seek to reduce fossil fuel consumption or find a less polluting alternative especiallyfor the transport sector. This study seeks to estimate the elasticities of price, income, education and technology on transport gasoline demand sector inGhana. The Structural Time Series Model reports a short-run price and income elasticities of -0.0088 and 0.713. Total factor productivity is -0.408 whilstthe elasticity for education is 2.33. In the long run, the reported price and income elasticities are -0.065 and 5.129 respectively. The long run elasticityfor productivity is -2.935. The study recommends that in order to enhanceefficiency in gasoline consumption in the transport sector, there should beinvestment in productivity.

  4. Hybrid Perturbation methods based on Statistical Time Series models

    CERN Document Server

    San-Juan, Juan Félix; Pérez, Iván; López, Rosario

    2016-01-01

    In this work we present a new methodology for orbit propagation, the hybrid perturbation theory, based on the combination of an integration method and a prediction technique. The former, which can be a numerical, analytical or semianalytical theory, generates an initial approximation that contains some inaccuracies derived from the fact that, in order to simplify the expressions and subsequent computations, not all the involved forces are taken into account and only low-order terms are considered, not to mention the fact that mathematical models of perturbations not always reproduce physical phenomena with absolute precision. The prediction technique, which can be based on either statistical time series models or computational intelligence methods, is aimed at modelling and reproducing missing dynamics in the previously integrated approximation. This combination results in the precision improvement of conventional numerical, analytical and semianalytical theories for determining the position and velocity of a...

  5. Detecting Dynamical States from Noisy Time Series using Bicoherence

    CERN Document Server

    George, Sandip V; Misra, R

    2016-01-01

    Deriving meaningful information from observational data is often restricted by many limiting factors, the most important of which is the presence of noise. In this work, we present the use of the bicoherence function to extract information about the underlying nonlinearity from noisy time series. We show that a system evolving in the presence of noise which has its dynamical state concealed from quantifiers like the power spectrum and correlation dimension D2, can be revealed using the bicoherence function. We define an index called main peak bicoherence function as the bicoherence associated with the maximal power spectral peak. We show that this index is extremely useful while dealing with quasi-periodic data as it can distinguish strange non chaos from quasi periodicity even with added noise. We demonstrate this in a real world scenario, by taking the bicoherence of variable stars showing period doubling and strange non-chaotic behavior. Our results indicate that bicoherence analysis can also bypass the me...

  6. Time series analysis for minority game simulations of financial markets

    Science.gov (United States)

    Ferreira, Fernando F.; Francisco, Gerson; Machado, Birajara S.; Muruganandam, Paulsamy

    2003-04-01

    The minority game (MG) model introduced recently provides promising insights into the understanding of the evolution of prices, indices and rates in the financial markets. In this paper we perform a time series analysis of the model employing tools from statistics, dynamical systems theory and stochastic processes. Using benchmark systems and a financial index for comparison, several conclusions are obtained about the generating mechanism for this kind of evolution. The motion is deterministic, driven by occasional random external perturbation. When the interval between two successive perturbations is sufficiently large, one can find low dimensional chaos in this regime. However, the full motion of the MG model is found to be similar to that of the first differences of the SP500 index: stochastic, nonlinear and (unit root) stationary.

  7. Multivariate time series with linear state space structure

    CERN Document Server

    Gómez, Víctor

    2016-01-01

    This book presents a comprehensive study of multivariate time series with linear state space structure. The emphasis is put on both the clarity of the theoretical concepts and on efficient algorithms for implementing the theory. In particular, it investigates the relationship between VARMA and state space models, including canonical forms. It also highlights the relationship between Wiener-Kolmogorov and Kalman filtering both with an infinite and a finite sample. The strength of the book also lies in the numerous algorithms included for state space models that take advantage of the recursive nature of the models. Many of these algorithms can be made robust, fast, reliable and efficient. The book is accompanied by a MATLAB package called SSMMATLAB and a webpage presenting implemented algorithms with many examples and case studies. Though it lays a solid theoretical foundation, the book also focuses on practical application, and includes exercises in each chapter. It is intended for researchers and students wor...

  8. Traffic time series analysis by using multiscale time irreversibility and entropy

    Science.gov (United States)

    Wang, Xuejiao; Shang, Pengjian; Fang, Jintang

    2014-09-01

    Traffic systems, especially urban traffic systems, are regulated by different kinds of interacting mechanisms which operate across multiple spatial and temporal scales. Traditional approaches fail to account for the multiple time scales inherent in time series, such as empirical probability distribution function and detrended fluctuation analysis, which have lead to different results. The role of multiscale analytical method in traffic time series is a frontier area of investigation. In this paper, our main purpose is to introduce a new method—multiscale time irreversibility, which is helpful to extract information from traffic time series we studied. In addition, to analyse the complexity of traffic volume time series of Beijing Ring 2, 3, 4 roads between workdays and weekends, which are from August 18, 2012 to October 26, 2012, we also compare the results by this new method and multiscale entropy method we have known well. The results show that the higher asymmetry index we get, the higher traffic congestion level will be, and accord with those which are obtained by multiscale entropy.

  9. Traffic time series analysis by using multiscale time irreversibility and entropy.

    Science.gov (United States)

    Wang, Xuejiao; Shang, Pengjian; Fang, Jintang

    2014-09-01

    Traffic systems, especially urban traffic systems, are regulated by different kinds of interacting mechanisms which operate across multiple spatial and temporal scales. Traditional approaches fail to account for the multiple time scales inherent in time series, such as empirical probability distribution function and detrended fluctuation analysis, which have lead to different results. The role of multiscale analytical method in traffic time series is a frontier area of investigation. In this paper, our main purpose is to introduce a new method-multiscale time irreversibility, which is helpful to extract information from traffic time series we studied. In addition, to analyse the complexity of traffic volume time series of Beijing Ring 2, 3, 4 roads between workdays and weekends, which are from August 18, 2012 to October 26, 2012, we also compare the results by this new method and multiscale entropy method we have known well. The results show that the higher asymmetry index we get, the higher traffic congestion level will be, and accord with those which are obtained by multiscale entropy.

  10. Interglacial climate dynamics and advanced time series analysis

    Science.gov (United States)

    Mudelsee, Manfred; Bermejo, Miguel; Köhler, Peter; Lohmann, Gerrit

    2013-04-01

    Studying the climate dynamics of past interglacials (IGs) helps to better assess the anthropogenically influenced dynamics of the current IG, the Holocene. We select the IG portions from the EPICA Dome C ice core archive, which covers the past 800 ka, to apply methods of statistical time series analysis (Mudelsee 2010). The analysed variables are deuterium/H (indicating temperature) (Jouzel et al. 2007), greenhouse gases (Siegenthaler et al. 2005, Loulergue et al. 2008, L¨ü thi et al. 2008) and a model-co-derived climate radiative forcing (Köhler et al. 2010). We select additionally high-resolution sea-surface-temperature records from the marine sedimentary archive. The first statistical method, persistence time estimation (Mudelsee 2002) lets us infer the 'climate memory' property of IGs. Second, linear regression informs about long-term climate trends during IGs. Third, ramp function regression (Mudelsee 2000) is adapted to look on abrupt climate changes during IGs. We compare the Holocene with previous IGs in terms of these mathematical approaches, interprete results in a climate context, assess uncertainties and the requirements to data from old IGs for yielding results of 'acceptable' accuracy. This work receives financial support from the Deutsche Forschungsgemeinschaft (Project ClimSens within the DFG Research Priority Program INTERDYNAMIK) and the European Commission (Marie Curie Initial Training Network LINC, No. 289447, within the 7th Framework Programme). References Jouzel J, Masson-Delmotte V, Cattani O, Dreyfus G, Falourd S, Hoffmann G, Minster B, Nouet J, Barnola JM, Chappellaz J, Fischer H, Gallet JC, Johnsen S, Leuenberger M, Loulergue L, Luethi D, Oerter H, Parrenin F, Raisbeck G, Raynaud D, Schilt A, Schwander J, Selmo E, Souchez R, Spahni R, Stauffer B, Steffensen JP, Stenni B, Stocker TF, Tison JL, Werner M, Wolff EW (2007) Orbital and millennial Antarctic climate variability over the past 800,000 years. Science 317:793. Köhler P, Bintanja R

  11. Beyond multi-fractals: surrogate time series and fields

    Science.gov (United States)

    Venema, V.; Simmer, C.

    2007-12-01

    Most natural complex are characterised by variability on a large range of temporal and spatial scales. The two main methodologies to generate such structures are Fourier/FARIMA based algorithms and multifractal methods. The former is restricted to Gaussian data, whereas the latter requires the structure to be self-similar. This work will present so-called surrogate data as an alternative that works with any (empirical) distribution and power spectrum. The best-known surrogate algorithm is the iterative amplitude adjusted Fourier transform (IAAFT) algorithm. We have studied six different geophysical time series (two clouds, runoff of a small and a large river, temperature and rain) and their surrogates. The power spectra and consequently the 2nd order structure functions were replicated accurately. Even the fourth order structure function was more accurately reproduced by the surrogates as would be possible by a fractal method, because the measured structure deviated too strong from fractal scaling. Only in case of the daily rain sums a fractal method could have been more accurate. Just as Fourier and multifractal methods, the current surrogates are not able to model the asymmetric increment distributions observed for runoff, i.e., they cannot reproduce nonlinear dynamical processes that are asymmetric in time. Furthermore, we have found differences for the structure functions on small scales. Surrogate methods are especially valuable for empirical studies, because the time series and fields that are generated are able to mimic measured variables accurately. Our main application is radiative transfer through structured clouds. Like many geophysical fields, clouds can only be sampled sparsely, e.g. with in-situ airborne instruments. However, for radiative transfer calculations we need full 3-dimensional cloud fields. A first study relating the measured properties of the cloud droplets and the radiative properties of the cloud field by generating surrogate cloud

  12. TEMPORAL SIGNATURES OF AIR QUALITY OBSERVATIONS AND MODEL OUTPUTS: DO TIME SERIES DECOMPOSITION METHODS CAPTURE RELEVANT TIME SCALES?

    Science.gov (United States)

    Time series decomposition methods were applied to meteorological and air quality data and their numerical model estimates. Decomposition techniques express a time series as the sum of a small number of independent modes which hypothetically represent identifiable forcings, thereb...

  13. A unified nonlinear stochastic time series analysis for climate science

    Science.gov (United States)

    Moon, Woosok; Wettlaufer, John S.

    2017-01-01

    Earth’s orbit and axial tilt imprint a strong seasonal cycle on climatological data. Climate variability is typically viewed in terms of fluctuations in the seasonal cycle induced by higher frequency processes. We can interpret this as a competition between the orbitally enforced monthly stability and the fluctuations/noise induced by weather. Here we introduce a new time-series method that determines these contributions from monthly-averaged data. We find that the spatio-temporal distribution of the monthly stability and the magnitude of the noise reveal key fingerprints of several important climate phenomena, including the evolution of the Arctic sea ice cover, the El Nio Southern Oscillation (ENSO), the Atlantic Nio and the Indian Dipole Mode. In analogy with the classical destabilising influence of the ice-albedo feedback on summertime sea ice, we find that during some time interval of the season a destabilising process operates in all of these climate phenomena. The interaction between the destabilisation and the accumulation of noise, which we term the memory effect, underlies phase locking to the seasonal cycle and the statistical nature of seasonal predictability. PMID:28287128

  14. Forecasting incidence of dengue in Rajasthan, using time series analyses

    Directory of Open Access Journals (Sweden)

    Sunil Bhatnagar

    2012-01-01

    Full Text Available Aim: To develop a prediction model for dengue fever/dengue haemorrhagic fever (DF/DHF using time series data over the past decade in Rajasthan and to forecast monthly DF/DHF incidence for 2011. Materials and Methods: Seasonal autoregressive integrated moving average (SARIMA model was used for statistical modeling. Results: During January 2001 to December 2010, the reported DF/DHF cases showed a cyclical pattern with seasonal variation. SARIMA (0,0,1 (0,1,1 12 model had the lowest normalized Bayesian information criteria (BIC of 9.426 and mean absolute percentage error (MAPE of 263.361 and appeared to be the best model. The proportion of variance explained by the model was 54.3%. Adequacy of the model was established through Ljung-Box test (Q statistic 4.910 and P-value 0.996, which showed no significant correlation between residuals at different lag times. The forecast for the year 2011 showed a seasonal peak in the month of October with an estimated 546 cases. Conclusion: Application of SARIMA model may be useful for forecast of cases and impending outbreaks of DF/DHF and other infectious diseases, which exhibit seasonal pattern.

  15. A unified nonlinear stochastic time series analysis for climate science

    Science.gov (United States)

    Moon, Woosok; Wettlaufer, John S.

    2017-03-01

    Earth’s orbit and axial tilt imprint a strong seasonal cycle on climatological data. Climate variability is typically viewed in terms of fluctuations in the seasonal cycle induced by higher frequency processes. We can interpret this as a competition between the orbitally enforced monthly stability and the fluctuations/noise induced by weather. Here we introduce a new time-series method that determines these contributions from monthly-averaged data. We find that the spatio-temporal distribution of the monthly stability and the magnitude of the noise reveal key fingerprints of several important climate phenomena, including the evolution of the Arctic sea ice cover, the El Nio Southern Oscillation (ENSO), the Atlantic Nio and the Indian Dipole Mode. In analogy with the classical destabilising influence of the ice-albedo feedback on summertime sea ice, we find that during some time interval of the season a destabilising process operates in all of these climate phenomena. The interaction between the destabilisation and the accumulation of noise, which we term the memory effect, underlies phase locking to the seasonal cycle and the statistical nature of seasonal predictability.

  16. Auto-Regressive Models of Non-Stationary Time Series with Finite Length

    Institute of Scientific and Technical Information of China (English)

    FEI Wanchun; BAI Lun

    2005-01-01

    To analyze and simulate non-stationary time series with finite length, the statistical characteristics and auto-regressive (AR) models of non-stationary time series with finite length are discussed and studied. A new AR model called the time varying parameter AR model is proposed for solution of non-stationary time series with finite length. The auto-covariances of time series simulated by means of several AR models are analyzed. The result shows that the new AR model can be used to simulate and generate a new time series with the auto-covariance same as the original time series. The size curves of cocoon filaments regarded as non-stationary time series with finite length are experimentally simulated. The simulation results are significantly better than those obtained so far, and illustrate the availability of the time varying parameter AR model. The results are useful for analyzing and simulating non-stationary time series with finite length.

  17. GPS coordinate time series measurements in Ontario and Quebec, Canada

    Science.gov (United States)

    Samadi Alinia, Hadis; Tiampo, Kristy F.; James, Thomas S.

    2017-01-01

    New precise network solutions for continuous GPS (cGPS) stations distributed in eastern Ontario and western Québec provide constraints on the regional three-dimensional crustal velocity field. Five years of continuous observations at fourteen cGPS sites were analyzed using Bernese GPS processing software. Several different sub-networks were chosen from these stations, and the data were processed and compared to in order to select the optimal configuration to accurately estimate the vertical and horizontal station velocities and minimize the associated errors. The coordinate time series were then compared to the crustal motions from global solutions and the optimized solution is presented here. A noise analysis model with power-law and white noise, which best describes the noise characteristics of all three components, was employed for the GPS time series analysis. The linear trend, associated uncertainties, and the spectral index of the power-law noise were calculated using a maximum likelihood estimation approach. The residual horizontal velocities, after removal of rigid plate motion, have a magnitude consistent with expected glacial isostatic adjustment (GIA). The vertical velocities increase from subsidence of almost 1.9 mm/year south of the Great Lakes to uplift near Hudson Bay, where the highest rate is approximately 10.9 mm/year. The residual horizontal velocities range from approximately 0.5 mm/year, oriented south-southeastward, at the Great Lakes to nearly 1.5 mm/year directed toward the interior of Hudson Bay at stations adjacent to its shoreline. Here, the velocity uncertainties are estimated at less than 0.6 mm/year for the horizontal component and 1.1 mm/year for the vertical component. A comparison between the observed velocities and GIA model predictions, for a limited range of Earth models, shows a better fit to the observations for the Earth model with the smallest upper mantle viscosity and the largest lower mantle viscosity. However, the

  18. Markov chain modeling of precipitation time series: Modeling waiting times between tipping bucket rain gauge tips

    DEFF Research Database (Denmark)

    Sørup, Hjalte Jomo Danielsen; Madsen, Henrik; Arnbjerg-Nielsen, Karsten

    2011-01-01

    A very fine temporal and volumetric resolution precipitation time series is modeled using Markov models. Both 1st and 2nd order Markov models as well as seasonal and diurnal models are investigated and evaluated using likelihood based techniques. The 2nd order Markov model is found to be insignif...

  19. Estimation of vegetation cover resilience from satellite time series

    Directory of Open Access Journals (Sweden)

    T. Simoniello

    2008-07-01

    Full Text Available Resilience is a fundamental concept for understanding vegetation as a dynamic component of the climate system. It expresses the ability of ecosystems to tolerate disturbances and to recover their initial state. Recovery times are basic parameters of the vegetation's response to forcing and, therefore, are essential for describing realistic vegetation within dynamical models. Healthy vegetation tends to rapidly recover from shock and to persist in growth and expansion. On the contrary, climatic and anthropic stress can reduce resilience thus favouring persistent decrease in vegetation activity.

    In order to characterize resilience, we analyzed the time series 1982–2003 of 8 km GIMMS AVHRR-NDVI maps of the Italian territory. Persistence probability of negative and positive trends was estimated according to the vegetation cover class, altitude, and climate. Generally, mean recovery times from negative trends were shorter than those estimated for positive trends, as expected for vegetation of healthy status. Some signatures of inefficient resilience were found in high-level mountainous areas and in the Mediterranean sub-tropical ones. This analysis was refined by aggregating pixels according to phenology. This multitemporal clustering synthesized information on vegetation cover, climate, and orography rather well. The consequent persistence estimations confirmed and detailed hints obtained from the previous analyses. Under the same climatic regime, different vegetation resilience levels were found. In particular, within the Mediterranean sub-tropical climate, clustering was able to identify features with different persistence levels in areas that are liable to different levels of anthropic pressure. Moreover, it was capable of enhancing reduced vegetation resilience also in the southern areas under Warm Temperate sub-continental climate. The general consistency of the obtained results showed that, with the help of suited analysis

  20. Inverse method for estimating respiration rates from decay time series

    Directory of Open Access Journals (Sweden)

    D. C. Forney

    2012-03-01

    Full Text Available Long-term organic matter decomposition experiments typically measure the mass lost from decaying organic matter as a function of time. These experiments can provide information about the dynamics of carbon dioxide input to the atmosphere and controls on natural respiration processes. Decay slows down with time, suggesting that organic matter is composed of components (pools with varied lability. Yet it is unclear how the appropriate rates, sizes, and number of pools vary with organic matter type, climate, and ecosystem. To better understand these relations, it is necessary to properly extract the decay rates from decomposition data. Here we present a regularized inverse method to identify an optimally-fitting distribution of decay rates associated with a decay time series. We motivate our study by first evaluating a standard, direct inversion of the data. The direct inversion identifies a discrete distribution of decay rates, where mass is concentrated in just a small number of discrete pools. It is consistent with identifying the best fitting "multi-pool" model, without prior assumption of the number of pools. However we find these multi-pool solutions are not robust to noise and are over-parametrized. We therefore introduce a method of regularized inversion, which identifies the solution which best fits the data but not the noise. This method shows that the data are described by a continuous distribution of rates which we find is well approximated by a lognormal distribution, and consistent with the idea that decomposition results from a continuum of processes at different rates. The ubiquity of the lognormal distribution suggest that decay may be simply described by just two parameters; a mean and a variance of log rates. We conclude by describing a procedure that estimates these two lognormal parameters from decay data. Matlab codes for all numerical methods and procedures are provided.

  1. Inverse method for estimating respiration rates from decay time series

    Directory of Open Access Journals (Sweden)

    D. C. Forney

    2012-09-01

    Full Text Available Long-term organic matter decomposition experiments typically measure the mass lost from decaying organic matter as a function of time. These experiments can provide information about the dynamics of carbon dioxide input to the atmosphere and controls on natural respiration processes. Decay slows down with time, suggesting that organic matter is composed of components (pools with varied lability. Yet it is unclear how the appropriate rates, sizes, and number of pools vary with organic matter type, climate, and ecosystem. To better understand these relations, it is necessary to properly extract the decay rates from decomposition data. Here we present a regularized inverse method to identify an optimally-fitting distribution of decay rates associated with a decay time series. We motivate our study by first evaluating a standard, direct inversion of the data. The direct inversion identifies a discrete distribution of decay rates, where mass is concentrated in just a small number of discrete pools. It is consistent with identifying the best fitting "multi-pool" model, without prior assumption of the number of pools. However we find these multi-pool solutions are not robust to noise and are over-parametrized. We therefore introduce a method of regularized inversion, which identifies the solution which best fits the data but not the noise. This method shows that the data are described by a continuous distribution of rates, which we find is well approximated by a lognormal distribution, and consistent with the idea that decomposition results from a continuum of processes at different rates. The ubiquity of the lognormal distribution suggest that decay may be simply described by just two parameters: a mean and a variance of log rates. We conclude by describing a procedure that estimates these two lognormal parameters from decay data. Matlab codes for all numerical methods and procedures are provided.

  2. River flow time series using least squares support vector machines

    Science.gov (United States)

    Samsudin, R.; Saad, P.; Shabri, A.

    2011-06-01

    This paper proposes a novel hybrid forecasting model known as GLSSVM, which combines the group method of data handling (GMDH) and the least squares support vector machine (LSSVM). The GMDH is used to determine the useful input variables which work as the time series forecasting for the LSSVM model. Monthly river flow data from two stations, the Selangor and Bernam rivers in Selangor state of Peninsular Malaysia were taken into consideration in the development of this hybrid model. The performance of this model was compared with the conventional artificial neural network (ANN) models, Autoregressive Integrated Moving Average (ARIMA), GMDH and LSSVM models using the long term observations of monthly river flow discharge. The root mean square error (RMSE) and coefficient of correlation (R) are used to evaluate the models' performances. In both cases, the new hybrid model has been found to provide more accurate flow forecasts compared to the other models. The results of the comparison indicate that the new hybrid model is a useful tool and a promising new method for river flow forecasting.

  3. Enhancing time-series detection algorithms for automated biosurveillance.

    Science.gov (United States)

    Tokars, Jerome I; Burkom, Howard; Xing, Jian; English, Roseanne; Bloom, Steven; Cox, Kenneth; Pavlin, Julie A

    2009-04-01

    BioSense is a US national system that uses data from health information systems for automated disease surveillance. We studied 4 time-series algorithm modifications designed to improve sensitivity for detecting artificially added data. To test these modified algorithms, we used reports of daily syndrome visits from 308 Department of Defense (DoD) facilities and 340 hospital emergency departments (EDs). At a constant alert rate of 1%, sensitivity was improved for both datasets by using a minimum standard deviation (SD) of 1.0, a 14-28 day baseline duration for calculating mean and SD, and an adjustment for total clinic visits as a surrogate denominator. Stratifying baseline days into weekdays versus weekends to account for day-of-week effects increased sensitivity for the DoD data but not for the ED data. These enhanced methods may increase sensitivity without increasing the alert rate and may improve the ability to detect outbreaks by using automated surveillance system data.

  4. Nonlinear Time Series Analysis in Earth Sciences - Potentials and Pitfalls

    Science.gov (United States)

    Kurths, Jürgen; Donges, Jonathan F.; Donner, Reik V.; Marwan, Norbert; Zou, Yong

    2010-05-01

    The application of methods of nonlinear time series analysis has a rich tradition in Earth sciences and has enabled substantially new insights into various complex processes there. However, some approaches and findings have been controversially discussed over the last decades. One reason is that they are often bases on strong restrictions and their violation may lead to pitfalls and misinterpretations. Here, we discuss three general concepts of nonlinear dynamics and statistical physics, synchronization, recurrence and complex networks and explain how to use them for data analysis. We show that the corresponding methods can be applied even to rather short and non-stationary data which are typical in Earth sciences. References Marwan, N., Romano, M., Thiel, M., Kurths, J.: Recurrence plots for the analysis of complex systems, Physics Reports 438, 237-329 (2007) Arenas, A., Diaz-Guilera, A., Kurths, J., Moreno, Y., Zhou, C.: Synchronization in complex networks, Physics Reports 469, 93-153 (2008) Marwan, N., Donges, J.F., Zou, Y., Donner, R. and Kurths, J., Phys. Lett. A 373, 4246 (2009) Donges, J.F., Zou, Y., Marwan, N. and Kurths, J. Europhys. Lett. 87, 48007 (2009) Donner, R., Zou, Y., Donges, J.F., Marwan, N. and Kurths, J., Phys. Rev. E 81, 015101(R) (2010)

  5. Forecasting Financial Time-Series using Artificial Market Models

    CERN Document Server

    Gupta, N; Johnson, N F; Gupta, Nachi; Hauser, Raphael; Johnson, Neil F.

    2005-01-01

    We discuss the theoretical machinery involved in predicting financial market movements using an artificial market model which has been trained on real financial data. This approach to market prediction - in particular, forecasting financial time-series by training a third-party or 'black box' game on the financial data itself -- was discussed by Johnson et al. in cond-mat/0105303 and cond-mat/0105258 and was based on some encouraging preliminary investigations of the dollar-yen exchange rate, various individual stocks, and stock market indices. However, the initial attempts lacked a clear formal methodology. Here we present a detailed methodology, using optimization techniques to build an estimate of the strategy distribution across the multi-trader population. In contrast to earlier attempts, we are able to present a systematic method for identifying 'pockets of predictability' in real-world markets. We find that as each pocket closes up, the black-box system needs to be 'reset' - which is equivalent to sayi...

  6. Analysis of surface atrial signals: time series with missing data?

    Science.gov (United States)

    Sassi, Roberto; Corino, Valentina D A; Mainardi, Luca T

    2009-10-01

    Uncovering of the atrial signal for patients undergoing episodes of atrial fibrillation is usually obtained from surface ECG by removing waves induced by ventricular activities. Once earned the atrial signal, the detection of the dominant fibrillation frequency is often the main (and only) goal. In this work we verified if subtraction of the ventricular activity might be avoided by performing spectral analysis on those ECG segments where ventricular activity is absent, (i.e. the T-Q intervals). While the approach might seem crude, in here the question was recast into a problem of missing data in a long time series and proper methods were applied: the Lomb periodogram and the iterative Singular Spectrum Analysis. The two methods were tested on both simulated signals and "realistic" atrial signals constructed using the ECG recordings provided by the 2004 Computers in Cardiology competition. The results obtained showed that both techniques were able to provide a reliable quantification of the dominant oscillation, with a slightly superior performance of the iterative Singular Spectrum Analysis. Absolute errors larger than 1.0 Hz were unlikely (p < 0.05) up to 130-140 bpm. Such level of agreement is consistent with similar comparative works where techniques for separating the atrial signal from ventricular waves were considered.

  7. Chaos Time Series Prediction Based on Membrane Optimization Algorithms

    Directory of Open Access Journals (Sweden)

    Meng Li

    2015-01-01

    Full Text Available This paper puts forward a prediction model based on membrane computing optimization algorithm for chaos time series; the model optimizes simultaneously the parameters of phase space reconstruction (τ,m and least squares support vector machine (LS-SVM (γ,σ by using membrane computing optimization algorithm. It is an important basis for spectrum management to predict accurately the change trend of parameters in the electromagnetic environment, which can help decision makers to adopt an optimal action. Then, the model presented in this paper is used to forecast band occupancy rate of frequency modulation (FM broadcasting band and interphone band. To show the applicability and superiority of the proposed model, this paper will compare the forecast model presented in it with conventional similar models. The experimental results show that whether single-step prediction or multistep prediction, the proposed model performs best based on three error measures, namely, normalized mean square error (NMSE, root mean square error (RMSE, and mean absolute percentage error (MAPE.

  8. Linear Detrending Subsequence Matching in Time-Series Databases

    CERN Document Server

    Gil, Myeong-Seon; Kim, Bum-Soo

    2010-01-01

    Each time-series has its own linear trend, the directionality of a timeseries, and removing the linear trend is crucial to get the more intuitive matching results. Supporting the linear detrending in subsequence matching is a challenging problem due to a huge number of possible subsequences. In this paper we define this problem the linear detrending subsequence matching and propose its efficient index-based solution. To this end, we first present a notion of LD-windows (LD means linear detrending), which is obtained as follows: we eliminate the linear trend from a subsequence rather than each window itself and obtain LD-windows by dividing the subsequence into windows. Using the LD-windows we then present a lower bounding theorem for the index-based matching solution and formally prove its correctness. Based on the lower bounding theorem, we next propose the index building and subsequence matching algorithms for linear detrending subsequence matching.We finally show the superiority of our index-based solution...

  9. Imputation of missing data in time series for air pollutants

    Science.gov (United States)

    Junger, W. L.; Ponce de Leon, A.

    2015-02-01

    Missing data are major concerns in epidemiological studies of the health effects of environmental air pollutants. This article presents an imputation-based method that is suitable for multivariate time series data, which uses the EM algorithm under the assumption of normal distribution. Different approaches are considered for filtering the temporal component. A simulation study was performed to assess validity and performance of proposed method in comparison with some frequently used methods. Simulations showed that when the amount of missing data was as low as 5%, the complete data analysis yielded satisfactory results regardless of the generating mechanism of the missing data, whereas the validity began to degenerate when the proportion of missing values exceeded 10%. The proposed imputation method exhibited good accuracy and precision in different settings with respect to the patterns of missing observations. Most of the imputations obtained valid results, even under missing not at random. The methods proposed in this study are implemented as a package called mtsdi for the statistical software system R.

  10. On the Fourier and Wavelet Analysis of Coronal Time Series

    Science.gov (United States)

    Auchère, F.; Froment, C.; Bocchialini, K.; Buchlin, E.; Solomon, J.

    2016-07-01

    Using Fourier and wavelet analysis, we critically re-assess the significance of our detection of periodic pulsations in coronal loops. We show that the proper identification of the frequency dependence and statistical properties of the different components of the power spectra provides a strong argument against the common practice of data detrending, which tends to produce spurious detections around the cut-off frequency of the filter. In addition, the white and red noise models built into the widely used wavelet code of Torrence & Compo cannot, in most cases, adequately represent the power spectra of coronal time series, thus also possibly causing false positives. Both effects suggest that several reports of periodic phenomena should be re-examined. The Torrence & Compo code nonetheless effectively computes rigorous confidence levels if provided with pertinent models of mean power spectra, and we describe the appropriate manner in which to call its core routines. We recall the meaning of the default confidence levels output from the code, and we propose new Monte-Carlo-derived levels that take into account the total number of degrees of freedom in the wavelet spectra. These improvements allow us to confirm that the power peaks that we detected have a very low probability of being caused by noise.

  11. Chaotic time series analysis of vision evoked EEG

    Science.gov (United States)

    Zhang, Ningning; Wang, Hong

    2010-01-01

    To investigate the human brain activities for aesthetic processing, beautiful woman face picture and ugly buffoon face picture were applied. Twelve subjects were assigned the aesthetic processing task while the electroencephalogram (EEG) was recorded. Event-related brain potential (ERP) was required from the 32 scalp electrodes and the ugly buffoon picture produced larger amplitudes for the N1, P2, N2, and late slow wave components. Average ERP from the ugly buffoon picture were larger than that from the beautiful woman picture. The ERP signals shows that the ugly buffoon elite higher emotion waves than the beautiful woman face, because some expression is on the face of the buffoon. Then, chaos time series analysis was carried out to calculate the largest Lyapunov exponent using small data set method and the correlation dimension using G-P algorithm. The results show that the largest Lyapunov exponents of the ERP signals are greater than zero, which indicate that the ERP signals may be chaotic. The correlations dimensions coming from the beautiful woman picture are larger than that from the ugly buffoon picture. The comparison of the correlations dimensions shows that the beautiful face can excite the brain nerve cells. The research in the paper is a persuasive proof to the opinion that cerebrum's work is chaotic under some picture stimuli.

  12. The PRIMAP-hist national historical emissions time series

    Science.gov (United States)

    Gütschow, Johannes; Jeffery, M. Louise; Gieseke, Robert; Gebel, Ronja; Stevens, David; Krapp, Mario; Rocha, Marcia

    2016-11-01

    To assess the history of greenhouse gas emissions and individual countries' contributions to emissions and climate change, detailed historical data are needed. We combine several published datasets to create a comprehensive set of emissions pathways for each country and Kyoto gas, covering the years 1850 to 2014 with yearly values, for all UNFCCC member states and most non-UNFCCC territories. The sectoral resolution is that of the main IPCC 1996 categories. Additional time series of CO2 are available for energy and industry subsectors. Country-resolved data are combined from different sources and supplemented using year-to-year growth rates from regionally resolved sources and numerical extrapolations to complete the dataset. Regional deforestation emissions are downscaled to country level using estimates of the deforested area obtained from potential vegetation and simulations of agricultural land. In this paper, we discuss the data sources and methods used and present the resulting dataset, including its limitations and uncertainties. The dataset is available from doi:10.5880/PIK.2016.003 and can be viewed on the website accompanying this paper (http://www.pik-potsdam.de/primap-live/primap-hist/).

  13. Optimal model-free prediction from multivariate time series.

    Science.gov (United States)

    Runge, Jakob; Donner, Reik V; Kurths, Jürgen

    2015-05-01

    Forecasting a time series from multivariate predictors constitutes a challenging problem, especially using model-free approaches. Most techniques, such as nearest-neighbor prediction, quickly suffer from the curse of dimensionality and overfitting for more than a few predictors which has limited their application mostly to the univariate case. Therefore, selection strategies are needed that harness the available information as efficiently as possible. Since often the right combination of predictors matters, ideally all subsets of possible predictors should be tested for their predictive power, but the exponentially growing number of combinations makes such an approach computationally prohibitive. Here a prediction scheme that overcomes this strong limitation is introduced utilizing a causal preselection step which drastically reduces the number of possible predictors to the most predictive set of causal drivers making a globally optimal search scheme tractable. The information-theoretic optimality is derived and practical selection criteria are discussed. As demonstrated for multivariate nonlinear stochastic delay processes, the optimal scheme can even be less computationally expensive than commonly used suboptimal schemes like forward selection. The method suggests a general framework to apply the optimal model-free approach to select variables and subsequently fit a model to further improve a prediction or learn statistical dependencies. The performance of this framework is illustrated on a climatological index of El Niño Southern Oscillation.

  14. Optimizing the search for transiting planets in long time series

    CERN Document Server

    Ofir, Aviv

    2013-01-01

    Context: Transit surveys, both ground- and space- based, have already accumulated a large number of light curves that span several years. Aims: The search for transiting planets in these long time series is computationally intensive. We wish to optimize the search for both detection and computational efficiencies. Methods: We assume that the searched systems can be well described by Keplerian orbits. We then propagate the effects of different system parameters to the detection parameters. Results: We show that the frequency information content of the light curve is primarily determined by the duty cycle of the transit signal, and thus the optimal frequency sampling is found to be cubic and not linear. Further optimization is achieved by considering duty-cycle dependent binning of the phased light curve. By using the (standard) BLS one is either rather insensitive to long-period planets, or less sensitive to short-period planets and computationally slower by a significant factor of ~330 (for a 3yr long dataset...

  15. Innovative techniques to analyze time series of geomagnetic activity indices

    Science.gov (United States)

    Balasis, Georgios; Papadimitriou, Constantinos; Daglis, Ioannis A.; Potirakis, Stelios M.; Eftaxias, Konstantinos

    2016-04-01

    Magnetic storms are undoubtedly among the most important phenomena in space physics and also a central subject of space weather. The non-extensive Tsallis entropy has been recently introduced, as an effective complexity measure for the analysis of the geomagnetic activity Dst index. The Tsallis entropy sensitively shows the complexity dissimilarity among different "physiological" (normal) and "pathological" states (intense magnetic storms). More precisely, the Tsallis entropy implies the emergence of two distinct patterns: (i) a pattern associated with the intense magnetic storms, which is characterized by a higher degree of organization, and (ii) a pattern associated with normal periods, which is characterized by a lower degree of organization. Other entropy measures such as Block Entropy, T-Complexity, Approximate Entropy, Sample Entropy and Fuzzy Entropy verify the above mentioned result. Importantly, the wavelet spectral analysis in terms of Hurst exponent, H, also shows the existence of two different patterns: (i) a pattern associated with the intense magnetic storms, which is characterized by a fractional Brownian persistent behavior (ii) a pattern associated with normal periods, which is characterized by a fractional Brownian anti-persistent behavior. Finally, we observe universality in the magnetic storm and earthquake dynamics, on a basis of a modified form of the Gutenberg-Richter law for the Tsallis statistics. This finding suggests a common approach to the interpretation of both phenomena in terms of the same driving physical mechanism. Signatures of discrete scale invariance in Dst time series further supports the aforementioned proposal.

  16. Landslide monitoring using airphotos time series and GIS

    Science.gov (United States)

    Kavoura, Katerina; Nikolakopoulos, Konstantinos G.; Sabatakakis, Nikolaos

    2014-10-01

    Western Greece is suffering by landslides. The term landslide includes a wide range of ground movement, such as slides, falls, flows etc. mainly based on gravity with the aid of many conditioning and triggering factors. Landslides provoke enormous changes to the natural and artificial relief. The annual cost of repairing the damage amounts to millions of euros. In this paper a combined use of airphotos time series, high resolution remote sensing data and GIS for the landslide monitoring is presented. Analog and digital air-photos used covered a period of almost 70 years from 1945 until 2012. Classical analog airphotos covered the period from 1945 to 2000, while digital airphotos and satellite images covered the 2008-2012 period. The air photos have been orthorectified using the Leica Photogrammetry Suite. Ground control points and a high accuracy DSM were used for the orthorectification of the air photos. The 2008 digital air photo mosaic from the Greek Cadastral with a spatial resolution of 25 cm and the respective DSM was used as the base map for all the others data sets. The RMS error was less than 0.5 pixel. Changes to the artificial constructions provoked by the landslideswere digitized and then implemented in an ARCGIS database. The results are presented in this paper.

  17. Financial time series analysis based on effective phase transfer entropy

    Science.gov (United States)

    Yang, Pengbo; Shang, Pengjian; Lin, Aijing

    2017-02-01

    Transfer entropy is a powerful technique which is able to quantify the impact of one dynamic system on another system. In this paper, we propose the effective phase transfer entropy method based on the transfer entropy method. We use simulated data to test the performance of this method, and the experimental results confirm that the proposed approach is capable of detecting the information transfer between the systems. We also explore the relationship between effective phase transfer entropy and some variables, such as data size, coupling strength and noise. The effective phase transfer entropy is positively correlated with the data size and the coupling strength. Even in the presence of a large amount of noise, it can detect the information transfer between systems, and it is very robust to noise. Moreover, this measure is indeed able to accurately estimate the information flow between systems compared with phase transfer entropy. In order to reflect the application of this method in practice, we apply this method to financial time series and gain new insight into the interactions between systems. It is demonstrated that the effective phase transfer entropy can be used to detect some economic fluctuations in the financial market. To summarize, the effective phase transfer entropy method is a very efficient tool to estimate the information flow between systems.

  18. Task-Driven Evaluation of Aggregation in Time Series Visualization.

    Science.gov (United States)

    Albers, Danielle; Correll, Michael; Gleicher, Michael

    2014-01-01

    Many visualization tasks require the viewer to make judgments about aggregate properties of data. Recent work has shown that viewers can perform such tasks effectively, for example to efficiently compare the maximums or means over ranges of data. However, this work also shows that such effectiveness depends on the designs of the displays. In this paper, we explore this relationship between aggregation task and visualization design to provide guidance on matching tasks with designs. We combine prior results from perceptual science and graphical perception to suggest a set of design variables that influence performance on various aggregate comparison tasks. We describe how choices in these variables can lead to designs that are matched to particular tasks. We use these variables to assess a set of eight different designs, predicting how they will support a set of six aggregate time series comparison tasks. A crowd-sourced evaluation confirms these predictions. These results not only provide evidence for how the specific visualizations support various tasks, but also suggest using the identified design variables as a tool for designing visualizations well suited for various types of tasks.

  19. Detection of a sudden change of the field time series based on the Lorenz system

    Science.gov (United States)

    Li, Fang; Shen, BingLu; Yan, PengCheng; Song, Jian; Ma, DeShan

    2017-01-01

    We conducted an exploratory study of the detection of a sudden change of the field time series based on the numerical solution of the Lorenz system. First, the time when the Lorenz path jumped between the regions on the left and right of the equilibrium point of the Lorenz system was quantitatively marked and the sudden change time of the Lorenz system was obtained. Second, the numerical solution of the Lorenz system was regarded as a vector; thus, this solution could be considered as a vector time series. We transformed the vector time series into a time series using the vector inner product, considering the geometric and topological features of the Lorenz system path. Third, the sudden change of the resulting time series was detected using the sliding t-test method. Comparing the test results with the quantitatively marked time indicated that the method could detect every sudden change of the Lorenz path, thus the method is effective. Finally, we used the method to detect the sudden change of the pressure field time series and temperature field time series, and obtained good results for both series, which indicates that the method can apply to high-dimension vector time series. Mathematically, there is no essential difference between the field time series and vector time series; thus, we provide a new method for the detection of the sudden change of the field time series. PMID:28141832

  20. Reconstructing Ocean Circulation using Coral (triangle)14C Time Series

    Energy Technology Data Exchange (ETDEWEB)

    Kashgarian, M; Guilderson, T P

    2001-02-23

    the invasion of fossil fuel CO{sub 2} and bomb {sup 14}C into the atmosphere and surface oceans. Therefore the {Delta}{sup 14}C data that are produced in this study can be used to validate the ocean uptake of fossil fuel CO2 in coupled ocean-atmosphere models. This study takes advantage of the quasi-conservative nature of {sup 14}C as a water mass tracer by using {Delta}{sup 14}C time series in corals to identify changes in the shallow circulation of the Pacific. Although the data itself provides fundamental information on surface water mass movement the true strength is a combined approach which is greater than the individual parts; the data helps uncover deficiencies in ocean circulation models and the model results place long {Delta}{sup 14}C time series in a dynamic framework which helps to identify those locations where additional observations are most needed.

  1. A time-series approach to dynamical systems from classical and quantum worlds

    Science.gov (United States)

    Fossion, Ruben

    2014-01-01

    This contribution discusses some recent applications of time-series analysis in Random Matrix Theory (RMT), and applications of RMT in the statistial analysis of eigenspectra of correlation matrices of multivariate time series.

  2. A time-series approach to dynamical systems from classical and quantum worlds

    Energy Technology Data Exchange (ETDEWEB)

    Fossion, Ruben [Instituto Nacional de Geriatría, Periférico Sur No. 2767, Col. San Jerónimo Lídice, Del. Magdalena Contreras, 10200 México D.F., Mexico and Centro de Ciencias de la Complejidad (C3), Universidad Nacional Autó (Mexico)

    2014-01-08

    This contribution discusses some recent applications of time-series analysis in Random Matrix Theory (RMT), and applications of RMT in the statistial analysis of eigenspectra of correlation matrices of multivariate time series.

  3. Implementing a real-time data stream for time-series stellar photometry

    Science.gov (United States)

    Bogosavljevic, M.; Ioannou, Z.

    2016-07-01

    We present a new automated photometric pipeline optimized for time-series photometry that includes a real-time data streaming service. An observer using this resource can automatically stream photometric data over the Internet. Other observers can then monitor the data stream in real time and make an informed decision whether to perform complementary observations of a transient event in progress. Our pipeline uses a modular design so that it can be easily implemented or customized as a real-time robotic telescope pipeline on any observatory. The pipeline is controlled through the user friendly SAOImage DS9 package.

  4. Nonlinear Time Series Analysis of White Dwarf Light Curves

    Science.gov (United States)

    Jevtic, N.; Zelechoski, S.; Feldman, H.; Peterson, C.; Schweitzer, J.

    2001-12-01

    We use nonlinear time series analysis methods to examine the light intensity curves of white dwarf PG1351+489 obtained by the Whole Earth Telescope (WET). Though these methods were originally introduced to study chaotic systems, when a clear signature of determinism is found for the process generating an observable and it couples the active degrees of freedom of the system, then the notion of phase space provides a framework for exploring the system dynamics of nonlinear systems in general. With a pronounced single frequency, its harmonics and other frequencies of lower amplitude on a broadband background, the PG1351 light curve lends itself to the use of time delay coordinates. Our phase space reconstruction yields a triangular, toroidal three-dimensional shape. This differs from earlier results of a circular toroidal representation. We find a morphological similarity to a magnetic dynamo model developed for fast rotators that yields a union of both results: the circular phase space structure for the ascending portion of the cycle, and the triangular structure for the declining portion. The rise and fall of the dynamo cycle yield both different phase space representations and different correlation dimensions. Since PG1351 is known to have no significant fields, these results may stimulate the observation of light curves of known magnetic white dwarfs for comparison. Using other data obtained by the WET, we compare the phase space reconstruction of DB white dwarf PG1351 with that of GD 358 which has a more complex power spectrum. We also compare these results with those for PG1159. There is some general similarity between the results of the phase space reconstruction for the DB white dwarfs. As expected, the difference between the results for the DB white dwarfs and PG1159 is great.

  5. Staircase baker's map generates flaring-type time series

    Directory of Open Access Journals (Sweden)

    G. Radons

    2000-01-01

    Full Text Available The baker’s map, invented by Eberhard Hopf in 1937, is an intuitively accesible, two-dimensional chaos-generating discrete dynamical system. This map, which describes the transformation of an idealized two-dimensional dough by stretching, cutting and piling, is non-dissipative. Nevertheless the “x” variable is identical with the dissipative, one-dimensional Bernoulli-shift-generating map. The generalization proposed here takes up ideas of Yaacov Sinai in a modified form. It has a staircase-like shape, with every next step half as high as the preceding one. Each pair of neighboring elements exchanges an equal volume (area during every iteration step in a scaled manner. Since the density of iterated points is constant, the thin tail (to the right, say is visited only exponentially rarely. This observation already explains the map's main qualitative behavior: The “x” variable shows “flares”. The time series of this variable is closely analogous to that of a flaring-type dissipative dynamical system – like those recently described in an abstract economic model. An initial point starting its journey in the tale (or “antenna”, if we tilt the map upwards by 90 degrees is predictably attracted by the broad left hand (bottom part, in order to only very rarely venture out again to the tip. Yet whenever it does so, it thereby creates, with the top of a flare, a new “far-from-equilibrium” initial condition, in this reversible system. The system therefore qualifies as a discrete analogue to a far-from-equilibrium multiparticle Hamiltonian system. The height of the flare hereby corresponds to the momentary height of the H function of a gas. An observable which is even more closely related to the momentary negative entropy was recently described. Dependent on the numerical accuracy chosen, “Poincaré cycles” of two different types (periodic and nonperiodic can be observed for the first time.

  6. Optimizing the search for transiting planets in long time series

    Science.gov (United States)

    Ofir, Aviv

    2014-01-01

    Context. Transit surveys, both ground- and space-based, have already accumulated a large number of light curves that span several years. Aims: The search for transiting planets in these long time series is computationally intensive. We wish to optimize the search for both detection and computational efficiencies. Methods: We assume that the searched systems can be described well by Keplerian orbits. We then propagate the effects of different system parameters to the detection parameters. Results: We show that the frequency information content of the light curve is primarily determined by the duty cycle of the transit signal, and thus the optimal frequency sampling is found to be cubic and not linear. Further optimization is achieved by considering duty-cycle dependent binning of the phased light curve. By using the (standard) BLS, one is either fairly insensitive to long-period planets or less sensitive to short-period planets and computationally slower by a significant factor of ~330 (for a 3 yr long dataset). We also show how the physical system parameters, such as the host star's size and mass, directly affect transit detection. This understanding can then be used to optimize the search for every star individually. Conclusions: By considering Keplerian dynamics explicitly rather than implicitly one can optimally search the BLS parameter space. The presented Optimal BLS enhances the detectability of both very short and very long period planets, while allowing such searches to be done with much reduced resources and time. The Matlab/Octave source code for Optimal BLS is made available. The MATLAB code is only available at the CDS via anonymous ftp to http://cdsarc.u-strasbg.fr (ftp://130.79.128.5) or via http://cdsarc.u-strasbg.fr/viz-bin/qcat?J/A+A/561/A138

  7. D City Transformations by Time Series of Aerial Images

    Science.gov (United States)

    Adami, A.

    2015-02-01

    Recent photogrammetric applications, based on dense image matching algorithms, allow to use not only images acquired by digital cameras, amateur or not, but also to recover the vast heritage of analogue photographs. This possibility opens up many possibilities in the use and enhancement of existing photos heritage. The research of the original figuration of old buildings, the virtual reconstruction of disappeared architectures and the study of urban development are some of the application areas that exploit the great cultural heritage of photography. Nevertheless there are some restrictions in the use of historical images for automatic reconstruction of buildings such as image quality, availability of camera parameters and ineffective geometry of image acquisition. These constrains are very hard to solve and it is difficult to discover good dataset in the case of terrestrial close range photogrammetry for the above reasons. Even the photographic archives of museums and superintendence, while retaining a wealth of documentation, have no dataset for a dense image matching approach. Compared to the vast collection of historical photos, the class of aerial photos meets both criteria stated above. In this paper historical aerial photographs are used with dense image matching algorithms to realize 3d models of a city in different years. The models can be used to study the urban development of the city and its changes through time. The application relates to the city centre of Verona, for which some time series of aerial photographs have been retrieved. The models obtained in this way allowed, right away, to observe the urban development of the city, the places of expansion and new urban areas. But a more interesting aspect emerged from the analytical comparison between models. The difference, as the Euclidean distance, between two models gives information about new buildings or demolitions. As considering accuracy it is necessary point out that the quality of final

  8. Downscaled TRMM Rainfall Time-Series for Catchment Hydrology Applications

    Science.gov (United States)

    Tarnavsky, E.; Mulligan, M.

    2009-04-01

    Hydrology in semi-arid regions is controlled, to a large extent, by the spatial and temporal distribution of rainfall defined in terms of rainfall depth and intensity. Thus, appropriate representation of the space-time variability of rainfall is essential for catchment-scale hydrological models applied in semi-arid regions. While spaceborne platforms equipped with remote sensing instruments provide information on a range of variables for hydrological modelling, including rainfall, the necessary spatial and temporal detail is rarely obtained from a single dataset. This paper presents a new dynamic model of dryland hydrology, DryMOD, which makes best use of free, public-domain remote sensing data for representation of key variables with a particular focus on (a) simulation of spatial rainfall fields and (b) the hydrological response to rainfall, particularly in terms of rainfall-runoff partitioning. In DryMOD, rainfall is simulated using a novel approach combining 1-km spatial detail from a climatology derived from the TRMM 2B31 dataset (mean monthly rainfall) and 3-hourly temporal detail from time-series derived from the 0.25-degree gridded TRMM 3B42 dataset (rainfall intensity). This allows for rainfall simulation at the hourly time step, as well as accumulation of infiltration, recharge, and runoff at the monthly time step. In combination with temperature, topography, and soil data, rainfall-runoff and soil moisture dynamics are simulated over large dryland regions. In order to investigate the hydrological response to rainfall and variable catchment characteristics, the model is applied to two very different catchments in the drylands of North and West Africa. The results of the study demonstrate the use of remote sensing-based estimates of precipitation intensity and volume for the simulation of critical hydrological parameters. The model allows for better spatial planning of water harvesting activities, as well as for optimisation of agricultural activities

  9. Testing coeffcients of AR and bilinear time series models by a graphical approach

    Institute of Scientific and Technical Information of China (English)

    IP; WaiCheung

    2008-01-01

    AR and bilinear time series models are expressed as time series chain graphical models, based on which, it is shown that the coefficients of AR and bilinear models are the conditional correlation coefficients conditioned on the other components of the time series. Then a graphically based procedure is proposed to test the significance of the coeffcients of AR and bilinear time series. Simulations show that our procedure performs well both in sizes and powers.

  10. The Exponential Model for the Spectrum of a Time Series: Extensions and Applications

    DEFF Research Database (Denmark)

    Proietti, Tommaso; Luati, Alessandra

    The exponential model for the spectrum of a time series and its fractional extensions are based on the Fourier series expansion of the logarithm of the spectral density. The coefficients of the expansion form the cepstrum of the time series. After deriving the cepstrum of important classes of time...

  11. Error Sources in Deforestation Detection Using BFAST Monitor on Landsat Time Series Across Three Tropical Sites

    NARCIS (Netherlands)

    Schultz, Michael; Verbesselt, Jan; Avitabile, Valerio; Souza, Carlos; Herold, Martin

    2016-01-01

    Accurate tropic deforestation monitoring using time series requires methods which can capture gradual to abrupt changes and can account for site-specific properties of the environment and the available data. The generic time series algorithm BFAST Monitor was tested using Landsat time series at thre

  12. PROGRAMMING AND ANALYSIS FOR DIGITAL TIME SERIES DATA,

    Science.gov (United States)

    Contents: Preprocessing of data; Digital filtering; Fourier series and Fourier transform computations; Correlation function computations; Spectral density function computations; Frequency response function and coherence function computations; Probability density function computations; Nonstationary processes; and Test case and examples.

  13. The Time Series Data Server (TSDS) for Standards-Compliant, Convenient, and Efficient Access to Time Series Data

    Science.gov (United States)

    Lindholm, D. M.; Weigel, R. S.; Wilson, A.; Ware Dewolfe, A.

    2009-12-01

    Data analysis in the physical sciences is often plagued by the difficulty in acquiring the desired data. A great deal of work has been done in the area of metadata and data discovery, however, many such discoveries simply provide links that lead directly to a data file. Often these files are impractically large, containing more time samples or variables than desired, and are slow to access. Once these files are downloaded, format issues further complicate using the data. Some data servers have begun to address these problems by improving data virtualization and ease of use. However, these services often don't scale to large datasets. Also, the generic nature of the data models used by these servers, while providing greater flexibility, may complicate setting up such a service for data providers and limit sufficient semantics that would otherwise simplify use for clients, machine or human. The Time Series Data Server (TSDS) aims to address these problems within the limited, yet common, domain of time series data. With the simplifying assumption that all data products served are a function of time, the server can optimize for data access based on time subsets, a common use case. The server also supports requests for specific variables, which can be of type scalar, structure, or sequence. It also supports data types with higher level semantics, such as "spectrum." The TSDS is implemented using Java Servlet technology and can be dropped into any servlet container and customized for a data provider's needs. The interface is based on OPeNDAP (http://opendap.org) and conforms to the Data Acces Protocol (DAP) 2.0, a NASA standard (ESDS-RFC-004), which defines a simple HTTP request and response paradigm. Thus a TSDS server instance is a compliant OPeNDAP server that can be accessed by any OPeNDAP client or directly via RESTful web service requests. The TSDS reads the data that it serves into a common data model via the NetCDF Markup Language (NcML, http

  14. Granger Causality in Multivariate Time Series Using a Time-Ordered Restricted Vector Autoregressive Model

    Science.gov (United States)

    Siggiridou, Elsa; Kugiumtzis, Dimitris

    2016-04-01

    Granger causality has been used for the investigation of the inter-dependence structure of the underlying systems of multi-variate time series. In particular, the direct causal effects are commonly estimated by the conditional Granger causality index (CGCI). In the presence of many observed variables and relatively short time series, CGCI may fail because it is based on vector autoregressive models (VAR) involving a large number of coefficients to be estimated. In this work, the VAR is restricted by a scheme that modifies the recently developed method of backward-in-time selection (BTS) of the lagged variables and the CGCI is combined with BTS. Further, the proposed approach is compared favorably to other restricted VAR representations, such as the top-down strategy, the bottom-up strategy, and the least absolute shrinkage and selection operator (LASSO), in terms of sensitivity and specificity of CGCI. This is shown by using simulations of linear and nonlinear, low and high-dimensional systems and different time series lengths. For nonlinear systems, CGCI from the restricted VAR representations are compared with analogous nonlinear causality indices. Further, CGCI in conjunction with BTS and other restricted VAR representations is applied to multi-channel scalp electroencephalogram (EEG) recordings of epileptic patients containing epileptiform discharges. CGCI on the restricted VAR, and BTS in particular, could track the changes in brain connectivity before, during and after epileptiform discharges, which was not possible using the full VAR representation.

  15. Detection of cavity migration risks using radar interferometric time series

    Science.gov (United States)

    Chang, L.; Hanssen, R. F.

    2012-12-01

    , ERS-2, Envisat, and Radarsat-2, to investigate the dynamics (deformation) of the area. In particular we show, for the first time, shear-stress change distribution patterns within the structure of a building, over a period of close to 20 years. Time series analysis shows that deformation rates of ~4 mm/a could be detected for about 18 years, followed by a dramatic increase of up to 20 mm/a in the last period. These results imply that the driving mechanisms of the 2011 catastrophe have a very long lead time and are therefore likely due to a long-lasting gradual motion, such as the upward migration of a cavity. The analysis shows the collocation of the deformation location with relatively shallow near-horizontal mine shafts, suggesting that cavity migration has a high likelihood to be the driving mechanism of the collapse-sinkhole.

  16. A Novel Approach for Nonstationary Time Series Analysis with Time-Invariant Correlation Coefficient

    Directory of Open Access Journals (Sweden)

    Chengrui Liu

    2014-01-01

    Full Text Available We will concentrate on the modeling and analysis of a class of nonstationary time series, called correlation coefficient stationary series, which commonly exists in practical engineering. First, the concept and scope of correlation coefficient stationary series are discussed to get a better understanding. Second, a theorem is proposed to determine standard deviation function for correlation coefficient stationary series. Third, we propose a moving multiple-point average method to determine the function forms for mean and standard deviation, which can help to improve the analysis precision, especially in the context of limited sample size. Fourth, the conditional likelihood approach is utilized to estimate the model parameters. In addition, we discuss the correlation coefficient stationarity test method, which can contribute to the verification of modeling validity. Monte Carlo simulation study illustrates the authentication of the theorem and the validity of the established method. Empirical study shows that the approach can satisfactorily explain the nonstationary behavior of many practical data sets, including stock returns, maximum power load, China money supply, and foreign currency exchange rate. The effectiveness of these processes is addressed by forecasting performance.

  17. Multiscale multifractal multiproperty analysis of financial time series based on Rényi entropy

    Science.gov (United States)

    Yujun, Yang; Jianping, Li; Yimei, Yang

    This paper introduces a multiscale multifractal multiproperty analysis based on Rényi entropy (3MPAR) method to analyze short-range and long-range characteristics of financial time series, and then applies this method to the five time series of five properties in four stock indices. Combining the two analysis techniques of Rényi entropy and multifractal detrended fluctuation analysis (MFDFA), the 3MPAR method focuses on the curves of Rényi entropy and generalized Hurst exponent of five properties of four stock time series, which allows us to study more universal and subtle fluctuation characteristics of financial time series. By analyzing the curves of the Rényi entropy and the profiles of the logarithm distribution of MFDFA of five properties of four stock indices, the 3MPAR method shows some fluctuation characteristics of the financial time series and the stock markets. Then, it also shows a richer information of the financial time series by comparing the profile of five properties of four stock indices. In this paper, we not only focus on the multifractality of time series but also the fluctuation characteristics of the financial time series and subtle differences in the time series of different properties. We find that financial time series is far more complex than reported in some research works using one property of time series.

  18. INDUSTRIAL PRODUCTION IN GERMANY AND AUSTRIA: A CASE STUDY IN STRUCTURAL TIME SERIES MODELLING

    Institute of Scientific and Technical Information of China (English)

    Gerhard THURY

    2003-01-01

    Industrial production series are volatile and often cyclical. Time series models can be used to establish certain stylized facts, such as trends and cycles, which may be present in these series. In certain situations, it is also possible that common factors, which may have an interesting interpretation, can be detected in production series. Series from two neighboring countries with close economic relationships, such as Germany and Austria, are especially likely to exhibit such joint stylized facts.

  19. HIGH ORDER FUZZY TIME SERIES MODEL AND ITS APLICATION TO IMKB

    Directory of Open Access Journals (Sweden)

    Çağdaş Hakan ALADAĞ

    2010-12-01

    Full Text Available The observations of some real time series such as temperature and stock market can take different values in a day. Instead of representing the observations of these time series by real numbers, employing linguistic values or fuzzy sets can be more appropriate. In recent years, many approaches have been introduced to analyze time series consisting of observations which are fuzzy sets and such time series are called fuzzy time series. In this study, a novel approach is proposed to analyze high order fuzzy time series model. The proposed method is applied to IMKB data and the obtained results are discussed. IMKB data is also analyzed by using some other fuzzy time series methods available in the literature and obtained results are compared to results obtained from the proposed method. As a result of the comparison, it is seen that the proposed method produce accurate forecasts.

  20. Time Granularity Transformation of Time Series Data for Failure Prediction of Overhead Line

    Science.gov (United States)

    Ma, Yan; Zhu, Wenbing; Yao, Jinxia; Gu, Chao; Bai, Demeng; Wang, Kun

    2017-01-01

    In this paper, we give an approach of transforming time series data with different time granularities into the same plane, which is the basis of further association analysis. We focus on the application of overhead line tripping. First all the relative state variables with line tripping are collected into our big data platform. We collect line account, line fault, lightning, power load and meteorological data. Second we respectively pre-process the five kinds of data to guarantee the integrality of data and simplicity of analysis. We use a representation way combining the aggregated representation and trend extraction methods, which considers both short term variation and long term trend of time sequence. Last we use extensive experiments to demonstrate that the proposed time granularity transformation approach not only lets multiple variables analysed on the same plane, but also has a high prediction accuracy and low running time no matter for SVM or logistic regression algorithm.

  1. A Bayesian approach to combine Landsat and ALOS PALSAR time series for near real-time deforestation detection

    NARCIS (Netherlands)

    Reiche, J.; Bruin, de S.; Hoekman, D.H.; Verbesselt, J.; Herold, M.

    2015-01-01

    To address the need for timely information on newly deforested areas at medium resolution scale, we introduce a Bayesian approach to combine SAR and optical time series for near real-time deforestation detection. Once a new image of either of the input time series is available, the conditional proba

  2. West Africa Land Use Land Cover Time Series

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — This series of three-period land use land cover (LULC) datasets (1975, 2000, and 2013) aids in monitoring change in West Africa’s land resources (exception is...

  3. Indirect inference with time series observed with error

    DEFF Research Database (Denmark)

    Rossi, Eduardo; Santucci de Magistris, Paolo

    We analyze the properties of the indirect inference estimator when the observed series are contaminated by measurement error. We show that the indirect inference estimates are asymptotically biased when the nuisance parameters of the measurement error distribution are neglected in the indirect...

  4. Real-Time Detection of Application-Layer DDoS Attack Using Time Series Analysis

    Directory of Open Access Journals (Sweden)

    Tongguang Ni

    2013-01-01

    Full Text Available Distributed denial of service (DDoS attacks are one of the major threats to the current Internet, and application-layer DDoS attacks utilizing legitimate HTTP requests to overwhelm victim resources are more undetectable. Consequently, neither intrusion detection systems (IDS nor victim server can detect malicious packets. In this paper, a novel approach to detect application-layer DDoS attack is proposed based on entropy of HTTP GET requests per source IP address (HRPI. By approximating the adaptive autoregressive (AAR model, the HRPI time series is transformed into a multidimensional vector series. Then, a trained support vector machine (SVM classifier is applied to identify the attacks. The experiments with several databases are performed and results show that this approach can detect application-layer DDoS attacks effectively.

  5. Tsunami arrival time detection system applicable to discontinuous time series data with outliers

    Science.gov (United States)

    Lee, Jun-Whan; Park, Sun-Cheon; Lee, Duk Kee; Lee, Jong Ho

    2016-12-01

    Timely detection of tsunamis with water level records is a critical but logistically challenging task because of outliers and gaps. Since tsunami detection algorithms require several hours of past data, outliers could cause false alarms, and gaps can stop the tsunami detection algorithm even after the recording is restarted. In order to avoid such false alarms and time delays, we propose the Tsunami Arrival time Detection System (TADS), which can be applied to discontinuous time series data with outliers. TADS consists of three algorithms, outlier removal, gap filling, and tsunami detection, which are designed to update whenever new data are acquired. After calibrating the thresholds and parameters for the Ulleung-do surge gauge located in the East Sea (Sea of Japan), Korea, the performance of TADS was discussed based on a 1-year dataset with historical tsunamis and synthetic tsunamis. The results show that the overall performance of TADS is effective in detecting a tsunami signal superimposed on both outliers and gaps.

  6. Stochastic nonlinear time series forecasting using time-delay reservoir computers: performance and universality.

    Science.gov (United States)

    Grigoryeva, Lyudmila; Henriques, Julie; Larger, Laurent; Ortega, Juan-Pablo

    2014-07-01

    Reservoir computing is a recently introduced machine learning paradigm that has already shown excellent performances in the processing of empirical data. We study a particular kind of reservoir computers called time-delay reservoirs that are constructed out of the sampling of the solution of a time-delay differential equation and show their good performance in the forecasting of the conditional covariances associated to multivariate discrete-time nonlinear stochastic processes of VEC-GARCH type as well as in the prediction of factual daily market realized volatilities computed with intraday quotes, using as training input daily log-return series of moderate size. We tackle some problems associated to the lack of task-universality for individually operating reservoirs and propose a solution based on the use of parallel arrays of time-delay reservoirs.

  7. A generalized exponential time series regression model for electricity prices

    DEFF Research Database (Denmark)

    Haldrup, Niels; Knapik, Oskar; Proietti, Tomasso

    We consider the issue of modeling and forecasting daily electricity spot prices on the Nord Pool Elspot power market. We propose a method that can handle seasonal and non-seasonal persistence by modelling the price series as a generalized exponential process. As the presence of spikes can distort...... the estimation of the dynamic structure of the series we consider an iterative estimation strategy which, conditional on a set of parameter estimates, clears the spikes using a data cleaning algorithm, and reestimates the parameters using the cleaned data so as to robustify the estimates. Conditional...... on the estimated model, the best linear predictor is constructed. Our modeling approach provides good fit within sample and outperforms competing benchmark predictors in terms of forecasting accuracy. We also find that building separate models for each hour of the day and averaging the forecasts is a better...

  8. Evaluating the uncertainty of predicting future climate time series at the hourly time scale

    Science.gov (United States)

    Caporali, E.; Fatichi, S.; Ivanov, V. Y.

    2011-12-01

    A stochastic downscaling methodology is developed to generate hourly, point-scale time series for several meteorological variables, such as precipitation, cloud cover, shortwave radiation, air temperature, relative humidity, wind speed, and atmospheric pressure. The methodology uses multi-model General Circulation Model (GCM) realizations and an hourly weather generator, AWE-GEN. Probabilistic descriptions of factors of change (a measure of climate change with respect to historic conditions) are computed for several climate statistics and different aggregation times using a Bayesian approach that weights the individual GCM contributions. The Monte Carlo method is applied to sample the factors of change from their respective distributions thereby permitting the generation of time series in an ensemble fashion, which reflects the uncertainty of climate projections of future as well as the uncertainty of the downscaling procedure. Applications of the methodology and probabilistic expressions of certainty in reproducing future climates for the periods, 2000 - 2009, 2046 - 2065 and 2081 - 2100, using the 1962 - 1992 period as the baseline, are discussed for the location of Firenze (Italy). The climate predictions for the period of 2000 - 2009 are tested against observations permitting to assess the reliability and uncertainties of the methodology in reproducing statistics of meteorological variables at different time scales.

  9. Change detection in a time series of polarimetric SAR data

    DEFF Research Database (Denmark)

    Conradsen, Knut; Nielsen, Allan Aasbjerg; Skriver, Henning

    2014-01-01

    A test statistic for the equality of several variance-covariance matrices following the complex Wishart distribution with an associated probability of finding a smaller value of the test statistic is introduced. Unlike tests based on pairwise comparisons between all temporally consecutive acquisi...... acquisitions the new omnibus test statistic and the probability measure successfully detects change in two short series of L- and C-band polarimetric EMISAR data....

  10. Nonlinear time series theory, methods and applications with R examples

    CERN Document Server

    Douc, Randal; Stoffer, David

    2014-01-01

    FOUNDATIONSLinear ModelsStochastic Processes The Covariance World Linear Processes The Multivariate Cases Numerical Examples ExercisesLinear Gaussian State Space Models Model Basics Filtering, Smoothing, and Forecasting Maximum Likelihood Estimation Smoothing Splines and the Kalman Smoother Asymptotic Distribution of the MLE Missing Data Modifications Structural Component Models State-Space Models with Correlated Errors Exercises Beyond Linear ModelsNonlinear Non-Gaussian Data Volterra Series Expansion Cumulants and Higher-Order Spectra Bilinear Models Conditionally Heteroscedastic Models Thre

  11. Time Series Forecasting of Airlift Sustainment Cargo Demand

    Science.gov (United States)

    2012-06-01

    are better. Schwarz’s Bayesian Criterion Schwarz’s Bayesian Criterion, or SBC , is mathematically defined as: Like AIC, SBC adds a penalty for each...additional parameter. Lower SBC values indicate a better model. RSquare RSquare measures the percent of variation in a series accounted for by a...Bayesian Criterion (9) Where: m = # parameters in model N = # of observations 2m + likelihood log 2- =AIC logN m + likelihood log 2- = SBC 35

  12. Incomplete phase-space method to reveal time delay from scalar time series

    Science.gov (United States)

    Zhu, Shengli; Gan, Lu

    2016-11-01

    A computationally quick and conceptually simple method to recover time delay of the chaotic system from scalar time series is developed in this paper. We show that the orbits in the incomplete two-dimensional reconstructed phase-space will show local clustering phenomenon after the component reordering procedure proposed in this work. We find that information captured by the incomplete two-dimensional reconstructed phase-space is related to the time delay τ0 present in the system, and will be transferred to the reordered component by the procedure of component reordering. We then propose the segmented mean variance (SMV) from the reordered component to identify the time delay τ0 of the system. The proposed SMV shows clear maximum when the embedding delay τ of the incomplete reconstruction matches the time delay τ0 of the chaotic system. Numerical data generated by a time-delay system based on the Mackey-Glass equation operating in the chaotic regime are used to illustrate the effectiveness of the proposed SMV. Experimental results show that the proposed SMV is robust to additive observational noise and is able to recover the time delay of the chaotic system even though the amount of data is relatively small and the feedback strength is weak. Moreover, the time complexity of the proposed method is quite low.

  13. Granger Causality in Multi-variate Time Series using a Time Ordered Restricted Vector Autoregressive Model

    CERN Document Server

    Siggiridou, Elsa

    2015-01-01

    Granger causality has been used for the investigation of the inter-dependence structure of the underlying systems of multi-variate time series. In particular, the direct causal effects are commonly estimated by the conditional Granger causality index (CGCI). In the presence of many observed variables and relatively short time series, CGCI may fail because it is based on vector autoregressive models (VAR) involving a large number of coefficients to be estimated. In this work, the VAR is restricted by a scheme that modifies the recently developed method of backward-in-time selection (BTS) of the lagged variables and the CGCI is combined with BTS. Further, the proposed approach is compared favorably to other restricted VAR representations, such as the top-down strategy, the bottom-up strategy, and the least absolute shrinkage and selection operator (LASSO), in terms of sensitivity and specificity of CGCI. This is shown by using simulations of linear and nonlinear, low and high-dimensional systems and different t...

  14. Study in the natural time domain of the entropy of dichotomic geoelectrical and chaotic time series

    Science.gov (United States)

    Ramírez-Rojas, A.; Telesca, L.; Angulo-Brown, F.

    2010-12-01

    The so-called seismo-electric signals (SES) have been considered as precursors of great earthquakes. To characterize possible SES activities, the Natural Time Domain (NTD) (Varotsos et al., 2001) was proposed as adequate methodology. In this work we analyze two geoelectric time series measured in a very seismically active area of South Pacific Mexican coast, and a chaotic time series obtained from the Liebovitch and Thot (LT) chaotic map. The two analyzed geoelectric signals display possible SES activities associated with the earhquakes occurred on October 24, 1993 (M6.6, epicenter at (16.54N, 98.98W)) and on September 14, 1995 (M7.4, epicenter at (16.31N, 98.88W)). Our monitoring station was located at (16.50N, 99.47W) close to Acapulco city and the experimental set-up was based on the VAN methodology. We found that the correlation degree of the SES geoelectric signals increases before the occurrence of the seismic events with power spectrum and entropy calculated in NTD in good agreement with analogous studies in the field of earthquake-related phenomena. Such SES activity, analysed in NTD, can be discriminated from the LT- chaotic map and from artificial noises. Varotsos P.A., Sarlis N.V., Skordas E.S., Practica of Athens Academy 76, (2001) 294 Liebovitch S.L. and Thot T.I., J. Theor. Biol., 148(1991), 243-267

  15. Time-Scale and Time-Frequency Analyses of Irregularly Sampled Astronomical Time Series

    Directory of Open Access Journals (Sweden)

    S. Roques

    2005-09-01

    Full Text Available We evaluate the quality of spectral restoration in the case of irregular sampled signals in astronomy. We study in details a time-scale method leading to a global wavelet spectrum comparable to the Fourier period, and a time-frequency matching pursuit allowing us to identify the frequencies and to control the error propagation. In both cases, the signals are first resampled with a linear interpolation. Both results are compared with those obtained using Lomb's periodogram and using the weighted waveletZ-transform developed in astronomy for unevenly sampled variable stars observations. These approaches are applied to simulations and to light variations of four variable stars. This leads to the conclusion that the matching pursuit is more efficient for recovering the spectral contents of a pulsating star, even with a preliminary resampling. In particular, the results are almost independent of the quality of the initial irregular sampling.

  16. On the long-term correlations and multifractal properties of electric arc furnace time series

    CERN Document Server

    Livi, Lorenzo; Rizzi, Antonello; Sadeghian, Alireza

    2015-01-01

    In this paper, we study long-term correlations and multifractal properties elaborated from time series of three-phase current signals coming from an industrial electric arc furnace plant. Implicit sinusoidal trends are suitably detected in the scaling of the fluctuation function of such time series. Time series are then initially filtered via a Fourier based analysis, removing hence such strong periodicities. In the filtered time series we detected long-term, positive correlations. The presence of persistent correlations is in agreement with the typical V--I characteristic (hysteresis) of the electric arc furnace, justifying thus the memory effects found in the current time series. The multifractal signature is strong enough in the filtered time series to be effectively classified as multifractal.

  17. An Overview on R Packages for Seasonal Analysis of Time Series

    Directory of Open Access Journals (Sweden)

    Haibin Qiu

    2014-05-01

    Full Text Available Time series analysis consists of approaches for analysing time series data so thatimportant information and other features can be isolated from the data. Time series forecasting is the use of a model to predict perspective values on the basis of previouly observed values by a model. Statisticians generally use R project or R language, a free and popular programming language and computer software environment for statistical computing and graphics, for developing statistical computer software and data analysis. Plenty of time series display cyclic variation significant as seasonality, periodic variation, or periodic fluctuations in statistics. This study introducesabundant functions in the R packages TSA, marls, depersonalize and season for analyzing seasonal processes of time series, are introduced in this study. Note that R packages marls, depersonalize and season are included in the comprehensive R archive network task view TimeSeries.

  18. Travel cost inference from sparse, spatio-temporally correlated time series using markov models

    DEFF Research Database (Denmark)

    Yang, B.; Guo, C.; Jensen, C.S.

    2013-01-01

    of such time series offers insight into the underlying system and enables prediction of system behavior. While the techniques presented in the paper apply more generally, we consider the case of transportation systems and aim to predict travel cost from GPS tracking data from probe vehicles. Specifically, each......The monitoring of a system can yield a set of measurements that can be modeled as a collection of time series. These time series are often sparse, due to missing measurements, and spatiotemporally correlated, meaning that spatially close time series exhibit temporal correlation. The analysis...... road segment has an associated travel-cost time series, which is derived from GPS data. We use spatio-temporal hidden Markov models (STHMM) to model correlations among different traffic time series. We provide algorithms that are able to learn the parameters of an STHMM while contending...

  19. Correlated errors in geodetic time series: Implications for time-dependent deformation

    Science.gov (United States)

    Langbein, J.; Johnson, H.

    1997-01-01

    Analysis of frequent trilateration observations from the two-color electronic distance measuring networks in California demonstrate that the noise power spectra are dominated by white noise at higher frequencies and power law behavior at lower frequencies. In contrast, Earth scientists typically have assumed that only white noise is present in a geodetic time series, since a combination of infrequent measurements and low precision usually preclude identifying the time-correlated signature in such data. After removing a linear trend from the two-color data, it becomes evident that there are primarily two recognizable types of time-correlated noise present in the residuals. The first type is a seasonal variation in displacement which is probably a result of measuring to shallow surface monuments installed in clayey soil which responds to seasonally occurring rainfall; this noise is significant only for a small fraction of the sites analyzed. The second type of correlated noise becomes evident only after spectral analysis of line length changes and shows a functional relation at long periods between power and frequency of and where f is frequency and ?? ??? 2. With ?? = 2, this type of correlated noise is termed random-walk noise, and its source is mainly thought to be small random motions of geodetic monuments with respect to the Earth's crust, though other sources are possible. Because the line length changes in the two-color networks are measured at irregular intervals, power spectral techniques cannot reliably estimate the level of I//" noise. Rather, we also use here a maximum likelihood estimation technique which assumes that there are only two sources of noise in the residual time series (white noise and randomwalk noise) and estimates the amount of each. From this analysis we find that the random-walk noise level averages about 1.3 mm/Vyr and that our estimates of the white noise component confirm theoretical limitations of the measurement technique. In

  20. From time series to complex networks: The phase space coarse graining

    Science.gov (United States)

    Wang, Minggang; Tian, Lixin

    2016-11-01

    In this paper, we present a simple and fast computational method, the phase space coarse graining algorithm that converts a time series into a directed and weighted complex network. The constructed directed and weighted complex network inherits several properties of the series in its structure. Thereby, periodic series convert into regular networks, and random series do so into random networks. Moreover, chaotic series convert into scale-free networks. It is shown that the phase space coarse graining algorithm allows us to distinguish, identify and describe in detail various time series. Finally, we apply the phase space coarse graining algorithm to the practical observations series, international gasoline regular spot price series and identify its dynamic characteristics.