WorldWideScience

Sample records for volatility time series

  1. Short-term volatility versus long-term growth: evidence in US macroeconomic time series

    NARCIS (Netherlands)

    M. Sensier (Marianne); D.J.C. van Dijk (Dick)

    2001-01-01

    textabstractWe test for a change in the volatility of 215 US macroeconomic time series over the period 1960-1996. We find that about 90\\\\% of these series have experienced a break in volatility during this period. This result is robust to controlling for instability in the mean and business cycle

  2. Long Run Estimations for the Volatility of Time Series in the Brazilian Financial Market

    Directory of Open Access Journals (Sweden)

    Alex Sandro Monteiro de Moraes

    2014-03-01

    Full Text Available The models of the GARCH family, normally used for the estimates of volatility for longer periods, keep unchanged the relative weights assigned to the observations both old and new, regardless of the volatility´s forecasted horizon. The purpose of this article is to verify if the increase in relative weights assigned to the earlier observations due to the increase of the forecast horizon results in better estimates of volatility. Through the use of seven forecasting models of volatility and return series of financial markets assets, the estimates obtained in the sample (in-sample were compared with observations outside the sample (out-of-sample. Based on this comparison, it was found that the best estimates of expected volatility were obtained by the modified EGARCH model and the ARLS model. We conclude that the use of traditional forecasting models of volatility, which keep unchanged relative weights assigned to both old and new observations, was inappropriate.

  3. Testing for a Common Volatility Process and Information Spillovers in Bivariate Financial Time Series Models

    NARCIS (Netherlands)

    J. Chen (Jinghui); M. Kobayashi (Masahito); M.J. McAleer (Michael)

    2016-01-01

    textabstractThe paper considers the problem as to whether financial returns have a common volatility process in the framework of stochastic volatility models that were suggested by Harvey et al. (1994). We propose a stochastic volatility version of the ARCH test proposed by Engle and Susmel (1993),

  4. Time Series

    OpenAIRE

    Gil-Alana, L.A.; Moreno, A; Pérez-de-Gracia, F. (Fernando)

    2011-01-01

    The last 20 years have witnessed a considerable increase in the use of time series techniques in econometrics. The articles in this important set have been chosen to illustrate the main themes in time series work as it relates to econometrics. The editor has written a new concise introduction to accompany the articles. Sections covered include: Ad Hoc Forecasting Procedures, ARIMA Modelling, Structural Time Series Models, Unit Roots, Detrending and Non-stationarity, Seasonality, Seasonal Adju...

  5. Time series analysis.

    NARCIS (Netherlands)

    2013-01-01

    Time series analysis can be used to quantitatively monitor, describe, explain, and predict road safety developments. Time series analysis techniques offer the possibility of quantitatively modelling road safety developments in such a way that the dependencies between the observations of time series

  6. Time Series Momentum

    DEFF Research Database (Denmark)

    Moskowitz, Tobias J.; Ooi, Yao Hua; Heje Pedersen, Lasse

    2012-01-01

    under-reaction and delayed over-reaction. A diversified portfolio of time series momentum strategies across all asset classes delivers substantial abnormal returns with little exposure to standard asset pricing factors and performs best during extreme markets. Examining the trading activities...... of speculators and hedgers, we find that speculators profit from time series momentum at the expense of hedgers....

  7. Periodic Time Series Models

    NARCIS (Netherlands)

    Ph.H.B.F. Franses (Philip Hans); R. Paap (Richard)

    2004-01-01

    textabstractThis book considers periodic time series models for seasonal data, characterized by parameters that differ across the seasons, and focuses on their usefulness for out-of-sample forecasting. Providing an up-to-date survey of the recent developments in periodic time series, the book

  8. Multivariate Time Series Search

    Data.gov (United States)

    National Aeronautics and Space Administration — Multivariate Time-Series (MTS) are ubiquitous, and are generated in areas as disparate as sensor recordings in aerospace systems, music and video streams, medical...

  9. Long time series

    DEFF Research Database (Denmark)

    Hisdal, H.; Holmqvist, E.; Hyvärinen, V.;

    Awareness that emission of greenhouse gases will raise the global temperature and change the climate has led to studies trying to identify such changes in long-term climate and hydrologic time series. This report, written by the......Awareness that emission of greenhouse gases will raise the global temperature and change the climate has led to studies trying to identify such changes in long-term climate and hydrologic time series. This report, written by the...

  10. Long time series

    DEFF Research Database (Denmark)

    Hisdal, H.; Holmqvist, E.; Hyvärinen, V.

    Awareness that emission of greenhouse gases will raise the global temperature and change the climate has led to studies trying to identify such changes in long-term climate and hydrologic time series. This report, written by the......Awareness that emission of greenhouse gases will raise the global temperature and change the climate has led to studies trying to identify such changes in long-term climate and hydrologic time series. This report, written by the...

  11. Time series analysis

    CERN Document Server

    Madsen, Henrik

    2007-01-01

    ""In this book the author gives a detailed account of estimation, identification methodologies for univariate and multivariate stationary time-series models. The interesting aspect of this introductory book is that it contains several real data sets and the author made an effort to explain and motivate the methodology with real data. … this introductory book will be interesting and useful not only to undergraduate students in the UK universities but also to statisticians who are keen to learn time-series techniques and keen to apply them. I have no hesitation in recommending the book.""-Journa

  12. Applied time series analysis

    CERN Document Server

    Woodward, Wayne A; Elliott, Alan C

    2011-01-01

    ""There is scarcely a standard technique that the reader will find left out … this book is highly recommended for those requiring a ready introduction to applicable methods in time series and serves as a useful resource for pedagogical purposes.""-International Statistical Review (2014), 82""Current time series theory for practice is well summarized in this book.""-Emmanuel Parzen, Texas A&M University""What an extraordinary range of topics covered, all very insightfully. I like [the authors'] innovations very much, such as the AR factor table.""-David Findley, U.S. Census Bureau (retired)""…

  13. Visual time series analysis

    DEFF Research Database (Denmark)

    Fischer, Paul; Hilbert, Astrid

    2012-01-01

    commands, our application is select-and-click-driven. It allows to derive many different sequences of deviations for a given time series and to visualize them in different ways in order to judge their expressive power and to reuse the procedure found. For many transformations or model-ts, the user may...

  14. Causality between time series

    CERN Document Server

    Liang, X San

    2014-01-01

    Given two time series, can one tell, in a rigorous and quantitative way, the cause and effect between them? Based on a recently rigorized physical notion namely information flow, we arrive at a concise formula and give this challenging question, which is of wide concern in different disciplines, a positive answer. Here causality is measured by the time rate of change of information flowing from one series, say, X2, to another, X1. The measure is asymmetric between the two parties and, particularly, if the process underlying X1 does not depend on X2, then the resulting causality from X2 to X1 vanishes. The formula is tight in form, involving only the commonly used statistics, sample covariances. It has been validated with touchstone series purportedly generated with one-way causality. It has also been applied to the investigation of real world problems; an example presented here is the cause-effect relation between two climate modes, El Ni\\~no and Indian Ocean Dipole, which have been linked to the hazards in f...

  15. Introduction to Time Series Modeling

    CERN Document Server

    Kitagawa, Genshiro

    2010-01-01

    In time series modeling, the behavior of a certain phenomenon is expressed in relation to the past values of itself and other covariates. Since many important phenomena in statistical analysis are actually time series and the identification of conditional distribution of the phenomenon is an essential part of the statistical modeling, it is very important and useful to learn fundamental methods of time series modeling. Illustrating how to build models for time series using basic methods, "Introduction to Time Series Modeling" covers numerous time series models and the various tools f

  16. Market volatility modeling for short time window

    Science.gov (United States)

    de Mattos Neto, Paulo S. G.; Silva, David A.; Ferreira, Tiago A. E.; Cavalcanti, George D. C.

    2011-10-01

    The gain or loss of an investment can be defined by the movement of the market. This movement can be estimated by the difference between the magnitudes of two stock prices in distinct periods and this difference can be used to calculate the volatility of the markets. The volatility characterizes the sensitivity of a market change in the world economy. Traditionally, the probability density function (pdf) of the movement of the markets is analyzed by using power laws. The contributions of this work is two-fold: (i) an analysis of the volatility dynamic of the world market indexes is performed by using a two-year window time data. In this case, the experiments show that the pdf of the volatility is better fitted by exponential function than power laws, in all range of pdf; (ii) after that, we investigate a relationship between the volatility of the markets and the coefficient of the exponential function based on the Maxwell-Boltzmann ideal gas theory. The results show an inverse relationship between the volatility and the coefficient of the exponential function. This information can be used, for example, to predict the future behavior of the markets or to cluster the markets in order to analyze economic patterns.

  17. A Discrimination framework for Financial Time Series Indices based on Inflection Points Set:A Idiosyncratic Volatility Case%金融时间序列指标判别框架:以特质波动率为例

    Institute of Scientific and Technical Information of China (English)

    汤胤; 毛景慧

    2016-01-01

    This paper presents a new method--TBUD to partition the inflection points into col-lections for time series of the stock price.Analyzing the relation among the collections,this paper builds a discrimination framework which is applied to Idiosyncratic Volatility,as a case.The re-sult suggests that Idiosyncratic Volatility can not be divided the inflection points set and is there-fore unable to make an accurate prediction on the future trends of the stock price.%基于拐点集合判别的TBUD方法主要思路是分析拐点集合间的关系,并在高维空间进行划分,从而搭建判别模型,并将分析框架应用在特质波动率等若干指标上,利用实证数据得到结论。应用 TBUD判别框架可以发现,特质波动率等指标无法对拐点集合进行清晰划分,因而并不具有预测能力。

  18. GPS Position Time Series @ JPL

    Science.gov (United States)

    Owen, Susan; Moore, Angelyn; Kedar, Sharon; Liu, Zhen; Webb, Frank; Heflin, Mike; Desai, Shailen

    2013-01-01

    Different flavors of GPS time series analysis at JPL - Use same GPS Precise Point Positioning Analysis raw time series - Variations in time series analysis/post-processing driven by different users. center dot JPL Global Time Series/Velocities - researchers studying reference frame, combining with VLBI/SLR/DORIS center dot JPL/SOPAC Combined Time Series/Velocities - crustal deformation for tectonic, volcanic, ground water studies center dot ARIA Time Series/Coseismic Data Products - Hazard monitoring and response focused center dot ARIA data system designed to integrate GPS and InSAR - GPS tropospheric delay used for correcting InSAR - Caltech's GIANT time series analysis uses GPS to correct orbital errors in InSAR - Zhen Liu's talking tomorrow on InSAR Time Series analysis

  19. GPS Position Time Series @ JPL

    Science.gov (United States)

    Owen, Susan; Moore, Angelyn; Kedar, Sharon; Liu, Zhen; Webb, Frank; Heflin, Mike; Desai, Shailen

    2013-01-01

    Different flavors of GPS time series analysis at JPL - Use same GPS Precise Point Positioning Analysis raw time series - Variations in time series analysis/post-processing driven by different users. center dot JPL Global Time Series/Velocities - researchers studying reference frame, combining with VLBI/SLR/DORIS center dot JPL/SOPAC Combined Time Series/Velocities - crustal deformation for tectonic, volcanic, ground water studies center dot ARIA Time Series/Coseismic Data Products - Hazard monitoring and response focused center dot ARIA data system designed to integrate GPS and InSAR - GPS tropospheric delay used for correcting InSAR - Caltech's GIANT time series analysis uses GPS to correct orbital errors in InSAR - Zhen Liu's talking tomorrow on InSAR Time Series analysis

  20. Predicting Nonlinear Time Series

    Science.gov (United States)

    1993-12-01

    response becomes R,(k) = f (Y FV,(k)) (2.4) where Wy specifies the weight associated with the output of node i to the input of nodej in the next layer and...interconnections for each of these previous nodes. 18 prr~~~o• wfe :t iam i -- ---- --- --- --- Figure 5: Delay block for ATNN [9] Thus, nodej receives the...computed values, aj(tn), and dj(tn) denotes the desired output of nodej at time in. In this thesis, the weights and time delays update after each input

  1. Advances in time series forecasting

    CERN Document Server

    Cagdas, Hakan Aladag

    2012-01-01

    Readers will learn how these methods work and how these approaches can be used to forecast real life time series. The hybrid forecasting model is also explained. Data presented in this e-book is problem based and is taken from real life situations. It is a valuable resource for students, statisticians and working professionals interested in advanced time series analysis.

  2. Time Series with Tailored Nonlinearities

    CERN Document Server

    Raeth, C

    2015-01-01

    It is demonstrated how to generate time series with tailored nonlinearities by inducing well- defined constraints on the Fourier phases. Correlations between the phase information of adjacent phases and (static and dynamic) measures of nonlinearities are established and their origin is explained. By applying a set of simple constraints on the phases of an originally linear and uncor- related Gaussian time series, the observed scaling behavior of the intensity distribution of empirical time series can be reproduced. The power law character of the intensity distributions being typical for e.g. turbulence and financial data can thus be explained in terms of phase correlations.

  3. Fractal and Multifractal Time Series

    CERN Document Server

    Kantelhardt, Jan W

    2008-01-01

    Data series generated by complex systems exhibit fluctuations on many time scales and/or broad distributions of the values. In both equilibrium and non-equilibrium situations, the natural fluctuations are often found to follow a scaling relation over several orders of magnitude, allowing for a characterisation of the data and the generating complex system by fractal (or multifractal) scaling exponents. In addition, fractal and multifractal approaches can be used for modelling time series and deriving predictions regarding extreme events. This review article describes and exemplifies several methods originating from Statistical Physics and Applied Mathematics, which have been used for fractal and multifractal time series analysis.

  4. Models for dependent time series

    CERN Document Server

    Tunnicliffe Wilson, Granville; Haywood, John

    2015-01-01

    Models for Dependent Time Series addresses the issues that arise and the methodology that can be applied when the dependence between time series is described and modeled. Whether you work in the economic, physical, or life sciences, the book shows you how to draw meaningful, applicable, and statistically valid conclusions from multivariate (or vector) time series data.The first four chapters discuss the two main pillars of the subject that have been developed over the last 60 years: vector autoregressive modeling and multivariate spectral analysis. These chapters provide the foundational mater

  5. Time series prediction in agroecosystems

    Science.gov (United States)

    Cortina-Januchs, M. G.; Quintanilla-Dominguez, J.; Vega-Corona, A.; Andina, D.

    2012-04-01

    This work proposes a novel model to predict time series such as frost, precipitation, temperature, solar radiation, all of them important variables for the agriculture process. In the proposed model, Artificial Neural Networks (ANN) combined with clustering algorithms and sensor data fusion are used. The real time series are obtained from different sensors. The clustering algorithms find relationships between variables, clustering involves the task of dividing data sets, which assigns the same label to members who belong to the same group, so that each group is homogeneous and distinct from the others. Those relationships provide information to the ANN in order to obtain the time series prediction. The most important issue of ANN in time series prediction is generalization, which refers to their ability to produce reasonable predictions on data sets other than those used for the estimation of the model parameters.

  6. Time series analysis time series analysis methods and applications

    CERN Document Server

    Rao, Tata Subba; Rao, C R

    2012-01-01

    The field of statistics not only affects all areas of scientific activity, but also many other matters such as public policy. It is branching rapidly into so many different subjects that a series of handbooks is the only way of comprehensively presenting the various aspects of statistical methodology, applications, and recent developments. The Handbook of Statistics is a series of self-contained reference books. Each volume is devoted to a particular topic in statistics, with Volume 30 dealing with time series. The series is addressed to the entire community of statisticians and scientists in various disciplines who use statistical methodology in their work. At the same time, special emphasis is placed on applications-oriented techniques, with the applied statistician in mind as the primary audience. Comprehensively presents the various aspects of statistical methodology Discusses a wide variety of diverse applications and recent developments Contributors are internationally renowened experts in their respect...

  7. Indirect inference with time series observed with error

    DEFF Research Database (Denmark)

    Rossi, Eduardo; Santucci de Magistris, Paolo

    We analyze the properties of the indirect inference estimator when the observed series are contaminated by measurement error. We show that the indirect inference estimates are asymptotically biased when the nuisance parameters of the measurement error distribution are neglected in the indirect...... to estimate the parameters of continuous-time stochastic volatility models with auxiliary specifications based on realized volatility measures. Monte Carlo simulations shows the bias reduction of the indirect estimates obtained when the microstructure noise is explicitly modeled. Finally, an empirical...

  8. Ensemble vs. time averages in financial time series analysis

    Science.gov (United States)

    Seemann, Lars; Hua, Jia-Chen; McCauley, Joseph L.; Gunaratne, Gemunu H.

    2012-12-01

    Empirical analysis of financial time series suggests that the underlying stochastic dynamics are not only non-stationary, but also exhibit non-stationary increments. However, financial time series are commonly analyzed using the sliding interval technique that assumes stationary increments. We propose an alternative approach that is based on an ensemble over trading days. To determine the effects of time averaging techniques on analysis outcomes, we create an intraday activity model that exhibits periodic variable diffusion dynamics and we assess the model data using both ensemble and time averaging techniques. We find that ensemble averaging techniques detect the underlying dynamics correctly, whereas sliding intervals approaches fail. As many traded assets exhibit characteristic intraday volatility patterns, our work implies that ensemble averages approaches will yield new insight into the study of financial markets’ dynamics.

  9. Benchmarking of energy time series

    Energy Technology Data Exchange (ETDEWEB)

    Williamson, M.A.

    1990-04-01

    Benchmarking consists of the adjustment of time series data from one source in order to achieve agreement with similar data from a second source. The data from the latter source are referred to as the benchmark(s), and often differ in that they are observed at a lower frequency, represent a higher level of temporal aggregation, and/or are considered to be of greater accuracy. This report provides an extensive survey of benchmarking procedures which have appeared in the statistical literature, and reviews specific benchmarking procedures currently used by the Energy Information Administration (EIA). The literature survey includes a technical summary of the major benchmarking methods and their statistical properties. Factors influencing the choice and application of particular techniques are described and the impact of benchmark accuracy is discussed. EIA applications and procedures are reviewed and evaluated for residential natural gas deliveries series and coal production series. It is found that the current method of adjusting the natural gas series is consistent with the behavior of the series and the methods used in obtaining the initial data. As a result, no change is recommended. For the coal production series, a staged approach based on a first differencing technique is recommended over the current procedure. A comparison of the adjustments produced by the two methods is made for the 1987 Indiana coal production series. 32 refs., 5 figs., 1 tab.

  10. Random time series in Astronomy

    CERN Document Server

    Vaughan, Simon

    2013-01-01

    Progress in astronomy comes from interpreting the signals encoded in the light received from distant objects: the distribution of light over the sky (images), over photon wavelength (spectrum), over polarization angle, and over time (usually called light curves by astronomers). In the time domain we see transient events such as supernovae, gamma-ray bursts, and other powerful explosions; we see periodic phenomena such as the orbits of planets around nearby stars, radio pulsars, and pulsations of stars in nearby galaxies; and persistent aperiodic variations (`noise') from powerful systems like accreting black holes. I review just a few of the recent and future challenges in the burgeoning area of Time Domain Astrophysics, with particular attention to persistently variable sources, the recovery of reliable noise power spectra from sparsely sampled time series, higher-order properties of accreting black holes, and time delays and correlations in multivariate time series.

  11. Random time series in astronomy.

    Science.gov (United States)

    Vaughan, Simon

    2013-02-13

    Progress in astronomy comes from interpreting the signals encoded in the light received from distant objects: the distribution of light over the sky (images), over photon wavelength (spectrum), over polarization angle and over time (usually called light curves by astronomers). In the time domain, we see transient events such as supernovae, gamma-ray bursts and other powerful explosions; we see periodic phenomena such as the orbits of planets around nearby stars, radio pulsars and pulsations of stars in nearby galaxies; and we see persistent aperiodic variations ('noise') from powerful systems such as accreting black holes. I review just a few of the recent and future challenges in the burgeoning area of time domain astrophysics, with particular attention to persistently variable sources, the recovery of reliable noise power spectra from sparsely sampled time series, higher order properties of accreting black holes, and time delays and correlations in multi-variate time series.

  12. Do Time-Varying Covariances, Volatility Comovement and Spillover Matter?

    OpenAIRE

    Lakshmi Balasubramanyan

    2005-01-01

    Financial markets and their respective assets are so intertwined; analyzing any single market in isolation ignores important information. We investigate whether time varying volatility comovement and spillover impact the true variance-covariance matrix under a time-varying correlation set up. Statistically significant volatility spillover and comovement between US, UK and Japan is found. To demonstrate the importance of modelling volatility comovement and spillover, we look at a simple portfo...

  13. A Time Series Forecasting Method

    Directory of Open Access Journals (Sweden)

    Wang Zhao-Yu

    2017-01-01

    Full Text Available This paper proposes a novel time series forecasting method based on a weighted self-constructing clustering technique. The weighted self-constructing clustering processes all the data patterns incrementally. If a data pattern is not similar enough to an existing cluster, it forms a new cluster of its own. However, if a data pattern is similar enough to an existing cluster, it is removed from the cluster it currently belongs to and added to the most similar cluster. During the clustering process, weights are learned for each cluster. Given a series of time-stamped data up to time t, we divide it into a set of training patterns. By using the weighted self-constructing clustering, the training patterns are grouped into a set of clusters. To estimate the value at time t + 1, we find the k nearest neighbors of the input pattern and use these k neighbors to decide the estimation. Experimental results are shown to demonstrate the effectiveness of the proposed approach.

  14. International Work-Conference on Time Series

    CERN Document Server

    Pomares, Héctor; Valenzuela, Olga

    2017-01-01

    This volume of selected and peer-reviewed contributions on the latest developments in time series analysis and forecasting updates the reader on topics such as analysis of irregularly sampled time series, multi-scale analysis of univariate and multivariate time series, linear and non-linear time series models, advanced time series forecasting methods, applications in time series analysis and forecasting, advanced methods and online learning in time series and high-dimensional and complex/big data time series. The contributions were originally presented at the International Work-Conference on Time Series, ITISE 2016, held in Granada, Spain, June 27-29, 2016. The series of ITISE conferences provides a forum for scientists, engineers, educators and students to discuss the latest ideas and implementations in the foundations, theory, models and applications in the field of time series analysis and forecasting.  It focuses on interdisciplinary and multidisciplinary rese arch encompassing the disciplines of comput...

  15. Event Discovery in Time Series

    CERN Document Server

    Preston, Dan; Brodley, Carla

    2009-01-01

    The discovery of events in time series can have important implications, such as identifying microlensing events in astronomical surveys, or changes in a patient's electrocardiogram. Current methods for identifying events require a sliding window of a fixed size, which is not ideal for all applications and could overlook important events. In this work, we develop probability models for calculating the significance of an arbitrary-sized sliding window and use these probabilities to find areas of significance. Because a brute force search of all sliding windows and all window sizes would be computationally intractable, we introduce a method for quickly approximating the results. We apply our method to over 100,000 astronomical time series from the MACHO survey, in which 56 different sections of the sky are considered, each with one or more known events. Our method was able to recover 100% of these events in the top 1% of the results, essentially pruning 99% of the data. Interestingly, our method was able to iden...

  16. Detecting chaos from time series

    Science.gov (United States)

    Xiaofeng, Gong; Lai, C. H.

    2000-02-01

    In this paper, an entirely data-based method to detect chaos from the time series is developed by introducing icons/Journals/Common/epsilon" ALT="epsilon" ALIGN="TOP"/> p -neighbour points (the p -steps icons/Journals/Common/epsilon" ALT="epsilon" ALIGN="TOP"/> -neighbour points). We demonstrate that for deterministic chaotic systems, there exists a linear relationship between the logarithm of the average number of icons/Journals/Common/epsilon" ALT="epsilon" ALIGN="TOP"/> p -neighbour points, lnn p ,icons/Journals/Common/epsilon" ALT="epsilon" ALIGN="TOP"/> , and the time step, p . The coefficient can be related to the KS entropy of the system. The effects of the embedding dimension and noise are also discussed.

  17. Effects of daylight savings time changes on stock market volatility.

    Science.gov (United States)

    Berument, M Hakan; Dogan, Nukhet; Onar, Bahar

    2010-04-01

    The presence of daylight savings time effects on stock returns and on stock volatility was investigated using an EGARCH specification to model the conditional variance. The evidence gathered from the major United States stock markets for the period between 1967 and 2007 did not support the existence of the daylight savings time effect on stock returns or on volatility. Returns on the first business day following daylight savings time changes were not lower nor was the volatility higher, as would be expected if there were an effect.

  18. Trend prediction of chaotic time series

    Institute of Scientific and Technical Information of China (English)

    Li Aiguo; Zhao Cai; Li Zhanhuai

    2007-01-01

    To predict the trend of chaotic time series in time series analysis and time series data mining fields, a novel predicting algorithm of chaotic time series trend is presented, and an on-line segmenting algorithm is proposed to convert a time series into a binary string according to ascending or descending trend of each subsequence. The on-line segmenting algorithm is independent of the prior knowledge about time series. The naive Bayesian algorithm is then employed to predict the trend of chaotic time series according to the binary string. The experimental results of three chaotic time series demonstrate that the proposed method predicts the ascending or descending trend of chaotic time series with few error.

  19. A Course in Time Series Analysis

    CERN Document Server

    Peña, Daniel; Tsay, Ruey S

    2011-01-01

    New statistical methods and future directions of research in time series A Course in Time Series Analysis demonstrates how to build time series models for univariate and multivariate time series data. It brings together material previously available only in the professional literature and presents a unified view of the most advanced procedures available for time series model building. The authors begin with basic concepts in univariate time series, providing an up-to-date presentation of ARIMA models, including the Kalman filter, outlier analysis, automatic methods for building ARIMA models, a

  20. Time Series Analysis of Wheat Futures Reward in China

    Institute of Scientific and Technical Information of China (English)

    2005-01-01

    Different from the fact that the main researches are focused on single futures contract and lack of the comparison of different periods, this paper described the statistical characteristics of wheat futures reward time series of Zhengzhou Commodity Exchange in recent three years. Besides the basic statistic analysis, the paper used the GARCH and EGARCH model to describe the time series which had the ARCH effect and analyzed the persistence of volatility shocks and the leverage effect. The results showed that compared with that of normal one,wheat futures reward series were abnormality, leptokurtic and thick tail distribution. The study also found that two-part of the reward series had no autocorrelation. Among the six correlative series, three ones presented the ARCH effect. By using of the Auto-regressive Distributed Lag Model, GARCH model and EGARCH model, the paper demonstrates the persistence of volatility shocks and the leverage effect on the wheat futures reward time series. The results reveal that on the one hand, the statistical characteristics of the wheat futures reward are similar to the aboard mature futures market as a whole. But on the other hand, the results reflect some shortages such as the immatureness and the over-control by the government in the Chinese future market.

  1. Description of complex time series by multipoles

    DEFF Research Database (Denmark)

    Lewkowicz, M.; Levitan, J.; Puzanov, N.

    2002-01-01

    We present a new method to describe time series with a highly complex time evolution. The time series is projected onto a two-dimensional phase-space plot which is quantified in terms of a multipole expansion where every data point is assigned a unit mass. The multipoles provide an efficient...... characterization of the original time series....

  2. Effective Feature Preprocessing for Time Series Forecasting

    DEFF Research Database (Denmark)

    Zhao, Junhua; Dong, Zhaoyang; Xu, Zhao

    2006-01-01

    Time series forecasting is an important area in data mining research. Feature preprocessing techniques have significant influence on forecasting accuracy, therefore are essential in a forecasting model. Although several feature preprocessing techniques have been applied in time series forecasting...... performance in time series forecasting. It is demonstrated in our experiment that, effective feature preprocessing can significantly enhance forecasting accuracy. This research can be a useful guidance for researchers on effectively selecting feature preprocessing techniques and integrating them with time...... series forecasting models....

  3. Regenerating time series from ordinal networks

    Science.gov (United States)

    McCullough, Michael; Sakellariou, Konstantinos; Stemler, Thomas; Small, Michael

    2017-03-01

    Recently proposed ordinal networks not only afford novel methods of nonlinear time series analysis but also constitute stochastic approximations of the deterministic flow time series from which the network models are constructed. In this paper, we construct ordinal networks from discrete sampled continuous chaotic time series and then regenerate new time series by taking random walks on the ordinal network. We then investigate the extent to which the dynamics of the original time series are encoded in the ordinal networks and retained through the process of regenerating new time series by using several distinct quantitative approaches. First, we use recurrence quantification analysis on traditional recurrence plots and order recurrence plots to compare the temporal structure of the original time series with random walk surrogate time series. Second, we estimate the largest Lyapunov exponent from the original time series and investigate the extent to which this invariant measure can be estimated from the surrogate time series. Finally, estimates of correlation dimension are computed to compare the topological properties of the original and surrogate time series dynamics. Our findings show that ordinal networks constructed from univariate time series data constitute stochastic models which approximate important dynamical properties of the original systems.

  4. Regenerating time series from ordinal networks.

    Science.gov (United States)

    McCullough, Michael; Sakellariou, Konstantinos; Stemler, Thomas; Small, Michael

    2017-03-01

    Recently proposed ordinal networks not only afford novel methods of nonlinear time series analysis but also constitute stochastic approximations of the deterministic flow time series from which the network models are constructed. In this paper, we construct ordinal networks from discrete sampled continuous chaotic time series and then regenerate new time series by taking random walks on the ordinal network. We then investigate the extent to which the dynamics of the original time series are encoded in the ordinal networks and retained through the process of regenerating new time series by using several distinct quantitative approaches. First, we use recurrence quantification analysis on traditional recurrence plots and order recurrence plots to compare the temporal structure of the original time series with random walk surrogate time series. Second, we estimate the largest Lyapunov exponent from the original time series and investigate the extent to which this invariant measure can be estimated from the surrogate time series. Finally, estimates of correlation dimension are computed to compare the topological properties of the original and surrogate time series dynamics. Our findings show that ordinal networks constructed from univariate time series data constitute stochastic models which approximate important dynamical properties of the original systems.

  5. INDUSTRIAL PRODUCTION IN GERMANY AND AUSTRIA: A CASE STUDY IN STRUCTURAL TIME SERIES MODELLING

    Institute of Scientific and Technical Information of China (English)

    Gerhard THURY

    2003-01-01

    Industrial production series are volatile and often cyclical. Time series models can be used to establish certain stylized facts, such as trends and cycles, which may be present in these series. In certain situations, it is also possible that common factors, which may have an interesting interpretation, can be detected in production series. Series from two neighboring countries with close economic relationships, such as Germany and Austria, are especially likely to exhibit such joint stylized facts.

  6. Time Series Analysis and Forecasting by Example

    CERN Document Server

    Bisgaard, Soren

    2011-01-01

    An intuition-based approach enables you to master time series analysis with ease Time Series Analysis and Forecasting by Example provides the fundamental techniques in time series analysis using various examples. By introducing necessary theory through examples that showcase the discussed topics, the authors successfully help readers develop an intuitive understanding of seemingly complicated time series models and their implications. The book presents methodologies for time series analysis in a simplified, example-based approach. Using graphics, the authors discuss each presented example in

  7. Duality between Time Series and Networks

    Science.gov (United States)

    Campanharo, Andriana S. L. O.; Sirer, M. Irmak; Malmgren, R. Dean; Ramos, Fernando M.; Amaral, Luís A. Nunes.

    2011-01-01

    Studying the interaction between a system's components and the temporal evolution of the system are two common ways to uncover and characterize its internal workings. Recently, several maps from a time series to a network have been proposed with the intent of using network metrics to characterize time series. Although these maps demonstrate that different time series result in networks with distinct topological properties, it remains unclear how these topological properties relate to the original time series. Here, we propose a map from a time series to a network with an approximate inverse operation, making it possible to use network statistics to characterize time series and time series statistics to characterize networks. As a proof of concept, we generate an ensemble of time series ranging from periodic to random and confirm that application of the proposed map retains much of the information encoded in the original time series (or networks) after application of the map (or its inverse). Our results suggest that network analysis can be used to distinguish different dynamic regimes in time series and, perhaps more importantly, time series analysis can provide a powerful set of tools that augment the traditional network analysis toolkit to quantify networks in new and useful ways. PMID:21858093

  8. A Review of Subsequence Time Series Clustering

    Science.gov (United States)

    Teh, Ying Wah

    2014-01-01

    Clustering of subsequence time series remains an open issue in time series clustering. Subsequence time series clustering is used in different fields, such as e-commerce, outlier detection, speech recognition, biological systems, DNA recognition, and text mining. One of the useful fields in the domain of subsequence time series clustering is pattern recognition. To improve this field, a sequence of time series data is used. This paper reviews some definitions and backgrounds related to subsequence time series clustering. The categorization of the literature reviews is divided into three groups: preproof, interproof, and postproof period. Moreover, various state-of-the-art approaches in performing subsequence time series clustering are discussed under each of the following categories. The strengths and weaknesses of the employed methods are evaluated as potential issues for future studies. PMID:25140332

  9. Testing for time-varying long-range dependence in volatility for emerging markets

    Science.gov (United States)

    Cajueiro, Daniel O.; Tabak, Benjamin M.

    2005-02-01

    This paper tests whether volatility for equity returns for emerging markets possesses long-range dependence. Furthermore, the assertion of whether long-range dependence is time-varying is checked through a rolling sample approach. The empirical results suggest that there exists long-range dependence in emerging equity returns' volatility and also that it is time-varying. This assertion also holds true for Japan and the US, which are considered more developed markets. Moreover, these results are robust to “shuffling” the data to eliminate short-term autocorrelation. Therefore, they suggest that the class of GARCH processes, which are currently employed to analyze volatility of financial time series, is misspecified.

  10. Machine News and Volatility: The Dow Jones Industrial Average and the TRNA Sentiment Series

    NARCIS (Netherlands)

    D.E. Allen (David); A.K. Singh (Abhay)

    2014-01-01

    markdownabstract__Abstract__ This paper features an analysis of the relationship between the volatility of the Dow Jones Industrial Average (DJIA) Index and a sentiment news series using daily data obtained from the Thomson Reuters News Analytics (TRNA) provided by SIRCA (The Securities Industry Re

  11. Data mining in time series databases

    CERN Document Server

    Kandel, Abraham; Bunke, Horst

    2004-01-01

    Adding the time dimension to real-world databases produces Time SeriesDatabases (TSDB) and introduces new aspects and difficulties to datamining and knowledge discovery. This book covers the state-of-the-artmethodology for mining time series databases. The novel data miningmethods presented in the book include techniques for efficientsegmentation, indexing, and classification of noisy and dynamic timeseries. A graph-based method for anomaly detection in time series isdescribed and the book also studies the implications of a novel andpotentially useful representation of time series as strings. Theproblem of detecting changes in data mining models that are inducedfrom temporal databases is additionally discussed.

  12. Outliers Mining in Time Series Data Sets

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    In this paper, we present a cluster-based algorithm for time series outlier mining.We use discrete Fourier transformation (DFT) to transform time series from time domain to frequency domain. Time series thus can be mapped as the points in k-dimensional space.For these points, a cluster-based algorithm is developed to mine the outliers from these points.The algorithm first partitions the input points into disjoint clusters and then prunes the clusters,through judgment that can not contain outliers.Our algorithm has been run in the electrical load time series of one steel enterprise and proved to be effective.

  13. International Work-Conference on Time Series

    CERN Document Server

    Pomares, Héctor

    2016-01-01

    This volume presents selected peer-reviewed contributions from The International Work-Conference on Time Series, ITISE 2015, held in Granada, Spain, July 1-3, 2015. It discusses topics in time series analysis and forecasting, advanced methods and online learning in time series, high-dimensional and complex/big data time series as well as forecasting in real problems. The International Work-Conferences on Time Series (ITISE) provide a forum for scientists, engineers, educators and students to discuss the latest ideas and implementations in the foundations, theory, models and applications in the field of time series analysis and forecasting. It focuses on interdisciplinary and multidisciplinary research encompassing the disciplines of computer science, mathematics, statistics and econometrics.

  14. Coupling between time series: a network view

    CERN Document Server

    Mehraban, Saeed; Zamani, Maryam; Jafari, Gholamreza

    2013-01-01

    Recently, the visibility graph has been introduced as a novel view for analyzing time series, which maps it to a complex network. In this paper, we introduce new algorithm of visibility, "cross-visibility", which reveals the conjugation of two coupled time series. The correspondence between the two time series is mapped to a network, "the cross-visibility graph", to demonstrate the correlation between them. We applied the algorithm to several correlated and uncorrelated time series, generated by the linear stationary ARFIMA process. The results demonstrate that the cross-visibility graph associated with correlated time series with power-law auto-correlation is scale-free. If the time series are uncorrelated, the degree distribution of their cross-visibility network deviates from power-law. For more clarifying the process, we applied the algorithm to real-world data from the financial trades of two companies, and observed significant small-scale coupling in their dynamics.

  15. Forecasting Enrollments with Fuzzy Time Series.

    Science.gov (United States)

    Song, Qiang; Chissom, Brad S.

    The concept of fuzzy time series is introduced and used to forecast the enrollment of a university. Fuzzy time series, an aspect of fuzzy set theory, forecasts enrollment using a first-order time-invariant model. To evaluate the model, the conventional linear regression technique is applied and the predicted values obtained are compared to the…

  16. Detecting macroeconomic phases in the Dow Jones Industrial Average time series

    Science.gov (United States)

    Wong, Jian Cheng; Lian, Heng; Cheong, Siew Ann

    2009-11-01

    In this paper, we perform statistical segmentation and clustering analysis of the Dow Jones Industrial Average (DJI) time series between January 1997 and August 2008. Modeling the index movements and log-index movements as stationary Gaussian processes, we find a total of 116 and 119 statistically stationary segments respectively. These can then be grouped into between five and seven clusters, each representing a different macroeconomic phase. The macroeconomic phases are distinguished primarily by their volatilities. We find that the US economy, as measured by the DJI, spends most of its time in a low-volatility phase and a high-volatility phase. The former can be roughly associated with economic expansion, while the latter contains the economic contraction phase in the standard economic cycle. Both phases are interrupted by a moderate-volatility market correction phase, but extremely-high-volatility market crashes are found mostly within the high-volatility phase. From the temporal distribution of various phases, we see a high-volatility phase from mid-1998 to mid-2003, and another starting mid-2007 (the current global financial crisis). Transitions from the low-volatility phase to the high-volatility phase are preceded by a series of precursor shocks, whereas the transition from the high-volatility phase to the low-volatility phase is preceded by a series of inverted shocks. The time scale for both types of transitions is about a year. We also identify the July 1997 Asian Financial Crisis to be the trigger for the mid-1998 transition, and an unnamed May 2006 market event related to corrections in the Chinese markets to be the trigger for the mid-2007 transition.

  17. Hurst Exponent Analysis of Financial Time Series

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    Statistical properties of stock market time series and the implication of their Hurst exponents are discussed. Hurst exponents of DJ1A (Dow Jones Industrial Average) components are tested using re-scaled range analysis. In addition to the original stock return series, the linear prediction errors of the daily returns are also tested. Numerical results show that the Hurst exponent analysis can provide some information about the statistical properties of the financial time series.

  18. Transfer entropy between multivariate time series

    Science.gov (United States)

    Mao, Xuegeng; Shang, Pengjian

    2017-06-01

    It is a crucial topic to identify the direction and strength of the interdependence between time series in multivariate systems. In this paper, we propose the method of transfer entropy based on the theory of time-delay reconstruction of a phase space, which is a model-free approach to detect causalities in multivariate time series. This method overcomes the limitation that original transfer entropy only can capture which system drives the transition probabilities of another in scalar time series. Using artificial time series, we show that the driving character is obviously reflected with the increase of the coupling strength between two signals and confirm the effectiveness of the method with noise added. Furthermore, we utilize it to real-world data, namely financial time series, in order to characterize the information flow among different stocks.

  19. Statistical criteria for characterizing irradiance time series.

    Energy Technology Data Exchange (ETDEWEB)

    Stein, Joshua S.; Ellis, Abraham; Hansen, Clifford W.

    2010-10-01

    We propose and examine several statistical criteria for characterizing time series of solar irradiance. Time series of irradiance are used in analyses that seek to quantify the performance of photovoltaic (PV) power systems over time. Time series of irradiance are either measured or are simulated using models. Simulations of irradiance are often calibrated to or generated from statistics for observed irradiance and simulations are validated by comparing the simulation output to the observed irradiance. Criteria used in this comparison should derive from the context of the analyses in which the simulated irradiance is to be used. We examine three statistics that characterize time series and their use as criteria for comparing time series. We demonstrate these statistics using observed irradiance data recorded in August 2007 in Las Vegas, Nevada, and in June 2009 in Albuquerque, New Mexico.

  20. Reconstruction of time-delay systems from chaotic time series.

    Science.gov (United States)

    Bezruchko, B P; Karavaev, A S; Ponomarenko, V I; Prokhorov, M D

    2001-11-01

    We propose a method that allows one to estimate the parameters of model scalar time-delay differential equations from time series. The method is based on a statistical analysis of time intervals between extrema in the time series. We verify our method by using it for the reconstruction of time-delay differential equations from their chaotic solutions and for modeling experimental systems with delay-induced dynamics from their chaotic time series.

  1. Lag space estimation in time series modelling

    DEFF Research Database (Denmark)

    Goutte, Cyril

    1997-01-01

    The purpose of this article is to investigate some techniques for finding the relevant lag-space, i.e. input information, for time series modelling. This is an important aspect of time series modelling, as it conditions the design of the model through the regressor vector a.k.a. the input layer...

  2. The foundations of modern time series analysis

    CERN Document Server

    Mills, Terence C

    2011-01-01

    This book develops the analysis of Time Series from its formal beginnings in the 1890s through to the publication of Box and Jenkins' watershed publication in 1970, showing how these methods laid the foundations for the modern techniques of Time Series analysis that are in use today.

  3. Nonlinear transformation on the transfer entropy of financial time series

    Science.gov (United States)

    Wu, Zhenyu; Shang, Pengjian

    2017-09-01

    Transfer entropy (TE) now is widely used in the data mining and economic field. However, TE itself demands that time series intend to be stationary and meet Markov condition. Naturally, we are interested in investigating the effect of the nonlinear transformation of the two series on the TE. Therefore, the paper is designed to study the TE of five nonlinear ;volatile; transformations based on the data which are generated by the linear modeling and the logistic maps modeling, as well as the dataset that come from financial markets. With only one of the TE of nonlinear transformations fluctuating around the TE of original series, the TE of others all have increased with different degrees.

  4. On reconstruction of time series in climatology

    Directory of Open Access Journals (Sweden)

    V. Privalsky

    2015-10-01

    Full Text Available The approach to time series reconstruction in climatology based upon cross-correlation coefficients and regression equations is mathematically incorrect because it ignores the dependence of time series upon their past. The proper method described here for the bivariate case requires the autoregressive time- and frequency domains modeling of the time series which contains simultaneous observations of both scalar series with subsequent application of the model to restore the shorter one into the past. The method presents further development of previous efforts taken by a number of authors starting from A. Douglass who introduced some concepts of time series analysis into paleoclimatology. The method is applied to the monthly data of total solar irradiance (TSI, 1979–2014, and sunspot numbers (SSN, 1749–2014, to restore the TSI data over 1749–1978. The results of the reconstruction are in statistical agreement with observations.

  5. A radar image time series

    Science.gov (United States)

    Leberl, F.; Fuchs, H.; Ford, J. P.

    1981-01-01

    A set of ten side-looking radar images of a mining area in Arizona that were aquired over a period of 14 yr are studied to demonstrate the photogrammetric differential-rectification technique applied to radar images and to examine changes that occurred in the area over time. Five of the images are rectified by using ground control points and a digital height model taken from a map. Residual coordinate errors in ground control are reduced from several hundred meters in all cases to + or - 19 to 70 m. The contents of the radar images are compared with a Landsat image and with aerial photographs. Effects of radar system parameters on radar images are briefly reviewed.

  6. Biclustering of time series microarray data.

    Science.gov (United States)

    Meng, Jia; Huang, Yufei

    2012-01-01

    Clustering is a popular data exploration technique widely used in microarray data analysis. In this chapter, we review ideas and algorithms of bicluster and its applications in time series microarray analysis. We introduce first the concept and importance of biclustering and its different variations. We then focus our discussion on the popular iterative signature algorithm (ISA) for searching biclusters in microarray dataset. Next, we discuss in detail the enrichment constraint time-dependent ISA (ECTDISA) for identifying biologically meaningful temporal transcription modules from time series microarray dataset. In the end, we provide an example of ECTDISA application to time series microarray data of Kaposi's Sarcoma-associated Herpesvirus (KSHV) infection.

  7. Portfolio Optimization under Local-Stochastic Volatility: Coefficient Taylor Series Approximations & Implied Sharpe Ratio

    OpenAIRE

    Matthew Lorig; Ronnie Sircar

    2015-01-01

    We study the finite horizon Merton portfolio optimization problem in a general local-stochastic volatility setting. Using model coefficient expansion techniques, we derive approximations for the both the value function and the optimal investment strategy. We also analyze the `implied Sharpe ratio' and derive a series approximation for this quantity. The zeroth-order approximation of the value function and optimal investment strategy correspond to those obtained by Merton (1969) when the risky...

  8. Developing consistent time series landsat data products

    Science.gov (United States)

    The Landsat series satellite has provided earth observation data record continuously since early 1970s. There are increasing demands on having a consistent time series of Landsat data products. In this presentation, I will summarize the work supported by the USGS Landsat Science Team project from 20...

  9. Modeling Time Series Data for Supervised Learning

    Science.gov (United States)

    Baydogan, Mustafa Gokce

    2012-01-01

    Temporal data are increasingly prevalent and important in analytics. Time series (TS) data are chronological sequences of observations and an important class of temporal data. Fields such as medicine, finance, learning science and multimedia naturally generate TS data. Each series provide a high-dimensional data vector that challenges the learning…

  10. DATA MINING IN CANADIAN LYNX TIME SERIES

    Directory of Open Access Journals (Sweden)

    R.Karnaboopathy

    2012-01-01

    Full Text Available This paper sums up the applications of Statistical model such as ARIMA family timeseries models in Canadian lynx data time series analysis and introduces the method of datamining combined with Statistical knowledge to analysis Canadian lynx data series.

  11. Time Series Analysis Forecasting and Control

    CERN Document Server

    Box, George E P; Reinsel, Gregory C

    2011-01-01

    A modernized new edition of one of the most trusted books on time series analysis. Since publication of the first edition in 1970, Time Series Analysis has served as one of the most influential and prominent works on the subject. This new edition maintains its balanced presentation of the tools for modeling and analyzing time series and also introduces the latest developments that have occurred n the field over the past decade through applications from areas such as business, finance, and engineering. The Fourth Edition provides a clearly written exploration of the key methods for building, cl

  12. A Simple Fuzzy Time Series Forecasting Model

    DEFF Research Database (Denmark)

    Ortiz-Arroyo, Daniel

    2016-01-01

    In this paper we describe a new first order fuzzy time series forecasting model. We show that our automatic fuzzy partitioning method provides an accurate approximation to the time series that when combined with rule forecasting and an OWA operator improves forecasting accuracy. Our model does...... not attempt to provide the best results in comparison with other forecasting methods but to show how to improve first order models using simple techniques. However, we show that our first order model is still capable of outperforming some more complex higher order fuzzy time series models....

  13. Time series modeling, computation, and inference

    CERN Document Server

    Prado, Raquel

    2010-01-01

    The authors systematically develop a state-of-the-art analysis and modeling of time series. … this book is well organized and well written. The authors present various statistical models for engineers to solve problems in time series analysis. Readers no doubt will learn state-of-the-art techniques from this book.-Hsun-Hsien Chang, Computing Reviews, March 2012My favorite chapters were on dynamic linear models and vector AR and vector ARMA models.-William Seaver, Technometrics, August 2011… a very modern entry to the field of time-series modelling, with a rich reference list of the current lit

  14. Modeling Philippine Stock Exchange Composite Index Using Time Series Analysis

    Science.gov (United States)

    Gayo, W. S.; Urrutia, J. D.; Temple, J. M. F.; Sandoval, J. R. D.; Sanglay, J. E. A.

    2015-06-01

    This study was conducted to develop a time series model of the Philippine Stock Exchange Composite Index and its volatility using the finite mixture of ARIMA model with conditional variance equations such as ARCH, GARCH, EG ARCH, TARCH and PARCH models. Also, the study aimed to find out the reason behind the behaviorof PSEi, that is, which of the economic variables - Consumer Price Index, crude oil price, foreign exchange rate, gold price, interest rate, money supply, price-earnings ratio, Producers’ Price Index and terms of trade - can be used in projecting future values of PSEi and this was examined using Granger Causality Test. The findings showed that the best time series model for Philippine Stock Exchange Composite index is ARIMA(1,1,5) - ARCH(1). Also, Consumer Price Index, crude oil price and foreign exchange rate are factors concluded to Granger cause Philippine Stock Exchange Composite Index.

  15. Evaluation of Harmonic Analysis of Time Series (HANTS): impact of gaps on time series reconstruction

    NARCIS (Netherlands)

    Zhou, J.Y.; Jia, L.; Hu, G.; Menenti, M.

    2012-01-01

    In recent decades, researchers have developed methods and models to reconstruct time series of irregularly spaced observations from satellite remote sensing, among which the widely used Harmonic Analysis of Time Series (HANTS) method. Many studies based on time series reconstructed with HANTS docume

  16. Evaluation of Harmonic Analysis of Time Series (HANTS): impact of gaps on time series reconstruction

    NARCIS (Netherlands)

    Zhou, J.Y.; Jia, L.; Hu, G.; Menenti, M.

    2012-01-01

    In recent decades, researchers have developed methods and models to reconstruct time series of irregularly spaced observations from satellite remote sensing, among which the widely used Harmonic Analysis of Time Series (HANTS) method. Many studies based on time series reconstructed with HANTS docume

  17. Forecasting Daily Time Series using Periodic Unobserved Components Time Series Models

    NARCIS (Netherlands)

    Koopman, Siem Jan; Ooms, Marius

    2004-01-01

    We explore a periodic analysis in the context of unobserved components time series models that decompose time series into components of interest such as trend and seasonal. Periodic time series models allow dynamic characteristics to depend on the period of the year, month, week or day. In the stand

  18. Forecasting Daily Time Series using Periodic Unobserved Components Time Series Models

    NARCIS (Netherlands)

    Koopman, Siem Jan; Ooms, Marius

    2004-01-01

    We explore a periodic analysis in the context of unobserved components time series models that decompose time series into components of interest such as trend and seasonal. Periodic time series models allow dynamic characteristics to depend on the period of the year, month, week or day. In the

  19. Evaluation of Harmonic Analysis of Time Series (HANTS): impact of gaps on time series reconstruction

    NARCIS (Netherlands)

    Zhou, J.Y.; Jia, L.; Hu, G.; Menenti, M.

    2012-01-01

    In recent decades, researchers have developed methods and models to reconstruct time series of irregularly spaced observations from satellite remote sensing, among which the widely used Harmonic Analysis of Time Series (HANTS) method. Many studies based on time series reconstructed with HANTS

  20. Visibility Graph Based Time Series Analysis.

    Science.gov (United States)

    Stephen, Mutua; Gu, Changgui; Yang, Huijie

    2015-01-01

    Network based time series analysis has made considerable achievements in the recent years. By mapping mono/multivariate time series into networks, one can investigate both it's microscopic and macroscopic behaviors. However, most proposed approaches lead to the construction of static networks consequently providing limited information on evolutionary behaviors. In the present paper we propose a method called visibility graph based time series analysis, in which series segments are mapped to visibility graphs as being descriptions of the corresponding states and the successively occurring states are linked. This procedure converts a time series to a temporal network and at the same time a network of networks. Findings from empirical records for stock markets in USA (S&P500 and Nasdaq) and artificial series generated by means of fractional Gaussian motions show that the method can provide us rich information benefiting short-term and long-term predictions. Theoretically, we propose a method to investigate time series from the viewpoint of network of networks.

  1. Visibility Graph Based Time Series Analysis

    Science.gov (United States)

    Stephen, Mutua; Gu, Changgui; Yang, Huijie

    2015-01-01

    Network based time series analysis has made considerable achievements in the recent years. By mapping mono/multivariate time series into networks, one can investigate both it’s microscopic and macroscopic behaviors. However, most proposed approaches lead to the construction of static networks consequently providing limited information on evolutionary behaviors. In the present paper we propose a method called visibility graph based time series analysis, in which series segments are mapped to visibility graphs as being descriptions of the corresponding states and the successively occurring states are linked. This procedure converts a time series to a temporal network and at the same time a network of networks. Findings from empirical records for stock markets in USA (S&P500 and Nasdaq) and artificial series generated by means of fractional Gaussian motions show that the method can provide us rich information benefiting short-term and long-term predictions. Theoretically, we propose a method to investigate time series from the viewpoint of network of networks. PMID:26571115

  2. Global Financial Crises and Time-varying Volatility Comovement in World Equity Markets

    OpenAIRE

    Andrew Stuart Duncan; Alain Kabundi

    2011-01-01

    This paper studies volatility comovement in world equity markets between 1994 and 2008. Global volatility factors are extracted from a panel of monthly volatility proxies relating to 25 developed and 20 emerging stock markets. A dynamic factor model (FM) is estimated using two-year rolling window regressions. The FMÂ’s time-varying variance shares of global factors map variations in volatility comovement over time and across countries. The results indicate that global volatility linkages are ...

  3. Estimating the Persistence and the Autocorrelation Function of a Time Series that is Measured with Error

    DEFF Research Database (Denmark)

    Hansen, Peter Reinhard; Lunde, Asger

    An economic time series can often be viewed as a noisy proxy for an underlying economic variable. Measurement errors will influence the dynamic properties of the observed process and may conceal the persistence of the underlying time series. In this paper we develop instrumental variable (IV......) methods for extracting information about the latent process. Our framework can be used to estimate the autocorrelation function of the latent volatility process and a key persistence parameter. Our analysis is motivated by the recent literature on realized (volatility) measures, such as the realized...... variance, that are imperfect estimates of actual volatility. In an empirical analysis using realized measures for the DJIA stocks we find the underlying volatility to be near unit root in all cases. Although standard unit root tests are asymptotically justified, we find them to be misleading in our...

  4. Estimating the Persistence and the Autocorrelation Function of a Time Series that is Measured with Error

    DEFF Research Database (Denmark)

    Hansen, Peter Reinhard; Lunde, Asger

    2014-01-01

    An economic time series can often be viewed as a noisy proxy for an underlying economic variable. Measurement errors will influence the dynamic properties of the observed process and may conceal the persistence of the underlying time series. In this paper we develop instrumental variable (IV...... of actual volatility. In an empirical analysis using realized measures for the Dow Jones industrial average stocks, we find the underlying volatility to be near unit root in all cases. Although standard unit root tests are asymptotically justified, we find them to be misleading in our application despite......) methods for extracting information about the latent process. Our framework can be used to estimate the autocorrelation function of the latent volatility process and a key persistence parameter. Our analysis is motivated by the recent literature on realized volatility measures that are imperfect estimates...

  5. Measuring nonlinear behavior in time series data

    Science.gov (United States)

    Wai, Phoong Seuk; Ismail, Mohd Tahir

    2014-12-01

    Stationary Test is an important test in detect the time series behavior since financial and economic data series always have missing data, structural change as well as jumps or breaks in the data set. Moreover, stationary test is able to transform the nonlinear time series variable to become stationary by taking difference-stationary process or trend-stationary process. Two different types of hypothesis testing of stationary tests that are Augmented Dickey-Fuller (ADF) test and Kwiatkowski-Philips-Schmidt-Shin (KPSS) test are examine in this paper to describe the properties of the time series variables in financial model. Besides, Least Square method is used in Augmented Dickey-Fuller test to detect the changes of the series and Lagrange multiplier is used in Kwiatkowski-Philips-Schmidt-Shin test to examine the properties of oil price, gold price and Malaysia stock market. Moreover, Quandt-Andrews, Bai-Perron and Chow tests are also use to detect the existence of break in the data series. The monthly index data are ranging from December 1989 until May 2012. Result is shown that these three series exhibit nonlinear properties but are able to transform to stationary series after taking first difference process.

  6. Modelling Time-Varying Volatility in Financial Returns

    DEFF Research Database (Denmark)

    Amado, Cristina; Laakkonen, Helinä

    2014-01-01

    The “unusually uncertain” phase in the global financial markets has inspired many researchers to study the effects of ambiguity (or “Knightian uncertainty”) on the decisions made by investors and their implications for the capital markets. We contribute to this literature by using a modified...... version of the time-varying GARCH model of Amado and Teräsvirta (2013) to analyze whether the increasing uncertainty has caused excess volatility in the US and European government bond markets. In our model, volatility is multiplicatively decomposed into two time-varying conditional components: the first...... being captured by a stable GARCH(1,1) process and the second driven by the level of uncertainty in the financial market....

  7. Applied time series analysis and innovative computing

    CERN Document Server

    Ao, Sio-Iong

    2010-01-01

    This text is a systematic, state-of-the-art introduction to the use of innovative computing paradigms as an investigative tool for applications in time series analysis. It includes frontier case studies based on recent research.

  8. Spectra: Time series power spectrum calculator

    Science.gov (United States)

    Gallardo, Tabaré

    2017-01-01

    Spectra calculates the power spectrum of a time series equally spaced or not based on the Spectral Correlation Coefficient (Ferraz-Mello 1981, Astron. Journal 86 (4), 619). It is very efficient for detection of low frequencies.

  9. Improving Intercomparability of Marine Biogeochemical Time Series

    Science.gov (United States)

    Benway, Heather M.; Telszewski, Maciej; Lorenzoni, Laura

    2013-04-01

    Shipboard biogeochemical time series represent one of the most valuable tools scientists have to quantify marine elemental fluxes and associated biogeochemical processes and to understand their links to changing climate. They provide the long, temporally resolved data sets needed to characterize ocean climate, biogeochemistry, and ecosystem variability and change. However, to monitor and differentiate natural cycles and human-driven changes in the global oceans, time series methodologies must be transparent and intercomparable when possible. To review current shipboard biogeochemical time series sampling and analytical methods, the International Ocean Carbon Coordination Project (IOCCP; http://www.ioccp.org/) and the Ocean Carbon and Biogeochemistry Program (http://www.us-ocb.org/) convened an international ocean time series workshop at the Bermuda Institute for Ocean Sciences.

  10. FATS: Feature Analysis for Time Series

    CERN Document Server

    Nun, Isadora; Sim, Brandon; Zhu, Ming; Dave, Rahul; Castro, Nicolas; Pichara, Karim

    2015-01-01

    In this paper, we present the FATS (Feature Analysis for Time Series) library. FATS is a Python library which facilitates and standardizes feature extraction for time series data. In particular, we focus on one application: feature extraction for astronomical light curve data, although the library is generalizable for other uses. We detail the methods and features implemented for light curve analysis, and present examples for its usage.

  11. Combination prediction method of chaotic time series

    Institute of Scientific and Technical Information of China (English)

    ZHAO DongHua; RUAN Jiong; CAI ZhiJie

    2007-01-01

    In the present paper, we propose an approach of combination prediction of chaotic time series. The method is based on the adding-weight one-rank local-region method of chaotic time series. The method allows us to define an interval containing a future value with a given probability, which is obtained by studying the prediction error distribution. Its effectiveness is shown with data generated by Logistic map.

  12. Pseudotime estimation: deconfounding single cell time series

    OpenAIRE

    John E Reid; Wernisch, Lorenz

    2016-01-01

    Motivation: Repeated cross-sectional time series single cell data confound several sources of variation, with contributions from measurement noise, stochastic cell-to-cell variation and cell progression at different rates. Time series from single cell assays are particularly susceptible to confounding as the measurements are not averaged over populations of cells. When several genes are assayed in parallel these effects can be estimated and corrected for under certain smoothness assumptions o...

  13. Multivariate Time Series Decomposition into Oscillation Components.

    Science.gov (United States)

    Matsuda, Takeru; Komaki, Fumiyasu

    2017-08-01

    Many time series are considered to be a superposition of several oscillation components. We have proposed a method for decomposing univariate time series into oscillation components and estimating their phases (Matsuda & Komaki, 2017 ). In this study, we extend that method to multivariate time series. We assume that several oscillators underlie the given multivariate time series and that each variable corresponds to a superposition of the projections of the oscillators. Thus, the oscillators superpose on each variable with amplitude and phase modulation. Based on this idea, we develop gaussian linear state-space models and use them to decompose the given multivariate time series. The model parameters are estimated from data using the empirical Bayes method, and the number of oscillators is determined using the Akaike information criterion. Therefore, the proposed method extracts underlying oscillators in a data-driven manner and enables investigation of phase dynamics in a given multivariate time series. Numerical results show the effectiveness of the proposed method. From monthly mean north-south sunspot number data, the proposed method reveals an interesting phase relationship.

  14. Introduction to time series analysis and forecasting

    CERN Document Server

    Montgomery, Douglas C; Kulahci, Murat

    2008-01-01

    An accessible introduction to the most current thinking in and practicality of forecasting techniques in the context of time-oriented data. Analyzing time-oriented data and forecasting are among the most important problems that analysts face across many fields, ranging from finance and economics to production operations and the natural sciences. As a result, there is a widespread need for large groups of people in a variety of fields to understand the basic concepts of time series analysis and forecasting. Introduction to Time Series Analysis and Forecasting presents the time series analysis branch of applied statistics as the underlying methodology for developing practical forecasts, and it also bridges the gap between theory and practice by equipping readers with the tools needed to analyze time-oriented data and construct useful, short- to medium-term, statistically based forecasts.

  15. Time Series Forecasting with Missing Values

    Directory of Open Access Journals (Sweden)

    Shin-Fu Wu

    2015-11-01

    Full Text Available Time series prediction has become more popular in various kinds of applications such as weather prediction, control engineering, financial analysis, industrial monitoring, etc. To deal with real-world problems, we are often faced with missing values in the data due to sensor malfunctions or human errors. Traditionally, the missing values are simply omitted or replaced by means of imputation methods. However, omitting those missing values may cause temporal discontinuity. Imputation methods, on the other hand, may alter the original time series. In this study, we propose a novel forecasting method based on least squares support vector machine (LSSVM. We employ the input patterns with the temporal information which is defined as local time index (LTI. Time series data as well as local time indexes are fed to LSSVM for doing forecasting without imputation. We compare the forecasting performance of our method with other imputation methods. Experimental results show that the proposed method is promising and is worth further investigations.

  16. Formation Mechanism of the Accumulative Magnification Effect in a Financial Time Series

    Institute of Scientific and Technical Information of China (English)

    DUAN Wen-Qi

    2012-01-01

    Structural information contained in financial time series can be magnified effectively by constructing the accumulative return.In order to make the magnification effects of different financial time series comparative,we first propose a standard method to characterize the strength of the accumulative magnification effect.Then,we employ decomposed-randomized technology to uncover the formation mechanism of the accumulative magnification effect.Our results show that (1) the standard deviation pattern is determined by volatility dependence,(2) the Hurst exponent pattern is induced by sign dependence,(3) an approximate entropy pattern is caused by the combined effect of sign dependence and volatility dependence.

  17. Time averaging, ageing and delay analysis of financial time series

    Science.gov (United States)

    Cherstvy, Andrey G.; Vinod, Deepak; Aghion, Erez; Chechkin, Aleksei V.; Metzler, Ralf

    2017-06-01

    We introduce three strategies for the analysis of financial time series based on time averaged observables. These comprise the time averaged mean squared displacement (MSD) as well as the ageing and delay time methods for varying fractions of the financial time series. We explore these concepts via statistical analysis of historic time series for several Dow Jones Industrial indices for the period from the 1960s to 2015. Remarkably, we discover a simple universal law for the delay time averaged MSD. The observed features of the financial time series dynamics agree well with our analytical results for the time averaged measurables for geometric Brownian motion, underlying the famed Black-Scholes-Merton model. The concepts we promote here are shown to be useful for financial data analysis and enable one to unveil new universal features of stock market dynamics.

  18. Feature Matching in Time Series Modelling

    CERN Document Server

    Xia, Yingcun

    2011-01-01

    Using a time series model to mimic an observed time series has a long history. However, with regard to this objective, conventional estimation methods for discrete-time dynamical models are frequently found to be wanting. In the absence of a true model, we prefer an alternative approach to conventional model fitting that typically involves one-step-ahead prediction errors. Our primary aim is to match the joint probability distribution of the observable time series, including long-term features of the dynamics that underpin the data, such as cycles, long memory and others, rather than short-term prediction. For want of a better name, we call this specific aim {\\it feature matching}. The challenges of model mis-specification, measurement errors and the scarcity of data are forever present in real time series modelling. In this paper, by synthesizing earlier attempts into an extended-likelihood, we develop a systematic approach to empirical time series analysis to address these challenges and to aim at achieving...

  19. Highly comparative time-series analysis: the empirical structure of time series and their methods.

    Science.gov (United States)

    Fulcher, Ben D; Little, Max A; Jones, Nick S

    2013-06-06

    The process of collecting and organizing sets of observations represents a common theme throughout the history of science. However, despite the ubiquity of scientists measuring, recording and analysing the dynamics of different processes, an extensive organization of scientific time-series data and analysis methods has never been performed. Addressing this, annotated collections of over 35 000 real-world and model-generated time series, and over 9000 time-series analysis algorithms are analysed in this work. We introduce reduced representations of both time series, in terms of their properties measured by diverse scientific methods, and of time-series analysis methods, in terms of their behaviour on empirical time series, and use them to organize these interdisciplinary resources. This new approach to comparing across diverse scientific data and methods allows us to organize time-series datasets automatically according to their properties, retrieve alternatives to particular analysis methods developed in other scientific disciplines and automate the selection of useful methods for time-series classification and regression tasks. The broad scientific utility of these tools is demonstrated on datasets of electroencephalograms, self-affine time series, heartbeat intervals, speech signals and others, in each case contributing novel analysis techniques to the existing literature. Highly comparative techniques that compare across an interdisciplinary literature can thus be used to guide more focused research in time-series analysis for applications across the scientific disciplines.

  20. Predicting road accidents: Structural time series approach

    Science.gov (United States)

    Junus, Noor Wahida Md; Ismail, Mohd Tahir

    2014-07-01

    In this paper, the model for occurrence of road accidents in Malaysia between the years of 1970 to 2010 was developed and throughout this model the number of road accidents have been predicted by using the structural time series approach. The models are developed by using stepwise method and the residual of each step has been analyzed. The accuracy of the model is analyzed by using the mean absolute percentage error (MAPE) and the best model is chosen based on the smallest Akaike information criterion (AIC) value. A structural time series approach found that local linear trend model is the best model to represent the road accidents. This model allows level and slope component to be varied over time. In addition, this approach also provides useful information on improving the conventional time series method.

  1. Effective Feature Preprocessing for Time Series Forecasting

    DEFF Research Database (Denmark)

    Zhao, Junhua; Dong, Zhaoyang; Xu, Zhao

    2006-01-01

    Time series forecasting is an important area in data mining research. Feature preprocessing techniques have significant influence on forecasting accuracy, therefore are essential in a forecasting model. Although several feature preprocessing techniques have been applied in time series forecasting......, there is so far no systematic research to study and compare their performance. How to select effective techniques of feature preprocessing in a forecasting model remains a problem. In this paper, the authors conduct a comprehensive study of existing feature preprocessing techniques to evaluate their empirical...... performance in time series forecasting. It is demonstrated in our experiment that, effective feature preprocessing can significantly enhance forecasting accuracy. This research can be a useful guidance for researchers on effectively selecting feature preprocessing techniques and integrating them with time...

  2. Introduction to time series analysis and forecasting

    CERN Document Server

    Montgomery, Douglas C; Kulahci, Murat

    2015-01-01

    Praise for the First Edition ""…[t]he book is great for readers who need to apply the methods and models presented but have little background in mathematics and statistics."" -MAA Reviews Thoroughly updated throughout, Introduction to Time Series Analysis and Forecasting, Second Edition presents the underlying theories of time series analysis that are needed to analyze time-oriented data and construct real-world short- to medium-term statistical forecasts.    Authored by highly-experienced academics and professionals in engineering statistics, the Second Edition features discussions on both

  3. Building Chaotic Model From Incomplete Time Series

    Science.gov (United States)

    Siek, Michael; Solomatine, Dimitri

    2010-05-01

    This paper presents a number of novel techniques for building a predictive chaotic model from incomplete time series. A predictive chaotic model is built by reconstructing the time-delayed phase space from observed time series and the prediction is made by a global model or adaptive local models based on the dynamical neighbors found in the reconstructed phase space. In general, the building of any data-driven models depends on the completeness and quality of the data itself. However, the completeness of the data availability can not always be guaranteed since the measurement or data transmission is intermittently not working properly due to some reasons. We propose two main solutions dealing with incomplete time series: using imputing and non-imputing methods. For imputing methods, we utilized the interpolation methods (weighted sum of linear interpolations, Bayesian principle component analysis and cubic spline interpolation) and predictive models (neural network, kernel machine, chaotic model) for estimating the missing values. After imputing the missing values, the phase space reconstruction and chaotic model prediction are executed as a standard procedure. For non-imputing methods, we reconstructed the time-delayed phase space from observed time series with missing values. This reconstruction results in non-continuous trajectories. However, the local model prediction can still be made from the other dynamical neighbors reconstructed from non-missing values. We implemented and tested these methods to construct a chaotic model for predicting storm surges at Hoek van Holland as the entrance of Rotterdam Port. The hourly surge time series is available for duration of 1990-1996. For measuring the performance of the proposed methods, a synthetic time series with missing values generated by a particular random variable to the original (complete) time series is utilized. There exist two main performance measures used in this work: (1) error measures between the actual

  4. Volatile and non-volatile compounds in green tea affected in harvesting time and their correlation to consumer preference.

    Science.gov (United States)

    Kim, Youngmok; Lee, Kwang-Geun; Kim, Mina K

    2016-10-01

    Current study was designed to find out how tea harvesting time affects the volatile and non-volatile compounds profiles of green tea. In addition, correlation of instrumental volatile and non-volatile compounds analyses to consumer perception were analyzed. Overall, earlier harvested green tea had stronger antioxidant capacity (~61.0%) due to the polyphenolic compounds from catechin (23,164 mg/L), in comparison to later harvested green teas (11,961 mg/L). However, high catechin content in green tea influenced negatively the consumer likings of green tea, due to high bitterness (27.6%) and astringency (13.4%). Volatile compounds drive consumer liking of green tea products were also identified, that included linalool, 2,3-methyl butanal, 2-heptanone, (E,E)-3,5-Octadien-2-one. Finding from current study are useful for green tea industry as it provide the difference in physiochemical properties of green tea harvested at different intervals.

  5. Layered Ensemble Architecture for Time Series Forecasting.

    Science.gov (United States)

    Rahman, Md Mustafizur; Islam, Md Monirul; Murase, Kazuyuki; Yao, Xin

    2016-01-01

    Time series forecasting (TSF) has been widely used in many application areas such as science, engineering, and finance. The phenomena generating time series are usually unknown and information available for forecasting is only limited to the past values of the series. It is, therefore, necessary to use an appropriate number of past values, termed lag, for forecasting. This paper proposes a layered ensemble architecture (LEA) for TSF problems. Our LEA consists of two layers, each of which uses an ensemble of multilayer perceptron (MLP) networks. While the first ensemble layer tries to find an appropriate lag, the second ensemble layer employs the obtained lag for forecasting. Unlike most previous work on TSF, the proposed architecture considers both accuracy and diversity of the individual networks in constructing an ensemble. LEA trains different networks in the ensemble by using different training sets with an aim of maintaining diversity among the networks. However, it uses the appropriate lag and combines the best trained networks to construct the ensemble. This indicates LEAs emphasis on accuracy of the networks. The proposed architecture has been tested extensively on time series data of neural network (NN)3 and NN5 competitions. It has also been tested on several standard benchmark time series data. In terms of forecasting accuracy, our experimental results have revealed clearly that LEA is better than other ensemble and nonensemble methods.

  6. Complex network analysis of time series

    Science.gov (United States)

    Gao, Zhong-Ke; Small, Michael; Kurths, Jürgen

    2016-12-01

    Revealing complicated behaviors from time series constitutes a fundamental problem of continuing interest and it has attracted a great deal of attention from a wide variety of fields on account of its significant importance. The past decade has witnessed a rapid development of complex network studies, which allow to characterize many types of systems in nature and technology that contain a large number of components interacting with each other in a complicated manner. Recently, the complex network theory has been incorporated into the analysis of time series and fruitful achievements have been obtained. Complex network analysis of time series opens up new venues to address interdisciplinary challenges in climate dynamics, multiphase flow, brain functions, ECG dynamics, economics and traffic systems.

  7. Time-varying volatility in Malaysian stock exchange: An empirical study using multiple-volatility-shift fractionally integrated model

    Science.gov (United States)

    Cheong, Chin Wen

    2008-02-01

    This article investigated the influences of structural breaks on the fractionally integrated time-varying volatility model in the Malaysian stock markets which included the Kuala Lumpur composite index and four major sectoral indices. A fractionally integrated time-varying volatility model combined with sudden changes is developed to study the possibility of structural change in the empirical data sets. Our empirical results showed substantial reduction in fractional differencing parameters after the inclusion of structural change during the Asian financial and currency crises. Moreover, the fractionally integrated model with sudden change in volatility performed better in the estimation and specification evaluations.

  8. Time series clustering in large data sets

    Directory of Open Access Journals (Sweden)

    Jiří Fejfar

    2011-01-01

    Full Text Available The clustering of time series is a widely researched area. There are many methods for dealing with this task. We are actually using the Self-organizing map (SOM with the unsupervised learning algorithm for clustering of time series. After the first experiment (Fejfar, Weinlichová, Šťastný, 2009 it seems that the whole concept of the clustering algorithm is correct but that we have to perform time series clustering on much larger dataset to obtain more accurate results and to find the correlation between configured parameters and results more precisely. The second requirement arose in a need for a well-defined evaluation of results. It seems useful to use sound recordings as instances of time series again. There are many recordings to use in digital libraries, many interesting features and patterns can be found in this area. We are searching for recordings with the similar development of information density in this experiment. It can be used for musical form investigation, cover songs detection and many others applications.The objective of the presented paper is to compare clustering results made with different parameters of feature vectors and the SOM itself. We are describing time series in a simplistic way evaluating standard deviations for separated parts of recordings. The resulting feature vectors are clustered with the SOM in batch training mode with different topologies varying from few neurons to large maps.There are other algorithms discussed, usable for finding similarities between time series and finally conclusions for further research are presented. We also present an overview of the related actual literature and projects.

  9. Dynamical networks reconstructed from time series

    CERN Document Server

    Levnajić, Zoran

    2012-01-01

    Novel method of reconstructing dynamical networks from empirically measured time series is proposed. By statistically examining the correlations between motions displayed by network nodes, we derive a simple equation that directly yields the adjacency matrix, assuming the intra-network interaction functions to be known. We illustrate the method's implementation on a simple example and discuss the dependence of the reconstruction precision on the properties of time series. Our method is applicable to any network, allowing for reconstruction precision to be maximized, and errors to be estimated.

  10. Improving the prediction of chaotic time series

    Institute of Scientific and Technical Information of China (English)

    李克平; 高自友; 陈天仑

    2003-01-01

    One of the features of deterministic chaos is sensitive to initial conditions. This feature limits the prediction horizons of many chaotic systems. In this paper, we propose a new prediction technique for chaotic time series. In our method, some neighbouring points of the predicted point, for which the corresponding local Lyapunov exponent is particularly large, would be discarded during estimating the local dynamics, and thus the error accumulated by the prediction algorithm is reduced. The model is tested for the convection amplitude of Lorenz systems. The simulation results indicate that the prediction technique can improve the prediction of chaotic time series.

  11. Lecture notes for Advanced Time Series Analysis

    DEFF Research Database (Denmark)

    Madsen, Henrik; Holst, Jan

    1997-01-01

    A first version of this notes was used at the lectures in Grenoble, and they are now extended and improved (together with Jan Holst), and used in Ph.D. courses on Advanced Time Series Analysis at IMM and at the Department of Mathematical Statistics, University of Lund, 1994, 1997, ......A first version of this notes was used at the lectures in Grenoble, and they are now extended and improved (together with Jan Holst), and used in Ph.D. courses on Advanced Time Series Analysis at IMM and at the Department of Mathematical Statistics, University of Lund, 1994, 1997, ...

  12. Xenon isotopic constraints on the timing of atmospheric volatile recycling

    Science.gov (United States)

    Parai, R.; Mukhopadhyay, S.

    2015-12-01

    Constraints on the recycling of atmospheric volatiles into the deep Earth provide important insights into mantle temperature, cooling rate, structure and style of convection over Earth history. Studies of ancient atmospheric gases trapped in Archean cherts show that the Xe isotopic composition of the atmosphere at ~3.5 Ga differed from the modern atmosphere [1]. This suggests the atmosphere evolved in isotopic composition until it reached its present-day composition at some time after 3.5 Ga. The evolution of the atmospheric Xe isotopic composition presents an opportunity to constrain the timing of Xe recycling into the Earth's mantle. Xe isotopes measured in mid-ocean ridge basalts [MORBs; 2,3] and plume-related basalts [4,5] indicate that both the upper mantle and plume source Xe isotopic compositions are dominated by recycled Xe [e.g., 3]. We find that the mantle source Xe isotopic compositions cannot be explained by recycling ancient atmospheric Xe alone; rather, subduction and incorporation of material bearing the modern atmospheric Xe composition must dominate. We note that our findings are consistent with a number of physical reasons that recently-subducted volatiles should be more prevalent than ancient subducted volatiles. First, a higher Archean mantle potential temperature should inhibit early Xe recycling to the deep Earth. Second, since the mantle turnover time scale is estimated to be between a few hundreds of Myr and 1 Gyr, the mantle recycled atmospheric Xe budget should be primarily composed of Xe subducted after ~2.5 Ga, at which point the atmosphere approaches the modern Xe composition [1]. Therefore, even if ancient atmospheric Xe were recycled efficiently to the mantle early in Earth history, the recycled atmospheric Xe budget of the mantle should still be dominated by the modern atmospheric Xe composition. [1] Pujol et al., 2011, EPSL; [2] Tucker et al., 2012, EPSL; [3] Parai and Mukhopadhyay, 2015, G-cubed; [4] Mukhopadhyay, 2012, Nature; [5

  13. Introduction to time series and forecasting

    CERN Document Server

    Brockwell, Peter J

    2016-01-01

    This book is aimed at the reader who wishes to gain a working knowledge of time series and forecasting methods as applied to economics, engineering and the natural and social sciences. It assumes knowledge only of basic calculus, matrix algebra and elementary statistics. This third edition contains detailed instructions for the use of the professional version of the Windows-based computer package ITSM2000, now available as a free download from the Springer Extras website. The logic and tools of time series model-building are developed in detail. Numerous exercises are included and the software can be used to analyze and forecast data sets of the user's own choosing. The book can also be used in conjunction with other time series packages such as those included in R. The programs in ITSM2000 however are menu-driven and can be used with minimal investment of time in the computational details. The core of the book covers stationary processes, ARMA and ARIMA processes, multivariate time series and state-space mod...

  14. Multifractal Analysis of Polyalanines Time Series

    CERN Document Server

    Figueirêdo, P H; Moret, M A; Coutinho, Sérgio; 10.1016/j.physa.2009.11.045

    2010-01-01

    Multifractal properties of the energy time series of short $\\alpha$-helix structures, specifically from a polyalanine family, are investigated through the MF-DFA technique ({\\it{multifractal detrended fluctuation analysis}}). Estimates for the generalized Hurst exponent $h(q)$ and its associated multifractal exponents $\\tau(q)$ are obtained for several series generated by numerical simulations of molecular dynamics in different systems from distinct initial conformations. All simulations were performed using the GROMOS force field, implemented in the program THOR. The main results have shown that all series exhibit multifractal behavior depending on the number of residues and temperature. Moreover, the multifractal spectra reveal important aspects on the time evolution of the system and suggest that the nucleation process of the secondary structures during the visits on the energy hyper-surface is an essential feature of the folding process.

  15. Estimating High-Dimensional Time Series Models

    DEFF Research Database (Denmark)

    Medeiros, Marcelo C.; Mendes, Eduardo F.

    We study the asymptotic properties of the Adaptive LASSO (adaLASSO) in sparse, high-dimensional, linear time-series models. We assume both the number of covariates in the model and candidate variables can increase with the number of observations and the number of candidate variables is, possibly...

  16. Time series tapering for short data samples

    DEFF Research Database (Denmark)

    Kaimal, J.C.; Kristensen, L.

    1991-01-01

    We explore the effect of applying tapered windows on atmospheric data to eliminate overestimation inherent in spectra computed from short time series. Some windows are more effective than others in correcting this distortion. The Hamming window gave the best results with experimental data...

  17. Designer networks for time series processing

    DEFF Research Database (Denmark)

    Svarer, C; Hansen, Lars Kai; Larsen, Jan

    1993-01-01

    The conventional tapped-delay neural net may be analyzed using statistical methods and the results of such analysis can be applied to model optimization. The authors review and extend efforts to demonstrate the power of this strategy within time series processing. They attempt to design compact...

  18. Lecture notes for Advanced Time Series Analysis

    DEFF Research Database (Denmark)

    Madsen, Henrik; Holst, Jan

    1997-01-01

    A first version of this notes was used at the lectures in Grenoble, and they are now extended and improved (together with Jan Holst), and used in Ph.D. courses on Advanced Time Series Analysis at IMM and at the Department of Mathematical Statistics, University of Lund, 1994, 1997, ...

  19. On clustering fMRI time series

    DEFF Research Database (Denmark)

    Goutte, Cyril; Toft, Peter Aundal; Rostrup, E.

    1999-01-01

    Analysis of fMRI time series is often performed by extracting one or more parameters for the individual voxels. Methods based, e.g., on various statistical tests are then used to yield parameters corresponding to probability of activation or activation strength. However, these methods do...

  20. Optimal transformations for categorical autoregressive time series

    NARCIS (Netherlands)

    Buuren, S. van

    1996-01-01

    This paper describes a method for finding optimal transformations for analyzing time series by autoregressive models. 'Optimal' implies that the agreement between the autoregressive model and the transformed data is maximal. Such transformations help 1) to increase the model fit, and 2) to analyze c

  1. Nonlinear time series modelling: an introduction

    OpenAIRE

    Simon M. Potter

    1999-01-01

    Recent developments in nonlinear time series modelling are reviewed. Three main types of nonlinear models are discussed: Markov Switching, Threshold Autoregression and Smooth Transition Autoregression. Classical and Bayesian estimation techniques are described for each model. Parametric tests for nonlinearity are reviewed with examples from the three types of models. Finally, forecasting and impulse response analysis is developed.

  2. Forecasting with periodic autoregressive time series models

    NARCIS (Netherlands)

    Ph.H.B.F. Franses (Philip Hans); R. Paap (Richard)

    1999-01-01

    textabstractThis paper is concerned with forecasting univariate seasonal time series data using periodic autoregressive models. We show how one should account for unit roots and deterministic terms when generating out-of-sample forecasts. We illustrate the models for various quarterly UK consumption

  3. 25 years of time series forecasting

    NARCIS (Netherlands)

    de Gooijer, J.G.; Hyndman, R.J.

    2006-01-01

    We review the past 25 years of research into time series forecasting. In this silver jubilee issue, we naturally highlight results published in journals managed by the International Institute of Forecasters (Journal of Forecasting 1982-1985 and International Journal of Forecasting 1985-2005). During

  4. Nonlinear Time Series Analysis via Neural Networks

    Science.gov (United States)

    Volná, Eva; Janošek, Michal; Kocian, Václav; Kotyrba, Martin

    This article deals with a time series analysis based on neural networks in order to make an effective forex market [Moore and Roche, J. Int. Econ. 58, 387-411 (2002)] pattern recognition. Our goal is to find and recognize important patterns which repeatedly appear in the market history to adapt our trading system behaviour based on them.

  5. Time Series Rule Discovery: Tough, not Meaningless

    NARCIS (Netherlands)

    Struzik, Z.R.

    2003-01-01

    `Model free' rule discovery from data has recently been subject to considerable criticism, which has cast a shadow over the emerging discipline of time series data mining. However, other than in data mining, rule discovery has long been the subject of research in statistical physics of complex pheno

  6. 25 years of time series forecasting

    NARCIS (Netherlands)

    de Gooijer, J.G.; Hyndman, R.J.

    2006-01-01

    We review the past 25 years of research into time series forecasting. In this silver jubilee issue, we naturally highlight results published in journals managed by the International Institute of Forecasters (Journal of Forecasting 1982-1985 and International Journal of Forecasting 1985-2005). During

  7. Parsimonious Linear Fingerprinting for Time Series

    Science.gov (United States)

    2010-09-01

    like to detect such groups of harmonics. Fig. 1(d) gives a quick preview of the visualization and effectiveness of the proposed PLiF method: For the...coefficients of each individual frequency. As we find harmonic frequency sets in music , in real time- series like motions, we will expect to usually find

  8. Forecasting with periodic autoregressive time series models

    NARCIS (Netherlands)

    Ph.H.B.F. Franses (Philip Hans); R. Paap (Richard)

    1999-01-01

    textabstractThis paper is concerned with forecasting univariate seasonal time series data using periodic autoregressive models. We show how one should account for unit roots and deterministic terms when generating out-of-sample forecasts. We illustrate the models for various quarterly UK consumption

  9. On forecasting cointegrated seasonal time series

    NARCIS (Netherlands)

    M. Löf (Marten); Ph.H.B.F. Franses (Philip Hans)

    2000-01-01

    textabstractWe analyze periodic and seasonal cointegration models for bivariate quarterly observed time series in an empirical forecasting study. We include both single equation and multiple equation methods. A VAR model in first differences with and without cointegration restrictions is also

  10. Efficient Approximate OLAP Querying Over Time Series

    DEFF Research Database (Denmark)

    Perera, Kasun Baruhupolage Don Kasun Sanjeewa; Hahmann, Martin; Lehner, Wolfgang

    2016-01-01

    The ongoing trend for data gathering not only produces larger volumes of data, but also increases the variety of recorded data types. Out of these, especially time series, e.g. various sensor readings, have attracted attention in the domains of business intelligence and decision making. As OLAP...

  11. Common large innovations across nonlinear time series

    NARCIS (Netherlands)

    Ph.H.B.F. Franses (Philip Hans); R. Paap (Richard)

    2002-01-01

    textabstractWe propose a multivariate nonlinear econometric time series model, which can be used to examine if there is common nonlinearity across economic variables. The model is a multivariate censored latent effects autoregression. The key feature of this model is that nonlinearity appears as sep

  12. Offset detection in GPS coordinate time series

    Science.gov (United States)

    Gazeaux, J.; King, M. A.; Williams, S. D.

    2013-12-01

    Global Positioning System (GPS) time series are commonly affected by offsets of unknown magnitude and the large volume of data globally warrants investigation of automated detection approaches. The Detection of Offsets in GPS Experiment (DOGEx) showed that accuracy of Global Positioning System (GPS) time series can be significantly improved by applying statistical offset detection methods (see Gazeaux et al. (2013)). However, the best of these approaches did not perform as well as manual detection by expert analysts. Many of the features of GPS coordinates time series have not yet been fully taken into account in existing methods. Here, we apply Bayesian theory in order to make use of prior knowledge of the site noise characteristics and metadata in an attempt to make the offset detection more accurate. In the past decades, Bayesian theory has shown relevant results for a widespread range of applications, but has not yet been applied to GPS coordinates time series. Such methods incorporate different inputs such as a dynamic model (linear trend, periodic signal..) and a-priori information in a process that provides the best estimate of parameters (velocity, phase and amplitude of periodic signals...) based on all the available information. We test the new method on the DOGEx simulated dataset and compare it to previous solutions, and to Monte-Carlo method to test the accuracy of the procedure. We make a preliminary extension of the DOGEx dataset to introduce metadata information, allowing us to test the value of this data type in detecting offsets. The flexibility, robustness and limitations of the new approach are discussed. Gazeaux, J. Williams, S., King, M., Bos, M., Dach, R., Deo, M.,Moore, A.W., Ostini, L., Petrie, E., Roggero, M., Teferle, F.N., Olivares, G.,Webb, F.H. 2013. Detecting offsets in GPS time series: First results from the detection of offsets in GPS experiment. Journal of Geophysical Research: Solid Earth 118. 5. pp:2169-9356. Keywords : GPS

  13. What marketing scholars should know about time series analysis : time series applications in marketing

    NARCIS (Netherlands)

    Horváth, Csilla; Kornelis, Marcel; Leeflang, Peter S.H.

    2002-01-01

    In this review, we give a comprehensive summary of time series techniques in marketing, and discuss a variety of time series analysis (TSA) techniques and models. We classify them in the sets (i) univariate TSA, (ii) multivariate TSA, and (iii) multiple TSA. We provide relevant marketing application

  14. What marketing scholars should know about time series analysis : time series applications in marketing

    NARCIS (Netherlands)

    Horváth, Csilla; Kornelis, Marcel; Leeflang, Peter S.H.

    2002-01-01

    In this review, we give a comprehensive summary of time series techniques in marketing, and discuss a variety of time series analysis (TSA) techniques and models. We classify them in the sets (i) univariate TSA, (ii) multivariate TSA, and (iii) multiple TSA. We provide relevant marketing application

  15. What marketing scholars should know about time series analysis : time series applications in marketing

    NARCIS (Netherlands)

    Horváth, Csilla; Kornelis, Marcel; Leeflang, Peter S.H.

    2002-01-01

    In this review, we give a comprehensive summary of time series techniques in marketing, and discuss a variety of time series analysis (TSA) techniques and models. We classify them in the sets (i) univariate TSA, (ii) multivariate TSA, and (iii) multiple TSA. We provide relevant marketing

  16. Remote Sensing Time Series Product Tool

    Science.gov (United States)

    Predos, Don; Ryan, Robert E.; Ross, Kenton W.

    2006-01-01

    The TSPT (Time Series Product Tool) software was custom-designed for NASA to rapidly create and display single-band and band-combination time series, such as NDVI (Normalized Difference Vegetation Index) images, for wide-area crop surveillance and for other time-critical applications. The TSPT, developed in MATLAB, allows users to create and display various MODIS (Moderate Resolution Imaging Spectroradiometer) or simulated VIIRS (Visible/Infrared Imager Radiometer Suite) products as single images, as time series plots at a selected location, or as temporally processed image videos. Manually creating these types of products is extremely labor intensive; however, the TSPT development tool makes the process simplified and efficient. MODIS is ideal for monitoring large crop areas because of its wide swath (2330 km), its relatively small ground sample distance (250 m), and its high temporal revisit time (twice daily). Furthermore, because MODIS imagery is acquired daily, rapid changes in vegetative health can potentially be detected. The new TSPT technology provides users with the ability to temporally process high-revisit-rate satellite imagery, such as that acquired from MODIS and from its successor, the VIIRS. The TSPT features the important capability of fusing data from both MODIS instruments onboard the Terra and Aqua satellites, which drastically improves cloud statistics. With the TSPT, MODIS metadata is used to find and optionally remove bad and suspect data. Noise removal and temporal processing techniques allow users to create low-noise time series plots and image videos and to select settings and thresholds that tailor particular output products. The TSPT GUI (graphical user interface) provides an interactive environment for crafting what-if scenarios by enabling a user to repeat product generation using different settings and thresholds. The TSPT Application Programming Interface provides more fine-tuned control of product generation, allowing experienced

  17. Delay differential analysis of time series.

    Science.gov (United States)

    Lainscsek, Claudia; Sejnowski, Terrence J

    2015-03-01

    Nonlinear dynamical system analysis based on embedding theory has been used for modeling and prediction, but it also has applications to signal detection and classification of time series. An embedding creates a multidimensional geometrical object from a single time series. Traditionally either delay or derivative embeddings have been used. The delay embedding is composed of delayed versions of the signal, and the derivative embedding is composed of successive derivatives of the signal. The delay embedding has been extended to nonuniform embeddings to take multiple timescales into account. Both embeddings provide information on the underlying dynamical system without having direct access to all the system variables. Delay differential analysis is based on functional embeddings, a combination of the derivative embedding with nonuniform delay embeddings. Small delay differential equation (DDE) models that best represent relevant dynamic features of time series data are selected from a pool of candidate models for detection or classification. We show that the properties of DDEs support spectral analysis in the time domain where nonlinear correlation functions are used to detect frequencies, frequency and phase couplings, and bispectra. These can be efficiently computed with short time windows and are robust to noise. For frequency analysis, this framework is a multivariate extension of discrete Fourier transform (DFT), and for higher-order spectra, it is a linear and multivariate alternative to multidimensional fast Fourier transform of multidimensional correlations. This method can be applied to short or sparse time series and can be extended to cross-trial and cross-channel spectra if multiple short data segments of the same experiment are available. Together, this time-domain toolbox provides higher temporal resolution, increased frequency and phase coupling information, and it allows an easy and straightforward implementation of higher-order spectra across time

  18. Delay Differential Analysis of Time Series

    Science.gov (United States)

    Lainscsek, Claudia; Sejnowski, Terrence J.

    2015-01-01

    Nonlinear dynamical system analysis based on embedding theory has been used for modeling and prediction, but it also has applications to signal detection and classification of time series. An embedding creates a multidimensional geometrical object from a single time series. Traditionally either delay or derivative embeddings have been used. The delay embedding is composed of delayed versions of the signal, and the derivative embedding is composed of successive derivatives of the signal. The delay embedding has been extended to nonuniform embeddings to take multiple timescales into account. Both embeddings provide information on the underlying dynamical system without having direct access to all the system variables. Delay differential analysis is based on functional embeddings, a combination of the derivative embedding with nonuniform delay embeddings. Small delay differential equation (DDE) models that best represent relevant dynamic features of time series data are selected from a pool of candidate models for detection or classification. We show that the properties of DDEs support spectral analysis in the time domain where nonlinear correlation functions are used to detect frequencies, frequency and phase couplings, and bispectra. These can be efficiently computed with short time windows and are robust to noise. For frequency analysis, this framework is a multivariate extension of discrete Fourier transform (DFT), and for higher-order spectra, it is a linear and multivariate alternative to multidimensional fast Fourier transform of multidimensional correlations. This method can be applied to short or sparse time series and can be extended to cross-trial and cross-channel spectra if multiple short data segments of the same experiment are available. Together, this time-domain toolbox provides higher temporal resolution, increased frequency and phase coupling information, and it allows an easy and straightforward implementation of higher-order spectra across time

  19. Time Series Forecasting: A Nonlinear Dynamics Approach

    OpenAIRE

    Sello, Stefano

    1999-01-01

    The problem of prediction of a given time series is examined on the basis of recent nonlinear dynamics theories. Particular attention is devoted to forecast the amplitude and phase of one of the most common solar indicator activity, the international monthly smoothed sunspot number. It is well known that the solar cycle is very difficult to predict due to the intrinsic complexity of the related time behaviour and to the lack of a succesful quantitative theoretical model of the Sun magnetic cy...

  20. The Statistical Analysis of Time Series

    CERN Document Server

    Anderson, T W

    2011-01-01

    The Wiley Classics Library consists of selected books that have become recognized classics in their respective fields. With these new unabridged and inexpensive editions, Wiley hopes to extend the life of these important works by making them available to future generations of mathematicians and scientists. Currently available in the Series: T. W. Anderson Statistical Analysis of Time Series T. S. Arthanari & Yadolah Dodge Mathematical Programming in Statistics Emil Artin Geometric Algebra Norman T. J. Bailey The Elements of Stochastic Processes with Applications to the Natural Sciences George

  1. Time-varying correlation and common structures in volatility

    NARCIS (Netherlands)

    Liu, Yang

    2016-01-01

    This thesis studies time series properties of the covariance structure of multivariate asset returns. First, the time-varying feature of correlation is investigated at the intraday level with a new correlation model incorporating the intraday correlation dynamics. Second, the thesis develops a

  2. Outlier Detection in Structural Time Series Models

    DEFF Research Database (Denmark)

    Marczak, Martyna; Proietti, Tommaso

    investigate via Monte Carlo simulations how this approach performs for detecting additive outliers and level shifts in the analysis of nonstationary seasonal time series. The reference model is the basic structural model, featuring a local linear trend, possibly integrated of order two, stochastic seasonality......Structural change affects the estimation of economic signals, like the underlying growth rate or the seasonally adjusted series. An important issue, which has attracted a great deal of attention also in the seasonal adjustment literature, is its detection by an expert procedure. The general...... and a stationary component. Further, we apply both kinds of indicator saturation to detect additive outliers and level shifts in the industrial production series in five European countries....

  3. Nonlinear Analysis of Physiological Time Series

    Institute of Scientific and Technical Information of China (English)

    MENG Qing-fang; PENG Yu-hua; XUE Yu-li; HAN Min

    2007-01-01

    Abstract.The heart rate variability could be explained by a low-dimensional governing mechanism. There has been increasing interest in verifying and understanding the coupling between the respiration and the heart rate. In this paper we use the nonlinear detection method to detect the nonlinear deterministic component in the physiological time series by a single variable series and two variables series respectively, and use the conditional information entropy to analyze the correlation between the heart rate, the respiration and the blood oxygen concentration. The conclusions are that there is the nonlinear deterministic component in the heart rate data and respiration data, and the heart rate and the respiration are two variables originating from the same underlying dynamics.

  4. TIME SERIES FORECASTING USING NEURAL NETWORKS

    Directory of Open Access Journals (Sweden)

    BOGDAN OANCEA

    2013-05-01

    Full Text Available Recent studies have shown the classification and prediction power of the Neural Networks. It has been demonstrated that a NN can approximate any continuous function. Neural networks have been successfully used for forecasting of financial data series. The classical methods used for time series prediction like Box-Jenkins or ARIMA assumes that there is a linear relationship between inputs and outputs. Neural Networks have the advantage that can approximate nonlinear functions. In this paper we compared the performances of different feed forward and recurrent neural networks and training algorithms for predicting the exchange rate EUR/RON and USD/RON. We used data series with daily exchange rates starting from 2005 until 2013.

  5. Algorithm for Compressing Time-Series Data

    Science.gov (United States)

    Hawkins, S. Edward, III; Darlington, Edward Hugo

    2012-01-01

    An algorithm based on Chebyshev polynomials effects lossy compression of time-series data or other one-dimensional data streams (e.g., spectral data) that are arranged in blocks for sequential transmission. The algorithm was developed for use in transmitting data from spacecraft scientific instruments to Earth stations. In spite of its lossy nature, the algorithm preserves the information needed for scientific analysis. The algorithm is computationally simple, yet compresses data streams by factors much greater than two. The algorithm is not restricted to spacecraft or scientific uses: it is applicable to time-series data in general. The algorithm can also be applied to general multidimensional data that have been converted to time-series data, a typical example being image data acquired by raster scanning. However, unlike most prior image-data-compression algorithms, this algorithm neither depends on nor exploits the two-dimensional spatial correlations that are generally present in images. In order to understand the essence of this compression algorithm, it is necessary to understand that the net effect of this algorithm and the associated decompression algorithm is to approximate the original stream of data as a sequence of finite series of Chebyshev polynomials. For the purpose of this algorithm, a block of data or interval of time for which a Chebyshev polynomial series is fitted to the original data is denoted a fitting interval. Chebyshev approximation has two properties that make it particularly effective for compressing serial data streams with minimal loss of scientific information: The errors associated with a Chebyshev approximation are nearly uniformly distributed over the fitting interval (this is known in the art as the "equal error property"); and the maximum deviations of the fitted Chebyshev polynomial from the original data have the smallest possible values (this is known in the art as the "min-max property").

  6. Nonlinearity, Breaks, and Long-Range Dependence in Time-Series Models

    DEFF Research Database (Denmark)

    Hillebrand, Eric Tobias; Medeiros, Marcelo C.

    We study the simultaneous occurrence of long memory and nonlinear effects, such as parameter changes and threshold effects, in ARMA time series models and apply our modeling framework to daily realized volatility. Asymptotic theory for parameter estimation is developed and two model building...

  7. Pseudotime estimation: deconfounding single cell time series.

    Science.gov (United States)

    Reid, John E; Wernisch, Lorenz

    2016-10-01

    Repeated cross-sectional time series single cell data confound several sources of variation, with contributions from measurement noise, stochastic cell-to-cell variation and cell progression at different rates. Time series from single cell assays are particularly susceptible to confounding as the measurements are not averaged over populations of cells. When several genes are assayed in parallel these effects can be estimated and corrected for under certain smoothness assumptions on cell progression. We present a principled probabilistic model with a Bayesian inference scheme to analyse such data. We demonstrate our method's utility on public microarray, nCounter and RNA-seq datasets from three organisms. Our method almost perfectly recovers withheld capture times in an Arabidopsis dataset, it accurately estimates cell cycle peak times in a human prostate cancer cell line and it correctly identifies two precocious cells in a study of paracrine signalling in mouse dendritic cells. Furthermore, our method compares favourably with Monocle, a state-of-the-art technique. We also show using held-out data that uncertainty in the temporal dimension is a common confounder and should be accounted for in analyses of repeated cross-sectional time series. Our method is available on CRAN in the DeLorean package. john.reid@mrc-bsu.cam.ac.uk Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.

  8. Hurst exponents for short time series

    Science.gov (United States)

    Qi, Jingchao; Yang, Huijie

    2011-12-01

    A concept called balanced estimator of diffusion entropy is proposed to detect quantitatively scalings in short time series. The effectiveness is verified by detecting successfully scaling properties for a large number of artificial fractional Brownian motions. Calculations show that this method can give reliable scalings for short time series with length ˜102. It is also used to detect scalings in the Shanghai Stock Index, five stock catalogs, and a total of 134 stocks collected from the Shanghai Stock Exchange Market. The scaling exponent for each catalog is significantly larger compared with that for the stocks included in the catalog. Selecting a window with size 650, the evolution of scaling for the Shanghai Stock Index is obtained by the window's sliding along the series. Global patterns in the evolutionary process are captured from the smoothed evolutionary curve. By comparing the patterns with the important event list in the history of the considered stock market, the evolution of scaling is matched with the stock index series. We can find that the important events fit very well with global transitions of the scaling behaviors.

  9. Recovery of time-dependent volatility in option pricing model

    Science.gov (United States)

    Deng, Zui-Cha; Hon, Y. C.; Isakov, V.

    2016-11-01

    In this paper we investigate an inverse problem of determining the time-dependent volatility from observed market prices of options with different strikes. Due to the non linearity and sparsity of observations, an analytical solution to the problem is generally not available. Numerical approximation is also difficult to obtain using most of the existing numerical algorithms. Based on our recent theoretical results, we apply the linearisation technique to convert the problem into an inverse source problem from which recovery of the unknown volatility function can be achieved. Two kinds of strategies, namely, the integral equation method and the Landweber iterations, are adopted to obtain the stable numerical solution to the inverse problem. Both theoretical analysis and numerical examples confirm that the proposed approaches are effective. The work described in this paper was partially supported by a grant from the Research Grant Council of the Hong Kong Special Administrative Region (Project No. CityU 101112) and grants from the NNSF of China (Nos. 11261029, 11461039), and NSF grants DMS 10-08902 and 15-14886 and by Emylou Keith and Betty Dutcher Distinguished Professorship at the Wichita State University (USA).

  10. Incomplete Continuous-Time Securities Markets with Stochastic Income Volatility

    DEFF Research Database (Denmark)

    Christensen, Peter Ove; Larsen, Kasper

    and can trade continuously on a finite time interval in a money market account and a single risky security. Besides establishing the existence of an equilibrium, our main result shows that if the investors' unspanned income has stochastic counter-cyclical volatility, the resulting equilibrium can display......In an incomplete continuous-time securities market governed by Brownian motions, we derive closed-form solutions for the equilibrium risk-free rate and equity premium processes. The economy has a finite number of heterogeneous exponential utility investors, who receive partially unspanned income...... both lower risk-free rates and higher risk premia relative to the Pareto efficient equilibrium in an otherwise identical complete market. Consequently, our model can simultaneously help explaining the risk-free rate and equity premium puzzles....

  11. Incomplete Continuous-Time Securities Markets with Stochastic Income Volatility

    DEFF Research Database (Denmark)

    Christensen, Peter Ove; Larsen, Kasper

    and can trade continuously on a finite time interval in a money market account and a single risky security. Besides establishing the existence of an equilibrium, our main result shows that if the investors' unspanned income has stochastic counter-cyclical volatility, the resulting equilibrium can display......In an incomplete continuous-time securities market governed by Brownian motions, we derive closed-form solutions for the equilibrium risk-free rate and equity premium processes. The economy has a finite number of heterogeneous exponential utility investors, who receive partially unspanned income...... both lower risk-free rates and higher risk premia relative to the Pareto efficient equilibrium in an otherwise identical complete market. Consequently, our model can simultaneously help explaining the risk-free rate and equity premium puzzles....

  12. Sliced Inverse Regression for Time Series Analysis

    Science.gov (United States)

    Chen, Li-Sue

    1995-11-01

    In this thesis, general nonlinear models for time series data are considered. A basic form is x _{t} = f(beta_sp{1} {T}X_{t-1},beta_sp {2}{T}X_{t-1},... , beta_sp{k}{T}X_ {t-1},varepsilon_{t}), where x_{t} is an observed time series data, X_{t } is the first d time lag vector, (x _{t},x_{t-1},... ,x _{t-d-1}), f is an unknown function, beta_{i}'s are unknown vectors, varepsilon_{t }'s are independent distributed. Special cases include AR and TAR models. We investigate the feasibility applying SIR/PHD (Li 1990, 1991) (the sliced inverse regression and principal Hessian methods) in estimating beta _{i}'s. PCA (Principal component analysis) is brought in to check one critical condition for SIR/PHD. Through simulation and a study on 3 well -known data sets of Canadian lynx, U.S. unemployment rate and sunspot numbers, we demonstrate how SIR/PHD can effectively retrieve the interesting low-dimension structures for time series data.

  13. Understanding Financial Market Volatility

    NARCIS (Netherlands)

    A. Opschoor (Anne)

    2014-01-01

    markdownabstract__Abstract__ Volatility has been one of the most active and successful areas of research in time series econometrics and economic forecasting in recent decades. Loosely speaking, volatility is defined as the average magnitude of fluctuations observed in some phenomenon over time. Wi

  14. Univariate time series forecasting algorithm validation

    Science.gov (United States)

    Ismail, Suzilah; Zakaria, Rohaiza; Muda, Tuan Zalizam Tuan

    2014-12-01

    Forecasting is a complex process which requires expert tacit knowledge in producing accurate forecast values. This complexity contributes to the gaps between end users and expert. Automating this process by using algorithm can act as a bridge between them. Algorithm is a well-defined rule for solving a problem. In this study a univariate time series forecasting algorithm was developed in JAVA and validated using SPSS and Excel. Two set of simulated data (yearly and non-yearly); several univariate forecasting techniques (i.e. Moving Average, Decomposition, Exponential Smoothing, Time Series Regressions and ARIMA) and recent forecasting process (such as data partition, several error measures, recursive evaluation and etc.) were employed. Successfully, the results of the algorithm tally with the results of SPSS and Excel. This algorithm will not just benefit forecaster but also end users that lacking in depth knowledge of forecasting process.

  15. Time-Series Analysis: A Cautionary Tale

    Science.gov (United States)

    Damadeo, Robert

    2015-01-01

    Time-series analysis has often been a useful tool in atmospheric science for deriving long-term trends in various atmospherically important parameters (e.g., temperature or the concentration of trace gas species). In particular, time-series analysis has been repeatedly applied to satellite datasets in order to derive the long-term trends in stratospheric ozone, which is a critical atmospheric constituent. However, many of the potential pitfalls relating to the non-uniform sampling of the datasets were often ignored and the results presented by the scientific community have been unknowingly biased. A newly developed and more robust application of this technique is applied to the Stratospheric Aerosol and Gas Experiment (SAGE) II version 7.0 ozone dataset and the previous biases and newly derived trends are presented.

  16. Multivariate Voronoi Outlier Detection for Time Series.

    Science.gov (United States)

    Zwilling, Chris E; Wang, Michelle Yongmei

    2014-10-01

    Outlier detection is a primary step in many data mining and analysis applications, including healthcare and medical research. This paper presents a general method to identify outliers in multivariate time series based on a Voronoi diagram, which we call Multivariate Voronoi Outlier Detection (MVOD). The approach copes with outliers in a multivariate framework, via designing and extracting effective attributes or features from the data that can take parametric or nonparametric forms. Voronoi diagrams allow for automatic configuration of the neighborhood relationship of the data points, which facilitates the differentiation of outliers and non-outliers. Experimental evaluation demonstrates that our MVOD is an accurate, sensitive, and robust method for detecting outliers in multivariate time series data.

  17. Visibility graphlet approach to chaotic time series

    Energy Technology Data Exchange (ETDEWEB)

    Mutua, Stephen [Business School, University of Shanghai for Science and Technology, Shanghai 200093 (China); Computer Science Department, Masinde Muliro University of Science and Technology, P.O. Box 190-50100, Kakamega (Kenya); Gu, Changgui, E-mail: gu-changgui@163.com, E-mail: hjyang@ustc.edu.cn; Yang, Huijie, E-mail: gu-changgui@163.com, E-mail: hjyang@ustc.edu.cn [Business School, University of Shanghai for Science and Technology, Shanghai 200093 (China)

    2016-05-15

    Many novel methods have been proposed for mapping time series into complex networks. Although some dynamical behaviors can be effectively captured by existing approaches, the preservation and tracking of the temporal behaviors of a chaotic system remains an open problem. In this work, we extended the visibility graphlet approach to investigate both discrete and continuous chaotic time series. We applied visibility graphlets to capture the reconstructed local states, so that each is treated as a node and tracked downstream to create a temporal chain link. Our empirical findings show that the approach accurately captures the dynamical properties of chaotic systems. Networks constructed from periodic dynamic phases all converge to regular networks and to unique network structures for each model in the chaotic zones. Furthermore, our results show that the characterization of chaotic and non-chaotic zones in the Lorenz system corresponds to the maximal Lyapunov exponent, thus providing a simple and straightforward way to analyze chaotic systems.

  18. Visibility graphlet approach to chaotic time series.

    Science.gov (United States)

    Mutua, Stephen; Gu, Changgui; Yang, Huijie

    2016-05-01

    Many novel methods have been proposed for mapping time series into complex networks. Although some dynamical behaviors can be effectively captured by existing approaches, the preservation and tracking of the temporal behaviors of a chaotic system remains an open problem. In this work, we extended the visibility graphlet approach to investigate both discrete and continuous chaotic time series. We applied visibility graphlets to capture the reconstructed local states, so that each is treated as a node and tracked downstream to create a temporal chain link. Our empirical findings show that the approach accurately captures the dynamical properties of chaotic systems. Networks constructed from periodic dynamic phases all converge to regular networks and to unique network structures for each model in the chaotic zones. Furthermore, our results show that the characterization of chaotic and non-chaotic zones in the Lorenz system corresponds to the maximal Lyapunov exponent, thus providing a simple and straightforward way to analyze chaotic systems.

  19. Inductorless Chua's Circuit: Experimental Time Series Analysis

    Directory of Open Access Journals (Sweden)

    R. M. Rubinger

    2007-01-01

    Full Text Available We have implemented an operational amplifier inductorless realization of the Chua's circuit. We have registered time series from its dynamical variables with the resistor R as the control parameter and varying from 1300 Ω to 2000 Ω. Experimental time series at fixed R were used to reconstruct attractors by the delay vector technique. The flow attractors and their Poincaré maps considering parameters such as the Lyapunov spectrum, its subproduct the Kaplan-Yorke dimension, and the information dimension are also analyzed here. The results for a typical double scroll attractor indicate a chaotic behavior characterized by a positive Lyapunov exponent and with a Kaplan-Yorke dimension of 2.14. The occurrence of chaos was also investigated through numerical simulations of the Chua's circuit set of differential equations.

  20. On clustering fMRI time series

    DEFF Research Database (Denmark)

    Goutte, C; Toft, P; Rostrup, E

    1999-01-01

    Analysis of fMRI time series is often performed by extracting one or more parameters for the individual voxels. Methods based, e.g., on various statistical tests are then used to yield parameters corresponding to probability of activation or activation strength. However, these methods do not indi......Analysis of fMRI time series is often performed by extracting one or more parameters for the individual voxels. Methods based, e.g., on various statistical tests are then used to yield parameters corresponding to probability of activation or activation strength. However, these methods do...... between the activation stimulus and the fMRI signal. We present two different clustering algorithms and use them to identify regions of similar activations in an fMRI experiment involving a visual stimulus....

  1. Learning and Prediction of Relational Time Series

    Science.gov (United States)

    2013-03-01

    genetic algorithms can generate a sequence of events to maximize some functions or the likelihood to achieve the assumed goals. With reference...Reinforcement learning is not the same as relational time-series learning mainly because its main focus is to learn a set of policies to maximize the...scope blending, and has been applied to machine poetry generation [48] and the generation of animation characters [49]. Tan and Kowk [50] applied the

  2. Revisiting algorithms for generating surrogate time series

    CERN Document Server

    Raeth, C; Papadakis, I E; Brinkmann, W

    2011-01-01

    The method of surrogates is one of the key concepts of nonlinear data analysis. Here, we demonstrate that commonly used algorithms for generating surrogates often fail to generate truly linear time series. Rather, they create surrogate realizations with Fourier phase correlations leading to non-detections of nonlinearities. We argue that reliable surrogates can only be generated, if one tests separately for static and dynamic nonlinearities.

  3. Time Series Modelling using Proc Varmax

    DEFF Research Database (Denmark)

    Milhøj, Anders

    2007-01-01

    In this paper it will be demonstrated how various time series problems could be met using Proc Varmax. The procedure is rather new and hence new features like cointegration, testing for Granger causality are included, but it also means that more traditional ARIMA modelling as outlined by Box & Je...... & Jenkins is performed in a more modern way using the computer resources which are now available...

  4. Multivariate Option Pricing with Time Varying Volatility and Correlations

    DEFF Research Database (Denmark)

    Rombouts, Jeroen V.K.; Stentoft, Lars Peter

    In recent years multivariate models for asset returns have received much attention, in particular this is the case for models with time varying volatility. In this paper we consider models of this class and examine their potential when it comes to option pricing. Specifically, we derive the risk...... neutral dynamics for a general class of multivariate heteroskedastic models, and we provide a feasible way to price options in this framework. Our framework can be used irrespective of the assumed underlying distribution and dynamics, and it nests several important special cases. We provide an application...... to options on the minimum of two indices. Our results show that not only is correlation important for these options but so is allowing this correlation to be dynamic. Moreover, we show that for the general model exposure to correlation risk carries an important premium, and when this is neglected option...

  5. Normalizing the causality between time series

    CERN Document Server

    Liang, X San

    2015-01-01

    Recently, a rigorous yet concise formula has been derived to evaluate the information flow, and hence the causality in a quantitative sense, between time series. To assess the importance of a resulting causality, it needs to be normalized. The normalization is achieved through distinguishing three types of fundamental mechanisms that govern the marginal entropy change of the flow recipient. A normalized or relative flow measures its importance relative to other mechanisms. In analyzing realistic series, both absolute and relative information flows need to be taken into account, since the normalizers for a pair of reverse flows belong to two different entropy balances; it is quite normal that two identical flows may differ a lot in relative importance in their respective balances. We have reproduced these results with several autoregressive models. We have also shown applications to a climate change problem and a financial analysis problem. For the former, reconfirmed is the role of the Indian Ocean Dipole as ...

  6. Argos: An Optimized Time-Series Photometer

    Indian Academy of Sciences (India)

    Anjum S. Mukadam; R. E. Nather

    2005-06-01

    We designed a prime focus CCD photometer, Argos, optimized for high speed time-series measurements of blue variables (Nather & Mukadam 2004) for the 2.1 m telescope at McDonald Observatory. Lack of any intervening optics between the primary mirror and the CCD makes the instrument highly efficient.We measure an improvement in sensitivity by a factor of nine over the 3-channel PMT photometers used on the same telescope and for the same exposure time. The CCD frame transfer operation triggered by GPS synchronized pulses serves as an electronic shutter for the photometer. This minimizes the dead time between exposures, but more importantly, allows a precise control of the start and duration of the exposure. We expect the uncertainty in our timing to be less than 100 s.

  7. Directed networks with underlying time structures from multivariate time series

    CERN Document Server

    Tanizawa, Toshihiro; Taya, Fumihiko

    2014-01-01

    In this paper we propose a method of constructing directed networks of time-dependent phenomena from multivariate time series. As the construction method is based on the linear model, the network fully reflects dynamical features of the system such as time structures of periodicities. Furthermore, this method can construct networks even if these time series show no similarity: situations in which common methods fail. We explicitly introduce a case where common methods do not work. This fact indicates the importance of constructing networks based on dynamical perspective, when we consider time-dependent phenomena. We apply the method to multichannel electroencephalography~(EEG) data and the result reveals underlying interdependency among the components in the brain system.

  8. Fractal fluctuations in cardiac time series

    Science.gov (United States)

    West, B. J.; Zhang, R.; Sanders, A. W.; Miniyar, S.; Zuckerman, J. H.; Levine, B. D.; Blomqvist, C. G. (Principal Investigator)

    1999-01-01

    Human heart rate, controlled by complex feedback mechanisms, is a vital index of systematic circulation. However, it has been shown that beat-to-beat values of heart rate fluctuate continually over a wide range of time scales. Herein we use the relative dispersion, the ratio of the standard deviation to the mean, to show, by systematically aggregating the data, that the correlation in the beat-to-beat cardiac time series is a modulated inverse power law. This scaling property indicates the existence of long-time memory in the underlying cardiac control process and supports the conclusion that heart rate variability is a temporal fractal. We argue that the cardiac control system has allometric properties that enable it to respond to a dynamical environment through scaling.

  9. Time Series Forecasting A Nonlinear Dynamics Approach

    CERN Document Server

    Sello, S

    1999-01-01

    The problem of prediction of a given time series is examined on the basis of recent nonlinear dynamics theories. Particular attention is devoted to forecast the amplitude and phase of one of the most common solar indicator activity, the international monthly smoothed sunspot number. It is well known that the solar cycle is very difficult to predict due to the intrinsic complexity of the related time behaviour and to the lack of a succesful quantitative theoretical model of the Sun magnetic cycle. Starting from a previous recent work, we checked the reliability and accuracy of a forecasting model based on concepts of nonlinear dynamical systems applied to experimental time series, such as embedding phase space,Lyapunov spectrum,chaotic behaviour. The model is based on a locally hypothesis of the behaviour on the embedding space, utilizing an optimal number k of neighbour vectors to predict the future evolution of the current point with the set of characteristic parameters determined by several previous paramet...

  10. Time Series Photometry of KZ Lacertae

    Science.gov (United States)

    Joner, Michael D.

    2016-01-01

    We present BVRI time series photometry of the high amplitude delta Scuti star KZ Lacertae secured using the 0.9-meter telescope located at the Brigham Young University West Mountain Observatory. In addition to the multicolor light curves that are presented, the V data from the last six years of observations are used to plot an O-C diagram in order to determine the ephemeris and evaluate evidence for period change. We wish to thank the Brigham Young University College of Physical and Mathematical Sciences as well as the Department of Physics and Astronomy for their continued support of the research activities at the West Mountain Observatory.

  11. Fourier analysis of time series an introduction

    CERN Document Server

    Bloomfield, Peter

    2000-01-01

    A new, revised edition of a yet unrivaled work on frequency domain analysis Long recognized for his unique focus on frequency domain methods for the analysis of time series data as well as for his applied, easy-to-understand approach, Peter Bloomfield brings his well-known 1976 work thoroughly up to date. With a minimum of mathematics and an engaging, highly rewarding style, Bloomfield provides in-depth discussions of harmonic regression, harmonic analysis, complex demodulation, and spectrum analysis. All methods are clearly illustrated using examples of specific data sets, while ample

  12. Forecasting with nonlinear time series models

    DEFF Research Database (Denmark)

    Kock, Anders Bredahl; Teräsvirta, Timo

    and two versions of a simple artificial neural network model. Techniques for generating multi-period forecasts from nonlinear models recursively are considered, and the direct (non-recursive) method for this purpose is mentioned as well. Forecasting with com- plex dynamic systems, albeit less frequently...... applied to economic fore- casting problems, is briefly highlighted. A number of large published studies comparing macroeconomic forecasts obtained using different time series models are discussed, and the paper also contains a small simulation study comparing recursive and direct forecasts in a partic...

  13. Modeling noisy time series Physiological tremor

    CERN Document Server

    Timmer, J

    1998-01-01

    Empirical time series often contain observational noise. We investigate the effect of this noise on the estimated parameters of models fitted to the data. For data of physiological tremor, i.e. a small amplitude oscillation of the outstretched hand of healthy subjects, we compare the results for a linear model that explicitly includes additional observational noise to one that ignores this noise. We discuss problems and possible solutions for nonlinear deterministic as well as nonlinear stochastic processes. Especially we discuss the state space model applicable for modeling noisy stochastic systems and Bock's algorithm capable for modeling noisy deterministic systems.

  14. Time series modeling for automatic target recognition

    Science.gov (United States)

    Sokolnikov, Andre

    2012-05-01

    Time series modeling is proposed for identification of targets whose images are not clearly seen. The model building takes into account air turbulence, precipitation, fog, smoke and other factors obscuring and distorting the image. The complex of library data (of images, etc.) serving as a basis for identification provides the deterministic part of the identification process, while the partial image features, distorted parts, irrelevant pieces and absence of particular features comprise the stochastic part of the target identification. The missing data approach is elaborated that helps the prediction process for the image creation or reconstruction. The results are provided.

  15. Time Series Analysis of SOLSTICE Measurements

    Science.gov (United States)

    Wen, G.; Cahalan, R. F.

    2003-12-01

    Solar radiation is the major energy source for the Earth's biosphere and atmospheric and ocean circulations. Variations of solar irradiance have been a major concern of scientists both in solar physics and atmospheric sciences. A number of missions have been carried out to monitor changes in total solar irradiance (TSI) [see Fröhlich and Lean, 1998 for review] and spectral solar irradiance (SSI) [e.g., SOLSTICE on UARS and VIRGO on SOHO]. Observations over a long time period reveal the connection between variations in solar irradiance and surface magnetic fields of the Sun [Lean1997]. This connection provides a guide to scientists in modeling solar irradiances [e.g., Fontenla et al., 1999; Krivova et al., 2003]. Solar spectral observations have now been made over a relatively long time period, allowing statistical analysis. This paper focuses on predictability of solar spectral irradiance using observed SSI from SOLSTICE . Analysis of predictability is based on nonlinear dynamics using an artificial neural network in a reconstructed phase space [Abarbanel et al., 1993]. In the analysis, we first examine the average mutual information of the observed time series and a delayed time series. The time delay that gives local minimum of mutual information is chosen as the time-delay for phase space reconstruction [Fraser and Swinney, 1986]. The embedding dimension of the reconstructed phase space is determined using the false neighbors and false strands method [Kennel and Abarbanel, 2002]. Subsequently, we use a multi-layer feed-forward network with back propagation scheme [e.g., Haykin, 1994] to model the time series. The predictability of solar irradiance as a function of wavelength is considered. References Abarbanel, H. D. I., R. Brown, J. J. Sidorowich, and L. Sh. Tsimring, Rev. Mod. Phys. 65, 1331, 1993. Fraser, A. M. and H. L. Swinney, Phys. Rev. 33A, 1134, 1986. Fontenla, J., O. R. White, P. Fox, E. H. Avrett and R. L. Kurucz, The Astrophysical Journal, 518, 480

  16. Real Time Clustering of Time Series Using Triangular Potentials

    Directory of Open Access Journals (Sweden)

    Aldo Pacchiano

    2015-01-01

    Full Text Available Motivated by the problem of computing investment portfolio weightin gs we investigate various methods of clustering as alternatives to traditional mean-v ariance approaches. Such methods can have significant benefits from a practical point of view since they remove the need to invert a sample covariance matrix, which can suffer from estimation error and will almost certainly be non-stationary. The general idea is to find groups of assets w hich share similar return characteristics over time and treat each group as a singl e composite asset. We then apply inverse volatility weightings to these new composite assets. In the course of our investigation we devise a method of clustering based on triangular potentials and we present as sociated theoretical results as well as various examples based on synthetic data.

  17. Emerging Equity Market Volatility

    OpenAIRE

    Geert Bekaert; Harvey, Campbell R.

    1995-01-01

    Returns in emerging capital markets are very different from returns in developed markets. While most previous research has focused on average returns, we analyze the volatility of the returns in emerging equity markets. We characterize the time-series of volatility in emerging markets and explore the distributional foundations of the variance process. Of particular interest is evidence of asymmetries in volatility and the evolution of the variance process after periods of capital market refor...

  18. An introduction to state space time series analysis.

    NARCIS (Netherlands)

    Commandeur, J.J.F. & Koopman, S.J.

    2007-01-01

    Providing a practical introduction to state space methods as applied to unobserved components time series models, also known as structural time series models, this book introduces time series analysis using state space methodology to readers who are neither familiar with time series analysis, nor wi

  19. Nonlinear Time Series Analysis Since 1990:Some Personal Reflections

    Institute of Scientific and Technical Information of China (English)

    Howel Tong

    2002-01-01

    I reflect upon the development of nonlinear time series analysis since 1990 by focusing on five major areas of development. These areas include the interface between nonlinear time series analysis and chaos, the nonparametric/semiparametric approach, nonlinear state space modelling, financial time series and nonlinear modelling of panels of time series.

  20. Understanding Financial Market Volatility

    NARCIS (Netherlands)

    A. Opschoor (Anne)

    2014-01-01

    markdownabstract__Abstract__ Volatility has been one of the most active and successful areas of research in time series econometrics and economic forecasting in recent decades. Loosely speaking, volatility is defined as the average magnitude of fluctuations observed in some phenomenon over

  1. Forecasting the Time Series of Sunspot Numbers

    Science.gov (United States)

    Aguirre, L. A.; Letellier, C.; Maquet, J.

    2008-05-01

    Forecasting the solar cycle is of great importance for weather prediction and environmental monitoring, and also constitutes a difficult scientific benchmark in nonlinear dynamical modeling. This paper describes the identification of a model and its use in the forecasting the time series comprised of Wolf’s sunspot numbers. A key feature of this procedure is that the original time series is first transformed into a symmetrical space where the dynamics of the solar dynamo are unfolded in a better way, thus improving the model. The nonlinear model obtained is parsimonious and has both deterministic and stochastic parts. Monte Carlo simulation of the whole model produces very consistent results with the deterministic part of the model but allows for the determination of confidence bands. The obtained model was used to predict cycles 24 and 25, although the forecast of the latter is seen as a crude approximation, given the long prediction horizon required. As for the 24th cycle, two estimates were obtained with peaks of 65±16 and of 87±13 units of sunspot numbers. The simulated results suggest that the 24th cycle will be shorter and less active than the preceding one.

  2. Partial spectral analysis of hydrological time series

    Science.gov (United States)

    Jukić, D.; Denić-Jukić, V.

    2011-03-01

    SummaryHydrological time series comprise the influences of numerous processes involved in the transfer of water in hydrological cycle. It implies that an ambiguity with respect to the processes encoded in spectral and cross-spectral density functions exists. Previous studies have not paid attention adequately to this issue. Spectral and cross-spectral density functions represent the Fourier transforms of auto-covariance and cross-covariance functions. Using this basic property, the ambiguity is resolved by applying a novel approach based on the spectral representation of partial correlation. Mathematical background for partial spectral density, partial amplitude and partial phase functions is presented. The proposed functions yield the estimates of spectral density, amplitude and phase that are not affected by a controlling process. If an input-output relation is the subject of interest, antecedent and subsequent influences of the controlling process can be distinguished considering the input event as a referent point. The method is used for analyses of the relations between the rainfall, air temperature and relative humidity, as well as the influences of air temperature and relative humidity on the discharge from karst spring. Time series are collected in the catchment of the Jadro Spring located in the Dinaric karst area of Croatia.

  3. Stochastic nonlinear time series forecasting using time-delay reservoir computers: performance and universality.

    Science.gov (United States)

    Grigoryeva, Lyudmila; Henriques, Julie; Larger, Laurent; Ortega, Juan-Pablo

    2014-07-01

    Reservoir computing is a recently introduced machine learning paradigm that has already shown excellent performances in the processing of empirical data. We study a particular kind of reservoir computers called time-delay reservoirs that are constructed out of the sampling of the solution of a time-delay differential equation and show their good performance in the forecasting of the conditional covariances associated to multivariate discrete-time nonlinear stochastic processes of VEC-GARCH type as well as in the prediction of factual daily market realized volatilities computed with intraday quotes, using as training input daily log-return series of moderate size. We tackle some problems associated to the lack of task-universality for individually operating reservoirs and propose a solution based on the use of parallel arrays of time-delay reservoirs. Copyright © 2014 Elsevier Ltd. All rights reserved.

  4. Forecasting autoregressive time series under changing persistence

    DEFF Research Database (Denmark)

    Kruse, Robinson

    Changing persistence in time series models means that a structural change from nonstationarity to stationarity or vice versa occurs over time. Such a change has important implications for forecasting, as negligence may lead to inaccurate model predictions. This paper derives generally applicable...... recommendations, no matter whether a change in persistence occurs or not. Seven different forecasting strategies based on a biasedcorrected estimator are compared by means of a large-scale Monte Carlo study. The results for decreasing and increasing persistence are highly asymmetric and new to the literature. Its...... good predictive ability and its balanced performance among different settings strongly advocate the use of forecasting strategies based on the Bai-Perron procedure....

  5. Useful Pattern Mining on Time Series

    DEFF Research Database (Denmark)

    Goumatianos, Nikitas; Christou, Ioannis T; Lindgren, Peter

    2013-01-01

    We present the architecture of a “useful pattern” mining system that is capable of detecting thousands of different candlestick sequence patterns at the tick or any higher granularity levels. The system architecture is highly distributed and performs most of its highly compute-intensive aggregation...... calculations as complex but efficient distributed SQL queries on the relational databases that store the time-series. We present initial results from mining all frequent candlestick sequences with the characteristic property that when they occur then, with an average at least 60% probability, they signal a 2......% or higher increase (or, alternatively, decrease) in a chosen property of the stock (e.g. close-value) within a given time-window (e.g. 5 days). Initial results from a first prototype implementation of the architecture show that after training on a large set of stocks, the system is capable of finding...

  6. Learning with Latent Factors in Time Series

    CERN Document Server

    Jalali, Ali

    2011-01-01

    This paper considers the problem of learning, from samples, the dependency structure of a system of linear stochastic differential equations, when some of the variables are latent. In particular, we observe the time evolution of some variables, and never observe other variables; from this, we would like to find the dependency structure between the observed variables -- separating out the spurious interactions caused by the (marginalizing out of the) latent variables' time series. We develop a new method, based on convex optimization, to do so in the case when the number of latent variables is smaller than the number of observed ones. For the case when the dependency structure between the observed variables is sparse, we theoretically establish a high-dimensional scaling result for structure recovery. We verify our theoretical result with both synthetic and real data (from the stock market).

  7. Automated time series forecasting for biosurveillance.

    Science.gov (United States)

    Burkom, Howard S; Murphy, Sean Patrick; Shmueli, Galit

    2007-09-30

    For robust detection performance, traditional control chart monitoring for biosurveillance is based on input data free of trends, day-of-week effects, and other systematic behaviour. Time series forecasting methods may be used to remove this behaviour by subtracting forecasts from observations to form residuals for algorithmic input. We describe three forecast methods and compare their predictive accuracy on each of 16 authentic syndromic data streams. The methods are (1) a non-adaptive regression model using a long historical baseline, (2) an adaptive regression model with a shorter, sliding baseline, and (3) the Holt-Winters method for generalized exponential smoothing. Criteria for comparing the forecasts were the root-mean-square error, the median absolute per cent error (MedAPE), and the median absolute deviation. The median-based criteria showed best overall performance for the Holt-Winters method. The MedAPE measures over the 16 test series averaged 16.5, 11.6, and 9.7 for the non-adaptive regression, adaptive regression, and Holt-Winters methods, respectively. The non-adaptive regression forecasts were degraded by changes in the data behaviour in the fixed baseline period used to compute model coefficients. The mean-based criterion was less conclusive because of the effects of poor forecasts on a small number of calendar holidays. The Holt-Winters method was also most effective at removing serial autocorrelation, with most 1-day-lag autocorrelation coefficients below 0.15. The forecast methods were compared without tuning them to the behaviour of individual series. We achieved improved predictions with such tuning of the Holt-Winters method, but practical use of such improvements for routine surveillance will require reliable data classification methods.

  8. Trend prediction of chaotic time series

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    Trend prediction of chaotic ti me series is anin-teresting probleminti me series analysis andti me se-ries data mining(TSDM)fields[1].TSDM-basedmethods can successfully characterize and predictcomplex,irregular,and chaotic ti me series.Somemethods have been proposed to predict the trend ofchaotic ti me series.In our knowledge,these meth-ods can be classified into t wo categories as follows.The first category is based on the embeddedspace[2-3],where rawti me series data is mapped to areconstructed phase spac...

  9. A New SBUV Ozone Profile Time Series

    Science.gov (United States)

    McPeters, Richard

    2011-01-01

    Under NASA's MEaSUREs program for creating long term multi-instrument data sets, our group at Goddard has re-processed ozone profile data from a series of SBUV instruments. We have processed data from the Nimbus 7 SBUV instrument (1979-1990) and data from SBUV/2 instruments on NOAA-9 (1985-1998), NOAA-11 (1989-1995), NOAA-16 (2001-2010), NOAA-17 (2002-2010), and NOAA-18 (2005-2010). This reprocessing uses the version 8 ozone profile algorithm but now uses the Brion, Daumont, and Malicet (BMD) ozone cross sections instead of the Bass and Paur cross sections. The new cross sections have much better resolution, and extended wavelength range, and a more consistent temperature dependence. The re-processing also uses an improved cloud height climatology based on the Raman cloud retrievals of OMI. Finally, the instrument-to-instrument calibration is set using matched scenes so that ozone diurnal variation in the upper stratosphere does not alias into the ozone trands. Where there is no instrument overlap, SAGE and MLS are used to estimate calibration offsets. Preliminary analysis shows a more coherent time series as a function of altitude. The net effect on profile total column ozone is on average an absolute reduction of about one percent. Comparisons with ground-based systems are significantly better at high latitudes.

  10. Correcting and combining time series forecasters.

    Science.gov (United States)

    Firmino, Paulo Renato A; de Mattos Neto, Paulo S G; Ferreira, Tiago A E

    2014-02-01

    Combined forecasters have been in the vanguard of stochastic time series modeling. In this way it has been usual to suppose that each single model generates a residual or prediction error like a white noise. However, mostly because of disturbances not captured by each model, it is yet possible that such supposition is violated. The present paper introduces a two-step method for correcting and combining forecasting models. Firstly, the stochastic process underlying the bias of each predictive model is built according to a recursive ARIMA algorithm in order to achieve a white noise behavior. At each iteration of the algorithm the best ARIMA adjustment is determined according to a given information criterion (e.g. Akaike). Then, in the light of the corrected predictions, it is considered a maximum likelihood combined estimator. Applications involving single ARIMA and artificial neural networks models for Dow Jones Industrial Average Index, S&P500 Index, Google Stock Value, and Nasdaq Index series illustrate the usefulness of the proposed framework.

  11. Periodograms for multiband astronomical time series

    Science.gov (United States)

    Ivezic, Z.; VanderPlas, J. T.

    2016-05-01

    We summarize the multiband periodogram, a general extension of the well-known Lomb-Scargle approach for detecting periodic signals in time- domain data developed by VanderPlas & Ivezic (2015). A Python implementation of this method is available on GitHub. The multiband periodogram significantly improves period finding for randomly sampled multiband light curves (e.g., Pan-STARRS, DES, and LSST), and can treat non-uniform sampling and heteroscedastic errors. The light curves in each band are modeled as arbitrary truncated Fourier series, with the period and phase shared across all bands. The key aspect is the use of Tikhonov regularization which drives most of the variability into the so-called base model common to all bands, while fits for individual bands describe residuals relative to the base model and typically require lower-order Fourier series. We use simulated light curves and randomly subsampled SDSS Stripe 82 data to demonstrate the superiority of this method compared to other methods from the literature, and find that this method will be able to efficiently determine the correct period in the majority of LSST's bright RR Lyrae stars with as little as six months of LSST data.

  12. Normalizing the causality between time series

    Science.gov (United States)

    Liang, X. San

    2015-08-01

    Recently, a rigorous yet concise formula was derived to evaluate information flow, and hence the causality in a quantitative sense, between time series. To assess the importance of a resulting causality, it needs to be normalized. The normalization is achieved through distinguishing a Lyapunov exponent-like, one-dimensional phase-space stretching rate and a noise-to-signal ratio from the rate of information flow in the balance of the marginal entropy evolution of the flow recipient. It is verified with autoregressive models and applied to a real financial analysis problem. An unusually strong one-way causality is identified from IBM (International Business Machines Corporation) to GE (General Electric Company) in their early era, revealing to us an old story, which has almost faded into oblivion, about "Seven Dwarfs" competing with a giant for the mainframe computer market.

  13. Inferring causality from noisy time series data

    DEFF Research Database (Denmark)

    Mønster, Dan; Fusaroli, Riccardo; Tylén, Kristian;

    2016-01-01

    Convergent Cross-Mapping (CCM) has shown high potential to perform causal inference in the absence of models. We assess the strengths and weaknesses of the method by varying coupling strength and noise levels in coupled logistic maps. We find that CCM fails to infer accurate coupling strength...... and even causality direction in synchronized time-series and in the presence of intermediate coupling. We find that the presence of noise deterministically reduces the level of cross-mapping fidelity, while the convergence rate exhibits higher levels of robustness. Finally, we propose that controlled noise...... injections in intermediate-to-strongly coupled systems could enable more accurate causal inferences. Given the inherent noisy nature of real-world systems, our findings enable a more accurate evaluation of CCM applicability and advance suggestions on how to overcome its weaknesses....

  14. Highly comparative, feature-based time-series classification

    CERN Document Server

    Fulcher, Ben D

    2014-01-01

    A highly comparative, feature-based approach to time series classification is introduced that uses an extensive database of algorithms to extract thousands of interpretable features from time series. These features are derived from across the scientific time-series analysis literature, and include summaries of time series in terms of their correlation structure, distribution, entropy, stationarity, scaling properties, and fits to a range of time-series models. After computing thousands of features for each time series in a training set, those that are most informative of the class structure are selected using greedy forward feature selection with a linear classifier. The resulting feature-based classifiers automatically learn the differences between classes using a reduced number of time-series properties, and circumvent the need to calculate distances between time series. Representing time series in this way results in orders of magnitude of dimensionality reduction, allowing the method to perform well on ve...

  15. PERIODOGRAMS FOR MULTIBAND ASTRONOMICAL TIME SERIES

    Energy Technology Data Exchange (ETDEWEB)

    VanderPlas, Jacob T. [eScience Institute, University of Washington, Seattle, WA (United States); Ivezic, Željko [Department of Astronomy, University of Washington, Seattle, WA (United States)

    2015-10-10

    This paper introduces the multiband periodogram, a general extension of the well-known Lomb–Scargle approach for detecting periodic signals in time-domain data. In addition to advantages of the Lomb–Scargle method such as treatment of non-uniform sampling and heteroscedastic errors, the multiband periodogram significantly improves period finding for randomly sampled multiband light curves (e.g., Pan-STARRS, DES, and LSST). The light curves in each band are modeled as arbitrary truncated Fourier series, with the period and phase shared across all bands. The key aspect is the use of Tikhonov regularization which drives most of the variability into the so-called base model common to all bands, while fits for individual bands describe residuals relative to the base model and typically require lower-order Fourier series. This decrease in the effective model complexity is the main reason for improved performance. After a pedagogical development of the formalism of least-squares spectral analysis, which motivates the essential features of the multiband model, we use simulated light curves and randomly subsampled SDSS Stripe 82 data to demonstrate the superiority of this method compared to other methods from the literature and find that this method will be able to efficiently determine the correct period in the majority of LSST’s bright RR Lyrae stars with as little as six months of LSST data, a vast improvement over the years of data reported to be required by previous studies. A Python implementation of this method, along with code to fully reproduce the results reported here, is available on GitHub.

  16. Timing calibration and spectral cleaning of LOFAR time series data

    Science.gov (United States)

    Corstanje, A.; Buitink, S.; Enriquez, J. E.; Falcke, H.; Hörandel, J. R.; Krause, M.; Nelles, A.; Rachen, J. P.; Schellart, P.; Scholten, O.; ter Veen, S.; Thoudam, S.; Trinh, T. N. G.

    2016-05-01

    We describe a method for spectral cleaning and timing calibration of short time series data of the voltage in individual radio interferometer receivers. It makes use of phase differences in fast Fourier transform (FFT) spectra across antenna pairs. For strong, localized terrestrial sources these are stable over time, while being approximately uniform-random for a sum over many sources or for noise. Using only milliseconds-long datasets, the method finds the strongest interfering transmitters, a first-order solution for relative timing calibrations, and faulty data channels. No knowledge of gain response or quiescent noise levels of the receivers is required. With relatively small data volumes, this approach is suitable for use in an online system monitoring setup for interferometric arrays. We have applied the method to our cosmic-ray data collection, a collection of measurements of short pulses from extensive air showers, recorded by the LOFAR radio telescope. Per air shower, we have collected 2 ms of raw time series data for each receiver. The spectral cleaning has a calculated optimal sensitivity corresponding to a power signal-to-noise ratio of 0.08 (or -11 dB) in a spectral window of 25 kHz, for 2 ms of data in 48 antennas. This is well sufficient for our application. Timing calibration across individual antenna pairs has been performed at 0.4 ns precision; for calibration of signal clocks across stations of 48 antennas the precision is 0.1 ns. Monitoring differences in timing calibration per antenna pair over the course of the period 2011 to 2015 shows a precision of 0.08 ns, which is useful for monitoring and correcting drifts in signal path synchronizations. A cross-check method for timing calibration is presented, using a pulse transmitter carried by a drone flying over the array. Timing precision is similar, 0.3 ns, but is limited by transmitter position measurements, while requiring dedicated flights.

  17. Timing calibration and spectral cleaning of LOFAR time series data

    CERN Document Server

    Corstanje, A; Enriquez, J E; Falcke, H; Hörandel, J R; Krause, M; Nelles, A; Rachen, J P; Schellart, P; Scholten, O; ter Veen, S; Thoudam, S; Trinh, T N G

    2016-01-01

    We describe a method for spectral cleaning and timing calibration of short voltage time series data from individual radio interferometer receivers. It makes use of the phase differences in Fast Fourier Transform (FFT) spectra across antenna pairs. For strong, localized terrestrial sources these are stable over time, while being approximately uniform-random for a sum over many sources or for noise. Using only milliseconds-long datasets, the method finds the strongest interfering transmitters, a first-order solution for relative timing calibrations, and faulty data channels. No knowledge of gain response or quiescent noise levels of the receivers is required. With relatively small data volumes, this approach is suitable for use in an online system monitoring setup for interferometric arrays. We have applied the method to our cosmic-ray data collection, a collection of measurements of short pulses from extensive air showers, recorded by the LOFAR radio telescope. Per air shower, we have collected 2 ms of raw tim...

  18. Time series modeling for syndromic surveillance

    Directory of Open Access Journals (Sweden)

    Mandl Kenneth D

    2003-01-01

    Full Text Available Abstract Background Emergency department (ED based syndromic surveillance systems identify abnormally high visit rates that may be an early signal of a bioterrorist attack. For example, an anthrax outbreak might first be detectable as an unusual increase in the number of patients reporting to the ED with respiratory symptoms. Reliably identifying these abnormal visit patterns requires a good understanding of the normal patterns of healthcare usage. Unfortunately, systematic methods for determining the expected number of (ED visits on a particular day have not yet been well established. We present here a generalized methodology for developing models of expected ED visit rates. Methods Using time-series methods, we developed robust models of ED utilization for the purpose of defining expected visit rates. The models were based on nearly a decade of historical data at a major metropolitan academic, tertiary care pediatric emergency department. The historical data were fit using trimmed-mean seasonal models, and additional models were fit with autoregressive integrated moving average (ARIMA residuals to account for recent trends in the data. The detection capabilities of the model were tested with simulated outbreaks. Results Models were built both for overall visits and for respiratory-related visits, classified according to the chief complaint recorded at the beginning of each visit. The mean absolute percentage error of the ARIMA models was 9.37% for overall visits and 27.54% for respiratory visits. A simple detection system based on the ARIMA model of overall visits was able to detect 7-day-long simulated outbreaks of 30 visits per day with 100% sensitivity and 97% specificity. Sensitivity decreased with outbreak size, dropping to 94% for outbreaks of 20 visits per day, and 57% for 10 visits per day, all while maintaining a 97% benchmark specificity. Conclusions Time series methods applied to historical ED utilization data are an important tool

  19. Multifractals in Western Major STOCK Markets Historical Volatilities in Times of Financial Crisis

    Science.gov (United States)

    Lahmiri, Salim

    In this paper, the generalized Hurst exponent is used to investigate multifractal properties of historical volatility (CHV) in stock market price and return series before, during and after 2008 financial crisis. Empirical results from NASDAQ, S&P500, TSE, CAC40, DAX, and FTSE stock market data show that there is strong evidence of multifractal patterns in HV of both price and return series. In addition, financial crisis deeply affected the behavior and degree of multifractality in volatility of Western financial markets at price and return levels.

  20. Time series models of symptoms in schizophrenia.

    Science.gov (United States)

    Tschacher, Wolfgang; Kupper, Zeno

    2002-12-15

    The symptom courses of 84 schizophrenia patients (mean age: 24.4 years; mean previous admissions: 1.3; 64% males) of a community-based acute ward were examined to identify dynamic patterns of symptoms and to investigate the relation between these patterns and treatment outcome. The symptoms were monitored by systematic daily staff ratings using a scale composed of three factors: psychoticity, excitement, and withdrawal. Patients showed moderate to high symptomatic improvement documented by effect size measures. Each of the 84 symptom trajectories was analyzed by time series methods using vector autoregression (VAR) that models the day-to-day interrelations between symptom factors. Multiple and stepwise regression analyses were then performed on the basis of the VAR models. Two VAR parameters were found to be associated significantly with favorable outcome in this exploratory study: 'withdrawal preceding a reduction of psychoticity' as well as 'excitement preceding an increase of withdrawal'. The findings were interpreted as generating hypotheses about how patients cope with psychotic episodes.

  1. Testing whether a time series is Guassian

    Energy Technology Data Exchange (ETDEWEB)

    Lee, S.

    1991-01-01

    The authors first tests whether a stationary linear process with mean 0 is Gaussian. For the invertible processes, he considers the empirical process based on the residuals as the basis of a test procedure. By applying the result of Boldin (1983) and Kreiss (1988), he shows that the process behaves asymptotically like the one based on the true errors. For non-invertible processes, on the other hand, Lee uses the empirical process based on data themselves rather than the one based on residuals. Here, the time series is assumed to be a strongly mixing process with a suitable mixing order. Then, the asymptotic behavior of the empirical process in each case is studied under a sequence of contiguous alternatives, and quadratic functionals of the empirical process are employed for AAR([infinity]) processes in order to compare efficiencies between these two procedures. The rest of the thesis is devoted to extending Boldin's results to nonstationary processes such as unstable AR(p) processes and explosive AR(1) processes, analyzing by means of a general stochastic regression model.

  2. Climate Prediction Center (CPC) Global Precipitation Time Series

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The global precipitation time series provides time series charts showing observations of daily precipitation as well as accumulated precipitation compared to normal...

  3. Climate Prediction Center (CPC) Global Temperature Time Series

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The global temperature time series provides time series charts using station based observations of daily temperature. These charts provide information about the...

  4. Spectral Estimation of Non-Gaussian Time Series

    OpenAIRE

    Fabián, Z. (Zdeněk)

    2010-01-01

    Based on the concept of the scalar score of a probability distribution, we introduce a concept of a scalar score of time series and propose to characterize a non-Gaussian time series by spectral density of its scalar score.

  5. An introduction to state space time series analysis.

    OpenAIRE

    Commandeur, J.J.F. & Koopman, S.J.

    2007-01-01

    Providing a practical introduction to state space methods as applied to unobserved components time series models, also known as structural time series models, this book introduces time series analysis using state space methodology to readers who are neither familiar with time series analysis, nor with state space methods. The only background required in order to understand the material presented in the book is a basic knowledge of classical linear regression models, of which a brief review is...

  6. Transmission of linear regression patterns between time series: from relationship in time series to complex networks.

    Science.gov (United States)

    Gao, Xiangyun; An, Haizhong; Fang, Wei; Huang, Xuan; Li, Huajiao; Zhong, Weiqiong; Ding, Yinghui

    2014-07-01

    The linear regression parameters between two time series can be different under different lengths of observation period. If we study the whole period by the sliding window of a short period, the change of the linear regression parameters is a process of dynamic transmission over time. We tackle fundamental research that presents a simple and efficient computational scheme: a linear regression patterns transmission algorithm, which transforms linear regression patterns into directed and weighted networks. The linear regression patterns (nodes) are defined by the combination of intervals of the linear regression parameters and the results of the significance testing under different sizes of the sliding window. The transmissions between adjacent patterns are defined as edges, and the weights of the edges are the frequency of the transmissions. The major patterns, the distance, and the medium in the process of the transmission can be captured. The statistical results of weighted out-degree and betweenness centrality are mapped on timelines, which shows the features of the distribution of the results. Many measurements in different areas that involve two related time series variables could take advantage of this algorithm to characterize the dynamic relationships between the time series from a new perspective.

  7. Seasonal Time Series Analysis Based on Genetic Algorithm

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    Pattern discovery from the seasonal time-series is of importance. Traditionally, most of the algorithms of pattern discovery in time series are similar. A novel mode of time series is proposed which integrates the Genetic Algorithm (GA) for the actual problem. The experiments on the electric power yield sequence models show that this algorithm is practicable and effective.

  8. Influence of lambda-carrageenan on the release of systematic series of volatile flavor compounds from viscous food model systems

    DEFF Research Database (Denmark)

    Bylaite, Egle; Ilgunaite, Z.; Meyer, Anne Boye Strunge

    2004-01-01

    The effect of lambda-carrageenan addition level (0.1, 0.25, 0.4, and 0.5% w/w) and viscosity on the release of systematic series of aroma compounds (aldehydes, esters, ketones, and alcohols) was studied in thickened viscous solutions containing lambda-carrageenan and 10 wt % of sucrose. Air...... range was assessed by dynamic headspace gas chromatography. K(37degreesC) increased as the carbon chain increased within each homologous series. Esters exhibited the highest volatility, followed by aldehydes, ketones, and alcohols. Under equilibrium, no overall effect of lambda-carrageenan was found...

  9. Generalized Framework for Similarity Measure of Time Series

    Directory of Open Access Journals (Sweden)

    Hongsheng Yin

    2014-01-01

    Full Text Available Currently, there is no definitive and uniform description for the similarity of time series, which results in difficulties for relevant research on this topic. In this paper, we propose a generalized framework to measure the similarity of time series. In this generalized framework, whether the time series is univariable or multivariable, and linear transformed or nonlinear transformed, the similarity of time series is uniformly defined using norms of vectors or matrices. The definitions of the similarity of time series in the original space and the transformed space are proved to be equivalent. Furthermore, we also extend the theory on similarity of univariable time series to multivariable time series. We present some experimental results on published time series datasets tested with the proposed similarity measure function of time series. Through the proofs and experiments, it can be claimed that the similarity measure functions of linear multivariable time series based on the norm distance of covariance matrix and nonlinear multivariable time series based on kernel function are reasonable and practical.

  10. MÉTODOS DISCRETOS Y CONTINUOS PARA MODELAR LA DENSIDAD DE PROBABILIDAD DE LA VOLATILIDAD ESTOCÁSTICA DE LOS RENDIMIENTOS DE SERIES FINANCIERAS DISCRETE AND CONTINUOUS METHODS FOR MODELING FINANCIAL SERIES YIELDING STOCHASTIC VOLATILITY PROBABILITY DENSITY

    Directory of Open Access Journals (Sweden)

    Carlos Alexánder Grajales Correa

    2007-07-01

    Full Text Available En este trabajo se consideran los rendimientos diarios de un activo financiero con el propósito de modelar y comparar la densidad de probabilidad de la volatilidad estocástica de los retornos. Para tal fin, se proponen los modelos ARCH y sus extensiones, que son en tiempo discreto, así como un modelo empírico de volatilidad estocástica, desarrollado por Paul Wilmott. Para el caso discreto se muestran los modelos que permiten estimar la volatilidad condicional heterocedástica en un instante t del tiempo, t∈[1,T]. En el caso continuo se asocia un proceso de difusión de Itô a la volatilidad estocástica de la serie financiera, lo cual posibilita discretizar dicho proceso y simularlo para obtener densidades de probabilidad empíricas de la volatilidad. Finalmente se ilustran y se comparan los resultados obtenidos con las metodologías expuestas para el caso de las series financieras S&P 500 de EEUU, el Índice de Precios y Cotizaciones de la Bolsa Mexicana de Valores (IPC y el IGBC de Colombia.This work considers daily yields of financial assets in order to model and compare returns stochastic volatility probability density. For such aim, ARCH models and its extensions are proposed - they are in discrete time- as well as an Empirical Stochastic Volatility Model, developed by Paul Wilmott. For the discrete case, models that allow to estimate heteroscedasticity conditional volatility in a time, t, t,t∈[1,T], are shown. In the continuous case, there is an association of an Itô diffusion process to stochastic volatility of the financial series, which allows to write a discretization of this process and to simulate it to obtain empirical probabilistic densities from the volatility. Finally the results are illustrated and compared with methodologies exposed by the case of the financial series S&P 500 of the U.S.A., Index of Prices and Quotations of stock-market Mexican of Values (IPC and IGBC of Colombia.

  11. ANCOVA Procedures in Time-Series Experiments: An Illustrative Example.

    Science.gov (United States)

    Willson, Victor L.

    A statistical model for analysis of multiple time-series observation is briefly outlined. The model incorporates a change parameter corresponding to intervention or interruption of the dependent series. The additional time-series are included in the model as covariates. The practical application of the procedure is illustrated with traffic…

  12. Hidden Markov Models for Time Series An Introduction Using R

    CERN Document Server

    Zucchini, Walter

    2009-01-01

    Illustrates the flexibility of HMMs as general-purpose models for time series data. This work presents an overview of HMMs for analyzing time series data, from continuous-valued, circular, and multivariate series to binary data, bounded and unbounded counts and categorical observations.

  13. Time and ensemble averaging in time series analysis

    CERN Document Server

    Latka, Miroslaw; Jernajczyk, Wojciech; West, Bruce J

    2010-01-01

    In many applications expectation values are calculated by partitioning a single experimental time series into an ensemble of data segments of equal length. Such single trajectory ensemble (STE) is a counterpart to a multiple trajectory ensemble (MTE) used whenever independent measurements or realizations of a stochastic process are available. The equivalence of STE and MTE for stationary systems was postulated by Wang and Uhlenbeck in their classic paper on Brownian motion (Rev. Mod. Phys. 17, 323 (1945)) but surprisingly has not yet been proved. Using the stationary and ergodic paradigm of statistical physics -- the Ornstein-Uhlenbeck (OU) Langevin equation, we revisit Wang and Uhlenbeck's postulate. In particular, we find that the variance of the solution of this equation is different for these two ensembles. While the variance calculated using the MTE quantifies the spreading of independent trajectories originating from the same initial point, the variance for STE measures the spreading of two correlated r...

  14. EGARCH: un modelo asimétrico para estimar la volatilidad de series financieras EGARCH: a model to estimate the asymmetric volatility of financial series

    Directory of Open Access Journals (Sweden)

    Horacio Fernández Castaño

    2010-01-01

    Full Text Available En la modelación de las volatilidades con cambios súbitos, es imperativo usar modelos que permitan describir y analizar el dinamismo de la volatilidad, ya que los inversionistas, entre otras cosas, pueden estar interesados en estimar la tasa de retorno y la volatilidad de un instrumento financiero u otros derivados, sólo durante el período de tenencia. En este artículo, que constituye la primera de dos entregas, se hace una evaluación del modelo asimétrico EGARCH que resulta ser muy útil para estudiar la dinámica del Índice General de la Bolsa de valores de Colombia (IGBC y de su volatilidad, pues inicia haciendo una breve revisión del modelo GARCH, resaltando su importancia en la modelación de series de tiempo financieras, e identificando sus debilidades en cuanto a su propiedad de simetría para las distribuciones de colas gruesas y que pueden generar errores de pronóstico. Luego se muestra la importancia del modelo EGARCH para la modelación de algunos hechos que no se logran capturar con los modelos GARCHIn the modeling of volatility with rapid changes, it is imperative to use models to describe and analyze the dynamics of volatility, as investors, among other things, may be interested in estimating the rate of return and volatility of an instrument financial or other derivatives, only during the holding period. This article contains an evaluation of asymmetric EGARCH model that proves to be very useful to study the dynamics of the General Index of the Stock Exchange of Colombia (IGBC and its volatility, since, as will be shown, the results suggest they could be more useful for capture the stylized facts of the Colombian market behavior. It is really significant to evidence the importance of asymmetric models to estimate the volatility of financial series is intended here as a model for identifying, in the best way to estimate the volatility of daily returns of the IGBC.

  15. Estimation in continuous-time stochastic| volatility models using nonlinear filters

    DEFF Research Database (Denmark)

    Nielsen, Jan Nygaard; Vestergaard, M.; Madsen, Henrik

    2000-01-01

    Presents a correction to the authorship of the article 'Estimation in Continuous-Time Stochastic Volatility Models Using Nonlinear Filters,' published in the periodical 'International Journal of Theoretical and Applied Finance,' Vol. 3, No. 2., pp. 279-308.......Presents a correction to the authorship of the article 'Estimation in Continuous-Time Stochastic Volatility Models Using Nonlinear Filters,' published in the periodical 'International Journal of Theoretical and Applied Finance,' Vol. 3, No. 2., pp. 279-308....

  16. Scale-dependent intrinsic entropies of complex time series.

    Science.gov (United States)

    Yeh, Jia-Rong; Peng, Chung-Kang; Huang, Norden E

    2016-04-13

    Multi-scale entropy (MSE) was developed as a measure of complexity for complex time series, and it has been applied widely in recent years. The MSE algorithm is based on the assumption that biological systems possess the ability to adapt and function in an ever-changing environment, and these systems need to operate across multiple temporal and spatial scales, such that their complexity is also multi-scale and hierarchical. Here, we present a systematic approach to apply the empirical mode decomposition algorithm, which can detrend time series on various time scales, prior to analysing a signal's complexity by measuring the irregularity of its dynamics on multiple time scales. Simulated time series of fractal Gaussian noise and human heartbeat time series were used to study the performance of this new approach. We show that our method can successfully quantify the fractal properties of the simulated time series and can accurately distinguish modulations in human heartbeat time series in health and disease.

  17. Efficient Algorithms for Segmentation of Item-Set Time Series

    Science.gov (United States)

    Chundi, Parvathi; Rosenkrantz, Daniel J.

    We propose a special type of time series, which we call an item-set time series, to facilitate the temporal analysis of software version histories, email logs, stock market data, etc. In an item-set time series, each observed data value is a set of discrete items. We formalize the concept of an item-set time series and present efficient algorithms for segmenting a given item-set time series. Segmentation of a time series partitions the time series into a sequence of segments where each segment is constructed by combining consecutive time points of the time series. Each segment is associated with an item set that is computed from the item sets of the time points in that segment, using a function which we call a measure function. We then define a concept called the segment difference, which measures the difference between the item set of a segment and the item sets of the time points in that segment. The segment difference values are required to construct an optimal segmentation of the time series. We describe novel and efficient algorithms to compute segment difference values for each of the measure functions described in the paper. We outline a dynamic programming based scheme to construct an optimal segmentation of the given item-set time series. We use the item-set time series segmentation techniques to analyze the temporal content of three different data sets—Enron email, stock market data, and a synthetic data set. The experimental results show that an optimal segmentation of item-set time series data captures much more temporal content than a segmentation constructed based on the number of time points in each segment, without examining the item set data at the time points, and can be used to analyze different types of temporal data.

  18. Fluctuation complexity of agent-based financial time series model by stochastic Potts system

    Science.gov (United States)

    Hong, Weijia; Wang, Jun

    2015-03-01

    Financial market is a complex evolved dynamic system with high volatilities and noises, and the modeling and analyzing of financial time series are regarded as the rather challenging tasks in financial research. In this work, by applying the Potts dynamic system, a random agent-based financial time series model is developed in an attempt to uncover the empirical laws in finance, where the Potts model is introduced to imitate the trading interactions among the investing agents. Based on the computer simulation in conjunction with the statistical analysis and the nonlinear analysis, we present numerical research to investigate the fluctuation behaviors of the proposed time series model. Furthermore, in order to get a robust conclusion, we consider the daily returns of Shanghai Composite Index and Shenzhen Component Index, and the comparison analysis of return behaviors between the simulation data and the actual data is exhibited.

  19. Sparse Representation for Time-Series Classification

    Science.gov (United States)

    2015-02-08

    Comput. Vision and Pattern Recognition (CVPR), pp. 4114–4121 (2014). 18. J. Mairal, F. Bach , A. Zisserman, and G. Sapiro. Supervised dictionary learn...ing. In Advances Neural Inform. Process. Syst. (NIPS), pp. 1033–1040 (2008). 19. J. Mairal, F. Bach , and J. Ponce, Task-driven dictionary learning...Series Classification 17 compressive sensing, SISC. 33(1), 250–278 (2011). 41. J. Mairal, F. Bach , J. Ponce, and G. Sapiro, Online dictionary learning for

  20. TIME SERIES ANALYSIS USING A UNIQUE MODEL OF TRANSFORMATION

    Directory of Open Access Journals (Sweden)

    Goran Klepac

    2007-12-01

    Full Text Available REFII1 model is an authorial mathematical model for time series data mining. The main purpose of that model is to automate time series analysis, through a unique transformation model of time series. An advantage of this approach of time series analysis is the linkage of different methods for time series analysis, linking traditional data mining tools in time series, and constructing new algorithms for analyzing time series. It is worth mentioning that REFII model is not a closed system, which means that we have a finite set of methods. At first, this is a model for transformation of values of time series, which prepares data used by different sets of methods based on the same model of transformation in a domain of problem space. REFII model gives a new approach in time series analysis based on a unique model of transformation, which is a base for all kind of time series analysis. The advantage of REFII model is its possible application in many different areas such as finance, medicine, voice recognition, face recognition and text mining.

  1. Bernstein polynomials for evolutionary algebraic prediction of short time series

    Science.gov (United States)

    Lukoseviciute, Kristina; Howard, Daniel; Ragulskis, Minvydas

    2017-07-01

    Short time series prediction technique based on Bernstein polynomials is presented in this paper. Firstly, the straightforward Bernstein polynomial extrapolation scheme is improved by extending the interval of approximation. Secondly, the forecasting scheme is designed in the evolutionary computational setup which is based on the conciliation between the coarseness of the algebraic prediction and the smoothness of the time average prediction. Computational experiments with the test time series suggest that this time series prediction technique could be applicable for various forecasting applications.

  2. Time-series prediction and applications a machine intelligence approach

    CERN Document Server

    Konar, Amit

    2017-01-01

    This book presents machine learning and type-2 fuzzy sets for the prediction of time-series with a particular focus on business forecasting applications. It also proposes new uncertainty management techniques in an economic time-series using type-2 fuzzy sets for prediction of the time-series at a given time point from its preceding value in fluctuating business environments. It employs machine learning to determine repetitively occurring similar structural patterns in the time-series and uses stochastic automaton to predict the most probabilistic structure at a given partition of the time-series. Such predictions help in determining probabilistic moves in a stock index time-series Primarily written for graduate students and researchers in computer science, the book is equally useful for researchers/professionals in business intelligence and stock index prediction. A background of undergraduate level mathematics is presumed, although not mandatory, for most of the sections. Exercises with tips are provided at...

  3. Ruin Probability in Linear Time Series Model

    Institute of Scientific and Technical Information of China (English)

    ZHANG Lihong

    2005-01-01

    This paper analyzes a continuous time risk model with a linear model used to model the claim process. The time is discretized stochastically using the times when claims occur, using Doob's stopping time theorem and martingale inequalities to obtain expressions for the ruin probability as well as both exponential and non-exponential upper bounds for the ruin probability for an infinite time horizon. Numerical results are included to illustrate the accuracy of the non-exponential bound.

  4. Analysis of Nonstationary Time Series for Biological Rhythms Research.

    Science.gov (United States)

    Leise, Tanya L

    2017-06-01

    This article is part of a Journal of Biological Rhythms series exploring analysis and statistics topics relevant to researchers in biological rhythms and sleep research. The goal is to provide an overview of the most common issues that arise in the analysis and interpretation of data in these fields. In this article on time series analysis for biological rhythms, we describe some methods for assessing the rhythmic properties of time series, including tests of whether a time series is indeed rhythmic. Because biological rhythms can exhibit significant fluctuations in their period, phase, and amplitude, their analysis may require methods appropriate for nonstationary time series, such as wavelet transforms, which can measure how these rhythmic parameters change over time. We illustrate these methods using simulated and real time series.

  5. Clustering Time Series Data Stream - A Literature Survey

    CERN Document Server

    Kavitha, V

    2010-01-01

    Mining Time Series data has a tremendous growth of interest in today's world. To provide an indication various implementations are studied and summarized to identify the different problems in existing applications. Clustering time series is a trouble that has applications in an extensive assortment of fields and has recently attracted a large amount of research. Time series data are frequently large and may contain outliers. In addition, time series are a special type of data set where elements have a temporal ordering. Therefore clustering of such data stream is an important issue in the data mining process. Numerous techniques and clustering algorithms have been proposed earlier to assist clustering of time series data streams. The clustering algorithms and its effectiveness on various applications are compared to develop a new method to solve the existing problem. This paper presents a survey on various clustering algorithms available for time series datasets. Moreover, the distinctiveness and restriction ...

  6. On correlations and fractal characteristics of time series

    CERN Document Server

    Vitanov, N K; Yankulova, E D; Vitanov, Nikolay K.; Sakai, kenschi; Yankulova, Elka D.

    2005-01-01

    Correlation analysis is convenient and frequently used tool for investigation of time series from complex systems. Recently new methods such as the multifractal detrended fluctuation analysis (MFDFA) and the wavelet transform modulus maximum method (WTMM) have been developed. By means of these methods (i) we can investigate long-range correlations in time series and (ii) we can calculate fractal spectra of these time series. But opposite to the classical tool for correlation analysis - the autocorrelation function, the newly developed tools are not applicable to all kinds of time series. The unappropriate application of MFDFA or WTMM leads to wrong results and conclusions. In this article we discuss the opportunities and risks connected to the application of the MFDFA method to time series from a random number generator and to experimentally measured time series (i) for accelerations of an agricultural tractor and (ii) for the heartbeat activity of {\\sl Drosophila melanogaster}. Our main goal is to emphasize ...

  7. A novel weight determination method for time series data aggregation

    Science.gov (United States)

    Xu, Paiheng; Zhang, Rong; Deng, Yong

    2017-09-01

    Aggregation in time series is of great importance in time series smoothing, predicting and other time series analysis process, which makes it crucial to address the weights in times series correctly and reasonably. In this paper, a novel method to obtain the weights in time series is proposed, in which we adopt induced ordered weighted aggregation (IOWA) operator and visibility graph averaging (VGA) operator and linearly combine the weights separately generated by the two operator. The IOWA operator is introduced to the weight determination of time series, through which the time decay factor is taken into consideration. The VGA operator is able to generate weights with respect to the degree distribution in the visibility graph constructed from the corresponding time series, which reflects the relative importance of vertices in time series. The proposed method is applied to two practical datasets to illustrate its merits. The aggregation of Construction Cost Index (CCI) demonstrates the ability of proposed method to smooth time series, while the aggregation of The Taiwan Stock Exchange Capitalization Weighted Stock Index (TAIEX) illustrate how proposed method maintain the variation tendency of original data.

  8. How volatilities nonlocal in time affect the price dynamics in complex financial systems.

    Science.gov (United States)

    Tan, Lei; Zheng, Bo; Chen, Jun-Jie; Jiang, Xiong-Fei

    2015-01-01

    What is the dominating mechanism of the price dynamics in financial systems is of great interest to scientists. The problem whether and how volatilities affect the price movement draws much attention. Although many efforts have been made, it remains challenging. Physicists usually apply the concepts and methods in statistical physics, such as temporal correlation functions, to study financial dynamics. However, the usual volatility-return correlation function, which is local in time, typically fluctuates around zero. Here we construct dynamic observables nonlocal in time to explore the volatility-return correlation, based on the empirical data of hundreds of individual stocks and 25 stock market indices in different countries. Strikingly, the correlation is discovered to be non-zero, with an amplitude of a few percent and a duration of over two weeks. This result provides compelling evidence that past volatilities nonlocal in time affect future returns. Further, we introduce an agent-based model with a novel mechanism, that is, the asymmetric trading preference in volatile and stable markets, to understand the microscopic origin of the volatility-return correlation nonlocal in time.

  9. How volatilities nonlocal in time affect the price dynamics in complex financial systems.

    Directory of Open Access Journals (Sweden)

    Lei Tan

    Full Text Available What is the dominating mechanism of the price dynamics in financial systems is of great interest to scientists. The problem whether and how volatilities affect the price movement draws much attention. Although many efforts have been made, it remains challenging. Physicists usually apply the concepts and methods in statistical physics, such as temporal correlation functions, to study financial dynamics. However, the usual volatility-return correlation function, which is local in time, typically fluctuates around zero. Here we construct dynamic observables nonlocal in time to explore the volatility-return correlation, based on the empirical data of hundreds of individual stocks and 25 stock market indices in different countries. Strikingly, the correlation is discovered to be non-zero, with an amplitude of a few percent and a duration of over two weeks. This result provides compelling evidence that past volatilities nonlocal in time affect future returns. Further, we introduce an agent-based model with a novel mechanism, that is, the asymmetric trading preference in volatile and stable markets, to understand the microscopic origin of the volatility-return correlation nonlocal in time.

  10. Non-parametric causal inference for bivariate time series

    CERN Document Server

    McCracken, James M

    2015-01-01

    We introduce new quantities for exploratory causal inference between bivariate time series. The quantities, called penchants and leanings, are computationally straightforward to apply, follow directly from assumptions of probabilistic causality, do not depend on any assumed models for the time series generating process, and do not rely on any embedding procedures; these features may provide a clearer interpretation of the results than those from existing time series causality tools. The penchant and leaning are computed based on a structured method for computing probabilities.

  11. Intrusion Detection Forecasting Using Time Series for Improving Cyber Defence

    OpenAIRE

    Abdullah, Azween Bin; Pillai, Thulasyammal Ramiah; Cai, Long Zheng

    2015-01-01

    The strength of time series modeling is generally not used in almost all current intrusion detection and prevention systems. By having time series models, system administrators will be able to better plan resource allocation and system readiness to defend against malicious activities. In this paper, we address the knowledge gap by investigating the possible inclusion of a statistical based time series modeling that can be seamlessly integrated into existing cyber defense system. Cyber-attack ...

  12. A Generalization of Some Classical Time Series Tools

    DEFF Research Database (Denmark)

    Nielsen, Henrik Aalborg; Madsen, Henrik

    2001-01-01

    In classical time series analysis the sample autocorrelation function (SACF) and the sample partial autocorrelation function (SPACF) has gained wide application for structural identification of linear time series models. We suggest generalizations, founded on smoothing techniques, applicable...... for structural identification of non-linear time series models. A similar generalization of the sample cross correlation function is discussed. Furthermore, a measure of the departure from linearity is suggested. It is shown how bootstrapping can be applied to construct confidence intervals under independence...

  13. Genetic programming-based chaotic time series modeling

    Institute of Scientific and Technical Information of China (English)

    张伟; 吴智铭; 杨根科

    2004-01-01

    This paper proposes a Genetic Programming-Based Modeling (GPM) algorithm on chaotic time series. GP is used here to search for appropriate model structures in function space, and the Particle Swarm Optimization (PSO) algorithm is used for Nonlinear Parameter Estimation (NPE) of dynamic model structures. In addition, GPM integrates the results of Nonlinear Time Series Analysis (NTSA) to adjust the parameters and takes them as the criteria of established models. Experiments showed the effectiveness of such improvements on chaotic time series modeling.

  14. Information distance and its application in time series

    Directory of Open Access Journals (Sweden)

    B. Mirza

    2008-03-01

    Full Text Available   In this paper a new method is introduced for studying time series of complex systems. This method is based on using the concept of entropy and Jensen-Shannon divergence. In this paper this method is applied to time series of billiard system and heart signals. By this method, we can diagnose the healthy and unhealthy heart and also chaotic billiards from non chaotic systems . The method can also be applied to other time series.

  15. Predicting Chaotic Time Series Using Recurrent Neural Network

    Institute of Scientific and Technical Information of China (English)

    ZHANG Jia-Shu; XIAO Xian-Ci

    2000-01-01

    A new proposed method, i.e. the recurrent neural network (RNN), is introduced to predict chaotic time series. The effectiveness of using RNN for making one-step and multi-step predictions is tested based on remarkable few datum points by computer-generated chaotic time series. Numerical results show that the RNN proposed here is a very powerful tool for making prediction of chaotic time series.

  16. Trend time-series modeling and forecasting with neural networks.

    Science.gov (United States)

    Qi, Min; Zhang, G Peter

    2008-05-01

    Despite its great importance, there has been no general consensus on how to model the trends in time-series data. Compared to traditional approaches, neural networks (NNs) have shown some promise in time-series forecasting. This paper investigates how to best model trend time series using NNs. Four different strategies (raw data, raw data with time index, detrending, and differencing) are used to model various trend patterns (linear, nonlinear, deterministic, stochastic, and breaking trend). We find that with NNs differencing often gives meritorious results regardless of the underlying data generating processes (DGPs). This finding is also confirmed by the real gross national product (GNP) series.

  17. gatspy: General tools for Astronomical Time Series in Python

    Science.gov (United States)

    VanderPlas, Jake

    2016-10-01

    Gatspy contains efficient, well-documented implementations of several common routines for Astronomical time series analysis, including the Lomb-Scargle periodogram, the Supersmoother method, and others.

  18. Using neural networks for dynamic light scattering time series processing

    Science.gov (United States)

    Chicea, Dan

    2017-04-01

    A basic experiment to record dynamic light scattering (DLS) time series was assembled using basic components. The DLS time series processing using the Lorentzian function fit was considered as reference. A Neural Network was designed and trained using simulated frequency spectra for spherical particles in the range 0–350 nm, assumed to be scattering centers, and the neural network design and training procedure are described in detail. The neural network output accuracy was tested both on simulated and on experimental time series. The match with the DLS results, considered as reference, was good serving as a proof of concept for using neural networks in fast DLS time series processing.

  19. Forecasting the underlying potential governing climatic time series

    CERN Document Server

    Livina, V N; Mudelsee, M; Lenton, T M

    2012-01-01

    We introduce a technique of time series analysis, potential forecasting, which is based on dynamical propagation of the probability density of time series. We employ polynomial coefficients of the orthogonal approximation of the empirical probability distribution and extrapolate them in order to forecast the future probability distribution of data. The method is tested on artificial data, used for hindcasting observed climate data, and then applied to forecast Arctic sea-ice time series. The proposed methodology completes a framework for `potential analysis' of climatic tipping points which altogether serves anticipating, detecting and forecasting climate transitions and bifurcations using several independent techniques of time series analysis.

  20. Efficient use of correlation entropy for analysing time series data

    Indian Academy of Sciences (India)

    K P Harikrishnan; R Misra; G Ambika

    2009-02-01

    The correlation dimension 2 and correlation entropy 2 are both important quantifiers in nonlinear time series analysis. However, use of 2 has been more common compared to 2 as a discriminating measure. One reason for this is that 2 is a static measure and can be easily evaluated from a time series. However, in many cases, especially those involving coloured noise, 2 is regarded as a more useful measure. Here we present an efficient algorithmic scheme to compute 2 directly from a time series data and show that 2 can be used as a more effective measure compared to 2 for analysing practical time series involving coloured noise.

  1. Time series analysis in the social sciences the fundamentals

    CERN Document Server

    Shin, Youseop

    2017-01-01

    Times Series Analysis in the Social Sciences is a practical and highly readable introduction written exclusively for students and researchers whose mathematical background is limited to basic algebra. The book focuses on fundamental elements of time series analysis that social scientists need to understand so they can employ time series analysis for their research and practice. Through step-by-step explanations and using monthly violent crime rates as case studies, this book explains univariate time series from the preliminary visual analysis through the modeling of seasonality, trends, and re

  2. Interpretable Early Classification of Multivariate Time Series

    Science.gov (United States)

    Ghalwash, Mohamed F.

    2013-01-01

    Recent advances in technology have led to an explosion in data collection over time rather than in a single snapshot. For example, microarray technology allows us to measure gene expression levels in different conditions over time. Such temporal data grants the opportunity for data miners to develop algorithms to address domain-related problems,…

  3. Exponential Smoothing, Long Memory and Volatility Prediction

    DEFF Research Database (Denmark)

    Proietti, Tommaso

    Extracting and forecasting the volatility of financial markets is an important empirical problem. The paper provides a time series characterization of the volatility components arising when the volatility process is fractionally integrated, and proposes a new predictor that can be seen as extensi...... methods for forecasting realized volatility, and that the estimated model confidence sets include the newly proposed fractional lag predictor in all occurrences....

  4. Studies on time series applications in environmental sciences

    CERN Document Server

    Bărbulescu, Alina

    2016-01-01

    Time series analysis and modelling represent a large study field, implying the approach from the perspective of the time and frequency, with applications in different domains. Modelling hydro-meteorological time series is difficult due to the characteristics of these series, as long range dependence, spatial dependence, the correlation with other series. Continuous spatial data plays an important role in planning, risk assessment and decision making in environmental management. In this context, in this book we present various statistical tests and modelling techniques used for time series analysis, as well as applications to hydro-meteorological series from Dobrogea, a region situated in the south-eastern part of Romania, less studied till now. Part of the results are accompanied by their R code. .

  5. Simulation of Ground Winds Time Series

    Science.gov (United States)

    Adelfang, S. I.

    2008-01-01

    A simulation process has been developed for generation of the longitudinal and lateral components of ground wind atmospheric turbulence as a function of mean wind speed, elevation, temporal frequency range and distance between locations. The distance between locations influences the spectral coherence between the simulated series at adjacent locations. Short distances reduce correlation only at high frequencies; as distances increase correlation is reduced over a wider range of frequencies. The choice of values for the constants d1 and d3 in the PSD model is the subject of work in progress. An improved knowledge of the values for zO as a function of wind direction at the ARES-1 launch pads is necessary for definition of d1. Results of other studies at other locations may be helpful as summarized in Fichtl's recent correspondence. Ideally, further research is needed based on measurements of ground wind turbulence with high resolution anemometers at a number of altitudes at a new KSC tower located closer to the ARES-1 launch pad .The proposed research would be based on turbulence measurements that may be influenced by surface terrain roughness that may be significantly different from roughness prior to 1970 in Fichtl's measurements. Significant improvements in instrumentation, data storage end processing will greatly enhance the capability to model ground wind profiles and ground wind turbulence.

  6. How to analyse irregularly sampled geophysical time series?

    Science.gov (United States)

    Eroglu, Deniz; Ozken, Ibrahim; Stemler, Thomas; Marwan, Norbert; Wyrwoll, Karl-Heinz; Kurths, Juergen

    2015-04-01

    One of the challenges of time series analysis is to detect dynamical changes in the dynamics of the underlying system.There are numerous methods that can be used to detect such regime changes in regular sampled times series. Here we present a new approach, that can be applied, when the time series is irregular sampled. Such data sets occur frequently in real world applications as in paleo climate proxy records. The basic idea follows Victor and Purpura [1] and considers segments of the time series. For each segment we compute the cost of transforming the segment into the following one. If the time series is from one dynamical regime the cost of transformation should be similar for each segment of the data. Dramatic changes in the cost time series indicate a change in the underlying dynamics. Any kind of analysis can be applicable to the cost time series since it is a regularly sampled time series. While recurrence plots are not the best choice for irregular sampled data with some measurement noise component, we show that a recurrence plot analysis based on the cost time series can successfully identify the changes in the dynamics of the system. We tested this method using synthetically created time series and will use these results to highlight the performance of our method. Furthermore we present our analysis of a suite of calcite and aragonite stalagmites located in the eastern Kimberley region of tropical Western Australia. This oxygen isotopic data is a proxy for the monsoon activity over the last 8,000 years. In this time series our method picks up several so far undetected changes from wet to dry in the monsoon system and therefore enables us to get a better understanding of the monsoon dynamics in the North-East of Australia over the last couple of thousand years. [1] J. D. Victor and K. P. Purpura, Network: Computation in Neural Systems 8, 127 (1997)

  7. A Method for Determining Periods in Time Series.

    Science.gov (United States)

    1981-04-01

    SUPPLEMENTARY NOTES IS. KEY WORDS (Conlinu an revere cide Ii necesry d Identify by block nmi 9ber) Univariate time series; spectral density function ; Newton’s...and the method is applied to a series of hormone levels data. KEY WORDS: Univariate time series; Spectral density function ; Newton’s Method...Z the set of integers, be a zero mean covariance stationary time series with autocovariance function R(v) = E(Y(t)Y(t+v)), vZ and spectral density function f

  8. Distance measure with improved lower bound for multivariate time series

    Science.gov (United States)

    Li, Hailin

    2017-02-01

    Lower bound function is one of the important techniques used to fast search and index time series data. Multivariate time series has two aspects of high dimensionality including the time-based dimension and the variable-based dimension. Due to the influence of variable-based dimension, a novel method is proposed to deal with the lower bound distance computation for multivariate time series. The proposed method like the traditional ones also reduces the dimensionality of time series in its first step and thus does not directly apply the lower bound function on the multivariate time series. The dimensionality reduction is that multivariate time series is reduced to univariate time series denoted as center sequences according to the principle of piecewise aggregate approximation. In addition, an extended lower bound function is designed to obtain good tightness and fast measure the distance between any two center sequences. The experimental results demonstrate that the proposed lower bound function has better tightness and improves the performance of similarity search in multivariate time series datasets.

  9. Recovery of the Time-Evolution Equation of Time-Delay Systems from Time Series

    CERN Document Server

    Bünner, M J; Kittel, A; Parisi, J; Meyer, Th.

    1997-01-01

    We present a method for time series analysis of both, scalar and nonscalar time-delay systems. If the dynamics of the system investigated is governed by a time-delay induced instability, the method allows to determine the delay time. In a second step, the time-delay differential equation can be recovered from the time series. The method is a generalization of our recently proposed method suitable for time series analysis of {\\it scalar} time-delay systems. The dynamics is not required to be settled on its attractor, which also makes transient motion accessible to the analysis. If the motion actually takes place on a chaotic attractor, the applicability of the method does not depend on the dimensionality of the chaotic attractor - one main advantage over all time series analysis methods known until now. For demonstration, we analyze time series, which are obtained with the help of the numerical integration of a two-dimensional time-delay differential equation. After having determined the delay time, we recover...

  10. Multiscale structure of time series revealed by the monotony spectrum.

    Science.gov (United States)

    Vamoş, Călin

    2017-03-01

    Observation of complex systems produces time series with specific dynamics at different time scales. The majority of the existing numerical methods for multiscale analysis first decompose the time series into several simpler components and the multiscale structure is given by the properties of their components. We present a numerical method which describes the multiscale structure of arbitrary time series without decomposing them. It is based on the monotony spectrum defined as the variation of the mean amplitude of the monotonic segments with respect to the mean local time scale during successive averagings of the time series, the local time scales being the durations of the monotonic segments. The maxima of the monotony spectrum indicate the time scales which dominate the variations of the time series. We show that the monotony spectrum can correctly analyze a diversity of artificial time series and can discriminate the existence of deterministic variations at large time scales from the random fluctuations. As an application we analyze the multifractal structure of some hydrological time series.

  11. Time series prediction using wavelet process neural network

    Institute of Scientific and Technical Information of China (English)

    Ding Gang; Zhong Shi-Sheng; Li Yang

    2008-01-01

    In the real world, the inputs of many complicated systems are time-varying functions or processes. In order to predict the outputs of these systems with high speed and accuracy, this paper proposes a time series prediction model based on the wavelet process neural network, and develops the corresponding learning algorithm based on the expansion of the orthogonal basis functions. The effectiveness of the proposed time series prediction model and its learning algorithm is proved by the Mackey-Glass time series prediction, and the comparative prediction results indicate that the proposed time series prediction model based on the wavelet process neural network seems to perform well and appears suitable for using as a good tool to predict the highly complex nonlinear time series.

  12. Small Sample Properties of Bayesian Multivariate Autoregressive Time Series Models

    Science.gov (United States)

    Price, Larry R.

    2012-01-01

    The aim of this study was to compare the small sample (N = 1, 3, 5, 10, 15) performance of a Bayesian multivariate vector autoregressive (BVAR-SEM) time series model relative to frequentist power and parameter estimation bias. A multivariate autoregressive model was developed based on correlated autoregressive time series vectors of varying…

  13. Measurements of spatial population synchrony: influence of time series transformations.

    Science.gov (United States)

    Chevalier, Mathieu; Laffaille, Pascal; Ferdy, Jean-Baptiste; Grenouillet, Gaël

    2015-09-01

    Two mechanisms have been proposed to explain spatial population synchrony: dispersal among populations, and the spatial correlation of density-independent factors (the "Moran effect"). To identify which of these two mechanisms is driving spatial population synchrony, time series transformations (TSTs) of abundance data have been used to remove the signature of one mechanism, and highlight the effect of the other. However, several issues with TSTs remain, and to date no consensus has emerged about how population time series should be handled in synchrony studies. Here, by using 3131 time series involving 34 fish species found in French rivers, we computed several metrics commonly used in synchrony studies to determine whether a large-scale climatic factor (temperature) influenced fish population dynamics at the regional scale, and to test the effect of three commonly used TSTs (detrending, prewhitening and a combination of both) on these metrics. We also tested whether the influence of TSTs on time series and population synchrony levels was related to the features of the time series using both empirical and simulated time series. For several species, and regardless of the TST used, we evidenced a Moran effect on freshwater fish populations. However, these results were globally biased downward by TSTs which reduced our ability to detect significant signals. Depending on the species and the features of the time series, we found that TSTs could lead to contradictory results, regardless of the metric considered. Finally, we suggest guidelines on how population time series should be processed in synchrony studies.

  14. Transition Icons for Time Series Visualization and Exploratory Analysis.

    Science.gov (United States)

    Nickerson, Paul; Baharloo, Raheleh; Wanigatunga, Amal A; Manini, Todd D; Tighe, Patrick J; Rashidi, Parisa

    2017-05-16

    The modern healthcare landscape has seen the rapid emergence of techniques and devices which temporally monitor and record physiological signals. The prevalence of time series data within the healthcare field necessitates the development of methods which can analyze the data in order to draw meaningful conclusions. Time series behavior is notoriously difficult to intuitively understand due to its intrinsic high-dimensionality, which is compounded in the case of analyzing groups of time series collected from different patients. Our framework, which we call Transition Icons, renders common patterns in a visual format useful for understanding the shared behavior within groups of time series. Transition Icons are adept at detecting and displaying subtle differences and similarities e.g. between measurements taken from patients receiving different treatment strategies or stratified by demographics. We introduce various methods which collectively allow for exploratory analysis of groups of time series, while being free of distribution assumptions and including simple heuristics for parameter determination. Our technique extracts discrete transition patterns from Symbolic Aggregate approXimation (SAX) representations, and compiles transition frequencies into a Bag of Patterns (BoP) constructed for each group. These transition frequencies are normalized and aligned in icon form to intuitively display the underlying patterns. We demonstrate the Transition Icon technique for two time series data sets - postoperative pain scores, and hip-worn accelerometer activity counts. We believe Transition Icons can be an important tool for researchers approaching time series data, as they give rich and intuitive information about collective time series behaviors.

  15. Uniform Consistency for Nonparametric Estimators in Null Recurrent Time Series

    DEFF Research Database (Denmark)

    Gao, Jiti; Kanaya, Shin; Li, Degui

    2015-01-01

    This paper establishes uniform consistency results for nonparametric kernel density and regression estimators when time series regressors concerned are nonstationary null recurrent Markov chains. Under suitable regularity conditions, we derive uniform convergence rates of the estimators. Our...... results can be viewed as a nonstationary extension of some well-known uniform consistency results for stationary time series....

  16. Model of a synthetic wind speed time series generator

    DEFF Research Database (Denmark)

    Negra, N.B.; Holmstrøm, O.; Bak-Jensen, B.

    2008-01-01

    of possible wind conditions. If these information are not available, synthetic wind speed time series may be a useful tool as well, but their generator must preserve statistical and stochastic features of the phenomenon. This paper deals with this issue: a generator for synthetic wind speed time series...

  17. Evaluation Applications of Regression Analysis with Time-Series Data.

    Science.gov (United States)

    Veney, James E.

    1993-01-01

    The application of time series analysis is described, focusing on the use of regression analysis for analyzing time series in a way that may make it more readily available to an evaluation practice audience. Practical guidelines are suggested for decision makers in government, health, and social welfare agencies. (SLD)

  18. Metagenomics meets time series analysis: unraveling microbial community dynamics

    NARCIS (Netherlands)

    Faust, K.; Lahti, L.M.; Gonze, D.; Vos, de W.M.; Raes, J.

    2015-01-01

    The recent increase in the number of microbial time series studies offers new insights into the stability and dynamics of microbial communities, from the world's oceans to human microbiota. Dedicated time series analysis tools allow taking full advantage of these data. Such tools can reveal periodic

  19. Two-fractal overlap time series: Earthquakes and market crashes

    Indian Academy of Sciences (India)

    Bikas K Chakrabarti; Arnab Chatterjee; Pratip Bhattacharyya

    2008-08-01

    We find prominent similarities in the features of the time series for the (model earthquakes or) overlap of two Cantor sets when one set moves with uniform relative velocity over the other and time series of stock prices. An anticipation method for some of the crashes have been proposed here, based on these observations.

  20. Robust Forecasting of Non-Stationary Time Series

    NARCIS (Netherlands)

    Croux, C.; Fried, R.; Gijbels, I.; Mahieu, K.

    2010-01-01

    This paper proposes a robust forecasting method for non-stationary time series. The time series is modelled using non-parametric heteroscedastic regression, and fitted by a localized MM-estimator, combining high robustness and large efficiency. The proposed method is shown to produce reliable foreca

  1. Mean shifts, unit roots and forecasting seasonal time series

    NARCIS (Netherlands)

    Ph.H.B.F. Franses (Philip Hans); R. Paap (Richard); H. Hoek (Henk)

    1997-01-01

    textabstractExamples of descriptive models for changing seasonal patterns in economic time series are autoregressive models with seasonal unit roots or with deterministic seasonal mean shifts. In this paper we show through a forecasting comparison for three macroeconomic time series (for which tests

  2. Stata: The language of choice for time series analysis?

    OpenAIRE

    Baum, Christopher F

    2004-01-01

    This paper discusses the use of Stata for the analysis of time series and panel data. The evolution of time-series capabilities in Stata is reviewed. Facilities for data management, graphics, and econometric analysis from both official Stata and the user community are discussed. A new routine to provide moving-window regression estimates, rollreg, is described, and its use illustrated.

  3. Fixed Points in Self-Similar Analysis of Time Series

    OpenAIRE

    Gluzman, S.; Yukalov, V. I.

    1998-01-01

    Two possible definitions of fixed points in the self-similar analysis of time series are considered. One definition is based on the minimal-difference condition and another, on a simple averaging. From studying stock market time series, one may conclude that these two definitions are practically equivalent. A forecast is made for the stock market indices for the end of March 1998.

  4. Parameterizing unconditional skewness in models for financial time series

    DEFF Research Database (Denmark)

    He, Changli; Silvennoinen, Annastiina; Teräsvirta, Timo

    In this paper we consider the third-moment structure of a class of time series models. It is often argued that the marginal distribution of financial time series such as returns is skewed. Therefore it is of importance to know what properties a model should possess if it is to accommodate...

  5. Time Series Econometrics for the 21st Century

    Science.gov (United States)

    Hansen, Bruce E.

    2017-01-01

    The field of econometrics largely started with time series analysis because many early datasets were time-series macroeconomic data. As the field developed, more cross-sectional and longitudinal datasets were collected, which today dominate the majority of academic empirical research. In nonacademic (private sector, central bank, and governmental)…

  6. Mean shifts, unit roots and forecasting seasonal time series

    NARCIS (Netherlands)

    Ph.H.B.F. Franses (Philip Hans); R. Paap (Richard); H. Hoek (Henk)

    1997-01-01

    textabstractExamples of descriptive models for changing seasonal patterns in economic time series are autoregressive models with seasonal unit roots or with deterministic seasonal mean shifts. In this paper we show through a forecasting comparison for three macroeconomic time series (for which tests

  7. Time series analysis : Smoothed correlation integrals, autocovariances, and power spectra

    NARCIS (Netherlands)

    Takens, F; Dumortier, F; Broer, H; Mawhin, J; Vanderbauwhede, A; Lunel, SV

    2005-01-01

    In this paper we relate notions from linear time series analyses, like autocovariances and power spectra, with notions from nonlinear times series analysis, like (smoothed) correlation integrals and the corresponding dimensions and entropies. The complete proofs of the results announced in this pape

  8. Measuring information interactions on the ordinal pattern of stock time series.

    Science.gov (United States)

    Zhao, Xiaojun; Shang, Pengjian; Wang, Jing

    2013-02-01

    The interactions among time series as individual components of complex systems can be quantified by measuring to what extent they exchange information among each other. In many applications, one focuses not on the original series but on its ordinal pattern. In such cases, trivial noises appear more likely to be filtered and the abrupt influence of extreme values can be weakened. Cross-sample entropy and inner composition alignment have been introduced as prominent methods to estimate the information interactions of complex systems. In this paper, we modify both methods to detect the interactions among the ordinal pattern of stock return and volatility series, and we try to uncover the information exchanges across sectors in Chinese stock markets.

  9. Fluctuation behaviors of financial return volatility duration

    Science.gov (United States)

    Niu, Hongli; Wang, Jun; Lu, Yunfan

    2016-04-01

    It is of significantly crucial to understand the return volatility of financial markets because it helps to quantify the investment risk, optimize the portfolio, and provide a key input of option pricing models. The characteristics of isolated high volatility events above certain threshold in price fluctuations and the distributions of return intervals between these events arouse great interest in financial research. In the present work, we introduce a new concept of daily return volatility duration, which is defined as the shortest passage time when the future volatility intensity is above or below the current volatility intensity (without predefining a threshold). The statistical properties of the daily return volatility durations for seven representative stock indices from the world financial markets are investigated. Some useful and interesting empirical results of these volatility duration series about the probability distributions, memory effects and multifractal properties are obtained. These results also show that the proposed stock volatility series analysis is a meaningful and beneficial trial.

  10. Scale Invariance in Rain Time Series

    Science.gov (United States)

    Deluca, A.; Corral, A.

    2009-09-01

    In the last few years there have been pieces of evidence that rain events can be considered analogous to other nonequilibrium relaxation processes in Nature such as earthquakes, solar flares and avalanches. In this work we compare the probability densities of rain event size, duration, and recurrence times (i.e., drought periods) between one Mediterranean site and different sites worldwide. We test the existence of scale invariance in these distributions and the possibility of a universal scaling exponent, despite the different climatic characteristics of the different places.

  11. Comparison of New and Old Sunspot Number Time Series

    Science.gov (United States)

    Cliver, E. W.

    2016-11-01

    Four new sunspot number time series have been published in this Topical Issue: a backbone-based group number in Svalgaard and Schatten ( Solar Phys., 2016; referred to here as SS, 1610 - present), a group number series in Usoskin et al. ( Solar Phys., 2016; UEA, 1749 - present) that employs active day fractions from which it derives an observational threshold in group spot area as a measure of observer merit, a provisional group number series in Cliver and Ling ( Solar Phys., 2016; CL, 1841 - 1976) that removed flaws in the Hoyt and Schatten ( Solar Phys. 179, 189, 1998a; 181, 491, 1998b) normalization scheme for the original relative group sunspot number (RG, 1610 - 1995), and a corrected Wolf (international, RI) number in Clette and Lefèvre ( Solar Phys., 2016; SN, 1700 - present). Despite quite different construction methods, the four new series agree well after about 1900. Before 1900, however, the UEA time series is lower than SS, CL, and SN, particularly so before about 1885. Overall, the UEA series most closely resembles the original RG series. Comparison of the UEA and SS series with a new solar wind B time series (Owens et al. in J. Geophys. Res., 2016; 1845 - present) indicates that the UEA time series is too low before 1900. We point out incongruities in the Usoskin et al. ( Solar Phys., 2016) observer normalization scheme and present evidence that this method under-estimates group counts before 1900. In general, a correction factor time series, obtained by dividing an annual group count series by the corresponding yearly averages of raw group counts for all observers, can be used to assess the reliability of new sunspot number reconstructions.

  12. Testing time series reversibility using complex network methods

    CERN Document Server

    Donges, Jonathan F; Kurths, Jürgen

    2012-01-01

    The absence of time-reversal symmetry is a fundamental property of many nonlinear time series. Here, we propose a set of novel statistical tests for time series reversibility based on standard and horizontal visibility graphs. Specifically, we statistically compare the distributions of time-directed variants of the common graph-theoretical measures degree and local clustering coefficient. Unlike other tests for reversibility, our approach does not require constructing surrogate data and can be applied to relatively short time series. We demonstrate its performance for realisations of paradigmatic model systems with known time-reversal properties as well as pickling up signatures of nonlinearity in some well-studied real-world neuro-physiological time series.

  13. Fisher Information Framework for Time Series Modeling

    CERN Document Server

    Venkatesan, R C

    2016-01-01

    A robust prediction model invoking the Takens embedding theorem, whose \\textit{working hypothesis} is obtained via an inference procedure based on the minimum Fisher information principle, is presented. The coefficients of the ansatz, central to the \\textit{working hypothesis} satisfy a time independent Schr\\"{o}dinger-like equation in a vector setting. The inference of i) the probability density function of the coefficients of the \\textit{working hypothesis} and ii) the establishing of constraint driven pseudo-inverse condition for the modeling phase of the prediction scheme, is made, for the case of normal distributions, with the aid of the quantum mechanical virial theorem. The well-known reciprocity relations and the associated Legendre transform structure for the Fisher information measure (FIM, hereafter)-based model in a vector setting (with least square constraints) are self-consistently derived. These relations are demonstrated to yield an intriguing form of the FIM for the modeling phase, which defi...

  14. Time series analysis and inverse theory for geophysicists

    Institute of Scientific and Technical Information of China (English)

    Junzo Kasahara

    2006-01-01

    @@ Thanks to the advances in geophysical measurement technologies, most geophysical data are now recorded in digital form. But to extract the ‘Earth's nature’ from observed data, it is necessary to apply the signal-processing method to the time-series data, seismograms and geomagnetic records being the most common. The processing of time-series data is one of the major subjects of this book.By the processing of time series data, numerical values such as travel-times are obtained.The first stage of data analysis is forward modeling, but the more advanced step is the inversion method. This is the second subject of this book.

  15. Chaotic Time Series Forecasting Using Higher Order Neural Networks

    Directory of Open Access Journals (Sweden)

    Waddah Waheeb

    2016-10-01

    Full Text Available This study presents a novel application and comparison of higher order neural networks (HONNs to forecast benchmark chaotic time series. Two models of HONNs were implemented, namely functional link neural network (FLNN and pi-sigma neural network (PSNN. These models were tested on two benchmark time series; the monthly smoothed sunspot numbers and the Mackey-Glass time-delay differential equation time series. The forecasting performance of the HONNs is compared against the performance of different models previously used in the literature such as fuzzy and neural networks models. Simulation results showed that FLNN and PSNN offer good performance compared to many previously used hybrid models.

  16. Scaling symmetry, renormalization, and time series modeling: the case of financial assets dynamics.

    Science.gov (United States)

    Zamparo, Marco; Baldovin, Fulvio; Caraglio, Michele; Stella, Attilio L

    2013-12-01

    We present and discuss a stochastic model of financial assets dynamics based on the idea of an inverse renormalization group strategy. With this strategy we construct the multivariate distributions of elementary returns based on the scaling with time of the probability density of their aggregates. In its simplest version the model is the product of an endogenous autoregressive component and a random rescaling factor designed to embody also exogenous influences. Mathematical properties like increments' stationarity and ergodicity can be proven. Thanks to the relatively low number of parameters, model calibration can be conveniently based on a method of moments, as exemplified in the case of historical data of the S&P500 index. The calibrated model accounts very well for many stylized facts, like volatility clustering, power-law decay of the volatility autocorrelation function, and multiscaling with time of the aggregated return distribution. In agreement with empirical evidence in finance, the dynamics is not invariant under time reversal, and, with suitable generalizations, skewness of the return distribution and leverage effects can be included. The analytical tractability of the model opens interesting perspectives for applications, for instance, in terms of obtaining closed formulas for derivative pricing. Further important features are the possibility of making contact, in certain limits, with autoregressive models widely used in finance and the possibility of partially resolving the long- and short-memory components of the volatility, with consistent results when applied to historical series.

  17. Sensor-Generated Time Series Events: A Definition Language

    Science.gov (United States)

    Anguera, Aurea; Lara, Juan A.; Lizcano, David; Martínez, Maria Aurora; Pazos, Juan

    2012-01-01

    There are now a great many domains where information is recorded by sensors over a limited time period or on a permanent basis. This data flow leads to sequences of data known as time series. In many domains, like seismography or medicine, time series analysis focuses on particular regions of interest, known as events, whereas the remainder of the time series contains hardly any useful information. In these domains, there is a need for mechanisms to identify and locate such events. In this paper, we propose an events definition language that is general enough to be used to easily and naturally define events in time series recorded by sensors in any domain. The proposed language has been applied to the definition of time series events generated within the branch of medicine dealing with balance-related functions in human beings. A device, called posturograph, is used to study balance-related functions. The platform has four sensors that record the pressure intensity being exerted on the platform, generating four interrelated time series. As opposed to the existing ad hoc proposals, the results confirm that the proposed language is valid, that is generally applicable and accurate, for identifying the events contained in the time series.

  18. Effective multifractal features of high-frequency price fluctuations time series and l-variability diagrams

    Energy Technology Data Exchange (ETDEWEB)

    Souza, Jeferson de [Laboratorio de Analise de Bacias e Petrofisica, Departamento de Geologia, Universidade Federal do Parana, Centro Politecnico - Jardim das Americas, Caixa Postal 19001, 81531-990 Curitiba-PR (Brazil); Centro Brasileiro de Pesquisas Fisicas, Rua Dr. Xavier Sigaud 150, 22290-180 Rio de Janeiro-RJ (Brazil)], E-mail: jdesouza@ufpr.br; Duarte Queiros, Silvio M. [Centro Brasileiro de Pesquisas Fisicas, Rua Dr. Xavier Sigaud 150, 22290-180 Rio de Janeiro-RJ (Brazil)], E-mail: sdqueiro@googlemail.com

    2009-11-30

    In this manuscript we present a comprehensive study on the multifractal properties of high-frequency price fluctuations and instantaneous volatility of the equities that compose the Dow Jones Industrial Average. The analysis consists about the quantification of the influence of dependence and non-Gaussianity on the multifractal character of financial quantities. Our results point out an equivalent importance of dependence and non-Gaussianity on the multifractality of time series. Moreover, we analyse l-diagrams of price fluctuations. In the latter case, we show that the fractal dimension of these maps is basically independent of the lag between price fluctuations that we assume.

  19. Financial Time Series: Stylized Facts for the Mexican Stock Exchange Index Compared to Developed Markets

    OpenAIRE

    Omar Rojas; Carlos Trejo-Pech

    2014-01-01

    We present some stylized facts exhibited by the time series of returns of the Mexican Stock Exchange Index (IPC) and compare them to a sample of both developed (USA, UK and Japan) and emerging markets (Brazil and India). The period of study is 1997-2011. The stylized facts are related mostly to the probability distribution func- tion and the autocorrelation function (e.g. fat tails, non-normality, volatility cluster- ing, among others). We find that positive skewness for returns in Mexico and...

  20. Performance of multifractal detrended fluctuation analysis on short time series

    CERN Document Server

    Lopez, Juan Luis

    2013-01-01

    The performance of the multifractal detrended analysis on short time series is evaluated for synthetic samples of several mono- and multifractal models. The reconstruction of the generalized Hurst exponents is used to determine the range of applicability of the method and the precision of its results as a function of the decreasing length of the series. As an application the series of the daily exchange rate between the U.S. dollar and the euro is studied.

  1. Time Series Decomposition into Oscillation Components and Phase Estimation.

    Science.gov (United States)

    Matsuda, Takeru; Komaki, Fumiyasu

    2017-02-01

    Many time series are naturally considered as a superposition of several oscillation components. For example, electroencephalogram (EEG) time series include oscillation components such as alpha, beta, and gamma. We propose a method for decomposing time series into such oscillation components using state-space models. Based on the concept of random frequency modulation, gaussian linear state-space models for oscillation components are developed. In this model, the frequency of an oscillator fluctuates by noise. Time series decomposition is accomplished by this model like the Bayesian seasonal adjustment method. Since the model parameters are estimated from data by the empirical Bayes' method, the amplitudes and the frequencies of oscillation components are determined in a data-driven manner. Also, the appropriate number of oscillation components is determined with the Akaike information criterion (AIC). In this way, the proposed method provides a natural decomposition of the given time series into oscillation components. In neuroscience, the phase of neural time series plays an important role in neural information processing. The proposed method can be used to estimate the phase of each oscillation component and has several advantages over a conventional method based on the Hilbert transform. Thus, the proposed method enables an investigation of the phase dynamics of time series. Numerical results show that the proposed method succeeds in extracting intermittent oscillations like ripples and detecting the phase reset phenomena. We apply the proposed method to real data from various fields such as astronomy, ecology, tidology, and neuroscience.

  2. Outliers detection in multivariate time series by independent component analysis.

    Science.gov (United States)

    Baragona, Roberto; Battaglia, Francesco

    2007-07-01

    In multivariate time series, outlying data may be often observed that do not fit the common pattern. Occurrences of outliers are unpredictable events that may severely distort the analysis of the multivariate time series. For instance, model building, seasonality assessment, and forecasting may be seriously affected by undetected outliers. The structure dependence of the multivariate time series gives rise to the well-known smearing and masking phenomena that prevent using most outliers' identification techniques. It may be noticed, however, that a convenient way for representing multiple outliers consists of superimposing a deterministic disturbance to a gaussian multivariate time series. Then outliers may be modeled as nongaussian time series components. Independent component analysis is a recently developed tool that is likely to be able to extract possible outlier patterns. In practice, independent component analysis may be used to analyze multivariate observable time series and separate regular and outlying unobservable components. In the factor models framework too, it is shown that independent component analysis is a useful tool for detection of outliers in multivariate time series. Some algorithms that perform independent component analysis are compared. It has been found that all algorithms are effective in detecting various types of outliers, such as patches, level shifts, and isolated outliers, even at the beginning or the end of the stretch of observations. Also, there is no appreciable difference in the ability of different algorithms to display the outlying observations pattern.

  3. Solving Nonlinear Time Delay Control Systems by Fourier series

    Directory of Open Access Journals (Sweden)

    Mohammad Hadi Farahi

    2014-06-01

    Full Text Available In this paper we present a method to find the solution of time-delay optimal control systems using Fourier series. The method is based upon expanding various time functions in the system as their truncated Fourier series. Operational matrices of integration and delay are presented and are utilized to reduce the solution of time-delay control systems to the solution of algebraic equations. Illustrative examples are included to demonstrate the validity and applicability of the technique.

  4. Cross recurrence plot based synchronization of time series

    OpenAIRE

    N. Marwan; Thiel, M.; Nowaczyk, N. R.

    2002-01-01

    The method of recurrence plots is extended to the cross recurrence plots (CRP) which, among others, enables the study of synchronization or time differences in two time series. This is emphasized in a distorted main diagonal in the cross recurrence plot, the line of synchronization (LOS). A non-parametrical fit of this LOS can be used to rescale the time axis of the two data series (whereby one of them is compressed or stretched) so ...

  5. Clinical time series prediction: Toward a hierarchical dynamical system framework.

    Science.gov (United States)

    Liu, Zitao; Hauskrecht, Milos

    2015-09-01

    Developing machine learning and data mining algorithms for building temporal models of clinical time series is important for understanding of the patient condition, the dynamics of a disease, effect of various patient management interventions and clinical decision making. In this work, we propose and develop a novel hierarchical framework for modeling clinical time series data of varied length and with irregularly sampled observations. Our hierarchical dynamical system framework for modeling clinical time series combines advantages of the two temporal modeling approaches: the linear dynamical system and the Gaussian process. We model the irregularly sampled clinical time series by using multiple Gaussian process sequences in the lower level of our hierarchical framework and capture the transitions between Gaussian processes by utilizing the linear dynamical system. The experiments are conducted on the complete blood count (CBC) panel data of 1000 post-surgical cardiac patients during their hospitalization. Our framework is evaluated and compared to multiple baseline approaches in terms of the mean absolute prediction error and the absolute percentage error. We tested our framework by first learning the time series model from data for the patients in the training set, and then using it to predict future time series values for the patients in the test set. We show that our model outperforms multiple existing models in terms of its predictive accuracy. Our method achieved a 3.13% average prediction accuracy improvement on ten CBC lab time series when it was compared against the best performing baseline. A 5.25% average accuracy improvement was observed when only short-term predictions were considered. A new hierarchical dynamical system framework that lets us model irregularly sampled time series data is a promising new direction for modeling clinical time series and for improving their predictive performance. Copyright © 2014 Elsevier B.V. All rights reserved.

  6. Clinical time series prediction: towards a hierarchical dynamical system framework

    Science.gov (United States)

    Liu, Zitao; Hauskrecht, Milos

    2014-01-01

    Objective Developing machine learning and data mining algorithms for building temporal models of clinical time series is important for understanding of the patient condition, the dynamics of a disease, effect of various patient management interventions and clinical decision making. In this work, we propose and develop a novel hierarchical framework for modeling clinical time series data of varied length and with irregularly sampled observations. Materials and methods Our hierarchical dynamical system framework for modeling clinical time series combines advantages of the two temporal modeling approaches: the linear dynamical system and the Gaussian process. We model the irregularly sampled clinical time series by using multiple Gaussian process sequences in the lower level of our hierarchical framework and capture the transitions between Gaussian processes by utilizing the linear dynamical system. The experiments are conducted on the complete blood count (CBC) panel data of 1000 post-surgical cardiac patients during their hospitalization. Our framework is evaluated and compared to multiple baseline approaches in terms of the mean absolute prediction error and the absolute percentage error. Results We tested our framework by first learning the time series model from data for the patient in the training set, and then applying the model in order to predict future time series values on the patients in the test set. We show that our model outperforms multiple existing models in terms of its predictive accuracy. Our method achieved a 3.13% average prediction accuracy improvement on ten CBC lab time series when it was compared against the best performing baseline. A 5.25% average accuracy improvement was observed when only short-term predictions were considered. Conclusion A new hierarchical dynamical system framework that lets us model irregularly sampled time series data is a promising new direction for modeling clinical time series and for improving their predictive

  7. Modeling Persistence In Hydrological Time Series Using Fractional Differencing

    Science.gov (United States)

    Hosking, J. R. M.

    1984-12-01

    The class of autoregressive integrated moving average (ARIMA) time series models may be generalized by permitting the degree of differencing d to take fractional values. Models including fractional differencing are capable of representing persistent series (d > 0) or short-memory series (d = 0). The class of fractionally differenced ARIMA processes provides a more flexible way than has hitherto been available of simultaneously modeling the long-term and short-term behavior of a time series. In this paper some fundamental properties of fractionally differenced ARIMA processes are presented. Methods of simulating these processes are described. Estimation of the parameters of fractionally differenced ARIMA models is discussed, and an approximate maximum likelihood method is proposed. The methodology is illustrated by fitting fractionally differenced models to time series of streamflows and annual temperatures.

  8. Seasonality, nonstationarity and the forecasting of monthly time series

    NARCIS (Netherlands)

    Ph.H.B.F. Franses (Philip Hans)

    1991-01-01

    textabstractWe focus on two forecasting models for a monthly time series. The first model requires that the variable is first order and seasonally differenced. The second model considers the series only in its first differences, while seasonality is modeled with a constant and seasonal dummies. A me

  9. Seasonality, nonstationarity and the forecasting of monthly time series

    NARCIS (Netherlands)

    Ph.H.B.F. Franses (Philip Hans)

    1991-01-01

    textabstractWe focus on two forecasting models for a monthly time series. The first model requires that the variable is first order and seasonally differenced. The second model considers the series only in its first differences, while seasonality is modeled with a constant and seasonal dummies. A me

  10. A vector of quarters representation for bivariate time series

    NARCIS (Netherlands)

    Ph.H.B.F. Franses (Philip Hans)

    1995-01-01

    textabstractIn this paper it is shown that several models for a bivariate nonstationary quarterly time series are nested in a vector autoregression with cointegration restrictions for the eight annual series of quarterly observations. Or, the Granger Representation Theorem is extended to incorporate

  11. A multivariate approach to modeling univariate seasonal time series

    NARCIS (Netherlands)

    Ph.H.B.F. Franses (Philip Hans)

    1994-01-01

    textabstractA seasonal time series can be represented by a vector autoregressive model for the annual series containing the seasonal observations. This model allows for periodically varying coefficients. When the vector elements are integrated, the maximum likelihood cointegration method can be used

  12. Seasonality, nonstationarity and the forecasting of monthly time series

    NARCIS (Netherlands)

    Ph.H.B.F. Franses (Philip Hans)

    1991-01-01

    textabstractWe focus on two forecasting models for a monthly time series. The first model requires that the variable is first order and seasonally differenced. The second model considers the series only in its first differences, while seasonality is modeled with a constant and seasonal dummies. A

  13. Multivariate time series analysis with R and financial applications

    CERN Document Server

    Tsay, Ruey S

    2013-01-01

    Since the publication of his first book, Analysis of Financial Time Series, Ruey Tsay has become one of the most influential and prominent experts on the topic of time series. Different from the traditional and oftentimes complex approach to multivariate (MV) time series, this sequel book emphasizes structural specification, which results in simplified parsimonious VARMA modeling and, hence, eases comprehension. Through a fundamental balance between theory and applications, the book supplies readers with an accessible approach to financial econometric models and their applications to real-worl

  14. Multi-Scale Dissemination of Time Series Data

    DEFF Research Database (Denmark)

    Guo, Qingsong; Zhou, Yongluan; Su, Li

    2013-01-01

    In this paper, we consider the problem of continuous dissemination of time series data, such as sensor measurements, to a large number of subscribers. These subscribers fall into multiple subscription levels, where each subscription level is specified by the bandwidth constraint of a subscriber......, which is an abstract indicator for both the physical limits and the amount of data that the subscriber would like to handle. To handle this problem, we propose a system framework for multi-scale time series data dissemination that employs a typical tree-based dissemination network and existing time-series...

  15. On the detection of superdiffusive behaviour in time series

    CERN Document Server

    Gottwald, Georg A

    2016-01-01

    We present a new method for detecting superdiffusive behaviour and for determining rates of superdiffusion in time series data. Our method applies equally to stochastic and deterministic time series data and relies on one realisation (ie one sample path) of the process. Linear drift effects are automatically removed without any preprocessing. We show numerical results for time series constructed from i.i.d. $\\alpha$-stable random variables and from deterministic weakly chaotic maps. We compare our method with the standard method of estimating the growth rate of the mean-square displacement as well as the $p$-variation method.

  16. Genetic programming-based chaotic time series modeling

    Institute of Scientific and Technical Information of China (English)

    张伟; 吴智铭; 杨根科

    2004-01-01

    This paper proposes a Genetic Programming-Based Modeling(GPM)algorithm on chaotic time series. GP is used here to search for appropriate model structures in function space,and the Particle Swarm Optimization(PSO)algorithm is used for Nonlinear Parameter Estimation(NPE)of dynamic model structures. In addition,GPM integrates the results of Nonlinear Time Series Analysis(NTSA)to adjust the parameters and takes them as the criteria of established models.Experiments showed the effectiveness of such improvements on chaotic time series modeling.

  17. Algorithms for Linear Time Series Analysis: With R Package

    Directory of Open Access Journals (Sweden)

    A. Ian McLeod

    2007-11-01

    Full Text Available Our ltsa package implements the Durbin-Levinson and Trench algorithms and provides a general approach to the problems of fitting, forecasting and simulating linear time series models as well as fitting regression models with linear time series errors. For computational efficiency both algorithms are implemented in C and interfaced to R. Examples are given which illustrate the efficiency and accuracy of the algorithms. We provide a second package FGN which illustrates the use of the ltsa package with fractional Gaussian noise (FGN. It is hoped that the ltsa will provide a base for further time series software.

  18. Modelling road accidents: An approach using structural time series

    Science.gov (United States)

    Junus, Noor Wahida Md; Ismail, Mohd Tahir

    2014-09-01

    In this paper, the trend of road accidents in Malaysia for the years 2001 until 2012 was modelled using a structural time series approach. The structural time series model was identified using a stepwise method, and the residuals for each model were tested. The best-fitted model was chosen based on the smallest Akaike Information Criterion (AIC) and prediction error variance. In order to check the quality of the model, a data validation procedure was performed by predicting the monthly number of road accidents for the year 2012. Results indicate that the best specification of the structural time series model to represent road accidents is the local level with a seasonal model.

  19. INTERVAL TIME SERIES ANALYSIS WITH AN APPLICATION TO THE STERLING-DOLLAR EXCHANGE RATE

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    Traditional econometrics has long employed "points" to measure time series data.In reallife situations,however,it suffers the loss of volatility information,since many variables are bounded by intervals in a given period.To address this issue,this paper provides a new methodology for interval time series analysis.The concept of "interval stochastic process" is formally defined as a counterpart of "stochastic process" in point-based econometrics.The authors introduce the concepts of interval stationarity,interval statistics(including interval mean,interval variance,etc.)and propose an interval linear model to investigate the dynamic relationships between interval processes.A new interval-based optimization approach for estimation is proposed,and corresponding evaluation criteria are derived.To demonstrate that the new interval method provides valid results,an empirical example on the sterling-dollar exchange rate is presented.

  20. Scalable Prediction of Energy Consumption using Incremental Time Series Clustering

    Energy Technology Data Exchange (ETDEWEB)

    Simmhan, Yogesh; Noor, Muhammad Usman

    2013-10-09

    Time series datasets are a canonical form of high velocity Big Data, and often generated by pervasive sensors, such as found in smart infrastructure. Performing predictive analytics on time series data can be computationally complex, and requires approximation techniques. In this paper, we motivate this problem using a real application from the smart grid domain. We propose an incremental clustering technique, along with a novel affinity score for determining cluster similarity, which help reduce the prediction error for cumulative time series within a cluster. We evaluate this technique, along with optimizations, using real datasets from smart meters, totaling ~700,000 data points, and show the efficacy of our techniques in improving the prediction error of time series data within polynomial time.

  1. Quantifying memory in complex physiological time-series.

    Science.gov (United States)

    Shirazi, Amir H; Raoufy, Mohammad R; Ebadi, Haleh; De Rui, Michele; Schiff, Sami; Mazloom, Roham; Hajizadeh, Sohrab; Gharibzadeh, Shahriar; Dehpour, Ahmad R; Amodio, Piero; Jafari, G Reza; Montagnese, Sara; Mani, Ali R

    2013-01-01

    In a time-series, memory is a statistical feature that lasts for a period of time and distinguishes the time-series from a random, or memory-less, process. In the present study, the concept of "memory length" was used to define the time period, or scale over which rare events within a physiological time-series do not appear randomly. The method is based on inverse statistical analysis and provides empiric evidence that rare fluctuations in cardio-respiratory time-series are 'forgotten' quickly in healthy subjects while the memory for such events is significantly prolonged in pathological conditions such as asthma (respiratory time-series) and liver cirrhosis (heart-beat time-series). The memory length was significantly higher in patients with uncontrolled asthma compared to healthy volunteers. Likewise, it was significantly higher in patients with decompensated cirrhosis compared to those with compensated cirrhosis and healthy volunteers. We also observed that the cardio-respiratory system has simple low order dynamics and short memory around its average, and high order dynamics around rare fluctuations.

  2. Elements of nonlinear time series analysis and forecasting

    CERN Document Server

    De Gooijer, Jan G

    2017-01-01

    This book provides an overview of the current state-of-the-art of nonlinear time series analysis, richly illustrated with examples, pseudocode algorithms and real-world applications. Avoiding a “theorem-proof” format, it shows concrete applications on a variety of empirical time series. The book can be used in graduate courses in nonlinear time series and at the same time also includes interesting material for more advanced readers. Though it is largely self-contained, readers require an understanding of basic linear time series concepts, Markov chains and Monte Carlo simulation methods. The book covers time-domain and frequency-domain methods for the analysis of both univariate and multivariate (vector) time series. It makes a clear distinction between parametric models on the one hand, and semi- and nonparametric models/methods on the other. This offers the reader the option of concentrating exclusively on one of these nonlinear time series analysis methods. To make the book as user friendly as possible...

  3. Analysis of temperature time series based on Hilbert-Huang Transform

    Institute of Scientific and Technical Information of China (English)

    马皓; 邱翔; 罗剑平; 顾品强; 刘宇陆

    2015-01-01

    In this paper, with consideration of the nonlinear and non-stationary properties of the temperature time series, we employ the Hilbert-Huang Transform, based on the empirical mode decomposition (EMD), to analyze the temperature time series from 1959 to 2012 in the Fengxian district of Shanghai, obtained from a certain monitoring station. The oscillating mode is drawn from the data, and its characteristics of the time series are investigated. The results show that the intrinsic modes of 1, 2 and 6 represent the periodic properties of 1 year, 2.5 years, and 27 years. The mean temperature shows periodic variations, but the main trend of this fluctuation is the rising of the temperature in the recent 50 years. The analysis of the reconstructed modes with the wave pattern shows that the variations are quite large from 1963 to 1964, from 1977 to 1982 and from 2003 to 2006, which indicates that the temperature rises and falls dramatically in these periods. The volatility from 1993 to 1994 is far more dramatic than in other periods. And the volatility is the most remarkable in recent 50 years. The log-linear plots of the mean time scalesT andMshow that each mode associated with a time scale almost twice as large as the time scale of the preceding mode. The Hilbert spectrum shows that the energy is concentra-ted in the range of low frequency from 0.05 to 0.1 Hz, and a very small amount of energy is distributed in the range of higher frequency over 0.1 Hz. In conclusion, the HHT is better than other traditional signal analysis methods in processing the nonlinear signals to obtain the periodic variation and volatility’s properties of different time scales.

  4. Graphical Data Analysis on the Circle: Wrap-Around Time Series Plots for (Interrupted) Time Series Designs.

    Science.gov (United States)

    Rodgers, Joseph Lee; Beasley, William Howard; Schuelke, Matthew

    2014-01-01

    Many data structures, particularly time series data, are naturally seasonal, cyclical, or otherwise circular. Past graphical methods for time series have focused on linear plots. In this article, we move graphical analysis onto the circle. We focus on 2 particular methods, one old and one new. Rose diagrams are circular histograms and can be produced in several different forms using the RRose software system. In addition, we propose, develop, illustrate, and provide software support for a new circular graphical method, called Wrap-Around Time Series Plots (WATS Plots), which is a graphical method useful to support time series analyses in general but in particular in relation to interrupted time series designs. We illustrate the use of WATS Plots with an interrupted time series design evaluating the effect of the Oklahoma City bombing on birthrates in Oklahoma County during the 10 years surrounding the bombing of the Murrah Building in Oklahoma City. We compare WATS Plots with linear time series representations and overlay them with smoothing and error bands. Each method is shown to have advantages in relation to the other; in our example, the WATS Plots more clearly show the existence and effect size of the fertility differential.

  5. Multi-dimensional sparse time series: feature extraction

    CERN Document Server

    Franciosi, Marco

    2008-01-01

    We show an analysis of multi-dimensional time series via entropy and statistical linguistic techniques. We define three markers encoding the behavior of the series, after it has been translated into a multi-dimensional symbolic sequence. The leading component and the trend of the series with respect to a mobile window analysis result from the entropy analysis and label the dynamical evolution of the series. The diversification formalizes the differentiation in the use of recurrent patterns, from a Zipf law point of view. These markers are the starting point of further analysis such as classification or clustering of large database of multi-dimensional time series, prediction of future behavior and attribution of new data. We also present an application to economic data. We deal with measurements of money investments of some business companies in advertising market for different media sources.

  6. Incomplete Continuous-time Securities Markets with Stochastic Income Volatility

    DEFF Research Database (Denmark)

    Christensen, Peter Ove; Larsen, Kasper

    2014-01-01

    We derive closed-form solutions for the equilibrium interest rate and market price of risk processes in an incomplete continuous-time market with uncertainty generated by Brownian motions. The economy has a finite number of heterogeneous exponential utility investors, who receive partially...... equilibrium displays both lower interest rates and higher risk premia compared to the equilibrium in an otherwise identical complete market....

  7. A probability distribution approach to synthetic turbulence time series

    Science.gov (United States)

    Sinhuber, Michael; Bodenschatz, Eberhard; Wilczek, Michael

    2016-11-01

    The statistical features of turbulence can be described in terms of multi-point probability density functions (PDFs). The complexity of these statistical objects increases rapidly with the number of points. This raises the question of how much information has to be incorporated into statistical models of turbulence to capture essential features such as inertial-range scaling and intermittency. Using high Reynolds number hot-wire data obtained at the Variable Density Turbulence Tunnel at the Max Planck Institute for Dynamics and Self-Organization, we establish a PDF-based approach on generating synthetic time series that reproduce those features. To do this, we measure three-point conditional PDFs from the experimental data and use an adaption-rejection method to draw random velocities from this distribution to produce synthetic time series. Analyzing these synthetic time series, we find that time series based on even low-dimensional conditional PDFs already capture some essential features of real turbulent flows.

  8. On robust forecasting of autoregressive time series under censoring

    OpenAIRE

    Kharin, Y.; Badziahin, I.

    2009-01-01

    Problems of robust statistical forecasting are considered for autoregressive time series observed under distortions generated by interval censoring. Three types of robust forecasting statistics are developed; meansquare risk is evaluated for the developed forecasting statistics. Numerical results are given.

  9. Lagrangian Time Series Models for Ocean Surface Drifter Trajectories

    CERN Document Server

    Sykulski, Adam M; Lilly, Jonathan M; Danioux, Eric

    2016-01-01

    This paper proposes stochastic models for the analysis of ocean surface trajectories obtained from freely-drifting satellite-tracked instruments. The proposed time series models are used to summarise large multivariate datasets and infer important physical parameters of inertial oscillations and other ocean processes. Nonstationary time series methods are employed to account for the spatiotemporal variability of each trajectory. Because the datasets are large, we construct computationally efficient methods through the use of frequency-domain modelling and estimation, with the data expressed as complex-valued time series. We detail how practical issues related to sampling and model misspecification may be addressed using semi-parametric techniques for time series, and we demonstrate the effectiveness of our stochastic models through application to both real-world data and to numerical model output.

  10. Fast and Flexible Multivariate Time Series Subsequence Search

    Data.gov (United States)

    National Aeronautics and Space Administration — Multivariate Time-Series (MTS) are ubiquitous, and are generated in areas as disparate as sensor recordings in aerospace systems, music and video streams, medical...

  11. AFSC/ABL: Ugashik sockeye salmon scale time series

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — A time series of scale samples (1956 b?? 2002) collected from adult sockeye salmon returning to Ugashik River were retrieved from the Alaska Department of Fish and...

  12. On robust forecasting of autoregressive time series under censoring

    OpenAIRE

    Kharin, Y.; Badziahin, I.

    2009-01-01

    Problems of robust statistical forecasting are considered for autoregressive time series observed under distortions generated by interval censoring. Three types of robust forecasting statistics are developed; meansquare risk is evaluated for the developed forecasting statistics. Numerical results are given.

  13. Multivariate Time Series Analysis for Optimum Production Forecast ...

    African Journals Online (AJOL)

    FIRST LADY

    Keywords: production model, inventory management, multivariate time series ... regard when companies over stock raw materials inventory as a result of .... Error Analysis for Forecasts of 2008-2014 to Establish Model out of. Control.

  14. Phenotyping of Clinical Time Series with LSTM Recurrent Neural Networks

    OpenAIRE

    Lipton, Zachary C.; Kale, David C.; Wetzell, Randall C.

    2015-01-01

    We present a novel application of LSTM recurrent neural networks to multilabel classification of diagnoses given variable-length time series of clinical measurements. Our method outperforms a strong baseline on a variety of metrics.

  15. Distinguishing chaotic time series from noise: A random matrix approach

    Science.gov (United States)

    Ye, Bin; Chen, Jianxing; Ju, Chen; Li, Huijun; Wang, Xuesong

    2017-03-01

    Deterministically chaotic systems can often give rise to random and unpredictable behaviors which make the time series obtained from them to be almost indistinguishable from noise. Motivated by the fact that data points in a chaotic time series will have intrinsic correlations between them, we propose a random matrix theory (RMT) approach to identify the deterministic or stochastic dynamics of the system. We show that the spectral distributions of the correlation matrices, constructed from the chaotic time series, deviate significantly from the predictions of random matrix ensembles. On the contrary, the eigenvalue statistics for a noisy signal follow closely those of random matrix ensembles. Numerical results also indicate that the approach is to some extent robust to additive observational noise which pollutes the data in many practical situations. Our approach is efficient in recognizing the continuous chaotic dynamics underlying the evolution of the time series.

  16. Unsupervised land cover change detection: meaningful sequential time series analysis

    CSIR Research Space (South Africa)

    Salmon, BP

    2011-06-01

    Full Text Available An automated land cover change detection method is proposed that uses coarse spatial resolution hyper-temporal earth observation satellite time series data. The study compared three different unsupervised clustering approaches that operate on short...

  17. AFSC/ABL: Naknek sockeye salmon scale time series

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — A time series of scale samples (1956 2002) collected from adult sockeye salmon returning to Naknek River were retrieved from the Alaska Department of Fish and Game....

  18. Assessing Day-to-Day Volatility: Does the Trading Time Matter?

    Directory of Open Access Journals (Sweden)

    José Valentim Machado Vicente

    2014-06-01

    Full Text Available The aim of this study is to examine whether investors who trade daily but at different times have distinct perceptions about the risk of an asset. In order to capture the uncertainty faced by these investors, we define the volatility perceived by investors as the distribution of standard deviations of daily returns calculated from intraday prices collected randomly. We find that this distribution has a high degree of dispersion. This means that different investors may not share the same opinion regarding the variability of returns of the same asset. Moreover, the close-to-close volatility is often less than the median of the volatility distribution perceived by investors while the open-to-open volatility is greater than that statistic. From a practical point of view, our results indicate that volatilities estimated using traditional samples of daily returns (i.e., close-to-close and open-to-open returns may not do a good job when used as inputs in financial models since they may not properly capture the risk investors are exposed.

  19. A Generalization of Some Classical Time Series Tools

    DEFF Research Database (Denmark)

    Nielsen, Henrik Aalborg; Madsen, Henrik

    2001-01-01

    In classical time series analysis the sample autocorrelation function (SACF) and the sample partial autocorrelation function (SPACF) has gained wide application for structural identification of linear time series models. We suggest generalizations, founded on smoothing techniques, applicable for ....... In this paper the generalizations are applied to some simulated data sets and to the Canadian lynx data. The generalizations seem to perform well and the measure of the departure from linearity proves to be an important additional tool....

  20. Outlier detection algorithms for least squares time series regression

    DEFF Research Database (Denmark)

    Johansen, Søren; Nielsen, Bent

    We review recent asymptotic results on some robust methods for multiple regression. The regressors include stationary and non-stationary time series as well as polynomial terms. The methods include the Huber-skip M-estimator, 1-step Huber-skip M-estimators, in particular the Impulse Indicator...... theory involves normal distribution results and Poisson distribution results. The theory is applied to a time series data set....

  1. The use of synthetic input sequences in time series modeling

    Energy Technology Data Exchange (ETDEWEB)

    Oliveira, Dair Jose de [Programa de Pos-Graduacao em Engenharia Eletrica, Universidade Federal de Minas Gerais, Av. Antonio Carlos 6627, 31.270-901 Belo Horizonte, MG (Brazil); Letellier, Christophe [CORIA/CNRS UMR 6614, Universite et INSA de Rouen, Av. de l' Universite, BP 12, F-76801 Saint-Etienne du Rouvray cedex (France); Gomes, Murilo E.D. [Programa de Pos-Graduacao em Engenharia Eletrica, Universidade Federal de Minas Gerais, Av. Antonio Carlos 6627, 31.270-901 Belo Horizonte, MG (Brazil); Aguirre, Luis A. [Programa de Pos-Graduacao em Engenharia Eletrica, Universidade Federal de Minas Gerais, Av. Antonio Carlos 6627, 31.270-901 Belo Horizonte, MG (Brazil)], E-mail: aguirre@cpdee.ufmg.br

    2008-08-04

    In many situations time series models obtained from noise-like data settle to trivial solutions under iteration. This Letter proposes a way of producing a synthetic (dummy) input, that is included to prevent the model from settling down to a trivial solution, while maintaining features of the original signal. Simulated benchmark models and a real time series of RR intervals from an ECG are used to illustrate the procedure.

  2. The use of synthetic input sequences in time series modeling

    Science.gov (United States)

    de Oliveira, Dair José; Letellier, Christophe; Gomes, Murilo E. D.; Aguirre, Luis A.

    2008-08-01

    In many situations time series models obtained from noise-like data settle to trivial solutions under iteration. This Letter proposes a way of producing a synthetic (dummy) input, that is included to prevent the model from settling down to a trivial solution, while maintaining features of the original signal. Simulated benchmark models and a real time series of RR intervals from an ECG are used to illustrate the procedure.

  3. Prediction and interpolation of time series by state space models

    OpenAIRE

    Helske, Jouni

    2015-01-01

    A large amount of data collected today is in the form of a time series. In order to make realistic inferences based on time series forecasts, in addition to point predictions, prediction intervals or other measures of uncertainty should be presented. Multiple sources of uncertainty are often ignored due to the complexities involved in accounting them correctly. In this dissertation, some of these problems are reviewed and some new solutions are presented. A state space approach...

  4. Stacked Heterogeneous Neural Networks for Time Series Forecasting

    Directory of Open Access Journals (Sweden)

    Florin Leon

    2010-01-01

    Full Text Available A hybrid model for time series forecasting is proposed. It is a stacked neural network, containing one normal multilayer perceptron with bipolar sigmoid activation functions, and the other with an exponential activation function in the output layer. As shown by the case studies, the proposed stacked hybrid neural model performs well on a variety of benchmark time series. The combination of weights of the two stack components that leads to optimal performance is also studied.

  5. Mean shifts, unit roots and forecasting seasonal time series

    OpenAIRE

    Franses, Philip Hans; Paap, Richard; Hoek, Henk

    1997-01-01

    textabstractExamples of descriptive models for changing seasonal patterns in economic time series are autoregressive models with seasonal unit roots or with deterministic seasonal mean shifts. In this paper we show through a forecasting comparison for three macroeconomic time series (for which tests indicate the presence of seasonal unit roots) that allowing for possible seasonal mean shifts can improve forecast performance. Next, by means of simulation we demonstrate the impact of imposing a...

  6. Extracting Chaos Control Parameters from Time Series Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Santos, R B B [Centro Universitario da FEI, Avenida Humberto de Alencar Castelo Branco 3972, 09850-901, Sao Bernardo do Campo, SP (Brazil); Graves, J C, E-mail: rsantos@fei.edu.br [Instituto Tecnologico de Aeronautica, Praca Marechal Eduardo Gomes 50, 12228-900, Sao Jose dos Campos, SP (Brazil)

    2011-03-01

    We present a simple method to analyze time series, and estimate the parameters needed to control chaos in dynamical systems. Application of the method to a system described by the logistic map is also shown. Analyzing only two 100-point time series, we achieved results within 2% of the analytical ones. With these estimates, we show that OGY control method successfully stabilized a period-1 unstable periodic orbit embedded in the chaotic attractor.

  7. Effects of Daylight Saving Time changes on stock market volatility: a reply.

    Science.gov (United States)

    Berument, Hakan; Dogan, Nukhet

    2011-12-01

    There is a rich array of evidence that suggests that changes in sleeping patterns affect an individual's decision-making processes. A nationwide sleeping-pattern change happens twice a year when the Daylight Saving Time (DST) change occurs. Kamstra, Kramer, and Levi argued in 2000 that a DST change lowers stock market returns. This study presents evidence that DST changes affect the relationship between stock market return and volatility. Empirical evidence suggests that the positive relationship between return and volatility becomes negative on the Mondays following DST changes.

  8. Modelling time-varying volatility in the Indian stock returns: Some empirical evidence

    Directory of Open Access Journals (Sweden)

    Trilochan Tripathy

    2015-12-01

    Full Text Available This paper models time-varying volatility in one of the Indian main stock markets, namely, the National Stock Exchange (NSE located in Mumbai, investigating whether it has been affected by the recent global financial crisis. A Chow test indicates the presence of a structural break. Both symmetric and asymmetric GARCH models suggest that the volatility of NSE returns is persistent and asymmetric and has increased as a result of the crisis. The model under the Generalized Error Distribution appears to be the most suitable one. However, its out-of-sample forecasting performance is relatively poor.

  9. Time Series Analysis of Insar Data: Methods and Trends

    Science.gov (United States)

    Osmanoglu, Batuhan; Sunar, Filiz; Wdowinski, Shimon; Cano-Cabral, Enrique

    2015-01-01

    Time series analysis of InSAR data has emerged as an important tool for monitoring and measuring the displacement of the Earth's surface. Changes in the Earth's surface can result from a wide range of phenomena such as earthquakes, volcanoes, landslides, variations in ground water levels, and changes in wetland water levels. Time series analysis is applied to interferometric phase measurements, which wrap around when the observed motion is larger than one-half of the radar wavelength. Thus, the spatio-temporal ''unwrapping" of phase observations is necessary to obtain physically meaningful results. Several different algorithms have been developed for time series analysis of InSAR data to solve for this ambiguity. These algorithms may employ different models for time series analysis, but they all generate a first-order deformation rate, which can be compared to each other. However, there is no single algorithm that can provide optimal results in all cases. Since time series analyses of InSAR data are used in a variety of applications with different characteristics, each algorithm possesses inherently unique strengths and weaknesses. In this review article, following a brief overview of InSAR technology, we discuss several algorithms developed for time series analysis of InSAR data using an example set of results for measuring subsidence rates in Mexico City.

  10. Time-varying parameter auto-regressive models for autocovariance nonstationary time series

    Institute of Scientific and Technical Information of China (English)

    FEI WanChun; BAI Lun

    2009-01-01

    In this paper,autocovariance nonstationary time series is clearly defined on a family of time series.We propose three types of TVPAR (time-varying parameter auto-regressive) models:the full order TVPAR model,the time-unvarying order TVPAR model and the time-varying order TVPAR model for autocovariance nonstationary time series.Related minimum AIC (Akaike information criterion) estimations are carried out.

  11. Time-varying parameter auto-regressive models for autocovariance nonstationary time series

    Institute of Scientific and Technical Information of China (English)

    2009-01-01

    In this paper, autocovariance nonstationary time series is clearly defined on a family of time series. We propose three types of TVPAR (time-varying parameter auto-regressive) models: the full order TVPAR model, the time-unvarying order TVPAR model and the time-varying order TV-PAR model for autocovariance nonstationary time series. Related minimum AIC (Akaike information criterion) estimations are carried out.

  12. Real-Time Measurement of Volatile Chemicals Released by Bed Bugs during Mating Activities

    DEFF Research Database (Denmark)

    Kilpinen, Ole Østerlund; Liu, Dezhao; Adamsen, Anders Peter

    2012-01-01

    In recent years, bed bug (Hemiptera: Cimicidae) problems have increased dramatically in many parts of the world, leading to a renewed interest in their chemical ecology. Most studies of bed bug semiochemicals have been based on the collection of volatiles over a period of time followed by chemical...

  13. Volatile compound profile of sous-vide cooked lamb loins at different temperature-time combinations.

    Science.gov (United States)

    Roldán, Mar; Ruiz, Jorge; Del Pulgar, José Sánchez; Pérez-Palacios, Trinidad; Antequera, Teresa

    2015-02-01

    Lamb loins were subjected to sous-vide cooking at different combinations of temperature (60 and 80°C) and time (6 and 24h) to assess the effect on the volatile compound profile. Major chemical families in cooked samples were aliphatic hydrocarbons and aldehydes. The volatile compound profile in sous-vide cooked lamb loin was affected by the cooking temperature and time. Volatile compounds arising from lipid oxidation presented a high abundance in samples cooked at low or moderate cooking conditions (60°C for 6 and 24h, 80°C for 6h), while a more intense time and temperature combination (80°C for 24h) resulted on a higher concentration of volatile compounds arising from Strecker degradations of amino acids, as 2-methylpropanal and 3-methylbutanal. Therefore, sous-vide cooking at moderately high temperatures for long times would result in the formation of a stronger meaty flavor and roast notes in lamb meat. Copyright © 2014 Elsevier Ltd. All rights reserved.

  14. A method for detecting changes in long time series

    Energy Technology Data Exchange (ETDEWEB)

    Downing, D.J.; Lawkins, W.F.; Morris, M.D.; Ostrouchov, G.

    1995-09-01

    Modern scientific activities, both physical and computational, can result in time series of many thousands or even millions of data values. Here the authors describe a statistically motivated algorithm for quick screening of very long time series data for the presence of potentially interesting but arbitrary changes. The basic data model is a stationary Gaussian stochastic process, and the approach to detecting a change is the comparison of two predictions of the series at a time point or contiguous collection of time points. One prediction is a ``forecast``, i.e. based on data from earlier times, while the other a ``backcast``, i.e. based on data from later times. The statistic is the absolute value of the log-likelihood ratio for these two predictions, evaluated at the observed data. A conservative procedure is suggested for specifying critical values for the statistic under the null hypothesis of ``no change``.

  15. Combined forecasts from linear and nonlinear time series models

    NARCIS (Netherlands)

    N. Terui (Nobuhiko); H.K. van Dijk (Herman)

    1999-01-01

    textabstractCombined forecasts from a linear and a nonlinear model are investigated for time series with possibly nonlinear characteristics. The forecasts are combined by a constant coefficient regression method as well as a time varying method. The time varying method allows for a locally (non)line

  16. LEGENDRE SERIES SOLUTIONS FOR TIME-VARIATION DYNAMICS

    Institute of Scientific and Technical Information of China (English)

    Cao Zhiyuan; Zou Guiping; Tang Shougao

    2000-01-01

    In this topic, a new approach to the analysis of time-variation dynamics is proposed by use of Legendre series expansion and Legendre integral operator matrix. The theoretical basis for effective solution of time-variation dynamics is therefore established, which is beneficial to further research of time-variation science.

  17. Similarity estimators for irregular and age-uncertain time series

    Science.gov (United States)

    Rehfeld, K.; Kurths, J.

    2014-01-01

    Paleoclimate time series are often irregularly sampled and age uncertain, which is an important technical challenge to overcome for successful reconstruction of past climate variability and dynamics. Visual comparison and interpolation-based linear correlation approaches have been used to infer dependencies from such proxy time series. While the first is subjective, not measurable and not suitable for the comparison of many data sets at a time, the latter introduces interpolation bias, and both face difficulties if the underlying dependencies are nonlinear. In this paper we investigate similarity estimators that could be suitable for the quantitative investigation of dependencies in irregular and age-uncertain time series. We compare the Gaussian-kernel-based cross-correlation (gXCF, Rehfeld et al., 2011) and mutual information (gMI, Rehfeld et al., 2013) against their interpolation-based counterparts and the new event synchronization function (ESF). We test the efficiency of the methods in estimating coupling strength and coupling lag numerically, using ensembles of synthetic stalagmites with short, autocorrelated, linear and nonlinearly coupled proxy time series, and in the application to real stalagmite time series. In the linear test case, coupling strength increases are identified consistently for all estimators, while in the nonlinear test case the correlation-based approaches fail. The lag at which the time series are coupled is identified correctly as the maximum of the similarity functions in around 60-55% (in the linear case) to 53-42% (for the nonlinear processes) of the cases when the dating of the synthetic stalagmite is perfectly precise. If the age uncertainty increases beyond 5% of the time series length, however, the true coupling lag is not identified more often than the others for which the similarity function was estimated. Age uncertainty contributes up to half of the uncertainty in the similarity estimation process. Time series irregularity

  18. Similarity estimators for irregular and age uncertain time series

    Science.gov (United States)

    Rehfeld, K.; Kurths, J.

    2013-09-01

    Paleoclimate time series are often irregularly sampled and age uncertain, which is an important technical challenge to overcome for successful reconstruction of past climate variability and dynamics. Visual comparison and interpolation-based linear correlation approaches have been used to infer dependencies from such proxy time series. While the first is subjective, not measurable and not suitable for the comparison of many datasets at a time, the latter introduces interpolation bias, and both face difficulties if the underlying dependencies are nonlinear. In this paper we investigate similarity estimators that could be suitable for the quantitative investigation of dependencies in irregular and age uncertain time series. We compare the Gaussian-kernel based cross correlation (gXCF, Rehfeld et al., 2011) and mutual information (gMI, Rehfeld et al., 2013) against their interpolation-based counterparts and the new event synchronization function (ESF). We test the efficiency of the methods in estimating coupling strength and coupling lag numerically, using ensembles of synthetic stalagmites with short, autocorrelated, linear and nonlinearly coupled proxy time series, and in the application to real stalagmite time series. In the linear test case coupling strength increases are identified consistently for all estimators, while in the nonlinear test case the correlation-based approaches fail. The lag at which the time series are coupled is identified correctly as the maximum of the similarity functions in around 60-55% (in the linear case) to 53-42% (for the nonlinear processes) of the cases when the dating of the synthetic stalagmite is perfectly precise. If the age uncertainty increases beyond 5% of the time series length, however, the true coupling lag is not identified more often than the others for which the similarity function was estimated. Age uncertainty contributes up to half of the uncertainty in the similarity estimation process. Time series irregularity

  19. Comparison of time series using entropy and mutual correlation

    Science.gov (United States)

    Madonna, Fabio; Rosoldi, Marco

    2015-04-01

    The potential for redundant time series to reduce uncertainty in atmospheric variables has not been investigated comprehensively for climate observations. Moreover, comparison among time series of in situ and ground based remote sensing measurements have been performed using several methods, but quite often relying on linear models. In this work, the concepts of entropy (H) and mutual correlation (MC), defined in the frame of the information theory, are applied to the study of essential climate variables with the aim of characterizing the uncertainty of a time series and the redundancy of collocated measurements provided by different surface-based techniques. In particular, integrated water vapor (IWV) and water vapour mixing ratio times series obtained at five highly instrumented GRUAN (GCOS, Global Climate Observing System, Reference Upper-Air Network) stations with several sensors (e.g radiosondes, GPS, microwave and infrared radiometers, Raman lidar), in the period from 2010-2012, are analyzed in terms of H and MC. The comparison between the probability density functions of the time series shows that caution in using linear assumptions is needed and the use of statistics, like entropy, that are robust to outliers, is recommended to investigate measurements time series. Results reveals that the random uncertainties on the IWV measured with radiosondes, global positioning system, microwave and infrared radiometers, and Raman lidar measurements differed by less than 8 % over the considered time period. Comparisons of the time series of IWV content from ground-based remote sensing instruments with in situ soundings showed that microwave radiometers have the highest redundancy with the IWV time series measured by radiosondes and therefore the highest potential to reduce the random uncertainty of the radiosondes time series. Moreover, the random uncertainty of a time series from one instrument can be reduced by 60% by constraining the measurements with those from

  20. Similarity estimators for irregular and age uncertain time series

    Directory of Open Access Journals (Sweden)

    K. Rehfeld

    2013-09-01

    Full Text Available Paleoclimate time series are often irregularly sampled and age uncertain, which is an important technical challenge to overcome for successful reconstruction of past climate variability and dynamics. Visual comparison and interpolation-based linear correlation approaches have been used to infer dependencies from such proxy time series. While the first is subjective, not measurable and not suitable for the comparison of many datasets at a time, the latter introduces interpolation bias, and both face difficulties if the underlying dependencies are nonlinear. In this paper we investigate similarity estimators that could be suitable for the quantitative investigation of dependencies in irregular and age uncertain time series. We compare the Gaussian-kernel based cross correlation (gXCF, Rehfeld et al., 2011 and mutual information (gMI, Rehfeld et al., 2013 against their interpolation-based counterparts and the new event synchronization function (ESF. We test the efficiency of the methods in estimating coupling strength and coupling lag numerically, using ensembles of synthetic stalagmites with short, autocorrelated, linear and nonlinearly coupled proxy time series, and in the application to real stalagmite time series. In the linear test case coupling strength increases are identified consistently for all estimators, while in the nonlinear test case the correlation-based approaches fail. The lag at which the time series are coupled is identified correctly as the maximum of the similarity functions in around 60–55% (in the linear case to 53–42% (for the nonlinear processes of the cases when the dating of the synthetic stalagmite is perfectly precise. If the age uncertainty increases beyond 5% of the time series length, however, the true coupling lag is not identified more often than the others for which the similarity function was estimated. Age uncertainty contributes up to half of the uncertainty in the similarity estimation process. Time

  1. Modeling Financial Time Series Based on a Market Microstructure Model with Leverage Effect

    Directory of Open Access Journals (Sweden)

    Yanhui Xi

    2016-01-01

    Full Text Available The basic market microstructure model specifies that the price/return innovation and the volatility innovation are independent Gaussian white noise processes. However, the financial leverage effect has been found to be statistically significant in many financial time series. In this paper, a novel market microstructure model with leverage effects is proposed. The model specification assumed a negative correlation in the errors between the price/return innovation and the volatility innovation. With the new representations, a theoretical explanation of leverage effect is provided. Simulated data and daily stock market indices (Shanghai composite index, Shenzhen component index, and Standard and Poor’s 500 Composite index via Bayesian Markov Chain Monte Carlo (MCMC method are used to estimate the leverage market microstructure model. The results verify the effectiveness of the model and its estimation approach proposed in the paper and also indicate that the stock markets have strong leverage effects. Compared with the classical leverage stochastic volatility (SV model in terms of DIC (Deviance Information Criterion, the leverage market microstructure model fits the data better.

  2. Analyses of Inhomogeneities in Radiosonde Temperature and Humidity Time Series.

    Science.gov (United States)

    Zhai, Panmao; Eskridge, Robert E.

    1996-04-01

    Twice daily radiosonde data from selected stations in the United States (period 1948 to 1990) and China (period 1958 to 1990) were sorted into time series. These stations have one sounding taken in darkness and the other in sunlight. The analysis shows that the 0000 and 1200 UTC time series are highly correlated. Therefore, the Easterling and Peterson technique was tested on the 0000 and 1200 time series to detect inhomogeneities and to estimate the size of the biases. Discontinuities were detected using the difference series created from the 0000 and 1200 UTC time series. To establish that the detected bias was significant, a t test was performed to confirm that the change occurs in the daytime series but not in the nighttime series.Both U.S. and Chinese radiosonde temperature and humidity data include inhomogeneities caused by changes in radiosonde sensors and observation times. The U.S. humidity data have inhomogeneities that were caused by instrument changes and the censoring of data. The practice of reporting relative humidity as 19% when it is lower than 20% or the temperature is below 40°C is called censoring. This combination of procedural and instrument changes makes the detection of biases and adjustment of the data very difficult. In the Chinese temperatures, them are inhomogeneities related to a change in the radiation correction procedure.Test results demonstrate that a modified Easterling and Peterson method is suitable for use in detecting and adjusting time series radiosonde data.Accurate stations histories are very desirable. Stations histories can confirm that detected inhomogeneities are related to instrument or procedural changes. Adjustments can then he made to the data with some confidence.

  3. Multiresolution analysis of Bursa Malaysia KLCI time series

    Science.gov (United States)

    Ismail, Mohd Tahir; Dghais, Amel Abdoullah Ahmed

    2017-05-01

    In general, a time series is simply a sequence of numbers collected at regular intervals over a period. Financial time series data processing is concerned with the theory and practice of processing asset price over time, such as currency, commodity data, and stock market data. The primary aim of this study is to understand the fundamental characteristics of selected financial time series by using the time as well as the frequency domain analysis. After that prediction can be executed for the desired system for in sample forecasting. In this study, multiresolution analysis which the assist of discrete wavelet transforms (DWT) and maximal overlap discrete wavelet transform (MODWT) will be used to pinpoint special characteristics of Bursa Malaysia KLCI (Kuala Lumpur Composite Index) daily closing prices and return values. In addition, further case study discussions include the modeling of Bursa Malaysia KLCI using linear ARIMA with wavelets to address how multiresolution approach improves fitting and forecasting results.

  4. Correlation measure to detect time series distances, whence economy globalization

    Science.gov (United States)

    Miśkiewicz, Janusz; Ausloos, Marcel

    2008-11-01

    An instantaneous time series distance is defined through the equal time correlation coefficient. The idea is applied to the Gross Domestic Product (GDP) yearly increments of 21 rich countries between 1950 and 2005 in order to test the process of economic globalisation. Some data discussion is first presented to decide what (EKS, GK, or derived) GDP series should be studied. Distances are then calculated from the correlation coefficient values between pairs of series. The role of time averaging of the distances over finite size windows is discussed. Three network structures are next constructed based on the hierarchy of distances. It is shown that the mean distance between the most developed countries on several networks actually decreases in time, -which we consider as a proof of globalization. An empirical law is found for the evolution after 1990, similar to that found in flux creep. The optimal observation time window size is found ≃15 years.

  5. Exploratory Causal Analysis in Bivariate Time Series Data

    Science.gov (United States)

    McCracken, James M.

    Many scientific disciplines rely on observational data of systems for which it is difficult (or impossible) to implement controlled experiments and data analysis techniques are required for identifying causal information and relationships directly from observational data. This need has lead to the development of many different time series causality approaches and tools including transfer entropy, convergent cross-mapping (CCM), and Granger causality statistics. In this thesis, the existing time series causality method of CCM is extended by introducing a new method called pairwise asymmetric inference (PAI). It is found that CCM may provide counter-intuitive causal inferences for simple dynamics with strong intuitive notions of causality, and the CCM causal inference can be a function of physical parameters that are seemingly unrelated to the existence of a driving relationship in the system. For example, a CCM causal inference might alternate between ''voltage drives current'' and ''current drives voltage'' as the frequency of the voltage signal is changed in a series circuit with a single resistor and inductor. PAI is introduced to address both of these limitations. Many of the current approaches in the times series causality literature are not computationally straightforward to apply, do not follow directly from assumptions of probabilistic causality, depend on assumed models for the time series generating process, or rely on embedding procedures. A new approach, called causal leaning, is introduced in this work to avoid these issues. The leaning is found to provide causal inferences that agree with intuition for both simple systems and more complicated empirical examples, including space weather data sets. The leaning may provide a clearer interpretation of the results than those from existing time series causality tools. A practicing analyst can explore the literature to find many proposals for identifying drivers and causal connections in times series data

  6. Evaluation of scaling invariance embedded in short time series.

    Science.gov (United States)

    Pan, Xue; Hou, Lei; Stephen, Mutua; Yang, Huijie; Zhu, Chenping

    2014-01-01

    Scaling invariance of time series has been making great contributions in diverse research fields. But how to evaluate scaling exponent from a real-world series is still an open problem. Finite length of time series may induce unacceptable fluctuation and bias to statistical quantities and consequent invalidation of currently used standard methods. In this paper a new concept called correlation-dependent balanced estimation of diffusion entropy is developed to evaluate scale-invariance in very short time series with length ~10(2). Calculations with specified Hurst exponent values of 0.2,0.3,...,0.9 show that by using the standard central moving average de-trending procedure this method can evaluate the scaling exponents for short time series with ignorable bias (≤0.03) and sharp confidential interval (standard deviation ≤0.05). Considering the stride series from ten volunteers along an approximate oval path of a specified length, we observe that though the averages and deviations of scaling exponents are close, their evolutionary behaviors display rich patterns. It has potential use in analyzing physiological signals, detecting early warning signals, and so on. As an emphasis, the our core contribution is that by means of the proposed method one can estimate precisely shannon entropy from limited records.

  7. Evaluation of scaling invariance embedded in short time series.

    Directory of Open Access Journals (Sweden)

    Xue Pan

    Full Text Available Scaling invariance of time series has been making great contributions in diverse research fields. But how to evaluate scaling exponent from a real-world series is still an open problem. Finite length of time series may induce unacceptable fluctuation and bias to statistical quantities and consequent invalidation of currently used standard methods. In this paper a new concept called correlation-dependent balanced estimation of diffusion entropy is developed to evaluate scale-invariance in very short time series with length ~10(2. Calculations with specified Hurst exponent values of 0.2,0.3,...,0.9 show that by using the standard central moving average de-trending procedure this method can evaluate the scaling exponents for short time series with ignorable bias (≤0.03 and sharp confidential interval (standard deviation ≤0.05. Considering the stride series from ten volunteers along an approximate oval path of a specified length, we observe that though the averages and deviations of scaling exponents are close, their evolutionary behaviors display rich patterns. It has potential use in analyzing physiological signals, detecting early warning signals, and so on. As an emphasis, the our core contribution is that by means of the proposed method one can estimate precisely shannon entropy from limited records.

  8. Statistical modelling of agrometeorological time series by exponential smoothing

    Science.gov (United States)

    Murat, Małgorzata; Malinowska, Iwona; Hoffmann, Holger; Baranowski, Piotr

    2016-01-01

    Meteorological time series are used in modelling agrophysical processes of the soil-plant-atmosphere system which determine plant growth and yield. Additionally, long-term meteorological series are used in climate change scenarios. Such studies often require forecasting or projection of meteorological variables, eg the projection of occurrence of the extreme events. The aim of the article was to determine the most suitable exponential smoothing models to generate forecast using data on air temperature, wind speed, and precipitation time series in Jokioinen (Finland), Dikopshof (Germany), Lleida (Spain), and Lublin (Poland). These series exhibit regular additive seasonality or non-seasonality without any trend, which is confirmed by their autocorrelation functions and partial autocorrelation functions. The most suitable models were indicated by the smallest mean absolute error and the smallest root mean squared error.

  9. Discovering shared and individual latent structure in multiple time series

    CERN Document Server

    Saria, Suchi; Penn, Anna

    2010-01-01

    This paper proposes a nonparametric Bayesian method for exploratory data analysis and feature construction in continuous time series. Our method focuses on understanding shared features in a set of time series that exhibit significant individual variability. Our method builds on the framework of latent Diricihlet allocation (LDA) and its extension to hierarchical Dirichlet processes, which allows us to characterize each series as switching between latent ``topics'', where each topic is characterized as a distribution over ``words'' that specify the series dynamics. However, unlike standard applications of LDA, we discover the words as we learn the model. We apply this model to the task of tracking the physiological signals of premature infants; our model obtains clinically significant insights as well as useful features for supervised learning tasks.

  10. Self-affinity in the dengue fever time series

    Science.gov (United States)

    Azevedo, S. M.; Saba, H.; Miranda, J. G. V.; Filho, A. S. Nascimento; Moret, M. A.

    2016-06-01

    Dengue is a complex public health problem that is common in tropical and subtropical regions. This disease has risen substantially in the last three decades, and the physical symptoms depict the self-affine behavior of the occurrences of reported dengue cases in Bahia, Brazil. This study uses detrended fluctuation analysis (DFA) to verify the scale behavior in a time series of dengue cases and to evaluate the long-range correlations that are characterized by the power law α exponent for different cities in Bahia, Brazil. The scaling exponent (α) presents different long-range correlations, i.e. uncorrelated, anti-persistent, persistent and diffusive behaviors. The long-range correlations highlight the complex behavior of the time series of this disease. The findings show that there are two distinct types of scale behavior. In the first behavior, the time series presents a persistent α exponent for a one-month period. For large periods, the time series signal approaches subdiffusive behavior. The hypothesis of the long-range correlations in the time series of the occurrences of reported dengue cases was validated. The observed self-affinity is useful as a forecasting tool for future periods through extrapolation of the α exponent behavior. This complex system has a higher predictability in a relatively short time (approximately one month), and it suggests a new tool in epidemiological control strategies. However, predictions for large periods using DFA are hidden by the subdiffusive behavior.

  11. Drunk driving detection based on classification of multivariate time series.

    Science.gov (United States)

    Li, Zhenlong; Jin, Xue; Zhao, Xiaohua

    2015-09-01

    This paper addresses the problem of detecting drunk driving based on classification of multivariate time series. First, driving performance measures were collected from a test in a driving simulator located in the Traffic Research Center, Beijing University of Technology. Lateral position and steering angle were used to detect drunk driving. Second, multivariate time series analysis was performed to extract the features. A piecewise linear representation was used to represent multivariate time series. A bottom-up algorithm was then employed to separate multivariate time series. The slope and time interval of each segment were extracted as the features for classification. Third, a support vector machine classifier was used to classify driver's state into two classes (normal or drunk) according to the extracted features. The proposed approach achieved an accuracy of 80.0%. Drunk driving detection based on the analysis of multivariate time series is feasible and effective. The approach has implications for drunk driving detection. Copyright © 2015 Elsevier Ltd and National Safety Council. All rights reserved.

  12. Wavelet matrix transform for time-series similarity measurement

    Institute of Scientific and Technical Information of China (English)

    HU Zhi-kun; XU Fei; GUI Wei-hua; YANG Chun-hua

    2009-01-01

    A time-series similarity measurement method based on wavelet and matrix transform was proposed, and its anti-noise ability, sensitivity and accuracy were discussed. The time-series sequences were compressed into wavelet subspace, and sample feature vector and orthogonal basics of sample time-series sequences were obtained by K-L transform. Then the inner product transform was carried out to project analyzed time-series sequence into orthogonal basics to gain analyzed feature vectors. The similarity was calculated between sample feature vector and analyzed feature vector by the Euclid distance. Taking fault wave of power electronic devices for example, the experimental results show that the proposed method has low dimension of feature vector, the anti-noise ability of proposed method is 30 times as large as that of plain wavelet method, the sensitivity of proposed method is 1/3 as large as that of plain wavelet method, and the accuracy of proposed method is higher than that of the wavelet singular value decomposition method. The proposed method can be applied in similarity matching and indexing for lager time series databases.

  13. Stochastic modeling of hourly rainfall times series in Campania (Italy)

    Science.gov (United States)

    Giorgio, M.; Greco, R.

    2009-04-01

    Occurrence of flowslides and floods in small catchments is uneasy to predict, since it is affected by a number of variables, such as mechanical and hydraulic soil properties, slope morphology, vegetation coverage, rainfall spatial and temporal variability. Consequently, landslide risk assessment procedures and early warning systems still rely on simple empirical models based on correlation between recorded rainfall data and observed landslides and/or river discharges. Effectiveness of such systems could be improved by reliable quantitative rainfall prediction, which can allow gaining larger lead-times. Analysis of on-site recorded rainfall height time series represents the most effective approach for a reliable prediction of local temporal evolution of rainfall. Hydrological time series analysis is a widely studied field in hydrology, often carried out by means of autoregressive models, such as AR, ARMA, ARX, ARMAX (e.g. Salas [1992]). Such models gave the best results when applied to the analysis of autocorrelated hydrological time series, like river flow or level time series. Conversely, they are not able to model the behaviour of intermittent time series, like point rainfall height series usually are, especially when recorded with short sampling time intervals. More useful for this issue are the so-called DRIP (Disaggregated Rectangular Intensity Pulse) and NSRP (Neymann-Scott Rectangular Pulse) model [Heneker et al., 2001; Cowpertwait et al., 2002], usually adopted to generate synthetic point rainfall series. In this paper, the DRIP model approach is adopted, in which the sequence of rain storms and dry intervals constituting the structure of rainfall time series is modeled as an alternating renewal process. Final aim of the study is to provide a useful tool to implement an early warning system for hydrogeological risk management. Model calibration has been carried out with hourly rainfall hieght data provided by the rain gauges of Campania Region civil

  14. Stationary Time Series Analysis Using Information and Spectral Analysis

    Science.gov (United States)

    1992-09-01

    spectral density function of the time series. The spectral density function f(w), 0 < w < 1, is defined as the Fourier transform of...series with spectral density function f(w). 4 An important result of Pinsker [(1964), p. 196] can be interpreted as providing a for- mula for asymptotic...Analysis Papers, Holden-Day, San Francisco, California. Parzen, E. (1958) "On asymptotically efficient consistent estimates of the spectral density function

  15. Gaussian semiparametric estimation of non-stationary time series

    OpenAIRE

    Velasco, Carlos

    1998-01-01

    Generalizing the definition of the memory parameter d in terms of the differentiated series, we showed in Velasco (Non-stationary log-periodogram regression, Forthcoming J. Economet., 1997) that it is possible to estimate consistently the memory of non-stationary processes using methods designed for stationary long-range-dependent time series. In this paper we consider the Gaussian semiparametric estimate analysed by Robinson (Gaussian semiparametric estimation of long range dependence. Ann. ...

  16. Moderate Growth Time Series for Dynamic Combinatorics Modelisation

    CERN Document Server

    Jaff, Luaï; Kacem, Hatem Hadj; Bertelle, Cyrille

    2007-01-01

    Here, we present a family of time series with a simple growth constraint. This family can be the basis of a model to apply to emerging computation in business and micro-economy where global functions can be expressed from local rules. We explicit a double statistics on these series which allows to establish a one-to-one correspondence between three other ballot-like strunctures.

  17. Ozonolysis of a series of biogenic organic volatile compounds and secondary organic aerosol formation

    Science.gov (United States)

    Bernard, François; Quilgars, Alain; Cazaunau, Mathieu; Grosselin, Benoît.; Daele, Véronique; Mellouki, Abdelwahid; Winterhalter, Richard; Moortgat, Geert K.

    2010-05-01

    Secondary organic aerosols are formed via nucleation of atmospheric organic vapours on pre-existing particles observed in various rural environments where the organic fraction represents the major part of the observed nano-particle (Kavouras and Stephanou, 2002; Kulmala et al., 2004a). However, nucleation of organic vapors appears to be unlikely thermodynamically in relevant atmospheric conditions (Kulmala et al., 2004b). In this work, a systematic study has been conducted to investigate the aerosol formation through the ozonolysis of a series of monotepenes using a newly developed aerosol flow reactor and the ICARE indoor simulation chamber. The nucleation thresholds have been determined for SOA formed through the reaction of ozone with a-Pinene, sabinene, myrcene and limonene in absence of any observable existing particles. The measurements were performed using the flow reactor combined to a particle counter (CPC 3022). Number concentrations of SOA have been measured for different concentration of consumed monoterpenes. The data obtained allow us to estimate the nucleation threshold for a range of 0.2 - 45 ppb of consumed monoterpenes. The nucleation threshold values obtained here (≤ 1 ppb of the consumed monoterpenes) have been found to be lower than the previously reported ones (Berndt et al., 2003; Bonn and Moortgat, 2003; Koch et al., 2000; Lee and Kamens, 2005). The ICARE simulation chamber has been used to study the mechanism of the reaction of ozone with various acyclic terpenes (myrcene, ocimene, linalool and a-farnesene) and to derive the SOA mass formation yield. The time-concentration profiles of reactants and products in gas-phase were obtained using in-situ Fourier Transform Infrared Spectroscopy. In addition, the number and mass concentrations of SOA have been monitored with a Scanning Mobility Particle Sizer. The chemical composition of the SOA formed has been tentatively characterised using Liquid Chromatography - Mass Spectrometry. The results

  18. First time-series optical photometry from Antarctica

    CERN Document Server

    Strassmeier, K G; Granzer, T; Tosti, G; DiVarano, I; Savanov, I; Bagaglia, M; Castellini, S; Mancini, A; Nucciarelli, G; Straniero, O; Distefano, E; Messina, S; Cutispoto, G

    2008-01-01

    Beating the Earth's day-night cycle is mandatory for long and continuous time-series photometry and had been achieved with either large ground-based networks of observatories at different geographic longitudes or when conducted from space. A third possibility is offered by a polar location with astronomically-qualified site characteristics. Aims. In this paper, we present the first scientific stellar time-series optical photometry from Dome C in Antarctica and analyze approximately 13,000 CCD frames taken in July 2007. We conclude that high-precision CCD photometry with exceptional time coverage and cadence can be obtained at Dome C in Antarctica and be successfully used for time-series astrophysics.

  19. Time series analysis of the response of measurement instruments

    CERN Document Server

    Georgakaki, Dimitra; Polatoglou, Hariton

    2012-01-01

    In this work the significance of treating a set of measurements as a time series is being explored. Time Series Analysis (TSA) techniques, part of the Exploratory Data Analysis (EDA) approach, can provide much insight regarding the stochastic correlations that are induced on the outcome of an experiment by the measurement system and can provide criteria for the limited use of the classical variance in metrology. Specifically, techniques such as the Lag Plots, Autocorrelation Function, Power Spectral Density and Allan Variance are used to analyze series of sequential measurements, collected at equal time intervals from an electromechanical transducer. These techniques are used in conjunction with power law models of stochastic noise in order to characterize time or frequency regimes for which the usually assumed white noise model is adequate for the description of the measurement system response. However, through the detection of colored noise, usually referred to as flicker noise, which is expected to appear ...

  20. Weighted statistical parameters for irregularly sampled time series

    CERN Document Server

    Rimoldini, Lorenzo

    2014-01-01

    Unevenly spaced time series are common in astronomy because of the day-night cycle, weather conditions, dependence on the source position in the sky, allocated telescope time, corrupt measurements, for example, or be inherent to the scanning law of satellites like Hipparcos and the forthcoming Gaia. This paper aims at improving the accuracy of common statistical parameters for the characterization of irregularly sampled signals. The uneven representation of time series, often including clumps of measurements and gaps with no data, can severely disrupt the values of estimators. A weighting scheme adapting to the sampling density and noise level of the signal is formulated. Its application to time series from the Hipparcos periodic catalogue led to significant improvements in the overall accuracy and precision of the estimators with respect to the unweighted counterparts and those weighted by inverse-squared uncertainties. Automated classification procedures employing statistical parameters weighted by the sugg...

  1. A refined fuzzy time series model for stock market forecasting

    Science.gov (United States)

    Jilani, Tahseen Ahmed; Burney, Syed Muhammad Aqil

    2008-05-01

    Time series models have been used to make predictions of stock prices, academic enrollments, weather, road accident casualties, etc. In this paper we present a simple time-variant fuzzy time series forecasting method. The proposed method uses heuristic approach to define frequency-density-based partitions of the universe of discourse. We have proposed a fuzzy metric to use the frequency-density-based partitioning. The proposed fuzzy metric also uses a trend predictor to calculate the forecast. The new method is applied for forecasting TAIEX and enrollments’ forecasting of the University of Alabama. It is shown that the proposed method work with higher accuracy as compared to other fuzzy time series methods developed for forecasting TAIEX and enrollments of the University of Alabama.

  2. Image-Based Learning Approach Applied to Time Series Forecasting

    Directory of Open Access Journals (Sweden)

    J. C. Chimal-Eguía

    2012-06-01

    Full Text Available In this paper, a new learning approach based on time-series image information is presented. In order to implementthis new learning technique, a novel time-series input data representation is also defined. This input datarepresentation is based on information obtained by image axis division into boxes. The difference between this newinput data representation and the classical is that this technique is not time-dependent. This new information isimplemented in the new Image-Based Learning Approach (IBLA and by means of a probabilistic mechanism thislearning technique is applied to the interesting problem of time series forecasting. The experimental results indicatethat by using the methodology proposed in this article, it is possible to obtain better results than with the classicaltechniques such as artificial neuronal networks and support vector machines.

  3. Minimum entropy density method for the time series analysis

    Science.gov (United States)

    Lee, Jeong Won; Park, Joongwoo Brian; Jo, Hang-Hyun; Yang, Jae-Suk; Moon, Hie-Tae

    2009-01-01

    The entropy density is an intuitive and powerful concept to study the complicated nonlinear processes derived from physical systems. We develop the minimum entropy density method (MEDM) to detect the structure scale of a given time series, which is defined as the scale in which the uncertainty is minimized, hence the pattern is revealed most. The MEDM is applied to the financial time series of Standard and Poor’s 500 index from February 1983 to April 2006. Then the temporal behavior of structure scale is obtained and analyzed in relation to the information delivery time and efficient market hypothesis.

  4. Periodicity Estimation in Mechanical Acoustic Time-Series Data

    Directory of Open Access Journals (Sweden)

    Zhu Yongbo

    2015-01-01

    Full Text Available Periodicity estimation in mechanical acoustic time-series data is a well-established problem in data mining as it can be applicable in variety of disciplines either for anomaly detection or for prediction purposes in industry. In this paper, we develop a new approach for capturing and characterizing periodic patterns in time-series data by virtue of the dynamic time warping (DTW. We have conducted extensive experiments to evaluate the proposed approach with synthetic data and our collected data in practice. Experimental results demonstrated its effectiveness and robustness on periodicity detection in highly noised data.

  5. Detecting structural breaks in time series via genetic algorithms

    DEFF Research Database (Denmark)

    Doerr, Benjamin; Fischer, Paul; Hilbert, Astrid

    2016-01-01

    Detecting structural breaks is an essential task for the statistical analysis of time series, for example, for fitting parametric models to it. In short, structural breaks are points in time at which the behaviour of the time series substantially changes. Typically, no solid background knowledge...... and mutation operations for this problem, we conduct extensive experiments to determine good choices for the parameters and operators of the genetic algorithm. One surprising observation is that use of uniform and one-point crossover together gave significantly better results than using either crossover...

  6. Bayesian dynamic modeling of time series of dengue disease case counts.

    Directory of Open Access Journals (Sweden)

    Daniel Adyro Martínez-Bello

    2017-07-01

    Full Text Available The aim of this study is to model the association between weekly time series of dengue case counts and meteorological variables, in a high-incidence city of Colombia, applying Bayesian hierarchical dynamic generalized linear models over the period January 2008 to August 2015. Additionally, we evaluate the model's short-term performance for predicting dengue cases. The methodology shows dynamic Poisson log link models including constant or time-varying coefficients for the meteorological variables. Calendar effects were modeled using constant or first- or second-order random walk time-varying coefficients. The meteorological variables were modeled using constant coefficients and first-order random walk time-varying coefficients. We applied Markov Chain Monte Carlo simulations for parameter estimation, and deviance information criterion statistic (DIC for model selection. We assessed the short-term predictive performance of the selected final model, at several time points within the study period using the mean absolute percentage error. The results showed the best model including first-order random walk time-varying coefficients for calendar trend and first-order random walk time-varying coefficients for the meteorological variables. Besides the computational challenges, interpreting the results implies a complete analysis of the time series of dengue with respect to the parameter estimates of the meteorological effects. We found small values of the mean absolute percentage errors at one or two weeks out-of-sample predictions for most prediction points, associated with low volatility periods in the dengue counts. We discuss the advantages and limitations of the dynamic Poisson models for studying the association between time series of dengue disease and meteorological variables. The key conclusion of the study is that dynamic Poisson models account for the dynamic nature of the variables involved in the modeling of time series of dengue disease

  7. The Application of Kernel Smoothing to Time Series Data

    Institute of Scientific and Technical Information of China (English)

    Zhao-jun Wang; Yi Zhao; Chun-jie Wu; Yan-ting Li

    2006-01-01

    There are already a lot of models to fit a set of stationary time series, such as AR, MA, and ARMA models. For the non-stationary data, an ARIMA or seasonal ARIMA models can be used to fit the given data.Moreover, there are also many statistical softwares that can be used to build a stationary or non-stationary time series model for a given set of time series data, such as SAS, SPLUS, etc. However, some statistical softwares wouldn't work well for small samples with or without missing data, especially for small time series data with seasonal trend. A nonparametric smoothing technique to build a forecasting model for a given small seasonal time series data is carried out in this paper. And then, both the method provided in this paper and that in SAS package axe applied to the modeling of international airline passengers data respectively, the comparisons between the two methods are done afterwards. The results of the comparison show us the method provided in this paper has superiority over SAS's method.

  8. Recurrent Neural Network Applications for Astronomical Time Series

    Science.gov (United States)

    Protopapas, Pavlos

    2017-06-01

    The benefits of good predictive models in astronomy lie in early event prediction systems and effective resource allocation. Current time series methods applicable to regular time series have not evolved to generalize for irregular time series. In this talk, I will describe two Recurrent Neural Network methods, Long Short-Term Memory (LSTM) and Echo State Networks (ESNs) for predicting irregular time series. Feature engineering along with a non-linear modeling proved to be an effective predictor. For noisy time series, the prediction is improved by training the network on error realizations using the error estimates from astronomical light curves. In addition to this, we propose a new neural network architecture to remove correlation from the residuals in order to improve prediction and compensate for the noisy data. Finally, I show how to set hyperparameters for a stable and performant solution correctly. In this work, we circumvent this obstacle by optimizing ESN hyperparameters using Bayesian optimization with Gaussian Process priors. This automates the tuning procedure, enabling users to employ the power of RNN without needing an in-depth understanding of the tuning procedure.

  9. Characterizing time series via complexity-entropy curves

    Science.gov (United States)

    Ribeiro, Haroldo V.; Jauregui, Max; Zunino, Luciano; Lenzi, Ervin K.

    2017-06-01

    The search for patterns in time series is a very common task when dealing with complex systems. This is usually accomplished by employing a complexity measure such as entropies and fractal dimensions. However, such measures usually only capture a single aspect of the system dynamics. Here, we propose a family of complexity measures for time series based on a generalization of the complexity-entropy causality plane. By replacing the Shannon entropy by a monoparametric entropy (Tsallis q entropy) and after considering the proper generalization of the statistical complexity (q complexity), we build up a parametric curve (the q -complexity-entropy curve) that is used for characterizing and classifying time series. Based on simple exact results and numerical simulations of stochastic processes, we show that these curves can distinguish among different long-range, short-range, and oscillating correlated behaviors. Also, we verify that simulated chaotic and stochastic time series can be distinguished based on whether these curves are open or closed. We further test this technique in experimental scenarios related to chaotic laser intensity, stock price, sunspot, and geomagnetic dynamics, confirming its usefulness. Finally, we prove that these curves enhance the automatic classification of time series with long-range correlations and interbeat intervals of healthy subjects and patients with heart disease.

  10. Time Series Analysis Based on Running Mann Whitney Z Statistics

    Science.gov (United States)

    A sensitive and objective time series analysis method based on the calculation of Mann Whitney U statistics is described. This method samples data rankings over moving time windows, converts those samples to Mann-Whitney U statistics, and then normalizes the U statistics to Z statistics using Monte-...

  11. Nonlinear projective filtering; 1, Application to real time series

    CERN Document Server

    Schreiber, T

    1998-01-01

    We discuss applications of nonlinear filtering of time series by locally linear phase space projections. Noise can be reduced whenever the error due to the manifold approximation is smaller than the noise in the system. Examples include the real time extraction of the fetal electrocardiogram from abdominal recordings.

  12. Sparse time series chain graphical models for reconstructing genetic networks

    NARCIS (Netherlands)

    Abegaz, Fentaw; Wit, Ernst

    2013-01-01

    We propose a sparse high-dimensional time series chain graphical model for reconstructing genetic networks from gene expression data parametrized by a precision matrix and autoregressive coefficient matrix. We consider the time steps as blocks or chains. The proposed approach explores patterns of co

  13. Optimization of recurrent neural networks for time series modeling

    DEFF Research Database (Denmark)

    Pedersen, Morten With

    1997-01-01

    The present thesis is about optimization of recurrent neural networks applied to time series modeling. In particular is considered fully recurrent networks working from only a single external input, one layer of nonlinear hidden units and a li near output unit applied to prediction of discrete time...

  14. Mining approximate periodic pattern in hydrological time series

    Science.gov (United States)

    Zhu, Y. L.; Li, S. J.; Bao, N. N.; Wan, D. S.

    2012-04-01

    There is a lot of information about the hidden laws of nature evolution and the influences of human beings activities on the earth surface in long sequence of hydrological time series. Data mining technology can help find those hidden laws, such as flood frequency and abrupt change, which is useful for the decision support of hydrological prediction and flood control scheduling. The periodic nature of hydrological time series is important for trend forecasting of drought and flood and hydraulic engineering planning. In Hydrology, the full period analysis of hydrological time series has attracted a lot of attention, such as the discrete periodogram, simple partial wave method, Fourier analysis method, and maximum entropy spectral analysis method and wavelet analysis. In fact, the hydrological process is influenced both by deterministic factors and stochastic ones. For example, the tidal level is also affected by moon circling the Earth, in addition to the Earth revolution and its rotation. Hence, there is some kind of approximate period hidden in the hydrological time series, sometimes which is also called the cryptic period. Recently, partial period mining originated from the data mining domain can be a remedy for the traditional period analysis methods in hydrology, which has a loose request of the data integrity and continuity. They can find some partial period in the time series. This paper is focused on the partial period mining in the hydrological time series. Based on asynchronous periodic pattern and partial period mining with suffix tree, this paper proposes to mine multi-event asynchronous periodic pattern based on modified suffix tree representation and traversal, and invent a dynamic candidate period intervals adjusting method, which can avoids period omissions or waste of time and space. The experimental results on synthetic data and real water level data of the Yangtze River at Nanjing station indicate that this algorithm can discover hydrological

  15. A Platform for Processing Expression of Short Time Series (PESTS

    Directory of Open Access Journals (Sweden)

    Markatou Marianthi

    2011-01-01

    Full Text Available Abstract Background Time course microarray profiles examine the expression of genes over a time domain. They are necessary in order to determine the complete set of genes that are dynamically expressed under given conditions, and to determine the interaction between these genes. Because of cost and resource issues, most time series datasets contain less than 9 points and there are few tools available geared towards the analysis of this type of data. Results To this end, we introduce a platform for Processing Expression of Short Time Series (PESTS. It was designed with a focus on usability and interpretability of analyses for the researcher. As such, it implements several standard techniques for comparability as well as visualization functions. However, it is designed specifically for the unique methods we have developed for significance analysis, multiple test correction and clustering of short time series data. The central tenet of these methods is the use of biologically relevant features for analysis. Features summarize short gene expression profiles, inherently incorporate dependence across time, and allow for both full description of the examined curve and missing data points. Conclusions PESTS is fully generalizable to other types of time series analyses. PESTS implements novel methods as well as several standard techniques for comparability and visualization functions. These features and functionality make PESTS a valuable resource for a researcher's toolkit. PESTS is available to download for free to academic and non-profit users at http://www.mailman.columbia.edu/academic-departments/biostatistics/research-service/software-development.

  16. Time Series Outlier Detection Based on Sliding Window Prediction

    Directory of Open Access Journals (Sweden)

    Yufeng Yu

    2014-01-01

    Full Text Available In order to detect outliers in hydrological time series data for improving data quality and decision-making quality related to design, operation, and management of water resources, this research develops a time series outlier detection method for hydrologic data that can be used to identify data that deviate from historical patterns. The method first built a forecasting model on the history data and then used it to predict future values. Anomalies are assumed to take place if the observed values fall outside a given prediction confidence interval (PCI, which can be calculated by the predicted value and confidence coefficient. The use of PCI as threshold is mainly on the fact that it considers the uncertainty in the data series parameters in the forecasting model to address the suitable threshold selection problem. The method performs fast, incremental evaluation of data as it becomes available, scales to large quantities of data, and requires no preclassification of anomalies. Experiments with different hydrologic real-world time series showed that the proposed methods are fast and correctly identify abnormal data and can be used for hydrologic time series analysis.

  17. Time series hyperspectral chemical imaging data: challenges, solutions and applications.

    Science.gov (United States)

    Gowen, A A; Marini, F; Esquerre, C; O'Donnell, C; Downey, G; Burger, J

    2011-10-31

    Hyperspectral chemical imaging (HCI) integrates imaging and spectroscopy resulting in three-dimensional data structures, hypercubes, with two spatial and one wavelength dimension. Each spatial image pixel in a hypercube contains a spectrum with >100 datapoints. While HCI facilitates enhanced monitoring of multi-component systems; time series HCI offers the possibility of a more comprehensive understanding of the dynamics of such systems and processes. This implies a need for modeling strategies that can cope with the large multivariate data structures generated in time series HCI experiments. The challenges posed by such data include dimensionality reduction, temporal morphological variation of samples and instrumental drift. This article presents potential solutions to these challenges, including multiway analysis, object tracking, multivariate curve resolution and non-linear regression. Several real world examples of time series HCI data are presented to illustrate the proposed solutions. Copyright © 2011 Elsevier B.V. All rights reserved.

  18. Complex Network Approach to the Fractional Time Series

    CERN Document Server

    Manshour, Pouya

    2015-01-01

    In order to extract the correlation information inherited in a stochastic time series, the visibility graph algorithm has been recently proposed, by which a time series can be mapped onto a complex network. We demonstrate that the visibility algorithm is not an appropriate one to study the correlation aspects of a time series. We then employ the horizontal visibility algorithm, as a much simpler one, to map the fractional processes onto complex networks. The parabolic exponential functions are found to ?fit with the corresponding degree distributions, with Hurst dependent ?fitting parameter. Further, we take into account other topological properties such as the maximum eigenvalue of the adjacency matrix and the degree assortativity, and show that such topological quantities can also be used to predict the Hurst exponent, with an exception for the antipersistent fractional Gaussian noises. To solve this problem, we take into account the Spearman correlation coefficient between the node's degree and its corresp...

  19. General expression for linear and nonlinear time series models

    Institute of Scientific and Technical Information of China (English)

    Ren HUANG; Feiyun XU; Ruwen CHEN

    2009-01-01

    The typical time series models such as ARMA, AR, and MA are founded on the normality and stationarity of a system and expressed by a linear difference equation; therefore, they are strictly limited to the linear system. However, some nonlinear factors are within the practical system; thus, it is difficult to fit the model for real systems with the above models. This paper proposes a general expression for linear and nonlinear auto-regressive time series models (GNAR). With the gradient optimization method and modified AIC information criteria integrated with the prediction error, the parameter estimation and order determination are achieved. The model simulation and experiments show that the GNAR model can accurately approximate to the dynamic characteristics of the most nonlinear models applied in academics and engineering. The modeling and prediction accuracy of the GNAR model is superior to the classical time series models. The proposed GNAR model is flexible and effective.

  20. Time series analysis by the Maximum Entropy method

    Energy Technology Data Exchange (ETDEWEB)

    Kirk, B.L.; Rust, B.W.; Van Winkle, W.

    1979-01-01

    The principal subject of this report is the use of the Maximum Entropy method for spectral analysis of time series. The classical Fourier method is also discussed, mainly as a standard for comparison with the Maximum Entropy method. Examples are given which clearly demonstrate the superiority of the latter method over the former when the time series is short. The report also includes a chapter outlining the theory of the method, a discussion of the effects of noise in the data, a chapter on significance tests, a discussion of the problem of choosing the prediction filter length, and, most importantly, a description of a package of FORTRAN subroutines for making the various calculations. Cross-referenced program listings are given in the appendices. The report also includes a chapter demonstrating the use of the programs by means of an example. Real time series like the lynx data and sunspot numbers are also analyzed. 22 figures, 21 tables, 53 references.

  1. Extracting unstable periodic orbits from chaotic time series data

    Energy Technology Data Exchange (ETDEWEB)

    So, P.; Schiff, S.; Gluckman, B.J., [Center for Neuroscience, Childrens Research Institute, Childrens National Medical Center and the George Washington University, NW, Washington, D.C. 20010 (United States); So, P.; Ott, E.; Grebogi, C., [Institute for Plasma Research, University of Maryland, College Park, Maryland 20742 (United States); Sauer, T., [Department of Mathematics, The George Mason University, Fairfax, Virginia 22030 (United States); Gluckman, B.J., [Naval Surface Warfare Center, Carderock Division, Bethesda, Maryland 20054-5000 (United States)

    1997-05-01

    A general nonlinear method to extract unstable periodic orbits from chaotic time series is proposed. By utilizing the estimated local dynamics along a trajectory, we devise a transformation of the time series data such that the transformed data are concentrated on the periodic orbits. Thus, one can extract unstable periodic orbits from a chaotic time series by simply looking for peaks in a finite grid approximation of the distribution function of the transformed data. Our method is demonstrated using data from both numerical and experimental examples, including neuronal ensemble data from mammalian brain slices. The statistical significance of the results in the presence of noise is assessed using surrogate data. {copyright} {ital 1997} {ital The American Physical Society}

  2. Parameter-Free Search of Time-Series Discord

    Institute of Scientific and Technical Information of China (English)

    Wei Luo; Marcus Gallagher; Janet Wiles

    2013-01-01

    Time-series discord is widely used in data mining applications to characterize anomalous subsequences in time series.Compared to some other discord search algorithms,the direct search algorithm based on the recurrence plot shows the advantage of being fast and parameter free.The direct search algorithm,however,relies on quasi-periodicity in input time series,an assumption that limits the algorithm's applicability.In this paper,we eliminate the periodicity assumption from the direct search algorithm by proposing a reference function for subsequences and a new sampling strategy based on the reference function.These measures result in a new algorithm with improved efficiency and robustness,as evidenced by our empirical evaluation.

  3. GAS DETECTING AND FORECASTING VIA TIME SERIES METHOD

    Institute of Scientific and Technical Information of China (English)

    黄养光

    1990-01-01

    The importance and urgency of gas detecting and forecasting in underground coal mining are self-evident. Unfortunately, this problem has not yet been solved thoroughly. In this paper, the author suggests that the time series analysis method be adopted for processing the gas stochastic data. The time series method is superior to the conventional Fourier analysis in some aspects, especially, the time series method possesses Forecasting (or prediction) function which is highly valuable for gas monitoring. An example ot a set ot gas data sampled From a certain foul coal mine is investigated and an AR (3) model is established. The fitting result and the forecasting error are accepted satisfactorily. At the end of this paper several remarks are presented for further discussion.

  4. The time series forecasting: from the aspect of network

    CERN Document Server

    Chen, S; Hu, Y; Liu, Q; Deng, Y

    2014-01-01

    Forecasting can estimate the statement of events according to the historical data and it is considerably important in many disciplines. At present, time series models have been utilized to solve forecasting problems in various domains. In general, researchers use curve fitting and parameter estimation methods (moment estimation, maximum likelihood estimation and least square method) to forecast. In this paper, a new sight is given to the forecasting and a completely different method is proposed to forecast time series. Inspired by the visibility graph and link prediction, this letter converts time series into network and then finds the nodes which are mostly likelihood to link with the predicted node. Finally, the predicted value will be obtained according to the state of the link. The TAIEX data set is used in the case study to illustrate that the proposed method is effectiveness. Compared with ARIMA model, the proposed shows a good forecasting performance when there is a small amount of data.

  5. Feature-preserving interpolation and filtering of environmental time series

    CERN Document Server

    Mariethoz, Gregoire; Jougnot, Damien; Rezaee, Hassan

    2015-01-01

    We propose a method for filling gaps and removing interferences in time series for applications involving continuous monitoring of environmental variables. The approach is non-parametric and based on an iterative pattern-matching between the affected and the valid parts of the time series. It considers several variables jointly in the pattern matching process and allows preserving linear or non-linear dependences between variables. The uncertainty in the reconstructed time series is quantified through multiple realizations. The method is tested on self-potential data that are affected by strong interferences as well as data gaps, and the results show that our approach allows reproducing the spectral features of the original signal. Even in the presence of intense signal perturbations, it significantly improves the signal and corrects bias introduced by asymmetrical interferences. Potential applications are wide-ranging, including geophysics, meteorology and hydrology.

  6. Causal analysis of time series from hydrological systems

    Science.gov (United States)

    Selle, Benny; Aufgebauer, Britta; Knorr, Klaus-Holger

    2017-04-01

    It is often difficult to infer cause and effect in hydrological systems for which time series of system inputs, outputs and state variables are observed. A recently published technique called Convergent Cross Mapping could be a promising tool to detect causality between time series. A response variable Y may be causally related to a forcing variable X, if the so called cross mapping of X using Y improves with the amount of data included. The idea is that a response variable contains information on the history of its driving variable whereas the reverse may not be true. We propose an alternative approach based on similar ideas using neural networks. Our approach is firstly compared to Convergent Cross Mapping using a synthetic time series of precipitation and streamflow generated by a rainfall runoff model. Secondly, measured concentrations of dissolved organic carbon and dissolved iron from a mountainous stream in Germany, that were previously hypothesised to be casually linked, are tested.

  7. On the detection of superdiffusive behaviour in time series

    Science.gov (United States)

    Gottwald, G. A.; Melbourne, I.

    2016-12-01

    We present a new method for detecting superdiffusive behaviour and for determining rates of superdiffusion in time series data. Our method applies equally to stochastic and deterministic time series data (with no prior knowledge required of the nature of the data) and relies on one realisation (ie one sample path) of the process. Linear drift effects are automatically removed without any preprocessing. We show numerical results for time series constructed from i.i.d. α-stable random variables and from deterministic weakly chaotic maps. We compare our method with the standard method of estimating the growth rate of the mean-square displacement as well as the p-variation method, maximum likelihood, quantile matching and linear regression of the empirical characteristic function.

  8. Increment entropy as a measure of complexity for time series

    CERN Document Server

    Liu, Xiaofeng; Xu, Ning; Xue, Jianru

    2015-01-01

    Entropy has been a common index to quantify the complexity of time series in a variety of fields. Here, we introduce increment entropy to measure the complexity of time series in which each increment is mapped into a word of two letters, one letter corresponding to direction and the other corresponding to magnitude. The Shannon entropy of the words is termed as increment entropy (IncrEn). Simulations on synthetic data and tests on epileptic EEG signals have demonstrated its ability of detecting the abrupt change, regardless of energetic (e.g. spikes or bursts) or structural changes. The computation of IncrEn does not make any assumption on time series and it can be applicable to arbitrary real-world data.

  9. Grammar-based feature generation for time-series prediction

    CERN Document Server

    De Silva, Anthony Mihirana

    2015-01-01

    This book proposes a novel approach for time-series prediction using machine learning techniques with automatic feature generation. Application of machine learning techniques to predict time-series continues to attract considerable attention due to the difficulty of the prediction problems compounded by the non-linear and non-stationary nature of the real world time-series. The performance of machine learning techniques, among other things, depends on suitable engineering of features. This book proposes a systematic way for generating suitable features using context-free grammar. A number of feature selection criteria are investigated and a hybrid feature generation and selection algorithm using grammatical evolution is proposed. The book contains graphical illustrations to explain the feature generation process. The proposed approaches are demonstrated by predicting the closing price of major stock market indices, peak electricity load and net hourly foreign exchange client trade volume. The proposed method ...

  10. Time Series Prediction based on Hybrid Neural Networks

    Directory of Open Access Journals (Sweden)

    S. A. Yarushev

    2016-01-01

    Full Text Available In this paper, we suggest to use hybrid approach to time series forecasting problem. In first part of paper, we create a literature review of time series forecasting methods based on hybrid neural networks and neuro-fuzzy approaches. Hybrid neural networks especially effective for specific types of applications such as forecasting or classification problem, in contrast to traditional monolithic neural networks. These classes of problems include problems with different characteristics in different modules. The main part of paper create a detailed overview of hybrid networks benefits, its architectures and performance under traditional neural networks. Hybrid neural networks models for time series forecasting are discussed in the paper. Experiments with modular neural networks are given.

  11. Time series, correlation matrices and random matrix models

    Energy Technology Data Exchange (ETDEWEB)

    Vinayak [Instituto de Ciencias Físicas, Universidad Nacional Autónoma de México, C.P. 62210 Cuernavaca (Mexico); Seligman, Thomas H. [Instituto de Ciencias Físicas, Universidad Nacional Autónoma de México, C.P. 62210 Cuernavaca, México and Centro Internacional de Ciencias, C.P. 62210 Cuernavaca (Mexico)

    2014-01-08

    In this set of five lectures the authors have presented techniques to analyze open classical and quantum systems using correlation matrices. For diverse reasons we shall see that random matrices play an important role to describe a null hypothesis or a minimum information hypothesis for the description of a quantum system or subsystem. In the former case various forms of correlation matrices of time series associated with the classical observables of some system. The fact that such series are necessarily finite, inevitably introduces noise and this finite time influence lead to a random or stochastic component in these time series. By consequence random correlation matrices have a random component, and corresponding ensembles are used. In the latter we use random matrices to describe high temperature environment or uncontrolled perturbations, ensembles of differing chaotic systems etc. The common theme of the lectures is thus the importance of random matrix theory in a wide range of fields in and around physics.

  12. Track Irregularity Time Series Analysis and Trend Forecasting

    Directory of Open Access Journals (Sweden)

    Jia Chaolong

    2012-01-01

    Full Text Available The combination of linear and nonlinear methods is widely used in the prediction of time series data. This paper analyzes track irregularity time series data by using gray incidence degree models and methods of data transformation, trying to find the connotative relationship between the time series data. In this paper, GM (1,1 is based on first-order, single variable linear differential equations; after an adaptive improvement and error correction, it is used to predict the long-term changing trend of track irregularity at a fixed measuring point; the stochastic linear AR, Kalman filtering model, and artificial neural network model are applied to predict the short-term changing trend of track irregularity at unit section. Both long-term and short-term changes prove that the model is effective and can achieve the expected accuracy.

  13. Increment Entropy as a Measure of Complexity for Time Series

    Directory of Open Access Journals (Sweden)

    Xiaofeng Liu

    2016-01-01

    Full Text Available Entropy has been a common index to quantify the complexity of time series in a variety of fields. Here, we introduce an increment entropy to measure the complexity of time series in which each increment is mapped onto a word of two letters, one corresponding to the sign and the other corresponding to the magnitude. Increment entropy (IncrEn is defined as the Shannon entropy of the words. Simulations on synthetic data and tests on epileptic electroencephalogram (EEG signals demonstrate its ability of detecting abrupt changes, regardless of the energetic (e.g., spikes or bursts or structural changes. The computation of IncrEn does not make any assumption on time series, and it can be applicable to arbitrary real-world data.

  14. Time series characterization via horizontal visibility graph and Information Theory

    Science.gov (United States)

    Gonçalves, Bruna Amin; Carpi, Laura; Rosso, Osvaldo A.; Ravetti, Martín G.

    2016-12-01

    Complex networks theory have gained wider applicability since methods for transformation of time series to networks were proposed and successfully tested. In the last few years, horizontal visibility graph has become a popular method due to its simplicity and good results when applied to natural and artificially generated data. In this work, we explore different ways of extracting information from the network constructed from the horizontal visibility graph and evaluated by Information Theory quantifiers. Most works use the degree distribution of the network, however, we found alternative probability distributions, more efficient than the degree distribution in characterizing dynamical systems. In particular, we find that, when using distributions based on distances and amplitude values, significant shorter time series are required. We analyze fractional Brownian motion time series, and a paleoclimatic proxy record of ENSO from the Pallcacocha Lake to study dynamical changes during the Holocene.

  15. Asymptotics for Nonlinear Transformations of Fractionally Integrated Time Series

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    The asymptotic theory for nonlinear transformations of fractionally integrated time series is developed. By the use of fractional Occupation Times Formula, various nonlinear functions of fractionally integrated series such as ARFIMA time series are studied, and the asymptotic distributions of the sample moments of such functions are obtained and analyzed. The transformations considered in this paper includes a variety of functions such as regular functions, integrable functions and asymptotically homogeneous functions that are often used in practical nonlinear econometric analysis. It is shown that the asymptotic theory of nonlinear transformations of original and normalized fractionally integrated processes is different from that of fractionally integrated processes, but is similar to the asymptotic theory of nonlinear transformations of integrated processes.

  16. Neural network versus classical time series forecasting models

    Science.gov (United States)

    Nor, Maria Elena; Safuan, Hamizah Mohd; Shab, Noorzehan Fazahiyah Md; Asrul, Mohd; Abdullah, Affendi; Mohamad, Nurul Asmaa Izzati; Lee, Muhammad Hisyam

    2017-05-01

    Artificial neural network (ANN) has advantage in time series forecasting as it has potential to solve complex forecasting problems. This is because ANN is data driven approach which able to be trained to map past values of a time series. In this study the forecast performance between neural network and classical time series forecasting method namely seasonal autoregressive integrated moving average models was being compared by utilizing gold price data. Moreover, the effect of different data preprocessing on the forecast performance of neural network being examined. The forecast accuracy was evaluated using mean absolute deviation, root mean square error and mean absolute percentage error. It was found that ANN produced the most accurate forecast when Box-Cox transformation was used as data preprocessing.

  17. Appropriate Algorithms for Nonlinear Time Series Analysis in Psychology

    Science.gov (United States)

    Scheier, Christian; Tschacher, Wolfgang

    Chaos theory has a strong appeal for psychology because it allows for the investigation of the dynamics and nonlinearity of psychological systems. Consequently, chaos-theoretic concepts and methods have recently gained increasing attention among psychologists and positive claims for chaos have been published in nearly every field of psychology. Less attention, however, has been paid to the appropriateness of chaos-theoretic algorithms for psychological time series. An appropriate algorithm can deal with short, noisy data sets and yields `objective' results. In the present paper it is argued that most of the classical nonlinear techniques don't satisfy these constraints and thus are not appropriate for psychological data. A methodological approach is introduced that is based on nonlinear forecasting and the method of surrogate data. In artificial data sets and empirical time series we can show that this methodology reliably assesses nonlinearity and chaos in time series even if they are short and contaminated by noise.

  18. Earnings and Income Volatility in America: Evidence from Matched CPS. Discussion Paper Series. DP 2010-05

    Science.gov (United States)

    Ziliak, James P.; Hardy, Bradley; Bollinger, Christopher

    2010-01-01

    In this paper we offer new evidence on earnings and income volatility in the United States over the past four decades by using matched data from the March Current Population Survey. We find that between 1973 and 2008 family income volatility rose by 38 percent, primarily as a result of higher volatility of husbands earnings and non means-tested…

  19. A multidisciplinary database for geophysical time series management

    Science.gov (United States)

    Montalto, P.; Aliotta, M.; Cassisi, C.; Prestifilippo, M.; Cannata, A.

    2013-12-01

    The variables collected by a sensor network constitute a heterogeneous data source that needs to be properly organized in order to be used in research and geophysical monitoring. With the time series term we refer to a set of observations of a given phenomenon acquired sequentially in time. When the time intervals are equally spaced one speaks of period or sampling frequency. Our work describes in detail a possible methodology for storage and management of time series using a specific data structure. We designed a framework, hereinafter called TSDSystem (Time Series Database System), in order to acquire time series from different data sources and standardize them within a relational database. The operation of standardization provides the ability to perform operations, such as query and visualization, of many measures synchronizing them using a common time scale. The proposed architecture follows a multiple layer paradigm (Loaders layer, Database layer and Business Logic layer). Each layer is specialized in performing particular operations for the reorganization and archiving of data from different sources such as ASCII, Excel, ODBC (Open DataBase Connectivity), file accessible from the Internet (web pages, XML). In particular, the loader layer performs a security check of the working status of each running software through an heartbeat system, in order to automate the discovery of acquisition issues and other warning conditions. Although our system has to manage huge amounts of data, performance is guaranteed by using a smart partitioning table strategy, that keeps balanced the percentage of data stored in each database table. TSDSystem also contains modules for the visualization of acquired data, that provide the possibility to query different time series on a specified time range, or follow the realtime signal acquisition, according to a data access policy from the users.

  20. A novel time series link prediction method: Learning automata approach

    Science.gov (United States)

    Moradabadi, Behnaz; Meybodi, Mohammad Reza

    2017-09-01

    Link prediction is a main social network challenge that uses the network structure to predict future links. The common link prediction approaches to predict hidden links use a static graph representation where a snapshot of the network is analyzed to find hidden or future links. For example, similarity metric based link predictions are a common traditional approach that calculates the similarity metric for each non-connected link and sort the links based on their similarity metrics and label the links with higher similarity scores as the future links. Because people activities in social networks are dynamic and uncertainty, and the structure of the networks changes over time, using deterministic graphs for modeling and analysis of the social network may not be appropriate. In the time-series link prediction problem, the time series link occurrences are used to predict the future links In this paper, we propose a new time series link prediction based on learning automata. In the proposed algorithm for each link that must be predicted there is one learning automaton and each learning automaton tries to predict the existence or non-existence of the corresponding link. To predict the link occurrence in time T, there is a chain consists of stages 1 through T - 1 and the learning automaton passes from these stages to learn the existence or non-existence of the corresponding link. Our preliminary link prediction experiments with co-authorship and email networks have provided satisfactory results when time series link occurrences are considered.

  1. Detection of "noisy" chaos in a time series

    DEFF Research Database (Denmark)

    Chon, K H; Kanters, J K; Cohen, R J

    1997-01-01

    , and if this determinism has chaotic attributes. The method relies on fitting a nonlinear autoregressive model to the time series followed by an estimation of the characteristic exponents of the model over the observed probability distribution of states for the system. The method is tested by computer simulations...... the internal dynamics of the systems, and the input to the system from the surroundings. This implies that the system should be viewed as a mixed system with both stochastic and deterministic components. We present a method that appears to be useful in deciding whether determinism is present in a time series...

  2. A Non-standard Empirical Likelihood for Time Series

    DEFF Research Database (Denmark)

    Nordman, Daniel J.; Bunzel, Helle; Lahiri, Soumendra N.

    Standard blockwise empirical likelihood (BEL) for stationary, weakly dependent time series requires specifying a fixed block length as a tuning parameter for setting confidence regions. This aspect can be difficult and impacts coverage accuracy. As an alternative, this paper proposes a new version......-standard asymptotics and requires a significantly different development compared to standard BEL. We establish the large-sample distribution of log-ratio statistics from the new BEL method for calibrating confidence regions for mean or smooth function parameters of time series. This limit law is not the usual chi...

  3. Dominant Skyline Query Processing over Multiple Time Series

    Institute of Scientific and Technical Information of China (English)

    Hao Wang; Chao-Kun Wang; Ya-Jun Xu; Yuan-Chi Ning

    2013-01-01

    Multiple time series (MTS),which describes an object in multi-dimensions,is based on single time series and has been proved to be useful.In this paper,a new analytical method called α/β-Dominant-Skyline on MTS and a formal definition of the α/β-dominant skyline MTS are given.Also,three algorithms,called NL,BC and MFB,are proposed to address the α/β-dominant skyline queries over MTS.Finally experimental results on both synthetic and real data verify the correctness and effectiveness of the proposed method and algorithms.

  4. Testing for intracycle determinism in pseudoperiodic time series

    Science.gov (United States)

    Coelho, Mara C. S.; Mendes, Eduardo M. A. M.; Aguirre, Luis A.

    2008-06-01

    A determinism test is proposed based on the well-known method of the surrogate data. Assuming predictability to be a signature of determinism, the proposed method checks for intracycle (e.g., short-term) determinism in the pseudoperiodic time series for which standard methods of surrogate analysis do not apply. The approach presented is composed of two steps. First, the data are preprocessed to reduce the effects of seasonal and trend components. Second, standard tests of surrogate analysis can then be used. The determinism test is applied to simulated and experimental pseudoperiodic time series and the results show the applicability of the proposed test.

  5. Mining Rules from Electrical Load Time Series Data Set

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    The mining of the rules from the electrical load time series data which are collected from the EMS (Energy Management System) is discussed. The data from the EMS are too huge and sophisticated to be understood and used by the power system engineer, while useful information is hidden in the electrical load data. The authors discuss the use of fuzzy linguistic summary as data mining method to induce the rules from the electrical load time series. The data preprocessing techniques are also discussed in the paper.

  6. Nonlinear Time Series Forecast Using Radial Basis Function Neural Networks

    Institute of Scientific and Technical Information of China (English)

    ZHENGXin; CHENTian-Lun

    2003-01-01

    In the research of using Radial Basis Function Neural Network (RBF NN) forecasting nonlinear time series, we investigate how the different clusterings affect the process of learning and forecasting. We find that k-means clustering is very suitable. In order to increase the precision we introduce a nonlinear feedback term to escape from the local minima of energy, then we use the model to forecast the nonlinear time series which are produced by Mackey-Glass equation and stocks. By selecting the k-means clustering and the suitable feedback term, much better forecasting results are obtained.

  7. Bootstrap Power of Time Series Goodness of fit tests

    Directory of Open Access Journals (Sweden)

    Sohail Chand

    2013-10-01

    Full Text Available In this article, we looked at power of various versions of Box and Pierce statistic and Cramer von Mises test. An extensive simulation study has been conducted to compare the power of these tests. Algorithms have been provided for the power calculations and comparison has also been made between the semi parametric bootstrap methods used for time series. Results show that Box-Pierce statistic and its various versions have good power against linear time series models but poor power against non linear models while situation reverses for Cramer von Mises test. Moreover, we found that dynamic bootstrap method is better than xed design bootstrap method.

  8. Complex network approach for recurrence analysis of time series

    Energy Technology Data Exchange (ETDEWEB)

    Marwan, Norbert, E-mail: marwan@pik-potsdam.d [Potsdam Institute for Climate Impact Research, PO Box 601203, 14412 Potsdam (Germany); Donges, Jonathan F. [Potsdam Institute for Climate Impact Research, PO Box 601203, 14412 Potsdam (Germany)] [Department of Physics, Humboldt University Berlin, Newtonstr. 15, 12489 Berlin (Germany); Zou Yong [Potsdam Institute for Climate Impact Research, PO Box 601203, 14412 Potsdam (Germany); Donner, Reik V. [Potsdam Institute for Climate Impact Research, PO Box 601203, 14412 Potsdam (Germany)] [Institute for Transport and Economics, Dresden University of Technology, Andreas-Schubert-Str. 23, 01062 Dresden (Germany)] [Graduate School of Science, Osaka Prefecture University, 1-1 Gakuencho, Naka-ku, Sakai 599-8531 (Japan); Kurths, Juergen [Potsdam Institute for Climate Impact Research, PO Box 601203, 14412 Potsdam (Germany)] [Department of Physics, Humboldt University Berlin, Newtonstr. 15, 12489 Berlin (Germany)

    2009-11-09

    We propose a novel approach for analysing time series using complex network theory. We identify the recurrence matrix (calculated from time series) with the adjacency matrix of a complex network and apply measures for the characterisation of complex networks to this recurrence matrix. By using the logistic map, we illustrate the potential of these complex network measures for the detection of dynamical transitions. Finally, we apply the proposed approach to a marine palaeo-climate record and identify the subtle changes to the climate regime.

  9. Chaotic time series prediction and additive white Gaussian noise

    Energy Technology Data Exchange (ETDEWEB)

    Lim, Teck Por [Department of Mathematics, 6M30 Huxley, Imperial College London, 180 Queen' s Gate, London, SW7 2BZ (United Kingdom)]. E-mail: teckpor@gmail.com; Puthusserypady, Sadasivan [Department of Electrical and Computer Engineering, National University of Singapore, 4 Engineering Drive 3, Singapore 117576 (Singapore)]. E-mail: elespk@nus.edu.sg

    2007-06-04

    Taken's delay embedding theorem states that a pseudo state-space can be reconstructed from a time series consisting of observations of a chaotic process. However, experimental observations are inevitably corrupted by measurement noise, which can be modelled as Additive White Gaussian Noise (AWGN). This Letter analyses time series prediction in the presence of AWGN using the triangle inequality and the mean of the Nakagami distribution. It is shown that using more delay coordinates than those used by a typical delay embedding can improve prediction accuracy, when the mean magnitude of the input vector dominates the mean magnitude of AWGN.

  10. Parameterizing unconditional skewness in models for financial time series

    DEFF Research Database (Denmark)

    He, Changli; Silvennoinen, Annastiina; Teräsvirta, Timo

    In this paper we consider the third-moment structure of a class of time series models. It is often argued that the marginal distribution of financial time series such as returns is skewed. Therefore it is of importance to know what properties a model should possess if it is to accommodate...... unconditional skewness. We consider modelling the unconditional mean and variance using models that respond nonlinearly or asymmetrically to shocks. We investigate the implications of these models on the third-moment structure of the marginal distribution as well as conditions under which the unconditional...

  11. Handbook of Time Series Analysis Recent Theoretical Developments and Applications

    CERN Document Server

    Schelter, Björn; Timmer, Jens

    2006-01-01

    This handbook provides an up-to-date survey of current research topics and applications of time series analysis methods written by leading experts in their fields. It covers recent developments in univariate as well as bivariate and multivariate time series analysis techniques ranging from physics' to life sciences' applications. Each chapter comprises both methodological aspects and applications to real world complex systems, such as the human brain or Earth's climate. Covering an exceptionally broad spectrum of topics, beginners, experts and practitioners who seek to understand the latest de

  12. Microbial oceanography and the Hawaii Ocean Time-series programme.

    Science.gov (United States)

    Karl, David M; Church, Matthew J

    2014-10-01

    The Hawaii Ocean Time-series (HOT) programme has been tracking microbial and biogeochemical processes in the North Pacific Subtropical Gyre since October 1988. The near-monthly time series observations have revealed previously undocumented phenomena within a temporally dynamic ecosystem that is vulnerable to climate change. Novel microorganisms, genes and unexpected metabolic pathways have been discovered and are being integrated into our evolving ecological paradigms. Continued research, including higher-frequency observations and at-sea experimentation, will help to provide a comprehensive scientific understanding of microbial processes in the largest biome on Earth.

  13. Kālī: Time series data modeler

    Science.gov (United States)

    Kasliwal, Vishal P.

    2016-07-01

    The fully parallelized and vectorized software package Kālī models time series data using various stochastic processes such as continuous-time ARMA (C-ARMA) processes and uses Bayesian Markov Chain Monte-Carlo (MCMC) for inferencing a stochastic light curve. Kālimacr; is written in c++ with Python language bindings for ease of use. K¯lī is named jointly after the Hindu goddess of time, change, and power and also as an acronym for KArma LIbrary.

  14. Analysis of volatile organic compounds in compost samples: A potential tool to determine appropriate composting time.

    Science.gov (United States)

    Zhu, Fengxiang; Pan, Zaifa; Hong, Chunlai; Wang, Weiping; Chen, Xiaoyang; Xue, Zhiyong; Yao, Yanlai

    2016-12-01

    Changes in volatile organic compound contents in compost samples during pig manure composting were studied using a headspace, solid-phase micro-extraction method (HS-SPME) followed by gas chromatography with mass spectrometric detection (GC/MS). Parameters affecting the SPME procedure were optimized as follows: the coating was carbon molecular sieve/polydimethylsiloxane (CAR/PDMS) fiber, the temperature was 60°C and the time was 30min. Under these conditions, 87 compounds were identified from 17 composting samples. Most of the volatile components could only be detected before day 22. However, benzenes, alkanes and alkenes increased and eventually stabilized after day 22. Phenol and acid substances, which are important factors for compost quality, were almost undetectable on day 39 in natural compost (NC) samples and on day 13 in maggot-treated compost (MC) samples. Our results indicate that the approach can be effectively used to determine the composting times by analysis of volatile substances in compost samples. An appropriate composting time not only ensures the quality of compost and reduces the loss of composting material but also reduces the generation of hazardous substances. The appropriate composting times for MC and NC were approximately 22days and 40days, respectively, during the summer in Zhejiang.

  15. A window-based time series feature extraction method.

    Science.gov (United States)

    Katircioglu-Öztürk, Deniz; Güvenir, H Altay; Ravens, Ursula; Baykal, Nazife

    2017-08-09

    This study proposes a robust similarity score-based time series feature extraction method that is termed as Window-based Time series Feature ExtraCtion (WTC). Specifically, WTC generates domain-interpretable results and involves significantly low computational complexity thereby rendering itself useful for densely sampled and populated time series datasets. In this study, WTC is applied to a proprietary action potential (AP) time series dataset on human cardiomyocytes and three precordial leads from a publicly available electrocardiogram (ECG) dataset. This is followed by comparing WTC in terms of predictive accuracy and computational complexity with shapelet transform and fast shapelet transform (which constitutes an accelerated variant of the shapelet transform). The results indicate that WTC achieves a slightly higher classification performance with significantly lower execution time when compared to its shapelet-based alternatives. With respect to its interpretable features, WTC has a potential to enable medical experts to explore definitive common trends in novel datasets. Copyright © 2017 Elsevier Ltd. All rights reserved.

  16. Fractal dimension of electroencephalographic time series and underlying brain processes.

    Science.gov (United States)

    Lutzenberger, W; Preissl, H; Pulvermüller, F

    1995-10-01

    Fractal dimension has been proposed as a useful measure for the characterization of electrophysiological time series. This paper investigates what the pointwise dimension of electroencephalographic (EEG) time series can reveal about underlying neuronal generators. The following theoretical assumptions concerning brain function were made (i) within the cortex, strongly coupled neural assemblies exist which oscillate at certain frequencies when they are active, (ii) several such assemblies can oscillate at a time, and (iii) activity flow between assemblies is minimal. If these assumptions are made, cortical activity can be considered as the weighted sum of a finite number of oscillations (plus noise). It is shown that the correlation dimension of finite time series generated by multiple oscillators increases monotonically with the number of oscillators. Furthermore, it is shown that a reliable estimate of the pointwise dimension of the raw EEG signal can be calculated from a time series as short as a few seconds. These results indicate that (i) The pointwise dimension of the EEG allows conclusions regarding the number of independently oscillating networks in the cortex, and (ii) a reliable estimate of the pointwise dimension of the EEG is possible on the basis of short raw signals.

  17. FTSPlot: fast time series visualization for large datasets.

    Science.gov (United States)

    Riss, Michael

    2014-01-01

    The analysis of electrophysiological recordings often involves visual inspection of time series data to locate specific experiment epochs, mask artifacts, and verify the results of signal processing steps, such as filtering or spike detection. Long-term experiments with continuous data acquisition generate large amounts of data. Rapid browsing through these massive datasets poses a challenge to conventional data plotting software because the plotting time increases proportionately to the increase in the volume of data. This paper presents FTSPlot, which is a visualization concept for large-scale time series datasets using techniques from the field of high performance computer graphics, such as hierarchic level of detail and out-of-core data handling. In a preprocessing step, time series data, event, and interval annotations are converted into an optimized data format, which then permits fast, interactive visualization. The preprocessing step has a computational complexity of O(n x log(N)); the visualization itself can be done with a complexity of O(1) and is therefore independent of the amount of data. A demonstration prototype has been implemented and benchmarks show that the technology is capable of displaying large amounts of time series data, event, and interval annotations lag-free with visualization method for long-term electrophysiological experiments.

  18. Power computations in time series analyses for traffic safety interventions.

    Science.gov (United States)

    McLeod, A Ian; Vingilis, E R

    2008-05-01

    The evaluation of traffic safety interventions or other policies that can affect road safety often requires the collection of administrative time series data, such as monthly motor vehicle collision data that may be difficult and/or expensive to collect. Furthermore, since policy decisions may be based on the results found from the intervention analysis of the policy, it is important to ensure that the statistical tests have enough power, that is, that we have collected enough time series data both before and after the intervention so that a meaningful change in the series will likely be detected. In this short paper, we present a simple methodology for doing this. It is expected that the methodology presented will be useful for sample size determination in a wide variety of traffic safety intervention analysis applications. Our method is illustrated with a proposed traffic safety study that was funded by NIH.

  19. LEARNING GRANGER CAUSALITY GRAPHS FOR MULTIVARIATE NONLINEAR TIME SERIES

    Institute of Scientific and Technical Information of China (English)

    Wei GAO; Zheng TIAN

    2009-01-01

    An information theory method is proposed to test the. Granger causality and contemporaneous conditional independence in Granger causality graph models. In the graphs, the vertex set denotes the component series of the multivariate time series, and the directed edges denote causal dependence, while the undirected edges reflect the instantaneous dependence. The presence of the edges is measured by a statistics based on conditional mutual information and tested by a permutation procedure. Furthermore, for the existed relations, a statistics based on the difference between general conditional mutual information and linear conditional mutual information is proposed to test the nonlinearity. The significance of the nonlinear test statistics is determined by a bootstrap method based on surrogate data. We investigate the finite sample behavior of the procedure through simulation time series with different dependence structures, including linear and nonlinear relations.

  20. Studies in Astronomical Time Series Analysis. VI. Bayesian Block Representations

    CERN Document Server

    Scargle, Jeffrey D; Jackson, Brad; Chiang, James

    2012-01-01

    This paper addresses the problem of detecting and characterizing local variability in time series and other forms of sequential data. The goal is to identify and characterize statistically significant variations, at the same time suppressing the inevitable corrupting observational errors. We present a simple nonparametric modeling technique and an algorithm implementing it - an improved and generalized version of Bayesian Blocks (Scargle 1998) - that finds the optimal segmentation of the data in the observation interval. The structure of the algorithm allows it to be used in either a real-time trigger mode, or a retrospective mode. Maximum likelihood or marginal posterior functions to measure model fitness are presented for events, binned counts, and measurements at arbitrary times with known error distributions. Problems addressed include those connected with data gaps, variable exposure, extension to piecewise linear and piecewise exponential representations, multi-variate time series data, analysis of vari...

  1. Recovery of delay time from time series based on the nearest neighbor method

    Energy Technology Data Exchange (ETDEWEB)

    Prokhorov, M.D., E-mail: mdprokhorov@yandex.ru [Saratov Branch of Kotel' nikov Institute of Radio Engineering and Electronics of Russian Academy of Sciences, Zelyonaya Street, 38, Saratov 410019 (Russian Federation); Ponomarenko, V.I. [Saratov Branch of Kotel' nikov Institute of Radio Engineering and Electronics of Russian Academy of Sciences, Zelyonaya Street, 38, Saratov 410019 (Russian Federation); Department of Nano- and Biomedical Technologies, Saratov State University, Astrakhanskaya Street, 83, Saratov 410012 (Russian Federation); Khorev, V.S. [Department of Nano- and Biomedical Technologies, Saratov State University, Astrakhanskaya Street, 83, Saratov 410012 (Russian Federation)

    2013-12-09

    We propose a method for the recovery of delay time from time series of time-delay systems. The method is based on the nearest neighbor analysis. The method allows one to reconstruct delays in various classes of time-delay systems including systems of high order, systems with several coexisting delays, and nonscalar time-delay systems. It can be applied to time series heavily corrupted by additive and dynamical noise.

  2. Recovery of delay time from time series based on the nearest neighbor method

    Science.gov (United States)

    Prokhorov, M. D.; Ponomarenko, V. I.; Khorev, V. S.

    2013-12-01

    We propose a method for the recovery of delay time from time series of time-delay systems. The method is based on the nearest neighbor analysis. The method allows one to reconstruct delays in various classes of time-delay systems including systems of high order, systems with several coexisting delays, and nonscalar time-delay systems. It can be applied to time series heavily corrupted by additive and dynamical noise.

  3. Fluctuation behaviors of financial time series by a stochastic Ising system on a Sierpinski carpet lattice

    Science.gov (United States)

    Fang, Wen; Wang, Jun

    2013-09-01

    We develop a financial market model using an Ising spin system on a Sierpinski carpet lattice that breaks the equal status of each spin. To study the fluctuation behavior of the financial model, we present numerical research based on Monte Carlo simulation in conjunction with the statistical analysis and multifractal analysis of the financial time series. We extract the multifractal spectra by selecting various lattice size values of the Sierpinski carpet, and the inverse temperature of the Ising dynamic system. We also investigate the statistical fluctuation behavior, the time-varying volatility clustering, and the multifractality of returns for the indices SSE, SZSE, DJIA, IXIC, S&P500, HSI, N225, and for the simulation data derived from the Ising model on the Sierpinski carpet lattice. A numerical study of the model’s dynamical properties reveals that this financial model reproduces important features of the empirical data.

  4. A Data Mining Framework for Time Series Estimation

    Science.gov (United States)

    Hu, Xiao; Xu, Peng; Wu, Shaozhi; Asgari, Shadnaz; Bergsneider, Marvin

    2009-01-01

    Time series estimation techniques are usually employed in biomedical research to derive variables less accessible from a set of related and more accessible variables. These techniques are traditionally built from systems modeling approaches including simulation, blind decovolution, and state estimation. In this work, we define target time series (TTS) and its related time series (RTS) as the output and input of a time series estimation process, respectively. We then propose a novel data mining framework for time series estimation when TTS and RTS represent different sets of observed variables from the same dynamic system. This is made possible by mining a database of instances of TTS, its simultaneously recorded RTS, and the input/output dynamic models between them. The key mining strategy is to formulate a mapping function for each TTS-RTS pair in the database that translates a feature vector extracted from RTS to the dissimilarity between true TTS and its estimate from the dynamic model associated with the same TTS-RTS pair. At run time, a feature vector is extracted from an inquiry RTS and supplied to the mapping function associated with each TTS-RTS pair to calculate a dissimilarity measure. An optimal TTS-RTS pair is then selected by analyzing these dissimilarity measures. The associated input/output model of the selected TTS-RTS pair is then used to simulate the TTS given the inquiry RTS as an input. An exemplary implementation was built to address a biomedical problem of noninvasive intracranial pressure assessment. The performance of the proposed method was superior to that of a simple training-free approach of finding the optimal TTS-RTS pair by a conventional similarity-based search on RTS features. PMID:19900575

  5. Change detection in a time series of polarimetric SAR images

    DEFF Research Database (Denmark)

    Skriver, Henning; Nielsen, Allan Aasbjerg; Conradsen, Knut

    can be used to detect at which points changes occur in the time series. [1] T. W. Anderson, An Introduction to Multivariate Statistical Analysis, John Wiley, New York, third edition, 2003. [2] K. Conradsen, A. A. Nielsen, J. Schou, and H. Skriver, “A test statistic in the complex Wishart distribution...

  6. Deriving dynamic marketing effectiveness from econometric time series models

    NARCIS (Netherlands)

    C. Horváth (Csilla); Ph.H.B.F. Franses (Philip Hans)

    2003-01-01

    textabstractTo understand the relevance of marketing efforts, it has become standard practice to estimate the long-run and short-run effects of the marketing-mix, using, say, weekly scanner data. A common vehicle for this purpose is an econometric time series model. Issues that are addressed in the

  7. United States forest disturbance trends observed with landsat time series

    Science.gov (United States)

    Jeffrey G. Masek; Samuel N. Goward; Robert E. Kennedy; Warren B. Cohen; Gretchen G. Moisen; Karen Schleweiss; Chengquan. Huang

    2013-01-01

    Disturbance events strongly affect the composition, structure, and function of forest ecosystems; however, existing US land management inventories were not designed to monitor disturbance. To begin addressing this gap, the North American Forest Dynamics (NAFD) project has examined a geographic sample of 50 Landsat satellite image time series to assess trends in forest...

  8. Real Rainfall Time Series for Storm Sewer Design

    DEFF Research Database (Denmark)

    Larsen, Torben

    1981-01-01

    This paper describes a simulation method for the design of retention storages, overflows etc. in storm sewer systems. The method is based on computer simulation with real real rainfall time series as input and with a simple transfer model of the ARMA-type (Autoregressive moving average) applied...

  9. Real Rainfall Time Series for Storm Sewer Design

    DEFF Research Database (Denmark)

    Larsen, Torben

    The paper describes a simulation method for the design of retention storages, overflows etc. in storm sewer systems. The method is based on computer simulation with real rainfall time series as input ans with the aply of a simple transfer model of the ARMA-type (autoregressiv moving average model...

  10. Daily time series evapotranspiration maps for Oklahoma and Texas panhandle

    Science.gov (United States)

    Evapotranspiration (ET) is an important process in ecosystems’ water budget and closely linked to its productivity. Therefore, regional scale daily time series ET maps developed at high and medium resolutions have large utility in studying the carbon-energy-water nexus and managing water resources. ...

  11. Analysis of Complex Intervention Effects in Time-Series Experiments.

    Science.gov (United States)

    Bower, Cathleen

    An iterative least squares procedure for analyzing the effect of various kinds of intervention in time-series data is described. There are numerous applications of this design in economics, education, and psychology, although until recently, no appropriate analysis techniques had been developed to deal with the model adequately. This paper…

  12. Noise in multivariate GPS position time-series

    NARCIS (Netherlands)

    Amiri-Simkooei, A.R.

    2008-01-01

    A methodology is developed to analyze a multivariate linear model, which occurs in many geodetic and geophysical applications. Proper analysis of multivariate GPS coordinate time-series is considered to be an application. General, special, and more practical stochastic models are adopted to assess t

  13. Application of modern time series analysis to high stability oscillators

    Science.gov (United States)

    Farrell, B. F.; Mattison, W. M.; Vessot, R. F. C.

    1980-01-01

    Techniques of modern time series analysis useful for investigating the characteristics of high-stability oscillators and identifying systematic perturbations are discussed with reference to an experiment in which the frequencies of superconducting cavity-stabilized oscillators and hydrogen masers were compared. The techniques examined include transformation to stationarity, autocorrelation and cross-correlation, superresolution, and transfer function determination.

  14. Long-memory time series theory and methods

    CERN Document Server

    Palma, Wilfredo

    2007-01-01

    Wilfredo Palma, PhD, is Chairman and Professor of Statistics in the Department of Statistics at Pontificia Universidad Católica de Chile. Dr. Palma has published several refereed articles and has received over a dozen academic honors and awards. His research interests include time series analysis, prediction theory, state space systems, linear models, and econometrics.

  15. Wavelet methods in (financial) time-series processing

    NARCIS (Netherlands)

    Struzik, Z.R.

    2000-01-01

    We briefly describe the major advantages of using the wavelet transform for the processing of financial time series on the example of the S&P index. In particular, we show how to uncover local the scaling (correlation) characteristics of the S&P index with the wavelet based effective H'older expone

  16. Noise in multivariate GPS position time-series

    NARCIS (Netherlands)

    Amiri-Simkooei, A.R.

    2008-01-01

    A methodology is developed to analyze a multivariate linear model, which occurs in many geodetic and geophysical applications. Proper analysis of multivariate GPS coordinate time-series is considered to be an application. General, special, and more practical stochastic models are adopted to assess t

  17. Time Series Data Visualization in World Wide Telescope

    Science.gov (United States)

    Fay, J.

    WorldWide Telescope provides a rich set of timer series visualization for both archival and real time data. WWT consists of both interactive desktop tools for interactive immersive visualization and HTML5 web based controls that can be utilized in customized web pages. WWT supports a range of display options including full dome, power walls, stereo and virtual reality headsets.

  18. Time series analysis in astronomy: Limits and potentialities

    DEFF Research Database (Denmark)

    Vio, R.; Kristensen, N.R.; Madsen, Henrik

    2005-01-01

    In this paper we consider the problem of the limits concerning the physical information that can be extracted from the analysis of one or more time series ( light curves) typical of astrophysical objects. On the basis of theoretical considerations and numerical simulations, we show that with no a...

  19. Risk bounds for time series without strong mixing

    CERN Document Server

    McDonald, Daniel J; Schervish, Mark

    2011-01-01

    We show how to control the generalization error of time series models wherein past values of the outcome are used to predict future values. The results are based on a generalization of standard IID concentration inequalities to dependent data. We show how these concentration inequalities behave under different versions of dependence to provide some intuition for our methods.

  20. Publicly Verifiable Private Aggregation of Time-Series Data

    NARCIS (Netherlands)

    Bakondi, B.G.; Peter, A.; Everts, M.H.; Hartel, P.H.; Jonker, W.

    2015-01-01

    Aggregation of time-series data offers the possibility to learn certain statistics over data periodically uploaded by different sources. In case of privacy sensitive data, it is desired to hide every data provider's individual values from the other participants (including the data aggregator). Exist

  1. A Hybrid Joint Moment Ratio Test for Financial Time Series

    NARCIS (Netherlands)

    P.A. Groenendijk (Patrick); A. Lucas (André); C.G. de Vries (Casper)

    1998-01-01

    textabstractWe advocate the use of absolute moment ratio statistics in conjunction with standard variance ratio statistics in order to disentangle linear dependence, non-linear dependence, and leptokurtosis in financial time series. Both statistics are computed for multiple return horizons

  2. A Hybrid Joint Moment Ratio Test for Financial Time Series

    NARCIS (Netherlands)

    P.A. Groenendijk (Patrick); A. Lucas (André); C.G. de Vries (Casper)

    1998-01-01

    textabstractWe advocate the use of absolute moment ratio statistics in conjunction with standard variance ratio statistics in order to disentangle linear dependence, non-linear dependence, and leptokurtosis in financial time series. Both statistics are computed for multiple return horizons simultane

  3. TAIL INDEX ESTIMATION FOR A FILTERED DEPENDENT TIME SERIES

    National Research Council Canada - National Science Library

    Jonathan B. Hill

    2015-01-01

    We prove Hill's (1975) tail index estimator is asymptotically normal when the employed data are generated by a stationary parametric time series (xt(θ0) : t ∈ ℤ} and θ0 is an unknown k × 1 vector. We assume xt(θ0...

  4. Time series analysis in astronomy: Limits and potentialities

    DEFF Research Database (Denmark)

    Vio, R.; Kristensen, N.R.; Madsen, Henrik

    2005-01-01

    In this paper we consider the problem of the limits concerning the physical information that can be extracted from the analysis of one or more time series ( light curves) typical of astrophysical objects. On the basis of theoretical considerations and numerical simulations, we show that with no a...

  5. The Haar Wavelet Transform in the Time Series Similarity Paradigm

    NARCIS (Netherlands)

    Z.R. Struzik; A.P.J.M. Siebes (Arno)

    1999-01-01

    textabstractSimilarity measures play an important role in many data mining algorithms. To allow the use of such algorithms on non-standard databases, such as databases of financial time series, their similarity measure has to be defined. We present a simple and powerful technique which allows for

  6. ISO 9000 Series Certification Over Time: what have we learnt?

    NARCIS (Netherlands)

    A. van der Wiele (Ton); A.M. Brown (Alan)

    2002-01-01

    textabstractThe ISO 9000 experiences of the same sample of organisations over a five year time period is examined in this paper. The responses to a questionnaire sent out at the end of 1999 to companies which had a reasonably long term experience with the ISO 9000 series quality system are analysed.

  7. What Makes a Coursebook Series Stand the Test of Time?

    Science.gov (United States)

    Illes, Eva

    2009-01-01

    Intriguingly, at a time when the ELT market is inundated with state-of-the-art coursebooks teaching modern-day English, a 30-year-old series enjoys continuing popularity in some secondary schools in Hungary. Why would teachers, several of whom are school-based teacher-mentors in the vanguard of the profession, purposefully choose materials which…

  8. Segmentation of Nonstationary Time Series with Geometric Clustering

    DEFF Research Database (Denmark)

    Bocharov, Alexei; Thiesson, Bo

    2013-01-01

    We introduce a non-parametric method for segmentation in regimeswitching time-series models. The approach is based on spectral clustering of target-regressor tuples and derives a switching regression tree, where regime switches are modeled by oblique splits. Such models can be learned efficiently...

  9. A test of conditional heteroscedasticity in time series

    Institute of Scientific and Technical Information of China (English)

    陈敏; 安鸿志

    1999-01-01

    A new test of conditional heteroscedasticity for time series is proposed. The new testing method is based on a goodness of fit type test statistics and a Cramer-von Mises type test statistic. The asymptotic properties of the new test statistic is establised. The results demonstrate that such a test is consistent.

  10. Notes on economic time series analysis system theoretic perspectives

    CERN Document Server

    Aoki, Masanao

    1983-01-01

    In seminars and graduate level courses I have had several opportunities to discuss modeling and analysis of time series with economists and economic graduate students during the past several years. These experiences made me aware of a gap between what economic graduate students are taught about vector-valued time series and what is available in recent system literature. Wishing to fill or narrow the gap that I suspect is more widely spread than my personal experiences indicate, I have written these notes to augment and reor­ ganize materials I have given in these courses and seminars. I have endeavored to present, in as much a self-contained way as practicable, a body of results and techniques in system theory that I judge to be relevant and useful to economists interested in using time series in their research. I have essentially acted as an intermediary and interpreter of system theoretic results and perspectives in time series by filtering out non-essential details, and presenting coherent accounts of wha...

  11. Metagenomics meets time series analysis: unraveling microbial community dynamics.

    Science.gov (United States)

    Faust, Karoline; Lahti, Leo; Gonze, Didier; de Vos, Willem M; Raes, Jeroen

    2015-06-01

    The recent increase in the number of microbial time series studies offers new insights into the stability and dynamics of microbial communities, from the world's oceans to human microbiota. Dedicated time series analysis tools allow taking full advantage of these data. Such tools can reveal periodic patterns, help to build predictive models or, on the contrary, quantify irregularities that make community behavior unpredictable. Microbial communities can change abruptly in response to small perturbations, linked to changing conditions or the presence of multiple stable states. With sufficient samples or time points, such alternative states can be detected. In addition, temporal variation of microbial interactions can be captured with time-varying networks. Here, we apply these techniques on multiple longitudinal datasets to illustrate their potential for microbiome research.

  12. Displaying time series, spatial, and space-time data with R

    CERN Document Server

    Perpinan Lamigueiro, Oscar

    2014-01-01

    Code and Methods for Creating High-Quality Data GraphicsA data graphic is not only a static image, but it also tells a story about the data. It activates cognitive processes that are able to detect patterns and discover information not readily available with the raw data. This is particularly true for time series, spatial, and space-time datasets.Focusing on the exploration of data with visual methods, Displaying Time Series, Spatial, and Space-Time Data with R presents methods and R code for producing high-quality graphics of time series, spatial, and space-time data. Practical examples using

  13. Classification of time series patterns from complex dynamic systems

    Energy Technology Data Exchange (ETDEWEB)

    Schryver, J.C.; Rao, N.

    1998-07-01

    An increasing availability of high-performance computing and data storage media at decreasing cost is making possible the proliferation of large-scale numerical databases and data warehouses. Numeric warehousing enterprises on the order of hundreds of gigabytes to terabytes are a reality in many fields such as finance, retail sales, process systems monitoring, biomedical monitoring, surveillance and transportation. Large-scale databases are becoming more accessible to larger user communities through the internet, web-based applications and database connectivity. Consequently, most researchers now have access to a variety of massive datasets. This trend will probably only continue to grow over the next several years. Unfortunately, the availability of integrated tools to explore, analyze and understand the data warehoused in these archives is lagging far behind the ability to gain access to the same data. In particular, locating and identifying patterns of interest in numerical time series data is an increasingly important problem for which there are few available techniques. Temporal pattern recognition poses many interesting problems in classification, segmentation, prediction, diagnosis and anomaly detection. This research focuses on the problem of classification or characterization of numerical time series data. Highway vehicles and their drivers are examples of complex dynamic systems (CDS) which are being used by transportation agencies for field testing to generate large-scale time series datasets. Tools for effective analysis of numerical time series in databases generated by highway vehicle systems are not yet available, or have not been adapted to the target problem domain. However, analysis tools from similar domains may be adapted to the problem of classification of numerical time series data.

  14. Time-series analysis of Music: Perceptual and Information Dynamics

    OpenAIRE

    Pearce, Marcus T.

    2011-01-01

    Dean and Bailes (2010) provide a tutorial on the use of time-series analysis in research on music perception and a study of the influence of acoustic factors on real-time perception of music. They illustrate their approach with a detailed case study of an electroacoustic composition by Trevor Wishart. In this commentary, I discuss four aspects of Dean and Bailes’ presentation: first, the importance of focusing on dynamic changes in musical structure; second, the benefits of computer-generated...

  15. Modelling, simulation and inference for multivariate time series of counts

    OpenAIRE

    Veraart, Almut E. D.

    2016-01-01

    This article presents a new continuous-time modelling framework for multivariate time series of counts which have an infinitely divisible marginal distribution. The model is based on a mixed moving average process driven by L\\'{e}vy noise - called a trawl process - where the serial correlation and the cross-sectional dependence are modelled independently of each other. Such processes can exhibit short or long memory. We derive a stochastic simulation algorithm and a statistical inference meth...

  16. Statistical Analysis of Time Series Data (STATS). Users Manual (Preliminary)

    Science.gov (United States)

    1987-05-01

    15, 30. 60, 90, 120, andL -!/14:X.... 183 days are presently used. auto Page 1 of 10 wrpy *VtsE0> J1 record (continued) Field Variab Vlue D 2 NPRDS ...each event. 6 JEND + Order number of last period in time series to ( NPRDS ) select for analysis. If blank, the last period is assumed. 7 JPPF Plotting...values. 2 NPRDS + Actual number of periods for the event following on ’INO records until the next ID, BF, or LI record. IN record - T:E SERIES DATA

  17. Improving predictability of time series using maximum entropy methods

    Science.gov (United States)

    Chliamovitch, G.; Dupuis, A.; Golub, A.; Chopard, B.

    2015-04-01

    We discuss how maximum entropy methods may be applied to the reconstruction of Markov processes underlying empirical time series and compare this approach to usual frequency sampling. It is shown that, in low dimension, there exists a subset of the space of stochastic matrices for which the MaxEnt method is more efficient than sampling, in the sense that shorter historical samples have to be considered to reach the same accuracy. Considering short samples is of particular interest when modelling smoothly non-stationary processes, which provides, under some conditions, a powerful forecasting tool. The method is illustrated for a discretized empirical series of exchange rates.

  18. Fast computation of recurrences in long time series

    Science.gov (United States)

    Rawald, Tobias; Sips, Mike; Marwan, Norbert; Dransch, Doris

    2014-05-01

    The quadratic time complexity of calculating basic RQA measures, doubling the size of the input time series leads to a quadrupling in operations, impairs the fast computation of RQA in many application scenarios. As an example, we analyze the Potsdamer Reihe, an ongoing non-interrupted hourly temperature profile since 1893, consisting of 1,043,112 data points. Using an optimized single-threaded CPU implementation this analysis requires about six hours. Our approach conducts RQA for the Potsdamer Reihe in five minutes. We automatically split a long time series into smaller chunks (Divide) and distribute the computation of RQA measures across multiple GPU devices. To guarantee valid RQA results, we employ carryover buffers that allow sharing information between pairs of chunks (Recombine). We demonstrate the capabilities of our Divide and Recombine approach to process long time series by comparing the runtime of our implementation to existing RQA tools. We support a variety of platforms by employing the computing framework OpenCL. Our current implementation supports the computation of standard RQA measures (recurrence rate, determinism, laminarity, ratio, average diagonal line length, trapping time, longest diagonal line, longest vertical line, divergence, entropy, trend) and also calculates recurrence times. To utilize the potential of our approach for a number of applications, we plan to release our implementation under an Open Source software license. It will be available at http://www.gfz-potsdam.de/fast-rqa/. Since our approach allows to compute RQA measures for a long time series fast, we plan to extend our implementation to support multi-scale RQA.

  19. Wavelet analysis on paleomagnetic (and computer simulated VGP time series

    Directory of Open Access Journals (Sweden)

    A. Siniscalchi

    2003-06-01

    Full Text Available We present Continuous Wavelet Transform (CWT data analysis of Virtual Geomagnetic Pole (VGP latitude time series. The analyzed time series are sedimentary paleomagnetic and geodynamo simulated data. Two mother wavelets (the Morlet function and the first derivative of a Gaussian function are used in order to detect features related to the spectral content as well as polarity excursions and reversals. By means of the Morlet wavelet, we estimate both the global spectrum and the time evolution of the spectral content of the paleomagnetic data series. Some peaks corresponding to the orbital components are revealed by the spectra and the local analysis helped disclose their statistical significance. Even if this feature could be an indication of orbital influence on geodynamo, other interpretations are possible. In particular, we note a correspondence of local spectral peaks with the appearance of the excursions in the series. The comparison among the paleomagnetic and simulated spectra shows a similarity in the high frequency region indicating that their degree of regularity is analogous. By means of Gaussian first derivative wavelet, reversals and excursions of polarity were sought. The analysis was performed first on the simulated data, to have a guide in understanding the features present in the more complex paleomagnetic data. Various excursions and reversals have been identified, despite of the prevalent normality of the series and its inherent noise. The found relative chronology of the paleomagnetic data reversals was compared with a coeval global polarity time scale (Channel et al., 1995. The relative lengths of polarity stability intervals are found similar, but a general shift appears between the two scales, that could be due to the datation uncertainties of the Hauterivian/Barremian boundary.

  20. Autoregression of Quasi-Stationary Time Series (Invited)

    Science.gov (United States)

    Meier, T. M.; Küperkoch, L.

    2009-12-01

    Autoregression is a model based tool for spectral analysis and prediction of time series. It has the potential to increase the resolution of spectral estimates. However, the validity of the assumed model has to be tested. Here we review shortly methods for the determination of the parameters of autoregression and summarize properties of autoregressive prediction and autoregressive spectral analysis. Time series with a limited number of dominant frequencies varying slowly in time (quasi-stationary time series) may well be described by a time-dependent autoregressive model of low order. An algorithm for the estimation of the autoregression parameters in a moving window is presented. Time-varying dominant frequencies are estimated. The comparison to results obtained by Fourier transform based methods and the visualization of the time dependent normalized prediction error are essential for quality assessment of the results. The algorithm is applied to synthetic examples as well as to mircoseism and tremor. The sensitivity of the results to the choice of model and filter parameters is discussed. Autoregressive forward prediction offers the opportunity to detect body wave phases in seismograms and to determine arrival times automatically. Examples are shown for P- and S-phases at local and regional distances. In order to determine S-wave arrival times the autoregressive model is extended to multi-component recordings. For the detection of significant temporal changes in waveforms, the choice of the model appears to be less crucial compared to spectral analysis. Temporal changes in frequency, amplitude, phase, and polarisation are detectable by autoregressive prediction. Quality estimates of automatically determined onset times may be obtained from the slope of the absolute prediction error as a function of time and the signal-to-noise ratio. Results are compared to manual readings.

  1. Near Real Time Prospecting for Lunar Volatiles: Demonstrating RESOLVE Science in the Field

    Science.gov (United States)

    Elphic, Richard; Colaprete, Anthony; Heldmann, Jennifer; Mattes, Gregory W.; Ennico, Kimberly; Sanders, Gerald; Quinn, Jacqueline; Tegnerud, Erin Leigh; Marinova, Margarita; Larson, William E.; Picard, Martin; Morse, Stephanie

    2012-01-01

    The Regolith and Environment Science and Oxygen & Lunar Volatile Extraction (RESOLVE) project aims to demonstrate the utility of "in situ resource utilization". In situ resource utilization (ISRU) is a way to rebalance the economics of spaceflight by reducing or eliminating materials that must be brought up from Earth and placed on the surface of the Moon for human use. RESOLVE is developing a rover-borne payload that (1) can locate near subsurface volatiles, (2) excavate and analyze samples of the volatile-bearing regolith, and (3) demonstrate the form, extractability and usefulness of the materials. Such investigations are important not only for ISRU but are also critically important for understanding the scientific nature of these intriguing lunar polar volatile deposits. Temperature models and orbital data suggest near surface volatile concentrations may exist at briefly lit lunar polar locations outside persistently shadowed regions. A lunar rover could be remotely operated at some of these locations for the 4-7 days of expected sunlight at relatively low cost. In July 2012 the RESOLVE project conducted a full-scale field demonstration. In particular, the ability to perform the real-time measurement analysis necessary to search for volatiles and the ability to combine the various measurement techniques to meet the mission measurement and science goals. With help from the Pacific International Space Center for Exploration Systems (PISCES), a lunar rover prototype (provided by the Canadian Space Agency) was equipped with prospecting instruments (neutron spectrometer and near-infrared spectrometer), subsurface access and sampling tools, including both an auger and coring drill (provided by CSA) and subsurface sample analysis instrumentation, including a sample oven system, the Oxygen and Volatile Extraction Node (OVEN), and Gas Chromatograph / Mass Spectrometer system, the Lunar Advanced Volatile Analysis (LAVA) system. Given the relatively short time period this

  2. Near Real-Time Prospecting for Lunar Volatiles: Demonstrating RESOLVE Science in the Field

    Science.gov (United States)

    Elphic, R. C.; Colaprete, A.; Heldmann, J. L.; Mattes, G.; Ennico, K.; Sanders, G. B.; Quinn, J.; Fritzler, E.; Marinova, M.; Roush, T. L.; Stoker, C.; Larson, W.; Picard, M.; McMurray, R.; Morse, S.

    2012-12-01

    The Regolith and Environment Science and Oxygen & Lunar Volatile Extraction (RESOLVE) project aims to demonstrate the utility of "in situ resource utilization". In situ resource utilization (ISRU) is a way to rebalance the economics of spaceflight by reducing or eliminating materials that must be brought up from Earth and placed on the surface of the Moon for human use. RESOLVE is developing a rover-borne payload that (1) can locate near subsurface volatiles, (2) excavate and analyze samples of the volatile-bearing regolith, and (3) demonstrate the form, extractability and usefulness of the materials. Such investigations are important not only for ISRU but are also critically important for understanding the scientific nature of these intriguing lunar polar volatile deposits. Temperature models and orbital data suggest near surface volatile concentrations may exist at briefly lit lunar polar locations outside persistently shadowed regions. A lunar rover could be remotely operated at some of these locations for the 4-7 days of expected sunlight at relatively low cost. In July 2012 the RESOLVE project conducted a full-scale field demonstration. In particular, the ability to perform the real-time measurement analysis necessary to search for volatiles and the ability to combine the various measurement techniques to meet the mission measurement and science goals. With help from the Pacific International Space Center for Exploration Systems (PISCES), a lunar rover prototype (provided by the Canadian Space Agency) was equipped with prospecting instruments (neutron spectrometer and near-infrared spectrometer), subsurface access and sampling tools, including both an auger and coring drill (provided by CSA) and subsurface sample analysis instrumentation, including a sample oven system, the Oxygen and Volatile Extraction Node (OVEN), and Gas Chromatograph / Mass Spectrometer system, the Lunar Advanced Volatile Analysis (LAVA) system. Given the relatively short time period this

  3. Time series data mining for the Gaia variability analysis

    CERN Document Server

    Nienartowicz, Krzysztof; Guy, Leanne; Holl, Berry; Lecoeur-Taïbi, Isabelle; Mowlavi, Nami; Rimoldini, Lorenzo; Ruiz, Idoia; Süveges, Maria; Eyer, Laurent

    2014-01-01

    Gaia is an ESA cornerstone mission, which was successfully launched December 2013 and commenced operations in July 2014. Within the Gaia Data Processing and Analysis consortium, Coordination Unit 7 (CU7) is responsible for the variability analysis of over a billion celestial sources and nearly 4 billion associated time series (photometric, spectrophotometric, and spectroscopic), encoding information in over 800 billion observations during the 5 years of the mission, resulting in a petabyte scale analytical problem. In this article, we briefly describe the solutions we developed to address the challenges of time series variability analysis: from the structure for a distributed data-oriented scientific collaboration to architectural choices and specific components used. Our approach is based on Open Source components with a distributed, partitioned database as the core to handle incrementally: ingestion, distributed processing, analysis, results and export in a constrained time window.

  4. Cross recurrence plot based synchronization of time series

    Directory of Open Access Journals (Sweden)

    N. Marwan

    2002-01-01

    Full Text Available The method of recurrence plots is extended to the cross recurrence plots (CRP which, among others, enables the study of synchronization or time differences in two time series. This is emphasized in a distorted main diagonal in the cross recurrence plot, the line of synchronization (LOS. A non-parametrical fit of this LOS can be used to rescale the time axis of the two data series (whereby one of them is compressed or stretched so that they are synchronized. An application of this method to geophysical sediment core data illustrates its suitability for real data. The rock magnetic data of two different sediment cores from the Makarov Basin can be adjusted to each other by using this method, so that they are comparable.

  5. A comprehensive characterization of recurrences in time series

    CERN Document Server

    Chicheportiche, Rémy

    2013-01-01

    Study of recurrences in earthquakes, climate, financial time-series, etc. is crucial to better forecast disasters and limit their consequences. However, almost all the previous phenomenological studies involved only a long-ranged autocorrelation function, or disregarded the multi-scaling properties induced by potential higher order dependencies. Consequently, they missed the facts that non-linear dependences do impact both the statistics and dynamics of recurrence times, and that scaling arguments for the unconditional distribution may not be applicable. We argue that copulas is the correct model-free framework to study non-linear dependencies in time series and related concepts like recurrences. Fitting and/or simulating the intertemporal distribution of recurrence intervals is very much system specific, and cannot actually benefit from universal features, in contrast to the previous claims. This has important implications in epilepsy prognosis and financial risk management applications.

  6. A multivariate heuristic model for fuzzy time-series forecasting.

    Science.gov (United States)

    Huarng, Kun-Huang; Yu, Tiffany Hui-Kuang; Hsu, Yu Wei

    2007-08-01

    Fuzzy time-series models have been widely applied due to their ability to handle nonlinear data directly and because no rigid assumptions for the data are needed. In addition, many such models have been shown to provide better forecasting results than their conventional counterparts. However, since most of these models require complicated matrix computations, this paper proposes the adoption of a multivariate heuristic function that can be integrated with univariate fuzzy time-series models into multivariate models. Such a multivariate heuristic function can easily be extended and integrated with various univariate models. Furthermore, the integrated model can handle multiple variables to improve forecasting results and, at the same time, avoid complicated computations due to the inclusion of multiple variables.

  7. Cross Recurrence Plot Based Synchronization of Time Series

    CERN Document Server

    Marwan, N; Nowaczyk, N R

    2002-01-01

    The method of recurrence plots is extended to the cross recurrence plots (CRP), which among others enables the study of synchronization or time differences in two time series. This is emphasized in a distorted main diagonal in the cross recurrence plot, the line of synchronization (LOS). A non-parametrical fit of this LOS can be used to rescale the time axis of the two data series (whereby one of it is e.g. compressed or stretched) so that they are synchronized. An application of this method to geophysical sediment core data illustrates its suitability for real data. The rock magnetic data of two different sediment cores from the Makarov Basin can be adjusted to each other by using this method, so that they are comparable.

  8. Adaptively Sharing Time-Series with Differential Privacy

    CERN Document Server

    Fan, Liyue

    2012-01-01

    Sharing real-time aggregate statistics of private data has given much benefit to the public to perform data mining for understanding important phenomena, such as Influenza outbreaks and traffic congestions. We propose an adaptive approach with sampling and estimation to release aggregated time series under differential privacy, the key innovation of which is that we utilize feedback loops based on observed (perturbed) values to dynamically adjust the estimation model as well as the sampling rate. To minimize the overall privacy cost, our solution uses the PID controller to adaptively sample long time-series according to detected data dynamics. To improve the accuracy of data release per timestamp, the Kalman filter is used to predict data values at non-sampling points and to estimate true values from perturbed query answers at sampling points. Our experiments with three real data sets show that it is beneficial to incorporate feedback into both the estimation model and the sampling process. The results confir...

  9. Minimum Entropy Density Method for the Time Series Analysis

    CERN Document Server

    Lee, J W; Moon, H T; Park, J B; Yang, J S; Jo, Hang-Hyun; Lee, Jeong Won; Moon, Hie-Tae; Park, Joongwoo Brian; Yang, Jae-Suk

    2006-01-01

    The entropy density is an intuitive and powerful concept to study the complicated nonlinear processes derived from physical systems. We develop the minimum entropy density method (MEDM) to detect the most correlated time interval of a given time series and define the effective delay of information (EDI) as the correlation length that minimizes the entropy density in relation to the velocity of information flow. The MEDM is applied to the financial time series of Standard and Poor's 500 (S&P500) index from February 1983 to April 2006. It is found that EDI of S&P500 index has decreased for the last twenty years, which suggests that the efficiency of the U.S. market dynamics became close to the efficient market hypothesis.

  10. Time-series analysis of Music: Perceptual and Information Dynamics

    Directory of Open Access Journals (Sweden)

    Marcus T. Pearce

    2011-12-01

    Full Text Available Dean and Bailes (2010 provide a tutorial on the use of time-series analysis in research on music perception and a study of the influence of acoustic factors on real-time perception of music. They illustrate their approach with a detailed case study of an electroacoustic composition by Trevor Wishart. In this commentary, I discuss four aspects of Dean and Bailes’ presentation: first, the importance of focusing on dynamic changes in musical structure; second, the benefits of computer-generated music for research on music perception; third, the need for caution in averaging responses from multiple listeners; and finally, the role of time-series analysis in understanding computational information-dynamic models of music cognition.

  11. Recursive Bayesian recurrent neural networks for time-series modeling.

    Science.gov (United States)

    Mirikitani, Derrick T; Nikolaev, Nikolay

    2010-02-01

    This paper develops a probabilistic approach to recursive second-order training of recurrent neural networks (RNNs) for improved time-series modeling. A general recursive Bayesian Levenberg-Marquardt algorithm is derived to sequentially update the weights and the covariance (Hessian) matrix. The main strengths of the approach are a principled handling of the regularization hyperparameters that leads to better generalization, and stable numerical performance. The framework involves the adaptation of a noise hyperparameter and local weight prior hyperparameters, which represent the noise in the data and the uncertainties in the model parameters. Experimental investigations using artificial and real-world data sets show that RNNs equipped with the proposed approach outperform standard real-time recurrent learning and extended Kalman training algorithms for recurrent networks, as well as other contemporary nonlinear neural models, on time-series modeling.

  12. Learning dynamics from nonstationary time series: Analysis of electroencephalograms

    Science.gov (United States)

    Gribkov, Dmitrii; Gribkova, Valentina

    2000-06-01

    We propose an empirical modeling technique for a nonstationary time series analysis. Proposed methods include a high-dimensional (N>3) dynamical model construction in the form of delay differential equations, a nonparametric method of respective time delay calculation, the detection of quasistationary regions of the process by reccurence analysis in the space of model coefficients, and final fitting of the model to quasistationary segments of observed time series. We also demonstrate the effectiveness of our approach for nonstationary signal classification in the space of model coefficients. Applying the empirical modeling technique to electroencephalogram (EEG) records analysis, we find evidence of high-dimensional nonlinear dynamics in quasistationary EEG segments. Reccurence analysis of model parameters reveals long-term correlations in nonstationary EEG records. Using the dynamical model as a nonlinear filter, we find that different emotional states of subjects can be clearly distinguished in the space of model coefficients.

  13. Reconstruction of ensembles of coupled time-delay systems from time series.

    Science.gov (United States)

    Sysoev, I V; Prokhorov, M D; Ponomarenko, V I; Bezruchko, B P

    2014-06-01

    We propose a method to recover from time series the parameters of coupled time-delay systems and the architecture of couplings between them. The method is based on a reconstruction of model delay-differential equations and estimation of statistical significance of couplings. It can be applied to networks composed of nonidentical nodes with an arbitrary number of unidirectional and bidirectional couplings. We test our method on chaotic and periodic time series produced by model equations of ensembles of diffusively coupled time-delay systems in the presence of noise, and apply it to experimental time series obtained from electronic oscillators with delayed feedback coupled by resistors.

  14. Building Real-Time Network Intrusion Detection System Based on Parallel Time-Series Mining Techniques

    Institute of Scientific and Technical Information of China (English)

    Zhao Feng; Li Qinghua

    2005-01-01

    A new real-time model based on parallel time-series mining is proposed to improve the accuracy and efficiency of the network intrusion detection systems. In this model, multidimensional dataset is constructed to describe network events, and sliding window updating algorithm is used to maintain network stream. Moreover, parallel frequent patterns and frequent episodes mining algorithms are applied to implement parallel time-series mining engineer which can intelligently generate rules to distinguish intrusions from normal activities. Analysis and study on the basis of DAWNING 3000 indicate that this parallel time-series mining-based model provides a more accurate and efficient way to building real-time NIDS.

  15. Reconstruction of ensembles of coupled time-delay systems from time series

    Science.gov (United States)

    Sysoev, I. V.; Prokhorov, M. D.; Ponomarenko, V. I.; Bezruchko, B. P.

    2014-06-01

    We propose a method to recover from time series the parameters of coupled time-delay systems and the architecture of couplings between them. The method is based on a reconstruction of model delay-differential equations and estimation of statistical significance of couplings. It can be applied to networks composed of nonidentical nodes with an arbitrary number of unidirectional and bidirectional couplings. We test our method on chaotic and periodic time series produced by model equations of ensembles of diffusively coupled time-delay systems in the presence of noise, and apply it to experimental time series obtained from electronic oscillators with delayed feedback coupled by resistors.

  16. Measures of Analysis of Time Series (MATS: A MATLAB Toolkit for Computation of Multiple Measures on Time Series Data Bases

    Directory of Open Access Journals (Sweden)

    Dimitris Kugiumtzis

    2010-02-01

    Full Text Available In many applications, such as physiology and finance, large time series data bases are to be analyzed requiring the computation of linear, nonlinear and other measures. Such measures have been developed and implemented in commercial and freeware softwares rather selectively and independently. The Measures of Analysis of Time Series (MATS MATLAB toolkit is designed to handle an arbitrary large set of scalar time series and compute a large variety of measures on them, allowing for the specification of varying measure parameters as well. The variety of options with added facilities for visualization of the results support different settings of time series analysis, such as the detection of dynamics changes in long data records, resampling (surrogate or bootstrap tests for independence and linearity with various test statistics, and discrimination power of different measures and for different combinations of their parameters. The basic features of MATS are presented and the implemented measures are briefly described. The usefulness of MATS is illustrated on some empirical examples along with screenshots.

  17. Interpolation based consensus clustering for gene expression time series.

    Science.gov (United States)

    Chiu, Tai-Yu; Hsu, Ting-Chieh; Yen, Chia-Cheng; Wang, Jia-Shung

    2015-04-16

    Unsupervised analyses such as clustering are the essential tools required to interpret time-series expression data from microarrays. Several clustering algorithms have been developed to analyze gene expression data. Early methods such as k-means, hierarchical clustering, and self-organizing maps are popular for their simplicity. However, because of noise and uncertainty of measurement, these common algorithms have low accuracy. Moreover, because gene expression is a temporal process, the relationship between successive time points should be considered in the analyses. In addition, biological processes are generally continuous; therefore, the datasets collected from time series experiments are often found to have an insufficient number of data points and, as a result, compensation for missing data can also be an issue. An affinity propagation-based clustering algorithm for time-series gene expression data is proposed. The algorithm explores the relationship between genes using a sliding-window mechanism to extract a large number of features. In addition, the time-course datasets are resampled with spline interpolation to predict the unobserved values. Finally, a consensus process is applied to enhance the robustness of the method. Some real gene expression datasets were analyzed to demonstrate the accuracy and efficiency of the algorithm. The proposed algorithm has benefitted from the use of cubic B-splines interpolation, sliding-window, affinity propagation, gene relativity graph, and a consensus process, and, as a result, provides both appropriate and effective clustering of time-series gene expression data. The proposed method was tested with gene expression data from the Yeast galactose dataset, the Yeast cell-cycle dataset (Y5), and the Yeast sporulation dataset, and the results illustrated the relationships between the expressed genes, which may give some insights into the biological processes involved.

  18. Exploring large scale time-series data using nested timelines

    Science.gov (United States)

    Xie, Zaixian; Ward, Matthew O.; Rundensteiner, Elke A.

    2013-01-01

    When data analysts study time-series data, an important task is to discover how data patterns change over time. If the dataset is very large, this task becomes challenging. Researchers have developed many visualization techniques to help address this problem. However, little work has been done regarding the changes of multivariate patterns, such as linear trends and clusters, on time-series data. In this paper, we describe a set of history views to fill this gap. This technique works under two modes: merge and non-merge. For the merge mode, merge algorithms were applied to selected time windows to generate a change-based hierarchy. Contiguous time windows having similar patterns are merged first. Users can choose different levels of merging with the tradeoff between more details in the data and less visual clutter in the visualizations. In the non-merge mode, the framework can use natural hierarchical time units or one defined by domain experts to represent timelines. This can help users navigate across long time periods. Gridbased views were designed to provide a compact overview for the history data. In addition, MDS pattern starfields and distance maps were developed to enable users to quickly investigate the degree of pattern similarity among different time periods. The usability evaluation demonstrated that most participants could understand the concepts of the history views correctly and finished assigned tasks with a high accuracy and relatively fast response time.

  19. Assessing spatial covariance among time series of abundance.

    Science.gov (United States)

    Jorgensen, Jeffrey C; Ward, Eric J; Scheuerell, Mark D; Zabel, Richard W

    2016-04-01

    For species of conservation concern, an essential part of the recovery planning process is identifying discrete population units and their location with respect to one another. A common feature among geographically proximate populations is that the number of organisms tends to covary through time as a consequence of similar responses to exogenous influences. In turn, high covariation among populations can threaten the persistence of the larger metapopulation. Historically, explorations of the covariance in population size of species with many (>10) time series have been computationally difficult. Here, we illustrate how dynamic factor analysis (DFA) can be used to characterize diversity among time series of population abundances and the degree to which all populations can be represented by a few common signals. Our application focuses on anadromous Chinook salmon (Oncorhynchus tshawytscha), a species listed under the US Endangered Species Act, that is impacted by a variety of natural and anthropogenic factors. Specifically, we fit DFA models to 24 time series of population abundance and used model selection to identify the minimum number of latent variables that explained the most temporal variation after accounting for the effects of environmental covariates. We found support for grouping the time series according to 5 common latent variables. The top model included two covariates: the Pacific Decadal Oscillation in spring and summer. The assignment of populations to the latent variables matched the currently established population structure at a broad spatial scale. At a finer scale, there was more population grouping complexity. Some relatively distant populations were grouped together, and some relatively close populations - considered to be more aligned with each other - were more associated with populations further away. These coarse- and fine-grained examinations of spatial structure are important because they reveal different structural patterns not evident

  20. FTSPlot: fast time series visualization for large datasets.

    Directory of Open Access Journals (Sweden)

    Michael Riss

    Full Text Available The analysis of electrophysiological recordings often involves visual inspection of time series data to locate specific experiment epochs, mask artifacts, and verify the results of signal processing steps, such as filtering or spike detection. Long-term experiments with continuous data acquisition generate large amounts of data. Rapid browsing through these massive datasets poses a challenge to conventional data plotting software because the plotting time increases proportionately to the increase in the volume of data. This paper presents FTSPlot, which is a visualization concept for large-scale time series datasets using techniques from the field of high performance computer graphics, such as hierarchic level of detail and out-of-core data handling. In a preprocessing step, time series data, event, and interval annotations are converted into an optimized data format, which then permits fast, interactive visualization. The preprocessing step has a computational complexity of O(n x log(N; the visualization itself can be done with a complexity of O(1 and is therefore independent of the amount of data. A demonstration prototype has been implemented and benchmarks show that the technology is capable of displaying large amounts of time series data, event, and interval annotations lag-free with < 20 ms ms. The current 64-bit implementation theoretically supports datasets with up to 2(64 bytes, on the x86_64 architecture currently up to 2(48 bytes are supported, and benchmarks have been conducted with 2(40 bytes/1 TiB or 1.3 x 10(11 double precision samples. The presented software is freely available and can be included as a Qt GUI component in future software projects, providing a standard visualization method for long-term electrophysiological experiments.

  1. Satellite time series analysis using Empirical Mode Decomposition

    Science.gov (United States)

    Pannimpullath, R. Renosh; Doolaeghe, Diane; Loisel, Hubert; Vantrepotte, Vincent; Schmitt, Francois G.

    2016-04-01

    Geophysical fields possess large fluctuations over many spatial and temporal scales. Satellite successive images provide interesting sampling of this spatio-temporal multiscale variability. Here we propose to consider such variability by performing satellite time series analysis, pixel by pixel, using Empirical Mode Decomposition (EMD). EMD is a time series analysis technique able to decompose an original time series into a sum of modes, each one having a different mean frequency. It can be used to smooth signals, to extract trends. It is built in a data-adaptative way, and is able to extract information from nonlinear signals. Here we use MERIS Suspended Particulate Matter (SPM) data, on a weekly basis, during 10 years. There are 458 successive time steps. We have selected 5 different regions of coastal waters for the present study. They are Vietnam coastal waters, Brahmaputra region, St. Lawrence, English Channel and McKenzie. These regions have high SPM concentrations due to large scale river run off. Trend and Hurst exponents are derived for each pixel in each region. The energy also extracted using Hilberts Spectral Analysis (HSA) along with EMD method. Normalised energy computed for each mode for each region with the total energy. The total energy computed using all the modes are extracted using EMD method.

  2. Inverse problem for multivariate time series using dynamical latent variables

    Science.gov (United States)

    Zamparo, M.; Stramaglia, S.; Banavar, J. R.; Maritan, A.

    2012-06-01

    Factor analysis is a well known statistical method to describe the variability among observed variables in terms of a smaller number of unobserved latent variables called factors. While dealing with multivariate time series, the temporal correlation structure of data may be modeled by including correlations in latent factors, but a crucial choice is the covariance function to be implemented. We show that analyzing multivariate time series in terms of latent Gaussian processes, which are mutually independent but with each of them being characterized by exponentially decaying temporal correlations, leads to an efficient implementation of the expectation-maximization algorithm for the maximum likelihood estimation of parameters, due to the properties of block-tridiagonal matrices. The proposed approach solves an ambiguity known as the identifiability problem, which renders the solution of factor analysis determined only up to an orthogonal transformation. Samples with just two temporal points are sufficient for the parameter estimation: hence the proposed approach may be applied even in the absence of prior information about the correlation structure of latent variables by fitting the model to pairs of points with varying time delay. Our modeling allows one to make predictions of the future values of time series and we illustrate our method by applying it to an analysis of published gene expression data from cell culture HeLa.

  3. Copulas and time series with long-ranged dependences

    CERN Document Server

    Chicheportiche, Rémy

    2013-01-01

    We review ideas on temporal dependences and recurrences in discrete time series from several areas of natural and social sciences. We revisit existing studies and redefine the relevant observables in the language of copulas (joint laws of the ranks). We propose that copulas provide an appropriate mathematical framework to study non-linear time dependences and related concepts - like aftershocks, Omori law, recurrences, waiting times. We also critically argue using this global approach that previous phenomenological attempts involving only a long-ranged autocorrelation function lacked complexity in that they were essentially mono-scale.

  4. West Africa land use and land cover time series

    Science.gov (United States)

    Cotillon, Suzanne E.

    2017-02-16

    Started in 1999, the West Africa Land Use Dynamics project represents an effort to map land use and land cover, characterize the trends in time and space, and understand their effects on the environment across West Africa. The outcome of the West Africa Land Use Dynamics project is the production of a three-time period (1975, 2000, and 2013) land use and land cover dataset for the Sub-Saharan region of West Africa, including the Cabo Verde archipelago. The West Africa Land Use Land Cover Time Series dataset offers a unique basis for characterizing and analyzing land changes across the region, systematically and at an unprecedented level of detail.

  5. Modeling Large Time Series for Efficient Approximate Query Processing

    DEFF Research Database (Denmark)

    Perera, Kasun S; Hahmann, Martin; Lehner, Wolfgang

    2015-01-01

    Evolving customer requirements and increasing competition force business organizations to store increasing amounts of data and query them for information at any given time. Due to the current growth of data volumes, timely extraction of relevant information becomes more and more difficult...... these issues, compression techniques have been introduced in many areas of data processing. In this paper, we outline a new system that does not query complete datasets but instead utilizes models to extract the requested information. For time series data we use Fourier and Cosine transformations and piece...

  6. A Comparative Study of Portmanteau Tests for Univariate Time Series Models

    Directory of Open Access Journals (Sweden)

    Sohail Chand

    2006-07-01

    Full Text Available Time series model diagnostic checking is the most important stage of time series model building. In this paper the comparison among several suggested diagnostic tests has been made using the simulation time series data.

  7. Model and Variable Selection Procedures for Semiparametric Time Series Regression

    Directory of Open Access Journals (Sweden)

    Risa Kato

    2009-01-01

    Full Text Available Semiparametric regression models are very useful for time series analysis. They facilitate the detection of features resulting from external interventions. The complexity of semiparametric models poses new challenges for issues of nonparametric and parametric inference and model selection that frequently arise from time series data analysis. In this paper, we propose penalized least squares estimators which can simultaneously select significant variables and estimate unknown parameters. An innovative class of variable selection procedure is proposed to select significant variables and basis functions in a semiparametric model. The asymptotic normality of the resulting estimators is established. Information criteria for model selection are also proposed. We illustrate the effectiveness of the proposed procedures with numerical simulations.

  8. Analyzing single-molecule time series via nonparametric Bayesian inference.

    Science.gov (United States)

    Hines, Keegan E; Bankston, John R; Aldrich, Richard W

    2015-02-03

    The ability to measure the properties of proteins at the single-molecule level offers an unparalleled glimpse into biological systems at the molecular scale. The interpretation of single-molecule time series has often been rooted in statistical mechanics and the theory of Markov processes. While existing analysis methods have been useful, they are not without significant limitations including problems of model selection and parameter nonidentifiability. To address these challenges, we introduce the use of nonparametric Bayesian inference for the analysis of single-molecule time series. These methods provide a flexible way to extract structure from data instead of assuming models beforehand. We demonstrate these methods with applications to several diverse settings in single-molecule biophysics. This approach provides a well-constrained and rigorously grounded method for determining the number of biophysical states underlying single-molecule data. Copyright © 2015 Biophysical Society. Published by Elsevier Inc. All rights reserved.

  9. Time series analysis methods and applications for flight data

    CERN Document Server

    Zhang, Jianye

    2017-01-01

    This book focuses on different facets of flight data analysis, including the basic goals, methods, and implementation techniques. As mass flight data possesses the typical characteristics of time series, the time series analysis methods and their application for flight data have been illustrated from several aspects, such as data filtering, data extension, feature optimization, similarity search, trend monitoring, fault diagnosis, and parameter prediction, etc. An intelligent information-processing platform for flight data has been established to assist in aircraft condition monitoring, training evaluation and scientific maintenance. The book will serve as a reference resource for people working in aviation management and maintenance, as well as researchers and engineers in the fields of data analysis and data mining.

  10. Radial basis function network design for chaotic time series prediction

    Energy Technology Data Exchange (ETDEWEB)

    Shin, Chang Yong; Kim, Taek Soo; Park, Sang Hui [Yonsei University, Seoul (Korea, Republic of); Choi, Yoon Ho [Kyonggi University, Suwon (Korea, Republic of)

    1996-04-01

    In this paper, radial basis function networks with two hidden layers, which employ the K-means clustering method and the hierarchical training, are proposed for improving the short-term predictability of chaotic time series. Furthermore the recursive training method of radial basis function network using the recursive modified Gram-Schmidt algorithm is proposed for the purpose. In addition, the radial basis function networks trained by the proposed training methods are compared with the X.D. He A Lapedes`s model and the radial basis function network by non-recursive training method. Through this comparison, an improved radial basis function network for predicting chaotic time series is presented. (author). 17 refs., 8 figs., 3 tabs.

  11. Time Series Analysis, Modeling and Applications A Computational Intelligence Perspective

    CERN Document Server

    Chen, Shyi-Ming

    2013-01-01

    Temporal and spatiotemporal data form an inherent fabric of the society as we are faced with streams of data coming from numerous sensors, data feeds, recordings associated with numerous areas of application embracing physical and human-generated phenomena (environmental data, financial markets, Internet activities, etc.). A quest for a thorough analysis, interpretation, modeling and prediction of time series comes with an ongoing challenge for developing models that are both accurate and user-friendly (interpretable). The volume is aimed to exploit the conceptual and algorithmic framework of Computational Intelligence (CI) to form a cohesive and comprehensive environment for building models of time series. The contributions covered in the volume are fully reflective of the wealth of the CI technologies by bringing together ideas, algorithms, and numeric studies, which convincingly demonstrate their relevance, maturity and visible usefulness. It reflects upon the truly remarkable diversity of methodological a...

  12. Model-Coupled Autoencoder for Time Series Visualisation

    CERN Document Server

    Gianniotis, Nikolaos; Tiňo, Peter; Polsterer, Kai L

    2016-01-01

    We present an approach for the visualisation of a set of time series that combines an echo state network with an autoencoder. For each time series in the dataset we train an echo state network, using a common and fixed reservoir of hidden neurons, and use the optimised readout weights as the new representation. Dimensionality reduction is then performed via an autoencoder on the readout weight representations. The crux of the work is to equip the autoencoder with a loss function that correctly interprets the reconstructed readout weights by associating them with a reconstruction error measured in the data space of sequences. This essentially amounts to measuring the predictive performance that the reconstructed readout weights exhibit on their corresponding sequences when plugged back into the echo state network with the same fixed reservoir. We demonstrate that the proposed visualisation framework can deal both with real valued sequences as well as binary sequences. We derive magnification factors in order t...

  13. Models for Pooled Time-Series Cross-Section Data

    Directory of Open Access Journals (Sweden)

    Lawrence E Raffalovich

    2015-07-01

    Full Text Available Several models are available for the analysis of pooled time-series cross-section (TSCS data, defined as “repeated observations on fixed units” (Beck and Katz 1995. In this paper, we run the following models: (1 a completely pooled model, (2 fixed effects models, and (3 multi-level/hierarchical linear models. To illustrate these models, we use a Generalized Least Squares (GLS estimator with cross-section weights and panel-corrected standard errors (with EViews 8 on the cross-national homicide trends data of forty countries from 1950 to 2005, which we source from published research (Messner et al. 2011. We describe and discuss the similarities and differences between the models, and what information each can contribute to help answer substantive research questions. We conclude with a discussion of how the models we present may help to mitigate validity threats inherent in pooled time-series cross-section data analysis.

  14. Time series prediction by feedforward neural networks - is it difficult?

    CERN Document Server

    Rosen-Zvi, M; Kinzel, W

    2003-01-01

    The difficulties that a neural network faces when trying to learn from a quasi-periodic time series are studied analytically using a teacher-student scenario where the random input is divided into two macroscopic regions with different variances, 1 and 1/gamma sup 2 (gamma >> 1). The generalization error is found to decrease as epsilon sub g propor to exp(-alpha/gamma sup 2), where alpha is the number of examples per input dimension. In contradiction to this very slow vanishing generalization error, the next output prediction is found to be almost free of mistakes. This picture is consistent with learning quasi-periodic time series produced by feedforward neural networks, which is dominated by enhanced components of the Fourier spectrum of the input. Simulation results are in good agreement with the analytical results.

  15. FORECASTING INFLATION RATES WITH HIGH ORDER FUZZY TIME SERIES APPROACH

    Directory of Open Access Journals (Sweden)

    VEDİDE REZAN USLU

    2013-06-01

    Full Text Available To obtain inflation forecasts is an important economic issue. The more accurate forecasts we get implies the more precise decisions we make. The central Bank reports inflation rates in certain periods of every year. In this reports the results of inflation expectation survey are presented. In this study we use an approach in which relationship is determined by artificial neural network in high order fuzzy time series model. Time series of consumer price index is estimated by both the artificial neural network based method and some fuzzy approaches which is common in the literature. The results are compared to the results of inflation expectation survey analysis conducted by Central Bank of the Republic of Turkey in the aspect of forecasts accuracy.

  16. TESTING FOR OUTLIERS IN TIME SERIES USING WAVELETS

    Institute of Scientific and Technical Information of China (English)

    ZHANG Tong; ZHANG Xibin; ZHANG Shiying

    2003-01-01

    One remarkable feature of wavelet decomposition is that the wavelet coefficients are localized, and any singularity in the input signals can only affect the wavelet coefficients at the point near the singularity. The localized property of the wavelet coefficients allows us to identify the singularities in the input signals by studying the wavelet coefficients at different resolution levels. This paper considers wavelet-based approaches for the detection of outliers in time series. Outliers are high-frequency phenomena which are associated with the wavelet coefficients with large absolute values at different resolution levels. On the basis of the first-level wavelet coefficients, this paper presents a diagnostic to identify outliers in a time series. Under the null hypothesis that there is no outlier, the proposed diagnostic is distributed as a X12. Empirical examples are presented to demonstrate the application of the proposed diagnostic.

  17. Chaotic time series. Part II. System Identification and Prediction

    Directory of Open Access Journals (Sweden)

    Bjørn Lillekjendlie

    1994-10-01

    Full Text Available This paper is the second in a series of two, and describes the current state of the art in modeling and prediction of chaotic time series. Sample data from deterministic non-linear systems may look stochastic when analysed with linear methods. However, the deterministic structure may be uncovered and non-linear models constructed that allow improved prediction. We give the background for such methods from a geometrical point of view, and briefly describe the following types of methods: global polynomials, local polynomials, multilayer perceptrons and semi-local methods including radial basis functions. Some illustrative examples from known chaotic systems are presented, emphasising the increase in prediction error with time. We compare some of the algorithms with respect to prediction accuracy and storage requirements, and list applications of these methods to real data from widely different areas.

  18. Chaotic time series; 2, system identification and prediction

    CERN Document Server

    Lillekjendlie, B

    1994-01-01

    This paper is the second in a series of two, and describes the current state of the art in modelling and prediction of chaotic time series. Sampled data from deterministic non-linear systems may look stochastic when analysed with linear methods. However, the deterministic structure may be uncovered and non-linear models constructed that allow improved prediction. We give the background for such methods from a geometrical point of view, and briefly describe the following types of methods: global polynomials, local polynomials, multi layer perceptrons and semi-local methods including radial basis functions. Some illustrative examples from known chaotic systems are presented, emphasising the increase in prediction error with time. We compare some of the algorithms with respect to prediction accuracy and storage requirements, and list applications of these methods to real data from widely different areas.

  19. Simple Patterns in Fluctuations of Time Series of Economic Interest

    Science.gov (United States)

    Fanchiotti, H.; García Canal, C. A.; García Zúñiga, H.

    Time series corresponding to nominal exchange rates between the US dollar and Argentina, Brazil and European Economic Community currencies; different financial indexes as the Industrial Dow Jones, the British Footsie, the German DAX Composite, the Australian Share Price and the Nikkei Cash and also different Argentine local tax revenues, are analyzed looking for the appearance of simple patterns and the possible definition of forecast evaluators. In every case, the statistical fractal dimensions are obtained from the behavior of the corresponding variance of increments at a given lag. The detrended fluctuation analysis of the data in terms of the corresponding exponent in the resulting power law is carried out. Finally, the frequency power spectra of all the time series considered are computed and compared

  20. Time series analysis of physiological response during ICU visitation.

    Science.gov (United States)

    Hepworth, J T; Hendrickson, S G; Lopez, J

    1994-12-01

    Time series analysis (TSA) is an important statistical procedure for clinical nursing research. The current paucity of nursing research reports using TSA may be due to unfamiliarity with this technique. In this article, TSA is compared with the ordinary least squares regression model; validity concerns of time series designs are discussed; and concomitant and interrupted TSA of data collected on the effects of family visitation on intracranial pressure (ICP), heart rate, and blood pressure of patients in ICUs are presented. The concomitant TSA of the effect of family on ICP suggested that family presence tended to be associated with decreased ICP. Interrupted TSA indicated the effect of family on heart rate and blood pressure was not as consistent: The overall effect on blood pressure appeared to be negligible, and heart rate may increase overall. Restrictive visiting policies, once typical of intensive care units, should be reconsidered.