WorldWideScience

Sample records for autoregressive time series

  1. Boosting Nonlinear Additive Autoregressive Time Series

    OpenAIRE

    Shafik, Nivien; Tutz, Gerhard

    2007-01-01

    Within the last years several methods for the analysis of nonlinear autoregressive time series have been proposed. As in linear autoregressive models main problems are model identification, estimation and prediction. A boosting method is proposed that performs model identification and estimation simultaneously within the framework of nonlinear autoregressive time series. The method allows to select influential terms from a large numbers of potential lags and exogenous variables. The influence...

  2. vector bilinear autoregressive time series model and its superiority ...

    African Journals Online (AJOL)

    In this research, a vector bilinear autoregressive time series model was proposed and used to model three revenue series(. )t ... showed that vector bilinear autoregressive (BIVAR) models provide better estimates than the long embraced linear models. ... order moving average (MA) polynomials on backward shift operator B ...

  3. Vector bilinear autoregressive time series model and its superiority ...

    African Journals Online (AJOL)

    In this research, a vector bilinear autoregressive time series model was proposed and used to model three revenue series (X1, X2, X3) . The “orders” of the three series were identified on the basis of the distribution of autocorrelation and partial autocorrelation functions and were used to construct the vector bilinear models.

  4. Forecasting autoregressive time series under changing persistence

    DEFF Research Database (Denmark)

    Kruse, Robinson

    Changing persistence in time series models means that a structural change from nonstationarity to stationarity or vice versa occurs over time. Such a change has important implications for forecasting, as negligence may lead to inaccurate model predictions. This paper derives generally applicable...... recommendations, no matter whether a change in persistence occurs or not. Seven different forecasting strategies based on a biasedcorrected estimator are compared by means of a large-scale Monte Carlo study. The results for decreasing and increasing persistence are highly asymmetric and new to the literature. Its...

  5. Single-Index Additive Vector Autoregressive Time Series Models

    KAUST Repository

    LI, YEHUA

    2009-09-01

    We study a new class of nonlinear autoregressive models for vector time series, where the current vector depends on single-indexes defined on the past lags and the effects of different lags have an additive form. A sufficient condition is provided for stationarity of such models. We also study estimation of the proposed model using P-splines, hypothesis testing, asymptotics, selection of the order of the autoregression and of the smoothing parameters and nonlinear forecasting. We perform simulation experiments to evaluate our model in various settings. We illustrate our methodology on a climate data set and show that our model provides more accurate yearly forecasts of the El Niño phenomenon, the unusual warming of water in the Pacific Ocean. © 2009 Board of the Foundation of the Scandinavian Journal of Statistics.

  6. SEASONAL AUTOREGRESSIVE INTEGRATED MOVING AVERAGE MODEL FOR PRECIPITATION TIME SERIES

    OpenAIRE

    Yan Wang; Meng Gao; Xinghua Chang; Xiyong Hou

    2012-01-01

    Predicting the trend of precipitation is a difficult task in meteorology and environmental sciences. Statistical approaches from time series analysis provide an alternative way for precipitation prediction. The ARIMA model incorporating seasonal characteristics, which is referred to as seasonal ARIMA model was presented. The time series data is the monthly precipitation data in Yantai, China and the period is from 1961 to 2011. The model was denoted as SARIMA (1, 0, 1) (0, 1, 1)12 in this stu...

  7. Multivariate autoregressive modelling of sea level time series from TOPEX/Poseidon satellite altimetry

    Directory of Open Access Journals (Sweden)

    S. M. Barbosa

    2006-01-01

    Full Text Available This work addresses the autoregressive modelling of sea level time series from TOPEX/Poseidon satellite altimetry mission. Datasets from remote sensing applications are typically very large and correlated both in time and space. Multivariate analysis methods are useful tools to summarise and extract information from such large space-time datasets. Multivariate autoregressive analysis is a generalisation of Principal Oscillation Pattern (POP analysis, widely used in the geosciences for the extraction of dynamical modes by eigen-decomposition of a first order autoregressive model fitted to the multivariate dataset of observations. The extension of the POP methodology to autoregressions of higher order, although increasing the difficulties in estimation, allows one to model a larger class of complex systems. Here, sea level variability in the North Atlantic is modelled by a third order multivariate autoregressive model estimated by stepwise least squares. Eigen-decomposition of the fitted model yields physically-interpretable seasonal modes. The leading autoregressive mode is an annual oscillation and exhibits a very homogeneous spatial structure in terms of amplitude reflecting the large scale coherent behaviour of the annual pattern in the Northern hemisphere. The phase structure reflects the seesaw pattern between the western and eastern regions in the tropical North Atlantic associated with the trade winds regime. The second mode is close to a semi-annual oscillation. Multivariate autoregressive models provide a useful framework for the description of time-varying fields while enclosing a predictive potential.

  8. Self-organising mixture autoregressive model for non-stationary time series modelling.

    Science.gov (United States)

    Ni, He; Yin, Hujun

    2008-12-01

    Modelling non-stationary time series has been a difficult task for both parametric and nonparametric methods. One promising solution is to combine the flexibility of nonparametric models with the simplicity of parametric models. In this paper, the self-organising mixture autoregressive (SOMAR) network is adopted as a such mixture model. It breaks time series into underlying segments and at the same time fits local linear regressive models to the clusters of segments. In such a way, a global non-stationary time series is represented by a dynamic set of local linear regressive models. Neural gas is used for a more flexible structure of the mixture model. Furthermore, a new similarity measure has been introduced in the self-organising network to better quantify the similarity of time series segments. The network can be used naturally in modelling and forecasting non-stationary time series. Experiments on artificial, benchmark time series (e.g. Mackey-Glass) and real-world data (e.g. numbers of sunspots and Forex rates) are presented and the results show that the proposed SOMAR network is effective and superior to other similar approaches.

  9. A time series model: First-order integer-valued autoregressive (INAR(1))

    Science.gov (United States)

    Simarmata, D. M.; Novkaniza, F.; Widyaningsih, Y.

    2017-07-01

    Nonnegative integer-valued time series arises in many applications. A time series model: first-order Integer-valued AutoRegressive (INAR(1)) is constructed by binomial thinning operator to model nonnegative integer-valued time series. INAR (1) depends on one period from the process before. The parameter of the model can be estimated by Conditional Least Squares (CLS). Specification of INAR(1) is following the specification of (AR(1)). Forecasting in INAR(1) uses median or Bayesian forecasting methodology. Median forecasting methodology obtains integer s, which is cumulative density function (CDF) until s, is more than or equal to 0.5. Bayesian forecasting methodology forecasts h-step-ahead of generating the parameter of the model and parameter of innovation term using Adaptive Rejection Metropolis Sampling within Gibbs sampling (ARMS), then finding the least integer s, where CDF until s is more than or equal to u . u is a value taken from the Uniform(0,1) distribution. INAR(1) is applied on pneumonia case in Penjaringan, Jakarta Utara, January 2008 until April 2016 monthly.

  10. Autoregressive Prediction with Rolling Mechanism for Time Series Forecasting with Small Sample Size

    Directory of Open Access Journals (Sweden)

    Zhihua Wang

    2014-01-01

    Full Text Available Reasonable prediction makes significant practical sense to stochastic and unstable time series analysis with small or limited sample size. Motivated by the rolling idea in grey theory and the practical relevance of very short-term forecasting or 1-step-ahead prediction, a novel autoregressive (AR prediction approach with rolling mechanism is proposed. In the modeling procedure, a new developed AR equation, which can be used to model nonstationary time series, is constructed in each prediction step. Meanwhile, the data window, for the next step ahead forecasting, rolls on by adding the most recent derived prediction result while deleting the first value of the former used sample data set. This rolling mechanism is an efficient technique for its advantages of improved forecasting accuracy, applicability in the case of limited and unstable data situations, and requirement of little computational effort. The general performance, influence of sample size, nonlinearity dynamic mechanism, and significance of the observed trends, as well as innovation variance, are illustrated and verified with Monte Carlo simulations. The proposed methodology is then applied to several practical data sets, including multiple building settlement sequences and two economic series.

  11. Beyond long memory in heart rate variability: An approach based on fractionally integrated autoregressive moving average time series models with conditional heteroscedasticity

    Science.gov (United States)

    Leite, Argentina; Paula Rocha, Ana; Eduarda Silva, Maria

    2013-06-01

    Heart Rate Variability (HRV) series exhibit long memory and time-varying conditional variance. This work considers the Fractionally Integrated AutoRegressive Moving Average (ARFIMA) models with Generalized AutoRegressive Conditional Heteroscedastic (GARCH) errors. ARFIMA-GARCH models may be used to capture and remove long memory and estimate the conditional volatility in 24 h HRV recordings. The ARFIMA-GARCH approach is applied to fifteen long term HRV series available at Physionet, leading to the discrimination among normal individuals, heart failure patients, and patients with atrial fibrillation.

  12. Estimation of pure autoregressive vector models for revenue series ...

    African Journals Online (AJOL)

    This paper aims at applying multivariate approach to Box and Jenkins univariate time series modeling to three vector series. General Autoregressive Vector Models with time varying coefficients are estimated. The first vector is a response vector, while others are predictor vectors. By matrix expansion each vector, whether ...

  13. Detecting harmonic signals in a noisy time-series: the z-domain Autoregressive (AR-z) spectrum

    Science.gov (United States)

    Ding, Hao; Chao, Benjamin F.

    2015-06-01

    We develop a new method referred to as the AR-z spectrum for detecting harmonic signals with exponential decay/growth contained in a noisy time-series by extending the autoregressive (AR) method of Chao & Gilbert. The method consists of (i) `blindly' forcing one 2nd-order AR fit to the signal content in the frequency domain for any chosen frequency whether or not there is truly a signal; (ii) finding the corresponding AR (complex-conjugate pair of) poles in the complex z-domain; (iii) converting the pole locations into the corresponding complex frequencies of the harmonic signals via the Prony's relation and (iv) constructing the Lorentzian power spectrum in the z-domain, conceptually constituting the analytical continuation of the spectrum from the (real) frequency domain to the complex z-domain, where a true harmonic signal is manifested as a Lorentzian peak. The AR-z spectrum can be further enhanced by forming the product spectrum from multiple records as available. We apply the AR-z spectral method to detect and to estimate the complex frequencies of the Earth's normal-modes of free oscillation using superconducting gravimeter records after recent large earthquakes. Specifically we show examples of detection and precise estimation of the frequencies and Q values of the split singlets of the spheroidal modes 0S2, 2S1, 1S2 and 0S0, and report the mode couplings manifested by the gravimeter recording of the toroidal modes 0T2, 0T3 and 0T4. The AR-z spectrum proves to be highly sensitive for harmonic signal of decaying sinusoids in comparison to the conventional Fourier-based spectrum, particularly when the signal in question is weak and where high spectral resolution is desired.

  14. Study on homogenization of synthetic GNSS-retrieved IWV time series and its impact on trend estimates with autoregressive noise

    Science.gov (United States)

    Klos, Anna; Pottiaux, Eric; Van Malderen, Roeland; Bock, Olivier; Bogusz, Janusz

    2017-04-01

    A synthetic benchmark dataset of Integrated Water Vapour (IWV) was created within the activity of "Data homogenisation" of sub-working group WG3 of COST ES1206 Action. The benchmark dataset was created basing on the analysis of IWV differences retrieved by Global Positioning System (GPS) International GNSS Service (IGS) stations using European Centre for Medium-Range Weather Forecats (ECMWF) reanalysis data (ERA-Interim). Having analysed a set of 120 series of IWV differences (ERAI-GPS) derived for IGS stations, we delivered parameters of a number of gaps and breaks for every certain station. Moreover, we estimated values of trends, significant seasonalities and character of residuals when deterministic model was removed. We tested five different noise models and found that a combination of white and autoregressive processes of first order describes the stochastic part with a good accuracy. Basing on this analysis, we performed Monte Carlo simulations of 25 years long data with two different types of noise: white as well as combination of white and autoregressive processes. We also added few strictly defined offsets, creating three variants of synthetic dataset: easy, less-complicated and fully-complicated. The 'Easy' dataset included seasonal signals (annual, semi-annual, 3 and 4 months if present for a particular station), offsets and white noise. The 'Less-complicated' dataset included above-mentioned, as well as the combination of white and first order autoregressive processes (AR(1)+WH). The 'Fully-complicated' dataset included, beyond above, a trend and gaps. In this research, we show the impact of manual homogenisation on the estimates of trend and its error. We also cross-compare the results for three above-mentioned datasets, as the synthetized noise type might have a significant influence on manual homogenisation. Therefore, it might mostly affect the values of trend and their uncertainties when inappropriately handled. In a future, the synthetic dataset

  15. Autoregressive Generalized Linear Mixed Effect Models with Crossed Random Effects: An Application to Intensive Binary Time Series Eye-Tracking Data.

    Science.gov (United States)

    Cho, Sun-Joo; Brown-Schmidt, Sarah; Lee, Woo-Yeol

    2018-02-07

    As a method to ascertain person and item effects in psycholinguistics, a generalized linear mixed effect model (GLMM) with crossed random effects has met limitations in handing serial dependence across persons and items. This paper presents an autoregressive GLMM with crossed random effects that accounts for variability in lag effects across persons and items. The model is shown to be applicable to intensive binary time series eye-tracking data when researchers are interested in detecting experimental condition effects while controlling for previous responses. In addition, a simulation study shows that ignoring lag effects can lead to biased estimates and underestimated standard errors for the experimental condition effects.

  16. Neural networks prediction and fault diagnosis applied to stationary and non stationary ARMA (Autoregressive moving average) modeled time series

    International Nuclear Information System (INIS)

    Marseguerra, M.; Minoggio, S.; Rossi, A.; Zio, E.

    1992-01-01

    The correlated noise affecting many industrial plants under stationary or cyclo-stationary conditions - nuclear reactors included -has been successfully modeled by autoregressive moving average (ARMA) due to the versatility of this technique. The relatively recent neural network methods have similar features and much effort is being devoted to exploring their usefulness in forecasting and control. Identifying a signal by means of an ARMA model gives rise to the problem of selecting its correct order. Similar difficulties must be faced when applying neural network methods and, specifically, particular care must be given to the setting up of the appropriate network topology, the data normalization procedure and the learning code. In the present paper the capability of some neural networks of learning ARMA and seasonal ARMA processes is investigated. The results of the tested cases look promising since they indicate that the neural networks learn the underlying process with relative ease so that their forecasting capability may represent a convenient fault diagnosis tool. (Author)

  17. Dangers and uses of cross-correlation in analyzing time series in perception, performance, movement, and neuroscience: The importance of constructing transfer function autoregressive models.

    Science.gov (United States)

    Dean, Roger T; Dunsmuir, William T M

    2016-06-01

    Many articles on perception, performance, psychophysiology, and neuroscience seek to relate pairs of time series through assessments of their cross-correlations. Most such series are individually autocorrelated: they do not comprise independent values. Given this situation, an unfounded reliance is often placed on cross-correlation as an indicator of relationships (e.g., referent vs. response, leading vs. following). Such cross-correlations can indicate spurious relationships, because of autocorrelation. Given these dangers, we here simulated how and why such spurious conclusions can arise, to provide an approach to resolving them. We show that when multiple pairs of series are aggregated in several different ways for a cross-correlation analysis, problems remain. Finally, even a genuine cross-correlation function does not answer key motivating questions, such as whether there are likely causal relationships between the series. Thus, we illustrate how to obtain a transfer function describing such relationships, informed by any genuine cross-correlations. We illustrate the confounds and the meaningful transfer functions by two concrete examples, one each in perception and performance, together with key elements of the R software code needed. The approach involves autocorrelation functions, the establishment of stationarity, prewhitening, the determination of cross-correlation functions, the assessment of Granger causality, and autoregressive model development. Autocorrelation also limits the interpretability of other measures of possible relationships between pairs of time series, such as mutual information. We emphasize that further complexity may be required as the appropriate analysis is pursued fully, and that causal intervention experiments will likely also be needed.

  18. Vector Autoregressive Models and Granger Causality in Time Series Analysis in Nursing Research: Dynamic Changes Among Vital Signs Prior to Cardiorespiratory Instability Events as an Example.

    Science.gov (United States)

    Bose, Eliezer; Hravnak, Marilyn; Sereika, Susan M

    Patients undergoing continuous vital sign monitoring (heart rate [HR], respiratory rate [RR], pulse oximetry [SpO2]) in real time display interrelated vital sign changes during situations of physiological stress. Patterns in this physiological cross-talk could portend impending cardiorespiratory instability (CRI). Vector autoregressive (VAR) modeling with Granger causality tests is one of the most flexible ways to elucidate underlying causal mechanisms in time series data. The purpose of this article is to illustrate the development of patient-specific VAR models using vital sign time series data in a sample of acutely ill, monitored, step-down unit patients and determine their Granger causal dynamics prior to onset of an incident CRI. CRI was defined as vital signs beyond stipulated normality thresholds (HR = 40-140/minute, RR = 8-36/minute, SpO2 change in RR caused change in HR (21%; i.e., RR changed before HR changed) more often than change in HR causing change in RR (15%). Similarly, changes in RR caused changes in SpO2 (15%) more often than changes in SpO2 caused changes in RR (9%). For HR and SpO2, changes in HR causing changes in SpO2 and changes in SpO2 causing changes in HR occurred with equal frequency (18%). Within this sample of acutely ill patients who experienced a CRI event, VAR modeling indicated that RR changes tend to occur before changes in HR and SpO2. These findings suggest that contextual assessment of RR changes as the earliest sign of CRI is warranted. Use of VAR modeling may be helpful in other nursing research applications based on time series data.

  19. Advantage of make-to-stock strategy based on linear mixed-effect model: a comparison with regression, autoregressive, times series, and exponential smoothing models

    Directory of Open Access Journals (Sweden)

    Yu-Pin Liao

    2017-11-01

    Full Text Available In the past few decades, demand forecasting has become relatively difficult due to rapid changes in the global environment. This research illustrates the use of the make-to-stock (MTS production strategy in order to explain how forecasting plays an essential role in business management. The linear mixed-effect (LME model has been extensively developed and is widely applied in various fields. However, no study has used the LME model for business forecasting. We suggest that the LME model be used as a tool for prediction and to overcome environment complexity. The data analysis is based on real data in an international display company, where the company needs accurate demand forecasting before adopting a MTS strategy. The forecasting result from the LME model is compared to the commonly used approaches, including the regression model, autoregressive model, times series model, and exponential smoothing model, with the results revealing that prediction performance provided by the LME model is more stable than using the other methods. Furthermore, product types in the data are regarded as a random effect in the LME model, hence demands of all types can be predicted simultaneously using a single LME model. However, some approaches require splitting the data into different type categories, and then predicting the type demand by establishing a model for each type. This feature also demonstrates the practicability of the LME model in real business operations.

  20. Estimating Time Series Soil Moisture by Applying Recurrent Nonlinear Autoregressive Neural Networks to Passive Microwave Data over the Heihe River Basin, China

    Directory of Open Access Journals (Sweden)

    Zheng Lu

    2017-06-01

    Full Text Available A method using a nonlinear auto-regressive neural network with exogenous input (NARXnn to retrieve time series soil moisture (SM that is spatially and temporally continuous and high quality over the Heihe River Basin (HRB in China was investigated in this study. The input training data consisted of the X-band dual polarization brightness temperature (TB and the Ka-band V polarization TB from the Advanced Microwave Scanning Radiometer II (AMSR2, Global Land Satellite product (GLASS Leaf Area Index (LAI, precipitation from the Tropical Rainfall Measuring Mission (TRMM and the Global Precipitation Measurement (GPM, and a global 30 arc-second elevation (GTOPO-30. The output training data were generated from fused SM products of the Japan Aerospace Exploration Agency (JAXA and the Land Surface Parameter Model (LPRM. The reprocessed fused SM from two years (2013 and 2014 was inputted into the NARXnn for training; subsequently, SM during a third year (2015 was estimated. Direct and indirect validations were then performed during the period 2015 by comparing with in situ measurements, SM from JAXA, LPRM and the Global Land Data Assimilation System (GLDAS, as well as precipitation data from TRMM and GPM. The results showed that the SM predictions from NARXnn performed best, as indicated by their higher correlation coefficients (R ≥ 0.85 for the whole year of 2015, lower Bias values (absolute value of Bias ≤ 0.02 and root mean square error values (RMSE ≤ 0.06, and their improved response to precipitation. This method is being used to produce the NARXnn SM product over the HRB in China.

  1. Estimation of Time Varying Autoregressive Symmetric Alpha Stable

    Data.gov (United States)

    National Aeronautics and Space Administration — In this work, we present a novel method for modeling time-varying autoregressive impulsive signals driven by symmetric alpha stable distributions. The proposed...

  2. Modeling non-Gaussian time-varying vector autoregressive process

    Data.gov (United States)

    National Aeronautics and Space Administration — We present a novel and general methodology for modeling time-varying vector autoregressive processes which are widely used in many areas such as modeling of chemical...

  3. Time to burn: Modeling wildland arson as an autoregressive crime function

    Science.gov (United States)

    Jeffrey P. Prestemon; David T. Butry

    2005-01-01

    Six Poisson autoregressive models of order p [PAR(p)] of daily wildland arson ignition counts are estimated for five locations in Florida (1994-2001). In addition, a fixed effects time-series Poisson model of annual arson counts is estimated for all Florida counties (1995-2001). PAR(p) model estimates reveal highly significant arson ignition autocorrelation, lasting up...

  4. Dimensionality reduction for time series data

    OpenAIRE

    Vidaurre, Diego; Rezek, Iead; Harrison, Samuel L.; Smith, Stephen S.; Woolrich, Mark

    2014-01-01

    Despite the fact that they do not consider the temporal nature of data, classic dimensionality reduction techniques, such as PCA, are widely applied to time series data. In this paper, we introduce a factor decomposition specific for time series that builds upon the Bayesian multivariate autoregressive model and hence evades the assumption that data points are mutually independent. The key is to find a low-rank estimation of the autoregressive matrices. As in the probabilistic version of othe...

  5. Clustering of financial time series

    Science.gov (United States)

    D'Urso, Pierpaolo; Cappelli, Carmela; Di Lallo, Dario; Massari, Riccardo

    2013-05-01

    This paper addresses the topic of classifying financial time series in a fuzzy framework proposing two fuzzy clustering models both based on GARCH models. In general clustering of financial time series, due to their peculiar features, needs the definition of suitable distance measures. At this aim, the first fuzzy clustering model exploits the autoregressive representation of GARCH models and employs, in the framework of a partitioning around medoids algorithm, the classical autoregressive metric. The second fuzzy clustering model, also based on partitioning around medoids algorithm, uses the Caiado distance, a Mahalanobis-like distance, based on estimated GARCH parameters and covariances that takes into account the information about the volatility structure of time series. In order to illustrate the merits of the proposed fuzzy approaches an application to the problem of classifying 29 time series of Euro exchange rates against international currencies is presented and discussed, also comparing the fuzzy models with their crisp version.

  6. Poisson Autoregression

    DEFF Research Database (Denmark)

    Fokianos, Konstantinos; Rahbek, Anders Christian; Tjøstheim, Dag

    This paper considers geometric ergodicity and likelihood based inference for linear and nonlinear Poisson autoregressions. In the linear case the conditional mean is linked linearly to its past values as well as the observed values of the Poisson process. This also applies to the conditional...... variance, implying an interpretation as an integer valued GARCH process. In a nonlinear conditional Poisson model, the conditional mean is a nonlinear function of its past values and a nonlinear function of past observations. As a particular example an exponential autoregressive Poisson model for time...... series is considered. Under geometric ergodicity the maximum likelihood estimators of the parameters are shown to be asymptotically Gaussian in the linear model. In addition we provide a consistent estimator of the asymptotic covariance, which is used in the simulations and the analysis of some...

  7. Models for dependent time series

    CERN Document Server

    Tunnicliffe Wilson, Granville; Haywood, John

    2015-01-01

    Models for Dependent Time Series addresses the issues that arise and the methodology that can be applied when the dependence between time series is described and modeled. Whether you work in the economic, physical, or life sciences, the book shows you how to draw meaningful, applicable, and statistically valid conclusions from multivariate (or vector) time series data.The first four chapters discuss the two main pillars of the subject that have been developed over the last 60 years: vector autoregressive modeling and multivariate spectral analysis. These chapters provide the foundational mater

  8. Probabilistic forecasting of wind power at the minute time-scale with Markov-switching autoregressive models

    DEFF Research Database (Denmark)

    Pinson, Pierre; Madsen, Henrik

    2008-01-01

    Better modelling and forecasting of very short-term power fluctuations at large offshore wind farms may significantly enhance control and management strategies of their power output. The paper introduces a new methodology for modelling and forecasting such very short-term fluctuations. The proposed...... consists in 1-step ahead forecasting exercise on time-series of wind generation with a time resolution of 10 minute. The quality of the introduced forecasting methodology and its interest for better understanding power fluctuations are finally discussed....... methodology is based on a Markov-switching autoregressive model with time-varying coefficients. An advantage of the method is that one can easily derive full predictive densities. The quality of this methodology is demonstrated from the test case of 2 large offshore wind farms in Denmark. The exercise...

  9. BSMART: A Matlab/C toolbox for analysis of multichannel neural time series

    OpenAIRE

    Cui, Jie; Xu, Lei; Bressler, Steven L.; Ding, Mingzhou; Liang, Hualou

    2008-01-01

    We have developed a Matlab/C toolbox, Brain-SMART (System for Multivariate AutoRegressive Time series, or BSMART), for spectral analysis of continuous neural time series data recorded simultaneously from multiple sensors. Available functions include time series data importing/exporting, preprocessing (normalization and trend removal), AutoRegressive (AR) modeling (multivariate/bivariate model estimation and validation), spectral quantity estimation (auto power, coherence and Granger causality...

  10. Sparse time series chain graphical models for reconstructing genetic networks

    NARCIS (Netherlands)

    Abegaz, Fentaw; Wit, Ernst

    We propose a sparse high-dimensional time series chain graphical model for reconstructing genetic networks from gene expression data parametrized by a precision matrix and autoregressive coefficient matrix. We consider the time steps as blocks or chains. The proposed approach explores patterns of

  11. Improved Subset Autoregression: With R Package

    Directory of Open Access Journals (Sweden)

    A. I. McLeod

    2008-07-01

    Full Text Available The FitAR R (R Development Core Team 2008 package that is available on the Comprehensive R Archive Network is described. This package provides a comprehensive approach to fitting autoregressive and subset autoregressive time series. For long time series with complicated autocorrelation behavior, such as the monthly sunspot numbers, subset autoregression may prove more feasible and/or parsimonious than using AR or ARMA models. The two principal functions in this package are SelectModel and FitAR for automatic model selection and model fitting respectively. In addition to the regular autoregressive model and the usual subset autoregressive models (Tong 1977, these functions implement a new family of models. This new family of subset autoregressive models is obtained by using the partial autocorrelations as parameters and then selecting a subset of these parameters. Further properties and results for these models are discussed in McLeod and Zhang (2006. The advantages of this approach are that not only is an efficient algorithm for exact maximum likelihood implemented but that efficient methods are derived for selecting high-order subset models that may occur in massive datasets containing long time series. A new improved extended {BIC} criterion, {UBIC}, developed by Chen and Chen (2008 is implemented for subset model selection. A complete suite of model building functions for each of the three types of autoregressive models described above are included in the package. The package includes functions for time series plots, diagnostic testing and plotting, bootstrapping, simulation, forecasting, Box-Cox analysis, spectral density estimation and other useful time series procedures. As well as methods for standard generic functions including print, plot, predict and others, some new generic functions and methods are supplied that make it easier to work with the output from FitAR for bootstrapping, simulation, spectral density estimation and Box

  12. Real time damage detection using recursive principal components and time varying auto-regressive modeling

    Science.gov (United States)

    Krishnan, M.; Bhowmik, B.; Hazra, B.; Pakrashi, V.

    2018-02-01

    In this paper, a novel baseline free approach for continuous online damage detection of multi degree of freedom vibrating structures using Recursive Principal Component Analysis (RPCA) in conjunction with Time Varying Auto-Regressive Modeling (TVAR) is proposed. In this method, the acceleration data is used to obtain recursive proper orthogonal components online using rank-one perturbation method, followed by TVAR modeling of the first transformed response, to detect the change in the dynamic behavior of the vibrating system from its pristine state to contiguous linear/non-linear-states that indicate damage. Most of the works available in the literature deal with algorithms that require windowing of the gathered data owing to their data-driven nature which renders them ineffective for online implementation. Algorithms focussed on mathematically consistent recursive techniques in a rigorous theoretical framework of structural damage detection is missing, which motivates the development of the present framework that is amenable for online implementation which could be utilized along with suite experimental and numerical investigations. The RPCA algorithm iterates the eigenvector and eigenvalue estimates for sample covariance matrices and new data point at each successive time instants, using the rank-one perturbation method. TVAR modeling on the principal component explaining maximum variance is utilized and the damage is identified by tracking the TVAR coefficients. This eliminates the need for offline post processing and facilitates online damage detection especially when applied to streaming data without requiring any baseline data. Numerical simulations performed on a 5-dof nonlinear system under white noise excitation and El Centro (also known as 1940 Imperial Valley earthquake) excitation, for different damage scenarios, demonstrate the robustness of the proposed algorithm. The method is further validated on results obtained from case studies involving

  13. a model for nonlinear innovation in time series

    African Journals Online (AJOL)

    DJFLEX

    heteroscedastic errors are common in financial and econometric time series. The conditional variance may be specified as nonlinear autoregressive conditional heteroscedasticity ...... applied econometrics, 8, 31 – 49. Rao, C. R., 1973. Linear statistical inference and its applications, 2nd edition. New york: John Wiley.

  14. forecasting with nonlinear time series model: a monte-carlo ...

    African Journals Online (AJOL)

    PUBLICATIONS1

    with nonlinear time series model by comparing the RMSE with the traditional bootstrap and. Monte-Carlo method of forecasting. We use the logistic smooth transition autoregressive. (LSTAR) model as a case study. We first consider a linear model called the AR. (p) model of order p which satisfies the follow- ing linear ...

  15. Nonlinear time series modeling and forecasting the seismic data of the Hindu Kush region

    Science.gov (United States)

    Khan, Muhammad Yousaf; Mittnik, Stefan

    2017-11-01

    In this study, we extended the application of linear and nonlinear time models in the field of earthquake seismology and examined the out-of-sample forecast accuracy of linear Autoregressive (AR), Autoregressive Conditional Duration (ACD), Self-Exciting Threshold Autoregressive (SETAR), Threshold Autoregressive (TAR), Logistic Smooth Transition Autoregressive (LSTAR), Additive Autoregressive (AAR), and Artificial Neural Network (ANN) models for seismic data of the Hindu Kush region. We also extended the previous studies by using Vector Autoregressive (VAR) and Threshold Vector Autoregressive (TVAR) models and compared their forecasting accuracy with linear AR model. Unlike previous studies that typically consider the threshold model specifications by using internal threshold variable, we specified these models with external transition variables and compared their out-of-sample forecasting performance with the linear benchmark AR model. The modeling results show that time series models used in the present study are capable of capturing the dynamic structure present in the seismic data. The point forecast results indicate that the AR model generally outperforms the nonlinear models. However, in some cases, threshold models with external threshold variables specification produce more accurate forecasts, indicating that specification of threshold time series models is of crucial importance. For raw seismic data, the ACD model does not show an improved out-of-sample forecasting performance over the linear AR model. The results indicate that the AR model is the best forecasting device to model and forecast the raw seismic data of the Hindu Kush region.

  16. Nonlinear time series modeling and forecasting the seismic data of the Hindu Kush region

    Science.gov (United States)

    Khan, Muhammad Yousaf; Mittnik, Stefan

    2018-01-01

    In this study, we extended the application of linear and nonlinear time models in the field of earthquake seismology and examined the out-of-sample forecast accuracy of linear Autoregressive (AR), Autoregressive Conditional Duration (ACD), Self-Exciting Threshold Autoregressive (SETAR), Threshold Autoregressive (TAR), Logistic Smooth Transition Autoregressive (LSTAR), Additive Autoregressive (AAR), and Artificial Neural Network (ANN) models for seismic data of the Hindu Kush region. We also extended the previous studies by using Vector Autoregressive (VAR) and Threshold Vector Autoregressive (TVAR) models and compared their forecasting accuracy with linear AR model. Unlike previous studies that typically consider the threshold model specifications by using internal threshold variable, we specified these models with external transition variables and compared their out-of-sample forecasting performance with the linear benchmark AR model. The modeling results show that time series models used in the present study are capable of capturing the dynamic structure present in the seismic data. The point forecast results indicate that the AR model generally outperforms the nonlinear models. However, in some cases, threshold models with external threshold variables specification produce more accurate forecasts, indicating that specification of threshold time series models is of crucial importance. For raw seismic data, the ACD model does not show an improved out-of-sample forecasting performance over the linear AR model. The results indicate that the AR model is the best forecasting device to model and forecast the raw seismic data of the Hindu Kush region.

  17. Order-disorder transitions in time-discrete mean field systems with memory: a novel approach via nonlinear autoregressive models

    International Nuclear Information System (INIS)

    Frank, T D; Mongkolsakulvong, S

    2015-01-01

    In a previous study strongly nonlinear autoregressive (SNAR) models have been introduced as a generalization of the widely-used time-discrete autoregressive models that are known to apply both to Markov and non-Markovian systems. In contrast to conventional autoregressive models, SNAR models depend on process mean values. So far, only linear dependences have been studied. We consider the case in which process mean values can have a nonlinear impact on the processes under consideration. It is shown that such models describe Markov and non-Markovian many-body systems with mean field forces that exhibit a nonlinear impact on single subsystems. We exemplify that such nonlinear dependences can describe order-disorder phase transitions of time-discrete Markovian and non-Markovian many-body systems. The relevant order parameter equations are derived and issues of stability and stationarity are studied. (paper)

  18. The Shifting Seasonal Mean Autoregressive Model and Seasonality in the Central England Monthly Temperature Series, 1772-2016

    DEFF Research Database (Denmark)

    He, Changli; Kang, Jian; Terasvirta, Timo

    In this paper we introduce an autoregressive model with seasonal dummy variables in which coefficients of seasonal dummies vary smoothly and deterministically over time. The error variance of the model is seasonally heteroskedastic and multiplicatively decomposed, the decomposition being similar...... to that in well known ARCH and GARCH models. This variance is also allowed to be smoothly and deterministically time-varying. Under regularity conditions, consistency and asymptotic normality of the maximum likelihood estimators of parameters of this model is proved. A test of constancy of the seasonal...... coefficients is derived. The test is generalised to specifying the parametric structure of the model. A test of constancy over time of the heteroskedastic error variance is presented. The purpose of building this model is to use it for describing changing seasonality in the well-known monthly central England...

  19. Time Series Momentum

    DEFF Research Database (Denmark)

    Moskowitz, Tobias J.; Ooi, Yao Hua; Heje Pedersen, Lasse

    2012-01-01

    We document significant “time series momentum” in equity index, currency, commodity, and bond futures for each of the 58 liquid instruments we consider. We find persistence in returns for one to 12 months that partially reverses over longer horizons, consistent with sentiment theories of initial...... under-reaction and delayed over-reaction. A diversified portfolio of time series momentum strategies across all asset classes delivers substantial abnormal returns with little exposure to standard asset pricing factors and performs best during extreme markets. Examining the trading activities...... of speculators and hedgers, we find that speculators profit from time series momentum at the expense of hedgers....

  20. Detecting method for crude oil price fluctuation mechanism under different periodic time series

    International Nuclear Information System (INIS)

    Gao, Xiangyun; Fang, Wei; An, Feng; Wang, Yue

    2017-01-01

    Highlights: • We proposed the concept of autoregressive modes to indicate the fluctuation patterns. • We constructed transmission networks for studying the fluctuation mechanism. • There are different fluctuation mechanism under different periodic time series. • Only a few types of autoregressive modes control the fluctuations in crude oil price. • There are cluster effects during the fluctuation mechanism of autoregressive modes. - Abstract: Current existing literatures can characterize the long-term fluctuation of crude oil price time series, however, it is difficult to detect the fluctuation mechanism specifically under short term. Because each fluctuation pattern for one short period contained in a long-term crude oil price time series have dynamic characteristics of diversity; in other words, there exhibit various fluctuation patterns in different short periods and transmit to each other, which reflects the reputedly complicate and chaotic oil market. Thus, we proposed an incorporated method to detect the fluctuation mechanism, which is the evolution of the different fluctuation patterns over time from the complex network perspective. We divided crude oil price time series into segments using sliding time windows, and defined autoregressive modes based on regression models to indicate the fluctuation patterns of each segment. Hence, the transmissions between different types of autoregressive modes over time form a transmission network that contains rich dynamic information. We then capture transmission characteristics of autoregressive modes under different periodic time series through the structure features of the transmission networks. The results indicate that there are various autoregressive modes with significantly different statistical characteristics under different periodic time series. However, only a few types of autoregressive modes and transmission patterns play a major role in the fluctuation mechanism of the crude oil price, and these

  1. Analysing Stable Time Series

    National Research Council Canada - National Science Library

    Adler, Robert

    1997-01-01

    We describe how to take a stable, ARMA, time series through the various stages of model identification, parameter estimation, and diagnostic checking, and accompany the discussion with a goodly number...

  2. Multivariate Time Series Search

    Data.gov (United States)

    National Aeronautics and Space Administration — Multivariate Time-Series (MTS) are ubiquitous, and are generated in areas as disparate as sensor recordings in aerospace systems, music and video streams, medical...

  3. Long time series

    DEFF Research Database (Denmark)

    Hisdal, H.; Holmqvist, E.; Hyvärinen, V.

    Awareness that emission of greenhouse gases will raise the global temperature and change the climate has led to studies trying to identify such changes in long-term climate and hydrologic time series. This report, written by the......Awareness that emission of greenhouse gases will raise the global temperature and change the climate has led to studies trying to identify such changes in long-term climate and hydrologic time series. This report, written by the...

  4. Modeling Nonstationary Emotion Dynamics in Dyads using a Time-Varying Vector-Autoregressive Model.

    Science.gov (United States)

    Bringmann, Laura F; Ferrer, Emilio; Hamaker, Ellen L; Borsboom, Denny; Tuerlinckx, Francis

    2018-03-05

    Emotion dynamics are likely to arise in an interpersonal context. Standard methods to study emotions in interpersonal interaction are limited because stationarity is assumed. This means that the dynamics, for example, time-lagged relations, are invariant across time periods. However, this is generally an unrealistic assumption. Whether caused by an external (e.g., divorce) or an internal (e.g., rumination) event, emotion dynamics are prone to change. The semi-parametric time-varying vector-autoregressive (TV-VAR) model is based on well-studied generalized additive models, implemented in the software R. The TV-VAR can explicitly model changes in temporal dependency without pre-existing knowledge about the nature of change. A simulation study is presented, showing that the TV-VAR model is superior to the standard time-invariant VAR model when the dynamics change over time. The TV-VAR model is applied to empirical data on daily feelings of positive affect (PA) from a single couple. Our analyses indicate reliable changes in the male's emotion dynamics over time, but not in the female's-which were not predicted by her own affect or that of her partner. This application illustrates the usefulness of using a TV-VAR model to detect changes in the dynamics in a system.

  5. Visual time series analysis

    DEFF Research Database (Denmark)

    Fischer, Paul; Hilbert, Astrid

    2012-01-01

    We introduce a platform which supplies an easy-to-handle, interactive, extendable, and fast analysis tool for time series analysis. In contrast to other software suits like Maple, Matlab, or R, which use a command-line-like interface and where the user has to memorize/look-up the appropriate...... commands, our application is select-and-click-driven. It allows to derive many different sequences of deviations for a given time series and to visualize them in different ways in order to judge their expressive power and to reuse the procedure found. For many transformations or model-ts, the user may...... choose between manual and automated parameter selection. The user can dene new transformations and add them to the system. The application contains efficient implementations of advanced and recent techniques for time series analysis including techniques related to extreme value analysis and filtering...

  6. Applied time series analysis

    CERN Document Server

    Woodward, Wayne A; Elliott, Alan C

    2011-01-01

    ""There is scarcely a standard technique that the reader will find left out … this book is highly recommended for those requiring a ready introduction to applicable methods in time series and serves as a useful resource for pedagogical purposes.""-International Statistical Review (2014), 82""Current time series theory for practice is well summarized in this book.""-Emmanuel Parzen, Texas A&M University""What an extraordinary range of topics covered, all very insightfully. I like [the authors'] innovations very much, such as the AR factor table.""-David Findley, U.S. Census Bureau (retired)""…

  7. Time series analysis

    CERN Document Server

    Madsen, Henrik

    2007-01-01

    ""In this book the author gives a detailed account of estimation, identification methodologies for univariate and multivariate stationary time-series models. The interesting aspect of this introductory book is that it contains several real data sets and the author made an effort to explain and motivate the methodology with real data. … this introductory book will be interesting and useful not only to undergraduate students in the UK universities but also to statisticians who are keen to learn time-series techniques and keen to apply them. I have no hesitation in recommending the book.""-Journa

  8. Predicting chaotic time series

    International Nuclear Information System (INIS)

    Farmer, J.D.; Sidorowich, J.J.

    1987-01-01

    We present a forecasting technique for chaotic data. After embedding a time series in a state space using delay coordinates, we ''learn'' the induced nonlinear mapping using local approximation. This allows us to make short-term predictions of the future behavior of a time series, using information based only on past values. We present an error estimate for this technique, and demonstrate its effectiveness by applying it to several examples, including data from the Mackey-Glass delay differential equation, Rayleigh-Benard convection, and Taylor-Couette flow

  9. Residual Analysis of Generalized Autoregressive Integrated Moving ...

    African Journals Online (AJOL)

    In this study, analysis of residuals of generalized autoregressive integrated moving average bilinear time series model was considered. The adequacy of this model was based on testing the estimated residuals for whiteness. Jarque-Bera statistic and squared-residual autocorrelations were used to test the estimated ...

  10. MODELLING OF ORDINAL TIME SERIES BY PROPORTIONAL ODDS MODEL

    Directory of Open Access Journals (Sweden)

    Serpil AKTAŞ ALTUNAY

    2013-06-01

    Full Text Available Categorical time series data with random time dependent covariates often arise when the variable categories are assigned as categorical. There are several other models that have been proposed in the literature for the analysis of categorical time series. For example, Markov chain models, integer autoregressive processes, discrete ARMA models can be utilized for modeling of categorical time series. In general, the choice of model depends on the measurement of study variables: nominal, ordinal and interval. However, regression theory is successful approach for categorical time series which is based on generalized linear models and partial likelihood inference. One of the models for ordinal time series in regression theory is proportional odds model. In this study, proportional odds model approach to ordinal categorical time series is investigated based on a real air pollution data set and the results are discussed.

  11. Normalizing the causality between time series

    Science.gov (United States)

    Liang, X. San

    2015-08-01

    Recently, a rigorous yet concise formula was derived to evaluate information flow, and hence the causality in a quantitative sense, between time series. To assess the importance of a resulting causality, it needs to be normalized. The normalization is achieved through distinguishing a Lyapunov exponent-like, one-dimensional phase-space stretching rate and a noise-to-signal ratio from the rate of information flow in the balance of the marginal entropy evolution of the flow recipient. It is verified with autoregressive models and applied to a real financial analysis problem. An unusually strong one-way causality is identified from IBM (International Business Machines Corporation) to GE (General Electric Company) in their early era, revealing to us an old story, which has almost faded into oblivion, about "Seven Dwarfs" competing with a giant for the mainframe computer market.

  12. Poisson Autoregression

    DEFF Research Database (Denmark)

    Fokianos, Konstantinos; Rahbek, Anders Christian; Tjøstheim, Dag

    2009-01-01

    In this article we consider geometric ergodicity and likelihood-based inference for linear and nonlinear Poisson autoregression. In the linear case, the conditional mean is linked linearly to its past values, as well as to the observed values of the Poisson process. This also applies to the condi......In this article we consider geometric ergodicity and likelihood-based inference for linear and nonlinear Poisson autoregression. In the linear case, the conditional mean is linked linearly to its past values, as well as to the observed values of the Poisson process. This also applies...... to the conditional variance, making possible interpretation as an integer-valued generalized autoregressive conditional heteroscedasticity process. In a nonlinear conditional Poisson model, the conditional mean is a nonlinear function of its past values and past observations. As a particular example, we consider...... ergodicity proceeds via Markov theory and irreducibility. Finding transparent conditions for proving ergodicity turns out to be a delicate problem in the original model formulation. This problem is circumvented by allowing a perturbation of the model. We show that as the perturbations can be chosen...

  13. Modelling bursty time series

    International Nuclear Information System (INIS)

    Vajna, Szabolcs; Kertész, János; Tóth, Bálint

    2013-01-01

    Many human-related activities show power-law decaying interevent time distribution with exponents usually varying between 1 and 2. We study a simple task-queuing model, which produces bursty time series due to the non-trivial dynamics of the task list. The model is characterized by a priority distribution as an input parameter, which describes the choice procedure from the list. We give exact results on the asymptotic behaviour of the model and we show that the interevent time distribution is power-law decaying for any kind of input distributions that remain normalizable in the infinite list limit, with exponents tunable between 1 and 2. The model satisfies a scaling law between the exponents of interevent time distribution (β) and autocorrelation function (α): α + β = 2. This law is general for renewal processes with power-law decaying interevent time distribution. We conclude that slowly decaying autocorrelation function indicates long-range dependence only if the scaling law is violated. (paper)

  14. Multivariate Exponential Autoregressive and Autoregressive Moving ...

    African Journals Online (AJOL)

    Autoregressive (AR) and autoregressive moving average (ARMA) processes with multivariate exponential (ME) distribution are presented and discussed. The theory of positive dependence is used to show that in many cases, multivariate exponential autoregressive (MEAR) and multivariate autoregressive moving average ...

  15. Modeling corporate defaults: Poisson autoregressions with exogenous covariates (PARX)

    DEFF Research Database (Denmark)

    Agosto, Arianna; Cavaliere, Guiseppe; Kristensen, Dennis

    We develop a class of Poisson autoregressive models with additional covariates (PARX) that can be used to model and forecast time series of counts. We establish the time series properties of the models, including conditions for stationarity and existence of moments. These results are in turn used...

  16. Fundamental State Space Time Series Models for JEPX Electricity Prices

    Science.gov (United States)

    Ofuji, Kenta; Kanemoto, Shigeru

    Time series models are popular in attempts to model and forecast price dynamics in various markets. In this paper, we have formulated two state space models and tested them for its applicability to power price modeling and forecasting using JEPX (Japan Electric Power eXchange) data. The state space models generally have a high degree of flexibility with its time-dependent state transition matrix and system equation configurations. Based on empirical data analysis and past literatures, we used calculation assumptions to a) extract stochastic trend component to capture non-stationarity, and b) detect structural changes underlying in the market. The stepwise calculation algorithm followed that of Kalman Filter. We then evaluated the two models' forecasting capabilities, in comparison with ordinary AR (autoregressive) and ARCH (autoregressive conditional heteroskedasticity) models. By choosing proper explanatory variables, the latter state space model yielded as good a forecasting capability as that of the AR and the ARCH models for a short forecasting horizon.

  17. Poisson Autoregression

    DEFF Research Database (Denmark)

    Fokianos, Konstantinos; Rahbæk, Anders; Tjøstheim, Dag

    This paper considers geometric ergodicity and likelihood based inference for linear and nonlinear Poisson autoregressions. In the linear case the conditional mean is linked linearly to its past values as well as the observed values of the Poisson process. This also applies to the conditional...... proceeds via Markov theory and irreducibility. Finding transparent conditions for proving ergodicity turns out to be a delicate problem in the original model formulation. This problem is circumvented by allowing a perturbation of the model. We show that as the perturbations can be chosen to be arbitrarily...

  18. Introduction to Time Series Modeling

    CERN Document Server

    Kitagawa, Genshiro

    2010-01-01

    In time series modeling, the behavior of a certain phenomenon is expressed in relation to the past values of itself and other covariates. Since many important phenomena in statistical analysis are actually time series and the identification of conditional distribution of the phenomenon is an essential part of the statistical modeling, it is very important and useful to learn fundamental methods of time series modeling. Illustrating how to build models for time series using basic methods, "Introduction to Time Series Modeling" covers numerous time series models and the various tools f

  19. Analysis of JET ELMy time series

    International Nuclear Information System (INIS)

    Zvejnieks, G.; Kuzovkov, V.N.

    2005-01-01

    Full text: Achievement of the planned operational regime in the next generation tokamaks (such as ITER) still faces principal problems. One of the main challenges is obtaining the control of edge localized modes (ELMs), which should lead to both long plasma pulse times and reasonable divertor life time. In order to control ELMs the hypothesis was proposed by Degeling [1] that ELMs exhibit features of chaotic dynamics and thus a standard chaos control methods might be applicable. However, our findings which are based on the nonlinear autoregressive (NAR) model contradict this hypothesis for JET ELMy time-series. In turn, it means that ELM behavior is of a relaxation or random type. These conclusions coincide with our previous results obtained for ASDEX Upgrade time series [2]. [1] A.W. Degeling, Y.R. Martin, P.E. Bak, J. B.Lister, and X. Llobet, Plasma Phys. Control. Fusion 43, 1671 (2001). [2] G. Zvejnieks, V.N. Kuzovkov, O. Dumbrajs, A.W. Degeling, W. Suttrop, H. Urano, and H. Zohm, Physics of Plasmas 11, 5658 (2004)

  20. GPS Position Time Series @ JPL

    Science.gov (United States)

    Owen, Susan; Moore, Angelyn; Kedar, Sharon; Liu, Zhen; Webb, Frank; Heflin, Mike; Desai, Shailen

    2013-01-01

    Different flavors of GPS time series analysis at JPL - Use same GPS Precise Point Positioning Analysis raw time series - Variations in time series analysis/post-processing driven by different users. center dot JPL Global Time Series/Velocities - researchers studying reference frame, combining with VLBI/SLR/DORIS center dot JPL/SOPAC Combined Time Series/Velocities - crustal deformation for tectonic, volcanic, ground water studies center dot ARIA Time Series/Coseismic Data Products - Hazard monitoring and response focused center dot ARIA data system designed to integrate GPS and InSAR - GPS tropospheric delay used for correcting InSAR - Caltech's GIANT time series analysis uses GPS to correct orbital errors in InSAR - Zhen Liu's talking tomorrow on InSAR Time Series analysis

  1. Time-varying coefficient vector autoregressions model based on dynamic correlation with an application to crude oil and stock markets.

    Science.gov (United States)

    Lu, Fengbin; Qiao, Han; Wang, Shouyang; Lai, Kin Keung; Li, Yuze

    2017-01-01

    This paper proposes a new time-varying coefficient vector autoregressions (VAR) model, in which the coefficient is a linear function of dynamic lagged correlation. The proposed model allows for flexibility in choices of dynamic correlation models (e.g. dynamic conditional correlation generalized autoregressive conditional heteroskedasticity (GARCH) models, Markov-switching GARCH models and multivariate stochastic volatility models), which indicates that it can describe many types of time-varying causal effects. Time-varying causal relations between West Texas Intermediate (WTI) crude oil and the US Standard and Poor's 500 (S&P 500) stock markets are examined by the proposed model. The empirical results show that their causal relations evolve with time and display complex characters. Both positive and negative causal effects of the WTI on the S&P 500 in the subperiods have been found and confirmed by the traditional VAR models. Similar results have been obtained in the causal effects of S&P 500 on WTI. In addition, the proposed model outperforms the traditional VAR model. Copyright © 2016 Elsevier Ltd. All rights reserved.

  2. Interpolation in Time Series : An Introductive Overview of Existing Methods, Their Performance Criteria and Uncertainty Assessment

    NARCIS (Netherlands)

    Lepot, M.J.; Aubin, Jean Baptiste; Clemens, F.H.L.R.

    2017-01-01

    A thorough review has been performed on interpolation methods to fill gaps in time-series, efficiency criteria, and uncertainty quantifications. On one hand, there are numerous available methods: interpolation, regression, autoregressive, machine learning methods, etc. On the other hand, there are

  3. Quality Quandaries- Time Series Model Selection and Parsimony

    DEFF Research Database (Denmark)

    Bisgaard, Søren; Kulahci, Murat

    2009-01-01

    Some of the issues involved in selecting adequate models for time series data are discussed using an example concerning the number of users of an Internet server. The process of selecting an appropriate model is subjective and requires experience and judgment. The authors believe an important...... consideration in model selection should be parameter parsimony. They favor the use of parsimonious mixed ARMA models, noting that research has shown that a model building strategy that considers only autoregressive representations will lead to non-parsimonious models and to loss of forecasting accuracy....

  4. Testing and modelling autoregressive conditional heteroskedasticity of streamflow processes

    Directory of Open Access Journals (Sweden)

    W. Wang

    2005-01-01

    Full Text Available Conventional streamflow models operate under the assumption of constant variance or season-dependent variances (e.g. ARMA (AutoRegressive Moving Average models for deseasonalized streamflow series and PARMA (Periodic AutoRegressive Moving Average models for seasonal streamflow series. However, with McLeod-Li test and Engle's Lagrange Multiplier test, clear evidences are found for the existence of autoregressive conditional heteroskedasticity (i.e. the ARCH (AutoRegressive Conditional Heteroskedasticity effect, a nonlinear phenomenon of the variance behaviour, in the residual series from linear models fitted to daily and monthly streamflow processes of the upper Yellow River, China. It is shown that the major cause of the ARCH effect is the seasonal variation in variance of the residual series. However, while the seasonal variation in variance can fully explain the ARCH effect for monthly streamflow, it is only a partial explanation for daily flow. It is also shown that while the periodic autoregressive moving average model is adequate in modelling monthly flows, no model is adequate in modelling daily streamflow processes because none of the conventional time series models takes the seasonal variation in variance, as well as the ARCH effect in the residuals, into account. Therefore, an ARMA-GARCH (Generalized AutoRegressive Conditional Heteroskedasticity error model is proposed to capture the ARCH effect present in daily streamflow series, as well as to preserve seasonal variation in variance in the residuals. The ARMA-GARCH error model combines an ARMA model for modelling the mean behaviour and a GARCH model for modelling the variance behaviour of the residuals from the ARMA model. Since the GARCH model is not followed widely in statistical hydrology, the work can be a useful addition in terms of statistical modelling of daily streamflow processes for the hydrological community.

  5. Linear and nonlinear dynamic systems in financial time series prediction

    Directory of Open Access Journals (Sweden)

    Salim Lahmiri

    2012-10-01

    Full Text Available Autoregressive moving average (ARMA process and dynamic neural networks namely the nonlinear autoregressive moving average with exogenous inputs (NARX are compared by evaluating their ability to predict financial time series; for instance the S&P500 returns. Two classes of ARMA are considered. The first one is the standard ARMA model which is a linear static system. The second one uses Kalman filter (KF to estimate and predict ARMA coefficients. This model is a linear dynamic system. The forecasting ability of each system is evaluated by means of mean absolute error (MAE and mean absolute deviation (MAD statistics. Simulation results indicate that the ARMA-KF system performs better than the standard ARMA alone. Thus, introducing dynamics into the ARMA process improves the forecasting accuracy. In addition, the ARMA-KF outperformed the NARX. This result may suggest that the linear component found in the S&P500 return series is more dominant than the nonlinear part. In sum, we conclude that introducing dynamics into the ARMA process provides an effective system for S&P500 time series prediction.

  6. Forecasting electricity spot-prices using linear univariate time-series models

    International Nuclear Information System (INIS)

    Cuaresma, Jesus Crespo; Hlouskova, Jaroslava; Kossmeier, Stephan; Obersteiner, Michael

    2004-01-01

    This paper studies the forecasting abilities of a battery of univariate models on hourly electricity spot prices, using data from the Leipzig Power Exchange. The specifications studied include autoregressive models, autoregressive-moving average models and unobserved component models. The results show that specifications, where each hour of the day is modelled separately present uniformly better forecasting properties than specifications for the whole time-series, and that the inclusion of simple probabilistic processes for the arrival of extreme price events can lead to improvements in the forecasting abilities of univariate models for electricity spot prices. (Author)

  7. Forecasting Cryptocurrencies Financial Time Series

    DEFF Research Database (Denmark)

    Catania, Leopoldo; Grassi, Stefano; Ravazzolo, Francesco

    2018-01-01

    This paper studies the predictability of cryptocurrencies time series. We compare several alternative univariate and multivariate models in point and density forecasting of four of the most capitalized series: Bitcoin, Litecoin, Ripple and Ethereum. We apply a set of crypto–predictors and rely...

  8. Time series with tailored nonlinearities

    Science.gov (United States)

    Räth, C.; Laut, I.

    2015-10-01

    It is demonstrated how to generate time series with tailored nonlinearities by inducing well-defined constraints on the Fourier phases. Correlations between the phase information of adjacent phases and (static and dynamic) measures of nonlinearities are established and their origin is explained. By applying a set of simple constraints on the phases of an originally linear and uncorrelated Gaussian time series, the observed scaling behavior of the intensity distribution of empirical time series can be reproduced. The power law character of the intensity distributions being typical for, e.g., turbulence and financial data can thus be explained in terms of phase correlations.

  9. Time series analysis time series analysis methods and applications

    CERN Document Server

    Rao, Tata Subba; Rao, C R

    2012-01-01

    The field of statistics not only affects all areas of scientific activity, but also many other matters such as public policy. It is branching rapidly into so many different subjects that a series of handbooks is the only way of comprehensively presenting the various aspects of statistical methodology, applications, and recent developments. The Handbook of Statistics is a series of self-contained reference books. Each volume is devoted to a particular topic in statistics, with Volume 30 dealing with time series. The series is addressed to the entire community of statisticians and scientists in various disciplines who use statistical methodology in their work. At the same time, special emphasis is placed on applications-oriented techniques, with the applied statistician in mind as the primary audience. Comprehensively presents the various aspects of statistical methodology Discusses a wide variety of diverse applications and recent developments Contributors are internationally renowened experts in their respect...

  10. Modeling Autoregressive Processes with Moving-Quantiles-Implied Nonlinearity

    Directory of Open Access Journals (Sweden)

    Isao Ishida

    2015-01-01

    Full Text Available We introduce and investigate some properties of a class of nonlinear time series models based on the moving sample quantiles in the autoregressive data generating process. We derive a test fit to detect this type of nonlinearity. Using the daily realized volatility data of Standard & Poor’s 500 (S&P 500 and several other indices, we obtained good performance using these models in an out-of-sample forecasting exercise compared with the forecasts obtained based on the usual linear heterogeneous autoregressive and other models of realized volatility.

  11. Stochastic models for time series

    CERN Document Server

    Doukhan, Paul

    2018-01-01

    This book presents essential tools for modelling non-linear time series. The first part of the book describes the main standard tools of probability and statistics that directly apply to the time series context to obtain a wide range of modelling possibilities. Functional estimation and bootstrap are discussed, and stationarity is reviewed. The second part describes a number of tools from Gaussian chaos and proposes a tour of linear time series models. It goes on to address nonlinearity from polynomial or chaotic models for which explicit expansions are available, then turns to Markov and non-Markov linear models and discusses Bernoulli shifts time series models. Finally, the volume focuses on the limit theory, starting with the ergodic theorem, which is seen as the first step for statistics of time series. It defines the distributional range to obtain generic tools for limit theory under long or short-range dependences (LRD/SRD) and explains examples of LRD behaviours. More general techniques (central limit ...

  12. SU-D-BRA-02: An Extended Time-Variant Seasonal Autoregressive Model-Based Prediction for Irregular Breathing Motion Tracking.

    Science.gov (United States)

    Ichiji, K; Homma, N; Sakai, M; Narita, Y; Takai, Y; Yoshizawa, M

    2012-06-01

    Real-time tumor position/shape measurement and dynamic beam tracking techniques allow accurate and continuous irradiation to moving tumor, but there can be a delay of several hundred milliseconds between observation and irradiation. A time-variant seasonal autoregressive (TVSAR) model has been proposed for compensating the delay by predicting respiratory tumor motion with sub-millimeter accuracy for a second latency. This is the-state-of-the-art model for almost regular breathing prediction so far. In this study, we propose an extended prediction method based on TVSAR to be usable for various breathing patterns, by predicting the residual component obtained from conventional TVSAR. An essential core of the method is to take into account the residual component that is not predictable by only TVSAR. The residual component involves baseline shift, amplitude variation, and so on. In this study, the time series of the residual obtained for every new sample are predicted by using autoregressive (AR) model. The order and parameters of the AR model is adaptively determined for each residual component by using an information criterion. Eleven data sets of 3-D lung tumor motion, observed at Georgetown University Hospital by using Cyberknife Synchrony system, were used for evaluation of the prediction performance. Experimental results indicated that the proposed method is superior to those of conventional and the state-of-the-art methods for 0 to 1 s ahead prediction. The average prediction error of the proposed method was 0.920 plus/minus 0.348 mm for 0.5 s forward prediction. We have developed the new prediction method based on TVSAR model with adaptive residual prediction. The new method can predict various respiratory motions including not only regular but also a variety of irregular breathing patterns and thus can compensate the bad effect of the delay in dynamic irradiation system for moving tumor tracking. A part of this work has been financially supported by Varian

  13. Benchmarking of energy time series

    Energy Technology Data Exchange (ETDEWEB)

    Williamson, M.A.

    1990-04-01

    Benchmarking consists of the adjustment of time series data from one source in order to achieve agreement with similar data from a second source. The data from the latter source are referred to as the benchmark(s), and often differ in that they are observed at a lower frequency, represent a higher level of temporal aggregation, and/or are considered to be of greater accuracy. This report provides an extensive survey of benchmarking procedures which have appeared in the statistical literature, and reviews specific benchmarking procedures currently used by the Energy Information Administration (EIA). The literature survey includes a technical summary of the major benchmarking methods and their statistical properties. Factors influencing the choice and application of particular techniques are described and the impact of benchmark accuracy is discussed. EIA applications and procedures are reviewed and evaluated for residential natural gas deliveries series and coal production series. It is found that the current method of adjusting the natural gas series is consistent with the behavior of the series and the methods used in obtaining the initial data. As a result, no change is recommended. For the coal production series, a staged approach based on a first differencing technique is recommended over the current procedure. A comparison of the adjustments produced by the two methods is made for the 1987 Indiana coal production series. 32 refs., 5 figs., 1 tab.

  14. Temporal aggregation in first order cointegrated vector autoregressive models

    DEFF Research Database (Denmark)

    La Cour, Lisbeth Funding; Milhøj, Anders

    We study aggregation - or sample frequencies - of time series, e.g. aggregation from weekly to monthly or quarterly time series. Aggregation usually gives shorter time series but spurious phenomena, in e.g. daily observations, can on the other hand be avoided. An important issue is the effect of ...... of aggregation on the adjustment coefficient in cointegrated systems. We study only first order vector autoregressive processes for n dimensional time series Xt, and we illustrate the theory by a two dimensional and a four dimensional model for prices of various grades of gasoline...

  15. Temporal aggregation in first order cointegrated vector autoregressive

    DEFF Research Database (Denmark)

    la Cour, Lisbeth Funding; Milhøj, Anders

    2006-01-01

    We study aggregation - or sample frequencies - of time series, e.g. aggregation from weekly to monthly or quarterly time series. Aggregation usually gives shorter time series but spurious phenomena, in e.g. daily observations, can on the other hand be avoided. An important issue is the effect of ...... of aggregation on the adjustment coefficient in cointegrated systems. We study only first order vector autoregressive processes for n dimensional time series Xt, and we illustrate the theory by a two dimensional and a four dimensional model for prices of various grades of gasoline....

  16. Model reduction methods for vector autoregressive processes

    CERN Document Server

    Brüggemann, Ralf

    2004-01-01

    1. 1 Objective of the Study Vector autoregressive (VAR) models have become one of the dominant research tools in the analysis of macroeconomic time series during the last two decades. The great success of this modeling class started with Sims' (1980) critique of the traditional simultaneous equation models (SEM). Sims criticized the use of 'too many incredible restrictions' based on 'supposed a priori knowledge' in large scale macroeconometric models which were popular at that time. Therefore, he advo­ cated largely unrestricted reduced form multivariate time series models, unrestricted VAR models in particular. Ever since his influential paper these models have been employed extensively to characterize the underlying dynamics in systems of time series. In particular, tools to summarize the dynamic interaction between the system variables, such as impulse response analysis or forecast error variance decompo­ sitions, have been developed over the years. The econometrics of VAR models and related quantities i...

  17. Linearity Testing in Time-Varying Smooth Transition Autoregressive Models under Unknown Degree of Persistency

    DEFF Research Database (Denmark)

    Sandberg, Rickard; Kruse, Robinson

    Building upon the work of Vogelsang (1998) and Harvey and Leybourne (2007) we derive tests that are invariant to the order of integration when the null hypothesis of linearity is tested in time-varying smooth transition models. As heteroscedasticity may lead to spurious rejections of the null...

  18. Vector Autoregressions with Parsimoniously Time Varying Parameters and an Application to Monetary Policy

    DEFF Research Database (Denmark)

    Callot, Laurent; Kristensen, Johannes Tang

    the monetary policy response to inflation and business cycle fluctuations in the US by estimating a parsimoniously time varying parameter Taylor rule.We document substantial changes in the policy response of the Fed in the 1970s and 1980s, and since 2007, but also document the stability of this response...

  19. Analysis and generation of groundwater concentration time series

    Science.gov (United States)

    Crăciun, Maria; Vamoş, Călin; Suciu, Nicolae

    2018-01-01

    Concentration time series are provided by simulated concentrations of a nonreactive solute transported in groundwater, integrated over the transverse direction of a two-dimensional computational domain and recorded at the plume center of mass. The analysis of a statistical ensemble of time series reveals subtle features that are not captured by the first two moments which characterize the approximate Gaussian distribution of the two-dimensional concentration fields. The concentration time series exhibit a complex preasymptotic behavior driven by a nonstationary trend and correlated fluctuations with time-variable amplitude. Time series with almost the same statistics are generated by successively adding to a time-dependent trend a sum of linear regression terms, accounting for correlations between fluctuations around the trend and their increments in time, and terms of an amplitude modulated autoregressive noise of order one with time-varying parameter. The algorithm generalizes mixing models used in probability density function approaches. The well-known interaction by exchange with the mean mixing model is a special case consisting of a linear regression with constant coefficients.

  20. Incorporating measurement error in n = 1 psychological autoregressive modeling

    Science.gov (United States)

    Schuurman, Noémi K.; Houtveen, Jan H.; Hamaker, Ellen L.

    2015-01-01

    Measurement error is omnipresent in psychological data. However, the vast majority of applications of autoregressive time series analyses in psychology do not take measurement error into account. Disregarding measurement error when it is present in the data results in a bias of the autoregressive parameters. We discuss two models that take measurement error into account: An autoregressive model with a white noise term (AR+WN), and an autoregressive moving average (ARMA) model. In a simulation study we compare the parameter recovery performance of these models, and compare this performance for both a Bayesian and frequentist approach. We find that overall, the AR+WN model performs better. Furthermore, we find that for realistic (i.e., small) sample sizes, psychological research would benefit from a Bayesian approach in fitting these models. Finally, we illustrate the effect of disregarding measurement error in an AR(1) model by means of an empirical application on mood data in women. We find that, depending on the person, approximately 30–50% of the total variance was due to measurement error, and that disregarding this measurement error results in a substantial underestimation of the autoregressive parameters. PMID:26283988

  1. The Exponential Model for the Spectrum of a Time Series: Extensions and Applications

    DEFF Research Database (Denmark)

    Proietti, Tommaso; Luati, Alessandra

    The exponential model for the spectrum of a time series and its fractional extensions are based on the Fourier series expansion of the logarithm of the spectral density. The coefficients of the expansion form the cepstrum of the time series. After deriving the cepstrum of important classes of time...... to the log-spectrum. We then propose two extensions. The first deals with replacing the logarithmic link with a more general Box-Cox link, which encompasses also the identity and the inverse links: this enables nesting alternative spectral estimation methods (autoregressive, exponential, etc.) under the same...

  2. Time series modeling for syndromic surveillance

    Directory of Open Access Journals (Sweden)

    Mandl Kenneth D

    2003-01-01

    Full Text Available Abstract Background Emergency department (ED based syndromic surveillance systems identify abnormally high visit rates that may be an early signal of a bioterrorist attack. For example, an anthrax outbreak might first be detectable as an unusual increase in the number of patients reporting to the ED with respiratory symptoms. Reliably identifying these abnormal visit patterns requires a good understanding of the normal patterns of healthcare usage. Unfortunately, systematic methods for determining the expected number of (ED visits on a particular day have not yet been well established. We present here a generalized methodology for developing models of expected ED visit rates. Methods Using time-series methods, we developed robust models of ED utilization for the purpose of defining expected visit rates. The models were based on nearly a decade of historical data at a major metropolitan academic, tertiary care pediatric emergency department. The historical data were fit using trimmed-mean seasonal models, and additional models were fit with autoregressive integrated moving average (ARIMA residuals to account for recent trends in the data. The detection capabilities of the model were tested with simulated outbreaks. Results Models were built both for overall visits and for respiratory-related visits, classified according to the chief complaint recorded at the beginning of each visit. The mean absolute percentage error of the ARIMA models was 9.37% for overall visits and 27.54% for respiratory visits. A simple detection system based on the ARIMA model of overall visits was able to detect 7-day-long simulated outbreaks of 30 visits per day with 100% sensitivity and 97% specificity. Sensitivity decreased with outbreak size, dropping to 94% for outbreaks of 20 visits per day, and 57% for 10 visits per day, all while maintaining a 97% benchmark specificity. Conclusions Time series methods applied to historical ED utilization data are an important tool

  3. RADON CONCENTRATION TIME SERIES MODELING AND APPLICATION DISCUSSION.

    Science.gov (United States)

    Stránský, V; Thinová, L

    2017-11-01

    In the year 2010 a continual radon measurement was established at Mladeč Caves in the Czech Republic using a continual radon monitor RADIM3A. In order to model radon time series in the years 2010-15, the Box-Jenkins Methodology, often used in econometrics, was applied. Because of the behavior of radon concentrations (RCs), a seasonal integrated, autoregressive moving averages model with exogenous variables (SARIMAX) has been chosen to model the measured time series. This model uses the time series seasonality, previously acquired values and delayed atmospheric parameters, to forecast RC. The developed model for RC time series is called regARIMA(5,1,3). Model residuals could be retrospectively compared with seismic evidence of local or global earthquakes, which occurred during the RCs measurement. This technique enables us to asses if continuously measured RC could serve an earthquake precursor. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  4. Time Series Modelling of Syphilis Incidence in China from 2005 to 2012

    Science.gov (United States)

    Zhang, Xingyu; Zhang, Tao; Pei, Jiao; Liu, Yuanyuan; Li, Xiaosong; Medrano-Gracia, Pau

    2016-01-01

    Background The infection rate of syphilis in China has increased dramatically in recent decades, becoming a serious public health concern. Early prediction of syphilis is therefore of great importance for heath planning and management. Methods In this paper, we analyzed surveillance time series data for primary, secondary, tertiary, congenital and latent syphilis in mainland China from 2005 to 2012. Seasonality and long-term trend were explored with decomposition methods. Autoregressive integrated moving average (ARIMA) was used to fit a univariate time series model of syphilis incidence. A separate multi-variable time series for each syphilis type was also tested using an autoregressive integrated moving average model with exogenous variables (ARIMAX). Results The syphilis incidence rates have increased three-fold from 2005 to 2012. All syphilis time series showed strong seasonality and increasing long-term trend. Both ARIMA and ARIMAX models fitted and estimated syphilis incidence well. All univariate time series showed highest goodness-of-fit results with the ARIMA(0,0,1)×(0,1,1) model. Conclusion Time series analysis was an effective tool for modelling the historical and future incidence of syphilis in China. The ARIMAX model showed superior performance than the ARIMA model for the modelling of syphilis incidence. Time series correlations existed between the models for primary, secondary, tertiary, congenital and latent syphilis. PMID:26901682

  5. Kumaraswamy autoregressive moving average models for double bounded environmental data

    Science.gov (United States)

    Bayer, Fábio Mariano; Bayer, Débora Missio; Pumi, Guilherme

    2017-12-01

    In this paper we introduce the Kumaraswamy autoregressive moving average models (KARMA), which is a dynamic class of models for time series taking values in the double bounded interval (a,b) following the Kumaraswamy distribution. The Kumaraswamy family of distribution is widely applied in many areas, especially hydrology and related fields. Classical examples are time series representing rates and proportions observed over time. In the proposed KARMA model, the median is modeled by a dynamic structure containing autoregressive and moving average terms, time-varying regressors, unknown parameters and a link function. We introduce the new class of models and discuss conditional maximum likelihood estimation, hypothesis testing inference, diagnostic analysis and forecasting. In particular, we provide closed-form expressions for the conditional score vector and conditional Fisher information matrix. An application to environmental real data is presented and discussed.

  6. A Time Series Forecasting Method

    Directory of Open Access Journals (Sweden)

    Wang Zhao-Yu

    2017-01-01

    Full Text Available This paper proposes a novel time series forecasting method based on a weighted self-constructing clustering technique. The weighted self-constructing clustering processes all the data patterns incrementally. If a data pattern is not similar enough to an existing cluster, it forms a new cluster of its own. However, if a data pattern is similar enough to an existing cluster, it is removed from the cluster it currently belongs to and added to the most similar cluster. During the clustering process, weights are learned for each cluster. Given a series of time-stamped data up to time t, we divide it into a set of training patterns. By using the weighted self-constructing clustering, the training patterns are grouped into a set of clusters. To estimate the value at time t + 1, we find the k nearest neighbors of the input pattern and use these k neighbors to decide the estimation. Experimental results are shown to demonstrate the effectiveness of the proposed approach.

  7. International Work-Conference on Time Series

    CERN Document Server

    Pomares, Héctor; Valenzuela, Olga

    2017-01-01

    This volume of selected and peer-reviewed contributions on the latest developments in time series analysis and forecasting updates the reader on topics such as analysis of irregularly sampled time series, multi-scale analysis of univariate and multivariate time series, linear and non-linear time series models, advanced time series forecasting methods, applications in time series analysis and forecasting, advanced methods and online learning in time series and high-dimensional and complex/big data time series. The contributions were originally presented at the International Work-Conference on Time Series, ITISE 2016, held in Granada, Spain, June 27-29, 2016. The series of ITISE conferences provides a forum for scientists, engineers, educators and students to discuss the latest ideas and implementations in the foundations, theory, models and applications in the field of time series analysis and forecasting.  It focuses on interdisciplinary and multidisciplinary rese arch encompassing the disciplines of comput...

  8. Conventional and advanced time series estimation: application to the Australian and New Zealand Intensive Care Society (ANZICS) adult patient database, 1993-2006.

    Science.gov (United States)

    Moran, John L; Solomon, Patricia J

    2011-02-01

    Time series analysis has seen limited application in the biomedical Literature. The utility of conventional and advanced time series estimators was explored for intensive care unit (ICU) outcome series. Monthly mean time series, 1993-2006, for hospital mortality, severity-of-illness score (APACHE III), ventilation fraction and patient type (medical and surgical), were generated from the Australia and New Zealand Intensive Care Society adult patient database. Analyses encompassed geographical seasonal mortality patterns, series structural time changes, mortality series volatility using autoregressive moving average and Generalized Autoregressive Conditional Heteroscedasticity models in which predicted variances are updated adaptively, and bivariate and multivariate (vector error correction models) cointegrating relationships between series. The mortality series exhibited marked seasonality, declining mortality trend and substantial autocorrelation beyond 24 lags. Mortality increased in winter months (July-August); the medical series featured annual cycling, whereas the surgical demonstrated long and short (3-4 months) cycling. Series structural breaks were apparent in January 1995 and December 2002. The covariance stationary first-differenced mortality series was consistent with a seasonal autoregressive moving average process; the observed conditional-variance volatility (1993-1995) and residual Autoregressive Conditional Heteroscedasticity effects entailed a Generalized Autoregressive Conditional Heteroscedasticity model, preferred by information criterion and mean model forecast performance. Bivariate cointegration, indicating long-term equilibrium relationships, was established between mortality and severity-of-illness scores at the database level and for categories of ICUs. Multivariate cointegration was demonstrated for {log APACHE III score, log ICU length of stay, ICU mortality and ventilation fraction}. A system approach to understanding series time

  9. Discretization of time series data.

    Science.gov (United States)

    Dimitrova, Elena S; Licona, M Paola Vera; McGee, John; Laubenbacher, Reinhard

    2010-06-01

    An increasing number of algorithms for biochemical network inference from experimental data require discrete data as input. For example, dynamic Bayesian network methods and methods that use the framework of finite dynamical systems, such as Boolean networks, all take discrete input. Experimental data, however, are typically continuous and represented by computer floating point numbers. The translation from continuous to discrete data is crucial in preserving the variable dependencies and thus has a significant impact on the performance of the network inference algorithms. We compare the performance of two such algorithms that use discrete data using several different discretization algorithms. One of the inference methods uses a dynamic Bayesian network framework, the other-a time-and state-discrete dynamical system framework. The discretization algorithms are quantile, interval discretization, and a new algorithm introduced in this article, SSD. SSD is especially designed for short time series data and is capable of determining the optimal number of discretization states. The experiments show that both inference methods perform better with SSD than with the other methods. In addition, SSD is demonstrated to preserve the dynamic features of the time series, as well as to be robust to noise in the experimental data. A C++ implementation of SSD is available from the authors at http://polymath.vbi.vt.edu/discretization .

  10. Identification of the time series interrelationships with reference to ...

    African Journals Online (AJOL)

    In this study, the model of interest is that of a rational distributed lag function Y on X plus an independent Autoregressive Moving Average (ARMA) model. To investigate the model structure relating X and Y we considered the inverse cross correlation function for the observed and residual series in the presence of outliers.

  11. Detection of chaotic determinism in time series from randomly forced maps

    DEFF Research Database (Denmark)

    Chon, K H; Kanters, J K; Cohen, R J

    1997-01-01

    Time series from biological system often display fluctuations in the measured variables. Much effort has been directed at determining whether this variability reflects deterministic chaos, or whether it is merely "noise". Despite this effort, it has been difficult to establish the presence of chaos...... a method that appears to be useful in deciding whether determinism is present in a time series, and if this determinism has chaotic attributes, i.e., a positive characteristic exponent that leads to sensitivity to initial conditions. The method relies on fitting a nonlinear autoregressive model to the time...

  12. Stable Parameter Estimation for Autoregressive Equations with Random Coefficients

    Directory of Open Access Journals (Sweden)

    V. B. Goryainov

    2014-01-01

    Full Text Available In recent yearsthere has been a growing interest in non-linear time series models. They are more flexible than traditional linear models and allow more adequate description of real data. Among these models a autoregressive model with random coefficients plays an important role. It is widely used in various fields of science and technology, for example, in physics, biology, economics and finance. The model parameters are the mean values of autoregressive coefficients. Their evaluation is the main task of model identification. The basic method of estimation is still the least squares method, which gives good results for Gaussian time series, but it is quite sensitive to even small disturbancesin the assumption of Gaussian observations. In this paper we propose estimates, which generalize the least squares estimate in the sense that the quadratic objective function is replaced by an arbitrary convex and even function. Reasonable choice of objective function allows you to keep the benefits of the least squares estimate and eliminate its shortcomings. In particular, you can make it so that they will be almost as effective as the least squares estimate in the Gaussian case, but almost never loose in accuracy with small deviations of the probability distribution of the observations from the Gaussian distribution.The main result is the proof of consistency and asymptotic normality of the proposed estimates in the particular case of the one-parameter model describing the stationary process with finite variance. Another important result is the finding of the asymptotic relative efficiency of the proposed estimates in relation to the least squares estimate. This allows you to compare the two estimates, depending on the probability distribution of innovation process and of autoregressive coefficients. The results can be used to identify an autoregressive process, especially with nonGaussian nature, and/or of autoregressive processes observed with gross

  13. Stochastic modeling of hourly rainfall times series in Campania (Italy)

    Science.gov (United States)

    Giorgio, M.; Greco, R.

    2009-04-01

    Occurrence of flowslides and floods in small catchments is uneasy to predict, since it is affected by a number of variables, such as mechanical and hydraulic soil properties, slope morphology, vegetation coverage, rainfall spatial and temporal variability. Consequently, landslide risk assessment procedures and early warning systems still rely on simple empirical models based on correlation between recorded rainfall data and observed landslides and/or river discharges. Effectiveness of such systems could be improved by reliable quantitative rainfall prediction, which can allow gaining larger lead-times. Analysis of on-site recorded rainfall height time series represents the most effective approach for a reliable prediction of local temporal evolution of rainfall. Hydrological time series analysis is a widely studied field in hydrology, often carried out by means of autoregressive models, such as AR, ARMA, ARX, ARMAX (e.g. Salas [1992]). Such models gave the best results when applied to the analysis of autocorrelated hydrological time series, like river flow or level time series. Conversely, they are not able to model the behaviour of intermittent time series, like point rainfall height series usually are, especially when recorded with short sampling time intervals. More useful for this issue are the so-called DRIP (Disaggregated Rectangular Intensity Pulse) and NSRP (Neymann-Scott Rectangular Pulse) model [Heneker et al., 2001; Cowpertwait et al., 2002], usually adopted to generate synthetic point rainfall series. In this paper, the DRIP model approach is adopted, in which the sequence of rain storms and dry intervals constituting the structure of rainfall time series is modeled as an alternating renewal process. Final aim of the study is to provide a useful tool to implement an early warning system for hydrogeological risk management. Model calibration has been carried out with hourly rainfall hieght data provided by the rain gauges of Campania Region civil

  14. Nonlinear Time Series and Neural-Network Models of Exchange Rates between the US Dollar and Major Currencies

    Directory of Open Access Journals (Sweden)

    David E. Allen

    2016-03-01

    Full Text Available This paper features an analysis of major currency exchange rate movements in relation to the US dollar, as constituted in US dollar terms. Euro, British pound, Chinese yuan, and Japanese yen are modelled using a variety of non-linear models, including smooth transition regression models, logistic smooth transition regressions models, threshold autoregressive models, nonlinear autoregressive models, and additive nonlinear autoregressive models, plus Neural Network models. The models are evaluated on the basis of error metrics for twenty day out-of-sample forecasts using the mean average percentage errors (MAPE. The results suggest that there is no dominating class of time series models, and the different currency pairs relationships with the US dollar are captured best by neural net regression models, over the ten year sample of daily exchange rate returns data, from August 2005 to August 2015.

  15. Real-Time Detection of Application-Layer DDoS Attack Using Time Series Analysis

    Directory of Open Access Journals (Sweden)

    Tongguang Ni

    2013-01-01

    Full Text Available Distributed denial of service (DDoS attacks are one of the major threats to the current Internet, and application-layer DDoS attacks utilizing legitimate HTTP requests to overwhelm victim resources are more undetectable. Consequently, neither intrusion detection systems (IDS nor victim server can detect malicious packets. In this paper, a novel approach to detect application-layer DDoS attack is proposed based on entropy of HTTP GET requests per source IP address (HRPI. By approximating the adaptive autoregressive (AAR model, the HRPI time series is transformed into a multidimensional vector series. Then, a trained support vector machine (SVM classifier is applied to identify the attacks. The experiments with several databases are performed and results show that this approach can detect application-layer DDoS attacks effectively.

  16. A Course in Time Series Analysis

    CERN Document Server

    Peña, Daniel; Tsay, Ruey S

    2011-01-01

    New statistical methods and future directions of research in time series A Course in Time Series Analysis demonstrates how to build time series models for univariate and multivariate time series data. It brings together material previously available only in the professional literature and presents a unified view of the most advanced procedures available for time series model building. The authors begin with basic concepts in univariate time series, providing an up-to-date presentation of ARIMA models, including the Kalman filter, outlier analysis, automatic methods for building ARIMA models, a

  17. Noncausal Bayesian Vector Autoregression

    DEFF Research Database (Denmark)

    Lanne, Markku; Luoto, Jani

    We propose a Bayesian inferential procedure for the noncausal vector autoregressive (VAR) model that is capable of capturing nonlinearities and incorporating effects of missing variables. In particular, we devise a fast and reliable posterior simulator that yields the predictive distribution...

  18. Application of time-lagged ensemble approach with auto-regressive processors to reduce uncertainties in peak discharge and timing

    Directory of Open Access Journals (Sweden)

    Kyung-Jin Kim

    2017-02-01

    An accuracy evaluation using observations from 2002 to 2009 found that the time-lagged ensemble approach alone produced significant bias but the AR processor reduced the relative error percentage of the peak discharge from 60% to 10% and also decreased the peak timing error from more than 10 h to less than 3 h, on average. The proposed methodology is easy and inexpensive to implement with the existing products and models and thus can be immediately activated until a new product for forecasted meteorological ensembles is officially issued in Korea.

  19. Global Population Density Grid Time Series Estimates

    Data.gov (United States)

    National Aeronautics and Space Administration — Global Population Density Grid Time Series Estimates provide a back-cast time series of population density grids based on the year 2000 population grid from SEDAC's...

  20. Global Population Count Grid Time Series Estimates

    Data.gov (United States)

    National Aeronautics and Space Administration — Global Population Count Grid Time Series Estimates provide a back-cast time series of population grids based on the year 2000 population grid from SEDAC's Global...

  1. Time series modeling for analysis and control advanced autopilot and monitoring systems

    CERN Document Server

    Ohtsu, Kohei; Kitagawa, Genshiro

    2015-01-01

    This book presents multivariate time series methods for the analysis and optimal control of feedback systems. Although ships’ autopilot systems are considered through the entire book, the methods set forth in this book can be applied to many other complicated, large, or noisy feedback control systems for which it is difficult to derive a model of the entire system based on theory in that subject area. The basic models used in this method are the multivariate autoregressive model with exogenous variables (ARX) model and the radial bases function net-type coefficients ARX model. The noise contribution analysis can then be performed through the estimated autoregressive (AR) model and various types of autopilot systems can be designed through the state–space representation of the models. The marine autopilot systems addressed in this book include optimal controllers for course-keeping motion, rolling reduction controllers with rudder motion, engine governor controllers, noise adaptive autopilots, route-tracki...

  2. Time series regression model for infectious disease and weather.

    Science.gov (United States)

    Imai, Chisato; Armstrong, Ben; Chalabi, Zaid; Mangtani, Punam; Hashizume, Masahiro

    2015-10-01

    Time series regression has been developed and long used to evaluate the short-term associations of air pollution and weather with mortality or morbidity of non-infectious diseases. The application of the regression approaches from this tradition to infectious diseases, however, is less well explored and raises some new issues. We discuss and present potential solutions for five issues often arising in such analyses: changes in immune population, strong autocorrelations, a wide range of plausible lag structures and association patterns, seasonality adjustments, and large overdispersion. The potential approaches are illustrated with datasets of cholera cases and rainfall from Bangladesh and influenza and temperature in Tokyo. Though this article focuses on the application of the traditional time series regression to infectious diseases and weather factors, we also briefly introduce alternative approaches, including mathematical modeling, wavelet analysis, and autoregressive integrated moving average (ARIMA) models. Modifications proposed to standard time series regression practice include using sums of past cases as proxies for the immune population, and using the logarithm of lagged disease counts to control autocorrelation due to true contagion, both of which are motivated from "susceptible-infectious-recovered" (SIR) models. The complexity of lag structures and association patterns can often be informed by biological mechanisms and explored by using distributed lag non-linear models. For overdispersed models, alternative distribution models such as quasi-Poisson and negative binomial should be considered. Time series regression can be used to investigate dependence of infectious diseases on weather, but may need modifying to allow for features specific to this context. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.

  3. Memory and betweenness preference in temporal networks induced from time series

    Science.gov (United States)

    Weng, Tongfeng; Zhang, Jie; Small, Michael; Zheng, Rui; Hui, Pan

    2017-02-01

    We construct temporal networks from time series via unfolding the temporal information into an additional topological dimension of the networks. Thus, we are able to introduce memory entropy analysis to unravel the memory effect within the considered signal. We find distinct patterns in the entropy growth rate of the aggregate network at different memory scales for time series with different dynamics ranging from white noise, 1/f noise, autoregressive process, periodic to chaotic dynamics. Interestingly, for a chaotic time series, an exponential scaling emerges in the memory entropy analysis. We demonstrate that the memory exponent can successfully characterize bifurcation phenomenon, and differentiate the human cardiac system in healthy and pathological states. Moreover, we show that the betweenness preference analysis of these temporal networks can further characterize dynamical systems and separate distinct electrocardiogram recordings. Our work explores the memory effect and betweenness preference in temporal networks constructed from time series data, providing a new perspective to understand the underlying dynamical systems.

  4. Early detection of metabolic and energy disorders by thermal time series stochastic complexity analysis.

    Science.gov (United States)

    Lutaif, N A; Palazzo, R; Gontijo, J A R

    2014-01-01

    Maintenance of thermal homeostasis in rats fed a high-fat diet (HFD) is associated with changes in their thermal balance. The thermodynamic relationship between heat dissipation and energy storage is altered by the ingestion of high-energy diet content. Observation of thermal registers of core temperature behavior, in humans and rodents, permits identification of some characteristics of time series, such as autoreference and stationarity that fit adequately to a stochastic analysis. To identify this change, we used, for the first time, a stochastic autoregressive model, the concepts of which match those associated with physiological systems involved and applied in male HFD rats compared with their appropriate standard food intake age-matched male controls (n=7 per group). By analyzing a recorded temperature time series, we were able to identify when thermal homeostasis would be affected by a new diet. The autoregressive time series model (AR model) was used to predict the occurrence of thermal homeostasis, and this model proved to be very effective in distinguishing such a physiological disorder. Thus, we infer from the results of our study that maximum entropy distribution as a means for stochastic characterization of temperature time series registers may be established as an important and early tool to aid in the diagnosis and prevention of metabolic diseases due to their ability to detect small variations in thermal profile.

  5. Early detection of metabolic and energy disorders by thermal time series stochastic complexity analysis

    Energy Technology Data Exchange (ETDEWEB)

    Lutaif, N.A. [Departamento de Clínica Médica, Faculdade de Ciências Médicas, Universidade Estadual de Campinas, Campinas, SP (Brazil); Palazzo, R. Jr [Departamento de Telemática, Faculdade de Engenharia Elétrica e Computação, Universidade Estadual de Campinas, Campinas, SP (Brazil); Gontijo, J.A.R. [Departamento de Clínica Médica, Faculdade de Ciências Médicas, Universidade Estadual de Campinas, Campinas, SP (Brazil)

    2014-01-17

    Maintenance of thermal homeostasis in rats fed a high-fat diet (HFD) is associated with changes in their thermal balance. The thermodynamic relationship between heat dissipation and energy storage is altered by the ingestion of high-energy diet content. Observation of thermal registers of core temperature behavior, in humans and rodents, permits identification of some characteristics of time series, such as autoreference and stationarity that fit adequately to a stochastic analysis. To identify this change, we used, for the first time, a stochastic autoregressive model, the concepts of which match those associated with physiological systems involved and applied in male HFD rats compared with their appropriate standard food intake age-matched male controls (n=7 per group). By analyzing a recorded temperature time series, we were able to identify when thermal homeostasis would be affected by a new diet. The autoregressive time series model (AR model) was used to predict the occurrence of thermal homeostasis, and this model proved to be very effective in distinguishing such a physiological disorder. Thus, we infer from the results of our study that maximum entropy distribution as a means for stochastic characterization of temperature time series registers may be established as an important and early tool to aid in the diagnosis and prevention of metabolic diseases due to their ability to detect small variations in thermal profile.

  6. Early detection of metabolic and energy disorders by thermal time series stochastic complexity analysis

    Directory of Open Access Journals (Sweden)

    N.A. Lutaif

    2014-01-01

    Full Text Available Maintenance of thermal homeostasis in rats fed a high-fat diet (HFD is associated with changes in their thermal balance. The thermodynamic relationship between heat dissipation and energy storage is altered by the ingestion of high-energy diet content. Observation of thermal registers of core temperature behavior, in humans and rodents, permits identification of some characteristics of time series, such as autoreference and stationarity that fit adequately to a stochastic analysis. To identify this change, we used, for the first time, a stochastic autoregressive model, the concepts of which match those associated with physiological systems involved and applied in male HFD rats compared with their appropriate standard food intake age-matched male controls (n=7 per group. By analyzing a recorded temperature time series, we were able to identify when thermal homeostasis would be affected by a new diet. The autoregressive time series model (AR model was used to predict the occurrence of thermal homeostasis, and this model proved to be very effective in distinguishing such a physiological disorder. Thus, we infer from the results of our study that maximum entropy distribution as a means for stochastic characterization of temperature time series registers may be established as an important and early tool to aid in the diagnosis and prevention of metabolic diseases due to their ability to detect small variations in thermal profile.

  7. Early detection of metabolic and energy disorders by thermal time series stochastic complexity analysis

    International Nuclear Information System (INIS)

    Lutaif, N.A.; Palazzo, R. Jr; Gontijo, J.A.R.

    2014-01-01

    Maintenance of thermal homeostasis in rats fed a high-fat diet (HFD) is associated with changes in their thermal balance. The thermodynamic relationship between heat dissipation and energy storage is altered by the ingestion of high-energy diet content. Observation of thermal registers of core temperature behavior, in humans and rodents, permits identification of some characteristics of time series, such as autoreference and stationarity that fit adequately to a stochastic analysis. To identify this change, we used, for the first time, a stochastic autoregressive model, the concepts of which match those associated with physiological systems involved and applied in male HFD rats compared with their appropriate standard food intake age-matched male controls (n=7 per group). By analyzing a recorded temperature time series, we were able to identify when thermal homeostasis would be affected by a new diet. The autoregressive time series model (AR model) was used to predict the occurrence of thermal homeostasis, and this model proved to be very effective in distinguishing such a physiological disorder. Thus, we infer from the results of our study that maximum entropy distribution as a means for stochastic characterization of temperature time series registers may be established as an important and early tool to aid in the diagnosis and prevention of metabolic diseases due to their ability to detect small variations in thermal profile

  8. Time Series Analysis and Forecasting by Example

    CERN Document Server

    Bisgaard, Soren

    2011-01-01

    An intuition-based approach enables you to master time series analysis with ease Time Series Analysis and Forecasting by Example provides the fundamental techniques in time series analysis using various examples. By introducing necessary theory through examples that showcase the discussed topics, the authors successfully help readers develop an intuitive understanding of seemingly complicated time series models and their implications. The book presents methodologies for time series analysis in a simplified, example-based approach. Using graphics, the authors discuss each presented example in

  9. A review of subsequence time series clustering.

    Science.gov (United States)

    Zolhavarieh, Seyedjamal; Aghabozorgi, Saeed; Teh, Ying Wah

    2014-01-01

    Clustering of subsequence time series remains an open issue in time series clustering. Subsequence time series clustering is used in different fields, such as e-commerce, outlier detection, speech recognition, biological systems, DNA recognition, and text mining. One of the useful fields in the domain of subsequence time series clustering is pattern recognition. To improve this field, a sequence of time series data is used. This paper reviews some definitions and backgrounds related to subsequence time series clustering. The categorization of the literature reviews is divided into three groups: preproof, interproof, and postproof period. Moreover, various state-of-the-art approaches in performing subsequence time series clustering are discussed under each of the following categories. The strengths and weaknesses of the employed methods are evaluated as potential issues for future studies.

  10. Time series analysis of wind speed using VAR and the generalized impulse response technique

    Energy Technology Data Exchange (ETDEWEB)

    Ewing, Bradley T. [Area of Information Systems and Quantitative Sciences, Rawls College of Business and Wind Science and Engineering Research Center, Texas Tech University, Lubbock, TX 79409-2101 (United States); Kruse, Jamie Brown [Center for Natural Hazard Research, East Carolina University, Greenville, NC (United States); Schroeder, John L. [Department of Geosciences and Wind Science and Engineering Research Center, Texas Tech University, Lubbock, TX (United States); Smith, Douglas A. [Department of Civil Engineering and Wind Science and Engineering Research Center, Texas Tech University, Lubbock, TX (United States)

    2007-03-15

    This research examines the interdependence in time series wind speed data measured in the same location at four different heights. A multiple-equation system known as a vector autoregression is proposed for characterizing the time series dynamics of wind. Additionally, the recently developed method of generalized impulse response analysis provides insight into the cross-effects of the wind series and their responses to shocks. Findings are based on analysis of contemporaneous wind speed time histories taken at 13, 33, 70 and 160 ft above ground level with a sampling rate of 10 Hz. The results indicate that wind speeds measured at 70 ft was the most variable. Further, the turbulence persisted longer at the 70-ft measurement than at the other heights. The greatest interdependence is observed at 13 ft. Gusts at 160 ft led to the greatest persistence to an 'own' shock and led to greatest persistence in the responses of the other wind series. (author)

  11. Volatility Behaviors of Financial Time Series by Percolation System on Sierpinski Carpet Lattice

    Science.gov (United States)

    Pei, Anqi; Wang, Jun

    2015-01-01

    The financial time series is simulated and investigated by the percolation system on the Sierpinski carpet lattice, where percolation is usually employed to describe the behavior of connected clusters in a random graph, and the Sierpinski carpet lattice is a graph which corresponds the fractal — Sierpinski carpet. To study the fluctuation behavior of returns for the financial model and the Shanghai Composite Index, we establish a daily volatility measure — multifractal volatility (MFV) measure to obtain MFV series, which have long-range cross-correlations with squared daily return series. The autoregressive fractionally integrated moving average (ARFIMA) model is used to analyze the MFV series, which performs better when compared to other volatility series. By a comparative study of the multifractality and volatility analysis of the data, the simulation data of the proposed model exhibits very similar behaviors to those of the real stock index, which indicates somewhat rationality of the model to the market application.

  12. Time-varying surrogate data to assess nonlinearity in nonstationary time series: application to heart rate variability.

    Science.gov (United States)

    Faes, Luca; Zhao, He; Chon, Ki H; Nollo, Giandomenico

    2009-03-01

    We propose a method to extend to time-varying (TV) systems the procedure for generating typical surrogate time series, in order to test the presence of nonlinear dynamics in potentially nonstationary signals. The method is based on fitting a TV autoregressive (AR) model to the original series and then regressing the model coefficients with random replacements of the model residuals to generate TV AR surrogate series. The proposed surrogate series were used in combination with a TV sample entropy (SE) discriminating statistic to assess nonlinearity in both simulated and experimental time series, in comparison with traditional time-invariant (TIV) surrogates combined with the TIV SE discriminating statistic. Analysis of simulated time series showed that using TIV surrogates, linear nonstationary time series may be erroneously regarded as nonlinear and weak TV nonlinearities may remain unrevealed, while the use of TV AR surrogates markedly increases the probability of a correct interpretation. Application to short (500 beats) heart rate variability (HRV) time series recorded at rest (R), after head-up tilt (T), and during paced breathing (PB) showed: 1) modifications of the SE statistic that were well interpretable with the known cardiovascular physiology; 2) significant contribution of nonlinear dynamics to HRV in all conditions, with significant increase during PB at 0.2 Hz respiration rate; and 3) a disagreement between TV AR surrogates and TIV surrogates in about a quarter of the series, suggesting that nonstationarity may affect HRV recordings and bias the outcome of the traditional surrogate-based nonlinearity test.

  13. Effective Feature Preprocessing for Time Series Forecasting

    DEFF Research Database (Denmark)

    Zhao, Junhua; Dong, Zhaoyang; Xu, Zhao

    2006-01-01

    Time series forecasting is an important area in data mining research. Feature preprocessing techniques have significant influence on forecasting accuracy, therefore are essential in a forecasting model. Although several feature preprocessing techniques have been applied in time series forecasting...... performance in time series forecasting. It is demonstrated in our experiment that, effective feature preprocessing can significantly enhance forecasting accuracy. This research can be a useful guidance for researchers on effectively selecting feature preprocessing techniques and integrating them with time...

  14. MCMC methods for financial time series

    OpenAIRE

    Tritová, Hana

    2016-01-01

    This thesis focuses on estimating parameters of appropriate model for daily returns using the Markov Chain Monte Carlo method (MCMC) and Bayesian statistics. We describe MCMC methods, such as Gibbs sampling and Metropolis- Hastings algorithm and their basic properties. After that, we introduce different financial models. Particularly we focus on the lognormal autoregressive model. Later we theoretically apply Gibbs sampling to lognormal autoregressive model using principles of Bayesian statis...

  15. Compact and accurate linear and nonlinear autoregressive moving average model parameter estimation using laguerre functions

    DEFF Research Database (Denmark)

    Chon, K H; Cohen, R J; Holstein-Rathlou, N H

    1997-01-01

    A linear and nonlinear autoregressive moving average (ARMA) identification algorithm is developed for modeling time series data. The algorithm uses Laguerre expansion of kernals (LEK) to estimate Volterra-Wiener kernals. However, instead of estimating linear and nonlinear system dynamics via movi...

  16. International Work-Conference on Time Series

    CERN Document Server

    Pomares, Héctor

    2016-01-01

    This volume presents selected peer-reviewed contributions from The International Work-Conference on Time Series, ITISE 2015, held in Granada, Spain, July 1-3, 2015. It discusses topics in time series analysis and forecasting, advanced methods and online learning in time series, high-dimensional and complex/big data time series as well as forecasting in real problems. The International Work-Conferences on Time Series (ITISE) provide a forum for scientists, engineers, educators and students to discuss the latest ideas and implementations in the foundations, theory, models and applications in the field of time series analysis and forecasting. It focuses on interdisciplinary and multidisciplinary research encompassing the disciplines of computer science, mathematics, statistics and econometrics.

  17. Trend Filtering Techniques for Time Series Analysis

    OpenAIRE

    López Arias, Daniel

    2016-01-01

    Time series can be found almost everywhere in our lives and because of this being capable of analysing them is an important task. Most of the time series we can think of are quite noisy, being this one of the main problems to extract information from them. In this work we use Trend Filtering techniques to try to remove this noise from a series and understand the underlying trend of the series, that gives us information about the behaviour of the series aside from the particular...

  18. Univariate Time Series Prediction of Solar Power Using a Hybrid Wavelet-ARMA-NARX Prediction Method

    Energy Technology Data Exchange (ETDEWEB)

    Nazaripouya, Hamidreza; Wang, Yubo; Chu, Chi-Cheng; Pota, Hemanshu; Gadh, Rajit

    2016-05-02

    This paper proposes a new hybrid method for super short-term solar power prediction. Solar output power usually has a complex, nonstationary, and nonlinear characteristic due to intermittent and time varying behavior of solar radiance. In addition, solar power dynamics is fast and is inertia less. An accurate super short-time prediction is required to compensate for the fluctuations and reduce the impact of solar power penetration on the power system. The objective is to predict one step-ahead solar power generation based only on historical solar power time series data. The proposed method incorporates discrete wavelet transform (DWT), Auto-Regressive Moving Average (ARMA) models, and Recurrent Neural Networks (RNN), while the RNN architecture is based on Nonlinear Auto-Regressive models with eXogenous inputs (NARX). The wavelet transform is utilized to decompose the solar power time series into a set of richer-behaved forming series for prediction. ARMA model is employed as a linear predictor while NARX is used as a nonlinear pattern recognition tool to estimate and compensate the error of wavelet-ARMA prediction. The proposed method is applied to the data captured from UCLA solar PV panels and the results are compared with some of the common and most recent solar power prediction methods. The results validate the effectiveness of the proposed approach and show a considerable improvement in the prediction precision.

  19. Generalizing smooth transition autoregressions

    DEFF Research Database (Denmark)

    Chini, Emilio Zanetti

    We introduce a variant of the smooth transition autoregression - the GSTAR model - capable to parametrize the asymmetry in the tails of the transition equation by using a particular generalization of the logistic function. A General-to-Specific modelling strategy is discussed in detail...

  20. Analysis of Heavy-Tailed Time Series

    DEFF Research Database (Denmark)

    Xie, Xiaolei

    and expressed in terms of the parameters of the dependence structure, among others. Furthermore, we study an importance sampling method for estimating rare-event probabilities of multivariate heavy-tailed time series generated by matrix recursion. We show that the proposed algorithm is efficient in the sense......This thesis is about analysis of heavy-tailed time series. We discuss tail properties of real-world equity return series and investigate the possibility that a single tail index is shared by all return series of actively traded equities in a market. Conditions for this hypothesis to be true...... are identified. We study the eigenvalues and eigenvectors of sample covariance and sample auto-covariance matrices of multivariate heavy-tailed time series, and particularly for time series with very high dimensions. Asymptotic approximations of the eigenvalues and eigenvectors of such matrices are found...

  1. Research on power grid loss prediction model based on Granger causality property of time series

    Energy Technology Data Exchange (ETDEWEB)

    Wang, J. [North China Electric Power Univ., Beijing (China); State Grid Corp., Beijing (China); Yan, W.P.; Yuan, J. [North China Electric Power Univ., Beijing (China); Xu, H.M.; Wang, X.L. [State Grid Information and Telecommunications Corp., Beijing (China)

    2009-03-11

    This paper described a method of predicting power transmission line losses using the Granger causality property of time series. The stable property of the time series was investigated using unit root tests. The Granger causality relationship between line losses and other variables was then determined. Granger-caused time series were then used to create the following 3 prediction models: (1) a model based on line loss binomials that used electricity sales to predict variables, (2) a model that considered both power sales and grid capacity, and (3) a model based on autoregressive distributed lag (ARDL) approaches that incorporated both power sales and the square of power sales as variables. A case study of data from China's electric power grid between 1980 and 2008 was used to evaluate model performance. Results of the study showed that the model error rates ranged between 2.7 and 3.9 percent. 6 refs., 3 tabs., 1 fig.

  2. Work-related accidents among the Iranian population: a time series analysis, 2000-2011.

    Science.gov (United States)

    Karimlou, Masoud; Salehi, Masoud; Imani, Mehdi; Hosseini, Agha-Fatemeh; Dehnad, Afsaneh; Vahabi, Nasim; Bakhtiyari, Mahmood

    2015-01-01

    Work-related accidents result in human suffering and economic losses and are considered as a major health problem worldwide, especially in the economically developing world. To introduce seasonal autoregressive moving average (ARIMA) models for time series analysis of work-related accident data for workers insured by the Iranian Social Security Organization (ISSO) between 2000 and 2011. In this retrospective study, all insured people experiencing at least one work-related accident during a 10-year period were included in the analyses. We used Box-Jenkins modeling to develop a time series model of the total number of accidents. There was an average of 1476 accidents per month (1476·05±458·77, mean±SD). The final ARIMA (p,d,q) (P,D,Q)s model for fitting to data was: ARIMA(1,1,1)×(0,1,1)12 consisting of the first ordering of the autoregressive, moving average and seasonal moving average parameters with 20·942 mean absolute percentage error (MAPE). The final model showed that time series analysis of ARIMA models was useful for forecasting the number of work-related accidents in Iran. In addition, the forecasted number of work-related accidents for 2011 explained the stability of occurrence of these accidents in recent years, indicating a need for preventive occupational health and safety policies such as safety inspection.

  3. Work-related accidents among the Iranian population: a time series analysis, 2000–2011

    Science.gov (United States)

    Karimlou, Masoud; Imani, Mehdi; Hosseini, Agha-Fatemeh; Dehnad, Afsaneh; Vahabi, Nasim; Bakhtiyari, Mahmood

    2015-01-01

    Background Work-related accidents result in human suffering and economic losses and are considered as a major health problem worldwide, especially in the economically developing world. Objectives To introduce seasonal autoregressive moving average (ARIMA) models for time series analysis of work-related accident data for workers insured by the Iranian Social Security Organization (ISSO) between 2000 and 2011. Methods In this retrospective study, all insured people experiencing at least one work-related accident during a 10-year period were included in the analyses. We used Box–Jenkins modeling to develop a time series model of the total number of accidents. Results There was an average of 1476 accidents per month (1476·05±458·77, mean±SD). The final ARIMA (p,d,q) (P,D,Q)s model for fitting to data was: ARIMA(1,1,1)×(0,1,1)12 consisting of the first ordering of the autoregressive, moving average and seasonal moving average parameters with 20·942 mean absolute percentage error (MAPE). Conclusions The final model showed that time series analysis of ARIMA models was useful for forecasting the number of work-related accidents in Iran. In addition, the forecasted number of work-related accidents for 2011 explained the stability of occurrence of these accidents in recent years, indicating a need for preventive occupational health and safety policies such as safety inspection. PMID:26119774

  4. BSMART: a Matlab/C toolbox for analysis of multichannel neural time series.

    Science.gov (United States)

    Cui, Jie; Xu, Lei; Bressler, Steven L; Ding, Mingzhou; Liang, Hualou

    2008-10-01

    We have developed a Matlab/C toolbox, Brain-SMART (System for Multivariate AutoRegressive Time series, or BSMART), for spectral analysis of continuous neural time series data recorded simultaneously from multiple sensors. Available functions include time series data importing/exporting, preprocessing (normalization and trend removal), AutoRegressive (AR) modeling (multivariate/bivariate model estimation and validation), spectral quantity estimation (auto power, coherence and Granger causality spectra), network analysis (including coherence and causality networks) and visualization (including data, power, coherence and causality views). The tools for investigating causal network structures in respect of frequency bands are unique functions provided by this toolbox. All functionality has been integrated into a simple and user-friendly graphical user interface (GUI) environment designed for easy accessibility. Although we have tested the toolbox only on Windows and Linux operating systems, BSMART itself is system independent. This toolbox is freely available (http://www.brain-smart.org) under the GNU public license for open source development.

  5. Adaptive spline autoregression threshold method in forecasting Mitsubishi car sales volume at PT Srikandi Diamond Motors

    Science.gov (United States)

    Susanti, D.; Hartini, E.; Permana, A.

    2017-01-01

    Sale and purchase of the growing competition between companies in Indonesian, make every company should have a proper planning in order to win the competition with other companies. One of the things that can be done to design the plan is to make car sales forecast for the next few periods, it’s required that the amount of inventory of cars that will be sold in proportion to the number of cars needed. While to get the correct forecasting, on of the methods that can be used is the method of Adaptive Spline Threshold Autoregression (ASTAR). Therefore, this time the discussion will focus on the use of Adaptive Spline Threshold Autoregression (ASTAR) method in forecasting the volume of car sales in PT.Srikandi Diamond Motors using time series data.In the discussion of this research, forecasting using the method of forecasting value Adaptive Spline Threshold Autoregression (ASTAR) produce approximately correct.

  6. Seismic assessment of a site using the time series method

    International Nuclear Information System (INIS)

    Krutzik, N.J.; Rotaru, I.; Bobei, M.; Mingiuc, C.; Serban, V.; Androne, M.

    1997-01-01

    To increase the safety of a NPP located on a seismic site, the seismic acceleration level to which the NPP should be qualified must be as representative as possible for that site, with a conservative degree of safety but not too exaggerated. The consideration of the seismic events affecting the site as independent events and the use of statistic methods to define some safety levels with very low annual occurrence probability (10 -4 ) may lead to some exaggerations of the seismic safety level. The use of some very high value for the seismic acceleration imposed by the seismic safety levels required by the hazard analysis may lead to very costly technical solutions that can make the plant operation more difficult and increase maintenance costs. The considerations of seismic events as a time series with dependence among the events produced, may lead to a more representative assessment of a NPP site seismic activity and consequently to a prognosis on the seismic level values to which the NPP would be ensured throughout its life-span. That prognosis should consider the actual seismic activity (including small earthquakes in real time) of the focuses that affect the plant site. The paper proposes the applications of Autoregressive Time Series to issue a prognosis on the seismic activity of a focus and presents the analysis on Vrancea focus that affects NPP Cernavoda site, by this method. The paper also presents the manner to analyse the focus activity as per the new approach and it assesses the maximum seismic acceleration that may affect NPP Cernavoda throughout its life-span (∼ 30 years). Development and applications of new mathematical analysis method, both for long - and short - time intervals, may lead to important contributions in the process of foretelling the seismic events in the future. (authors)

  7. Interpolation in Time Series: An Introductive Overview of Existing Methods, Their Performance Criteria and Uncertainty Assessment

    Directory of Open Access Journals (Sweden)

    Mathieu Lepot

    2017-10-01

    Full Text Available A thorough review has been performed on interpolation methods to fill gaps in time-series, efficiency criteria, and uncertainty quantifications. On one hand, there are numerous available methods: interpolation, regression, autoregressive, machine learning methods, etc. On the other hand, there are many methods and criteria to estimate efficiencies of these methods, but uncertainties on the interpolated values are rarely calculated. Furthermore, while they are estimated according to standard methods, the prediction uncertainty is not taken into account: a discussion is thus presented on the uncertainty estimation of interpolated/extrapolated data. Finally, some suggestions for further research and a new method are proposed.

  8. The foundations of modern time series analysis

    CERN Document Server

    Mills, Terence C

    2011-01-01

    This book develops the analysis of Time Series from its formal beginnings in the 1890s through to the publication of Box and Jenkins' watershed publication in 1970, showing how these methods laid the foundations for the modern techniques of Time Series analysis that are in use today.

  9. Lag space estimation in time series modelling

    DEFF Research Database (Denmark)

    Goutte, Cyril

    1997-01-01

    The purpose of this article is to investigate some techniques for finding the relevant lag-space, i.e. input information, for time series modelling. This is an important aspect of time series modelling, as it conditions the design of the model through the regressor vector a.k.a. the input layer...

  10. Time-frequency analysis of econometric time series

    Science.gov (United States)

    Corinaldi, Sharif; Cohen, Leon

    2007-06-01

    We review the basic concepts of time-frequency analysis which are methods that indicate not only that which frequencies in a time series but also when they existed. A number of examples are given to illustrate the possible use of these methods to econometric series. The methods are applied to the Beveridge Wheat Price Series.

  11. Entropic Analysis of Electromyography Time Series

    Science.gov (United States)

    Kaufman, Miron; Sung, Paul

    2005-03-01

    We are in the process of assessing the effectiveness of fractal and entropic measures for the diagnostic of low back pain from surface electromyography (EMG) time series. Surface electromyography (EMG) is used to assess patients with low back pain. In a typical EMG measurement, the voltage is measured every millisecond. We observed back muscle fatiguing during one minute, which results in a time series with 60,000 entries. We characterize the complexity of time series by computing the Shannon entropy time dependence. The analysis of the time series from different relevant muscles from healthy and low back pain (LBP) individuals provides evidence that the level of variability of back muscle activities is much larger for healthy individuals than for individuals with LBP. In general the time dependence of the entropy shows a crossover from a diffusive regime to a regime characterized by long time correlations (self organization) at about 0.01s.

  12. A Note on the Properties of Generalised Separable Spatial Autoregressive Process

    Directory of Open Access Journals (Sweden)

    Mahendran Shitan

    2009-01-01

    Full Text Available Spatial modelling has its applications in many fields like geology, agriculture, meteorology, geography, and so forth. In time series a class of models known as Generalised Autoregressive (GAR has been introduced by Peiris (2003 that includes an index parameter δ. It has been shown that the inclusion of this additional parameter aids in modelling and forecasting many real data sets. This paper studies the properties of a new class of spatial autoregressive process of order 1 with an index. We will call this a Generalised Separable Spatial Autoregressive (GENSSAR Model. The spectral density function (SDF, the autocovariance function (ACVF, and the autocorrelation function (ACF are derived. The theoretical ACF and SDF plots are presented as three-dimensional figures.

  13. Efficient Bayesian inference for natural time series using ARFIMA processes

    Science.gov (United States)

    Graves, Timothy; Gramacy, Robert; Franzke, Christian; Watkins, Nicholas

    2016-04-01

    Many geophysical quantities, such as atmospheric temperature, water levels in rivers, and wind speeds, have shown evidence of long memory (LM). LM implies that these quantities experience non-trivial temporal memory, which potentially not only enhances their predictability, but also hampers the detection of externally forced trends. Thus, it is important to reliably identify whether or not a system exhibits LM. We present a modern and systematic approach to the inference of LM. We use the flexible autoregressive fractional integrated moving average (ARFIMA) model, which is widely used in time series analysis, and of increasing interest in climate science. Unlike most previous work on the inference of LM, which is frequentist in nature, we provide a systematic treatment of Bayesian inference. In particular, we provide a new approximate likelihood for efficient parameter inference, and show how nuisance parameters (e.g., short-memory effects) can be integrated over in order to focus on long-memory parameters and hypothesis testing more directly. We illustrate our new methodology on the Nile water level data and the central England temperature (CET) time series, with favorable comparison to the standard estimators [1]. In addition we show how the method can be used to perform joint inference of the stability exponent and the memory parameter when ARFIMA is extended to allow for alpha-stable innovations. Such models can be used to study systems where heavy tails and long range memory coexist. [1] Graves et al, Nonlin. Processes Geophys., 22, 679-700, 2015; doi:10.5194/npg-22-679-2015.

  14. Modeling Time Series Data for Supervised Learning

    Science.gov (United States)

    Baydogan, Mustafa Gokce

    2012-01-01

    Temporal data are increasingly prevalent and important in analytics. Time series (TS) data are chronological sequences of observations and an important class of temporal data. Fields such as medicine, finance, learning science and multimedia naturally generate TS data. Each series provide a high-dimensional data vector that challenges the learning…

  15. Time series modeling, computation, and inference

    CERN Document Server

    Prado, Raquel

    2010-01-01

    The authors systematically develop a state-of-the-art analysis and modeling of time series. … this book is well organized and well written. The authors present various statistical models for engineers to solve problems in time series analysis. Readers no doubt will learn state-of-the-art techniques from this book.-Hsun-Hsien Chang, Computing Reviews, March 2012My favorite chapters were on dynamic linear models and vector AR and vector ARMA models.-William Seaver, Technometrics, August 2011… a very modern entry to the field of time-series modelling, with a rich reference list of the current lit

  16. Visibility Graph Based Time Series Analysis.

    Science.gov (United States)

    Stephen, Mutua; Gu, Changgui; Yang, Huijie

    2015-01-01

    Network based time series analysis has made considerable achievements in the recent years. By mapping mono/multivariate time series into networks, one can investigate both it's microscopic and macroscopic behaviors. However, most proposed approaches lead to the construction of static networks consequently providing limited information on evolutionary behaviors. In the present paper we propose a method called visibility graph based time series analysis, in which series segments are mapped to visibility graphs as being descriptions of the corresponding states and the successively occurring states are linked. This procedure converts a time series to a temporal network and at the same time a network of networks. Findings from empirical records for stock markets in USA (S&P500 and Nasdaq) and artificial series generated by means of fractional Gaussian motions show that the method can provide us rich information benefiting short-term and long-term predictions. Theoretically, we propose a method to investigate time series from the viewpoint of network of networks.

  17. How to statistically analyze nano exposure measurement results: using an ARIMA time series approach

    International Nuclear Information System (INIS)

    Klein Entink, Rinke H.; Fransman, Wouter; Brouwer, Derk H.

    2011-01-01

    Measurement strategies for exposure to nano-sized particles differ from traditional integrated sampling methods for exposure assessment by the use of real-time instruments. The resulting measurement series is a time series, where typically the sequential measurements are not independent from each other but show a pattern of autocorrelation. This article addresses the statistical difficulties when analyzing real-time measurements for exposure assessment to manufactured nano objects. To account for autocorrelation patterns, Autoregressive Integrated Moving Average (ARIMA) models are proposed. A simulation study shows the pitfalls of using a standard t-test and the application of ARIMA models is illustrated with three real-data examples. Some practical suggestions for the data analysis of real-time exposure measurements conclude this article.

  18. Methods comparison by time series analysis

    International Nuclear Information System (INIS)

    Giovino, J.

    1986-01-01

    One role of the U.S. Environmental Protection Agency (EPA) is that of monitor for laboratories under contract to perform chemical analyses. In general this program involves periodic analyses and reporting of unknown radionuclides in water. This radiochemistry data for the years 1980-1984, has been summarized. It represents several radionuclides and various methods used by numerous laboratories. Any series of measurements taken at successive time points is a time series, and is thus candidate for time series analysis. The purpose of such an analysis is to see what changes take place over time in the event being observed, to see if the performance is better or worse than it was expected to be, and to predict future behavior. To illustrate the step-by-step process of a time series analysis, the radionuclide /sup 226/Ra was selected. The available data were generated by two methods; total radium alpha and /sup 222/Rn emanation. The results of analysis are presented

  19. Data Mining Smart Energy Time Series

    Directory of Open Access Journals (Sweden)

    Janina POPEANGA

    2015-07-01

    Full Text Available With the advent of smart metering technology the amount of energy data will increase significantly and utilities industry will have to face another big challenge - to find relationships within time-series data and even more - to analyze such huge numbers of time series to find useful patterns and trends with fast or even real-time response. This study makes a small review of the literature in the field, trying to demonstrate how essential is the application of data mining techniques in the time series to make the best use of this large quantity of data, despite all the difficulties. Also, the most important Time Series Data Mining techniques are presented, highlighting their applicability in the energy domain.

  20. Time series prediction: statistical and neural techniques

    Science.gov (United States)

    Zahirniak, Daniel R.; DeSimio, Martin P.

    1996-03-01

    In this paper we compare the performance of nonlinear neural network techniques to those of linear filtering techniques in the prediction of time series. Specifically, we compare the results of using the nonlinear systems, known as multilayer perceptron and radial basis function neural networks, with the results obtained using the conventional linear Wiener filter, Kalman filter and Widrow-Hoff adaptive filter in predicting future values of stationary and non- stationary time series. Our results indicate the performance of each type of system is heavily dependent upon the form of the time series being predicted and the size of the system used. In particular, the linear filters perform adequately for linear or near linear processes while the nonlinear systems perform better for nonlinear processes. Since the linear systems take much less time to be developed, they should be tried prior to using the nonlinear systems when the linearity properties of the time series process are unknown.

  1. Kepler AutoRegressive Planet Search

    Science.gov (United States)

    Caceres, Gabriel Antonio; Feigelson, Eric

    2016-01-01

    The Kepler AutoRegressive Planet Search (KARPS) project uses statistical methodology associated with autoregressive (AR) processes to model Kepler lightcurves in order to improve exoplanet transit detection in systems with high stellar variability. We also introduce a planet-search algorithm to detect transits in time-series residuals after application of the AR models. One of the main obstacles in detecting faint planetary transits is the intrinsic stellar variability of the host star. The variability displayed by many stars may have autoregressive properties, wherein later flux values are correlated with previous ones in some manner. Our analysis procedure consisting of three steps: pre-processing of the data to remove discontinuities, gaps and outliers; AR-type model selection and fitting; and transit signal search of the residuals using a new Transit Comb Filter (TCF) that replaces traditional box-finding algorithms. The analysis procedures of the project are applied to a portion of the publicly available Kepler light curve data for the full 4-year mission duration. Tests of the methods have been made on a subset of Kepler Objects of Interest (KOI) systems, classified both as planetary `candidates' and `false positives' by the Kepler Team, as well as a random sample of unclassified systems. We find that the ARMA-type modeling successfully reduces the stellar variability, by a factor of 10 or more in active stars and by smaller factors in more quiescent stars. A typical quiescent Kepler star has an interquartile range (IQR) of ~10 e-/sec, which may improve slightly after modeling, while those with IQR ranging from 20 to 50 e-/sec, have improvements from 20% up to 70%. High activity stars (IQR exceeding 100) markedly improve. A periodogram based on the TCF is constructed to concentrate the signal of these periodic spikes. When a periodic transit is found, the model is displayed on a standard period-folded averaged light curve. Our findings to date on real

  2. Measuring multiscaling in financial time-series

    International Nuclear Information System (INIS)

    Buonocore, R.J.; Aste, T.; Di Matteo, T.

    2016-01-01

    We discuss the origin of multiscaling in financial time-series and investigate how to best quantify it. Our methodology consists in separating the different sources of measured multifractality by analyzing the multi/uni-scaling behavior of synthetic time-series with known properties. We use the results from the synthetic time-series to interpret the measure of multifractality of real log-returns time-series. The main finding is that the aggregation horizon of the returns can introduce a strong bias effect on the measure of multifractality. This effect can become especially important when returns distributions have power law tails with exponents in the range (2, 5). We discuss the right aggregation horizon to mitigate this bias.

  3. Detecting nonlinear structure in time series

    International Nuclear Information System (INIS)

    Theiler, J.

    1991-01-01

    We describe an approach for evaluating the statistical significance of evidence for nonlinearity in a time series. The formal application of our method requires the careful statement of a null hypothesis which characterizes a candidate linear process, the generation of an ensemble of ''surrogate'' data sets which are similar to the original time series but consistent with the null hypothesis, and the computation of a discriminating statistic for the original and for each of the surrogate data sets. The idea is to test the original time series against the null hypothesis by checking whether the discriminating statistic computed for the original time series differs significantly from the statistics computed for each of the surrogate sets. While some data sets very cleanly exhibit low-dimensional chaos, there are many cases where the evidence is sketchy and difficult to evaluate. We hope to provide a framework within which such claims of nonlinearity can be evaluated. 5 refs., 4 figs

  4. Applied time series analysis and innovative computing

    CERN Document Server

    Ao, Sio-Iong

    2010-01-01

    This text is a systematic, state-of-the-art introduction to the use of innovative computing paradigms as an investigative tool for applications in time series analysis. It includes frontier case studies based on recent research.

  5. Simulating multivariate time series using flocking

    OpenAIRE

    Schruben, Lee W.; Singham, Dashi I.

    2010-01-01

    Refereed Conference Paper Notions from agent based modeling (ABM) can be used to simulate multivariate time series. An example is given using the ABM concept of flocking, which models the behaviors of birds (called boids) in a flock. A multivariate time series is mapped into the coordinates of a bounded orthotope. This represents the flight path of a boid. Other boids are generated that flock around this data boid. The coordinates of these new boids are mapped back to simulate replicates o...

  6. DROP: Dimensionality Reduction Optimization for Time Series

    OpenAIRE

    Suri, Sahaana; Bailis, Peter

    2017-01-01

    Dimensionality reduction is critical in analyzing increasingly high-volume, high-dimensional time series. In this paper, we revisit a now-classic study of time series dimensionality reduction operators and find that for a given quality constraint, Principal Component Analysis (PCA) uncovers representations that are over 2x smaller than those obtained via alternative techniques favored in the literature. However, as classically implemented via Singular Value Decomposition (SVD), PCA is incredi...

  7. Introduction to time series analysis and forecasting

    CERN Document Server

    Montgomery, Douglas C; Kulahci, Murat

    2008-01-01

    An accessible introduction to the most current thinking in and practicality of forecasting techniques in the context of time-oriented data. Analyzing time-oriented data and forecasting are among the most important problems that analysts face across many fields, ranging from finance and economics to production operations and the natural sciences. As a result, there is a widespread need for large groups of people in a variety of fields to understand the basic concepts of time series analysis and forecasting. Introduction to Time Series Analysis and Forecasting presents the time series analysis branch of applied statistics as the underlying methodology for developing practical forecasts, and it also bridges the gap between theory and practice by equipping readers with the tools needed to analyze time-oriented data and construct useful, short- to medium-term, statistically based forecasts.

  8. for nonlinear autoregressive

    African Journals Online (AJOL)

    2017-09-10

    Sep 10, 2017 ... nts a comparison between the Binary Artificial Bee Colony (BABC warm Optimization (BPSO) algorithm for structure selection of a Non. Model (NAR) of the chaotic Mackey-Glass time-series data. Both s rithms are swarm-based in nature with BABC mimicking bee coloni the swarming behavior of birds.

  9. Application of semi parametric modelling to times series forecasting: case of the electricity consumption

    International Nuclear Information System (INIS)

    Lefieux, V.

    2007-10-01

    Reseau de Transport d'Electricite (RTE), in charge of operating the French electric transportation grid, needs an accurate forecast of the power consumption in order to operate it correctly. The forecasts used everyday result from a model combining a nonlinear parametric regression and a SARIMA model. In order to obtain an adaptive forecasting model, nonparametric forecasting methods have already been tested without real success. In particular, it is known that a nonparametric predictor behaves badly with a great number of explanatory variables, what is commonly called the curse of dimensionality. Recently, semi parametric methods which improve the pure nonparametric approach have been proposed to estimate a regression function. Based on the concept of 'dimension reduction', one those methods (called MAVE : Moving Average -conditional- Variance Estimate) can apply to time series. We study empirically its effectiveness to predict the future values of an autoregressive time series. We then adapt this method, from a practical point of view, to forecast power consumption. We propose a partially linear semi parametric model, based on the MAVE method, which allows to take into account simultaneously the autoregressive aspect of the problem and the exogenous variables. The proposed estimation procedure is practically efficient. (author)

  10. Parametric time series analysis of geoelectrical signals: an application to earthquake forecasting in Southern Italy

    Directory of Open Access Journals (Sweden)

    V. Tramutoli

    1996-06-01

    Full Text Available An autoregressive model was selected to describe geoelectrical time series. An objective technique was subsequently applied to analyze and discriminate values above (below an a priorifixed threshold possibly related to seismic events. A complete check of the model and the main guidelines to estimate the occurrence probability of extreme events are reported. A first application of the proposed technique is discussed through the analysis of the experimental data recorded by an automatic station located in Tito, a small town on the Apennine chain in Southern Italy. This region was hit by the November 1980 Irpinia-Basilicata earthquake and it is one of most active areas of the Mediterranean region. After a preliminary filtering procedure to reduce the influence of external parameters (i.e. the meteo-climatic effects, it was demonstrated that the geoelectrical residual time series are well described by means of a second order autoregressive model. Our findings outline a statistical methodology to evaluate the efficiency of electrical seismic precursors.

  11. Time averaging, ageing and delay analysis of financial time series

    Science.gov (United States)

    Cherstvy, Andrey G.; Vinod, Deepak; Aghion, Erez; Chechkin, Aleksei V.; Metzler, Ralf

    2017-06-01

    We introduce three strategies for the analysis of financial time series based on time averaged observables. These comprise the time averaged mean squared displacement (MSD) as well as the ageing and delay time methods for varying fractions of the financial time series. We explore these concepts via statistical analysis of historic time series for several Dow Jones Industrial indices for the period from the 1960s to 2015. Remarkably, we discover a simple universal law for the delay time averaged MSD. The observed features of the financial time series dynamics agree well with our analytical results for the time averaged measurables for geometric Brownian motion, underlying the famed Black-Scholes-Merton model. The concepts we promote here are shown to be useful for financial data analysis and enable one to unveil new universal features of stock market dynamics.

  12. Efficient Approximate OLAP Querying Over Time Series

    DEFF Research Database (Denmark)

    Perera, Kasun Baruhupolage Don Kasun Sanjeewa; Hahmann, Martin; Lehner, Wolfgang

    2016-01-01

    The ongoing trend for data gathering not only produces larger volumes of data, but also increases the variety of recorded data types. Out of these, especially time series, e.g. various sensor readings, have attracted attention in the domains of business intelligence and decision making. As OLAP...... queries play a major role in these domains, it is desirable to also execute them on time series data. While this is not a problem on the conceptual level, it can become a bottleneck with regards to query run-time. In general, processing OLAP queries gets more computationally intensive as the volume...... are either costly or require continuous maintenance. In this paper we propose an approach for approximate OLAP querying of time series that offers constant latency and is maintenance-free. To achieve this, we identify similarities between aggregation cuboids and propose algorithms that eliminate...

  13. Introduction to time series analysis and forecasting

    CERN Document Server

    Montgomery, Douglas C; Kulahci, Murat

    2015-01-01

    Praise for the First Edition ""…[t]he book is great for readers who need to apply the methods and models presented but have little background in mathematics and statistics."" -MAA Reviews Thoroughly updated throughout, Introduction to Time Series Analysis and Forecasting, Second Edition presents the underlying theories of time series analysis that are needed to analyze time-oriented data and construct real-world short- to medium-term statistical forecasts.    Authored by highly-experienced academics and professionals in engineering statistics, the Second Edition features discussions on both

  14. Building Chaotic Model From Incomplete Time Series

    Science.gov (United States)

    Siek, Michael; Solomatine, Dimitri

    2010-05-01

    This paper presents a number of novel techniques for building a predictive chaotic model from incomplete time series. A predictive chaotic model is built by reconstructing the time-delayed phase space from observed time series and the prediction is made by a global model or adaptive local models based on the dynamical neighbors found in the reconstructed phase space. In general, the building of any data-driven models depends on the completeness and quality of the data itself. However, the completeness of the data availability can not always be guaranteed since the measurement or data transmission is intermittently not working properly due to some reasons. We propose two main solutions dealing with incomplete time series: using imputing and non-imputing methods. For imputing methods, we utilized the interpolation methods (weighted sum of linear interpolations, Bayesian principle component analysis and cubic spline interpolation) and predictive models (neural network, kernel machine, chaotic model) for estimating the missing values. After imputing the missing values, the phase space reconstruction and chaotic model prediction are executed as a standard procedure. For non-imputing methods, we reconstructed the time-delayed phase space from observed time series with missing values. This reconstruction results in non-continuous trajectories. However, the local model prediction can still be made from the other dynamical neighbors reconstructed from non-missing values. We implemented and tested these methods to construct a chaotic model for predicting storm surges at Hoek van Holland as the entrance of Rotterdam Port. The hourly surge time series is available for duration of 1990-1996. For measuring the performance of the proposed methods, a synthetic time series with missing values generated by a particular random variable to the original (complete) time series is utilized. There exist two main performance measures used in this work: (1) error measures between the actual

  15. Penalised Complexity Priors for Stationary Autoregressive Processes

    KAUST Repository

    Sørbye, Sigrunn Holbek

    2017-05-25

    The autoregressive (AR) process of order p(AR(p)) is a central model in time series analysis. A Bayesian approach requires the user to define a prior distribution for the coefficients of the AR(p) model. Although it is easy to write down some prior, it is not at all obvious how to understand and interpret the prior distribution, to ensure that it behaves according to the users\\' prior knowledge. In this article, we approach this problem using the recently developed ideas of penalised complexity (PC) priors. These prior have important properties like robustness and invariance to reparameterisations, as well as a clear interpretation. A PC prior is computed based on specific principles, where model component complexity is penalised in terms of deviation from simple base model formulations. In the AR(1) case, we discuss two natural base model choices, corresponding to either independence in time or no change in time. The latter case is illustrated in a survival model with possible time-dependent frailty. For higher-order processes, we propose a sequential approach, where the base model for AR(p) is the corresponding AR(p-1) model expressed using the partial autocorrelations. The properties of the new prior distribution are compared with the reference prior in a simulation study.

  16. Layered Ensemble Architecture for Time Series Forecasting.

    Science.gov (United States)

    Rahman, Md Mustafizur; Islam, Md Monirul; Murase, Kazuyuki; Yao, Xin

    2016-01-01

    Time series forecasting (TSF) has been widely used in many application areas such as science, engineering, and finance. The phenomena generating time series are usually unknown and information available for forecasting is only limited to the past values of the series. It is, therefore, necessary to use an appropriate number of past values, termed lag, for forecasting. This paper proposes a layered ensemble architecture (LEA) for TSF problems. Our LEA consists of two layers, each of which uses an ensemble of multilayer perceptron (MLP) networks. While the first ensemble layer tries to find an appropriate lag, the second ensemble layer employs the obtained lag for forecasting. Unlike most previous work on TSF, the proposed architecture considers both accuracy and diversity of the individual networks in constructing an ensemble. LEA trains different networks in the ensemble by using different training sets with an aim of maintaining diversity among the networks. However, it uses the appropriate lag and combines the best trained networks to construct the ensemble. This indicates LEAs emphasis on accuracy of the networks. The proposed architecture has been tested extensively on time series data of neural network (NN)3 and NN5 competitions. It has also been tested on several standard benchmark time series data. In terms of forecasting accuracy, our experimental results have revealed clearly that LEA is better than other ensemble and nonensemble methods.

  17. Time series clustering in large data sets

    Directory of Open Access Journals (Sweden)

    Jiří Fejfar

    2011-01-01

    Full Text Available The clustering of time series is a widely researched area. There are many methods for dealing with this task. We are actually using the Self-organizing map (SOM with the unsupervised learning algorithm for clustering of time series. After the first experiment (Fejfar, Weinlichová, Šťastný, 2009 it seems that the whole concept of the clustering algorithm is correct but that we have to perform time series clustering on much larger dataset to obtain more accurate results and to find the correlation between configured parameters and results more precisely. The second requirement arose in a need for a well-defined evaluation of results. It seems useful to use sound recordings as instances of time series again. There are many recordings to use in digital libraries, many interesting features and patterns can be found in this area. We are searching for recordings with the similar development of information density in this experiment. It can be used for musical form investigation, cover songs detection and many others applications.The objective of the presented paper is to compare clustering results made with different parameters of feature vectors and the SOM itself. We are describing time series in a simplistic way evaluating standard deviations for separated parts of recordings. The resulting feature vectors are clustered with the SOM in batch training mode with different topologies varying from few neurons to large maps.There are other algorithms discussed, usable for finding similarities between time series and finally conclusions for further research are presented. We also present an overview of the related actual literature and projects.

  18. Forecasting with nonlinear time series models

    DEFF Research Database (Denmark)

    Kock, Anders Bredahl; Teräsvirta, Timo

    applied to economic fore- casting problems, is briefly highlighted. A number of large published studies comparing macroeconomic forecasts obtained using different time series models are discussed, and the paper also contains a small simulation study comparing recursive and direct forecasts in a partic......In this paper, nonlinear models are restricted to mean nonlinear parametric models. Several such models popular in time series econo- metrics are presented and some of their properties discussed. This in- cludes two models based on universal approximators: the Kolmogorov- Gabor polynomial model...

  19. Non-Gaussian Autoregressive Processes with Tukey g-and-h Transformations

    KAUST Repository

    Yan, Yuan

    2017-11-20

    When performing a time series analysis of continuous data, for example from climate or environmental problems, the assumption that the process is Gaussian is often violated. Therefore, we introduce two non-Gaussian autoregressive time series models that are able to fit skewed and heavy-tailed time series data. Our two models are based on the Tukey g-and-h transformation. We discuss parameter estimation, order selection, and forecasting procedures for our models and examine their performances in a simulation study. We demonstrate the usefulness of our models by applying them to two sets of wind speed data.

  20. Methodology for the AutoRegressive Planet Search (ARPS) Project

    Science.gov (United States)

    Feigelson, Eric; Caceres, Gabriel; ARPS Collaboration

    2018-01-01

    The detection of periodic signals of transiting exoplanets is often impeded by the presence of aperiodic photometric variations. This variability is intrinsic to the host star in space-based observations (typically arising from magnetic activity) and from observational conditions in ground-based observations. The most common statistical procedures to remove stellar variations are nonparametric, such as wavelet decomposition or Gaussian Processes regression. However, many stars display variability with autoregressive properties, wherein later flux values are correlated with previous ones. Providing the time series is evenly spaced, parametric autoregressive models can prove very effective. Here we present the methodology of the Autoregessive Planet Search (ARPS) project which uses Autoregressive Integrated Moving Average (ARIMA) models to treat a wide variety of stochastic short-memory processes, as well as nonstationarity. Additionally, we introduce a planet-search algorithm to detect periodic transits in the time-series residuals after application of ARIMA models. Our matched-filter algorithm, the Transit Comb Filter (TCF), replaces the traditional box-fitting step. We construct a periodogram based on the TCF to concentrate the signal of these periodic spikes. Various features of the original light curves, the ARIMA fits, the TCF periodograms, and folded light curves at peaks of the TCF periodogram can then be collected to provide constraints for planet detection. These features provide input into a multivariate classifier when a training set is available. The ARPS procedure has been applied NASA's Kepler mission observations of ~200,000 stars (Caceres, Dissertation Talk, this meeting) and will be applied in the future to other datasets.

  1. Introduction to time series and forecasting

    CERN Document Server

    Brockwell, Peter J

    2016-01-01

    This book is aimed at the reader who wishes to gain a working knowledge of time series and forecasting methods as applied to economics, engineering and the natural and social sciences. It assumes knowledge only of basic calculus, matrix algebra and elementary statistics. This third edition contains detailed instructions for the use of the professional version of the Windows-based computer package ITSM2000, now available as a free download from the Springer Extras website. The logic and tools of time series model-building are developed in detail. Numerous exercises are included and the software can be used to analyze and forecast data sets of the user's own choosing. The book can also be used in conjunction with other time series packages such as those included in R. The programs in ITSM2000 however are menu-driven and can be used with minimal investment of time in the computational details. The core of the book covers stationary processes, ARMA and ARIMA processes, multivariate time series and state-space mod...

  2. Scaling symmetry, renormalization, and time series modeling: the case of financial assets dynamics.

    Science.gov (United States)

    Zamparo, Marco; Baldovin, Fulvio; Caraglio, Michele; Stella, Attilio L

    2013-12-01

    We present and discuss a stochastic model of financial assets dynamics based on the idea of an inverse renormalization group strategy. With this strategy we construct the multivariate distributions of elementary returns based on the scaling with time of the probability density of their aggregates. In its simplest version the model is the product of an endogenous autoregressive component and a random rescaling factor designed to embody also exogenous influences. Mathematical properties like increments' stationarity and ergodicity can be proven. Thanks to the relatively low number of parameters, model calibration can be conveniently based on a method of moments, as exemplified in the case of historical data of the S&P500 index. The calibrated model accounts very well for many stylized facts, like volatility clustering, power-law decay of the volatility autocorrelation function, and multiscaling with time of the aggregated return distribution. In agreement with empirical evidence in finance, the dynamics is not invariant under time reversal, and, with suitable generalizations, skewness of the return distribution and leverage effects can be included. The analytical tractability of the model opens interesting perspectives for applications, for instance, in terms of obtaining closed formulas for derivative pricing. Further important features are the possibility of making contact, in certain limits, with autoregressive models widely used in finance and the possibility of partially resolving the long- and short-memory components of the volatility, with consistent results when applied to historical series.

  3. TimeSeer: Scagnostics for high-dimensional time series.

    Science.gov (United States)

    Dang, Tuan Nhon; Anand, Anushka; Wilkinson, Leland

    2013-03-01

    We introduce a method (Scagnostic time series) and an application (TimeSeer) for organizing multivariate time series and for guiding interactive exploration through high-dimensional data. The method is based on nine characterizations of the 2D distributions of orthogonal pairwise projections on a set of points in multidimensional euclidean space. These characterizations include measures, such as, density, skewness, shape, outliers, and texture. Working directly with these Scagnostic measures, we can locate anomalous or interesting subseries for further analysis. Our application is designed to handle the types of doubly multivariate data series that are often found in security, financial, social, and other sectors.

  4. Complex dynamic in ecological time series

    Science.gov (United States)

    Peter Turchin; Andrew D. Taylor

    1992-01-01

    Although the possibility of complex dynamical behaviors-limit cycles, quasiperiodic oscillations, and aperiodic chaos-has been recognized theoretically, most ecologists are skeptical of their importance in nature. In this paper we develop a methodology for reconstructing endogenous (or deterministic) dynamics from ecological time series. Our method consists of fitting...

  5. On clustering fMRI time series

    DEFF Research Database (Denmark)

    Goutte, Cyril; Toft, Peter Aundal; Rostrup, E.

    1999-01-01

    Analysis of fMRI time series is often performed by extracting one or more parameters for the individual voxels. Methods based, e.g., on various statistical tests are then used to yield parameters corresponding to probability of activation or activation strength. However, these methods do not indi...

  6. Lecture notes for Advanced Time Series Analysis

    DEFF Research Database (Denmark)

    Madsen, Henrik; Holst, Jan

    1997-01-01

    A first version of this notes was used at the lectures in Grenoble, and they are now extended and improved (together with Jan Holst), and used in Ph.D. courses on Advanced Time Series Analysis at IMM and at the Department of Mathematical Statistics, University of Lund, 1994, 1997, ...

  7. Inferring interdependencies from short time series

    Indian Academy of Sciences (India)

    chance – a much weaker null hypothesis than when trying to ensure that the observed value of a test statis- .... for short time series and performs better than exist- ing methods. The details are discussed in the .... seen to perform well in a significant number of combi- nations, although without any discernible relation to the.

  8. Argos: An Optimized Time-Series Photometer

    Indian Academy of Sciences (India)

    2016-01-27

    Jan 27, 2016 ... We designed a prime focus CCD photometer, Argos, optimized for high speed time-series measurements of blue variables (Nather & Mukadam 2004) for the 2.1 m telescope at McDonald Observatory. Lack of any intervening optics between the primary mirror and the CCD makes the instrument highly ...

  9. Markov Trends in Macroeconomic Time Series

    NARCIS (Netherlands)

    R. Paap (Richard)

    1997-01-01

    textabstractMany macroeconomic time series are characterised by long periods of positive growth, expansion periods, and short periods of negative growth, recessions. A popular model to describe this phenomenon is the Markov trend, which is a stochastic segmented trend where the slope depends on the

  10. Nonlinear Time Series Analysis via Neural Networks

    Science.gov (United States)

    Volná, Eva; Janošek, Michal; Kocian, Václav; Kotyrba, Martin

    This article deals with a time series analysis based on neural networks in order to make an effective forex market [Moore and Roche, J. Int. Econ. 58, 387-411 (2002)] pattern recognition. Our goal is to find and recognize important patterns which repeatedly appear in the market history to adapt our trading system behaviour based on them.

  11. Inferring interdependencies from short time series

    Indian Academy of Sciences (India)

    underlying structural difference in their overall economies, as well as their agricultural sectors. Keywords. Interdependence; correlation; inner composition alignment; time series ..... ables – sharing common properties within a climate zone – and socio-economic indicators, where informa- tion is aggregated only on a ...

  12. Recent Advances in Energy Time Series Forecasting

    Directory of Open Access Journals (Sweden)

    Francisco Martínez-Álvarez

    2017-06-01

    Full Text Available This editorial summarizes the performance of the special issue entitled Energy Time Series Forecasting, which was published in MDPI’s Energies journal. The special issue took place in 2016 and accepted a total of 21 papers from twelve different countries. Electrical, solar, or wind energy forecasting were the most analyzed topics, introducing brand new methods with very sound results.

  13. Forecasting the Reference Evapotranspiration Using Time Series Model

    Directory of Open Access Journals (Sweden)

    H. Zare Abyaneh

    2016-10-01

    Full Text Available Introduction: Reference evapotranspiration is one of the most important factors in irrigation timing and field management. Moreover, reference evapotranspiration forecasting can play a vital role in future developments. Therefore in this study, the seasonal autoregressive integrated moving average (ARIMA model was used to forecast the reference evapotranspiration time series in the Esfahan, Semnan, Shiraz, Kerman, and Yazd synoptic stations. Materials and Methods: In the present study in all stations (characteristics of the synoptic stations are given in Table 1, the meteorological data, including mean, maximum and minimum air temperature, relative humidity, dry-and wet-bulb temperature, dew-point temperature, wind speed, precipitation, air vapor pressure and sunshine hours were collected from the Islamic Republic of Iran Meteorological Organization (IRIMO for the 41 years from 1965 to 2005. The FAO Penman-Monteith equation was used to calculate the monthly reference evapotranspiration in the five synoptic stations and the evapotranspiration time series were formed. The unit root test was used to identify whether the time series was stationary, then using the Box-Jenkins method, seasonal ARIMA models were applied to the sample data. Table 1. The geographical location and climate conditions of the synoptic stations Station\tGeographical location\tAltitude (m\tMean air temperature (°C\tMean precipitation (mm\tClimate, according to the De Martonne index classification Longitude (E\tLatitude (N Annual\tMin. and Max. Esfahan\t51° 40'\t32° 37'\t1550.4\t16.36\t9.4-23.3\t122\tArid Semnan\t53° 33'\t35° 35'\t1130.8\t18.0\t12.4-23.8\t140\tArid Shiraz\t52° 36'\t29° 32'\t1484\t18.0\t10.2-25.9\t324\tSemi-arid Kerman\t56° 58'\t30° 15'\t1753.8\t15.6\t6.7-24.6\t142\tArid Yazd\t54° 17'\t31° 54'\t1237.2\t19.2\t11.8-26.0\t61\tArid Results and Discussion: The monthly meteorological data were used as input for the Ref-ET software and monthly reference

  14. Real-time GPS Satellite Clock Error Prediction Based On No-stationary Time Series Model

    Science.gov (United States)

    Wang, Q.; Xu, G.; Wang, F.

    2009-04-01

    Analysis Centers of the IGS provide precise satellite ephemeris for GPS data post-processing. The accuracy of orbit products is better than 5cm, and that of the satellite clock errors (SCE) approaches 0.1ns (igscb.jpl.nasa.gov), which can meet with the requirements of precise point positioning (PPP). Due to the 13 day-latency of the IGS final products, only the broadcast ephemeris and IGS ultra rapid products (predicted) are applicable for real time PPP (RT-PPP). Therefore, development of an approach to estimate high precise GPS SCE in real time is of particular importance for RT-PPP. Many studies have been carried out for forecasting the corrections using models, such as Linear Model (LM), Quadratic Polynomial Model (QPM), Quadratic Polynomial Model with Cyclic corrected Terms (QPM+CT), Grey Model (GM) and Kalman Filter Model (KFM), etc. However, the precisions of these models are generally in nanosecond level. The purpose of this study is to develop a method using which SCE forecasting for RT-PPP can be reached with a precision of sub-nanosecond. Analysis of the last 8 years IGS SCE data shown that predicted precision depend on the stability of the individual satellite clock. The clocks of the most recent GPS satellites (BLOCK IIR and BLOCK IIR-M) are more stable than that of the former GPS satellites (BLOCK IIA). For the stable satellite clock, the next 6 hours SCE can be easily predict with LM. The residuals of unstable satellite clocks are periodic ones with noise components. Dominant periods of residuals are found by using Fourier Transform and Spectrum Analysis. For the rest part of the residuals, an auto-regression model is used to determine their systematic trends. Summarized from this study, a no-stationary time series model can be proposed to predict GPS SCE in real time. This prediction model includes: linear term, cyclic corrected terms and auto-regression term, which are used to represent SCE trend, cyclic parts and rest of the errors, respectively

  15. The Statistical Analysis of Time Series

    CERN Document Server

    Anderson, T W

    2011-01-01

    The Wiley Classics Library consists of selected books that have become recognized classics in their respective fields. With these new unabridged and inexpensive editions, Wiley hopes to extend the life of these important works by making them available to future generations of mathematicians and scientists. Currently available in the Series: T. W. Anderson Statistical Analysis of Time Series T. S. Arthanari & Yadolah Dodge Mathematical Programming in Statistics Emil Artin Geometric Algebra Norman T. J. Bailey The Elements of Stochastic Processes with Applications to the Natural Sciences George

  16. TIME SERIES MODELS OF THREE SETS OF RXTE OBSERVATIONS OF 4U 1543–47

    International Nuclear Information System (INIS)

    Koen, C.

    2013-01-01

    The X-ray nova 4U 1543–47 was in a different physical state (low/hard, high/soft, and very high) during the acquisition of each of the three time series analyzed in this paper. Standard time series models of the autoregressive moving average (ARMA) family are fitted to these series. The low/hard data can be adequately modeled by a simple low-order model with fixed coefficients, once the slowly varying mean count rate has been accounted for. The high/soft series requires a higher order model, or an ARMA model with variable coefficients. The very high state is characterized by a succession of 'dips', with roughly equal depths. These seem to appear independently of one another. The underlying stochastic series can again be modeled by an ARMA form, or roughly as the sum of an ARMA series and white noise. The structuring of each model in terms of short-lived aperiodic and 'quasi-periodic' components is discussed.

  17. Hurst exponents for short time series

    Science.gov (United States)

    Qi, Jingchao; Yang, Huijie

    2011-12-01

    A concept called balanced estimator of diffusion entropy is proposed to detect quantitatively scalings in short time series. The effectiveness is verified by detecting successfully scaling properties for a large number of artificial fractional Brownian motions. Calculations show that this method can give reliable scalings for short time series with length ˜102. It is also used to detect scalings in the Shanghai Stock Index, five stock catalogs, and a total of 134 stocks collected from the Shanghai Stock Exchange Market. The scaling exponent for each catalog is significantly larger compared with that for the stocks included in the catalog. Selecting a window with size 650, the evolution of scaling for the Shanghai Stock Index is obtained by the window's sliding along the series. Global patterns in the evolutionary process are captured from the smoothed evolutionary curve. By comparing the patterns with the important event list in the history of the considered stock market, the evolution of scaling is matched with the stock index series. We can find that the important events fit very well with global transitions of the scaling behaviors.

  18. A framework for assessing frequency domain causality in physiological time series with instantaneous effects.

    Science.gov (United States)

    Faes, Luca; Erla, Silvia; Porta, Alberto; Nollo, Giandomenico

    2013-08-28

    We present an approach for the quantification of directional relations in multiple time series exhibiting significant zero-lag interactions. To overcome the limitations of the traditional multivariate autoregressive (MVAR) modelling of multiple series, we introduce an extended MVAR (eMVAR) framework allowing either exclusive consideration of time-lagged effects according to the classic notion of Granger causality, or consideration of combined instantaneous and lagged effects according to an extended causality definition. The spectral representation of the eMVAR model is exploited to derive novel frequency domain causality measures that generalize to the case of instantaneous effects the known directed coherence (DC) and partial DC measures. The new measures are illustrated in theoretical examples showing that they reduce to the known measures in the absence of instantaneous causality, and describe peculiar aspects of directional interaction among multiple series when instantaneous causality is non-negligible. Then, the issue of estimating eMVAR models from time-series data is faced, proposing two approaches for model identification and discussing problems related to the underlying model assumptions. Finally, applications of the framework on cardiovascular variability series and multichannel EEG recordings are presented, showing how it allows one to highlight patterns of frequency domain causality consistent with well-interpretable physiological interaction mechanisms.

  19. Inverse statistical approach in heartbeat time series

    International Nuclear Information System (INIS)

    Ebadi, H; Shirazi, A H; Mani, Ali R; Jafari, G R

    2011-01-01

    We present an investigation on heart cycle time series, using inverse statistical analysis, a concept borrowed from studying turbulence. Using this approach, we studied the distribution of the exit times needed to achieve a predefined level of heart rate alteration. Such analysis uncovers the most likely waiting time needed to reach a certain change in the rate of heart beat. This analysis showed a significant difference between the raw data and shuffled data, when the heart rate accelerates or decelerates to a rare event. We also report that inverse statistical analysis can distinguish between the electrocardiograms taken from healthy volunteers and patients with heart failure

  20. Nonlinear time series analysis with R

    CERN Document Server

    Huffaker, Ray; Rosa, Rodolfo

    2017-01-01

    In the process of data analysis, the investigator is often facing highly-volatile and random-appearing observed data. A vast body of literature shows that the assumption of underlying stochastic processes was not necessarily representing the nature of the processes under investigation and, when other tools were used, deterministic features emerged. Non Linear Time Series Analysis (NLTS) allows researchers to test whether observed volatility conceals systematic non linear behavior, and to rigorously characterize governing dynamics. Behavioral patterns detected by non linear time series analysis, along with scientific principles and other expert information, guide the specification of mechanistic models that serve to explain real-world behavior rather than merely reproducing it. Often there is a misconception regarding the complexity of the level of mathematics needed to understand and utilize the tools of NLTS (for instance Chaos theory). However, mathematics used in NLTS is much simpler than many other subjec...

  1. COMPUTATION OF IMAGE SIMILARITY WITH TIME SERIES

    Directory of Open Access Journals (Sweden)

    V. Balamurugan

    2011-11-01

    Full Text Available Searching for similar sequence in large database is an important task in temporal data mining. Similarity search is concerned with efficiently locating subsequences or whole sequences in large archives of sequences. It is useful in typical data mining applications and it can be easily extended to image retrieval. In this work, time series similarity analysis that involves dimensionality reduction and clustering is adapted on digital images to find similarity between them. The dimensionality reduced time series is represented as clusters by the use of K-Means clustering and the similarity distance between two images is found by finding the distance between the signatures of their clusters. To quantify the extent of similarity between two sequences, Earth Mover’s Distance (EMD is used. From the experiments on different sets of images, it is found that this technique is well suited for measuring the subjective similarity between two images.

  2. Visibility graphlet approach to chaotic time series

    Energy Technology Data Exchange (ETDEWEB)

    Mutua, Stephen [Business School, University of Shanghai for Science and Technology, Shanghai 200093 (China); Computer Science Department, Masinde Muliro University of Science and Technology, P.O. Box 190-50100, Kakamega (Kenya); Gu, Changgui, E-mail: gu-changgui@163.com, E-mail: hjyang@ustc.edu.cn; Yang, Huijie, E-mail: gu-changgui@163.com, E-mail: hjyang@ustc.edu.cn [Business School, University of Shanghai for Science and Technology, Shanghai 200093 (China)

    2016-05-15

    Many novel methods have been proposed for mapping time series into complex networks. Although some dynamical behaviors can be effectively captured by existing approaches, the preservation and tracking of the temporal behaviors of a chaotic system remains an open problem. In this work, we extended the visibility graphlet approach to investigate both discrete and continuous chaotic time series. We applied visibility graphlets to capture the reconstructed local states, so that each is treated as a node and tracked downstream to create a temporal chain link. Our empirical findings show that the approach accurately captures the dynamical properties of chaotic systems. Networks constructed from periodic dynamic phases all converge to regular networks and to unique network structures for each model in the chaotic zones. Furthermore, our results show that the characterization of chaotic and non-chaotic zones in the Lorenz system corresponds to the maximal Lyapunov exponent, thus providing a simple and straightforward way to analyze chaotic systems.

  3. Markov Trends in Macroeconomic Time Series

    OpenAIRE

    Paap, Richard

    1997-01-01

    textabstractMany macroeconomic time series are characterised by long periods of positive growth, expansion periods, and short periods of negative growth, recessions. A popular model to describe this phenomenon is the Markov trend, which is a stochastic segmented trend where the slope depends on the value of an unobserved two-state first-order Markov process. The two slopes of the Markov trend describe the growth rates in the two phases of the business cycle. This thesis deals with a Bayesian ...

  4. Time lag between immigration and tuberculosis rates in immigrants in the Netherlands: a time-series analysis.

    Science.gov (United States)

    van Aart, C; Boshuizen, H; Dekkers, A; Korthals Altes, H

    2017-05-01

    In low-incidence countries, most tuberculosis (TB) cases are foreign-born. We explored the temporal relationship between immigration and TB in first-generation immigrants between 1995 and 2012 to assess whether immigration can be a predictor for TB in immigrants from high-incidence countries. We obtained monthly data on immigrant TB cases and immigration for the three countries of origin most frequently represented among TB cases in the Netherlands: Morocco, Somalia and Turkey. The best-fit seasonal autoregressive integrated moving average (SARIMA) model to the immigration time-series was used to prewhiten the TB time series. The cross-correlation function (CCF) was then computed on the residual time series to detect time lags between immigration and TB rates. We identified a 17-month lag between Somali immigration and Somali immigrant TB cases, but no time lag for immigrants from Morocco and Turkey. The absence of a lag in the Moroccan and Turkish population may be attributed to the relatively low TB prevalence in the countries of origin and an increased likelihood of reactivation TB in an ageing immigrant population. Understanding the time lag between Somali immigration and TB disease would benefit from a closer epidemiological analysis of cohorts of Somali cases diagnosed within the first years after entry.

  5. On-line analysis of reactor noise using time-series analysis

    International Nuclear Information System (INIS)

    McGevna, V.G.

    1981-10-01

    A method to allow use of time series analysis for on-line noise analysis has been developed. On-line analysis of noise in nuclear power reactors has been limited primarily to spectral analysis and related frequency domain techniques. Time series analysis has many distinct advantages over spectral analysis in the automated processing of reactor noise. However, fitting an autoregressive-moving average (ARMA) model to time series data involves non-linear least squares estimation. Unless a high speed, general purpose computer is available, the calculations become too time consuming for on-line applications. To eliminate this problem, a special purpose algorithm was developed for fitting ARMA models. While it is based on a combination of steepest descent and Taylor series linearization, properties of the ARMA model are used so that the auto- and cross-correlation functions can be used to eliminate the need for estimating derivatives. The number of calculations, per iteration varies lineegardless of the mee 0.2% yield strength displayed anisotropy, with axial and circumferential values being greater than radial. For CF8-CPF8 and CF8M-CPF8M castings to meet current ASME Code S acid fuel cells

  6. Clinical and epidemiological round: Interrupted time series

    Directory of Open Access Journals (Sweden)

    León-Álvarez, Alba Luz

    2017-07-01

    Full Text Available In quasi-experimental research, it is commonly used the interrupted time series analysis, which measures the effect of an intervention from a specific time point. This technique integrates longitudinal data and allows to discover detailed trends before and after such intervention. It is considered an important tool to understand the patterns of change after any event, it is applicable in different disciplines and have a great potential to draw conclusions in research with long follow-up periods that require objective evaluation of interventions.

  7. Hierarchical time series bottom-up approach for forecast the export value in Central Java

    Science.gov (United States)

    Mahkya, D. A.; Ulama, B. S.; Suhartono

    2017-10-01

    The purpose of this study is Getting the best modeling and predicting the export value of Central Java using a Hierarchical Time Series. The export value is one variable injection in the economy of a country, meaning that if the export value of the country increases, the country’s economy will increase even more. Therefore, it is necessary appropriate modeling to predict the export value especially in Central Java. Export Value in Central Java are grouped into 21 commodities with each commodity has a different pattern. One approach that can be used time series is a hierarchical approach. Hierarchical Time Series is used Buttom-up. To Forecast the individual series at all levels using Autoregressive Integrated Moving Average (ARIMA), Radial Basis Function Neural Network (RBFNN), and Hybrid ARIMA-RBFNN. For the selection of the best models used Symmetric Mean Absolute Percentage Error (sMAPE). Results of the analysis showed that for the Export Value of Central Java, Bottom-up approach with Hybrid ARIMA-RBFNN modeling can be used for long-term predictions. As for the short and medium-term predictions, it can be used a bottom-up approach RBFNN modeling. Overall bottom-up approach with RBFNN modeling give the best result.

  8. Timing calibration and spectral cleaning of LOFAR time series data

    OpenAIRE

    Corstanje, A.; Buitink, S.; Enriquez, J. E.; Falcke, H.; Hörandel, J. R.; Krause, M.; Nelles, A.; Rachen, J. P.; Schellart, P.; Scholten, O.; ter Veen, S.; Thoudam, S.; Trinh, T. N. G.

    2016-01-01

    We describe a method for spectral cleaning and timing calibration of short time series data of the voltage in individual radio interferometer receivers. It makes use of phase differences in fast Fourier transform (FFT) spectra across antenna pairs. For strong, localized terrestrial sources these are stable over time, while being approximately uniform-random for a sum over many sources or for noise. Using only milliseconds-long datasets, the method finds the strongest interfering transmitters,...

  9. Forecasting electric vehicles sales with univariate and multivariate time series models: The case of China

    Science.gov (United States)

    Zhang, Yong; Zhong, Miner; Geng, Nana; Jiang, Yunjian

    2017-01-01

    The market demand for electric vehicles (EVs) has increased in recent years. Suitable models are necessary to understand and forecast EV sales. This study presents a singular spectrum analysis (SSA) as a univariate time-series model and vector autoregressive model (VAR) as a multivariate model. Empirical results suggest that SSA satisfactorily indicates the evolving trend and provides reasonable results. The VAR model, which comprised exogenous parameters related to the market on a monthly basis, can significantly improve the prediction accuracy. The EV sales in China, which are categorized into battery and plug-in EVs, are predicted in both short term (up to December 2017) and long term (up to 2020), as statistical proofs of the growth of the Chinese EV industry. PMID:28459872

  10. Non-stationary time series modeling on caterpillars pest of palm oil for early warning system

    Science.gov (United States)

    Setiyowati, Susi; Nugraha, Rida F.; Mukhaiyar, Utriweni

    2015-12-01

    The oil palm production has an important role for the plantation and economic sector in Indonesia. One of the important problems in the cultivation of oil palm plantation is pests which causes damage to the quality of fruits. The caterpillar pest which feed palm tree's leaves will cause decline in quality of palm oil production. Early warning system is needed to minimize losses due to this pest. Here, we applied non-stationary time series modeling, especially the family of autoregressive models to predict the number of pests based on its historical data. We realized that there is some uniqueness of these pests data, i.e. the spike value that occur almost periodically. Through some simulations and case study, we obtain that the selection of constant factor has a significance influence to the model so that it can shoot the spikes value precisely.

  11. Forecasting electric vehicles sales with univariate and multivariate time series models: The case of China.

    Science.gov (United States)

    Zhang, Yong; Zhong, Miner; Geng, Nana; Jiang, Yunjian

    2017-01-01

    The market demand for electric vehicles (EVs) has increased in recent years. Suitable models are necessary to understand and forecast EV sales. This study presents a singular spectrum analysis (SSA) as a univariate time-series model and vector autoregressive model (VAR) as a multivariate model. Empirical results suggest that SSA satisfactorily indicates the evolving trend and provides reasonable results. The VAR model, which comprised exogenous parameters related to the market on a monthly basis, can significantly improve the prediction accuracy. The EV sales in China, which are categorized into battery and plug-in EVs, are predicted in both short term (up to December 2017) and long term (up to 2020), as statistical proofs of the growth of the Chinese EV industry.

  12. Weekly lottery sales volume and suicide numbers: a time series analysis on national data from Taiwan.

    Science.gov (United States)

    Chen, Vincent Chin-Hung; Stewart, Robert; Lee, Charles Tzu-Chi

    2012-07-01

    To investigate the association between weekly lottery sales and number of suicide deaths in Taiwan. All suicides aged 15+ years during 2004-2006 in Taiwan were included. Poisson autoregression time series models investigated associations of weekly numbers with contemporaneous and recent sales from two national lotteries in operation. Adjustments were made for seasonal fluctuation, temperature, monthly unemployment and autocorrelation. In fully adjusted models, suicide deaths were negatively correlated with sales of tickets for a low-prize, low-cost lottery system. However, they were correlated positively with recent sales for a higher-cost, larger-prize system. Both correlations were stronger for male than female suicide numbers but differed in terms of age groups most strongly implicated. Associations between lottery sales and suicide numbers differed according to the nature of the lottery. A low-prize, low-publicity system appeared to be more benign than a high-prize, high-publicity one.

  13. Fractal fluctuations in cardiac time series

    Science.gov (United States)

    West, B. J.; Zhang, R.; Sanders, A. W.; Miniyar, S.; Zuckerman, J. H.; Levine, B. D.; Blomqvist, C. G. (Principal Investigator)

    1999-01-01

    Human heart rate, controlled by complex feedback mechanisms, is a vital index of systematic circulation. However, it has been shown that beat-to-beat values of heart rate fluctuate continually over a wide range of time scales. Herein we use the relative dispersion, the ratio of the standard deviation to the mean, to show, by systematically aggregating the data, that the correlation in the beat-to-beat cardiac time series is a modulated inverse power law. This scaling property indicates the existence of long-time memory in the underlying cardiac control process and supports the conclusion that heart rate variability is a temporal fractal. We argue that the cardiac control system has allometric properties that enable it to respond to a dynamical environment through scaling.

  14. Period Estimation in Astronomical Time Series

    Science.gov (United States)

    Protopapas, Pavlos

    2011-09-01

    Detection of periodicity and period estimation in non-uniformly sampled time series data is frequently a goal in Astronomical data analysis. There are various problems faced: Firstly, data is sampled non-uniformly which makes it difficult to use simple Fourier transform for performing spectral analysis. Secondly, there are large gaps in data which makes it difficult to interpolate the signal for re-sampling. Finally, in data sets with smaller time periods the non-uniformity in sampling and noise in data pose even greater problems because of the lesser number of samples per period. In this talk we review existing methods and then we propose new approaches in determining periods. We first use correntropy (an alternative to autocorrelation) that encapsulates non-linear correlations using a spatio-temporal kernel to estimate accurately the time period of the data. The other uses periodic kernels in non-parametric Gaussian process. These new techniques are also used for identifying periodic signals.

  15. Inferring causality from noisy time series data

    DEFF Research Database (Denmark)

    Mønster, Dan; Fusaroli, Riccardo; Tylén, Kristian

    2016-01-01

    Convergent Cross-Mapping (CCM) has shown high potential to perform causal inference in the absence of models. We assess the strengths and weaknesses of the method by varying coupling strength and noise levels in coupled logistic maps. We find that CCM fails to infer accurate coupling strength...... and even causality direction in synchronized time-series and in the presence of intermediate coupling. We find that the presence of noise deterministically reduces the level of cross-mapping fidelity, while the convergence rate exhibits higher levels of robustness. Finally, we propose that controlled noise...

  16. Time Series Modeling for Structural Response Prediction

    Science.gov (United States)

    1988-11-14

    results for 2nd mode. 69 5. 3DOF simulated data. 71 6. Experimental data. 72 7. Simulated data. 75 8. MPEM estimates for MDOF data with closely spaced...vector Ssteering matrix of residual time series 2DOF Two-degree-of-freedom 2LS Two-stage Least Squares Method 3DOF Three-degree-of-freedom x SUMMARY A...70 Table 5: 3DOF Simulated Data (fd= 1 ,10 ,25 ; C=.01,.0l,.0l; Amp=1,l,l; 256 pts, f,=2000 Hz) Algorithm grv noise higher mode grv, 4th mode, bias 40

  17. Fourier analysis of time series an introduction

    CERN Document Server

    Bloomfield, Peter

    2000-01-01

    A new, revised edition of a yet unrivaled work on frequency domain analysis Long recognized for his unique focus on frequency domain methods for the analysis of time series data as well as for his applied, easy-to-understand approach, Peter Bloomfield brings his well-known 1976 work thoroughly up to date. With a minimum of mathematics and an engaging, highly rewarding style, Bloomfield provides in-depth discussions of harmonic regression, harmonic analysis, complex demodulation, and spectrum analysis. All methods are clearly illustrated using examples of specific data sets, while ample

  18. Unit Root Vector Autoregression with volatility Induced Stationarity

    DEFF Research Database (Denmark)

    Rahbek, Anders; Nielsen, Heino Bohn

    We propose a discrete-time multivariate model where lagged levels of the process enter both the conditional mean and the conditional variance. This way we allow for the empirically observed persistence in time series such as interest rates, often implying unit-roots, while at the same time maintain...... stationarity despite such unit-roots. Specifically, the model bridges vector autoregressions and multivariate ARCH models in which residuals are replaced by levels lagged. An empirical illustration using recent US term structure data is given in which the individual interest rates have unit roots, have...... and geometrically ergodic. Interestingly, these conditions include the case of unit roots and a reduced rank structure in the conditional mean, known from linear co-integration to imply non-stationarity. Asymptotic theory of the maximum likelihood estimators for a particular structured case (so-called self...

  19. Big Data impacts on stochastic Forecast Models: Evidence from FX time series

    Directory of Open Access Journals (Sweden)

    Sebastian Dietz

    2013-12-01

    Full Text Available With the rise of the Big Data paradigm new tasks for prediction models appeared. In addition to the volume problem of such data sets nonlinearity becomes important, as the more detailed data sets contain also more comprehensive information, e.g. about non regular seasonal or cyclical movements as well as jumps in time series. This essay compares two nonlinear methods for predicting a high frequency time series, the USD/Euro exchange rate. The first method investigated is Autoregressive Neural Network Processes (ARNN, a neural network based nonlinear extension of classical autoregressive process models from time series analysis (see Dietz 2011. Its advantage is its simple but scalable time series process model architecture, which is able to include all kinds of nonlinearities based on the universal approximation theorem of Hornik, Stinchcombe and White 1989 and the extensions of Hornik 1993. However, restrictions related to the numeric estimation procedures limit the flexibility of the model. The alternative is a Support Vector Machine Model (SVM, Vapnik 1995. The two methods compared have different approaches of error minimization (Empirical error minimization at the ARNN vs. structural error minimization at the SVM. Our new finding is, that time series data classified as “Big Data” need new methods for prediction. Estimation and prediction was performed using the statistical programming language R. Besides prediction results we will also discuss the impact of Big Data on data preparation and model validation steps. Normal 0 21 false false false DE X-NONE X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Normale Tabelle"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-parent:""; mso-padding-alt:0cm 5.4pt 0cm 5.4pt; mso-para-margin:0cm; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:10.0pt; font-family:"Times New Roman","serif";}

  20. Time series analysis of temporal networks

    Science.gov (United States)

    Sikdar, Sandipan; Ganguly, Niloy; Mukherjee, Animesh

    2016-01-01

    A common but an important feature of all real-world networks is that they are temporal in nature, i.e., the network structure changes over time. Due to this dynamic nature, it becomes difficult to propose suitable growth models that can explain the various important characteristic properties of these networks. In fact, in many application oriented studies only knowing these properties is sufficient. For instance, if one wishes to launch a targeted attack on a network, this can be done even without the knowledge of the full network structure; rather an estimate of some of the properties is sufficient enough to launch the attack. We, in this paper show that even if the network structure at a future time point is not available one can still manage to estimate its properties. We propose a novel method to map a temporal network to a set of time series instances, analyze them and using a standard forecast model of time series, try to predict the properties of a temporal network at a later time instance. To our aim, we consider eight properties such as number of active nodes, average degree, clustering coefficient etc. and apply our prediction framework on them. We mainly focus on the temporal network of human face-to-face contacts and observe that it represents a stochastic process with memory that can be modeled as Auto-Regressive-Integrated-Moving-Average (ARIMA). We use cross validation techniques to find the percentage accuracy of our predictions. An important observation is that the frequency domain properties of the time series obtained from spectrogram analysis could be used to refine the prediction framework by identifying beforehand the cases where the error in prediction is likely to be high. This leads to an improvement of 7.96% (for error level ≤20%) in prediction accuracy on an average across all datasets. As an application we show how such prediction scheme can be used to launch targeted attacks on temporal networks. Contribution to the Topical Issue

  1. Anomaly on Superspace of Time Series Data

    Science.gov (United States)

    Capozziello, Salvatore; Pincak, Richard; Kanjamapornkul, Kabin

    2017-11-01

    We apply the G-theory and anomaly of ghost and antighost fields in the theory of supersymmetry to study a superspace over time series data for the detection of hidden general supply and demand equilibrium in the financial market. We provide proof of the existence of a general equilibrium point over 14 extradimensions of the new G-theory compared with the M-theory of the 11 dimensions model of Edward Witten. We found that the process of coupling between nonequilibrium and equilibrium spinor fields of expectation ghost fields in the superspace of time series data induces an infinitely long exact sequence of cohomology from a short exact sequence of moduli state space model. If we assume that the financial market is separated into two topological spaces of supply and demand as the D-brane and anti-D-brane model, then we can use a cohomology group to compute the stability of the market as a stable point of the general equilibrium of the interaction between D-branes of the market. We obtain the result that the general equilibrium will exist if and only if the 14th Batalin-Vilkovisky cohomology group with the negative dimensions underlying 14 major hidden factors influencing the market is zero.

  2. Analyzing the House Fly’s Exploratory Behavior with Autoregression Methods

    Science.gov (United States)

    Takahashi, Hisanao; Horibe, Naoto; Shimada, Masakazu; Ikegami, Takashi

    2008-08-01

    This paper presents a detailed characterization of the trajectory of a single housefly with free range of a square cage. The trajectory of the fly was recorded and transformed into a time series, which was fully analyzed using an autoregressive model, which describes a stationary time series by a linear regression of prior state values with the white noise. The main discovery was that the fly switched styles of motion from a low dimensional regular pattern to a higher dimensional disordered pattern. This discovered exploratory behavior is, irrespective of the presence of food, characterized by anomalous diffusion.

  3. Time Series Based for Online Signature Verification

    Directory of Open Access Journals (Sweden)

    I Ketut Gede Darma Putra

    2013-11-01

    Full Text Available Signature verification system is to match the tested signature with a claimed signature. This paper proposes time series based for feature extraction method and dynamic time warping for match method. The system made by process of testing 900 signatures belong to 50 participants, 3 signatures for reference and 5 signatures from original user, simple imposters and trained imposters for signatures test. The final result system was tested with 50 participants with 3 references. This test obtained that system accuracy without imposters is 90,44897959% at threshold 44 with rejection errors (FNMR is 5,2% and acceptance errors (FMR is 4,35102%, when with imposters system accuracy is 80,1361% at threshold 27 with error rejection (FNMR is 15,6% and acceptance errors (average FMR is 4,263946%, with details as follows: acceptance errors is 0,391837%, acceptance errors simple imposters is 3,2% and acceptance errors trained imposters is 9,2%.

  4. Monthly streamflow forecasting with auto-regressive integrated moving average

    Science.gov (United States)

    Nasir, Najah; Samsudin, Ruhaidah; Shabri, Ani

    2017-09-01

    Forecasting of streamflow is one of the many ways that can contribute to better decision making for water resource management. The auto-regressive integrated moving average (ARIMA) model was selected in this research for monthly streamflow forecasting with enhancement made by pre-processing the data using singular spectrum analysis (SSA). This study also proposed an extension of the SSA technique to include a step where clustering was performed on the eigenvector pairs before reconstruction of the time series. The monthly streamflow data of Sungai Muda at Jeniang, Sungai Muda at Jambatan Syed Omar and Sungai Ketil at Kuala Pegang was gathered from the Department of Irrigation and Drainage Malaysia. A ratio of 9:1 was used to divide the data into training and testing sets. The ARIMA, SSA-ARIMA and Clustered SSA-ARIMA models were all developed in R software. Results from the proposed model are then compared to a conventional auto-regressive integrated moving average model using the root-mean-square error and mean absolute error values. It was found that the proposed model can outperform the conventional model.

  5. Automated time series forecasting for biosurveillance.

    Science.gov (United States)

    Burkom, Howard S; Murphy, Sean Patrick; Shmueli, Galit

    2007-09-30

    For robust detection performance, traditional control chart monitoring for biosurveillance is based on input data free of trends, day-of-week effects, and other systematic behaviour. Time series forecasting methods may be used to remove this behaviour by subtracting forecasts from observations to form residuals for algorithmic input. We describe three forecast methods and compare their predictive accuracy on each of 16 authentic syndromic data streams. The methods are (1) a non-adaptive regression model using a long historical baseline, (2) an adaptive regression model with a shorter, sliding baseline, and (3) the Holt-Winters method for generalized exponential smoothing. Criteria for comparing the forecasts were the root-mean-square error, the median absolute per cent error (MedAPE), and the median absolute deviation. The median-based criteria showed best overall performance for the Holt-Winters method. The MedAPE measures over the 16 test series averaged 16.5, 11.6, and 9.7 for the non-adaptive regression, adaptive regression, and Holt-Winters methods, respectively. The non-adaptive regression forecasts were degraded by changes in the data behaviour in the fixed baseline period used to compute model coefficients. The mean-based criterion was less conclusive because of the effects of poor forecasts on a small number of calendar holidays. The Holt-Winters method was also most effective at removing serial autocorrelation, with most 1-day-lag autocorrelation coefficients below 0.15. The forecast methods were compared without tuning them to the behaviour of individual series. We achieved improved predictions with such tuning of the Holt-Winters method, but practical use of such improvements for routine surveillance will require reliable data classification methods.

  6. pplication of Time-series Modeling to Predict Infiltration of Different Soil Textures

    Directory of Open Access Journals (Sweden)

    S. Vazirpour

    2016-10-01

    Full Text Available Introduction: Infiltration is one of the most important parameters affecting irrigation. For this reason, measuring and estimating this parameter is very important, particularly when designing and managing irrigation systems. Infiltration affects water flow and solute transport in the soil surface and subsurface. Due to temporal and spatial variability, Many measurements are needed to explain the average soil infiltration characteristics under field conditions. Stochastic characteristics of the different natural phenomena led to the application of random variables and time series in predicting the performance of these phenomena. Time-series analysis is a simple and efficient method for prediction, which is widely used in various sciences. However, a few researches have investigated the time-series modeling to predict soil infiltration characteristics. In this study, capability of time series in estimating infiltration rate for different soil textures was evaluated. Materials and methods: For this purpose, the 60 and 120 minutes data of double ring infiltrometer test in Lali plain, Khuzestan, Iran, with its proposed time intervals (0, 1, 3, 5, 10, 15, 20, 30, 45, 60, 80, 100, 120, 150, 180, 210, 240 minutes were used to predict cumulative infiltration until the end of the experiment time for heavy (clay, medium (loam and light (sand soil textures. Moreover, used parameters of Kostiakov-Lewis equation recommended by NRCS, 24 hours cumulative infiltration curves were applied in time-series modeling for six different soil textures (clay, clay loam, silty, silty loam, sandy loam and sand. Different time-series models including Autoregressive (AR, Moving Average (MA, Autoregressive Moving Average (ARMA, autoregressive integrated moving average (ARIMA, ARMA model with eXogenous variables (ARMAX and AR model with eXogenous variables (ARX were evaluated in predicting cumulative infiltration. Autocorrelation and partial autocorrelation charts for each

  7. The Gaussian Graphical Model in Cross-Sectional and Time-Series Data.

    Science.gov (United States)

    Epskamp, Sacha; Waldorp, Lourens J; Mõttus, René; Borsboom, Denny

    2018-04-16

    We discuss the Gaussian graphical model (GGM; an undirected network of partial correlation coefficients) and detail its utility as an exploratory data analysis tool. The GGM shows which variables predict one-another, allows for sparse modeling of covariance structures, and may highlight potential causal relationships between observed variables. We describe the utility in three kinds of psychological data sets: data sets in which consecutive cases are assumed independent (e.g., cross-sectional data), temporally ordered data sets (e.g., n = 1 time series), and a mixture of the 2 (e.g., n > 1 time series). In time-series analysis, the GGM can be used to model the residual structure of a vector-autoregression analysis (VAR), also termed graphical VAR. Two network models can then be obtained: a temporal network and a contemporaneous network. When analyzing data from multiple subjects, a GGM can also be formed on the covariance structure of stationary means-the between-subjects network. We discuss the interpretation of these models and propose estimation methods to obtain these networks, which we implement in the R packages graphicalVAR and mlVAR. The methods are showcased in two empirical examples, and simulation studies on these methods are included in the supplementary materials.

  8. Linear and non-linear autoregressive models for short-term wind speed forecasting

    International Nuclear Information System (INIS)

    Lydia, M.; Suresh Kumar, S.; Immanuel Selvakumar, A.; Edwin Prem Kumar, G.

    2016-01-01

    Highlights: • Models for wind speed prediction at 10-min intervals up to 1 h built on time-series wind speed data. • Four different multivariate models for wind speed built based on exogenous variables. • Non-linear models built using three data mining algorithms outperform the linear models. • Autoregressive models based on wind direction perform better than other models. - Abstract: Wind speed forecasting aids in estimating the energy produced from wind farms. The soaring energy demands of the world and minimal availability of conventional energy sources have significantly increased the role of non-conventional sources of energy like solar, wind, etc. Development of models for wind speed forecasting with higher reliability and greater accuracy is the need of the hour. In this paper, models for predicting wind speed at 10-min intervals up to 1 h have been built based on linear and non-linear autoregressive moving average models with and without external variables. The autoregressive moving average models based on wind direction and annual trends have been built using data obtained from Sotavento Galicia Plc. and autoregressive moving average models based on wind direction, wind shear and temperature have been built on data obtained from Centre for Wind Energy Technology, Chennai, India. While the parameters of the linear models are obtained using the Gauss–Newton algorithm, the non-linear autoregressive models are developed using three different data mining algorithms. The accuracy of the models has been measured using three performance metrics namely, the Mean Absolute Error, Root Mean Squared Error and Mean Absolute Percentage Error.

  9. Palmprint Verification Using Time Series Method

    Directory of Open Access Journals (Sweden)

    A. A. Ketut Agung Cahyawan Wiranatha

    2013-11-01

    Full Text Available The use of biometrics as an automatic recognition system is growing rapidly in solving security problems, palmprint is one of biometric system which often used. This paper used two steps in center of mass moment method for region of interest (ROI segmentation and apply the time series method combined with block window method as feature representation. Normalized Euclidean Distance is used to measure the similarity degrees of two feature vectors of palmprint. System testing is done using 500 samples palms, with 4 samples as the reference image and the 6 samples as test images. Experiment results show that this system can achieve a high performance with success rate about 97.33% (FNMR=1.67%, FMR=1.00 %, T=0.036.

  10. Learning and Prediction of Relational Time Series

    Science.gov (United States)

    2013-03-01

    r S ub gr ap h Is om or ph is m (s ec ) Number of Constants in one situation Snort Dataset 1 & 2: Runtime over constant count Attention BFS 130...the scalability of the attention technique. 0 0.2 0.4 0.6 0.8 1 0 50 100 150 200 250 300 Ti m e pe r S ub gr ap h Is om or ph is m (s ec ) Number...φ, φ). Segment: A segment in the relational time-series r = p1p2…pn is comprised of the percept subsequence [ papa +1pa+2…pa+mpb) such that pa

  11. Nonlinear time series analysis of food intake in the dab and the rainbow trout.

    Science.gov (United States)

    King, J W; Ruohonen, K; Grove, D J; Hammerstein, A

    2007-04-21

    Evidence suggests that dab and rainbow trout are able to quickly adjust their food intake to an appropriate level when offered novel diets. In addition day-to-day and meal-to-meal food intake varies greatly and meal timing is plastic. Why this is the case is not clear: Food intake in fish is influenced by many factors, however the hierarchy and mechanisms by which these interact is not yet fully understood. A model of food intake may be helpful to understand these phenomena; to determine model type it is necessary to understand the qualitative nature of food intake. Food intake can be regarded as an autoregressive (AR) time series, as the amount of food eaten at time t will be influenced by previous meals, and this allows food intake to be considered using time series analyses. Here, time series data were analysed using nonlinear techniques to obtain qualitative information from which evidence for the hierarchy of mechanisms controlling food intake may be drawn. Time series were obtained for a group of dab and individuals and a group of rainbow trout for analysis. Surrogate data sets were generated to test several null hypotheses describing linear processes and all proved significantly different to the real data, suggesting nonlinear dynamics. Examination of topography and recurrence diagrams suggested that all series were deterministic and non-stationary. The point correlation dimension (PD2i) suggested low-dimensional dynamics. Our findings suggest therefore that any model of appetite should create output that is deterministic, non-stationary, low-dimensional and having nonlinear dynamics.

  12. On the relationship between health, education and economic growth: Time series evidence from Malaysia

    Science.gov (United States)

    Khan, Habib Nawaz; Razali, Radzuan B.; Shafei, Afza Bt.

    2016-11-01

    The objectives of this paper is two-fold: First, to empirically investigate the effects of an enlarged number of healthy and well-educated people on economic growth in Malaysia within the Endogeneous Growth Model framework. Second, to examine the causal links between education, health and economic growth using annual time series data from 1981 to 2014 for Malaysia. Data series were checked for the time series properties by using ADF and KPSS tests. Long run co-integration relationship was investigated with the help of vector autoregressive (VAR) method. For short and long run dynamic relationship investigation vector error correction model (VECM) was applied. Causality analysis was performed through Engle-Granger technique. The study results showed long run co-integration relation and positively significant effects of education and health on economic growth in Malaysia. The reported results also confirmed a feedback hypothesis between the variables in the case of Malaysia. The study results have policy relevance of the importance of human capital (health and education) to the growth process of the Malaysia. Thus, it is suggested that policy makers focus on education and health sectors for sustainable economic growth in Malaysia.

  13. Fisher information framework for time series modeling

    Science.gov (United States)

    Venkatesan, R. C.; Plastino, A.

    2017-08-01

    A robust prediction model invoking the Takens embedding theorem, whose working hypothesis is obtained via an inference procedure based on the minimum Fisher information principle, is presented. The coefficients of the ansatz, central to the working hypothesis satisfy a time independent Schrödinger-like equation in a vector setting. The inference of (i) the probability density function of the coefficients of the working hypothesis and (ii) the establishing of constraint driven pseudo-inverse condition for the modeling phase of the prediction scheme, is made, for the case of normal distributions, with the aid of the quantum mechanical virial theorem. The well-known reciprocity relations and the associated Legendre transform structure for the Fisher information measure (FIM, hereafter)-based model in a vector setting (with least square constraints) are self-consistently derived. These relations are demonstrated to yield an intriguing form of the FIM for the modeling phase, which defines the working hypothesis, solely in terms of the observed data. Cases for prediction employing time series' obtained from the: (i) the Mackey-Glass delay-differential equation, (ii) one ECG signal from the MIT-Beth Israel Deaconess Hospital (MIT-BIH) cardiac arrhythmia database, and (iii) one ECG signal from the Creighton University ventricular tachyarrhythmia database. The ECG samples were obtained from the Physionet online repository. These examples demonstrate the efficiency of the prediction model. Numerical examples for exemplary cases are provided.

  14. Zero-crossing statistics for non-Markovian time series

    Science.gov (United States)

    Nyberg, Markus; Lizana, Ludvig; Ambjörnsson, Tobias

    2018-03-01

    In applications spanning from image analysis and speech recognition to energy dissipation in turbulence and time-to failure of fatigued materials, researchers and engineers want to calculate how often a stochastic observable crosses a specific level, such as zero. At first glance this problem looks simple, but it is in fact theoretically very challenging, and therefore few exact results exist. One exception is the celebrated Rice formula that gives the mean number of zero crossings in a fixed time interval of a zero-mean Gaussian stationary process. In this study we use the so-called independent interval approximation to go beyond Rice's result and derive analytic expressions for all higher-order zero-crossing cumulants and moments. Our results agree well with simulations for the non-Markovian autoregressive model.

  15. Forecasting and simulating wind speed in Corsica by using an autoregressive model

    International Nuclear Information System (INIS)

    Poggi, P.; Muselli, M.; Notton, G.; Cristofari, C.; Louche, A.

    2003-01-01

    Alternative approaches for generating wind speed time series are discussed. The method utilized involves the use of an autoregressive process model. The model has been applied to three Mediterranean sites in Corsica and has been used to generate 3-hourly synthetic time series for these considered sites. The synthetic time series have been examined to determine their ability to preserve the statistical properties of the Corsican wind speed time series. In this context, using the main statistical characteristics of the wind speed (mean, variance, probability distribution, autocorrelation function), the data simulated are compared to experimental ones in order to check whether the wind speed behavior was correctly reproduced over the studied periods. The purpose is to create a data generator in order to construct a reference year for wind systems simulation in Corsica

  16. Climate Prediction Center (CPC) Global Precipitation Time Series

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The global precipitation time series provides time series charts showing observations of daily precipitation as well as accumulated precipitation compared to normal...

  17. Climate Prediction Center (CPC) Global Temperature Time Series

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The global temperature time series provides time series charts using station based observations of daily temperature. These charts provide information about the...

  18. Volatility Analysis of Bitcoin Price Time Series

    Directory of Open Access Journals (Sweden)

    Lukáš Pichl

    2017-12-01

    Full Text Available Bitcoin has the largest share in the total capitalization of cryptocurrency markets currently reaching above 70 billion USD. In this work we focus on the price of Bitcoin in terms of standard currencies and their volatility over the last five years. The average day-to-day return throughout this period is 0.328%, amounting in exponential growth from 6 USD to over 4,000 USD per 1 BTC at present. Multi-scale analysis is performed from the level of the tick data, through the 5 min, 1 hour and 1 day scales. Distribution of trading volumes (1 sec, 1 min, 1 hour and 1 day aggregated from the Kraken BTCEUR tick data is provided that shows the artifacts of algorithmic trading (selling transactions with volume peaks distributed at integer multiples of BTC unit. Arbitrage opportunities are studied using the EUR, USD and CNY currencies. Whereas the arbitrage spread for EUR-USD currency pair is found narrow at the order of a percent, at the 1 hour sampling period the arbitrage spread for USD-CNY (and similarly EUR-CNY is found to be more substantial, reaching as high as above 5 percent on rare occasions. The volatility of BTC exchange rates is modeled using the day-to-day distribution of logarithmic return, and the Realized Volatility, sum of the squared logarithmic returns on 5-minute basis. In this work we demonstrate that the Heterogeneous Autoregressive model for Realized Volatility Andersen et al. (2007 applies reasonably well to the BTCUSD dataset. Finally, a feed-forward neural network with 2 hidden layers using 10-day moving window sampling daily return predictors is applied to estimate the next-day logarithmic return. The results show that such an artificial neural network prediction is capable of approximate capture of the actual log return distribution; more sophisticated methods, such as recurrent neural networks and LSTM (Long Short Term Memory techniques from deep learning may be necessary for higher prediction accuracy.

  19. Failure and reliability prediction by support vector machines regression of time series data

    International Nuclear Information System (INIS)

    Chagas Moura, Marcio das; Zio, Enrico; Lins, Isis Didier; Droguett, Enrique

    2011-01-01

    Support Vector Machines (SVMs) are kernel-based learning methods, which have been successfully adopted for regression problems. However, their use in reliability applications has not been widely explored. In this paper, a comparative analysis is presented in order to evaluate the SVM effectiveness in forecasting time-to-failure and reliability of engineered components based on time series data. The performance on literature case studies of SVM regression is measured against other advanced learning methods such as the Radial Basis Function, the traditional MultiLayer Perceptron model, Box-Jenkins autoregressive-integrated-moving average and the Infinite Impulse Response Locally Recurrent Neural Networks. The comparison shows that in the analyzed cases, SVM outperforms or is comparable to other techniques. - Highlights: → Realistic modeling of reliability demands complex mathematical formulations. → SVM is proper when the relation input/output is unknown or very costly to be obtained. → Results indicate the potential of SVM for reliability time series prediction. → Reliability estimates support the establishment of adequate maintenance strategies.

  20. State-level gonorrhea rates and expedited partner therapy laws: insights from time series analyses.

    Science.gov (United States)

    Owusu-Edusei, K; Cramer, R; Chesson, H W; Gift, T L; Leichliter, J S

    2017-06-01

    In this study, we examined state-level monthly gonorrhea morbidity and assessed the potential impact of existing expedited partner therapy (EPT) laws in relation to the time that the laws were enacted. Longitudinal study. We obtained state-level monthly gonorrhea morbidity (number of cases/100,000 for males, females and total) from the national surveillance data. We used visual examination (of morbidity trends) and an autoregressive time series model in a panel format with intervention (interrupted time series) analysis to assess the impact of state EPT laws based on the months in which the laws were enacted. For over 84% of the states with EPT laws, the monthly morbidity trends did not show any noticeable decreases on or after the laws were enacted. Although we found statistically significant decreases in gonorrhea morbidity within four of the states with EPT laws (Alaska, Illinois, Minnesota, and Vermont), there were no significant decreases when the decreases in the four states were compared contemporaneously with the decreases in states that do not have the laws. We found no impact (decrease in gonorrhea morbidity) attributable exclusively to the EPT law(s). However, these results do not imply that the EPT laws themselves were not effective (or failed to reduce gonorrhea morbidity), because the effectiveness of the EPT law is dependent on necessary intermediate events/outcomes, including sexually transmitted infection service providers' awareness and practice, as well as acceptance by patients and their partners. Published by Elsevier Ltd.

  1. Transmission of linear regression patterns between time series: from relationship in time series to complex networks.

    Science.gov (United States)

    Gao, Xiangyun; An, Haizhong; Fang, Wei; Huang, Xuan; Li, Huajiao; Zhong, Weiqiong; Ding, Yinghui

    2014-07-01

    The linear regression parameters between two time series can be different under different lengths of observation period. If we study the whole period by the sliding window of a short period, the change of the linear regression parameters is a process of dynamic transmission over time. We tackle fundamental research that presents a simple and efficient computational scheme: a linear regression patterns transmission algorithm, which transforms linear regression patterns into directed and weighted networks. The linear regression patterns (nodes) are defined by the combination of intervals of the linear regression parameters and the results of the significance testing under different sizes of the sliding window. The transmissions between adjacent patterns are defined as edges, and the weights of the edges are the frequency of the transmissions. The major patterns, the distance, and the medium in the process of the transmission can be captured. The statistical results of weighted out-degree and betweenness centrality are mapped on timelines, which shows the features of the distribution of the results. Many measurements in different areas that involve two related time series variables could take advantage of this algorithm to characterize the dynamic relationships between the time series from a new perspective.

  2. An approach for generating synthetic fine temporal resolution solar radiation time series from hourly gridded datasets

    Directory of Open Access Journals (Sweden)

    Matthew Perry

    2017-06-01

    Full Text Available A tool has been developed to statistically increase the temporal resolution of solar irradiance time series. Fine temporal resolution time series are an important input into the planning process for solar power plants, and lead to increased understanding of the likely short-term variability of solar energy. The approach makes use of the spatial variability of hourly gridded datasets around a location of interest to make inferences about the temporal variability within the hour. The unique characteristics of solar irradiance data are modelled by classifying each hour into a typical weather situation. Low variability situations are modelled using an autoregressive process which is applied to ramps of clear-sky index. High variability situations are modelled as a transition between states of clear sky conditions and different levels of cloud opacity. The methods have been calibrated to Australian conditions using 1 min data from four ground stations for a 10 year period. These stations, together with an independent dataset, have also been used to verify the quality of the results using a number of relevant metrics. The results show that the method generates realistic fine resolution synthetic time series. The synthetic time series correlate well with observed data on monthly and annual timescales as they are constrained to the nearest grid-point value on each hour. The probability distributions of the synthetic and observed global irradiance data are similar, with Kolmogorov-Smirnov test statistic less than 0.04 at each station. The tool could be useful for the estimation of solar power output for integration studies.

  3. Predictability of monthly temperature and precipitation using automatic time series forecasting methods

    Science.gov (United States)

    Papacharalampous, Georgia; Tyralis, Hristos; Koutsoyiannis, Demetris

    2018-02-01

    We investigate the predictability of monthly temperature and precipitation by applying automatic univariate time series forecasting methods to a sample of 985 40-year-long monthly temperature and 1552 40-year-long monthly precipitation time series. The methods include a naïve one based on the monthly values of the last year, as well as the random walk (with drift), AutoRegressive Fractionally Integrated Moving Average (ARFIMA), exponential smoothing state-space model with Box-Cox transformation, ARMA errors, Trend and Seasonal components (BATS), simple exponential smoothing, Theta and Prophet methods. Prophet is a recently introduced model inspired by the nature of time series forecasted at Facebook and has not been applied to hydrometeorological time series before, while the use of random walk, BATS, simple exponential smoothing and Theta is rare in hydrology. The methods are tested in performing multi-step ahead forecasts for the last 48 months of the data. We further investigate how different choices of handling the seasonality and non-normality affect the performance of the models. The results indicate that: (a) all the examined methods apart from the naïve and random walk ones are accurate enough to be used in long-term applications; (b) monthly temperature and precipitation can be forecasted to a level of accuracy which can barely be improved using other methods; (c) the externally applied classical seasonal decomposition results mostly in better forecasts compared to the automatic seasonal decomposition used by the BATS and Prophet methods; and (d) Prophet is competitive, especially when it is combined with externally applied classical seasonal decomposition.

  4. A comparative study of shallow groundwater level simulation with three time series models in a coastal aquifer of South China

    Science.gov (United States)

    Yang, Q.; Wang, Y.; Zhang, J.; Delgado, J.

    2017-05-01

    Accurate and reliable groundwater level forecasting models can help ensure the sustainable use of a watershed's aquifers for urban and rural water supply. In this paper, three time series analysis methods, Holt-Winters (HW), integrated time series (ITS), and seasonal autoregressive integrated moving average (SARIMA), are explored to simulate the groundwater level in a coastal aquifer, China. The monthly groundwater table depth data collected in a long time series from 2000 to 2011 are simulated and compared with those three time series models. The error criteria are estimated using coefficient of determination ( R 2), Nash-Sutcliffe model efficiency coefficient ( E), and root-mean-squared error. The results indicate that three models are all accurate in reproducing the historical time series of groundwater levels. The comparisons of three models show that HW model is more accurate in predicting the groundwater levels than SARIMA and ITS models. It is recommended that additional studies explore this proposed method, which can be used in turn to facilitate the development and implementation of more effective and sustainable groundwater management strategies.

  5. Hidden Markov Models for Time Series An Introduction Using R

    CERN Document Server

    Zucchini, Walter

    2009-01-01

    Illustrates the flexibility of HMMs as general-purpose models for time series data. This work presents an overview of HMMs for analyzing time series data, from continuous-valued, circular, and multivariate series to binary data, bounded and unbounded counts and categorical observations.

  6. An autoregressive growth model for longitudinal item analysis.

    Science.gov (United States)

    Jeon, Minjeong; Rabe-Hesketh, Sophia

    2016-09-01

    A first-order autoregressive growth model is proposed for longitudinal binary item analysis where responses to the same items are conditionally dependent across time given the latent traits. Specifically, the item response probability for a given item at a given time depends on the latent trait as well as the response to the same item at the previous time, or the lagged response. An initial conditions problem arises because there is no lagged response at the initial time period. We handle this problem by adapting solutions proposed for dynamic models in panel data econometrics. Asymptotic and finite sample power for the autoregressive parameters are investigated. The consequences of ignoring local dependence and the initial conditions problem are also examined for data simulated from a first-order autoregressive growth model. The proposed methods are applied to longitudinal data on Korean students' self-esteem.

  7. Hierarchical Meta-Learning in Time Series Forecasting for Improved Interference-Less Machine Learning

    Directory of Open Access Journals (Sweden)

    David Afolabi

    2017-11-01

    Full Text Available The importance of an interference-less machine learning scheme in time series prediction is crucial, as an oversight can have a negative cumulative effect, especially when predicting many steps ahead of the currently available data. The on-going research on noise elimination in time series forecasting has led to a successful approach of decomposing the data sequence into component trends to identify noise-inducing information. The empirical mode decomposition method separates the time series/signal into a set of intrinsic mode functions ranging from high to low frequencies, which can be summed up to reconstruct the original data. The usual assumption that random noises are only contained in the high-frequency component has been shown not to be the case, as observed in our previous findings. The results from that experiment reveal that noise can be present in a low frequency component, and this motivates the newly-proposed algorithm. Additionally, to prevent the erosion of periodic trends and patterns within the series, we perform the learning of local and global trends separately in a hierarchical manner which succeeds in detecting and eliminating short/long term noise. The algorithm is tested on four datasets from financial market data and physical science data. The simulation results are compared with the conventional and state-of-the-art approaches for time series machine learning, such as the non-linear autoregressive neural network and the long short-term memory recurrent neural network, respectively. Statistically significant performance gains are recorded when the meta-learning algorithm for noise reduction is used in combination with these artificial neural networks. For time series data which cannot be decomposed into meaningful trends, applying the moving average method to create meta-information for guiding the learning process is still better than the traditional approach. Therefore, this new approach is applicable to the forecasting

  8. Approximate Entropies for Stochastic Time Series and EKG Time Series of Patients with Epilepsy and Pseudoseizures

    Science.gov (United States)

    Vyhnalek, Brian; Zurcher, Ulrich; O'Dwyer, Rebecca; Kaufman, Miron

    2009-10-01

    A wide range of heart rate irregularities have been reported in small studies of patients with temporal lobe epilepsy [TLE]. We hypothesize that patients with TLE display cardiac dysautonomia in either a subclinical or clinical manner. In a small study, we have retrospectively identified (2003-8) two groups of patients from the epilepsy monitoring unit [EMU] at the Cleveland Clinic. No patients were diagnosed with cardiovascular morbidities. The control group consisted of patients with confirmed pseudoseizures and the experimental group had confirmed right temporal lobe epilepsy through a seizure free outcome after temporal lobectomy. We quantified the heart rate variability using the approximate entropy [ApEn]. We found similar values of the ApEn in all three states of consciousness (awake, sleep, and proceeding seizure onset). In the TLE group, there is some evidence for greater variability in the awake than in either the sleep or proceeding seizure onset. Here we present results for mathematically-generated time series: the heart rate fluctuations ξ follow the γ statistics i.e., p(ξ)=γ-1(k) ξ^k exp(-ξ). This probability function has well-known properties and its Shannon entropy can be expressed in terms of the γ-function. The parameter k allows us to generate a family of heart rate time series with different statistics. The ApEn calculated for the generated time series for different values of k mimic the properties found for the TLE and pseudoseizure group. Our results suggest that the ApEn is an effective tool to probe differences in statistics of heart rate fluctuations.

  9. A comparison of performance of several artificial intelligence methods for forecasting monthly discharge time series

    Science.gov (United States)

    Wang, Wen-Chuan; Chau, Kwok-Wing; Cheng, Chun-Tian; Qiu, Lin

    2009-08-01

    SummaryDeveloping a hydrological forecasting model based on past records is crucial to effective hydropower reservoir management and scheduling. Traditionally, time series analysis and modeling is used for building mathematical models to generate hydrologic records in hydrology and water resources. Artificial intelligence (AI), as a branch of computer science, is capable of analyzing long-series and large-scale hydrological data. In recent years, it is one of front issues to apply AI technology to the hydrological forecasting modeling. In this paper, autoregressive moving-average (ARMA) models, artificial neural networks (ANNs) approaches, adaptive neural-based fuzzy inference system (ANFIS) techniques, genetic programming (GP) models and support vector machine (SVM) method are examined using the long-term observations of monthly river flow discharges. The four quantitative standard statistical performance evaluation measures, the coefficient of correlation ( R), Nash-Sutcliffe efficiency coefficient ( E), root mean squared error (RMSE), mean absolute percentage error (MAPE), are employed to evaluate the performances of various models developed. Two case study river sites are also provided to illustrate their respective performances. The results indicate that the best performance can be obtained by ANFIS, GP and SVM, in terms of different evaluation criteria during the training and validation phases.

  10. Integrating uncertainty in time series population forecasts: An illustration using a simple projection model

    Directory of Open Access Journals (Sweden)

    Guy J. Abel

    2013-12-01

    Full Text Available Background: Population forecasts are widely used for public policy purposes. Methods to quantify the uncertainty in forecasts tend to ignore model uncertainty and to be based on a single model. Objective: In this paper, we use Bayesian time series models to obtain future population estimates with associated measures of uncertainty. The models are compared based on Bayesian posterior model probabilities, which are then used to provide model-averaged forecasts. Methods: The focus is on a simple projection model with the historical data representing population change in England and Wales from 1841 to 2007. Bayesian forecasts to the year 2032 are obtained based on a range of models, including autoregression models, stochastic volatility models and random variance shift models. The computational steps to fit each of these models using the OpenBUGS software via R are illustrated. Results: We show that the Bayesian approach is adept in capturing multiple sources of uncertainty in population projections, including model uncertainty. The inclusion of non-constant variance improves the fit of the models and provides more realistic predictive uncertainty levels. The forecasting methodology is assessed through fitting the models to various truncated data series.

  11. Efficient Algorithms for Segmentation of Item-Set Time Series

    Science.gov (United States)

    Chundi, Parvathi; Rosenkrantz, Daniel J.

    We propose a special type of time series, which we call an item-set time series, to facilitate the temporal analysis of software version histories, email logs, stock market data, etc. In an item-set time series, each observed data value is a set of discrete items. We formalize the concept of an item-set time series and present efficient algorithms for segmenting a given item-set time series. Segmentation of a time series partitions the time series into a sequence of segments where each segment is constructed by combining consecutive time points of the time series. Each segment is associated with an item set that is computed from the item sets of the time points in that segment, using a function which we call a measure function. We then define a concept called the segment difference, which measures the difference between the item set of a segment and the item sets of the time points in that segment. The segment difference values are required to construct an optimal segmentation of the time series. We describe novel and efficient algorithms to compute segment difference values for each of the measure functions described in the paper. We outline a dynamic programming based scheme to construct an optimal segmentation of the given item-set time series. We use the item-set time series segmentation techniques to analyze the temporal content of three different data sets—Enron email, stock market data, and a synthetic data set. The experimental results show that an optimal segmentation of item-set time series data captures much more temporal content than a segmentation constructed based on the number of time points in each segment, without examining the item set data at the time points, and can be used to analyze different types of temporal data.

  12. TIME SERIES ANALYSIS USING A UNIQUE MODEL OF TRANSFORMATION

    Directory of Open Access Journals (Sweden)

    Goran Klepac

    2007-12-01

    Full Text Available REFII1 model is an authorial mathematical model for time series data mining. The main purpose of that model is to automate time series analysis, through a unique transformation model of time series. An advantage of this approach of time series analysis is the linkage of different methods for time series analysis, linking traditional data mining tools in time series, and constructing new algorithms for analyzing time series. It is worth mentioning that REFII model is not a closed system, which means that we have a finite set of methods. At first, this is a model for transformation of values of time series, which prepares data used by different sets of methods based on the same model of transformation in a domain of problem space. REFII model gives a new approach in time series analysis based on a unique model of transformation, which is a base for all kind of time series analysis. The advantage of REFII model is its possible application in many different areas such as finance, medicine, voice recognition, face recognition and text mining.

  13. Comparison of extended mean-reversion and time series models for electricity spot price simulation considering negative prices

    International Nuclear Information System (INIS)

    Keles, Dogan; Genoese, Massimo; Möst, Dominik; Fichtner, Wolf

    2012-01-01

    This paper evaluates different financial price and time series models, such as mean reversion, autoregressive moving average (ARMA), integrated ARMA (ARIMA) and general autoregressive conditional heteroscedasticity (GARCH) process, usually applied for electricity price simulations. However, as these models are developed to describe the stochastic behaviour of electricity prices, they are extended by a separate data treatment for the deterministic components (trend, daily, weekly and annual cycles) of electricity spot prices. Furthermore price jumps are considered and implemented within a regime-switching model. Since 2008 market design allows for negative prices at the European Energy Exchange, which also occurred for several hours in the last years. Up to now, only a few financial and time series approaches exist, which are able to capture negative prices. This paper presents a new approach incorporating negative prices. The evaluation of the different approaches presented points out that the mean reversion and the ARMA models deliver the lowest mean root square error between simulated and historical electricity spot prices gained from the European Energy Exchange. These models posses also lower mean average errors than GARCH models. Hence, they are more suitable to simulate well-fitting price paths. Furthermore it is shown that the daily structure of historical price curves is better captured applying ARMA or ARIMA processes instead of mean-reversion or GARCH models. Another important outcome of the paper is that the regime-switching approach and the consideration of negative prices via the new proposed approach lead to a significant improvement of the electricity price simulation. - Highlights: ► Considering negative prices improves the results of time-series and financial models for electricity prices. ► Regime-switching approach captures the jumps and base prices quite well. ► Removing and separate modelling of deterministic annual, weekly and daily

  14. Time series analysis of onchocerciasis data from Mexico: a trend towards elimination.

    Science.gov (United States)

    Lara-Ramírez, Edgar E; Rodríguez-Pérez, Mario A; Pérez-Rodríguez, Miguel A; Adeleke, Monsuru A; Orozco-Algarra, María E; Arrendondo-Jiménez, Juan I; Guo, Xianwu

    2013-01-01

    In Latin America, there are 13 geographically isolated endemic foci distributed among Mexico, Guatemala, Colombia, Venezuela, Brazil and Ecuador. The communities of the three endemic foci found within Mexico have been receiving ivermectin treatment since 1989. In this study, we predicted the trend of occurrence of cases in Mexico by applying time series analysis to monthly onchocerciasis data reported by the Mexican Secretariat of Health between 1988 and 2011 using the software R. A total of 15,584 cases were reported in Mexico from 1988 to 2011. The data of onchocerciasis cases are mainly from the main endemic foci of Chiapas and Oaxaca. The last case in Oaxaca was reported in 1998, but new cases were reported in the Chiapas foci up to 2011. Time series analysis performed for the foci in Mexico showed a decreasing trend of the disease over time. The best-fitted models with the smallest Akaike Information Criterion (AIC) were Auto-Regressive Integrated Moving Average (ARIMA) models, which were used to predict the tendency of onchocerciasis cases for two years ahead. According to the ARIMA models predictions, the cases in very low number (below 1) are expected for the disease between 2012 and 2013 in Chiapas, the last endemic region in Mexico. The endemic regions of Mexico evolved from high onchocerciasis-endemic states to the interruption of transmission due to the strategies followed by the MSH, based on treatment with ivermectin. The extremely low level of expected cases as predicted by ARIMA models for the next two years suggest that the onchocerciasis is being eliminated in Mexico. To our knowledge, it is the first study utilizing time series for predicting case dynamics of onchocerciasis, which could be used as a benchmark during monitoring and post-treatment surveillance.

  15. Time-series prediction and applications a machine intelligence approach

    CERN Document Server

    Konar, Amit

    2017-01-01

    This book presents machine learning and type-2 fuzzy sets for the prediction of time-series with a particular focus on business forecasting applications. It also proposes new uncertainty management techniques in an economic time-series using type-2 fuzzy sets for prediction of the time-series at a given time point from its preceding value in fluctuating business environments. It employs machine learning to determine repetitively occurring similar structural patterns in the time-series and uses stochastic automaton to predict the most probabilistic structure at a given partition of the time-series. Such predictions help in determining probabilistic moves in a stock index time-series Primarily written for graduate students and researchers in computer science, the book is equally useful for researchers/professionals in business intelligence and stock index prediction. A background of undergraduate level mathematics is presumed, although not mandatory, for most of the sections. Exercises with tips are provided at...

  16. A Two-Factor Autoregressive Moving Average Model Based on Fuzzy Fluctuation Logical Relationships

    Directory of Open Access Journals (Sweden)

    Shuang Guan

    2017-10-01

    Full Text Available Many of the existing autoregressive moving average (ARMA forecast models are based on one main factor. In this paper, we proposed a new two-factor first-order ARMA forecast model based on fuzzy fluctuation logical relationships of both a main factor and a secondary factor of a historical training time series. Firstly, we generated a fluctuation time series (FTS for two factors by calculating the difference of each data point with its previous day, then finding the absolute means of the two FTSs. We then constructed a fuzzy fluctuation time series (FFTS according to the defined linguistic sets. The next step was establishing fuzzy fluctuation logical relation groups (FFLRGs for a two-factor first-order autoregressive (AR(1 model and forecasting the training data with the AR(1 model. Then we built FFLRGs for a two-factor first-order autoregressive moving average (ARMA(1,m model. Lastly, we forecasted test data with the ARMA(1,m model. To illustrate the performance of our model, we used real Taiwan Stock Exchange Capitalization Weighted Stock Index (TAIEX and Dow Jones datasets as a secondary factor to forecast TAIEX. The experiment results indicate that the proposed two-factor fluctuation ARMA method outperformed the one-factor method based on real historic data. The secondary factor may have some effects on the main factor and thereby impact the forecasting results. Using fuzzified fluctuations rather than fuzzified real data could avoid the influence of extreme values in historic data, which performs negatively while forecasting. To verify the accuracy and effectiveness of the model, we also employed our method to forecast the Shanghai Stock Exchange Composite Index (SHSECI from 2001 to 2015 and the international gold price from 2000 to 2010.

  17. Forecasting hourly global solar radiation using hybrid k-means and nonlinear autoregressive neural network models

    International Nuclear Information System (INIS)

    Benmouiza, Khalil; Cheknane, Ali

    2013-01-01

    Highlights: • An unsupervised clustering algorithm with a neural network model was explored. • The forecasting results of solar radiation time series and the comparison of their performance was simulated. • A new method was proposed combining k-means algorithm and NAR network to provide better prediction results. - Abstract: In this paper, we review our work for forecasting hourly global horizontal solar radiation based on the combination of unsupervised k-means clustering algorithm and artificial neural networks (ANN). k-Means algorithm focused on extracting useful information from the data with the aim of modeling the time series behavior and find patterns of the input space by clustering the data. On the other hand, nonlinear autoregressive (NAR) neural networks are powerful computational models for modeling and forecasting nonlinear time series. Taking the advantage of both methods, a new method was proposed combining k-means algorithm and NAR network to provide better forecasting results

  18. Assessment and prediction of road accident injuries trend using time-series models in Kurdistan.

    Science.gov (United States)

    Parvareh, Maryam; Karimi, Asrin; Rezaei, Satar; Woldemichael, Abraha; Nili, Sairan; Nouri, Bijan; Nasab, Nader Esmail

    2018-01-01

    Road traffic accidents are commonly encountered incidents that can cause high-intensity injuries to the victims and have direct impacts on the members of the society. Iran has one of the highest incident rates of road traffic accidents. The objective of this study was to model the patterns of road traffic accidents leading to injury in Kurdistan province, Iran. A time-series analysis was conducted to characterize and predict the frequency of road traffic accidents that lead to injury in Kurdistan province. The injuries were categorized into three separate groups which were related to the car occupants, motorcyclists and pedestrian road traffic accident injuries. The Box-Jenkins time-series analysis was used to model the injury observations applying autoregressive integrated moving average (ARIMA) and seasonal autoregressive integrated moving average (SARIMA) from March 2009 to February 2015 and to predict the accidents up to 24 months later (February 2017). The analysis was carried out using R-3.4.2 statistical software package. A total of 5199 pedestrians, 9015 motorcyclists, and 28,906 car occupants' accidents were observed. The mean (SD) number of car occupant, motorcyclist and pedestrian accident injuries observed were 401.01 (SD 32.78), 123.70 (SD 30.18) and 71.19 (SD 17.92) per year, respectively. The best models for the pattern of car occupant, motorcyclist, and pedestrian injuries were the ARIMA (1, 0, 0), SARIMA (1, 0, 2) (1, 0, 0) 12 , and SARIMA (1, 1, 1) (0, 0, 1) 12 , respectively. The motorcyclist and pedestrian injuries showed a seasonal pattern and the peak was during summer (August). The minimum frequency for the motorcyclist and pedestrian injuries were observed during the late autumn and early winter (December and January). Our findings revealed that the observed motorcyclist and pedestrian injuries had a seasonal pattern that was explained by air temperature changes overtime. These findings call the need for close monitoring of the

  19. Analysis of Nonstationary Time Series for Biological Rhythms Research.

    Science.gov (United States)

    Leise, Tanya L

    2017-06-01

    This article is part of a Journal of Biological Rhythms series exploring analysis and statistics topics relevant to researchers in biological rhythms and sleep research. The goal is to provide an overview of the most common issues that arise in the analysis and interpretation of data in these fields. In this article on time series analysis for biological rhythms, we describe some methods for assessing the rhythmic properties of time series, including tests of whether a time series is indeed rhythmic. Because biological rhythms can exhibit significant fluctuations in their period, phase, and amplitude, their analysis may require methods appropriate for nonstationary time series, such as wavelet transforms, which can measure how these rhythmic parameters change over time. We illustrate these methods using simulated and real time series.

  20. Track Irregularity Time Series Analysis and Trend Forecasting

    OpenAIRE

    Jia Chaolong; Xu Weixiang; Wang Futian; Wang Hanning

    2012-01-01

    The combination of linear and nonlinear methods is widely used in the prediction of time series data. This paper analyzes track irregularity time series data by using gray incidence degree models and methods of data transformation, trying to find the connotative relationship between the time series data. In this paper, GM (1,1) is based on first-order, single variable linear differential equations; after an adaptive improvement and error correction, it is used to predict the long-term changin...

  1. A Comparative Analysis of Short Time Series Processing Methods

    OpenAIRE

    Kiršners, A; Borisovs, A

    2012-01-01

    This article analyzes the traditional time series processing methods that are used to perform the task of short time series analysis in demand forecasting. The main aim of this paper is to scrutinize the ability of these methods to be used when analyzing short time series. The analyzed methods include exponential smoothing, exponential smoothing with the development trend and moving average method. The paper gives the description of the structure and main operating princi...

  2. Bag-of-Temporal-SIFT-Words for Time Series Classification

    OpenAIRE

    Bailly , Adeline; Malinowski , Simon; Tavenard , Romain; Guyet , Thomas; Chapel , Laetitia

    2015-01-01

    International audience; Time series classification is an application of particular interest with the increase of data to monitor. Classical techniques for time series classification rely on point-to-point distances. Recently, Bag-of-Words approaches have been used in this context. Words are quantized versions of simple features extracted from sliding windows. The SIFT framework has proved efficient for image classification. In this paper, we design a time series classification scheme that bui...

  3. TIME SERIES WORKSHOP” OBSERVATIONS DATA PROCESSING TOOL

    OpenAIRE

    Shapovalova, L. L

    2017-01-01

    The new tool for mathematical and visual processing of time series is reresented. The program ”Time Series WorkShop” (TSW) is specialized for processing visual observations of variable stars. An open structure of the allows to apply any old and new mathematical methods for searching any parameters of variability. The program also allows to visualize the time series and any  calculation results (periodograms, histograms, light curves and their smoothing curves) in a camera-ready form. The foll...

  4. Capturing Structure Implicitly from Time-Series having Limited Data

    OpenAIRE

    Emaasit, Daniel; Johnson, Matthew

    2018-01-01

    Scientific fields such as insider-threat detection and highway-safety planning often lack sufficient amounts of time-series data to estimate statistical models for the purpose of scientific discovery. Moreover, the available limited data are quite noisy. This presents a major challenge when estimating time-series models that are robust to overfitting and have well-calibrated uncertainty estimates. Most of the current literature in these fields involve visualizing the time-series for noticeabl...

  5. SEM Based CARMA Time Series Modeling for Arbitrary N.

    Science.gov (United States)

    Oud, Johan H L; Voelkle, Manuel C; Driver, Charles C

    2018-01-01

    This article explains in detail the state space specification and estimation of first and higher-order autoregressive moving-average models in continuous time (CARMA) in an extended structural equation modeling (SEM) context for N = 1 as well as N > 1. To illustrate the approach, simulations will be presented in which a single panel model (T = 41 time points) is estimated for a sample of N = 1,000 individuals as well as for samples of N = 100 and N = 50 individuals, followed by estimating 100 separate models for each of the one-hundred N = 1 cases in the N = 100 sample. Furthermore, we will demonstrate how to test the difference between the full panel model and each N = 1 model by means of a subject-group-reproducibility test. Finally, the proposed analyses will be applied in an empirical example, in which the relationships between mood at work and mood at home are studied in a sample of N = 55 women. All analyses are carried out by ctsem, an R-package for continuous time modeling, interfacing to OpenMx.

  6. The sample autocorrelation function of non-linear time series

    NARCIS (Netherlands)

    Basrak, Bojan

    2000-01-01

    When studying a real-life time series, it is frequently reasonable to assume, possibly after a suitable transformation, that the data come from a stationary time series (Xt). This means that the finite-dimensional distributions of this sequence are invariant under shifts of time. Various stationary

  7. Mathematical foundations of time series analysis a concise introduction

    CERN Document Server

    Beran, Jan

    2017-01-01

    This book provides a concise introduction to the mathematical foundations of time series analysis, with an emphasis on mathematical clarity. The text is reduced to the essential logical core, mostly using the symbolic language of mathematics, thus enabling readers to very quickly grasp the essential reasoning behind time series analysis. It appeals to anybody wanting to understand time series in a precise, mathematical manner. It is suitable for graduate courses in time series analysis but is equally useful as a reference work for students and researchers alike.

  8. Time series analysis in the social sciences the fundamentals

    CERN Document Server

    Shin, Youseop

    2017-01-01

    Times Series Analysis in the Social Sciences is a practical and highly readable introduction written exclusively for students and researchers whose mathematical background is limited to basic algebra. The book focuses on fundamental elements of time series analysis that social scientists need to understand so they can employ time series analysis for their research and practice. Through step-by-step explanations and using monthly violent crime rates as case studies, this book explains univariate time series from the preliminary visual analysis through the modeling of seasonality, trends, and re

  9. Stochastic time series analysis of hydrology data for water resources

    Science.gov (United States)

    Sathish, S.; Khadar Babu, S. K.

    2017-11-01

    The prediction to current publication of stochastic time series analysis in hydrology and seasonal stage. The different statistical tests for predicting the hydrology time series on Thomas-Fiering model. The hydrology time series of flood flow have accept a great deal of consideration worldwide. The concentration of stochastic process areas of time series analysis method are expanding with develop concerns about seasonal periods and global warming. The recent trend by the researchers for testing seasonal periods in the hydrologic flowseries using stochastic process on Thomas-Fiering model. The present article proposed to predict the seasonal periods in hydrology using Thomas-Fiering model.

  10. Interpretable Early Classification of Multivariate Time Series

    Science.gov (United States)

    Ghalwash, Mohamed F.

    2013-01-01

    Recent advances in technology have led to an explosion in data collection over time rather than in a single snapshot. For example, microarray technology allows us to measure gene expression levels in different conditions over time. Such temporal data grants the opportunity for data miners to develop algorithms to address domain-related problems,…

  11. Studies on time series applications in environmental sciences

    CERN Document Server

    Bărbulescu, Alina

    2016-01-01

    Time series analysis and modelling represent a large study field, implying the approach from the perspective of the time and frequency, with applications in different domains. Modelling hydro-meteorological time series is difficult due to the characteristics of these series, as long range dependence, spatial dependence, the correlation with other series. Continuous spatial data plays an important role in planning, risk assessment and decision making in environmental management. In this context, in this book we present various statistical tests and modelling techniques used for time series analysis, as well as applications to hydro-meteorological series from Dobrogea, a region situated in the south-eastern part of Romania, less studied till now. Part of the results are accompanied by their R code. .

  12. A Bayesian Infinite Hidden Markov Vector Autoregressive Model

    NARCIS (Netherlands)

    D. Nibbering (Didier); R. Paap (Richard); M. van der Wel (Michel)

    2016-01-01

    textabstractWe propose a Bayesian infinite hidden Markov model to estimate time-varying parameters in a vector autoregressive model. The Markov structure allows for heterogeneity over time while accounting for state-persistence. By modelling the transition distribution as a Dirichlet process mixture

  13. Assessing Local Turbulence Strength from a Time Series

    Directory of Open Access Journals (Sweden)

    Mayer Humi

    2010-01-01

    Full Text Available We study the possible link between “local turbulence strength” in a flow which is represented by a finite time series and a “chaotic invariant”, namely, the leading Lyaponuv exponent that characterizes this series. To validate a conjecture about this link, we analyze several time series of measurements taken by a plane flying at constant height in the upper troposphere. For each of these time series we estimate the leading Lyaponuv exponent which we then correlate with the structure constants for the temperature. In addition, we introduce a quantitative technique to educe the scale contents of the flow and a methodology to validate its spectrum.

  14. Useful Pattern Mining on Time Series

    DEFF Research Database (Denmark)

    Goumatianos, Nikitas; Christou, Ioannis T; Lindgren, Peter

    2013-01-01

    % or higher increase (or, alternatively, decrease) in a chosen property of the stock (e.g. close-value) within a given time-window (e.g. 5 days). Initial results from a first prototype implementation of the architecture show that after training on a large set of stocks, the system is capable of finding...

  15. Two-fractal overlap time series: Earthquakes and market crashes

    Indian Academy of Sciences (India)

    velocity over the other and time series of stock prices. An anticipation method for some of the crashes have been proposed here, based on these observations. Keywords. Cantor set; time series; earthquake; market crash. PACS Nos 05.00; 02.50.-r; 64.60; 89.65.Gh; 95.75.Wx. 1. Introduction. Capturing dynamical patterns of ...

  16. forecasting with nonlinear time series model: a monte-carlo

    African Journals Online (AJOL)

    PUBLICATIONS1

    erated recursively up to any step greater than one. For nonlinear time series model, point forecast for step one can be done easily like in the linear case but forecast for a step greater than or equal to ..... London. Franses, P. H. (1998). Time series models for business and Economic forecasting, Cam- bridge University press.

  17. Transition Icons for Time-Series Visualization and Exploratory Analysis.

    Science.gov (United States)

    Nickerson, Paul V; Baharloo, Raheleh; Wanigatunga, Amal A; Manini, Todd M; Tighe, Patrick J; Rashidi, Parisa

    2018-03-01

    The modern healthcare landscape has seen the rapid emergence of techniques and devices that temporally monitor and record physiological signals. The prevalence of time-series data within the healthcare field necessitates the development of methods that can analyze the data in order to draw meaningful conclusions. Time-series behavior is notoriously difficult to intuitively understand due to its intrinsic high-dimensionality, which is compounded in the case of analyzing groups of time series collected from different patients. Our framework, which we call transition icons, renders common patterns in a visual format useful for understanding the shared behavior within groups of time series. Transition icons are adept at detecting and displaying subtle differences and similarities, e.g., between measurements taken from patients receiving different treatment strategies or stratified by demographics. We introduce various methods that collectively allow for exploratory analysis of groups of time series, while being free of distribution assumptions and including simple heuristics for parameter determination. Our technique extracts discrete transition patterns from symbolic aggregate approXimation representations, and compiles transition frequencies into a bag of patterns constructed for each group. These transition frequencies are normalized and aligned in icon form to intuitively display the underlying patterns. We demonstrate the transition icon technique for two time-series datasets-postoperative pain scores, and hip-worn accelerometer activity counts. We believe transition icons can be an important tool for researchers approaching time-series data, as they give rich and intuitive information about collective time-series behaviors.

  18. Time series analyses of mean monthly rainfall for drought ...

    African Journals Online (AJOL)

    This paper analyses the time series characteristics of rainfall data for Sokoto metropolis for 40 years with a view to understanding drought management. Data for this study was obtained from the Nigeria Metrological Agency (NIMET), Sokoto Airport; Sokoto. The data was subjected to time series tests (trend, cycle, seasonal ...

  19. Critical values for unit root tests in seasonal time series

    NARCIS (Netherlands)

    Ph.H.B.F. Franses (Philip Hans); B. Hobijn (Bart)

    1997-01-01

    textabstractIn this paper, we present tables with critical values for a variety of tests for seasonal and non-seasonal unit roots in seasonal time series. We consider (extensions of) the Hylleberg et al. and Osborn et al. test procedures. These extensions concern time series with increasing seasonal

  20. Time Series Econometrics for the 21st Century

    Science.gov (United States)

    Hansen, Bruce E.

    2017-01-01

    The field of econometrics largely started with time series analysis because many early datasets were time-series macroeconomic data. As the field developed, more cross-sectional and longitudinal datasets were collected, which today dominate the majority of academic empirical research. In nonacademic (private sector, central bank, and governmental)…

  1. Time series forecasting based on deep extreme learning machine

    NARCIS (Netherlands)

    Guo, Xuqi; Pang, Y.; Yan, Gaowei; Qiao, Tiezhu; Yang, Guang-Hong; Yang, Dan

    2017-01-01

    Multi-layer Artificial Neural Networks (ANN) has caught widespread attention as a new method for time series forecasting due to the ability of approximating any nonlinear function. In this paper, a new local time series prediction model is established with the nearest neighbor domain theory, in

  2. Two-fractal overlap time series: Earthquakes and market crashes

    Indian Academy of Sciences (India)

    We find prominent similarities in the features of the time series for the (model earthquakes or) overlap of two Cantor sets when one set moves with uniform relative velocity over the other and time series of stock prices. An anticipation method for some of the crashes have been proposed here, based on these observations.

  3. 461 TIME SERIES ANALYSES OF MEAN MONTHLY RAINFALL ...

    African Journals Online (AJOL)

    Osondu

    Abstract. This paper analyses the time series characteristics of rainfall data for Sokoto metropolis for 40 years with a view to understanding drought management. Data for this study was obtained from the. Nigeria Metrological Agency (NIMET), Sokoto Airport; Sokoto. The data was subjected to time series tests (trend, cycle ...

  4. Time series prediction of apple scab using meteorological ...

    African Journals Online (AJOL)

    A new prediction model for the early warning of apple scab is proposed in this study. The method is based on artificial intelligence and time series prediction. The infection period of apple scab was evaluated as the time series prediction model instead of summation of wetness duration. Also, the relations of different ...

  5. A Dynamic Fuzzy Cluster Algorithm for Time Series

    Directory of Open Access Journals (Sweden)

    Min Ji

    2013-01-01

    clustering time series by introducing the definition of key point and improving FCM algorithm. The proposed algorithm works by determining those time series whose class labels are vague and further partitions them into different clusters over time. The main advantage of this approach compared with other existing algorithms is that the property of some time series belonging to different clusters over time can be partially revealed. Results from simulation-based experiments on geographical data demonstrate the excellent performance and the desired results have been obtained. The proposed algorithm can be applied to solve other clustering problems in data mining.

  6. Auto-correlograms and auto-regressive models of trace metal distributions in Cochin backwaters

    Digital Repository Service at National Institute of Oceanography (India)

    Jayalakshmy, K.V.; Sankaranarayanan, V.N.

    ,2 and 3 and for Zn at stations 1 and 4. The stability in time for the concentration profiles increases as Fe Mn Ni Cu Co. The fraction of variability in the variables obtained by the auto-regressive model of order 1 ranges from 20 to 50%. Auto-regressive...

  7. A new algorithm for automatic Outlier Detection in GPS Time Series

    Science.gov (United States)

    Cannavo', Flavio; Mattia, Mario; Rossi, Massimo; Palano, Mimmo; Bruno, Valentina

    2010-05-01

    Nowadays continuous GPS time series are considered a crucial product of GPS permanent networks, useful in many geo-science fields, such as active tectonics, seismology, crustal deformation and volcano monitoring (Altamimi et al. 2002, Elósegui et al. 2006, Aloisi et al. 2009). Although the GPS data elaboration software has increased in reliability, the time series are still affected by different kind of noise, from the intrinsic noise (e.g. thropospheric delay) to the un-modeled noise (e.g. cycle slips, satellite faults, parameters changing). Typically GPS Time Series present characteristic noise that is a linear combination of white noise and correlated colored noise, and this characteristic is fractal in the sense that is evident for every considered time scale or sampling rate. The un-modeled noise sources result in spikes, outliers and steps. These kind of errors can appreciably influence the estimation of velocities of the monitored sites. The outlier detection in generic time series is a widely treated problem in literature (Wei, 2005), while is not fully developed for the specific kind of GPS series. We propose a robust automatic procedure for cleaning the GPS time series from the outliers and, especially for long daily series, steps due to strong seismic or volcanic events or merely instrumentation changing such as antenna and receiver upgrades. The procedure is basically divided in two steps: a first step for the colored noise reduction and a second step for outlier detection through adaptive series segmentation. Both algorithms present novel ideas and are nearly unsupervised. In particular, we propose an algorithm to estimate an autoregressive model for colored noise in GPS time series in order to subtract the effect of non Gaussian noise on the series. This step is useful for the subsequent step (i.e. adaptive segmentation) which requires the hypothesis of Gaussian noise. The proposed algorithms are tested in a benchmark case study and the results

  8. Analysis of forecasting malaria case with climatic factors as predictor in Mandailing Natal Regency: a time series study

    Science.gov (United States)

    Aulia, D.; Ayu, S. F.; Matondang, A.

    2018-01-01

    Malaria is the most contagious global concern. As a public health problem with outbreaks, affect the quality of life and economy, also could lead to death. Therefore, this research is to forecast malaria cases with climatic factors as predictors in Mandailing Natal Regency. The total number of positive malaria cases on January 2008 to December 2016 were taken from health department of Mandailing Natal Regency. Climates data such as rainfall, humidity, and temperature were taken from Center of Statistic Department of Mandailing Natal Regency. E-views ver. 9 is used to analyze this study. Autoregressive integrated average, ARIMA (0,1,1) (1,0,0)12 is the best model to explain the 67,2% variability data in time series study. Rainfall (P value = 0.0005), temperature (P value = 0,0029) and humidity (P value = 0.0001) are significant predictors for malaria transmission. Seasonal adjusted factor (SAF) in November and March shows peak for malaria cases.

  9. Frontiers in Time Series and Financial Econometrics : An overview

    NARCIS (Netherlands)

    S. Ling (Shiqing); M.J. McAleer (Michael); H. Tong (Howell)

    2015-01-01

    markdownabstract__Abstract__ Two of the fastest growing frontiers in econometrics and quantitative finance are time series and financial econometrics. Significant theoretical contributions to financial econometrics have been made by experts in statistics, econometrics, mathematics, and time

  10. Frontiers in Time Series and Financial Econometrics: An Overview

    NARCIS (Netherlands)

    S. Ling (Shiqing); M.J. McAleer (Michael); H. Tong (Howell)

    2015-01-01

    markdownabstract__Abstract__ Two of the fastest growing frontiers in econometrics and quantitative finance are time series and financial econometrics. Significant theoretical contributions to financial econometrics have been made by experts in statistics, econometrics, mathematics, and time

  11. Effectiveness of Multivariate Time Series Classification Using Shapelets

    Directory of Open Access Journals (Sweden)

    A. P. Karpenko

    2015-01-01

    Full Text Available Typically, time series classifiers require signal pre-processing (filtering signals from noise and artifact removal, etc., enhancement of signal features (amplitude, frequency, spectrum, etc., classification of signal features in space using the classical techniques and classification algorithms of multivariate data. We consider a method of classifying time series, which does not require enhancement of the signal features. The method uses the shapelets of time series (time series shapelets i.e. small fragments of this series, which reflect properties of one of its classes most of all.Despite the significant number of publications on the theory and shapelet applications for classification of time series, the task to evaluate the effectiveness of this technique remains relevant. An objective of this publication is to study the effectiveness of a number of modifications of the original shapelet method as applied to the multivariate series classification that is a littlestudied problem. The paper presents the problem statement of multivariate time series classification using the shapelets and describes the shapelet–based basic method of binary classification, as well as various generalizations and proposed modification of the method. It also offers the software that implements a modified method and results of computational experiments confirming the effectiveness of the algorithmic and software solutions.The paper shows that the modified method and the software to use it allow us to reach the classification accuracy of about 85%, at best. The shapelet search time increases in proportion to input data dimension.

  12. Period Estimation in Astronomical Time Series Using Slotted Correntropy

    OpenAIRE

    Huijse, Pablo; Estévez, Pablo A.; Zegers, Pablo; Príncipe, José; Protopapas, Pavlos

    2011-01-01

    In this letter, we propose a method for period estimation in light curves from periodic variable stars using correntropy. Light curves are astronomical time series of stellar brightness over time, and are characterized as being noisy and unevenly sampled. We propose to use slotted time lags in order to estimate correntropy directly from irregularly sampled time series. A new information theoretic metric is proposed for discriminating among the peaks of the correntropy spectral density. The sl...

  13. Time-Series Analysis of the Impact of Prescription Drug Monitoring Programs on Heroin Treatment Admissions.

    Science.gov (United States)

    Branham, Douglas Keith

    2018-03-21

    Prescription drug abuse has become a major issue in the United States in recent years. Prescription drug monitoring programs (PDMPs) are designed to help health care providers to prevent such abuses. There may be unintended effects of these programs. Specifically, PDMPs may move prescription opioid users to begin use of heroin. This article aims to evaluate the impact of PDMPs on heroin abuse across several different states through use of treatment admissions records obtained from the Treatment Episode Data Set. Operational dates and other characteristics of state PDMPs were obtained from the Prescription Drug Monitoring Program Training and Technical Assistance Center. Data for the dependent variable were collected from the Treatment Episodes Data Set from 1992 to 2012. Interrupted time-series analyses using autoregressive integrated moving average modeling were used to estimate the effect of presence of an operational PDMP on the number of admissions reporting heroin as their primary drug being used. The relationship between heroin admissions and prescription opioid admissions was significant for the average data (β = 0.41, p = 0.0017) and the 5-year data (β = 0.5, p = 0.036), both showing positive associations between heroin and prescription drug admissions in states in the post PDMP implementation period. Conclusions/Importance: The study found a positive relationship that between heroin and prescription opioid admissions post PDMP implementation. Future research should attempt to identify what this relationship means and how this information can be used to improve opioid policy.

  14. Deterministic decomposition and seasonal ARIMA time series models applied to airport noise forecasting

    Science.gov (United States)

    Guarnaccia, Claudio; Quartieri, Joseph; Tepedino, Carmine

    2017-06-01

    One of the most hazardous physical polluting agents, considering their effects on human health, is acoustical noise. Airports are a strong source of acoustical noise, due to the airplanes turbines, to the aero-dynamical noise of transits, to the acceleration or the breaking during the take-off and landing phases of aircrafts, to the road traffic around the airport, etc.. The monitoring and the prediction of the acoustical level emitted by airports can be very useful to assess the impact on human health and activities. In the airports noise scenario, thanks to flights scheduling, the predominant sources may have a periodic behaviour. Thus, a Time Series Analysis approach can be adopted, considering that a general trend and a seasonal behaviour can be highlighted and used to build a predictive model. In this paper, two different approaches are adopted, thus two predictive models are constructed and tested. The first model is based on deterministic decomposition and is built composing the trend, that is the long term behaviour, the seasonality, that is the periodic component, and the random variations. The second model is based on seasonal autoregressive moving average, and it belongs to the stochastic class of models. The two different models are fitted on an acoustical level dataset collected close to the Nice (France) international airport. Results will be encouraging and will show good prediction performances of both the adopted strategies. A residual analysis is performed, in order to quantify the forecasting error features.

  15. Modeling annual Coffee production in Ghana using ARIMA time series Model

    Directory of Open Access Journals (Sweden)

    E. Harris

    2013-07-01

    Full Text Available In the international commodity trade, coffee, which represents the world’s most valuable tropical agricultural commodity, comes next to oil. Indeed, it is estimated that about 40 million people in the major producing countries in Africa derive their livelihood from coffee, with Africa accounting for about 12 per cent of global production. The paper applied Autoregressive Integrated Moving Average (ARIMA time series model to study the behavior of Ghana’s annual coffee production as well as make five years forecasts. Annual coffee production data from 1990 to 2010 was obtained from Ghana cocoa board and analyzed using ARIMA. The results showed that in general, the trend of Ghana’s total coffee production follows an upward and downward movement. The best model arrived at on the basis of various diagnostics, selection and an evaluation criterion was ARIMA (0,3,1. Finally, the forecast figures base on Box- Jenkins method showed that Ghana’s annual coffee production will decrease continuously in the next five (5 years, all things being equal

  16. Burden of salmonellosis, campylobacteriosis and listeriosis: a time series analysis, Belgium, 2012 to 2020.

    Science.gov (United States)

    Maertens de Noordhout, Charline; Devleesschauwer, Brecht; Haagsma, Juanita A; Havelaar, Arie H; Bertrand, Sophie; Vandenberg, Olivier; Quoilin, Sophie; Brandt, Patrick T; Speybroeck, Niko

    2017-09-21

    Salmonellosis, campylobacteriosis and listeriosis are food-borne diseases. We estimated and forecasted the number of cases of these three diseases in Belgium from 2012 to 2020, and calculated the corresponding number of disability-adjusted life years (DALYs). The salmonellosis time series was fitted with a Bai and Perron two-breakpoint model, while a dynamic linear model was used for campylobacteriosis and a Poisson autoregressive model for listeriosis. The average monthly number of cases of salmonellosis was 264 (standard deviation (SD): 86) in 2012 and predicted to be 212 (SD: 87) in 2020; campylobacteriosis case numbers were 633 (SD: 81) and 1,081 (SD: 311); listeriosis case numbers were 5 (SD: 2) in 2012 and 6 (SD: 3) in 2014. After applying correction factors, the estimated DALYs for salmonellosis were 102 (95% uncertainty interval (UI): 8-376) in 2012 and predicted to be 82 (95% UI: 6-310) in 2020; campylobacteriosis DALYs were 1,019 (95% UI: 137-3,181) and 1,736 (95% UI: 178-5,874); listeriosis DALYs were 208 (95% UI: 192-226) in 2012 and 252 (95% UI: 200-307) in 2014. New actions are needed to reduce the risk of food-borne infection with Campylobacter spp. because campylobacteriosis incidence may almost double through 2020. This article is copyright of The Authors, 2017.

  17. Reduction in male suicide mortality following the 2006 Russian alcohol policy: an interrupted time series analysis.

    Science.gov (United States)

    Pridemore, William Alex; Chamlin, Mitchell B; Andreev, Evgeny

    2013-11-01

    We took advantage of a natural experiment to assess the impact on suicide mortality of a suite of Russian alcohol policies. We obtained suicide counts from anonymous death records collected by the Russian Federal State Statistics Service. We used autoregressive integrated moving average (ARIMA) interrupted time series techniques to model the effect of the alcohol policy (implemented in January 2006) on monthly male and female suicide counts between January 2000 and December 2010. Monthly male and female suicide counts decreased during the period under study. Although the ARIMA analysis showed no impact of the policy on female suicide mortality, the results revealed an immediate and permanent reduction of about 9% in male suicides (Ln ω0 = -0.096; P = .01). Despite a recent decrease in mortality, rates of alcohol consumption and suicide in Russia remain among the highest in the world. Our analysis revealed that the 2006 alcohol policy in Russia led to a 9% reduction in male suicide mortality, meaning the policy was responsible for saving 4000 male lives annually that would otherwise have been lost to suicide. Together with recent similar findings elsewhere, our results suggest an important role for public health and other population level interventions, including alcohol policy, in reducing alcohol-related harm.

  18. A Long-Term Prediction Model of Beijing Haze Episodes Using Time Series Analysis

    Directory of Open Access Journals (Sweden)

    Xiaoping Yang

    2016-01-01

    Full Text Available The rapid industrial development has led to the intermittent outbreak of pm2.5 or haze in developing countries, which has brought about great environmental issues, especially in big cities such as Beijing and New Delhi. We investigated the factors and mechanisms of haze change and present a long-term prediction model of Beijing haze episodes using time series analysis. We construct a dynamic structural measurement model of daily haze increment and reduce the model to a vector autoregressive model. Typical case studies on 886 continuous days indicate that our model performs very well on next day’s Air Quality Index (AQI prediction, and in severely polluted cases (AQI ≥ 300 the accuracy rate of AQI prediction even reaches up to 87.8%. The experiment of one-week prediction shows that our model has excellent sensitivity when a sudden haze burst or dissipation happens, which results in good long-term stability on the accuracy of the next 3–7 days’ AQI prediction.

  19. The effect of the Swedish bicycle helmet law for children: an interrupted time series study.

    Science.gov (United States)

    Bonander, Carl; Nilson, Finn; Andersson, Ragnar

    2014-12-01

    Previous population-based research has shown that bicycle helmet laws can reduce head injury rates among cyclists. According to deterrence theory, such laws are mainly effective if there is a high likelihood of being apprehended. In this study, we investigated the effect of the Swedish helmet law for children under the age of 15, a population that cannot be fined. An interrupted time series design was used. Monthly inpatient data on injured cyclists from 1998-2012, stratified by age (0-14, 15+), sex, and injury diagnosis, was obtained from the National Patient Register. The main outcome measure was the proportion of head injury admissions per month. Intervention effect estimates were obtained using generalized autoregressive moving average (GARMA) models. Pre-legislation trend and seasonality was adjusted for, and differences-in-differences estimation was obtained using adults as a non-equivalent control group. There was a statistically significant intervention effect among male children, where the proportion of head injuries dropped by 7.8 percentage points. There was no evidence of an intervention effect on the proportion of head injuries among female children. According to hospital admission data, the bicycle helmet law appears to have had an effect only on male children. This study, while quasi-experimental and thus not strictly generalizable, can contribute to increased knowledge regarding the effects of bicycle helmet laws. Copyright © 2014 National Safety Council and Elsevier Ltd. All rights reserved.

  20. Using SAR satellite data time series for regional glacier mapping

    Directory of Open Access Journals (Sweden)

    S. H. Winsvold

    2018-03-01

    Full Text Available With dense SAR satellite data time series it is possible to map surface and subsurface glacier properties that vary in time. On Sentinel-1A and RADARSAT-2 backscatter time series images over mainland Norway and Svalbard, we outline how to map glaciers using descriptive methods. We present five application scenarios. The first shows potential for tracking transient snow lines with SAR backscatter time series and correlates with both optical satellite images (Sentinel-2A and Landsat 8 and equilibrium line altitudes derived from in situ surface mass balance data. In the second application scenario, time series representation of glacier facies corresponding to SAR glacier zones shows potential for a more accurate delineation of the zones and how they change in time. The third application scenario investigates the firn evolution using dense SAR backscatter time series together with a coupled energy balance and multilayer firn model. We find strong correlation between backscatter signals with both the modeled firn air content and modeled wetness in the firn. In the fourth application scenario, we highlight how winter rain events can be detected in SAR time series, revealing important information about the area extent of internal accumulation. In the last application scenario, averaged summer SAR images were found to have potential in assisting the process of mapping glaciers outlines, especially in the presence of seasonal snow. Altogether we present examples of how to map glaciers and to further understand glaciological processes using the existing and future massive amount of multi-sensor time series data.

  1. Using SAR satellite data time series for regional glacier mapping

    Science.gov (United States)

    Winsvold, Solveig H.; Kääb, Andreas; Nuth, Christopher; Andreassen, Liss M.; van Pelt, Ward J. J.; Schellenberger, Thomas

    2018-03-01

    With dense SAR satellite data time series it is possible to map surface and subsurface glacier properties that vary in time. On Sentinel-1A and RADARSAT-2 backscatter time series images over mainland Norway and Svalbard, we outline how to map glaciers using descriptive methods. We present five application scenarios. The first shows potential for tracking transient snow lines with SAR backscatter time series and correlates with both optical satellite images (Sentinel-2A and Landsat 8) and equilibrium line altitudes derived from in situ surface mass balance data. In the second application scenario, time series representation of glacier facies corresponding to SAR glacier zones shows potential for a more accurate delineation of the zones and how they change in time. The third application scenario investigates the firn evolution using dense SAR backscatter time series together with a coupled energy balance and multilayer firn model. We find strong correlation between backscatter signals with both the modeled firn air content and modeled wetness in the firn. In the fourth application scenario, we highlight how winter rain events can be detected in SAR time series, revealing important information about the area extent of internal accumulation. In the last application scenario, averaged summer SAR images were found to have potential in assisting the process of mapping glaciers outlines, especially in the presence of seasonal snow. Altogether we present examples of how to map glaciers and to further understand glaciological processes using the existing and future massive amount of multi-sensor time series data.

  2. Application of semi parametric modelling to times series forecasting: case of the electricity consumption; Modeles semi-parametriques appliques a la prevision des series temporelles. Cas de la consommation d'electricite

    Energy Technology Data Exchange (ETDEWEB)

    Lefieux, V

    2007-10-15

    Reseau de Transport d'Electricite (RTE), in charge of operating the French electric transportation grid, needs an accurate forecast of the power consumption in order to operate it correctly. The forecasts used everyday result from a model combining a nonlinear parametric regression and a SARIMA model. In order to obtain an adaptive forecasting model, nonparametric forecasting methods have already been tested without real success. In particular, it is known that a nonparametric predictor behaves badly with a great number of explanatory variables, what is commonly called the curse of dimensionality. Recently, semi parametric methods which improve the pure nonparametric approach have been proposed to estimate a regression function. Based on the concept of 'dimension reduction', one those methods (called MAVE : Moving Average -conditional- Variance Estimate) can apply to time series. We study empirically its effectiveness to predict the future values of an autoregressive time series. We then adapt this method, from a practical point of view, to forecast power consumption. We propose a partially linear semi parametric model, based on the MAVE method, which allows to take into account simultaneously the autoregressive aspect of the problem and the exogenous variables. The proposed estimation procedure is practically efficient. (author)

  3. Stochastic models in the DORIS position time series: estimates for IDS contribution to ITRF2014

    Science.gov (United States)

    Klos, Anna; Bogusz, Janusz; Moreaux, Guilhem

    2017-11-01

    This paper focuses on the investigation of the deterministic and stochastic parts of the Doppler Orbitography and Radiopositioning Integrated by Satellite (DORIS) weekly time series aligned to the newest release of ITRF2014. A set of 90 stations was divided into three groups depending on when the data were collected at an individual station. To reliably describe the DORIS time series, we employed a mathematical model that included the long-term nonlinear signal, linear trend, seasonal oscillations and a stochastic part, all being estimated with maximum likelihood estimation. We proved that the values of the parameters delivered for DORIS data are strictly correlated with the time span of the observations. The quality of the most recent data has significantly improved. Not only did the seasonal amplitudes decrease over the years, but also, and most importantly, the noise level and its type changed significantly. Among several tested models, the power-law process may be chosen as the preferred one for most of the DORIS data. Moreover, the preferred noise model has changed through the years from an autoregressive process to pure power-law noise with few stations characterised by a positive spectral index. For the latest observations, the medians of the velocity errors were equal to 0.3, 0.3 and 0.4 mm/year, respectively, for the North, East and Up components. In the best cases, a velocity uncertainty of DORIS sites of 0.1 mm/year is achievable when the appropriate coloured noise model is taken into consideration.

  4. Year Ahead Demand Forecast of City Natural Gas Using Seasonal Time Series Methods

    Directory of Open Access Journals (Sweden)

    Mustafa Akpinar

    2016-09-01

    Full Text Available Consumption of natural gas, a major clean energy source, increases as energy demand increases. We studied specifically the Turkish natural gas market. Turkey’s natural gas consumption increased as well in parallel with the world‘s over the last decade. This consumption growth in Turkey has led to the formation of a market structure for the natural gas industry. This significant increase requires additional investments since a rise in consumption capacity is expected. One of the reasons for the consumption increase is the user-based natural gas consumption influence. This effect yields imbalances in demand forecasts and if the error rates are out of bounds, penalties may occur. In this paper, three univariate statistical methods, which have not been previously investigated for mid-term year-ahead monthly natural gas forecasting, are used to forecast natural gas demand in Turkey’s Sakarya province. Residential and low-consumption commercial data is used, which may contain seasonality. The goal of this paper is minimizing more or less gas tractions on mid-term consumption while improving the accuracy of demand forecasting. In forecasting models, seasonality and single variable impacts reinforce forecasts. This paper studies time series decomposition, Holt-Winters exponential smoothing and autoregressive integrated moving average (ARIMA methods. Here, 2011–2014 monthly data were prepared and divided into two series. The first series is 2011–2013 monthly data used for finding seasonal effects and model requirements. The second series is 2014 monthly data used for forecasting. For the ARIMA method, a stationary series was prepared and transformation process prior to forecasting was done. Forecasting results confirmed that as the computation complexity of the model increases, forecasting accuracy increases with lower error rates. Also, forecasting errors and the coefficients of determination values give more consistent results. Consequently

  5. Automated analysis of brachial ultrasound time series

    Science.gov (United States)

    Liang, Weidong; Browning, Roger L.; Lauer, Ronald M.; Sonka, Milan

    1998-07-01

    Atherosclerosis begins in childhood with the accumulation of lipid in the intima of arteries to form fatty streaks, advances through adult life when occlusive vascular disease may result in coronary heart disease, stroke and peripheral vascular disease. Non-invasive B-mode ultrasound has been found useful in studying risk factors in the symptom-free population. Large amount of data is acquired from continuous imaging of the vessels in a large study population. A high quality brachial vessel diameter measurement method is necessary such that accurate diameters can be measured consistently in all frames in a sequence, across different observers. Though human expert has the advantage over automated computer methods in recognizing noise during diameter measurement, manual measurement suffers from inter- and intra-observer variability. It is also time-consuming. An automated measurement method is presented in this paper which utilizes quality assurance approaches to adapt to specific image features, to recognize and minimize the noise effect. Experimental results showed the method's potential for clinical usage in the epidemiological studies.

  6. Sensor-Generated Time Series Events: A Definition Language

    Directory of Open Access Journals (Sweden)

    Juan Pazos

    2012-08-01

    Full Text Available There are now a great many domains where information is recorded by sensors over a limited time period or on a permanent basis. This data flow leads to sequences of data known as time series. In many domains, like seismography or medicine, time series analysis focuses on particular regions of interest, known as events, whereas the remainder of the time series contains hardly any useful information. In these domains, there is a need for mechanisms to identify and locate such events. In this paper, we propose an events definition language that is general enough to be used to easily and naturally define events in time series recorded by sensors in any domain. The proposed language has been applied to the definition of time series events generated within the branch of medicine dealing with balance-related functions in human beings. A device, called posturograph, is used to study balance-related functions. The platform has four sensors that record the pressure intensity being exerted on the platform, generating four interrelated time series. As opposed to the existing ad hoc proposals, the results confirm that the proposed language is valid, that is generally applicable and accurate, for identifying the events contained in the time series.

  7. Sensor-Generated Time Series Events: A Definition Language

    Science.gov (United States)

    Anguera, Aurea; Lara, Juan A.; Lizcano, David; Martínez, Maria Aurora; Pazos, Juan

    2012-01-01

    There are now a great many domains where information is recorded by sensors over a limited time period or on a permanent basis. This data flow leads to sequences of data known as time series. In many domains, like seismography or medicine, time series analysis focuses on particular regions of interest, known as events, whereas the remainder of the time series contains hardly any useful information. In these domains, there is a need for mechanisms to identify and locate such events. In this paper, we propose an events definition language that is general enough to be used to easily and naturally define events in time series recorded by sensors in any domain. The proposed language has been applied to the definition of time series events generated within the branch of medicine dealing with balance-related functions in human beings. A device, called posturograph, is used to study balance-related functions. The platform has four sensors that record the pressure intensity being exerted on the platform, generating four interrelated time series. As opposed to the existing ad hoc proposals, the results confirm that the proposed language is valid, that is generally applicable and accurate, for identifying the events contained in the time series.

  8. Fractal dimension of wind speed time series

    International Nuclear Information System (INIS)

    Chang, Tian-Pau; Ko, Hong-Hsi; Liu, Feng-Jiao; Chen, Pai-Hsun; Chang, Ying-Pin; Liang, Ying-Hsin; Jang, Horng-Yuan; Lin, Tsung-Chi; Chen, Yi-Hwa

    2012-01-01

    Highlights: ► Fractal dimension of wind speeds in Taiwan is studied considering climate factors. ► Relevant algorithms for the calculation of fractal dimension are presented graphically. ► Fractal dimension reveals negative correlation with mean wind speed. ► Fractal dimension is not lower even wind distribution is well described by Weibull pdf. - Abstract: The fluctuation of wind speed within a specific time period affects a lot the energy conversion rate of wind turbine. In this paper, the concept of fractal dimension in chaos theory is applied to investigate wind speed characterizations; numerical algorithms for the calculation of the fractal dimension are presented graphically. Wind data selected is observed at three wind farms experiencing different climatic conditions from 2006 to 2008 in Taiwan, where wind speed distribution can be properly classified to high wind season from October to March and low wind season from April to September. The variations of fractal dimensions among different wind farms are analyzed from the viewpoint of climatic conditions. The results show that the wind speeds studied are characterized by medium to high values of fractal dimension; the annual dimension values lie between 1.61 and 1.66. Because of monsoon factor, the fluctuation of wind speed during high wind months is not as significant as that during low wind months; the value of fractal dimension reveals negative correlation with that of mean wind speed, irrespective of wind farm considered. For a location where the wind distribution is well described by Weibull function, its fractal dimension is not necessarily lower. These findings are useful to wind analysis.

  9. Automating Vector Autoregression on Electronic Patient Diary Data

    NARCIS (Netherlands)

    Emerencia, Ando Celino; van der Krieke, Lian; Bos, Elisabeth H.; de Jonge, Peter; Petkov, Nicolai; Aiello, Marco

    Finding the best vector autoregression model for any dataset, medical or otherwise, is a process that, to this day, is frequently performed manually in an iterative manner requiring a statistical expertize and time. Very few software solutions for automating this process exist, and they still

  10. Efficient Computation of Multiscale Entropy over Short Biomedical Time Series Based on Linear State-Space Models

    Directory of Open Access Journals (Sweden)

    Luca Faes

    2017-01-01

    Full Text Available The most common approach to assess the dynamical complexity of a time series across multiple temporal scales makes use of the multiscale entropy (MSE and refined MSE (RMSE measures. In spite of their popularity, MSE and RMSE lack an analytical framework allowing their calculation for known dynamic processes and cannot be reliably computed over short time series. To overcome these limitations, we propose a method to assess RMSE for autoregressive (AR stochastic processes. The method makes use of linear state-space (SS models to provide the multiscale parametric representation of an AR process observed at different time scales and exploits the SS parameters to quantify analytically the complexity of the process. The resulting linear MSE (LMSE measure is first tested in simulations, both theoretically to relate the multiscale complexity of AR processes to their dynamical properties and over short process realizations to assess its computational reliability in comparison with RMSE. Then, it is applied to the time series of heart period, arterial pressure, and respiration measured for healthy subjects monitored in resting conditions and during physiological stress. This application to short-term cardiovascular variability documents that LMSE can describe better than RMSE the activity of physiological mechanisms producing biological oscillations at different temporal scales.

  11. DEM time series of an agricultural watershed

    Science.gov (United States)

    Pineux, Nathalie; Lisein, Jonathan; Swerts, Gilles; Degré, Aurore

    2014-05-01

    In agricultural landscape soil surface evolves notably due to erosion and deposition phenomenon. Even if most of the field data come from plot scale studies, the watershed scale seems to be more appropriate to understand them. Currently, small unmanned aircraft systems and images treatments are improving. In this way, 3D models are built from multiple covering shots. When techniques for large areas would be to expensive for a watershed level study or techniques for small areas would be too time consumer, the unmanned aerial system seems to be a promising solution to quantify the erosion and deposition patterns. The increasing technical improvements in this growth field allow us to obtain a really good quality of data and a very high spatial resolution with a high Z accuracy. In the center of Belgium, we equipped an agricultural watershed of 124 ha. For three years (2011-2013), we have been monitoring weather (including rainfall erosivity using a spectropluviograph), discharge at three different locations, sediment in runoff water, and watershed microtopography through unmanned airborne imagery (Gatewing X100). We also collected all available historical data to try to capture the "long-term" changes in watershed morphology during the last decades: old topography maps, soil historical descriptions, etc. An erosion model (LANDSOIL) is also used to assess the evolution of the relief. Short-term evolution of the surface are now observed through flights done at 200m height. The pictures are taken with a side overlap equal to 80%. To precisely georeference the DEM produced, ground control points are placed on the study site and surveyed using a Leica GPS1200 (accuracy of 1cm for x and y coordinates and 1.5cm for the z coordinate). Flights are done each year in December to have an as bare as possible ground surface. Specific treatments are developed to counteract vegetation effect because it is know as key sources of error in the DEM produced by small unmanned aircraft

  12. Database for Hydrological Time Series of Inland Waters (DAHITI)

    Science.gov (United States)

    Schwatke, Christian; Dettmering, Denise

    2016-04-01

    Satellite altimetry was designed for ocean applications. However, since some years, satellite altimetry is also used over inland water to estimate water level time series of lakes, rivers and wetlands. The resulting water level time series can help to understand the water cycle of system earth and makes altimetry to a very useful instrument for hydrological applications. In this poster, we introduce the "Database for Hydrological Time Series of Inland Waters" (DAHITI). Currently, the database contains about 350 water level time series of lakes, reservoirs, rivers, and wetlands which are freely available after a short registration process via http://dahiti.dgfi.tum.de. In this poster, we introduce the product of DAHITI and the functionality of the DAHITI web service. Furthermore, selected examples of inland water targets are presented in detail. DAHITI provides time series of water level heights of inland water bodies and their formal errors . These time series are available within the period of 1992-2015 and have varying temporal resolutions depending on the data coverage of the investigated water body. The accuracies of the water level time series depend mainly on the extent of the investigated water body and the quality of the altimeter measurements. Hereby, an external validation with in-situ data reveals RMS differences between 5 cm and 40 cm for lakes and 10 cm and 140 cm for rivers, respectively.

  13. Housefly population density correlates with shigellosis among children in Mirzapur, Bangladesh: a time series analysis.

    Science.gov (United States)

    Farag, Tamer H; Faruque, Abu S; Wu, Yukun; Das, Sumon K; Hossain, Anowar; Ahmed, Shahnawaz; Ahmed, Dilruba; Nasrin, Dilruba; Kotloff, Karen L; Panchilangam, Sandra; Nataro, James P; Cohen, Dani; Blackwelder, William C; Levine, Myron M

    2013-01-01

    Shigella infections are a public health problem in developing and transitional countries because of high transmissibility, severity of clinical disease, widespread antibiotic resistance and lack of a licensed vaccine. Whereas Shigellae are known to be transmitted primarily by direct fecal-oral contact and less commonly by contaminated food and water, the role of the housefly Musca domestica as a mechanical vector of transmission is less appreciated. We sought to assess the contribution of houseflies to Shigella-associated moderate-to-severe diarrhea (MSD) among children less than five years old in Mirzapur, Bangladesh, a site where shigellosis is hyperendemic, and to model the potential impact of a housefly control intervention. Stool samples from 843 children presenting to Kumudini Hospital during 2009-2010 with new episodes of MSD (diarrhea accompanied by dehydration, dysentery or hospitalization) were analyzed. Housefly density was measured twice weekly in six randomly selected sentinel households. Poisson time series regression was performed and autoregression-adjusted attributable fractions (AFs) were calculated using the Bruzzi method, with standard errors via jackknife procedure. Dramatic springtime peaks in housefly density in 2009 and 2010 were followed one to two months later by peaks of Shigella-associated MSD among toddlers and pre-school children. Poisson time series regression showed that housefly density was associated with Shigella cases at three lags (six weeks) (Incidence Rate Ratio = 1.39 [95% CI: 1.23 to 1.58] for each log increase in fly count), an association that was not confounded by ambient air temperature. Autocorrelation-adjusted AF calculations showed that a housefly control intervention could have prevented approximately 37% of the Shigella cases over the study period. Houseflies may play an important role in the seasonal transmission of Shigella in some developing country ecologies. Interventions to control houseflies should be

  14. Behavioural Pattern of Causality Parameter of Autoregressive ...

    African Journals Online (AJOL)

    In this paper, a causal form of Autoregressive Moving Average process, ARMA (p, q) of various orders and behaviour of the causality parameter of ARMA model is investigated. It is deduced that the behaviour of causality parameter ψi depends on positive and negative values of autoregressive parameter φ and moving ...

  15. Speech Segmentation Using Bayesian Autoregressive Changepoint Detector

    Directory of Open Access Journals (Sweden)

    P. Sovka

    1998-12-01

    Full Text Available This submission is devoted to the study of the Bayesian autoregressive changepoint detector (BCD and its use for speech segmentation. Results of the detector application to autoregressive signals as well as to real speech are given. BCD basic properties are described and discussed. The novel two-step algorithm consisting of cepstral analysis and BCD for automatic speech segmentation is suggested.

  16. Estimation in autoregressive models with Markov regime

    OpenAIRE

    Ríos, Ricardo; Rodríguez, Luis

    2005-01-01

    In this paper we derive the consistency of the penalized likelihood method for the number state of the hidden Markov chain in autoregressive models with Markov regimen. Using a SAEM type algorithm to estimate the models parameters. We test the null hypothesis of hidden Markov Model against an autoregressive process with Markov regime.

  17. Conditional time series forecasting with convolutional neural networks

    NARCIS (Netherlands)

    A. Borovykh (Anastasia); S.M. Bohte (Sander); C.W. Oosterlee (Cornelis)

    2017-01-01

    textabstractForecasting financial time series using past observations has been a significant topic of interest. While temporal relationships in the data exist, they are difficult to analyze and predict accurately due to the non-linear trends and noise present in the series. We propose to learn these

  18. A Dimensionality Reduction Technique for Efficient Time Series Similarity Analysis

    Science.gov (United States)

    Wang, Qiang; Megalooikonomou, Vasileios

    2008-01-01

    We propose a dimensionality reduction technique for time series analysis that significantly improves the efficiency and accuracy of similarity searches. In contrast to piecewise constant approximation (PCA) techniques that approximate each time series with constant value segments, the proposed method--Piecewise Vector Quantized Approximation--uses the closest (based on a distance measure) codeword from a codebook of key-sequences to represent each segment. The new representation is symbolic and it allows for the application of text-based retrieval techniques into time series similarity analysis. Experiments on real and simulated datasets show that the proposed technique generally outperforms PCA techniques in clustering and similarity searches. PMID:18496587

  19. A Dimensionality Reduction Technique for Efficient Time Series Similarity Analysis.

    Science.gov (United States)

    Wang, Qiang; Megalooikonomou, Vasileios

    2008-03-01

    We propose a dimensionality reduction technique for time series analysis that significantly improves the efficiency and accuracy of similarity searches. In contrast to piecewise constant approximation (PCA) techniques that approximate each time series with constant value segments, the proposed method--Piecewise Vector Quantized Approximation--uses the closest (based on a distance measure) codeword from a codebook of key-sequences to represent each segment. The new representation is symbolic and it allows for the application of text-based retrieval techniques into time series similarity analysis. Experiments on real and simulated datasets show that the proposed technique generally outperforms PCA techniques in clustering and similarity searches.

  20. Multivariate time series analysis with R and financial applications

    CERN Document Server

    Tsay, Ruey S

    2013-01-01

    Since the publication of his first book, Analysis of Financial Time Series, Ruey Tsay has become one of the most influential and prominent experts on the topic of time series. Different from the traditional and oftentimes complex approach to multivariate (MV) time series, this sequel book emphasizes structural specification, which results in simplified parsimonious VARMA modeling and, hence, eases comprehension. Through a fundamental balance between theory and applications, the book supplies readers with an accessible approach to financial econometric models and their applications to real-worl

  1. Recursive wind speed forecasting based on Hammerstein Auto-Regressive model

    International Nuclear Information System (INIS)

    Ait Maatallah, Othman; Achuthan, Ajit; Janoyan, Kerop; Marzocca, Pier

    2015-01-01

    Highlights: • Developed a new recursive WSF model for 1–24 h horizon based on Hammerstein model. • Nonlinear HAR model successfully captured chaotic dynamics of wind speed time series. • Recursive WSF intrinsic error accumulation corrected by applying rotation. • Model verified for real wind speed data from two sites with different characteristics. • HAR model outperformed both ARIMA and ANN models in terms of accuracy of prediction. - Abstract: A new Wind Speed Forecasting (WSF) model, suitable for a short term 1–24 h forecast horizon, is developed by adapting Hammerstein model to an Autoregressive approach. The model is applied to real data collected for a period of three years (2004–2006) from two different sites. The performance of HAR model is evaluated by comparing its prediction with the classical Autoregressive Integrated Moving Average (ARIMA) model and a multi-layer perceptron Artificial Neural Network (ANN). Results show that the HAR model outperforms both the ARIMA model and ANN model in terms of root mean square error (RMSE), mean absolute error (MAE), and Mean Absolute Percentage Error (MAPE). When compared to the conventional models, the new HAR model can better capture various wind speed characteristics, including asymmetric (non-gaussian) wind speed distribution, non-stationary time series profile, and the chaotic dynamics. The new model is beneficial for various applications in the renewable energy area, particularly for power scheduling

  2. Characterizing interdependencies of multiple time series theory and applications

    CERN Document Server

    Hosoya, Yuzo; Takimoto, Taro; Kinoshita, Ryo

    2017-01-01

    This book introduces academic researchers and professionals to the basic concepts and methods for characterizing interdependencies of multiple time series in the frequency domain. Detecting causal directions between a pair of time series and the extent of their effects, as well as testing the non existence of a feedback relation between them, have constituted major focal points in multiple time series analysis since Granger introduced the celebrated definition of causality in view of prediction improvement. Causality analysis has since been widely applied in many disciplines. Although most analyses are conducted from the perspective of the time domain, a frequency domain method introduced in this book sheds new light on another aspect that disentangles the interdependencies between multiple time series in terms of long-term or short-term effects, quantitatively characterizing them. The frequency domain method includes the Granger noncausality test as a special case. Chapters 2 and 3 of the book introduce an i...

  3. Scalable Prediction of Energy Consumption using Incremental Time Series Clustering

    Energy Technology Data Exchange (ETDEWEB)

    Simmhan, Yogesh; Noor, Muhammad Usman

    2013-10-09

    Time series datasets are a canonical form of high velocity Big Data, and often generated by pervasive sensors, such as found in smart infrastructure. Performing predictive analytics on time series data can be computationally complex, and requires approximation techniques. In this paper, we motivate this problem using a real application from the smart grid domain. We propose an incremental clustering technique, along with a novel affinity score for determining cluster similarity, which help reduce the prediction error for cumulative time series within a cluster. We evaluate this technique, along with optimizations, using real datasets from smart meters, totaling ~700,000 data points, and show the efficacy of our techniques in improving the prediction error of time series data within polynomial time.

  4. Detecting structural breaks in time series via genetic algorithms

    DEFF Research Database (Denmark)

    Doerr, Benjamin; Fischer, Paul; Hilbert, Astrid

    2016-01-01

    Detecting structural breaks is an essential task for the statistical analysis of time series, for example, for fitting parametric models to it. In short, structural breaks are points in time at which the behaviour of the time series substantially changes. Typically, no solid background knowledge...... of the time series under consideration is available. Therefore, a black-box optimization approach is our method of choice for detecting structural breaks. We describe a genetic algorithm framework which easily adapts to a large number of statistical settings. To evaluate the usefulness of different crossover...... operator alone. Moreover, we present a specific fitness function which exploits the sparse structure of the break points and which can be evaluated particularly efficiently. The experiments on artificial and real-world time series show that the resulting algorithm detects break points with high precision...

  5. Elements of nonlinear time series analysis and forecasting

    CERN Document Server

    De Gooijer, Jan G

    2017-01-01

    This book provides an overview of the current state-of-the-art of nonlinear time series analysis, richly illustrated with examples, pseudocode algorithms and real-world applications. Avoiding a “theorem-proof” format, it shows concrete applications on a variety of empirical time series. The book can be used in graduate courses in nonlinear time series and at the same time also includes interesting material for more advanced readers. Though it is largely self-contained, readers require an understanding of basic linear time series concepts, Markov chains and Monte Carlo simulation methods. The book covers time-domain and frequency-domain methods for the analysis of both univariate and multivariate (vector) time series. It makes a clear distinction between parametric models on the one hand, and semi- and nonparametric models/methods on the other. This offers the reader the option of concentrating exclusively on one of these nonlinear time series analysis methods. To make the book as user friendly as possible...

  6. multivariate time series modeling of selected childhood diseases in ...

    African Journals Online (AJOL)

    2016-06-17

    Jun 17, 2016 ... Here Ф(L) indicates an ( × ) matrix polynomial in the lag operator L. Vector autoregression is covariance stationary if all values z satisfying | − Ф z − Ф z … − Ф z |=0 lie outside the unit circle. 2.2.2. Covariance and Correlation Matrices of VAR(1) Process. The relationship among components of the vector ...

  7. Financial Time-series Analysis: a Brief Overview

    Science.gov (United States)

    Chakraborti, A.; Patriarca, M.; Santhanam, M. S.

    Prices of commodities or assets produce what is called time-series. Different kinds of financial time-series have been recorded and studied for decades. Nowadays, all transactions on a financial market are recorded, leading to a huge amount of data available, either for free in the Internet or commercially. Financial time-series analysis is of great interest to practitioners as well as to theoreticians, for making inferences and predictions. Furthermore, the stochastic uncertainties inherent in financial time-series and the theory needed to deal with them make the subject especially interesting not only to economists, but also to statisticians and physicists [1]. While it would be a formidable task to make an exhaustive review on the topic, with this review we try to give a flavor of some of its aspects.

  8. AFSC/ABL: Naknek sockeye salmon scale time series

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — A time series of scale samples (1956 2002) collected from adult sockeye salmon returning to Naknek River were retrieved from the Alaska Department of Fish and Game....

  9. AFSC/ABL: Ugashik sockeye salmon scale time series

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — A time series of scale samples (1956 b?? 2002) collected from adult sockeye salmon returning to Ugashik River were retrieved from the Alaska Department of Fish and...

  10. Fast and Flexible Multivariate Time Series Subsequence Search

    Data.gov (United States)

    National Aeronautics and Space Administration — Multivariate Time-Series (MTS) are ubiquitous, and are generated in areas as disparate as sensor recordings in aerospace systems, music and video streams, medical...

  11. Unsupervised land cover change detection: meaningful sequential time series analysis

    CSIR Research Space (South Africa)

    Salmon, BP

    2011-06-01

    Full Text Available An automated land cover change detection method is proposed that uses coarse spatial resolution hyper-temporal earth observation satellite time series data. The study compared three different unsupervised clustering approaches that operate on short...

  12. Geomechanical time series and its singularity spectrum analysis

    Czech Academy of Sciences Publication Activity Database

    Lyubushin, Alexei A.; Kaláb, Zdeněk; Lednická, Markéta

    2012-01-01

    Roč. 47, č. 1 (2012), s. 69-77 ISSN 1217-8977 R&D Projects: GA ČR GA105/09/0089 Institutional research plan: CEZ:AV0Z30860518 Keywords : geomechanical time series * singularity spectrum * time series segmentation * laser distance meter Subject RIV: DC - Siesmology, Volcanology, Earth Structure Impact factor: 0.347, year: 2012 http://www.akademiai.com/content/88v4027758382225/fulltext.pdf

  13. Signal Processing for Time-Series Functions on a Graph

    Science.gov (United States)

    2018-02-01

    ARL-TR-8276• FEB 2018 US Army Research Laboratory Signal Processing for Time-Series Functions on a Graph by Humberto Muñoz-Barona, Jean Vettel, and...ARL-TR-8276• FEB 2018 US Army Research Laboratory Signal Processing for Time-Series Functions on a Graph by Humberto Muñoz-Barona Southern University...addison.w.bohannon.civ@mail.mil>. Previous research introduced signal processing on graphs, an approach to generalize signal processing tools such

  14. Multiple Time Series Ising Model for Financial Market Simulations

    International Nuclear Information System (INIS)

    Takaishi, Tetsuya

    2015-01-01

    In this paper we propose an Ising model which simulates multiple financial time series. Our model introduces the interaction which couples to spins of other systems. Simulations from our model show that time series exhibit the volatility clustering that is often observed in the real financial markets. Furthermore we also find non-zero cross correlations between the volatilities from our model. Thus our model can simulate stock markets where volatilities of stocks are mutually correlated

  15. Stacked Heterogeneous Neural Networks for Time Series Forecasting

    Directory of Open Access Journals (Sweden)

    Florin Leon

    2010-01-01

    Full Text Available A hybrid model for time series forecasting is proposed. It is a stacked neural network, containing one normal multilayer perceptron with bipolar sigmoid activation functions, and the other with an exponential activation function in the output layer. As shown by the case studies, the proposed stacked hybrid neural model performs well on a variety of benchmark time series. The combination of weights of the two stack components that leads to optimal performance is also studied.

  16. Extracting Chaos Control Parameters from Time Series Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Santos, R B B [Centro Universitario da FEI, Avenida Humberto de Alencar Castelo Branco 3972, 09850-901, Sao Bernardo do Campo, SP (Brazil); Graves, J C, E-mail: rsantos@fei.edu.br [Instituto Tecnologico de Aeronautica, Praca Marechal Eduardo Gomes 50, 12228-900, Sao Jose dos Campos, SP (Brazil)

    2011-03-01

    We present a simple method to analyze time series, and estimate the parameters needed to control chaos in dynamical systems. Application of the method to a system described by the logistic map is also shown. Analyzing only two 100-point time series, we achieved results within 2% of the analytical ones. With these estimates, we show that OGY control method successfully stabilized a period-1 unstable periodic orbit embedded in the chaotic attractor.

  17. An innovation approach to non-Gaussian time series analysis

    OpenAIRE

    Ozaki, Tohru; Iino, Mitsunori

    2001-01-01

    The paper shows that the use of both types of random noise, white noise and Poisson noise, can be justified when using an innovations approach. The historical background for this is sketched, and then several methods of whitening dependent time series are outlined, including a mixture of Gaussian white noise and a compound Poisson process: this appears as a natural extension of the Gaussian white noise model for the prediction errors of a non-Gaussian time series. A stati...

  18. Time Series Analysis of Insar Data: Methods and Trends

    Science.gov (United States)

    Osmanoglu, Batuhan; Sunar, Filiz; Wdowinski, Shimon; Cano-Cabral, Enrique

    2015-01-01

    Time series analysis of InSAR data has emerged as an important tool for monitoring and measuring the displacement of the Earth's surface. Changes in the Earth's surface can result from a wide range of phenomena such as earthquakes, volcanoes, landslides, variations in ground water levels, and changes in wetland water levels. Time series analysis is applied to interferometric phase measurements, which wrap around when the observed motion is larger than one-half of the radar wavelength. Thus, the spatio-temporal ''unwrapping" of phase observations is necessary to obtain physically meaningful results. Several different algorithms have been developed for time series analysis of InSAR data to solve for this ambiguity. These algorithms may employ different models for time series analysis, but they all generate a first-order deformation rate, which can be compared to each other. However, there is no single algorithm that can provide optimal results in all cases. Since time series analyses of InSAR data are used in a variety of applications with different characteristics, each algorithm possesses inherently unique strengths and weaknesses. In this review article, following a brief overview of InSAR technology, we discuss several algorithms developed for time series analysis of InSAR data using an example set of results for measuring subsidence rates in Mexico City.

  19. Combined Forecasts from Linear and Nonlinear Time Series Models

    NARCIS (Netherlands)

    N. Terui (Nobuhiko); H.K. van Dijk (Herman)

    1999-01-01

    textabstractCombined forecasts from a linear and a nonlinear model are investigated for time series with possibly nonlinear characteristics. The forecasts are combined by a constant coefficient regression method as well as a time varying method. The time varying method allows for a locally

  20. Combined forecasts from linear and nonlinear time series models

    NARCIS (Netherlands)

    N. Terui (Nobuhiko); H.K. van Dijk (Herman)

    1999-01-01

    textabstractCombined forecasts from a linear and a nonlinear model are investigated for time series with possibly nonlinear characteristics. The forecasts are combined by a constant coefficient regression method as well as a time varying method. The time varying method allows for a locally

  1. Data imputation analysis for Cosmic Rays time series

    Science.gov (United States)

    Fernandes, R. C.; Lucio, P. S.; Fernandez, J. H.

    2017-05-01

    The occurrence of missing data concerning Galactic Cosmic Rays time series (GCR) is inevitable since loss of data is due to mechanical and human failure or technical problems and different periods of operation of GCR stations. The aim of this study was to perform multiple dataset imputation in order to depict the observational dataset. The study has used the monthly time series of GCR Climax (CLMX) and Roma (ROME) from 1960 to 2004 to simulate scenarios of 10%, 20%, 30%, 40%, 50%, 60%, 70%, 80% and 90% of missing data compared to observed ROME series, with 50 replicates. Then, the CLMX station as a proxy for allocation of these scenarios was used. Three different methods for monthly dataset imputation were selected: AMÉLIA II - runs the bootstrap Expectation Maximization algorithm, MICE - runs an algorithm via Multivariate Imputation by Chained Equations and MTSDI - an Expectation Maximization algorithm-based method for imputation of missing values in multivariate normal time series. The synthetic time series compared with the observed ROME series has also been evaluated using several skill measures as such as RMSE, NRMSE, Agreement Index, R, R2, F-test and t-test. The results showed that for CLMX and ROME, the R2 and R statistics were equal to 0.98 and 0.96, respectively. It was observed that increases in the number of gaps generate loss of quality of the time series. Data imputation was more efficient with MTSDI method, with negligible errors and best skill coefficients. The results suggest a limit of about 60% of missing data for imputation, for monthly averages, no more than this. It is noteworthy that CLMX, ROME and KIEL stations present no missing data in the target period. This methodology allowed reconstructing 43 time series.

  2. Period Estimation in Astronomical Time Series Using Slotted Correntropy

    Science.gov (United States)

    Huijse, Pablo; Estevez, Pablo A.; Zegers, Pablo; Principe, José C.; Protopapas, Pavlos

    2011-06-01

    In this letter, we propose a method for period estimation in light curves from periodic variable stars using correntropy. Light curves are astronomical time series of stellar brightness over time, and are characterized as being noisy and unevenly sampled. We propose to use slotted time lags in order to estimate correntropy directly from irregularly sampled time series. A new information theoretic metric is proposed for discriminating among the peaks of the correntropy spectral density. The slotted correntropy method outperformed slotted correlation, string length, VarTools (Lomb-Scargle periodogram and Analysis of Variance), and SigSpec applications on a set of light curves drawn from the MACHO survey.

  3. Multiresolution analysis of Bursa Malaysia KLCI time series

    Science.gov (United States)

    Ismail, Mohd Tahir; Dghais, Amel Abdoullah Ahmed

    2017-05-01

    In general, a time series is simply a sequence of numbers collected at regular intervals over a period. Financial time series data processing is concerned with the theory and practice of processing asset price over time, such as currency, commodity data, and stock market data. The primary aim of this study is to understand the fundamental characteristics of selected financial time series by using the time as well as the frequency domain analysis. After that prediction can be executed for the desired system for in sample forecasting. In this study, multiresolution analysis which the assist of discrete wavelet transforms (DWT) and maximal overlap discrete wavelet transform (MODWT) will be used to pinpoint special characteristics of Bursa Malaysia KLCI (Kuala Lumpur Composite Index) daily closing prices and return values. In addition, further case study discussions include the modeling of Bursa Malaysia KLCI using linear ARIMA with wavelets to address how multiresolution approach improves fitting and forecasting results.

  4. Correlation measure to detect time series distances, whence economy globalization

    Science.gov (United States)

    Miśkiewicz, Janusz; Ausloos, Marcel

    2008-11-01

    An instantaneous time series distance is defined through the equal time correlation coefficient. The idea is applied to the Gross Domestic Product (GDP) yearly increments of 21 rich countries between 1950 and 2005 in order to test the process of economic globalisation. Some data discussion is first presented to decide what (EKS, GK, or derived) GDP series should be studied. Distances are then calculated from the correlation coefficient values between pairs of series. The role of time averaging of the distances over finite size windows is discussed. Three network structures are next constructed based on the hierarchy of distances. It is shown that the mean distance between the most developed countries on several networks actually decreases in time, -which we consider as a proof of globalization. An empirical law is found for the evolution after 1990, similar to that found in flux creep. The optimal observation time window size is found ≃15 years.

  5. Time domain series system definition and gear set reliability modeling

    International Nuclear Information System (INIS)

    Xie, Liyang; Wu, Ningxiang; Qian, Wenxue

    2016-01-01

    Time-dependent multi-configuration is a typical feature for mechanical systems such as gear trains and chain drives. As a series system, a gear train is distinct from a traditional series system, such as a chain, in load transmission path, system-component relationship, system functioning manner, as well as time-dependent system configuration. Firstly, the present paper defines time-domain series system to which the traditional series system reliability model is not adequate. Then, system specific reliability modeling technique is proposed for gear sets, including component (tooth) and subsystem (tooth-pair) load history description, material priori/posterior strength expression, time-dependent and system specific load-strength interference analysis, as well as statistically dependent failure events treatment. Consequently, several system reliability models are developed for gear sets with different tooth numbers in the scenario of tooth root material ultimate tensile strength failure. The application of the models is discussed in the last part, and the differences between the system specific reliability model and the traditional series system reliability model are illustrated by virtue of several numerical examples. - Highlights: • A new type of series system, i.e. time-domain multi-configuration series system is defined, that is of great significance to reliability modeling. • Multi-level statistical analysis based reliability modeling method is presented for gear transmission system. • Several system specific reliability models are established for gear set reliability estimation. • The differences between the traditional series system reliability model and the new model are illustrated.

  6. Evaluation of Scaling Invariance Embedded in Short Time Series

    Science.gov (United States)

    Pan, Xue; Hou, Lei; Stephen, Mutua; Yang, Huijie; Zhu, Chenping

    2014-01-01

    Scaling invariance of time series has been making great contributions in diverse research fields. But how to evaluate scaling exponent from a real-world series is still an open problem. Finite length of time series may induce unacceptable fluctuation and bias to statistical quantities and consequent invalidation of currently used standard methods. In this paper a new concept called correlation-dependent balanced estimation of diffusion entropy is developed to evaluate scale-invariance in very short time series with length . Calculations with specified Hurst exponent values of show that by using the standard central moving average de-trending procedure this method can evaluate the scaling exponents for short time series with ignorable bias () and sharp confidential interval (standard deviation ). Considering the stride series from ten volunteers along an approximate oval path of a specified length, we observe that though the averages and deviations of scaling exponents are close, their evolutionary behaviors display rich patterns. It has potential use in analyzing physiological signals, detecting early warning signals, and so on. As an emphasis, the our core contribution is that by means of the proposed method one can estimate precisely shannon entropy from limited records. PMID:25549356

  7. Self-affinity in the dengue fever time series

    Science.gov (United States)

    Azevedo, S. M.; Saba, H.; Miranda, J. G. V.; Filho, A. S. Nascimento; Moret, M. A.

    2016-06-01

    Dengue is a complex public health problem that is common in tropical and subtropical regions. This disease has risen substantially in the last three decades, and the physical symptoms depict the self-affine behavior of the occurrences of reported dengue cases in Bahia, Brazil. This study uses detrended fluctuation analysis (DFA) to verify the scale behavior in a time series of dengue cases and to evaluate the long-range correlations that are characterized by the power law α exponent for different cities in Bahia, Brazil. The scaling exponent (α) presents different long-range correlations, i.e. uncorrelated, anti-persistent, persistent and diffusive behaviors. The long-range correlations highlight the complex behavior of the time series of this disease. The findings show that there are two distinct types of scale behavior. In the first behavior, the time series presents a persistent α exponent for a one-month period. For large periods, the time series signal approaches subdiffusive behavior. The hypothesis of the long-range correlations in the time series of the occurrences of reported dengue cases was validated. The observed self-affinity is useful as a forecasting tool for future periods through extrapolation of the α exponent behavior. This complex system has a higher predictability in a relatively short time (approximately one month), and it suggests a new tool in epidemiological control strategies. However, predictions for large periods using DFA are hidden by the subdiffusive behavior.

  8. Drunk driving detection based on classification of multivariate time series.

    Science.gov (United States)

    Li, Zhenlong; Jin, Xue; Zhao, Xiaohua

    2015-09-01

    This paper addresses the problem of detecting drunk driving based on classification of multivariate time series. First, driving performance measures were collected from a test in a driving simulator located in the Traffic Research Center, Beijing University of Technology. Lateral position and steering angle were used to detect drunk driving. Second, multivariate time series analysis was performed to extract the features. A piecewise linear representation was used to represent multivariate time series. A bottom-up algorithm was then employed to separate multivariate time series. The slope and time interval of each segment were extracted as the features for classification. Third, a support vector machine classifier was used to classify driver's state into two classes (normal or drunk) according to the extracted features. The proposed approach achieved an accuracy of 80.0%. Drunk driving detection based on the analysis of multivariate time series is feasible and effective. The approach has implications for drunk driving detection. Copyright © 2015 Elsevier Ltd and National Safety Council. All rights reserved.

  9. [Autoregressive integrated moving average model in food poisoning prediction in Hunan Province].

    Science.gov (United States)

    Chen, Ling; Xu, Huilan

    2012-02-01

    To determine the application of autoregressive integrated moving average (ARIMA) model in food poisoning prediction in Hunan Province, and to provide scientific basis for the prevention and control of food poisoning. We collected the number of food poisoning from January 2003 to December 2009 in Hunan Province for ARIMA model fitting, and used food poisoning data of 2010 to verify the effect of model prediction. We predicted the number of food poisoning in 2011. ARIMA (0,1,1) (0,1,1)12 better fit the trends of the poisoning number in previous time periods and series, with prediction fitting error of 9.59%. The number of food poisoning in Hunan Province in 2011 was predicted to be 834. ARIMA model can better fit the number of food poisoning in the short term trends and series. If used for long-term forecasts.

  10. Biogeochemistry from Gliders at the Hawaii Ocean Times-Series

    Science.gov (United States)

    Nicholson, D. P.; Barone, B.; Karl, D. M.

    2016-02-01

    At the Hawaii Ocean Time-series (HOT) autonomous, underwater gliders equipped with biogeochemical sensors observe the oceans for months at a time, sampling spatiotemporal scales missed by the ship-based programs. Over the last decade, glider data augmented by a foundation of time-series observations have shed light on biogeochemical dynamics occuring spatially at meso- and submesoscales and temporally on scales from diel to annual. We present insights gained from the synergy between glider observations, time-series measurements and remote sensing in the subtropical North Pacific. We focus on diel variability observed in dissolved oxygen and bio-optics and approaches to autonomously quantify net community production and gross primary production (GPP) as developed during the 2012 Hawaii Ocean Experiment - DYnamics of Light And Nutrients (HOE-DYLAN). Glider-based GPP measurements were extended to explore the relationship between GPP and mesoscale context over multiple years of Seaglider deployments.

  11. Compounding approach for univariate time series with nonstationary variances

    Science.gov (United States)

    Schäfer, Rudi; Barkhofen, Sonja; Guhr, Thomas; Stöckmann, Hans-Jürgen; Kuhl, Ulrich

    2015-12-01

    A defining feature of nonstationary systems is the time dependence of their statistical parameters. Measured time series may exhibit Gaussian statistics on short time horizons, due to the central limit theorem. The sample statistics for long time horizons, however, averages over the time-dependent variances. To model the long-term statistical behavior, we compound the local distribution with the distribution of its parameters. Here, we consider two concrete, but diverse, examples of such nonstationary systems: the turbulent air flow of a fan and a time series of foreign exchange rates. Our main focus is to empirically determine the appropriate parameter distribution for the compounding approach. To this end, we extract the relevant time scales by decomposing the time signals into windows and determine the distribution function of the thus obtained local variances.

  12. The Application of Bayesian Spectral Analysis in Photometric Time Series

    Directory of Open Access Journals (Sweden)

    saeideh latif

    2017-11-01

    Full Text Available The present paper introduces the Bayesian spectral analysis as a powerful and efficient method for spectral analysis of photometric time series. For this purpose, Bayesian spectral analysis has programmed in Matlab software for XZ Dra photometric time series which is non-uniform with large gaps and the power spectrum of this analysis has compared with the power spectrum which obtained from the Period04 software, which designed for statistical analysis of astronomical time series and used of artificial data for unify the time series. Although in the power spectrum of this software, the main spectral peak which represent the main frequency of XZ Dra variable star oscillations in the f = 2.09864 (day -1 is well known but false spectral peaks are also seen. Also, in this software it’s not clear how to generate the synthetic data. These false peaks have been removed in the power spectrum which obtained from the Bayesian analysis; also this spectral peak which is around the desired frequency has a shorter width and is more accurate. It should be noted that in Bayesian spectral analysis, it’s not require to unify the time series for obtaining a desired power spectrum. Moreover, the researcher also becomes aware of the exact calculation process.

  13. Recurrent Neural Network Applications for Astronomical Time Series

    Science.gov (United States)

    Protopapas, Pavlos

    2017-06-01

    The benefits of good predictive models in astronomy lie in early event prediction systems and effective resource allocation. Current time series methods applicable to regular time series have not evolved to generalize for irregular time series. In this talk, I will describe two Recurrent Neural Network methods, Long Short-Term Memory (LSTM) and Echo State Networks (ESNs) for predicting irregular time series. Feature engineering along with a non-linear modeling proved to be an effective predictor. For noisy time series, the prediction is improved by training the network on error realizations using the error estimates from astronomical light curves. In addition to this, we propose a new neural network architecture to remove correlation from the residuals in order to improve prediction and compensate for the noisy data. Finally, I show how to set hyperparameters for a stable and performant solution correctly. In this work, we circumvent this obstacle by optimizing ESN hyperparameters using Bayesian optimization with Gaussian Process priors. This automates the tuning procedure, enabling users to employ the power of RNN without needing an in-depth understanding of the tuning procedure.

  14. Using empirical mode decomposition to correlate paleoclimatic time-series

    Directory of Open Access Journals (Sweden)

    J. Solé

    2007-01-01

    Full Text Available Determination of the timing and duration of paleoclimatic events is a challenging task. Classical techniques for time-series analysis rely too strongly on having a constant sampling rate, which poorly adapts to the uneven time recording of paleoclimatic variables; new, more flexible methods issued from Non-Linear Physics are hence required. In this paper, we have used Huang's Empirical Mode Decomposition (EMD for the analysis of paleoclimatic series. We have studied three different time series of temperature proxies, characterizing oscillation patterns by using EMD. To measure the degree of temporal correlation of two variables, we have developed a method that relates couples of modes from different series by calculating the instantaneous phase differences among the associated modes. We observed that when two modes exhibited a constant phase difference, their frequencies were nearly equal to that of Milankovich cycles. Our results show that EMD is a good methodology not only for synchronization of different records but also for determination of the different local frequencies in each time series. Some of the obtained modes may be interpreted as the result of global forcing mechanisms.

  15. Continuous baseflow separation from time series of daily and ...

    African Journals Online (AJOL)

    Continuous baseflow separation procedures have been frequently used to differentiate total flows into the high-frequency, lowamplitude 'baseflow' component and the low-frequency, high-amplitude 'flood' flows. In the past, such procedures have normally been applied to streamflow time-series data with time steps of 1 day ...

  16. Efficient use of correlation entropy for analysing time series data

    Indian Academy of Sciences (India)

    specific data sets. The technique uses the scalar time series to reconstruct the dy- namics in an embedding space of dimension M using delay coordinates scanned at a suitable time delay τ. But a major difficulty in implementing this procedure is that, the scaling region in the correlation sum for the computation of D2 and K2 ...

  17. Applying Time Series Analysis Model to Temperature Data in Greenhouses

    Directory of Open Access Journals (Sweden)

    Abdelhafid Hasni

    2011-03-01

    Full Text Available The objective of the research is to find an appropriate Seasonal Auto-Regressive Integrated Moving Average (SARIMA Model for fitting the inside air temperature (Tin of a naturally ventilated greenhouse under Mediterranean conditions by considering the minimum of Akaike Information Criterion (AIC. The results of fitting were as follows: the best SARIMA Model for fitting air temperature of greenhouse is SARIMA (1,0,0 (1,0,224.

  18. A Random Walk Test for Functional Time Series

    OpenAIRE

    Mingotti, Nicola; Lillo Rodríguez, Rosa Elvira; Romo Urroz, Juan

    2015-01-01

    In this paper we introduce a Random Walk test for Functional Autoregressive Processes of Order One. The test is non parametric, based on Bootstrap and Functional Principal Components. The power of the test is shown through an extensive Montecarlo simulation. We apply the test to two real dataset, Bitcoin prices and electrical energy consumption in France. The authors acknowledge financial support from the Spanish Ministry of Economy and Competition, research project ECO2012-38442.

  19. A general theory on frequency and time-frequency analysis of irregularly sampled time series based on projection methods - Part 2: Extension to time-frequency analysis

    Science.gov (United States)

    Lenoir, Guillaume; Crucifix, Michel

    2018-03-01

    Geophysical time series are sometimes sampled irregularly along the time axis. The situation is particularly frequent in palaeoclimatology. Yet, there is so far no general framework for handling the continuous wavelet transform when the time sampling is irregular. Here we provide such a framework. To this end, we define the scalogram as the continuous-wavelet-transform equivalent of the extended Lomb-Scargle periodogram defined in Part 1 of this study (Lenoir and Crucifix, 2018). The signal being analysed is modelled as the sum of a locally periodic component in the time-frequency plane, a polynomial trend, and a background noise. The mother wavelet adopted here is the Morlet wavelet classically used in geophysical applications. The background noise model is a stationary Gaussian continuous autoregressive-moving-average (CARMA) process, which is more general than the traditional Gaussian white and red noise processes. The scalogram is smoothed by averaging over neighbouring times in order to reduce its variance. The Shannon-Nyquist exclusion zone is however defined as the area corrupted by local aliasing issues. The local amplitude in the time-frequency plane is then estimated with least-squares methods. We also derive an approximate formula linking the squared amplitude and the scalogram. Based on this property, we define a new analysis tool: the weighted smoothed scalogram, which we recommend for most analyses. The estimated signal amplitude also gives access to band and ridge filtering. Finally, we design a test of significance for the weighted smoothed scalogram against the stationary Gaussian CARMA background noise, and provide algorithms for computing confidence levels, either analytically or with Monte Carlo Markov chain methods. All the analysis tools presented in this article are available to the reader in the Python package WAVEPAL.

  20. Time Series Prediction based on Hybrid Neural Networks

    Directory of Open Access Journals (Sweden)

    S. A. Yarushev

    2016-01-01

    Full Text Available In this paper, we suggest to use hybrid approach to time series forecasting problem. In first part of paper, we create a literature review of time series forecasting methods based on hybrid neural networks and neuro-fuzzy approaches. Hybrid neural networks especially effective for specific types of applications such as forecasting or classification problem, in contrast to traditional monolithic neural networks. These classes of problems include problems with different characteristics in different modules. The main part of paper create a detailed overview of hybrid networks benefits, its architectures and performance under traditional neural networks. Hybrid neural networks models for time series forecasting are discussed in the paper. Experiments with modular neural networks are given.

  1. Appropriate Algorithms for Nonlinear Time Series Analysis in Psychology

    Science.gov (United States)

    Scheier, Christian; Tschacher, Wolfgang

    Chaos theory has a strong appeal for psychology because it allows for the investigation of the dynamics and nonlinearity of psychological systems. Consequently, chaos-theoretic concepts and methods have recently gained increasing attention among psychologists and positive claims for chaos have been published in nearly every field of psychology. Less attention, however, has been paid to the appropriateness of chaos-theoretic algorithms for psychological time series. An appropriate algorithm can deal with short, noisy data sets and yields `objective' results. In the present paper it is argued that most of the classical nonlinear techniques don't satisfy these constraints and thus are not appropriate for psychological data. A methodological approach is introduced that is based on nonlinear forecasting and the method of surrogate data. In artificial data sets and empirical time series we can show that this methodology reliably assesses nonlinearity and chaos in time series even if they are short and contaminated by noise.

  2. Grammar-based feature generation for time-series prediction

    CERN Document Server

    De Silva, Anthony Mihirana

    2015-01-01

    This book proposes a novel approach for time-series prediction using machine learning techniques with automatic feature generation. Application of machine learning techniques to predict time-series continues to attract considerable attention due to the difficulty of the prediction problems compounded by the non-linear and non-stationary nature of the real world time-series. The performance of machine learning techniques, among other things, depends on suitable engineering of features. This book proposes a systematic way for generating suitable features using context-free grammar. A number of feature selection criteria are investigated and a hybrid feature generation and selection algorithm using grammatical evolution is proposed. The book contains graphical illustrations to explain the feature generation process. The proposed approaches are demonstrated by predicting the closing price of major stock market indices, peak electricity load and net hourly foreign exchange client trade volume. The proposed method ...

  3. Track Irregularity Time Series Analysis and Trend Forecasting

    Directory of Open Access Journals (Sweden)

    Jia Chaolong

    2012-01-01

    Full Text Available The combination of linear and nonlinear methods is widely used in the prediction of time series data. This paper analyzes track irregularity time series data by using gray incidence degree models and methods of data transformation, trying to find the connotative relationship between the time series data. In this paper, GM (1,1 is based on first-order, single variable linear differential equations; after an adaptive improvement and error correction, it is used to predict the long-term changing trend of track irregularity at a fixed measuring point; the stochastic linear AR, Kalman filtering model, and artificial neural network model are applied to predict the short-term changing trend of track irregularity at unit section. Both long-term and short-term changes prove that the model is effective and can achieve the expected accuracy.

  4. Time series analysis and its applications with R examples

    CERN Document Server

    Shumway, Robert H

    2017-01-01

    The fourth edition of this popular graduate textbook, like its predecessors, presents a balanced and comprehensive treatment of both time and frequency domain methods with accompanying theory. Numerous examples using nontrivial data illustrate solutions to problems such as discovering natural and anthropogenic climate change, evaluating pain perception experiments using functional magnetic resonance imaging, and monitoring a nuclear test ban treaty. The book is designed as a textbook for graduate level students in the physical, biological, and social sciences and as a graduate level text in statistics. Some parts may also serve as an undergraduate introductory course. Theory and methodology are separated to allow presentations on different levels. In addition to coverage of classical methods of time series regression, ARIMA models, spectral analysis and state-space models, the text includes modern developments including categorical time series analysis, multivariate spectral methods, long memory series, nonli...

  5. Semi-autonomous remote sensing time series generation tool

    Science.gov (United States)

    Babu, Dinesh Kumar; Kaufmann, Christof; Schmidt, Marco; Dhams, Thorsten; Conrad, Christopher

    2017-10-01

    High spatial and temporal resolution data is vital for crop monitoring and phenology change detection. Due to the lack of satellite architecture and frequent cloud cover issues, availability of daily high spatial data is still far from reality. Remote sensing time series generation of high spatial and temporal data by data fusion seems to be a practical alternative. However, it is not an easy process, since it involves multiple steps and also requires multiple tools. In this paper, a framework of Geo Information System (GIS) based tool is presented for semi-autonomous time series generation. This tool will eliminate the difficulties by automating all the steps and enable the users to generate synthetic time series data with ease. Firstly, all the steps required for the time series generation process are identified and grouped into blocks based on their functionalities. Later two main frameworks are created, one to perform all the pre-processing steps on various satellite data and the other one to perform data fusion to generate time series. The two frameworks can be used individually to perform specific tasks or they could be combined to perform both the processes in one go. This tool can handle most of the known geo data formats currently available which makes it a generic tool for time series generation of various remote sensing satellite data. This tool is developed as a common platform with good interface which provides lot of functionalities to enable further development of more remote sensing applications. A detailed description on the capabilities and the advantages of the frameworks are given in this paper.

  6. A multidisciplinary database for geophysical time series management

    Science.gov (United States)

    Montalto, P.; Aliotta, M.; Cassisi, C.; Prestifilippo, M.; Cannata, A.

    2013-12-01

    The variables collected by a sensor network constitute a heterogeneous data source that needs to be properly organized in order to be used in research and geophysical monitoring. With the time series term we refer to a set of observations of a given phenomenon acquired sequentially in time. When the time intervals are equally spaced one speaks of period or sampling frequency. Our work describes in detail a possible methodology for storage and management of time series using a specific data structure. We designed a framework, hereinafter called TSDSystem (Time Series Database System), in order to acquire time series from different data sources and standardize them within a relational database. The operation of standardization provides the ability to perform operations, such as query and visualization, of many measures synchronizing them using a common time scale. The proposed architecture follows a multiple layer paradigm (Loaders layer, Database layer and Business Logic layer). Each layer is specialized in performing particular operations for the reorganization and archiving of data from different sources such as ASCII, Excel, ODBC (Open DataBase Connectivity), file accessible from the Internet (web pages, XML). In particular, the loader layer performs a security check of the working status of each running software through an heartbeat system, in order to automate the discovery of acquisition issues and other warning conditions. Although our system has to manage huge amounts of data, performance is guaranteed by using a smart partitioning table strategy, that keeps balanced the percentage of data stored in each database table. TSDSystem also contains modules for the visualization of acquired data, that provide the possibility to query different time series on a specified time range, or follow the realtime signal acquisition, according to a data access policy from the users.

  7. A novel time series link prediction method: Learning automata approach

    Science.gov (United States)

    Moradabadi, Behnaz; Meybodi, Mohammad Reza

    2017-09-01

    Link prediction is a main social network challenge that uses the network structure to predict future links. The common link prediction approaches to predict hidden links use a static graph representation where a snapshot of the network is analyzed to find hidden or future links. For example, similarity metric based link predictions are a common traditional approach that calculates the similarity metric for each non-connected link and sort the links based on their similarity metrics and label the links with higher similarity scores as the future links. Because people activities in social networks are dynamic and uncertainty, and the structure of the networks changes over time, using deterministic graphs for modeling and analysis of the social network may not be appropriate. In the time-series link prediction problem, the time series link occurrences are used to predict the future links In this paper, we propose a new time series link prediction based on learning automata. In the proposed algorithm for each link that must be predicted there is one learning automaton and each learning automaton tries to predict the existence or non-existence of the corresponding link. To predict the link occurrence in time T, there is a chain consists of stages 1 through T - 1 and the learning automaton passes from these stages to learn the existence or non-existence of the corresponding link. Our preliminary link prediction experiments with co-authorship and email networks have provided satisfactory results when time series link occurrences are considered.

  8. Easily adaptable complexity measure for finite time series.

    Science.gov (United States)

    Ke, Da-Guan; Tong, Qin-Ye

    2008-06-01

    We present a complexity measure for any finite time series. This measure has invariance under any monotonic transformation of the time series, has a degree of robustness against noise, and has the adaptability of satisfying almost all the widely accepted but conflicting criteria for complexity measurements. Surprisingly, the measure is developed from Kolmogorov complexity, which is traditionally believed to represent only randomness and to satisfy one criterion to the exclusion of the others. For familiar iterative systems, our treatment may imply a heuristic approach to transforming symbolic dynamics into permutation dynamics and vice versa.

  9. Handbook of Time Series Analysis Recent Theoretical Developments and Applications

    CERN Document Server

    Schelter, Björn; Timmer, Jens

    2006-01-01

    This handbook provides an up-to-date survey of current research topics and applications of time series analysis methods written by leading experts in their fields. It covers recent developments in univariate as well as bivariate and multivariate time series analysis techniques ranging from physics' to life sciences' applications. Each chapter comprises both methodological aspects and applications to real world complex systems, such as the human brain or Earth's climate. Covering an exceptionally broad spectrum of topics, beginners, experts and practitioners who seek to understand the latest de

  10. Complex network approach for recurrence analysis of time series

    Energy Technology Data Exchange (ETDEWEB)

    Marwan, Norbert, E-mail: marwan@pik-potsdam.d [Potsdam Institute for Climate Impact Research, PO Box 601203, 14412 Potsdam (Germany); Donges, Jonathan F. [Potsdam Institute for Climate Impact Research, PO Box 601203, 14412 Potsdam (Germany)] [Department of Physics, Humboldt University Berlin, Newtonstr. 15, 12489 Berlin (Germany); Zou Yong [Potsdam Institute for Climate Impact Research, PO Box 601203, 14412 Potsdam (Germany); Donner, Reik V. [Potsdam Institute for Climate Impact Research, PO Box 601203, 14412 Potsdam (Germany)] [Institute for Transport and Economics, Dresden University of Technology, Andreas-Schubert-Str. 23, 01062 Dresden (Germany)] [Graduate School of Science, Osaka Prefecture University, 1-1 Gakuencho, Naka-ku, Sakai 599-8531 (Japan); Kurths, Juergen [Potsdam Institute for Climate Impact Research, PO Box 601203, 14412 Potsdam (Germany)] [Department of Physics, Humboldt University Berlin, Newtonstr. 15, 12489 Berlin (Germany)

    2009-11-09

    We propose a novel approach for analysing time series using complex network theory. We identify the recurrence matrix (calculated from time series) with the adjacency matrix of a complex network and apply measures for the characterisation of complex networks to this recurrence matrix. By using the logistic map, we illustrate the potential of these complex network measures for the detection of dynamical transitions. Finally, we apply the proposed approach to a marine palaeo-climate record and identify the subtle changes to the climate regime.

  11. Testing for intracycle determinism in pseudoperiodic time series.

    Science.gov (United States)

    Coelho, Mara C S; Mendes, Eduardo M A M; Aguirre, Luis A

    2008-06-01

    A determinism test is proposed based on the well-known method of the surrogate data. Assuming predictability to be a signature of determinism, the proposed method checks for intracycle (e.g., short-term) determinism in the pseudoperiodic time series for which standard methods of surrogate analysis do not apply. The approach presented is composed of two steps. First, the data are preprocessed to reduce the effects of seasonal and trend components. Second, standard tests of surrogate analysis can then be used. The determinism test is applied to simulated and experimental pseudoperiodic time series and the results show the applicability of the proposed test.

  12. Deep Learning in Multiple Multistep Time Series Prediction

    OpenAIRE

    Zang, Chuanyun

    2017-01-01

    The project aims to research on combining deep learning specifically Long-Short Memory (LSTM) and basic statistics in multiple multistep time series prediction. LSTM can dive into all the pages and learn the general trends of variation in a large scope, while the well selected medians for each page can keep the special seasonality of different pages so that the future trend will not fluctuate too much from the reality. A recent Kaggle competition on 145K Web Traffic Time Series Forecasting [1...

  13. Time-Series Analysis of Continuously Monitored Blood Glucose: The Impacts of Geographic and Daily Lifestyle Factors

    Science.gov (United States)

    Greaves, Stephen P.

    2015-01-01

    Type 2 diabetes is known to be associated with environmental, behavioral, and lifestyle factors. However, the actual impacts of these factors on blood glucose (BG) variation throughout the day have remained relatively unexplored. Continuous blood glucose monitors combined with human activity tracking technologies afford new opportunities for exploration in a naturalistic setting. Data from a study of 40 patients with diabetes is utilized in this paper, including continuously monitored BG, food/medicine intake, and patient activity/location tracked using global positioning systems over a 4-day period. Standard linear regression and more disaggregated time-series analysis using autoregressive integrated moving average (ARIMA) are used to explore patient BG variation throughout the day and over space. The ARIMA models revealed a wide variety of BG correlating factors related to specific activity types, locations (especially those far from home), and travel modes, although the impacts were highly personal. Traditional variables related to food intake and medications were less often significant. Overall, the time-series analysis revealed considerable patient-by-patient variation in the effects of geographic and daily lifestyle factors. We would suggest that maps of BG spatial variation or an interactive messaging system could provide new tools to engage patients and highlight potential risk factors. PMID:25893201

  14. Time-Series Analysis of Continuously Monitored Blood Glucose: The Impacts of Geographic and Daily Lifestyle Factors

    Directory of Open Access Journals (Sweden)

    Sean T. Doherty

    2015-01-01

    Full Text Available Type 2 diabetes is known to be associated with environmental, behavioral, and lifestyle factors. However, the actual impacts of these factors on blood glucose (BG variation throughout the day have remained relatively unexplored. Continuous blood glucose monitors combined with human activity tracking technologies afford new opportunities for exploration in a naturalistic setting. Data from a study of 40 patients with diabetes is utilized in this paper, including continuously monitored BG, food/medicine intake, and patient activity/location tracked using global positioning systems over a 4-day period. Standard linear regression and more disaggregated time-series analysis using autoregressive integrated moving average (ARIMA are used to explore patient BG variation throughout the day and over space. The ARIMA models revealed a wide variety of BG correlating factors related to specific activity types, locations (especially those far from home, and travel modes, although the impacts were highly personal. Traditional variables related to food intake and medications were less often significant. Overall, the time-series analysis revealed considerable patient-by-patient variation in the effects of geographic and daily lifestyle factors. We would suggest that maps of BG spatial variation or an interactive messaging system could provide new tools to engage patients and highlight potential risk factors.

  15. A Procedure for Identification of Appropriate State Space and ARIMA Models Based on Time-Series Cross-Validation

    Directory of Open Access Journals (Sweden)

    Patrícia Ramos

    2016-11-01

    Full Text Available In this work, a cross-validation procedure is used to identify an appropriate Autoregressive Integrated Moving Average model and an appropriate state space model for a time series. A minimum size for the training set is specified. The procedure is based on one-step forecasts and uses different training sets, each containing one more observation than the previous one. All possible state space models and all ARIMA models where the orders are allowed to range reasonably are fitted considering raw data and log-transformed data with regular differencing (up to second order differences and, if the time series is seasonal, seasonal differencing (up to first order differences. The value of root mean squared error for each model is calculated averaging the one-step forecasts obtained. The model which has the lowest root mean squared error value and passes the Ljung–Box test using all of the available data with a reasonable significance level is selected among all the ARIMA and state space models considered. The procedure is exemplified in this paper with a case study of retail sales of different categories of women’s footwear from a Portuguese retailer, and its accuracy is compared with three reliable forecasting approaches. The results show that our procedure consistently forecasts more accurately than the other approaches and the improvements in the accuracy are significant.

  16. Time series analysis of human and bovine brucellosis in South Korea from 2005 to 2010.

    Science.gov (United States)

    Lee, Hu Suk; Her, Moon; Levine, Michael; Moore, George E

    2013-06-01

    Brucellosis is considered to be one of the most important zoonotic diseases in the world, affecting underdeveloped and developing countries. The primary purpose of brucellosis control is to prevent the spread of disease from animals (typically ruminants) to humans. The main objective of this study was to retrospectively develop an appropriate time series model for cattle-to-human transmission in South Korea using data from independent national surveillance systems. Monthly case counts for cattle and people as well as national population data were available for 2005-2010. The temporal relationship was evaluated using an autoregressive integrated moving average with exogenous input (ARIMAX) model [notated as ARIMA(p, d, q)-AR(p)] and a negative binomial regression (NBR) model. Human incidence rate was highly correlated to cattle incidence rate in the same month and the previous month (both r=0.82). In the final models, ARIMA (0, 1, 1)-AR (0, 1) was determined as the best fit with 191.5% error in the validation phase, whereas the best NBR model including lags (0, 1 months) for the cattle incidence rate yielded a 131.9% error in the validation phase. Error (MAPE) rates were high due to small absolute human case numbers (typically less than 10 per month in the validation phase). The NBR model however was able to demonstrate a marked reduction in human case immediately following a hypothetical marked reduction in cattle cases, and may be better for public health decision making. Copyright © 2013 Elsevier B.V. All rights reserved.

  17. A methodology to filter time series: application to minute-by-minute electric load series

    Directory of Open Access Journals (Sweden)

    Mayte Suarez-Farinas

    2004-12-01

    Full Text Available In this article a methodology for filtering a time series is presented, with application to high frequency series such as the minute-by-minute electric load series. The goal of this approach is to detect and substitute the irregularities of the time series that can produce distortions on the modelling stage. Outlier values are detected through a dynamic linear model and the Bayes factor tool; missing values are then interpolated with a Smoothing Cubic Spline. The performance of the proposed approach is illustrated using real data and evaluated through a series of tests where the irregularities have been simulated.Neste artigo apresenta-se uma metodologia para a filtragem de séries temporais, com aplicação em séries de alta freqüência. Esta metodologia tem como objetivo detectar e substituir as irregularidades da série temporal que podem comprometer a etapa de modelagem. São detalhados o modelo linear dinâmico utilizado para detectar os valores outliers e o emprego do Fator de Bayes. Na interpolação de valores faltantes utiliza-se o Spline Cúbico Suavizado. O desempenho da metodologia proposta é avaliado a través de vários testes onde as irregularidade foram simuladas.

  18. A window-based time series feature extraction method.

    Science.gov (United States)

    Katircioglu-Öztürk, Deniz; Güvenir, H Altay; Ravens, Ursula; Baykal, Nazife

    2017-10-01

    This study proposes a robust similarity score-based time series feature extraction method that is termed as Window-based Time series Feature ExtraCtion (WTC). Specifically, WTC generates domain-interpretable results and involves significantly low computational complexity thereby rendering itself useful for densely sampled and populated time series datasets. In this study, WTC is applied to a proprietary action potential (AP) time series dataset on human cardiomyocytes and three precordial leads from a publicly available electrocardiogram (ECG) dataset. This is followed by comparing WTC in terms of predictive accuracy and computational complexity with shapelet transform and fast shapelet transform (which constitutes an accelerated variant of the shapelet transform). The results indicate that WTC achieves a slightly higher classification performance with significantly lower execution time when compared to its shapelet-based alternatives. With respect to its interpretable features, WTC has a potential to enable medical experts to explore definitive common trends in novel datasets. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. Stochastic generation of hourly wind speed time series

    International Nuclear Information System (INIS)

    Shamshad, A.; Wan Mohd Ali Wan Hussin; Bawadi, M.A.; Mohd Sanusi, S.A.

    2006-01-01

    In the present study hourly wind speed data of Kuala Terengganu in Peninsular Malaysia are simulated by using transition matrix approach of Markovian process. The wind speed time series is divided into various states based on certain criteria. The next wind speed states are selected based on the previous states. The cumulative probability transition matrix has been formed in which each row ends with 1. Using the uniform random numbers between 0 and 1, a series of future states is generated. These states have been converted to the corresponding wind speed values using another uniform random number generator. The accuracy of the model has been determined by comparing the statistical characteristics such as average, standard deviation, root mean square error, probability density function and autocorrelation function of the generated data to those of the original data. The generated wind speed time series data is capable to preserve the wind speed characteristics of the observed data

  20. Complexity analysis of the turbulent environmental fluid flow time series

    Science.gov (United States)

    Mihailović, D. T.; Nikolić-Đorić, E.; Drešković, N.; Mimić, G.

    2014-02-01

    We have used the Kolmogorov complexities, sample and permutation entropies to quantify the randomness degree in river flow time series of two mountain rivers in Bosnia and Herzegovina, representing the turbulent environmental fluid, for the period 1926-1990. In particular, we have examined the monthly river flow time series from two rivers (the Miljacka and the Bosnia) in the mountain part of their flow and then calculated the Kolmogorov complexity (KL) based on the Lempel-Ziv Algorithm (LZA) (lower-KLL and upper-KLU), sample entropy (SE) and permutation entropy (PE) values for each time series. The results indicate that the KLL, KLU, SE and PE values in two rivers are close to each other regardless of the amplitude differences in their monthly flow rates. We have illustrated the changes in mountain river flow complexity by experiments using (i) the data set for the Bosnia River and (ii) anticipated human activities and projected climate changes. We have explored the sensitivity of considered measures in dependence on the length of time series. In addition, we have divided the period 1926-1990 into three subintervals: (a) 1926-1945, (b) 1946-1965, (c) 1966-1990, and calculated the KLL, KLU, SE, PE values for the various time series in these subintervals. It is found that during the period 1946-1965, there is a decrease in their complexities, and corresponding changes in the SE and PE, in comparison to the period 1926-1990. This complexity loss may be primarily attributed to (i) human interventions, after the Second World War, on these two rivers because of their use for water consumption and (ii) climate change in recent times.

  1. Tempered fractional time series model for turbulence in geophysical flows

    Science.gov (United States)

    Meerschaert, Mark M.; Sabzikar, Farzad; Phanikumar, Mantha S.; Zeleke, Aklilu

    2014-09-01

    We propose a new time series model for velocity data in turbulent flows. The new model employs tempered fractional calculus to extend the classical 5/3 spectral model of Kolmogorov. Application to wind speed and water velocity in a large lake are presented, to demonstrate the practical utility of the model.

  2. Tempered fractional time series model for turbulence in geophysical flows

    International Nuclear Information System (INIS)

    Meerschaert, Mark M; Sabzikar, Farzad; Phanikumar, Mantha S; Zeleke, Aklilu

    2014-01-01

    We propose a new time series model for velocity data in turbulent flows. The new model employs tempered fractional calculus to extend the classical 5/3 spectral model of Kolmogorov. Application to wind speed and water velocity in a large lake are presented, to demonstrate the practical utility of the model. (paper)

  3. On the Application of Information in Time Series Analysis

    Czech Academy of Sciences Publication Activity Database

    Klán, Petr; Wilkie, J.; Ankenbrand, T.

    1998-01-01

    Roč. 8, č. 1 (1998), s. 39-49 ISSN 1210-0552 Grant - others:Fonds National Suisse de la Recherche Scientifique (XE) CP93:9630 Keywords : time series analysis * measurement and application of information Subject RIV: BA - General Mathematics

  4. A Non-standard Empirical Likelihood for Time Series

    DEFF Research Database (Denmark)

    Nordman, Daniel J.; Bunzel, Helle; Lahiri, Soumendra N.

    -standard asymptotics and requires a significantly different development compared to standard BEL. We establish the large-sample distribution of log-ratio statistics from the new BEL method for calibrating confidence regions for mean or smooth function parameters of time series. This limit law is not the usual chi...

  5. Segmentation of Nonstationary Time Series with Geometric Clustering

    DEFF Research Database (Denmark)

    Bocharov, Alexei; Thiesson, Bo

    2013-01-01

    We introduce a non-parametric method for segmentation in regimeswitching time-series models. The approach is based on spectral clustering of target-regressor tuples and derives a switching regression tree, where regime switches are modeled by oblique splits. Such models can be learned efficiently...

  6. Two-fractal overlap time series: Earthquakes and market crashes

    Indian Academy of Sciences (India)

    203–210. Two-fractal overlap time series: Earthquakes and market crashes. BIKAS K CHAKRABARTI1,2,∗, ARNAB CHATTERJEE1,3 and. PRATIP BHATTACHARYYA1,4. 1Theoretical Condensed Matter Physics Division and Centre for Applied Mathematics and. Computational Science, Saha Institute of Nuclear Physics, ...

  7. Time Series Data Visualization in World Wide Telescope

    Science.gov (United States)

    Fay, J.

    WorldWide Telescope provides a rich set of timer series visualization for both archival and real time data. WWT consists of both interactive desktop tools for interactive immersive visualization and HTML5 web based controls that can be utilized in customized web pages. WWT supports a range of display options including full dome, power walls, stereo and virtual reality headsets.

  8. Seasonal time series data imputation: Comparison between feed ...

    African Journals Online (AJOL)

    Specifically we examine how recursive and direct estimates from forward and backward learning Artificial Neural Networks (ANN) compares with seasonal ARIMA estimates and interpolation estimates of Additive outliers in seasonal ARIMA models. A comparison statistics is also proposed. Keywords: Time Series; Artificial ...

  9. Detecting cognizable trends of gene expression in a time series ...

    Indian Academy of Sciences (India)

    Home; Journals; Journal of Genetics; Volume 95; Issue 3. Detecting cognizable trends of gene expression in a time series RNA-sequencing experiment: a bootstrap approach. SHATAKSHEE CHATTERJEE PARTHA P. MAJUMDER PRIYANKA PANDEY. RESEARCH ARTICLE Volume 95 Issue 3 September 2016 pp 587- ...

  10. Buys – Ballot Estimates for time series decomposition | Iwueze ...

    African Journals Online (AJOL)

    An estimation procedure based on the Buys – Ballot (1847) table for time series decomposition is given in this paper. We give two alternative methods called the Chain Base Estimation and Fixed Base Estimation methods. Simulated examples are used to illustrate the methods, while comparing them with the least squares ...

  11. Time series analysis in astronomy: Limits and potentialities

    DEFF Research Database (Denmark)

    Vio, R.; Kristensen, N.R.; Madsen, Henrik

    2005-01-01

    In this paper we consider the problem of the limits concerning the physical information that can be extracted from the analysis of one or more time series ( light curves) typical of astrophysical objects. On the basis of theoretical considerations and numerical simulations, we show that with no a...

  12. Forecasting with nonlinear time series model: A Monte-Carlo ...

    African Journals Online (AJOL)

    In this paper, we propose a new method of forecasting with nonlinear time series model using Monte-Carlo Bootstrap method. This new method gives better result in terms of forecast root mean squared error (RMSE) when compared with the traditional Bootstrap method and Monte-Carlo method of forecasting using a ...

  13. Outlier detection algorithms for least squares time series regression

    DEFF Research Database (Denmark)

    Johansen, Søren; Nielsen, Bent

    We review recent asymptotic results on some robust methods for multiple regression. The regressors include stationary and non-stationary time series as well as polynomial terms. The methods include the Huber-skip M-estimator, 1-step Huber-skip M-estimators, in particular the Impulse Indicator Sat...

  14. Time series analysis in chaotic diode resonator circuit

    Energy Technology Data Exchange (ETDEWEB)

    Hanias, M.P. [TEI of Chalkis, GR 34400, Evia, Chalkis (Greece)] e-mail: mhanias@teihal.gr; Giannaris, G. [TEI of Chalkis, GR 34400, Evia, Chalkis (Greece); Spyridakis, A. [TEI of Chalkis, GR 34400, Evia, Chalkis (Greece); Rigas, A. [TEI of Chalkis, GR 34400, Evia, Chalkis (Greece)

    2006-01-01

    A diode resonator chaotic circuit is presented. Multisim is used to simulate the circuit and show the presence of chaos. Time series analysis performed by the method proposed by Grasberger and Procaccia. The correlation and minimum embedding dimension {nu} and m {sub min}, respectively, were calculated. Also the corresponding Kolmogorov entropy was calculated.

  15. Time series analysis in chaotic diode resonator circuit

    International Nuclear Information System (INIS)

    Hanias, M.P.; Giannaris, G.; Spyridakis, A.; Rigas, A.

    2006-01-01

    A diode resonator chaotic circuit is presented. Multisim is used to simulate the circuit and show the presence of chaos. Time series analysis performed by the method proposed by Grasberger and Procaccia. The correlation and minimum embedding dimension ν and m min , respectively, were calculated. Also the corresponding Kolmogorov entropy was calculated

  16. Financial Intermediation and the Nigerian Economy: A Time Series ...

    African Journals Online (AJOL)

    This paper examines the level of development of financial intermediation and how it impacts on economic growth of Nigeria. Using a time series data covering a period of 40 years (1970 –2009) and employing the econometric tool of Ordinary Least Squares (OLS) and cointegration analysis based on Engle Granger ...

  17. Notes on economic time series analysis system theoretic perspectives

    CERN Document Server

    Aoki, Masanao

    1983-01-01

    In seminars and graduate level courses I have had several opportunities to discuss modeling and analysis of time series with economists and economic graduate students during the past several years. These experiences made me aware of a gap between what economic graduate students are taught about vector-valued time series and what is available in recent system literature. Wishing to fill or narrow the gap that I suspect is more widely spread than my personal experiences indicate, I have written these notes to augment and reor­ ganize materials I have given in these courses and seminars. I have endeavored to present, in as much a self-contained way as practicable, a body of results and techniques in system theory that I judge to be relevant and useful to economists interested in using time series in their research. I have essentially acted as an intermediary and interpreter of system theoretic results and perspectives in time series by filtering out non-essential details, and presenting coherent accounts of wha...

  18. Long-memory time series theory and methods

    CERN Document Server

    Palma, Wilfredo

    2007-01-01

    Wilfredo Palma, PhD, is Chairman and Professor of Statistics in the Department of Statistics at Pontificia Universidad Católica de Chile. Dr. Palma has published several refereed articles and has received over a dozen academic honors and awards. His research interests include time series analysis, prediction theory, state space systems, linear models, and econometrics.

  19. Multivariate Time Series Analysis for Optimum Production Forecast ...

    African Journals Online (AJOL)

    ... by 0.002579KG/Month. Finally, this work adds to the growing body of literature on data-driven production and inventory management by utilizing historical data in the development of useful forecasting mathematical model. Keywords: production model, inventory management, multivariate time series, production forecast ...

  20. A Hybrid Joint Moment Ratio Test for Financial Time Series

    NARCIS (Netherlands)

    P.A. Groenendijk (Patrick); A. Lucas (André); C.G. de Vries (Casper)

    1998-01-01

    textabstractWe advocate the use of absolute moment ratio statistics in conjunction with standard variance ratio statistics in order to disentangle linear dependence, non-linear dependence, and leptokurtosis in financial time series. Both statistics are computed for multiple return horizons

  1. Koopman Operator Framework for Time Series Modeling and Analysis

    Science.gov (United States)

    Surana, Amit

    2018-01-01

    We propose an interdisciplinary framework for time series classification, forecasting, and anomaly detection by combining concepts from Koopman operator theory, machine learning, and linear systems and control theory. At the core of this framework is nonlinear dynamic generative modeling of time series using the Koopman operator which is an infinite-dimensional but linear operator. Rather than working with the underlying nonlinear model, we propose two simpler linear representations or model forms based on Koopman spectral properties. We show that these model forms are invariants of the generative model and can be readily identified directly from data using techniques for computing Koopman spectral properties without requiring the explicit knowledge of the generative model. We also introduce different notions of distances on the space of such model forms which is essential for model comparison/clustering. We employ the space of Koopman model forms equipped with distance in conjunction with classical machine learning techniques to develop a framework for automatic feature generation for time series classification. The forecasting/anomaly detection framework is based on using Koopman model forms along with classical linear systems and control approaches. We demonstrate the proposed framework for human activity classification, and for time series forecasting/anomaly detection in power grid application.

  2. Detection of "noisy" chaos in a time series

    DEFF Research Database (Denmark)

    Chon, K H; Kanters, J K; Cohen, R J

    1997-01-01

    Time series from biological system often displays fluctuations in the measured variables. Much effort has been directed at determining whether this variability reflects deterministic chaos, or whether it is merely "noise". The output from most biological systems is probably the result of both the...

  3. Multivariate Time Series Analysis for Optimum Production Forecast ...

    African Journals Online (AJOL)

    FIRST LADY

    on data-driven production and inventory management by utilizing historical data in the development of useful forecasting mathematical model. Keywords: production model, inventory management, multivariate time series, production forecast. Introduction. A large assortment of forecasting techniques has been developed ...

  4. Growth And Export Expansion In Mauritius - A Time Series Analysis ...

    African Journals Online (AJOL)

    This paper analyses the empirical relationship between economic growth and export expansion in Mauritius as observed through time series data. Using Granger Causality tests, the short-run analysis results revealed that there is significant reciprocal causality between real export earnings (total, textiles and manufacturing) ...

  5. Tests for nonlinearity in short stationary time series

    International Nuclear Information System (INIS)

    Chang, T.; Sauer, T.; Schiff, S.J.

    1995-01-01

    To compare direct tests for detecting determinism in chaotic time series, data from Henon, Lorenz, and Mackey--Glass equations were contaminated with various levels of additive colored noise. These data were analyzed with a variety of recently developed tests for determinism, and the results compared

  6. Time Series Factor Analysis with an Application to Measuring Money

    NARCIS (Netherlands)

    Gilbert, Paul D.; Meijer, Erik

    2005-01-01

    Time series factor analysis (TSFA) and its associated statistical theory is developed. Unlike dynamic factor analysis (DFA), TSFA obviates the need for explicitly modeling the process dynamics of the underlying phenomena. It also differs from standard factor analysis (FA) in important respects: the

  7. Seasonal time series forecasting: a comparative study of arima and ...

    African Journals Online (AJOL)

    This paper addresses the concerns of Faraway and Chatfield (1998) who questioned the forecasting ability of Artificial Neural Networks (ANN). In particular the paper compares the performance of Artificial Neural Networks (ANN) and ARIMA models in forecasting of seasonal (monthly) Time series. Using the Airline data ...

  8. Time series prediction with simple recurrent neural networks ...

    African Journals Online (AJOL)

    Simple recurrent neural networks are widely used in time series prediction. Most researchers and application developers often choose arbitrarily between Elman or Jordan simple recurrent neural networks for their applications. A hybrid of the two called Elman-Jordan (or Multi-recurrent) neural network is also being used.

  9. Multivariate time series modeling of selected childhood diseases in ...

    African Journals Online (AJOL)

    This paper is focused on modeling the five most prevalent childhood diseases in Akwa Ibom State using a multivariate approach to time series. An aggregate of 78,839 reported cases of malaria, upper respiratory tract infection (URTI), Pneumonia, anaemia and tetanus were extracted from five randomly selected hospitals in ...

  10. Multiple imputation for time series data with Amelia package.

    Science.gov (United States)

    Zhang, Zhongheng

    2016-02-01

    Time series data are common in medical researches. Many laboratory variables or study endpoints could be measured repeatedly over time. Multiple imputation (MI) without considering time trend of a variable may cause it to be unreliable. The article illustrates how to perform MI by using Amelia package in a clinical scenario. Amelia package is powerful in that it allows for MI for time series data. External information on the variable of interest can also be incorporated by using prior or bound argument. Such information may be based on previous published observations, academic consensus, and personal experience. Diagnostics of imputation model can be performed by examining the distributions of imputed and observed values, or by using over-imputation technique.

  11. Forecasting Rice Productivity and Production of Odisha, India, Using Autoregressive Integrated Moving Average Models

    Directory of Open Access Journals (Sweden)

    Rahul Tripathi

    2014-01-01

    Full Text Available Forecasting of rice area, production, and productivity of Odisha was made from the historical data of 1950-51 to 2008-09 by using univariate autoregressive integrated moving average (ARIMA models and was compared with the forecasted all Indian data. The autoregressive (p and moving average (q parameters were identified based on the significant spikes in the plots of partial autocorrelation function (PACF and autocorrelation function (ACF of the different time series. ARIMA (2, 1, 0 model was found suitable for all Indian rice productivity and production, whereas ARIMA (1, 1, 1 was best fitted for forecasting of rice productivity and production in Odisha. Prediction was made for the immediate next three years, that is, 2007-08, 2008-09, and 2009-10, using the best fitted ARIMA models based on minimum value of the selection criterion, that is, Akaike information criteria (AIC and Schwarz-Bayesian information criteria (SBC. The performances of models were validated by comparing with percentage deviation from the actual values and mean absolute percent error (MAPE, which was found to be 0.61 and 2.99% for the area under rice in Odisha and India, respectively. Similarly for prediction of rice production and productivity in Odisha and India, the MAPE was found to be less than 6%.

  12. Classification of time series patterns from complex dynamic systems

    Energy Technology Data Exchange (ETDEWEB)

    Schryver, J.C.; Rao, N.

    1998-07-01

    An increasing availability of high-performance computing and data storage media at decreasing cost is making possible the proliferation of large-scale numerical databases and data warehouses. Numeric warehousing enterprises on the order of hundreds of gigabytes to terabytes are a reality in many fields such as finance, retail sales, process systems monitoring, biomedical monitoring, surveillance and transportation. Large-scale databases are becoming more accessible to larger user communities through the internet, web-based applications and database connectivity. Consequently, most researchers now have access to a variety of massive datasets. This trend will probably only continue to grow over the next several years. Unfortunately, the availability of integrated tools to explore, analyze and understand the data warehoused in these archives is lagging far behind the ability to gain access to the same data. In particular, locating and identifying patterns of interest in numerical time series data is an increasingly important problem for which there are few available techniques. Temporal pattern recognition poses many interesting problems in classification, segmentation, prediction, diagnosis and anomaly detection. This research focuses on the problem of classification or characterization of numerical time series data. Highway vehicles and their drivers are examples of complex dynamic systems (CDS) which are being used by transportation agencies for field testing to generate large-scale time series datasets. Tools for effective analysis of numerical time series in databases generated by highway vehicle systems are not yet available, or have not been adapted to the target problem domain. However, analysis tools from similar domains may be adapted to the problem of classification of numerical time series data.

  13. Displaying time series, spatial, and space-time data with R

    CERN Document Server

    Perpinan Lamigueiro, Oscar

    2014-01-01

    Code and Methods for Creating High-Quality Data GraphicsA data graphic is not only a static image, but it also tells a story about the data. It activates cognitive processes that are able to detect patterns and discover information not readily available with the raw data. This is particularly true for time series, spatial, and space-time datasets.Focusing on the exploration of data with visual methods, Displaying Time Series, Spatial, and Space-Time Data with R presents methods and R code for producing high-quality graphics of time series, spatial, and space-time data. Practical examples using

  14. Normalization methods in time series of platelet function assays

    Science.gov (United States)

    Van Poucke, Sven; Zhang, Zhongheng; Roest, Mark; Vukicevic, Milan; Beran, Maud; Lauwereins, Bart; Zheng, Ming-Hua; Henskens, Yvonne; Lancé, Marcus; Marcus, Abraham

    2016-01-01

    Abstract Platelet function can be quantitatively assessed by specific assays such as light-transmission aggregometry, multiple-electrode aggregometry measuring the response to adenosine diphosphate (ADP), arachidonic acid, collagen, and thrombin-receptor activating peptide and viscoelastic tests such as rotational thromboelastometry (ROTEM). The task of extracting meaningful statistical and clinical information from high-dimensional data spaces in temporal multivariate clinical data represented in multivariate time series is complex. Building insightful visualizations for multivariate time series demands adequate usage of normalization techniques. In this article, various methods for data normalization (z-transformation, range transformation, proportion transformation, and interquartile range) are presented and visualized discussing the most suited approach for platelet function data series. Normalization was calculated per assay (test) for all time points and per time point for all tests. Interquartile range, range transformation, and z-transformation demonstrated the correlation as calculated by the Spearman correlation test, when normalized per assay (test) for all time points. When normalizing per time point for all tests, no correlation could be abstracted from the charts as was the case when using all data as 1 dataset for normalization. PMID:27428217

  15. An entropic approach to the analysis of time series

    Science.gov (United States)

    Scafetta, Nicola

    Statistical analysis of time series. With compelling arguments we show that the Diffusion Entropy Analysis (DEA) is the only method of the literature of the Science of Complexity that correctly determines the scaling hidden within a time series reflecting a Complex Process. The time series is thought of as a source of fluctuations, and the DEA is based on the Shannon entropy of the diffusion process generated by these fluctuations. All traditional methods of scaling analysis, instead, are based on the variance of this diffusion process. The variance methods detect the real scaling only if the Gaussian assumption holds true. We call H the scaling exponent detected by the variance methods and delta the real scaling exponent. If the time series is characterized by Fractional Brownian Motion, we have H = delta and the scaling can be safely determined, in this case, by using the variance methods. If, on the contrary, the time series is characterized, for example, by Levy statistics, H ≠ delta and the variance methods cannot be used to detect the true scaling. Levy walk yields the relation delta = 1/(3 - 2H). In the case of Levy flights, the variance diverges and the exponent H cannot be determined, whereas the scaling delta exists and can be established by using the DEA. Therefore, only the joint use of two different scaling analysis methods, the variance scaling analysis and the DEA, can assess the real nature, Gauss or Levy or something else, of a time series. Moreover, the DEA determines the information content, under the form of Shannon entropy, or of any other convenient entropic indicator, at each time step of the process that, given a sufficiently large number of data, is expected to become diffusion with scaling. This makes it possible to study the regime of transition from dynamics to thermodynamics, non-stationary regimes, and the saturation regime as well. First of all, the efficiency of the DEA is proved with theoretical arguments and with numerical work

  16. On the plurality of times: disunified time and the A-series | Nefdt ...

    African Journals Online (AJOL)

    Then, I attempt to show that disunified time is a problem for a semantics based on the A-series since A-truthmakers are hard to come by in a universe of temporally disconnected time-series. Finally, I provide a novel argument showing that presentists should be particularly fearful of such a universe. South African Journal of ...

  17. Multi-Scale Dissemination of Time Series Data

    DEFF Research Database (Denmark)

    Guo, Qingsong; Zhou, Yongluan; Su, Li

    2013-01-01

    In this paper, we consider the problem of continuous dissemination of time series data, such as sensor measurements, to a large number of subscribers. These subscribers fall into multiple subscription levels, where each subscription level is specified by the bandwidth constraint of a subscriber......, which is an abstract indicator for both the physical limits and the amount of data that the subscriber would like to handle. To handle this problem, we propose a system framework for multi-scale time series data dissemination that employs a typical tree-based dissemination network and existing time...... to optimize the average accuracies of the data received by all subscribers within the dissemination network. Finally, we have conducted extensive experiments to study the performance of the algorithms....

  18. Optimization of recurrent neural networks for time series modeling

    DEFF Research Database (Denmark)

    Pedersen, Morten With

    1997-01-01

    The present thesis is about optimization of recurrent neural networks applied to time series modeling. In particular is considered fully recurrent networks working from only a single external input, one layer of nonlinear hidden units and a li near output unit applied to prediction of discrete time...... series. The overall objective s are to improve training by application of second-order methods and to improve generalization ability by architecture optimization accomplished by pruning. The major topics covered in the thesis are: 1. The problem of training recurrent networks is analyzed from a numerical...... of solution obtained as well as computation time required. 3. A theoretical definition of the generalization error for recurrent networks is provided. This definition justifies a commonly adopted approach for estimating generalization ability. 4. The viability of pruning recurrent networks by the Optimal...

  19. Estimating density dependence from time series of population age structure.

    Science.gov (United States)

    Lande, Russell; Engen, Steinar; Saether, Bernt-Erik; Coulson, Tim

    2006-07-01

    Population fluctuations are caused by demographic and environmental stochasticity, time lags due to life history, and density dependence. We model a general life history allowing density dependence within and among age or stage classes in a population undergoing small or moderate fluctuations around a stable equilibrium. We develop a method for estimating the overall strength of density dependence measured by the rate of return toward equilibrium, and we also consider a simplified population description and forecasting using the density-dependent reproductive value. This generality comes at the cost of requiring a time series of the population age or stage structure instead of a univariate time series of adult or total population size. The method is illustrated by analyzing the dynamics of a fully censused population of red deer (Cervus elaphus) based on annual fluctuations of age structure through 21 years.

  20. Reconstruction of ensembles of coupled time-delay systems from time series.

    Science.gov (United States)

    Sysoev, I V; Prokhorov, M D; Ponomarenko, V I; Bezruchko, B P

    2014-06-01

    We propose a method to recover from time series the parameters of coupled time-delay systems and the architecture of couplings between them. The method is based on a reconstruction of model delay-differential equations and estimation of statistical significance of couplings. It can be applied to networks composed of nonidentical nodes with an arbitrary number of unidirectional and bidirectional couplings. We test our method on chaotic and periodic time series produced by model equations of ensembles of diffusively coupled time-delay systems in the presence of noise, and apply it to experimental time series obtained from electronic oscillators with delayed feedback coupled by resistors.

  1. Gamma and Exponential Autoregressive Moving Average (ARMA ...

    African Journals Online (AJOL)

    Time series data encountered in practice depict properties that deviate from those of gaussian processes. The gamma and exponentially distributed processes which are used as basic models for positive time series fall in the class of non-gaussian processes. In this paper, we develop new and simpler representations of the ...

  2. Perception of acoustically presented time series with varied intervals.

    Science.gov (United States)

    Wackermann, Jiří; Pacer, Jakob; Wittmann, Marc

    2014-03-01

    Data from three experiments on serial perception of temporal intervals in the supra-second domain are reported. Sequences of short acoustic signals ("pips") separated by periods of silence were presented to the observers. Two types of time series, geometric or alternating, were used, where the modulus 1+δ of the inter-pip series and the base duration Tb (range from 1.1 to 6s) were varied as independent parameters. The observers had to judge whether the series were accelerating, decelerating, or uniform (3 paradigm), or to distinguish regular from irregular sequences (2 paradigm). "Intervals of subjective uniformity" (isus) were obtained by fitting Gaussian psychometric functions to individual subjects' responses. Progression towards longer base durations (Tb=4.4 or 6s) shifts the isus towards negative δs, i.e., accelerating series. This finding is compatible with the phenomenon of "subjective shortening" of past temporal intervals, which is naturally accounted for by the lossy integration model of internal time representation. The opposite effect observed for short durations (Tb=1.1 or 1.5s) remains unexplained by the lossy integration model, and presents a challenge for further research. © 2013 Elsevier B.V. All rights reserved.

  3. Cardiac arrhythmia classification using autoregressive modeling

    Directory of Open Access Journals (Sweden)

    Srinivasan Narayanan

    2002-11-01

    Full Text Available Abstract Background Computer-assisted arrhythmia recognition is critical for the management of cardiac disorders. Various techniques have been utilized to classify arrhythmias. Generally, these techniques classify two or three arrhythmias or have significantly large processing times. A simpler autoregressive modeling (AR technique is proposed to classify normal sinus rhythm (NSR and various cardiac arrhythmias including atrial premature contraction (APC, premature ventricular contraction (PVC, superventricular tachycardia (SVT, ventricular tachycardia (VT and ventricular fibrillation (VF. Methods AR Modeling was performed on ECG data from normal sinus rhythm as well as various arrhythmias. The AR coefficients were computed using Burg's algorithm. The AR coefficients were classified using a generalized linear model (GLM based algorithm in various stages. Results AR modeling results showed that an order of four was sufficient for modeling the ECG signals. The accuracy of detecting NSR, APC, PVC, SVT, VT and VF were 93.2% to 100% using the GLM based classification algorithm. Conclusion The results show that AR modeling is useful for the classification of cardiac arrhythmias, with reasonably high accuracies. Further validation of the proposed technique will yield acceptable results for clinical implementation.

  4. Topological data analysis of financial time series: Landscapes of crashes

    Science.gov (United States)

    Gidea, Marian; Katz, Yuri

    2018-02-01

    We explore the evolution of daily returns of four major US stock market indices during the technology crash of 2000, and the financial crisis of 2007-2009. Our methodology is based on topological data analysis (TDA). We use persistence homology to detect and quantify topological patterns that appear in multidimensional time series. Using a sliding window, we extract time-dependent point cloud data sets, to which we associate a topological space. We detect transient loops that appear in this space, and we measure their persistence. This is encoded in real-valued functions referred to as a 'persistence landscapes'. We quantify the temporal changes in persistence landscapes via their Lp-norms. We test this procedure on multidimensional time series generated by various non-linear and non-equilibrium models. We find that, in the vicinity of financial meltdowns, the Lp-norms exhibit strong growth prior to the primary peak, which ascends during a crash. Remarkably, the average spectral density at low frequencies of the time series of Lp-norms of the persistence landscapes demonstrates a strong rising trend for 250 trading days prior to either dotcom crash on 03/10/2000, or to the Lehman bankruptcy on 09/15/2008. Our study suggests that TDA provides a new type of econometric analysis, which complements the standard statistical measures. The method can be used to detect early warning signals of imminent market crashes. We believe that this approach can be used beyond the analysis of financial time series presented here.

  5. FTSPlot: fast time series visualization for large datasets.

    Directory of Open Access Journals (Sweden)

    Michael Riss

    Full Text Available The analysis of electrophysiological recordings often involves visual inspection of time series data to locate specific experiment epochs, mask artifacts, and verify the results of signal processing steps, such as filtering or spike detection. Long-term experiments with continuous data acquisition generate large amounts of data. Rapid browsing through these massive datasets poses a challenge to conventional data plotting software because the plotting time increases proportionately to the increase in the volume of data. This paper presents FTSPlot, which is a visualization concept for large-scale time series datasets using techniques from the field of high performance computer graphics, such as hierarchic level of detail and out-of-core data handling. In a preprocessing step, time series data, event, and interval annotations are converted into an optimized data format, which then permits fast, interactive visualization. The preprocessing step has a computational complexity of O(n x log(N; the visualization itself can be done with a complexity of O(1 and is therefore independent of the amount of data. A demonstration prototype has been implemented and benchmarks show that the technology is capable of displaying large amounts of time series data, event, and interval annotations lag-free with < 20 ms ms. The current 64-bit implementation theoretically supports datasets with up to 2(64 bytes, on the x86_64 architecture currently up to 2(48 bytes are supported, and benchmarks have been conducted with 2(40 bytes/1 TiB or 1.3 x 10(11 double precision samples. The presented software is freely available and can be included as a Qt GUI component in future software projects, providing a standard visualization method for long-term electrophysiological experiments.

  6. Financial time series analysis based on information categorization method

    Science.gov (United States)

    Tian, Qiang; Shang, Pengjian; Feng, Guochen

    2014-12-01

    The paper mainly applies the information categorization method to analyze the financial time series. The method is used to examine the similarity of different sequences by calculating the distances between them. We apply this method to quantify the similarity of different stock markets. And we report the results of similarity in US and Chinese stock markets in periods 1991-1998 (before the Asian currency crisis), 1999-2006 (after the Asian currency crisis and before the global financial crisis), and 2007-2013 (during and after global financial crisis) by using this method. The results show the difference of similarity between different stock markets in different time periods and the similarity of the two stock markets become larger after these two crises. Also we acquire the results of similarity of 10 stock indices in three areas; it means the method can distinguish different areas' markets from the phylogenetic trees. The results show that we can get satisfactory information from financial markets by this method. The information categorization method can not only be used in physiologic time series, but also in financial time series.

  7. Dynamical analysis and visualization of tornadoes time series.

    Directory of Open Access Journals (Sweden)

    António M Lopes

    Full Text Available In this paper we analyze the behavior of tornado time-series in the U.S. from the perspective of dynamical systems. A tornado is a violently rotating column of air extending from a cumulonimbus cloud down to the ground. Such phenomena reveal features that are well described by power law functions and unveil characteristics found in systems with long range memory effects. Tornado time series are viewed as the output of a complex system and are interpreted as a manifestation of its dynamics. Tornadoes are modeled as sequences of Dirac impulses with amplitude proportional to the events size. First, a collection of time series involving 64 years is analyzed in the frequency domain by means of the Fourier transform. The amplitude spectra are approximated by power law functions and their parameters are read as an underlying signature of the system dynamics. Second, it is adopted the concept of circular time and the collective behavior of tornadoes analyzed. Clustering techniques are then adopted to identify and visualize the emerging patterns.

  8. Dynamical analysis and visualization of tornadoes time series.

    Science.gov (United States)

    Lopes, António M; Tenreiro Machado, J A

    2015-01-01

    In this paper we analyze the behavior of tornado time-series in the U.S. from the perspective of dynamical systems. A tornado is a violently rotating column of air extending from a cumulonimbus cloud down to the ground. Such phenomena reveal features that are well described by power law functions and unveil characteristics found in systems with long range memory effects. Tornado time series are viewed as the output of a complex system and are interpreted as a manifestation of its dynamics. Tornadoes are modeled as sequences of Dirac impulses with amplitude proportional to the events size. First, a collection of time series involving 64 years is analyzed in the frequency domain by means of the Fourier transform. The amplitude spectra are approximated by power law functions and their parameters are read as an underlying signature of the system dynamics. Second, it is adopted the concept of circular time and the collective behavior of tornadoes analyzed. Clustering techniques are then adopted to identify and visualize the emerging patterns.

  9. Cluster analysis of activity-time series in motor learning

    DEFF Research Database (Denmark)

    Balslev, Daniela; Nielsen, Finn Å; Futiger, Sally A

    2002-01-01

    Neuroimaging studies of learning focus on brain areas where the activity changes as a function of time. To circumvent the difficult problem of model selection, we used a data-driven analytic tool, cluster analysis, which extracts representative temporal and spatial patterns from the voxel......-time series. The optimal number of clusters was chosen using a cross-validated likelihood method, which highlights the clustering pattern that generalizes best over the subjects. Data were acquired with PET at different time points during practice of a visuomotor task. The results from cluster analysis show...

  10. Efficient Processing of Multiple DTW Queries in Time Series Databases

    DEFF Research Database (Denmark)

    Kremer, Hardy; Günnemann, Stephan; Ivanescu, Anca-Maria

    2011-01-01

    Dynamic Time Warping (DTW) is a widely used distance measure for time series that has been successfully used in science and many other application domains. As DTW is computationally expensive, there is a strong need for efficient query processing algorithms. Such algorithms exist for single queries....... In many of today’s applications, however, large numbers of queries arise at any given time. Existing DTW techniques do not process multiple DTW queries simultaneously, a serious limitation which slows down overall processing. In this paper, we propose an efficient processing approach for multiple DTW...

  11. Parameterizing unconditional skewness in models for financial time series

    DEFF Research Database (Denmark)

    He, Changli; Silvennoinen, Annastiina; Teräsvirta, Timo

    In this paper we consider the third-moment structure of a class of time series models. It is often argued that the marginal distribution of financial time series such as returns is skewed. Therefore it is of importance to know what properties a model should possess if it is to accommodate...... unconditional skewness. We consider modelling the unconditional mean and variance using models that respond nonlinearly or asymmetrically to shocks. We investigate the implications of these models on the third-moment structure of the marginal distribution as well as conditions under which the unconditional...... distribution exhibits skewness and nonzero third-order autocovariance structure. In this respect, an asymmetric or nonlinear specification of the conditional mean is found to be of greater importance than the properties of the conditional variance. Several examples are discussed and, whenever possible...

  12. Time Series Analysis, Modeling and Applications A Computational Intelligence Perspective

    CERN Document Server

    Chen, Shyi-Ming

    2013-01-01

    Temporal and spatiotemporal data form an inherent fabric of the society as we are faced with streams of data coming from numerous sensors, data feeds, recordings associated with numerous areas of application embracing physical and human-generated phenomena (environmental data, financial markets, Internet activities, etc.). A quest for a thorough analysis, interpretation, modeling and prediction of time series comes with an ongoing challenge for developing models that are both accurate and user-friendly (interpretable). The volume is aimed to exploit the conceptual and algorithmic framework of Computational Intelligence (CI) to form a cohesive and comprehensive environment for building models of time series. The contributions covered in the volume are fully reflective of the wealth of the CI technologies by bringing together ideas, algorithms, and numeric studies, which convincingly demonstrate their relevance, maturity and visible usefulness. It reflects upon the truly remarkable diversity of methodological a...

  13. Model and Variable Selection Procedures for Semiparametric Time Series Regression

    Directory of Open Access Journals (Sweden)

    Risa Kato

    2009-01-01

    Full Text Available Semiparametric regression models are very useful for time series analysis. They facilitate the detection of features resulting from external interventions. The complexity of semiparametric models poses new challenges for issues of nonparametric and parametric inference and model selection that frequently arise from time series data analysis. In this paper, we propose penalized least squares estimators which can simultaneously select significant variables and estimate unknown parameters. An innovative class of variable selection procedure is proposed to select significant variables and basis functions in a semiparametric model. The asymptotic normality of the resulting estimators is established. Information criteria for model selection are also proposed. We illustrate the effectiveness of the proposed procedures with numerical simulations.

  14. A Generalization of Some Classical Time Series Tools

    DEFF Research Database (Denmark)

    Nielsen, Henrik Aalborg; Madsen, Henrik

    2001-01-01

    or linearity. The generalizations do not prescribe a particular smoothing technique. In fact, when the smoother is replaced by a linear regression the generalizations reduce to close approximations of SACF and SPACF. For this reason a smooth transition from the linear to the non-linear case can be obtained......In classical time series analysis the sample autocorrelation function (SACF) and the sample partial autocorrelation function (SPACF) has gained wide application for structural identification of linear time series models. We suggest generalizations, founded on smoothing techniques, applicable...... by varying the bandwidth of a local linear smoother. By adjusting the flexibility of the smoother the power of the tests for independence and linearity against specific alternatives can be adjusted. The generalizations allow for graphical presentations, very similar to those used for SACF and SPACF...

  15. Deviations from uniform power law scaling in nonstationary time series

    Science.gov (United States)

    Viswanathan, G. M.; Peng, C. K.; Stanley, H. E.; Goldberger, A. L.

    1997-01-01

    A classic problem in physics is the analysis of highly nonstationary time series that typically exhibit long-range correlations. Here we test the hypothesis that the scaling properties of the dynamics of healthy physiological systems are more stable than those of pathological systems by studying beat-to-beat fluctuations in the human heart rate. We develop techniques based on the Fano factor and Allan factor functions, as well as on detrended fluctuation analysis, for quantifying deviations from uniform power-law scaling in nonstationary time series. By analyzing extremely long data sets of up to N = 10(5) beats for 11 healthy subjects, we find that the fluctuations in the heart rate scale approximately uniformly over several temporal orders of magnitude. By contrast, we find that in data sets of comparable length for 14 subjects with heart disease, the fluctuations grow erratically, indicating a loss of scaling stability.

  16. Mathematical methods in time series analysis and digital image processing

    CERN Document Server

    Kurths, J; Maass, P; Timmer, J

    2008-01-01

    The aim of this volume is to bring together research directions in theoretical signal and imaging processing developed rather independently in electrical engineering, theoretical physics, mathematics and the computer sciences. In particular, mathematically justified algorithms and methods, the mathematical analysis of these algorithms, and methods as well as the investigation of connections between methods from time series analysis and image processing are reviewed. An interdisciplinary comparison of these methods, drawing upon common sets of test problems from medicine and geophysical/enviromental sciences, is also addressed. This volume coherently summarizes work carried out in the field of theoretical signal and image processing. It focuses on non-linear and non-parametric models for time series as well as on adaptive methods in image processing.

  17. Time series analysis of nuclear instrumentation in EBR-II

    Energy Technology Data Exchange (ETDEWEB)

    Imel, G.R.

    1996-05-01

    Results of a time series analysis of the scaler count data from the 3 wide range nuclear detectors in the Experimental Breeder Reactor-II are presented. One of the channels was replaced, and it was desired to determine if there was any statistically significant change (ie, improvement) in the channel`s response after the replacement. Data were collected from all 3 channels for 16-day periods before and after detector replacement. Time series analysis and statistical tests showed that there was no significant change after the detector replacement. Also, there were no statistically significant differences among the 3 channels, either before or after the replacement. Finally, it was determined that errors in the reactivity change inferred from subcritical count monitoring during fuel handling would be on the other of 20-30 cents for single count intervals.

  18. Models for Pooled Time-Series Cross-Section Data

    Directory of Open Access Journals (Sweden)

    Lawrence E Raffalovich

    2015-07-01

    Full Text Available Several models are available for the analysis of pooled time-series cross-section (TSCS data, defined as “repeated observations on fixed units” (Beck and Katz 1995. In this paper, we run the following models: (1 a completely pooled model, (2 fixed effects models, and (3 multi-level/hierarchical linear models. To illustrate these models, we use a Generalized Least Squares (GLS estimator with cross-section weights and panel-corrected standard errors (with EViews 8 on the cross-national homicide trends data of forty countries from 1950 to 2005, which we source from published research (Messner et al. 2011. We describe and discuss the similarities and differences between the models, and what information each can contribute to help answer substantive research questions. We conclude with a discussion of how the models we present may help to mitigate validity threats inherent in pooled time-series cross-section data analysis.

  19. Time series analysis of nuclear instrumentation in EBR-II

    International Nuclear Information System (INIS)

    Imel, G.R.

    1996-01-01

    Results of a time series analysis of the scaler count data from the 3 wide range nuclear detectors in the Experimental Breeder Reactor-II are presented. One of the channels was replaced, and it was desired to determine if there was any statistically significant change (ie, improvement) in the channel's response after the replacement. Data were collected from all 3 channels for 16-day periods before and after detector replacement. Time series analysis and statistical tests showed that there was no significant change after the detector replacement. Also, there were no statistically significant differences among the 3 channels, either before or after the replacement. Finally, it was determined that errors in the reactivity change inferred from subcritical count monitoring during fuel handling would be on the other of 20-30 cents for single count intervals

  20. Modeling Philippine Stock Exchange Composite Index Using Time Series Analysis

    Science.gov (United States)

    Gayo, W. S.; Urrutia, J. D.; Temple, J. M. F.; Sandoval, J. R. D.; Sanglay, J. E. A.

    2015-06-01

    This study was conducted to develop a time series model of the Philippine Stock Exchange Composite Index and its volatility using the finite mixture of ARIMA model with conditional variance equations such as ARCH, GARCH, EG ARCH, TARCH and PARCH models. Also, the study aimed to find out the reason behind the behaviorof PSEi, that is, which of the economic variables - Consumer Price Index, crude oil price, foreign exchange rate, gold price, interest rate, money supply, price-earnings ratio, Producers’ Price Index and terms of trade - can be used in projecting future values of PSEi and this was examined using Granger Causality Test. The findings showed that the best time series model for Philippine Stock Exchange Composite index is ARIMA(1,1,5) - ARCH(1). Also, Consumer Price Index, crude oil price and foreign exchange rate are factors concluded to Granger cause Philippine Stock Exchange Composite Index.

  1. Time series analysis methods and applications for flight data

    CERN Document Server

    Zhang, Jianye

    2017-01-01

    This book focuses on different facets of flight data analysis, including the basic goals, methods, and implementation techniques. As mass flight data possesses the typical characteristics of time series, the time series analysis methods and their application for flight data have been illustrated from several aspects, such as data filtering, data extension, feature optimization, similarity search, trend monitoring, fault diagnosis, and parameter prediction, etc. An intelligent information-processing platform for flight data has been established to assist in aircraft condition monitoring, training evaluation and scientific maintenance. The book will serve as a reference resource for people working in aviation management and maintenance, as well as researchers and engineers in the fields of data analysis and data mining.

  2. Time series prediction by feedforward neural networks - is it difficult?

    CERN Document Server

    Rosen-Zvi, M; Kinzel, W

    2003-01-01

    The difficulties that a neural network faces when trying to learn from a quasi-periodic time series are studied analytically using a teacher-student scenario where the random input is divided into two macroscopic regions with different variances, 1 and 1/gamma sup 2 (gamma >> 1). The generalization error is found to decrease as epsilon sub g propor to exp(-alpha/gamma sup 2), where alpha is the number of examples per input dimension. In contradiction to this very slow vanishing generalization error, the next output prediction is found to be almost free of mistakes. This picture is consistent with learning quasi-periodic time series produced by feedforward neural networks, which is dominated by enhanced components of the Fourier spectrum of the input. Simulation results are in good agreement with the analytical results.

  3. Time series prediction by feedforward neural networks - is it difficult?

    Science.gov (United States)

    Rosen-Zvi, Michal; Kanter, Ido; Kinzel, Wolfgang

    2003-04-01

    The difficulties that a neural network faces when trying to learn from a quasi-periodic time series are studied analytically using a teacher-student scenario where the random input is divided into two macroscopic regions with different variances, 1 and 1/gamma2 (gamma gg 1). The generalization error is found to decrease as epsilong propto exp(-alpha/gamma2), where alpha is the number of examples per input dimension. In contradiction to this very slow vanishing generalization error, the next output prediction is found to be almost free of mistakes. This picture is consistent with learning quasi-periodic time series produced by feedforward neural networks, which is dominated by enhanced components of the Fourier spectrum of the input. Simulation results are in good agreement with the analytical results.

  4. A Comparative Study of Portmanteau Tests for Univariate Time Series Models

    Directory of Open Access Journals (Sweden)

    Sohail Chand

    2006-07-01

    Full Text Available Time series model diagnostic checking is the most important stage of time series model building. In this paper the comparison among several suggested diagnostic tests has been made using the simulation time series data.

  5. Learning effective connectivity from fMRI using autoregressive hidden Markov model with missing data.

    Science.gov (United States)

    Dang, Shilpa; Chaudhury, Santanu; Lall, Brejesh; Roy, Prasun Kumar

    2017-02-15

    Effective connectivity (EC) analysis of neuronal groups using fMRI delivers insights about functional-integration. However, fMRI signal has low-temporal resolution due to down-sampling and indirectly measures underlying neuronal activity. The aim is to address above issues for more reliable EC estimates. This paper proposes use of autoregressive hidden Markov model with missing data (AR-HMM-md) in dynamically multi-linked (DML) framework for learning EC using multiple fMRI time series. In our recent work (Dang et al., 2016), we have shown how AR-HMM-md for modelling single fMRI time series outperforms the existing methods. AR-HMM-md models unobserved neuronal activity and lost data over time as variables and estimates their values by joint optimization given fMRI observation sequence. The effectiveness in learning EC is shown using simulated experiments. Also the effects of sampling and noise are studied on EC. Moreover, classification-experiments are performed for Attention-Deficit/Hyperactivity Disorder subjects and age-matched controls for performance evaluation of real data. Using Bayesian model selection, we see that the proposed model converged to higher log-likelihood and demonstrated that group-classification can be performed with higher cross-validation accuracy of above 94% using distinctive network EC which characterizes patients vs. The full data EC obtained from DML-AR-HMM-md is more consistent with previous literature than the classical multivariate Granger causality method. The proposed architecture leads to reliable estimates of EC than the existing latent models. This framework overcomes the disadvantage of low-temporal resolution and improves cross-validation accuracy significantly due to presence of missing data variables and autoregressive process. Copyright © 2016 Elsevier B.V. All rights reserved.

  6. Deriving dynamic marketing effectiveness from econometric time series models

    OpenAIRE

    Horváth, C.; Franses, Ph.H.B.F.

    2003-01-01

    textabstractTo understand the relevance of marketing efforts, it has become standard practice to estimate the long-run and short-run effects of the marketing-mix, using, say, weekly scanner data. A common vehicle for this purpose is an econometric time series model. Issues that are addressed in the literature are unit roots, cointegration, structural breaks and impulse response functions. In this paper we summarize the most important concepts by reviewing all possible empirical cases that can...

  7. Identification of neutral biochemical network models from time series data

    Directory of Open Access Journals (Sweden)

    Maia Marco

    2009-05-01

    Full Text Available Abstract Background The major difficulty in modeling biological systems from multivariate time series is the identification of parameter sets that endow a model with dynamical behaviors sufficiently similar to the experimental data. Directly related to this parameter estimation issue is the task of identifying the structure and regulation of ill-characterized systems. Both tasks are simplified if the mathematical model is canonical, i.e., if it is constructed according to strict guidelines. Results In this report, we propose a method for the identification of admissible parameter sets of canonical S-systems from biological time series. The method is based on a Monte Carlo process that is combined with an improved version of our previous parameter optimization algorithm. The method maps the parameter space into the network space, which characterizes the connectivity among components, by creating an ensemble of decoupled S-system models that imitate the dynamical behavior of the time series with sufficient accuracy. The concept of sloppiness is revisited in the context of these S-system models with an exploration not only of different parameter sets that produce similar dynamical behaviors but also different network topologies that yield dynamical similarity. Conclusion The proposed parameter estimation methodology was applied to actual time series data from the glycolytic pathway of the bacterium Lactococcus lactis and led to ensembles of models with different network topologies. In parallel, the parameter optimization algorithm was applied to the same dynamical data upon imposing a pre-specified network topology derived from prior biological knowledge, and the results from both strategies were compared. The results suggest that the proposed method may serve as a powerful exploration tool for testing hypotheses and the design of new experiments.

  8. Identification of neutral biochemical network models from time series data.

    Science.gov (United States)

    Vilela, Marco; Vinga, Susana; Maia, Marco A Grivet Mattoso; Voit, Eberhard O; Almeida, Jonas S

    2009-05-05

    The major difficulty in modeling biological systems from multivariate time series is the identification of parameter sets that endow a model with dynamical behaviors sufficiently similar to the experimental data. Directly related to this parameter estimation issue is the task of identifying the structure and regulation of ill-characterized systems. Both tasks are simplified if the mathematical model is canonical, i.e., if it is constructed according to strict guidelines. In this report, we propose a method for the identification of admissible parameter sets of canonical S-systems from biological time series. The method is based on a Monte Carlo process that is combined with an improved version of our previous parameter optimization algorithm. The method maps the parameter space into the network space, which characterizes the connectivity among components, by creating an ensemble of decoupled S-system models that imitate the dynamical behavior of the time series with sufficient accuracy. The concept of sloppiness is revisited in the context of these S-system models with an exploration not only of different parameter sets that produce similar dynamical behaviors but also different network topologies that yield dynamical similarity. The proposed parameter estimation methodology was applied to actual time series data from the glycolytic pathway of the bacterium Lactococcus lactis and led to ensembles of models with different network topologies. In parallel, the parameter optimization algorithm was applied to the same dynamical data upon imposing a pre-specified network topology derived from prior biological knowledge, and the results from both strategies were compared. The results suggest that the proposed method may serve as a powerful exploration tool for testing hypotheses and the design of new experiments.

  9. Financial Time Series Prediction Using Elman Recurrent Random Neural Networks

    Directory of Open Access Journals (Sweden)

    Jie Wang

    2016-01-01

    (ERNN, the empirical results show that the proposed neural network displays the best performance among these neural networks in financial time series forecasting. Further, the empirical research is performed in testing the predictive effects of SSE, TWSE, KOSPI, and Nikkei225 with the established model, and the corresponding statistical comparisons of the above market indices are also exhibited. The experimental results show that this approach gives good performance in predicting the values from the stock market indices.

  10. Seglearn: A Python Package for Learning Sequences and Time Series

    OpenAIRE

    Burns, David M.; Whyne, Cari M.

    2018-01-01

    Seglearn is an open-source python package for machine learning time series or sequences using a sliding window segmentation approach. The implementation provides a flexible pipeline for tackling classification, regression, and forecasting problems with multivariate sequence and contextual data. This package is compatible with scikit-learn and is listed under scikit-learn Related Projects. The package depends on numpy, scipy, and scikit-learn. Seglearn is distributed under the BSD 3-Clause Lic...

  11. Analyses of GIMMS NDVI Time Series in Kogi State, Nigeria

    Science.gov (United States)

    Palka, Jessica; Wessollek, Christine; Karrasch, Pierre

    2017-10-01

    The value of remote sensing data is particularly evident where an areal monitoring is needed to provide information on the earth's surface development. The use of temporal high resolution time series data allows for detecting short-term changes. In Kogi State in Nigeria different vegetation types can be found. As the major population in this region is living in rural communities with crop farming the existing vegetation is slowly being altered. The expansion of agricultural land causes loss of natural vegetation, especially in the regions close to the rivers which are suitable for crop production. With regard to these facts, two questions can be dealt with covering different aspects of the development of vegetation in the Kogi state, the determination and evaluation of the general development of the vegetation in the study area (trend estimation) and analyses on a short-term behavior of vegetation conditions, which can provide information about seasonal effects in vegetation development. For this purpose, the GIMMS-NDVI data set, provided by the NOAA, provides information on the normalized difference vegetation index (NDVI) in a geometric resolution of approx. 8 km. The temporal resolution of 15 days allows the already described analyses. For the presented analysis data for the period 1981-2012 (31 years) were used. The implemented workflow mainly applies methods of time series analysis. The results show that in addition to the classical seasonal development, artefacts of different vegetation periods (several NDVI maxima) can be found in the data. The trend component of the time series shows a consistently positive development in the entire study area considering the full investigation period of 31 years. However, the results also show that this development has not been continuous and a simple linear modeling of the NDVI increase is only possible to a limited extent. For this reason, the trend modeling was extended by procedures for detecting structural breaks in

  12. The complexity of carbon flux time series in Europe

    Science.gov (United States)

    Lange, Holger; Sippel, Sebastian

    2014-05-01

    Observed geophysical time series usually exhibit pronounced variability, part of which is process-related and deterministic ("signal"), another part is due to random fluctuations ("noise"). To discern these two sources for fluctuations is notoriously difficult using conventional analysis methods, unless sophisticated model assumptions are made. Here, we present an almost parameter-free innovative approach with the potential to draw a distinction between deterministic processes and structured noise, based on ordinal pattern statistics. The method determines one measure for the information content of time series (Shannon entropy) and two complexity measures, one based on global properties of the order pattern distribution (Jensen-Shannon complexity) and one based on local (derivative) properties (Fisher information or complexity). Each time series gets classified via its location in an entropy-complexity plane; using this representation, the method draws a qualitative distinction between different types of natural processes. As a case study, we investigate Gross Primary Productivity (GPP) and respiration which are key variables in terrestrial ecosystems quantifying carbon allocation and biomass growth of vegetation. Changes in GPP and ecosystem respiration can be induced by land use change, environmental disasters or extreme events, and changing climate. Numerous attempts to quantify these variables on larger spatial scales exist. Here, we investigate gridded time series at monthly resolution for the European continent either based on upscaled measurements ("observations") or modelled with two different process-based terrestrial ecosystem models ("simulations"). The complexity analysis is either visualized as maps of Europe showing "hotspots" of complexity for GPP and respiration, or used to provide a detailed observations-simulations and model-model comparison. Values found for information and complexity will be compared to known artificial reference processes

  13. Statistical Inference Methods for Sparse Biological Time Series Data

    Directory of Open Access Journals (Sweden)

    Voit Eberhard O

    2011-04-01

    Full Text Available Abstract Background Comparing metabolic profiles under different biological perturbations has become a powerful approach to investigating the functioning of cells. The profiles can be taken as single snapshots of a system, but more information is gained if they are measured longitudinally over time. The results are short time series consisting of relatively sparse data that cannot be analyzed effectively with standard time series techniques, such as autocorrelation and frequency domain methods. In this work, we study longitudinal time series profiles of glucose consumption in the yeast Saccharomyces cerevisiae under different temperatures and preconditioning regimens, which we obtained with methods of in vivo nuclear magnetic resonance (NMR spectroscopy. For the statistical analysis we first fit several nonlinear mixed effect regression models to the longitudinal profiles and then used an ANOVA likelihood ratio method in order to test for significant differences between the profiles. Results The proposed methods are capable of distinguishing metabolic time trends resulting from different treatments and associate significance levels to these differences. Among several nonlinear mixed-effects regression models tested, a three-parameter logistic function represents the data with highest accuracy. ANOVA and likelihood ratio tests suggest that there are significant differences between the glucose consumption rate profiles for cells that had been--or had not been--preconditioned by heat during growth. Furthermore, pair-wise t-tests reveal significant differences in the longitudinal profiles for glucose consumption rates between optimal conditions and heat stress, optimal and recovery conditions, and heat stress and recovery conditions (p-values Conclusion We have developed a nonlinear mixed effects model that is appropriate for the analysis of sparse metabolic and physiological time profiles. The model permits sound statistical inference procedures

  14. Reconstruction of network topology using status-time-series data

    Science.gov (United States)

    Pandey, Pradumn Kumar; Badarla, Venkataramana

    2018-01-01

    Uncovering the heterogeneous connection pattern of a networked system from the available status-time-series (STS) data of a dynamical process on the network is of great interest in network science and known as a reverse engineering problem. Dynamical processes on a network are affected by the structure of the network. The dependency between the diffusion dynamics and structure of the network can be utilized to retrieve the connection pattern from the diffusion data. Information of the network structure can help to devise the control of dynamics on the network. In this paper, we consider the problem of network reconstruction from the available status-time-series (STS) data using matrix analysis. The proposed method of network reconstruction from the STS data is tested successfully under susceptible-infected-susceptible (SIS) diffusion dynamics on real-world and computer-generated benchmark networks. High accuracy and efficiency of the proposed reconstruction procedure from the status-time-series data define the novelty of the method. Our proposed method outperforms compressed sensing theory (CST) based method of network reconstruction using STS data. Further, the same procedure of network reconstruction is applied to the weighted networks. The ordering of the edges in the weighted networks is identified with high accuracy.

  15. Time series analysis for psychological research: examining and forecasting change

    Science.gov (United States)

    Jebb, Andrew T.; Tay, Louis; Wang, Wei; Huang, Qiming

    2015-01-01

    Psychological research has increasingly recognized the importance of integrating temporal dynamics into its theories, and innovations in longitudinal designs and analyses have allowed such theories to be formalized and tested. However, psychological researchers may be relatively unequipped to analyze such data, given its many characteristics and the general complexities involved in longitudinal modeling. The current paper introduces time series analysis to psychological research, an analytic domain that has been essential for understanding and predicting the behavior of variables across many diverse fields. First, the characteristics of time series data are discussed. Second, different time series modeling techniques are surveyed that can address various topics of interest to psychological researchers, including describing the pattern of change in a variable, modeling seasonal effects, assessing the immediate and long-term impact of a salient event, and forecasting future values. To illustrate these methods, an illustrative example based on online job search behavior is used throughout the paper, and a software tutorial in R for these analyses is provided in the Supplementary Materials. PMID:26106341

  16. Genetic programming and serial processing for time series classification.

    Science.gov (United States)

    Alfaro-Cid, Eva; Sharman, Ken; Esparcia-Alcázar, Anna I

    2014-01-01

    This work describes an approach devised by the authors for time series classification. In our approach genetic programming is used in combination with a serial processing of data, where the last output is the result of the classification. The use of genetic programming for classification, although still a field where more research in needed, is not new. However, the application of genetic programming to classification tasks is normally done by considering the input data as a feature vector. That is, to the best of our knowledge, there are not examples in the genetic programming literature of approaches where the time series data are processed serially and the last output is considered as the classification result. The serial processing approach presented here fills a gap in the existing literature. This approach was tested in three different problems. Two of them are real world problems whose data were gathered for online or conference competitions. As there are published results of these two problems this gives us the chance to compare the performance of our approach against top performing methods. The serial processing of data in combination with genetic programming obtained competitive results in both competitions, showing its potential for solving time series classification problems. The main advantage of our serial processing approach is that it can easily handle very large datasets.

  17. Modeling financial time series with S-plus

    CERN Document Server

    Zivot, Eric

    2003-01-01

    The field of financial econometrics has exploded over the last decade This book represents an integration of theory, methods, and examples using the S-PLUS statistical modeling language and the S+FinMetrics module to facilitate the practice of financial econometrics This is the first book to show the power of S-PLUS for the analysis of time series data It is written for researchers and practitioners in the finance industry, academic researchers in economics and finance, and advanced MBA and graduate students in economics and finance Readers are assumed to have a basic knowledge of S-PLUS and a solid grounding in basic statistics and time series concepts Eric Zivot is an associate professor and Gary Waterman Distinguished Scholar in the Economics Department at the University of Washington, and is co-director of the nascent Professional Master's Program in Computational Finance He regularly teaches courses on econometric theory, financial econometrics and time series econometrics, and is the recipient of the He...

  18. Clustering Multivariate Time Series Using Hidden Markov Models

    Directory of Open Access Journals (Sweden)

    Shima Ghassempour

    2014-03-01

    Full Text Available In this paper we describe an algorithm for clustering multivariate time series with variables taking both categorical and continuous values. Time series of this type are frequent in health care, where they represent the health trajectories of individuals. The problem is challenging because categorical variables make it difficult to define a meaningful distance between trajectories. We propose an approach based on Hidden Markov Models (HMMs, where we first map each trajectory into an HMM, then define a suitable distance between HMMs and finally proceed to cluster the HMMs with a method based on a distance matrix. We test our approach on a simulated, but realistic, data set of 1,255 trajectories of individuals of age 45 and over, on a synthetic validation set with known clustering structure, and on a smaller set of 268 trajectories extracted from the longitudinal Health and Retirement Survey. The proposed method can be implemented quite simply using standard packages in R and Matlab and may be a good candidate for solving the difficult problem of clustering multivariate time series with categorical variables using tools that do not require advanced statistic knowledge, and therefore are accessible to a wide range of researchers.

  19. Learning restricted Boolean network model by time-series data.

    Science.gov (United States)

    Ouyang, Hongjia; Fang, Jie; Shen, Liangzhong; Dougherty, Edward R; Liu, Wenbin

    2014-01-01

    Restricted Boolean networks are simplified Boolean networks that are required for either negative or positive regulations between genes. Higa et al. (BMC Proc 5:S5, 2011) proposed a three-rule algorithm to infer a restricted Boolean network from time-series data. However, the algorithm suffers from a major drawback, namely, it is very sensitive to noise. In this paper, we systematically analyze the regulatory relationships between genes based on the state switch of the target gene and propose an algorithm with which restricted Boolean networks may be inferred from time-series data. We compare the proposed algorithm with the three-rule algorithm and the best-fit algorithm based on both synthetic networks and a well-studied budding yeast cell cycle network. The performance of the algorithms is evaluated by three distance metrics: the normalized-edge Hamming distance [Formula: see text], the normalized Hamming distance of state transition [Formula: see text], and the steady-state distribution distance μ (ssd). Results show that the proposed algorithm outperforms the others according to both [Formula: see text] and [Formula: see text], whereas its performance according to μ (ssd) is intermediate between best-fit and the three-rule algorithms. Thus, our new algorithm is more appropriate for inferring interactions between genes from time-series data.

  20. Cross-sample entropy of foreign exchange time series

    Science.gov (United States)

    Liu, Li-Zhi; Qian, Xi-Yuan; Lu, Heng-Yao

    2010-11-01

    The correlation of foreign exchange rates in currency markets is investigated based on the empirical data of DKK/USD, NOK/USD, CAD/USD, JPY/USD, KRW/USD, SGD/USD, THB/USD and TWD/USD for a period from 1995 to 2002. Cross-SampEn (cross-sample entropy) method is used to compare the returns of every two exchange rate time series to assess their degree of asynchrony. The calculation method of confidence interval of SampEn is extended and applied to cross-SampEn. The cross-SampEn and its confidence interval for every two of the exchange rate time series in periods 1995-1998 (before the Asian currency crisis) and 1999-2002 (after the Asian currency crisis) are calculated. The results show that the cross-SampEn of every two of these exchange rates becomes higher after the Asian currency crisis, indicating a higher asynchrony between the exchange rates. Especially for Singapore, Thailand and Taiwan, the cross-SampEn values after the Asian currency crisis are significantly higher than those before the Asian currency crisis. Comparison with the correlation coefficient shows that cross-SampEn is superior to describe the correlation between time series.

  1. Coastline detection with time series of SAR images

    Science.gov (United States)

    Ao, Dongyang; Dumitru, Octavian; Schwarz, Gottfried; Datcu, Mihai

    2017-10-01

    For maritime remote sensing, coastline detection is a vital task. With continuous coastline detection results from satellite image time series, the actual shoreline, the sea level, and environmental parameters can be observed to support coastal management and disaster warning. Established coastline detection methods are often based on SAR images and wellknown image processing approaches. These methods involve a lot of complicated data processing, which is a big challenge for remote sensing time series. Additionally, a number of SAR satellites operating with polarimetric capabilities have been launched in recent years, and many investigations of target characteristics in radar polarization have been performed. In this paper, a fast and efficient coastline detection method is proposed which comprises three steps. First, we calculate a modified correlation coefficient of two SAR images of different polarization. This coefficient differs from the traditional computation where normalization is needed. Through this modified approach, the separation between sea and land becomes more prominent. Second, we set a histogram-based threshold to distinguish between sea and land within the given image. The histogram is derived from the statistical distribution of the polarized SAR image pixel amplitudes. Third, we extract continuous coastlines using a Canny image edge detector that is rather immune to speckle noise. Finally, the individual coastlines derived from time series of .SAR images can be checked for changes.

  2. Autoregressive Logistic Regression Applied to Atmospheric Circulation Patterns

    Science.gov (United States)

    Guanche, Yanira; Mínguez, Roberto; Méndez, Fernando J.

    2013-04-01

    The study of atmospheric patterns, weather types or circulation patterns, is a topic deeply studied by climatologists, and it is widely accepted to disaggregate the atmospheric conditions over regions in a certain number of representative states. This consensus allows simplifying the study of climate conditions to improve weather predictions and a better knowledge of the influence produced by anthropogenic activities on the climate system. Once the atmospheric conditions have been reduced to a catalogue of representative states, it is desirable to dispose of numerical models to improve our understanding about weather dynamics, i.e. i) to analyze climate change studying trends in the probability of occurrence of weather types, ii) to study seasonality and iii) to analyze the possible influence of previous states (Autoregressive terms or Markov Chains). This work introduces the mathematical framework to analyze those effects from a qualitative point of view. In particular, an autoregressive logistic regression model, which has been successfully applied in medical and pharmacological research fields, is presented. The main advantages of autoregressive logistic regression are that i) it can be used to model polytomous outcome variables, such as circulation types, and ii) standard statistical software can be used for fitting purposes. To show the potential of these kind of models for analyzing atmospheric conditions, a case of study located in the Northeastern Atlantic is described. Results obtained show how the model is capable of dealing simultaneously with predictors related to different time scales, which can be used to simulate the behaviour of circulation patterns.

  3. A Method for the Monthly Electricity Demand Forecasting in Colombia based on Wavelet Analysis and a Nonlinear Autoregressive Model

    Directory of Open Access Journals (Sweden)

    Cristhian Moreno-Chaparro

    2011-12-01

    Full Text Available This paper proposes a monthly electricity forecast method for the National Interconnected System (SIN of Colombia. The method preprocesses the time series using a Multiresolution Analysis (MRA with Discrete Wavelet Transform (DWT; a study for the selection of the mother wavelet and her order, as well as the level decomposition was carried out. Given that original series follows a non-linear behaviour, a neural nonlinear autoregressive (NAR model was used. The prediction was obtained by adding the forecast trend with the estimated obtained by the residual series combined with further components extracted from preprocessing. A bibliographic review of studies conducted internationally and in Colombia is included, in addition to references to investigations made with wavelet transform applied to electric energy prediction and studies reporting the use of NAR in prediction.

  4. Adaptive modelling and forecasting of offshore wind power fluctuations with Markov-switching autoregressive models

    DEFF Research Database (Denmark)

    Pinson, Pierre; Madsen, Henrik

    2012-01-01

    optimized is based on penalized maximum likelihood, with exponential forgetting of past observations. MSAR models are then employed for one-step-ahead point forecasting of 10 min resolution time series of wind power at two large offshore wind farms. They are favourably compared against persistence......Wind power production data at temporal resolutions of a few minutes exhibit successive periods with fluctuations of various dynamic nature and magnitude, which cannot be explained (so far) by the evolution of some explanatory variable. Our proposal is to capture this regime-switching behaviour...... and autoregressive models. It is finally shown that the main interest of MSAR models lies in their ability to generate interval/density forecasts of significantly higher skill....

  5. Order Selection for General Expression of Nonlinear Autoregressive Model Based on Multivariate Stepwise Regression

    Science.gov (United States)

    Shi, Jinfei; Zhu, Songqing; Chen, Ruwen

    2017-12-01

    An order selection method based on multiple stepwise regressions is proposed for General Expression of Nonlinear Autoregressive model which converts the model order problem into the variable selection of multiple linear regression equation. The partial autocorrelation function is adopted to define the linear term in GNAR model. The result is set as the initial model, and then the nonlinear terms are introduced gradually. Statistics are chosen to study the improvements of both the new introduced and originally existed variables for the model characteristics, which are adopted to determine the model variables to retain or eliminate. So the optimal model is obtained through data fitting effect measurement or significance test. The simulation and classic time-series data experiment results show that the method proposed is simple, reliable and can be applied to practical engineering.

  6. Earthquake forecasting studies using radon time series data in Taiwan

    Science.gov (United States)

    Walia, Vivek; Kumar, Arvind; Fu, Ching-Chou; Lin, Shih-Jung; Chou, Kuang-Wu; Wen, Kuo-Liang; Chen, Cheng-Hong

    2017-04-01

    For few decades, growing number of studies have shown usefulness of data in the field of seismogeochemistry interpreted as geochemical precursory signals for impending earthquakes and radon is idendified to be as one of the most reliable geochemical precursor. Radon is recognized as short-term precursor and is being monitored in many countries. This study is aimed at developing an effective earthquake forecasting system by inspecting long term radon time series data. The data is obtained from a network of radon monitoring stations eastblished along different faults of Taiwan. The continuous time series radon data for earthquake studies have been recorded and some significant variations associated with strong earthquakes have been observed. The data is also examined to evaluate earthquake precursory signals against environmental factors. An automated real-time database operating system has been developed recently to improve the data processing for earthquake precursory studies. In addition, the study is aimed at the appraisal and filtrations of these environmental parameters, in order to create a real-time database that helps our earthquake precursory study. In recent years, automatic operating real-time database has been developed using R, an open source programming language, to carry out statistical computation on the data. To integrate our data with our working procedure, we use the popular and famous open source web application solution, AMP (Apache, MySQL, and PHP), creating a website that could effectively show and help us manage the real-time database.

  7. Forecasting long memory time series under a break in persistence

    DEFF Research Database (Denmark)

    Heinen, Florian; Sibbertsen, Philipp; Kruse, Robinson

    of this effect depends on whether the memory parameter is increasing or decreasing over time. A comparison of six forecasting strategies allows us to conclude that pre-testing for a change in persistence is highly recommendable in our setting. In addition we provide an empirical example which underlines......We consider the problem of forecasting time series with long memory when the memory parameter is subject to a structural break. By means of a large-scale Monte Carlo study we show that ignoring such a change in persistence leads to substantially reduced forecasting precision. The strength...

  8. Extracting the relevant delays in time series modelling

    DEFF Research Database (Denmark)

    Goutte, Cyril

    1997-01-01

    selection, and more precisely stepwise forward selection. The method is compared to other forward selection schemes, as well as to a nonparametric tests aimed at estimating the embedding dimension of time series. The final application extends these results to the efficient estimation of FIR filters on some......In this contribution, we suggest a convenient way to use generalisation error to extract the relevant delays from a time-varying process, i.e. the delays that lead to the best prediction performance. We design a generalisation-based algorithm that takes its inspiration from traditional variable...

  9. Accelerating molecular dynamics simulations by linear prediction of time series

    Science.gov (United States)

    Brutovsky, B.; Mülders, T.; Kneller, G. R.

    2003-04-01

    We present a molecular dynamics simulation scheme which allows to speed up molecular dynamics simulations by linear prediction of force time series. The explicit calculation of nonbonding forces is periodically replaced by linear prediction from past values. Applying our method to liquid oxygen consisting of flexible molecules we obtained real speedups between 5.4 and 6.5, compared to conventional molecular dynamics simulations. Here only the bond-stretching forces were calculated at each time step. We demonstrate that essential dynamical quantities, such as the mean-square displacement and the velocity autocorrelation function, are preserved.

  10. Satellite Image Time Series Decomposition Based on EEMD

    Directory of Open Access Journals (Sweden)

    Yun-long Kong

    2015-11-01

    Full Text Available Satellite Image Time Series (SITS have recently been of great interest due to the emerging remote sensing capabilities for Earth observation. Trend and seasonal components are two crucial elements of SITS. In this paper, a novel framework of SITS decomposition based on Ensemble Empirical Mode Decomposition (EEMD is proposed. EEMD is achieved by sifting an ensemble of adaptive orthogonal components called Intrinsic Mode Functions (IMFs. EEMD is noise-assisted and overcomes the drawback of mode mixing in conventional Empirical Mode Decomposition (EMD. Inspired by these advantages, the aim of this work is to employ EEMD to decompose SITS into IMFs and to choose relevant IMFs for the separation of seasonal and trend components. In a series of simulations, IMFs extracted by EEMD achieved a clear representation with physical meaning. The experimental results of 16-day compositions of Moderate Resolution Imaging Spectroradiometer (MODIS, Normalized Difference Vegetation Index (NDVI, and Global Environment Monitoring Index (GEMI time series with disturbance illustrated the effectiveness and stability of the proposed approach to monitoring tasks, such as applications for the detection of abrupt changes.

  11. Time-series animation techniques for visualizing urban growth

    Science.gov (United States)

    Acevedo, W.; Masuoka, P.

    1997-01-01

    Time-series animation is a visually intuitive way to display urban growth. Animations of landuse change for the Baltimore-Washington region were generated by showing a series of images one after the other in sequential order. Before creating an animation, various issues which will affect the appearance of the animation should be considered, including the number of original data frames to use, the optimal animation display speed, the number of intermediate frames to create between the known frames, and the output media on which the animations will be displayed. To create new frames between the known years of data, the change in each theme (i.e. urban development, water bodies, transportation routes) must be characterized and an algorithm developed to create the in-between frames. Example time-series animations were created using a temporal GIS database of the Baltimore-Washington area. Creating the animations involved generating raster images of the urban development, water bodies, and principal transportation routes; overlaying the raster images on a background image; and importing the frames to a movie file. Three-dimensional perspective animations were created by draping each image over digital elevation data prior to importing the frames to a movie file. ?? 1997 Elsevier Science Ltd.

  12. Connectionist Architectures for Time Series Prediction of Dynamical Systems

    Science.gov (United States)

    Weigend, Andreas Sebastian

    We investigate the effectiveness of connectionist networks for predicting the future continuation of temporal sequences. The problem of overfitting, particularly serious for short records of noisy data, is addressed by the method of weight-elimination: a term penalizing network complexity is added to the usual cost function in back-propagation. We describe the dynamics of the procedure and clarify the meaning of the parameters involved. From a Bayesian perspective, the complexity term can be usefully interpreted as an assumption about prior distribution of the weights. We analyze three time series. On the benchmark sunspot series, the networks outperform traditional statistical approaches. We show that the network performance does not deteriorate when there are more input units than needed. In the second example, the notoriously noisy foreign exchange rates series, we pick one weekday and one currency (DM vs. US). Given exchange rate information up to and including a Monday, the task is to predict the rate for the following Tuesday. Weight-elimination manages to extract a significant part of the dynamics and makes the solution interpretable. In the third example, the networks predict the resource utilization of a chaotic computational ecosystem for hundreds of steps forward in time.

  13. Seasonality of Tuberculosis in Delhi, India: A Time Series Analysis

    Directory of Open Access Journals (Sweden)

    Varun Kumar

    2014-01-01

    Full Text Available Background. It is highly cost effective to detect a seasonal trend in tuberculosis in order to optimize disease control and intervention. Although seasonal variation of tuberculosis has been reported from different parts of the world, no definite and consistent pattern has been observed. Therefore, the study was designed to find the seasonal variation of tuberculosis in Delhi, India. Methods. Retrospective record based study was undertaken in a Directly Observed Treatment Short course (DOTS centre located in the south district of Delhi. Six-year data from January 2007 to December 2012 was analyzed. Expert modeler of SPSS ver. 21 software was used to fit the best suitable model for the time series data. Results. Autocorrelation function (ACF and partial autocorrelation function (PACF at lag 12 show significant peak suggesting seasonal component of the TB series. Seasonal adjusted factor (SAF showed peak seasonal variation from March to May. Univariate model by expert modeler in the SPSS showed that Winter’s multiplicative model could best predict the time series data with 69.8% variability. The forecast shows declining trend with seasonality. Conclusion. A seasonal pattern and declining trend with variable amplitudes of fluctuation were observed in the incidence of tuberculosis.

  14. Seasonality of tuberculosis in delhi, India: a time series analysis.

    Science.gov (United States)

    Kumar, Varun; Singh, Abhay; Adhikary, Mrinmoy; Daral, Shailaja; Khokhar, Anita; Singh, Saudan

    2014-01-01

    Background. It is highly cost effective to detect a seasonal trend in tuberculosis in order to optimize disease control and intervention. Although seasonal variation of tuberculosis has been reported from different parts of the world, no definite and consistent pattern has been observed. Therefore, the study was designed to find the seasonal variation of tuberculosis in Delhi, India. Methods. Retrospective record based study was undertaken in a Directly Observed Treatment Short course (DOTS) centre located in the south district of Delhi. Six-year data from January 2007 to December 2012 was analyzed. Expert modeler of SPSS ver. 21 software was used to fit the best suitable model for the time series data. Results. Autocorrelation function (ACF) and partial autocorrelation function (PACF) at lag 12 show significant peak suggesting seasonal component of the TB series. Seasonal adjusted factor (SAF) showed peak seasonal variation from March to May. Univariate model by expert modeler in the SPSS showed that Winter's multiplicative model could best predict the time series data with 69.8% variability. The forecast shows declining trend with seasonality. Conclusion. A seasonal pattern and declining trend with variable amplitudes of fluctuation were observed in the incidence of tuberculosis.

  15. Studies in Astronomical Time Series Analysis. VI. Bayesian Block Representations

    Science.gov (United States)

    Scargle, Jeffrey D.; Norris, Jay P.; Jackson, Brad; Chiang, James

    2013-01-01

    This paper addresses the problem of detecting and characterizing local variability in time series and other forms of sequential data. The goal is to identify and characterize statistically significant variations, at the same time suppressing the inevitable corrupting observational errors. We present a simple nonparametric modeling technique and an algorithm implementing it-an improved and generalized version of Bayesian Blocks [Scargle 1998]-that finds the optimal segmentation of the data in the observation interval. The structure of the algorithm allows it to be used in either a real-time trigger mode, or a retrospective mode. Maximum likelihood or marginal posterior functions to measure model fitness are presented for events, binned counts, and measurements at arbitrary times with known error distributions. Problems addressed include those connected with data gaps, variable exposure, extension to piece- wise linear and piecewise exponential representations, multivariate time series data, analysis of variance, data on the circle, other data modes, and dispersed data. Simulations provide evidence that the detection efficiency for weak signals is close to a theoretical asymptotic limit derived by [Arias-Castro, Donoho and Huo 2003]. In the spirit of Reproducible Research [Donoho et al. (2008)] all of the code and data necessary to reproduce all of the figures in this paper are included as auxiliary material.

  16. Assessing Coupling Dynamics from an Ensemble of Time Series

    Directory of Open Access Journals (Sweden)

    Germán Gómez-Herrero

    2015-04-01

    Full Text Available Finding interdependency relations between time series provides valuable knowledge about the processes that generated the signals. Information theory sets a natural framework for important classes of statistical dependencies. However, a reliable estimation from information-theoretic functionals is hampered when the dependency to be assessed is brief or evolves in time. Here, we show that these limitations can be partly alleviated when we have access to an ensemble of independent repetitions of the time series. In particular, we gear a data-efficient estimator of probability densities to make use of the full structure of trial-based measures. By doing so, we can obtain time-resolved estimates for a family of entropy combinations (including mutual information, transfer entropy and their conditional counterparts, which are more accurate than the simple average of individual estimates over trials. We show with simulated and real data generated by coupled electronic circuits that the proposed approach allows one to recover the time-resolved dynamics of the coupling between different subsystems.

  17. Studies in Astronomical Time Series Analysis. VI. Bayesian Block Representations

    Science.gov (United States)

    Scargle, Jeffrey D.; Norris, Jay P.; Jackson, Brad; Chiang, James

    2013-02-01

    This paper addresses the problem of detecting and characterizing local variability in time series and other forms of sequential data. The goal is to identify and characterize statistically significant variations, at the same time suppressing the inevitable corrupting observational errors. We present a simple nonparametric modeling technique and an algorithm implementing it—an improved and generalized version of Bayesian Blocks—that finds the optimal segmentation of the data in the observation interval. The structure of the algorithm allows it to be used in either a real-time trigger mode, or a retrospective mode. Maximum likelihood or marginal posterior functions to measure model fitness are presented for events, binned counts, and measurements at arbitrary times with known error distributions. Problems addressed include those connected with data gaps, variable exposure, extension to piecewise linear and piecewise exponential representations, multivariate time series data, analysis of variance, data on the circle, other data modes, and dispersed data. Simulations provide evidence that the detection efficiency for weak signals is close to a theoretical asymptotic limit derived by Arias-Castro et al. In the spirit of Reproducible Research all of the code and data necessary to reproduce all of the figures in this paper are included as supplementary material.

  18. A Time Series Analysis of Macroeconomic Determinants of Corporate Births in Romania in the period 2008-2013

    Directory of Open Access Journals (Sweden)

    Marușa Beca

    2015-06-01

    Full Text Available In this article, we studied the relationship between macroeconomic factors and the observed corporate births for the Romanian economy through the Autoregressive Distributed Lags Model (ADL. We performed a time series analysis that uses monthly data for the period January 2008 – December 2013 in order to establish the impact of the fiscal and monetary policy adopted by the Romanian government in times of economic crisis on the firms’ demography. The corporate birth rate is an endogenous variable in a linear function model with five exogenous macroeconomic variables such as the CPI, the loans ratio to GDP, the FDI, the long term interest rate, tax rate to GDP and the lags of the dependent variable. The main finding is that the variance of the corporate birth rate variable is negatively correlated with the variances of CPI in the current month and the interest rate two months lagged. We also determined that the variance of the dependent variable was positively correlated with the variances of the loans rate two months lagged, tax rate four months ago and FDI two months lagged and FDI in the current period.

  19. Time series analysis of the behavior of brazilian natural rubber

    Directory of Open Access Journals (Sweden)

    Antônio Donizette de Oliveira

    2009-03-01

    Full Text Available The natural rubber is a non-wood product obtained of the coagulation of some lattices of forest species, being Hevea brasiliensis the main one. Native from the Amazon Region, this species was already known by the Indians before the discovery of America. The natural rubber became a product globally valued due to its multiple applications in the economy, being its almost perfect substitute the synthetic rubber derived from the petroleum. Similarly to what happens with other countless products the forecast of future prices of the natural rubber has been object of many studies. The use of models of forecast of univariate timeseries stands out as the more accurate and useful to reduce the uncertainty in the economic decision making process. This studyanalyzed the historical series of prices of the Brazilian natural rubber (R$/kg, in the Jan/99 - Jun/2006 period, in order tocharacterize the rubber price behavior in the domestic market; estimated a model for the time series of monthly natural rubberprices; and foresaw the domestic prices of the natural rubber, in the Jul/2006 - Jun/2007 period, based on the estimated models.The studied models were the ones belonging to the ARIMA family. The main results were: the domestic market of the natural rubberis expanding due to the growth of the world economy; among the adjusted models, the ARIMA (1,1,1 model provided the bestadjustment of the time series of prices of the natural rubber (R$/kg; the prognosis accomplished for the series supplied statistically adequate fittings.

  20. Prewhitening of hydroclimatic time series? Implications for inferred change and variability across time scales

    Science.gov (United States)

    Razavi, Saman; Vogel, Richard

    2018-02-01

    Prewhitening, the process of eliminating or reducing short-term stochastic persistence to enable detection of deterministic change, has been extensively applied to time series analysis of a range of geophysical variables. Despite the controversy around its utility, methodologies for prewhitening time series continue to be a critical feature of a variety of analyses including: trend detection of hydroclimatic variables and reconstruction of climate and/or hydrology through proxy records such as tree rings. With a focus on the latter, this paper presents a generalized approach to exploring the impact of a wide range of stochastic structures of short- and long-term persistence on the variability of hydroclimatic time series. Through this approach, we examine the impact of prewhitening on the inferred variability of time series across time scales. We document how a focus on prewhitened, residual time series can be misleading, as it can drastically distort (or remove) the structure of variability across time scales. Through examples with actual data, we show how such loss of information in prewhitened time series of tree rings (so-called "residual chronologies") can lead to the underestimation of extreme conditions in climate and hydrology, particularly droughts, reconstructed for centuries preceding the historical period.

  1. Sequential Monte Carlo for inference of latent ARMA time-series with innovations correlated in time

    Science.gov (United States)

    Urteaga, Iñigo; Bugallo, Mónica F.; Djurić, Petar M.

    2017-12-01

    We consider the problem of sequential inference of latent time-series with innovations correlated in time and observed via nonlinear functions. We accommodate time-varying phenomena with diverse properties by means of a flexible mathematical representation of the data. We characterize statistically such time-series by a Bayesian analysis of their densities. The density that describes the transition of the state from time t to the next time instant t+1 is used for implementation of novel sequential Monte Carlo (SMC) methods. We present a set of SMC methods for inference of latent ARMA time-series with innovations correlated in time for different assumptions in knowledge of parameters. The methods operate in a unified and consistent manner for data with diverse memory properties. We show the validity of the proposed approach by comprehensive simulations of the challenging stochastic volatility model.

  2. Cluster analysis of activity-time series in motor learning

    DEFF Research Database (Denmark)

    Balslev, Daniela; Nielsen, Finn Årup; Frutiger, Sally A.

    2002-01-01

    Neuroimaging studies of learning focus on brain areas where the activity changes as a function of time. To circumvent the difficult problem of model selection, we used a data-driven analytic tool, cluster analysis, which extracts representative temporal and spatial patterns from the voxel......-time series. The optimal number of clusters was chosen using a cross-validated likelihood method, which highlights the clustering pattern that generalizes best over the subjects. Data were acquired with PET at different time points during practice of a visuomotor task. The results from cluster analysis show...... practice-related activity in a fronto-parieto-cerebellar network, in agreement with previous studies of motor learning. These voxels were separated from a group of voxels showing an unspecific time-effect and another group of voxels, whose activation was an artifact from smoothing. Hum. Brain Mapping 15...

  3. Neutron noise analysis of BWR using time series analysis

    International Nuclear Information System (INIS)

    Fukunishi, Kohyu

    1976-01-01

    The main purpose of this paper is to give more quantitative understanding of noise source in neutron flux and to provide a useful tool for the detection and diagnosis of reactor. The space dependent effects of distributed neutron flux signals at the axial direction of two different strings are investigated by the power contribution ratio among neutron fluxes and the incoherent noise spectra of neutron fluxes derived from autoregressive spectra. The signals are measured on the medium sized commercial BWR of 460 MWe in Japan. From the obtained results, local and global noise sources in neutron flux are discussed. This method is indicated to be a useful tool for detection and diagnosis of anomalous phenomena in BWR. (orig./RW) [de

  4. Weighted statistical parameters for irregularly sampled time series

    Science.gov (United States)

    Rimoldini, Lorenzo

    2014-01-01

    Unevenly spaced time series are common in astronomy because of the day-night cycle, weather conditions, dependence on the source position in the sky, allocated telescope time and corrupt measurements, for example, or inherent to the scanning law of satellites like Hipparcos and the forthcoming Gaia. Irregular sampling often causes clumps of measurements and gaps with no data which can severely disrupt the values of estimators. This paper aims at improving the accuracy of common statistical parameters when linear interpolation (in time or phase) can be considered an acceptable approximation of a deterministic signal. A pragmatic solution is formulated in terms of a simple weighting scheme, adapting to the sampling density and noise level, applicable to large data volumes at minimal computational cost. Tests on time series from the Hipparcos periodic catalogue led to significant improvements in the overall accuracy and precision of the estimators with respect to the unweighted counterparts and those weighted by inverse-squared uncertainties. Automated classification procedures employing statistical parameters weighted by the suggested scheme confirmed the benefits of the improved input attributes. The classification of eclipsing binaries, Mira, RR Lyrae, Delta Cephei and Alpha2 Canum Venaticorum stars employing exclusively weighted descriptive statistics achieved an overall accuracy of 92 per cent, about 6 per cent higher than with unweighted estimators.

  5. Optimal Hedging with the Vector Autoregressive Model

    NARCIS (Netherlands)

    L. Gatarek (Lukasz); S.G. Johansen (Soren)

    2014-01-01

    markdownabstract__Abstract__ We derive the optimal hedging ratios for a portfolio of assets driven by a Cointegrated Vector Autoregressive model with general cointegration rank. Our hedge is optimal in the sense of minimum variance portfolio. We consider a model that allows for the hedges to be

  6. Improving Variational Autoencoders with Inverse Autoregressive Flow

    NARCIS (Netherlands)

    Kingma, D.; Salimans, T.; Josefowicz, R.; Chen, X.; Sutskever, I.; Welling, M.; Lee, D.D.; von Luxburg, U.; Garnett, R.; Sugiyama, M.; Guyon, I.

    2017-01-01

    The framework of normalizing flows provides a general strategy for flexible variational inference of posteriors over latent variables. We propose a new type of normalizing flow, inverse autoregressive flow (IAF), that, in contrast to earlier published flows, scales well to high-dimensional latent

  7. Interval Forecast for Smooth Transition Autoregressive Model ...

    African Journals Online (AJOL)

    In this paper, we propose a simple method for constructing interval forecast for smooth transition autoregressive (STAR) model. This interval forecast is based on bootstrapping the residual error of the estimated STAR model for each forecast horizon and computing various Akaike information criterion (AIC) function. This new ...

  8. New interval forecast for stationary autoregressive models ...

    African Journals Online (AJOL)

    In this paper, we proposed a new forecasting interval for stationary Autoregressive, AR(p) models using the Akaike information criterion (AIC) function. Ordinarily, the AIC function is used to determine the order of an AR(p) process. In this study however, AIC forecast interval compared favorably with the theoretical forecast ...

  9. Bias-correction in vector autoregressive models

    DEFF Research Database (Denmark)

    Engsted, Tom; Pedersen, Thomas Quistgaard

    2014-01-01

    We analyze the properties of various methods for bias-correcting parameter estimates in both stationary and non-stationary vector autoregressive models. First, we show that two analytical bias formulas from the existing literature are in fact identical. Next, based on a detailed simulation study...

  10. The Integration Order of Vector Autoregressive Processes

    DEFF Research Database (Denmark)

    Franchi, Massimo

    We show that the order of integration of a vector autoregressive process is equal to the difference between the multiplicity of the unit root in the characteristic equation and the multiplicity of the unit root in the adjoint matrix polynomial. The equivalence with the standard I(1) and I(2...

  11. Forecasting nuclear power supply with Bayesian autoregression

    International Nuclear Information System (INIS)

    Beck, R.; Solow, J.L.

    1994-01-01

    We explore the possibility of forecasting the quarterly US generation of electricity from nuclear power using a Bayesian autoregression model. In terms of forecasting accuracy, this approach compares favorably with both the Department of Energy's current forecasting methodology and their more recent efforts using ARIMA models, and it is extremely easy and inexpensive to implement. (author)

  12. THE ALLOMETRIC-AUTOREGRESSIVE MODEL IN GENETIC ...

    African Journals Online (AJOL)

    THE ALLOMETRIC-AUTOREGRESSIVE. MODEL IN GENETIC STUDIES: HERITABILITIES. AND CORRELATIONS. IN THE RAT*. M.M. Scholtz and C.Z. Roux. Animal and Dairv Science Research Institute, Prh'ate Bag X2, Irene, /675,. Republic of South Africa. (Keywords: Rat, growth model, heritabilities, correlations).

  13. Mixed Causal-Noncausal Autoregressions with Strictly Exogenous Regressors

    NARCIS (Netherlands)

    Hecq, Alain; Issler, J.V.; Telg, Sean

    2017-01-01

    The mixed autoregressive causal-noncausal model (MAR) has been proposed to estimate economic relationships involving explosive roots in their autoregressive part, as they have stationary forward solutions. In previous work, possible exogenous variables in economic relationships are substituted into

  14. Time series analytics using sliding window metaheuristic optimization-based machine learning system for identifying building energy consumption patterns

    International Nuclear Information System (INIS)

    Chou, Jui-Sheng; Ngo, Ngoc-Tri

    2016-01-01

    Highlights: • This study develops a novel time-series sliding window forecast system. • The system integrates metaheuristics, machine learning and time-series models. • Site experiment of smart grid infrastructure is installed to retrieve real-time data. • The proposed system accurately predicts energy consumption in residential buildings. • The forecasting system can help users minimize their electricity usage. - Abstract: Smart grids are a promising solution to the rapidly growing power demand because they can considerably increase building energy efficiency. This study developed a novel time-series sliding window metaheuristic optimization-based machine learning system for predicting real-time building energy consumption data collected by a smart grid. The proposed system integrates a seasonal autoregressive integrated moving average (SARIMA) model and metaheuristic firefly algorithm-based least squares support vector regression (MetaFA-LSSVR) model. Specifically, the proposed system fits the SARIMA model to linear data components in the first stage, and the MetaFA-LSSVR model captures nonlinear data components in the second stage. Real-time data retrieved from an experimental smart grid installed in a building were used to evaluate the efficacy and effectiveness of the proposed system. A k-week sliding window approach is proposed for employing historical data as input for the novel time-series forecasting system. The prediction system yielded high and reliable accuracy rates in 1-day-ahead predictions of building energy consumption, with a total error rate of 1.181% and mean absolute error of 0.026 kW h. Notably, the system demonstrates an improved accuracy rate in the range of 36.8–113.2% relative to those of the linear forecasting model (i.e., SARIMA) and nonlinear forecasting models (i.e., LSSVR and MetaFA-LSSVR). Therefore, end users can further apply the forecasted information to enhance efficiency of energy usage in their buildings, especially

  15. Mapping Brazilian savanna vegetation gradients with Landsat time series

    Science.gov (United States)

    Schwieder, Marcel; Leitão, Pedro J.; da Cunha Bustamante, Mercedes Maria; Ferreira, Laerte Guimarães; Rabe, Andreas; Hostert, Patrick

    2016-10-01

    Global change has tremendous impacts on savanna systems around the world. Processes related to climate change or agricultural expansion threaten the ecosystem's state, function and the services it provides. A prominent example is the Brazilian Cerrado that has an extent of around 2 million km2 and features high biodiversity with many endemic species. It is characterized by landscape patterns from open grasslands to dense forests, defining a heterogeneous gradient in vegetation structure throughout the biome. While it is undisputed that the Cerrado provides a multitude of valuable ecosystem services, it is exposed to changes, e.g. through large scale land conversions or climatic changes. Monitoring of the Cerrado is thus urgently needed to assess the state of the system as well as to analyze and further understand ecosystem responses and adaptations to ongoing changes. Therefore we explored the potential of dense Landsat time series to derive phenological information for mapping vegetation gradients in the Cerrado. Frequent data gaps, e.g. due to cloud contamination, impose a serious challenge for such time series analyses. We synthetically filled data gaps based on Radial Basis Function convolution filters to derive continuous pixel-wise temporal profiles capable of representing Land Surface Phenology (LSP). Derived phenological parameters revealed differences in the seasonal cycle between the main Cerrado physiognomies and could thus be used to calibrate a Support Vector Classification model to map their spatial distribution. Our results show that it is possible to map the main spatial patterns of the observed physiognomies based on their phenological differences, whereat inaccuracies occurred especially between similar classes and data-scarce areas. The outcome emphasizes the need for remote sensing based time series analyses at fine scales. Mapping heterogeneous ecosystems such as savannas requires spatial detail, as well as the ability to derive important

  16. An Interrupted Time-Series Analysis to Assess Impact of Introduction of Co-Payment on Emergency Room Visits in Cyprus.

    Science.gov (United States)

    Petrou, Panagiotis

    2015-10-01

    A co-payment fee of EUR10 was introduced in Cyprus, in order to cope with overcrowding of emergency room services. The scope of this paper is the assessment of the short-term impact of this measure. We used an interrupted time-series autoregressive integrated moving average model, and we analyzed official data from Cyprus' largest emergency room facility for three years. Co-payment is associated with a 16% statistically significant reduction of emergency room visits. No impact was observed in categories of teenagers, children, infants, and people over 70 years old. Co-payment was proven to be effective in Cyprus' emergency room setting and is expected to lessen congestion in the emergency room. The price insensitivity of people aged over 70 years, teenagers, children and infants, merits additional research for the identification of the underlying reasons.

  17. GPS time series at Campi Flegrei caldera (2000-2013

    Directory of Open Access Journals (Sweden)

    Prospero De Martino

    2014-05-01

    Full Text Available The Campi Flegrei caldera is an active volcanic system associated to a high volcanic risk, and represents a well known and peculiar example of ground deformations (bradyseism, characterized by intense uplift periods, followed by subsidence phases with some episodic superimposed mini-uplifts. Ground deformation is an important volcanic precursor, and, its continuous monitoring, is one of the main tool for short time forecast of eruptive activity. This paper provides an overview of the continuous GPS monitoring of the Campi Flegrei caldera from January 2000 to July 2013, including network operations, data recording and processing, and data products. In this period the GPS time series allowed continuous and accurate tracking of ground deformation of the area. Seven main uplift episodes were detected, and during each uplift period, the recurrent horizontal displacement pattern, radial from the “caldera center”, suggests no significant change in deformation source geometry and location occurs. The complete archive of GPS time series at Campi Flegrei area is reported in the Supplementary materials. These data can be usefull for the scientific community in improving the research on Campi Flegrei caldera dynamic and hazard assessment.

  18. Exploratory joint and separate tracking of geographically related time series

    Science.gov (United States)

    Balasingam, Balakumar; Willett, Peter; Levchuk, Georgiy; Freeman, Jared

    2012-05-01

    Target tracking techniques have usually been applied to physical systems via radar, sonar or imaging modalities. But the same techniques - filtering, association, classification, track management - can be applied to nontraditional data such as one might find in other fields such as economics, business and national defense. In this paper we explore a particular data set. The measurements are time series collected at various sites; but other than that little is known about it. We shall refer to as the data as representing the Megawatt hour (MWH) output of various power plants located in Afghanistan. We pose such questions as: 1. Which power plants seem to have a common model? 2. Do any power plants change their models with time? 3. Can power plant behavior be predicted, and if so, how far to the future? 4. Are some of the power plants stochastically linked? That is, do we observed a lack of power demand at one power plant as implying a surfeit of demand elsewhere? The observations seem well modeled as hidden Markov. This HMM modeling is compared to other approaches; and tests are continued to other (albeit self-generated) data sets with similar characteristics. Keywords: Time-series analysis, hidden Markov models, statistical similarity, clustering weighted

  19. Centrality measures in temporal networks with time series analysis

    Science.gov (United States)

    Huang, Qiangjuan; Zhao, Chengli; Zhang, Xue; Wang, Xiaojie; Yi, Dongyun

    2017-05-01

    The study of identifying important nodes in networks has a wide application in different fields. However, the current researches are mostly based on static or aggregated networks. Recently, the increasing attention to networks with time-varying structure promotes the study of node centrality in temporal networks. In this paper, we define a supra-evolution matrix to depict the temporal network structure. With using of the time series analysis, the relationships between different time layers can be learned automatically. Based on the special form of the supra-evolution matrix, the eigenvector centrality calculating problem is turned into the calculation of eigenvectors of several low-dimensional matrices through iteration, which effectively reduces the computational complexity. Experiments are carried out on two real-world temporal networks, Enron email communication network and DBLP co-authorship network, the results of which show that our method is more efficient at discovering the important nodes than the common aggregating method.

  20. Detecting and characterising ramp events in wind power time series

    International Nuclear Information System (INIS)

    Gallego, Cristóbal; Cuerva, Álvaro; Costa, Alexandre

    2014-01-01

    In order to implement accurate models for wind power ramp forecasting, ramps need to be previously characterised. This issue has been typically addressed by performing binary ramp/non-ramp classifications based on ad-hoc assessed thresholds. However, recent works question this approach. This paper presents the ramp function, an innovative wavelet- based tool which detects and characterises ramp events in wind power time series. The underlying idea is to assess a continuous index related to the ramp intensity at each time step, which is obtained by considering large power output gradients evaluated under different time scales (up to typical ramp durations). The ramp function overcomes some of the drawbacks shown by the aforementioned binary classification and permits forecasters to easily reveal specific features of the ramp behaviour observed at a wind farm. As an example, the daily profile of the ramp-up and ramp-down intensities are obtained for the case of a wind farm located in Spain

  1. Assemblage time series reveal biodiversity change but not systematic loss.

    Science.gov (United States)

    Dornelas, Maria; Gotelli, Nicholas J; McGill, Brian; Shimadzu, Hideyasu; Moyes, Faye; Sievers, Caya; Magurran, Anne E

    2014-04-18

    The extent to which biodiversity change in local assemblages contributes to global biodiversity loss is poorly understood. We analyzed 100 time series from biomes across Earth to ask how diversity within assemblages is changing through time. We quantified patterns of temporal α diversity, measured as change in local diversity, and temporal β diversity, measured as change in community composition. Contrary to our expectations, we did not detect systematic loss of α diversity. However, community composition changed systematically through time, in excess of predictions from null models. Heterogeneous rates of environmental change, species range shifts associated with climate change, and biotic homogenization may explain the different patterns of temporal α and β diversity. Monitoring and understanding change in species composition should be a conservation priority.

  2. Chaotic time series analysis in economics: Balance and perspectives

    Energy Technology Data Exchange (ETDEWEB)

    Faggini, Marisa, E-mail: mfaggini@unisa.it [Dipartimento di Scienze Economiche e Statistiche, Università di Salerno, Fisciano 84084 (Italy)

    2014-12-15

    The aim of the paper is not to review the large body of work concerning nonlinear time series analysis in economics, about which much has been written, but rather to focus on the new techniques developed to detect chaotic behaviours in economic data. More specifically, our attention will be devoted to reviewing some of these techniques and their application to economic and financial data in order to understand why chaos theory, after a period of growing interest, appears now not to be such an interesting and promising research area.

  3. ALBEDO PATTERN RECOGNITION AND TIME-SERIES ANALYSES IN MALAYSIA

    Directory of Open Access Journals (Sweden)

    S. A. Salleh

    2012-07-01

    Full Text Available Pattern recognition and time-series analyses will enable one to evaluate and generate predictions of specific phenomena. The albedo pattern and time-series analyses are very much useful especially in relation to climate condition monitoring. This study is conducted to seek for Malaysia albedo pattern changes. The pattern recognition and changes will be useful for variety of environmental and climate monitoring researches such as carbon budgeting and aerosol mapping. The 10 years (2000–2009 MODIS satellite images were used for the analyses and interpretation. These images were being processed using ERDAS Imagine remote sensing software, ArcGIS 9.3, the 6S code for atmospherical calibration and several MODIS tools (MRT, HDF2GIS, Albedo tools. There are several methods for time-series analyses were explored, this paper demonstrates trends and seasonal time-series analyses using converted HDF format MODIS MCD43A3 albedo land product. The results revealed significance changes of albedo percentages over the past 10 years and the pattern with regards to Malaysia's nebulosity index (NI and aerosol optical depth (AOD. There is noticeable trend can be identified with regards to its maximum and minimum value of the albedo. The rise and fall of the line graph show a similar trend with regards to its daily observation. The different can be identified in term of the value or percentage of rises and falls of albedo. Thus, it can be concludes that the temporal behavior of land surface albedo in Malaysia have a uniform behaviours and effects with regards to the local monsoons. However, although the average albedo shows linear trend with nebulosity index, the pattern changes of albedo with respects to the nebulosity index indicates that there are external factors that implicates the albedo values, as the sky conditions and its diffusion plotted does not have uniform trend over the years, especially when the trend of 5 years interval is examined, 2000 shows high

  4. Ensemble Deep Learning for Biomedical Time Series Classification.

    Science.gov (United States)

    Jin, Lin-Peng; Dong, Jun

    2016-01-01

    Ensemble learning has been proved to improve the generalization ability effectively in both theory and practice. In this paper, we briefly outline the current status of research on it first. Then, a new deep neural network-based ensemble method that integrates filtering views, local views, distorted views, explicit training, implicit training, subview prediction, and Simple Average is proposed for biomedical time series classification. Finally, we validate its effectiveness on the Chinese Cardiovascular Disease Database containing a large number of electrocardiogram recordings. The experimental results show that the proposed method has certain advantages compared to some well-known ensemble methods, such as Bagging and AdaBoost .

  5. Ensemble Deep Learning for Biomedical Time Series Classification

    Directory of Open Access Journals (Sweden)

    Lin-peng Jin

    2016-01-01

    Full Text Available Ensemble learning has been proved to improve the generalization ability effectively in both theory and practice. In this paper, we briefly outline the current status of research on it first. Then, a new deep neural network-based ensemble method that integrates filtering views, local views, distorted views, explicit training, implicit training, subview prediction, and Simple Average is proposed for biomedical time series classification. Finally, we validate its effectiveness on the Chinese Cardiovascular Disease Database containing a large number of electrocardiogram recordings. The experimental results show that the proposed method has certain advantages compared to some well-known ensemble methods, such as Bagging and AdaBoost.

  6. SaaS Platform for Time Series Data Handling

    Directory of Open Access Journals (Sweden)

    Oplachko Ekaterina

    2018-01-01

    Full Text Available The paper is devoted to the description of MathBrain, a cloud-based resource, which works as a “Software as a Service” model. It is designed to maximize the efficiency of the current technology and to provide a tool for time series data handling. The resource provides access to the following analysis methods: direct and inverse Fourier transforms, Principal component analysis and Independent component analysis decompositions, quantitative analysis, magnetoencephalography inverse problem solution in a single dipole model based on multichannel spectral data.

  7. SaaS Platform for Time Series Data Handling

    Science.gov (United States)

    Oplachko, Ekaterina; Rykunov, Stanislav; Ustinin, Mikhail

    2018-02-01

    The paper is devoted to the description of MathBrain, a cloud-based resource, which works as a "Software as a Service" model. It is designed to maximize the efficiency of the current technology and to provide a tool for time series data handling. The resource provides access to the following analysis methods: direct and inverse Fourier transforms, Principal component analysis and Independent component analysis decompositions, quantitative analysis, magnetoencephalography inverse problem solution in a single dipole model based on multichannel spectral data.

  8. Chaotic time series analysis in economics: Balance and perspectives

    International Nuclear Information System (INIS)

    Faggini, Marisa

    2014-01-01

    The aim of the paper is not to review the large body of work concerning nonlinear time series analysis in economics, about which much has been written, but rather to focus on the new techniques developed to detect chaotic behaviours in economic data. More specifically, our attention will be devoted to reviewing some of these techniques and their application to economic and financial data in order to understand why chaos theory, after a period of growing interest, appears now not to be such an interesting and promising research area

  9. Estimation of dynamic flux profiles from metabolic time series data

    Directory of Open Access Journals (Sweden)

    Chou I-Chun

    2012-07-01

    Full Text Available Abstract Background Advances in modern high-throughput techniques of molecular biology have enabled top-down approaches for the estimation of parameter values in metabolic systems, based on time series data. Special among them is the recent method of dynamic flux estimation (DFE, which uses such data not only for parameter estimation but also for the identification of functional forms of the processes governing a metabolic system. DFE furthermore provides diagnostic tools for the evaluation of model validity and of the quality of a model fit beyond residual errors. Unfortunately, DFE works only when the data are more or less complete and the system contains as many independent fluxes as metabolites. These drawbacks may be ameliorated with other types of estimation and information. However, such supplementations incur their own limitations. In particular, assumptions must be made regarding the functional forms of some processes and detailed kinetic information must be available, in addition to the time series data. Results The authors propose here a systematic approach that supplements DFE and overcomes some of its shortcomings. Like DFE, the approach is model-free and requires only minimal assumptions. If sufficient time series data are available, the approach allows the determination of a subset of fluxes that enables the subsequent applicability of DFE to the rest of the flux system. The authors demonstrate the procedure with three artificial pathway systems exhibiting distinct characteristics and with actual data of the trehalose pathway in Saccharomyces cerevisiae. Conclusions The results demonstrate that the proposed method successfully complements DFE under various situations and without a priori assumptions regarding the model representation. The proposed method also permits an examination of whether at all, to what degree, or within what range the available time series data can be validly represented in a particular functional format of

  10. Nonlinear analysis and prediction of time series in multiphase reactors

    CERN Document Server

    Liu, Mingyan

    2014-01-01

    This book reports on important nonlinear aspects or deterministic chaos issues in the systems of multi-phase reactors. The reactors treated in the book include gas-liquid bubble columns, gas-liquid-solid fluidized beds and gas-liquid-solid magnetized fluidized beds. The authors take pressure fluctuations in the bubble columns  as time series for nonlinear analysis, modeling and forecasting. They present qualitative and quantitative non-linear analysis tools which include attractor phase plane plot, correlation dimension, Kolmogorov entropy and largest Lyapunov exponent calculations and local non-linear short-term prediction.

  11. Real coded genetic algorithm for fuzzy time series prediction

    Science.gov (United States)

    Jain, Shilpa; Bisht, Dinesh C. S.; Singh, Phool; Mathpal, Prakash C.

    2017-10-01

    Genetic Algorithm (GA) forms a subset of evolutionary computing, rapidly growing area of Artificial Intelligence (A.I.). Some variants of GA are binary GA, real GA, messy GA, micro GA, saw tooth GA, differential evolution GA. This research article presents a real coded GA for predicting enrollments of University of Alabama. Data of Alabama University is a fuzzy time series. Here, fuzzy logic is used to predict enrollments of Alabama University and genetic algorithm optimizes fuzzy intervals. Results are compared to other eminent author works and found satisfactory, and states that real coded GA are fast and accurate.

  12. To center or not to center? Investigating inertia with a multilevel autoregressive model

    Directory of Open Access Journals (Sweden)

    Ellen L. Hamaker

    2015-01-01

    Full Text Available Whether level 1 predictors should be centered per cluster has received considerable attention in the multilevel literature. While most agree that there is no one preferred approach, it has also been argued that cluster mean centering is desirable when the within-cluster slope and the between-cluster slope are expected to deviate, and the main interest is in the within-cluster slope. However, we show in a series of simulations that if one has a multilevel autoregressive model in which the level 1 predictor is the lagged outcome variable (i.e., the outcome variable at the previous occasion, cluster mean centering will in general lead to a downward bias in the parameter estimate of the within-cluster slope (i.e., the autoregressive relationship. This is particularly relevant if the main question is whether there is on average an autoregressive effect. Nonetheless, we show that if the main interest is in estimating the effect of a level 2 predictor on the autoregressive parameter (i.e., a cross-level interaction, cluster mean centering should be preferred over other forms of centering. Hence, researchers should be clear on what is considered the main goal of their study, and base their choice of centering method on this when using a multilevel autoregressive model.

  13. Emergency department clinical redesign, team-based care and improvements in hospital performance: A time series analysis.

    Science.gov (United States)

    Dinh, Michael M; Green, Timothy C; Bein, Kendall J; Lo, Serigne; Jones, Aaron; Johnson, Terence

    2015-08-01

    The objective was to evaluate the impact of an ED clinical redesign project that involved team-based care and early senior assessment on hospital performance. This was an interrupted time series analysis performed using daily hospital performance data 6 months before and 8 months after the implementation of the clinical redesign intervention that involved Emergency Consultant-led team-based care, redistribution of ED beds and implementation of a senior nursing coordination roles in the ED. The primary outcome was the daily National Emergency Access Target (NEAT) performance (proportion of total daily ED presentations that were admitted to an inpatient ward or discharged from ED within 4 h of arrival). Secondary outcomes were daily ALOS in ED, inpatient Clinical Emergency Response System (CERS) calls and hospital mortality. Autoregressive Integrated Moving Average analysis was used to model NEAT performance. Hospital mortality was modelled using negative binomial regression. After adjusting for patient volume, inpatient admissions, ambulance, hospital occupancy, weekends ED Consultant numbers, weekends and underlying trends, there was a 17% improvement in NEAT associated with the post-intervention period (95% CI 12, 19% P performance with no evidence of an increase in clinical deterioration on inpatient wards and evidence for an improvement in hospital mortality. © 2015 Australasian College for Emergency Medicine and Australasian Society for Emergency Medicine.

  14. Predicting the Water Level Fluctuation in an Alpine Lake Using Physically Based, Artificial Neural Network, and Time Series Forecasting Models

    Directory of Open Access Journals (Sweden)

    Chih-Chieh Young

    2015-01-01

    Full Text Available Accurate prediction of water level fluctuation is important in lake management due to its significant impacts in various aspects. This study utilizes four model approaches to predict water levels in the Yuan-Yang Lake (YYL in Taiwan: a three-dimensional hydrodynamic model, an artificial neural network (ANN model (back propagation neural network, BPNN, a time series forecasting (autoregressive moving average with exogenous inputs, ARMAX model, and a combined hydrodynamic and ANN model. Particularly, the black-box ANN model and physically based hydrodynamic model are coupled to more accurately predict water level fluctuation. Hourly water level data (a total of 7296 observations was collected for model calibration (training and validation. Three statistical indicators (mean absolute error, root mean square error, and coefficient of correlation were adopted to evaluate model performances. Overall, the results demonstrate that the hydrodynamic model can satisfactorily predict hourly water level changes during the calibration stage but not for the validation stage. The ANN and ARMAX models better predict the water level than the hydrodynamic model does. Meanwhile, the results from an ANN model are superior to those by the ARMAX model in both training and validation phases. The novel proposed concept using a three-dimensional hydrodynamic model in conjunction with an ANN model has clearly shown the improved prediction accuracy for the water level fluctuation.

  15. Forecasting the number of zoonotic cutaneous leishmaniasis cases in south of Fars province, Iran using seasonal ARIMA time series method.

    Science.gov (United States)

    Sharafi, Mehdi; Ghaem, Haleh; Tabatabaee, Hamid Reza; Faramarzi, Hossein

    2017-01-01

    To predict the trend of cutaneous leishmaniasis and assess the relationship between the disease trend and weather variables in south of Fars province using Seasonal Autoregressive Integrated Moving Average (SARIMA) model. The trend of cutaneous leishmaniasis was predicted using Mini tab software and SARIMA model. Besides, information about the disease and weather conditions was collected monthly based on time series design during January 2010 to March 2016. Moreover, various SARIMA models were assessed and the best one was selected. Then, the model's fitness was evaluated based on normality of the residuals' distribution, correspondence between the fitted and real amounts, and calculation of Akaike Information Criteria (AIC) and Bayesian Information Criteria (BIC). The study results indicated that SARIMA model (4,1,4)(0,1,0) (12) in general and SARIMA model (4,1,4)(0,1,1) (12) in below and above 15 years age groups could appropriately predict the disease trend in the study area. Moreover, temperature with a three-month delay (lag3) increased the disease trend, rainfall with a four-month delay (lag4) decreased the disease trend, and rainfall with a nine-month delay (lag9) increased the disease trend. Based on the results, leishmaniasis follows a descending trend in the study area in case drought condition continues, SARIMA models can suitably measure the disease trend, and the disease follows a seasonal trend. Copyright © 2017 Hainan Medical University. Production and hosting by Elsevier B.V. All rights reserved.

  16. Comparative Time Series Analysis of Aerosol Optical Depth over Sites in United States and China Using ARIMA Modeling

    Science.gov (United States)

    Li, X.; Zhang, C.; Li, W.

    2017-12-01

    Long-term spatiotemporal analysis and modeling of aerosol optical depth (AOD) distribution is of paramount importance to study radiative forcing, climate change, and human health. This study is focused on the trends and variations of AOD over six stations located in United States and China during 2003 to 2015, using satellite-retrieved Moderate Resolution Imaging Spectrometer (MODIS) Collection 6 retrievals and ground measurements derived from Aerosol Robotic NETwork (AERONET). An autoregressive integrated moving average (ARIMA) model is applied to simulate and predict AOD values. The R2, adjusted R2, Root Mean Square Error (RMSE), Mean Absolute Error (MAE), Mean Absolute Percentage Error (MAPE), and Bayesian Information Criterion (BIC) are used as indices to select the best fitted model. Results show that there is a persistent decreasing trend in AOD for both MODIS data and AERONET data over three stations. Monthly and seasonal AOD variations reveal consistent aerosol patterns over stations along mid-latitudes. Regional differences impacted by climatology and land cover types are observed for the selected stations. Statistical validation of time series models indicates that the non-seasonal ARIMA model performs better for AERONET AOD data than for MODIS AOD data over most stations, suggesting the method works better for data with higher quality. By contrast, the seasonal ARIMA model reproduces the seasonal variations of MODIS AOD data much more precisely. Overall, the reasonably predicted results indicate the applicability and feasibility of the stochastic ARIMA modeling technique to forecast future and missing AOD values.

  17. Modeling malaria control intervention effect in KwaZulu-Natal, South Africa using intervention time series analysis.

    Science.gov (United States)

    Ebhuoma, Osadolor; Gebreslasie, Michael; Magubane, Lethumusa

    The change of the malaria control intervention policy in South Africa (SA), re-introduction of dichlorodiphenyltrichloroethane (DDT), may be responsible for the low and sustained malaria transmission in KwaZulu-Natal (KZN). We evaluated the effect of the re-introduction of DDT on malaria in KZN and suggested practical ways the province can strengthen her already existing malaria control and elimination efforts, to achieve zero malaria transmission. We obtained confirmed monthly malaria cases in KZN from the malaria control program of KZN from 1998 to 2014. The seasonal autoregressive integrated moving average (SARIMA) intervention time series analysis (ITSA) was employed to model the effect of the re-introduction of DDT on confirmed monthly malaria cases. The result is an abrupt and permanent decline of monthly malaria cases (w 0 =-1174.781, p-value=0.003) following the implementation of the intervention policy. The sustained low malaria cases observed over a long period suggests that the continued usage of DDT did not result in insecticide resistance as earlier anticipated. It may be due to exophagic malaria vectors, which renders the indoor residual spraying not totally effective. Therefore, the feasibility of reducing malaria transmission to zero in KZN requires other reliable and complementary intervention resources to optimize the existing ones. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  18. Impact of meteorological factors on the incidence of bacillary dysentery in Beijing, China: A time series analysis (1970-2012.

    Directory of Open Access Journals (Sweden)

    Long Yan

    Full Text Available Influence of meteorological variables on the transmission of bacillary dysentery (BD is under investigated topic and effective forecasting models as public health tool are lacking. This paper aimed to quantify the relationship between meteorological variables and BD cases in Beijing and to establish an effective forecasting model.A time series analysis was conducted in the Beijing area based upon monthly data on weather variables (i.e. temperature, rainfall, relative humidity, vapor pressure, and wind speed and on the number of BD cases during the period 1970-2012. Autoregressive integrated moving average models with explanatory variables (ARIMAX were built based on the data from 1970 to 2004. Prediction of monthly BD cases from 2005 to 2012 was made using the established models. The prediction accuracy was evaluated by the mean square error (MSE.Firstly, temperature with 2-month and 7-month lags and rainfall with 12-month lag were found positively correlated with the number of BD cases in Beijing. Secondly, ARIMAX model with covariates of temperature with 7-month lag (β = 0.021, 95% confidence interval(CI: 0.004-0.038 and rainfall with 12-month lag (β = 0.023, 95% CI: 0.009-0.037 displayed the highest prediction accuracy.The ARIMAX model developed in this study showed an accurate goodness of fit and precise prediction accuracy in the short term, which would be beneficial for government departments to take early public health measures to prevent and control possible BD popularity.

  19. Feasibility of real-time calculation of correlation integral derived statistics applied to EEG time series

    NARCIS (Netherlands)

    Broek, P.L.C. van den; Egmond, J. van; Rijn, C.M. van; Takens, F.; Coenen, A.M.L.; Booij, L.H.D.J.

    2005-01-01

    This study assessed the feasibility of online calculation of the correlation integral (C(r)) aiming to apply C(r)-derived statistics. For real-time application it is important to reduce calculation time. It is shown how our method works for EEG time series. Methods: To achieve online calculation of

  20. Feasibility of real-time calculation of correlation integral derived statistics applied to EGG time series

    NARCIS (Netherlands)

    van den Broek, PLC; van Egmond, J; van Rijn, CM; Takens, F; Coenen, AML; Booij, LHDJ

    2005-01-01

    Background: This study assessed the feasibility of online calculation of the correlation integral (C(r)) aiming to apply C(r)derived statistics. For real-time application it is important to reduce calculation time. It is shown how our method works for EEG time series. Methods: To achieve online