Ph.H.B.F. Franses (Philip Hans); R. Paap (Richard)
2004-01-01
textabstractThis book considers periodic time series models for seasonal data, characterized by parameters that differ across the seasons, and focuses on their usefulness for out-of-sample forecasting. Providing an up-to-date survey of the recent developments in periodic time series, the book
Introduction to Time Series Modeling
Kitagawa, Genshiro
2010-01-01
In time series modeling, the behavior of a certain phenomenon is expressed in relation to the past values of itself and other covariates. Since many important phenomena in statistical analysis are actually time series and the identification of conditional distribution of the phenomenon is an essential part of the statistical modeling, it is very important and useful to learn fundamental methods of time series modeling. Illustrating how to build models for time series using basic methods, "Introduction to Time Series Modeling" covers numerous time series models and the various tools f
Models for dependent time series
Tunnicliffe Wilson, Granville; Haywood, John
2015-01-01
Models for Dependent Time Series addresses the issues that arise and the methodology that can be applied when the dependence between time series is described and modeled. Whether you work in the economic, physical, or life sciences, the book shows you how to draw meaningful, applicable, and statistically valid conclusions from multivariate (or vector) time series data.The first four chapters discuss the two main pillars of the subject that have been developed over the last 60 years: vector autoregressive modeling and multivariate spectral analysis. These chapters provide the foundational mater
Lag space estimation in time series modelling
DEFF Research Database (Denmark)
Goutte, Cyril
1997-01-01
The purpose of this article is to investigate some techniques for finding the relevant lag-space, i.e. input information, for time series modelling. This is an important aspect of time series modelling, as it conditions the design of the model through the regressor vector a.k.a. the input layer...
A Simple Fuzzy Time Series Forecasting Model
DEFF Research Database (Denmark)
Ortiz-Arroyo, Daniel
2016-01-01
In this paper we describe a new ﬁrst order fuzzy time series forecasting model. We show that our automatic fuzzy partitioning method provides an accurate approximation to the time series that when combined with rule forecasting and an OWA operator improves forecasting accuracy. Our model does...... not attempt to provide the best results in comparison with other forecasting methods but to show how to improve ﬁrst order models using simple techniques. However, we show that our ﬁrst order model is still capable of outperforming some more complex higher order fuzzy time series models....
Time series modeling, computation, and inference
Prado, Raquel
2010-01-01
The authors systematically develop a state-of-the-art analysis and modeling of time series. … this book is well organized and well written. The authors present various statistical models for engineers to solve problems in time series analysis. Readers no doubt will learn state-of-the-art techniques from this book.-Hsun-Hsien Chang, Computing Reviews, March 2012My favorite chapters were on dynamic linear models and vector AR and vector ARMA models.-William Seaver, Technometrics, August 2011… a very modern entry to the field of time-series modelling, with a rich reference list of the current lit
Nonlinear time series modelling: an introduction
Simon M. Potter
1999-01-01
Recent developments in nonlinear time series modelling are reviewed. Three main types of nonlinear models are discussed: Markov Switching, Threshold Autoregression and Smooth Transition Autoregression. Classical and Bayesian estimation techniques are described for each model. Parametric tests for nonlinearity are reviewed with examples from the three types of models. Finally, forecasting and impulse response analysis is developed.
Feature Matching in Time Series Modelling
Xia, Yingcun
2011-01-01
Using a time series model to mimic an observed time series has a long history. However, with regard to this objective, conventional estimation methods for discrete-time dynamical models are frequently found to be wanting. In the absence of a true model, we prefer an alternative approach to conventional model fitting that typically involves one-step-ahead prediction errors. Our primary aim is to match the joint probability distribution of the observable time series, including long-term features of the dynamics that underpin the data, such as cycles, long memory and others, rather than short-term prediction. For want of a better name, we call this specific aim {\\it feature matching}. The challenges of model mis-specification, measurement errors and the scarcity of data are forever present in real time series modelling. In this paper, by synthesizing earlier attempts into an extended-likelihood, we develop a systematic approach to empirical time series analysis to address these challenges and to aim at achieving...
Building Chaotic Model From Incomplete Time Series
Siek, Michael; Solomatine, Dimitri
2010-05-01
This paper presents a number of novel techniques for building a predictive chaotic model from incomplete time series. A predictive chaotic model is built by reconstructing the time-delayed phase space from observed time series and the prediction is made by a global model or adaptive local models based on the dynamical neighbors found in the reconstructed phase space. In general, the building of any data-driven models depends on the completeness and quality of the data itself. However, the completeness of the data availability can not always be guaranteed since the measurement or data transmission is intermittently not working properly due to some reasons. We propose two main solutions dealing with incomplete time series: using imputing and non-imputing methods. For imputing methods, we utilized the interpolation methods (weighted sum of linear interpolations, Bayesian principle component analysis and cubic spline interpolation) and predictive models (neural network, kernel machine, chaotic model) for estimating the missing values. After imputing the missing values, the phase space reconstruction and chaotic model prediction are executed as a standard procedure. For non-imputing methods, we reconstructed the time-delayed phase space from observed time series with missing values. This reconstruction results in non-continuous trajectories. However, the local model prediction can still be made from the other dynamical neighbors reconstructed from non-missing values. We implemented and tested these methods to construct a chaotic model for predicting storm surges at Hoek van Holland as the entrance of Rotterdam Port. The hourly surge time series is available for duration of 1990-1996. For measuring the performance of the proposed methods, a synthetic time series with missing values generated by a particular random variable to the original (complete) time series is utilized. There exist two main performance measures used in this work: (1) error measures between the actual
Estimating High-Dimensional Time Series Models
DEFF Research Database (Denmark)
Medeiros, Marcelo C.; Mendes, Eduardo F.
We study the asymptotic properties of the Adaptive LASSO (adaLASSO) in sparse, high-dimensional, linear time-series models. We assume both the number of covariates in the model and candidate variables can increase with the number of observations and the number of candidate variables is, possibly...
Forecasting with periodic autoregressive time series models
Ph.H.B.F. Franses (Philip Hans); R. Paap (Richard)
1999-01-01
textabstractThis paper is concerned with forecasting univariate seasonal time series data using periodic autoregressive models. We show how one should account for unit roots and deterministic terms when generating out-of-sample forecasts. We illustrate the models for various quarterly UK consumption
Forecasting with periodic autoregressive time series models
Ph.H.B.F. Franses (Philip Hans); R. Paap (Richard)
1999-01-01
textabstractThis paper is concerned with forecasting univariate seasonal time series data using periodic autoregressive models. We show how one should account for unit roots and deterministic terms when generating out-of-sample forecasts. We illustrate the models for various quarterly UK consumption
Modeling noisy time series Physiological tremor
Timmer, J
1998-01-01
Empirical time series often contain observational noise. We investigate the effect of this noise on the estimated parameters of models fitted to the data. For data of physiological tremor, i.e. a small amplitude oscillation of the outstretched hand of healthy subjects, we compare the results for a linear model that explicitly includes additional observational noise to one that ignores this noise. We discuss problems and possible solutions for nonlinear deterministic as well as nonlinear stochastic processes. Especially we discuss the state space model applicable for modeling noisy stochastic systems and Bock's algorithm capable for modeling noisy deterministic systems.
Outlier Detection in Structural Time Series Models
DEFF Research Database (Denmark)
Marczak, Martyna; Proietti, Tommaso
investigate via Monte Carlo simulations how this approach performs for detecting additive outliers and level shifts in the analysis of nonstationary seasonal time series. The reference model is the basic structural model, featuring a local linear trend, possibly integrated of order two, stochastic seasonality......Structural change affects the estimation of economic signals, like the underlying growth rate or the seasonally adjusted series. An important issue, which has attracted a great deal of attention also in the seasonal adjustment literature, is its detection by an expert procedure. The general...... and a stationary component. Further, we apply both kinds of indicator saturation to detect additive outliers and level shifts in the industrial production series in five European countries....
Time Series Modelling using Proc Varmax
DEFF Research Database (Denmark)
Milhøj, Anders
2007-01-01
In this paper it will be demonstrated how various time series problems could be met using Proc Varmax. The procedure is rather new and hence new features like cointegration, testing for Granger causality are included, but it also means that more traditional ARIMA modelling as outlined by Box & Je...... & Jenkins is performed in a more modern way using the computer resources which are now available...
Forecasting with nonlinear time series models
DEFF Research Database (Denmark)
Kock, Anders Bredahl; Teräsvirta, Timo
and two versions of a simple artificial neural network model. Techniques for generating multi-period forecasts from nonlinear models recursively are considered, and the direct (non-recursive) method for this purpose is mentioned as well. Forecasting with com- plex dynamic systems, albeit less frequently...... applied to economic fore- casting problems, is briefly highlighted. A number of large published studies comparing macroeconomic forecasts obtained using different time series models are discussed, and the paper also contains a small simulation study comparing recursive and direct forecasts in a partic...
Forecasting Daily Time Series using Periodic Unobserved Components Time Series Models
Koopman, Siem Jan; Ooms, Marius
2004-01-01
We explore a periodic analysis in the context of unobserved components time series models that decompose time series into components of interest such as trend and seasonal. Periodic time series models allow dynamic characteristics to depend on the period of the year, month, week or day. In the stand
Forecasting Daily Time Series using Periodic Unobserved Components Time Series Models
Koopman, Siem Jan; Ooms, Marius
2004-01-01
We explore a periodic analysis in the context of unobserved components time series models that decompose time series into components of interest such as trend and seasonal. Periodic time series models allow dynamic characteristics to depend on the period of the year, month, week or day. In the
Time series modeling for automatic target recognition
Sokolnikov, Andre
2012-05-01
Time series modeling is proposed for identification of targets whose images are not clearly seen. The model building takes into account air turbulence, precipitation, fog, smoke and other factors obscuring and distorting the image. The complex of library data (of images, etc.) serving as a basis for identification provides the deterministic part of the identification process, while the partial image features, distorted parts, irrelevant pieces and absence of particular features comprise the stochastic part of the target identification. The missing data approach is elaborated that helps the prediction process for the image creation or reconstruction. The results are provided.
Gil-Alana, L.A.; Moreno, A; Pérez-de-Gracia, F. (Fernando)
2011-01-01
The last 20 years have witnessed a considerable increase in the use of time series techniques in econometrics. The articles in this important set have been chosen to illustrate the main themes in time series work as it relates to econometrics. The editor has written a new concise introduction to accompany the articles. Sections covered include: Ad Hoc Forecasting Procedures, ARIMA Modelling, Structural Time Series Models, Unit Roots, Detrending and Non-stationarity, Seasonality, Seasonal Adju...
Modeling Time Series Data for Supervised Learning
Baydogan, Mustafa Gokce
2012-01-01
Temporal data are increasingly prevalent and important in analytics. Time series (TS) data are chronological sequences of observations and an important class of temporal data. Fields such as medicine, finance, learning science and multimedia naturally generate TS data. Each series provide a high-dimensional data vector that challenges the learning…
Time series modeling for syndromic surveillance
Directory of Open Access Journals (Sweden)
Mandl Kenneth D
2003-01-01
Full Text Available Abstract Background Emergency department (ED based syndromic surveillance systems identify abnormally high visit rates that may be an early signal of a bioterrorist attack. For example, an anthrax outbreak might first be detectable as an unusual increase in the number of patients reporting to the ED with respiratory symptoms. Reliably identifying these abnormal visit patterns requires a good understanding of the normal patterns of healthcare usage. Unfortunately, systematic methods for determining the expected number of (ED visits on a particular day have not yet been well established. We present here a generalized methodology for developing models of expected ED visit rates. Methods Using time-series methods, we developed robust models of ED utilization for the purpose of defining expected visit rates. The models were based on nearly a decade of historical data at a major metropolitan academic, tertiary care pediatric emergency department. The historical data were fit using trimmed-mean seasonal models, and additional models were fit with autoregressive integrated moving average (ARIMA residuals to account for recent trends in the data. The detection capabilities of the model were tested with simulated outbreaks. Results Models were built both for overall visits and for respiratory-related visits, classified according to the chief complaint recorded at the beginning of each visit. The mean absolute percentage error of the ARIMA models was 9.37% for overall visits and 27.54% for respiratory visits. A simple detection system based on the ARIMA model of overall visits was able to detect 7-day-long simulated outbreaks of 30 visits per day with 100% sensitivity and 97% specificity. Sensitivity decreased with outbreak size, dropping to 94% for outbreaks of 20 visits per day, and 57% for 10 visits per day, all while maintaining a 97% benchmark specificity. Conclusions Time series methods applied to historical ED utilization data are an important tool
Time series models of symptoms in schizophrenia.
Tschacher, Wolfgang; Kupper, Zeno
2002-12-15
The symptom courses of 84 schizophrenia patients (mean age: 24.4 years; mean previous admissions: 1.3; 64% males) of a community-based acute ward were examined to identify dynamic patterns of symptoms and to investigate the relation between these patterns and treatment outcome. The symptoms were monitored by systematic daily staff ratings using a scale composed of three factors: psychoticity, excitement, and withdrawal. Patients showed moderate to high symptomatic improvement documented by effect size measures. Each of the 84 symptom trajectories was analyzed by time series methods using vector autoregression (VAR) that models the day-to-day interrelations between symptom factors. Multiple and stepwise regression analyses were then performed on the basis of the VAR models. Two VAR parameters were found to be associated significantly with favorable outcome in this exploratory study: 'withdrawal preceding a reduction of psychoticity' as well as 'excitement preceding an increase of withdrawal'. The findings were interpreted as generating hypotheses about how patients cope with psychotic episodes.
Ruin Probability in Linear Time Series Model
Institute of Scientific and Technical Information of China (English)
ZHANG Lihong
2005-01-01
This paper analyzes a continuous time risk model with a linear model used to model the claim process. The time is discretized stochastically using the times when claims occur, using Doob's stopping time theorem and martingale inequalities to obtain expressions for the ruin probability as well as both exponential and non-exponential upper bounds for the ruin probability for an infinite time horizon. Numerical results are included to illustrate the accuracy of the non-exponential bound.
Genetic programming-based chaotic time series modeling
Institute of Scientific and Technical Information of China (English)
张伟; 吴智铭; 杨根科
2004-01-01
This paper proposes a Genetic Programming-Based Modeling (GPM) algorithm on chaotic time series. GP is used here to search for appropriate model structures in function space, and the Particle Swarm Optimization (PSO) algorithm is used for Nonlinear Parameter Estimation (NPE) of dynamic model structures. In addition, GPM integrates the results of Nonlinear Time Series Analysis (NTSA) to adjust the parameters and takes them as the criteria of established models. Experiments showed the effectiveness of such improvements on chaotic time series modeling.
TIME SERIES ANALYSIS USING A UNIQUE MODEL OF TRANSFORMATION
Directory of Open Access Journals (Sweden)
Goran Klepac
2007-12-01
Full Text Available REFII1 model is an authorial mathematical model for time series data mining. The main purpose of that model is to automate time series analysis, through a unique transformation model of time series. An advantage of this approach of time series analysis is the linkage of different methods for time series analysis, linking traditional data mining tools in time series, and constructing new algorithms for analyzing time series. It is worth mentioning that REFII model is not a closed system, which means that we have a finite set of methods. At first, this is a model for transformation of values of time series, which prepares data used by different sets of methods based on the same model of transformation in a domain of problem space. REFII model gives a new approach in time series analysis based on a unique model of transformation, which is a base for all kind of time series analysis. The advantage of REFII model is its possible application in many different areas such as finance, medicine, voice recognition, face recognition and text mining.
Fisher Information Framework for Time Series Modeling
Venkatesan, R C
2016-01-01
A robust prediction model invoking the Takens embedding theorem, whose \\textit{working hypothesis} is obtained via an inference procedure based on the minimum Fisher information principle, is presented. The coefficients of the ansatz, central to the \\textit{working hypothesis} satisfy a time independent Schr\\"{o}dinger-like equation in a vector setting. The inference of i) the probability density function of the coefficients of the \\textit{working hypothesis} and ii) the establishing of constraint driven pseudo-inverse condition for the modeling phase of the prediction scheme, is made, for the case of normal distributions, with the aid of the quantum mechanical virial theorem. The well-known reciprocity relations and the associated Legendre transform structure for the Fisher information measure (FIM, hereafter)-based model in a vector setting (with least square constraints) are self-consistently derived. These relations are demonstrated to yield an intriguing form of the FIM for the modeling phase, which defi...
Hidden Markov Models for Time Series An Introduction Using R
Zucchini, Walter
2009-01-01
Illustrates the flexibility of HMMs as general-purpose models for time series data. This work presents an overview of HMMs for analyzing time series data, from continuous-valued, circular, and multivariate series to binary data, bounded and unbounded counts and categorical observations.
Trend time-series modeling and forecasting with neural networks.
Qi, Min; Zhang, G Peter
2008-05-01
Despite its great importance, there has been no general consensus on how to model the trends in time-series data. Compared to traditional approaches, neural networks (NNs) have shown some promise in time-series forecasting. This paper investigates how to best model trend time series using NNs. Four different strategies (raw data, raw data with time index, detrending, and differencing) are used to model various trend patterns (linear, nonlinear, deterministic, stochastic, and breaking trend). We find that with NNs differencing often gives meritorious results regardless of the underlying data generating processes (DGPs). This finding is also confirmed by the real gross national product (GNP) series.
Modelling road accidents: An approach using structural time series
Junus, Noor Wahida Md; Ismail, Mohd Tahir
2014-09-01
In this paper, the trend of road accidents in Malaysia for the years 2001 until 2012 was modelled using a structural time series approach. The structural time series model was identified using a stepwise method, and the residuals for each model were tested. The best-fitted model was chosen based on the smallest Akaike Information Criterion (AIC) and prediction error variance. In order to check the quality of the model, a data validation procedure was performed by predicting the monthly number of road accidents for the year 2012. Results indicate that the best specification of the structural time series model to represent road accidents is the local level with a seasonal model.
Small Sample Properties of Bayesian Multivariate Autoregressive Time Series Models
Price, Larry R.
2012-01-01
The aim of this study was to compare the small sample (N = 1, 3, 5, 10, 15) performance of a Bayesian multivariate vector autoregressive (BVAR-SEM) time series model relative to frequentist power and parameter estimation bias. A multivariate autoregressive model was developed based on correlated autoregressive time series vectors of varying…
Parameterizing unconditional skewness in models for financial time series
DEFF Research Database (Denmark)
He, Changli; Silvennoinen, Annastiina; Teräsvirta, Timo
In this paper we consider the third-moment structure of a class of time series models. It is often argued that the marginal distribution of financial time series such as returns is skewed. Therefore it is of importance to know what properties a model should possess if it is to accommodate...
Genetic programming-based chaotic time series modeling
Institute of Scientific and Technical Information of China (English)
张伟; 吴智铭; 杨根科
2004-01-01
This paper proposes a Genetic Programming-Based Modeling(GPM)algorithm on chaotic time series. GP is used here to search for appropriate model structures in function space,and the Particle Swarm Optimization(PSO)algorithm is used for Nonlinear Parameter Estimation(NPE)of dynamic model structures. In addition,GPM integrates the results of Nonlinear Time Series Analysis(NTSA)to adjust the parameters and takes them as the criteria of established models.Experiments showed the effectiveness of such improvements on chaotic time series modeling.
Time series modelling of overflow structures
DEFF Research Database (Denmark)
Carstensen, J.; Harremoës, P.
1997-01-01
to the overflow structures. The capacity of a pump draining the storage pipe has been estimated for two rain events, revealing that the pump was malfunctioning during the first rain event. The grey-box modelling approach is applicable for automated on-line surveillance and control. (C) 1997 IAWQ. Published......The dynamics of a storage pipe is examined using a grey-box model based on on-line measured data. The grey-box modelling approach uses a combination of physically-based and empirical terms in the model formulation. The model provides an on-line state estimate of the overflows, pumping capacities...
Modeling Persistence In Hydrological Time Series Using Fractional Differencing
Hosking, J. R. M.
1984-12-01
The class of autoregressive integrated moving average (ARIMA) time series models may be generalized by permitting the degree of differencing d to take fractional values. Models including fractional differencing are capable of representing persistent series (d > 0) or short-memory series (d = 0). The class of fractionally differenced ARIMA processes provides a more flexible way than has hitherto been available of simultaneously modeling the long-term and short-term behavior of a time series. In this paper some fundamental properties of fractionally differenced ARIMA processes are presented. Methods of simulating these processes are described. Estimation of the parameters of fractionally differenced ARIMA models is discussed, and an approximate maximum likelihood method is proposed. The methodology is illustrated by fitting fractionally differenced models to time series of streamflows and annual temperatures.
A multivariate approach to modeling univariate seasonal time series
Ph.H.B.F. Franses (Philip Hans)
1994-01-01
textabstractA seasonal time series can be represented by a vector autoregressive model for the annual series containing the seasonal observations. This model allows for periodically varying coefficients. When the vector elements are integrated, the maximum likelihood cointegration method can be used
Lagrangian Time Series Models for Ocean Surface Drifter Trajectories
Sykulski, Adam M; Lilly, Jonathan M; Danioux, Eric
2016-01-01
This paper proposes stochastic models for the analysis of ocean surface trajectories obtained from freely-drifting satellite-tracked instruments. The proposed time series models are used to summarise large multivariate datasets and infer important physical parameters of inertial oscillations and other ocean processes. Nonstationary time series methods are employed to account for the spatiotemporal variability of each trajectory. Because the datasets are large, we construct computationally efficient methods through the use of frequency-domain modelling and estimation, with the data expressed as complex-valued time series. We detail how practical issues related to sampling and model misspecification may be addressed using semi-parametric techniques for time series, and we demonstrate the effectiveness of our stochastic models through application to both real-world data and to numerical model output.
Structural Equation Modeling of Multivariate Time Series
du Toit, Stephen H. C.; Browne, Michael W.
2007-01-01
The covariance structure of a vector autoregressive process with moving average residuals (VARMA) is derived. It differs from other available expressions for the covariance function of a stationary VARMA process and is compatible with current structural equation methodology. Structural equation modeling programs, such as LISREL, may therefore be…
The use of synthetic input sequences in time series modeling
Energy Technology Data Exchange (ETDEWEB)
Oliveira, Dair Jose de [Programa de Pos-Graduacao em Engenharia Eletrica, Universidade Federal de Minas Gerais, Av. Antonio Carlos 6627, 31.270-901 Belo Horizonte, MG (Brazil); Letellier, Christophe [CORIA/CNRS UMR 6614, Universite et INSA de Rouen, Av. de l' Universite, BP 12, F-76801 Saint-Etienne du Rouvray cedex (France); Gomes, Murilo E.D. [Programa de Pos-Graduacao em Engenharia Eletrica, Universidade Federal de Minas Gerais, Av. Antonio Carlos 6627, 31.270-901 Belo Horizonte, MG (Brazil); Aguirre, Luis A. [Programa de Pos-Graduacao em Engenharia Eletrica, Universidade Federal de Minas Gerais, Av. Antonio Carlos 6627, 31.270-901 Belo Horizonte, MG (Brazil)], E-mail: aguirre@cpdee.ufmg.br
2008-08-04
In many situations time series models obtained from noise-like data settle to trivial solutions under iteration. This Letter proposes a way of producing a synthetic (dummy) input, that is included to prevent the model from settling down to a trivial solution, while maintaining features of the original signal. Simulated benchmark models and a real time series of RR intervals from an ECG are used to illustrate the procedure.
The use of synthetic input sequences in time series modeling
de Oliveira, Dair José; Letellier, Christophe; Gomes, Murilo E. D.; Aguirre, Luis A.
2008-08-01
In many situations time series models obtained from noise-like data settle to trivial solutions under iteration. This Letter proposes a way of producing a synthetic (dummy) input, that is included to prevent the model from settling down to a trivial solution, while maintaining features of the original signal. Simulated benchmark models and a real time series of RR intervals from an ECG are used to illustrate the procedure.
General expression for linear and nonlinear time series models
Institute of Scientific and Technical Information of China (English)
Ren HUANG; Feiyun XU; Ruwen CHEN
2009-01-01
The typical time series models such as ARMA, AR, and MA are founded on the normality and stationarity of a system and expressed by a linear difference equation; therefore, they are strictly limited to the linear system. However, some nonlinear factors are within the practical system; thus, it is difficult to fit the model for real systems with the above models. This paper proposes a general expression for linear and nonlinear auto-regressive time series models (GNAR). With the gradient optimization method and modified AIC information criteria integrated with the prediction error, the parameter estimation and order determination are achieved. The model simulation and experiments show that the GNAR model can accurately approximate to the dynamic characteristics of the most nonlinear models applied in academics and engineering. The modeling and prediction accuracy of the GNAR model is superior to the classical time series models. The proposed GNAR model is flexible and effective.
With string model to time series forecasting
Pinčák, Richard; Bartoš, Erik
2015-01-01
Overwhelming majority of econometric models applied on a long term basis in the financial forex market do not work sufficiently well. The reason is that transaction costs and arbitrage opportunity are not included, as this does not simulate the real financial markets. Analyses are not conducted on the non equidistant date but rather on the aggregate date, which is also not a real financial case. In this paper, we would like to show a new way how to analyze and, moreover, forecast financial ma...
With string model to time series forecasting
Pinčák, Richard; Bartoš, Erik
2015-10-01
Overwhelming majority of econometric models applied on a long term basis in the financial forex market do not work sufficiently well. The reason is that transaction costs and arbitrage opportunity are not included, as this does not simulate the real financial markets. Analyses are not conducted on the non equidistant date but rather on the aggregate date, which is also not a real financial case. In this paper, we would like to show a new way how to analyze and, moreover, forecast financial market. We utilize the projections of the real exchange rate dynamics onto the string-like topology in the OANDA market. The latter approach allows us to build the stable prediction models in trading in the financial forex market. The real application of the multi-string structures is provided to demonstrate our ideas for the solution of the problem of the robust portfolio selection. The comparison with the trend following strategies was performed, the stability of the algorithm on the transaction costs for long trade periods was confirmed.
Parameterizing unconditional skewness in models for financial time series
DEFF Research Database (Denmark)
He, Changli; Silvennoinen, Annastiina; Teräsvirta, Timo
In this paper we consider the third-moment structure of a class of time series models. It is often argued that the marginal distribution of financial time series such as returns is skewed. Therefore it is of importance to know what properties a model should possess if it is to accommodate...... unconditional skewness. We consider modelling the unconditional mean and variance using models that respond nonlinearly or asymmetrically to shocks. We investigate the implications of these models on the third-moment structure of the marginal distribution as well as conditions under which the unconditional...
Stochastic modeling of hourly rainfall times series in Campania (Italy)
Giorgio, M.; Greco, R.
2009-04-01
Occurrence of flowslides and floods in small catchments is uneasy to predict, since it is affected by a number of variables, such as mechanical and hydraulic soil properties, slope morphology, vegetation coverage, rainfall spatial and temporal variability. Consequently, landslide risk assessment procedures and early warning systems still rely on simple empirical models based on correlation between recorded rainfall data and observed landslides and/or river discharges. Effectiveness of such systems could be improved by reliable quantitative rainfall prediction, which can allow gaining larger lead-times. Analysis of on-site recorded rainfall height time series represents the most effective approach for a reliable prediction of local temporal evolution of rainfall. Hydrological time series analysis is a widely studied field in hydrology, often carried out by means of autoregressive models, such as AR, ARMA, ARX, ARMAX (e.g. Salas [1992]). Such models gave the best results when applied to the analysis of autocorrelated hydrological time series, like river flow or level time series. Conversely, they are not able to model the behaviour of intermittent time series, like point rainfall height series usually are, especially when recorded with short sampling time intervals. More useful for this issue are the so-called DRIP (Disaggregated Rectangular Intensity Pulse) and NSRP (Neymann-Scott Rectangular Pulse) model [Heneker et al., 2001; Cowpertwait et al., 2002], usually adopted to generate synthetic point rainfall series. In this paper, the DRIP model approach is adopted, in which the sequence of rain storms and dry intervals constituting the structure of rainfall time series is modeled as an alternating renewal process. Final aim of the study is to provide a useful tool to implement an early warning system for hydrogeological risk management. Model calibration has been carried out with hourly rainfall hieght data provided by the rain gauges of Campania Region civil
Combined forecasts from linear and nonlinear time series models
N. Terui (Nobuhiko); H.K. van Dijk (Herman)
1999-01-01
textabstractCombined forecasts from a linear and a nonlinear model are investigated for time series with possibly nonlinear characteristics. The forecasts are combined by a constant coefficient regression method as well as a time varying method. The time varying method allows for a locally (non)line
Statistical modelling of agrometeorological time series by exponential smoothing
Murat, Małgorzata; Malinowska, Iwona; Hoffmann, Holger; Baranowski, Piotr
2016-01-01
Meteorological time series are used in modelling agrophysical processes of the soil-plant-atmosphere system which determine plant growth and yield. Additionally, long-term meteorological series are used in climate change scenarios. Such studies often require forecasting or projection of meteorological variables, eg the projection of occurrence of the extreme events. The aim of the article was to determine the most suitable exponential smoothing models to generate forecast using data on air temperature, wind speed, and precipitation time series in Jokioinen (Finland), Dikopshof (Germany), Lleida (Spain), and Lublin (Poland). These series exhibit regular additive seasonality or non-seasonality without any trend, which is confirmed by their autocorrelation functions and partial autocorrelation functions. The most suitable models were indicated by the smallest mean absolute error and the smallest root mean squared error.
Time-varying parameter auto-regressive models for autocovariance nonstationary time series
Institute of Scientific and Technical Information of China (English)
FEI WanChun; BAI Lun
2009-01-01
In this paper,autocovariance nonstationary time series is clearly defined on a family of time series.We propose three types of TVPAR (time-varying parameter auto-regressive) models:the full order TVPAR model,the time-unvarying order TVPAR model and the time-varying order TVPAR model for autocovariance nonstationary time series.Related minimum AIC (Akaike information criterion) estimations are carried out.
Time-varying parameter auto-regressive models for autocovariance nonstationary time series
Institute of Scientific and Technical Information of China (English)
无
2009-01-01
In this paper, autocovariance nonstationary time series is clearly defined on a family of time series. We propose three types of TVPAR (time-varying parameter auto-regressive) models: the full order TVPAR model, the time-unvarying order TVPAR model and the time-varying order TV-PAR model for autocovariance nonstationary time series. Related minimum AIC (Akaike information criterion) estimations are carried out.
A multivariate heuristic model for fuzzy time-series forecasting.
Huarng, Kun-Huang; Yu, Tiffany Hui-Kuang; Hsu, Yu Wei
2007-08-01
Fuzzy time-series models have been widely applied due to their ability to handle nonlinear data directly and because no rigid assumptions for the data are needed. In addition, many such models have been shown to provide better forecasting results than their conventional counterparts. However, since most of these models require complicated matrix computations, this paper proposes the adoption of a multivariate heuristic function that can be integrated with univariate fuzzy time-series models into multivariate models. Such a multivariate heuristic function can easily be extended and integrated with various univariate models. Furthermore, the integrated model can handle multiple variables to improve forecasting results and, at the same time, avoid complicated computations due to the inclusion of multiple variables.
Modelling, simulation and inference for multivariate time series of counts
Veraart, Almut E. D.
2016-01-01
This article presents a new continuous-time modelling framework for multivariate time series of counts which have an infinitely divisible marginal distribution. The model is based on a mixed moving average process driven by L\\'{e}vy noise - called a trawl process - where the serial correlation and the cross-sectional dependence are modelled independently of each other. Such processes can exhibit short or long memory. We derive a stochastic simulation algorithm and a statistical inference meth...
Sparse time series chain graphical models for reconstructing genetic networks
Abegaz, Fentaw; Wit, Ernst
2013-01-01
We propose a sparse high-dimensional time series chain graphical model for reconstructing genetic networks from gene expression data parametrized by a precision matrix and autoregressive coefficient matrix. We consider the time steps as blocks or chains. The proposed approach explores patterns of co
Optimization of recurrent neural networks for time series modeling
DEFF Research Database (Denmark)
Pedersen, Morten With
1997-01-01
The present thesis is about optimization of recurrent neural networks applied to time series modeling. In particular is considered fully recurrent networks working from only a single external input, one layer of nonlinear hidden units and a li near output unit applied to prediction of discrete time...
Models for Pooled Time-Series Cross-Section Data
Directory of Open Access Journals (Sweden)
Lawrence E Raffalovich
2015-07-01
Full Text Available Several models are available for the analysis of pooled time-series cross-section (TSCS data, defined as “repeated observations on fixed units” (Beck and Katz 1995. In this paper, we run the following models: (1 a completely pooled model, (2 fixed effects models, and (3 multi-level/hierarchical linear models. To illustrate these models, we use a Generalized Least Squares (GLS estimator with cross-section weights and panel-corrected standard errors (with EViews 8 on the cross-national homicide trends data of forty countries from 1950 to 2005, which we source from published research (Messner et al. 2011. We describe and discuss the similarities and differences between the models, and what information each can contribute to help answer substantive research questions. We conclude with a discussion of how the models we present may help to mitigate validity threats inherent in pooled time-series cross-section data analysis.
DEFF Research Database (Denmark)
Sørup, Hjalte Jomo Danielsen; Madsen, Henrik; Arnbjerg-Nielsen, Karsten
2011-01-01
A very fine temporal and volumetric resolution precipitation time series is modeled using Markov models. Both 1st and 2nd order Markov models as well as seasonal and diurnal models are investigated and evaluated using likelihood based techniques. The 2nd order Markov model is found to be insignif...
A refined fuzzy time series model for stock market forecasting
Jilani, Tahseen Ahmed; Burney, Syed Muhammad Aqil
2008-05-01
Time series models have been used to make predictions of stock prices, academic enrollments, weather, road accident casualties, etc. In this paper we present a simple time-variant fuzzy time series forecasting method. The proposed method uses heuristic approach to define frequency-density-based partitions of the universe of discourse. We have proposed a fuzzy metric to use the frequency-density-based partitioning. The proposed fuzzy metric also uses a trend predictor to calculate the forecast. The new method is applied for forecasting TAIEX and enrollments’ forecasting of the University of Alabama. It is shown that the proposed method work with higher accuracy as compared to other fuzzy time series methods developed for forecasting TAIEX and enrollments of the University of Alabama.
Kālī: Time series data modeler
Kasliwal, Vishal P.
2016-07-01
The fully parallelized and vectorized software package Kālī models time series data using various stochastic processes such as continuous-time ARMA (C-ARMA) processes and uses Bayesian Markov Chain Monte-Carlo (MCMC) for inferencing a stochastic light curve. Kālimacr; is written in c++ with Python language bindings for ease of use. K¯lī is named jointly after the Hindu goddess of time, change, and power and also as an acronym for KArma LIbrary.
Analyzing the Dynamics of Nonlinear Multivariate Time Series Models
Institute of Scientific and Technical Information of China (English)
DenghuaZhong; ZhengfengZhang; DonghaiLiu; StefanMittnik
2004-01-01
This paper analyzes the dynamics of nonlinear multivariate time series models that is represented by generalized impulse response functions and asymmetric functions. We illustrate the measures of shock persistences and asymmetric effects of shocks derived from the generalized impulse response functions and asymmetric function in bivariate smooth transition regression models. The empirical work investigates a bivariate smooth transition model of US GDP and the unemployment rate.
Recursive Bayesian recurrent neural networks for time-series modeling.
Mirikitani, Derrick T; Nikolaev, Nikolay
2010-02-01
This paper develops a probabilistic approach to recursive second-order training of recurrent neural networks (RNNs) for improved time-series modeling. A general recursive Bayesian Levenberg-Marquardt algorithm is derived to sequentially update the weights and the covariance (Hessian) matrix. The main strengths of the approach are a principled handling of the regularization hyperparameters that leads to better generalization, and stable numerical performance. The framework involves the adaptation of a noise hyperparameter and local weight prior hyperparameters, which represent the noise in the data and the uncertainties in the model parameters. Experimental investigations using artificial and real-world data sets show that RNNs equipped with the proposed approach outperform standard real-time recurrent learning and extended Kalman training algorithms for recurrent networks, as well as other contemporary nonlinear neural models, on time-series modeling.
Deriving dynamic marketing effectiveness from econometric time series models
C. Horváth (Csilla); Ph.H.B.F. Franses (Philip Hans)
2003-01-01
textabstractTo understand the relevance of marketing efforts, it has become standard practice to estimate the long-run and short-run effects of the marketing-mix, using, say, weekly scanner data. A common vehicle for this purpose is an econometric time series model. Issues that are addressed in the
Model and Variable Selection Procedures for Semiparametric Time Series Regression
Directory of Open Access Journals (Sweden)
Risa Kato
2009-01-01
Full Text Available Semiparametric regression models are very useful for time series analysis. They facilitate the detection of features resulting from external interventions. The complexity of semiparametric models poses new challenges for issues of nonparametric and parametric inference and model selection that frequently arise from time series data analysis. In this paper, we propose penalized least squares estimators which can simultaneously select significant variables and estimate unknown parameters. An innovative class of variable selection procedure is proposed to select significant variables and basis functions in a semiparametric model. The asymptotic normality of the resulting estimators is established. Information criteria for model selection are also proposed. We illustrate the effectiveness of the proposed procedures with numerical simulations.
Quality Quandaries- Time Series Model Selection and Parsimony
DEFF Research Database (Denmark)
Bisgaard, Søren; Kulahci, Murat
2009-01-01
Some of the issues involved in selecting adequate models for time series data are discussed using an example concerning the number of users of an Internet server. The process of selecting an appropriate model is subjective and requires experience and judgment. The authors believe an important...... consideration in model selection should be parameter parsimony. They favor the use of parsimonious mixed ARMA models, noting that research has shown that a model building strategy that considers only autoregressive representations will lead to non-parsimonious models and to loss of forecasting accuracy....
Quality Quandaries- Time Series Model Selection and Parsimony
DEFF Research Database (Denmark)
Bisgaard, Søren; Kulahci, Murat
2009-01-01
Some of the issues involved in selecting adequate models for time series data are discussed using an example concerning the number of users of an Internet server. The process of selecting an appropriate model is subjective and requires experience and judgment. The authors believe an important...... consideration in model selection should be parameter parsimony. They favor the use of parsimonious mixed ARMA models, noting that research has shown that a model building strategy that considers only autoregressive representations will lead to non-parsimonious models and to loss of forecasting accuracy....
Neural network versus classical time series forecasting models
Nor, Maria Elena; Safuan, Hamizah Mohd; Shab, Noorzehan Fazahiyah Md; Asrul, Mohd; Abdullah, Affendi; Mohamad, Nurul Asmaa Izzati; Lee, Muhammad Hisyam
2017-05-01
Artificial neural network (ANN) has advantage in time series forecasting as it has potential to solve complex forecasting problems. This is because ANN is data driven approach which able to be trained to map past values of a time series. In this study the forecast performance between neural network and classical time series forecasting method namely seasonal autoregressive integrated moving average models was being compared by utilizing gold price data. Moreover, the effect of different data preprocessing on the forecast performance of neural network being examined. The forecast accuracy was evaluated using mean absolute deviation, root mean square error and mean absolute percentage error. It was found that ANN produced the most accurate forecast when Box-Cox transformation was used as data preprocessing.
Modeling Philippine Stock Exchange Composite Index Using Time Series Analysis
Gayo, W. S.; Urrutia, J. D.; Temple, J. M. F.; Sandoval, J. R. D.; Sanglay, J. E. A.
2015-06-01
This study was conducted to develop a time series model of the Philippine Stock Exchange Composite Index and its volatility using the finite mixture of ARIMA model with conditional variance equations such as ARCH, GARCH, EG ARCH, TARCH and PARCH models. Also, the study aimed to find out the reason behind the behaviorof PSEi, that is, which of the economic variables - Consumer Price Index, crude oil price, foreign exchange rate, gold price, interest rate, money supply, price-earnings ratio, Producers’ Price Index and terms of trade - can be used in projecting future values of PSEi and this was examined using Granger Causality Test. The findings showed that the best time series model for Philippine Stock Exchange Composite index is ARIMA(1,1,5) - ARCH(1). Also, Consumer Price Index, crude oil price and foreign exchange rate are factors concluded to Granger cause Philippine Stock Exchange Composite Index.
Time Series Analysis, Modeling and Applications A Computational Intelligence Perspective
Chen, Shyi-Ming
2013-01-01
Temporal and spatiotemporal data form an inherent fabric of the society as we are faced with streams of data coming from numerous sensors, data feeds, recordings associated with numerous areas of application embracing physical and human-generated phenomena (environmental data, financial markets, Internet activities, etc.). A quest for a thorough analysis, interpretation, modeling and prediction of time series comes with an ongoing challenge for developing models that are both accurate and user-friendly (interpretable). The volume is aimed to exploit the conceptual and algorithmic framework of Computational Intelligence (CI) to form a cohesive and comprehensive environment for building models of time series. The contributions covered in the volume are fully reflective of the wealth of the CI technologies by bringing together ideas, algorithms, and numeric studies, which convincingly demonstrate their relevance, maturity and visible usefulness. It reflects upon the truly remarkable diversity of methodological a...
Time series regression model for infectious disease and weather.
Imai, Chisato; Armstrong, Ben; Chalabi, Zaid; Mangtani, Punam; Hashizume, Masahiro
2015-10-01
Time series regression has been developed and long used to evaluate the short-term associations of air pollution and weather with mortality or morbidity of non-infectious diseases. The application of the regression approaches from this tradition to infectious diseases, however, is less well explored and raises some new issues. We discuss and present potential solutions for five issues often arising in such analyses: changes in immune population, strong autocorrelations, a wide range of plausible lag structures and association patterns, seasonality adjustments, and large overdispersion. The potential approaches are illustrated with datasets of cholera cases and rainfall from Bangladesh and influenza and temperature in Tokyo. Though this article focuses on the application of the traditional time series regression to infectious diseases and weather factors, we also briefly introduce alternative approaches, including mathematical modeling, wavelet analysis, and autoregressive integrated moving average (ARIMA) models. Modifications proposed to standard time series regression practice include using sums of past cases as proxies for the immune population, and using the logarithm of lagged disease counts to control autocorrelation due to true contagion, both of which are motivated from "susceptible-infectious-recovered" (SIR) models. The complexity of lag structures and association patterns can often be informed by biological mechanisms and explored by using distributed lag non-linear models. For overdispersed models, alternative distribution models such as quasi-Poisson and negative binomial should be considered. Time series regression can be used to investigate dependence of infectious diseases on weather, but may need modifying to allow for features specific to this context. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.
Forecasting the Reference Evapotranspiration Using Time Series Model
Directory of Open Access Journals (Sweden)
H. Zare Abyaneh
2016-10-01
Full Text Available Introduction: Reference evapotranspiration is one of the most important factors in irrigation timing and field management. Moreover, reference evapotranspiration forecasting can play a vital role in future developments. Therefore in this study, the seasonal autoregressive integrated moving average (ARIMA model was used to forecast the reference evapotranspiration time series in the Esfahan, Semnan, Shiraz, Kerman, and Yazd synoptic stations. Materials and Methods: In the present study in all stations (characteristics of the synoptic stations are given in Table 1, the meteorological data, including mean, maximum and minimum air temperature, relative humidity, dry-and wet-bulb temperature, dew-point temperature, wind speed, precipitation, air vapor pressure and sunshine hours were collected from the Islamic Republic of Iran Meteorological Organization (IRIMO for the 41 years from 1965 to 2005. The FAO Penman-Monteith equation was used to calculate the monthly reference evapotranspiration in the five synoptic stations and the evapotranspiration time series were formed. The unit root test was used to identify whether the time series was stationary, then using the Box-Jenkins method, seasonal ARIMA models were applied to the sample data. Table 1. The geographical location and climate conditions of the synoptic stations Station\tGeographical location\tAltitude (m\tMean air temperature (°C\tMean precipitation (mm\tClimate, according to the De Martonne index classification Longitude (E\tLatitude (N Annual\tMin. and Max. Esfahan\t51° 40'\t32° 37'\t1550.4\t16.36\t9.4-23.3\t122\tArid Semnan\t53° 33'\t35° 35'\t1130.8\t18.0\t12.4-23.8\t140\tArid Shiraz\t52° 36'\t29° 32'\t1484\t18.0\t10.2-25.9\t324\tSemi-arid Kerman\t56° 58'\t30° 15'\t1753.8\t15.6\t6.7-24.6\t142\tArid Yazd\t54° 17'\t31° 54'\t1237.2\t19.2\t11.8-26.0\t61\tArid Results and Discussion: The monthly meteorological data were used as input for the Ref-ET software and monthly reference
Time series ARIMA models for daily price of palm oil
Ariff, Noratiqah Mohd; Zamhawari, Nor Hashimah; Bakar, Mohd Aftar Abu
2015-02-01
Palm oil is deemed as one of the most important commodity that forms the economic backbone of Malaysia. Modeling and forecasting the daily price of palm oil is of great interest for Malaysia's economic growth. In this study, time series ARIMA models are used to fit the daily price of palm oil. The Akaike Infromation Criterion (AIC), Akaike Infromation Criterion with a correction for finite sample sizes (AICc) and Bayesian Information Criterion (BIC) are used to compare between different ARIMA models being considered. It is found that ARIMA(1,2,1) model is suitable for daily price of crude palm oil in Malaysia for the year 2010 to 2012.
TIME SERIES FORECASTING WITH MULTIPLE CANDIDATE MODELS: SELECTING OR COMBINING?
Institute of Scientific and Technical Information of China (English)
YU Lean; WANG Shouyang; K. K. Lai; Y.Nakamori
2005-01-01
Various mathematical models have been commonly used in time series analysis and forecasting. In these processes, academic researchers and business practitioners often come up against two important problems. One is whether to select an appropriate modeling approach for prediction purposes or to combine these different individual approaches into a single forecast for the different/dissimilar modeling approaches. Another is whether to select the best candidate model for forecasting or to mix the various candidate models with different parameters into a new forecast for the same/similar modeling approaches. In this study, we propose a set of computational procedures to solve the above two issues via two judgmental criteria. Meanwhile, in view of the problems presented in the literature, a novel modeling technique is also proposed to overcome the drawbacks of existing combined forecasting methods. To verify the efficiency and reliability of the proposed procedure and modeling technique, the simulations and real data examples are conducted in this study.The results obtained reveal that the proposed procedure and modeling technique can be used as a feasible solution for time series forecasting with multiple candidate models.
Modeling Large Time Series for Efficient Approximate Query Processing
DEFF Research Database (Denmark)
Perera, Kasun S; Hahmann, Martin; Lehner, Wolfgang
2015-01-01
Evolving customer requirements and increasing competition force business organizations to store increasing amounts of data and query them for information at any given time. Due to the current growth of data volumes, timely extraction of relevant information becomes more and more difficult...... these issues, compression techniques have been introduced in many areas of data processing. In this paper, we outline a new system that does not query complete datasets but instead utilizes models to extract the requested information. For time series data we use Fourier and Cosine transformations and piece...
A Comparative Study of Portmanteau Tests for Univariate Time Series Models
Directory of Open Access Journals (Sweden)
Sohail Chand
2006-07-01
Full Text Available Time series model diagnostic checking is the most important stage of time series model building. In this paper the comparison among several suggested diagnostic tests has been made using the simulation time series data.
Unsupervised Classification During Time-Series Model Building.
Gates, Kathleen M; Lane, Stephanie T; Varangis, E; Giovanello, K; Guiskewicz, K
2017-01-01
Researchers who collect multivariate time-series data across individuals must decide whether to model the dynamic processes at the individual level or at the group level. A recent innovation, group iterative multiple model estimation (GIMME), offers one solution to this dichotomy by identifying group-level time-series models in a data-driven manner while also reliably recovering individual-level patterns of dynamic effects. GIMME is unique in that it does not assume homogeneity in processes across individuals in terms of the patterns or weights of temporal effects. However, it can be difficult to make inferences from the nuances in varied individual-level patterns. The present article introduces an algorithm that arrives at subgroups of individuals that have similar dynamic models. Importantly, the researcher does not need to decide the number of subgroups. The final models contain reliable group-, subgroup-, and individual-level patterns that enable generalizable inferences, subgroups of individuals with shared model features, and individual-level patterns and estimates. We show that integrating community detection into the GIMME algorithm improves upon current standards in two important ways: (1) providing reliable classification and (2) increasing the reliability in the recovery of individual-level effects. We demonstrate this method on functional MRI from a sample of former American football players.
Disease management with ARIMA model in time series.
Sato, Renato Cesar
2013-01-01
The evaluation of infectious and noninfectious disease management can be done through the use of a time series analysis. In this study, we expect to measure the results and prevent intervention effects on the disease. Clinical studies have benefited from the use of these techniques, particularly for the wide applicability of the ARIMA model. This study briefly presents the process of using the ARIMA model. This analytical tool offers a great contribution for researchers and healthcare managers in the evaluation of healthcare interventions in specific populations.
A Simple Pile-up Model for Time Series Analysis
Sevilla, Diego J. R.
2017-07-01
In this paper, a simple pile-up model is presented. This model calculates the probability P(n| N) of having n counts if N particles collide with a sensor during an exposure time. Through some approximations, an analytic expression depending on only one parameter is obtained. This parameter characterizes the pile-up magnitude, and depends on features of the instrument and the source. The statistical model obtained permits the determination of probability distributions of measured counts from the probability distributions of incoming particles, which is valuable for time series analysis. Applicability limits are discussed, and an example of the improvement that can be achieved in the statistical analysis considering the proposed pile-up model is shown by analyzing real data.
On the maximum-entropy/autoregressive modeling of time series
Chao, B. F.
1984-01-01
The autoregressive (AR) model of a random process is interpreted in the light of the Prony's relation which relates a complex conjugate pair of poles of the AR process in the z-plane (or the z domain) on the one hand, to the complex frequency of one complex harmonic function in the time domain on the other. Thus the AR model of a time series is one that models the time series as a linear combination of complex harmonic functions, which include pure sinusoids and real exponentials as special cases. An AR model is completely determined by its z-domain pole configuration. The maximum-entropy/autogressive (ME/AR) spectrum, defined on the unit circle of the z-plane (or the frequency domain), is nothing but a convenient, but ambiguous visual representation. It is asserted that the position and shape of a spectral peak is determined by the corresponding complex frequency, and the height of the spectral peak contains little information about the complex amplitude of the complex harmonic functions.
Single-Index Additive Vector Autoregressive Time Series Models
LI, YEHUA
2009-09-01
We study a new class of nonlinear autoregressive models for vector time series, where the current vector depends on single-indexes defined on the past lags and the effects of different lags have an additive form. A sufficient condition is provided for stationarity of such models. We also study estimation of the proposed model using P-splines, hypothesis testing, asymptotics, selection of the order of the autoregression and of the smoothing parameters and nonlinear forecasting. We perform simulation experiments to evaluate our model in various settings. We illustrate our methodology on a climate data set and show that our model provides more accurate yearly forecasts of the El Niño phenomenon, the unusual warming of water in the Pacific Ocean. © 2009 Board of the Foundation of the Scandinavian Journal of Statistics.
Clustering Multivariate Time Series Using Hidden Markov Models
Directory of Open Access Journals (Sweden)
Shima Ghassempour
2014-03-01
Full Text Available In this paper we describe an algorithm for clustering multivariate time series with variables taking both categorical and continuous values. Time series of this type are frequent in health care, where they represent the health trajectories of individuals. The problem is challenging because categorical variables make it difficult to define a meaningful distance between trajectories. We propose an approach based on Hidden Markov Models (HMMs, where we first map each trajectory into an HMM, then define a suitable distance between HMMs and finally proceed to cluster the HMMs with a method based on a distance matrix. We test our approach on a simulated, but realistic, data set of 1,255 trajectories of individuals of age 45 and over, on a synthetic validation set with known clustering structure, and on a smaller set of 268 trajectories extracted from the longitudinal Health and Retirement Survey. The proposed method can be implemented quite simply using standard packages in R and Matlab and may be a good candidate for solving the difficult problem of clustering multivariate time series with categorical variables using tools that do not require advanced statistic knowledge, and therefore are accessible to a wide range of researchers.
Modeling financial time series with S-plus
Zivot, Eric
2003-01-01
The field of financial econometrics has exploded over the last decade This book represents an integration of theory, methods, and examples using the S-PLUS statistical modeling language and the S+FinMetrics module to facilitate the practice of financial econometrics This is the first book to show the power of S-PLUS for the analysis of time series data It is written for researchers and practitioners in the finance industry, academic researchers in economics and finance, and advanced MBA and graduate students in economics and finance Readers are assumed to have a basic knowledge of S-PLUS and a solid grounding in basic statistics and time series concepts Eric Zivot is an associate professor and Gary Waterman Distinguished Scholar in the Economics Department at the University of Washington, and is co-director of the nascent Professional Master's Program in Computational Finance He regularly teaches courses on econometric theory, financial econometrics and time series econometrics, and is the recipient of the He...
Hybrid perturbation methods based on statistical time series models
San-Juan, Juan Félix; San-Martín, Montserrat; Pérez, Iván; López, Rosario
2016-04-01
In this work we present a new methodology for orbit propagation, the hybrid perturbation theory, based on the combination of an integration method and a prediction technique. The former, which can be a numerical, analytical or semianalytical theory, generates an initial approximation that contains some inaccuracies derived from the fact that, in order to simplify the expressions and subsequent computations, not all the involved forces are taken into account and only low-order terms are considered, not to mention the fact that mathematical models of perturbations not always reproduce physical phenomena with absolute precision. The prediction technique, which can be based on either statistical time series models or computational intelligence methods, is aimed at modelling and reproducing missing dynamics in the previously integrated approximation. This combination results in the precision improvement of conventional numerical, analytical and semianalytical theories for determining the position and velocity of any artificial satellite or space debris object. In order to validate this methodology, we present a family of three hybrid orbit propagators formed by the combination of three different orders of approximation of an analytical theory and a statistical time series model, and analyse their capability to process the effect produced by the flattening of the Earth. The three considered analytical components are the integration of the Kepler problem, a first-order and a second-order analytical theories, whereas the prediction technique is the same in the three cases, namely an additive Holt-Winters method.
Empirical intrinsic geometry for nonlinear modeling and time series filtering.
Talmon, Ronen; Coifman, Ronald R
2013-07-30
In this paper, we present a method for time series analysis based on empirical intrinsic geometry (EIG). EIG enables one to reveal the low-dimensional parametric manifold as well as to infer the underlying dynamics of high-dimensional time series. By incorporating concepts of information geometry, this method extends existing geometric analysis tools to support stochastic settings and parametrizes the geometry of empirical distributions. However, the statistical models are not required as priors; hence, EIG may be applied to a wide range of real signals without existing definitive models. We show that the inferred model is noise-resilient and invariant under different observation and instrumental modalities. In addition, we show that it can be extended efficiently to newly acquired measurements in a sequential manner. These two advantages enable us to revisit the Bayesian approach and incorporate empirical dynamics and intrinsic geometry into a nonlinear filtering framework. We show applications to nonlinear and non-Gaussian tracking problems as well as to acoustic signal localization.
Model of a synthetic wind speed time series generator
DEFF Research Database (Denmark)
Negra, N.B.; Holmstrøm, O.; Bak-Jensen, B.
2008-01-01
of possible wind conditions. If these information are not available, synthetic wind speed time series may be a useful tool as well, but their generator must preserve statistical and stochastic features of the phenomenon. This paper deals with this issue: a generator for synthetic wind speed time series...
Hybrid Perturbation methods based on Statistical Time Series models
San-Juan, Juan Félix; Pérez, Iván; López, Rosario
2016-01-01
In this work we present a new methodology for orbit propagation, the hybrid perturbation theory, based on the combination of an integration method and a prediction technique. The former, which can be a numerical, analytical or semianalytical theory, generates an initial approximation that contains some inaccuracies derived from the fact that, in order to simplify the expressions and subsequent computations, not all the involved forces are taken into account and only low-order terms are considered, not to mention the fact that mathematical models of perturbations not always reproduce physical phenomena with absolute precision. The prediction technique, which can be based on either statistical time series models or computational intelligence methods, is aimed at modelling and reproducing missing dynamics in the previously integrated approximation. This combination results in the precision improvement of conventional numerical, analytical and semianalytical theories for determining the position and velocity of a...
Crop Yield Forecasted Model Based on Time Series Techniques
Institute of Scientific and Technical Information of China (English)
Li Hong-ying; Hou Yan-lin; Zhou Yong-juan; Zhao Hui-ming
2012-01-01
Traditional studies on potential yield mainly referred to attainable yield： the maximum yield which could be reached by a crop in a given environment. The new concept of crop yield under average climate conditions was defined in this paper, which was affected by advancement of science and technology. Based on the new concept of crop yield, the time series techniques relying on past yield data was employed to set up a forecasting model. The model was tested by using average grain yields of Liaoning Province in China from 1949 to 2005. The testing combined dynamic n-choosing and micro tendency rectification, and an average forecasting error was 1.24%. In the trend line of yield change, and then a yield turning point might occur, in which case the inflexion model was used to solve the problem of yield turn point.
2013-01-01
Time series analysis can be used to quantitatively monitor, describe, explain, and predict road safety developments. Time series analysis techniques offer the possibility of quantitatively modelling road safety developments in such a way that the dependencies between the observations of time series
Incorporating Satellite Time-Series Data into Modeling
Gregg, Watson
2008-01-01
In situ time series observations have provided a multi-decadal view of long-term changes in ocean biology. These observations are sufficiently reliable to enable discernment of even relatively small changes, and provide continuous information on a host of variables. Their key drawback is their limited domain. Satellite observations from ocean color sensors do not suffer the drawback of domain, and simultaneously view the global oceans. This attribute lends credence to their use in global and regional model validation and data assimilation. We focus on these applications using the NASA Ocean Biogeochemical Model. The enhancement of the satellite data using data assimilation is featured and the limitation of tongterm satellite data sets is also discussed.
Forecasting inflation in Montenegro using univariate time series models
Directory of Open Access Journals (Sweden)
Milena Lipovina-Božović
2015-04-01
Full Text Available The analysis of price trends and their prognosis is one of the key tasks of the economic authorities in each country. Due to the nature of the Montenegrin economy as small and open economy with euro as currency, forecasting inflation is very specific which is more difficult due to low quality of the data. This paper analyzes the utility and applicability of univariate time series models for forecasting price index in Montenegro. Data analysis of key macroeconomic movements in previous decades indicates the presence of many possible determinants that could influence forecasting result. This paper concludes that the forecasting models (ARIMA based only on its own previous values cannot adequately cover the key factors that determine the price level in the future, probably because of the existence of numerous external factors that influence the price movement in Montenegro.
Time series modelling and forecasting of emergency department overcrowding.
Kadri, Farid; Harrou, Fouzi; Chaabane, Sondès; Tahon, Christian
2014-09-01
Efficient management of patient flow (demand) in emergency departments (EDs) has become an urgent issue for many hospital administrations. Today, more and more attention is being paid to hospital management systems to optimally manage patient flow and to improve management strategies, efficiency and safety in such establishments. To this end, EDs require significant human and material resources, but unfortunately these are limited. Within such a framework, the ability to accurately forecast demand in emergency departments has considerable implications for hospitals to improve resource allocation and strategic planning. The aim of this study was to develop models for forecasting daily attendances at the hospital emergency department in Lille, France. The study demonstrates how time-series analysis can be used to forecast, at least in the short term, demand for emergency services in a hospital emergency department. The forecasts were based on daily patient attendances at the paediatric emergency department in Lille regional hospital centre, France, from January 2012 to December 2012. An autoregressive integrated moving average (ARIMA) method was applied separately to each of the two GEMSA categories and total patient attendances. Time-series analysis was shown to provide a useful, readily available tool for forecasting emergency department demand.
Modeling Glacier Elevation Change from DEM Time Series
Directory of Open Access Journals (Sweden)
Di Wang
2015-08-01
Full Text Available In this study, a methodology for glacier elevation reconstruction from Digital Elevation Model (DEM time series (tDEM is described for modeling the evolution of glacier elevation and estimating related volume change, with focus on medium-resolution and noisy satellite DEMs. The method is robust with respect to outliers in individual DEM products. Fox Glacier and Franz Josef Glacier in New Zealand are used as test cases based on 31 Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER DEMs and the Shuttle Radar Topography Mission (SRTM DEM. We obtained a mean surface elevation lowering rate of −0.51 ± 0.02 m·a−1 and −0.09 ± 0.02 m·a−1 between 2000 and 2014 for Fox and Franz Josef Glacier, respectively. The specific volume difference between 2000 and 2014 was estimated as −0.77 ± 0.13 m·a−1 and −0.33 ± 0.06 m·a−1 by our tDEM method. The comparably moderate thinning rates are mainly due to volume gains after 2013 that compensate larger thinning rates earlier in the series. Terminus thickening prevailed between 2002 and 2007.
Prediction and interpolation of time series by state space models
Helske, Jouni
2015-01-01
A large amount of data collected today is in the form of a time series. In order to make realistic inferences based on time series forecasts, in addition to point predictions, prediction intervals or other measures of uncertainty should be presented. Multiple sources of uncertainty are often ignored due to the complexities involved in accounting them correctly. In this dissertation, some of these problems are reviewed and some new solutions are presented. A state space approach...
Predicting chaotic time series with a partial model.
Hamilton, Franz; Berry, Tyrus; Sauer, Timothy
2015-07-01
Methods for forecasting time series are a critical aspect of the understanding and control of complex networks. When the model of the network is unknown, nonparametric methods for prediction have been developed, based on concepts of attractor reconstruction pioneered by Takens and others. In this Rapid Communication we consider how to make use of a subset of the system equations, if they are known, to improve the predictive capability of forecasting methods. A counterintuitive implication of the results is that knowledge of the evolution equation of even one variable, if known, can improve forecasting of all variables. The method is illustrated on data from the Lorenz attractor and from a small network with chaotic dynamics.
Forecasting electricity usage using univariate time series models
Hock-Eam, Lim; Chee-Yin, Yip
2014-12-01
Electricity is one of the important energy sources. A sufficient supply of electricity is vital to support a country's development and growth. Due to the changing of socio-economic characteristics, increasing competition and deregulation of electricity supply industry, the electricity demand forecasting is even more important than before. It is imperative to evaluate and compare the predictive performance of various forecasting methods. This will provide further insights on the weakness and strengths of each method. In literature, there are mixed evidences on the best forecasting methods of electricity demand. This paper aims to compare the predictive performance of univariate time series models for forecasting the electricity demand using a monthly data of maximum electricity load in Malaysia from January 2003 to December 2013. Results reveal that the Box-Jenkins method produces the best out-of-sample predictive performance. On the other hand, Holt-Winters exponential smoothing method is a good forecasting method for in-sample predictive performance.
MODELLING GASOLINE DEMAND IN GHANA: A STRUCTURAL TIME SERIES ANALYSIS
Directory of Open Access Journals (Sweden)
Ishmael Ackah
2014-01-01
Full Text Available Concerns about the role of energy consumption in global warming have led to policy designs that seek to reduce fossil fuel consumption or find a less polluting alternative especiallyfor the transport sector. This study seeks to estimate the elasticities of price, income, education and technology on transport gasoline demand sector inGhana. The Structural Time Series Model reports a short-run price and income elasticities of -0.0088 and 0.713. Total factor productivity is -0.408 whilstthe elasticity for education is 2.33. In the long run, the reported price and income elasticities are -0.065 and 5.129 respectively. The long run elasticityfor productivity is -2.935. The study recommends that in order to enhanceefficiency in gasoline consumption in the transport sector, there should beinvestment in productivity.
Optimal model-free prediction from multivariate time series.
Runge, Jakob; Donner, Reik V; Kurths, Jürgen
2015-05-01
Forecasting a time series from multivariate predictors constitutes a challenging problem, especially using model-free approaches. Most techniques, such as nearest-neighbor prediction, quickly suffer from the curse of dimensionality and overfitting for more than a few predictors which has limited their application mostly to the univariate case. Therefore, selection strategies are needed that harness the available information as efficiently as possible. Since often the right combination of predictors matters, ideally all subsets of possible predictors should be tested for their predictive power, but the exponentially growing number of combinations makes such an approach computationally prohibitive. Here a prediction scheme that overcomes this strong limitation is introduced utilizing a causal preselection step which drastically reduces the number of possible predictors to the most predictive set of causal drivers making a globally optimal search scheme tractable. The information-theoretic optimality is derived and practical selection criteria are discussed. As demonstrated for multivariate nonlinear stochastic delay processes, the optimal scheme can even be less computationally expensive than commonly used suboptimal schemes like forward selection. The method suggests a general framework to apply the optimal model-free approach to select variables and subsequently fit a model to further improve a prediction or learn statistical dependencies. The performance of this framework is illustrated on a climatological index of El Niño Southern Oscillation.
DEFF Research Database (Denmark)
Johansen, Søren
An overvies of results for the cointegrated VAR model for nonstationary I (1) variables is given. The emphasis is on the analysis of the model and the tools for asymptotic inference. These include: formulation of criteria on the parameters, for the process to be nonstationary and I (1), formulati...
DEFF Research Database (Denmark)
Johansen, Søren
2015-01-01
An overview of results for the cointegrated VAR model for nonstationary I(1) variables is given. The emphasis is on the analysis of the model and the tools for asymptotic inference. These include: formulation of criteria on the parameters, for the process to be nonstationary and I(1), formulation...
Time series, correlation matrices and random matrix models
Energy Technology Data Exchange (ETDEWEB)
Vinayak [Instituto de Ciencias Físicas, Universidad Nacional Autónoma de México, C.P. 62210 Cuernavaca (Mexico); Seligman, Thomas H. [Instituto de Ciencias Físicas, Universidad Nacional Autónoma de México, C.P. 62210 Cuernavaca, México and Centro Internacional de Ciencias, C.P. 62210 Cuernavaca (Mexico)
2014-01-08
In this set of five lectures the authors have presented techniques to analyze open classical and quantum systems using correlation matrices. For diverse reasons we shall see that random matrices play an important role to describe a null hypothesis or a minimum information hypothesis for the description of a quantum system or subsystem. In the former case various forms of correlation matrices of time series associated with the classical observables of some system. The fact that such series are necessarily finite, inevitably introduces noise and this finite time influence lead to a random or stochastic component in these time series. By consequence random correlation matrices have a random component, and corresponding ensembles are used. In the latter we use random matrices to describe high temperature environment or uncontrolled perturbations, ensembles of differing chaotic systems etc. The common theme of the lectures is thus the importance of random matrix theory in a wide range of fields in and around physics.
Adaptive time-variant models for fuzzy-time-series forecasting.
Wong, Wai-Keung; Bai, Enjian; Chu, Alice Wai-Ching
2010-12-01
A fuzzy time series has been applied to the prediction of enrollment, temperature, stock indices, and other domains. Related studies mainly focus on three factors, namely, the partition of discourse, the content of forecasting rules, and the methods of defuzzification, all of which greatly influence the prediction accuracy of forecasting models. These studies use fixed analysis window sizes for forecasting. In this paper, an adaptive time-variant fuzzy-time-series forecasting model (ATVF) is proposed to improve forecasting accuracy. The proposed model automatically adapts the analysis window size of fuzzy time series based on the prediction accuracy in the training phase and uses heuristic rules to generate forecasting values in the testing phase. The performance of the ATVF model is tested using both simulated and actual time series including the enrollments at the University of Alabama, Tuscaloosa, and the Taiwan Stock Exchange Capitalization Weighted Stock Index (TAIEX). The experiment results show that the proposed ATVF model achieves a significant improvement in forecasting accuracy as compared to other fuzzy-time-series forecasting models.
A generalized exponential time series regression model for electricity prices
DEFF Research Database (Denmark)
Haldrup, Niels; Knapik, Oskar; Proietti, Tomasso
We consider the issue of modeling and forecasting daily electricity spot prices on the Nord Pool Elspot power market. We propose a method that can handle seasonal and non-seasonal persistence by modelling the price series as a generalized exponential process. As the presence of spikes can distort...... the estimation of the dynamic structure of the series we consider an iterative estimation strategy which, conditional on a set of parameter estimates, clears the spikes using a data cleaning algorithm, and reestimates the parameters using the cleaned data so as to robustify the estimates. Conditional...... on the estimated model, the best linear predictor is constructed. Our modeling approach provides good fit within sample and outperforms competing benchmark predictors in terms of forecasting accuracy. We also find that building separate models for each hour of the day and averaging the forecasts is a better...
Auto-Regressive Models of Non-Stationary Time Series with Finite Length
Institute of Scientific and Technical Information of China (English)
FEI Wanchun; BAI Lun
2005-01-01
To analyze and simulate non-stationary time series with finite length, the statistical characteristics and auto-regressive (AR) models of non-stationary time series with finite length are discussed and studied. A new AR model called the time varying parameter AR model is proposed for solution of non-stationary time series with finite length. The auto-covariances of time series simulated by means of several AR models are analyzed. The result shows that the new AR model can be used to simulate and generate a new time series with the auto-covariance same as the original time series. The size curves of cocoon filaments regarded as non-stationary time series with finite length are experimentally simulated. The simulation results are significantly better than those obtained so far, and illustrate the availability of the time varying parameter AR model. The results are useful for analyzing and simulating non-stationary time series with finite length.
Forecasting Financial Time-Series using Artificial Market Models
Gupta, N; Johnson, N F; Gupta, Nachi; Hauser, Raphael; Johnson, Neil F.
2005-01-01
We discuss the theoretical machinery involved in predicting financial market movements using an artificial market model which has been trained on real financial data. This approach to market prediction - in particular, forecasting financial time-series by training a third-party or 'black box' game on the financial data itself -- was discussed by Johnson et al. in cond-mat/0105303 and cond-mat/0105258 and was based on some encouraging preliminary investigations of the dollar-yen exchange rate, various individual stocks, and stock market indices. However, the initial attempts lacked a clear formal methodology. Here we present a detailed methodology, using optimization techniques to build an estimate of the strategy distribution across the multi-trader population. In contrast to earlier attempts, we are able to present a systematic method for identifying 'pockets of predictability' in real-world markets. We find that as each pocket closes up, the black-box system needs to be 'reset' - which is equivalent to sayi...
Testing coeffcients of AR and bilinear time series models by a graphical approach
Institute of Scientific and Technical Information of China (English)
IP; WaiCheung
2008-01-01
AR and bilinear time series models are expressed as time series chain graphical models, based on which, it is shown that the coefficients of AR and bilinear models are the conditional correlation coefficients conditioned on the other components of the time series. Then a graphically based procedure is proposed to test the significance of the coeffcients of AR and bilinear time series. Simulations show that our procedure performs well both in sizes and powers.
Model-Coupled Autoencoder for Time Series Visualisation
Gianniotis, Nikolaos; Tiňo, Peter; Polsterer, Kai L
2016-01-01
We present an approach for the visualisation of a set of time series that combines an echo state network with an autoencoder. For each time series in the dataset we train an echo state network, using a common and fixed reservoir of hidden neurons, and use the optimised readout weights as the new representation. Dimensionality reduction is then performed via an autoencoder on the readout weight representations. The crux of the work is to equip the autoencoder with a loss function that correctly interprets the reconstructed readout weights by associating them with a reconstruction error measured in the data space of sequences. This essentially amounts to measuring the predictive performance that the reconstructed readout weights exhibit on their corresponding sequences when plugged back into the echo state network with the same fixed reservoir. We demonstrate that the proposed visualisation framework can deal both with real valued sequences as well as binary sequences. We derive magnification factors in order t...
He, Yuning
2015-01-01
Safety of unmanned aerial systems (UAS) is paramount, but the large number of dynamically changing controller parameters makes it hard to determine if the system is currently stable, and the time before loss of control if not. We propose a hierarchical statistical model using Treed Gaussian Processes to predict (i) whether a flight will be stable (success) or become unstable (failure), (ii) the time-to-failure if unstable, and (iii) time series outputs for flight variables. We first classify the current flight input into success or failure types, and then use separate models for each class to predict the time-to-failure and time series outputs. As different inputs may cause failures at different times, we have to model variable length output curves. We use a basis representation for curves and learn the mappings from input to basis coefficients. We demonstrate the effectiveness of our prediction methods on a NASA neuro-adaptive flight control system.
Learning restricted Boolean network model by time-series data
2014-01-01
Restricted Boolean networks are simplified Boolean networks that are required for either negative or positive regulations between genes. Higa et al. (BMC Proc 5:S5, 2011) proposed a three-rule algorithm to infer a restricted Boolean network from time-series data. However, the algorithm suffers from a major drawback, namely, it is very sensitive to noise. In this paper, we systematically analyze the regulatory relationships between genes based on the state switch of the target gene and propose an algorithm with which restricted Boolean networks may be inferred from time-series data. We compare the proposed algorithm with the three-rule algorithm and the best-fit algorithm based on both synthetic networks and a well-studied budding yeast cell cycle network. The performance of the algorithms is evaluated by three distance metrics: the normalized-edge Hamming distance μhame, the normalized Hamming distance of state transition μhamst, and the steady-state distribution distance μssd. Results show that the proposed algorithm outperforms the others according to both μhame and μhamst, whereas its performance according to μssd is intermediate between best-fit and the three-rule algorithms. Thus, our new algorithm is more appropriate for inferring interactions between genes from time-series data. PMID:25093019
Learning restricted Boolean network model by time-series data.
Ouyang, Hongjia; Fang, Jie; Shen, Liangzhong; Dougherty, Edward R; Liu, Wenbin
2014-01-01
Restricted Boolean networks are simplified Boolean networks that are required for either negative or positive regulations between genes. Higa et al. (BMC Proc 5:S5, 2011) proposed a three-rule algorithm to infer a restricted Boolean network from time-series data. However, the algorithm suffers from a major drawback, namely, it is very sensitive to noise. In this paper, we systematically analyze the regulatory relationships between genes based on the state switch of the target gene and propose an algorithm with which restricted Boolean networks may be inferred from time-series data. We compare the proposed algorithm with the three-rule algorithm and the best-fit algorithm based on both synthetic networks and a well-studied budding yeast cell cycle network. The performance of the algorithms is evaluated by three distance metrics: the normalized-edge Hamming distance [Formula: see text], the normalized Hamming distance of state transition [Formula: see text], and the steady-state distribution distance μ (ssd). Results show that the proposed algorithm outperforms the others according to both [Formula: see text] and [Formula: see text], whereas its performance according to μ (ssd) is intermediate between best-fit and the three-rule algorithms. Thus, our new algorithm is more appropriate for inferring interactions between genes from time-series data.
Self-organising mixture autoregressive model for non-stationary time series modelling.
Ni, He; Yin, Hujun
2008-12-01
Modelling non-stationary time series has been a difficult task for both parametric and nonparametric methods. One promising solution is to combine the flexibility of nonparametric models with the simplicity of parametric models. In this paper, the self-organising mixture autoregressive (SOMAR) network is adopted as a such mixture model. It breaks time series into underlying segments and at the same time fits local linear regressive models to the clusters of segments. In such a way, a global non-stationary time series is represented by a dynamic set of local linear regressive models. Neural gas is used for a more flexible structure of the mixture model. Furthermore, a new similarity measure has been introduced in the self-organising network to better quantify the similarity of time series segments. The network can be used naturally in modelling and forecasting non-stationary time series. Experiments on artificial, benchmark time series (e.g. Mackey-Glass) and real-world data (e.g. numbers of sunspots and Forex rates) are presented and the results show that the proposed SOMAR network is effective and superior to other similar approaches.
Extracting the relevant delays in time series modelling
DEFF Research Database (Denmark)
Goutte, Cyril
1997-01-01
selection, and more precisely stepwise forward selection. The method is compared to other forward selection schemes, as well as to a nonparametric tests aimed at estimating the embedding dimension of time series. The final application extends these results to the efficient estimation of FIR filters on some......In this contribution, we suggest a convenient way to use generalisation error to extract the relevant delays from a time-varying process, i.e. the delays that lead to the best prediction performance. We design a generalisation-based algorithm that takes its inspiration from traditional variable...
Time series decomposition methods were applied to meteorological and air quality data and their numerical model estimates. Decomposition techniques express a time series as the sum of a small number of independent modes which hypothetically represent identifiable forcings, thereb...
The Exponential Model for the Spectrum of a Time Series: Extensions and Applications
DEFF Research Database (Denmark)
Proietti, Tommaso; Luati, Alessandra
The exponential model for the spectrum of a time series and its fractional extensions are based on the Fourier series expansion of the logarithm of the spectral density. The coefficients of the expansion form the cepstrum of the time series. After deriving the cepstrum of important classes of time...
Modeling PSInSAR time series without phase unwrapping
Zhang, L.; Ding, X.; Lu, Zhiming
2011-01-01
In this paper, we propose a least-squares-based method for multitemporal synthetic aperture radar interferometry that allows one to estimate deformations without the need of phase unwrapping. The method utilizes a series of multimaster wrapped differential interferograms with short baselines and focuses on arcs at which there are no phase ambiguities. An outlier detector is used to identify and remove the arcs with phase ambiguities, and a pseudoinverse of the variancecovariance matrix is used as the weight matrix of the correlated observations. The deformation rates at coherent points are estimated with a least squares model constrained by reference points. The proposed approach is verified with a set of simulated data. ?? 2006 IEEE.
A flexible coefficient smooth transition time series model.
Medeiros, Marcelo C; Veiga, Alvaro
2005-01-01
In this paper, we consider a flexible smooth transition autoregressive (STAR) model with multiple regimes and multiple transition variables. This formulation can be interpreted as a time varying linear model where the coefficients are the outputs of a single hidden layer feedforward neural network. This proposal has the major advantage of nesting several nonlinear models, such as, the self-exciting threshold autoregressive (SETAR), the autoregressive neural network (AR-NN), and the logistic STAR models. Furthermore, if the neural network is interpreted as a nonparametric universal approximation to any Borel measurable function, our formulation is directly comparable to the functional coefficient autoregressive (FAR) and the single-index coefficient regression models. A model building procedure is developed based on statistical inference arguments. A Monte Carlo experiment showed that the procedure works in small samples, and its performance improves, as it should, in medium size samples. Several real examples are also addressed.
INDUSTRIAL PRODUCTION IN GERMANY AND AUSTRIA: A CASE STUDY IN STRUCTURAL TIME SERIES MODELLING
Institute of Scientific and Technical Information of China (English)
Gerhard THURY
2003-01-01
Industrial production series are volatile and often cyclical. Time series models can be used to establish certain stylized facts, such as trends and cycles, which may be present in these series. In certain situations, it is also possible that common factors, which may have an interesting interpretation, can be detected in production series. Series from two neighboring countries with close economic relationships, such as Germany and Austria, are especially likely to exhibit such joint stylized facts.
Bayesian Modelling of fMRI Time Series
DEFF Research Database (Denmark)
Højen-Sørensen, Pedro; Hansen, Lars Kai; Rasmussen, Carl Edward
2000-01-01
We present a Hidden Markov Model (HMM) for inferring the hidden psychological state (or neural activity) during single trial fMRI activation experiments with blocked task paradigms. Inference is based on Bayesian methodology, using a combination of analytical and a variety of Markov Chain Monte C...... Carlo (MCMC) sampling techniques. The advantage of this method is that detection of short time learning effects between repeated trials is possible since inference is based only on single trial experiments....
Directory of Open Access Journals (Sweden)
Kennedy Curtis E
2011-10-01
Full Text Available Abstract Background Thousands of children experience cardiac arrest events every year in pediatric intensive care units. Most of these children die. Cardiac arrest prediction tools are used as part of medical emergency team evaluations to identify patients in standard hospital beds that are at high risk for cardiac arrest. There are no models to predict cardiac arrest in pediatric intensive care units though, where the risk of an arrest is 10 times higher than for standard hospital beds. Current tools are based on a multivariable approach that does not characterize deterioration, which often precedes cardiac arrests. Characterizing deterioration requires a time series approach. The purpose of this study is to propose a method that will allow for time series data to be used in clinical prediction models. Successful implementation of these methods has the potential to bring arrest prediction to the pediatric intensive care environment, possibly allowing for interventions that can save lives and prevent disabilities. Methods We reviewed prediction models from nonclinical domains that employ time series data, and identified the steps that are necessary for building predictive models using time series clinical data. We illustrate the method by applying it to the specific case of building a predictive model for cardiac arrest in a pediatric intensive care unit. Results Time course analysis studies from genomic analysis provided a modeling template that was compatible with the steps required to develop a model from clinical time series data. The steps include: 1 selecting candidate variables; 2 specifying measurement parameters; 3 defining data format; 4 defining time window duration and resolution; 5 calculating latent variables for candidate variables not directly measured; 6 calculating time series features as latent variables; 7 creating data subsets to measure model performance effects attributable to various classes of candidate variables; 8
A Sandwich-Type Standard Error Estimator of SEM Models with Multivariate Time Series
Zhang, Guangjian; Chow, Sy-Miin; Ong, Anthony D.
2011-01-01
Structural equation models are increasingly used as a modeling tool for multivariate time series data in the social and behavioral sciences. Standard error estimators of SEM models, originally developed for independent data, require modifications to accommodate the fact that time series data are inherently dependent. In this article, we extend a…
A Sandwich-Type Standard Error Estimator of SEM Models with Multivariate Time Series
Zhang, Guangjian; Chow, Sy-Miin; Ong, Anthony D.
2011-01-01
Structural equation models are increasingly used as a modeling tool for multivariate time series data in the social and behavioral sciences. Standard error estimators of SEM models, originally developed for independent data, require modifications to accommodate the fact that time series data are inherently dependent. In this article, we extend a…
New Models for Forecasting Enrollments: Fuzzy Time Series and Neural Network Approaches.
Song, Qiang; Chissom, Brad S.
Since university enrollment forecasting is very important, many different methods and models have been proposed by researchers. Two new methods for enrollment forecasting are introduced: (1) the fuzzy time series model; and (2) the artificial neural networks model. Fuzzy time series has been proposed to deal with forecasting problems within a…
Financial-Economic Time Series Modeling and Prediction Techniques – Review
2014-01-01
Financial-economic time series distinguishes from other time series because they contain a portion of uncertainity. Because of this, statistical theory and methods play important role in their analysis. Moreover, external influence of various parameters on the values in time series makes them non-linear, which on the other hand suggests employment of more complex techniques for ther modeling. To cope with this challenging problem many researchers and scientists have developed various models a...
Siggiridou, Elsa; Kugiumtzis, Dimitris
2016-04-01
Granger causality has been used for the investigation of the inter-dependence structure of the underlying systems of multi-variate time series. In particular, the direct causal effects are commonly estimated by the conditional Granger causality index (CGCI). In the presence of many observed variables and relatively short time series, CGCI may fail because it is based on vector autoregressive models (VAR) involving a large number of coefficients to be estimated. In this work, the VAR is restricted by a scheme that modifies the recently developed method of backward-in-time selection (BTS) of the lagged variables and the CGCI is combined with BTS. Further, the proposed approach is compared favorably to other restricted VAR representations, such as the top-down strategy, the bottom-up strategy, and the least absolute shrinkage and selection operator (LASSO), in terms of sensitivity and specificity of CGCI. This is shown by using simulations of linear and nonlinear, low and high-dimensional systems and different time series lengths. For nonlinear systems, CGCI from the restricted VAR representations are compared with analogous nonlinear causality indices. Further, CGCI in conjunction with BTS and other restricted VAR representations is applied to multi-channel scalp electroencephalogram (EEG) recordings of epileptic patients containing epileptiform discharges. CGCI on the restricted VAR, and BTS in particular, could track the changes in brain connectivity before, during and after epileptiform discharges, which was not possible using the full VAR representation.
Time series count data models: an empirical application to traffic accidents.
Quddus, Mohammed A
2008-09-01
Count data are primarily categorised as cross-sectional, time series, and panel. Over the past decade, Poisson and Negative Binomial (NB) models have been used widely to analyse cross-sectional and time series count data, and random effect and fixed effect Poisson and NB models have been used to analyse panel count data. However, recent literature suggests that although the underlying distributional assumptions of these models are appropriate for cross-sectional count data, they are not capable of taking into account the effect of serial correlation often found in pure time series count data. Real-valued time series models, such as the autoregressive integrated moving average (ARIMA) model, introduced by Box and Jenkins have been used in many applications over the last few decades. However, when modelling non-negative integer-valued data such as traffic accidents at a junction over time, Box and Jenkins models may be inappropriate. This is mainly due to the normality assumption of errors in the ARIMA model. Over the last few years, a new class of time series models known as integer-valued autoregressive (INAR) Poisson models, has been studied by many authors. This class of models is particularly applicable to the analysis of time series count data as these models hold the properties of Poisson regression and able to deal with serial correlation, and therefore offers an alternative to the real-valued time series models. The primary objective of this paper is to introduce the class of INAR models for the time series analysis of traffic accidents in Great Britain. Different types of time series count data are considered: aggregated time series data where both the spatial and temporal units of observation are relatively large (e.g., Great Britain and years) and disaggregated time series data where both the spatial and temporal units are relatively small (e.g., congestion charging zone and months). The performance of the INAR models is compared with the class of Box and
Travel cost inference from sparse, spatio-temporally correlated time series using markov models
DEFF Research Database (Denmark)
Yang, B.; Guo, C.; Jensen, C.S.
2013-01-01
of such time series offers insight into the underlying system and enables prediction of system behavior. While the techniques presented in the paper apply more generally, we consider the case of transportation systems and aim to predict travel cost from GPS tracking data from probe vehicles. Specifically, each......The monitoring of a system can yield a set of measurements that can be modeled as a collection of time series. These time series are often sparse, due to missing measurements, and spatiotemporally correlated, meaning that spatially close time series exhibit temporal correlation. The analysis...... road segment has an associated travel-cost time series, which is derived from GPS data. We use spatio-temporal hidden Markov models (STHMM) to model correlations among different traffic time series. We provide algorithms that are able to learn the parameters of an STHMM while contending...
HIGH ORDER FUZZY TIME SERIES MODEL AND ITS APLICATION TO IMKB
Directory of Open Access Journals (Sweden)
Çağdaş Hakan ALADAĞ
2010-12-01
Full Text Available The observations of some real time series such as temperature and stock market can take different values in a day. Instead of representing the observations of these time series by real numbers, employing linguistic values or fuzzy sets can be more appropriate. In recent years, many approaches have been introduced to analyze time series consisting of observations which are fuzzy sets and such time series are called fuzzy time series. In this study, a novel approach is proposed to analyze high order fuzzy time series model. The proposed method is applied to IMKB data and the obtained results are discussed. IMKB data is also analyzed by using some other fuzzy time series methods available in the literature and obtained results are compared to results obtained from the proposed method. As a result of the comparison, it is seen that the proposed method produce accurate forecasts.
Directory of Open Access Journals (Sweden)
Entin Hidayah
2011-02-01
Full Text Available Disaggregation of hourly rainfall data is very important to fulfil the input of continual rainfall-runoff model, when the availability of automatic rainfall records are limited. Continual rainfall-runoff modeling requires rainfall data in form of series of hourly. Such specification can be obtained by temporal disaggregation in single site. The paper attempts to generate single-site rainfall model based upon time series (AR1 model by adjusting and establishing dummy procedure. Estimated with Bayesian Markov Chain Monte Carlo (MCMC the objective variable is hourly rainfall depth. Performance of model has been evaluated by comparison of history data and model prediction. The result shows that the model has a good performance for dry interval periods. The performance of the model good represented by smaller number of MAE by 0.21 respectively.
Multivariate nonlinear time series modeling of exposure and risk in road safety research
Bijleveld, F.; Commandeur, J.; Montfort, van K.; Koopman, S.J.
2010-01-01
A multivariate non-linear time series model for road safety data is presented. The model is applied in a case-study into the development of a yearly time series of numbers of fatal accidents (inside and outside urban areas) and numbers of kilometres driven by motor vehicles in the Netherlands betwee
On Fire regime modelling using satellite TM time series
Oddi, F.; . Ghermandi, L.; Lanorte, A.; Lasaponara, R.
2009-04-01
Wildfires can cause an environment deterioration modifying vegetation dynamics because they have the capacity of changing vegetation diversity and physiognomy. In semiarid regions, like the northwestern Patagonia, fire disturbance is also important because it could impact on the potential productivity of the ecosystem. There is reduction plant biomass and with that reducing the animal carrying capacity and/or the forest site quality with negative economics implications. Therefore knowledge of the fires regime in a region is of great importance to understand and predict the responses of vegetation and its possible effect on the regional economy. Studies of this type at a landscape level can be addressed using GIS tools. Satellite imagery allows detect burned areas and through a temporary analysis can be determined to fire regime and detecting changes at landscape scale. The study area of work is located on the east of the city of Bariloche including the San Ramon Ranch (22,000 ha) and its environs in the ecotone formed by the sub Antarctic forest and the patagonian steppe. We worked with multiespectral Landsat TM images and Landsat ETM + 30m spatial resolution obtained at different times. For the spatial analysis we used the software Erdas Imagine 9.0 and ArcView 3.3. A discrimination of vegetation types has made and was determined areas affected by fires in different years. We determined the level of change on vegetation induced by fire. In the future the use of high spatial resolution images combined with higher spectral resolution will allows distinguish burned areas with greater precision on study area. Also the use of digital terrain models derived from satellite imagery associated with climatic variables will allows model the relationship between them and the dynamics of vegetation.
Richly parameterized linear models additive, time series, and spatial models using random effects
Hodges, James S
2013-01-01
A First Step toward a Unified Theory of Richly Parameterized Linear ModelsUsing mixed linear models to analyze data often leads to results that are mysterious, inconvenient, or wrong. Further compounding the problem, statisticians lack a cohesive resource to acquire a systematic, theory-based understanding of models with random effects.Richly Parameterized Linear Models: Additive, Time Series, and Spatial Models Using Random Effects takes a first step in developing a full theory of richly parameterized models, which would allow statisticians to better understand their analysis results. The aut
The modified Yule-Walker method for α-stable time series models
Kruczek, Piotr; Wyłomańska, Agnieszka; Teuerle, Marek; Gajda, Janusz
2017-03-01
This paper discusses the problem of parameters estimation for stable periodic autoregressive (PAR) time series. Considered models generalize popular and widely accepted autoregressive (AR) time series. By examining measures of dependence for α-stable processes, first we introduce new empirical estimator of autocovariation for α-stable sequences. Based on this approach we generalize Yule-Walker method for estimation of parameter for PAR time series. Thus we fill a gap in estimation methods for non-Gaussian models. We test proposed procedure and show its consistency. Moreover, we use our approach to model real empirical data thus showing usefulness of heavy tailed models in statistical modelling.
Fitting ARMA Time Series by Structural Equation Models.
van Buuren, Stef
1997-01-01
This paper outlines how the stationary ARMA (p,q) model (G. Box and G. Jenkins, 1976) can be specified as a structural equation model. Maximum likelihood estimates for the parameters in the ARMA model can be obtained by software for fitting structural equation models. The method is applied to three problem types. (SLD)
DEFF Research Database (Denmark)
Fischer, Paul; Hilbert, Astrid
2012-01-01
commands, our application is select-and-click-driven. It allows to derive many different sequences of deviations for a given time series and to visualize them in different ways in order to judge their expressive power and to reuse the procedure found. For many transformations or model-ts, the user may...
Madsen, Henrik
2007-01-01
""In this book the author gives a detailed account of estimation, identification methodologies for univariate and multivariate stationary time-series models. The interesting aspect of this introductory book is that it contains several real data sets and the author made an effort to explain and motivate the methodology with real data. … this introductory book will be interesting and useful not only to undergraduate students in the UK universities but also to statisticians who are keen to learn time-series techniques and keen to apply them. I have no hesitation in recommending the book.""-Journa
Time Series Modelling of Syphilis Incidence in China from 2005 to 2012.
Zhang, Xingyu; Zhang, Tao; Pei, Jiao; Liu, Yuanyuan; Li, Xiaosong; Medrano-Gracia, Pau
2016-01-01
The infection rate of syphilis in China has increased dramatically in recent decades, becoming a serious public health concern. Early prediction of syphilis is therefore of great importance for heath planning and management. In this paper, we analyzed surveillance time series data for primary, secondary, tertiary, congenital and latent syphilis in mainland China from 2005 to 2012. Seasonality and long-term trend were explored with decomposition methods. Autoregressive integrated moving average (ARIMA) was used to fit a univariate time series model of syphilis incidence. A separate multi-variable time series for each syphilis type was also tested using an autoregressive integrated moving average model with exogenous variables (ARIMAX). The syphilis incidence rates have increased three-fold from 2005 to 2012. All syphilis time series showed strong seasonality and increasing long-term trend. Both ARIMA and ARIMAX models fitted and estimated syphilis incidence well. All univariate time series showed highest goodness-of-fit results with the ARIMA(0,0,1)×(0,1,1) model. Time series analysis was an effective tool for modelling the historical and future incidence of syphilis in China. The ARIMAX model showed superior performance than the ARIMA model for the modelling of syphilis incidence. Time series correlations existed between the models for primary, secondary, tertiary, congenital and latent syphilis.
Mixed Portmanteau Test for Diagnostic Checking of Time Series Models
Directory of Open Access Journals (Sweden)
Sohail Chand
2014-01-01
Full Text Available Model criticism is an important stage of model building and thus goodness of fit tests provides a set of tools for diagnostic checking of the fitted model. Several tests are suggested in literature for diagnostic checking. These tests use autocorrelation or partial autocorrelation in the residuals to criticize the adequacy of fitted model. The main idea underlying these portmanteau tests is to identify if there is any dependence structure which is yet unexplained by the fitted model. In this paper, we suggest mixed portmanteau tests based on autocorrelation and partial autocorrelation functions of the residuals. We derived the asymptotic distribution of the mixture test and studied its size and power using Monte Carlo simulations.
A feature fusion based forecasting model for financial time series.
Guo, Zhiqiang; Wang, Huaiqing; Liu, Quan; Yang, Jie
2014-01-01
Predicting the stock market has become an increasingly interesting research area for both researchers and investors, and many prediction models have been proposed. In these models, feature selection techniques are used to pre-process the raw data and remove noise. In this paper, a prediction model is constructed to forecast stock market behavior with the aid of independent component analysis, canonical correlation analysis, and a support vector machine. First, two types of features are extracted from the historical closing prices and 39 technical variables obtained by independent component analysis. Second, a canonical correlation analysis method is utilized to combine the two types of features and extract intrinsic features to improve the performance of the prediction model. Finally, a support vector machine is applied to forecast the next day's closing price. The proposed model is applied to the Shanghai stock market index and the Dow Jones index, and experimental results show that the proposed model performs better in the area of prediction than other two similar models.
A feature fusion based forecasting model for financial time series.
Directory of Open Access Journals (Sweden)
Zhiqiang Guo
Full Text Available Predicting the stock market has become an increasingly interesting research area for both researchers and investors, and many prediction models have been proposed. In these models, feature selection techniques are used to pre-process the raw data and remove noise. In this paper, a prediction model is constructed to forecast stock market behavior with the aid of independent component analysis, canonical correlation analysis, and a support vector machine. First, two types of features are extracted from the historical closing prices and 39 technical variables obtained by independent component analysis. Second, a canonical correlation analysis method is utilized to combine the two types of features and extract intrinsic features to improve the performance of the prediction model. Finally, a support vector machine is applied to forecast the next day's closing price. The proposed model is applied to the Shanghai stock market index and the Dow Jones index, and experimental results show that the proposed model performs better in the area of prediction than other two similar models.
Siggiridou, Elsa
2015-01-01
Granger causality has been used for the investigation of the inter-dependence structure of the underlying systems of multi-variate time series. In particular, the direct causal effects are commonly estimated by the conditional Granger causality index (CGCI). In the presence of many observed variables and relatively short time series, CGCI may fail because it is based on vector autoregressive models (VAR) involving a large number of coefficients to be estimated. In this work, the VAR is restricted by a scheme that modifies the recently developed method of backward-in-time selection (BTS) of the lagged variables and the CGCI is combined with BTS. Further, the proposed approach is compared favorably to other restricted VAR representations, such as the top-down strategy, the bottom-up strategy, and the least absolute shrinkage and selection operator (LASSO), in terms of sensitivity and specificity of CGCI. This is shown by using simulations of linear and nonlinear, low and high-dimensional systems and different t...
Multilayer stock forecasting model using fuzzy time series.
Javedani Sadaei, Hossein; Lee, Muhammad Hisyam
2014-01-01
After reviewing the vast body of literature on using FTS in stock market forecasting, certain deficiencies are distinguished in the hybridization of findings. In addition, the lack of constructive systematic framework, which can be helpful to indicate direction of growth in entire FTS forecasting systems, is outstanding. In this study, we propose a multilayer model for stock market forecasting including five logical significant layers. Every single layer has its detailed concern to assist forecast development by reconciling certain problems exclusively. To verify the model, a set of huge data containing Taiwan Stock Index (TAIEX), National Association of Securities Dealers Automated Quotations (NASDAQ), Dow Jones Industrial Average (DJI), and S&P 500 have been chosen as experimental datasets. The results indicate that the proposed methodology has the potential to be accepted as a framework for model development in stock market forecasts using FTS.
Nonlinear Time Series Model for Shape Classification Using Neural Networks
Institute of Scientific and Technical Information of China (English)
无
2000-01-01
A complex nonlinear exponential autoregressive (CNEAR) model for invariant feature extraction is developed for recognizing arbitrary shapes on a plane. A neural network is used to calculate the CNEAR coefficients. The coefficients, which constitute the feature set, are proven to be invariant to boundary transformations such as translation, rotation, scale and choice of starting point in tracing the boundary. The feature set is then used as the input to a complex multilayer perceptron (C-MLP) network for learning and classification. Experimental results show that complicated shapes can be accurately recognized even with the low-order model and that the classification method has good fault tolerance when noise is present.
Bayesian Modelling of fMRI Time Series
DEFF Research Database (Denmark)
Højen-Sørensen, Pedro; Hansen, Lars Kai; Rasmussen, Carl Edward
2000-01-01
We present a Hidden Markov Model (HMM) for inferring the hidden psychological state (or neural activity) during single trial fMRI activation experiments with blocked task paradigms. Inference is based on Bayesian methodology, using a combination of analytical and a variety of Markov Chain Monte...
Linear models for multivariate, time series, and spatial data
Christensen, Ronald
1991-01-01
This is a companion volume to Plane Answers to Complex Questions: The Theory 0/ Linear Models. It consists of six additional chapters written in the same spirit as the last six chapters of the earlier book. Brief introductions are given to topics related to linear model theory. No attempt is made to give a comprehensive treatment of the topics. Such an effort would be futile. Each chapter is on a topic so broad that an in depth discussion would require a book-Iength treatment. People need to impose structure on the world in order to understand it. There is a limit to the number of unrelated facts that anyone can remem ber. If ideas can be put within a broad, sophisticatedly simple structure, not only are they easier to remember but often new insights become avail able. In fact, sophisticatedly simple models of the world may be the only ones that work. I have often heard Arnold Zellner say that, to the best of his knowledge, this is true in econometrics. The process of modeling is fundamental to understand...
Time-Elastic Generative Model for Acceleration Time Series in Human Activity Recognition.
Munoz-Organero, Mario; Ruiz-Blazquez, Ramona
2017-02-08
Body-worn sensors in general and accelerometers in particular have been widely used in order to detect human movements and activities. The execution of each type of movement by each particular individual generates sequences of time series of sensed data from which specific movement related patterns can be assessed. Several machine learning algorithms have been used over windowed segments of sensed data in order to detect such patterns in activity recognition based on intermediate features (either hand-crafted or automatically learned from data). The underlying assumption is that the computed features will capture statistical differences that can properly classify different movements and activities after a training phase based on sensed data. In order to achieve high accuracy and recall rates (and guarantee the generalization of the system to new users), the training data have to contain enough information to characterize all possible ways of executing the activity or movement to be detected. This could imply large amounts of data and a complex and time-consuming training phase, which has been shown to be even more relevant when automatically learning the optimal features to be used. In this paper, we present a novel generative model that is able to generate sequences of time series for characterizing a particular movement based on the time elasticity properties of the sensed data. The model is used to train a stack of auto-encoders in order to learn the particular features able to detect human movements. The results of movement detection using a newly generated database with information on five users performing six different movements are presented. The generalization of results using an existing database is also presented in the paper. The results show that the proposed mechanism is able to obtain acceptable recognition rates (F = 0.77) even in the case of using different people executing a different sequence of movements and using different hardware.
Time-Elastic Generative Model for Acceleration Time Series in Human Activity Recognition
Directory of Open Access Journals (Sweden)
Mario Munoz-Organero
2017-02-01
Full Text Available Body-worn sensors in general and accelerometers in particular have been widely used in order to detect human movements and activities. The execution of each type of movement by each particular individual generates sequences of time series of sensed data from which specific movement related patterns can be assessed. Several machine learning algorithms have been used over windowed segments of sensed data in order to detect such patterns in activity recognition based on intermediate features (either hand-crafted or automatically learned from data. The underlying assumption is that the computed features will capture statistical differences that can properly classify different movements and activities after a training phase based on sensed data. In order to achieve high accuracy and recall rates (and guarantee the generalization of the system to new users, the training data have to contain enough information to characterize all possible ways of executing the activity or movement to be detected. This could imply large amounts of data and a complex and time-consuming training phase, which has been shown to be even more relevant when automatically learning the optimal features to be used. In this paper, we present a novel generative model that is able to generate sequences of time series for characterizing a particular movement based on the time elasticity properties of the sensed data. The model is used to train a stack of auto-encoders in order to learn the particular features able to detect human movements. The results of movement detection using a newly generated database with information on five users performing six different movements are presented. The generalization of results using an existing database is also presented in the paper. The results show that the proposed mechanism is able to obtain acceptable recognition rates (F = 0.77 even in the case of using different people executing a different sequence of movements and using different
van der Heijden, Sven; Callau Poduje, Ana; Müller, Hannes; Shehu, Bora; Haberlandt, Uwe; Lorenz, Manuel; Wagner, Sven; Kunstmann, Harald; Müller, Thomas; Mosthaf, Tobias; Bárdossy, András
2015-04-01
For the design and operation of urban drainage systems with numerical simulation models, long, continuous precipitation time series with high temporal resolution are necessary. Suitable observed time series are rare. As a result, intelligent design concepts often use uncertain or unsuitable precipitation data, which renders them uneconomic or unsustainable. An expedient alternative to observed data is the use of long, synthetic rainfall time series as input for the simulation models. Within the project SYNOPSE, several different methods to generate synthetic precipitation data for urban drainage modelling are advanced, tested, and compared. The presented study compares four different approaches of precipitation models regarding their ability to reproduce rainfall and runoff characteristics. These include one parametric stochastic model (alternating renewal approach), one non-parametric stochastic model (resampling approach), one downscaling approach from a regional climate model, and one disaggregation approach based on daily precipitation measurements. All four models produce long precipitation time series with a temporal resolution of five minutes. The synthetic time series are first compared to observed rainfall reference time series. Comparison criteria include event based statistics like mean dry spell and wet spell duration, wet spell amount and intensity, long term means of precipitation sum and number of events, and extreme value distributions for different durations. Then they are compared regarding simulated discharge characteristics using an urban hydrological model on a fictitious sewage network. First results show a principal suitability of all rainfall models but with different strengths and weaknesses regarding the different rainfall and runoff characteristics considered.
Modelling Biophysical Parameters of Maize Using Landsat 8 Time Series
Dahms, Thorsten; Seissiger, Sylvia; Conrad, Christopher; Borg, Erik
2016-06-01
Open and free access to multi-frequent high-resolution data (e.g. Sentinel - 2) will fortify agricultural applications based on satellite data. The temporal and spatial resolution of these remote sensing datasets directly affects the applicability of remote sensing methods, for instance a robust retrieving of biophysical parameters over the entire growing season with very high geometric resolution. In this study we use machine learning methods to predict biophysical parameters, namely the fraction of absorbed photosynthetic radiation (FPAR), the leaf area index (LAI) and the chlorophyll content, from high resolution remote sensing. 30 Landsat 8 OLI scenes were available in our study region in Mecklenburg-Western Pomerania, Germany. In-situ data were weekly to bi-weekly collected on 18 maize plots throughout the summer season 2015. The study aims at an optimized prediction of biophysical parameters and the identification of the best explaining spectral bands and vegetation indices. For this purpose, we used the entire in-situ dataset from 24.03.2015 to 15.10.2015. Random forest and conditional inference forests were used because of their explicit strong exploratory and predictive character. Variable importance measures allowed for analysing the relation between the biophysical parameters with respect to the spectral response, and the performance of the two approaches over the plant stock evolvement. Classical random forest regression outreached the performance of conditional inference forests, in particular when modelling the biophysical parameters over the entire growing period. For example, modelling biophysical parameters of maize for the entire vegetation period using random forests yielded: FPAR: R² = 0.85; RMSE = 0.11; LAI: R² = 0.64; RMSE = 0.9 and chlorophyll content (SPAD): R² = 0.80; RMSE=4.9. Our results demonstrate the great potential in using machine-learning methods for the interpretation of long-term multi-frequent remote sensing datasets to model
Markov Model of Wind Power Time Series UsingBayesian Inference of Transition Matrix
DEFF Research Database (Denmark)
Chen, Peiyuan; Berthelsen, Kasper Klitgaard; Bak-Jensen, Birgitte
2009-01-01
This paper proposes to use Bayesian inference of transition matrix when developing a discrete Markov model of a wind speed/power time series and 95% credible interval for the model verification. The Dirichlet distribution is used as a conjugate prior for the transition matrix. Three discrete Markov...... models are compared, i.e. the basic Markov model, the Bayesian Markov model and the birth-and-death Markov model. The proposed Bayesian Markov model shows the best accuracy in modeling the autocorrelation of the wind power time series....
Monitoring Poisson time series using multi-process models
DEFF Research Database (Denmark)
Engebjerg, Malene Dahl Skov; Lundbye-Christensen, Søren; Kjær, Birgitte B.
Surveillance of infectious diseases based on routinely collected public health data is important for at least three reasons: The early detection of an epidemic may facilitate prompt interventions and the seasonal variations and long term trend may be of general epidemiological interest. Furthermore...... aspects of health resource management may also be addressed. In this paper we center on the detection of outbreaks of infectious diseases. This is achieved by a multi-process Poisson state space model taking autocorrelation and overdispersion into account, which has been applied to a data set concerning...
Application of uncertainty reasoning based on cloud model in time series prediction
Institute of Scientific and Technical Information of China (English)
张锦春; 胡谷雨
2003-01-01
Time series prediction has been successfully used in several application areas, such as meteoro-logical forecasting, market prediction, network traffic forecasting, etc. , and a number of techniques have been developed for modeling and predicting time series. In the traditional exponential smoothing method, a fixed weight is assigned to data history, and the trend changes of time series are ignored. In this paper, an uncertainty reasoning method, based on cloud model, is employed in time series prediction, which uses cloud logic controller to adjust the smoothing coefficient of the simple exponential smoothing method dynamically to fit the current trend of the time series. The validity of this solution was proved by experiments on various data sets.
Application of uncertainty reasoning based on cloud model in time series prediction
Institute of Scientific and Technical Information of China (English)
张锦春; 胡谷雨
2003-01-01
Time series prediction has been successfully used in several application areas, such as meteorological forecasting, market prediction, network traffic forecasting, etc., and a number of techniques have been developed for modeling and predicting time series. In the traditional exponential smoothing method, a fixed weight is assigned to data history, and the trend changes of time series are ignored. In this paper, an uncertainty reasoning method, based on cloud model, is employed in time series prediction, which uses cloud logic controller to adjust the smoothing coefficient of the simple exponential smoothing method dynamically to fit the current trend of the time series. The validity of this solution was proved by experiments on various data sets.
An Exponential Model for the Spectrum of a Scalar Time Series
A new class of parametric models for the spectrum of a scalar time series is proposed, in which the logarithm of the spectral density function is represented by a finite Fourier series. Two alternative parameter estimation procedures are described, and the use of a fitted model to provide forecasts of future values is discussed. The model has been compared with the more conventional autoregressive/moving-average model, and the results of their comparison are given.
On the Practice of Bayesian Inference in Basic Economic Time Series Models using Gibbs Sampling
M.D. de Pooter (Michiel); R. Segers (René); H.K. van Dijk (Herman)
2006-01-01
textabstractSeveral lessons learned from a Bayesian analysis of basic economic time series models by means of the Gibbs sampling algorithm are presented. Models include the Cochrane-Orcutt model for serial correlation, the Koyck distributed lag model, the Unit Root model, the Instrumental Variables
Calibration of transient groundwater models using time series analysis and moment matching
Bakker, M.; Maas, K.; Von Asmuth, J.R.
2008-01-01
A comprehensive and efficient approach is presented for the calibration of transient groundwater models. The approach starts with the time series analysis of the measured heads in observation wells using all active stresses as input series, which may include rainfall, evaporation, surface water leve
Travel cost inference from sparse, spatio-temporally correlated time series using markov models
DEFF Research Database (Denmark)
Yang, B.; Guo, C.; Jensen, C.S.
2013-01-01
of such time series offers insight into the underlying system and enables prediction of system behavior. While the techniques presented in the paper apply more generally, we consider the case of transportation systems and aim to predict travel cost from GPS tracking data from probe vehicles. Specifically, each...... road segment has an associated travel-cost time series, which is derived from GPS data. We use spatio-temporal hidden Markov models (STHMM) to model correlations among different traffic time series. We provide algorithms that are able to learn the parameters of an STHMM while contending...... with the sparsity, spatio-temporal correlation, and heterogeneity of the time series. Using the resulting STHMM, near future travel costs in the transportation network, e.g., travel time or greenhouse gas emissions, can be inferred, enabling a variety of routing services, e.g., eco-routing. Empirical studies...
A Stepwise Time Series Regression Procedure for Water Demand Model Identification
Miaou, Shaw-Pin
1990-09-01
Annual time series water demand has traditionally been studied through multiple linear regression analysis. Four associated model specification problems have long been recognized: (1) the length of the available time series data is relatively short, (2) a large set of candidate explanatory or "input" variables needs to be considered, (3) input variables can be highly correlated with each other (multicollinearity problem), and (4) model error series are often highly autocorrelated or even nonstationary. A step wise time series regression identification procedure is proposed to alleviate these problems. The proposed procedure adopts the sequential input variable selection concept of stepwise regression and the "three-step" time series model building strategy of Box and Jenkins. Autocorrelated model error is assumed to follow an autoregressive integrated moving average (ARIMA) process. The stepwise selection procedure begins with a univariate time series demand model with no input variables. Subsequently, input variables are selected and inserted into the equation one at a time until the last entered variable is found to be statistically insignificant. The order of insertion is determined by a statistical measure called between-variable partial correlation. This correlation measure is free from the contamination of serial autocorrelation. Three data sets from previous studies are employed to illustrate the proposed procedure. The results are then compared with those from their original studies.
Research on power grid loss prediction model based on Granger causality property of time series
Energy Technology Data Exchange (ETDEWEB)
Wang, J. [North China Electric Power Univ., Beijing (China); State Grid Corp., Beijing (China); Yan, W.P.; Yuan, J. [North China Electric Power Univ., Beijing (China); Xu, H.M.; Wang, X.L. [State Grid Information and Telecommunications Corp., Beijing (China)
2009-03-11
This paper described a method of predicting power transmission line losses using the Granger causality property of time series. The stable property of the time series was investigated using unit root tests. The Granger causality relationship between line losses and other variables was then determined. Granger-caused time series were then used to create the following 3 prediction models: (1) a model based on line loss binomials that used electricity sales to predict variables, (2) a model that considered both power sales and grid capacity, and (3) a model based on autoregressive distributed lag (ARDL) approaches that incorporated both power sales and the square of power sales as variables. A case study of data from China's electric power grid between 1980 and 2008 was used to evaluate model performance. Results of the study showed that the model error rates ranged between 2.7 and 3.9 percent. 6 refs., 3 tabs., 1 fig.
Stochastic modeling of Lake Van water level time series with jumps and multiple trends
Directory of Open Access Journals (Sweden)
H. Aksoy
2013-06-01
Full Text Available In the 1990s, water level in the closed-basin Lake Van located in the Eastern Anatolia, Turkey, has risen up about 2 m. Analysis of the hydrometeorological data shows that change in the water level is related to the water budget of the lake. In this study, stochastic models are proposed for simulating monthly water level data. Two models considering mono- and multiple-trend time series are developed. The models are derived after removal of trend and periodicity in the dataset. Trend observed in the lake water level time series is fitted by mono- and multiple-trend lines. In the so-called mono-trend model, the time series is treated as a whole under the hypothesis that the lake water level has an increasing trend. In the second model (so-called multiple-trend, the time series is divided into a number of segments to each a linear trend can be fitted separately. Application on the lake water level data shows that four segments, each fitted with a trend line, are meaningful. Both the mono- and multiple-trend models are used for simulation of synthetic lake water level time series under the hypothesis that the observed mono- and multiple-trend structure of the lake water level persist during the simulation period. The multiple-trend model is found better for planning the future infrastructural projects in surrounding areas of the lake as it generates higher maxima for the simulated lake water level.
A probabilistic method for constructing wave time-series at inshore locations using model scenarios
Long, Joseph W.; Plant, Nathaniel G.; Dalyander, P. Soupy; Thompson, David M.
2014-01-01
Continuous time-series of wave characteristics (height, period, and direction) are constructed using a base set of model scenarios and simple probabilistic methods. This approach utilizes an archive of computationally intensive, highly spatially resolved numerical wave model output to develop time-series of historical or future wave conditions without performing additional, continuous numerical simulations. The archive of model output contains wave simulations from a set of model scenarios derived from an offshore wave climatology. Time-series of wave height, period, direction, and associated uncertainties are constructed at locations included in the numerical model domain. The confidence limits are derived using statistical variability of oceanographic parameters contained in the wave model scenarios. The method was applied to a region in the northern Gulf of Mexico and assessed using wave observations at 12 m and 30 m water depths. Prediction skill for significant wave height is 0.58 and 0.67 at the 12 m and 30 m locations, respectively, with similar performance for wave period and direction. The skill of this simplified, probabilistic time-series construction method is comparable to existing large-scale, high-fidelity operational wave models but provides higher spatial resolution output at low computational expense. The constructed time-series can be developed to support a variety of applications including climate studies and other situations where a comprehensive survey of wave impacts on the coastal area is of interest.
A solution to the problem of constructing a state space model from time series
Directory of Open Access Journals (Sweden)
David Di Ruscio
1994-01-01
Full Text Available The problem of constructing minimal realizations from arbitrary input-output time series which are only covariance stationary (not necessarily stationary is considered. An algorithm which solves this problem for a fairly nonrestrictive class of exogenous (input signals is presented. The algorithm is based upon modeling nonzero exogenous signals by linear models and including these in the total system model.
Nonlinearity, Breaks, and Long-Range Dependence in Time-Series Models
DEFF Research Database (Denmark)
Hillebrand, Eric Tobias; Medeiros, Marcelo C.
We study the simultaneous occurrence of long memory and nonlinear effects, such as parameter changes and threshold effects, in ARMA time series models and apply our modeling framework to daily realized volatility. Asymptotic theory for parameter estimation is developed and two model building...
Almaraz, Pablo
2005-04-01
Time-series analyses in ecology usually involve the use of autoregressive modelling through direct and/or delayed difference equations, which severely restricts the ability of the modeler to structure complex causal relationships within a multivariate frame. This is especially problematic in the field of population regulation, where the proximate and ultimate causes of fluctuations in population size have been hotly debated for decades. Here it is shown that this debate can benefit from the implementation of structural modelling with latent constructs (SEM) to time-series analysis in ecology. A nonparametric bootstrap scheme illustrates how this modelling approach can circumvent some problems posed by the climate-ecology interface. Stochastic Monte Carlo simulation is further used to assess the effects of increasing time-series length and different parameter estimation methods on the performance of several model fit indexes. Throughout, the advantages and limitations of the SEM method are highlighted.
Road safety forecasts in five European countries using structural time series models.
Antoniou, Constantinos; Papadimitriou, Eleonora; Yannis, George
2014-01-01
Modeling road safety development is a complex task and needs to consider both the quantifiable impact of specific parameters as well as the underlying trends that cannot always be measured or observed. The objective of this research is to apply structural time series models for obtaining reliable medium- to long-term forecasts of road traffic fatality risk using data from 5 countries with different characteristics from all over Europe (Cyprus, Greece, Hungary, Norway, and Switzerland). Two structural time series models are considered: (1) the local linear trend model and the (2) latent risk time series model. Furthermore, a structured decision tree for the selection of the applicable model for each situation (developed within the Road Safety Data, Collection, Transfer and Analysis [DaCoTA] research project, cofunded by the European Commission) is outlined. First, the fatality and exposure data that are used for the development of the models are presented and explored. Then, the modeling process is presented, including the model selection process, introduction of intervention variables, and development of mobility scenarios. The forecasts using the developed models appear to be realistic and within acceptable confidence intervals. The proposed methodology is proved to be very efficient for handling different cases of data availability and quality, providing an appropriate alternative from the family of structural time series models in each country. A concluding section providing perspectives and directions for future research is presented.
Hybrid model for forecasting time series with trend, seasonal and salendar variation patterns
Suhartono; Rahayu, S. P.; Prastyo, D. D.; Wijayanti, D. G. P.; Juliyanto
2017-09-01
Most of the monthly time series data in economics and business in Indonesia and other Moslem countries not only contain trend and seasonal, but also affected by two types of calendar variation effects, i.e. the effect of the number of working days or trading and holiday effects. The purpose of this research is to develop a hybrid model or a combination of several forecasting models to predict time series that contain trend, seasonal and calendar variation patterns. This hybrid model is a combination of classical models (namely time series regression and ARIMA model) and/or modern methods (artificial intelligence method, i.e. Artificial Neural Networks). A simulation study was used to show that the proposed procedure for building the hybrid model could work well for forecasting time series with trend, seasonal and calendar variation patterns. Furthermore, the proposed hybrid model is applied for forecasting real data, i.e. monthly data about inflow and outflow of currency at Bank Indonesia. The results show that the hybrid model tend to provide more accurate forecasts than individual forecasting models. Moreover, this result is also in line with the third results of the M3 competition, i.e. the hybrid model on average provides a more accurate forecast than the individual model.
Bayesian dynamic modeling of time series of dengue disease case counts.
Directory of Open Access Journals (Sweden)
Daniel Adyro Martínez-Bello
2017-07-01
Full Text Available The aim of this study is to model the association between weekly time series of dengue case counts and meteorological variables, in a high-incidence city of Colombia, applying Bayesian hierarchical dynamic generalized linear models over the period January 2008 to August 2015. Additionally, we evaluate the model's short-term performance for predicting dengue cases. The methodology shows dynamic Poisson log link models including constant or time-varying coefficients for the meteorological variables. Calendar effects were modeled using constant or first- or second-order random walk time-varying coefficients. The meteorological variables were modeled using constant coefficients and first-order random walk time-varying coefficients. We applied Markov Chain Monte Carlo simulations for parameter estimation, and deviance information criterion statistic (DIC for model selection. We assessed the short-term predictive performance of the selected final model, at several time points within the study period using the mean absolute percentage error. The results showed the best model including first-order random walk time-varying coefficients for calendar trend and first-order random walk time-varying coefficients for the meteorological variables. Besides the computational challenges, interpreting the results implies a complete analysis of the time series of dengue with respect to the parameter estimates of the meteorological effects. We found small values of the mean absolute percentage errors at one or two weeks out-of-sample predictions for most prediction points, associated with low volatility periods in the dengue counts. We discuss the advantages and limitations of the dynamic Poisson models for studying the association between time series of dengue disease and meteorological variables. The key conclusion of the study is that dynamic Poisson models account for the dynamic nature of the variables involved in the modeling of time series of dengue disease
Morton, Kenneth D., Jr.; Torrione, Peter A.; Collins, Leslie
2010-04-01
Time domain ground penetrating radar (GPR) has been shown to be a powerful sensing phenomenology for detecting buried objects such as landmines. Landmine detection with GPR data typically utilizes a feature-based pattern classification algorithm to discriminate buried landmines from other sub-surface objects. In high-fidelity GPR, the time-frequency characteristics of a landmine response should be indicative of the physical construction and material composition of the landmine and could therefore be useful for discrimination from other non-threatening sub-surface objects. In this research we propose modeling landmine time-domain responses with a nonparametric Bayesian time-series model and we perform clustering of these time-series models with a hierarchical nonparametric Bayesian model. Each time-series is modeled as a hidden Markov model (HMM) with autoregressive (AR) state densities. The proposed nonparametric Bayesian prior allows for automated learning of the number of states in the HMM as well as the AR order within each state density. This creates a flexible time-series model with complexity determined by the data. Furthermore, a hierarchical non-parametric Bayesian prior is used to group landmine responses with similar HMM model parameters, thus learning the number of distinct landmine response models within a data set. Model inference is accomplished using a fast variational mean field approximation that can be implemented for on-line learning.
Multifractal Detrended Fluctuation Analysis of Interevent Time Series in a Modified OFC Model
Institute of Scientific and Technical Information of China (English)
LIN Min; YAN Shuang-Xi; ZHAO Gang; WANG Gang
2013-01-01
We use multifractal detrended fluctuation analysis (MF-DFA) method to investigate the multifractal behavior of the interevent time series in a modified Olami-Feder-Christensen (OFC) earthquake model on assortative scale-free networks.We determine generalized Hurst exponent and singularity spectrum and find that these fluctuations have multifractal nature.Comparing the MF-DFA results for the original interevent time series with those for shuffled and surrogate series,we conclude that the origin of multifractality is due to both the broadness of probability density function and long-range correlation.
A four-stage hybrid model for hydrological time series forecasting.
Di, Chongli; Yang, Xiaohua; Wang, Xiaochao
2014-01-01
Hydrological time series forecasting remains a difficult task due to its complicated nonlinear, non-stationary and multi-scale characteristics. To solve this difficulty and improve the prediction accuracy, a novel four-stage hybrid model is proposed for hydrological time series forecasting based on the principle of 'denoising, decomposition and ensemble'. The proposed model has four stages, i.e., denoising, decomposition, components prediction and ensemble. In the denoising stage, the empirical mode decomposition (EMD) method is utilized to reduce the noises in the hydrological time series. Then, an improved method of EMD, the ensemble empirical mode decomposition (EEMD), is applied to decompose the denoised series into a number of intrinsic mode function (IMF) components and one residual component. Next, the radial basis function neural network (RBFNN) is adopted to predict the trend of all of the components obtained in the decomposition stage. In the final ensemble prediction stage, the forecasting results of all of the IMF and residual components obtained in the third stage are combined to generate the final prediction results, using a linear neural network (LNN) model. For illustration and verification, six hydrological cases with different characteristics are used to test the effectiveness of the proposed model. The proposed hybrid model performs better than conventional single models, the hybrid models without denoising or decomposition and the hybrid models based on other methods, such as the wavelet analysis (WA)-based hybrid models. In addition, the denoising and decomposition strategies decrease the complexity of the series and reduce the difficulties of the forecasting. With its effective denoising and accurate decomposition ability, high prediction precision and wide applicability, the new model is very promising for complex time series forecasting. This new forecast model is an extension of nonlinear prediction models.
Stochastic modeling of Lake Van water level time series with jumps and multiple trends
Directory of Open Access Journals (Sweden)
H. Aksoy
2013-02-01
Full Text Available In 1990s, water level in the closed-basin Lake Van located in the Eastern Anatolia, Turkey has risen up about 2 m. Analysis of the hydrometeorological shows that change in the water level is related to the water budget of the lake. In this study, a stochastic model is generated using the measured monthly water level data of the lake. The model is derived after removal of trend and periodicity in the data set. Trend observed in the lake water level time series is fitted by mono- and multiple-trend lines. For the multiple-trend, the time series is first divided into homogeneous segments by means of SEGMENTER, segmentation software. Four segments are found meaningful practically each fitted with a trend line. Two models considering mono- and multiple-trend time series are developed. The multiple-trend model is found better for planning future development in surrounding areas of the lake.
Fluctuation complexity of agent-based financial time series model by stochastic Potts system
Hong, Weijia; Wang, Jun
2015-03-01
Financial market is a complex evolved dynamic system with high volatilities and noises, and the modeling and analyzing of financial time series are regarded as the rather challenging tasks in financial research. In this work, by applying the Potts dynamic system, a random agent-based financial time series model is developed in an attempt to uncover the empirical laws in finance, where the Potts model is introduced to imitate the trading interactions among the investing agents. Based on the computer simulation in conjunction with the statistical analysis and the nonlinear analysis, we present numerical research to investigate the fluctuation behaviors of the proposed time series model. Furthermore, in order to get a robust conclusion, we consider the daily returns of Shanghai Composite Index and Shenzhen Component Index, and the comparison analysis of return behaviors between the simulation data and the actual data is exhibited.
Model-based Clustering of Categorical Time Series with Multinomial Logit Classification
Frühwirth-Schnatter, Sylvia; Pamminger, Christoph; Winter-Ebmer, Rudolf; Weber, Andrea
2010-09-01
A common problem in many areas of applied statistics is to identify groups of similar time series in a panel of time series. However, distance-based clustering methods cannot easily be extended to time series data, where an appropriate distance-measure is rather difficult to define, particularly for discrete-valued time series. Markov chain clustering, proposed by Pamminger and Frühwirth-Schnatter [6], is an approach for clustering discrete-valued time series obtained by observing a categorical variable with several states. This model-based clustering method is based on finite mixtures of first-order time-homogeneous Markov chain models. In order to further explain group membership we present an extension to the approach of Pamminger and Frühwirth-Schnatter [6] by formulating a probabilistic model for the latent group indicators within the Bayesian classification rule by using a multinomial logit model. The parameters are estimated for a fixed number of clusters within a Bayesian framework using an Markov chain Monte Carlo (MCMC) sampling scheme representing a (full) Gibbs-type sampler which involves only draws from standard distributions. Finally, an application to a panel of Austrian wage mobility data is presented which leads to an interesting segmentation of the Austrian labour market.
The application of time series models to cloud field morphology analysis
Chin, Roland T.; Jau, Jack Y. C.; Weinman, James A.
1987-01-01
A modeling method for the quantitative description of remotely sensed cloud field images is presented. A two-dimensional texture modeling scheme based on one-dimensional time series procedures is adopted for this purpose. The time series procedure used is the seasonal autoregressive, moving average (ARMA) process in Box and Jenkins. Cloud field properties such as directionality, clustering and cloud coverage can be retrieved by this method. It has been demonstrated that a cloud field image can be quantitatively defined by a small set of parameters and synthesized surrogates can be reconstructed from these model parameters. This method enables cloud climatology to be studied quantitatively.
DEFF Research Database (Denmark)
Moskowitz, Tobias J.; Ooi, Yao Hua; Heje Pedersen, Lasse
2012-01-01
under-reaction and delayed over-reaction. A diversified portfolio of time series momentum strategies across all asset classes delivers substantial abnormal returns with little exposure to standard asset pricing factors and performs best during extreme markets. Examining the trading activities...... of speculators and hedgers, we find that speculators profit from time series momentum at the expense of hedgers....
Moeeni, Hamid; Bonakdari, Hossein; Fatemi, Seyed Ehsan
2017-04-01
Because time series stationarization has a key role in stochastic modeling results, three methods are analyzed in this study. The methods are seasonal differencing, seasonal standardization and spectral analysis to eliminate the periodic effect on time series stationarity. First, six time series including 4 streamflow series and 2 water temperature series are stationarized. The stochastic term for these series obtained with ARIMA is subsequently modeled. For the analysis, 9228 models are introduced. It is observed that seasonal standardization and spectral analysis eliminate the periodic term completely, while seasonal differencing maintains seasonal correlation structures. The obtained results indicate that all three methods present acceptable performance overall. However, model accuracy in monthly streamflow prediction is higher with seasonal differencing than with the other two methods. Another advantage of seasonal differencing over the other methods is that the monthly streamflow is never estimated as negative. Standardization is the best method for predicting monthly water temperature although it is quite similar to seasonal differencing, while spectral analysis performed the weakest in all cases. It is concluded that for each monthly seasonal series, seasonal differencing is the best stationarization method in terms of periodic effect elimination. Moreover, the monthly water temperature is predicted with more accuracy than monthly streamflow. The criteria of the average stochastic term divided by the amplitude of the periodic term obtained for monthly streamflow and monthly water temperature were 0.19 and 0.30, 0.21 and 0.13, and 0.07 and 0.04 respectively. As a result, the periodic term is more dominant than the stochastic term for water temperature in the monthly water temperature series compared to streamflow series.
Analyzing Multiple Multivariate Time Series Data Using Multilevel Dynamic Factor Models.
Song, Hairong; Zhang, Zhiyong
2014-01-01
Multivariate time series data offer researchers opportunities to study dynamics of various systems in social and behavioral sciences. Dynamic factor model (DFM), as an idiographic approach for studying intraindividual variability and dynamics, has typically been applied to time series data obtained from a single unit. When multivariate time series data are collected from multiple units, how to synchronize dynamical information becomes a silent issue. To address this issue, the current study presented a multilevel dynamic factor model (MDFM) that analyzes multiple multivariate time series in multilevel SEM frameworks. MDFM not only disentangles within- and between-person variability but also models dynamics of the intraindividual processes. To illustrate the uses of MDFMs, we applied lag0, lag1, and lag2 MDFMs to empirical data on affect collected from 205 dating couples who had at least 50 consecutive days of observations. We also considered a model extension where the dynamical coefficients were allowed to be randomly varying in the population. The empirical analysis yielded interesting findings regarding affect regulation and coregulation within couples, demonstrating promising uses of MDFMs in analyzing multiple multivariate time series. In the end, we discussed a number of methodological issues in the applications of MDFMs and pointed out possible directions for future research.
Keil, Petr; Herben, Tomás; Rosindell, James; Storch, David
2010-07-07
There has recently been increasing interest in neutral models of biodiversity and their ability to reproduce the patterns observed in nature, such as species abundance distributions. Here we investigate the ability of a neutral model to predict phenomena observed in single-population time series, a study complementary to most existing work that concentrates on snapshots in time of the whole community. We consider tests for density dependence, the dominant frequencies of population fluctuation (spectral density) and a relationship between the mean and variance of a fluctuating population (Taylor's power law). We simulated an archipelago model of a set of interconnected local communities with variable mortality rate, migration rate, speciation rate, size of local community and number of local communities. Our spectral analysis showed 'pink noise': a departure from a standard random walk dynamics in favor of the higher frequency fluctuations which is partly consistent with empirical data. We detected density dependence in local community time series but not in metacommunity time series. The slope of the Taylor's power law in the model was similar to the slopes observed in natural populations, but the fit to the power law was worse. Our observations of pink noise and density dependence can be attributed to the presence of an upper limit to community sizes and to the effect of migration which distorts temporal autocorrelation in local time series. We conclude that some of the phenomena observed in natural time series can emerge from neutral processes, as a result of random zero-sum birth, death and migration. This suggests the neutral model would be a parsimonious null model for future studies of time series data.
DEFF Research Database (Denmark)
Ørregård Nielsen, Morten
2015-01-01
This article proves consistency and asymptotic normality for the conditional-sum-of-squares estimator, which is equivalent to the conditional maximum likelihood estimator, in multivariate fractional time-series models. The model is parametric and quite general and, in particular, encompasses...
DEFF Research Database (Denmark)
Ørregård Nielsen, Morten
This paper proves consistency and asymptotic normality for the conditional-sum-of-squares estimator, which is equivalent to the conditional maximum likelihood estimator, in multivariate fractional time series models. The model is parametric and quite general, and, in particular, encompasses...
An Alternative Bayesian Approach to Structural Breaks in Time Series Models
S. van den Hauwe (Sjoerd); R. Paap (Richard); D.J.C. van Dijk (Dick)
2011-01-01
textabstractWe propose a new approach to deal with structural breaks in time series models. The key contribution is an alternative dynamic stochastic specification for the model parameters which describes potential breaks. After a break new parameter values are generated from a so-called baseline pr
Applying ARIMA model for annual volume time series of the Magdalena River
Directory of Open Access Journals (Sweden)
Gloria Amaris
2017-04-01
Conclusions: The simulated results obtained with the ARIMA model compared to the observed data showed a fairly good adjustment of the minimum and maximum magnitudes. This allows concluding that it is a good tool for estimating minimum and maximum volumes, even though this model is not capable of simulating the exact behaviour of an annual volume time series.
Multi-Scale Gaussian Processes: a Novel Model for Chaotic Time Series Prediction
Institute of Scientific and Technical Information of China (English)
ZHOU Ya-Tong; ZHANG Tai-Yi; SUN Jian-Cheng
2007-01-01
@@ Based on the classical Gaussian process (GP) model, we propose a multi-scale Gaussian process (MGP) model to predict the existence of chaotic time series. The MGP employs a covariance function that is constructed by a scaling function with its different dilations and translations, ensuring that the optimal hyperparameter is easy to determine.
Prediction of altimetric sea level anomalies using time series models based on spatial correlation
Miziński, Bartłomiej; Niedzielski, Tomasz
2014-05-01
Sea level anomaly (SLA) times series, which are time-varying gridded data, can be modelled and predicted using time series methods. This approach has been shown to provide accurate forecasts within the Prognocean system, the novel infrastructure for anticipating sea level change designed and built at the University of Wrocław (Poland) which utilizes the real-time SLA data from Archiving, Validation and Interpretation of Satellite Oceanographic data (AVISO). The system runs a few models concurrently, and our ocean prediction experiment includes both uni- and multivariate time series methods. The univariate ones are: extrapolation of polynomial-harmonic model (PH), extrapolation of polynomial-harmonic model and autoregressive prediction (PH+AR), extrapolation of polynomial-harmonic model and self-exciting threshold autoregressive prediction (PH+SETAR). The following multivariate methods are used: extrapolation of polynomial-harmonic model and vector autoregressive prediction (PH+VAR), extrapolation of polynomial-harmonic model and generalized space-time autoregressive prediction (PH+GSTAR). As the aforementioned models and the corresponding forecasts are computed in real time, hence independently and in the same computational setting, we are allowed to compare the accuracies offered by the models. The objective of this work is to verify the hypothesis that the multivariate prediction techniques, which make use of cross-correlation and spatial correlation, perform better than the univariate ones. The analysis is based on the daily-fitted and updated time series models predicting the SLA data (lead time of two weeks) over several months when El Niño/Southern Oscillation (ENSO) was in its neutral state.
Modeling Financial Time Series Based on a Market Microstructure Model with Leverage Effect
Directory of Open Access Journals (Sweden)
Yanhui Xi
2016-01-01
Full Text Available The basic market microstructure model specifies that the price/return innovation and the volatility innovation are independent Gaussian white noise processes. However, the financial leverage effect has been found to be statistically significant in many financial time series. In this paper, a novel market microstructure model with leverage effects is proposed. The model specification assumed a negative correlation in the errors between the price/return innovation and the volatility innovation. With the new representations, a theoretical explanation of leverage effect is provided. Simulated data and daily stock market indices (Shanghai composite index, Shenzhen component index, and Standard and Poor’s 500 Composite index via Bayesian Markov Chain Monte Carlo (MCMC method are used to estimate the leverage market microstructure model. The results verify the effectiveness of the model and its estimation approach proposed in the paper and also indicate that the stock markets have strong leverage effects. Compared with the classical leverage stochastic volatility (SV model in terms of DIC (Deviance Information Criterion, the leverage market microstructure model fits the data better.
Kusev, Petko; van Schaik, Paul; Tsaneva-Atanasova, Krasimira; Juliusson, Asgeir; Chater, Nick
2017-04-06
When attempting to predict future events, people commonly rely on historical data. One psychological characteristic of judgmental forecasting of time series, established by research, is that when people make forecasts from series, they tend to underestimate future values for upward trends and overestimate them for downward ones, so-called trend-damping (modeled by anchoring on, and insufficient adjustment from, the average of recent time series values). Events in a time series can be experienced sequentially (dynamic mode), or they can also be retrospectively viewed simultaneously (static mode), not experienced individually in real time. In one experiment, we studied the influence of presentation mode (dynamic and static) on two sorts of judgment: (a) predictions of the next event (forecast) and (b) estimation of the average value of all the events in the presented series (average estimation). Participants' responses in dynamic mode were anchored on more recent events than in static mode for all types of judgment but with different consequences; hence, dynamic presentation improved prediction accuracy, but not estimation. These results are not anticipated by existing theoretical accounts; we develop and present an agent-based model-the adaptive anchoring model (ADAM)-to account for the difference between processing sequences of dynamically and statically presented stimuli (visually presented data). ADAM captures how variation in presentation mode produces variation in responses (and the accuracy of these responses) in both forecasting and judgment tasks. ADAM's model predictions for the forecasting and judgment tasks fit better with the response data than a linear-regression time series model. Moreover, ADAM outperformed autoregressive-integrated-moving-average (ARIMA) and exponential-smoothing models, while neither of these models accounts for people's responses on the average estimation task. Copyright © 2017 The Authors. Cognitive Science published by Wiley
Time-series modeling of long-term weight self-monitoring data.
Helander, Elina; Pavel, Misha; Jimison, Holly; Korhonen, Ilkka
2015-08-01
Long-term self-monitoring of weight is beneficial for weight maintenance, especially after weight loss. Connected weight scales accumulate time series information over long term and hence enable time series analysis of the data. The analysis can reveal individual patterns, provide more sensitive detection of significant weight trends, and enable more accurate and timely prediction of weight outcomes. However, long term self-weighing data has several challenges which complicate the analysis. Especially, irregular sampling, missing data, and existence of periodic (e.g. diurnal and weekly) patterns are common. In this study, we apply time series modeling approach on daily weight time series from two individuals and describe information that can be extracted from this kind of data. We study the properties of weight time series data, missing data and its link to individuals behavior, periodic patterns and weight series segmentation. Being able to understand behavior through weight data and give relevant feedback is desired to lead to positive intervention on health behaviors.
Modeling and Forecasting of Water Demand in Isfahan Using Underlying Trend Concept and Time Series
Directory of Open Access Journals (Sweden)
H. Sadeghi
2016-02-01
Full Text Available Introduction: Accurate water demand modeling for the city is very important for forecasting and policies adoption related to water resources management. Thus, for future requirements of water estimation, forecasting and modeling, it is important to utilize models with little errors. Water has a special place among the basic human needs, because it not hampers human life. The importance of the issue of water management in the extraction and consumption, it is necessary as a basic need. Municipal water applications is include a variety of water demand for domestic, public, industrial and commercial. Predicting the impact of urban water demand in better planning of water resources in arid and semiarid regions are faced with water restrictions. Materials and Methods: One of the most important factors affecting the changing technological advances in production and demand functions, we must pay special attention to the layout pattern. Technology development is concerned not only technically, but also other aspects such as personal, non-economic factors (population, geographical and social factors can be analyzed. Model examined in this study, a regression model is composed of a series of structural components over time allows changed invisible accidentally. Explanatory variables technology (both crystalline and amorphous in a model according to which the material is said to be better, but because of the lack of measured variables over time can not be entered in the template. Model examined in this study, a regression model is composed of a series of structural component invisible accidentally changed over time allows. In this study, structural time series (STSM and ARMA time series models have been used to model and estimate the water demand in Isfahan. Moreover, in order to find the efficient procedure, both models have been compared to each other. The desired data in this research include water consumption in Isfahan, water price and the monthly pay
A new approach to calibrate steady groundwater flow models with time series of head observations
Obergfell, C.; Bakker, M.; Maas, C.
2012-04-01
We developed a new method to calibrate aquifer parameters of steady-state well field models using measured time series of head fluctuations. Our method is an alternative to standard pumping tests and is based on time series analysis using parametric impulse response functions. First, the pumping influence is isolated from the overall groundwater fluctuation observed at monitoring wells around the well field, and response functions are determined for each individual well. Time series parameters are optimized using a quasi-Newton algorithm. For one monitoring well, time series model parameters are also optimized by means of SCEM-UA, a Markov Chain Monte Carlo algorithm, as a control on the validity of the parameters obtained by the faster quasi-Newton method. Subsequently, the drawdown corresponding to an average yearly pumping rate is calculated from the response functions determined by time series analysis. The drawdown values estimated with acceptable confidence intervals are used as calibration targets of a steady groundwater flow model. A case study is presented of the drinking water supply well field of Waalwijk (Netherlands). In this case study, a uniform aquifer transmissivity is optimized together with the conductance of ditches in the vicinity of the well field. Groundwater recharge or boundary heads do not have to be entered, which eliminates two import sources of uncertainty. The method constitutes a cost-efficient alternative to pumping tests and allows the determination of pumping influences without changes in well field operation.
Markov Chain Modelling for Short-Term NDVI Time Series Forecasting
Directory of Open Access Journals (Sweden)
Stepčenko Artūrs
2016-12-01
Full Text Available In this paper, the NDVI time series forecasting model has been developed based on the use of discrete time, continuous state Markov chain of suitable order. The normalised difference vegetation index (NDVI is an indicator that describes the amount of chlorophyll (the green mass and shows the relative density and health of vegetation; therefore, it is an important variable for vegetation forecasting. A Markov chain is a stochastic process that consists of a state space. This stochastic process undergoes transitions from one state to another in the state space with some probabilities. A Markov chain forecast model is flexible in accommodating various forecast assumptions and structures. The present paper discusses the considerations and techniques in building a Markov chain forecast model at each step. Continuous state Markov chain model is analytically described. Finally, the application of the proposed Markov chain model is illustrated with reference to a set of NDVI time series data.
Cointegration and Error Correction Modelling in Time-Series Analysis: A Brief Introduction
Directory of Open Access Journals (Sweden)
Helmut Thome
2015-07-01
Full Text Available Criminological research is often based on time-series data showing some type of trend movement. Trending time-series may correlate strongly even in cases where no causal relationship exists (spurious causality. To avoid this problem researchers often apply some technique of detrending their data, such as by differencing the series. This approach, however, may bring up another problem: that of spurious non-causality. Both problems can, in principle, be avoided if the series under investigation are “difference-stationary” (if the trend movements are stochastic and “cointegrated” (if the stochastically changing trendmovements in different variables correspond to each other. The article gives a brief introduction to key instruments and interpretative tools applied in cointegration modelling.
Long Memory of Financial Time Series and Hidden Markov Models with Time-Varying Parameters
DEFF Research Database (Denmark)
Nystrup, Peter; Madsen, Henrik; Lindström, Erik
2016-01-01
estimation approach that allows for the parameters of the estimated models to be time varying. It is shown that a two-state Gaussian hidden Markov model with time-varying parameters is able to reproduce the long memory of squared daily returns that was previously believed to be the most difficult fact...... to reproduce with a hidden Markov model. Capturing the time-varying behavior of the parameters also leads to improved one-step density forecasts. Finally, it is shown that the forecasting performance of the estimated models can be further improved using local smoothing to forecast the parameter variations....
Automated Bayesian model development for frequency detection in biological time series
Directory of Open Access Journals (Sweden)
Oldroyd Giles ED
2011-06-01
Full Text Available Abstract Background A first step in building a mathematical model of a biological system is often the analysis of the temporal behaviour of key quantities. Mathematical relationships between the time and frequency domain, such as Fourier Transforms and wavelets, are commonly used to extract information about the underlying signal from a given time series. This one-to-one mapping from time points to frequencies inherently assumes that both domains contain the complete knowledge of the system. However, for truncated, noisy time series with background trends this unique mapping breaks down and the question reduces to an inference problem of identifying the most probable frequencies. Results In this paper we build on the method of Bayesian Spectrum Analysis and demonstrate its advantages over conventional methods by applying it to a number of test cases, including two types of biological time series. Firstly, oscillations of calcium in plant root cells in response to microbial symbionts are non-stationary and noisy, posing challenges to data analysis. Secondly, circadian rhythms in gene expression measured over only two cycles highlights the problem of time series with limited length. The results show that the Bayesian frequency detection approach can provide useful results in specific areas where Fourier analysis can be uninformative or misleading. We demonstrate further benefits of the Bayesian approach for time series analysis, such as direct comparison of different hypotheses, inherent estimation of noise levels and parameter precision, and a flexible framework for modelling the data without pre-processing. Conclusions Modelling in systems biology often builds on the study of time-dependent phenomena. Fourier Transforms are a convenient tool for analysing the frequency domain of time series. However, there are well-known limitations of this method, such as the introduction of spurious frequencies when handling short and noisy time series, and
Advances in time series forecasting
Cagdas, Hakan Aladag
2012-01-01
Readers will learn how these methods work and how these approaches can be used to forecast real life time series. The hybrid forecasting model is also explained. Data presented in this e-book is problem based and is taken from real life situations. It is a valuable resource for students, statisticians and working professionals interested in advanced time series analysis.
The Evolutionary Modeling and Short-range Climatic Prediction for Meteorological Element Time Series
Institute of Scientific and Technical Information of China (English)
YU Kangqing; ZHOU Yuehua; YANG Jing'an; KANG Zhuo
2005-01-01
The time series of precipitation in flood season (May-September) at Wuhan Station, which is set as an example of the kind of time series with chaos characters, is split into two parts: One includes macro climatic timescale period waves that are affected by some relatively steady climatic factors such as astronomical factors (sunspot, etc.), some other known and/or unknown factors, and the other includes micro climatic timescale period waves superimposed on the macro one. The evolutionary modeling (EM), which develops from genetic programming (GP), is supposed to be adept at simulating the former part because it creates the nonlinear ordinary differential equation (NODE) based upon the data series. The natural fractals (NF)are used to simulate the latter part. The final prediction is the sum of results from both methods, thus the model can reflect multi-time scale effects of forcing factors in the climate system. The results of this example for 2002 and 2003 are satisfactory for climatic prediction operation. The NODE can suggest that the data vary with time, which is beneficial to think over short-range climatic analysis and prediction. Comparison in principle between evolutionary modeling and linear modeling indicates that the evolutionary one is a better way to simulate the complex time series with nonlinear characteristics.
Multivariate Time Series Search
National Aeronautics and Space Administration — Multivariate Time-Series (MTS) are ubiquitous, and are generated in areas as disparate as sensor recordings in aerospace systems, music and video streams, medical...
Estimation of time of death with a fourier series unsteady-state heat transfer model.
Smart, Jimmy L
2010-11-01
The purpose of this study was to return to fundamental principles of heat transfer and derive a suitable model to establish a firm basis for constructing a postmortem human cooling curve. A Fourier Series Model was successfully applied to unsteady heat transfer within a wooden cylinder in controlled laboratory conditions. Wood has similar thermal diffusivity properties as human tissue. By manipulation of the model, sensitivity analyses were performed to observe the impact of changes in values of input variables. Variables of initial temperature of the cylinder and ambient surrounding temperature were shown to be very sensitive and have the most impact upon predictive results of the model. The model was also used to demonstrate the existence of an initial temperature plateau, which is often the subject of controversy in estimating time of death. Finally, it was demonstrated how the Fourier Series Model can be applied to estimate time of death for humans.
DEFF Research Database (Denmark)
Hisdal, H.; Holmqvist, E.; Hyvärinen, V.;
Awareness that emission of greenhouse gases will raise the global temperature and change the climate has led to studies trying to identify such changes in long-term climate and hydrologic time series. This report, written by the......Awareness that emission of greenhouse gases will raise the global temperature and change the climate has led to studies trying to identify such changes in long-term climate and hydrologic time series. This report, written by the...
DEFF Research Database (Denmark)
Hisdal, H.; Holmqvist, E.; Hyvärinen, V.
Awareness that emission of greenhouse gases will raise the global temperature and change the climate has led to studies trying to identify such changes in long-term climate and hydrologic time series. This report, written by the......Awareness that emission of greenhouse gases will raise the global temperature and change the climate has led to studies trying to identify such changes in long-term climate and hydrologic time series. This report, written by the...
Nonlinear Fluctuation Behavior of Financial Time Series Model by Statistical Physics System
Directory of Open Access Journals (Sweden)
Wuyang Cheng
2014-01-01
Full Text Available We develop a random financial time series model of stock market by one of statistical physics systems, the stochastic contact interacting system. Contact process is a continuous time Markov process; one interpretation of this model is as a model for the spread of an infection, where the epidemic spreading mimics the interplay of local infections and recovery of individuals. From this financial model, we study the statistical behaviors of return time series, and the corresponding behaviors of returns for Shanghai Stock Exchange Composite Index (SSECI and Hang Seng Index (HSI are also comparatively studied. Further, we investigate the Zipf distribution and multifractal phenomenon of returns and price changes. Zipf analysis and MF-DFA analysis are applied to investigate the natures of fluctuations for the stock market.
Long memory of financial time series and hidden Markov models with time-varying parameters
DEFF Research Database (Denmark)
Nystrup, Peter; Madsen, Henrik; Lindström, Erik
facts have not been thoroughly examined. This paper presents an adaptive estimation approach that allows for the parameters of the estimated models to be time-varying. It is shown that a two-state Gaussian hidden Markov model with time-varying parameters is able to reproduce the long memory of squared...... daily returns that was previously believed to be the most difficult fact to reproduce with a hidden Markov model. Capturing the time-varying behavior of the parameters also leads to improved one-step predictions....
Time-series analysis with a hybrid Box-Jenkins ARIMA and neural network model
Institute of Scientific and Technical Information of China (English)
Dilli R Aryal; WANG Yao-wu(王要武)
2004-01-01
Time-series analysis is important to a wide range of disciplines transcending both the physical and social sciences for proactive policy decisions. Statistical models have sound theoretical basis and have been successfully used in a number of problem domains in time series forecasting. Due to power and flexibility, Box-Jenkins ARIMA model has gained enormous popularity in many areas and research practice for the last three decades.More recently, the neural networks have been shown to be a promising alternative tool for modeling and forecasting owing to their ability to capture the nonlinearity in the data. However, despite the popularity and the superiority of ARIMA and ANN models, the empirical forecasting performance has been rather mixed so that no single method is best in every situation. In this study, a hybrid ARIMA and neural networks model to time series forecasting is proposed. The basic idea behind the model combination is to use each model's unique features to capture different patterns in the data. With three real data sets, empirical results evidently show that the hybrid model outperforms ARIMA and ANN model noticeably in terms of forecasting accuracy used in isolation.
Point Processes Modeling of Time Series Exhibiting Power-Law Statistics
Kaulakys, B; Gontis, V
2010-01-01
We consider stochastic point processes generating time series exhibiting power laws of spectrum and distribution density (Phys. Rev. E 71, 051105 (2005)) and apply them for modeling the trading activity in the financial markets and for the frequencies of word occurrences in the language.
ShapeSelectForest: a new r package for modeling landsat time series
Mary Meyer; Xiyue Liao; Gretchen Moisen; Elizabeth. Freeman
2015-01-01
We present a new R package called ShapeSelectForest recently posted to the Comprehensive R Archival Network. The package was developed to fit nonparametric shape-restricted regression splines to time series of Landsat imagery for the purpose of modeling, mapping, and monitoring annual forest disturbance dynamics over nearly three decades. For each pixel and spectral...
Time series modeling of daily abandoned calls in a call centre ...
African Journals Online (AJOL)
Time series modeling of daily abandoned calls in a call centre. ... were shown to be both parsimonious and adequate using the P-P plots, Q-Q plots and residual analysis. ... The data for application were got from a GSM telephone provider.
Molenaar, P.C.M.
1987-01-01
Outlines a frequency domain analysis of the dynamic factor model and proposes a solution to the problem of constructing a causal filter of lagged factor loadings. The method is illustrated with applications to simulated and real multivariate time series. The latter applications involve topographic a
Particle Markov Chain Monte Carlo Techniques of Unobserved Component Time Series Models Using Ox
DEFF Research Database (Denmark)
Nonejad, Nima
This paper details Particle Markov chain Monte Carlo techniques for analysis of unobserved component time series models using several economic data sets. PMCMC combines the particle filter with the Metropolis-Hastings algorithm. Overall PMCMC provides a very compelling, computationally fast...
Commandeur, J.J.F. Wesemann, P. Bijleveld, F.D. Chhoun, V. & Sann, S.
2017-01-01
The authors present the methodology used for estimating forecasts for the number of road traffic fatalities in 2011—2020 in Cambodia based on observed developments in Cambodian road traffic fatalities and motor vehicle ownership in the years 1995—2009. Using the latent risk time series model
Watanabe, Hayafumi; Sano, Yukie; Takayasu, Hideki; Takayasu, Misako
2016-11-01
To elucidate the nontrivial empirical statistical properties of fluctuations of a typical nonsteady time series representing the appearance of words in blogs, we investigated approximately 3 ×109 Japanese blog articles over a period of six years and analyze some corresponding mathematical models. First, we introduce a solvable nonsteady extension of the random diffusion model, which can be deduced by modeling the behavior of heterogeneous random bloggers. Next, we deduce theoretical expressions for both the temporal and ensemble fluctuation scalings of this model, and demonstrate that these expressions can reproduce all empirical scalings over eight orders of magnitude. Furthermore, we show that the model can reproduce other statistical properties of time series representing the appearance of words in blogs, such as functional forms of the probability density and correlations in the total number of blogs. As an application, we quantify the abnormality of special nationwide events by measuring the fluctuation scalings of 1771 basic adjectives.
Time-series models on somatic cell score improve detection of matistis
DEFF Research Database (Denmark)
Norberg, E; Korsgaard, I R; Sloth, K H M N
2008-01-01
with bacteriological findings. At a sensitivity of 90% the corresponding specificity was 68%, which increased to 83% using a one-step back smoothing. It is concluded that mixture models based on Kalman filters are efficient in handling in-line sensor data for detection of mastitis and may be useful for similar......In-line detection of mastitis using frequent milk sampling was studied in 241 cows in a Danish research herd. Somatic cell scores obtained at a daily basis were analyzed using a mixture of four time-series models. Probabilities were assigned to each model for the observations to belong to a normal...... "steady-state" development, change in "level", change of "slope" or "outlier". Mastitis was indicated from the sum of probabilities for the "level" and "slope" models. Time-series models were based on the Kalman filter. Reference data was obtained from veterinary assessment of health status combined...
Generation of future high-resolution rainfall time series with a disaggregation model
Müller, Hannes; Haberlandt, Uwe
2017-04-01
High-resolution rainfall data are needed in many fields of hydrology and water resources management. For analyzes of future rainfall condition climate scenarios exist with hourly values of rainfall. However, the direct usage of these data is associated with uncertainties which can be indicated by comparisons of observations and C20 control runs. An alternative is the derivation of changes of rainfall behavior over the time from climate simulations. Conclusions about future rainfall conditions can be drawn by adding these changes to observed time series. A multiplicative cascade model is used in this investigation for the disaggregation of daily rainfall amounts to hourly values. Model parameters can be estimated by REMO rainfall time series (UBA-, BfG- and ENS-realization), based on ECHAM5. Parameter estimation is carried out for C20 period as well as near term and long term future (2021-2050 and 2071-2100). Change factors for both future periods are derived by parameter comparisons and added to the parameters estimated from observed time series. This enables the generation of hourly rainfall time series from observed daily values with respect to future changes. The investigation is carried out for rain gauges in Lower Saxony. Generated Time series are analyzed regarding statistical characteristics, e.g. extreme values, event-based (wet spell duration and amounts, dry spell duration, …) and continuum characteristics (average intensity, fraction of dry intervals,…). The generation of the time series is validated by comparing the changes in the statistical characteristics from the REMO data and from the disaggregated data.
Nonlinear Behaviors of Tail Dependence and Cross-Correlation of Financial Time Series Model
Directory of Open Access Journals (Sweden)
Wei Deng
2014-01-01
Full Text Available Nonlinear behaviors of tail dependence and cross-correlation of financial time series are reproduced and investigated by stochastic voter dynamic system. The voter process is a continuous-time Markov process and is one of the interacting dynamic systems. The tail dependence of return time series for pairs of Chinese stock markets and the proposed financial models is studied by copula analysis, in an attempt to detect and illustrate the existence of relevant correlation relationships. Further, the multifractality of cross-correlations for return series is studied by multifractal detrended cross-correlation analysis, which indicates the analogous cross-correlations and some fractal characters for both actual data and simulative data and provides an intuitive evidence for market inefficiency.
Stylised facts of financial time series and hidden Markov models in continuous time
DEFF Research Database (Denmark)
Nystrup, Peter; Madsen, Henrik; Lindström, Erik
2015-01-01
Hidden Markov models are often applied in quantitative finance to capture the stylised facts of financial returns. They are usually discrete-time models and the number of states rarely exceeds two because of the quadratic increase in the number of parameters with the number of states. This paper...
Time series models of environmental exposures: Good predictions or good understanding.
Barnett, Adrian G; Stephen, Dimity; Huang, Cunrui; Wolkewitz, Martin
2017-04-01
Time series data are popular in environmental epidemiology as they make use of the natural experiment of how changes in exposure over time might impact on disease. Many published time series papers have used parameter-heavy models that fully explained the second order patterns in disease to give residuals that have no short-term autocorrelation or seasonality. This is often achieved by including predictors of past disease counts (autoregression) or seasonal splines with many degrees of freedom. These approaches give great residuals, but add little to our understanding of cause and effect. We argue that modelling approaches should rely more on good epidemiology and less on statistical tests. This includes thinking about causal pathways, making potential confounders explicit, fitting a limited number of models, and not over-fitting at the cost of under-estimating the true association between exposure and disease. Copyright © 2017 Elsevier Inc. All rights reserved.
Statistical models and time series forecasting of sulfur dioxide: a case study Tehran.
Hassanzadeh, S; Hosseinibalam, F; Alizadeh, R
2009-08-01
This study performed a time-series analysis, frequency distribution and prediction of SO(2) levels for five stations (Pardisan, Vila, Azadi, Gholhak and Bahman) in Tehran for the period of 2000-2005. Most sites show a quite similar characteristic with highest pollution in autumn-winter time and least pollution in spring-summer. The frequency distributions show higher peaks at two residential sites. The potential for SO(2) problems is high because of high emissions and the close geographical proximity of the major industrial and urban centers. The ACF and PACF are nonzero for several lags, indicating a mixed (ARMA) model, then at Bahman station an ARMA model was used for forecasting SO(2). The partial autocorrelations become close to 0 after about 5 lags while the autocorrelations remain strong through all the lags shown. The results proved that ARMA (2,2) model can provides reliable, satisfactory predictions for time series.
Comparison of time series models for predicting campylobacteriosis risk in New Zealand.
Al-Sakkaf, A; Jones, G
2014-05-01
Predicting campylobacteriosis cases is a matter of considerable concern in New Zealand, after the number of the notified cases was the highest among the developed countries in 2006. Thus, there is a need to develop a model or a tool to predict accurately the number of campylobacteriosis cases as the Microbial Risk Assessment Model used to predict the number of campylobacteriosis cases failed to predict accurately the number of actual cases. We explore the appropriateness of classical time series modelling approaches for predicting campylobacteriosis. Finding the most appropriate time series model for New Zealand data has additional practical considerations given a possible structural change, that is, a specific and sudden change in response to the implemented interventions. A univariate methodological approach was used to predict monthly disease cases using New Zealand surveillance data of campylobacteriosis incidence from 1998 to 2009. The data from the years 1998 to 2008 were used to model the time series with the year 2009 held out of the data set for model validation. The best two models were then fitted to the full 1998-2009 data and used to predict for each month of 2010. The Holt-Winters (multiplicative) and ARIMA (additive) intervention models were considered the best models for predicting campylobacteriosis in New Zealand. It was noticed that the prediction by an additive ARIMA with intervention was slightly better than the prediction by a Holt-Winter multiplicative method for the annual total in year 2010, the former predicting only 23 cases less than the actual reported cases. It is confirmed that classical time series techniques such as ARIMA with intervention and Holt-Winters can provide a good prediction performance for campylobacteriosis risk in New Zealand. The results reported by this study are useful to the New Zealand Health and Safety Authority's efforts in addressing the problem of the campylobacteriosis epidemic. © 2013 Blackwell Verlag GmbH.
Data on copula modeling of mixed discrete and continuous neural time series
Directory of Open Access Journals (Sweden)
Meng Hu
2016-06-01
Full Text Available Copula is an important tool for modeling neural dependence. Recent work on copula has been expanded to jointly model mixed time series in neuroscience (“Hu et al., 2016, Joint Analysis of Spikes and Local Field Potentials using Copula” [1]. Here we present further data for joint analysis of spike and local field potential (LFP with copula modeling. In particular, the details of different model orders and the influence of possible spike contamination in LFP data from the same and different electrode recordings are presented. To further facilitate the use of our copula model for the analysis of mixed data, we provide the Matlab codes, together with example data.
Structural damage detection using ARMAX time series models and cepstral distances
Indian Academy of Sciences (India)
K LAKSHMI; A RAMA MOHAN RAO
2016-09-01
A novel damage detection algorithm for structural health monitoring using time series model is presented. The proposed algorithm uses output-only acceleration time series obtained from sensors on the structure which are fitted using Auto-regressive moving-average with exogenous inputs (ARMAX) model. The algorithm uses Cepstral distances between the ARMAX models of decorrelated data obtained from healthy and any other current condition of the structure as the damage indicator. A numerical model of a simply supported beam with variations due to temperature and operating conditions along with measurement noise is used to demonstrate the effectiveness of the proposed damage diagnostic technique using the ARMAX time series models and their Cepstral distances with novelty indices. The effectiveness of the proposed method is validatedusing the benchmark data of the 8-DOF system made available to public by the Engineering Institute of LANL and the simulated vibration data obtained from the FEM model of IASC-ASCE 12-DOF steel frame. The results of the studies indicate that the proposed algorithm is robust in identifying the damage from the acceleration datacontaminated with noise under varied environmental and operational conditions.
Zakynthinaki, M. S.; Stirling, J. R.
2007-01-01
Stochastic optimization is applied to the problem of optimizing the fit of a model to the time series of raw physiological (heart rate) data. The physiological response to exercise has been recently modeled as a dynamical system. Fitting the model to a set of raw physiological time series data is, however, not a trivial task. For this reason and in order to calculate the optimal values of the parameters of the model, the present study implements the powerful stochastic optimization method ALOPEX IV, an algorithm that has been proven to be fast, effective and easy to implement. The optimal parameters of the model, calculated by the optimization method for the particular athlete, are very important as they characterize the athlete's current condition. The present study applies the ALOPEX IV stochastic optimization to the modeling of a set of heart rate time series data corresponding to different exercises of constant intensity. An analysis of the optimization algorithm, together with an analytic proof of its convergence (in the absence of noise), is also presented.
Directory of Open Access Journals (Sweden)
Trottier Helen
2006-08-01
Full Text Available Abstract The goal of this paper is to analyze the stochastic dynamics of childhood infectious disease time series. We present an univariate time series analysis of pertussis, mumps, measles and rubella based on Box-Jenkins or AutoRegressive Integrated Moving Average (ARIMA modeling. The method, which enables the dependency structure embedded in time series data to be modeled, has potential research applications in studies of infectious disease dynamics. Canadian chronological series of pertussis, mumps, measles and rubella, before and after mass vaccination, are analyzed to characterize the statistical structure of these diseases. Despite the fact that these infectious diseases are biologically different, it is found that they are all represented by simple models with the same basic statistical structure. Aside from seasonal effects, the number of new cases is given by the incidence in the previous period and by periodically recurrent random factors. It is also shown that mass vaccination does not change this stochastic dependency. We conclude that the Box-Jenkins methodology does identify the collective pattern of the dynamics, but not the specifics of the diseases at the biological individual level.
Neural modeling for time series: A statistical stepwise method for weight elimination.
Cottrell, M; Girard, B; Girard, Y; Mangeas, M; Muller, C
1995-01-01
Many authors use feedforward neural networks for modeling and forecasting time series. Most of these applications are mainly experimental, and it is often difficult to extract a general methodology from the published studies. In particular, the choice of architecture is a tricky problem. We try to combine the statistical techniques of linear and nonlinear time series with the connectionist approach. The asymptotical properties of the estimators lead us to propose a systematic methodology to determine which weights are nonsignificant and to eliminate them to simplify the architecture. This method (SSM or statistical stepwise method) is compared to other pruning techniques and is applied to some artificial series, to the famous Sunspots benchmark, and to daily electrical consumption data.
Modeling pollen time series using seasonal-trend decomposition procedure based on LOESS smoothing.
Rojo, Jesús; Rivero, Rosario; Romero-Morte, Jorge; Fernández-González, Federico; Pérez-Badia, Rosa
2017-02-01
Analysis of airborne pollen concentrations provides valuable information on plant phenology and is thus a useful tool in agriculture-for predicting harvests in crops such as the olive and for deciding when to apply phytosanitary treatments-as well as in medicine and the environmental sciences. Variations in airborne pollen concentrations, moreover, are indicators of changing plant life cycles. By modeling pollen time series, we can not only identify the variables influencing pollen levels but also predict future pollen concentrations. In this study, airborne pollen time series were modeled using a seasonal-trend decomposition procedure based on LOcally wEighted Scatterplot Smoothing (LOESS) smoothing (STL). The data series-daily Poaceae pollen concentrations over the period 2006-2014-was broken up into seasonal and residual (stochastic) components. The seasonal component was compared with data on Poaceae flowering phenology obtained by field sampling. Residuals were fitted to a model generated from daily temperature and rainfall values, and daily pollen concentrations, using partial least squares regression (PLSR). This method was then applied to predict daily pollen concentrations for 2014 (independent validation data) using results for the seasonal component of the time series and estimates of the residual component for the period 2006-2013. Correlation between predicted and observed values was r = 0.79 (correlation coefficient) for the pre-peak period (i.e., the period prior to the peak pollen concentration) and r = 0.63 for the post-peak period. Separate analysis of each of the components of the pollen data series enables the sources of variability to be identified more accurately than by analysis of the original non-decomposed data series, and for this reason, this procedure has proved to be a suitable technique for analyzing the main environmental factors influencing airborne pollen concentrations.
Time series models to simulate and forecast hourly averaged wind speeds in Quetta, Pakistan
Energy Technology Data Exchange (ETDEWEB)
Lalarukh Kamal [Balochistan University, Quetta (Pakistan). Dept. of Mathematics; Yasmin Zahra Jafri [Balochistan University, Quetta (Pakistan). Dept. of Statistics
1997-07-01
Stochastic simulation and forecast models of hourly average wind speeds are presented. Time series models take into account several basic features of wind speed data including autocorrelation, non-Gaussian distribution and diurnal nonstationarity. The positive correlation between consecutive wind speed observations is taken into account by fitting an ARMA (p,q) process to wind speed data transformed to make their distribution approximately Gaussian and standardized to remove scattering of transformed data. Diurnal variations have been taken into account to observe forecasts and its dependence on lead times. We find the ARMA (p,q) model suitable for prediction intervals and probability forecasts. (author)
Directory of Open Access Journals (Sweden)
Guy J. Abel
2013-12-01
Full Text Available Background: Population forecasts are widely used for public policy purposes. Methods to quantify the uncertainty in forecasts tend to ignore model uncertainty and to be based on a single model. Objective: In this paper, we use Bayesian time series models to obtain future population estimates with associated measures of uncertainty. The models are compared based on Bayesian posterior model probabilities, which are then used to provide model-averaged forecasts. Methods: The focus is on a simple projection model with the historical data representing population change in England and Wales from 1841 to 2007. Bayesian forecasts to the year 2032 are obtained based on a range of models, including autoregression models, stochastic volatility models and random variance shift models. The computational steps to fit each of these models using the OpenBUGS software via R are illustrated. Results: We show that the Bayesian approach is adept in capturing multiple sources of uncertainty in population projections, including model uncertainty. The inclusion of non-constant variance improves the fit of the models and provides more realistic predictive uncertainty levels. The forecasting methodology is assessed through fitting the models to various truncated data series.
Ahalpara, D P; Parikh, J C; Verma, A; Ahalpara, Dilip P.; Panigrahi, Prasanta K.; Parikh, Jitendra C.; Verma, Amit
2006-01-01
A method based on wavelet transform and genetic programming is proposed for characterizing and modeling variations at multiple scales in non-stationary time series. The cyclic variations, extracted by wavelets and smoothened by cubic splines, are well captured by genetic programming in the form of dynamical equations. For the purpose of illustration, we analyze two different non-stationary financial time series, S&P CNX Nifty closing index of the National Stock Exchange (India) and Dow Jones industrial average closing values through Haar, Daubechies-4 and continuous Morlet wavelets for studying the character of fluctuations at different scales, before modeling the cyclic behavior through GP. Cyclic variations emerge at intermediate time scales and the corresponding dynamical equations reveal characteristic behavior at different scales.
Li, Qiongge; Chan, Maria F
2017-01-01
Over half of cancer patients receive radiotherapy (RT) as partial or full cancer treatment. Daily quality assurance (QA) of RT in cancer treatment closely monitors the performance of the medical linear accelerator (Linac) and is critical for continuous improvement of patient safety and quality of care. Cumulative longitudinal QA measurements are valuable for understanding the behavior of the Linac and allow physicists to identify trends in the output and take preventive actions. In this study, artificial neural networks (ANNs) and autoregressive moving average (ARMA) time-series prediction modeling techniques were both applied to 5-year daily Linac QA data. Verification tests and other evaluations were then performed for all models. Preliminary results showed that ANN time-series predictive modeling has more advantages over ARMA techniques for accurate and effective applicability in the dosimetry and QA field.
Multi-factor high-order intuitionistic fuzzy time series forecasting model
Institute of Scientific and Technical Information of China (English)
Yanan Wang; Yingjie Lei; Yang Lei; Xiaoshi Fan
2016-01-01
Fuzzy sets theory cannot describe the neutrality degree of data, which has largely limited the objectivity of fuzzy time series in uncertain data forecasting. With this regard, a multi-factor high-order intuitionistic fuzzy time series forecasting model is built. In the new model, a fuzzy clustering algorithm is used to get unequal intervals, and a more objective technique for ascertaining member-ship and non-membership functions of the intuitionistic fuzzy set is proposed. On these bases, forecast rules based on multidimen-sional intuitionistic fuzzy modus ponens inference are established. Final y, contrast experiments on the daily mean temperature of Beijing are carried out, which show that the novel model has a clear advantage of improving the forecast accuracy.
Intuitionistic Fuzzy Time Series Forecasting Model Based on Intuitionistic Fuzzy Reasoning
Directory of Open Access Journals (Sweden)
Ya’nan Wang
2016-01-01
Full Text Available Fuzzy sets theory cannot describe the data comprehensively, which has greatly limited the objectivity of fuzzy time series in uncertain data forecasting. In this regard, an intuitionistic fuzzy time series forecasting model is built. In the new model, a fuzzy clustering algorithm is used to divide the universe of discourse into unequal intervals, and a more objective technique for ascertaining the membership function and nonmembership function of the intuitionistic fuzzy set is proposed. On these bases, forecast rules based on intuitionistic fuzzy approximate reasoning are established. At last, contrast experiments on the enrollments of the University of Alabama and the Taiwan Stock Exchange Capitalization Weighted Stock Index are carried out. The results show that the new model has a clear advantage of improving the forecast accuracy.
Woodward, Wayne A; Elliott, Alan C
2011-01-01
""There is scarcely a standard technique that the reader will find left out … this book is highly recommended for those requiring a ready introduction to applicable methods in time series and serves as a useful resource for pedagogical purposes.""-International Statistical Review (2014), 82""Current time series theory for practice is well summarized in this book.""-Emmanuel Parzen, Texas A&M University""What an extraordinary range of topics covered, all very insightfully. I like [the authors'] innovations very much, such as the AR factor table.""-David Findley, U.S. Census Bureau (retired)""…
Time series prediction in agroecosystems
Cortina-Januchs, M. G.; Quintanilla-Dominguez, J.; Vega-Corona, A.; Andina, D.
2012-04-01
This work proposes a novel model to predict time series such as frost, precipitation, temperature, solar radiation, all of them important variables for the agriculture process. In the proposed model, Artificial Neural Networks (ANN) combined with clustering algorithms and sensor data fusion are used. The real time series are obtained from different sensors. The clustering algorithms find relationships between variables, clustering involves the task of dividing data sets, which assigns the same label to members who belong to the same group, so that each group is homogeneous and distinct from the others. Those relationships provide information to the ANN in order to obtain the time series prediction. The most important issue of ANN in time series prediction is generalization, which refers to their ability to produce reasonable predictions on data sets other than those used for the estimation of the model parameters.
High-Order Fuzzy Time Series Model Based on Generalized Fuzzy Logical Relationship
Directory of Open Access Journals (Sweden)
Wangren Qiu
2013-01-01
Full Text Available In view of techniques for constructing high-order fuzzy time series models, there are three methods which are based on advanced algorithms, computational methods, and grouping the fuzzy logical relationships, respectively. The last kind model has been widely applied and researched for the reason that it is easy to be understood by the decision makers. To improve the fuzzy time series forecasting model, this paper presents a novel high-order fuzzy time series models denoted as GTS(M,N on the basis of generalized fuzzy logical relationships. Firstly, the paper introduces some concepts of the generalized fuzzy logical relationship and an operation for combining the generalized relationships. Then, the proposed model is implemented in forecasting enrollments of the University of Alabama. As an example of in-depth research, the proposed approach is also applied to forecast the close price of Shanghai Stock Exchange Composite Index. Finally, the effects of the number of orders and hierarchies of fuzzy logical relationships on the forecasting results are discussed.
Application of Time-Series Model to Predict Groundwater Dynamic in Sanjiang Plain,Northeast China
Institute of Scientific and Technical Information of China (English)
LUAN Zhaoqing; LIU Guihua; YAN Baixing
2011-01-01
To study the groundwater dynamic in the typical region of Sanjiang Plain,long-term groundwater level observation data in the Honghe State Farm were collected and analyzed in this paper.The seasonal and long-term groundwater dynamic was explored.From 1996 to 2008,groundwater level kept declining due to intensive exploitation of groundwater resources for rice irrigation.A decline of nearly 5 m was found for almost all the monitoring wells.A time-series method was established to model the groundwater dynamic.Modeled results by time-series model showed that the groundwater level in this region would keep declining according to the current exploitation intensity.A total dropdown of 1.07 m would occur from 2009 to 2012.Time-series model can be used to model and forecast the groundwater dynamic with high accuracy.Measures including control on groundwater exploitation amount and application of water saving irrigation technique should be taken to prevent the continuing declining of groundwater in the Sanjiang Plain.
A Bayesian Approach for Summarizing and Modeling Time-Series Exposure Data with Left Censoring.
Houseman, E Andres; Virji, M Abbas
2017-08-01
Direct reading instruments are valuable tools for measuring exposure as they provide real-time measurements for rapid decision making. However, their use is limited to general survey applications in part due to issues related to their performance. Moreover, statistical analysis of real-time data is complicated by autocorrelation among successive measurements, non-stationary time series, and the presence of left-censoring due to limit-of-detection (LOD). A Bayesian framework is proposed that accounts for non-stationary autocorrelation and LOD issues in exposure time-series data in order to model workplace factors that affect exposure and estimate summary statistics for tasks or other covariates of interest. A spline-based approach is used to model non-stationary autocorrelation with relatively few assumptions about autocorrelation structure. Left-censoring is addressed by integrating over the left tail of the distribution. The model is fit using Markov-Chain Monte Carlo within a Bayesian paradigm. The method can flexibly account for hierarchical relationships, random effects and fixed effects of covariates. The method is implemented using the rjags package in R, and is illustrated by applying it to real-time exposure data. Estimates for task means and covariates from the Bayesian model are compared to those from conventional frequentist models including linear regression, mixed-effects, and time-series models with different autocorrelation structures. Simulations studies are also conducted to evaluate method performance. Simulation studies with percent of measurements below the LOD ranging from 0 to 50% showed lowest root mean squared errors for task means and the least biased standard deviations from the Bayesian model compared to the frequentist models across all levels of LOD. In the application, task means from the Bayesian model were similar to means from the frequentist models, while the standard deviations were different. Parameter estimates for covariates
Indian Academy of Sciences (India)
Dilip P Ahalpara; Amit Verma; Jiterndra C Parikh; Prasanta K Panigrahi
2008-09-01
A method based on wavelet transform is developed to characterize variations at multiple scales in non-stationary time series. We consider two different financial time series, S&P CNX Nifty closing index of the National Stock Exchange (India) and Dow Jones industrial average closing values. These time series are chosen since they are known to comprise of stochastic fluctuations as well as cyclic variations at different scales. The wavelet transform isolates cyclic variations at higher scales when random fluctuations are averaged out; this corroborates correlated behaviour observed earlier in financial time series through random matrix studies. Analysis is carried out through Haar, Daubechies-4 and continuous Morlet wavelets for studying the character of fluctuations at different scales and show that cyclic variations emerge at intermediate time scales. It is found that Daubechies family of wavelets can be effectively used to capture cyclic variations since these are local in nature. To get an insight into the occurrence of cyclic variations, we then proceed to model these wavelet coefficients using genetic programming (GP) approach and using the standard embedding technique in the reconstructed phase space. It is found that the standard methods (GP as well as artificial neural networks) fail to model these variations because of poor convergence. A novel interpolation approach is developed that overcomes this difficulty. The dynamical model equations have, primarily, linear terms with additive Padé-type terms. It is seen that the emergence of cyclic variations is due to an interplay of a few important terms in the model. Very interestingly GP model captures smooth variations as well as bursty behaviour quite nicely.
a Landsat Time-Series Stacks Model for Detection of Cropland Change
Chen, J.; Chen, J.; Zhang, J.
2017-09-01
Global, timely, accurate and cost-effective cropland monitoring with a fine spatial resolution will dramatically improve our understanding of the effects of agriculture on greenhouse gases emissions, food safety, and human health. Time-series remote sensing imagery have been shown particularly potential to describe land cover dynamics. The traditional change detection techniques are often not capable of detecting land cover changes within time series that are severely influenced by seasonal difference, which are more likely to generate pseuso changes. Here,we introduced and tested LTSM ( Landsat time-series stacks model), an improved Continuous Change Detection and Classification (CCDC) proposed previously approach to extract spectral trajectories of land surface change using a dense Landsat time-series stacks (LTS). The method is expected to eliminate pseudo changes caused by phenology driven by seasonal patterns. The main idea of the method is that using all available Landsat 8 images within a year, LTSM consisting of two term harmonic function are estimated iteratively for each pixel in each spectral band .LTSM can defines change area by differencing the predicted and observed Landsat images. The LTSM approach was compared with change vector analysis (CVA) method. The results indicated that the LTSM method correctly detected the "true change" without overestimating the "false" one, while CVA pointed out "true change" pixels with a large number of "false changes". The detection of change areas achieved an overall accuracy of 92.37 %, with a kappa coefficient of 0.676.
Neighbourhood selection for local modelling and prediction of hydrological time series
Jayawardena, A. W.; Li, W. K.; Xu, P.
2002-02-01
The prediction of a time series using the dynamical systems approach requires the knowledge of three parameters; the time delay, the embedding dimension and the number of nearest neighbours. In this paper, a new criterion, based on the generalized degrees of freedom, for the selection of the number of nearest neighbours needed for a better local model for time series prediction is presented. The validity of the proposed method is examined using time series, which are known to be chaotic under certain initial conditions (Lorenz map, Henon map and Logistic map), and real hydro meteorological time series (discharge data from Chao Phraya river in Thailand, Mekong river in Thailand and Laos, and sea surface temperature anomaly data). The predicted results are compared with observations, and with similar predictions obtained by using arbitrarily fixed numbers of neighbours. The results indicate superior predictive capability as measured by the mean square errors and coefficients of variation by the proposed approach when compared with the traditional approach of using a fixed number of neighbours.
Lohani, A. K.; Kumar, Rakesh; Singh, R. D.
2012-06-01
SummaryTime series modeling is necessary for the planning and management of reservoirs. More recently, the soft computing techniques have been used in hydrological modeling and forecasting. In this study, the potential of artificial neural networks and neuro-fuzzy system in monthly reservoir inflow forecasting are examined by developing and comparing monthly reservoir inflow prediction models, based on autoregressive (AR), artificial neural networks (ANNs) and adaptive neural-based fuzzy inference system (ANFIS). To take care the effect of monthly periodicity in the flow data, cyclic terms are also included in the ANN and ANFIS models. Working with time series flow data of the Sutlej River at Bhakra Dam, India, several ANN and adaptive neuro-fuzzy models are trained with different input vectors. To evaluate the performance of the selected ANN and adaptive neural fuzzy inference system (ANFIS) models, comparison is made with the autoregressive (AR) models. The ANFIS model trained with the input data vector including previous inflows and cyclic terms of monthly periodicity has shown a significant improvement in the forecast accuracy in comparison with the ANFIS models trained with the input vectors considering only previous inflows. In all cases ANFIS gives more accurate forecast than the AR and ANN models. The proposed ANFIS model coupled with the cyclic terms is shown to provide better representation of the monthly inflow forecasting for planning and operation of reservoir.
A Bayesian Surrogate Model for Rapid Time Series Analysis and Application to Exoplanet Observations
Ford, Eric B; Veras, Dimitri
2011-01-01
We present a Bayesian surrogate model for the analysis of periodic or quasi-periodic time series data. We describe a computationally efficient implementation that enables Bayesian model comparison. We apply this model to simulated and real exoplanet observations. We discuss the results and demonstrate some of the challenges for applying our surrogate model to realistic exoplanet data sets. In particular, we find that analyses of real world data should pay careful attention to the effects of uneven spacing of observations and the choice of prior for the "jitter" parameter.
Modeling and Simulation of Time Series Prediction Based on Dynic Neural Network
Institute of Scientific and Technical Information of China (English)
王雪松; 程玉虎; 彭光正
2004-01-01
Molding and simulation of time series prediction based on dynic neural network(NN) are studied. Prediction model for non-linear and time-varying system is proposed based on dynic Jordan NN. Aiming at the intrinsic defects of back-propagation (BP) algorithm that cannot update network weights incrementally, a hybrid algorithm combining the temporal difference (TD) method with BP algorithm to train Jordan NN is put forward. The proposed method is applied to predict the ash content of clean coal in jigging production real-time and multi-step. A practical exple is also given and its application results indicate that the method has better performance than others and also offers a beneficial reference to the prediction of nonlinear time series.
Prawirodirdjo, Linette; Ben-Zion, Yehuda; Bock, Yehuda
2006-02-01
We suggest that strain in the elastic part of the Earth's crust induced by surface temperature variations is a significant contributor to the seasonal variations observed in the spatially filtered daily position time series of Southern California Integrated GPS Network (SCIGN) stations. We compute the predicted thermoelastic strain from the observed local atmospheric temperature record assuming an elastically decoupled layer over a uniform elastic half-space and compare the seasonal variations in thermoelastic strain to the horizontal GPS position time series. We consider three regions (Palmdale, 29 Palms, and Idyllwild), each with one temperature station and three to six GPS stations. The temperature time series is used to compute thermoelastic strain at each station on the basis of its relative location in the temperature field. For each region we assume a wavelength for the temperature field that is related to the local topography. The depth of the decoupled layer is inferred from the phase delay between the temperature record and the GPS time series. The relative amplitude of strain variation at each GPS station, calculated to be on the order of 0.1 μstrain, is related to the relative location of that station in the temperature field. The goodness of fit between model and data is evaluated from the relative amplitudes of the seasonal signals, as well as the appropriateness of the chosen temperature field wavelength and decoupled layer depth. The analysis shows a good fit between the predicted strains and the GPS time series. This suggests that the model captures the key first-order ingredients that determine the thermoelastic strain in a given area. The results can be used to improve the signal/noise ratio in GPS data.
Uniting Mandelbrot’s Noah and Joseph Effects in Toy Models of Natural Hazard Time Series
Credgington, D.; Watkins, N. W.; Chapman, S. C.; Rosenberg, S. J.; Sanchez, R.
2009-12-01
The forecasting of extreme events is a highly topical, cross-disciplinary problem. One aspect which is potentially tractable even when the events themselves are stochastic is the probability of a “burst” of a given size and duration, defined as the area between a time series and a constant threshold. Many natural time series depart from the simplest, Brownian, case and in the 1960s Mandelbrot developed the use of fractals to describe these departures. In particular he proposed two kinds of fractal model to capture the way in which natural data is often persistent in time (his “Joseph effect”, common in hydrology and exemplified by fractional Brownian motion) and/or prone to heavy tailed jumps (the “Noah effect”, typical of economic index time series, for which he gave Levy flights as an examplar). Much of the earlier modelling, however, has emphasised one of the Noah and Joseph parameters (the tail exponent mu and one derived from the temporal behaviour such as power spectral beta) at the other one's expense. I will describe work [1] in which we applied a simple self-affine stable model-linear fractional stable motion (LFSM)-which unifies both effects to better describe natural data, in this case from space physics. I will show how we have resolved some contradictions seen in earlier work, where purely Joseph or Noah descriptions had been sought. I will also show recent work [2] using numerical simulations of LFSM and simple analytic scaling arguments to study the problem of the area between a fractional Levy model time series and a threshold. [1] Watkins et al, Space Science Reviews [2005] [2] Watkins et al, Physical Review E [2009
A time series model: First-order integer-valued autoregressive (INAR(1))
Simarmata, D. M.; Novkaniza, F.; Widyaningsih, Y.
2017-07-01
Nonnegative integer-valued time series arises in many applications. A time series model: first-order Integer-valued AutoRegressive (INAR(1)) is constructed by binomial thinning operator to model nonnegative integer-valued time series. INAR (1) depends on one period from the process before. The parameter of the model can be estimated by Conditional Least Squares (CLS). Specification of INAR(1) is following the specification of (AR(1)). Forecasting in INAR(1) uses median or Bayesian forecasting methodology. Median forecasting methodology obtains integer s, which is cumulative density function (CDF) until s, is more than or equal to 0.5. Bayesian forecasting methodology forecasts h-step-ahead of generating the parameter of the model and parameter of innovation term using Adaptive Rejection Metropolis Sampling within Gibbs sampling (ARMS), then finding the least integer s, where CDF until s is more than or equal to u . u is a value taken from the Uniform(0,1) distribution. INAR(1) is applied on pneumonia case in Penjaringan, Jakarta Utara, January 2008 until April 2016 monthly.
Fractal and Multifractal Time Series
Kantelhardt, Jan W
2008-01-01
Data series generated by complex systems exhibit fluctuations on many time scales and/or broad distributions of the values. In both equilibrium and non-equilibrium situations, the natural fluctuations are often found to follow a scaling relation over several orders of magnitude, allowing for a characterisation of the data and the generating complex system by fractal (or multifractal) scaling exponents. In addition, fractal and multifractal approaches can be used for modelling time series and deriving predictions regarding extreme events. This review article describes and exemplifies several methods originating from Statistical Physics and Applied Mathematics, which have been used for fractal and multifractal time series analysis.
Modeling pollen time series using seasonal-trend decomposition procedure based on LOESS smoothing
Rojo, Jesús; Rivero, Rosario; Romero-Morte, Jorge; Fernández-González, Federico; Pérez-Badia, Rosa
2016-08-01
Analysis of airborne pollen concentrations provides valuable information on plant phenology and is thus a useful tool in agriculture—for predicting harvests in crops such as the olive and for deciding when to apply phytosanitary treatments—as well as in medicine and the environmental sciences. Variations in airborne pollen concentrations, moreover, are indicators of changing plant life cycles. By modeling pollen time series, we can not only identify the variables influencing pollen levels but also predict future pollen concentrations. In this study, airborne pollen time series were modeled using a seasonal-trend decomposition procedure based on LOcally wEighted Scatterplot Smoothing (LOESS) smoothing (STL). The data series—daily Poaceae pollen concentrations over the period 2006-2014—was broken up into seasonal and residual (stochastic) components. The seasonal component was compared with data on Poaceae flowering phenology obtained by field sampling. Residuals were fitted to a model generated from daily temperature and rainfall values, and daily pollen concentrations, using partial least squares regression (PLSR). This method was then applied to predict daily pollen concentrations for 2014 (independent validation data) using results for the seasonal component of the time series and estimates of the residual component for the period 2006-2013. Correlation between predicted and observed values was r = 0.79 (correlation coefficient) for the pre-peak period (i.e., the period prior to the peak pollen concentration) and r = 0.63 for the post-peak period. Separate analysis of each of the components of the pollen data series enables the sources of variability to be identified more accurately than by analysis of the original non-decomposed data series, and for this reason, this procedure has proved to be a suitable technique for analyzing the main environmental factors influencing airborne pollen concentrations.
Modeling pollen time series using seasonal-trend decomposition procedure based on LOESS smoothing
Rojo, Jesús; Rivero, Rosario; Romero-Morte, Jorge; Fernández-González, Federico; Pérez-Badia, Rosa
2017-02-01
Analysis of airborne pollen concentrations provides valuable information on plant phenology and is thus a useful tool in agriculture—for predicting harvests in crops such as the olive and for deciding when to apply phytosanitary treatments—as well as in medicine and the environmental sciences. Variations in airborne pollen concentrations, moreover, are indicators of changing plant life cycles. By modeling pollen time series, we can not only identify the variables influencing pollen levels but also predict future pollen concentrations. In this study, airborne pollen time series were modeled using a seasonal-trend decomposition procedure based on LOcally wEighted Scatterplot Smoothing (LOESS) smoothing (STL). The data series—daily Poaceae pollen concentrations over the period 2006-2014—was broken up into seasonal and residual (stochastic) components. The seasonal component was compared with data on Poaceae flowering phenology obtained by field sampling. Residuals were fitted to a model generated from daily temperature and rainfall values, and daily pollen concentrations, using partial least squares regression (PLSR). This method was then applied to predict daily pollen concentrations for 2014 (independent validation data) using results for the seasonal component of the time series and estimates of the residual component for the period 2006-2013. Correlation between predicted and observed values was r = 0.79 (correlation coefficient) for the pre-peak period (i.e., the period prior to the peak pollen concentration) and r = 0.63 for the post-peak period. Separate analysis of each of the components of the pollen data series enables the sources of variability to be identified more accurately than by analysis of the original non-decomposed data series, and for this reason, this procedure has proved to be a suitable technique for analyzing the main environmental factors influencing airborne pollen concentrations.
Time-series gas prediction model using LS-SVR within a Bayesian framework
Institute of Scientific and Technical Information of China (English)
Qiao Meiying; Ma Xiaoping; Lan Jianyi; Wang Ying
2011-01-01
The traditional least squares support vector regression (LS-SVR) model, using cross validation to determine the regularization parameter and kernel parameter, is time-consuming. We propose a Bayesian evidence framework to infer the LS-SVR model parameters. Three levels Bayesian inferences are used to determine the model parameters, regularization hyper-parameters and tune the nuclear parameters by model comparison. On this basis, we established Bayesian LS-SVR time-series gas forecasting models and provide steps for the algorithm. The gas outburst data of a Hebi 10th mine working face is used to validate the model. The optimal embedding dimension and delay time of the time series were obtained by the smallest differential entropy method. Finally, within a MATLAB7.1 environment, we used actual coal gas data to compare the traditional LS-SVR and the Bayesian LS-SVR with LS-SVMlab1.5 Toolbox simulation. The results show that the Bayesian framework of an LS-SVR significantly improves the speed and accuracy of the forecast
Big Data impacts on stochastic Forecast Models: Evidence from FX time series
Directory of Open Access Journals (Sweden)
Sebastian Dietz
2013-12-01
Full Text Available With the rise of the Big Data paradigm new tasks for prediction models appeared. In addition to the volume problem of such data sets nonlinearity becomes important, as the more detailed data sets contain also more comprehensive information, e.g. about non regular seasonal or cyclical movements as well as jumps in time series. This essay compares two nonlinear methods for predicting a high frequency time series, the USD/Euro exchange rate. The first method investigated is Autoregressive Neural Network Processes (ARNN, a neural network based nonlinear extension of classical autoregressive process models from time series analysis (see Dietz 2011. Its advantage is its simple but scalable time series process model architecture, which is able to include all kinds of nonlinearities based on the universal approximation theorem of Hornik, Stinchcombe and White 1989 and the extensions of Hornik 1993. However, restrictions related to the numeric estimation procedures limit the flexibility of the model. The alternative is a Support Vector Machine Model (SVM, Vapnik 1995. The two methods compared have different approaches of error minimization (Empirical error minimization at the ARNN vs. structural error minimization at the SVM. Our new finding is, that time series data classified as “Big Data” need new methods for prediction. Estimation and prediction was performed using the statistical programming language R. Besides prediction results we will also discuss the impact of Big Data on data preparation and model validation steps. Normal 0 21 false false false DE X-NONE X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Normale Tabelle"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-parent:""; mso-padding-alt:0cm 5.4pt 0cm 5.4pt; mso-para-margin:0cm; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:10.0pt; font-family:"Times New Roman","serif";}
A model-free characterization of recurrences in stationary time series
Chicheportiche, Rémy; Chakraborti, Anirban
2017-05-01
Study of recurrences in earthquakes, climate, financial time-series, etc. is crucial to better forecast disasters and limit their consequences. Most of the previous phenomenological studies of recurrences have involved only a long-ranged autocorrelation function, and ignored the multi-scaling properties induced by potential higher order dependencies. We argue that copulas is a natural model-free framework to study non-linear dependencies in time series and related concepts like recurrences. Consequently, we arrive at the facts that (i) non-linear dependences do impact both the statistics and dynamics of recurrence times, and (ii) the scaling arguments for the unconditional distribution may not be applicable. Hence, fitting and/or simulating the intertemporal distribution of recurrence intervals is very much system specific, and cannot actually benefit from universal features, in contrast to the previous claims. This has important implications in epilepsy prognosis and financial risk management applications.
Wavelet time series MPARIMA modeling for power system short term load forecasting
Institute of Scientific and Technical Information of China (English)
冉启文; 单永正; 王建赜; 王骐
2003-01-01
The wavelet power system short term load forecasting(STLF) uses a mulriple periodical autoregressive integrated moving average(MPARIMA) model to model the mulriple near-periodicity, nonstationarity and nonlinearity existed in power system short term quarter-hour load time series, and can therefore accurately forecast the quarter-hour loads of weekdays and weekends, and provide more accurate results than the conventional techniques, such as artificial neural networks and autoregressive moving average(ARMA) models test results. Obtained with a power system networks in a city in Northeastern part of China confirm the validity of the approach proposed.
Extracting Knowledge From Time Series An Introduction to Nonlinear Empirical Modeling
Bezruchko, Boris P
2010-01-01
This book addresses the fundamental question of how to construct mathematical models for the evolution of dynamical systems from experimentally-obtained time series. It places emphasis on chaotic signals and nonlinear modeling and discusses different approaches to the forecast of future system evolution. In particular, it teaches readers how to construct difference and differential model equations depending on the amount of a priori information that is available on the system in addition to the experimental data sets. This book will benefit graduate students and researchers from all natural sciences who seek a self-contained and thorough introduction to this subject.
Model for the respiratory modulation of the heart beat-to-beat time interval series
Capurro, Alberto; Diambra, Luis; Malta, C. P.
2005-09-01
In this study we present a model for the respiratory modulation of the heart beat-to-beat interval series. The model consists of a set of differential equations used to simulate the membrane potential of a single rabbit sinoatrial node cell, excited with a periodic input signal with added correlated noise. This signal, which simulates the input from the autonomous nervous system to the sinoatrial node, was included in the pacemaker equations as a modulation of the iNaK current pump and the potassium current iK. We focus at modeling the heart beat-to-beat time interval series from normal subjects during meditation of the Kundalini Yoga and Chi techniques. The analysis of the experimental data indicates that while the embedding of pre-meditation and control cases have a roughly circular shape, it acquires a polygonal shape during meditation, triangular for the Kundalini Yoga data and quadrangular in the case of Chi data. The model was used to assess the waveshape of the respiratory signals needed to reproduce the trajectory of the experimental data in the phase space. The embedding of the Chi data could be reproduced using a periodic signal obtained by smoothing a square wave. In the case of Kundalini Yoga data, the embedding was reproduced with a periodic signal obtained by smoothing a triangular wave having a rising branch of longer duration than the decreasing branch. Our study provides an estimation of the respiratory signal using only the heart beat-to-beat time interval series.
Mukhin, Dmitry; Gavrilov, Andrey; Loskutov, Evgeny; Feigin, Alexander
2016-04-01
We suggest a method for empirical forecast of climate dynamics basing on the reconstruction of reduced dynamical models in a form of random dynamical systems [1,2] derived from observational time series. The construction of proper embedding - the set of variables determining the phase space the model works in - is no doubt the most important step in such a modeling, but this task is non-trivial due to huge dimension of time series of typical climatic fields. Actually, an appropriate expansion of observational time series is needed yielding the number of principal components considered as phase variables, which are to be efficient for the construction of low-dimensional evolution operator. We emphasize two main features the reduced models should have for capturing the main dynamical properties of the system: (i) taking into account time-lagged teleconnections in the atmosphere-ocean system and (ii) reflecting the nonlinear nature of these teleconnections. In accordance to these principles, in this report we present the methodology which includes the combination of a new way for the construction of an embedding by the spatio-temporal data expansion and nonlinear model construction on the basis of artificial neural networks. The methodology is aplied to NCEP/NCAR reanalysis data including fields of sea level pressure, geopotential height, and wind speed, covering Northern Hemisphere. Its efficiency for the interannual forecast of various climate phenomena including ENSO, PDO, NAO and strong blocking event condition over the mid latitudes, is demonstrated. Also, we investigate the ability of the models to reproduce and predict the evolution of qualitative features of the dynamics, such as spectral peaks, critical transitions and statistics of extremes. This research was supported by the Government of the Russian Federation (Agreement No. 14.Z50.31.0033 with the Institute of Applied Physics RAS) [1] Y. I. Molkov, E. M. Loskutov, D. N. Mukhin, and A. M. Feigin, "Random
Directory of Open Access Journals (Sweden)
Parneet Paul
2013-02-01
Full Text Available The computer modelling and simulation of wastewater treatment plant and their specific technologies, such as membrane bioreactors (MBRs, are becoming increasingly useful to consultant engineers when designing, upgrading, retrofitting, operating and controlling these plant. This research uses traditional phenomenological mechanistic models based on MBR filtration and biochemical processes to measure the effectiveness of alternative and novel time series models based upon input–output system identification methods. Both model types are calibrated and validated using similar plant layouts and data sets derived for this purpose. Results prove that although both approaches have their advantages, they also have specific disadvantages as well. In conclusion, the MBR plant designer and/or operator who wishes to use good quality, calibrated models to gain a better understanding of their process, should carefully consider which model type is selected based upon on what their initial modelling objectives are. Each situation usually proves unique.
Time series modeling for analysis and control advanced autopilot and monitoring systems
Ohtsu, Kohei; Kitagawa, Genshiro
2015-01-01
This book presents multivariate time series methods for the analysis and optimal control of feedback systems. Although ships’ autopilot systems are considered through the entire book, the methods set forth in this book can be applied to many other complicated, large, or noisy feedback control systems for which it is difficult to derive a model of the entire system based on theory in that subject area. The basic models used in this method are the multivariate autoregressive model with exogenous variables (ARX) model and the radial bases function net-type coefficients ARX model. The noise contribution analysis can then be performed through the estimated autoregressive (AR) model and various types of autopilot systems can be designed through the state–space representation of the models. The marine autopilot systems addressed in this book include optimal controllers for course-keeping motion, rolling reduction controllers with rudder motion, engine governor controllers, noise adaptive autopilots, route-tracki...
Applications of soft computing in time series forecasting simulation and modeling techniques
Singh, Pritpal
2016-01-01
This book reports on an in-depth study of fuzzy time series (FTS) modeling. It reviews and summarizes previous research work in FTS modeling and also provides a brief introduction to other soft-computing techniques, such as artificial neural networks (ANNs), rough sets (RS) and evolutionary computing (EC), focusing on how these techniques can be integrated into different phases of the FTS modeling approach. In particular, the book describes novel methods resulting from the hybridization of FTS modeling approaches with neural networks and particle swarm optimization. It also demonstrates how a new ANN-based model can be successfully applied in the context of predicting Indian summer monsoon rainfall. Thanks to its easy-to-read style and the clear explanations of the models, the book can be used as a concise yet comprehensive reference guide to fuzzy time series modeling, and will be valuable not only for graduate students, but also for researchers and professionals working for academic, business and governmen...
International Work-Conference on Time Series
Pomares, Héctor; Valenzuela, Olga
2017-01-01
This volume of selected and peer-reviewed contributions on the latest developments in time series analysis and forecasting updates the reader on topics such as analysis of irregularly sampled time series, multi-scale analysis of univariate and multivariate time series, linear and non-linear time series models, advanced time series forecasting methods, applications in time series analysis and forecasting, advanced methods and online learning in time series and high-dimensional and complex/big data time series. The contributions were originally presented at the International Work-Conference on Time Series, ITISE 2016, held in Granada, Spain, June 27-29, 2016. The series of ITISE conferences provides a forum for scientists, engineers, educators and students to discuss the latest ideas and implementations in the foundations, theory, models and applications in the field of time series analysis and forecasting. It focuses on interdisciplinary and multidisciplinary rese arch encompassing the disciplines of comput...
A sequential approach to calibrate ecosystem models with multiple time series data
Oliveros-Ramos, Ricardo; Verley, Philippe; Echevin, Vincent; Shin, Yunne-Jai
2017-02-01
When models are aimed to support decision-making, their credibility is essential to consider. Model fitting to observed data is one major criterion to assess such credibility. However, due to the complexity of ecosystem models making their calibration more challenging, the scientific community has given more attention to the exploration of model behavior than to a rigorous comparison to observations. This work highlights some issues related to the comparison of complex ecosystem models to data and proposes a methodology for a sequential multi-phases calibration (or parameter estimation) of ecosystem models. We first propose two criteria to classify the parameters of a model: the model dependency and the time variability of the parameters. Then, these criteria and the availability of approximate initial estimates are used as decision rules to determine which parameters need to be estimated, and their precedence order in the sequential calibration process. The end-to-end (E2E) ecosystem model ROMS-PISCES-OSMOSE applied to the Northern Humboldt Current Ecosystem is used as an illustrative case study. The model is calibrated using an evolutionary algorithm and a likelihood approach to fit time series data of landings, abundance indices and catch at length distributions from 1992 to 2008. Testing different calibration schemes regarding the number of phases, the precedence of the parameters' estimation, and the consideration of time varying parameters, the results show that the multiple-phase calibration conducted under our criteria allowed to improve the model fit.
Sample correlations of infinite variance time series models: an empirical and theoretical study
Directory of Open Access Journals (Sweden)
Jason Cohen
1998-01-01
Full Text Available When the elements of a stationary ergodic time series have finite variance the sample correlation function converges (with probability 1 to the theoretical correlation function. What happens in the case where the variance is infinite? In certain cases, the sample correlation function converges in probability to a constant, but not always. If within a class of heavy tailed time series the sample correlation functions do not converge to a constant, then more care must be taken in making inferences and in model selection on the basis of sample autocorrelations. We experimented with simulating various heavy tailed stationary sequences in an attempt to understand what causes the sample correlation function to converge or not to converge to a constant. In two new cases, namely the sum of two independent moving averages and a random permutation scheme, we are able to provide theoretical explanations for a random limit of the sample autocorrelation function as the sample grows.
Chattopadhyay, Goutami; 10.1140/epjp/i2012-12043-9
2012-01-01
This study reports a statistical analysis of monthly sunspot number time series and observes non homogeneity and asymmetry within it. Using Mann-Kendall test a linear trend is revealed. After identifying stationarity within the time series we generate autoregressive AR(p) and autoregressive moving average (ARMA(p,q)). Based on minimization of AIC we find 3 and 1 as the best values of p and q respectively. In the next phase, autoregressive neural network (AR-NN(3)) is generated by training a generalized feedforward neural network (GFNN). Assessing the model performances by means of Willmott's index of second order and coefficient of determination, the performance of AR-NN(3) is identified to be better than AR(3) and ARMA(3,1).
Modeling Inter-Country Connection from Geotagged News Reports: A Time-Series Analysis
Yuan, Yihong
2016-01-01
The development of theories and techniques for big data analytics offers tremendous flexibility for investigating large-scale events and patterns that emerge over space and time. In this research, we utilize a unique open-access dataset "The Global Data on Events, Location and Tone" (GDELT) to model the image of China in mass media, specifically, how China has related to the rest of the world and how this connection has evolved upon time based on an autoregressive integrated moving average (ARIMA) model. The results of this research contribute both in methodological and empirical perspectives: We examined the effectiveness of time series models in predicting trends in long-term mass media data. In addition, we identified various types of connection strength patterns between China and its top 15 related countries. This study generates valuable input to interpret China's diplomatic and regional relations based on mass media data, as well as providing methodological references for investigating international rel...
A stochastic HMM-based forecasting model for fuzzy time series.
Li, Sheng-Tun; Cheng, Yi-Chung
2010-10-01
Recently, fuzzy time series have attracted more academic attention than traditional time series due to their capability of dealing with the uncertainty and vagueness inherent in the data collected. The formulation of fuzzy relations is one of the key issues affecting forecasting results. Most of the present works adopt IF-THEN rules for relationship representation, which leads to higher computational overhead and rule redundancy. Sullivan and Woodall proposed a Markov-based formulation and a forecasting model to reduce computational overhead; however, its applicability is limited to handling one-factor problems. In this paper, we propose a novel forecasting model based on the hidden Markov model by enhancing Sullivan and Woodall's work to allow handling of two-factor forecasting problems. Moreover, in order to make the nature of conjecture and randomness of forecasting more realistic, the Monte Carlo method is adopted to estimate the outcome. To test the effectiveness of the resulting stochastic model, we conduct two experiments and compare the results with those from other models. The first experiment consists of forecasting the daily average temperature and cloud density in Taipei, Taiwan, and the second experiment is based on the Taiwan Weighted Stock Index by forecasting the exchange rate of the New Taiwan dollar against the U.S. dollar. In addition to improving forecasting accuracy, the proposed model adheres to the central limit theorem, and thus, the result statistically approximates to the real mean of the target value being forecast.
Time Series Model of Wind Speed for Multi Wind Turbines based on Mixed Copula
Directory of Open Access Journals (Sweden)
Nie Dan
2016-01-01
Full Text Available Because wind power is intermittent, random and so on, large scale grid will directly affect the safe and stable operation of power grid. In order to make a quantitative study on the characteristics of the wind speed of wind turbine, the wind speed time series model of the multi wind turbine generator is constructed by using the mixed Copula-ARMA function in this paper, and a numerical example is also given. The research results show that the model can effectively predict the wind speed, ensure the efficient operation of the wind turbine, and provide theoretical basis for the stability of wind power grid connected operation.
Water quality management using statistical analysis and time-series prediction model
Parmar, Kulwinder Singh; Bhardwaj, Rashmi
2014-12-01
This paper deals with water quality management using statistical analysis and time-series prediction model. The monthly variation of water quality standards has been used to compare statistical mean, median, mode, standard deviation, kurtosis, skewness, coefficient of variation at Yamuna River. Model validated using R-squared, root mean square error, mean absolute percentage error, maximum absolute percentage error, mean absolute error, maximum absolute error, normalized Bayesian information criterion, Ljung-Box analysis, predicted value and confidence limits. Using auto regressive integrated moving average model, future water quality parameters values have been estimated. It is observed that predictive model is useful at 95 % confidence limits and curve is platykurtic for potential of hydrogen (pH), free ammonia, total Kjeldahl nitrogen, dissolved oxygen, water temperature (WT); leptokurtic for chemical oxygen demand, biochemical oxygen demand. Also, it is observed that predicted series is close to the original series which provides a perfect fit. All parameters except pH and WT cross the prescribed limits of the World Health Organization /United States Environmental Protection Agency, and thus water is not fit for drinking, agriculture and industrial use.
A Course in Time Series Analysis
Peña, Daniel; Tsay, Ruey S
2011-01-01
New statistical methods and future directions of research in time series A Course in Time Series Analysis demonstrates how to build time series models for univariate and multivariate time series data. It brings together material previously available only in the professional literature and presents a unified view of the most advanced procedures available for time series model building. The authors begin with basic concepts in univariate time series, providing an up-to-date presentation of ARIMA models, including the Kalman filter, outlier analysis, automatic methods for building ARIMA models, a
A LANDSAT TIME-SERIES STACKS MODEL FOR DETECTION OF CROPLAND CHANGE
Directory of Open Access Journals (Sweden)
J. Chen
2017-09-01
Full Text Available Global, timely, accurate and cost-effective cropland monitoring with a fine spatial resolution will dramatically improve our understanding of the effects of agriculture on greenhouse gases emissions, food safety, and human health. Time-series remote sensing imagery have been shown particularly potential to describe land cover dynamics. The traditional change detection techniques are often not capable of detecting land cover changes within time series that are severely influenced by seasonal difference, which are more likely to generate pseuso changes. Here,we introduced and tested LTSM ( Landsat time-series stacks model, an improved Continuous Change Detection and Classification (CCDC proposed previously approach to extract spectral trajectories of land surface change using a dense Landsat time-series stacks (LTS. The method is expected to eliminate pseudo changes caused by phenology driven by seasonal patterns. The main idea of the method is that using all available Landsat 8 images within a year, LTSM consisting of two term harmonic function are estimated iteratively for each pixel in each spectral band .LTSM can defines change area by differencing the predicted and observed Landsat images. The LTSM approach was compared with change vector analysis (CVA method. The results indicated that the LTSM method correctly detected the “true change” without overestimating the “false” one, while CVA pointed out “true change” pixels with a large number of “false changes”. The detection of change areas achieved an overall accuracy of 92.37 %, with a kappa coefficient of 0.676.
Liang, X San
2014-01-01
Given two time series, can one tell, in a rigorous and quantitative way, the cause and effect between them? Based on a recently rigorized physical notion namely information flow, we arrive at a concise formula and give this challenging question, which is of wide concern in different disciplines, a positive answer. Here causality is measured by the time rate of change of information flowing from one series, say, X2, to another, X1. The measure is asymmetric between the two parties and, particularly, if the process underlying X1 does not depend on X2, then the resulting causality from X2 to X1 vanishes. The formula is tight in form, involving only the commonly used statistics, sample covariances. It has been validated with touchstone series purportedly generated with one-way causality. It has also been applied to the investigation of real world problems; an example presented here is the cause-effect relation between two climate modes, El Ni\\~no and Indian Ocean Dipole, which have been linked to the hazards in f...
Klein, A.A.B.; Melard, G.; Zahaf, T.
2000-01-01
The Fisher information matrix is of fundamental importance for the analysis of parameter estimation of time series models. In this paper the exact information matrix of a multivariate Gaussian time series model expressed in state space form is derived. A computationally efficient procedure is used b
Klein, A.A.B.; Melard, G.; Zahaf, T.
2000-01-01
The Fisher information matrix is of fundamental importance for the analysis of parameter estimation of time series models. In this paper the exact information matrix of a multivariate Gaussian time series model expressed in state space form is derived. A computationally efficient procedure is used b
Interception modeling with vegetation time series derived from Landsat TM data
Polo, M. J.; Díaz-Gutiérrez, A.; González-Dugo, M. P.
2011-11-01
Rainfall interception by the vegetation may constitute a significant fraction in the water budget at local and watershed scales, especially in Mediterranean areas. Different approaches can be found to model locally the interception fraction, but a distributed analysis requires time series of vegetation along the watershed for the study period, which includes both type of vegetation and ground cover fraction. In heterogeneous watersheds, remote sensing is usually the only viable alternative to characterize medium to large size areas, but the high number of scenes necessary to capture the temporal variability during long periods, together with the sometimes extreme scarcity of data during the wet season, make it necessary to deal with a limited number of images and interpolate vegetation maps between consecutive dates. This work presents an interception model for heterogeneous watersheds which combines an interception continuous simulation derived from Gash model and their derivations, and a time series of vegetation cover fraction and type from Landsat TM data and vegetation inventories. A mountainous watershed in Southern Spain where a physical hydrological modelling had been previously calibrated was selected for this study. The dominant species distribution and their relevant characteristics regarding the interception process were analyzed from literature and digital cartography; the evolution of the vegetation cover fraction along the watershed during the study period (2002-2005) was produced by the application of a NDVI analysis on the available scenes of Landsat TM images. This model was further calibrated by field data collected in selected areas in the watershed.
Modeling commodity salam contract between two parties for discrete and continuous time series
Hisham, Azie Farhani Badrol; Jaffar, Maheran Mohd
2017-08-01
In order for Islamic finance to remain competitive as the conventional, there needs a new development of Islamic compliance product such as Islamic derivative that can be used to manage the risk. However, under syariah principles and regulations, all financial instruments must not be conflicting with five syariah elements which are riba (interest paid), rishwah (corruption), gharar (uncertainty or unnecessary risk), maysir (speculation or gambling) and jahl (taking advantage of the counterparty's ignorance). This study has proposed a traditional Islamic contract namely salam that can be built as an Islamic derivative product. Although a lot of studies has been done on discussing and proposing the implementation of salam contract as the Islamic product however they are more into qualitative and law issues. Since there is lack of quantitative study of salam contract being developed, this study introduces mathematical models that can value the appropriate salam price for a commodity salam contract between two parties. In modeling the commodity salam contract, this study has modified the existing conventional derivative model and come out with some adjustments to comply with syariah rules and regulations. The cost of carry model has been chosen as the foundation to develop the commodity salam model between two parties for discrete and continuous time series. However, the conventional time value of money results from the concept of interest that is prohibited in Islam. Therefore, this study has adopted the idea of Islamic time value of money which is known as the positive time preference, in modeling the commodity salam contract between two parties for discrete and continuous time series.
Time Series Model of Occupational Injuries Analysis in Ghanaian Mines-A Case Study
Directory of Open Access Journals (Sweden)
S.J. Aidoo
2012-02-01
Full Text Available This study has modeled occupational injuries at Gold Fields Ghana Limited (GFGL, Tarkwa Mine using time series analysis. Data was collected from the Safety and Environment Department from January 2007 to December 2010. Testing for stationarity condition using line graph from Statistical Package for Social Sciences (SPSS 17.0 edition failed, hence the use of Box-Jenkins method of differencing which tested positive after the first difference. ARIMA (1,1,1 model was then applied in modeling the stationary data and model diagnostic was done to ensure its appropriateness. The model was further used to forecast the occurrence of injuries at GFGL for two year period spanning from January 2011 to December 2012. The results show that occupational injuries for GFGL are going to have a slight upward and downward movement from January 2011 to May 2011, after which there will be stability (almost zero from June 2011 to December 2012.
A Trend-Switching Financial Time Series Model with Level-Duration Dependence
Directory of Open Access Journals (Sweden)
Qingsheng Wang
2012-01-01
overcome the difficult problem that motivates our researches in this paper. An asymmetric and nonlinear model with the change of local trend depending on local high-low turning point process is first proposed in this paper. As the point process can be decomposed into the two different processes, a high-low level process and an up-down duration process, we then establish the so-called trend-switching model which depends on both level and duration (Trend-LD. The proposed model can predict efficiently the direction and magnitude of the local trend of a time series by incorporating the local high-low turning point information. The numerical results on six indices in world stock markets show that the proposed Trend-LD model is suitable for fitting the market data and able to outperform the traditional random walk model.
Directory of Open Access Journals (Sweden)
Lihua Yang
2015-04-01
Full Text Available Export volume forecasting of fresh fruits is a complex task due to the large number of factors affecting the demand. In order to guide the fruit growers’ sales, decreasing the cultivating cost and increasing their incomes, a hybrid fresh apple export volume forecasting model is proposed. Using the actual data of fresh apple export volume, the Seasonal Decomposition (SD model of time series and Radial Basis Function (RBF model of artificial neural network are built. The predictive results are compared among the three forecasting model based on the criterion of Mean Absolute Percentage Error (MAPE. The result indicates that the proposed combined forecasting model is effective because it can improve the prediction accuracy of fresh apple export volumes.
Directory of Open Access Journals (Sweden)
I MADE ARYA ANTARA
2015-02-01
Full Text Available This paper aimed to elaborates and compares the performance of Fuzzy Time Series (FTS model with Markov Chain (MC model in forecasting the Gross Regional Domestic Product (GDRP of Bali Province. Both methods were considered as forecasting methods in soft modeling domain. The data used was quarterly data of Bali’s GDRP for year 1992 through 2013 from Indonesian Bureau of Statistic at Denpasar Office. Inspite of using the original data, rate of change from two consecutive quarters was used to model. From the in-sample forecasting conducted, we got the Average Forecasting Error Rate (AFER for FTS dan MC models as much as 0,78 percent and 2,74 percent, respectively. Based-on these findings, FTS outperformed MC in in-sample forecasting for GDRP of Bali’s data.
Zhang, Yong; Zhong, Miner; Geng, Nana; Jiang, Yunjian
2017-01-01
The market demand for electric vehicles (EVs) has increased in recent years. Suitable models are necessary to understand and forecast EV sales. This study presents a singular spectrum analysis (SSA) as a univariate time-series model and vector autoregressive model (VAR) as a multivariate model. Empirical results suggest that SSA satisfactorily indicates the evolving trend and provides reasonable results. The VAR model, which comprised exogenous parameters related to the market on a monthly basis, can significantly improve the prediction accuracy. The EV sales in China, which are categorized into battery and plug-in EVs, are predicted in both short term (up to December 2017) and long term (up to 2020), as statistical proofs of the growth of the Chinese EV industry.
Loos, Martin; Krauss, Martin; Fenner, Kathrin
2012-09-18
Formation of soil nonextractable residues (NER) is central to the fate and persistence of pesticides. To investigate pools and extent of NER formation, an established inverse modeling approach for pesticide soil degradation time series was evaluated with a Monte Carlo Markov Chain (MCMC) sampling procedure. It was found that only half of 73 pesticide degradation time series from a homogeneous soil source allowed for well-behaved identification of kinetic parameters with a four-pool model containing a parent compound, a metabolite, a volatile, and a NER pool. A subsequent simulation indeed confirmed distinct parameter combinations of low identifiability. Taking the resulting uncertainties into account, several conclusions regarding NER formation and its impact on persistence assessment could nonetheless be drawn. First, rate constants for transformation of parent compounds to metabolites were correlated to those for transformation of parent compounds to NER, leading to degradation half-lives (DegT50) typically not being larger than disappearance half-lives (DT50) by more than a factor of 2. Second, estimated rate constants were used to evaluate NER formation over time. This showed that NER formation, particularly through the metabolite pool, may be grossly underestimated when using standard incubation periods. It further showed that amounts and uncertainties in (i) total NER, (ii) NER formed from the parent pool, and (iii) NER formed from the metabolite pool vary considerably among data sets at t→∞, with no clear dominance between (ii) and (iii). However, compounds containing aromatic amine moieties were found to form significantly more total NER when extrapolating to t→∞ than the other compounds studied. Overall, our study stresses the general need for assessing uncertainties, identifiability issues, and resulting biases when using inverse modeling of degradation time series for evaluating persistence and NER formation.
Dynamics modeling for sugar cane sucrose estimation using time series satellite imagery
Zhao, Yu; Justina, Diego Della; Kazama, Yoriko; Rocha, Jansle Vieira; Graziano, Paulo Sergio; Lamparelli, Rubens Augusto Camargo
2016-10-01
Sugarcane, as one of the most mainstay crop in Brazil, plays an essential role in ethanol production. To monitor sugarcane crop growth and predict sugarcane sucrose content, remote sensing technology plays an essential role while accurate and timely crop growth information is significant, in particularly for large scale farming. We focused on the issues of sugarcane sucrose content estimation using time-series satellite image. Firstly, we calculated the spectral features and vegetation indices to make them be correspondence to the sucrose accumulation biological mechanism. Secondly, we improved the statistical regression model considering more other factors. The evaluation was performed and we got precision of 90% which is about 20% higher than the conventional method. The validation results showed that prediction accuracy using our sugarcane growth modeling and improved mix model is satisfied.
An advection-based model to increase the temporal resolution of PIV time series.
Scarano, Fulvio; Moore, Peter
A numerical implementation of the advection equation is proposed to increase the temporal resolution of PIV time series. The method is based on the principle that velocity fluctuations are transported passively, similar to Taylor's hypothesis of frozen turbulence. In the present work, the advection model is extended to unsteady three-dimensional flows. The main objective of the method is that of lowering the requirement on the PIV repetition rate from the Eulerian frequency toward the Lagrangian one. The local trajectory of the fluid parcel is obtained by forward projection of the instantaneous velocity at the preceding time instant and backward projection from the subsequent time step. The trajectories are approximated by the instantaneous streamlines, which yields accurate results when the amplitude of velocity fluctuations is small with respect to the convective motion. The verification is performed with two experiments conducted at temporal resolutions significantly higher than that dictated by Nyquist criterion. The flow past the trailing edge of a NACA0012 airfoil closely approximates frozen turbulence, where the largest ratio between the Lagrangian and Eulerian temporal scales is expected. An order of magnitude reduction of the needed acquisition frequency is demonstrated by the velocity spectra of super-sampled series. The application to three-dimensional data is made with time-resolved tomographic PIV measurements of a transitional jet. Here, the 3D advection equation is implemented to estimate the fluid trajectories. The reduction in the minimum sampling rate by the use of super-sampling in this case is less, due to the fact that vortices occurring in the jet shear layer are not well approximated by sole advection at large time separation. Both cases reveal that the current requirements for time-resolved PIV experiments can be revised when information is poured from space to time. An additional favorable effect is observed by the analysis in the frequency
Directory of Open Access Journals (Sweden)
Wangren Qiu
2015-01-01
Full Text Available In view of techniques for constructing high-order fuzzy time series models, there are three types which are based on advanced algorithms, computational method, and grouping the fuzzy logical relationships. The last type of models is easy to be understood by the decision maker who does not know anything about fuzzy set theory or advanced algorithms. To deal with forecasting problems, this paper presented novel high-order fuzz time series models denoted as GTS (M, N based on generalized fuzzy logical relationships and automatic clustering. This paper issued the concept of generalized fuzzy logical relationship and an operation for combining the generalized relationships. Then, the procedure of the proposed model was implemented on forecasting enrollment data at the University of Alabama. To show the considerable outperforming results, the proposed approach was also applied to forecasting the Shanghai Stock Exchange Composite Index. Finally, the effects of parameters M and N, the number of order, and concerned principal fuzzy logical relationships, on the forecasting results were also discussed.
Directory of Open Access Journals (Sweden)
Kansuporn eSriyudthsak
2016-05-01
Full Text Available The high-throughput acquisition of metabolome data is greatly anticipated for the complete understanding of cellular metabolism in living organisms. A variety of analytical technologies have been developed to acquire large-scale metabolic profiles under different biological or environmental conditions. Time series data are useful for predicting the most likely metabolic pathways because they provide important information regarding the accumulation of metabolites, which implies causal relationships in the metabolic reaction network. Considerable effort has been undertaken to utilize these data for constructing a mathematical model merging system properties and quantitatively characterizing a whole metabolic system in toto. However, there are technical difficulties between benchmarking the provision and utilization of data. Although hundreds of metabolites can be measured, which provide information on the metabolic reaction system, simultaneous measurement of thousands of metabolites is still challenging. In addition, it is nontrivial to logically predict the dynamic behaviors of unmeasurable metabolite concentrations without sufficient information on the metabolic reaction network. Yet, consolidating the advantages of advancements in both metabolomics and mathematical modeling remain to be accomplished. This review outlines the conceptual basis of and recent advances in technologies in both the research fields. It also highlights the potential for constructing a large-scale mathematical model by estimating model parameters from time series metabolome data in order to comprehensively understand metabolism at the systems level.
Partitioning and interpolation based hybrid ARIMA–ANN model for time series forecasting
Indian Academy of Sciences (India)
C NARENDRA BABU; PALLAVIRAM SURE
2016-07-01
Time series data (TSD) originating from different applications have dissimilar characteristics. Hence for prediction of TSD, diversified varieties of prediction models exist. In many applications, hybrid models provide more accurate predictions than individual models. One such hybrid model, namely auto regressive integrated moving average – artificial neural network (ARIMA–ANN) is devised in many different ways in the literature. However, the prediction accuracy of hybrid ARIMA–ANN model can be further improved by devising suitable processing techniques. In this paper, a hybrid ARIMA–ANN model is proposed, which combines the concepts of the recently developed moving average (MA) filter based hybrid ARIMA–ANN model, with a processing technique involving a partitioning–interpolation (PI) step. The improved prediction accuracy of the proposed PI based hybrid ARIMA–ANN model is justified using a simulation experiment.Further, on different experimental TSD like sunspots TSD and electricity price TSD, the proposed hybrid model is applied along with four existing state-of-the-art models and it is found that the proposed model outperforms all the others, and hence is a promising model for TSD prediction
Multivariate time series modeling of short-term system scale irrigation demand
Perera, Kushan C.; Western, Andrew W.; George, Biju; Nawarathna, Bandara
2015-12-01
Travel time limits the ability of irrigation system operators to react to short-term irrigation demand fluctuations that result from variations in weather, including very hot periods and rainfall events, as well as the various other pressures and opportunities that farmers face. Short-term system-wide irrigation demand forecasts can assist in system operation. Here we developed a multivariate time series (ARMAX) model to forecast irrigation demands with respect to aggregated service points flows (IDCGi, ASP) and off take regulator flows (IDCGi, OTR) based across 5 command areas, which included area covered under four irrigation channels and the study area. These command area specific ARMAX models forecast 1-5 days ahead daily IDCGi, ASP and IDCGi, OTR using the real time flow data recorded at the service points and the uppermost regulators and observed meteorological data collected from automatic weather stations. The model efficiency and the predictive performance were quantified using the root mean squared error (RMSE), Nash-Sutcliffe model efficiency coefficient (NSE), anomaly correlation coefficient (ACC) and mean square skill score (MSSS). During the evaluation period, NSE for IDCGi, ASP and IDCGi, OTR across 5 command areas were ranged 0.98-0.78. These models were capable of generating skillful forecasts (MSSS ⩾ 0.5 and ACC ⩾ 0.6) of IDCGi, ASP and IDCGi, OTR for all 5 lead days and IDCGi, ASP and IDCGi, OTR forecasts were better than using the long term monthly mean irrigation demand. Overall these predictive performance from the ARMAX time series models were higher than almost all the previous studies we are aware. Further, IDCGi, ASP and IDCGi, OTR forecasts have improved the operators' ability to react for near future irrigation demand fluctuations as the developed ARMAX time series models were self-adaptive to reflect the short-term changes in the irrigation demand with respect to various pressures and opportunities that farmers' face, such as
Modeling time-series count data: the unique challenges facing political communication studies.
Fogarty, Brian J; Monogan, James E
2014-05-01
This paper demonstrates the importance of proper model specification when analyzing time-series count data in political communication studies. It is common for scholars of media and politics to investigate counts of coverage of an issue as it evolves over time. Many scholars rightly consider the issues of time dependence and dynamic causality to be the most important when crafting a model. However, to ignore the count features of the outcome variable overlooks an important feature of the data. This is particularly the case when modeling data with a low number of counts. In this paper, we argue that the Poisson autoregressive model (Brandt and Williams, 2001) accurately meets the needs of many media studies. We replicate the analyses of Flemming et al. (1997), Peake and Eshbaugh-Soha (2008), and Ura (2009) and demonstrate that models missing some of the assumptions of the Poisson autoregressive model often yield invalid inferences. We also demonstrate that the effect of any of these models can be illustrated dynamically with estimates of uncertainty through a simulation procedure. The paper concludes with implications of these findings for the practical researcher.
A Score Type Test for General Autoregressive Models in Time Series
Institute of Scientific and Technical Information of China (English)
Jian-hong Wu; Li-xing Zhu
2007-01-01
This paper is devoted to the goodness-of-fit test for the general autoregressive models in time series. By averaging for the weighted residuals, we construct a score type test which is asymptotically standard chi-squared under the null and has some desirable power properties under the alternatives. Specifically, the test is sensitive to alternatives and can detect the alternatives approaching, along a direction, the null at a rate that is arbitrarily close to n-1/2. Furthermore, when the alternatives are not directional, we construct asymptotically distribution-free maximin tests for a large class of alternatives. The performance of the tests is evaluated through simulation studies.
Multi-horizon solar radiation forecasting for Mediterranean locations using time series models
Voyant, Cyril; Paoli, Christophe; Muselli, Marc; Nivet, Marie Laure
2013-01-01
International audience; Considering the grid manager's point of view, needs in terms of prediction of intermittent energy like the photovoltaic resource can be distinguished according to the considered horizon: following days (d+1, d+2 and d+3), next day by hourly step (h+24), next hour (h+1) and next few minutes (m+5 e.g.). Through this work, we have identified methodologies using time series models for the prediction horizon of global radiation and photovoltaic power. What we present here i...
Modeling of ionosphere time series using wavelet neural networks (case study: N-W of Iran)
Ghaffari Razin, Mir Reza; Voosoghi, Behzad
2016-07-01
Wavelet neural networks (WNNs) are important tools for analyzing time series especially when it is non-linear and non-stationary. It takes advantage of high resolution of wavelets and feed forward nature of neural networks (NNs). Therefore, in this paper, WNNs is used for modeling of ionosphere time series in Iran. To apply the method, observations collected at 22 GPS stations in 12 successive days of 2012 (DOY# 219-230) from Azerbaijan local GPS network are used. For training of WNN, back-propagation (BP) algorithm is used. The results of WNN compared with results of international reference ionosphere 2012 (IRI-2012) and international GNSS service (IGS) products. To assess the error of WNN, statistical indicators, relative and absolute errors are used. Minimum relative error for WNN compared with GPS TEC is 6.37% and maximum relative error is 12.94%. Also the maximum and minimum absolute error computed 6.32 and 0.13 TECU, respectively. Comparison of diurnal predicted TEC values from the WNN model and the IRI-2012 with GPS TEC revealed that the WNN provides more accurate predictions than the IRI-2012 model and IGS products in the test area.
Voyant, Cyril; Muselli, Marc; Paoli, Christophe; Nivet, Marie Laure
2014-01-01
When a territory is poorly instrumented, geostationary satellites data can be useful to predict global solar radiation. In this paper, we use geostationary satellites data to generate 2-D time series of solar radiation for the next hour. The results presented in this paper relate to a particular territory, the Corsica Island, but as data used are available for the entire surface of the globe, our method can be easily exploited to another place. Indeed 2-D hourly time series are extracted from the HelioClim-3 surface solar irradiation database treated by the Heliosat-2 model. Each point of the map have been used as training data and inputs of artificial neural networks (ANN) and as inputs for two persistence models (scaled or not). Comparisons between these models and clear sky estimations were proceeded to evaluate the performances. We found a normalized root mean square error (nRMSE) close to 16.5% for the two best predictors (scaled persistence and ANN) equivalent to 35-45% related to ground measurements. F...
Multidimensional k-nearest neighbor model based on EEMD for financial time series forecasting
Zhang, Ningning; Lin, Aijing; Shang, Pengjian
2017-07-01
In this paper, we propose a new two-stage methodology that combines the ensemble empirical mode decomposition (EEMD) with multidimensional k-nearest neighbor model (MKNN) in order to forecast the closing price and high price of the stocks simultaneously. The modified algorithm of k-nearest neighbors (KNN) has an increasingly wide application in the prediction of all fields. Empirical mode decomposition (EMD) decomposes a nonlinear and non-stationary signal into a series of intrinsic mode functions (IMFs), however, it cannot reveal characteristic information of the signal with much accuracy as a result of mode mixing. So ensemble empirical mode decomposition (EEMD), an improved method of EMD, is presented to resolve the weaknesses of EMD by adding white noise to the original data. With EEMD, the components with true physical meaning can be extracted from the time series. Utilizing the advantage of EEMD and MKNN, the new proposed ensemble empirical mode decomposition combined with multidimensional k-nearest neighbor model (EEMD-MKNN) has high predictive precision for short-term forecasting. Moreover, we extend this methodology to the case of two-dimensions to forecast the closing price and high price of the four stocks (NAS, S&P500, DJI and STI stock indices) at the same time. The results indicate that the proposed EEMD-MKNN model has a higher forecast precision than EMD-KNN, KNN method and ARIMA.
Scaling symmetry, renormalization, and time series modeling: the case of financial assets dynamics.
Zamparo, Marco; Baldovin, Fulvio; Caraglio, Michele; Stella, Attilio L
2013-12-01
We present and discuss a stochastic model of financial assets dynamics based on the idea of an inverse renormalization group strategy. With this strategy we construct the multivariate distributions of elementary returns based on the scaling with time of the probability density of their aggregates. In its simplest version the model is the product of an endogenous autoregressive component and a random rescaling factor designed to embody also exogenous influences. Mathematical properties like increments' stationarity and ergodicity can be proven. Thanks to the relatively low number of parameters, model calibration can be conveniently based on a method of moments, as exemplified in the case of historical data of the S&P500 index. The calibrated model accounts very well for many stylized facts, like volatility clustering, power-law decay of the volatility autocorrelation function, and multiscaling with time of the aggregated return distribution. In agreement with empirical evidence in finance, the dynamics is not invariant under time reversal, and, with suitable generalizations, skewness of the return distribution and leverage effects can be included. The analytical tractability of the model opens interesting perspectives for applications, for instance, in terms of obtaining closed formulas for derivative pricing. Further important features are the possibility of making contact, in certain limits, with autoregressive models widely used in finance and the possibility of partially resolving the long- and short-memory components of the volatility, with consistent results when applied to historical series.
Keller, D. E.; Fischer, A. M.; Frei, C.; Liniger, M. A.; Appenzeller, C.; Knutti, R.
2014-07-01
Many climate impact assessments over topographically complex terrain require high-resolution precipitation time-series that have a spatio-temporal correlation structure consistent with observations. This consistency is essential for spatially distributed modelling of processes with non-linear responses to precipitation input (e.g. soil water and river runoff modelling). In this regard, weather generators (WGs) designed and calibrated for multiple sites are an appealing technique to stochastically simulate time-series that approximate the observed temporal and spatial dependencies. In this study, we present a stochastic multi-site precipitation generator and validate it over the hydrological catchment Thur in the Swiss Alps. The model consists of several Richardson-type WGs that are run with correlated random number streams reflecting the observed correlation structure among all possible station pairs. A first-order two-state Markov process simulates intermittence of daily precipitation, while precipitation amounts are simulated from a mixture model of two exponential distributions. The model is calibrated separately for each month over the time-period 1961-2011. The WG is skilful at individual sites in representing the annual cycle of the precipitation statistics, such as mean wet day frequency and intensity as well as monthly precipitation sums. It reproduces realistically the multi-day statistics such as the frequencies of dry and wet spell lengths and precipitation sums over consecutive wet days. Substantial added value is demonstrated in simulating daily areal precipitation sums in comparison to multiple WGs that lack the spatial dependency in the stochastic process: the multi-site WG is capable to capture about 95% of the observed variability in daily area sums, while the summed time-series from multiple single-site WGs only explains about 13%. Limitation of the WG have been detected in reproducing observed variability from year to year, a component that has
Diffusive and subdiffusive dynamics of indoor microclimate: a time series modeling.
Maciejewska, Monika; Szczurek, Andrzej; Sikora, Grzegorz; Wyłomańska, Agnieszka
2012-09-01
The indoor microclimate is an issue in modern society, where people spend about 90% of their time indoors. Temperature and relative humidity are commonly used for its evaluation. In this context, the two parameters are usually considered as behaving in the same manner, just inversely correlated. This opinion comes from observation of the deterministic components of temperature and humidity time series. We focus on the dynamics and the dependency structure of the time series of these parameters, without deterministic components. Here we apply the mean square displacement, the autoregressive integrated moving average (ARIMA), and the methodology for studying anomalous diffusion. The analyzed data originated from five monitoring locations inside a modern office building, covering a period of nearly one week. It was found that the temperature data exhibited a transition between diffusive and subdiffusive behavior, when the building occupancy pattern changed from the weekday to the weekend pattern. At the same time the relative humidity consistently showed diffusive character. Also the structures of the dependencies of the temperature and humidity data sets were different, as shown by the different structures of the ARIMA models which were found appropriate. In the space domain, the dynamics and dependency structure of the particular parameter were preserved. This work proposes an approach to describe the very complex conditions of indoor air and it contributes to the improvement of the representative character of microclimate monitoring.
Non-stationary time series modeling on caterpillars pest of palm oil for early warning system
Setiyowati, Susi; Nugraha, Rida F.; Mukhaiyar, Utriweni
2015-12-01
The oil palm production has an important role for the plantation and economic sector in Indonesia. One of the important problems in the cultivation of oil palm plantation is pests which causes damage to the quality of fruits. The caterpillar pest which feed palm tree's leaves will cause decline in quality of palm oil production. Early warning system is needed to minimize losses due to this pest. Here, we applied non-stationary time series modeling, especially the family of autoregressive models to predict the number of pests based on its historical data. We realized that there is some uniqueness of these pests data, i.e. the spike value that occur almost periodically. Through some simulations and case study, we obtain that the selection of constant factor has a significance influence to the model so that it can shoot the spikes value precisely.
Estimating and Analyzing Savannah Phenology with a Lagged Time Series Model
DEFF Research Database (Denmark)
Boke-Olen, Niklas; Lehsten, Veiko; Ardo, Jonas
2016-01-01
Savannah regions are predicted to undergo changes in precipitation patterns according to current climate change projections. This change will affect leaf phenology, which controls net primary productivity. It is of importance to study this since savannahs play an important role in the global carbon...... cycle due to their areal coverage and can have an effect on the food security in regions that depend on subsistence farming. In this study we investigate how soil moisture, mean annual precipitation, and day length control savannah phenology by developing a lagged time series model. The model uses...... climate data for 15 flux tower sites across four continents, and normalized difference vegetation index from satellite to optimize a statistical phenological model. We show that all three variables can be used to estimate savannah phenology on a global scale. However, it was not possible to create...
A detection model of underwater topography with a series of SAR images acquired at different time
Institute of Scientific and Technical Information of China (English)
YANG Jungang; ZHANG Jie; MENG Junmin
2010-01-01
underwater topography is one of oceanic features detected by Synthetic Aperture Radar. Under-water topography SAR imaging mechanism shows that tidal current is the important factor for underwater topography SAR imaging. Thus under the same wind field condition, SAR images for the same area acquired at different time include different information of the underwater topogra-phy. To utilize synchronously SAR images acquired at different time for the underwater topography SAR detection and improve the precision of detection, based on the detection model of underwater topography with single SAR image and the periodicity of tidal current, a detection model of under- water topography with a series of SAR images acquired at different time is developed by combing with tide and tidal current numerical simulation. To testify the feasibility of the presented model, Taiwan Shoal located at the south outlet of Taiwan Strait is selected as study area and three SAR images are used in the underwater topography detection. The detection results are compared with the field observation data of water depth carried out by R/V Dongfanghong 2, and the errors of the detection are compared with those of the single SAR image. All comparisons show that the detec- tion model presented in the paper improves the precision of underwater topography SAR detection, and the presented model is feasible.
Cooling load calculation by the radiant time series method - effect of solar radiation models
Energy Technology Data Exchange (ETDEWEB)
Costa, Alexandre M.S. [Universidade Estadual de Maringa (UEM), PR (Brazil)], E-mail: amscosta@uem.br
2010-07-01
In this work was analyzed numerically the effect of three different models for solar radiation on the cooling load calculated by the radiant time series' method. The solar radiation models implemented were clear sky, isotropic sky and anisotropic sky. The radiant time series' method (RTS) was proposed by ASHRAE (2001) for replacing the classical methods of cooling load calculation, such as TETD/TA. The method is based on computing the effect of space thermal energy storage on the instantaneous cooling load. The computing is carried out by splitting the heat gain components in convective and radiant parts. Following the radiant part is transformed using time series, which coefficients are a function of the construction type and heat gain (solar or non-solar). The transformed result is added to the convective part, giving the instantaneous cooling load. The method was applied for investigate the influence for an example room. The location used was - 23 degree S and 51 degree W and the day was 21 of January, a typical summer day in the southern hemisphere. The room was composed of two vertical walls with windows exposed to outdoors with azimuth angles equals to west and east directions. The output of the different models of solar radiation for the two walls in terms of direct and diffuse components as well heat gains were investigated. It was verified that the clear sky exhibited the less conservative (higher values) for the direct component of solar radiation, with the opposite trend for the diffuse component. For the heat gain, the clear sky gives the higher values, three times higher for the peek hours than the other models. Both isotropic and anisotropic models predicted similar magnitude for the heat gain. The same behavior was also verified for the cooling load. The effect of room thermal inertia was decreasing the cooling load during the peak hours. On the other hand the higher thermal inertia values are the greater for the non peak hours. The effect
Time Series Stream Temperature And Dissolved Oxygen Modeling In The Lower Flint River Basin
Li, G.; Jackson, C. R.
2004-12-01
The tributaries of the Lower Flint River Basin (LFRB) are incised into the upper Floridan semi-confined limestone aquifer, and thus seepage of relatively old groundwater sustains baseflows and provides some control over temperature and dissolved oxygen fluctuations. This hydrologic and geologic setting creates aquatic habitat that is unique in the state of Georgia. Groundwater withdrawals and possible water supply reservoirs threaten to exacerbate low flow conditions during summer droughts, which may force negative impacts to stream temperature and dissolved oxygen (DO). To evaluate the possible effects of human modifications to stream habitat, summer time series (in 15 min interval) of stream temperature and DO were monitored over the last three years along these streams, and a Continuously Stirred Tank Reactor (CSTR) model was developed and calibrated with these data. The driving forces of the diel trends and the overall levels of stream temperature and DO were identified by this model. Simulations were conducted with assumed managed flow conditions to illustrate potential effects of various stream flow regimes on stream temperature and DO time series. The goal of this research is to provide an accurate simulation tool to guide management decisions.
Linear genetic programming for time-series modelling of daily flow rate
Indian Academy of Sciences (India)
Aytac Guven
2009-04-01
In this study linear genetic programming (LGP),which is a variant of Genetic Programming,and two versions of Neural Networks (NNs)are used in predicting time-series of daily ﬂow rates at a station on Schuylkill River at Berne,PA,USA.Daily ﬂow rate at present is being predicted based on different time-series scenarios.For this purpose,various LGP and NN models are calibrated with training sets and validated by testing sets.Additionally,the robustness of the proposed LGP and NN models are evaluated by application data,which are used neither in training nor at testing stage.The results showed that both techniques predicted the ﬂow rate data in quite good agreement with the observed ones,and the predictions of LGP and NN are challenging.The performance of LGP,which was moderately better than NN,is very promising and hence supports the use of LGP in predicting of river ﬂow data.
Stochastic modeling for time series InSAR: with emphasis on atmospheric effects
Cao, Yunmeng; Li, Zhiwei; Wei, Jianchao; Hu, Jun; Duan, Meng; Feng, Guangcai
2017-08-01
Despite the many applications of time series interferometric synthetic aperture radar (TS-InSAR) techniques in geophysical problems, error analysis and assessment have been largely overlooked. Tropospheric propagation error is still the dominant error source of InSAR observations. However, the spatiotemporal variation of atmospheric effects is seldom considered in the present standard TS-InSAR techniques, such as persistent scatterer interferometry and small baseline subset interferometry. The failure to consider the stochastic properties of atmospheric effects not only affects the accuracy of the estimators, but also makes it difficult to assess the uncertainty of the final geophysical results. To address this issue, this paper proposes a network-based variance-covariance estimation method to model the spatiotemporal variation of tropospheric signals, and to estimate the temporal variance-covariance matrix of TS-InSAR observations. The constructed stochastic model is then incorporated into the TS-InSAR estimators both for parameters (e.g., deformation velocity, topography residual) estimation and uncertainty assessment. It is an incremental and positive improvement to the traditional weighted least squares methods to solve the multitemporal InSAR time series. The performance of the proposed method is validated by using both simulated and real datasets.
Time series modeling of soil moisture dynamics on a steep mountainous hillside
Kim, Sanghyun
2016-05-01
The response of soil moisture to rainfall events along hillslope transects is an important hydrologic process and a critical component of interactions between soil vegetation and the atmosphere. In this context, the research described in this article addresses the spatial distribution of soil moisture as a function of topography. In order to characterize the temporal variation in soil moisture on a steep mountainous hillside, a transfer function, including a model for noise, was introduced. Soil moisture time series with similar rainfall amounts, but different wetness gradients were measured in the spring and fall. Water flux near the soil moisture sensors was modeled and mathematical expressions were developed to provide a basis for input-output modeling of rainfall and soil moisture using hydrological processes such as infiltration, exfiltration and downslope lateral flow. The characteristics of soil moisture response can be expressed in terms of model structure. A seasonal comparison of models reveals differences in soil moisture response to rainfall, possibly associated with eco-hydrological process and evapotranspiration. Modeling results along the hillslope indicate that the spatial structure of the soil moisture response patterns mainly appears in deeper layers. Similarities between topographic attributes and stochastic model structures are spatially organized. The impact of temporal and spatial discretization scales on parameter expression is addressed in the context of modeling results that link rainfall events and soil moisture.
Effective Feature Preprocessing for Time Series Forecasting
DEFF Research Database (Denmark)
Zhao, Junhua; Dong, Zhaoyang; Xu, Zhao
2006-01-01
Time series forecasting is an important area in data mining research. Feature preprocessing techniques have significant influence on forecasting accuracy, therefore are essential in a forecasting model. Although several feature preprocessing techniques have been applied in time series forecasting...... performance in time series forecasting. It is demonstrated in our experiment that, effective feature preprocessing can significantly enhance forecasting accuracy. This research can be a useful guidance for researchers on effectively selecting feature preprocessing techniques and integrating them with time...... series forecasting models....
Artificial neural networks for modeling time series of beach litter in the southern North Sea.
Schulz, Marcus; Matthies, Michael
2014-07-01
In European marine waters, existing monitoring programs of beach litter need to be improved concerning litter items used as indicators of pollution levels, efficiency, and effectiveness. In order to ease and focus future monitoring of beach litter on few important litter items, feed-forward neural networks consisting of three layers were developed to relate single litter items to general categories of marine litter. The neural networks developed were applied to seven beaches in the southern North Sea and modeled time series of five general categories of marine litter, such as litter from fishing, shipping, and tourism. Results of regression analyses show that general categories were predicted significantly moderately to well. Measured and modeled data were in the same order of magnitude, and minima and maxima overlapped well. Neural networks were found to be eligible tools to deliver reliable predictions of marine litter with low computational effort and little input of information. Copyright © 2014 Elsevier Ltd. All rights reserved.
Research on Time-series Modeling and Filtering Methods for MEMS Gyroscope Random Drift Error
Wang, Xiao Yi; Meng, Xiu Yun
2017-03-01
The precision of MEMS gyroscope is reduced by random drift error. This paper applied time series analysis to model random drift error of MEMS gyroscope. Based on the model established, Kalman filter was employed to compensate for the error. To overcome the disadvantages of conventional Kalman filter, Sage-Husa adaptive filtering algorithm was utilized to improve the accuracy of filtering results and the orthogonal property of innovation in the process of filtering was utilized to deal with outliers. The results showed that, compared with conventional Kalman filter, the modified filter can not only enhance filter accuracy, but also resist to outliers and this assured the stability of filtering thus improving the performance of gyroscopes.
Modelling the neurovascular habituation effect on fMRI time series
Energy Technology Data Exchange (ETDEWEB)
Ciuciu, Ph.; Sockeel, S.; Vincent, T. [NeuroSpin/CEA, F-91191 Gif-sur-Yvette (France); Idier, J. [IRCCyN/CNRS, 1 rue de la Noe 44300 Nantes (France)
2009-07-01
In this paper, a novel non-stationary model of functional Magnetic Resonance Imaging (fMRI) time series is proposed. It allows us to account for some putative habituation effect arising in event-related fMRI paradigms that involves the so-called repetition-suppression phenomenon and induces decreasing magnitude responses over successive trials. Akin, this model is defined over functionally homogeneous regions-of-interest (ROIs) and embedded in a joint detection-estimation approach of brain activity. Importantly, its non-stationarity character is embodied in the trial-varying nature of the BOLD response magnitude. Habituation and activation maps are then estimated within the Bayesian framework in a fully unsupervised MCMC procedure. On artificial fMRI datasets, we show that habituation effects can be accurately recovered in activating voxels. (authors)
Institute of Scientific and Technical Information of China (English)
YE Liming; YANG Guixia; Eric VAN RANST; TANG Huajun
2013-01-01
A generalized,structural,time series modeling framework was developed to analyze the monthly records of absolute surface temperature,one of the most important environmental parameters,using a deterministicstochastic combined (DSC) approach.Although the development of the framework was based on the characterization of the variation patterns of a global dataset,the methodology could be applied to any monthly absolute temperature record.Deterministic processes were used to characterize the variation patterns of the global trend and the cyclic oscillations of the temperature signal,involving polynomial functions and the Fourier method,respectively,while stochastic processes were employed to account for any remaining patterns in the temperature signal,involving seasonal autoregressive integrated moving average (SARIMA) models.A prediction of the monthly global surface temperature during the second decade of the 21st century using the DSC model shows that the global temperature will likely continue to rise at twice the average rate of the past 150 years.The evaluation of prediction accuracy shows that DSC models perform systematically well against selected models of other authors,suggesting that DSC models,when coupled with other ecoenvironmental models,can be used as a supplemental tool for short-term (～10-year) environmental planning and decision making.
Multi-horizon solar radiation forecasting for Mediterranean locations using time series models
Voyant, Cyril; Muselli, Marc; Nivet, Marie Laure
2013-01-01
Considering the grid manager's point of view, needs in terms of prediction of intermittent energy like the photovoltaic resource can be distinguished according to the considered horizon: following days (d+1, d+2 and d+3), next day by hourly step (h+24), next hour (h+1) and next few minutes (m+5 e.g.). Through this work, we have identified methodologies using time series models for the prediction horizon of global radiation and photovoltaic power. What we present here is a comparison of different predictors developed and tested to propose a hierarchy. For horizons d+1 and h+1, without advanced ad hoc time series pre-processing (stationarity) we find it is not easy to differentiate between autoregressive moving average (ARMA) and multilayer perceptron (MLP). However we observed that using exogenous variables improves significantly the results for MLP . We have shown that the MLP were more adapted for horizons h+24 and m+5. In summary, our results are complementary and improve the existing prediction techniques ...
A Long-Term Prediction Model of Beijing Haze Episodes Using Time Series Analysis
Zhang, Zhongqiu; Sun, Liren; Xu, Cui
2016-01-01
The rapid industrial development has led to the intermittent outbreak of pm2.5 or haze in developing countries, which has brought about great environmental issues, especially in big cities such as Beijing and New Delhi. We investigated the factors and mechanisms of haze change and present a long-term prediction model of Beijing haze episodes using time series analysis. We construct a dynamic structural measurement model of daily haze increment and reduce the model to a vector autoregressive model. Typical case studies on 886 continuous days indicate that our model performs very well on next day's Air Quality Index (AQI) prediction, and in severely polluted cases (AQI ≥ 300) the accuracy rate of AQI prediction even reaches up to 87.8%. The experiment of one-week prediction shows that our model has excellent sensitivity when a sudden haze burst or dissipation happens, which results in good long-term stability on the accuracy of the next 3–7 days' AQI prediction. PMID:27597861
A Long-Term Prediction Model of Beijing Haze Episodes Using Time Series Analysis
Directory of Open Access Journals (Sweden)
Xiaoping Yang
2016-01-01
Full Text Available The rapid industrial development has led to the intermittent outbreak of pm2.5 or haze in developing countries, which has brought about great environmental issues, especially in big cities such as Beijing and New Delhi. We investigated the factors and mechanisms of haze change and present a long-term prediction model of Beijing haze episodes using time series analysis. We construct a dynamic structural measurement model of daily haze increment and reduce the model to a vector autoregressive model. Typical case studies on 886 continuous days indicate that our model performs very well on next day’s Air Quality Index (AQI prediction, and in severely polluted cases (AQI ≥ 300 the accuracy rate of AQI prediction even reaches up to 87.8%. The experiment of one-week prediction shows that our model has excellent sensitivity when a sudden haze burst or dissipation happens, which results in good long-term stability on the accuracy of the next 3–7 days’ AQI prediction.
A Long-Term Prediction Model of Beijing Haze Episodes Using Time Series Analysis.
Yang, Xiaoping; Zhang, Zhongxia; Zhang, Zhongqiu; Sun, Liren; Xu, Cui; Yu, Li
2016-01-01
The rapid industrial development has led to the intermittent outbreak of pm2.5 or haze in developing countries, which has brought about great environmental issues, especially in big cities such as Beijing and New Delhi. We investigated the factors and mechanisms of haze change and present a long-term prediction model of Beijing haze episodes using time series analysis. We construct a dynamic structural measurement model of daily haze increment and reduce the model to a vector autoregressive model. Typical case studies on 886 continuous days indicate that our model performs very well on next day's Air Quality Index (AQI) prediction, and in severely polluted cases (AQI ≥ 300) the accuracy rate of AQI prediction even reaches up to 87.8%. The experiment of one-week prediction shows that our model has excellent sensitivity when a sudden haze burst or dissipation happens, which results in good long-term stability on the accuracy of the next 3-7 days' AQI prediction.
Estimating and Analyzing Savannah Phenology with a Lagged Time Series Model.
Directory of Open Access Journals (Sweden)
Niklas Boke-Olén
Full Text Available Savannah regions are predicted to undergo changes in precipitation patterns according to current climate change projections. This change will affect leaf phenology, which controls net primary productivity. It is of importance to study this since savannahs play an important role in the global carbon cycle due to their areal coverage and can have an effect on the food security in regions that depend on subsistence farming. In this study we investigate how soil moisture, mean annual precipitation, and day length control savannah phenology by developing a lagged time series model. The model uses climate data for 15 flux tower sites across four continents, and normalized difference vegetation index from satellite to optimize a statistical phenological model. We show that all three variables can be used to estimate savannah phenology on a global scale. However, it was not possible to create a simplified savannah model that works equally well for all sites on the global scale without inclusion of more site specific parameters. The simplified model showed no bias towards tree cover or between continents and resulted in a cross-validated r2 of 0.6 and root mean squared error of 0.1. We therefore expect similar average results when applying the model to other savannah areas and further expect that it could be used to estimate the productivity of savannah regions.
Guarnaccia, Claudio; Quartieri, Joseph; Tepedino, Carmine
2017-06-01
One of the most hazardous physical polluting agents, considering their effects on human health, is acoustical noise. Airports are a strong source of acoustical noise, due to the airplanes turbines, to the aero-dynamical noise of transits, to the acceleration or the breaking during the take-off and landing phases of aircrafts, to the road traffic around the airport, etc.. The monitoring and the prediction of the acoustical level emitted by airports can be very useful to assess the impact on human health and activities. In the airports noise scenario, thanks to flights scheduling, the predominant sources may have a periodic behaviour. Thus, a Time Series Analysis approach can be adopted, considering that a general trend and a seasonal behaviour can be highlighted and used to build a predictive model. In this paper, two different approaches are adopted, thus two predictive models are constructed and tested. The first model is based on deterministic decomposition and is built composing the trend, that is the long term behaviour, the seasonality, that is the periodic component, and the random variations. The second model is based on seasonal autoregressive moving average, and it belongs to the stochastic class of models. The two different models are fitted on an acoustical level dataset collected close to the Nice (France) international airport. Results will be encouraging and will show good prediction performances of both the adopted strategies. A residual analysis is performed, in order to quantify the forecasting error features.
Creating Discriminative Models for Time Series Classification and Clustering by HMM Ensembles.
Asadi, Nazanin; Mirzaei, Abdolreza; Haghshenas, Ehsan
2016-12-01
Classification of temporal data sequences is a fundamental branch of machine learning with a broad range of real world applications. Since the dimensionality of temporal data is significantly larger than static data, and its modeling and interpreting is more complicated, performing classification and clustering on temporal data is more complex as well. Hidden Markov models (HMMs) are well-known statistical models for modeling and analysis of sequence data. Besides, ensemble methods, which employ multiple models to obtain the target model, revealed good performances in the conducted experiments. All these facts are a high level of motivation to employ HMM ensembles in the task of classification and clustering of time series data. So far, no effective classification and clustering method based on HMM ensembles has been proposed. Moreover, employing the limited existing HMM ensemble methods has trouble separating models of distinct classes as a vital task. In this paper, according to previous points a new framework based on HMM ensembles for classification and clustering is proposed. In addition to its strong theoretical background by employing the Rényi entropy for ensemble learning procedure, the main contribution of the proposed method is addressing HMM-based methods problem in separating models of distinct classes by considering the inverse emission matrix of the opposite class to build an opposite model. The proposed algorithms perform more effectively compared to other methods especially other HMM ensemble-based methods. Moreover, the proposed clustering framework, which derives benefits from both similarity-based and model-based methods, together with the Rényi-based ensemble method revealed its superiority in several measurements.
Buishand, T. A.; Klein Tank, A. M. G.
1996-05-01
The precipitation amounts on wet days at De Bilt (the Netherlands) are linked to temperature and surface air pressure through advanced regression techniques. Temperature is chosen as a covariate to use the model for generating synthetic time series of daily precipitation in a CO2 induced warmer climate. The precipitation-temperature dependence can partly be ascribed to the phenomenon that warmer air can contain more moisture. Spline functions are introduced to reproduce the non-monotonous change of the mean daily precipitation amount with temperature. Because the model is non-linear and the variance of the errors depends on the expected response, an iteratively reweighted least-squares technique is needed to estimate the regression coefficients. A representative rainfall sequence for the situation of a systematic temperature rise is obtained by multiplying the precipitation amounts in the observed record with a temperature dependent factor based on a fitted regression model. For a temperature change of 3°C (reasonable guess for a doubled CO2 climate according to the present-day general circulation models) this results in an increase in the annual average amount of 9% (20% in winter and 4% in summer). An extended model with both temperature and surface air pressure is presented which makes it possible to study the additional effects of a potential systematic change in surface air pressure on precipitation.
Time-series microarray data simulation modeled with a case-control label.
Liu, Y J; Zhang, J Y
2016-05-12
With advances in molecular biology, microarray data have become an important resource in the exploration of complex human diseases. Although gene chip technology continues to grow, there are still many barriers to overcome, such as high costs, small sample sizes, complex procedures, poor repeatability, and the dependence on data analysis methods. To avoid these problems, simulation data have a vital role in the study of complex diseases. A simulation method of microarray data is introduced in this study to model the occurrence and development of general diseases. Using classic statistics and control theory, five risk models are proposed. One or more models can be introduced into the baseline simulation dataset with a case-control label. In addition, time-series gene expression data can be generated to model the dynamic evolutionary process of a disease. The prevalence of each model is estimated and disease-associated genes are tested by significance analysis of microarrays. The source code, written in MATLAB, is freely and publicly available at http://sourceforge.net/projects/genesimulation/files/.
The string prediction models as invariants of time series in the forex market
Pincak, R.
2013-12-01
In this paper we apply a new approach of string theory to the real financial market. The models are constructed with an idea of prediction models based on the string invariants (PMBSI). The performance of PMBSI is compared to support vector machines (SVM) and artificial neural networks (ANN) on an artificial and a financial time series. A brief overview of the results and analysis is given. The first model is based on the correlation function as invariant and the second one is an application based on the deviations from the closed string/pattern form (PMBCS). We found the difference between these two approaches. The first model cannot predict the behavior of the forex market with good efficiency in comparison with the second one which is, in addition, able to make relevant profit per year. The presented string models could be useful for portfolio creation and financial risk management in the banking sector as well as for a nonlinear statistical approach to data optimization.
Optimizing the De-Noise Neural Network Model for GPS Time-Series Monitoring of Structures
Directory of Open Access Journals (Sweden)
Mosbeh R. Kaloop
2015-09-01
Full Text Available The Global Positioning System (GPS is recently used widely in structures and other applications. Notwithstanding, the GPS accuracy still suffers from the errors afflicting the measurements, particularly the short-period displacement of structural components. Previously, the multi filter method is utilized to remove the displacement errors. This paper aims at using a novel application for the neural network prediction models to improve the GPS monitoring time series data. Four prediction models for the learning algorithms are applied and used with neural network solutions: back-propagation, Cascade-forward back-propagation, adaptive filter and extended Kalman filter, to estimate which model can be recommended. The noise simulation and bridge’s short-period GPS of the monitoring displacement component of one Hz sampling frequency are used to validate the four models and the previous method. The results show that the Adaptive neural networks filter is suggested for de-noising the observations, specifically for the GPS displacement components of structures. Also, this model is expected to have significant influence on the design of structures in the low frequency responses and measurements’ contents.
U.S. Geological Survey, Department of the Interior — Abstract: This data release presents modeled time series of nearshore waves along the southern California coast, from Point Conception to the Mexican border,...
U.S. Geological Survey, Department of the Interior — Abstract: This data release presents modeled time series of nearshore waves along the southern California coast, from Point Conception to the Mexican border,...
Accurate estimation of energy expenditure (EE) in children and adolescents is required for a better understanding of physiological, behavioral, and environmental factors affecting energy balance. Cross-sectional time series (CSTS) models, which account for correlation structure of repeated observati...
Bayesian models of thermal and pluviometric time series in the Fucino plateau
Directory of Open Access Journals (Sweden)
Adriana Trabucco
2011-09-01
Full Text Available This work was developed within the Project Metodologie e sistemi integrati per la qualificazione di produzioni orticole del Fucino (Methodologies and integrated systems for the classification of horticultural products in the Fucino plateau, sponsored by the Italian Ministry of Education, University and Research, Strategic Projects, Law 448/97. Agro-system managing, especially if necessary to achieve high quality in speciality crops, requires knowledge of main features and intrinsic variability of climate. Statistical models may properly summarize the structure existing behind the observed variability, furthermore they may support the agronomic manager by providing the probability that meteorological events happen in a time window of interest. More than 30 years of daily values collected in four sites located on the Fucino plateau, Abruzzo region, Italy, were studied by fitting Bayesian generalized linear models to air temperature maximum /minimum and rainfall time series. Bayesian predictive distributions of climate variables supporting decision-making processes were calculated at different timescales, 5-days for temperatures and 10-days for rainfall, both to reduce computational efforts and to simplify statistical model assumptions. Technicians and field operators, even with limited statistical training, may exploit the model output by inspecting graphs and climatic profiles of the cultivated areas during decision-making processes. Realizations taken from predictive distributions may also be used as input for agro-ecological models (e.g. models of crop growth, water balance. Fitted models may be exploited to monitor climatic changes and to revise climatic profiles of interest areas, periodically updating the probability distributions of target climatic variables. For the sake of brevity, the description of results is limited to just one of the four sites, and results for all other sites are available as supplementary information.
GPS Position Time Series @ JPL
Owen, Susan; Moore, Angelyn; Kedar, Sharon; Liu, Zhen; Webb, Frank; Heflin, Mike; Desai, Shailen
2013-01-01
Different flavors of GPS time series analysis at JPL - Use same GPS Precise Point Positioning Analysis raw time series - Variations in time series analysis/post-processing driven by different users. center dot JPL Global Time Series/Velocities - researchers studying reference frame, combining with VLBI/SLR/DORIS center dot JPL/SOPAC Combined Time Series/Velocities - crustal deformation for tectonic, volcanic, ground water studies center dot ARIA Time Series/Coseismic Data Products - Hazard monitoring and response focused center dot ARIA data system designed to integrate GPS and InSAR - GPS tropospheric delay used for correcting InSAR - Caltech's GIANT time series analysis uses GPS to correct orbital errors in InSAR - Zhen Liu's talking tomorrow on InSAR Time Series analysis
GPS Position Time Series @ JPL
Owen, Susan; Moore, Angelyn; Kedar, Sharon; Liu, Zhen; Webb, Frank; Heflin, Mike; Desai, Shailen
2013-01-01
Different flavors of GPS time series analysis at JPL - Use same GPS Precise Point Positioning Analysis raw time series - Variations in time series analysis/post-processing driven by different users. center dot JPL Global Time Series/Velocities - researchers studying reference frame, combining with VLBI/SLR/DORIS center dot JPL/SOPAC Combined Time Series/Velocities - crustal deformation for tectonic, volcanic, ground water studies center dot ARIA Time Series/Coseismic Data Products - Hazard monitoring and response focused center dot ARIA data system designed to integrate GPS and InSAR - GPS tropospheric delay used for correcting InSAR - Caltech's GIANT time series analysis uses GPS to correct orbital errors in InSAR - Zhen Liu's talking tomorrow on InSAR Time Series analysis
A LINEAR-NEURAL HYBRID MODEL FOR ANALYSIS AND FORECASTING OF TIME-SERIES
MARCELO CUNHA MEDEIROS
1998-01-01
Esta dissertação apresenta um modelo não linear auto-regressivo com variáveis exógenas (ARX), para análise e previsão de séries temporais. Os coeficientes do modelo são estimados pela saída de uma rede neural feed-forward, treinada por um algoritmo híbrido de otimização. Os resultados obtidos são comparados tanto com modelos lineares, quanto com não lineares. This thesis presents a non linear autoregressive model with exogeneous variables (ARX), for time series analysis and forecasting. Th...
Goodness-of-fit tests for vector autoregressive models in time series
Institute of Scientific and Technical Information of China (English)
无
2010-01-01
The paper proposes and studies some diagnostic tools for checking the goodness-of-fit of general parametric vector autoregressive models in time series. The resulted tests are asymptotically chi-squared under the null hypothesis and can detect the alternatives converging to the null at a parametric rate. The tests involve weight functions,which provides us with the flexibility to choose scores for enhancing power performance,especially under directional alternatives. When the alternatives are not directional,we construct asymptotically distribution-free maximin tests for a large class of alternatives. A possibility to construct score-based omnibus tests is discussed when the alternative is saturated. The power performance is also investigated. In addition,when the sample size is small,a nonparametric Monte Carlo test approach for dependent data is proposed to improve the performance of the tests. The algorithm is easy to implement. Simulation studies and real applications are carried out for illustration.
Study on Apparent Kinetic Prediction Model of the Smelting Reduction Based on the Time-Series
Directory of Open Access Journals (Sweden)
Guo-feng Fan
2012-01-01
Full Text Available A series of direct smelting reduction experiment has been carried out with high phosphorous iron ore of the different bases by thermogravimetric analyzer. The derivative thermogravimetric (DTG data have been obtained from the experiments. One-step forward local weighted linear (LWL method , one of the most suitable ways of predicting chaotic time-series methods which focus on the errors, is used to predict DTG. In the meanwhile, empirical mode decomposition-autoregressive (EMD-AR, a data mining technique in signal processing, is also used to predict DTG. The results show that (1 EMD-AR(4 is the most appropriate and its error is smaller than the former; (2 root mean square error (RMSE has decreased about two-thirds; (3 standardized root mean square error (NMSE has decreased in an order of magnitude. Finally in this paper, EMD-AR method has been improved by golden section weighting; its error would be smaller than before. Therefore, the improved EMD-AR model is a promising alternative for apparent reaction rate (DTG. The analytical results have been an important reference in the field of industrial control.
Zhang, Tingting; Wu, Jingwei; Li, Fan; Caffo, Brian; Boatman-Reich, Dana
2015-03-01
We introduce a dynamic directional model (DDM) for studying brain effective connectivity based on intracranial electrocorticographic (ECoG) time series. The DDM consists of two parts: a set of differential equations describing neuronal activity of brain components (state equations), and observation equations linking the underlying neuronal states to observed data. When applied to functional MRI or EEG data, DDMs usually have complex formulations and thus can accommodate only a few regions, due to limitations in spatial resolution and/or temporal resolution of these imaging modalities. In contrast, we formulate our model in the context of ECoG data. The combined high temporal and spatial resolution of ECoG data result in a much simpler DDM, allowing investigation of complex connections between many regions. To identify functionally segregated sub-networks, a form of biologically economical brain networks, we propose the Potts model for the DDM parameters. The neuronal states of brain components are represented by cubic spline bases and the parameters are estimated by minimizing a log-likelihood criterion that combines the state and observation equations. The Potts model is converted to the Potts penalty in the penalized regression approach to achieve sparsity in parameter estimation, for which a fast iterative algorithm is developed. The methods are applied to an auditory ECoG dataset.
Cheng, Qing; Lu, Xin; Wu, Joseph T.; Liu, Zhong; Huang, Jincai
2016-01-01
Guangdong experienced the largest dengue epidemic in recent history. In 2014, the number of dengue cases was the highest in the previous 10 years and comprised more than 90% of all cases. In order to analyze heterogeneous transmission of dengue, a multivariate time series model decomposing dengue risk additively into endemic, autoregressive and spatiotemporal components was used to model dengue transmission. Moreover, random effects were introduced in the model to deal with heterogeneous dengue transmission and incidence levels and power law approach was embedded into the model to account for spatial interaction. There was little spatial variation in the autoregressive component. In contrast, for the endemic component, there was a pronounced heterogeneity between the Pearl River Delta area and the remaining districts. For the spatiotemporal component, there was considerable heterogeneity across districts with highest values in some western and eastern department. The results showed that the patterns driving dengue transmission were found by using clustering analysis. And endemic component contribution seems to be important in the Pearl River Delta area, where the incidence is high (95 per 100,000), while areas with relatively low incidence (4 per 100,000) are highly dependent on spatiotemporal spread and local autoregression. PMID:27666657
A Study of Time Series Model for Predicting Jute Yarn Demand: Case Study
Directory of Open Access Journals (Sweden)
C. L. Karmaker
2017-01-01
Full Text Available In today’s competitive environment, predicting sales for upcoming periods at right quantity is very crucial for ensuring product availability as well as improving customer satisfaction. This paper develops a model to identify the most appropriate method for prediction based on the least values of forecasting errors. Necessary sales data of jute yarn were collected from a jute product manufacturer industry in Bangladesh, namely, Akij Jute Mills, Akij Group Ltd., in Noapara, Jessore. Time series plot of demand data indicates that demand fluctuates over the period of time. In this paper, eight different forecasting techniques including simple moving average, single exponential smoothing, trend analysis, Winters method, and Holt’s method were performed by statistical technique using Minitab 17 software. Performance of all methods was evaluated on the basis of forecasting accuracy and the analysis shows that Winters additive model gives the best performance in terms of lowest error determinants. This work can be a guide for Bangladeshi manufacturers as well as other researchers to identify the most suitable forecasting technique for their industry.
Harmonic analysis of dense time series of landsat imagery for modeling change in forest conditions
Barry Tyler. Wilson
2015-01-01
This study examined the utility of dense time series of Landsat imagery for small area estimation and mapping of change in forest conditions over time. The study area was a region in north central Wisconsin for which Landsat 7 ETM+ imagery and field measurements from the Forest Inventory and Analysis program are available for the decade of 2003 to 2012. For the periods...
Revealing the Organization of Complex Adaptive Systems through Multivariate Time Series Modeling
Directory of Open Access Journals (Sweden)
David G. Angeler
2011-09-01
Full Text Available Revealing the adaptive responses of ecological, social, and economic systems to a transforming biosphere is crucial for understanding system resilience and preventing collapse. However, testing the theory that underpins complex adaptive system organization (e.g., panarchy theory is challenging. We used multivariate time series modeling to identify scale-specific system organization and, by extension, apparent resilience mechanisms. We used a 20-year time series of invertebrates and phytoplankton from 26 Swedish lakes to test the proposition that a few key-structuring environmental variables at specific scales create discontinuities in community dynamics. Cross-scale structure was manifested in two independent species groups within both communities across lakes. The first species group showed patterns of directional temporal change, which was related to environmental variables that acted at broad spatiotemporal scales (reduced sulfate deposition, North Atlantic Oscillation. The second species group showed fluctuation patterns, which often could not be explained by environmental variables. However, when significant relationships were found, species-group trends were predicted by variables (total organic carbon, nutrients that acted at narrower spatial scales (i.e., catchment and lake. Although the sets of environmental variables that predicted the species groups differed between phytoplankton and invertebrates, the scale-specific imprints of keystone environmental variables for creating cross-scale structure were clear for both communities. Temporal trends of functional groups did not track the observed structural changes, suggesting functional stability despite structural change. Our approach allows for identifying scale-specific patterns and processes, thus providing opportunities for better characterization of complex adaptive systems organization and dynamics. This, in turn, holds potential for more accurate evaluation of resilience in
Monitoring and modeling of wetland environment using time-series bi-sensor remotely sensed data
Michishita, Ryo
More than half of the wetlands in the world have been lost in the last century mainly due to human activities. Since natural wetlands receive a significant amount of untreated runoff from urban and agricultural areas, it is necessary to account for other landscapes adjacent to wetlands, such as water bodies, agricultural areas, and urban areas, in the protection and restoration of the wetlands. The goal of this dissertation is to monitor and model land cover changes using the time-series Landsat-5 TM and Terra MODIS data in the Poyang Lake area of China from two perspectives: wetland cover changes and urbanization. A bi-scale monitoring approach was adopted in the monitoring and modeling of wetland cover changes to examine the similarities and differences derived from remotely sensed imagery with different spatial resolutions. The effect of different modeling settings of multiple endmember spectral mixture analysis (MESMA) were examined utilizing a single pair of TM and MODIS scenes. MESMA applied to nine pairs of TM and MODIS scenes acquired from July 2004 to October 2005 captured phenological and hydrological trends of land cover fractions (LCFs) and LCF agreement between the image pairs. Ground surface reflectance, rather than LCFs, was chosen as the key parameter in the blending of bi-scale remotely sensed data that utilized the spatial details of one data type and temporal details of the other. This research customized an existing fusion model to overcome the problem with the unobserved pixels in MODIS data acquired on TM data acquisition dates. It is interesting that the input data combination considering water level change achieved higher accuracy. In the monitoring of urbanization, this research investigated the relationship between urban land cover and human activities, and detected the areas of new urban development and redevelopment of built-up areas. Different urbanization processes largely influenced by the economic reforms of China were demonstrated
Marc, Odin; Hovius, Niels; Meunier, Patrick; Uchida, Taro; Gorum, Tolga
2016-04-01
Earthquakes impart a catastrophic forcing on hillslopes, that often lead to widespread landsliding and can contribute significantly to sedimentary and organic matter fluxes. We present a new expression for the total area and volume of populations of earthquake-induced landslides.This model builds on a set of scaling relationships between key parameters, such as landslide density, ground acceleration, fault size, earthquake source depth and seismic moment, derived from geomorphological and seismological observations. To assess the model we have assembled and normalized a catalogue of landslide inventories for 40 earthquakes. We have found that low landscape steepness systematically leads to over-prediction of the total area and volume of landslides.When this effect is accounted for, the model is able to predict within a factor of 2 the landslide areas and associated volumes for about two thirds of the cases in our databases. This is a significant improvement on a previously published empirical expression based only on earthquake moment. This model is suitable for integration into landscape evolution models, and application to the assessment of secondary hazards and risks associated with earthquakes. However, it only models landslides associated to the strong ground shaking and neglects the intrinsic permanent damage that also occurred on hillslopes and persist for longer period. With time series of landslide maps we have constrained the magnitude of the change in landslide susceptibility in the epicentral areas of 4 intermediate to large earthquakes. We propose likely causes for this transient ground strength perturbations and compare our observations to other observations of transient perturbations in epicentral areas, such as suspended sediment transport increases, seismic velocity reductions and hydrological perturbations. We conclude with some preliminary observations on the coseismic mass wasting and post-seismic landslide enhancement caused by the 2015 Mw.7
Bendel, David; Beck, Ferdinand; Dittmer, Ulrich
2013-01-01
In the presented study climate change impacts on combined sewer overflows (CSOs) in Baden-Wuerttemberg, Southern Germany, were assessed based on continuous long-term rainfall-runoff simulations. As input data, synthetic rainfall time series were used. The applied precipitation generator NiedSim-Klima accounts for climate change effects on precipitation patterns. Time series for the past (1961-1990) and future (2041-2050) were generated for various locations. Comparing the simulated CSO activity of both periods we observe significantly higher overflow frequencies for the future. Changes in overflow volume and overflow duration depend on the type of overflow structure. Both values will increase at simple CSO structures that merely divide the flow, whereas they will decrease when the CSO structure is combined with a storage tank. However, there is a wide variation between the results of different precipitation time series (representative for different locations).
Du, Kongchang; Zhao, Ying; Lei, Jiaqiang
2017-09-01
In hydrological time series prediction, singular spectrum analysis (SSA) and discrete wavelet transform (DWT) are widely used as preprocessing techniques for artificial neural network (ANN) and support vector machine (SVM) predictors. These hybrid or ensemble models seem to largely reduce the prediction error. In current literature researchers apply these techniques to the whole observed time series and then obtain a set of reconstructed or decomposed time series as inputs to ANN or SVM. However, through two comparative experiments and mathematical deduction we found the usage of SSA and DWT in building hybrid models is incorrect. Since SSA and DWT adopt 'future' values to perform the calculation, the series generated by SSA reconstruction or DWT decomposition contain information of 'future' values. These hybrid models caused incorrect 'high' prediction performance and may cause large errors in practice.
A scalable database model for multiparametric time series: a volcano observatory case study
Montalto, Placido; Aliotta, Marco; Cassisi, Carmelo; Prestifilippo, Michele; Cannata, Andrea
2014-05-01
The variables collected by a sensor network constitute a heterogeneous data source that needs to be properly organized in order to be used in research and geophysical monitoring. With the time series term we refer to a set of observations of a given phenomenon acquired sequentially in time. When the time intervals are equally spaced one speaks of period or sampling frequency. Our work describes in detail a possible methodology for storage and management of time series using a specific data structure. We designed a framework, hereinafter called TSDSystem (Time Series Database System), in order to acquire time series from different data sources and standardize them within a relational database. The operation of standardization provides the ability to perform operations, such as query and visualization, of many measures synchronizing them using a common time scale. The proposed architecture follows a multiple layer paradigm (Loaders layer, Database layer and Business Logic layer). Each layer is specialized in performing particular operations for the reorganization and archiving of data from different sources such as ASCII, Excel, ODBC (Open DataBase Connectivity), file accessible from the Internet (web pages, XML). In particular, the loader layer performs a security check of the working status of each running software through an heartbeat system, in order to automate the discovery of acquisition issues and other warning conditions. Although our system has to manage huge amounts of data, performance is guaranteed by using a smart partitioning table strategy, that keeps balanced the percentage of data stored in each database table. TSDSystem also contains modules for the visualization of acquired data, that provide the possibility to query different time series on a specified time range, or follow the realtime signal acquisition, according to a data access policy from the users.
Beyond Rating Curves: Time Series Models for in-Stream Turbidity Prediction
Wang, L.; Mukundan, R.; Zion, M.; Pierson, D. C.
2012-12-01
The New York City Department of Environmental Protection (DEP) manages New York City's water supply, which is comprised of over 20 reservoirs and supplies over 1 billion gallons of water per day to more than 9 million customers. DEP's "West of Hudson" reservoirs located in the Catskill Mountains are unfiltered per a renewable filtration avoidance determination granted by the EPA. While water quality is usually pristine, high volume storm events occasionally cause the reservoirs to become highly turbid. A logical strategy for turbidity control is to temporarily remove the turbid reservoirs from service. While effective in limiting delivery of turbid water and reducing the need for in-reservoir alum flocculation, this strategy runs the risk of negatively impacting water supply reliability. Thus, it is advantageous for DEP to understand how long a particular turbidity event will affect their system. In order to understand the duration, intensity and total load of a turbidity event, predictions of future in-stream turbidity values are important. Traditionally, turbidity predictions have been carried out by applying streamflow observations/forecasts to a flow-turbidity rating curve. However, predictions from rating curves are often inaccurate due to inter- and intra-event variability in flow-turbidity relationships. Predictions can be improved by applying an autoregressive moving average (ARMA) time series model in combination with a traditional rating curve. Since 2003, DEP and the Upstate Freshwater Institute have compiled a relatively consistent set of 15-minute turbidity observations at various locations on Esopus Creek above Ashokan Reservoir. Using daily averages of this data and streamflow observations at nearby USGS gauges, flow-turbidity rating curves were developed via linear regression. Time series analysis revealed that the linear regression residuals may be represented using an ARMA(1,2) process. Based on this information, flow-turbidity regressions with
Ohkubo, Jun
2011-12-01
A scheme is developed for estimating state-dependent drift and diffusion coefficients in a stochastic differential equation from time-series data. The scheme does not require to specify parametric forms for the drift and diffusion coefficients in advance. In order to perform the nonparametric estimation, a maximum likelihood method is combined with a concept based on a kernel density estimation. In order to deal with discrete observation or sparsity of the time-series data, a local linearization method is employed, which enables a fast estimation.
Regenerating time series from ordinal networks
McCullough, Michael; Sakellariou, Konstantinos; Stemler, Thomas; Small, Michael
2017-03-01
Recently proposed ordinal networks not only afford novel methods of nonlinear time series analysis but also constitute stochastic approximations of the deterministic flow time series from which the network models are constructed. In this paper, we construct ordinal networks from discrete sampled continuous chaotic time series and then regenerate new time series by taking random walks on the ordinal network. We then investigate the extent to which the dynamics of the original time series are encoded in the ordinal networks and retained through the process of regenerating new time series by using several distinct quantitative approaches. First, we use recurrence quantification analysis on traditional recurrence plots and order recurrence plots to compare the temporal structure of the original time series with random walk surrogate time series. Second, we estimate the largest Lyapunov exponent from the original time series and investigate the extent to which this invariant measure can be estimated from the surrogate time series. Finally, estimates of correlation dimension are computed to compare the topological properties of the original and surrogate time series dynamics. Our findings show that ordinal networks constructed from univariate time series data constitute stochastic models which approximate important dynamical properties of the original systems.
Regenerating time series from ordinal networks.
McCullough, Michael; Sakellariou, Konstantinos; Stemler, Thomas; Small, Michael
2017-03-01
Recently proposed ordinal networks not only afford novel methods of nonlinear time series analysis but also constitute stochastic approximations of the deterministic flow time series from which the network models are constructed. In this paper, we construct ordinal networks from discrete sampled continuous chaotic time series and then regenerate new time series by taking random walks on the ordinal network. We then investigate the extent to which the dynamics of the original time series are encoded in the ordinal networks and retained through the process of regenerating new time series by using several distinct quantitative approaches. First, we use recurrence quantification analysis on traditional recurrence plots and order recurrence plots to compare the temporal structure of the original time series with random walk surrogate time series. Second, we estimate the largest Lyapunov exponent from the original time series and investigate the extent to which this invariant measure can be estimated from the surrogate time series. Finally, estimates of correlation dimension are computed to compare the topological properties of the original and surrogate time series dynamics. Our findings show that ordinal networks constructed from univariate time series data constitute stochastic models which approximate important dynamical properties of the original systems.
Modeling GPP in the Nordic Forest Landscape Using MODIS Time Series Data
Schubert, P.; Lagergren, F.; Aurela, M.; Christensen, T. R.; Grelle, A.; Heliasz, M.; Klemedtsson, L. K.; Lindroth, A.; Pilegaard, K.; Vesala, T.; Eklundh, L.
2011-12-01
Satellite sensor-derived images cover the ground surface continuously throughout the landscapes and are therefore suitable for regional and global estimations of carbon dioxide (CO2) exchange. This study is aimed at developing an empirical model for regional estimations of gross primary productivity (GPP) in Nordic forests by using data from the Moderate Resolution Imaging Spectroradiometer (MODIS) and modeled incoming photosynthetic photon flux density (PPFD). Eddy covariance-measured net ecosystem exchange (NEE) from three deciduous and ten coniferous sites was partitioned into GPP. Linear regression analyses were made on 8-day averages of GPP in relation to MODIS 8-day composite data and 8-day averages of PPFD. Time series of the two-band enhanced vegetation index (EVI2) were calculated from MODIS 500 m reflectance data (MOD09A1). In order to reduce noise in data, these times series were smoothed by a curve fitting procedure. For most sites, fairly strong to strong relationships were found between GPP and the product of EVI2 and PPFD (Deciduous: R2 = 0.45-0.86, Coniferous: R2 = 0.49-0.90). Similar relationships were found for GPP versus the product of EVI2 and the MODIS 1 km daytime land surface temperature (LST, MOD11A2) (R2 = 0.55-0.81, 0.57-0.77) and for GPP versus EVI2, PPFD and daytime LST in multiple linear regressions (R2 = 0.73-0.89, 0.65-0.93). The slope coefficient for GPP versus the product of EVI2 and PPFD was used as a proxy variable for the light use efficiency (LUE). An attempt was made to model the between-site variation in slope by linear regressions to other variables, but all relationships were found to be weak or very weak. One year of data was collected from each coniferous site and treated as one sample, in order to derive one general empirical model for GPP versus the product of EVI2 and PPFD (R2 = 0.70). General models were also derived for GPP versus the product of EVI2 and daytime LST (R2 = 0.62) and for GPP versus EVI2, PPFD and
ARMA modelled time-series classification for structural health monitoring of civil infrastructure
Peter Carden, E.; Brownjohn, James M. W.
2008-02-01
Structural health monitoring (SHM) is the subject of a great deal of ongoing research leading to the capability that reliable remote monitoring of civil infrastructure would allow a shift from schedule-based to condition-based maintenance strategies. The first stage in such a system would be the indication of an extraordinary change in the structure's behaviour. A statistical classification algorithm is presented here which is based on analysis of a structure's response in the time domain. The time-series responses are fitted with Autoregressive Moving Average (ARMA) models and the ARMA coefficients are fed to the classifier. The classifier is capable of learning in an unsupervised manner and of forming new classes when the structural response exhibits change. The approach is demonstrated with experimental data from the IASC-ASCE benchmark four-storey frame structure, the Z24 bridge and the Malaysia-Singapore Second Link bridge. The classifier is found to be capable of identifying structural change in all cases and of forming distinct classes corresponding to different structural states in most cases.
Vihermaa, Leena; Waldron, Susan; Newton, Jason
2013-04-01
Two small streams (New Colpita and Main Trail) and two rivers (Tambopata and La Torre), in the Tambopata National Reserve, Madre de Dios, Peru, were sampled for water chemistry (conductivity, pH and dissolved oxygen) and hydrology (stage height and flow velocity). In the small streams water chemistry and hydrology variables were logged at 15 minute intervals from Feb 2011 to November 2012. Water samples were collected from all four channels during field campaigns spanning different seasons and targeting the hydrological extremes. All the samples were analysed for dissolved inorganic carbon (DIC) concentration and δ13C (sample size ranging from 77 to 172 depending on the drainage system) and a smaller subset for dissolved organic carbon (DOC) and particulate organic carbon (POC) concentrations. Strong positive relationships were found between conductivity and both DIC concentration and δ13C in the New Colpita stream and the La Torre river. In Tambopata river the trends were less clear and in the Main Trail stream there was very little change in DIC and isotopic composition. The conductivity data was used to model continuous DIC time series for the New Colpita stream. The modelled DIC data agreed well with the measurements; the concordance correlation coefficients between predicted and measured data were 0.91 and 0.87 for mM-DIC and δ13C-DIC, respectively. The predictions of δ13C-DIC were improved when calendar month was included in the model, which indicates seasonal differences in the δ13C-DIC conductivity relationship. At present, continuous DIC sampling still requires expensive instrumentation. Therefore, modelling DIC from a proxy variable which can be monitored continuously with ease and at relatively low cost, such as conductivity, provides a powerful alternative method of DIC determination.
Forecasting Enrollments with Fuzzy Time Series.
Song, Qiang; Chissom, Brad S.
The concept of fuzzy time series is introduced and used to forecast the enrollment of a university. Fuzzy time series, an aspect of fuzzy set theory, forecasts enrollment using a first-order time-invariant model. To evaluate the model, the conventional linear regression technique is applied and the predicted values obtained are compared to the…
A dynamic factor model for the analysis of multivariate time series
Molenaar, P.C.M.
1985-01-01
Describes the new statistical technique of dynamic factor analysis (DFA), which accounts for the entire lagged covariance function of an arbitrary 2nd-order stationary time series. DFA is shown to be applicable to a relatively short stretch of observations and is therefore considered worthwhile for
Practical Aspects of the Spectral Analysis of Irregularly Sampled Data With Time-Series Models
Broersen, P.M.T.
2009-01-01
Several algorithms for the spectral analysis of irregularly sampled random processes can estimate the spectral density for a low frequency range. A new time-series method extended that frequency range with a factor of thousand or more. The new algorithm has two requirements to give useful results. F
Directory of Open Access Journals (Sweden)
Lukas Falat
2014-01-01
Full Text Available In this paper, authors apply feed-forward artificial neural network (ANN of RBF type into the process of modelling and forecasting the future value of USD/CAD time series. Authors test the customized version of the RBF and add the evolutionary approach into it. They also combine the standard algorithm for adapting weights in neural network with an unsupervised clustering algorithm called K-means. Finally, authors suggest the new hybrid model as a combination of a standard ANN and a moving average for error modeling that is used to enhance the outputs of the network using the error part of the original RBF. Using high-frequency data, they examine the ability to forecast exchange rate values for the horizon of one day. To determine the forecasting efficiency, authors perform the comparative out-of-sample analysis of the suggested hybrid model with statistical models and the standard neural network.
Time Series Analysis and Forecasting by Example
Bisgaard, Soren
2011-01-01
An intuition-based approach enables you to master time series analysis with ease Time Series Analysis and Forecasting by Example provides the fundamental techniques in time series analysis using various examples. By introducing necessary theory through examples that showcase the discussed topics, the authors successfully help readers develop an intuitive understanding of seemingly complicated time series models and their implications. The book presents methodologies for time series analysis in a simplified, example-based approach. Using graphics, the authors discuss each presented example in
Research on Optimize Prediction Model and Algorithm about Chaotic Time Series
Institute of Scientific and Technical Information of China (English)
JIANG Wei-jin; XU Yu-sheng
2004-01-01
We put forward a chaotic estimating model.by using the parameter of the chaotic system, sensitivity of the parameter to inching and control the disturbance of the system, and estimated the parameter of the model by using the best update option.In the end, we forecast the intending series value in its mutually space.The example shows that it can increase the precision in the estimated process by selecting the best model steps.It not only conquer the abuse of using detention inlay technology alone, but also decrease blindness of using forecast error to decide the input model directly, and the result of it is better than the method of statistics and other series means.
Analysis of MODIS snow cover time series over the alpine regions as input for hydrological modeling
Notarnicola, Claudia; Rastner, Philipp; Irsara, Luca; Moelg, Nico; Bertoldi, Giacomo; Dalla Chiesa, Stefano; Endrizzi, Stefano; Zebisch, Marc
2010-05-01
Snow extent and relative physical properties are key parameters in hydrology, weather forecast and hazard warning as well as in climatological models. Satellite sensors offer a unique advantage in monitoring snow cover due to their temporal and spatial synoptic view. The Moderate Resolution Imaging Spectrometer (MODIS) from NASA is especially useful for this purpose due to its high frequency. However, in order to evaluate the role of snow on the water cycle of a catchment such as runoff generation due to snowmelt, remote sensing data need to be assimilated in hydrological models. This study presents a comparison on a multi-temporal basis between snow cover data derived from (1) MODIS images, (2) LANDSAT images, and (3) predictions by the hydrological model GEOtop [1,3]. The test area is located in the catchment of the Matscher Valley (South Tyrol, Northern Italy). The snow cover maps derived from MODIS-images are obtained using a newly developed algorithm taking into account the specific requirements of mountain regions with a focus on the Alps [2]. This algorithm requires the standard MODIS-products MOD09 and MOD02 as input data and generates snow cover maps at a spatial resolution of 250 m. The final output is a combination of MODIS AQUA and MODIS TERRA snow cover maps, thus reducing the presence of cloudy pixels and no-data-values due to topography. By using these maps, daily time series starting from the winter season (November - May) 2002 till 2008/2009 have been created. Along with snow maps from MODIS images, also some snow cover maps derived from LANDSAT images have been used. Due to their high resolution (manto nevoso in aree alpine con dati MODIS multi-temporali e modelli idrologici, 13th ASITA National Conference, 1-4.12.2009, Bari, Italy. [3] Zanotti F., Endrizzi S., Bertoldi G. and Rigon R. 2004. The GEOtop snow module. Hydrological Processes, 18: 3667-3679. DOI:10.1002/hyp.5794.
Directory of Open Access Journals (Sweden)
T. V. O. Fabson
2011-11-01
Full Text Available Bullwhip (or whiplash effect is an observed phenomenon in forecast driven distribution channeland careful management of these effects is of great importance to managers of supply chain.Bullwhip effect refers to situations where orders to the suppliers tend to have larger variance thansales to the buyer (demand distortion and the distortion increases as we move up the supply chain.Due to the fact that demand of customer for product is unstable, business managers must forecast inorder to properly position inventory and other resources. Forecasts are statistically based and in mostcases, are not very accurate. The existence of forecast errors made it necessary for organizations tooften carry an inventory buffer called “safety stock”. Moving up the supply chain from the end userscustomers to raw materials supplier there is a lot of variation in demand that can be observed, whichcall for greater need for safety stock.This study compares the efficacy of simulation and Time Series model in quantifying the bullwhipeffects in supply chain management.
Reconstruction of time-delay systems from chaotic time series.
Bezruchko, B P; Karavaev, A S; Ponomarenko, V I; Prokhorov, M D
2001-11-01
We propose a method that allows one to estimate the parameters of model scalar time-delay differential equations from time series. The method is based on a statistical analysis of time intervals between extrema in the time series. We verify our method by using it for the reconstruction of time-delay differential equations from their chaotic solutions and for modeling experimental systems with delay-induced dynamics from their chaotic time series.
DEFF Research Database (Denmark)
Nielsen, Joakim Refslund; Dellwik, Ebba; Hahmann, Andrea N.
2014-01-01
A method is presented for development of satellite green vegetation fraction (GVF) time series for use in the Weather Research and Forecasting (WRF) model. The GVF data is in the WRF model used to describe the temporal evolution of many land surface parameters, in addition to the evolution...
Predicting Nonlinear Time Series
1993-12-01
response becomes R,(k) = f (Y FV,(k)) (2.4) where Wy specifies the weight associated with the output of node i to the input of nodej in the next layer and...interconnections for each of these previous nodes. 18 prr~~~o• wfe :t iam i -- ---- --- --- --- Figure 5: Delay block for ATNN [9] Thus, nodej receives the...computed values, aj(tn), and dj(tn) denotes the desired output of nodej at time in. In this thesis, the weights and time delays update after each input
Almog, Assaf; Garlaschelli, Diego
2014-09-01
The dynamics of complex systems, from financial markets to the brain, can be monitored in terms of multiple time series of activity of the constituent units, such as stocks or neurons, respectively. While the main focus of time series analysis is on the magnitude of temporal increments, a significant piece of information is encoded into the binary projection (i.e. the sign) of such increments. In this paper we provide further evidence of this by showing strong nonlinear relations between binary and non-binary properties of financial time series. These relations are a novel quantification of the fact that extreme price increments occur more often when most stocks move in the same direction. We then introduce an information-theoretic approach to the analysis of the binary signature of single and multiple time series. Through the definition of maximum-entropy ensembles of binary matrices and their mapping to spin models in statistical physics, we quantify the information encoded into the simplest binary properties of real time series and identify the most informative property given a set of measurements. Our formalism is able to accurately replicate, and mathematically characterize, the observed binary/non-binary relations. We also obtain a phase diagram allowing us to identify, based only on the instantaneous aggregate return of a set of multiple time series, a regime where the so-called ‘market mode’ has an optimal interpretation in terms of collective (endogenous) effects, a regime where it is parsimoniously explained by pure noise, and a regime where it can be regarded as a combination of endogenous and exogenous factors. Our approach allows us to connect spin models, simple stochastic processes, and ensembles of time series inferred from partial information.
Data mining in time series databases
Kandel, Abraham; Bunke, Horst
2004-01-01
Adding the time dimension to real-world databases produces Time SeriesDatabases (TSDB) and introduces new aspects and difficulties to datamining and knowledge discovery. This book covers the state-of-the-artmethodology for mining time series databases. The novel data miningmethods presented in the book include techniques for efficientsegmentation, indexing, and classification of noisy and dynamic timeseries. A graph-based method for anomaly detection in time series isdescribed and the book also studies the implications of a novel andpotentially useful representation of time series as strings. Theproblem of detecting changes in data mining models that are inducedfrom temporal databases is additionally discussed.
Time Series Modeling, Forecast Principle%时间序列建模、预报的原理
Institute of Scientific and Technical Information of China (English)
王娜
2012-01-01
本文介绍时间序列理论的基本内容，以及各种基本模型。最后对各类时序模型做预报公式。%The article introduces the basic content of time sequence theory, and all kinds of basic model. Finally on various kinds of time-series model do forecast formula.
Inventory Model (M,T) With Quadratic Backorder Costs And Continuous Lead Time Series 1
Dr. Martin Osawaru Omorodion
2014-01-01
We have assumed in this paper that demand follows a normal distribution, and lead times follow a gamma distribution and backorder costs is quadratic. The (M, T) Model an order is made to bring it to M at review time. The model is derived from the (nQ,R,T) model which at review time an integral multiple Q is ordered. After deriving the inventory costs for (nQ, R,T) we set Q 0 to obtain the (M,T) inventory costs by making use of the differentiation of the (nQ,R,T) model. The (M,T) inventory cos...
Kim, J R; Ko, J H; Im, J H; Lee, S H; Kim, S H; Kim, C W; Park, T J
2006-01-01
The information on the incoming load to wastewater treatment plants is not often available to apply modelling for evaluating the effect of control actions on a full-scale plant. In this paper, a time series model was developed to forecast flow rate, COD, NH4(+)-N and PO4(3-)-P in influent by using 250 days data of field plant operation data. The data for 150 days and 100 days were used for model development and model validation, respectively. The missing data were interpolated by the spline method and the time series model. Three different methods were proposed for model development: one model and one-step to seven-step ahead forecasting (Method 1); seven models and one-step-ahead forecasting (Method 2); and one model and one-step-ahead forecasting (Method 3). Method 3 featured only one-step-ahead forecasting that could avoid the accumulated error and give simple estimation of coefficients. Therefore, Method 3 was the reliable approach to developing the time series model for the purpose of this research.
Model for the heart beat-to-beat time series during meditation
Capurro, A.; Diambra, L.; Malta, C. P.
2003-09-01
We present a model for the respiratory modulation of the heart beat-to-beat interval series. The model consists of a pacemaker, that simulates the membrane potential of the sinoatrial node, modulated by a periodic input signal plus correlated noise that simulates the respiratory input. The model was used to assess the waveshape of the respiratory signals needed to reproduce in the phase space the trajectory of experimental heart beat-to-beat interval data. The data sets were recorded during meditation practices of the Chi and Kundalini Yoga techniques. Our study indicates that in the first case the respiratory signal has the shape of a smoothed square wave, and in the second case it has the shape of a smoothed triangular wave.
Evaluation of Harmonic Analysis of Time Series (HANTS): impact of gaps on time series reconstruction
Zhou, J.Y.; Jia, L.; Hu, G.; Menenti, M.
2012-01-01
In recent decades, researchers have developed methods and models to reconstruct time series of irregularly spaced observations from satellite remote sensing, among which the widely used Harmonic Analysis of Time Series (HANTS) method. Many studies based on time series reconstructed with HANTS docume
Evaluation of Harmonic Analysis of Time Series (HANTS): impact of gaps on time series reconstruction
Zhou, J.Y.; Jia, L.; Hu, G.; Menenti, M.
2012-01-01
In recent decades, researchers have developed methods and models to reconstruct time series of irregularly spaced observations from satellite remote sensing, among which the widely used Harmonic Analysis of Time Series (HANTS) method. Many studies based on time series reconstructed with HANTS docume
Evaluation of Harmonic Analysis of Time Series (HANTS): impact of gaps on time series reconstruction
Zhou, J.Y.; Jia, L.; Hu, G.; Menenti, M.
2012-01-01
In recent decades, researchers have developed methods and models to reconstruct time series of irregularly spaced observations from satellite remote sensing, among which the widely used Harmonic Analysis of Time Series (HANTS) method. Many studies based on time series reconstructed with HANTS
Time Series with Tailored Nonlinearities
Raeth, C
2015-01-01
It is demonstrated how to generate time series with tailored nonlinearities by inducing well- defined constraints on the Fourier phases. Correlations between the phase information of adjacent phases and (static and dynamic) measures of nonlinearities are established and their origin is explained. By applying a set of simple constraints on the phases of an originally linear and uncor- related Gaussian time series, the observed scaling behavior of the intensity distribution of empirical time series can be reproduced. The power law character of the intensity distributions being typical for e.g. turbulence and financial data can thus be explained in terms of phase correlations.
Directory of Open Access Journals (Sweden)
Subanar Subanar
2006-01-01
Full Text Available Recently, one of the central topics for the neural networks (NN community is the issue of data preprocessing on the use of NN. In this paper, we will investigate this topic particularly on the effect of Decomposition method as data processing and the use of NN for modeling effectively time series with both trend and seasonal patterns. Limited empirical studies on seasonal time series forecasting with neural networks show that some find neural networks are able to model seasonality directly and prior deseasonalization is not necessary, and others conclude just the opposite. In this research, we study particularly on the effectiveness of data preprocessing, including detrending and deseasonalization by applying Decomposition method on NN modeling and forecasting performance. We use two kinds of data, simulation and real data. Simulation data are examined on multiplicative of trend and seasonality patterns. The results are compared to those obtained from the classical time series model. Our result shows that a combination of detrending and deseasonalization by applying Decomposition method is the effective data preprocessing on the use of NN for forecasting trend and seasonal time series.
International Work-Conference on Time Series
Pomares, Héctor
2016-01-01
This volume presents selected peer-reviewed contributions from The International Work-Conference on Time Series, ITISE 2015, held in Granada, Spain, July 1-3, 2015. It discusses topics in time series analysis and forecasting, advanced methods and online learning in time series, high-dimensional and complex/big data time series as well as forecasting in real problems. The International Work-Conferences on Time Series (ITISE) provide a forum for scientists, engineers, educators and students to discuss the latest ideas and implementations in the foundations, theory, models and applications in the field of time series analysis and forecasting. It focuses on interdisciplinary and multidisciplinary research encompassing the disciplines of computer science, mathematics, statistics and econometrics.
Ramli, Nazirah; Mutalib, Siti Musleha Ab; Mohamad, Daud
2017-08-01
Fuzzy time series forecasting model has been proposed since 1993 to cater for data in linguistic values. Many improvement and modification have been made to the model such as enhancement on the length of interval and types of fuzzy logical relation. However, most of the improvement models represent the linguistic term in the form of discrete fuzzy sets. In this paper, fuzzy time series model with data in the form of trapezoidal fuzzy numbers and natural partitioning length approach is introduced for predicting the unemployment rate. Two types of fuzzy relations are used in this study which are first order and second order fuzzy relation. This proposed model can produce the forecasted values under different degree of confidence.
Event Discovery in Time Series
Preston, Dan; Brodley, Carla
2009-01-01
The discovery of events in time series can have important implications, such as identifying microlensing events in astronomical surveys, or changes in a patient's electrocardiogram. Current methods for identifying events require a sliding window of a fixed size, which is not ideal for all applications and could overlook important events. In this work, we develop probability models for calculating the significance of an arbitrary-sized sliding window and use these probabilities to find areas of significance. Because a brute force search of all sliding windows and all window sizes would be computationally intractable, we introduce a method for quickly approximating the results. We apply our method to over 100,000 astronomical time series from the MACHO survey, in which 56 different sections of the sky are considered, each with one or more known events. Our method was able to recover 100% of these events in the top 1% of the results, essentially pruning 99% of the data. Interestingly, our method was able to iden...
The projection of world geothermal energy consumption from time series and regression model
Simanullang, Elwin Y.; Supriatna, Agus; Supriatna, Asep K.
2015-12-01
World population growth has many impacts on human live activities and other related aspects. One among the aspects is the increase of the use of energy to support human daily activities, covering industrial aspect, transportation, domestic activities, etc. It is plausible that the higher the population size in a country the higher the needs for energy to support all aspects of human activities in the country. Considering the depletion of petroleum and other fossil-based energy, recently there is a tendency to use geothermal as other source of energy. In this paper we will discuss the prediction of the world consumption of geothermal energy by two different methods, i.e. via the time series of the geothermal usage and via the time series of the geothermal usage combined with the prediction of the world total population. For the first case, we use the simple exponential smoothing method while for the second case we use the simple regression method. The result shows that taking into account the prediction of the world population size giving a better prediction to forecast a short term of the geothermal energy consumption.
DATA MINING IN CANADIAN LYNX TIME SERIES
Directory of Open Access Journals (Sweden)
R.Karnaboopathy
2012-01-01
Full Text Available This paper sums up the applications of Statistical model such as ARIMA family timeseries models in Canadian lynx data time series analysis and introduces the method of datamining combined with Statistical knowledge to analysis Canadian lynx data series.
Time series analysis time series analysis methods and applications
Rao, Tata Subba; Rao, C R
2012-01-01
The field of statistics not only affects all areas of scientific activity, but also many other matters such as public policy. It is branching rapidly into so many different subjects that a series of handbooks is the only way of comprehensively presenting the various aspects of statistical methodology, applications, and recent developments. The Handbook of Statistics is a series of self-contained reference books. Each volume is devoted to a particular topic in statistics, with Volume 30 dealing with time series. The series is addressed to the entire community of statisticians and scientists in various disciplines who use statistical methodology in their work. At the same time, special emphasis is placed on applications-oriented techniques, with the applied statistician in mind as the primary audience. Comprehensively presents the various aspects of statistical methodology Discusses a wide variety of diverse applications and recent developments Contributors are internationally renowened experts in their respect...
Zhou, Fuqun; Zhang, Aining
2016-10-25
Nowadays, various time-series Earth Observation data with multiple bands are freely available, such as Moderate Resolution Imaging Spectroradiometer (MODIS) datasets including 8-day composites from NASA, and 10-day composites from the Canada Centre for Remote Sensing (CCRS). It is challenging to efficiently use these time-series MODIS datasets for long-term environmental monitoring due to their vast volume and information redundancy. This challenge will be greater when Sentinel 2-3 data become available. Another challenge that researchers face is the lack of in-situ data for supervised modelling, especially for time-series data analysis. In this study, we attempt to tackle the two important issues with a case study of land cover mapping using CCRS 10-day MODIS composites with the help of Random Forests' features: variable importance, outlier identification. The variable importance feature is used to analyze and select optimal subsets of time-series MODIS imagery for efficient land cover mapping, and the outlier identification feature is utilized for transferring sample data available from one year to an adjacent year for supervised classification modelling. The results of the case study of agricultural land cover classification at a regional scale show that using only about a half of the variables we can achieve land cover classification accuracy close to that generated using the full dataset. The proposed simple but effective solution of sample transferring could make supervised modelling possible for applications lacking sample data.
Nedorezov, L V
2015-01-01
For approximation of some well-known time series of Paramecia caudatun population dynamics (G. F. Gause, The Struggle for Existence, 1934) Verhulst and Gompertz models were used. The parameters were estimated for each of the models in two different ways: with the least squares method (global fitting) and non-traditional approach (a method of extreme points). The results obtained were compared and also with those represented by G. F. Gause. Deviations of theoretical (model) trajectories from experimental time series were tested using various non-parametric statistical tests. It was shown that the least square method-estimations lead to the results which not always meet the requirements imposed for a "fine" model. But in some cases a small modification of the least square method-estimations is possible allowing for satisfactory representations of experimental data set for approximation.
Modeling time-dependent corrosion fatigue crack propagation in 7000 series aluminum alloys
Mason, Mark E.; Gangloff, Richard P.
1994-01-01
Stress corrosion cracking and corrosion fatigue experiments were conducted with the susceptible S-L orientation of AA7075-T651, immersed in acidified and inhibited NaCl solution, to provide a basis for incorporating environmental effects into fatigue crack propagation life prediction codes such as NASA FLAGRO. This environment enhances da/dN by five to ten-fold compared to fatigue in moist air. Time-based crack growth rates from quasi-static load experiments are an order of magnitude too small for accurate linear superposition prediction of da/dN for loading frequencies above 0.001 Hz. Alternate methods of establishing da/dt, based on rising-load or ripple-load-enhanced crack tip strain rate, do not increase da/dt and do not improve linear superposition. Corrosion fatigue is characterized by two regimes of frequency dependence; da/dN is proportional to f(exp -1) below 0.001 Hz and to F(exp 0) to F(exp -0.1) for higher frequencies. Da/dN increases mildly both with increasing hold-time at K(sub max) and with increasing rise-time for a range of loading waveforms. The mild time-dependence is due to cycle-time-dependent corrosion fatigue growth. This behavior is identical for S-L nd L-T crack orientations. The frequency response of environmental fatigue in several 7000 series alloys is variable and depends on undefined compositional or microstructural variables. Speculative explanations are based on the effect of Mg on occluded crack chemistry and embritting hydrogen uptake, or on variable hydrogen diffusion in the crack tip process zone. Cracking in the 7075/NaCl system is adequately described for life prediction by linear superposition for prolonged load-cycle periods, and by a time-dependent upper bound relationship between da/dN and delta K for moderate loading times.
On reconstruction of time series in climatology
Directory of Open Access Journals (Sweden)
V. Privalsky
2015-10-01
Full Text Available The approach to time series reconstruction in climatology based upon cross-correlation coefficients and regression equations is mathematically incorrect because it ignores the dependence of time series upon their past. The proper method described here for the bivariate case requires the autoregressive time- and frequency domains modeling of the time series which contains simultaneous observations of both scalar series with subsequent application of the model to restore the shorter one into the past. The method presents further development of previous efforts taken by a number of authors starting from A. Douglass who introduced some concepts of time series analysis into paleoclimatology. The method is applied to the monthly data of total solar irradiance (TSI, 1979–2014, and sunspot numbers (SSN, 1749–2014, to restore the TSI data over 1749–1978. The results of the reconstruction are in statistical agreement with observations.
Directory of Open Access Journals (Sweden)
David E. Allen
2016-03-01
Full Text Available This paper features an analysis of major currency exchange rate movements in relation to the US dollar, as constituted in US dollar terms. Euro, British pound, Chinese yuan, and Japanese yen are modelled using a variety of non-linear models, including smooth transition regression models, logistic smooth transition regressions models, threshold autoregressive models, nonlinear autoregressive models, and additive nonlinear autoregressive models, plus Neural Network models. The models are evaluated on the basis of error metrics for twenty day out-of-sample forecasts using the mean average percentage errors (MAPE. The results suggest that there is no dominating class of time series models, and the different currency pairs relationships with the US dollar are captured best by neural net regression models, over the ten year sample of daily exchange rate returns data, from August 2005 to August 2015.
Directory of Open Access Journals (Sweden)
Suhartono Suhartono
2005-01-01
Full Text Available Many business and economic time series are non-stationary time series that contain trend and seasonal variations. Seasonality is a periodic and recurrent pattern caused by factors such as weather, holidays, or repeating promotions. A stochastic trend is often accompanied with the seasonal variations and can have a significant impact on various forecasting methods. In this paper, we will investigate and compare some forecasting methods for modeling time series with both trend and seasonal patterns. These methods are Winter's, Decomposition, Time Series Regression, ARIMA and Neural Networks models. In this empirical research, we study on the effectiveness of the forecasting performance, particularly to answer whether a complex method always give a better forecast than a simpler method. We use a real data, that is airline passenger data. The result shows that the more complex model does not always yield a better result than a simpler one. Additionally, we also find the possibility to do further research especially the use of hybrid model by combining some forecasting method to get better forecast, for example combination between decomposition (as data preprocessing and neural network model.
Predicting road accidents: Structural time series approach
Junus, Noor Wahida Md; Ismail, Mohd Tahir
2014-07-01
In this paper, the model for occurrence of road accidents in Malaysia between the years of 1970 to 2010 was developed and throughout this model the number of road accidents have been predicted by using the structural time series approach. The models are developed by using stepwise method and the residual of each step has been analyzed. The accuracy of the model is analyzed by using the mean absolute percentage error (MAPE) and the best model is chosen based on the smallest Akaike information criterion (AIC) value. A structural time series approach found that local linear trend model is the best model to represent the road accidents. This model allows level and slope component to be varied over time. In addition, this approach also provides useful information on improving the conventional time series method.
Spaeder, M C; Fackler, J C
2012-04-01
Respiratory syncytial virus (RSV) is the most common cause of documented viral respiratory infections, and the leading cause of hospitalization, in young children. We performed a retrospective time-series analysis of all patients aged Forecasting models of weekly RSV incidence for the local community, inpatient paediatric hospital and paediatric intensive-care unit (PICU) were created. Ninety-five percent confidence intervals calculated around our models' 2-week forecasts were accurate to ±9·3, ±7·5 and ±1·5 cases/week for the local community, inpatient hospital and PICU, respectively. Our results suggest that time-series models may be useful tools in forecasting the burden of RSV infection at the local and institutional levels, helping communities and institutions to optimize distribution of resources based on the changing burden and severity of illness in their respective communities.
Ward-Garrison, Christian; Markstrom, Steven L.; Hay, Lauren E.
2009-01-01
The U.S. Geological Survey Downsizer is a computer application that selects, downloads, verifies, and formats station-based time-series data for environmental-resource models, particularly the Precipitation-Runoff Modeling System. Downsizer implements the client-server software architecture. The client presents a map-based, graphical user interface that is intuitive to modelers; the server provides streamflow and climate time-series data from over 40,000 measurement stations across the United States. This report is the Downsizer user's manual and provides (1) an overview of the software design, (2) installation instructions, (3) a description of the graphical user interface, (4) a description of selected output files, and (5) troubleshooting information.
Underwater Noise Modeling and Direction-Finding Based on Heteroscedastic Time Series
Directory of Open Access Journals (Sweden)
Kamarei Mahmoud
2007-01-01
Full Text Available We propose a new method for practical non-Gaussian and nonstationary underwater noise modeling. This model is very useful for passive sonar in shallow waters. In this application, measurement of additive noise in natural environment and exhibits shows that noise can sometimes be significantly non-Gaussian and a time-varying feature especially in the variance. Therefore, signal processing algorithms such as direction-finding that is optimized for Gaussian noise may degrade significantly in this environment. Generalized autoregressive conditional heteroscedasticity (GARCH models are suitable for heavy tailed PDFs and time-varying variances of stochastic process. We use a more realistic GARCH-based noise model in the maximum-likelihood approach for the estimation of direction-of-arrivals (DOAs of impinging sources onto a linear array, and demonstrate using measured noise that this approach is feasible for the additive noise and direction finding in an underwater environment.
Dynamical modelling of measured time series from a Q-switched CO sub 2 laser
Horbelt, W; Bünner, M J; Meucci, R; Ciofini, M
2003-01-01
The transient dynamics of a Q-switched CO sub 2 laser is modelled quantitatively on the base of the four level model, a five dimensional nonlinear system of ordinary differential equations. Using the multiple shooting technique, internal parameters of the laser are estimated and the unobserved time courses of the population densities are constructed. For excitations barely above the laser threshold large pulse variations are identified as an effect of small variations of the pump parameter.
Transfer entropy between multivariate time series
Mao, Xuegeng; Shang, Pengjian
2017-06-01
It is a crucial topic to identify the direction and strength of the interdependence between time series in multivariate systems. In this paper, we propose the method of transfer entropy based on the theory of time-delay reconstruction of a phase space, which is a model-free approach to detect causalities in multivariate time series. This method overcomes the limitation that original transfer entropy only can capture which system drives the transition probabilities of another in scalar time series. Using artificial time series, we show that the driving character is obviously reflected with the increase of the coupling strength between two signals and confirm the effectiveness of the method with noise added. Furthermore, we utilize it to real-world data, namely financial time series, in order to characterize the information flow among different stocks.
Statistical criteria for characterizing irradiance time series.
Energy Technology Data Exchange (ETDEWEB)
Stein, Joshua S.; Ellis, Abraham; Hansen, Clifford W.
2010-10-01
We propose and examine several statistical criteria for characterizing time series of solar irradiance. Time series of irradiance are used in analyses that seek to quantify the performance of photovoltaic (PV) power systems over time. Time series of irradiance are either measured or are simulated using models. Simulations of irradiance are often calibrated to or generated from statistics for observed irradiance and simulations are validated by comparing the simulation output to the observed irradiance. Criteria used in this comparison should derive from the context of the analyses in which the simulated irradiance is to be used. We examine three statistics that characterize time series and their use as criteria for comparing time series. We demonstrate these statistics using observed irradiance data recorded in August 2007 in Las Vegas, Nevada, and in June 2009 in Albuquerque, New Mexico.
Institute of Scientific and Technical Information of China (English)
CHAN Kung-Sik; TONG Howell; STENSETH Nils Chr
2009-01-01
The study of the rodent fluctuations of the North was initiated in its modern form with Elton's pioneering work. Many scientific studies have been designed to collect yearly rodent abundance data, but the resulting time series are generally subject to at least two "problems": being short and non-linear. We explore the use of the continuous threshold autoregressive (TAR) models for analyzing such data. In the simplest case, the continuous TAR models are additive autoregressive models, being piecewise linear in one lag, and linear in all other lags. The location of the slope change is called the threshold parameter. The continuous TAR models for rodent abundance data can be derived from a general prey-predator model under some simplifying assumptions. The lag in which the threshold is located sheds important insights on the structure of the prey-predator system. We propose to assess the uncertainty on the location of the threshold via a new bootstrap called the nearest block bootstrap (NBB) which combines the methods of moving block bootstrap and the nearest neighbor bootstrap.The NBB assumes an underlying finite-order time-homogeneous Markov process. Essentially, the NBB bootstraps blocks of random block sizes, with each block being drawn from a non-parametric estimate of the future distribution given the realized past bootstrap series. We illustrate the methods by simulations and on a particular rodent abundance time series from Kilpisjarvi, Northern Finland.
Yoon, Heesung; Park, Eungyu; Yoon, Pilsun; Lee, Eunhee; Kim, Gyoo-Bum
2016-04-01
A method to filter out the effect of river stage fluctuations on groundwater level was designed using an artificial neural network-based time series model of groundwater level prediction. The designed method was applied to daily groundwater level data near the Gangjeong-Koryeong Barrage in the Nakdong river, South Korea. First, one-step ahead direct prediction time series models were successfully developed for both cases of before and after the barrage construction using past measurement data of rainfall, river stage, and groundwater level as inputs. The correlation coefficient values between observed and predicted data were over 0.97. Based on the direct prediction models, recursive prediction models for the simulation of groundwater level fluctuations were designed. The effect of river stage fluctuation on groundwater level data was filtered out by setting a constant value for river stage inputs of the recursive time series models. The hybrid water table fluctuation method was employed to estimate the groundwater recharge using the filtered data. The calculated ratios of groundwater recharge to precipitation before and after the barrage construction were 11.0% and 4.3%, respectively. It is expected that the proposed method can be a useful tool for groundwater level prediction and recharge estimation in the riverside area.
Mayaud, C; Wagner, T; Benischke, R; Birk, S
2014-04-16
The Lurbach karst system (Styria, Austria) is drained by two major springs and replenished by both autogenic recharge from the karst massif itself and a sinking stream that originates in low permeable schists (allogenic recharge). Detailed data from two events recorded during a tracer experiment in 2008 demonstrate that an overflow from one of the sub-catchments to the other is activated if the discharge of the main spring exceeds a certain threshold. Time series analysis (autocorrelation and cross-correlation) was applied to examine to what extent the various available methods support the identification of the transient inter-catchment flow observed in this binary karst system. As inter-catchment flow is found to be intermittent, the evaluation was focused on single events. In order to support the interpretation of the results from the time series analysis a simplified groundwater flow model was built using MODFLOW. The groundwater model is based on the current conceptual understanding of the karst system and represents a synthetic karst aquifer for which the same methods were applied. Using the wetting capability package of MODFLOW, the model simulated an overflow similar to what has been observed during the tracer experiment. Various intensities of allogenic recharge were employed to generate synthetic discharge data for the time series analysis. In addition, geometric and hydraulic properties of the karst system were varied in several model scenarios. This approach helps to identify effects of allogenic recharge and aquifer properties in the results from the time series analysis. Comparing the results from the time series analysis of the observed data with those of the synthetic data a good agreement was found. For instance, the cross-correlograms show similar patterns with respect to time lags and maximum cross-correlation coefficients if appropriate hydraulic parameters are assigned to the groundwater model. The comparable behaviors of the real and the
Mayaud, C.; Wagner, T.; Benischke, R.; Birk, S.
2014-04-01
The Lurbach karst system (Styria, Austria) is drained by two major springs and replenished by both autogenic recharge from the karst massif itself and a sinking stream that originates in low permeable schists (allogenic recharge). Detailed data from two events recorded during a tracer experiment in 2008 demonstrate that an overflow from one of the sub-catchments to the other is activated if the discharge of the main spring exceeds a certain threshold. Time series analysis (autocorrelation and cross-correlation) was applied to examine to what extent the various available methods support the identification of the transient inter-catchment flow observed in this binary karst system. As inter-catchment flow is found to be intermittent, the evaluation was focused on single events. In order to support the interpretation of the results from the time series analysis a simplified groundwater flow model was built using MODFLOW. The groundwater model is based on the current conceptual understanding of the karst system and represents a synthetic karst aquifer for which the same methods were applied. Using the wetting capability package of MODFLOW, the model simulated an overflow similar to what has been observed during the tracer experiment. Various intensities of allogenic recharge were employed to generate synthetic discharge data for the time series analysis. In addition, geometric and hydraulic properties of the karst system were varied in several model scenarios. This approach helps to identify effects of allogenic recharge and aquifer properties in the results from the time series analysis. Comparing the results from the time series analysis of the observed data with those of the synthetic data a good agreement was found. For instance, the cross-correlograms show similar patterns with respect to time lags and maximum cross-correlation coefficients if appropriate hydraulic parameters are assigned to the groundwater model. The comparable behaviors of the real and the
Mayaud, C.; Wagner, T.; Benischke, R.; Birk, S.
2014-01-01
Summary The Lurbach karst system (Styria, Austria) is drained by two major springs and replenished by both autogenic recharge from the karst massif itself and a sinking stream that originates in low permeable schists (allogenic recharge). Detailed data from two events recorded during a tracer experiment in 2008 demonstrate that an overflow from one of the sub-catchments to the other is activated if the discharge of the main spring exceeds a certain threshold. Time series analysis (autocorrelation and cross-correlation) was applied to examine to what extent the various available methods support the identification of the transient inter-catchment flow observed in this binary karst system. As inter-catchment flow is found to be intermittent, the evaluation was focused on single events. In order to support the interpretation of the results from the time series analysis a simplified groundwater flow model was built using MODFLOW. The groundwater model is based on the current conceptual understanding of the karst system and represents a synthetic karst aquifer for which the same methods were applied. Using the wetting capability package of MODFLOW, the model simulated an overflow similar to what has been observed during the tracer experiment. Various intensities of allogenic recharge were employed to generate synthetic discharge data for the time series analysis. In addition, geometric and hydraulic properties of the karst system were varied in several model scenarios. This approach helps to identify effects of allogenic recharge and aquifer properties in the results from the time series analysis. Comparing the results from the time series analysis of the observed data with those of the synthetic data a good agreement was found. For instance, the cross-correlograms show similar patterns with respect to time lags and maximum cross-correlation coefficients if appropriate hydraulic parameters are assigned to the groundwater model. The comparable behaviors of the real and
Directory of Open Access Journals (Sweden)
Richard R Stein
Full Text Available The intestinal microbiota is a microbial ecosystem of crucial importance to human health. Understanding how the microbiota confers resistance against enteric pathogens and how antibiotics disrupt that resistance is key to the prevention and cure of intestinal infections. We present a novel method to infer microbial community ecology directly from time-resolved metagenomics. This method extends generalized Lotka-Volterra dynamics to account for external perturbations. Data from recent experiments on antibiotic-mediated Clostridium difficile infection is analyzed to quantify microbial interactions, commensal-pathogen interactions, and the effect of the antibiotic on the community. Stability analysis reveals that the microbiota is intrinsically stable, explaining how antibiotic perturbations and C. difficile inoculation can produce catastrophic shifts that persist even after removal of the perturbations. Importantly, the analysis suggests a subnetwork of bacterial groups implicated in protection against C. difficile. Due to its generality, our method can be applied to any high-resolution ecological time-series data to infer community structure and response to external stimuli.
Benchmarking of energy time series
Energy Technology Data Exchange (ETDEWEB)
Williamson, M.A.
1990-04-01
Benchmarking consists of the adjustment of time series data from one source in order to achieve agreement with similar data from a second source. The data from the latter source are referred to as the benchmark(s), and often differ in that they are observed at a lower frequency, represent a higher level of temporal aggregation, and/or are considered to be of greater accuracy. This report provides an extensive survey of benchmarking procedures which have appeared in the statistical literature, and reviews specific benchmarking procedures currently used by the Energy Information Administration (EIA). The literature survey includes a technical summary of the major benchmarking methods and their statistical properties. Factors influencing the choice and application of particular techniques are described and the impact of benchmark accuracy is discussed. EIA applications and procedures are reviewed and evaluated for residential natural gas deliveries series and coal production series. It is found that the current method of adjusting the natural gas series is consistent with the behavior of the series and the methods used in obtaining the initial data. As a result, no change is recommended. For the coal production series, a staged approach based on a first differencing technique is recommended over the current procedure. A comparison of the adjustments produced by the two methods is made for the 1987 Indiana coal production series. 32 refs., 5 figs., 1 tab.
Study on Forward-Facing Model and Real-Time Simulation for a Series Hybrid Electric Vehicle
Directory of Open Access Journals (Sweden)
Xudong Liu
2011-10-01
Full Text Available To shorten design period and reduce development costs, computer modeling and simulation is important for HEV design and development. In this paper, real-time simulation for a Series Hybrid Electric Vehicle (SHEV is made to verify its fuzzy logic control strategy based on dSPACE-DS1103 development kits. The whole real-time simulation schematic is designed and the vehicle forward-facing simulation model is set up. Modeling methods for the driver, controller and vehicle (includes engine, generator, motor, battery, etc. under MATLAB/Simulink environment are discussed in detail. Driver behavior is simulated by two potentiometers and introduced into the real-time system to realize close-loop control. A real-time monitoring interface is also developed to observe the experiment results. Experiment results show that the real-time simulation platform works well and the SHEV fuzzy logic control strategy is effective.
DEFF Research Database (Denmark)
Cavaliere, Giuseppe; Nielsen, Morten Ørregaard; Taylor, Robert
We consider the problem of conducting estimation and inference on the parameters of univariate heteroskedastic fractionally integrated time series models. We first extend existing results in the literature, developed for conditional sum-of squares estimators in the context of parametric fractional...... time series models driven by conditionally homoskedastic shocks, to allow for conditional and unconditional heteroskedasticity both of a quite general and unknown form. Global consistency and asymptotic normality are shown to still obtain; however, the covariance matrix of the limiting distribution...... of the estimator now depends on nuisance parameters derived both from the weak dependence and heteroskedasticity present in the shocks. We then investigate classical methods of inference based on the Wald, likelihood ratio and Lagrange multiplier tests for linear hypotheses on either or both of the long and short...
Random time series in Astronomy
Vaughan, Simon
2013-01-01
Progress in astronomy comes from interpreting the signals encoded in the light received from distant objects: the distribution of light over the sky (images), over photon wavelength (spectrum), over polarization angle, and over time (usually called light curves by astronomers). In the time domain we see transient events such as supernovae, gamma-ray bursts, and other powerful explosions; we see periodic phenomena such as the orbits of planets around nearby stars, radio pulsars, and pulsations of stars in nearby galaxies; and persistent aperiodic variations (`noise') from powerful systems like accreting black holes. I review just a few of the recent and future challenges in the burgeoning area of Time Domain Astrophysics, with particular attention to persistently variable sources, the recovery of reliable noise power spectra from sparsely sampled time series, higher-order properties of accreting black holes, and time delays and correlations in multivariate time series.
Random time series in astronomy.
Vaughan, Simon
2013-02-13
Progress in astronomy comes from interpreting the signals encoded in the light received from distant objects: the distribution of light over the sky (images), over photon wavelength (spectrum), over polarization angle and over time (usually called light curves by astronomers). In the time domain, we see transient events such as supernovae, gamma-ray bursts and other powerful explosions; we see periodic phenomena such as the orbits of planets around nearby stars, radio pulsars and pulsations of stars in nearby galaxies; and we see persistent aperiodic variations ('noise') from powerful systems such as accreting black holes. I review just a few of the recent and future challenges in the burgeoning area of time domain astrophysics, with particular attention to persistently variable sources, the recovery of reliable noise power spectra from sparsely sampled time series, higher order properties of accreting black holes, and time delays and correlations in multi-variate time series.
Prostate cancer detection from model-free T1-weighted time series and diffusion imaging
Haq, Nandinee F.; Kozlowski, Piotr; Jones, Edward C.; Chang, Silvia D.; Goldenberg, S. Larry; Moradi, Mehdi
2015-03-01
The combination of Dynamic Contrast Enhanced (DCE) images with diffusion MRI has shown great potential in prostate cancer detection. The parameterization of DCE images to generate cancer markers is traditionally performed based on pharmacokinetic modeling. However, pharmacokinetic models make simplistic assumptions about the tissue perfusion process, require the knowledge of contrast agent concentration in a major artery, and the modeling process is sensitive to noise and fitting instabilities. We address this issue by extracting features directly from the DCE T1-weighted time course without modeling. In this work, we employed a set of data-driven features generated by mapping the DCE T1 time course to its principal component space, along with diffusion MRI features to detect prostate cancer. The optimal set of DCE features is extracted with sparse regularized regression through a Least Absolute Shrinkage and Selection Operator (LASSO) model. We show that when our proposed features are used within the multiparametric MRI protocol to replace the pharmacokinetic parameters, the area under ROC curve is 0.91 for peripheral zone classification and 0.87 for whole gland classification. We were able to correctly classify 32 out of 35 peripheral tumor areas identified in the data when the proposed features were used with support vector machine classification. The proposed feature set was used to generate cancer likelihood maps for the prostate gland.
Morrison, Kathryn T; Shaddick, Gavin; Henderson, Sarah B; Buckeridge, David L
2016-08-15
This paper outlines a latent process model for forecasting multiple health outcomes arising from a common environmental exposure. Traditionally, surveillance models in environmental health do not link health outcome measures, such as morbidity or mortality counts, to measures of exposure, such as air pollution. Moreover, different measures of health outcomes are treated as independent, while it is known that they are correlated with one another over time as they arise in part from a common underlying exposure. We propose modelling an environmental exposure as a latent process, and we describe the implementation of such a model within a hierarchical Bayesian framework and its efficient computation using integrated nested Laplace approximations. Through a simulation study, we compare distinct univariate models for each health outcome with a bivariate approach. The bivariate model outperforms the univariate models in bias and coverage of parameter estimation, in forecast accuracy and in computational efficiency. The methods are illustrated with a case study using healthcare utilization and air pollution data from British Columbia, Canada, 2003-2011, where seasonal wildfires produce high levels of air pollution, significantly impacting population health. Copyright © 2016 John Wiley & Sons, Ltd.
Fang, Xin; Li, Runkui; Kan, Haidong; Bottai, Matteo; Fang, Fang
2016-01-01
Objective To demonstrate an application of Bayesian model averaging (BMA) with generalised additive mixed models (GAMM) and provide a novel modelling technique to assess the association between inhalable coarse particles (PM10) and respiratory mortality in time-series studies. Design A time-series study using regional death registry between 2009 and 2010. Setting 8 districts in a large metropolitan area in Northern China. Participants 9559 permanent residents of the 8 districts who died of respiratory diseases between 2009 and 2010. Main outcome measures Per cent increase in daily respiratory mortality rate (MR) per interquartile range (IQR) increase of PM10 concentration and corresponding 95% confidence interval (CI) in single-pollutant and multipollutant (including NOx, CO) models. Results The Bayesian model averaged GAMM (GAMM+BMA) and the optimal GAMM of PM10, multipollutants and principal components (PCs) of multipollutants showed comparable results for the effect of PM10 on daily respiratory MR, that is, one IQR increase in PM10 concentration corresponded to 1.38% vs 1.39%, 1.81% vs 1.83% and 0.87% vs 0.88% increase, respectively, in daily respiratory MR. However, GAMM+BMA gave slightly but noticeable wider CIs for the single-pollutant model (−1.09 to 4.28 vs −1.08 to 3.93) and the PCs-based model (−2.23 to 4.07 vs −2.03 vs 3.88). The CIs of the multiple-pollutant model from two methods are similar, that is, −1.12 to 4.85 versus −1.11 versus 4.83. Conclusions The BMA method may represent a useful tool for modelling uncertainty in time-series studies when evaluating the effect of air pollution on fatal health outcomes. PMID:27531727
An introduction to state space time series analysis.
Commandeur, J.J.F. & Koopman, S.J.
2007-01-01
Providing a practical introduction to state space methods as applied to unobserved components time series models, also known as structural time series models, this book introduces time series analysis using state space methodology to readers who are neither familiar with time series analysis, nor wi
Nonlinear Time Series Analysis Since 1990:Some Personal Reflections
Institute of Scientific and Technical Information of China (English)
Howel Tong
2002-01-01
I reflect upon the development of nonlinear time series analysis since 1990 by focusing on five major areas of development. These areas include the interface between nonlinear time series analysis and chaos, the nonparametric/semiparametric approach, nonlinear state space modelling, financial time series and nonlinear modelling of panels of time series.
Time series modeling of pathogen-specific disease probabilities with subsampled data.
Fisher, Leigh; Wakefield, Jon; Bauer, Cici; Self, Steve
2017-03-01
Many diseases arise due to exposure to one of multiple possible pathogens. We consider the situation in which disease counts are available over time from a study region, along with a measure of clinical disease severity, for example, mild or severe. In addition, we suppose a subset of the cases are lab tested in order to determine the pathogen responsible for disease. In such a context, we focus interest on modeling the probabilities of disease incidence given pathogen type. The time course of these probabilities is of great interest as is the association with time-varying covariates such as meteorological variables. In this set up, a natural Bayesian approach would be based on imputation of the unsampled pathogen information using Markov Chain Monte Carlo but this is computationally challenging. We describe a practical approach to inference that is easy to implement. We use an empirical Bayes procedure in a first step to estimate summary statistics. We then treat these summary statistics as the observed data and develop a Bayesian generalized additive model. We analyze data on hand, foot, and mouth disease (HFMD) in China in which there are two pathogens of primary interest, enterovirus 71 (EV71) and Coxackie A16 (CA16). We find that both EV71 and CA16 are associated with temperature, relative humidity, and wind speed, with reasonably similar functional forms for both pathogens. The important issue of confounding by time is modeled using a penalized B-spline model with a random effects representation. The level of smoothing is addressed by a careful choice of the prior on the tuning variance. © 2016, The International Biometric Society.
Time Series Modeling of Pathogen-Specific Disease Probabilities with Subsampled Data
Fisher, Leigh; Wakefield, Jon; Bauer, Cici; Self, Steve
2016-01-01
Many diseases arise due to exposure to one of multiple possible pathogens. We consider the situation in which disease counts are available over time from a study region, along with a measure of clinical disease severity, for example, mild or severe. In addition, we suppose a subset of the cases are lab tested in order to determine the pathogen responsible for disease. In such a context, we focus interest on modeling the probabilities of disease incidence given pathogen type....
Comprehensive Model of Annual Plankton Succession Based on the Whole-Plankton Time Series Approach
Jean-Baptiste Romagnan; Louis Legendre; Lionel Guidi; Jean-Louis Jamet; Dominique Jamet; Laure Mousseau; Maria-Luiza Pedrotti; Marc Picheral; Gabriel Gorsky; Christian Sardet; Lars Stemmann
2015-01-01
International audience; Ecological succession provides a widely accepted description of seasonal changes in phy-toplankton and mesozooplankton assemblages in the natural environment, but concurrent changes in smaller (i.e. microbes) and larger (i.e. macroplankton) organisms are not included in the model because plankton ranging from bacteria to jellies are seldom sampled and analyzed simultaneously. Here we studied, for the first time in the aquatic literature, the succession of marine plankt...
Bronson, Jonathan E; Fei, Jingyi; Hofman, Jake M; Gonzalez, Ruben L; Wiggins, Chris H
2009-12-16
Time series data provided by single-molecule Förster resonance energy transfer (smFRET) experiments offer the opportunity to infer not only model parameters describing molecular complexes, e.g., rate constants, but also information about the model itself, e.g., the number of conformational states. Resolving whether such states exist or how many of them exist requires a careful approach to the problem of model selection, here meaning discrimination among models with differing numbers of states. The most straightforward approach to model selection generalizes the common idea of maximum likelihood--selecting the most likely parameter values--to maximum evidence: selecting the most likely model. In either case, such an inference presents a tremendous computational challenge, which we here address by exploiting an approximation technique termed variational Bayesian expectation maximization. We demonstrate how this technique can be applied to temporal data such as smFRET time series; show superior statistical consistency relative to the maximum likelihood approach; compare its performance on smFRET data generated from experiments on the ribosome; and illustrate how model selection in such probabilistic or generative modeling can facilitate analysis of closely related temporal data currently prevalent in biophysics. Source code used in this analysis, including a graphical user interface, is available open source via http://vbFRET.sourceforge.net.
Time Series Analysis Forecasting and Control
Box, George E P; Reinsel, Gregory C
2011-01-01
A modernized new edition of one of the most trusted books on time series analysis. Since publication of the first edition in 1970, Time Series Analysis has served as one of the most influential and prominent works on the subject. This new edition maintains its balanced presentation of the tools for modeling and analyzing time series and also introduces the latest developments that have occurred n the field over the past decade through applications from areas such as business, finance, and engineering. The Fourth Edition provides a clearly written exploration of the key methods for building, cl
Onisko, Agnieszka; Druzdzel, Marek J.; Austin, R. Marshall
2016-01-01
Background: Classical statistics is a well-established approach in the analysis of medical data. While the medical community seems to be familiar with the concept of a statistical analysis and its interpretation, the Bayesian approach, argued by many of its proponents to be superior to the classical frequentist approach, is still not well-recognized in the analysis of medical data. Aim: The goal of this study is to encourage data analysts to use the Bayesian approach, such as modeling with graphical probabilistic networks, as an insightful alternative to classical statistical analysis of medical data. Materials and Methods: This paper offers a comparison of two approaches to analysis of medical time series data: (1) classical statistical approach, such as the Kaplan–Meier estimator and the Cox proportional hazards regression model, and (2) dynamic Bayesian network modeling. Our comparison is based on time series cervical cancer screening data collected at Magee-Womens Hospital, University of Pittsburgh Medical Center over 10 years. Results: The main outcomes of our comparison are cervical cancer risk assessments produced by the three approaches. However, our analysis discusses also several aspects of the comparison, such as modeling assumptions, model building, dealing with incomplete data, individualized risk assessment, results interpretation, and model validation. Conclusion: Our study shows that the Bayesian approach is (1) much more flexible in terms of modeling effort, and (2) it offers an individualized risk assessment, which is more cumbersome for classical statistical approaches. PMID:28163973
Autovino, Dario; Minacapilli, Mario; Provenzano, Giuseppe
2015-04-01
Estimation of actual evapotraspiration by means of Penman-Monteith (P-M) equation requires the knowledge of the so-called 'bulk surface resistance', rc,act, representing the vapour flow resistance through the transpiring crop and evaporating soil surface. The accurate parameterization of rc,act still represents an unexploited topic, especially in the case of heterogeneous land surface. In agro-hydrological applications, the P-M equation commonly used to evaluate reference evapotranspiration (ET0) of a well-watered 'standardized crop' (grass or alfalfa), generally assumes for the bulk surface resistance a value of 70 s m-1. Moreover, specific crop coefficients have to be used to estimate maximum and/or actual evapotranspiration based on ET0. In this paper, a simple procedure for the indirect estimation of rc,act as function of a vegetation index computed from remote acquisition of Land Surface Temperature (LST), is proposed. An application was carried out in an irrigation district located near Castelvetrano, in South-West of Sicily, mainly cultivated with olive groves, in which actual evapotranspiration fluxes were measured during two years (2010-2011) by an Eddy Covariance flux tower (EC). Evapotranspiration measurements allowed evaluating rc,actbased on the numerical inversion of the P-M equation. In the same study area, a large time series of MODIS LST data, characterized by a spatial resolution of 1x1 km and a time step of 8-days, was also acquired for the period from 2000 to 2014. A simple Vegetation Index Temperatures (VTI), with values ranging from 0 to 1, was computed using normalized LST values. Evapotranspiration fluxes measured in 2010 were used to calibrate the relationship between rc,act and VTI, whereas data from 2011 were used for its validation. The preliminary results evidenced that, for the considered crop, an almost constant value of rc,act, corresponding to about 250 s m-1, can be considered typical of periods in which the crop is well
Developing a dengue early warning system using time series model: Case study in Tainan, Taiwan
Chen, Xiao-Wei; Jan, Chyan-Deng; Wang, Ji-Shang
2017-04-01
Dengue fever (DF) is a climate-sensitive disease that has been emerging in southern regions of Taiwan over the past few decades, causing a significant health burden to affected areas. This study aims to propose a predictive model to implement an early warning system so as to enhance dengue surveillance and control in Tainan, Taiwan. The Seasonal Autoregressive Integrated Moving Average (SARIMA) model was used herein to forecast dengue cases. Temporal correlation between dengue incidences and climate variables were examined by Pearson correlation analysis and Cross-correlation tests in order to identify key determinants to be included as predictors. The dengue surveillance data between 2000 and 2009, as well as their respective climate variables were then used as inputs for the model. We validated the model by forecasting the number of dengue cases expected to occur each week between January 1, 2010 and December 31, 2015. In addition, we analyzed historical dengue trends and found that 25 cases occurring in one week was a trigger point that often led to a dengue outbreak. This threshold point was combined with the season-based framework put forth by the World Health Organization to create a more accurate epidemic threshold for a Tainan-specific warning system. A Seasonal ARIMA model with the general form: (1,0,5)(1,1,1)52 is identified as the most appropriate model based on lowest AIC, and was proven significant in the prediction of observed dengue cases. Based on the correlation coefficient, Lag-11 maximum 1-hr rainfall (r=0.319, Pdengue surveillance and control in Tainan, Taiwan. We conclude that this timely dengue early warning system will enable public health services to allocate limited resources more effectively, and public health officials to adjust dengue emergency response plans to their maximum capabilities.
Institute of Scientific and Technical Information of China (English)
Xu Jiankun; Wang Enyuan; Li Zhonghui; Wang Chao
2011-01-01
In order to compensate for the deficiency of present methods of monitoring plane displacement in similarity model tests,such as inadequate real-time monitoring and more manual intervention,an effective monitoring method was proposed in this study,and the major steps of the monitoring method include:firstly,time-series images of the similarity model in the test were obtained by a camera,and secondly,measuring points marked as artificial targets were automatically tracked and recognized from time-series images.Finally,the real-time plane displacement field was calculated by the fixed magnification between objects and images under the specific conditions.And then the application device of the method was designed and tested.At the same time,a sub-pixel location method and a distortion error model were used to improve the measuring accuracy.The results indicate that this method may record the entire test,especially the detailed non-uniform deformation and sudden deformation.Compared with traditional methods this method has a number of advantages,such as greater measurement accuracy and reliability,less manual intervention,higher automation,strong practical properties,much more measurement information and so on.
A time-series analysis framework for the flood-wave method to estimate groundwater model parameters
Obergfell, Christophe; Bakker, Mark; Maas, Kees
2016-11-01
The flood-wave method is implemented within the framework of time-series analysis to estimate aquifer parameters for use in a groundwater model. The resulting extended flood-wave method is applicable to situations where groundwater fluctuations are affected significantly by time-varying precipitation and evaporation. Response functions for time-series analysis are generated with an analytic groundwater model describing stream-aquifer interaction. Analytical response functions play the same role as the well function in a pumping test, which is to translate observed head variations into groundwater model parameters by means of a parsimonious model equation. An important difference as compared to the traditional flood-wave method and pumping tests is that aquifer parameters are inferred from the combined effects of precipitation, evaporation, and stream stage fluctuations. Naturally occurring fluctuations are separated in contributions from different stresses. The proposed method is illustrated with data collected near a lowland river in the Netherlands. Special emphasis is put on the interpretation of the streambed resistance. The resistance of the streambed is the result of stream-line contraction instead of a semi-pervious streambed, which is concluded through comparison with the head loss calculated with an analytical two-dimensional cross-section model.
Directory of Open Access Journals (Sweden)
Chih-Chieh Young
2015-01-01
Full Text Available Accurate prediction of water level fluctuation is important in lake management due to its significant impacts in various aspects. This study utilizes four model approaches to predict water levels in the Yuan-Yang Lake (YYL in Taiwan: a three-dimensional hydrodynamic model, an artificial neural network (ANN model (back propagation neural network, BPNN, a time series forecasting (autoregressive moving average with exogenous inputs, ARMAX model, and a combined hydrodynamic and ANN model. Particularly, the black-box ANN model and physically based hydrodynamic model are coupled to more accurately predict water level fluctuation. Hourly water level data (a total of 7296 observations was collected for model calibration (training and validation. Three statistical indicators (mean absolute error, root mean square error, and coefficient of correlation were adopted to evaluate model performances. Overall, the results demonstrate that the hydrodynamic model can satisfactorily predict hourly water level changes during the calibration stage but not for the validation stage. The ANN and ARMAX models better predict the water level than the hydrodynamic model does. Meanwhile, the results from an ANN model are superior to those by the ARMAX model in both training and validation phases. The novel proposed concept using a three-dimensional hydrodynamic model in conjunction with an ANN model has clearly shown the improved prediction accuracy for the water level fluctuation.
Optimal transformations for categorical autoregressive time series
Buuren, S. van
1996-01-01
This paper describes a method for finding optimal transformations for analyzing time series by autoregressive models. 'Optimal' implies that the agreement between the autoregressive model and the transformed data is maximal. Such transformations help 1) to increase the model fit, and 2) to analyze c
Common large innovations across nonlinear time series
Ph.H.B.F. Franses (Philip Hans); R. Paap (Richard)
2002-01-01
textabstractWe propose a multivariate nonlinear econometric time series model, which can be used to examine if there is common nonlinearity across economic variables. The model is a multivariate censored latent effects autoregression. The key feature of this model is that nonlinearity appears as sep
Dalla Valle, Nicolas; Wutzler, Thomas; Meyer, Stefanie; Potthast, Karin; Michalzik, Beate
2017-04-01
Dual-permeability type models are widely used to simulate water fluxes and solute transport in structured soils. These models contain two spatially overlapping flow domains with different parameterizations or even entirely different conceptual descriptions of flow processes. They are usually able to capture preferential flow phenomena, but a large set of parameters is needed, which are very laborious to obtain or cannot be measured at all. Therefore, model inversions are often used to derive the necessary parameters. Although these require sufficient input data themselves, they can use measurements of state variables instead, which are often easier to obtain and can be monitored by automated measurement systems. In this work we show a method to estimate soil hydraulic parameters from high frequency soil moisture time series data gathered at two different measurement depths by inversion of a simple one dimensional dual-permeability model. The model uses an advection equation based on the kinematic wave theory to describe the flow in the fracture domain and a Richards equation for the flow in the matrix domain. The soil moisture time series data were measured in mesocosms during sprinkling experiments. The inversion consists of three consecutive steps: First, the parameters of the water retention function were assessed using vertical soil moisture profiles in hydraulic equilibrium. This was done using two different exponential retention functions and the Campbell function. Second, the soil sorptivity and diffusivity functions were estimated from Boltzmann-transformed soil moisture data, which allowed the calculation of the hydraulic conductivity function. Third, the parameters governing flow in the fracture domain were determined using the whole soil moisture time series. The resulting retention functions were within the range of values predicted by pedotransfer functions apart from very dry conditions, where all retention functions predicted lower matrix potentials
An Effective Time Series Analysis for Stock Trend Prediction Using ARIMA Model for Nifty Midcap-50
Directory of Open Access Journals (Sweden)
B.Uma Devi
2013-01-01
Full Text Available The data mining and its tool has played a vital role in exploring the data from different ware houses. Using data mining tools and analytical technologies we do a quantifiable amount of research to explore new approach for the investment decisions .The market with huge volume of investor with good enough knowledge and have a prediction as well as control over their investments. The stock market some time fails to attract new investor. The reason states that non-aware and also people don’t want to come forward to fall in to the risk. An approach with adequate expertise is designed to help investors to ascertain veiled patterns from the historic data that have feasible predictive ability in their investment decisions. In this paper the NSE – Nifty Midcap50 companies among them top 4 companies having max Midcap value has been selected for analysis. The historical data has a significant role in, helping the investing people to get an overview about the market behavior during the past decade. The stock data for the past five years has been collected and trained using ARIMA model with different parameters. The test criterions like Akaike Information Criterion Bayesian Information Criterion (AICBIC are applied to predict the accuracy of the model. The performance of the trained model is analyzed and it also tested to find the trend and the market behavior for future forecast.
Comprehensive model of annual plankton succession based on the whole-plankton time series approach.
Romagnan, Jean-Baptiste; Legendre, Louis; Guidi, Lionel; Jamet, Jean-Louis; Jamet, Dominique; Mousseau, Laure; Pedrotti, Maria-Luiza; Picheral, Marc; Gorsky, Gabriel; Sardet, Christian; Stemmann, Lars
2015-01-01
Ecological succession provides a widely accepted description of seasonal changes in phytoplankton and mesozooplankton assemblages in the natural environment, but concurrent changes in smaller (i.e. microbes) and larger (i.e. macroplankton) organisms are not included in the model because plankton ranging from bacteria to jellies are seldom sampled and analyzed simultaneously. Here we studied, for the first time in the aquatic literature, the succession of marine plankton in the whole-plankton assemblage that spanned 5 orders of magnitude in size from microbes to macroplankton predators (not including fish or fish larvae, for which no consistent data were available). Samples were collected in the northwestern Mediterranean Sea (Bay of Villefranche) weekly during 10 months. Simultaneously collected samples were analyzed by flow cytometry, inverse microscopy, FlowCam, and ZooScan. The whole-plankton assemblage underwent sharp reorganizations that corresponded to bottom-up events of vertical mixing in the water-column, and its development was top-down controlled by large gelatinous filter feeders and predators. Based on the results provided by our novel whole-plankton assemblage approach, we propose a new comprehensive conceptual model of the annual plankton succession (i.e. whole plankton model) characterized by both stepwise stacking of four broad trophic communities from early spring through summer, which is a new concept, and progressive replacement of ecological plankton categories within the different trophic communities, as recognised traditionally.
Comprehensive model of annual plankton succession based on the whole-plankton time series approach.
Directory of Open Access Journals (Sweden)
Jean-Baptiste Romagnan
Full Text Available Ecological succession provides a widely accepted description of seasonal changes in phytoplankton and mesozooplankton assemblages in the natural environment, but concurrent changes in smaller (i.e. microbes and larger (i.e. macroplankton organisms are not included in the model because plankton ranging from bacteria to jellies are seldom sampled and analyzed simultaneously. Here we studied, for the first time in the aquatic literature, the succession of marine plankton in the whole-plankton assemblage that spanned 5 orders of magnitude in size from microbes to macroplankton predators (not including fish or fish larvae, for which no consistent data were available. Samples were collected in the northwestern Mediterranean Sea (Bay of Villefranche weekly during 10 months. Simultaneously collected samples were analyzed by flow cytometry, inverse microscopy, FlowCam, and ZooScan. The whole-plankton assemblage underwent sharp reorganizations that corresponded to bottom-up events of vertical mixing in the water-column, and its development was top-down controlled by large gelatinous filter feeders and predators. Based on the results provided by our novel whole-plankton assemblage approach, we propose a new comprehensive conceptual model of the annual plankton succession (i.e. whole plankton model characterized by both stepwise stacking of four broad trophic communities from early spring through summer, which is a new concept, and progressive replacement of ecological plankton categories within the different trophic communities, as recognised traditionally.
Institute of Scientific and Technical Information of China (English)
CHAN; Kung-Sik; TONG; Howell; STENSETH; Nils; Chr
2009-01-01
The study of the rodent fluctuations of the North was initiated in its modern form with Elton’s pioneering work.Many scientific studies have been designed to collect yearly rodent abundance data,but the resulting time series are generally subject to at least two "problems":being short and non-linear.We explore the use of the continuous threshold autoregressive(TAR) models for analyzing such data.In the simplest case,the continuous TAR models are additive autoregressive models,being piecewise linear in one lag,and linear in all other lags.The location of the slope change is called the threshold parameter.The continuous TAR models for rodent abundance data can be derived from a general prey-predator model under some simplifying assumptions.The lag in which the threshold is located sheds important insights on the structure of the prey-predator system.We propose to assess the uncertainty on the location of the threshold via a new bootstrap called the nearest block bootstrap(NBB) which combines the methods of moving block bootstrap and the nearest neighbor bootstrap.The NBB assumes an underlying finite-order time-homogeneous Markov process.Essentially,the NBB bootstraps blocks of random block sizes,with each block being drawn from a non-parametric estimate of the future distribution given the realized past bootstrap series.We illustrate the methods by simulations and on a particular rodent abundance time series from Kilpisjrvi,Northern Finland.
The benefit of modeled ozone data for the reconstruction of a 99-year UV radiation time series
Junk, J.; Feister, U.; Helbig, A.; GöRgen, K.; Rozanov, E.; KrzyśCin, J. W.; Hoffmann, L.
2012-08-01
Solar erythemal UV radiation (UVER) is highly relevant for numerous biological processes that affect plants, animals, and human health. Nevertheless, long-term UVER records are scarce. As significant declines in the column ozone concentration were observed in the past and a recovery of the stratospheric ozone layer is anticipated by the middle of the 21st century, there is a strong interest in the temporal variation of UVERtime series. Therefore, we combined ground-based measurements of different meteorological variables with modeled ozone data sets to reconstruct time series of daily totals of UVER at the Meteorological Observatory, Potsdam, Germany. Artificial neural networks were trained with measured UVER, sunshine duration, the day of year, measured and modeled total column ozone, as well as the minimum solar zenith angle. This allows for the reconstruction of daily totals of UVERfor the period from 1901 to 1999. Additionally, analyses of the long-term variations from 1901 until 1999 of the reconstructed, new UVER data set are presented. The time series of monthly and annual totals of UVERprovide a long-term meteorological basis for epidemiological investigations in human health and occupational medicine for the region of Potsdam and Berlin. A strong benefit of our ANN-approach is the fact that it can be easily adapted to different geographical locations, as successfully tested in the framework of the COSTAction 726.
Lakshmi, K.; Rama Mohan Rao, A.
2014-10-01
In this paper, a novel output-only damage-detection technique based on time-series models for structural health monitoring in the presence of environmental variability and measurement noise is presented. The large amount of data obtained in the form of time-history response is transformed using principal component analysis, in order to reduce the data size and thereby improve the computational efficiency of the proposed algorithm. The time instant of damage is obtained by fitting the acceleration time-history data from the structure using autoregressive (AR) and AR with exogenous inputs time-series prediction models. The probability density functions (PDFs) of damage features obtained from the variances of prediction errors corresponding to references and healthy current data are found to be shifting from each other due to the presence of various uncertainties such as environmental variability and measurement noise. Control limits using novelty index are obtained using the distances of the peaks of the PDF curves in healthy condition and used later for determining the current condition of the structure. Numerical simulation studies have been carried out using a simply supported beam and also validated using an experimental benchmark data corresponding to a three-storey-framed bookshelf structure proposed by Los Alamos National Laboratory. Studies carried out in this paper clearly indicate the efficiency of the proposed algorithm for damage detection in the presence of measurement noise and environmental variability.
Wang, Duan; Podobnik, Boris; Horvatić, Davor; Stanley, H Eugene
2011-04-01
We propose a modified time lag random matrix theory in order to study time-lag cross correlations in multiple time series. We apply the method to 48 world indices, one for each of 48 different countries. We find long-range power-law cross correlations in the absolute values of returns that quantify risk, and find that they decay much more slowly than cross correlations between the returns. The magnitude of the cross correlations constitutes "bad news" for international investment managers who may believe that risk is reduced by diversifying across countries. We find that when a market shock is transmitted around the world, the risk decays very slowly. We explain these time-lag cross correlations by introducing a global factor model (GFM) in which all index returns fluctuate in response to a single global factor. For each pair of individual time series of returns, the cross correlations between returns (or magnitudes) can be modeled with the autocorrelations of the global factor returns (or magnitudes). We estimate the global factor using principal component analysis, which minimizes the variance of the residuals after removing the global trend. Using random matrix theory, a significant fraction of the world index cross correlations can be explained by the global factor, which supports the utility of the GFM. We demonstrate applications of the GFM in forecasting risks at the world level, and in finding uncorrelated individual indices. We find ten indices that are practically uncorrelated with the global factor and with the remainder of the world indices, which is relevant information for world managers in reducing their portfolio risk. Finally, we argue that this general method can be applied to a wide range of phenomena in which time series are measured, ranging from seismology and physiology to atmospheric geophysics.
Wang, Duan; Podobnik, Boris; Horvatić, Davor; Stanley, H. Eugene
2011-04-01
We propose a modified time lag random matrix theory in order to study time-lag cross correlations in multiple time series. We apply the method to 48 world indices, one for each of 48 different countries. We find long-range power-law cross correlations in the absolute values of returns that quantify risk, and find that they decay much more slowly than cross correlations between the returns. The magnitude of the cross correlations constitutes “bad news” for international investment managers who may believe that risk is reduced by diversifying across countries. We find that when a market shock is transmitted around the world, the risk decays very slowly. We explain these time-lag cross correlations by introducing a global factor model (GFM) in which all index returns fluctuate in response to a single global factor. For each pair of individual time series of returns, the cross correlations between returns (or magnitudes) can be modeled with the autocorrelations of the global factor returns (or magnitudes). We estimate the global factor using principal component analysis, which minimizes the variance of the residuals after removing the global trend. Using random matrix theory, a significant fraction of the world index cross correlations can be explained by the global factor, which supports the utility of the GFM. We demonstrate applications of the GFM in forecasting risks at the world level, and in finding uncorrelated individual indices. We find ten indices that are practically uncorrelated with the global factor and with the remainder of the world indices, which is relevant information for world managers in reducing their portfolio risk. Finally, we argue that this general method can be applied to a wide range of phenomena in which time series are measured, ranging from seismology and physiology to atmospheric geophysics.
Watanabe, Hayafumi; Takayasu, Hideki; Takayasu, Misako
2016-01-01
To elucidate the non-trivial empirical statistical properties of fluctuations of a typical non-steady time series representing the appearance of words in blogs, we investigated approximately five billion Japanese blogs over a period of six years and analyse some corresponding mathematical models. First, we introduce a solvable non-steady extension of the random diffusion model, which can be deduced by modelling the behaviour of heterogeneous random bloggers. Next, we deduce theoretical expressions for both the temporal and ensemble fluctuation scalings of this model, and demonstrate that these expressions can reproduce all empirical scalings over eight orders of magnitude. Furthermore, we show that the model can reproduce other statistical properties of time series representing the appearance of words in blogs, such as functional forms of the probability density and correlations in the total number of blogs. As an application, we quantify the abnormality of special nationwide events by measuring the fluctuati...
Muchlisoh, Siti; Kurnia, Anang; Notodiputro, Khairil Anwar; Mangku, I. Wayan
2016-02-01
Labor force surveys conducted over time by the rotating panel design have been carried out in many countries, including Indonesia. Labor force survey in Indonesia is regularly conducted by Statistics Indonesia (Badan Pusat Statistik-BPS) and has been known as the National Labor Force Survey (Sakernas). The main purpose of Sakernas is to obtain information about unemployment rates and its changes over time. Sakernas is a quarterly survey. The quarterly survey is designed only for estimating the parameters at the provincial level. The quarterly unemployment rate published by BPS (official statistics) is calculated based on only cross-sectional methods, despite the fact that the data is collected under rotating panel design. The study purpose to estimate a quarterly unemployment rate at the district level used small area estimation (SAE) model by combining time series and cross-sectional data. The study focused on the application and comparison between the Rao-Yu model and dynamic model in context estimating the unemployment rate based on a rotating panel survey. The goodness of fit of both models was almost similar. Both models produced an almost similar estimation and better than direct estimation, but the dynamic model was more capable than the Rao-Yu model to capture a heterogeneity across area, although it was reduced over time.
Highly comparative time-series analysis: the empirical structure of time series and their methods.
Fulcher, Ben D; Little, Max A; Jones, Nick S
2013-06-06
The process of collecting and organizing sets of observations represents a common theme throughout the history of science. However, despite the ubiquity of scientists measuring, recording and analysing the dynamics of different processes, an extensive organization of scientific time-series data and analysis methods has never been performed. Addressing this, annotated collections of over 35 000 real-world and model-generated time series, and over 9000 time-series analysis algorithms are analysed in this work. We introduce reduced representations of both time series, in terms of their properties measured by diverse scientific methods, and of time-series analysis methods, in terms of their behaviour on empirical time series, and use them to organize these interdisciplinary resources. This new approach to comparing across diverse scientific data and methods allows us to organize time-series datasets automatically according to their properties, retrieve alternatives to particular analysis methods developed in other scientific disciplines and automate the selection of useful methods for time-series classification and regression tasks. The broad scientific utility of these tools is demonstrated on datasets of electroencephalograms, self-affine time series, heartbeat intervals, speech signals and others, in each case contributing novel analysis techniques to the existing literature. Highly comparative techniques that compare across an interdisciplinary literature can thus be used to guide more focused research in time-series analysis for applications across the scientific disciplines.
Serinaldi, F.
2010-12-01
Discrete multiplicative random cascade (MRC) models were extensively studied and applied to disaggregate rainfall data, thanks to their formal simplicity and the small number of involved parameters. Focusing on temporal disaggregation, the rationale of these models is based on multiplying the value assumed by a physical attribute (e.g., rainfall intensity) at a given time scale L, by a suitable number b of random weights, to obtain b attribute values corresponding to statistically plausible observations at a smaller L/b time resolution. In the original formulation of the MRC models, the random weights were assumed to be independent and identically distributed. However, for several studies this hypothesis did not appear to be realistic for the observed rainfall series as the distribution of the weights was shown to depend on the space-time scale and rainfall intensity. Since these findings contrast with the scale invariance assumption behind the MRC models and impact on the applicability of these models, it is worth studying their nature. This study explores the possible presence of dependence of the parameters of two discrete MRC models on rainfall intensity and time scale, by analyzing point rainfall series with 5-min time resolution. Taking into account a discrete microcanonical (MC) model based on beta distribution and a discrete canonical beta-logstable (BLS), the analysis points out that the relations between the parameters and rainfall intensity across the time scales are detectable and can be modeled by a set of simple functions accounting for the parameter-rainfall intensity relationship, and another set describing the link between the parameters and the time scale. Therefore, MC and BLS models were modified to explicitly account for these relationships and compared with the continuous in scale universal multifractal (CUM) model, which is used as a physically based benchmark model. Monte Carlo simulations point out that the dependence of MC and BLS
Directory of Open Access Journals (Sweden)
F. Serinaldi
2010-12-01
Full Text Available Discrete multiplicative random cascade (MRC models were extensively studied and applied to disaggregate rainfall data, thanks to their formal simplicity and the small number of involved parameters. Focusing on temporal disaggregation, the rationale of these models is based on multiplying the value assumed by a physical attribute (e.g., rainfall intensity at a given time scale L, by a suitable number b of random weights, to obtain b attribute values corresponding to statistically plausible observations at a smaller L/b time resolution. In the original formulation of the MRC models, the random weights were assumed to be independent and identically distributed. However, for several studies this hypothesis did not appear to be realistic for the observed rainfall series as the distribution of the weights was shown to depend on the space-time scale and rainfall intensity. Since these findings contrast with the scale invariance assumption behind the MRC models and impact on the applicability of these models, it is worth studying their nature. This study explores the possible presence of dependence of the parameters of two discrete MRC models on rainfall intensity and time scale, by analyzing point rainfall series with 5-min time resolution. Taking into account a discrete microcanonical (MC model based on beta distribution and a discrete canonical beta-logstable (BLS, the analysis points out that the relations between the parameters and rainfall intensity across the time scales are detectable and can be modeled by a set of simple functions accounting for the parameter-rainfall intensity relationship, and another set describing the link between the parameters and the time scale. Therefore, MC and BLS models were modified to explicitly account for these relationships and compared with the continuous in scale universal multifractal (CUM model, which is used as a physically based benchmark model. Monte Carlo simulations point out
Nahlawi, Layan; Goncalves, Caroline; Imani, Farhad; Gaed, Mena; Gomez, Jose A.; Moussa, Madeleine; Gibson, Eli; Fenster, Aaron; Ward, Aaron D.; Abolmaesumi, Purang; Mousavi, Parvin; Shatkay, Hagit
2017-03-01
Recent studies have shown the value of Temporal Enhanced Ultrasound (TeUS) imaging for tissue characterization in transrectal ultrasound-guided prostate biopsies. Here, we present results of experiments designed to study the impact of temporal order of the data in TeUS signals. We assess the impact of variations in temporal order on the ability to automatically distinguish benign prostate-tissue from malignant tissue. We have previously used Hidden Markov Models (HMMs) to model TeUS data, as HMMs capture temporal order in time series. In the work presented here, we use HMMs to model malignant and benign tissues; the models are trained and tested on TeUS signals while introducing variation to their temporal order. We first model the signals in their original temporal order, followed by modeling the same signals under various time rearrangements. We compare the performance of these models for tissue characterization. Our results show that models trained over the original order-preserving signals perform statistically significantly better for distinguishing between malignant and benign tissues, than those trained on rearranged signals. The performance degrades as the amount of temporal-variation increases. Specifically, accuracy of tissue characterization decreases from 85% using models trained on original signals to 62% using models trained and tested on signals that are completely temporally-rearranged. These results indicate the importance of order in characterization of tissue malignancy from TeUS data.
Multivariate Time Series Decomposition into Oscillation Components.
Matsuda, Takeru; Komaki, Fumiyasu
2017-08-01
Many time series are considered to be a superposition of several oscillation components. We have proposed a method for decomposing univariate time series into oscillation components and estimating their phases (Matsuda & Komaki, 2017 ). In this study, we extend that method to multivariate time series. We assume that several oscillators underlie the given multivariate time series and that each variable corresponds to a superposition of the projections of the oscillators. Thus, the oscillators superpose on each variable with amplitude and phase modulation. Based on this idea, we develop gaussian linear state-space models and use them to decompose the given multivariate time series. The model parameters are estimated from data using the empirical Bayes method, and the number of oscillators is determined using the Akaike information criterion. Therefore, the proposed method extracts underlying oscillators in a data-driven manner and enables investigation of phase dynamics in a given multivariate time series. Numerical results show the effectiveness of the proposed method. From monthly mean north-south sunspot number data, the proposed method reveals an interesting phase relationship.
Time averaging, ageing and delay analysis of financial time series
Cherstvy, Andrey G.; Vinod, Deepak; Aghion, Erez; Chechkin, Aleksei V.; Metzler, Ralf
2017-06-01
We introduce three strategies for the analysis of financial time series based on time averaged observables. These comprise the time averaged mean squared displacement (MSD) as well as the ageing and delay time methods for varying fractions of the financial time series. We explore these concepts via statistical analysis of historic time series for several Dow Jones Industrial indices for the period from the 1960s to 2015. Remarkably, we discover a simple universal law for the delay time averaged MSD. The observed features of the financial time series dynamics agree well with our analytical results for the time averaged measurables for geometric Brownian motion, underlying the famed Black-Scholes-Merton model. The concepts we promote here are shown to be useful for financial data analysis and enable one to unveil new universal features of stock market dynamics.
Distributed analysis of simultaneous EEG-fMRI time-series: modeling and interpretation issues.
Esposito, Fabrizio; Aragri, Adriana; Piccoli, Tommaso; Tedeschi, Gioacchino; Goebel, Rainer; Di Salle, Francesco
2009-10-01
Functional magnetic resonance imaging (fMRI) and electroencephalography (EEG) represent brain activity in terms of a reliable anatomical localization and a detailed temporal evolution of neural signals. Simultaneous EEG-fMRI recordings offer the possibility to greatly enrich the significance and the interpretation of the single modality results because the same neural processes are observed from the same brain at the same time. Nonetheless, the different physical nature of the measured signals by the two techniques renders the coupling not always straightforward, especially in cognitive experiments where spatially localized and distributed effects coexist and evolve temporally at different temporal scales. The purpose of this article is to illustrate the combination of simultaneously recorded EEG and fMRI signals exploiting the principles of EEG distributed source modeling. We define a common source space for fMRI and EEG signal projection and gather a conceptually unique framework for the spatial and temporal comparative analysis. We illustrate this framework in a graded-load working-memory simultaneous EEG-fMRI experiment based on the n-back task where sustained load-dependent changes in the blood-oxygenation-level-dependent (BOLD) signals during continuous item memorization co-occur with parametric changes in the EEG theta power induced at each single item. In line with previous studies, we demonstrate on two single-subject cases how the presented approach is capable of colocalizing in midline frontal regions two phenomena simultaneously observed at different temporal scales, such as the sustained negative changes in BOLD activity and the parametric EEG theta synchronization. We discuss the presented approach in relation to modeling and interpretation issues typically arising in simultaneous EEG-fMRI studies.
Measuring nonlinear behavior in time series data
Wai, Phoong Seuk; Ismail, Mohd Tahir
2014-12-01
Stationary Test is an important test in detect the time series behavior since financial and economic data series always have missing data, structural change as well as jumps or breaks in the data set. Moreover, stationary test is able to transform the nonlinear time series variable to become stationary by taking difference-stationary process or trend-stationary process. Two different types of hypothesis testing of stationary tests that are Augmented Dickey-Fuller (ADF) test and Kwiatkowski-Philips-Schmidt-Shin (KPSS) test are examine in this paper to describe the properties of the time series variables in financial model. Besides, Least Square method is used in Augmented Dickey-Fuller test to detect the changes of the series and Lagrange multiplier is used in Kwiatkowski-Philips-Schmidt-Shin test to examine the properties of oil price, gold price and Malaysia stock market. Moreover, Quandt-Andrews, Bai-Perron and Chow tests are also use to detect the existence of break in the data series. The monthly index data are ranging from December 1989 until May 2012. Result is shown that these three series exhibit nonlinear properties but are able to transform to stationary series after taking first difference process.
DEFF Research Database (Denmark)
Sandoval, Santiago; Vezzaro, Luca; Bertrand-Krajewski, Jean-Luc
2016-01-01
seeks to evaluate the potential of the Singular Spectrum Analysis (SSA), a time-series modelling/gap-filling method, to complete dry weather time series. The SSA method is tested by reconstructing 1000 artificial discontinuous time series, randomly generated from real flow rate and total suspended...... solids (TSS) online measurements (year 2007, 2 minutes time-step, combined system, Ecully, Lyon, France). Results show up the potential of the method to fill gaps longer than 0.5 days, especially between 0.5 days and 1 day (mean NSE > 0.6) in the flow rate time series. TSS results still perform very...
Horváth, Csilla; Kornelis, Marcel; Leeflang, Peter S.H.
2002-01-01
In this review, we give a comprehensive summary of time series techniques in marketing, and discuss a variety of time series analysis (TSA) techniques and models. We classify them in the sets (i) univariate TSA, (ii) multivariate TSA, and (iii) multiple TSA. We provide relevant marketing application
Horváth, Csilla; Kornelis, Marcel; Leeflang, Peter S.H.
2002-01-01
In this review, we give a comprehensive summary of time series techniques in marketing, and discuss a variety of time series analysis (TSA) techniques and models. We classify them in the sets (i) univariate TSA, (ii) multivariate TSA, and (iii) multiple TSA. We provide relevant marketing application
Horváth, Csilla; Kornelis, Marcel; Leeflang, Peter S.H.
2002-01-01
In this review, we give a comprehensive summary of time series techniques in marketing, and discuss a variety of time series analysis (TSA) techniques and models. We classify them in the sets (i) univariate TSA, (ii) multivariate TSA, and (iii) multiple TSA. We provide relevant marketing
ARIMA-Based Time Series Model of Stochastic Wind Power Generation
DEFF Research Database (Denmark)
Chen, Peiyuan; Pedersen, Troels; Bak-Jensen, Birgitte
2010-01-01
This paper proposes a stochastic wind power model based on an autoregressive integrated moving average (ARIMA) process. The model takes into account the nonstationarity and physical limits of stochastic wind power generation. The model is constructed based on wind power measurement of one year from...... the Nysted offshore wind farm in Denmark. The proposed limited-ARIMA (LARIMA) model introduces a limiter and characterizes the stochastic wind power generation by mean level, temporal correlation and driving noise. The model is validated against the measurement in terms of temporal correlation...... and probability distribution. The LARIMA model outperforms a first-order transition matrix based discrete Markov model in terms of temporal correlation, probability distribution and model parameter number. The proposed LARIMA model is further extended to include the monthly variation of the stochastic wind power...
WIMAX TRAFFIC MODEL BASED ON TIME SERIES FOR FORECAST FUTURE VALUES OF TRAFFIC
Directory of Open Access Journals (Sweden)
Cesar Augusto Hernández Suarez
2009-03-01
Full Text Available El objetivo de esta investigación es demostrar que las series de tiempo son una excelente herramienta para el modelamiento de tráfico de datos en redes Wimax. Para lograr este objetivo se utilizó la metodología de Box-Jenkins, la cual se describe en este artículo. El modelamiento de tráfico Wimax a través de modelos correlacionados como las series de tiempo permiten ajustar gran parte de la dinámica del comportamiento de los datos en una ecuación y con base en esto estimar valores futuros de tráfico. Lo anterior es una ventaja para la planeación de cobertura, reservación de recursos y la realización de un control más oportuno y eficiente en forma integrada a diferentes niveles de la jerarquía funcional de la red de datos Wimax. Como resultado de la investigación se obtuvo un modelo de tráfico ARIMA de orden 18, el cual realizó pronósticos de tráfico con valores del error cuadrático medio relativamente pequeños, para un periodo de 10 días.
On forecasting cointegrated seasonal time series
M. Löf (Marten); Ph.H.B.F. Franses (Philip Hans)
2000-01-01
textabstractWe analyze periodic and seasonal cointegration models for bivariate quarterly observed time series in an empirical forecasting study. We include both single equation and multiple equation methods. A VAR model in first differences with and without cointegration restrictions is also
Institute of Scientific and Technical Information of China (English)
LI W.K.; LI GuoDong
2009-01-01
@@ The authors are to be congratulated for an innovative paper in terms of both modelling methodology and subject matter significance. The analysis of short time series is known to be difficult even for linear models.
A time series of TanDEM-X digital elevation models to monitor a glacier surge
Wendt, Anja; Mayer, Christoph; Lambrecht, Astrid; Floricioiu, Dana
2016-04-01
Bivachny Glacier, a tributary of the more than 70 km long Fedchenko Glacier in the Pamir Mountains, Central Asia, is a surge-type glacier with three known surges during the 20th century. In 2011, the most recent surge started which, in contrast to the previous ones, evolved down the whole glacier and reached the confluence with Fedchenko Glacier. Spatial and temporal glacier volume changes can be derived from high-resolution digital elevation models (DEMs) based on bistatic InSAR data from the TanDEM-X mission. There are nine DEMs available between 2011 and 2015 covering the entire surge period in time steps from few months up to one year. During the surge, the glacier surface elevation increased by up to 130 m in the lower part of the glacier; and change rates of up to 0.6 m per day were observed. The surface height dataset was complemented with glacier surface velocity information from TerraSAR-X/ TanDEM-X data as well as optical Landsat imagery. While the glacier was practically stagnant in 2000 after the end of the previous surge in the 1990s, the velocity increase started in 2011 in the upper reaches of the ablation area and successively moved downwards and intensified, reaching up to 4.0 m per day. The combination of surface elevation changes and glacier velocities, both of high temporal and spatial resolution, provides the unique opportunity to describe and analyse the evolution of the surge in unprecedented detail. Especially the relation between the mobilization front and the local mass transport provides insight into the surge dynamics.
VARIABLE SELECTION BY PSEUDO WAVELETS IN HETEROSCEDASTIC REGRESSION MODELS INVOLVING TIME SERIES
Institute of Scientific and Technical Information of China (English)
无
2006-01-01
A simple but efficient method has been proposed to select variables in heteroscedastic regression models. It is shown that the pseudo empirical wavelet coefficients corresponding to the significant explanatory variables in the regression models are clearly larger than those nonsignificant ones, on the basis of which a procedure is developed to select variables in regression models. The coefficients of the models are also estimated. All estimators are proved to be consistent.
Robert E. Keane
2012-01-01
Simulation modeling can be a powerful tool for generating information about historical range of variation (HRV) in landscape conditions. In this chapter, I will discuss several aspects of the use of simulation modeling to generate landscape HRV data, including (1) the advantages and disadvantages of using simulation, (2) a brief review of possible landscape models. and...
Jump-Preserving Varying-Coefficient Models for Nonlinear Time Series
Cizek, Pavel; Koo, Chao
2017-01-01
An important and widely used class of semiparametric models is formed by the varyingcoefficient models. Although the varying coefficients are traditionally assumed to be smooth functions, the varying-coefficient model is considered here with the coefficient functions containing a finite set of disco
The Biasing Effects of Unmodeled ARMA Time Series Processes on Latent Growth Curve Model Estimates
Sivo, Stephen; Fan, Xitao; Witta, Lea
2005-01-01
The purpose of this study was to evaluate the robustness of estimated growth curve models when there is stationary autocorrelation among manifest variable errors. The results suggest that when, in practice, growth curve models are fitted to longitudinal data, alternative rival hypotheses to consider would include growth models that also specify…
Xu, Xijin; Tang, Qian; Xia, Haiyue; Zhang, Yuling; Li, Weiqiu; Huo, Xia
2016-04-01
Chaotic time series prediction based on nonlinear systems showed a superior performance in prediction field. We studied prenatal exposure to polychlorinated biphenyls (PCBs) by chaotic time series prediction using the least squares self-exciting threshold autoregressive (SEATR) model in umbilical cord blood in an electronic waste (e-waste) contaminated area. The specific prediction steps basing on the proposal methods for prenatal PCB exposure were put forward, and the proposed scheme’s validity was further verified by numerical simulation experiments. Experiment results show: 1) seven kinds of PCB congeners negatively correlate with five different indices for birth status: newborn weight, height, gestational age, Apgar score and anogenital distance; 2) prenatal PCB exposed group at greater risks compared to the reference group; 3) PCBs increasingly accumulated with time in newborns; and 4) the possibility of newborns suffering from related diseases in the future was greater. The desirable numerical simulation experiments results demonstrated the feasibility of applying mathematical model in the environmental toxicology field.
Directory of Open Access Journals (Sweden)
Ming Dong
2010-01-01
Full Text Available The primary objective of engineering asset management is to optimize assets service delivery potential and to minimize the related risks and costs over their entire life through the development and application of asset health and usage management in which the health and reliability prediction plays an important role. In real-life situations where an engineering asset operates under dynamic operational and environmental conditions, the lifetime of an engineering asset is generally described as monitored nonlinear time-series data and subject to high levels of uncertainty and unpredictability. It has been proved that application of data mining techniques is very useful for extracting relevant features which can be used as parameters for assets diagnosis and prognosis. In this paper, a tutorial on nonlinear time-series data mining in engineering asset health and reliability prediction is given. Besides that an overview on health and reliability prediction techniques for engineering assets is covered, this tutorial will focus on concepts, models, algorithms, and applications of hidden Markov models (HMMs and hidden semi-Markov models (HSMMs in engineering asset health prognosis, which are representatives of recent engineering asset health prediction techniques.
A Time Series Forecasting Method
Directory of Open Access Journals (Sweden)
Wang Zhao-Yu
2017-01-01
Full Text Available This paper proposes a novel time series forecasting method based on a weighted self-constructing clustering technique. The weighted self-constructing clustering processes all the data patterns incrementally. If a data pattern is not similar enough to an existing cluster, it forms a new cluster of its own. However, if a data pattern is similar enough to an existing cluster, it is removed from the cluster it currently belongs to and added to the most similar cluster. During the clustering process, weights are learned for each cluster. Given a series of time-stamped data up to time t, we divide it into a set of training patterns. By using the weighted self-constructing clustering, the training patterns are grouped into a set of clusters. To estimate the value at time t + 1, we find the k nearest neighbors of the input pattern and use these k neighbors to decide the estimation. Experimental results are shown to demonstrate the effectiveness of the proposed approach.
Effective Feature Preprocessing for Time Series Forecasting
DEFF Research Database (Denmark)
Zhao, Junhua; Dong, Zhaoyang; Xu, Zhao
2006-01-01
Time series forecasting is an important area in data mining research. Feature preprocessing techniques have significant influence on forecasting accuracy, therefore are essential in a forecasting model. Although several feature preprocessing techniques have been applied in time series forecasting......, there is so far no systematic research to study and compare their performance. How to select effective techniques of feature preprocessing in a forecasting model remains a problem. In this paper, the authors conduct a comprehensive study of existing feature preprocessing techniques to evaluate their empirical...... performance in time series forecasting. It is demonstrated in our experiment that, effective feature preprocessing can significantly enhance forecasting accuracy. This research can be a useful guidance for researchers on effectively selecting feature preprocessing techniques and integrating them with time...
Davis, Richard A
2012-01-01
This paper studies theory and inference related to a class of time series models that incorporates nonlinear dynamics. It is assumed that the observations follow a one-parameter exponential family of distributions given an accompanying process that evolves as a function of lagged observations. We employ an iterated random function approach and a special coupling technique to show that, under suitable conditions on the parameter space, the conditional mean process is a geometric moment contracting Markov chain and that the observation process is absolutely regular with geometrically decaying coefficients. Moreover the asymptotic theory of the maximum likelihood estimates of the parameters is established under some mild assumptions. These models are applied to two examples; the first is the number of transactions per minute of Ericsson stock and the second is related to return times of extreme events of Goldman Sachs Group stock.
Forecasting ocean wave energy: A Comparison of the ECMWF wave model with time series methods
DEFF Research Database (Denmark)
Reikard, Gordon; Pinson, Pierre; Bidlot, Jean
2011-01-01
days. In selecting a method, the forecaster has a choice between physics-based models and statistical techniques. A further idea is to combine both types of models. This paper analyzes the forecasting properties of a well-known physics-based model, the European Center for Medium-Range Weather Forecasts......Recently, the technology has been developed to make wave farms commercially viable. Since electricity is perishable, utilities will be interested in forecasting ocean wave energy. The horizons involved in short-term management of power grids range from as little as a few hours to as long as several...... energy flux. In the initial tests, the ECMWF model and the statistical models are compared directly. The statistical models do better at short horizons, producing more accurate forecasts in the 1–5 h range. The ECMWF model is superior at longer horizons. The convergence point, at which the two methods...
A New Approach to Improve Accuracy of Grey Model GMC(1,n in Time Series Prediction
Directory of Open Access Journals (Sweden)
Sompop Moonchai
2015-01-01
Full Text Available This paper presents a modified grey model GMC(1,n for use in systems that involve one dependent system behavior and n-1 relative factors. The proposed model was developed from the conventional GMC(1,n model in order to improve its prediction accuracy by modifying the formula for calculating the background value, the system of parameter estimation, and the model prediction equation. The modified GMC(1,n model was verified by two cases: the study of forecasting CO2 emission in Thailand and forecasting electricity consumption in Thailand. The results demonstrated that the modified GMC(1,n model was able to achieve higher fitting and prediction accuracy compared with the conventional GMC(1,n and D-GMC(1,n models.
Martinez, Josue G.
2013-06-01
We describe a new approach to analyze chirp syllables of free-tailed bats from two regions of Texas in which they are predominant: Austin and College Station. Our goal is to characterize any systematic regional differences in the mating chirps and assess whether individual bats have signature chirps. The data are analyzed by modeling spectrograms of the chirps as responses in a Bayesian functional mixed model. Given the variable chirp lengths, we compute the spectrograms on a relative time scale interpretable as the relative chirp position, using a variable window overlap based on chirp length. We use 2D wavelet transforms to capture correlation within the spectrogram in our modeling and obtain adaptive regularization of the estimates and inference for the regions-specific spectrograms. Our model includes random effect spectrograms at the bat level to account for correlation among chirps from the same bat, and to assess relative variability in chirp spectrograms within and between bats. The modeling of spectrograms using functional mixed models is a general approach for the analysis of replicated nonstationary time series, such as our acoustical signals, to relate aspects of the signals to various predictors, while accounting for between-signal structure. This can be done on raw spectrograms when all signals are of the same length, and can be done using spectrograms defined on a relative time scale for signals of variable length in settings where the idea of defining correspondence across signals based on relative position is sensible.
A study of finite mixture model: Bayesian approach on financial time series data
Phoong, Seuk-Yen; Ismail, Mohd Tahir
2014-07-01
Recently, statistician have emphasized on the fitting finite mixture model by using Bayesian method. Finite mixture model is a mixture of distributions in modeling a statistical distribution meanwhile Bayesian method is a statistical method that use to fit the mixture model. Bayesian method is being used widely because it has asymptotic properties which provide remarkable result. In addition, Bayesian method also shows consistency characteristic which means the parameter estimates are close to the predictive distributions. In the present paper, the number of components for mixture model is studied by using Bayesian Information Criterion. Identify the number of component is important because it may lead to an invalid result. Later, the Bayesian method is utilized to fit the k-component mixture model in order to explore the relationship between rubber price and stock market price for Malaysia, Thailand, Philippines and Indonesia. Lastly, the results showed that there is a negative effect among rubber price and stock market price for all selected countries.
Directory of Open Access Journals (Sweden)
Georgia Doxani
2015-10-01
Full Text Available The Sentinel missions have been designed to support the operational services of the Copernicus program, ensuring long-term availability of data for a wide range of spectral, spatial and temporal resolutions. In particular, Sentinel-2 (S-2 data with improved high spatial resolution and higher revisit frequency (five days with the pair of satellites in operation will play a fundamental role in recording land cover types and monitoring land cover changes at regular intervals. Nevertheless, cloud coverage usually hinders the time series availability and consequently the continuous land surface monitoring. In an attempt to alleviate this limitation, the synergistic use of instruments with different features is investigated, aiming at the future synergy of the S-2 MultiSpectral Instrument (MSI and Sentinel-3 (S-3 Ocean and Land Colour Instrument (OLCI. To that end, an unmixing model is proposed with the intention of integrating the benefits of the two Sentinel missions, when both in orbit, in one composite image. The main goal is to fill the data gaps in the S-2 record, based on the more frequent information of the S-3 time series. The proposed fusion model has been applied on MODIS (MOD09GA L2G and SPOT4 (Take 5 data and the experimental results have demonstrated that the approach has high potential. However, the different acquisition characteristics of the sensors, i.e. illumination and viewing geometry, should be taken into consideration and bidirectional effects correction has to be performed in order to reduce noise in the reflectance time series.
FUZZY MODEL OPTIMIZATION FOR TIME SERIES DATA USING A TRANSLATION IN THE EXTENT OF MEAN ERROR
Nurhayadi; ., Subanar; Abdurakhman; Agus Maman Abadi
2014-01-01
Recently, many researchers in the field of writing about the prediction of stock price forecasting, electricity load demand and academic enrollment using fuzzy methods. However, in general, modeling does not consider the model position to actual data yet where it means that error is not been handled optimally. The error that is not managed well can reduce the accuracy of the forecasting. Therefore, the paper will discuss reducing error using model translation. The error that will be reduced i...
Analyzing Developmental Processes on an Individual Level Using Nonstationary Time Series Modeling
Molenaar, Peter C. M.; Sinclair, Katerina O.; Rovine, Michael J.; Ram, Nilam; Corneal, Sherry E.
2009-01-01
Individuals change over time, often in complex ways. Generally, studies of change over time have combined individuals into groups for analysis, which is inappropriate in most, if not all, studies of development. The authors explain how to identify appropriate levels of analysis (individual vs. group) and demonstrate how to estimate changes in…
FUZZY MODEL OPTIMIZATION FOR TIME SERIES DATA USING A TRANSLATION IN THE EXTENT OF MEAN ERROR
Directory of Open Access Journals (Sweden)
Nurhayadi
2014-01-01
Full Text Available Recently, many researchers in the field of writing about the prediction of stock price forecasting, electricity load demand and academic enrollment using fuzzy methods. However, in general, modeling does not consider the model position to actual data yet where it means that error is not been handled optimally. The error that is not managed well can reduce the accuracy of the forecasting. Therefore, the paper will discuss reducing error using model translation. The error that will be reduced is Mean Square Error (MSE. Here, the analysis is done mathematically and the empirical study is done by applying translation to fuzzy model for enrollment forecasting at the Alabama University. The results of this analysis show that the translation in the extent of mean error can reduce the MSE.
Abghari, H.; van de Giesen, N.; Mahdavi, M.; Salajegheh, A.
2009-04-01
Artificial intelligence modeling of nonstationary rainfall-runoff has some restrictions in simulation accuracy due to the complexity and nonlinearity of training patterns. Preprocessing of trainings dataset could determine homogeneity of rainfall-runoff patterns before modeling. In this presentation, a new hybrid model of Artificial Intelligence in conjunction with clustering is introduced and applied to flow prediction. Simulation of Nazloochaei river flow in North-West Iran was the case used for development of a PNN-RBF model. PNN classify a training dataset in two groups based on Parezen theory using unsupervised classification. Subsequently each data group is used to train and test two RBF networks and the results are compared to the application of all data in a RBF network without classification. Results show that classification of rainfall-runoff patterns using PNN and prediction of runoff with RBF increase prediction precise of networks. Keywords: Probabilistic Neural Network, Radial Base Function Neural Network, Parezen theory, River Flow Prediction
A time series model of the occurrence of gastric dilatation-volvulus in a population of dogs
Directory of Open Access Journals (Sweden)
Moore George E
2009-04-01
Full Text Available Abstract Background Gastric dilatation-volvulus (GDV is a life-threatening condition of mammals, with increased risk in large breed dogs. The study of its etiological factors is difficult due to the variety of possible living conditions. The association between meteorological events and the occurrence of GDV has been postulated but remains unclear. This study introduces the binary time series approach to the investigation of the possible meteorological risk factors for GDV. The data collected in a population of high-risk working dogs in Texas was used. Results Minimum and maximum daily atmospheric pressure on the day of GDV event and the maximum daily atmospheric pressure on the day before the GDV event were positively associated with the probability of GDV. All of the odds/multiplicative factors of a day being GDV day were interpreted conditionally on the past GDV occurrences. There was minimal difference between the binary and Poisson general linear models. Conclusion Time series modeling provided a novel method for evaluating the association between meteorological variables and GDV in a large population of dogs. Appropriate application of this method was enhanced by a common environment for the dogs and availability of meteorological data. The potential interaction between weather changes and patient risk factors for GDV deserves further investigation.
Introduction to time series analysis and forecasting
Montgomery, Douglas C; Kulahci, Murat
2015-01-01
Praise for the First Edition ""…[t]he book is great for readers who need to apply the methods and models presented but have little background in mathematics and statistics."" -MAA Reviews Thoroughly updated throughout, Introduction to Time Series Analysis and Forecasting, Second Edition presents the underlying theories of time series analysis that are needed to analyze time-oriented data and construct real-world short- to medium-term statistical forecasts. Authored by highly-experienced academics and professionals in engineering statistics, the Second Edition features discussions on both
Institute of Scientific and Technical Information of China (English)
Sudarat Chadsuthi; Charin Modchang; Yongwimon Lenbury; Sopon Iamsirithaworn; Wannapong Triampo
2012-01-01
ABSTRACT Objective:To study the number of leptospirosis cases in relations to the seasonal pattern, and its association with climate factors.Methods:Time series analysis was used to study the time variations in the number of leptospirosis cases.TheAutoregressiveIntegratedMovingAverage (ARIMA) model was used in data curve fitting and predicting the next leptospirosis cases. Results:We found that the amount of rainfall was correlated to leptospirosis cases in both regions of interest, namely the northern and northeastern region ofThailand, while the temperature played a role in the northeastern region only.The use of multivariateARIMA(ARIMAX) model showed that factoring in rainfall(with an8 months lag) yields the best model for the northern region while the model, which factors in rainfall(with a10 months lag) and temperature(with an8 months lag) was the best for the northeastern region.Conclusions:The models are able to show the trend in leptospirosis cases and closely fit the recorded data in both regions.The models can also be used to predict the next seasonal peak quite accurately.
Directory of Open Access Journals (Sweden)
Zahra Hojjati Tavassoli
2016-07-01
Full Text Available During a construction project life cycle, project costs and time estimations contribute greatly to baseline scheduling. Besides, schedule risk analysis and project control are also influenced by the above factors. Although many papers have offered estimation techniques, little attempt has been made to generate project time series data as daily progressive estimations in different project environments that could help researchers in generating general and customized formulae in further studies. This paper, however, is an attempt to introduce a new simulation approach to reflect the data regarding time series progress of the project, considering the specifications and the complexity of the project and the environment where the project is performed. Moreover, this simulator can equip project managers with estimated information, which reassures them of the execution stages of the project although they lack historical data. A case study is presented to show the usefulness of the model and its applicability in practice. In this study, singular spectrum analysis has been employed to analyze the simulated outputs, and the results are separated based on their signal and noise trends. The signal trend is used as a point-of-reference to compare the outputs of a simulation employing S-curve technique results and the formulae corresponding to earned value management, as well as the life of a given project.
An introduction to state space time series analysis.
Commandeur, J.J.F. & Koopman, S.J.
2007-01-01
Providing a practical introduction to state space methods as applied to unobserved components time series models, also known as structural time series models, this book introduces time series analysis using state space methodology to readers who are neither familiar with time series analysis, nor with state space methods. The only background required in order to understand the material presented in the book is a basic knowledge of classical linear regression models, of which a brief review is...
Model selection in time series studies of influenza-associated mortality.
Directory of Open Access Journals (Sweden)
Xi-Ling Wang
Full Text Available BACKGROUND: Poisson regression modeling has been widely used to estimate influenza-associated disease burden, as it has the advantage of adjusting for multiple seasonal confounders. However, few studies have discussed how to judge the adequacy of confounding adjustment. This study aims to compare the performance of commonly adopted model selection criteria in terms of providing a reliable and valid estimate for the health impact of influenza. METHODS: We assessed four model selection criteria: quasi Akaike information criterion (QAIC, quasi bayesian information criterion (QBIC, partial autocorrelation functions of residuals (PACF, and generalized cross-validation (GCV, by separately applying them to select the Poisson model best fitted to the mortality datasets that were simulated under the different assumptions of seasonal confounding. The performance of these criteria was evaluated by the bias and root-mean-square error (RMSE of estimates from the pre-determined coefficients of influenza proxy variable. These four criteria were subsequently applied to an empirical hospitalization dataset to confirm the findings of simulation study. RESULTS: GCV consistently provided smaller biases and RMSEs for the influenza coefficient estimates than QAIC, QBIC and PACF, under the different simulation scenarios. Sensitivity analysis of different pre-determined influenza coefficients, study periods and lag weeks showed that GCV consistently outperformed the other criteria. Similar results were found in applying these selection criteria to estimate influenza-associated hospitalization. CONCLUSIONS: GCV criterion is recommended for selection of Poisson models to estimate influenza-associated mortality and morbidity burden with proper adjustment for confounding. These findings shall help standardize the Poisson modeling approach for influenza disease burden studies.
Extensible biosignal metadata a model for physiological time-series data.
Brooks, David
2009-01-01
The domain specific nature of biosignal storage formats, along with the lack of support for metadata in generalpurpose biosignal libraries, has hampered the easy interchange of biosignals between disciplines and their integration with physiological modelling software. Extensible Biosignal Metadata (XBM) is introduced as a standard framework to facilitate the sharing of information between and within research groups for both experimentalists and modellers; to help establish more web-accessible biosignal repositories; and, by using semantic web technologies, to result in the discovery of knowledge by automated reasoning systems.
Lagged PM2.5 effects in mortality time series: Critical impact of covariate model
The two most common approaches to modeling the effects of air pollution on mortality are the Harvard and the Johns Hopkins (NMMAPS) approaches. These two approaches, which use different sets of covariates, result in dissimilar estimates of the effect of lagged fine particulate ma...
2010-09-30
Hyperfast Modeling of Nonlinear Ocean Waves A. R. Osborne Dipartimento di Fisica Generale, Università di Torino Via Pietro Giuria 1, 10125...PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) Universit?i Torino,Dipartimento di Fisica Generale,Via Pietro Giuria 1,10125 Torino, Italy, 8. PERFORMING
J. Chen (Jinghui); M. Kobayashi (Masahito); M.J. McAleer (Michael)
2016-01-01
textabstractThe paper considers the problem as to whether financial returns have a common volatility process in the framework of stochastic volatility models that were suggested by Harvey et al. (1994). We propose a stochastic volatility version of the ARCH test proposed by Engle and Susmel (1993),
2009-06-01
Carreras , and M. Collins. Structured prediction models via the matrix-tree theorem,. In EMNLP, 2007. 73, 74 [57] B. Korte and J. Vygen. Combinatorial...On lines and planes of closest fit to systems of points, 1901. 116 [75] J. Pers, M. Bon, and G. Vuckovic. CVBASE 06 dataset. In http://vision.fe.uni
The role of initial values in nonstationary fractional time series models
DEFF Research Database (Denmark)
Johansen, Søren; Nielsen, Morten Ørregaard
We consider the nonstationary fractional model $\\Delta^{d}X_{t}=\\varepsilon _{t}$ with $\\varepsilon_{t}$ i.i.d.$(0,\\sigma^{2})$ and $d>1/2$. We derive an analytical expression for the main term of the asymptotic bias of the maximum likelihood estimator of $d$ conditional on initial values, and we...
Detecting chaos from time series
Xiaofeng, Gong; Lai, C. H.
2000-02-01
In this paper, an entirely data-based method to detect chaos from the time series is developed by introducing icons/Journals/Common/epsilon" ALT="epsilon" ALIGN="TOP"/> p -neighbour points (the p -steps icons/Journals/Common/epsilon" ALT="epsilon" ALIGN="TOP"/> -neighbour points). We demonstrate that for deterministic chaotic systems, there exists a linear relationship between the logarithm of the average number of icons/Journals/Common/epsilon" ALT="epsilon" ALIGN="TOP"/> p -neighbour points, lnn p ,icons/Journals/Common/epsilon" ALT="epsilon" ALIGN="TOP"/> , and the time step, p . The coefficient can be related to the KS entropy of the system. The effects of the embedding dimension and noise are also discussed.
A wavelet method for modeling and despiking motion artifacts from resting-state fMRI time series
Patel, Ameera X.; Kundu, Prantik; Rubinov, Mikail; Jones, P. Simon; Vértes, Petra E.; Ersche, Karen D.; Suckling, John; Bullmore, Edward T.
2014-01-01
The impact of in-scanner head movement on functional magnetic resonance imaging (fMRI) signals has long been established as undesirable. These effects have been traditionally corrected by methods such as linear regression of head movement parameters. However, a number of recent independent studies have demonstrated that these techniques are insufficient to remove motion confounds, and that even small movements can spuriously bias estimates of functional connectivity. Here we propose a new data-driven, spatially-adaptive, wavelet-based method for identifying, modeling, and removing non-stationary events in fMRI time series, caused by head movement, without the need for data scrubbing. This method involves the addition of just one extra step, the Wavelet Despike, in standard pre-processing pipelines. With this method, we demonstrate robust removal of a range of different motion artifacts and motion-related biases including distance-dependent connectivity artifacts, at a group and single-subject level, using a range of previously published and new diagnostic measures. The Wavelet Despike is able to accommodate the substantial spatial and temporal heterogeneity of motion artifacts and can consequently remove a range of high and low frequency artifacts from fMRI time series, that may be linearly or non-linearly related to physical movements. Our methods are demonstrated by the analysis of three cohorts of resting-state fMRI data, including two high-motion datasets: a previously published dataset on children (N = 22) and a new dataset on adults with stimulant drug dependence (N = 40). We conclude that there is a real risk of motion-related bias in connectivity analysis of fMRI data, but that this risk is generally manageable, by effective time series denoising strategies designed to attenuate synchronized signal transients induced by abrupt head movements. The Wavelet Despiking software described in this article is freely available for download at www
Hu, Wenbiao; Clements, Archie; Williams, Gail; Tong, Shilu
2010-05-01
It remains unclear over whether it is possible to develop an epidemic forecasting model for transmission of dengue fever in Queensland, Australia. To examine the potential impact of El Niño/Southern Oscillation on the transmission of dengue fever in Queensland, Australia and explore the possibility of developing a forecast model of dengue fever. Data on the Southern Oscillation Index (SOI), an indicator of El Niño/Southern Oscillation activity, were obtained from the Australian Bureau of Meteorology. Numbers of dengue fever cases notified and the numbers of postcode areas with dengue fever cases between January 1993 and December 2005 were obtained from the Queensland Health and relevant population data were obtained from the Australia Bureau of Statistics. A multivariate Seasonal Auto-regressive Integrated Moving Average model was developed and validated by dividing the data file into two datasets: the data from January 1993 to December 2003 were used to construct a model and those from January 2004 to December 2005 were used to validate it. A decrease in the average SOI (ie, warmer conditions) during the preceding 3-12 months was significantly associated with an increase in the monthly numbers of postcode areas with dengue fever cases (beta=-0.038; p = 0.019). Predicted values from the Seasonal Auto-regressive Integrated Moving Average model were consistent with the observed values in the validation dataset (root-mean-square percentage error: 1.93%). Climate variability is directly and/or indirectly associated with dengue transmission and the development of an SOI-based epidemic forecasting system is possible for dengue fever in Queensland, Australia.
Evrendilek, F.; Karakaya, N.
2014-06-01
Continuous time-series measurements of diel dissolved oxygen (DO) through online sensors are vital to better understanding and management of metabolism of lake ecosystems, but are prone to noise. Discrete wavelet transforms (DWT) with the orthogonal Symmlet and the semiorthogonal Chui-Wang B-spline were compared in denoising diel, daytime and nighttime dynamics of DO, water temperature, pH, and chlorophyll-a. Predictive efficacies of multiple non-linear regression (MNLR) models of DO dynamics were evaluated with or without DWT denoising of either the response variable alone or all the response and explanatory variables. The combined use of the B-spline-based denoising of all the variables and the temporally partitioned data improved both the predictive power and the errors of the MNLR models better than the use of Symmlet DWT denoising of DO only or all the variables with or without the temporal partitioning.
Guarnaccia, Claudio; Quartieri, Joseph; Tepedino, Carmine
2017-06-01
The dangerous effect of noise on human health is well known. Both the auditory and non-auditory effects are largely documented in literature, and represent an important hazard in human activities. Particular care is devoted to road traffic noise, since it is growing according to the growth of residential, industrial and commercial areas. For these reasons, it is important to develop effective models able to predict the noise in a certain area. In this paper, a hybrid predictive model is presented. The model is based on the mixing of two different approach: the Time Series Analysis (TSA) and the Artificial Neural Network (ANN). The TSA model is based on the evaluation of trend and seasonality in the data, while the ANN model is based on the capacity of the network to "learn" the behavior of the data. The mixed approach will consist in the evaluation of noise levels by means of TSA and, once the differences (residuals) between TSA estimations and observed data have been calculated, in the training of a ANN on the residuals. This hybrid model will exploit interesting features and results, with a significant variation related to the number of steps forward in the prediction. It will be shown that the best results, in terms of prediction, are achieved predicting one step ahead in the future. Anyway, a 7 days prediction can be performed, with a slightly greater error, but offering a larger range of prediction, with respect to the single day ahead predictive model.
Directory of Open Access Journals (Sweden)
Farshad Fathian
2017-01-01
Full Text Available Introduction: Time series models are generally categorized as a data-driven method or mathematically-based method. These models are known as one of the most important tools in modeling and forecasting of hydrological processes, which are used to design and scientific management of water resources projects. On the other hand, a better understanding of the river flow process is vital for appropriate streamflow modeling and forecasting. One of the main concerns of hydrological time series modeling is whether the hydrologic variable is governed by the linear or nonlinear models through time. Although the linear time series models have been widely applied in hydrology research, there has been some recent increasing interest in the application of nonlinear time series approaches. The threshold autoregressive (TAR method is frequently applied in modeling the mean (first order moment of financial and economic time series. Thise type of the model has not received considerable attention yet from the hydrological community. The main purposes of this paper are to analyze and to discuss stochastic modeling of daily river flow time series of the study area using linear (such as ARMA: autoregressive integrated moving average and non-linear (such as two- and three- regime TAR models. Material and Methods: The study area has constituted itself of four sub-basins namely, Saghez Chai, Jighato Chai, Khorkhoreh Chai and Sarogh Chai from west to east, respectively, which discharge water into the Zarrineh Roud dam reservoir. River flow time series of 6 hydro-gauge stations located on upstream basin rivers of Zarrineh Roud dam (located in the southern part of Urmia Lake basin were considered to model purposes. All the data series used here to start from January 1, 1997, and ends until December 31, 2011. In this study, the daily river flow data from January 01 1997 to December 31 2009 (13 years were chosen for calibration and data for January 01 2010 to December 31 2011
Identifiability and Problems of Model Selection for Time-Series Analysis in Econometrics.
1980-01-01
feedback", in Proc. 1971 NRL-.IS Con- ference on Ordinary Differential Equations , edited by L. Weiss, Acalumic Press, pazes 459-471. REKA Dr Pai-,e...abstract sense. The difficulty is nonuniqueness , not identifiability. Third, there is the question of parametrization of models. In econo- metrics...with the equation (a) + -1, O, i, where u i t is a linear stochastic process whose values ( h ) " f K c t _ Tr=O ’t - are generated with the aid of
A Model-Free Method for Structual Change Detection Multivariate Nonlinear Time Series
Institute of Scientific and Technical Information of China (English)
孙青华; 张世英; 梁雄健
2003-01-01
In this paper, we apply the recursive genetic programming (RGP) approach to the cognition of a system, and then proceed to the detecting procedure for structural changes in the system whose components are of long memory. This approach is adaptive and model-free, which can simulate the individual activities of the system's participants, therefore, it has strong ability to recognize the operating mechanism of the system. Based on the previous cognition about the system, a testing statistic is developed for the detection of structural changes in the system. Furthermore, an example is presented to illustrate the validity and practical value of the proposed.
From time series analysis to a biomechanical multibody model of the human eye
Energy Technology Data Exchange (ETDEWEB)
Pascolo, P. [Dipartimento di Ingegneria Civile, Universita di Udine, Udine (Italy); Dipartimento di Bioingegneria, CISM, Udine (Italy)], E-mail: paolo.pascolo@uniud.it; Carniel, R. [Dipartimento di Georisorse e Territorio, Universita di Udine, Udine (Italy)], E-mail: roberto.carniel@uniud.it
2009-04-30
A mechanical model of the human eye is presented aimed at estimating the level of muscular activation. The applicability of the model in the biomedical field is discussed. Human eye movements studied in the laboratory are compared with the ones produced by a virtual eye described in kinematical terms and subject to the dynamics of six actuators, as many as the muscular systems devoted to the eye motion control. The definition of an error function between the experimental and the numerical response and the application of a suitable law that links activation and muscular force are at the base of the proposed methodology. The aim is the definition of a simple conceptual tool that could help the specialist in the diagnosis of potential physiological disturbances of saccadic and nystagmic movements but can also be extended in a second phase when more sophisticated data become available. The work is part of a collaboration between the Functional Mechanics Laboratory of the University and the Neurophysiopatology Laboratory of the 'S. Maria della Misericordia' Hospital in Udine, Italy.
Model Pemesanan Bahan Baku menggunakan Peramalan Time Series dengan CB Predictor
Directory of Open Access Journals (Sweden)
Tri Pujadi
2014-12-01
Full Text Available A company that manufactures finished goods often faces a shortage of raw materials, due to the determination of the quantity of raw material ordering improper because it is done by intuition and the lack of raw material inventory reserves. This resulted in costs because inefficient production processes are inhibited or had to perform an emergency procurement of raw materials to meet customer orders. The company seeks to use the method in determining the order quantity of raw material, comprising the steps of (1 collecting historical data of raw material use, step (2 forecasting needs raw materials, step (3 calculating the order quantity forecasting based on the data by comparing the deterministic method and probabilistic methods. Calculating safety stock for each raw material is done so as to cope with the situation outside of normal conditions, such a surge in orders. In its design, the system will be developed using the Unified Modeling Language modeling language (UML on the basis of the concept of object-oriented analysis and design (Object Oriented Analysis and Design. With the proposed implementation of the information system, the company can estimate the need for raw materials more quickly and accurately, and can determine the order quantity that is tailored to the needs. So that the costs associated with ordering and storage of raw materials can be minimized.
Haynes, S E
1983-10-01
It is widely known that linear restrictions involve bias. What is not known is that some linear restrictions are especially dangerous for hypothesis testing. For some, the expected value of the restricted coefficient does not lie between (among) the true unconstrained coefficients, which implies that the estimate is not a simple average of these coefficients. In this paper, the danger is examined regarding the additive linear restriction almost universally imposed in statistical research--the restriction of symmetry. Symmetry implies that the response of the dependent variable to a unit decrease in an expanatory variable is identical, but of opposite sign, to the response to a unit increase. The 1st section of the paper demonstrates theoretically that a coefficient restricted by symmetry (unlike coefficients embodying other additive restrictions) is not a simple average of the unconstrained coefficients because the relevant interacted variables are inversly correlated by definition. The next section shows that, under the restriction of symmetry, fertility in Finland from 1885-1925 appears to respond in a prolonged manner to infant mortality (significant and positive with a lag of 4-6 years), suggesting a response to expected deaths. However, unscontrained estimates indicate that this finding is spurious. When the restriction is relaxed, the dominant response is rapid (significant and positive with a lag of 1-2 years) and stronger for declines in mortality, supporting an aymmetric response to actual deaths. For 2 reasons, the danger of the symmetry restriction may be especially pervasive. 1st, unlike most other linear constraints, symmetry is passively imposed merely by ignoring the possibility of asymmetry. 2nd, modles in a wide range of fields--including macroeconomics (e.g., demand for money, consumption, and investment models, and the Phillips curve), international economics (e.g., intervention models of central banks), and labor economics (e.g., sticky wage
Designer networks for time series processing
DEFF Research Database (Denmark)
Svarer, C; Hansen, Lars Kai; Larsen, Jan
1993-01-01
The conventional tapped-delay neural net may be analyzed using statistical methods and the results of such analysis can be applied to model optimization. The authors review and extend efforts to demonstrate the power of this strategy within time series processing. They attempt to design compact...
Time Series Rule Discovery: Tough, not Meaningless
Struzik, Z.R.
2003-01-01
`Model free' rule discovery from data has recently been subject to considerable criticism, which has cast a shadow over the emerging discipline of time series data mining. However, other than in data mining, rule discovery has long been the subject of research in statistical physics of complex pheno
ANCOVA Procedures in Time-Series Experiments: An Illustrative Example.
Willson, Victor L.
A statistical model for analysis of multiple time-series observation is briefly outlined. The model incorporates a change parameter corresponding to intervention or interruption of the dependent series. The additional time-series are included in the model as covariates. The practical application of the procedure is illustrated with traffic…
Improving the prediction of chaotic time series
Institute of Scientific and Technical Information of China (English)
李克平; 高自友; 陈天仑
2003-01-01
One of the features of deterministic chaos is sensitive to initial conditions. This feature limits the prediction horizons of many chaotic systems. In this paper, we propose a new prediction technique for chaotic time series. In our method, some neighbouring points of the predicted point, for which the corresponding local Lyapunov exponent is particularly large, would be discarded during estimating the local dynamics, and thus the error accumulated by the prediction algorithm is reduced. The model is tested for the convection amplitude of Lorenz systems. The simulation results indicate that the prediction technique can improve the prediction of chaotic time series.
An Approximate Markov Model for the Wright-Fisher Diffusion and Its Application to Time Series Data.
Ferrer-Admetlla, Anna; Leuenberger, Christoph; Jensen, Jeffrey D; Wegmann, Daniel
2016-06-01
The joint and accurate inference of selection and demography from genetic data is considered a particularly challenging question in population genetics, since both process may lead to very similar patterns of genetic diversity. However, additional information for disentangling these effects may be obtained by observing changes in allele frequencies over multiple time points. Such data are common in experimental evolution studies, as well as in the comparison of ancient and contemporary samples. Leveraging this information, however, has been computationally challenging, particularly when considering multilocus data sets. To overcome these issues, we introduce a novel, discrete approximation for diffusion processes, termed mean transition time approximation, which preserves the long-term behavior of the underlying continuous diffusion process. We then derive this approximation for the particular case of inferring selection and demography from time series data under the classic Wright-Fisher model and demonstrate that our approximation is well suited to describe allele trajectories through time, even when only a few states are used. We then develop a Bayesian inference approach to jointly infer the population size and locus-specific selection coefficients with high accuracy and further extend this model to also infer the rates of sequencing errors and mutations. We finally apply our approach to recent experimental data on the evolution of drug resistance in influenza virus, identifying likely targets of selection and finding evidence for much larger viral population sizes than previously reported.
Energy Technology Data Exchange (ETDEWEB)
Marzouk, Youssef; Fast P. (Lawrence Livermore National Laboratory, Livermore, CA); Kraus, M. (Peterson AFB, CO); Ray, J. P.
2006-01-01
Terrorist attacks using an aerosolized pathogen preparation have gained credibility as a national security concern after the anthrax attacks of 2001. The ability to characterize such attacks, i.e., to estimate the number of people infected, the time of infection, and the average dose received, is important when planning a medical response. We address this question of characterization by formulating a Bayesian inverse problem predicated on a short time-series of diagnosed patients exhibiting symptoms. To be of relevance to response planning, we limit ourselves to 3-5 days of data. In tests performed with anthrax as the pathogen, we find that these data are usually sufficient, especially if the model of the outbreak used in the inverse problem is an accurate one. In some cases the scarcity of data may initially support outbreak characterizations at odds with the true one, but with sufficient data the correct inferences are recovered; in other words, the inverse problem posed and its solution methodology are consistent. We also explore the effect of model error-situations for which the model used in the inverse problem is only a partially accurate representation of the outbreak; here, the model predictions and the observations differ by more than a random noise. We find that while there is a consistent discrepancy between the inferred and the true characterizations, they are also close enough to be of relevance when planning a response.
Dabbakuti, J. R. K. Kumar; Venkata Ratnam, D.
2017-10-01
Precise modeling of the ionospheric Total Electron Content (TEC) is a critical aspect of Positioning, Navigation, and Timing (PNT) services intended for the Global Navigation Satellite Systems (GNSS) applications as well as Earth Observation System (EOS), satellite communication, and space weather forecasting applications. In this paper, linear time series modeling has been carried out on ionospheric TEC at two different locations at Koneru Lakshmaiah University (KLU), Guntur (geographic 16.44° N, 80.62° E; geomagnetic 7.55° N) and Bangalore (geographic 12.97° N, 77.59° E; geomagnetic 4.53° N) at the northern low-latitude region, for the year 2013 in the 24th solar cycle. The impact of the solar and geomagnetic activity on periodic oscillations of TEC has been investigated. Results confirm that the correlation coefficient of the estimated TEC from the linear model TEC and the observed GPS-TEC is around 93%. Solar activity is the key component that influences ionospheric daily averaged TEC while periodic component reveals the seasonal dependency of TEC. Furthermore, it is observed that the influence of geomagnetic activity component on TEC is different at both the latitudes. The accuracy of the model has been assessed by comparing the International Reference Ionosphere (IRI) 2012 model TEC and TEC measurements. Moreover, the absence of winter anomaly is remarkable, as determined by the Root Mean Square Error (RMSE) between the linear model TEC and GPS-TEC. On the contrary, the IRI2012 model TEC evidently failed to predict the absence of winter anomaly in the Equatorial Ionization Anomaly (EIA) crest region. The outcome of this work will be useful for improving the ionospheric now-casting models under various geophysical conditions.
Directory of Open Access Journals (Sweden)
Y. Ye
2009-10-01
Full Text Available A one-dimensional model of Fe speciation and biogeochemistry, coupled with the General Ocean Turbulence Model (GOTM and a NPZD-type ecosystem model, is applied for the Tropical Eastern North Atlantic Time-Series Observatory (TENATSO site. Among diverse processes affecting Fe speciation, this study is focusing on investigating the role of dust particles in removing dissolved iron (DFe by a more complex description of particle aggregation and sinking, and explaining the abundance of organic Fe-binding ligands by modelling their origin and fate.
The vertical distribution of different particle classes in the model shows high sensitivity to changing aggregation rates. Using the aggregation rates from the sensitivity study in this work, modelled particle fluxes are close to observations, with dust particles dominating near the surface and aggregates deeper in the water column. POC export at 1000 m is a little higher than regional sediment trap measurements, suggesting further improvement of modelling particle aggregation, sinking or remineralisation.
Modelled strong ligands have a high abundance near the surface and decline rapidly below the deep chlorophyll maximum, showing qualitative similarity to observations. Without production of strong ligands, phytoplankton concentration falls to 0 within the first 2 years in the model integration, caused by strong Fe-limitation. A nudging of total weak ligands towards a constant value is required for reproducing the observed nutrient-like profiles, assuming a decay time of 7 years for weak ligands. This indicates that weak ligands have a longer decay time and therefore cannot be modelled adequately in a one-dimensional model.
The modelled DFe profile is strongly influenced by particle concentration and vertical distribution, because the most important removal of DFe in deeper waters is colloid formation and aggregation. Redissolution of particulate iron is required to reproduce an
Trend prediction of chaotic time series
Institute of Scientific and Technical Information of China (English)
Li Aiguo; Zhao Cai; Li Zhanhuai
2007-01-01
To predict the trend of chaotic time series in time series analysis and time series data mining fields, a novel predicting algorithm of chaotic time series trend is presented, and an on-line segmenting algorithm is proposed to convert a time series into a binary string according to ascending or descending trend of each subsequence. The on-line segmenting algorithm is independent of the prior knowledge about time series. The naive Bayesian algorithm is then employed to predict the trend of chaotic time series according to the binary string. The experimental results of three chaotic time series demonstrate that the proposed method predicts the ascending or descending trend of chaotic time series with few error.
Background: Simulation studies have previously demonstrated that time-series analyses using smoothing splines correctly model null health-air pollution associations. Methods: We repeatedly simulated season, meteorology and air quality for the metropolitan area of Atlanta from cyc...
Smirnov, D A; Velazquez, J L P; Wennberg, R A; Bezruchko, B P
2005-01-01
We demonstrate in numerical experiments that estimators of strength and directionality of coupling between oscillators based on modeling of their phase dynamics [D.A. Smirnov and B.P. Bezruchko, Phys. Rev. E 68, 046209 (2003)] are widely applicable. Namely, although the expressions for the estimators and their confidence bands are derived for linear uncoupled oscillators under the influence of independent sources of Gaussian white noise, they turn out to allow reliable characterization of coupling from relatively short time series for different properties of noise, significant phase nonlinearity of the oscillators, and non-vanishing coupling between them. We apply the estimators to analyze a two-channel human intracranial epileptic electroencephalogram (EEG) recording with the purpose of epileptic focus localization.
The dynamic atmospheres of Mira stars: comparing the CODEX models to PTI time series of TU And
Hillen, M; Degroote, P; Acke, B; van Winckel, H
2012-01-01
Our comprehension of stellar evolution on the AGB still faces many difficulties. To improve on this, a quantified understanding of large-amplitude pulsator atmospheres and interpretation in terms of their fundamental stellar parameters are essential. We wish to evaluate the effectiveness of the recently released CODEX dynamical model atmospheres in representing M-type Mira variables through a confrontation with the time-resolved spectro-photometric and interferometric PTI data set of TU And. We calibrated the interferometric K-band time series to high precision. This results in 50 nights of observations, covering 8 subsequent pulsation cycles. At each phase, the flux at 2.2$\\mu$m is obtained, along with the spectral shape and visibility points in 5 channels across the K-band. We compared the data set to the relevant dynamical, self-excited CODEX models. Both spectrum and visibilities are consistently reproduced at visual minimum phases. Near maximum, our observations show that the current models predict a pho...
Fan, M.
2015-03-29
Parameter estimation is a challenging computational problemin the reverse engineering of biological systems. Because advances in biotechnology have facilitated wide availability of time-series gene expression data, systematic parameter esti- mation of gene circuitmodels fromsuch time-series mRNA data has become an importantmethod for quantitatively dissecting the regulation of gene expression. By focusing on themodeling of gene circuits, we examine here the perform- ance of three types of state-of-the-art parameter estimation methods: population-basedmethods, onlinemethods and model-decomposition-basedmethods. Our results show that certain population-basedmethods are able to generate high- quality parameter solutions. The performance of thesemethods, however, is heavily dependent on the size of the param- eter search space, and their computational requirements substantially increase as the size of the search space increases. In comparison, onlinemethods andmodel decomposition-basedmethods are computationally faster alternatives and are less dependent on the size of the search space. Among other things, our results show that a hybrid approach that augments computationally fastmethods with local search as a subsequent refinement procedure can substantially increase the qual- ity of their parameter estimates to the level on par with the best solution obtained fromthe population-basedmethods whilemaintaining high computational speed. These suggest that such hybridmethods can be a promising alternative to themore commonly used population-basedmethods for parameter estimation of gene circuit models when limited prior knowledge about the underlying regulatorymechanismsmakes the size of the parameter search space vastly large. © The Author 2015. Published by Oxford University Press.
Time Series Forecasting: A Nonlinear Dynamics Approach
Sello, Stefano
1999-01-01
The problem of prediction of a given time series is examined on the basis of recent nonlinear dynamics theories. Particular attention is devoted to forecast the amplitude and phase of one of the most common solar indicator activity, the international monthly smoothed sunspot number. It is well known that the solar cycle is very difficult to predict due to the intrinsic complexity of the related time behaviour and to the lack of a succesful quantitative theoretical model of the Sun magnetic cy...
Hillen, M.; Verhoelst, T.; Degroote, P.; Acke, B.; van Winckel, H.
2012-02-01
Context. Our comprehension of stellar evolution on the AGB still faces many difficulties. To improve on this, a quantified understanding of large-amplitude pulsator atmospheres and interpretation in terms of their fundamental stellar parameters are essential. Aims: We wish to evaluate the effectiveness of the recently released CODEX dynamical model atmospheres in representing M-type Mira variables through a confrontation with the time-resolved spectro-photometric and interferometric PTI data set of TU And. Methods: We calibrated the interferometric K-band time series to high precision. This results in 50 nights of observations, covering 8 subsequent pulsation cycles. At each phase, the flux at 2.2 μm is obtained, along with the spectral shape and visibility points in 5 channels across the K-band. We compared the data set to the relevant dynamical, self-excited CODEX models. Results: Both spectrum and visibilities are consistently reproduced at visual minimum phases. Near maximum, our observations show that the current models predict a photosphere that is too compact and hot, and we find that the extended atmosphere lacks H2O opacity. Since coverage in model parameter space is currently poor, more models are needed to make firm conclusions on the cause of the discrepancies. We argue that for TU And, the discrepancy could be lifted by adopting a lower value of the mixing length parameter combined with an increase in the stellar mass and/or a decrease in metallicity, but this requires the release of an extended model grid. Figure 4 and Table 1 are available in electronic form at http://www.aanda.org
Thakor, N V; Vaz, C A; McPherson, R W; Hanley, D F
1991-01-01
Evoked potentials (EPs) have traditionally been analyzed in time domain, with amplitude and latency of various signal components used in clinical interpretation. A new approach, called adaptive Fourier series modeling (FSM), is presented here. Dynamic changes in magnitudes of Fourier coefficients are analyzed for diagnostic purposes. In order to estimate the time-varying changes in the Fourier coefficients of noisy signals, a least mean-square filtering algorithm is applied. Results of computer simulations as well as experimental data are presented. Time-varying trends are presented in a new compressed evoked spectrum format. These techniques are applied to the study of alterations in human somatosensory EPs caused by the intravenous administration of etomidate during neurosurgical procedures. Amplitude increases of the order of 200-500% occurring within a time span of about 100 sec were captured. Due to its superior convergence properties, the adaptive FSM technique estimates more rapid changes in amplitude and latency than exponentially weighted averaging or moving window averaging schemes.
Hillen, M.; Verhoelst, T.; Degroote, P.; Acke, B.; Van Winckel, H.
2015-08-01
We present our already-published evaluation of the effectiveness of the CODEX models, released in 2011, in representing the atmospheres of M-type Mira variables. We present a high-precision interferometric K-band time series of TU And, consisting of 50 nights that cover eight consecutive pulsation cycles. At each phase, the flux at 2.2μm was obtained, along with the spectral shape and visibility points in five channels across the K band. We show a comparison between these data and the dynamical self-excited CODEX model which gives the closest match in stellar parameters yet available. Both the spectrum and the visibilities are consistently reproduced around visual minimum phases. Near the maximum phases, however, the current models predict a photosphere that is too hot and compact, surrounded by an extended atmosphere that lacks H2O opacity, compared to the observations. A better coverage in the model parameter space is needed to make firm conclusions as to the cause of the discrepancies. In the case of TU And, the discrepancy might be lifted by adopting a lower value of the mixing length parameter combined with an increased stellar mass and/or a decreased metallicity.
Description of complex time series by multipoles
DEFF Research Database (Denmark)
Lewkowicz, M.; Levitan, J.; Puzanov, N.
2002-01-01
We present a new method to describe time series with a highly complex time evolution. The time series is projected onto a two-dimensional phase-space plot which is quantified in terms of a multipole expansion where every data point is assigned a unit mass. The multipoles provide an efficient...... characterization of the original time series....
Shirzaei, M.; Burgmann, R.
2011-12-01
Interferometric synthetic aperture radar (InSAR) provides valuable spatiotemporal observations of surface deformation in volcanic and tectonic areas. In this study we generate a long time series of InSAR-measured deformation over the San Francisco Bay Area by combining over 100 ERS1/2 and Envisat SAR acquisitions from 1992 through 2011. We apply an advanced multitemporal processing algorithm that uses multiple-master interferometry and generate about 700 interferograms (ERS-ERS, Envisat-Envisat and ERS-Envisat pairs) with temporal and perpendicular baseline smaller than 4 years and 300 m, respectively. The systematic errors (such as DEM error and atmospheric delay) are estimated and reduced by using a variety of wavelet based filters. The differential displacement measured in each unwrapped interferogram is inverted by using an L1-norm minimization approach to generate time series of the surface displacement for identified stable pixels. Using a Kalman filter, the line-of-sight velocity is estimated, temporal random noise is reduced and the displacement variance-covariance matrix is refined. To solve for the time dependent model of aseismic slip on the Hayward fault, the upper-crustal fault plane is discretized into triangular patches. The size of these patches is optimized in a way that allows estimating the fault slip with maximum precision. Then, we apply an iterated inversion approach, combining static slip inversion and Kalman filtering to model temporal behavior of the slip. For the static inversion we expand the slip to the wavelet base functions and truncate noisy coefficients, which provide a solution equivalent to implementation of the Laplace smoothing operator in conventional slip inversion. This novel approach, however, overcomes the need of choosing a smoothing operator and allows automating the whole inversion step. Since we aim to integrate seismic and creepmeter data sets, the issue of relative weighting of these data sets becomes important, which
Highly comparative, feature-based time-series classification
Fulcher, Ben D
2014-01-01
A highly comparative, feature-based approach to time series classification is introduced that uses an extensive database of algorithms to extract thousands of interpretable features from time series. These features are derived from across the scientific time-series analysis literature, and include summaries of time series in terms of their correlation structure, distribution, entropy, stationarity, scaling properties, and fits to a range of time-series models. After computing thousands of features for each time series in a training set, those that are most informative of the class structure are selected using greedy forward feature selection with a linear classifier. The resulting feature-based classifiers automatically learn the differences between classes using a reduced number of time-series properties, and circumvent the need to calculate distances between time series. Representing time series in this way results in orders of magnitude of dimensionality reduction, allowing the method to perform well on ve...
Forecasting of PM10 time series using wavelet analysis and wavelet-ARMA model in Taiyuan, China.
Zhang, Hong; Zhang, Sheng; Wang, Ping; Qin, Yuzhe; Wang, Huifeng
2017-02-23
PM10 forecasting is difficult because of the uncertainties in describing the emission and meteorological fields. This paper proposed a wavelet-ARMA/ARIMA model to forecast the short-time series of the PM10 concentrations. It was evaluated by experiments using a 10-year dataset of daily PM10 concentrations from 4 stations located in Taiyuan, China. The results indicated the following: 1) PM10 concentrations of Taiyuan had a decreasing trend during 2005 to 2012 but it was increased in 2013. PM10 concentrations had an obvious seasonal fluctuation related with coal fired heating in winter and early spring. 2) Spatial difference among four stations showed that the PM10 concentrations in industrial and heavily trafficked areas were higher than those in residential and suburb areas. 3) Wavelet analysis revealed that the trend variation and the changes of the PM10 concentration of Taiyuan were complicated. 4) The proposed wavelet-ARIMA model could be efficiently and successfully applied to the PM10 forecasting field. Compared with the traditional ARMA/ARIMA methods, this wavelet-ARMA/ARIMA method could effectively reduce the forecasting error, improve the prediction accuracy, and realize multi-time scale prediction.
Delay differential analysis of time series.
Lainscsek, Claudia; Sejnowski, Terrence J
2015-03-01
Nonlinear dynamical system analysis based on embedding theory has been used for modeling and prediction, but it also has applications to signal detection and classification of time series. An embedding creates a multidimensional geometrical object from a single time series. Traditionally either delay or derivative embeddings have been used. The delay embedding is composed of delayed versions of the signal, and the derivative embedding is composed of successive derivatives of the signal. The delay embedding has been extended to nonuniform embeddings to take multiple timescales into account. Both embeddings provide information on the underlying dynamical system without having direct access to all the system variables. Delay differential analysis is based on functional embeddings, a combination of the derivative embedding with nonuniform delay embeddings. Small delay differential equation (DDE) models that best represent relevant dynamic features of time series data are selected from a pool of candidate models for detection or classification. We show that the properties of DDEs support spectral analysis in the time domain where nonlinear correlation functions are used to detect frequencies, frequency and phase couplings, and bispectra. These can be efficiently computed with short time windows and are robust to noise. For frequency analysis, this framework is a multivariate extension of discrete Fourier transform (DFT), and for higher-order spectra, it is a linear and multivariate alternative to multidimensional fast Fourier transform of multidimensional correlations. This method can be applied to short or sparse time series and can be extended to cross-trial and cross-channel spectra if multiple short data segments of the same experiment are available. Together, this time-domain toolbox provides higher temporal resolution, increased frequency and phase coupling information, and it allows an easy and straightforward implementation of higher-order spectra across time
Delay Differential Analysis of Time Series
Lainscsek, Claudia; Sejnowski, Terrence J.
2015-01-01
Nonlinear dynamical system analysis based on embedding theory has been used for modeling and prediction, but it also has applications to signal detection and classification of time series. An embedding creates a multidimensional geometrical object from a single time series. Traditionally either delay or derivative embeddings have been used. The delay embedding is composed of delayed versions of the signal, and the derivative embedding is composed of successive derivatives of the signal. The delay embedding has been extended to nonuniform embeddings to take multiple timescales into account. Both embeddings provide information on the underlying dynamical system without having direct access to all the system variables. Delay differential analysis is based on functional embeddings, a combination of the derivative embedding with nonuniform delay embeddings. Small delay differential equation (DDE) models that best represent relevant dynamic features of time series data are selected from a pool of candidate models for detection or classification. We show that the properties of DDEs support spectral analysis in the time domain where nonlinear correlation functions are used to detect frequencies, frequency and phase couplings, and bispectra. These can be efficiently computed with short time windows and are robust to noise. For frequency analysis, this framework is a multivariate extension of discrete Fourier transform (DFT), and for higher-order spectra, it is a linear and multivariate alternative to multidimensional fast Fourier transform of multidimensional correlations. This method can be applied to short or sparse time series and can be extended to cross-trial and cross-channel spectra if multiple short data segments of the same experiment are available. Together, this time-domain toolbox provides higher temporal resolution, increased frequency and phase coupling information, and it allows an easy and straightforward implementation of higher-order spectra across time
Almog, Assaf
2014-01-01
The dynamics of complex systems, from financial markets to the brain, can be monitored in terms of time series of activity of their fundamental elements (such as stocks or neurons respectively). While the main focus of time series analysis is on the magnitude of temporal increments, a significant piece of information is encoded into the binary projection (i.e. the sign) of such increments. In this paper we provide further evidence of this by showing strong nonlinear relationships between binary and non-binary properties of financial time series. We then introduce an information-theoretic approach to the analysis of the binary signature of single and multiple time series. Through the definition of maximum-entropy ensembles of binary matrices, we quantify the information encoded into the simplest binary properties of real time series and identify the most informative property given a set of measurements. Our formalism is able to replicate the observed binary/non-binary relations very well, and to mathematically...
Iacobellis, V.; Gioia, A.; Milella, P.; Satalino, G.; Balenzano, A.; Mattia, F.
2012-04-01
Over the last years, a vast number of experimental and theoretical studies has widely demonstrated the sensitivity of SAR data to soil moisture content, however, operational services integrating SAR measurements into land process models are not yet available. Important progresses in this field are expected, on the one hand, from SAR missions characterized by a short revisiting time, such as the COSMO-SkyMed or the forthcoming Sentinel-1 and ALOS-2 missions, on the other hand, from a strong effort in implementing hydrological models able to reproduce the dynamic of soil moisture content of the top layer (5 cm depth) of soil. With this latter purpose, we used the DREAM model [Manfreda et al., 2005], realized in a GIS-based approach, that explicitly takes into account the spatial heterogeneity of hydrological processes. The DREAM model carries out continuous hydrological simulations using the daily and the hourly scales. The distinctive feature of the model, which consists of evaluating the lateral flow through a water content redistribution weighted by the topographic index, was preserved. The latter provided the basis for the nested implementation of the Richard equation which has been used for evaluating vertical flows in the top soil layer (5cm).The Richard routine exploits the numerical solution proposed by Simunek et al. [2009] and runs, for each cell of the river basin, in a sub-module of 60 minutes with a vertical (i.e. depth) and temporal resolution of 1 cm and 1 s, respectively. The model was applied to the portion of the Celone at Foggia San Severo river basin downstream the San Giusto Dam, which is a tributary of the Candelaro river, in Puglia region (Southern Italy). Over this area quasi-dense time series of ALOS/PALSAR ScanSAR WB1 and COSMO-SkyMedStripMap images were acquired in 2007 and 2010, respectively. The SAR data have been used to derive time-series of soil moisture maps by means of the SMOSAR software developed for Sentinel-1 data [Balenzano et