WorldWideScience

Sample records for proxy time series

  1. Speleothem stable isotope records for east-central Europe: resampling sedimentary proxy records to obtain evenly spaced time series with spectral guidance

    Science.gov (United States)

    Gábor Hatvani, István; Kern, Zoltán; Leél-Őssy, Szabolcs; Demény, Attila

    2018-01-01

    Uneven spacing is a common feature of sedimentary paleoclimate records, in many cases causing difficulties in the application of classical statistical and time series methods. Although special statistical tools do exist to assess unevenly spaced data directly, the transformation of such data into a temporally equidistant time series which may then be examined using commonly employed statistical tools remains, however, an unachieved goal. The present paper, therefore, introduces an approach to obtain evenly spaced time series (using cubic spline fitting) from unevenly spaced speleothem records with the application of a spectral guidance to avoid the spectral bias caused by interpolation and retain the original spectral characteristics of the data. The methodology was applied to stable carbon and oxygen isotope records derived from two stalagmites from the Baradla Cave (NE Hungary) dating back to the late 18th century. To show the benefit of the equally spaced records to climate studies, their coherence with climate parameters is explored using wavelet transform coherence and discussed. The obtained equally spaced time series are available at https://doi.org/10.1594/PANGAEA.875917" target="_blank">https://doi.org/10.1594/PANGAEA.875917.

  2. Time series analysis.

    NARCIS (Netherlands)

    2013-01-01

    Time series analysis can be used to quantitatively monitor, describe, explain, and predict road safety developments. Time series analysis techniques offer the possibility of quantitatively modelling road safety developments in such a way that the dependencies between the observations of time series

  3. Time-and-ID-Based Proxy Reencryption Scheme

    Directory of Open Access Journals (Sweden)

    Kambombo Mtonga

    2014-01-01

    Full Text Available Time- and ID-based proxy reencryption scheme is proposed in this paper in which a type-based proxy reencryption enables the delegator to implement fine-grained policies with one key pair without any additional trust on the proxy. However, in some applications, the time within which the data was sampled or collected is very critical. In such applications, for example, healthcare and criminal investigations, the delegatee may be interested in only some of the messages with some types sampled within some time bound instead of the entire subset. Hence, in order to carter for such situations, in this paper, we propose a time-and-identity-based proxy reencryption scheme that takes into account the time within which the data was collected as a factor to consider when categorizing data in addition to its type. Our scheme is based on Boneh and Boyen identity-based scheme (BB-IBE and Matsuo’s proxy reencryption scheme for identity-based encryption (IBE to IBE. We prove that our scheme is semantically secure in the standard model.

  4. Time Series Momentum

    DEFF Research Database (Denmark)

    Moskowitz, Tobias J.; Ooi, Yao Hua; Heje Pedersen, Lasse

    2012-01-01

    We document significant “time series momentum” in equity index, currency, commodity, and bond futures for each of the 58 liquid instruments we consider. We find persistence in returns for one to 12 months that partially reverses over longer horizons, consistent with sentiment theories of initial...... under-reaction and delayed over-reaction. A diversified portfolio of time series momentum strategies across all asset classes delivers substantial abnormal returns with little exposure to standard asset pricing factors and performs best during extreme markets. Examining the trading activities...... of speculators and hedgers, we find that speculators profit from time series momentum at the expense of hedgers....

  5. Multivariate Time Series Search

    Data.gov (United States)

    National Aeronautics and Space Administration — Multivariate Time-Series (MTS) are ubiquitous, and are generated in areas as disparate as sensor recordings in aerospace systems, music and video streams, medical...

  6. Long time series

    DEFF Research Database (Denmark)

    Hisdal, H.; Holmqvist, E.; Hyvärinen, V.

    Awareness that emission of greenhouse gases will raise the global temperature and change the climate has led to studies trying to identify such changes in long-term climate and hydrologic time series. This report, written by the......Awareness that emission of greenhouse gases will raise the global temperature and change the climate has led to studies trying to identify such changes in long-term climate and hydrologic time series. This report, written by the...

  7. Time series analysis

    CERN Document Server

    Madsen, Henrik

    2007-01-01

    ""In this book the author gives a detailed account of estimation, identification methodologies for univariate and multivariate stationary time-series models. The interesting aspect of this introductory book is that it contains several real data sets and the author made an effort to explain and motivate the methodology with real data. … this introductory book will be interesting and useful not only to undergraduate students in the UK universities but also to statisticians who are keen to learn time-series techniques and keen to apply them. I have no hesitation in recommending the book.""-Journa

  8. Applied time series analysis

    CERN Document Server

    Woodward, Wayne A; Elliott, Alan C

    2011-01-01

    ""There is scarcely a standard technique that the reader will find left out … this book is highly recommended for those requiring a ready introduction to applicable methods in time series and serves as a useful resource for pedagogical purposes.""-International Statistical Review (2014), 82""Current time series theory for practice is well summarized in this book.""-Emmanuel Parzen, Texas A&M University""What an extraordinary range of topics covered, all very insightfully. I like [the authors'] innovations very much, such as the AR factor table.""-David Findley, U.S. Census Bureau (retired)""…

  9. Fingerprinting Reverse Proxies Using Timing Analysis of TCP Flows

    Science.gov (United States)

    2013-09-01

    Address Translation NPS Naval Postgraduate School OTT One-way Transit Time PET Privacy Enhancing Technology PHP Hypertext Preprocessor P2P ] Peer-to...of timing information that can translate into usable intelligence for detecting the use of reverse proxies by a network domain. 1.1 Problem Statement...websites (i.e., Sky News Arabia, Kemalist Gazete, Detroit News), and entertainment industry sites (i.e., HBO GO, LeoVegas Online Casino , FreeRide Games

  10. Visual time series analysis

    DEFF Research Database (Denmark)

    Fischer, Paul; Hilbert, Astrid

    2012-01-01

    commands, our application is select-and-click-driven. It allows to derive many different sequences of deviations for a given time series and to visualize them in different ways in order to judge their expressive power and to reuse the procedure found. For many transformations or model-ts, the user may...

  11. GPS Position Time Series @ JPL

    Science.gov (United States)

    Owen, Susan; Moore, Angelyn; Kedar, Sharon; Liu, Zhen; Webb, Frank; Heflin, Mike; Desai, Shailen

    2013-01-01

    Different flavors of GPS time series analysis at JPL - Use same GPS Precise Point Positioning Analysis raw time series - Variations in time series analysis/post-processing driven by different users. center dot JPL Global Time Series/Velocities - researchers studying reference frame, combining with VLBI/SLR/DORIS center dot JPL/SOPAC Combined Time Series/Velocities - crustal deformation for tectonic, volcanic, ground water studies center dot ARIA Time Series/Coseismic Data Products - Hazard monitoring and response focused center dot ARIA data system designed to integrate GPS and InSAR - GPS tropospheric delay used for correcting InSAR - Caltech's GIANT time series analysis uses GPS to correct orbital errors in InSAR - Zhen Liu's talking tomorrow on InSAR Time Series analysis

  12. Advances in time series forecasting

    CERN Document Server

    Cagdas, Hakan Aladag

    2012-01-01

    Readers will learn how these methods work and how these approaches can be used to forecast real life time series. The hybrid forecasting model is also explained. Data presented in this e-book is problem based and is taken from real life situations. It is a valuable resource for students, statisticians and working professionals interested in advanced time series analysis.

  13. Analysis of Forgery Attack on One-Time Proxy Signature and the Improvement

    Science.gov (United States)

    Wang, Tian-Yin; Wei, Zong-Li

    2016-02-01

    In a recent paper, Yang et al. (Quant. Inf. Process. 13(9), 2007-2016, 2014) analyzed the security of one-time proxy signature scheme Wang and Wei (Quant. Inf. Process. 11(2), 455-463, 2012) and pointed out that it cannot satisfy the security requirements of unforgeability and undeniability because an eavesdropper Eve can forge a valid proxy signature on a message chosen by herself. However, we find that the so-called proxy message-signature pair forged by Eve is issued by the proxy signer in fact, and anybody can obtain it as a requester, which means that the forgery attack is not considered as a successful attack. Therefore, the conclusion that this scheme cannot satisfy the security requirements of proxy signature against forging and denying is not appropriate in this sense. Finally, we study the reason for the misunderstanding and clarify the security requirements for proxy signatures.

  14. Time series analysis time series analysis methods and applications

    CERN Document Server

    Rao, Tata Subba; Rao, C R

    2012-01-01

    The field of statistics not only affects all areas of scientific activity, but also many other matters such as public policy. It is branching rapidly into so many different subjects that a series of handbooks is the only way of comprehensively presenting the various aspects of statistical methodology, applications, and recent developments. The Handbook of Statistics is a series of self-contained reference books. Each volume is devoted to a particular topic in statistics, with Volume 30 dealing with time series. The series is addressed to the entire community of statisticians and scientists in various disciplines who use statistical methodology in their work. At the same time, special emphasis is placed on applications-oriented techniques, with the applied statistician in mind as the primary audience. Comprehensively presents the various aspects of statistical methodology Discusses a wide variety of diverse applications and recent developments Contributors are internationally renowened experts in their respect...

  15. Random time series in astronomy.

    Science.gov (United States)

    Vaughan, Simon

    2013-02-13

    Progress in astronomy comes from interpreting the signals encoded in the light received from distant objects: the distribution of light over the sky (images), over photon wavelength (spectrum), over polarization angle and over time (usually called light curves by astronomers). In the time domain, we see transient events such as supernovae, gamma-ray bursts and other powerful explosions; we see periodic phenomena such as the orbits of planets around nearby stars, radio pulsars and pulsations of stars in nearby galaxies; and we see persistent aperiodic variations ('noise') from powerful systems such as accreting black holes. I review just a few of the recent and future challenges in the burgeoning area of time domain astrophysics, with particular attention to persistently variable sources, the recovery of reliable noise power spectra from sparsely sampled time series, higher order properties of accreting black holes, and time delays and correlations in multi-variate time series.

  16. Pattern Recognition in Time Series

    Science.gov (United States)

    Lin, Jessica; Williamson, Sheri; Borne, Kirk D.; DeBarr, David

    2012-03-01

    Perhaps the most commonly encountered data types are time series, touching almost every aspect of human life, including astronomy. One obvious problem of handling time-series databases concerns with its typically massive size—gigabytes or even terabytes are common, with more and more databases reaching the petabyte scale. For example, in telecommunication, large companies like AT&T produce several hundred millions long-distance records per day [Cort00]. In astronomy, time-domain surveys are relatively new—these are surveys that cover a significant fraction of the sky with many repeat observations, thereby producing time series for millions or billions of objects. Several such time-domain sky surveys are now completed, such as the MACHO [Alco01],OGLE [Szym05], SDSS Stripe 82 [Bram08], SuperMACHO [Garg08], and Berkeley’s Transients Classification Pipeline (TCP) [Star08] projects. The Pan-STARRS project is an active sky survey—it began in 2010, a 3-year survey covering three-fourths of the sky with ˜60 observations of each field [Kais04]. The Large Synoptic Survey Telescope (LSST) project proposes to survey 50% of the visible sky repeatedly approximately 1000 times over a 10-year period, creating a 100-petabyte image archive and a 20-petabyte science database (http://www.lsst.org/). The LSST science database will include time series of over 100 scientific parameters for each of approximately 50 billion astronomical sources—this will be the largest data collection (and certainly the largest time series database) ever assembled in astronomy, and it rivals any other discipline’s massive data collections for sheer size and complexity. More common in astronomy are time series of flux measurements. As a consequence of many decades of observations (and in some cases, hundreds of years), a large variety of flux variations have been detected in astronomical objects, including periodic variations (e.g., pulsating stars, rotators, pulsars, eclipsing binaries

  17. JWST NIRCam Time Series Observations

    Science.gov (United States)

    Greene, Tom; Schlawin, E.

    2017-01-01

    We explain how to make time-series observations with the Near-Infrared camera (NIRCam) science instrument of the James Webb Space Telescope. Both photometric and spectroscopic observations are described. We present the basic capabilities and performance of NIRCam and show examples of how to set its observing parameters using the Space Telescope Science Institute's Astronomer's Proposal Tool (APT).

  18. Stochastic Time-Series Spectroscopy

    CERN Document Server

    Scoville, John

    2015-01-01

    Spectroscopically measuring low levels of non-equilibrium phenomena (e.g. emission in the presence of a large thermal background) can be problematic due to an unfavorable signal-to-noise ratio. An approach is presented to use time-series spectroscopy to separate non-equilibrium quantities from slowly varying equilibria. A stochastic process associated with the non-equilibrium part of the spectrum is characterized in terms of its central moments or cumulants, which may vary over time. This parameterization encodes information about the non-equilibrium behavior of the system. Stochastic time-series spectroscopy (STSS) can be implemented at very little expense in many settings since a series of scans are typically recorded in order to generate a low-noise averaged spectrum. Higher moments or cumulants may be readily calculated from this series, enabling the observation of quantities that would be difficult or impossible to determine from an average spectrum or from prinicipal components analysis (PCA). This meth...

  19. A Time Series Forecasting Method

    Directory of Open Access Journals (Sweden)

    Wang Zhao-Yu

    2017-01-01

    Full Text Available This paper proposes a novel time series forecasting method based on a weighted self-constructing clustering technique. The weighted self-constructing clustering processes all the data patterns incrementally. If a data pattern is not similar enough to an existing cluster, it forms a new cluster of its own. However, if a data pattern is similar enough to an existing cluster, it is removed from the cluster it currently belongs to and added to the most similar cluster. During the clustering process, weights are learned for each cluster. Given a series of time-stamped data up to time t, we divide it into a set of training patterns. By using the weighted self-constructing clustering, the training patterns are grouped into a set of clusters. To estimate the value at time t + 1, we find the k nearest neighbors of the input pattern and use these k neighbors to decide the estimation. Experimental results are shown to demonstrate the effectiveness of the proposed approach.

  20. International Work-Conference on Time Series

    CERN Document Server

    Pomares, Héctor; Valenzuela, Olga

    2017-01-01

    This volume of selected and peer-reviewed contributions on the latest developments in time series analysis and forecasting updates the reader on topics such as analysis of irregularly sampled time series, multi-scale analysis of univariate and multivariate time series, linear and non-linear time series models, advanced time series forecasting methods, applications in time series analysis and forecasting, advanced methods and online learning in time series and high-dimensional and complex/big data time series. The contributions were originally presented at the International Work-Conference on Time Series, ITISE 2016, held in Granada, Spain, June 27-29, 2016. The series of ITISE conferences provides a forum for scientists, engineers, educators and students to discuss the latest ideas and implementations in the foundations, theory, models and applications in the field of time series analysis and forecasting.  It focuses on interdisciplinary and multidisciplinary rese arch encompassing the disciplines of comput...

  1. Munchausen by Proxy: A Case, Chart Series, and Literature Review of Older Victims

    Science.gov (United States)

    Awadallah, Nida; Vaughan, Aaron; Franco, Kathleen; Munir, Farah; Sharaby, Na'ama; Goldfarb, Johanna

    2005-01-01

    The history of an older child victim of Munchausen by proxy (MBP) is described. He was referred for evaluation after repeated sinus surgeries for recurrent sinus infections believed to be related to a falsified history of an immunodeficiency. The perpetrator was the mother of this 14-year-old victim, consistent with the majority of such cases.…

  2. Constructing Proxy Variables to Measure Adult Learners' Time Management Strategies in LMS

    Science.gov (United States)

    Jo, Il-Hyun; Kim, Dongho; Yoon, Meehyun

    2015-01-01

    This study describes the process of constructing proxy variables from recorded log data within a Learning Management System (LMS), which represents adult learners' time management strategies in an online course. Based on previous research, three variables of total login time, login frequency, and regularity of login interval were selected as…

  3. Appraising timing response of paleoenvironmental proxies to the Bond cycle in the western Mediterranean over the last 20 kyr

    Science.gov (United States)

    Rodrigo-Gámiz, Marta; Martínez-Ruiz, Francisca; Rodríguez-Tovar, Francisco J.; Pardo-Igúzquiza, Eulogio; Ortega-Huertas, Miguel

    2017-07-01

    The timing of climate responses to the Bond cycle is investigated in the western Mediterranean. Periodicities had been previously reported in a marine sediment record from this region spanning the last 20 kyr, and registered by diverse paleoenvironmental proxies, in particular those associated with terrigenous input, redox conditions, productivity, sea surface temperature (SST) and salinity. Further cross-spectral analyses on these time series reveal leads-lags in the 1400 year climate cycle. Considering as reference a terrigenous input proxy (the K/Al ratio), all the paleoenvironmental proxies displayed time shifts varying from ca. 700 year to ca. 350 year. SST and salinity variations show a first leaded response with the inflow of cold and less salty Atlantic waters. Followed by a time lead of 525 year, progresively arid conditions with an increase of eolian dust transport to the area, given by the Zr/Al signal, are observed. The intensification of dust transport could have triggered a latest biological response, lead by 350 year, with an increase of productivity, as suggested by the Ba/Al ratio. Lastly changes in the Mediterranean thermohaline circulation, indicated by a selected redox proxy (the U/Th ratio), are observed. These results support that the oceanic response triggered the atmospheric response to the Bond cycle in the western Mediterranean. Changes in the North Atlantic Oscillation mode and in the Inter-Tropical Convergence Zone migrations with variations in the monsoon activity or Saharan winds system, are considered as main forcing mechanisms, with a complex relationship of the involved phenomena.

  4. A Course in Time Series Analysis

    CERN Document Server

    Peña, Daniel; Tsay, Ruey S

    2011-01-01

    New statistical methods and future directions of research in time series A Course in Time Series Analysis demonstrates how to build time series models for univariate and multivariate time series data. It brings together material previously available only in the professional literature and presents a unified view of the most advanced procedures available for time series model building. The authors begin with basic concepts in univariate time series, providing an up-to-date presentation of ARIMA models, including the Kalman filter, outlier analysis, automatic methods for building ARIMA models, a

  5. Global Population Density Grid Time Series Estimates

    Data.gov (United States)

    National Aeronautics and Space Administration — Global Population Density Grid Time Series Estimates provide a back-cast time series of population density grids based on the year 2000 population grid from SEDAC's...

  6. Global Population Count Grid Time Series Estimates

    Data.gov (United States)

    National Aeronautics and Space Administration — Global Population Count Grid Time Series Estimates provide a back-cast time series of population grids based on the year 2000 population grid from SEDAC's Global...

  7. Time Series Analysis and Forecasting by Example

    CERN Document Server

    Bisgaard, Soren

    2011-01-01

    An intuition-based approach enables you to master time series analysis with ease Time Series Analysis and Forecasting by Example provides the fundamental techniques in time series analysis using various examples. By introducing necessary theory through examples that showcase the discussed topics, the authors successfully help readers develop an intuitive understanding of seemingly complicated time series models and their implications. The book presents methodologies for time series analysis in a simplified, example-based approach. Using graphics, the authors discuss each presented example in

  8. A review of subsequence time series clustering.

    Science.gov (United States)

    Zolhavarieh, Seyedjamal; Aghabozorgi, Saeed; Teh, Ying Wah

    2014-01-01

    Clustering of subsequence time series remains an open issue in time series clustering. Subsequence time series clustering is used in different fields, such as e-commerce, outlier detection, speech recognition, biological systems, DNA recognition, and text mining. One of the useful fields in the domain of subsequence time series clustering is pattern recognition. To improve this field, a sequence of time series data is used. This paper reviews some definitions and backgrounds related to subsequence time series clustering. The categorization of the literature reviews is divided into three groups: preproof, interproof, and postproof period. Moreover, various state-of-the-art approaches in performing subsequence time series clustering are discussed under each of the following categories. The strengths and weaknesses of the employed methods are evaluated as potential issues for future studies.

  9. A Review of Subsequence Time Series Clustering

    Science.gov (United States)

    Teh, Ying Wah

    2014-01-01

    Clustering of subsequence time series remains an open issue in time series clustering. Subsequence time series clustering is used in different fields, such as e-commerce, outlier detection, speech recognition, biological systems, DNA recognition, and text mining. One of the useful fields in the domain of subsequence time series clustering is pattern recognition. To improve this field, a sequence of time series data is used. This paper reviews some definitions and backgrounds related to subsequence time series clustering. The categorization of the literature reviews is divided into three groups: preproof, interproof, and postproof period. Moreover, various state-of-the-art approaches in performing subsequence time series clustering are discussed under each of the following categories. The strengths and weaknesses of the employed methods are evaluated as potential issues for future studies. PMID:25140332

  10. Component extraction analysis of multivariate time series

    NARCIS (Netherlands)

    Akman, I.; de Gooijer, J.G.

    1996-01-01

    A method for modelling several observed parallel time series is proposed. The method involves seeking possible common underlying pure AR and MA components in the series. The common components are forced to be mutually uncorrelated so that univariate time series modelling and forecasting techniques

  11. Data mining in time series databases

    CERN Document Server

    Kandel, Abraham; Bunke, Horst

    2004-01-01

    Adding the time dimension to real-world databases produces Time SeriesDatabases (TSDB) and introduces new aspects and difficulties to datamining and knowledge discovery. This book covers the state-of-the-artmethodology for mining time series databases. The novel data miningmethods presented in the book include techniques for efficientsegmentation, indexing, and classification of noisy and dynamic timeseries. A graph-based method for anomaly detection in time series isdescribed and the book also studies the implications of a novel andpotentially useful representation of time series as strings. Theproblem of detecting changes in data mining models that are inducedfrom temporal databases is additionally discussed.

  12. International Work-Conference on Time Series

    CERN Document Server

    Pomares, Héctor

    2016-01-01

    This volume presents selected peer-reviewed contributions from The International Work-Conference on Time Series, ITISE 2015, held in Granada, Spain, July 1-3, 2015. It discusses topics in time series analysis and forecasting, advanced methods and online learning in time series, high-dimensional and complex/big data time series as well as forecasting in real problems. The International Work-Conferences on Time Series (ITISE) provide a forum for scientists, engineers, educators and students to discuss the latest ideas and implementations in the foundations, theory, models and applications in the field of time series analysis and forecasting. It focuses on interdisciplinary and multidisciplinary research encompassing the disciplines of computer science, mathematics, statistics and econometrics.

  13. Description of complex time series by multipoles

    DEFF Research Database (Denmark)

    Lewkowicz, M.; Levitan, J.; Puzanov, N.

    2002-01-01

    We present a new method to describe time series with a highly complex time evolution. The time series is projected onto a two-dimensional phase-space plot which is quantified in terms of a multipole expansion where every data point is assigned a unit mass. The multipoles provide an efficient char...

  14. Analysis of Heavy-Tailed Time Series

    DEFF Research Database (Denmark)

    Xie, Xiaolei

    This thesis is about analysis of heavy-tailed time series. We discuss tail properties of real-world equity return series and investigate the possibility that a single tail index is shared by all return series of actively traded equities in a market. Conditions for this hypothesis to be true...... are identified. We study the eigenvalues and eigenvectors of sample covariance and sample auto-covariance matrices of multivariate heavy-tailed time series, and particularly for time series with very high dimensions. Asymptotic approximations of the eigenvalues and eigenvectors of such matrices are found...... and expressed in terms of the parameters of the dependence structure, among others. Furthermore, we study an importance sampling method for estimating rare-event probabilities of multivariate heavy-tailed time series generated by matrix recursion. We show that the proposed algorithm is efficient in the sense...

  15. Coping with Nonstationarity in Categorical Time Series

    Directory of Open Access Journals (Sweden)

    Monnie McGee

    2012-01-01

    more categories. In this paper, we introduce an algorithm which corrects for nonstationarity in categorical time series. The algorithm produces series which are not stationary in the traditional sense often used for stationary categorical time series. The form of stationarity is weaker but still useful for parameter estimation. Simulation results show that this simple algorithm applied to a DAR(1 model can dramatically improve the parameter estimates.

  16. Fuzzy time series forecasting of wheat production

    OpenAIRE

    Narendra Kumar; Sachin Ahuja; Shashank Bhardwaj; Vipin Kumar

    2010-01-01

    The present study provides a foundation for the development and application of fuzzy time series model for short term agricultural production forecasting. The present study can provide an advantageous basis to Farm administration for better post harvest management and thelocal industries in planning for their raw material requirement management. The fuzzy time series forecasting can be optimally utilized in agri-business management.

  17. FATS: Feature Analysis for Time Series

    Science.gov (United States)

    Nun, Isadora; Protopapas, Pavlos; Sim, Brandon; Zhu, Ming; Dave, Rahul; Castro, Nicolas; Pichara, Karim

    2017-11-01

    FATS facilitates and standardizes feature extraction for time series data; it quickly and efficiently calculates a compilation of many existing light curve features. Users can characterize or analyze an astronomical photometric database, though this library is not necessarily restricted to the astronomical domain and can also be applied to any kind of time series data.

  18. The foundations of modern time series analysis

    CERN Document Server

    Mills, Terence C

    2011-01-01

    This book develops the analysis of Time Series from its formal beginnings in the 1890s through to the publication of Box and Jenkins' watershed publication in 1970, showing how these methods laid the foundations for the modern techniques of Time Series analysis that are in use today.

  19. Forecasting daily time series using periodic unobserved components time series models

    NARCIS (Netherlands)

    Ooms, M.; Koopman, S.J.

    2006-01-01

    A periodic time series analysis is explored in the context of unobserved components time series models that include stochastic time functions for trend, seasonal and irregular effects. Periodic time series models allow dynamic characteristics (autocovariances) to depend on the period of the year,

  20. Network structure of multivariate time series

    Science.gov (United States)

    Lacasa, Lucas; Nicosia, Vincenzo; Latora, Vito

    2015-10-01

    Our understanding of a variety of phenomena in physics, biology and economics crucially depends on the analysis of multivariate time series. While a wide range tools and techniques for time series analysis already exist, the increasing availability of massive data structures calls for new approaches for multidimensional signal processing. We present here a non-parametric method to analyse multivariate time series, based on the mapping of a multidimensional time series into a multilayer network, which allows to extract information on a high dimensional dynamical system through the analysis of the structure of the associated multiplex network. The method is simple to implement, general, scalable, does not require ad hoc phase space partitioning, and is thus suitable for the analysis of large, heterogeneous and non-stationary time series. We show that simple structural descriptors of the associated multiplex networks allow to extract and quantify nontrivial properties of coupled chaotic maps, including the transition between different dynamical phases and the onset of various types of synchronization. As a concrete example we then study financial time series, showing that a multiplex network analysis can efficiently discriminate crises from periods of financial stability, where standard methods based on time-series symbolization often fail.

  1. Forecasting of nonlinear time series using ANN

    Directory of Open Access Journals (Sweden)

    Ahmed Tealab

    2017-06-01

    Full Text Available When forecasting time series, it is important to classify them according linearity behavior that the linear time series remains at the forefront of academic and applied research, it has often been found that simple linear time series models usually leave certain aspects of economic and financial data unexplained. The dynamic behavior of most of the time series in our real life with its autoregressive and inherited moving average terms issue the challenge to forecast nonlinear times series that contain inherited moving average terms using computational intelligence methodologies such as neural networks. It is rare to find studies that concentrate on forecasting nonlinear times series that contain moving average terms. In this study, we demonstrate that the common neural networks are not efficient for recognizing the behavior of nonlinear or dynamic time series which has moving average terms and hence low forecasting capability. This leads to the importance of formulating new models of neural networks such as Deep Learning neural networks with or without hybrid methodologies such as Fuzzy Logic.

  2. Modelling of nonlinear filtering Poisson time series

    Science.gov (United States)

    Bochkarev, Vladimir V.; Belashova, Inna A.

    2016-08-01

    In this article, algorithms of non-linear filtering of Poisson time series are tested using statistical modelling. The objective is to find a representation of a time series as a wavelet series with a small number of non-linear coefficients, which allows distinguishing statistically significant details. There are well-known efficient algorithms of non-linear wavelet filtering for the case when the values of a time series have a normal distribution. However, if the distribution is not normal, good results can be expected using the maximum likelihood estimations. The filtration is studied according to the criterion of maximum likelihood by the example of Poisson time series. For direct optimisation of the likelihood function, different stochastic (genetic algorithms, annealing method) and deterministic optimization algorithms are used. Testing of the algorithm using both simulated series and empirical data (series of rare words frequencies according to the Google Books Ngram data were used) showed that filtering based on the criterion of maximum likelihood has a great advantage over well-known algorithms for the case of Poisson series. Also, the most perspective methods of optimisation were selected for this problem.

  3. Time Series Analysis Forecasting and Control

    CERN Document Server

    Box, George E P; Reinsel, Gregory C

    2011-01-01

    A modernized new edition of one of the most trusted books on time series analysis. Since publication of the first edition in 1970, Time Series Analysis has served as one of the most influential and prominent works on the subject. This new edition maintains its balanced presentation of the tools for modeling and analyzing time series and also introduces the latest developments that have occurred n the field over the past decade through applications from areas such as business, finance, and engineering. The Fourth Edition provides a clearly written exploration of the key methods for building, cl

  4. Some aspects of harmonic time series analysis

    OpenAIRE

    2012-01-01

    Ph.D. Harmonic time series are often used to describe the periodic nature of a time series, for example the periodic nature of a variable star’s observed light curve. Statistical methods for determining the number of harmonic components to include in harmonic time series are limited. In this thesis a stepwise bootstrap procedure based on a F-type statistic is suggested. The performance of the stepwise procedure is compared to that of Schwartz’s Bayesian Criterion (SBC) and a procedure base...

  5. Forecasting Daily Time Series using Periodic Unobserved Components Time Series Models

    NARCIS (Netherlands)

    Koopman, Siem Jan; Ooms, Marius

    2004-01-01

    We explore a periodic analysis in the context of unobserved components time series models that decompose time series into components of interest such as trend and seasonal. Periodic time series models allow dynamic characteristics to depend on the period of the year, month, week or day. In the

  6. Evaluation of Harmonic Analysis of Time Series (HANTS): impact of gaps on time series reconstruction

    NARCIS (Netherlands)

    Zhou, J.Y.; Jia, L.; Hu, G.; Menenti, M.

    2012-01-01

    In recent decades, researchers have developed methods and models to reconstruct time series of irregularly spaced observations from satellite remote sensing, among which the widely used Harmonic Analysis of Time Series (HANTS) method. Many studies based on time series reconstructed with HANTS

  7. G-Filtering Nonstationary Time Series

    Directory of Open Access Journals (Sweden)

    Mengyuan Xu

    2012-01-01

    Full Text Available The classical linear filter can successfully filter the components from a time series for which the frequency content does not change with time, and those nonstationary time series with time-varying frequency (TVF components that do not overlap. However, for many types of nonstationary time series, the TVF components often overlap in time. In such a situation, the classical linear filtering method fails to extract components from the original process. In this paper, we introduce and theoretically develop the G-filter based on a time-deformation technique. Simulation examples and a real bat echolocation example illustrate that the G-filter can successfully filter a G-stationary process whose TVF components overlap with time.

  8. Applied time series analysis and innovative computing

    CERN Document Server

    Ao, Sio-Iong

    2010-01-01

    This text is a systematic, state-of-the-art introduction to the use of innovative computing paradigms as an investigative tool for applications in time series analysis. It includes frontier case studies based on recent research.

  9. Introduction to time series analysis and forecasting

    CERN Document Server

    Montgomery, Douglas C; Kulahci, Murat

    2008-01-01

    An accessible introduction to the most current thinking in and practicality of forecasting techniques in the context of time-oriented data. Analyzing time-oriented data and forecasting are among the most important problems that analysts face across many fields, ranging from finance and economics to production operations and the natural sciences. As a result, there is a widespread need for large groups of people in a variety of fields to understand the basic concepts of time series analysis and forecasting. Introduction to Time Series Analysis and Forecasting presents the time series analysis branch of applied statistics as the underlying methodology for developing practical forecasts, and it also bridges the gap between theory and practice by equipping readers with the tools needed to analyze time-oriented data and construct useful, short- to medium-term, statistically based forecasts.

  10. Efficient Approximate OLAP Querying Over Time Series

    DEFF Research Database (Denmark)

    Perera, Kasun Baruhupolage Don Kasun Sanjeewa; Hahmann, Martin; Lehner, Wolfgang

    2016-01-01

    queries play a major role in these domains, it is desirable to also execute them on time series data. While this is not a problem on the conceptual level, it can become a bottleneck with regards to query run-time. In general, processing OLAP queries gets more computationally intensive as the volume...... of data grows. This is a particular problem when querying time series data, which generally contains multiple measures recorded at fine time granularities. Usually, this issue is addressed either by scaling up hardware or by employing workload based query optimization techniques. However, these solutions...

  11. Time averaging, ageing and delay analysis of financial time series

    Science.gov (United States)

    Cherstvy, Andrey G.; Vinod, Deepak; Aghion, Erez; Chechkin, Aleksei V.; Metzler, Ralf

    2017-06-01

    We introduce three strategies for the analysis of financial time series based on time averaged observables. These comprise the time averaged mean squared displacement (MSD) as well as the ageing and delay time methods for varying fractions of the financial time series. We explore these concepts via statistical analysis of historic time series for several Dow Jones Industrial indices for the period from the 1960s to 2015. Remarkably, we discover a simple universal law for the delay time averaged MSD. The observed features of the financial time series dynamics agree well with our analytical results for the time averaged measurables for geometric Brownian motion, underlying the famed Black-Scholes-Merton model. The concepts we promote here are shown to be useful for financial data analysis and enable one to unveil new universal features of stock market dynamics.

  12. Highly comparative time-series analysis: the empirical structure of time series and their methods

    Science.gov (United States)

    Fulcher, Ben D.; Little, Max A.; Jones, Nick S.

    2013-01-01

    The process of collecting and organizing sets of observations represents a common theme throughout the history of science. However, despite the ubiquity of scientists measuring, recording and analysing the dynamics of different processes, an extensive organization of scientific time-series data and analysis methods has never been performed. Addressing this, annotated collections of over 35 000 real-world and model-generated time series, and over 9000 time-series analysis algorithms are analysed in this work. We introduce reduced representations of both time series, in terms of their properties measured by diverse scientific methods, and of time-series analysis methods, in terms of their behaviour on empirical time series, and use them to organize these interdisciplinary resources. This new approach to comparing across diverse scientific data and methods allows us to organize time-series datasets automatically according to their properties, retrieve alternatives to particular analysis methods developed in other scientific disciplines and automate the selection of useful methods for time-series classification and regression tasks. The broad scientific utility of these tools is demonstrated on datasets of electroencephalograms, self-affine time series, heartbeat intervals, speech signals and others, in each case contributing novel analysis techniques to the existing literature. Highly comparative techniques that compare across an interdisciplinary literature can thus be used to guide more focused research in time-series analysis for applications across the scientific disciplines. PMID:23554344

  13. Turbulencelike behavior of seismic time series.

    Science.gov (United States)

    Manshour, P; Saberi, S; Sahimi, Muhammad; Peinke, J; Pacheco, Amalio F; Rahimi Tabar, M Reza

    2009-01-09

    We report on a stochastic analysis of Earth's vertical velocity time series by using methods originally developed for complex hierarchical systems and, in particular, for turbulent flows. Analysis of the fluctuations of the detrended increments of the series reveals a pronounced transition in their probability density function from Gaussian to non-Gaussian. The transition occurs 5-10 hours prior to a moderate or large earthquake, hence representing a new and reliable precursor for detecting such earthquakes.

  14. Effective Feature Preprocessing for Time Series Forecasting

    DEFF Research Database (Denmark)

    Zhao, Junhua; Dong, Zhaoyang; Xu, Zhao

    2006-01-01

    Time series forecasting is an important area in data mining research. Feature preprocessing techniques have significant influence on forecasting accuracy, therefore are essential in a forecasting model. Although several feature preprocessing techniques have been applied in time series forecasting......, there is so far no systematic research to study and compare their performance. How to select effective techniques of feature preprocessing in a forecasting model remains a problem. In this paper, the authors conduct a comprehensive study of existing feature preprocessing techniques to evaluate their empirical...... performance in time series forecasting. It is demonstrated in our experiment that, effective feature preprocessing can significantly enhance forecasting accuracy. This research can be a useful guidance for researchers on effectively selecting feature preprocessing techniques and integrating them with time...

  15. Analisis Perbandingan Respons Time Squid Proxy Pada Windows Server dan Linux Server

    OpenAIRE

    Sirait, Parulian

    2016-01-01

    In the development of information technology, information is obtained quickly through technology computer network known as the Internet. The use bandwidth for Internet access can be maximized by using a proxy server. One of the proxy server is squid. The use squid as the proxy server need to consider the operating system on the server and have not known its best performance on any operating system yet. For that it is necessary to analyze the performance of squid proxy server on a different op...

  16. Introduction to time series analysis and forecasting

    CERN Document Server

    Montgomery, Douglas C; Kulahci, Murat

    2015-01-01

    Praise for the First Edition ""…[t]he book is great for readers who need to apply the methods and models presented but have little background in mathematics and statistics."" -MAA Reviews Thoroughly updated throughout, Introduction to Time Series Analysis and Forecasting, Second Edition presents the underlying theories of time series analysis that are needed to analyze time-oriented data and construct real-world short- to medium-term statistical forecasts.    Authored by highly-experienced academics and professionals in engineering statistics, the Second Edition features discussions on both

  17. Quasifiltering for time-series modeling

    OpenAIRE

    Tsyplakov, Alexander

    2015-01-01

    In the paper a method for constructing new varieties of time-series models is proposed. The idea is to start from an unobserved components model in a state-space form and use it as an inspiration for development of another time-series model, in which time-varying underlying variables are directly observed. The goal is to replace a state-space model with an intractable likelihood function by another model, for which the likelihood function can be written in a closed form. If state transition e...

  18. Fractal and natural time analysis of geoelectrical time series

    Science.gov (United States)

    Ramirez Rojas, A.; Moreno-Torres, L. R.; Cervantes, F.

    2013-05-01

    In this work we show the analysis of geoelectric time series linked with two earthquakes of M=6.6 and M=7.4. That time series were monitored at the South Pacific Mexican coast, which is the most important active seismic subduction zone in México. The geolectric time series were analyzed by using two complementary methods: a fractal analysis, by means of the detrended fluctuation analysis (DFA) in the conventional time, and the power spectrum defined in natural time domain (NTD). In conventional time we found long-range correlations prior to the EQ-occurrences and simultaneously in NTD, the behavior of the power spectrum suggest the possible existence of seismo electric signals (SES) similar with the previously reported in equivalent time series monitored in Greece prior to earthquakes of relevant magnitude.

  19. Automatic Regulation Time Series for Industry Processes

    Directory of Open Access Journals (Sweden)

    Tain-Sou Tsay

    2012-01-01

    Full Text Available A nonlinear digital control scheme is proposed for analyses and designs of stable industry processes. It is derived from the converging characteristic of a specified numerical time series. The ratios of neighbourhoods of the series are formulated as a function of the output of the plant and the reference input command and will be converted to be unities after the output has tracked the reference input command. Lead compensations are also found by another numerical time series to speed up the system responses on the online adjusting manner. A servosystem, a time-delay system, a high-order system, a very-high-order system, and a 2 × 2 multivariable aircraft gas turbine engine are used to illustrate effectiveness of the proposed nonlinear digital controller. Comparisons with other conventional methods are also made.

  20. Time series regression model for infectious disease and weather.

    Science.gov (United States)

    Imai, Chisato; Armstrong, Ben; Chalabi, Zaid; Mangtani, Punam; Hashizume, Masahiro

    2015-10-01

    Time series regression has been developed and long used to evaluate the short-term associations of air pollution and weather with mortality or morbidity of non-infectious diseases. The application of the regression approaches from this tradition to infectious diseases, however, is less well explored and raises some new issues. We discuss and present potential solutions for five issues often arising in such analyses: changes in immune population, strong autocorrelations, a wide range of plausible lag structures and association patterns, seasonality adjustments, and large overdispersion. The potential approaches are illustrated with datasets of cholera cases and rainfall from Bangladesh and influenza and temperature in Tokyo. Though this article focuses on the application of the traditional time series regression to infectious diseases and weather factors, we also briefly introduce alternative approaches, including mathematical modeling, wavelet analysis, and autoregressive integrated moving average (ARIMA) models. Modifications proposed to standard time series regression practice include using sums of past cases as proxies for the immune population, and using the logarithm of lagged disease counts to control autocorrelation due to true contagion, both of which are motivated from "susceptible-infectious-recovered" (SIR) models. The complexity of lag structures and association patterns can often be informed by biological mechanisms and explored by using distributed lag non-linear models. For overdispersed models, alternative distribution models such as quasi-Poisson and negative binomial should be considered. Time series regression can be used to investigate dependence of infectious diseases on weather, but may need modifying to allow for features specific to this context. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.

  1. Time series clustering in large data sets

    Directory of Open Access Journals (Sweden)

    Jiří Fejfar

    2011-01-01

    Full Text Available The clustering of time series is a widely researched area. There are many methods for dealing with this task. We are actually using the Self-organizing map (SOM with the unsupervised learning algorithm for clustering of time series. After the first experiment (Fejfar, Weinlichová, Šťastný, 2009 it seems that the whole concept of the clustering algorithm is correct but that we have to perform time series clustering on much larger dataset to obtain more accurate results and to find the correlation between configured parameters and results more precisely. The second requirement arose in a need for a well-defined evaluation of results. It seems useful to use sound recordings as instances of time series again. There are many recordings to use in digital libraries, many interesting features and patterns can be found in this area. We are searching for recordings with the similar development of information density in this experiment. It can be used for musical form investigation, cover songs detection and many others applications.The objective of the presented paper is to compare clustering results made with different parameters of feature vectors and the SOM itself. We are describing time series in a simplistic way evaluating standard deviations for separated parts of recordings. The resulting feature vectors are clustered with the SOM in batch training mode with different topologies varying from few neurons to large maps.There are other algorithms discussed, usable for finding similarities between time series and finally conclusions for further research are presented. We also present an overview of the related actual literature and projects.

  2. Introduction to time series and forecasting

    CERN Document Server

    Brockwell, Peter J

    2016-01-01

    This book is aimed at the reader who wishes to gain a working knowledge of time series and forecasting methods as applied to economics, engineering and the natural and social sciences. It assumes knowledge only of basic calculus, matrix algebra and elementary statistics. This third edition contains detailed instructions for the use of the professional version of the Windows-based computer package ITSM2000, now available as a free download from the Springer Extras website. The logic and tools of time series model-building are developed in detail. Numerous exercises are included and the software can be used to analyze and forecast data sets of the user's own choosing. The book can also be used in conjunction with other time series packages such as those included in R. The programs in ITSM2000 however are menu-driven and can be used with minimal investment of time in the computational details. The core of the book covers stationary processes, ARMA and ARIMA processes, multivariate time series and state-space mod...

  3. Multifractal Analysis of Polyalanines Time Series

    CERN Document Server

    Figueirêdo, P H; Moret, M A; Coutinho, Sérgio; 10.1016/j.physa.2009.11.045

    2010-01-01

    Multifractal properties of the energy time series of short $\\alpha$-helix structures, specifically from a polyalanine family, are investigated through the MF-DFA technique ({\\it{multifractal detrended fluctuation analysis}}). Estimates for the generalized Hurst exponent $h(q)$ and its associated multifractal exponents $\\tau(q)$ are obtained for several series generated by numerical simulations of molecular dynamics in different systems from distinct initial conformations. All simulations were performed using the GROMOS force field, implemented in the program THOR. The main results have shown that all series exhibit multifractal behavior depending on the number of residues and temperature. Moreover, the multifractal spectra reveal important aspects on the time evolution of the system and suggest that the nucleation process of the secondary structures during the visits on the energy hyper-surface is an essential feature of the folding process.

  4. Inferring causality from noisy time series data

    DEFF Research Database (Denmark)

    Mønster, Dan; Fusaroli, Riccardo; Tylén, Kristian

    2016-01-01

    and even causality direction in synchronized time-series and in the presence of intermediate coupling. We find that the presence of noise deterministically reduces the level of cross-mapping fidelity, while the convergence rate exhibits higher levels of robustness. Finally, we propose that controlled noise...

  5. Lecture notes for Advanced Time Series Analysis

    DEFF Research Database (Denmark)

    Madsen, Henrik; Holst, Jan

    1997-01-01

    A first version of this notes was used at the lectures in Grenoble, and they are now extended and improved (together with Jan Holst), and used in Ph.D. courses on Advanced Time Series Analysis at IMM and at the Department of Mathematical Statistics, University of Lund, 1994, 1997, ...

  6. Inferring interdependencies from short time series

    Indian Academy of Sciences (India)

    Our analysis uncovers that crop production in AFR is strongly interdependent with the regional rainfall. While the gross ... Interdependence; correlation; inner composition alignment; time series analysis. PACS Nos 05.45.; 02.50. ... data and the growing need of big data approaches, sev- eral real-world applications face ...

  7. Iconic CO2 Time Series at Risk

    NARCIS (Netherlands)

    Houweling, S.; Badawy, B.; Basu, S.; Krol, M.C.; Röckmann, T.; Vermeulen, A.

    2012-01-01

    THE STEADY RISE IN ATMOSPHERIC LONGlived greenhouse gas concentrations is the main driver of contemporary climate change. The Mauna Loa CO2 time series (1, 2), started by C. D. Keeling in 1958 and maintained today by the Scripps Institution of Oceanography and the Earth System Research Laboratory

  8. Iconic CO2 Time Series at Risk

    Energy Technology Data Exchange (ETDEWEB)

    Houweling, S. [SRON Netherlands Institute for Space Research, 3584 CA, Utrecht (Netherlands); Badawy, B. [Max-Planck-Institute for Biogeochemistry, 07745, Jena (Germany); Vermeulen, A.T. [Energieonderzoek Centrum Nederland ECN, 1755 ZG Petten (Netherlands)] [and others

    2012-08-31

    The Mauna Loa CO2 time series is iconic evidence of the effect of human-caused fossil fuel and land-use change emissions on the atmospheric increase of CO2. The continuity of such records depends critically on having stable funding, which is currently threatened by the financial crisis.

  9. Time series tapering for short data samples

    DEFF Research Database (Denmark)

    Kaimal, J.C.; Kristensen, L.

    1991-01-01

    We explore the effect of applying tapered windows on atmospheric data to eliminate overestimation inherent in spectra computed from short time series. Some windows are more effective than others in correcting this distortion. The Hamming window gave the best results with experimental data. The Ha...

  10. On modeling panels of time series

    NARCIS (Netherlands)

    Ph.H.B.F. Franses (Philip Hans)

    2002-01-01

    textabstractThis paper reviews research issues in modeling panels of time series. Examples of this type of data are annually observed macroeconomic indicators for all countries in the world, daily returns on the individual stocks listed in the S&P500, and the sales records of all items in a

  11. Optimal transformations for categorical autoregressive time series

    NARCIS (Netherlands)

    Buuren, S. van

    1996-01-01

    This paper describes a method for finding optimal transformations for analyzing time series by autoregressive models. 'Optimal' implies that the agreement between the autoregressive model and the transformed data is maximal. Such transformations help 1) to increase the model fit, and 2) to analyze

  12. 25 years of time series forecasting

    NARCIS (Netherlands)

    de Gooijer, J.G.; Hyndman, R.J.

    2006-01-01

    We review the past 25 years of research into time series forecasting. In this silver jubilee issue, we naturally highlight results published in journals managed by the International Institute of Forecasters (Journal of Forecasting 1982-1985 and International Journal of Forecasting 1985-2005). During

  13. Recent Advances in Energy Time Series Forecasting

    Directory of Open Access Journals (Sweden)

    Francisco Martínez-Álvarez

    2017-06-01

    Full Text Available This editorial summarizes the performance of the special issue entitled Energy Time Series Forecasting, which was published in MDPI’s Energies journal. The special issue took place in 2016 and accepted a total of 21 papers from twelve different countries. Electrical, solar, or wind energy forecasting were the most analyzed topics, introducing brand new methods with very sound results.

  14. Designer networks for time series processing

    DEFF Research Database (Denmark)

    Svarer, C; Hansen, Lars Kai; Larsen, Jan

    1993-01-01

    The conventional tapped-delay neural net may be analyzed using statistical methods and the results of such analysis can be applied to model optimization. The authors review and extend efforts to demonstrate the power of this strategy within time series processing. They attempt to design compact...

  15. Nonlinear Time Series Analysis via Neural Networks

    Science.gov (United States)

    Volná, Eva; Janošek, Michal; Kocian, Václav; Kotyrba, Martin

    This article deals with a time series analysis based on neural networks in order to make an effective forex market [Moore and Roche, J. Int. Econ. 58, 387-411 (2002)] pattern recognition. Our goal is to find and recognize important patterns which repeatedly appear in the market history to adapt our trading system behaviour based on them.

  16. Robust Control Charts for Time Series Data

    NARCIS (Netherlands)

    Croux, C.; Gelper, S.; Mahieu, K.

    2010-01-01

    This article presents a control chart for time series data, based on the one-step- ahead forecast errors of the Holt-Winters forecasting method. We use robust techniques to prevent that outliers affect the estimation of the control limits of the chart. Moreover, robustness is important to maintain

  17. Remote Sensing Time Series Product Tool

    Science.gov (United States)

    Prados, D.; Ryan, R. E.; Ross, K. W.

    2006-12-01

    The TSPT (Time Series Product Tool) software was custom-designed for NASA to rapidly create and display single-band and band-combination time series, such as NDVI (Normalized Difference Vegetation Index) images, for wide-area crop surveillance and for other time-critical applications. The TSPT, developed in MATLAB®, allows users to create and display various MODIS (Moderate Resolution Imaging Spectroradiometer) or simulated VIIRS (Visible/Infrared Imager Radiometer Suite) products as single images, as time series plots at a selected location, or as temporally processed image videos. Manually creating these types of products is extremely labor intensive; however, the TSPT development tool makes the process simplified and efficient. MODIS is ideal for monitoring large crop areas because of its wide swath (2330 km), its relatively small ground sample distance (250 m), and its high temporal revisit time (twice daily). Furthermore, because MODIS imagery is acquired daily, rapid changes in vegetative health can potentially be detected. The new TSPT technology provides users with the ability to temporally process high-revisit-rate satellite imagery, such as that acquired from MODIS and from its successor, the VIIRS. The TSPT features the important capability of fusing data from both MODIS instruments onboard the Terra and Aqua satellites, which drastically improves cloud statistics. With the TSPT, MODIS metadata is used to find and optionally remove bad and suspect data. Noise removal and temporal processing techniques allow users to create low-noise time series plots and image videos and to select settings and thresholds that tailor particular output products. The TSPT GUI (graphical user interface) provides an interactive environment for crafting "what-if" scenarios by enabling a user to repeat product generation using different settings and thresholds. The TSPT Application Programming Interface provides more fine-tuned control of product generation, allowing

  18. What marketing scholars should know about time series analysis : time series applications in marketing

    NARCIS (Netherlands)

    Horváth, Csilla; Kornelis, Marcel; Leeflang, Peter S.H.

    2002-01-01

    In this review, we give a comprehensive summary of time series techniques in marketing, and discuss a variety of time series analysis (TSA) techniques and models. We classify them in the sets (i) univariate TSA, (ii) multivariate TSA, and (iii) multiple TSA. We provide relevant marketing

  19. Fractal Analysis On Internet Traffic Time Series

    OpenAIRE

    Chong, K. B.; Choo, K. Y.

    2002-01-01

    Fractal behavior and long-range dependence have been observed in tele-traffic measurement and characterization. In this paper we show results of application of the fractal analysis to internet traffic via various methods. Our result demonstrate that the internet traffic exhibits self-similarity. Time-scale analysis show to be an effective way to characterize the local irregularity. Based on the result of this study, these two Internet time series exhibit fractal characteristic with long-range...

  20. Night-time lights as a proxy of human pressure on freshwater resources

    Science.gov (United States)

    Ceola, Serena; Montanari, Alberto; Laio, Francesco

    2017-04-01

    The presence and availability of freshwater resources at the global scale control the dynamics and the biodiversity of river ecosystems, as well as the human development and the security of people and economies. The increasing human pressure on freshwater is known to potentially drive significant alterations on both ecohydrological and social dynamics. To date, a spatially-detailed snapshot (i.e. single in time) analysis of human water security and river biodiversity threats revealed that the majority of the world's population and river ecosystems are exposed to high levels of endangerment. However, the temporal evolution of these effects at the global scale is still unexplored. To this aim, moving from the recent progress on remote sensing techniques, we employed yearly averaged night-time light images available from 1992 to 2013 as a proxy of anthropogenic presence and activity and we investigated how threats to human water security and river biodiversity evolved in time in 405 major river basins. Our results show a consistent correlation between nightlights and ecohydrological and threats, providing innovative support for freshwater resources management.

  1. Outlier Detection in Structural Time Series Models

    DEFF Research Database (Denmark)

    Marczak, Martyna; Proietti, Tommaso

    investigate via Monte Carlo simulations how this approach performs for detecting additive outliers and level shifts in the analysis of nonstationary seasonal time series. The reference model is the basic structural model, featuring a local linear trend, possibly integrated of order two, stochastic seasonality......Structural change affects the estimation of economic signals, like the underlying growth rate or the seasonally adjusted series. An important issue, which has attracted a great deal of attention also in the seasonal adjustment literature, is its detection by an expert procedure. The general...... and a stationary component. Further, we apply both kinds of indicator saturation to detect additive outliers and level shifts in the industrial production series in five European countries....

  2. Algorithm for Compressing Time-Series Data

    Science.gov (United States)

    Hawkins, S. Edward, III; Darlington, Edward Hugo

    2012-01-01

    An algorithm based on Chebyshev polynomials effects lossy compression of time-series data or other one-dimensional data streams (e.g., spectral data) that are arranged in blocks for sequential transmission. The algorithm was developed for use in transmitting data from spacecraft scientific instruments to Earth stations. In spite of its lossy nature, the algorithm preserves the information needed for scientific analysis. The algorithm is computationally simple, yet compresses data streams by factors much greater than two. The algorithm is not restricted to spacecraft or scientific uses: it is applicable to time-series data in general. The algorithm can also be applied to general multidimensional data that have been converted to time-series data, a typical example being image data acquired by raster scanning. However, unlike most prior image-data-compression algorithms, this algorithm neither depends on nor exploits the two-dimensional spatial correlations that are generally present in images. In order to understand the essence of this compression algorithm, it is necessary to understand that the net effect of this algorithm and the associated decompression algorithm is to approximate the original stream of data as a sequence of finite series of Chebyshev polynomials. For the purpose of this algorithm, a block of data or interval of time for which a Chebyshev polynomial series is fitted to the original data is denoted a fitting interval. Chebyshev approximation has two properties that make it particularly effective for compressing serial data streams with minimal loss of scientific information: The errors associated with a Chebyshev approximation are nearly uniformly distributed over the fitting interval (this is known in the art as the "equal error property"); and the maximum deviations of the fitted Chebyshev polynomial from the original data have the smallest possible values (this is known in the art as the "min-max property").

  3. Time Series Analysis by State Space Methods

    Science.gov (United States)

    Durbin, James; Koopman, Siem Jan

    2001-08-01

    Providing analyses from both classical and Bayesian perspectives, this book presents a comprehensive treatment of the state space approach to time series analysis. The distinguishing feature of state space time models is that observations are regarded as made up of distinct components such as trend, seasonal, regression elements and disturbance terms, each of which is modelled separately. The techniques that emerge from this approach are very flexible and are capable of handling a much wider range of problems than the main analytical system currently in use for time series analysis, the Box-Jenkins ARIMA system.Visit the authors' website for supplementary materials - download programs, data and find further information: www.ssfpack.com/dkbook/

  4. Statistical analysis of hydroclimatic time series: Uncertainty and insights

    Science.gov (United States)

    Koutsoyiannis, Demetris; Montanari, Alberto

    2007-05-01

    Today, hydrologic research and modeling depends largely on climatological inputs, whose physical and statistical behavior are the subject of many debates in the scientific community. A relevant ongoing discussion is focused on long-term persistence (LTP), a natural behavior identified in several studies of instrumental and proxy hydroclimatic time series, which, nevertheless, is neglected in some climatological studies. LTP may reflect a long-term variability of several factors and thus can support a more complete physical understanding and uncertainty characterization of climate. The implications of LTP in hydroclimatic research, especially in statistical questions and problems, may be substantial but appear to be not fully understood or recognized. To offer insights on these implications, we demonstrate by using analytical methods that the characteristics of temperature series, which appear to be compatible with the LTP hypothesis, imply a dramatic increase of uncertainty in statistical estimation and reduction of significance in statistical testing, in comparison with classical statistics. Therefore we maintain that statistical analysis in hydroclimatic research should be revisited in order not to derive misleading results and simultaneously that merely statistical arguments do not suffice to verify or falsify the LTP (or another) climatic hypothesis.

  5. On clustering fMRI time series

    DEFF Research Database (Denmark)

    Goutte, Cyril; Toft, Peter Aundal; Rostrup, E.

    1999-01-01

    Analysis of fMRI time series is often performed by extracting one or more parameters for the individual voxels. Methods based, e.g., on various statistical tests are then used to yield parameters corresponding to probability of activation or activation strength. However, these methods do not indi......Analysis of fMRI time series is often performed by extracting one or more parameters for the individual voxels. Methods based, e.g., on various statistical tests are then used to yield parameters corresponding to probability of activation or activation strength. However, these methods do...... between the activation stimulus and the fMRI signal. We present two different clustering algorithms and use them to identify regions of similar activations in an fMRI experiment involving a visual stimulus....

  6. Spartan random processes in time series modeling

    Science.gov (United States)

    Žukovič, M.; Hristopulos, D. T.

    2008-06-01

    A Spartan random process (SRP) is used to estimate the correlation structure of time series and to predict (interpolate and extrapolate) the data values. SRPs are motivated from statistical physics, and they can be viewed as Ginzburg-Landau models. The temporal correlations of the SRP are modeled in terms of ‘interactions’ between the field values. Model parameter inference employs the computationally fast modified method of moments, which is based on matching sample energy moments with the respective stochastic constraints. The parameters thus inferred are then compared with those obtained by means of the maximum likelihood method. The performance of the Spartan predictor (SP) is investigated using real time series of the quarterly S&P 500 index. SP prediction errors are compared with those of the Kolmogorov-Wiener predictor. Two predictors, one of which is explicit, are derived and used for extrapolation. The performance of the predictors is similarly evaluated.

  7. Aggregated Indexing of Biomedical Time Series Data.

    Science.gov (United States)

    Woodbridge, Jonathan; Mortazavi, Bobak; Sarrafzadeh, Majid; Bui, Alex A T

    2012-09-01

    Remote and wearable medical sensing has the potential to create very large and high dimensional datasets. Medical time series databases must be able to efficiently store, index, and mine these datasets to enable medical professionals to effectively analyze data collected from their patients. Conventional high dimensional indexing methods are a two stage process. First, a superset of the true matches is efficiently extracted from the database. Second, supersets are pruned by comparing each of their objects to the query object and rejecting any objects falling outside a predetermined radius. This pruning stage heavily dominates the computational complexity of most conventional search algorithms. Therefore, indexing algorithms can be significantly improved by reducing the amount of pruning. This paper presents an online algorithm to aggregate biomedical times series data to significantly reduce the search space (index size) without compromising the quality of search results. This algorithm is built on the observation that biomedical time series signals are composed of cyclical and often similar patterns. This algorithm takes in a stream of segments and groups them to highly concentrated collections. Locality Sensitive Hashing (LSH) is used to reduce the overall complexity of the algorithm, allowing it to run online. The output of this aggregation is used to populate an index. The proposed algorithm yields logarithmic growth of the index (with respect to the total number of objects) while keeping sensitivity and specificity simultaneously above 98%. Both memory and runtime complexities of time series search are improved when using aggregated indexes. In addition, data mining tasks, such as clustering, exhibit runtimes that are orders of magnitudes faster when run on aggregated indexes.

  8. Revisiting algorithms for generating surrogate time series

    CERN Document Server

    Raeth, C; Papadakis, I E; Brinkmann, W

    2011-01-01

    The method of surrogates is one of the key concepts of nonlinear data analysis. Here, we demonstrate that commonly used algorithms for generating surrogates often fail to generate truly linear time series. Rather, they create surrogate realizations with Fourier phase correlations leading to non-detections of nonlinearities. We argue that reliable surrogates can only be generated, if one tests separately for static and dynamic nonlinearities.

  9. Analysis of Polyphonic Musical Time Series

    Science.gov (United States)

    Sommer, Katrin; Weihs, Claus

    A general model for pitch tracking of polyphonic musical time series will be introduced. Based on a model of Davy and Godsill (Bayesian harmonic models for musical pitch estimation and analysis, Technical Report 431, Cambridge University Engineering Department, 2002) Davy and Godsill (2002) the different pitches of the musical sound are estimated with MCMC methods simultaneously. Additionally a preprocessing step is designed to improve the estimation of the fundamental frequencies (A comparative study on polyphonic musical time series using MCMC methods. In C. Preisach et al., editors, Data Analysis, Machine Learning, and Applications, Springer, Berlin, 2008). The preprocessing step compares real audio data with an alphabet constructed from the McGill Master Samples (Opolko and Wapnick, McGill University Master Samples [Compact disc], McGill University, Montreal, 1987) and consists of tones of different instruments. The tones with minimal Itakura-Saito distortion (Gray et al., Transactions on Acoustics, Speech, and Signal Processing ASSP-28(4):367-376, 1980) are chosen as first estimates and as starting points for the MCMC algorithms. Furthermore the implementation of the alphabet is an approach for the recognition of the instruments generating the musical time series. Results are presented for mixed monophonic data from McGill and for self recorded polyphonic audio data.

  10. Interpretable Categorization of Heterogeneous Time Series Data

    Science.gov (United States)

    Lee, Ritchie; Kochenderfer, Mykel J.; Mengshoel, Ole J.; Silbermann, Joshua

    2017-01-01

    We analyze data from simulated aircraft encounters to validate and inform the development of a prototype aircraft collision avoidance system. The high-dimensional and heterogeneous time series dataset is analyzed to discover properties of near mid-air collisions (NMACs) and categorize the NMAC encounters. Domain experts use these properties to better organize and understand NMAC occurrences. Existing solutions either are not capable of handling high-dimensional and heterogeneous time series datasets or do not provide explanations that are interpretable by a domain expert. The latter is critical to the acceptance and deployment of safety-critical systems. To address this gap, we propose grammar-based decision trees along with a learning algorithm. Our approach extends decision trees with a grammar framework for classifying heterogeneous time series data. A context-free grammar is used to derive decision expressions that are interpretable, application-specific, and support heterogeneous data types. In addition to classification, we show how grammar-based decision trees can also be used for categorization, which is a combination of clustering and generating interpretable explanations for each cluster. We apply grammar-based decision trees to a simulated aircraft encounter dataset and evaluate the performance of four variants of our learning algorithm. The best algorithm is used to analyze and categorize near mid-air collisions in the aircraft encounter dataset. We describe each discovered category in detail and discuss its relevance to aircraft collision avoidance.

  11. Normalizing the causality between time series

    CERN Document Server

    Liang, X San

    2015-01-01

    Recently, a rigorous yet concise formula has been derived to evaluate the information flow, and hence the causality in a quantitative sense, between time series. To assess the importance of a resulting causality, it needs to be normalized. The normalization is achieved through distinguishing three types of fundamental mechanisms that govern the marginal entropy change of the flow recipient. A normalized or relative flow measures its importance relative to other mechanisms. In analyzing realistic series, both absolute and relative information flows need to be taken into account, since the normalizers for a pair of reverse flows belong to two different entropy balances; it is quite normal that two identical flows may differ a lot in relative importance in their respective balances. We have reproduced these results with several autoregressive models. We have also shown applications to a climate change problem and a financial analysis problem. For the former, reconfirmed is the role of the Indian Ocean Dipole as ...

  12. Forgery attack on one-time proxy signature and the improvement

    Science.gov (United States)

    Yang, Chun-Wei; Luo, Yi-Ping; Hwang, Tzonelih

    2014-09-01

    This paper points out that in Wang and Wei's scheme (Quantum Inf Process 11:455-463, 2012), an eavesdropper, Eve, can replace the original message of a proxy signature with a forged one of her choice without being detected by the verifier. Accordingly, one of the security requirements of a quantum signature, i.e., unforgeability, may not be satisfied in their scheme. An improvement is given to avoid this attack, and the comparisons with the existing quantum proxy signature are also demonstrated.

  13. Transformation-cost time-series method for analyzing irregularly sampled data

    Science.gov (United States)

    Ozken, Ibrahim; Eroglu, Deniz; Stemler, Thomas; Marwan, Norbert; Bagci, G. Baris; Kurths, Jürgen

    2015-06-01

    Irregular sampling of data sets is one of the challenges often encountered in time-series analysis, since traditional methods cannot be applied and the frequently used interpolation approach can corrupt the data and bias the subsequence analysis. Here we present the TrAnsformation-Cost Time-Series (TACTS) method, which allows us to analyze irregularly sampled data sets without degenerating the quality of the data set. Instead of using interpolation we consider time-series segments and determine how close they are to each other by determining the cost needed to transform one segment into the following one. Using a limited set of operations—with associated costs—to transform the time series segments, we determine a new time series, that is our transformation-cost time series. This cost time series is regularly sampled and can be analyzed using standard methods. While our main interest is the analysis of paleoclimate data, we develop our method using numerical examples like the logistic map and the Rössler oscillator. The numerical data allows us to test the stability of our method against noise and for different irregular samplings. In addition we provide guidance on how to choose the associated costs based on the time series at hand. The usefulness of the TACTS method is demonstrated using speleothem data from the Secret Cave in Borneo that is a good proxy for paleoclimatic variability in the monsoon activity around the maritime continent.

  14. A nonorthogonal spectral analysis of time series

    Science.gov (United States)

    Anufriev, A.; Bchvarov, I.

    A method of nonorthogonal spectral analysis of time series (applicable in the study of geomagnetism) is developed which can be used to find the true period and to exclude the incorrect maxima which occur in Fourier analysis. The method is based on the reversal of a matrix which connects the oscillation amplitudes with the Fourier images, and the spectrum is determined by a numerical iteration technique. The correctness of the solution is tested by amplitude annulment at frequencies which are absent from the spectrum.

  15. Fourier analysis of time series an introduction

    CERN Document Server

    Bloomfield, Peter

    2000-01-01

    A new, revised edition of a yet unrivaled work on frequency domain analysis Long recognized for his unique focus on frequency domain methods for the analysis of time series data as well as for his applied, easy-to-understand approach, Peter Bloomfield brings his well-known 1976 work thoroughly up to date. With a minimum of mathematics and an engaging, highly rewarding style, Bloomfield provides in-depth discussions of harmonic regression, harmonic analysis, complex demodulation, and spectrum analysis. All methods are clearly illustrated using examples of specific data sets, while ample

  16. Useful Pattern Mining on Time Series

    DEFF Research Database (Denmark)

    Goumatianos, Nikitas; Christou, Ioannis T; Lindgren, Peter

    2013-01-01

    We present the architecture of a “useful pattern” mining system that is capable of detecting thousands of different candlestick sequence patterns at the tick or any higher granularity levels. The system architecture is highly distributed and performs most of its highly compute-intensive aggregation...... calculations as complex but efficient distributed SQL queries on the relational databases that store the time-series. We present initial results from mining all frequent candlestick sequences with the characteristic property that when they occur then, with an average at least 60% probability, they signal a 2...

  17. Time series analysis of temporal networks

    Science.gov (United States)

    Sikdar, Sandipan; Ganguly, Niloy; Mukherjee, Animesh

    2016-01-01

    A common but an important feature of all real-world networks is that they are temporal in nature, i.e., the network structure changes over time. Due to this dynamic nature, it becomes difficult to propose suitable growth models that can explain the various important characteristic properties of these networks. In fact, in many application oriented studies only knowing these properties is sufficient. For instance, if one wishes to launch a targeted attack on a network, this can be done even without the knowledge of the full network structure; rather an estimate of some of the properties is sufficient enough to launch the attack. We, in this paper show that even if the network structure at a future time point is not available one can still manage to estimate its properties. We propose a novel method to map a temporal network to a set of time series instances, analyze them and using a standard forecast model of time series, try to predict the properties of a temporal network at a later time instance. To our aim, we consider eight properties such as number of active nodes, average degree, clustering coefficient etc. and apply our prediction framework on them. We mainly focus on the temporal network of human face-to-face contacts and observe that it represents a stochastic process with memory that can be modeled as Auto-Regressive-Integrated-Moving-Average (ARIMA). We use cross validation techniques to find the percentage accuracy of our predictions. An important observation is that the frequency domain properties of the time series obtained from spectrogram analysis could be used to refine the prediction framework by identifying beforehand the cases where the error in prediction is likely to be high. This leads to an improvement of 7.96% (for error level ≤20%) in prediction accuracy on an average across all datasets. As an application we show how such prediction scheme can be used to launch targeted attacks on temporal networks. Contribution to the Topical Issue

  18. Anomaly on Superspace of Time Series Data

    Science.gov (United States)

    Capozziello, Salvatore; Pincak, Richard; Kanjamapornkul, Kabin

    2017-11-01

    We apply the G-theory and anomaly of ghost and antighost fields in the theory of supersymmetry to study a superspace over time series data for the detection of hidden general supply and demand equilibrium in the financial market. We provide proof of the existence of a general equilibrium point over 14 extradimensions of the new G-theory compared with the M-theory of the 11 dimensions model of Edward Witten. We found that the process of coupling between nonequilibrium and equilibrium spinor fields of expectation ghost fields in the superspace of time series data induces an infinitely long exact sequence of cohomology from a short exact sequence of moduli state space model. If we assume that the financial market is separated into two topological spaces of supply and demand as the D-brane and anti-D-brane model, then we can use a cohomology group to compute the stability of the market as a stable point of the general equilibrium of the interaction between D-branes of the market. We obtain the result that the general equilibrium will exist if and only if the 14th Batalin-Vilkovisky cohomology group with the negative dimensions underlying 14 major hidden factors influencing the market is zero.

  19. Time Series Based for Online Signature Verification

    Directory of Open Access Journals (Sweden)

    I Ketut Gede Darma Putra

    2013-11-01

    Full Text Available Signature verification system is to match the tested signature with a claimed signature. This paper proposes time series based for feature extraction method and dynamic time warping for match method. The system made by process of testing 900 signatures belong to 50 participants, 3 signatures for reference and 5 signatures from original user, simple imposters and trained imposters for signatures test. The final result system was tested with 50 participants with 3 references. This test obtained that system accuracy without imposters is 90,44897959% at threshold 44 with rejection errors (FNMR is 5,2% and acceptance errors (FMR is 4,35102%, when with imposters system accuracy is 80,1361% at threshold 27 with error rejection (FNMR is 15,6% and acceptance errors (average FMR is 4,263946%, with details as follows: acceptance errors is 0,391837%, acceptance errors simple imposters is 3,2% and acceptance errors trained imposters is 9,2%.

  20. Automated time series forecasting for biosurveillance.

    Science.gov (United States)

    Burkom, Howard S; Murphy, Sean Patrick; Shmueli, Galit

    2007-09-30

    For robust detection performance, traditional control chart monitoring for biosurveillance is based on input data free of trends, day-of-week effects, and other systematic behaviour. Time series forecasting methods may be used to remove this behaviour by subtracting forecasts from observations to form residuals for algorithmic input. We describe three forecast methods and compare their predictive accuracy on each of 16 authentic syndromic data streams. The methods are (1) a non-adaptive regression model using a long historical baseline, (2) an adaptive regression model with a shorter, sliding baseline, and (3) the Holt-Winters method for generalized exponential smoothing. Criteria for comparing the forecasts were the root-mean-square error, the median absolute per cent error (MedAPE), and the median absolute deviation. The median-based criteria showed best overall performance for the Holt-Winters method. The MedAPE measures over the 16 test series averaged 16.5, 11.6, and 9.7 for the non-adaptive regression, adaptive regression, and Holt-Winters methods, respectively. The non-adaptive regression forecasts were degraded by changes in the data behaviour in the fixed baseline period used to compute model coefficients. The mean-based criterion was less conclusive because of the effects of poor forecasts on a small number of calendar holidays. The Holt-Winters method was also most effective at removing serial autocorrelation, with most 1-day-lag autocorrelation coefficients below 0.15. The forecast methods were compared without tuning them to the behaviour of individual series. We achieved improved predictions with such tuning of the Holt-Winters method, but practical use of such improvements for routine surveillance will require reliable data classification methods.

  1. Entropy of geoelectrical time series in the natural time domain

    Directory of Open Access Journals (Sweden)

    A. Ramírez-Rojas

    2011-01-01

    Full Text Available Seismic electric signals (SES have been considered precursors of strong earthquakes, and, recently, their dynamics have been investigated within the Natural Time Domain (NTD (Varotsos et al., 2004. In this paper we apply the NTD approach and the chaotic map signal analysis to two geoelectric time series recorded in a seismically very active area of Mexico, where two strong earthquakes, M=6.6 and M=7.4, occurred on 24 October 1993 and 14 September 1995, respectively. The low frequency geoelectric signals measured display periods with dichotomic behavior. Our findings point out to an increase of the correlation degree of the geoelectric signals before the occurrence of strong earthquakes; furthermore, the power spectrum and entropy in NTD are in good agreement with the results published in literature. Our results were validated by the analysis of a chaotic map simulated time series, which revealed the typical characteristics of artificial noise.

  2. Periodograms for multiband astronomical time series

    Science.gov (United States)

    Ivezic, Z.; VanderPlas, J. T.

    2016-05-01

    We summarize the multiband periodogram, a general extension of the well-known Lomb-Scargle approach for detecting periodic signals in time- domain data developed by VanderPlas & Ivezic (2015). A Python implementation of this method is available on GitHub. The multiband periodogram significantly improves period finding for randomly sampled multiband light curves (e.g., Pan-STARRS, DES, and LSST), and can treat non-uniform sampling and heteroscedastic errors. The light curves in each band are modeled as arbitrary truncated Fourier series, with the period and phase shared across all bands. The key aspect is the use of Tikhonov regularization which drives most of the variability into the so-called base model common to all bands, while fits for individual bands describe residuals relative to the base model and typically require lower-order Fourier series. We use simulated light curves and randomly subsampled SDSS Stripe 82 data to demonstrate the superiority of this method compared to other methods from the literature, and find that this method will be able to efficiently determine the correct period in the majority of LSST's bright RR Lyrae stars with as little as six months of LSST data.

  3. Palmprint Verification Using Time Series Method

    Directory of Open Access Journals (Sweden)

    A. A. Ketut Agung Cahyawan Wiranatha

    2013-11-01

    Full Text Available The use of biometrics as an automatic recognition system is growing rapidly in solving security problems, palmprint is one of biometric system which often used. This paper used two steps in center of mass moment method for region of interest (ROI segmentation and apply the time series method combined with block window method as feature representation. Normalized Euclidean Distance is used to measure the similarity degrees of two feature vectors of palmprint. System testing is done using 500 samples palms, with 4 samples as the reference image and the 6 samples as test images. Experiment results show that this system can achieve a high performance with success rate about 97.33% (FNMR=1.67%, FMR=1.00 %, T=0.036.

  4. Reconstructing complex networks without time series

    Science.gov (United States)

    Ma, Chuang; Zhang, Hai-Feng; Lai, Ying-Cheng

    2017-08-01

    In the real world there are situations where the network dynamics are transient (e.g., various spreading processes) and the final nodal states represent the available data. Can the network topology be reconstructed based on data that are not time series? Assuming that an ensemble of the final nodal states resulting from statistically independent initial triggers (signals) of the spreading dynamics is available, we develop a maximum likelihood estimation-based framework to accurately infer the interaction topology. For dynamical processes that result in a binary final state, the framework enables network reconstruction based solely on the final nodal states. Additional information, such as the first arrival time of each signal at each node, can improve the reconstruction accuracy. For processes with a uniform final state, the first arrival times can be exploited to reconstruct the network. We derive a mathematical theory for our framework and validate its performance and robustness using various combinations of spreading dynamics and real-world network topologies.

  5. PERIODOGRAMS FOR MULTIBAND ASTRONOMICAL TIME SERIES

    Energy Technology Data Exchange (ETDEWEB)

    VanderPlas, Jacob T. [eScience Institute, University of Washington, Seattle, WA (United States); Ivezic, Željko [Department of Astronomy, University of Washington, Seattle, WA (United States)

    2015-10-10

    This paper introduces the multiband periodogram, a general extension of the well-known Lomb–Scargle approach for detecting periodic signals in time-domain data. In addition to advantages of the Lomb–Scargle method such as treatment of non-uniform sampling and heteroscedastic errors, the multiband periodogram significantly improves period finding for randomly sampled multiband light curves (e.g., Pan-STARRS, DES, and LSST). The light curves in each band are modeled as arbitrary truncated Fourier series, with the period and phase shared across all bands. The key aspect is the use of Tikhonov regularization which drives most of the variability into the so-called base model common to all bands, while fits for individual bands describe residuals relative to the base model and typically require lower-order Fourier series. This decrease in the effective model complexity is the main reason for improved performance. After a pedagogical development of the formalism of least-squares spectral analysis, which motivates the essential features of the multiband model, we use simulated light curves and randomly subsampled SDSS Stripe 82 data to demonstrate the superiority of this method compared to other methods from the literature and find that this method will be able to efficiently determine the correct period in the majority of LSST’s bright RR Lyrae stars with as little as six months of LSST data, a vast improvement over the years of data reported to be required by previous studies. A Python implementation of this method, along with code to fully reproduce the results reported here, is available on GitHub.

  6. Fisher information framework for time series modeling

    Science.gov (United States)

    Venkatesan, R. C.; Plastino, A.

    2017-08-01

    A robust prediction model invoking the Takens embedding theorem, whose working hypothesis is obtained via an inference procedure based on the minimum Fisher information principle, is presented. The coefficients of the ansatz, central to the working hypothesis satisfy a time independent Schrödinger-like equation in a vector setting. The inference of (i) the probability density function of the coefficients of the working hypothesis and (ii) the establishing of constraint driven pseudo-inverse condition for the modeling phase of the prediction scheme, is made, for the case of normal distributions, with the aid of the quantum mechanical virial theorem. The well-known reciprocity relations and the associated Legendre transform structure for the Fisher information measure (FIM, hereafter)-based model in a vector setting (with least square constraints) are self-consistently derived. These relations are demonstrated to yield an intriguing form of the FIM for the modeling phase, which defines the working hypothesis, solely in terms of the observed data. Cases for prediction employing time series' obtained from the: (i) the Mackey-Glass delay-differential equation, (ii) one ECG signal from the MIT-Beth Israel Deaconess Hospital (MIT-BIH) cardiac arrhythmia database, and (iii) one ECG signal from the Creighton University ventricular tachyarrhythmia database. The ECG samples were obtained from the Physionet online repository. These examples demonstrate the efficiency of the prediction model. Numerical examples for exemplary cases are provided.

  7. Timing calibration and spectral cleaning of LOFAR time series data

    Science.gov (United States)

    Corstanje, A.; Buitink, S.; Enriquez, J. E.; Falcke, H.; Hörandel, J. R.; Krause, M.; Nelles, A.; Rachen, J. P.; Schellart, P.; Scholten, O.; ter Veen, S.; Thoudam, S.; Trinh, T. N. G.

    2016-05-01

    We describe a method for spectral cleaning and timing calibration of short time series data of the voltage in individual radio interferometer receivers. It makes use of phase differences in fast Fourier transform (FFT) spectra across antenna pairs. For strong, localized terrestrial sources these are stable over time, while being approximately uniform-random for a sum over many sources or for noise. Using only milliseconds-long datasets, the method finds the strongest interfering transmitters, a first-order solution for relative timing calibrations, and faulty data channels. No knowledge of gain response or quiescent noise levels of the receivers is required. With relatively small data volumes, this approach is suitable for use in an online system monitoring setup for interferometric arrays. We have applied the method to our cosmic-ray data collection, a collection of measurements of short pulses from extensive air showers, recorded by the LOFAR radio telescope. Per air shower, we have collected 2 ms of raw time series data for each receiver. The spectral cleaning has a calculated optimal sensitivity corresponding to a power signal-to-noise ratio of 0.08 (or -11 dB) in a spectral window of 25 kHz, for 2 ms of data in 48 antennas. This is well sufficient for our application. Timing calibration across individual antenna pairs has been performed at 0.4 ns precision; for calibration of signal clocks across stations of 48 antennas the precision is 0.1 ns. Monitoring differences in timing calibration per antenna pair over the course of the period 2011 to 2015 shows a precision of 0.08 ns, which is useful for monitoring and correcting drifts in signal path synchronizations. A cross-check method for timing calibration is presented, using a pulse transmitter carried by a drone flying over the array. Timing precision is similar, 0.3 ns, but is limited by transmitter position measurements, while requiring dedicated flights.

  8. Time series modeling for syndromic surveillance

    Directory of Open Access Journals (Sweden)

    Mandl Kenneth D

    2003-01-01

    Full Text Available Abstract Background Emergency department (ED based syndromic surveillance systems identify abnormally high visit rates that may be an early signal of a bioterrorist attack. For example, an anthrax outbreak might first be detectable as an unusual increase in the number of patients reporting to the ED with respiratory symptoms. Reliably identifying these abnormal visit patterns requires a good understanding of the normal patterns of healthcare usage. Unfortunately, systematic methods for determining the expected number of (ED visits on a particular day have not yet been well established. We present here a generalized methodology for developing models of expected ED visit rates. Methods Using time-series methods, we developed robust models of ED utilization for the purpose of defining expected visit rates. The models were based on nearly a decade of historical data at a major metropolitan academic, tertiary care pediatric emergency department. The historical data were fit using trimmed-mean seasonal models, and additional models were fit with autoregressive integrated moving average (ARIMA residuals to account for recent trends in the data. The detection capabilities of the model were tested with simulated outbreaks. Results Models were built both for overall visits and for respiratory-related visits, classified according to the chief complaint recorded at the beginning of each visit. The mean absolute percentage error of the ARIMA models was 9.37% for overall visits and 27.54% for respiratory visits. A simple detection system based on the ARIMA model of overall visits was able to detect 7-day-long simulated outbreaks of 30 visits per day with 100% sensitivity and 97% specificity. Sensitivity decreased with outbreak size, dropping to 94% for outbreaks of 20 visits per day, and 57% for 10 visits per day, all while maintaining a 97% benchmark specificity. Conclusions Time series methods applied to historical ED utilization data are an important tool

  9. Climate Prediction Center (CPC) Global Temperature Time Series

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The global temperature time series provides time series charts using station based observations of daily temperature. These charts provide information about the...

  10. Climate Prediction Center (CPC) Global Precipitation Time Series

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The global precipitation time series provides time series charts showing observations of daily precipitation as well as accumulated precipitation compared to normal...

  11. Applications of nonlinear time-series analysis

    Science.gov (United States)

    Nichols, Jonathan Michael

    In this work, new applications in chaos theory and nonlinear time-series analysis are explored. Tools for attractor-based analysis are developed along with a complete description of invariant measures. The focus is on the computation of dimension and Lyapunov spectra from a single time-history for the purposes of system identification. The need for accurate attractor reconstruction is stressed as it may have severe effects on the quality of estimated invariants and of attractor based predictions. These tools are then placed in the context of several different problems of importance to the engineering community. Dimension and Lyaponuv spectra are used to indicate the operating regime of a nonlinear mechanical oscillator. Subtle changes to the way in which the oscillator is forced may give rise to a response with different state space characteristics. These differences are clearly discernible using invariant measures yet are undetectable using linear-based techniques. A state space approach is also used to extract damping estimates from the oscillator by means of the complete Lyapunov spectrum. The sum of the exponents may be thought of as the average divergence of the system which will, for a viscous damping model, provide quantitative information about the coefficient of viscous damping. The notion of chaotic excitation of a linear system is also explored. A linear structure subject to chaotic excitation will effectively act as a filter. The resulting dynamical interaction gives rise to response (filtered) attractors which possess information about the linear system. Differences in the geometric properties of the filtered attractors are used to detect damage in structures. These attractor-based statistics are shown to be more robust indicators of damage than linear-based statistics (e.g. mode shapes, frequencies, etc.). The same procedure is also used to estimate the coefficient of viscous damping for a multi-degree-of-freedom linear structure.

  12. Old and New Spectral Techniques for Economic Time Series

    OpenAIRE

    Sella Lisa

    2008-01-01

    This methodological paper reviews different spectral techniques well suitable to the analysis of economic time series. While econometric time series analysis is generally yielded in the time domain, these techniques propose a complementary approach based on the frequency domain. Spectral decomposition and time series reconstruction provide a precise quantitative and formal description of the main oscillatory components of a series: thus, it is possible to formally identify trends, lowfrequenc...

  13. Transmission of linear regression patterns between time series: From relationship in time series to complex networks

    Science.gov (United States)

    Gao, Xiangyun; An, Haizhong; Fang, Wei; Huang, Xuan; Li, Huajiao; Zhong, Weiqiong; Ding, Yinghui

    2014-07-01

    The linear regression parameters between two time series can be different under different lengths of observation period. If we study the whole period by the sliding window of a short period, the change of the linear regression parameters is a process of dynamic transmission over time. We tackle fundamental research that presents a simple and efficient computational scheme: a linear regression patterns transmission algorithm, which transforms linear regression patterns into directed and weighted networks. The linear regression patterns (nodes) are defined by the combination of intervals of the linear regression parameters and the results of the significance testing under different sizes of the sliding window. The transmissions between adjacent patterns are defined as edges, and the weights of the edges are the frequency of the transmissions. The major patterns, the distance, and the medium in the process of the transmission can be captured. The statistical results of weighted out-degree and betweenness centrality are mapped on timelines, which shows the features of the distribution of the results. Many measurements in different areas that involve two related time series variables could take advantage of this algorithm to characterize the dynamic relationships between the time series from a new perspective.

  14. Time Series Observations in the North Indian Ocean

    Digital Repository Service at National Institute of Oceanography (India)

    Shenoy, D.M.; Naik, H.; Kurian, S.; Naqvi, S.W.A.; Khare, N.

    Ocean and the ongoing time series study (Candolim Time Series; CaTS) off Goa. In addition, this article also focuses on the new time series initiative in the Arabian Sea and the Bay of Bengal under Sustained Indian Ocean Biogeochemistry and Ecosystem...

  15. Intercomparison of six Mediterranean zooplankton time series

    Science.gov (United States)

    Berline, Léo; Siokou-Frangou, Ioanna; Marasović, Ivona; Vidjak, Olja; Fernández de Puelles, M.a. Luz; Mazzocchi, Maria Grazia; Assimakopoulou, Georgia; Zervoudaki, Soultana; Fonda-Umani, Serena; Conversi, Alessandra; Garcia-Comas, Carmen; Ibanez, Frédéric; Gasparini, Stéphane; Stemmann, Lars; Gorsky, Gabriel

    2012-05-01

    We analyzed and compared Mediterranean mesozooplankton time series spanning 1957-2006 from six coastal stations in the Balearic, Ligurian, Tyrrhenian, North and Middle Adriatic and Aegean Sea. Our analysis focused on fluctuations of major zooplankton taxonomic groups and their relation with environmental and climatic variability. Average seasonal cycles and interannual trends were derived. Stations spanned a large range of trophic status from oligotrophic to moderately eutrophic. Intra-station analyses showed (1) coherent multi-taxa trends off Villefranche sur mer that diverge from the previous results found at species level, (2) in Baleares, covariation of zooplankton and water masses as a consequence of the boundary hydrographic regime in the middle Western Mediterranean, (3) decrease in trophic status and abundance of some taxonomic groups off Naples, and (4) off Athens, an increase of zooplankton abundance and decrease in chlorophyll possibly caused by reduction of anthropogenic nutrient input, increase of microbial components, and more efficient grazing control on phytoplankton. (5) At basin scale, the analysis of temperature revealed significant positive correlations between Villefranche, Trieste and Naples for annual and/or winter average, and synchronous abrupt cooling and warming events centered in 1987 at the same three sites. After correction for multiple comparisons, we found no significant correlations between climate indices and local temperature or zooplankton abundance, nor between stations for zooplankton abundance, therefore we suggest that for these coastal stations local drivers (climatic, anthropogenic) are dominant and that the link between local and larger scale of climate should be investigated further if we are to understand zooplankton fluctuations.

  16. Generalized Framework for Similarity Measure of Time Series

    Directory of Open Access Journals (Sweden)

    Hongsheng Yin

    2014-01-01

    Full Text Available Currently, there is no definitive and uniform description for the similarity of time series, which results in difficulties for relevant research on this topic. In this paper, we propose a generalized framework to measure the similarity of time series. In this generalized framework, whether the time series is univariable or multivariable, and linear transformed or nonlinear transformed, the similarity of time series is uniformly defined using norms of vectors or matrices. The definitions of the similarity of time series in the original space and the transformed space are proved to be equivalent. Furthermore, we also extend the theory on similarity of univariable time series to multivariable time series. We present some experimental results on published time series datasets tested with the proposed similarity measure function of time series. Through the proofs and experiments, it can be claimed that the similarity measure functions of linear multivariable time series based on the norm distance of covariance matrix and nonlinear multivariable time series based on kernel function are reasonable and practical.

  17. Hidden Markov Models for Time Series An Introduction Using R

    CERN Document Server

    Zucchini, Walter

    2009-01-01

    Illustrates the flexibility of HMMs as general-purpose models for time series data. This work presents an overview of HMMs for analyzing time series data, from continuous-valued, circular, and multivariate series to binary data, bounded and unbounded counts and categorical observations.

  18. TIME SERIES ANALYSIS USING A UNIQUE MODEL OF TRANSFORMATION

    Directory of Open Access Journals (Sweden)

    Goran Klepac

    2007-12-01

    Full Text Available REFII1 model is an authorial mathematical model for time series data mining. The main purpose of that model is to automate time series analysis, through a unique transformation model of time series. An advantage of this approach of time series analysis is the linkage of different methods for time series analysis, linking traditional data mining tools in time series, and constructing new algorithms for analyzing time series. It is worth mentioning that REFII model is not a closed system, which means that we have a finite set of methods. At first, this is a model for transformation of values of time series, which prepares data used by different sets of methods based on the same model of transformation in a domain of problem space. REFII model gives a new approach in time series analysis based on a unique model of transformation, which is a base for all kind of time series analysis. The advantage of REFII model is its possible application in many different areas such as finance, medicine, voice recognition, face recognition and text mining.

  19. Indian monsoon variability at different time scales: Marine and terrestrial proxy records

    Digital Repository Service at National Institute of Oceanography (India)

    Patnaik, R.; Gupta, A.K.; Naidu, P.D.; Yadav, R.R.; Bhattacharyya, A.; Kumar, M.

    . Series B- Historia Naturalis, v. 65, p. 9-24. Wang, P., Bradshaw, M., Ganzei, S.S., Tsukawaki, S., Hanssan, K.B., Hantoro, W.S., Poobrasert, S., Burne, R., Zhao, O.Q. and Kagami, H., 1997. West pacific marginal seas during Last Glacial Maximum...

  20. Time-series prediction and applications a machine intelligence approach

    CERN Document Server

    Konar, Amit

    2017-01-01

    This book presents machine learning and type-2 fuzzy sets for the prediction of time-series with a particular focus on business forecasting applications. It also proposes new uncertainty management techniques in an economic time-series using type-2 fuzzy sets for prediction of the time-series at a given time point from its preceding value in fluctuating business environments. It employs machine learning to determine repetitively occurring similar structural patterns in the time-series and uses stochastic automaton to predict the most probabilistic structure at a given partition of the time-series. Such predictions help in determining probabilistic moves in a stock index time-series Primarily written for graduate students and researchers in computer science, the book is equally useful for researchers/professionals in business intelligence and stock index prediction. A background of undergraduate level mathematics is presumed, although not mandatory, for most of the sections. Exercises with tips are provided at...

  1. Estimating the Persistence and the Autocorrelation Function of a Time Series that is Measured with Error

    DEFF Research Database (Denmark)

    Hansen, Peter Reinhard; Lunde, Asger

    2014-01-01

    An economic time series can often be viewed as a noisy proxy for an underlying economic variable. Measurement errors will influence the dynamic properties of the observed process and may conceal the persistence of the underlying time series. In this paper we develop instrumental variable (IV......) methods for extracting information about the latent process. Our framework can be used to estimate the autocorrelation function of the latent volatility process and a key persistence parameter. Our analysis is motivated by the recent literature on realized volatility measures that are imperfect estimates...... of actual volatility. In an empirical analysis using realized measures for the Dow Jones industrial average stocks, we find the underlying volatility to be near unit root in all cases. Although standard unit root tests are asymptotically justified, we find them to be misleading in our application despite...

  2. Estimating the Persistence and the Autocorrelation Function of a Time Series that is Measured with Error

    DEFF Research Database (Denmark)

    Hansen, Peter Reinhard; Lunde, Asger

    An economic time series can often be viewed as a noisy proxy for an underlying economic variable. Measurement errors will influence the dynamic properties of the observed process and may conceal the persistence of the underlying time series. In this paper we develop instrumental variable (IV......) methods for extracting information about the latent process. Our framework can be used to estimate the autocorrelation function of the latent volatility process and a key persistence parameter. Our analysis is motivated by the recent literature on realized (volatility) measures, such as the realized...... variance, that are imperfect estimates of actual volatility. In an empirical analysis using realized measures for the DJIA stocks we find the underlying volatility to be near unit root in all cases. Although standard unit root tests are asymptotically justified, we find them to be misleading in our...

  3. Analysis of Nonstationary Time Series for Biological Rhythms Research.

    Science.gov (United States)

    Leise, Tanya L

    2017-06-01

    This article is part of a Journal of Biological Rhythms series exploring analysis and statistics topics relevant to researchers in biological rhythms and sleep research. The goal is to provide an overview of the most common issues that arise in the analysis and interpretation of data in these fields. In this article on time series analysis for biological rhythms, we describe some methods for assessing the rhythmic properties of time series, including tests of whether a time series is indeed rhythmic. Because biological rhythms can exhibit significant fluctuations in their period, phase, and amplitude, their analysis may require methods appropriate for nonstationary time series, such as wavelet transforms, which can measure how these rhythmic parameters change over time. We illustrate these methods using simulated and real time series.

  4. Vector bilinear autoregressive time series model and its superiority ...

    African Journals Online (AJOL)

    In this research, a vector bilinear autoregressive time series model was proposed and used to model three revenue series (X1, X2, X3) . The “orders” of the three series were identified on the basis of the distribution of autocorrelation and partial autocorrelation functions and were used to construct the vector bilinear models.

  5. Multivariate time series modeling of selected childhood diseases in ...

    African Journals Online (AJOL)

    This process revealed that except tetanus; malaria, URTI, Pneumonia and anaemia series are interrelated. Hence, the four interrelated time series were considered in the multivariate analysis. Order selection criteria were employed to determine the order of the vector autoregressive (VAR) model to be fitted to these series.

  6. A novel weight determination method for time series data aggregation

    Science.gov (United States)

    Xu, Paiheng; Zhang, Rong; Deng, Yong

    2017-09-01

    Aggregation in time series is of great importance in time series smoothing, predicting and other time series analysis process, which makes it crucial to address the weights in times series correctly and reasonably. In this paper, a novel method to obtain the weights in time series is proposed, in which we adopt induced ordered weighted aggregation (IOWA) operator and visibility graph averaging (VGA) operator and linearly combine the weights separately generated by the two operator. The IOWA operator is introduced to the weight determination of time series, through which the time decay factor is taken into consideration. The VGA operator is able to generate weights with respect to the degree distribution in the visibility graph constructed from the corresponding time series, which reflects the relative importance of vertices in time series. The proposed method is applied to two practical datasets to illustrate its merits. The aggregation of Construction Cost Index (CCI) demonstrates the ability of proposed method to smooth time series, while the aggregation of The Taiwan Stock Exchange Capitalization Weighted Stock Index (TAIEX) illustrate how proposed method maintain the variation tendency of original data.

  7. Detecting structural breaks in time series via genetic algorithms

    DEFF Research Database (Denmark)

    Doerr, Benjamin; Fischer, Paul; Hilbert, Astrid

    2016-01-01

    Detecting structural breaks is an essential task for the statistical analysis of time series, for example, for fitting parametric models to it. In short, structural breaks are points in time at which the behaviour of the time series substantially changes. Typically, no solid background knowledge ...

  8. Time series analysis in the social sciences the fundamentals

    CERN Document Server

    Shin, Youseop

    2017-01-01

    Times Series Analysis in the Social Sciences is a practical and highly readable introduction written exclusively for students and researchers whose mathematical background is limited to basic algebra. The book focuses on fundamental elements of time series analysis that social scientists need to understand so they can employ time series analysis for their research and practice. Through step-by-step explanations and using monthly violent crime rates as case studies, this book explains univariate time series from the preliminary visual analysis through the modeling of seasonality, trends, and re

  9. Stochastic time series analysis of hydrology data for water resources

    Science.gov (United States)

    Sathish, S.; Khadar Babu, S. K.

    2017-11-01

    The prediction to current publication of stochastic time series analysis in hydrology and seasonal stage. The different statistical tests for predicting the hydrology time series on Thomas-Fiering model. The hydrology time series of flood flow have accept a great deal of consideration worldwide. The concentration of stochastic process areas of time series analysis method are expanding with develop concerns about seasonal periods and global warming. The recent trend by the researchers for testing seasonal periods in the hydrologic flowseries using stochastic process on Thomas-Fiering model. The present article proposed to predict the seasonal periods in hydrology using Thomas-Fiering model.

  10. Studies on time series applications in environmental sciences

    CERN Document Server

    Bărbulescu, Alina

    2016-01-01

    Time series analysis and modelling represent a large study field, implying the approach from the perspective of the time and frequency, with applications in different domains. Modelling hydro-meteorological time series is difficult due to the characteristics of these series, as long range dependence, spatial dependence, the correlation with other series. Continuous spatial data plays an important role in planning, risk assessment and decision making in environmental management. In this context, in this book we present various statistical tests and modelling techniques used for time series analysis, as well as applications to hydro-meteorological series from Dobrogea, a region situated in the south-eastern part of Romania, less studied till now. Part of the results are accompanied by their R code. .

  11. Assessing Local Turbulence Strength from a Time Series

    Directory of Open Access Journals (Sweden)

    Mayer Humi

    2010-01-01

    Full Text Available We study the possible link between “local turbulence strength” in a flow which is represented by a finite time series and a “chaotic invariant”, namely, the leading Lyaponuv exponent that characterizes this series. To validate a conjecture about this link, we analyze several time series of measurements taken by a plane flying at constant height in the upper troposphere. For each of these time series we estimate the leading Lyaponuv exponent which we then correlate with the structure constants for the temperature. In addition, we introduce a quantitative technique to educe the scale contents of the flow and a methodology to validate its spectrum.

  12. SINOMA - A new iterative statistical approach for the identification of linear relationships between noisy time series

    Science.gov (United States)

    Thees, Barnim; Buras, Allan; Jetschke, Gottfried; Kutzbach, Lars; Zorita, Eduardo; Wilmking, Martin

    2014-05-01

    In paleoclimatology, reconstructions of environmental conditions play a significant role. Such reconstructions rely on the relationship between proxies (e.g. tree-rings, lake sediments) and the processes which are to be reconstructed (e.g. temperature, precipitation, solar activity). However, both of these variable types in general are noisy. For instance, ring-width is only a proxy for tree growth and further determined by several other environmental signals (e.g. precipitation, length of growing season, competition). On the other hand, records of process data that are to be reconstructed are mostly available for too short periods (too short in terms of calibration) at the particular site at which the proxy data have been sampled. The resulting 'spatial' noise (e.g. by using climate station data not situated at the proxy site) causes additional errors in the relationship between measured proxy data and available process data (e.g. Kutzbach et al., 2011). If deriving models from such noisy data, Thees et al. (2009) and Kutzbach et al. (2011) could show (amongst others), that model slopes (the factor with which the one variable is multiplied to predict the other variable) in most cases are misestimated - depending on the ratio of the variances of the respective variable noises. Despite these facts, many recent reconstructions are based on ordinary least squares regressions, which underestimate model slopes as they do not account for the noise in the predictor variable (Kutzbach et al., 2011). This is because there yet only are few methodological approaches available to treat noisy data in terms of modeling, and for those methods additional information (e.g. a good estimate of the error noise ratio) which often is impossible to acquire is needed. Here we introduce the Sequential Iterative NOise Matching Algorithm - SINOMA - with which we are able to derive good estimates for model slopes between noisy time series. The mathematical background of SINOMA is described

  13. Learning and Prediction of Relational Time Series

    Science.gov (United States)

    2013-03-01

    percept that describes “a ball hits the wall” becomes false immediately after it occurs. An interval timed percept occurs at the ‘+’ percept, persist...describes “a ball is in the box” is true until the ball is removed. A timed percept indicating the beginning of an interval state has a...Degree Out Degree Type Troll1 0 1 D Agent1 0 1 A Location1 2 0 L Troll2 0 1 D Agent2 0 1 A Location2 2 0 L constant1 constant2 Score Dragon - 1

  14. Time Series Econometrics for the 21st Century

    Science.gov (United States)

    Hansen, Bruce E.

    2017-01-01

    The field of econometrics largely started with time series analysis because many early datasets were time-series macroeconomic data. As the field developed, more cross-sectional and longitudinal datasets were collected, which today dominate the majority of academic empirical research. In nonacademic (private sector, central bank, and governmental)…

  15. Time series analyses of mean monthly rainfall for drought ...

    African Journals Online (AJOL)

    This paper analyses the time series characteristics of rainfall data for Sokoto metropolis for 40 years with a view to understanding drought management. Data for this study was obtained from the Nigeria Metrological Agency (NIMET), Sokoto Airport; Sokoto. The data was subjected to time series tests (trend, cycle, seasonal ...

  16. 461 TIME SERIES ANALYSES OF MEAN MONTHLY RAINFALL ...

    African Journals Online (AJOL)

    Osondu

    Abstract. This paper analyses the time series characteristics of rainfall data for Sokoto metropolis for 40 years with a view to understanding drought management. Data for this study was obtained from the. Nigeria Metrological Agency (NIMET), Sokoto Airport; Sokoto. The data was subjected to time series tests (trend, cycle ...

  17. Two-fractal overlap time series: Earthquakes and market crashes

    Indian Academy of Sciences (India)

    We find prominent similarities in the features of the time series for the (model earthquakes or) overlap of two Cantor sets when one set moves with uniform relative velocity over the other and time series of stock prices. An anticipation method for some of the crashes have been proposed here, based on these observations.

  18. Metagenomics meets time series analysis: unraveling microbial community dynamics

    NARCIS (Netherlands)

    Faust, K.; Lahti, L.M.; Gonze, D.; Vos, de W.M.; Raes, J.

    2015-01-01

    The recent increase in the number of microbial time series studies offers new insights into the stability and dynamics of microbial communities, from the world's oceans to human microbiota. Dedicated time series analysis tools allow taking full advantage of these data. Such tools can reveal periodic

  19. Time series prediction of apple scab using meteorological ...

    African Journals Online (AJOL)

    A new prediction model for the early warning of apple scab is proposed in this study. The method is based on artificial intelligence and time series prediction. The infection period of apple scab was evaluated as the time series prediction model instead of summation of wetness duration. Also, the relations of different ...

  20. Prediction of nonlinear time series by kernel regression smoothing

    NARCIS (Netherlands)

    Borovkova, S; Burton, R; Dehling, H; Prochazka, A; Uhlir, J; Sovka, P

    1997-01-01

    We address the problem of prediction of nonlinear time series by kernel estimation of autoregression, and introduce a variation of this method. We apply this method to an experimental time series and compare its performance with predictions by feed-forward neural networks as well as with fitting a

  1. Parameterizing unconditional skewness in models for financial time series

    DEFF Research Database (Denmark)

    He, Changli; Silvennoinen, Annastiina; Teräsvirta, Timo

    In this paper we consider the third-moment structure of a class of time series models. It is often argued that the marginal distribution of financial time series such as returns is skewed. Therefore it is of importance to know what properties a model should possess if it is to accommodate...

  2. Time series forecasting based on deep extreme learning machine

    NARCIS (Netherlands)

    Guo, Xuqi; Pang, Y.; Yan, Gaowei; Qiao, Tiezhu; Yang, Guang-Hong; Yang, Dan

    2017-01-01

    Multi-layer Artificial Neural Networks (ANN) has caught widespread attention as a new method for time series forecasting due to the ability of approximating any nonlinear function. In this paper, a new local time series prediction model is established with the nearest neighbor domain theory, in

  3. Effects of dating errors on nonparametric trend analyses of speleothem time series

    Directory of Open Access Journals (Sweden)

    M. Mudelsee

    2012-10-01

    Full Text Available A fundamental problem in paleoclimatology is to take fully into account the various error sources when examining proxy records with quantitative methods of statistical time series analysis. Records from dated climate archives such as speleothems add extra uncertainty from the age determination to the other sources that consist in measurement and proxy errors. This paper examines three stalagmite time series of oxygen isotopic composition (δ18O from two caves in western Germany, the series AH-1 from the Atta Cave and the series Bu1 and Bu4 from the Bunker Cave. These records carry regional information about past changes in winter precipitation and temperature. U/Th and radiocarbon dating reveals that they cover the later part of the Holocene, the past 8.6 thousand years (ka. We analyse centennial- to millennial-scale climate trends by means of nonparametric Gasser–Müller kernel regression. Error bands around fitted trend curves are determined by combining (1 block bootstrap resampling to preserve noise properties (shape, autocorrelation of the δ18O residuals and (2 timescale simulations (models StalAge and iscam. The timescale error influences on centennial- to millennial-scale trend estimation are not excessively large. We find a "mid-Holocene climate double-swing", from warm to cold to warm winter conditions (6.5 ka to 6.0 ka to 5.1 ka, with warm–cold amplitudes of around 0.5‰ δ18O; this finding is documented by all three records with high confidence. We also quantify the Medieval Warm Period (MWP, the Little Ice Age (LIA and the current warmth. Our analyses cannot unequivocally support the conclusion that current regional winter climate is warmer than that during the MWP.

  4. Stationary determinism in Observed Time Series the earth's surface temperature

    CERN Document Server

    Gutíerrez, R M

    1999-01-01

    In this work we address the feasibility of estimating and isolating the stationary and deterministic content of observational time series, {\\bf Ots}, which in general have very limited characteristics. In particular, we study the valuable earth's surface mean temperature time series, {\\bf Tts}, by applying several treatments intended to isolate the stationary and deterministic content. We give particular attention to the sensitivity of results on the different parameters involved. The effects of such treatments were assessed by means of several methods designed to estimate the stationarity of time series. In order to strengthen the significance of the results obtained we have created a comparative framework with seven test time series of well-know origin and characteristics with a similar small number of data points. We have obtained a greater understanding of the potential and limitations of the different methods when applied to real world time series. The study of the stationarity and deterministic content ...

  5. Effectiveness of Multivariate Time Series Classification Using Shapelets

    Directory of Open Access Journals (Sweden)

    A. P. Karpenko

    2015-01-01

    Full Text Available Typically, time series classifiers require signal pre-processing (filtering signals from noise and artifact removal, etc., enhancement of signal features (amplitude, frequency, spectrum, etc., classification of signal features in space using the classical techniques and classification algorithms of multivariate data. We consider a method of classifying time series, which does not require enhancement of the signal features. The method uses the shapelets of time series (time series shapelets i.e. small fragments of this series, which reflect properties of one of its classes most of all.Despite the significant number of publications on the theory and shapelet applications for classification of time series, the task to evaluate the effectiveness of this technique remains relevant. An objective of this publication is to study the effectiveness of a number of modifications of the original shapelet method as applied to the multivariate series classification that is a littlestudied problem. The paper presents the problem statement of multivariate time series classification using the shapelets and describes the shapelet–based basic method of binary classification, as well as various generalizations and proposed modification of the method. It also offers the software that implements a modified method and results of computational experiments confirming the effectiveness of the algorithmic and software solutions.The paper shows that the modified method and the software to use it allow us to reach the classification accuracy of about 85%, at best. The shapelet search time increases in proportion to input data dimension.

  6. Battery Grouping with Time Series Clustering Based on Affinity Propagation

    OpenAIRE

    Zhiwei He; Mingyu Gao; Guojin Ma; Yuanyuan Liu; Lijun Tang

    2016-01-01

    Battery grouping is a technology widely used to improve the performance of battery packs. In this paper, we propose a time series clustering based battery grouping method. The proposed method utilizes the whole battery charge/discharge sequence for battery grouping. The time sequences are first denoised with a wavelet denoising technique. The similarity matrix is then computed with the dynamic time warping distance, and finally the time series are clustered with the affinity propagation algor...

  7. Sunspot Time Series: Passive and Active Intervals

    Science.gov (United States)

    Zięba, S.; Nieckarz, Z.

    2014-07-01

    Solar activity slowly and irregularly decreases from the first spotless day (FSD) in the declining phase of the old sunspot cycle and systematically, but also in an irregular way, increases to the new cycle maximum after the last spotless day (LSD). The time interval between the first and the last spotless day can be called the passive interval (PI), while the time interval from the last spotless day to the first one after the new cycle maximum is the related active interval (AI). Minima of solar cycles are inside PIs, while maxima are inside AIs. In this article, we study the properties of passive and active intervals to determine the relation between them. We have found that some properties of PIs, and related AIs, differ significantly between two group of solar cycles; this has allowed us to classify Cycles 8 - 15 as passive cycles, and Cycles 17 - 23 as active ones. We conclude that the solar activity in the PI declining phase (a descending phase of the previous cycle) determines the strength of the approaching maximum in the case of active cycles, while the activity of the PI rising phase (a phase of the ongoing cycle early growth) determines the strength of passive cycles. This can have implications for solar dynamo models. Our approach indicates the important role of solar activity during the declining and the rising phases of the solar-cycle minimum.

  8. A thermodynamic geography: night-time satellite imagery as a proxy measure of emergy.

    Science.gov (United States)

    Coscieme, Luca; Pulselli, Federico M; Bastianoni, Simone; Elvidge, Christopher D; Anderson, Sharolyn; Sutton, Paul C

    2014-11-01

    Night-time satellite imagery enables the measurement, visualization, and mapping of energy consumption in an area. In this paper, an index of the "sum of lights" as observed by night-time satellite imagery within national boundaries is compared with the emergy of the nations. Emergy is a measure of the solar energy equivalent used, directly or indirectly, to support the processes that characterize the economic activity in a country. Emergy has renewable and non-renewable components. Our results show that the non-renewable component of national emergy use is positively correlated with night-time satellite imagery. This relationship can be used to produce emergy density maps which enable the incorporation of spatially explicit representations of emergy in geographic information systems. The region of Abruzzo (Italy) is used to demonstrate this relationship as a spatially disaggregate case.

  9. High performance biomedical time series indexes using salient segmentation.

    Science.gov (United States)

    Woodbridge, Jonathan; Mortazavi, Bobak; Bui, Alex A T; Sarrafzadeh, Majid

    2012-01-01

    The advent of remote and wearable medical sensing has created a dire need for efficient medical time series databases. Wearable medical sensing devices provide continuous patient monitoring by various types of sensors and have the potential to create massive amounts of data. Therefore, time series databases must utilize highly optimized indexes in order to efficiently search and analyze stored data. This paper presents a highly efficient technique for indexing medical time series signals using Locality Sensitive Hashing (LSH). Unlike previous work, only salient (or interesting) segments are inserted into the index. This technique reduces search times by up to 95% while yielding near identical search results.

  10. Clustering Financial Time Series by Network Community Analysis

    Science.gov (United States)

    Piccardi, Carlo; Calatroni, Lisa; Bertoni, Fabio

    In this paper, we describe a method for clustering financial time series which is based on community analysis, a recently developed approach for partitioning the nodes of a network (graph). A network with N nodes is associated to the set of N time series. The weight of the link (i, j), which quantifies the similarity between the two corresponding time series, is defined according to a metric based on symbolic time series analysis, which has recently proved effective in the context of financial time series. Then, searching for network communities allows one to identify groups of nodes (and then time series) with strong similarity. A quantitative assessment of the significance of the obtained partition is also provided. The method is applied to two distinct case-studies concerning the US and Italy Stock Exchange, respectively. In the US case, the stability of the partitions over time is also thoroughly investigated. The results favorably compare with those obtained with the standard tools typically used for clustering financial time series, such as the minimal spanning tree and the hierarchical tree.

  11. Sensor-Generated Time Series Events: A Definition Language

    Science.gov (United States)

    Anguera, Aurea; Lara, Juan A.; Lizcano, David; Martínez, Maria Aurora; Pazos, Juan

    2012-01-01

    There are now a great many domains where information is recorded by sensors over a limited time period or on a permanent basis. This data flow leads to sequences of data known as time series. In many domains, like seismography or medicine, time series analysis focuses on particular regions of interest, known as events, whereas the remainder of the time series contains hardly any useful information. In these domains, there is a need for mechanisms to identify and locate such events. In this paper, we propose an events definition language that is general enough to be used to easily and naturally define events in time series recorded by sensors in any domain. The proposed language has been applied to the definition of time series events generated within the branch of medicine dealing with balance-related functions in human beings. A device, called posturograph, is used to study balance-related functions. The platform has four sensors that record the pressure intensity being exerted on the platform, generating four interrelated time series. As opposed to the existing ad hoc proposals, the results confirm that the proposed language is valid, that is generally applicable and accurate, for identifying the events contained in the time series.

  12. Time series modeling of system self-assessment of survival

    Energy Technology Data Exchange (ETDEWEB)

    Lu, H.; Kolarik, W.J. [Texas Tech Univ., Lubbock, TX (United States). Dept. of Industrial Engineering

    1999-06-01

    Self-assessment of survival for a system, subsystem or component is implemented by assessing conditional performance reliability in real-time, which includes modeling and analysis of physical performance data. This paper proposes a time series analysis approach to system self-assessment (prediction) of survival. In the approach, physical performance data are modeled in a time series. The performance forecast is based on the model developed and is converted to the reliability of system survival. In contrast to a standard regression model, a time series model, using on-line data, is suitable for the real-time performance prediction. This paper illustrates an example of time series modeling and survival assessment, regarding an excessive tool edge wear failure mode for a twist drill operation.

  13. Time Series Decomposition into Oscillation Components and Phase Estimation.

    Science.gov (United States)

    Matsuda, Takeru; Komaki, Fumiyasu

    2017-02-01

    Many time series are naturally considered as a superposition of several oscillation components. For example, electroencephalogram (EEG) time series include oscillation components such as alpha, beta, and gamma. We propose a method for decomposing time series into such oscillation components using state-space models. Based on the concept of random frequency modulation, gaussian linear state-space models for oscillation components are developed. In this model, the frequency of an oscillator fluctuates by noise. Time series decomposition is accomplished by this model like the Bayesian seasonal adjustment method. Since the model parameters are estimated from data by the empirical Bayes' method, the amplitudes and the frequencies of oscillation components are determined in a data-driven manner. Also, the appropriate number of oscillation components is determined with the Akaike information criterion (AIC). In this way, the proposed method provides a natural decomposition of the given time series into oscillation components. In neuroscience, the phase of neural time series plays an important role in neural information processing. The proposed method can be used to estimate the phase of each oscillation component and has several advantages over a conventional method based on the Hilbert transform. Thus, the proposed method enables an investigation of the phase dynamics of time series. Numerical results show that the proposed method succeeds in extracting intermittent oscillations like ripples and detecting the phase reset phenomena. We apply the proposed method to real data from various fields such as astronomy, ecology, tidology, and neuroscience.

  14. Conditional time series forecasting with convolutional neural networks

    NARCIS (Netherlands)

    A. Borovykh (Anastasia); S.M. Bohte (Sander); C.W. Oosterlee (Cornelis)

    2017-01-01

    textabstractForecasting financial time series using past observations has been a significant topic of interest. While temporal relationships in the data exist, they are difficult to analyze and predict accurately due to the non-linear trends and noise present in the series. We propose to learn these

  15. Algorithms for Linear Time Series Analysis: With R Package

    Directory of Open Access Journals (Sweden)

    A. Ian McLeod

    2007-11-01

    Full Text Available Our ltsa package implements the Durbin-Levinson and Trench algorithms and provides a general approach to the problems of fitting, forecasting and simulating linear time series models as well as fitting regression models with linear time series errors. For computational efficiency both algorithms are implemented in C and interfaced to R. Examples are given which illustrate the efficiency and accuracy of the algorithms. We provide a second package FGN which illustrates the use of the ltsa package with fractional Gaussian noise (FGN. It is hoped that the ltsa will provide a base for further time series software.

  16. Segmentation of Nonstationary Time Series with Geometric Clustering

    DEFF Research Database (Denmark)

    Bocharov, Alexei; Thiesson, Bo

    2013-01-01

    We introduce a non-parametric method for segmentation in regimeswitching time-series models. The approach is based on spectral clustering of target-regressor tuples and derives a switching regression tree, where regime switches are modeled by oblique splits. Such models can be learned efficiently...... from data, where clustering is used to propose one single split candidate at each split level. We use the class of ART time series models to serve as illustration, but because of the non-parametric nature of our segmentation approach, it readily generalizes to a wide range of time-series models that go...

  17. Time series with mixed spectra theory and methods

    CERN Document Server

    Li, Ta-Hsin

    2013-01-01

    Time series with mixed spectra are characterized by hidden periodic components buried in random noise. Despite strong interest in the statistical and signal processing communities, no book offers a comprehensive and up-to-date treatment of the subject. Filling this void, Time Series with Mixed Spectra focuses on the methods and theory for the statistical analysis of time series with mixed spectra. It presents detailed theoretical and empirical analyses of important methods and algorithms. Using both simulated and real-world data to illustrate the analyses, the book discusses periodogram analys

  18. FORECASTING FINANCIAL TIME SERIES USING A METHOD OF SELFORGANIZED CRITICALITY

    Directory of Open Access Journals (Sweden)

    Michail E. Mazurov

    2014-01-01

    Full Text Available There are four main methods of forecastingfinancial time series: technical analysis,mathematical analysis, fundamental analysis, the use of neural networks. Evolution of financial time series is accompanied by bifurcations, characterizing the internal propertiesof the system. Then there is the unstable state and momentum, which is distributed in a distributed system stock exchanges. Giventhis mechanism to analyze the behaviorof financial time series, we use bifurcation theory and a system of nonlinear differential equations of parabolic type, which are thebasic equations in synergetics.

  19. Multivariate time series analysis with R and financial applications

    CERN Document Server

    Tsay, Ruey S

    2013-01-01

    Since the publication of his first book, Analysis of Financial Time Series, Ruey Tsay has become one of the most influential and prominent experts on the topic of time series. Different from the traditional and oftentimes complex approach to multivariate (MV) time series, this sequel book emphasizes structural specification, which results in simplified parsimonious VARMA modeling and, hence, eases comprehension. Through a fundamental balance between theory and applications, the book supplies readers with an accessible approach to financial econometric models and their applications to real-worl

  20. Characterizing interdependencies of multiple time series theory and applications

    CERN Document Server

    Hosoya, Yuzo; Takimoto, Taro; Kinoshita, Ryo

    2017-01-01

    This book introduces academic researchers and professionals to the basic concepts and methods for characterizing interdependencies of multiple time series in the frequency domain. Detecting causal directions between a pair of time series and the extent of their effects, as well as testing the non existence of a feedback relation between them, have constituted major focal points in multiple time series analysis since Granger introduced the celebrated definition of causality in view of prediction improvement. Causality analysis has since been widely applied in many disciplines. Although most analyses are conducted from the perspective of the time domain, a frequency domain method introduced in this book sheds new light on another aspect that disentangles the interdependencies between multiple time series in terms of long-term or short-term effects, quantitatively characterizing them. The frequency domain method includes the Granger noncausality test as a special case. Chapters 2 and 3 of the book introduce an i...

  1. Scalable Prediction of Energy Consumption using Incremental Time Series Clustering

    Energy Technology Data Exchange (ETDEWEB)

    Simmhan, Yogesh; Noor, Muhammad Usman

    2013-10-09

    Time series datasets are a canonical form of high velocity Big Data, and often generated by pervasive sensors, such as found in smart infrastructure. Performing predictive analytics on time series data can be computationally complex, and requires approximation techniques. In this paper, we motivate this problem using a real application from the smart grid domain. We propose an incremental clustering technique, along with a novel affinity score for determining cluster similarity, which help reduce the prediction error for cumulative time series within a cluster. We evaluate this technique, along with optimizations, using real datasets from smart meters, totaling ~700,000 data points, and show the efficacy of our techniques in improving the prediction error of time series data within polynomial time.

  2. Elements of nonlinear time series analysis and forecasting

    CERN Document Server

    De Gooijer, Jan G

    2017-01-01

    This book provides an overview of the current state-of-the-art of nonlinear time series analysis, richly illustrated with examples, pseudocode algorithms and real-world applications. Avoiding a “theorem-proof” format, it shows concrete applications on a variety of empirical time series. The book can be used in graduate courses in nonlinear time series and at the same time also includes interesting material for more advanced readers. Though it is largely self-contained, readers require an understanding of basic linear time series concepts, Markov chains and Monte Carlo simulation methods. The book covers time-domain and frequency-domain methods for the analysis of both univariate and multivariate (vector) time series. It makes a clear distinction between parametric models on the one hand, and semi- and nonparametric models/methods on the other. This offers the reader the option of concentrating exclusively on one of these nonlinear time series analysis methods. To make the book as user friendly as possible...

  3. Uniform Consistency for Nonparametric Estimators in Null Recurrent Time Series

    DEFF Research Database (Denmark)

    Gao, Jiti; Kanaya, Shin; Li, Degui

    2015-01-01

    This paper establishes uniform consistency results for nonparametric kernel density and regression estimators when time series regressors concerned are nonstationary null recurrent Markov chains. Under suitable regularity conditions, we derive uniform convergence rates of the estimators. Our resu...

  4. Forecasting Time Series Movement Direction with Hybrid Methodology

    Directory of Open Access Journals (Sweden)

    Salwa Waeto

    2017-01-01

    Full Text Available Forecasting the tendencies of time series is a challenging task which gives better understanding. The purpose of this paper is to present the hybrid model of support vector regression associated with Autoregressive Integrated Moving Average which is formulated by hybrid methodology. The proposed model is more convenient for practical usage. The tendencies modeling of time series for Thailand’s south insurgency is of interest in this research article. The empirical results using the time series of monthly number of deaths, injuries, and incidents for Thailand’s south insurgency indicate that the proposed hybrid model is an effective way to construct an estimated hybrid model which is better than the classical time series model or support vector regression. The best forecast accuracy is performed by using mean square error.

  5. AFSC/ABL: Ugashik sockeye salmon scale time series

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — A time series of scale samples (1956 b?? 2002) collected from adult sockeye salmon returning to Ugashik River were retrieved from the Alaska Department of Fish and...

  6. AFSC/ABL: Naknek sockeye salmon scale time series

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — A time series of scale samples (1956 2002) collected from adult sockeye salmon returning to Naknek River were retrieved from the Alaska Department of Fish and Game....

  7. Fast and Flexible Multivariate Time Series Subsequence Search

    Data.gov (United States)

    National Aeronautics and Space Administration — Multivariate Time-Series (MTS) are ubiquitous, and are generated in areas as disparate as sensor recordings in aerospace systems, music and video streams, medical...

  8. Environmental Kuznets curve (EKC): Times series evidence from Portugal

    OpenAIRE

    Shahbaz, Muhammad; Jalil, Abdul; Dube, Smile

    2010-01-01

    The paper provides empirical evidence of an EKC – a relationship between income and environmental degradation for Portugal by applying autoregressive distributed lag (ARDL) to times series data. In order to capture Portugal’s historical experience, demographic changes, and international trade on CO2 emissions, we assess the traditional income-emissions model with variables such as energy consumption, urbanization, and trade openness in time series framework. There is evidence o...

  9. Stacked Heterogeneous Neural Networks for Time Series Forecasting

    Directory of Open Access Journals (Sweden)

    Florin Leon

    2010-01-01

    Full Text Available A hybrid model for time series forecasting is proposed. It is a stacked neural network, containing one normal multilayer perceptron with bipolar sigmoid activation functions, and the other with an exponential activation function in the output layer. As shown by the case studies, the proposed stacked hybrid neural model performs well on a variety of benchmark time series. The combination of weights of the two stack components that leads to optimal performance is also studied.

  10. Learning Multiple Temporal Matching for Time Series Classification

    OpenAIRE

    Frambourg, Cedric; Douzal-Chouakria, Ahlame; Gaussier, Eric

    2013-01-01

    12; International audience; In real applications, time series are generally of complex structure, exhibiting different global behaviors within classes. To discriminate such challenging time series, we propose a multiple temporal matching approach that reveals the commonly shared features within classes, and the most differential ones across classes. For this, we rely on a new framework based on the variance/covariance criterion to strengthen or weaken matched observations according to the ind...

  11. Prediction of Long-Memory Time Series: A Tutorial Review

    Science.gov (United States)

    Bhansali, R. J.; Kokoszka, P. S.

    Two different approaches, called Type-I and Type-II, to linear least-squares prediction of a long-memory time series are distinguished. In the former, no new theory is required and a long-memory time series is treated on par with a standard short-memory time series and its multistep predictions are obtained by using the existing modelling approaches to prediction of such time series. The latter, by contrast, seeks to model the long-memory stochastic characteristics of the observed time series by a fractional process such that its dth fractional difference, 0 memory process. The various approaches to constructing long-memory stochastic models are reviewed, and the associated question of parameter estimation for these models is discussed. Having fitted a long-memory stochastic model to a time series, linear multi-step forecasts of its future values are constructed from the model itself. The question of how to evaluate the multistep prediction constants is considered and three different methods proposed for doing so are outlined; it is further noted that, under appropriate regularity conditions, these methods apply also to the class of linear long memory processes with infinite variance. In addition, a brief review of the class of non-linear chaotic maps implying long-memory is given.

  12. Effect of reference stations on continuous GPS (CGPS) time series

    Science.gov (United States)

    Sella, G. F.; Malservisi, R.; Wdowinski, S.; Dixon, T.; Lafemina, P.

    2004-12-01

    Time series of CGPS sites in the ITRF2000 reference frame show a significant "bump" in late 2002. The time series suggest that the position of the GPS monument has moved significantly compared to the expected steady velocity prediction. Although the change has only a slight effect on computed velocities for long time series, it may introduce a significant bias in the computed velocity for shorter time series and in time series only using data collected during episodic GPS campaigns. The "bump" is easily recognized in North American sites but can also be observed in sites around the world, indicating a global effect. It is also present in time series produced by different analysis groups with different software packages (GIPSY-OASIS, GAMIT). We propose that the bump is related to "misbehavior" of one or more references sites used to transform raw GPS positions into ITRF2000. The "bump" appears to coincide with discontinuity in the operation of some reference stations. We also find that daily positions estimates are particularly sensitive to the behavior of some reference stations.

  13. Combined forecasts from linear and nonlinear time series models

    NARCIS (Netherlands)

    N. Terui (Nobuhiko); H.K. van Dijk (Herman)

    1999-01-01

    textabstractCombined forecasts from a linear and a nonlinear model are investigated for time series with possibly nonlinear characteristics. The forecasts are combined by a constant coefficient regression method as well as a time varying method. The time varying method allows for a locally

  14. Three essays in applied macroeconomics and time series analysis

    NARCIS (Netherlands)

    Abi Morshed, Alaa

    2017-01-01

    This dissertation revolves around topics in Applied Macroeconomics and Time series analysis. Generally speaking, we explore different forms of instability ranging from discrete sudden breaks to time varying parameter (TVP) models. In the second chapter, we study the time-varying impact of

  15. Combined Forecasts from Linear and Nonlinear Time Series Models

    NARCIS (Netherlands)

    N. Terui (Nobuhiko); H.K. van Dijk (Herman)

    1999-01-01

    textabstractCombined forecasts from a linear and a nonlinear model are investigated for time series with possibly nonlinear characteristics. The forecasts are combined by a constant coefficient regression method as well as a time varying method. The time varying method allows for a locally

  16. Battery Grouping with Time Series Clustering Based on Affinity Propagation

    Directory of Open Access Journals (Sweden)

    Zhiwei He

    2016-07-01

    Full Text Available Battery grouping is a technology widely used to improve the performance of battery packs. In this paper, we propose a time series clustering based battery grouping method. The proposed method utilizes the whole battery charge/discharge sequence for battery grouping. The time sequences are first denoised with a wavelet denoising technique. The similarity matrix is then computed with the dynamic time warping distance, and finally the time series are clustered with the affinity propagation algorithm according to the calculated similarity matrices. The silhouette index is utilized for assessing the performance of the proposed battery grouping method. Test results show that the proposed battery grouping method is effective.

  17. Multiresolution analysis of Bursa Malaysia KLCI time series

    Science.gov (United States)

    Ismail, Mohd Tahir; Dghais, Amel Abdoullah Ahmed

    2017-05-01

    In general, a time series is simply a sequence of numbers collected at regular intervals over a period. Financial time series data processing is concerned with the theory and practice of processing asset price over time, such as currency, commodity data, and stock market data. The primary aim of this study is to understand the fundamental characteristics of selected financial time series by using the time as well as the frequency domain analysis. After that prediction can be executed for the desired system for in sample forecasting. In this study, multiresolution analysis which the assist of discrete wavelet transforms (DWT) and maximal overlap discrete wavelet transform (MODWT) will be used to pinpoint special characteristics of Bursa Malaysia KLCI (Kuala Lumpur Composite Index) daily closing prices and return values. In addition, further case study discussions include the modeling of Bursa Malaysia KLCI using linear ARIMA with wavelets to address how multiresolution approach improves fitting and forecasting results.

  18. Evaluation of scaling invariance embedded in short time series.

    Directory of Open Access Journals (Sweden)

    Xue Pan

    Full Text Available Scaling invariance of time series has been making great contributions in diverse research fields. But how to evaluate scaling exponent from a real-world series is still an open problem. Finite length of time series may induce unacceptable fluctuation and bias to statistical quantities and consequent invalidation of currently used standard methods. In this paper a new concept called correlation-dependent balanced estimation of diffusion entropy is developed to evaluate scale-invariance in very short time series with length ~10(2. Calculations with specified Hurst exponent values of 0.2,0.3,...,0.9 show that by using the standard central moving average de-trending procedure this method can evaluate the scaling exponents for short time series with ignorable bias (≤0.03 and sharp confidential interval (standard deviation ≤0.05. Considering the stride series from ten volunteers along an approximate oval path of a specified length, we observe that though the averages and deviations of scaling exponents are close, their evolutionary behaviors display rich patterns. It has potential use in analyzing physiological signals, detecting early warning signals, and so on. As an emphasis, the our core contribution is that by means of the proposed method one can estimate precisely shannon entropy from limited records.

  19. Historical evidence for nature disconnection in a 70-year time series of Disney animated films.

    Science.gov (United States)

    Prévot-Julliard, Anne-Caroline; Julliard, Romain; Clayton, Susan

    2015-08-01

    The assumed ongoing disconnection between humans and nature in Western societies represents a profoundly challenging conservation issue. Here, we demonstrate one manifestation of this nature disconnection, via an examination of the representation of natural settings in a 70-year time series of Disney animated films. We found that natural settings are increasingly less present as a representation of outdoor environments in these films. Moreover, these drawn natural settings tend to be more and more human controlled and are less and less complex in terms of the biodiversity they depict. These results demonstrate the increasing nature disconnection of the filmmaking teams, which we consider as a proxy of the Western relation to nature. Additionally, because nature experience of children is partly based on movies, the depleted representation of biodiversity in outdoor environments of Disney films may amplify the current disconnection from nature for children. This reduction in exposure to nature may hinder the implementation of biodiversity conservation measures. © The Author(s) 2014.

  20. Drunk driving detection based on classification of multivariate time series.

    Science.gov (United States)

    Li, Zhenlong; Jin, Xue; Zhao, Xiaohua

    2015-09-01

    This paper addresses the problem of detecting drunk driving based on classification of multivariate time series. First, driving performance measures were collected from a test in a driving simulator located in the Traffic Research Center, Beijing University of Technology. Lateral position and steering angle were used to detect drunk driving. Second, multivariate time series analysis was performed to extract the features. A piecewise linear representation was used to represent multivariate time series. A bottom-up algorithm was then employed to separate multivariate time series. The slope and time interval of each segment were extracted as the features for classification. Third, a support vector machine classifier was used to classify driver's state into two classes (normal or drunk) according to the extracted features. The proposed approach achieved an accuracy of 80.0%. Drunk driving detection based on the analysis of multivariate time series is feasible and effective. The approach has implications for drunk driving detection. Copyright © 2015 Elsevier Ltd and National Safety Council. All rights reserved.

  1. Self-affinity in the dengue fever time series

    Science.gov (United States)

    Azevedo, S. M.; Saba, H.; Miranda, J. G. V.; Filho, A. S. Nascimento; Moret, M. A.

    2016-06-01

    Dengue is a complex public health problem that is common in tropical and subtropical regions. This disease has risen substantially in the last three decades, and the physical symptoms depict the self-affine behavior of the occurrences of reported dengue cases in Bahia, Brazil. This study uses detrended fluctuation analysis (DFA) to verify the scale behavior in a time series of dengue cases and to evaluate the long-range correlations that are characterized by the power law α exponent for different cities in Bahia, Brazil. The scaling exponent (α) presents different long-range correlations, i.e. uncorrelated, anti-persistent, persistent and diffusive behaviors. The long-range correlations highlight the complex behavior of the time series of this disease. The findings show that there are two distinct types of scale behavior. In the first behavior, the time series presents a persistent α exponent for a one-month period. For large periods, the time series signal approaches subdiffusive behavior. The hypothesis of the long-range correlations in the time series of the occurrences of reported dengue cases was validated. The observed self-affinity is useful as a forecasting tool for future periods through extrapolation of the α exponent behavior. This complex system has a higher predictability in a relatively short time (approximately one month), and it suggests a new tool in epidemiological control strategies. However, predictions for large periods using DFA are hidden by the subdiffusive behavior.

  2. Stochastic modeling of hourly rainfall times series in Campania (Italy)

    Science.gov (United States)

    Giorgio, M.; Greco, R.

    2009-04-01

    Occurrence of flowslides and floods in small catchments is uneasy to predict, since it is affected by a number of variables, such as mechanical and hydraulic soil properties, slope morphology, vegetation coverage, rainfall spatial and temporal variability. Consequently, landslide risk assessment procedures and early warning systems still rely on simple empirical models based on correlation between recorded rainfall data and observed landslides and/or river discharges. Effectiveness of such systems could be improved by reliable quantitative rainfall prediction, which can allow gaining larger lead-times. Analysis of on-site recorded rainfall height time series represents the most effective approach for a reliable prediction of local temporal evolution of rainfall. Hydrological time series analysis is a widely studied field in hydrology, often carried out by means of autoregressive models, such as AR, ARMA, ARX, ARMAX (e.g. Salas [1992]). Such models gave the best results when applied to the analysis of autocorrelated hydrological time series, like river flow or level time series. Conversely, they are not able to model the behaviour of intermittent time series, like point rainfall height series usually are, especially when recorded with short sampling time intervals. More useful for this issue are the so-called DRIP (Disaggregated Rectangular Intensity Pulse) and NSRP (Neymann-Scott Rectangular Pulse) model [Heneker et al., 2001; Cowpertwait et al., 2002], usually adopted to generate synthetic point rainfall series. In this paper, the DRIP model approach is adopted, in which the sequence of rain storms and dry intervals constituting the structure of rainfall time series is modeled as an alternating renewal process. Final aim of the study is to provide a useful tool to implement an early warning system for hydrogeological risk management. Model calibration has been carried out with hourly rainfall hieght data provided by the rain gauges of Campania Region civil

  3. Periodicity Estimation in Mechanical Acoustic Time-Series Data

    Directory of Open Access Journals (Sweden)

    Zhu Yongbo

    2015-01-01

    Full Text Available Periodicity estimation in mechanical acoustic time-series data is a well-established problem in data mining as it can be applicable in variety of disciplines either for anomaly detection or for prediction purposes in industry. In this paper, we develop a new approach for capturing and characterizing periodic patterns in time-series data by virtue of the dynamic time warping (DTW. We have conducted extensive experiments to evaluate the proposed approach with synthetic data and our collected data in practice. Experimental results demonstrated its effectiveness and robustness on periodicity detection in highly noised data.

  4. Minimum entropy density method for the time series analysis

    Science.gov (United States)

    Lee, Jeong Won; Park, Joongwoo Brian; Jo, Hang-Hyun; Yang, Jae-Suk; Moon, Hie-Tae

    2009-01-01

    The entropy density is an intuitive and powerful concept to study the complicated nonlinear processes derived from physical systems. We develop the minimum entropy density method (MEDM) to detect the structure scale of a given time series, which is defined as the scale in which the uncertainty is minimized, hence the pattern is revealed most. The MEDM is applied to the financial time series of Standard and Poor’s 500 index from February 1983 to April 2006. Then the temporal behavior of structure scale is obtained and analyzed in relation to the information delivery time and efficient market hypothesis.

  5. Visibility graph analysis of wall turbulence time-series

    Science.gov (United States)

    Iacobello, Giovanni; Scarsoglio, Stefania; Ridolfi, Luca

    2018-01-01

    The spatio-temporal features of the velocity field of a fully-developed turbulent channel flow are investigated through the natural visibility graph (NVG) method, which is able to fully map the intrinsic structure of the time-series into complex networks. Time-series of the three velocity components, (u , v , w), are analyzed at fixed grid-points of the whole three-dimensional domain. Each time-series was mapped into a network by means of the NVG algorithm, so that each network corresponds to a grid-point of the simulation. The degree centrality, the transitivity and the here proposed mean link-length were evaluated as indicators of the global visibility, inter-visibility, and mean temporal distance among nodes, respectively. The metrics were averaged along the directions of homogeneity (x , z) of the flow, so they only depend on the wall-normal coordinate, y+. The visibility-based networks, inheriting the flow field features, unveil key temporal properties of the turbulent time-series and their changes moving along y+. Although intrinsically simple to be implemented, the visibility graph-based approach offers a promising and effective support to the classical methods for accurate time-series analyses of inhomogeneous turbulent flows.

  6. Multitask Gaussian processes for multivariate physiological time-series analysis.

    Science.gov (United States)

    Dürichen, Robert; Pimentel, Marco A F; Clifton, Lei; Schweikard, Achim; Clifton, David A

    2015-01-01

    Gaussian process (GP) models are a flexible means of performing nonparametric Bayesian regression. However, GP models in healthcare are often only used to model a single univariate output time series, denoted as single-task GPs (STGP). Due to an increasing prevalence of sensors in healthcare settings, there is an urgent need for robust multivariate time-series tools. Here, we propose a method using multitask GPs (MTGPs) which can model multiple correlated multivariate physiological time series simultaneously. The flexible MTGP framework can learn the correlation between multiple signals even though they might be sampled at different frequencies and have training sets available for different intervals. Furthermore, prior knowledge of any relationship between the time series such as delays and temporal behavior can be easily integrated. A novel normalization is proposed to allow interpretation of the various hyperparameters used in the MTGP. We investigate MTGPs for physiological monitoring with synthetic data sets and two real-world problems from the field of patient monitoring and radiotherapy. The results are compared with standard Gaussian processes and other existing methods in the respective biomedical application areas. In both cases, we show that our framework learned the correlation between physiological time series efficiently, outperforming the existing state of the art.

  7. Recurrent Neural Network Applications for Astronomical Time Series

    Science.gov (United States)

    Protopapas, Pavlos

    2017-06-01

    The benefits of good predictive models in astronomy lie in early event prediction systems and effective resource allocation. Current time series methods applicable to regular time series have not evolved to generalize for irregular time series. In this talk, I will describe two Recurrent Neural Network methods, Long Short-Term Memory (LSTM) and Echo State Networks (ESNs) for predicting irregular time series. Feature engineering along with a non-linear modeling proved to be an effective predictor. For noisy time series, the prediction is improved by training the network on error realizations using the error estimates from astronomical light curves. In addition to this, we propose a new neural network architecture to remove correlation from the residuals in order to improve prediction and compensate for the noisy data. Finally, I show how to set hyperparameters for a stable and performant solution correctly. In this work, we circumvent this obstacle by optimizing ESN hyperparameters using Bayesian optimization with Gaussian Process priors. This automates the tuning procedure, enabling users to employ the power of RNN without needing an in-depth understanding of the tuning procedure.

  8. Continuous baseflow separation from time series of daily and ...

    African Journals Online (AJOL)

    Continuous baseflow separation procedures have been frequently used to differentiate total flows into the high-frequency, lowamplitude 'baseflow' component and the low-frequency, high-amplitude 'flood' flows. In the past, such procedures have normally been applied to streamflow time-series data with time steps of 1 day ...

  9. Sparse time series chain graphical models for reconstructing genetic networks

    NARCIS (Netherlands)

    Abegaz, Fentaw; Wit, Ernst

    We propose a sparse high-dimensional time series chain graphical model for reconstructing genetic networks from gene expression data parametrized by a precision matrix and autoregressive coefficient matrix. We consider the time steps as blocks or chains. The proposed approach explores patterns of

  10. Analysis and generation of groundwater concentration time series

    Science.gov (United States)

    Crăciun, Maria; Vamoş, Călin; Suciu, Nicolae

    2018-01-01

    Concentration time series are provided by simulated concentrations of a nonreactive solute transported in groundwater, integrated over the transverse direction of a two-dimensional computational domain and recorded at the plume center of mass. The analysis of a statistical ensemble of time series reveals subtle features that are not captured by the first two moments which characterize the approximate Gaussian distribution of the two-dimensional concentration fields. The concentration time series exhibit a complex preasymptotic behavior driven by a nonstationary trend and correlated fluctuations with time-variable amplitude. Time series with almost the same statistics are generated by successively adding to a time-dependent trend a sum of linear regression terms, accounting for correlations between fluctuations around the trend and their increments in time, and terms of an amplitude modulated autoregressive noise of order one with time-varying parameter. The algorithm generalizes mixing models used in probability density function approaches. The well-known interaction by exchange with the mean mixing model is a special case consisting of a linear regression with constant coefficients.

  11. MODIS Vegetation Indices time series improvement considering real acquisition dates

    Science.gov (United States)

    Testa, S.; Borgogno Mondino, E.

    2013-12-01

    Satellite Vegetation Indices (VI) time series images are widely used for the characterization phenology, which requires a high temporal accuracy of the satellite data. The present work is based on the MODerate resolution Imaging Spectroradiometer (MODIS) MOD13Q1 product - Vegetation Indices 16-Day L3 Global 250m, which is generated through a maximum value compositing process that reduces the number of cloudy pixels and excludes, when possible, off-nadir ones. Because of its 16-days compositing period, the distance between two adjacent-in-time values within each pixel NDVI time series can range from 1 to 32 days, thus not acceptable for phenologic studies. Moreover, most of the available smoothing algorithms, which are widely used for phenology characterization, assume that data points are equidistant in time and contemporary over the image. The objective of this work was to assess temporal features of NDVI time series over a test area, composed by Castanea sativa (chestnut) and Fagus sylvatica (beech) pure pixels within the Piemonte region in Northwestern Italy. Firstly, NDVI, Pixel Reliability (PR) and Composite Day of the Year (CDOY) data ranging from 2000 to 2011 were extracted from MOD13Q1 and corresponding time series were generated (in further computations, 2000 was not considered since it is not complete because acquisition began in February and calibration is unreliable until October). Analysis of CDOY time series (containing the actual reference date of each NDVI value) over the selected study areas showed NDVI values to be prevalently generated from data acquired at the centre of each 16-days period (the 9th day), at least constantly along the year. This leads to consider each original NDVI value nominally placed to the centre of its 16-days reference period. Then, a new NDVI time series was generated: a) moving each NDVI value to its actual "acquisition" date, b) interpolating the obtained temporary time series through SPLINE functions, c) sampling such

  12. Cloud masking and removal in remote sensing image time series

    Science.gov (United States)

    Gómez-Chova, Luis; Amorós-López, Julia; Mateo-García, Gonzalo; Muñoz-Marí, Jordi; Camps-Valls, Gustau

    2017-01-01

    Automatic cloud masking of Earth observation images is one of the first required steps in optical remote sensing data processing since the operational use and product generation from satellite image time series might be hampered by undetected clouds. The high temporal revisit of current and forthcoming missions and the scarcity of labeled data force us to cast cloud screening as an unsupervised change detection problem in the temporal domain. We introduce a cloud screening method based on detecting abrupt changes along the time dimension. The main assumption is that image time series follow smooth variations over land (background) and abrupt changes will be mainly due to the presence of clouds. The method estimates the background surface changes using the information in the time series. In particular, we propose linear and nonlinear least squares regression algorithms that minimize both the prediction and the estimation error simultaneously. Then, significant differences in the image of interest with respect to the estimated background are identified as clouds. The use of kernel methods allows the generalization of the algorithm to account for higher-order (nonlinear) feature relations. After the proposed cloud masking and cloud removal, cloud-free time series at high spatial resolution can be used to obtain a better monitoring of land cover dynamics and to generate more elaborated products. The method is tested in a dataset with 5-day revisit time series from SPOT-4 at high resolution and with Landsat-8 time series. Experimental results show that the proposed method yields more accurate cloud masks when confronted with state-of-the-art approaches typically used in operational settings. In addition, the algorithm has been implemented in the Google Earth Engine platform, which allows us to access the full Landsat-8 catalog and work in a parallel distributed platform to extend its applicability to a global planetary scale.

  13. Mining approximate periodic pattern in hydrological time series

    Science.gov (United States)

    Zhu, Y. L.; Li, S. J.; Bao, N. N.; Wan, D. S.

    2012-04-01

    There is a lot of information about the hidden laws of nature evolution and the influences of human beings activities on the earth surface in long sequence of hydrological time series. Data mining technology can help find those hidden laws, such as flood frequency and abrupt change, which is useful for the decision support of hydrological prediction and flood control scheduling. The periodic nature of hydrological time series is important for trend forecasting of drought and flood and hydraulic engineering planning. In Hydrology, the full period analysis of hydrological time series has attracted a lot of attention, such as the discrete periodogram, simple partial wave method, Fourier analysis method, and maximum entropy spectral analysis method and wavelet analysis. In fact, the hydrological process is influenced both by deterministic factors and stochastic ones. For example, the tidal level is also affected by moon circling the Earth, in addition to the Earth revolution and its rotation. Hence, there is some kind of approximate period hidden in the hydrological time series, sometimes which is also called the cryptic period. Recently, partial period mining originated from the data mining domain can be a remedy for the traditional period analysis methods in hydrology, which has a loose request of the data integrity and continuity. They can find some partial period in the time series. This paper is focused on the partial period mining in the hydrological time series. Based on asynchronous periodic pattern and partial period mining with suffix tree, this paper proposes to mine multi-event asynchronous periodic pattern based on modified suffix tree representation and traversal, and invent a dynamic candidate period intervals adjusting method, which can avoids period omissions or waste of time and space. The experimental results on synthetic data and real water level data of the Yangtze River at Nanjing station indicate that this algorithm can discover hydrological

  14. Time Series Outlier Detection Based on Sliding Window Prediction

    Directory of Open Access Journals (Sweden)

    Yufeng Yu

    2014-01-01

    Full Text Available In order to detect outliers in hydrological time series data for improving data quality and decision-making quality related to design, operation, and management of water resources, this research develops a time series outlier detection method for hydrologic data that can be used to identify data that deviate from historical patterns. The method first built a forecasting model on the history data and then used it to predict future values. Anomalies are assumed to take place if the observed values fall outside a given prediction confidence interval (PCI, which can be calculated by the predicted value and confidence coefficient. The use of PCI as threshold is mainly on the fact that it considers the uncertainty in the data series parameters in the forecasting model to address the suitable threshold selection problem. The method performs fast, incremental evaluation of data as it becomes available, scales to large quantities of data, and requires no preclassification of anomalies. Experiments with different hydrologic real-world time series showed that the proposed methods are fast and correctly identify abnormal data and can be used for hydrologic time series analysis.

  15. Time series analysis and its applications with R examples

    CERN Document Server

    Shumway, Robert H

    2017-01-01

    The fourth edition of this popular graduate textbook, like its predecessors, presents a balanced and comprehensive treatment of both time and frequency domain methods with accompanying theory. Numerous examples using nontrivial data illustrate solutions to problems such as discovering natural and anthropogenic climate change, evaluating pain perception experiments using functional magnetic resonance imaging, and monitoring a nuclear test ban treaty. The book is designed as a textbook for graduate level students in the physical, biological, and social sciences and as a graduate level text in statistics. Some parts may also serve as an undergraduate introductory course. Theory and methodology are separated to allow presentations on different levels. In addition to coverage of classical methods of time series regression, ARIMA models, spectral analysis and state-space models, the text includes modern developments including categorical time series analysis, multivariate spectral methods, long memory series, nonli...

  16. Increment entropy as a measure of complexity for time series

    CERN Document Server

    Liu, Xiaofeng; Xu, Ning; Xue, Jianru

    2015-01-01

    Entropy has been a common index to quantify the complexity of time series in a variety of fields. Here, we introduce increment entropy to measure the complexity of time series in which each increment is mapped into a word of two letters, one letter corresponding to direction and the other corresponding to magnitude. The Shannon entropy of the words is termed as increment entropy (IncrEn). Simulations on synthetic data and tests on epileptic EEG signals have demonstrated its ability of detecting the abrupt change, regardless of energetic (e.g. spikes or bursts) or structural changes. The computation of IncrEn does not make any assumption on time series and it can be applicable to arbitrary real-world data.

  17. Feature-preserving interpolation and filtering of environmental time series

    CERN Document Server

    Mariethoz, Gregoire; Jougnot, Damien; Rezaee, Hassan

    2015-01-01

    We propose a method for filling gaps and removing interferences in time series for applications involving continuous monitoring of environmental variables. The approach is non-parametric and based on an iterative pattern-matching between the affected and the valid parts of the time series. It considers several variables jointly in the pattern matching process and allows preserving linear or non-linear dependences between variables. The uncertainty in the reconstructed time series is quantified through multiple realizations. The method is tested on self-potential data that are affected by strong interferences as well as data gaps, and the results show that our approach allows reproducing the spectral features of the original signal. Even in the presence of intense signal perturbations, it significantly improves the signal and corrects bias introduced by asymmetrical interferences. Potential applications are wide-ranging, including geophysics, meteorology and hydrology.

  18. Grammar-based feature generation for time-series prediction

    CERN Document Server

    De Silva, Anthony Mihirana

    2015-01-01

    This book proposes a novel approach for time-series prediction using machine learning techniques with automatic feature generation. Application of machine learning techniques to predict time-series continues to attract considerable attention due to the difficulty of the prediction problems compounded by the non-linear and non-stationary nature of the real world time-series. The performance of machine learning techniques, among other things, depends on suitable engineering of features. This book proposes a systematic way for generating suitable features using context-free grammar. A number of feature selection criteria are investigated and a hybrid feature generation and selection algorithm using grammatical evolution is proposed. The book contains graphical illustrations to explain the feature generation process. The proposed approaches are demonstrated by predicting the closing price of major stock market indices, peak electricity load and net hourly foreign exchange client trade volume. The proposed method ...

  19. Track Irregularity Time Series Analysis and Trend Forecasting

    Directory of Open Access Journals (Sweden)

    Jia Chaolong

    2012-01-01

    Full Text Available The combination of linear and nonlinear methods is widely used in the prediction of time series data. This paper analyzes track irregularity time series data by using gray incidence degree models and methods of data transformation, trying to find the connotative relationship between the time series data. In this paper, GM (1,1 is based on first-order, single variable linear differential equations; after an adaptive improvement and error correction, it is used to predict the long-term changing trend of track irregularity at a fixed measuring point; the stochastic linear AR, Kalman filtering model, and artificial neural network model are applied to predict the short-term changing trend of track irregularity at unit section. Both long-term and short-term changes prove that the model is effective and can achieve the expected accuracy.

  20. Causal analysis of time series from hydrological systems

    Science.gov (United States)

    Selle, Benny; Aufgebauer, Britta; Knorr, Klaus-Holger

    2017-04-01

    It is often difficult to infer cause and effect in hydrological systems for which time series of system inputs, outputs and state variables are observed. A recently published technique called Convergent Cross Mapping could be a promising tool to detect causality between time series. A response variable Y may be causally related to a forcing variable X, if the so called cross mapping of X using Y improves with the amount of data included. The idea is that a response variable contains information on the history of its driving variable whereas the reverse may not be true. We propose an alternative approach based on similar ideas using neural networks. Our approach is firstly compared to Convergent Cross Mapping using a synthetic time series of precipitation and streamflow generated by a rainfall runoff model. Secondly, measured concentrations of dissolved organic carbon and dissolved iron from a mountainous stream in Germany, that were previously hypothesised to be casually linked, are tested.

  1. Detection of "noisy" chaos in a time series

    DEFF Research Database (Denmark)

    Chon, K H; Kanters, J K; Cohen, R J

    1997-01-01

    Time series from biological system often displays fluctuations in the measured variables. Much effort has been directed at determining whether this variability reflects deterministic chaos, or whether it is merely "noise". The output from most biological systems is probably the result of both...... the internal dynamics of the systems, and the input to the system from the surroundings. This implies that the system should be viewed as a mixed system with both stochastic and deterministic components. We present a method that appears to be useful in deciding whether determinism is present in a time series......, and if this determinism has chaotic attributes. The method relies on fitting a nonlinear autoregressive model to the time series followed by an estimation of the characteristic exponents of the model over the observed probability distribution of states for the system. The method is tested by computer simulations...

  2. Neural network versus classical time series forecasting models

    Science.gov (United States)

    Nor, Maria Elena; Safuan, Hamizah Mohd; Shab, Noorzehan Fazahiyah Md; Asrul, Mohd; Abdullah, Affendi; Mohamad, Nurul Asmaa Izzati; Lee, Muhammad Hisyam

    2017-05-01

    Artificial neural network (ANN) has advantage in time series forecasting as it has potential to solve complex forecasting problems. This is because ANN is data driven approach which able to be trained to map past values of a time series. In this study the forecast performance between neural network and classical time series forecasting method namely seasonal autoregressive integrated moving average models was being compared by utilizing gold price data. Moreover, the effect of different data preprocessing on the forecast performance of neural network being examined. The forecast accuracy was evaluated using mean absolute deviation, root mean square error and mean absolute percentage error. It was found that ANN produced the most accurate forecast when Box-Cox transformation was used as data preprocessing.

  3. A Generalization of Some Classical Time Series Tools

    DEFF Research Database (Denmark)

    Nielsen, Henrik Aalborg; Madsen, Henrik

    2001-01-01

    In classical time series analysis the sample autocorrelation function (SACF) and the sample partial autocorrelation function (SPACF) has gained wide application for structural identification of linear time series models. We suggest generalizations, founded on smoothing techniques, applicable...... for structural identification of non-linear time series models. A similar generalization of the sample cross correlation function is discussed. Furthermore, a measure of the departure from linearity is suggested. It is shown how bootstrapping can be applied to construct confidence intervals under independence...... or linearity. The generalizations do not prescribe a particular smoothing technique. In fact, when the smoother is replaced by a linear regression the generalizations reduce to close approximations of SACF and SPACF. For this reason a smooth transition from the linear to the non-linear case can be obtained...

  4. Multi-Scale Dissemination of Time Series Data

    DEFF Research Database (Denmark)

    Guo, Qingsong; Zhou, Yongluan; Su, Li

    2013-01-01

    In this paper, we consider the problem of continuous dissemination of time series data, such as sensor measurements, to a large number of subscribers. These subscribers fall into multiple subscription levels, where each subscription level is specified by the bandwidth constraint of a subscriber......, which is an abstract indicator for both the physical limits and the amount of data that the subscriber would like to handle. To handle this problem, we propose a system framework for multi-scale time series data dissemination that employs a typical tree-based dissemination network and existing time......-series compression models. Due to the bandwidth limits regarding to potentially sheer speed of data, it is inevitable to compress and re-compress data along the dissemination paths according to the subscription level of each node. Compression would caused the accuracy loss of data, thus we devise several algorithms...

  5. A statistical proxy for sulphuric acid concentration

    Directory of Open Access Journals (Sweden)

    S. Mikkonen

    2011-11-01

    Full Text Available Gaseous sulphuric acid is a key precursor for new particle formation in the atmosphere. Previous experimental studies have confirmed a strong correlation between the number concentrations of freshly formed particles and the ambient concentrations of sulphuric acid. This study evaluates a body of experimental gas phase sulphuric acid concentrations, as measured by Chemical Ionization Mass Spectrometry (CIMS during six intensive measurement campaigns and one long-term observational period. The campaign datasets were measured in Hyytiälä, Finland, in 2003 and 2007, in San Pietro Capofiume, Italy, in 2009, in Melpitz, Germany, in 2008, in Atlanta, Georgia, USA, in 2002, and in Niwot Ridge, Colorado, USA, in 2007. The long term data were obtained in Hohenpeissenberg, Germany, during 1998 to 2000. The measured time series were used to construct proximity measures ("proxies" for sulphuric acid concentration by using statistical analysis methods. The objective of this study is to find a proxy for sulfuric acid that is valid in as many different atmospheric environments as possible. Our most accurate and universal formulation of the sulphuric acid concentration proxy uses global solar radiation, SO2 concentration, condensation sink and relative humidity as predictor variables, yielding a correlation measure (R of 0.87 between observed concentration and the proxy predictions. Interestingly, the role of the condensation sink in the proxy was only minor, since similarly accurate proxies could be constructed with global solar radiation and SO2 concentration alone. This could be attributed to SO2 being an indicator for anthropogenic pollution, including particulate and gaseous emissions which represent sinks for the OH radical that, in turn, is needed for the formation of sulphuric acid.

  6. Semi-autonomous remote sensing time series generation tool

    Science.gov (United States)

    Babu, Dinesh Kumar; Kaufmann, Christof; Schmidt, Marco; Dhams, Thorsten; Conrad, Christopher

    2017-10-01

    High spatial and temporal resolution data is vital for crop monitoring and phenology change detection. Due to the lack of satellite architecture and frequent cloud cover issues, availability of daily high spatial data is still far from reality. Remote sensing time series generation of high spatial and temporal data by data fusion seems to be a practical alternative. However, it is not an easy process, since it involves multiple steps and also requires multiple tools. In this paper, a framework of Geo Information System (GIS) based tool is presented for semi-autonomous time series generation. This tool will eliminate the difficulties by automating all the steps and enable the users to generate synthetic time series data with ease. Firstly, all the steps required for the time series generation process are identified and grouped into blocks based on their functionalities. Later two main frameworks are created, one to perform all the pre-processing steps on various satellite data and the other one to perform data fusion to generate time series. The two frameworks can be used individually to perform specific tasks or they could be combined to perform both the processes in one go. This tool can handle most of the known geo data formats currently available which makes it a generic tool for time series generation of various remote sensing satellite data. This tool is developed as a common platform with good interface which provides lot of functionalities to enable further development of more remote sensing applications. A detailed description on the capabilities and the advantages of the frameworks are given in this paper.

  7. A multidisciplinary database for geophysical time series management

    Science.gov (United States)

    Montalto, P.; Aliotta, M.; Cassisi, C.; Prestifilippo, M.; Cannata, A.

    2013-12-01

    The variables collected by a sensor network constitute a heterogeneous data source that needs to be properly organized in order to be used in research and geophysical monitoring. With the time series term we refer to a set of observations of a given phenomenon acquired sequentially in time. When the time intervals are equally spaced one speaks of period or sampling frequency. Our work describes in detail a possible methodology for storage and management of time series using a specific data structure. We designed a framework, hereinafter called TSDSystem (Time Series Database System), in order to acquire time series from different data sources and standardize them within a relational database. The operation of standardization provides the ability to perform operations, such as query and visualization, of many measures synchronizing them using a common time scale. The proposed architecture follows a multiple layer paradigm (Loaders layer, Database layer and Business Logic layer). Each layer is specialized in performing particular operations for the reorganization and archiving of data from different sources such as ASCII, Excel, ODBC (Open DataBase Connectivity), file accessible from the Internet (web pages, XML). In particular, the loader layer performs a security check of the working status of each running software through an heartbeat system, in order to automate the discovery of acquisition issues and other warning conditions. Although our system has to manage huge amounts of data, performance is guaranteed by using a smart partitioning table strategy, that keeps balanced the percentage of data stored in each database table. TSDSystem also contains modules for the visualization of acquired data, that provide the possibility to query different time series on a specified time range, or follow the realtime signal acquisition, according to a data access policy from the users.

  8. Neuro-fuzzy system for chaotic time series forecasting

    Science.gov (United States)

    Masulli, Francesco; Studer, Leonard

    1997-10-01

    We report on an on-going study to assess potential benefits using soft computing methods in forecasting problems. Our goal is to forecast natural phenomena represented by time series that show chaotic features. We use a neuro-fuzzy system for its ability to adapt to numerical data and for the possibility to input and extract expert knowledge expressed in words. We present results of experiments designed to study how to shape a neuro-fuzzy systems to forecast chaotic time series. Our main conclusions are: (1) The neuro-fuzzy system is able to forecast a synthetic chaotic time series with high accuracy if the number of inputs and the time delay between them are chosen adequately. (2) The Takens-Mane theorem from chaos theory gives a useful lower bound on the minimal number of inputs. (3) The time delay between the inputs can not be set a priori. It has to be tuned for every different times series. (4) The number of fuzzy rules seems related to the size of the learning set and not to the structure of the chaotic dynamical system. We tentatively try to interpret the rules that the neuro-fuzzy system has learned. Finally we discuss the adequacy of the whole set of fuzzy rules to forecast locally the dynamical system.

  9. A novel time series link prediction method: Learning automata approach

    Science.gov (United States)

    Moradabadi, Behnaz; Meybodi, Mohammad Reza

    2017-09-01

    Link prediction is a main social network challenge that uses the network structure to predict future links. The common link prediction approaches to predict hidden links use a static graph representation where a snapshot of the network is analyzed to find hidden or future links. For example, similarity metric based link predictions are a common traditional approach that calculates the similarity metric for each non-connected link and sort the links based on their similarity metrics and label the links with higher similarity scores as the future links. Because people activities in social networks are dynamic and uncertainty, and the structure of the networks changes over time, using deterministic graphs for modeling and analysis of the social network may not be appropriate. In the time-series link prediction problem, the time series link occurrences are used to predict the future links In this paper, we propose a new time series link prediction based on learning automata. In the proposed algorithm for each link that must be predicted there is one learning automaton and each learning automaton tries to predict the existence or non-existence of the corresponding link. To predict the link occurrence in time T, there is a chain consists of stages 1 through T - 1 and the learning automaton passes from these stages to learn the existence or non-existence of the corresponding link. Our preliminary link prediction experiments with co-authorship and email networks have provided satisfactory results when time series link occurrences are considered.

  10. Time series patterns and language support in DBMS

    Science.gov (United States)

    Telnarova, Zdenka

    2017-07-01

    This contribution is focused on pattern type Time Series as a rich in semantics representation of data. Some example of implementation of this pattern type in traditional Data Base Management Systems is briefly presented. There are many approaches how to manipulate with patterns and query patterns. Crucial issue can be seen in systematic approach to pattern management and specific pattern query language which takes into consideration semantics of patterns. Query language SQL-TS for manipulating with patterns is shown on Time Series data.

  11. Handbook of Time Series Analysis Recent Theoretical Developments and Applications

    CERN Document Server

    Schelter, Björn; Timmer, Jens

    2006-01-01

    This handbook provides an up-to-date survey of current research topics and applications of time series analysis methods written by leading experts in their fields. It covers recent developments in univariate as well as bivariate and multivariate time series analysis techniques ranging from physics' to life sciences' applications. Each chapter comprises both methodological aspects and applications to real world complex systems, such as the human brain or Earth's climate. Covering an exceptionally broad spectrum of topics, beginners, experts and practitioners who seek to understand the latest de

  12. Testing for intracycle determinism in pseudoperiodic time series.

    Science.gov (United States)

    Coelho, Mara C S; Mendes, Eduardo M A M; Aguirre, Luis A

    2008-06-01

    A determinism test is proposed based on the well-known method of the surrogate data. Assuming predictability to be a signature of determinism, the proposed method checks for intracycle (e.g., short-term) determinism in the pseudoperiodic time series for which standard methods of surrogate analysis do not apply. The approach presented is composed of two steps. First, the data are preprocessed to reduce the effects of seasonal and trend components. Second, standard tests of surrogate analysis can then be used. The determinism test is applied to simulated and experimental pseudoperiodic time series and the results show the applicability of the proposed test.

  13. Noise in GPS position time series from Taiwan

    Science.gov (United States)

    Rau, Ruey-Juin; Hung, Huang-Kai

    2014-05-01

    Position time series of 393 continuous GPS (CGPS) stations with durations of 5-10 years are generated and analyzed for the noise model and seasonal motions in Taiwan. The noise parameters obtained are also used to evaluate the reliability of GPS velocity uncertainties. GPS data are processed by GAMIT/GLOBK to obtain the position time series. To obtain the seasonal properties in the CGPS time series, we modeled each GPS position time series to derive the secular velocity, seasonal variations in annual and semi-annual periods, offsets due to the antenna changing and co-seismic deformations using the weighted least square method. Uncertainties of the residual position time series after removing the modeling motions for each CGPS station are evaluated by the weighted root mean square (WRMS). The average WRMS of position time series for all CGPS stations are 1.8-2.5 mm and 3.9-7.7 mm in the horizontal and vertical components, respectively. GPS daily position time series are considered to be the signal pattern of the white plus flicker noise. The overall seasonal amplitudes for all the GPS stations are 1.8-3.3 mm in horizontal and 3.2-8.8 mm in the vertical component. Areas of strongest annual amplitudes in both horizontal and vertical components are concentrated in the west and southwest coastal plain in Taiwan which suffers severe ground subsidence from water over-pumping. The motions shown here appear to be elastic and are most likely induced by the effective stress in the regional aquifer changing between expansion and contraction. Stations located just east of the Longitudinal Valley in eastern Taiwan show high correlation between the horizontal seasonal motions and rainfall and groundwater levels. This may be resulted from the periodic contraction and extension motions of the Longitudinal Valley fault due to variations in hydrological loadings. Uncertainty of CGPS velocities estimated by the noise patterns of the Power Law noise series indicated that the

  14. Microbial oceanography and the Hawaii Ocean Time-series programme.

    Science.gov (United States)

    Karl, David M; Church, Matthew J

    2014-10-01

    The Hawaii Ocean Time-series (HOT) programme has been tracking microbial and biogeochemical processes in the North Pacific Subtropical Gyre since October 1988. The near-monthly time series observations have revealed previously undocumented phenomena within a temporally dynamic ecosystem that is vulnerable to climate change. Novel microorganisms, genes and unexpected metabolic pathways have been discovered and are being integrated into our evolving ecological paradigms. Continued research, including higher-frequency observations and at-sea experimentation, will help to provide a comprehensive scientific understanding of microbial processes in the largest biome on Earth.

  15. Detecting unstable periodic orbits in chaotic time series using synchronization

    Science.gov (United States)

    Olyaei, Ali Azimi; Wu, Christine; Kinsner, Witold

    2017-07-01

    An alternative approach of detecting unstable periodic orbits in chaotic time series is proposed using synchronization techniques. A master-slave synchronization scheme is developed, in which the chaotic system drives a system of harmonic oscillators through a proper coupling condition. The proposed scheme is designed so that the power of the coupling signal exhibits notches that drop to zero once the system approaches an unstable orbit yielding an explicit indication of the presence of a periodic motion. The results shows that the proposed approach is particularly suitable in practical situations, where the time series is short and noisy, or it is obtained from high-dimensional chaotic systems.

  16. A methodology to filter time series: application to minute-by-minute electric load series

    Directory of Open Access Journals (Sweden)

    Mayte Suarez-Farinas

    2004-12-01

    Full Text Available In this article a methodology for filtering a time series is presented, with application to high frequency series such as the minute-by-minute electric load series. The goal of this approach is to detect and substitute the irregularities of the time series that can produce distortions on the modelling stage. Outlier values are detected through a dynamic linear model and the Bayes factor tool; missing values are then interpolated with a Smoothing Cubic Spline. The performance of the proposed approach is illustrated using real data and evaluated through a series of tests where the irregularities have been simulated.Neste artigo apresenta-se uma metodologia para a filtragem de séries temporais, com aplicação em séries de alta freqüência. Esta metodologia tem como objetivo detectar e substituir as irregularidades da série temporal que podem comprometer a etapa de modelagem. São detalhados o modelo linear dinâmico utilizado para detectar os valores outliers e o emprego do Fator de Bayes. Na interpolação de valores faltantes utiliza-se o Spline Cúbico Suavizado. O desempenho da metodologia proposta é avaliado a través de vários testes onde as irregularidade foram simuladas.

  17. Segmentation of time series with long-range fractal correlations

    Science.gov (United States)

    Bernaola-Galván, P.; Oliver, J. L.; Hackenberg, M.; Coronado, A. V.; Ivanov, P. Ch.; Carpena, P.

    2012-06-01

    Segmentation is a standard method of data analysis to identify change-points dividing a nonstationary time series into homogeneous segments. However, for long-range fractal correlated series, most of the segmentation techniques detect spurious change-points which are simply due to the heterogeneities induced by the correlations and not to real nonstationarities. To avoid this oversegmentation, we present a segmentation algorithm which takes as a reference for homogeneity, instead of a random i.i.d. series, a correlated series modeled by a fractional noise with the same degree of correlations as the series to be segmented. We apply our algorithm to artificial series with long-range correlations and show that it systematically detects only the change-points produced by real nonstationarities and not those created by the correlations of the signal. Further, we apply the method to the sequence of the long arm of human chromosome 21, which is known to have long-range fractal correlations. We obtain only three segments that clearly correspond to the three regions of different G + C composition revealed by means of a multi-scale wavelet plot. Similar results have been obtained when segmenting all human chromosome sequences, showing the existence of previously unknown huge compositional superstructures in the human genome.

  18. A Short Image Series Based Scheme for Time Series Digital Image Correlation

    CERN Document Server

    Wang, Xian

    2014-01-01

    A new scheme for digital image correlation, i.e., short time series DIC (STS-DIC) is proposed. Instead of processing the original deformed speckle images individually, STS-DIC combines several adjacent deformed speckle images from a short time series and then processes the averaged image, for which deformation continuity over time is introduced. The deformation of several adjacent images is assumed to be linear in time and a new spatial-temporal displacement representation method with eight unknowns is presented based on the subset-based representation method. Then, the model of STS-DIC is created and a solving scheme is developed based on the Newton-Raphson iteration. The proposed method is verified for numerical and experimental cases. The results show that the proposed STS-DIC greatly improves the accuracy of traditional DIC, both under simple and complicated deformation conditions, while retaining acceptable actual computational cost.

  19. Time Series Data Visualization in World Wide Telescope

    Science.gov (United States)

    Fay, J.

    WorldWide Telescope provides a rich set of timer series visualization for both archival and real time data. WWT consists of both interactive desktop tools for interactive immersive visualization and HTML5 web based controls that can be utilized in customized web pages. WWT supports a range of display options including full dome, power walls, stereo and virtual reality headsets.

  20. Seasonal time series forecasting: a comparative study of arima and ...

    African Journals Online (AJOL)

    ANN) and ARIMA models in forecasting of seasonal (monthly) Time series. Using the Airline data which Faraway and Chatfield (1998) used and two other data sets and taking into consideration their suggestions, we show that ANN are not as ...

  1. Long-memory time series theory and methods

    CERN Document Server

    Palma, Wilfredo

    2007-01-01

    Wilfredo Palma, PhD, is Chairman and Professor of Statistics in the Department of Statistics at Pontificia Universidad Católica de Chile. Dr. Palma has published several refereed articles and has received over a dozen academic honors and awards. His research interests include time series analysis, prediction theory, state space systems, linear models, and econometrics.

  2. An observed 20-year time series of Agulhas leakage

    NARCIS (Netherlands)

    Le Bars, D.; Durgadoo, J. V.; Dijkstra, H. A.; Biastoch, A.; De Ruijter, W. P M

    2014-01-01

    We provide a time series of Agulhas leakage anomalies over the last 20-years from satellite altimetry. Until now, measuring the interannual variability of Indo-Atlantic exchange has been the major barrier in the investigation of the dynamics and large scale impact of Agulhas leakage. We compute the

  3. Koopman Operator Framework for Time Series Modeling and Analysis

    Science.gov (United States)

    Surana, Amit

    2018-01-01

    We propose an interdisciplinary framework for time series classification, forecasting, and anomaly detection by combining concepts from Koopman operator theory, machine learning, and linear systems and control theory. At the core of this framework is nonlinear dynamic generative modeling of time series using the Koopman operator which is an infinite-dimensional but linear operator. Rather than working with the underlying nonlinear model, we propose two simpler linear representations or model forms based on Koopman spectral properties. We show that these model forms are invariants of the generative model and can be readily identified directly from data using techniques for computing Koopman spectral properties without requiring the explicit knowledge of the generative model. We also introduce different notions of distances on the space of such model forms which is essential for model comparison/clustering. We employ the space of Koopman model forms equipped with distance in conjunction with classical machine learning techniques to develop a framework for automatic feature generation for time series classification. The forecasting/anomaly detection framework is based on using Koopman model forms along with classical linear systems and control approaches. We demonstrate the proposed framework for human activity classification, and for time series forecasting/anomaly detection in power grid application.

  4. Estimating continuous monthly baseflow time series and their ...

    African Journals Online (AJOL)

    drinie

    2001-04-02

    Apr 2, 2001 ... possible applications in the context of groundwater and estuarine components of the ecological reserve determination are discussed. Introduction .... future analysis with other modules provided by the time series computer .... major interactive components of the water cycle, namely surface water bodies and ...

  5. RADON CONCENTRATION TIME SERIES MODELING AND APPLICATION DISCUSSION.

    Science.gov (United States)

    Stránský, V; Thinová, L

    2017-11-01

    In the year 2010 a continual radon measurement was established at Mladeč Caves in the Czech Republic using a continual radon monitor RADIM3A. In order to model radon time series in the years 2010-15, the Box-Jenkins Methodology, often used in econometrics, was applied. Because of the behavior of radon concentrations (RCs), a seasonal integrated, autoregressive moving averages model with exogenous variables (SARIMAX) has been chosen to model the measured time series. This model uses the time series seasonality, previously acquired values and delayed atmospheric parameters, to forecast RC. The developed model for RC time series is called regARIMA(5,1,3). Model residuals could be retrospectively compared with seismic evidence of local or global earthquakes, which occurred during the RCs measurement. This technique enables us to asses if continuously measured RC could serve an earthquake precursor. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  6. Time series prediction with simple recurrent neural networks ...

    African Journals Online (AJOL)

    Simple recurrent neural networks are widely used in time series prediction. Most researchers and application developers often choose arbitrarily between Elman or Jordan simple recurrent neural networks for their applications. A hybrid of the two called Elman-Jordan (or Multi-recurrent) neural network is also being used.

  7. forecasting with nonlinear time series model: a monte-carlo ...

    African Journals Online (AJOL)

    PUBLICATIONS1

    ABSTRACT. In this paper, we propose a new method of forecasting with nonlinear time series model using. Monte-Carlo Bootstrap method. This new method gives better result in terms of forecast root mean squared error (RMSE) when compared with the traditional Bootstrap method and Monte-. Carlo method of forecasting ...

  8. Multivariate Time Series Analysis for Optimum Production Forecast ...

    African Journals Online (AJOL)

    This study focuses on the establishment of an optimum forecast model that predicts future production trends of 7UP Bottling company. Sixty (60) months time series data of 7UP bottling company were used after ascertaining the presence of seasonal variation and trend components of the data to establish the ...

  9. Forecasting with nonlinear time series model: A Monte-Carlo ...

    African Journals Online (AJOL)

    In this paper, we propose a new method of forecasting with nonlinear time series model using Monte-Carlo Bootstrap method. This new method gives better result in terms of forecast root mean squared error (RMSE) when compared with the traditional Bootstrap method and Monte-Carlo method of forecasting using a ...

  10. Change detection in a time series of polarimetric SAR images

    DEFF Research Database (Denmark)

    Skriver, Henning; Nielsen, Allan Aasbjerg; Conradsen, Knut

    can be used to detect at which points changes occur in the time series. [1] T. W. Anderson, An Introduction to Multivariate Statistical Analysis, John Wiley, New York, third edition, 2003. [2] K. Conradsen, A. A. Nielsen, J. Schou, and H. Skriver, “A test statistic in the complex Wishart distribution...

  11. Notes on economic time series analysis system theoretic perspectives

    CERN Document Server

    Aoki, Masanao

    1983-01-01

    In seminars and graduate level courses I have had several opportunities to discuss modeling and analysis of time series with economists and economic graduate students during the past several years. These experiences made me aware of a gap between what economic graduate students are taught about vector-valued time series and what is available in recent system literature. Wishing to fill or narrow the gap that I suspect is more widely spread than my personal experiences indicate, I have written these notes to augment and reor­ ganize materials I have given in these courses and seminars. I have endeavored to present, in as much a self-contained way as practicable, a body of results and techniques in system theory that I judge to be relevant and useful to economists interested in using time series in their research. I have essentially acted as an intermediary and interpreter of system theoretic results and perspectives in time series by filtering out non-essential details, and presenting coherent accounts of wha...

  12. Approaches to Time-series Catch Data Reconstruction

    African Journals Online (AJOL)

    Reliable time-series catch and effort data are fundamental for fisheries assessment and management; however, such data are usually not readily available. The Food and Agricultrual Organization (FAO) compiles statistical reports from its member countries, but their reliability is questionable. Several approaches were ...

  13. Approaches to Time-series Catch Data Reconstruction

    African Journals Online (AJOL)

    Food and Agricultrual Organization (FAO) compiles statistical reports from its member countries, but their reliability is questionable. Several approaches were explored in this study for the reconstruction of time-series catch data using Red Sea fisheries as case studies, starting from 1950. Historical documents, published and ...

  14. Practical implementation of nonlinear time series methods The TISEAN package

    CERN Document Server

    Hegger, R; Schreiber, T; Hegger, Rainer; Kantz, Holger; Schreiber, Thomas

    1998-01-01

    Nonlinear time series analysis is becoming a more and more reliable tool for the study of complicated dynamics from measurements. The concept of low-dimensional chaos has proven to be fruitful in the understanding of many complex phenomena despite the fact that very few natural systems have actually been found to be low dimensional deterministic in the sense of the theory. In order to evaluate the long term usefulness of the nonlinear time series approach as inspired by chaos theory, it will be important that the corresponding methods become more widely accessible. This paper, while not a proper review on nonlinear time series analysis, tries to make a contribution to this process by describing the actual implementation of the algorithms, and their proper usage. Most of the methods require the choice of certain parameters for each specific time series application. We will try to give guidance in this respect. The scope and selection of topics in this article, as well as the implementational choices that have ...

  15. ISO 9000 Series Certification Over Time: what have we learnt?

    NARCIS (Netherlands)

    A. van der Wiele (Ton); A.M. Brown (Alan)

    2002-01-01

    textabstractThe ISO 9000 experiences of the same sample of organisations over a five year time period is examined in this paper. The responses to a questionnaire sent out at the end of 1999 to companies which had a reasonably long term experience with the ISO 9000 series quality system are analysed.

  16. Outlier detection algorithms for least squares time series regression

    DEFF Research Database (Denmark)

    Johansen, Søren; Nielsen, Bent

    We review recent asymptotic results on some robust methods for multiple regression. The regressors include stationary and non-stationary time series as well as polynomial terms. The methods include the Huber-skip M-estimator, 1-step Huber-skip M-estimators, in particular the Impulse Indicator...

  17. Unsupervised land cover change detection: meaningful sequential time series analysis

    CSIR Research Space (South Africa)

    Salmon, BP

    2011-06-01

    Full Text Available -1 Unsupervised Land Cover Change Detection: Meaningful Sequential Time Series Analysis Salmon, B.P.; Olivier, J.C.; Wessels, K.J.; Kleynhans, W.; van den Bergh, F.; Steenkamp, K.C.; Dept. of Electr., Electron. & Comput. Eng., Univ. of Pretoria, Pretoria...

  18. United States forest disturbance trends observed with landsat time series

    Science.gov (United States)

    Jeffrey G. Masek; Samuel N. Goward; Robert E. Kennedy; Warren B. Cohen; Gretchen G. Moisen; Karen Schleweiss; Chengquan. Huang

    2013-01-01

    Disturbance events strongly affect the composition, structure, and function of forest ecosystems; however, existing US land management inventories were not designed to monitor disturbance. To begin addressing this gap, the North American Forest Dynamics (NAFD) project has examined a geographic sample of 50 Landsat satellite image time series to assess trends in forest...

  19. A Hybrid Joint Moment Ratio Test for Financial Time Series

    NARCIS (Netherlands)

    P.A. Groenendijk (Patrick); A. Lucas (André); C.G. de Vries (Casper)

    1998-01-01

    textabstractWe advocate the use of absolute moment ratio statistics in conjunction with standard variance ratio statistics in order to disentangle linear dependence, non-linear dependence, and leptokurtosis in financial time series. Both statistics are computed for multiple return horizons

  20. Two-fractal overlap time series: Earthquakes and market crashes

    Indian Academy of Sciences (India)

    203–210. Two-fractal overlap time series: Earthquakes and market crashes. BIKAS K CHAKRABARTI1,2,∗, ARNAB CHATTERJEE1,3 and. PRATIP BHATTACHARYYA1,4. 1Theoretical Condensed Matter Physics Division and Centre for Applied Mathematics and. Computational Science, Saha Institute of Nuclear Physics, ...

  1. Growth And Export Expansion In Mauritius - A Time Series Analysis ...

    African Journals Online (AJOL)

    This paper analyses the empirical relationship between economic growth and export expansion in Mauritius as observed through time series data. Using Granger Causality tests, the short-run analysis results revealed that there is significant reciprocal causality between real export earnings (total, textiles and manufacturing) ...

  2. Evaluating Bilingual Education Using a Time Series Design.

    Science.gov (United States)

    McConnell, Beverly B.

    1982-01-01

    A six-year evaluation of the long-range benefits of bilingual education illustrates the utility of time series and between-group variance for programs involving linguistic and ethnic minority populations. Through the described Individualized Bilingual Instruction children were brought to a level of balanced bilingualism in approximately a…

  3. Multivariate Time Series Analysis for Optimum Production Forecast ...

    African Journals Online (AJOL)

    FIRST LADY

    0.002579KG/Month. Finally, this work adds to the growing body of literature on data-driven production and inventory management by utilizing historical data in the development of useful forecasting mathematical model. Keywords: production model, inventory management, multivariate time series, production forecast.

  4. Time series analysis in astronomy: Limits and potentialities

    DEFF Research Database (Denmark)

    Vio, R.; Kristensen, N.R.; Madsen, Henrik

    2005-01-01

    In this paper we consider the problem of the limits concerning the physical information that can be extracted from the analysis of one or more time series ( light curves) typical of astrophysical objects. On the basis of theoretical considerations and numerical simulations, we show that with no a...

  5. Finding Time Series Discord Based on Bit Representation Clustering

    NARCIS (Netherlands)

    Li, G.; Braysy, O.M.P.; Jiang, L.; Wu, Z.; Wang, Y.

    2013-01-01

    The problem of finding time series discord has attracted much attention recently due to its numerous applications and several algorithms have been suggested. However, most of them suffer from high computation cost and cannot satisfy the requirement of real applications. In this paper, we propose a

  6. Time Series Factor Analysis with an Application to Measuring Money

    NARCIS (Netherlands)

    Gilbert, Paul D.; Meijer, Erik

    2005-01-01

    Time series factor analysis (TSFA) and its associated statistical theory is developed. Unlike dynamic factor analysis (DFA), TSFA obviates the need for explicitly modeling the process dynamics of the underlying phenomena. It also differs from standard factor analysis (FA) in important respects: the

  7. Additive nonparametric reconstruction of dynamical systems from time series

    Science.gov (United States)

    Abel, Markus; Ahnert, Karsten; Kurths, Jürgen; Mandelj, Simon

    2005-01-01

    We present a nonparametric way to retrieve an additive system of differential equations in embedding space from a single time series. These equations can be treated with dynamical systems theory and allow for long-term predictions. We apply our method to a modified chaotic Chua oscillator in order to demonstrate its potential.

  8. A Non-standard Empirical Likelihood for Time Series

    DEFF Research Database (Denmark)

    Nordman, Daniel J.; Bunzel, Helle; Lahiri, Soumendra N.

    Standard blockwise empirical likelihood (BEL) for stationary, weakly dependent time series requires specifying a fixed block length as a tuning parameter for setting confidence regions. This aspect can be difficult and impacts coverage accuracy. As an alternative, this paper proposes a new version...

  9. Economic growth - Quality of life nexus in Ethiopia: Time series ...

    African Journals Online (AJOL)

    QoL) in Ethiopia by using objective indicators of QoL-variables from economic, social and political aspects by employing descriptive and time series analysis methods. The results from the descriptive analysis confirmed an increasing trend in all ...

  10. Financial Intermediation and the Nigerian Economy: A Time Series ...

    African Journals Online (AJOL)

    This paper examines the level of development of financial intermediation and how it impacts on economic growth of Nigeria. Using a time series data covering a period of 40 years (1970 –2009) and employing the econometric tool of Ordinary Least Squares (OLS) and cointegration analysis based on Engle Granger ...

  11. Publicly Verifiable Private Aggregation of Time-Series Data

    NARCIS (Netherlands)

    Bakondi, B.G.; Peter, A.; Everts, M.H.; Hartel, P.H.; Jonker, W.

    2015-01-01

    Aggregation of time-series data offers the possibility to learn certain statistics over data periodically uploaded by different sources. In case of privacy sensitive data, it is desired to hide every data provider's individual values from the other participants (including the data aggregator).

  12. Publicly Verifiable Private Aggregation of Time-Series Data

    NARCIS (Netherlands)

    Bakondi, Bence Gábor; Peter, Andreas; Everts, Maarten Hinderik; Hartel, Pieter H.; Jonker, Willem

    2015-01-01

    Aggregation of time-series data offers the possibility to learn certain statistics over data periodically uploaded by different sources. In case of privacy sensitive data, it is desired to hide every data provider’s individual values from the other participants (including the data aggregator).

  13. Efficient Processing of Multiple DTW Queries in Time Series Databases

    DEFF Research Database (Denmark)

    Kremer, Hardy; Günnemann, Stephan; Ivanescu, Anca-Maria

    2011-01-01

    Dynamic Time Warping (DTW) is a widely used distance measure for time series that has been successfully used in science and many other application domains. As DTW is computationally expensive, there is a strong need for efficient query processing algorithms. Such algorithms exist for single queries....... In many of today’s applications, however, large numbers of queries arise at any given time. Existing DTW techniques do not process multiple DTW queries simultaneously, a serious limitation which slows down overall processing. In this paper, we propose an efficient processing approach for multiple DTW...... queries. We base our approach on the observation that algorithms in areas such as data mining and interactive visualization incur many queries that share certain characteristics. Our solution exploits these shared characteristics by pruning database time series with respect to sets of queries, and we...

  14. [Anomaly Detection of Multivariate Time Series Based on Riemannian Manifolds].

    Science.gov (United States)

    Xu, Yonghong; Hou, Xiaoying; Li Shuting; Cui, Jie

    2015-06-01

    Multivariate time series problems widely exist in production and life in the society. Anomaly detection has provided people with a lot of valuable information in financial, hydrological, meteorological fields, and the research areas of earthquake, video surveillance, medicine and others. In order to quickly and efficiently find exceptions in time sequence so that it can be presented in front of people in an intuitive way, we in this study combined the Riemannian manifold with statistical process control charts, based on sliding window, with a description of the covariance matrix as the time sequence, to achieve the multivariate time series of anomaly detection and its visualization. We made MA analog data flow and abnormal electrocardiogram data from MIT-BIH as experimental objects, and verified the anomaly detection method. The results showed that the method was reasonable and effective.

  15. Time-series prediction of shellfish farm closure: A comparison of alternatives

    Directory of Open Access Journals (Sweden)

    Ashfaqur Rahman

    2014-08-01

    Full Text Available Shellfish farms are closed for harvest when microbial pollutants are present. Such pollutants are typically present in rainfall runoff from various land uses in catchments. Experts currently use a number of observable parameters (river flow, rainfall, salinity as proxies to determine when to close farms. We have proposed using the short term historical rainfall data as a time-series prediction problem where we aim to predict the closure of shellfish farms based only on rainfall. Time-series event prediction consists of two steps: (i feature extraction, and (ii prediction. A number of data mining challenges exist for these scenarios: (i which feature extraction method best captures the rainfall pattern over successive days that leads to opening or closure of the farms?, (ii The farm closure events occur infrequently and this leads to a class imbalance problem; the question is what is the best way to deal with this problem? In this paper we have analysed and compared different combinations of balancing methods (under-sampling and over-sampling, feature extraction methods (cluster profile, curve fitting, Fourier Transform, Piecewise Aggregate Approximation, and Wavelet Transform and learning algorithms (neural network, support vector machine, k-nearest neighbour, decision tree, and Bayesian Network to predict closure events accurately considering the above data mining challenges. We have identified the best combination of techniques to accurately predict shellfish farm closure from rainfall, given the above data mining challenges.

  16. Classification of time series patterns from complex dynamic systems

    Energy Technology Data Exchange (ETDEWEB)

    Schryver, J.C.; Rao, N.

    1998-07-01

    An increasing availability of high-performance computing and data storage media at decreasing cost is making possible the proliferation of large-scale numerical databases and data warehouses. Numeric warehousing enterprises on the order of hundreds of gigabytes to terabytes are a reality in many fields such as finance, retail sales, process systems monitoring, biomedical monitoring, surveillance and transportation. Large-scale databases are becoming more accessible to larger user communities through the internet, web-based applications and database connectivity. Consequently, most researchers now have access to a variety of massive datasets. This trend will probably only continue to grow over the next several years. Unfortunately, the availability of integrated tools to explore, analyze and understand the data warehoused in these archives is lagging far behind the ability to gain access to the same data. In particular, locating and identifying patterns of interest in numerical time series data is an increasingly important problem for which there are few available techniques. Temporal pattern recognition poses many interesting problems in classification, segmentation, prediction, diagnosis and anomaly detection. This research focuses on the problem of classification or characterization of numerical time series data. Highway vehicles and their drivers are examples of complex dynamic systems (CDS) which are being used by transportation agencies for field testing to generate large-scale time series datasets. Tools for effective analysis of numerical time series in databases generated by highway vehicle systems are not yet available, or have not been adapted to the target problem domain. However, analysis tools from similar domains may be adapted to the problem of classification of numerical time series data.

  17. Displaying time series, spatial, and space-time data with R

    CERN Document Server

    Perpinan Lamigueiro, Oscar

    2014-01-01

    Code and Methods for Creating High-Quality Data GraphicsA data graphic is not only a static image, but it also tells a story about the data. It activates cognitive processes that are able to detect patterns and discover information not readily available with the raw data. This is particularly true for time series, spatial, and space-time datasets.Focusing on the exploration of data with visual methods, Displaying Time Series, Spatial, and Space-Time Data with R presents methods and R code for producing high-quality graphics of time series, spatial, and space-time data. Practical examples using

  18. Dependency Structures in Differentially Coded Cardiovascular Time Series

    OpenAIRE

    Tatjana Tasic; Sladjana Jovanovic; Omer Mohamoud; Tamara Skoric; Nina Japundzic-Zigon; Dragana Bajic

    2017-01-01

    Objectives. This paper analyses temporal dependency in the time series recorded from aging rats, the healthy ones and those with early developed hypertension. The aim is to explore effects of age and hypertension on mutual sample relationship along the time axis. Methods. A copula method is applied to raw and to differentially coded signals. The latter ones were additionally binary encoded for a joint conditional entropy application. The signals were recorded from freely moving male Wistar ra...

  19. An entropic approach to the analysis of time series

    Science.gov (United States)

    Scafetta, Nicola

    Statistical analysis of time series. With compelling arguments we show that the Diffusion Entropy Analysis (DEA) is the only method of the literature of the Science of Complexity that correctly determines the scaling hidden within a time series reflecting a Complex Process. The time series is thought of as a source of fluctuations, and the DEA is based on the Shannon entropy of the diffusion process generated by these fluctuations. All traditional methods of scaling analysis, instead, are based on the variance of this diffusion process. The variance methods detect the real scaling only if the Gaussian assumption holds true. We call H the scaling exponent detected by the variance methods and delta the real scaling exponent. If the time series is characterized by Fractional Brownian Motion, we have H = delta and the scaling can be safely determined, in this case, by using the variance methods. If, on the contrary, the time series is characterized, for example, by Levy statistics, H ≠ delta and the variance methods cannot be used to detect the true scaling. Levy walk yields the relation delta = 1/(3 - 2H). In the case of Levy flights, the variance diverges and the exponent H cannot be determined, whereas the scaling delta exists and can be established by using the DEA. Therefore, only the joint use of two different scaling analysis methods, the variance scaling analysis and the DEA, can assess the real nature, Gauss or Levy or something else, of a time series. Moreover, the DEA determines the information content, under the form of Shannon entropy, or of any other convenient entropic indicator, at each time step of the process that, given a sufficiently large number of data, is expected to become diffusion with scaling. This makes it possible to study the regime of transition from dynamics to thermodynamics, non-stationary regimes, and the saturation regime as well. First of all, the efficiency of the DEA is proved with theoretical arguments and with numerical work

  20. Nonlinear transformation on the transfer entropy of financial time series

    Science.gov (United States)

    Wu, Zhenyu; Shang, Pengjian

    2017-09-01

    Transfer entropy (TE) now is widely used in the data mining and economic field. However, TE itself demands that time series intend to be stationary and meet Markov condition. Naturally, we are interested in investigating the effect of the nonlinear transformation of the two series on the TE. Therefore, the paper is designed to study the TE of five nonlinear ;volatile; transformations based on the data which are generated by the linear modeling and the logistic maps modeling, as well as the dataset that come from financial markets. With only one of the TE of nonlinear transformations fluctuating around the TE of original series, the TE of others all have increased with different degrees.

  1. Wavelet analysis on paleomagnetic (and computer simulated VGP time series

    Directory of Open Access Journals (Sweden)

    A. Siniscalchi

    2003-06-01

    Full Text Available We present Continuous Wavelet Transform (CWT data analysis of Virtual Geomagnetic Pole (VGP latitude time series. The analyzed time series are sedimentary paleomagnetic and geodynamo simulated data. Two mother wavelets (the Morlet function and the first derivative of a Gaussian function are used in order to detect features related to the spectral content as well as polarity excursions and reversals. By means of the Morlet wavelet, we estimate both the global spectrum and the time evolution of the spectral content of the paleomagnetic data series. Some peaks corresponding to the orbital components are revealed by the spectra and the local analysis helped disclose their statistical significance. Even if this feature could be an indication of orbital influence on geodynamo, other interpretations are possible. In particular, we note a correspondence of local spectral peaks with the appearance of the excursions in the series. The comparison among the paleomagnetic and simulated spectra shows a similarity in the high frequency region indicating that their degree of regularity is analogous. By means of Gaussian first derivative wavelet, reversals and excursions of polarity were sought. The analysis was performed first on the simulated data, to have a guide in understanding the features present in the more complex paleomagnetic data. Various excursions and reversals have been identified, despite of the prevalent normality of the series and its inherent noise. The found relative chronology of the paleomagnetic data reversals was compared with a coeval global polarity time scale (Channel et al., 1995. The relative lengths of polarity stability intervals are found similar, but a general shift appears between the two scales, that could be due to the datation uncertainties of the Hauterivian/Barremian boundary.

  2. Mulstiscale Stochastic Generator of Multivariate Met-Ocean Time Series

    Science.gov (United States)

    Guanche, Yanira; Mínguez, Roberto; Méndez, Fernando J.

    2013-04-01

    The design of maritime structures requires information on sea state conditions that influence its behavior during its life cycle. In the last decades, there has been a increasing development of sea databases (buoys, reanalysis, satellite) that allow an accurate description of the marine climate and its interaction with a given structure in terms of functionality and stability. However, these databases have a limited timelength, and its appliance entails an associated uncertainty. To avoid this limitation, engineers try to sample synthetically generated time series, statistically consistent, which allow the simulation of longer time periods. The present work proposes a hybrid methodology to deal with this issue. It is based in the combination of clustering algorithms (k-means) and an autoregressive logistic regression model (logit). Since the marine climate is directly related to the atmospheric conditions at a synoptic scale, the proposed methodology takes both systems into account; generating simultaneously circulation patterns (weather types) time series and the sea state time series related. The generation of these time series can be summarized in three steps: (1) By applying the clustering technique k-means the atmospheric conditions are classified into a representative number of synoptical patterns (2) Taking into account different covariates involved (such as seasonality, interannual variability, trends or autoregressive term) the autoregressive logistic model is adjusted (3) Once the model is able to simulate weather types time series the last step is to generate multivariate hourly metocean parameters related to these weather types. This is done by an autoregressive model (ARMA) for each variable, including cross-correlation between them. To show the goodness of the proposed method the following data has been used: Sea Level Pressure (SLP) databases from NCEP-NCAR and Global Ocean Wave (GOW) reanalysis from IH Cantabria. The synthetical met-ocean hourly

  3. Time-series analysis of Music: Perceptual and Information Dynamics

    Directory of Open Access Journals (Sweden)

    Marcus T. Pearce

    2011-12-01

    Full Text Available Dean and Bailes (2010 provide a tutorial on the use of time-series analysis in research on music perception and a study of the influence of acoustic factors on real-time perception of music. They illustrate their approach with a detailed case study of an electroacoustic composition by Trevor Wishart. In this commentary, I discuss four aspects of Dean and Bailes’ presentation: first, the importance of focusing on dynamic changes in musical structure; second, the benefits of computer-generated music for research on music perception; third, the need for caution in averaging responses from multiple listeners; and finally, the role of time-series analysis in understanding computational information-dynamic models of music cognition.

  4. Environmental time series interpolation based on Spartan random processes

    Science.gov (United States)

    Žukovič, Milan; Hristopulos, D. T.

    In many environmental applications, time series are either incomplete or irregularly spaced. We investigate the application of the Spartan random process to missing data prediction. We employ a novel modified method of moments (MMoM) and the established method of maximum likelihood (ML) for parameter inference. The CPU time of MMoM is shown to be much faster than that of ML estimation and almost independent of the data size. We formulate an explicit Spartan interpolator for estimating missing data. The model validation is performed on both synthetic data and real time series of atmospheric aerosol concentrations. The prediction performance is shown to be comparable with that attained by means of the best linear unbiased (Kolmogorov-Wiener) predictor at reduced computational cost.

  5. Recursive Bayesian recurrent neural networks for time-series modeling.

    Science.gov (United States)

    Mirikitani, Derrick T; Nikolaev, Nikolay

    2010-02-01

    This paper develops a probabilistic approach to recursive second-order training of recurrent neural networks (RNNs) for improved time-series modeling. A general recursive Bayesian Levenberg-Marquardt algorithm is derived to sequentially update the weights and the covariance (Hessian) matrix. The main strengths of the approach are a principled handling of the regularization hyperparameters that leads to better generalization, and stable numerical performance. The framework involves the adaptation of a noise hyperparameter and local weight prior hyperparameters, which represent the noise in the data and the uncertainties in the model parameters. Experimental investigations using artificial and real-world data sets show that RNNs equipped with the proposed approach outperform standard real-time recurrent learning and extended Kalman training algorithms for recurrent networks, as well as other contemporary nonlinear neural models, on time-series modeling.

  6. A comprehensive characterization of recurrences in time series

    CERN Document Server

    Chicheportiche, Rémy

    2013-01-01

    Study of recurrences in earthquakes, climate, financial time-series, etc. is crucial to better forecast disasters and limit their consequences. However, almost all the previous phenomenological studies involved only a long-ranged autocorrelation function, or disregarded the multi-scaling properties induced by potential higher order dependencies. Consequently, they missed the facts that non-linear dependences do impact both the statistics and dynamics of recurrence times, and that scaling arguments for the unconditional distribution may not be applicable. We argue that copulas is the correct model-free framework to study non-linear dependencies in time series and related concepts like recurrences. Fitting and/or simulating the intertemporal distribution of recurrence intervals is very much system specific, and cannot actually benefit from universal features, in contrast to the previous claims. This has important implications in epilepsy prognosis and financial risk management applications.

  7. TimeSeriesStreaming.vi: LabVIEW program for reliable data streaming of large analog time series

    CERN Document Server

    Czerwinski, Fabian

    2010-01-01

    With modern data acquisition devices that work fast and very precise, scientists often face the task of dealing with huge amounts of data. These need to be rapidly processed and stored onto a hard disk. We present a LabVIEW program which reliably streams analog time series of MHz sampling. Its run time has virtually no limitation. We explicitly show how to use the program to extract time series from two experiments: For a photodiode detection system that tracks the position of an optically trapped particle and for a measurement of ionic current through a glass capillary. The program is easy to use and versatile as the input can be any type of analog signal. Also, the data streaming software is simple, highly reliable, and can be easily customized to include, e.g., real-time power spectral analysis and Allan variance noise quantification.

  8. Perception of acoustically presented time series with varied intervals.

    Science.gov (United States)

    Wackermann, Jiří; Pacer, Jakob; Wittmann, Marc

    2014-03-01

    Data from three experiments on serial perception of temporal intervals in the supra-second domain are reported. Sequences of short acoustic signals ("pips") separated by periods of silence were presented to the observers. Two types of time series, geometric or alternating, were used, where the modulus 1+δ of the inter-pip series and the base duration Tb (range from 1.1 to 6s) were varied as independent parameters. The observers had to judge whether the series were accelerating, decelerating, or uniform (3 paradigm), or to distinguish regular from irregular sequences (2 paradigm). "Intervals of subjective uniformity" (isus) were obtained by fitting Gaussian psychometric functions to individual subjects' responses. Progression towards longer base durations (Tb=4.4 or 6s) shifts the isus towards negative δs, i.e., accelerating series. This finding is compatible with the phenomenon of "subjective shortening" of past temporal intervals, which is naturally accounted for by the lossy integration model of internal time representation. The opposite effect observed for short durations (Tb=1.1 or 1.5s) remains unexplained by the lossy integration model, and presents a challenge for further research. © 2013 Elsevier B.V. All rights reserved.

  9. Topological data analysis of financial time series: Landscapes of crashes

    Science.gov (United States)

    Gidea, Marian; Katz, Yuri

    2018-02-01

    We explore the evolution of daily returns of four major US stock market indices during the technology crash of 2000, and the financial crisis of 2007-2009. Our methodology is based on topological data analysis (TDA). We use persistence homology to detect and quantify topological patterns that appear in multidimensional time series. Using a sliding window, we extract time-dependent point cloud data sets, to which we associate a topological space. We detect transient loops that appear in this space, and we measure their persistence. This is encoded in real-valued functions referred to as a 'persistence landscapes'. We quantify the temporal changes in persistence landscapes via their Lp-norms. We test this procedure on multidimensional time series generated by various non-linear and non-equilibrium models. We find that, in the vicinity of financial meltdowns, the Lp-norms exhibit strong growth prior to the primary peak, which ascends during a crash. Remarkably, the average spectral density at low frequencies of the time series of Lp-norms of the persistence landscapes demonstrates a strong rising trend for 250 trading days prior to either dotcom crash on 03/10/2000, or to the Lehman bankruptcy on 09/15/2008. Our study suggests that TDA provides a new type of econometric analysis, which complements the standard statistical measures. The method can be used to detect early warning signals of imminent market crashes. We believe that this approach can be used beyond the analysis of financial time series presented here.

  10. Satellite time series analysis using Empirical Mode Decomposition

    Science.gov (United States)

    Pannimpullath, R. Renosh; Doolaeghe, Diane; Loisel, Hubert; Vantrepotte, Vincent; Schmitt, Francois G.

    2016-04-01

    Geophysical fields possess large fluctuations over many spatial and temporal scales. Satellite successive images provide interesting sampling of this spatio-temporal multiscale variability. Here we propose to consider such variability by performing satellite time series analysis, pixel by pixel, using Empirical Mode Decomposition (EMD). EMD is a time series analysis technique able to decompose an original time series into a sum of modes, each one having a different mean frequency. It can be used to smooth signals, to extract trends. It is built in a data-adaptative way, and is able to extract information from nonlinear signals. Here we use MERIS Suspended Particulate Matter (SPM) data, on a weekly basis, during 10 years. There are 458 successive time steps. We have selected 5 different regions of coastal waters for the present study. They are Vietnam coastal waters, Brahmaputra region, St. Lawrence, English Channel and McKenzie. These regions have high SPM concentrations due to large scale river run off. Trend and Hurst exponents are derived for each pixel in each region. The energy also extracted using Hilberts Spectral Analysis (HSA) along with EMD method. Normalised energy computed for each mode for each region with the total energy. The total energy computed using all the modes are extracted using EMD method.

  11. Cluster analysis of activity-time series in motor learning

    DEFF Research Database (Denmark)

    Balslev, Daniela; Nielsen, Finn Å; Futiger, Sally A

    2002-01-01

    Neuroimaging studies of learning focus on brain areas where the activity changes as a function of time. To circumvent the difficult problem of model selection, we used a data-driven analytic tool, cluster analysis, which extracts representative temporal and spatial patterns from the voxel......-time series. The optimal number of clusters was chosen using a cross-validated likelihood method, which highlights the clustering pattern that generalizes best over the subjects. Data were acquired with PET at different time points during practice of a visuomotor task. The results from cluster analysis show...

  12. West Africa land use and land cover time series

    Science.gov (United States)

    Cotillon, Suzanne E.

    2017-02-16

    Started in 1999, the West Africa Land Use Dynamics project represents an effort to map land use and land cover, characterize the trends in time and space, and understand their effects on the environment across West Africa. The outcome of the West Africa Land Use Dynamics project is the production of a three-time period (1975, 2000, and 2013) land use and land cover dataset for the Sub-Saharan region of West Africa, including the Cabo Verde archipelago. The West Africa Land Use Land Cover Time Series dataset offers a unique basis for characterizing and analyzing land changes across the region, systematically and at an unprecedented level of detail.

  13. The Puoko-nui CCD Time-Series Photometer

    Science.gov (United States)

    Chote, P.; Sullivan, D. J.

    2013-01-01

    Puoko-nui (te reo Maori for ‘big eye’) is a precision time series photometer developed at Victoria University of Wellington, primarily for use with the 1m McLellan telescope at Mt John University Observatory (MJUO), at Lake Tekapo, New Zealand. GPS based timing provides excellent timing accuracy, and online reduction software processes frames as they are acquired. The user is presented with a simple user interface that includes instrument control and an up to date lightcurve and Fourier amplitude spectrum of the target star. Puoko-nui has been operating in its current form since early 2011, where it is primarily used to monitor pulsating white dwarf stars.

  14. Chaotic time series. Part II. System Identification and Prediction

    Directory of Open Access Journals (Sweden)

    Bjørn Lillekjendlie

    1994-10-01

    Full Text Available This paper is the second in a series of two, and describes the current state of the art in modeling and prediction of chaotic time series. Sample data from deterministic non-linear systems may look stochastic when analysed with linear methods. However, the deterministic structure may be uncovered and non-linear models constructed that allow improved prediction. We give the background for such methods from a geometrical point of view, and briefly describe the following types of methods: global polynomials, local polynomials, multilayer perceptrons and semi-local methods including radial basis functions. Some illustrative examples from known chaotic systems are presented, emphasising the increase in prediction error with time. We compare some of the algorithms with respect to prediction accuracy and storage requirements, and list applications of these methods to real data from widely different areas.

  15. Machine learning for cardiac ultrasound time series data

    Science.gov (United States)

    Yuan, Baichuan; Chitturi, Sathya R.; Iyer, Geoffrey; Li, Nuoyu; Xu, Xiaochuan; Zhan, Ruohan; Llerena, Rafael; Yen, Jesse T.; Bertozzi, Andrea L.

    2017-03-01

    We consider the problem of identifying frames in a cardiac ultrasound video associated with left ventricular chamber end-systolic (ES, contraction) and end-diastolic (ED, expansion) phases of the cardiac cycle. Our procedure involves a simple application of non-negative matrix factorization (NMF) to a series of frames of a video from a single patient. Rank-2 NMF is performed to compute two end-members. The end members are shown to be close representations of the actual heart morphology at the end of each phase of the heart function. Moreover, the entire time series can be represented as a linear combination of these two end-member states thus providing a very low dimensional representation of the time dynamics of the heart. Unlike previous work, our methods do not require any electrocardiogram (ECG) information in order to select the end-diastolic frame. Results are presented for a data set of 99 patients including both healthy and diseased examples.

  16. Model of a synthetic wind speed time series generator

    DEFF Research Database (Denmark)

    Negra, N.B.; Holmstrøm, O.; Bak-Jensen, B.

    2008-01-01

    Wind energy has assumed a great relevance in the operation and planning of today's power systems due to the exponential increase of installations in the last 10 years. For this reason, many performed studies have looked at suitable representations of wind generation for power system analysis. One...... of the main elements to consider for this purpose is the model of the wind speed that is usually required as input. Wind speed measurements may represent a solution for this problem, but, for techniques such as sequential Monte Carlo simulation, they have to be long enough in order to describe a wide range...... of possible wind conditions. If these information are not available, synthetic wind speed time series may be a useful tool as well, but their generator must preserve statistical and stochastic features of the phenomenon. This paper deals with this issue: a generator for synthetic wind speed time series...

  17. Mathematical methods in time series analysis and digital image processing

    CERN Document Server

    Kurths, J; Maass, P; Timmer, J

    2008-01-01

    The aim of this volume is to bring together research directions in theoretical signal and imaging processing developed rather independently in electrical engineering, theoretical physics, mathematics and the computer sciences. In particular, mathematically justified algorithms and methods, the mathematical analysis of these algorithms, and methods as well as the investigation of connections between methods from time series analysis and image processing are reviewed. An interdisciplinary comparison of these methods, drawing upon common sets of test problems from medicine and geophysical/enviromental sciences, is also addressed. This volume coherently summarizes work carried out in the field of theoretical signal and image processing. It focuses on non-linear and non-parametric models for time series as well as on adaptive methods in image processing.

  18. Time series analysis methods and applications for flight data

    CERN Document Server

    Zhang, Jianye

    2017-01-01

    This book focuses on different facets of flight data analysis, including the basic goals, methods, and implementation techniques. As mass flight data possesses the typical characteristics of time series, the time series analysis methods and their application for flight data have been illustrated from several aspects, such as data filtering, data extension, feature optimization, similarity search, trend monitoring, fault diagnosis, and parameter prediction, etc. An intelligent information-processing platform for flight data has been established to assist in aircraft condition monitoring, training evaluation and scientific maintenance. The book will serve as a reference resource for people working in aviation management and maintenance, as well as researchers and engineers in the fields of data analysis and data mining.

  19. Time Series Analysis, Modeling and Applications A Computational Intelligence Perspective

    CERN Document Server

    Chen, Shyi-Ming

    2013-01-01

    Temporal and spatiotemporal data form an inherent fabric of the society as we are faced with streams of data coming from numerous sensors, data feeds, recordings associated with numerous areas of application embracing physical and human-generated phenomena (environmental data, financial markets, Internet activities, etc.). A quest for a thorough analysis, interpretation, modeling and prediction of time series comes with an ongoing challenge for developing models that are both accurate and user-friendly (interpretable). The volume is aimed to exploit the conceptual and algorithmic framework of Computational Intelligence (CI) to form a cohesive and comprehensive environment for building models of time series. The contributions covered in the volume are fully reflective of the wealth of the CI technologies by bringing together ideas, algorithms, and numeric studies, which convincingly demonstrate their relevance, maturity and visible usefulness. It reflects upon the truly remarkable diversity of methodological a...

  20. Models for Pooled Time-Series Cross-Section Data

    Directory of Open Access Journals (Sweden)

    Lawrence E Raffalovich

    2015-07-01

    Full Text Available Several models are available for the analysis of pooled time-series cross-section (TSCS data, defined as “repeated observations on fixed units” (Beck and Katz 1995. In this paper, we run the following models: (1 a completely pooled model, (2 fixed effects models, and (3 multi-level/hierarchical linear models. To illustrate these models, we use a Generalized Least Squares (GLS estimator with cross-section weights and panel-corrected standard errors (with EViews 8 on the cross-national homicide trends data of forty countries from 1950 to 2005, which we source from published research (Messner et al. 2011. We describe and discuss the similarities and differences between the models, and what information each can contribute to help answer substantive research questions. We conclude with a discussion of how the models we present may help to mitigate validity threats inherent in pooled time-series cross-section data analysis.

  1. Time series prediction by feedforward neural networks - is it difficult?

    CERN Document Server

    Rosen-Zvi, M; Kinzel, W

    2003-01-01

    The difficulties that a neural network faces when trying to learn from a quasi-periodic time series are studied analytically using a teacher-student scenario where the random input is divided into two macroscopic regions with different variances, 1 and 1/gamma sup 2 (gamma >> 1). The generalization error is found to decrease as epsilon sub g propor to exp(-alpha/gamma sup 2), where alpha is the number of examples per input dimension. In contradiction to this very slow vanishing generalization error, the next output prediction is found to be almost free of mistakes. This picture is consistent with learning quasi-periodic time series produced by feedforward neural networks, which is dominated by enhanced components of the Fourier spectrum of the input. Simulation results are in good agreement with the analytical results.

  2. Time series prediction by feedforward neural networks - is it difficult?

    Science.gov (United States)

    Rosen-Zvi, Michal; Kanter, Ido; Kinzel, Wolfgang

    2003-04-01

    The difficulties that a neural network faces when trying to learn from a quasi-periodic time series are studied analytically using a teacher-student scenario where the random input is divided into two macroscopic regions with different variances, 1 and 1/gamma2 (gamma gg 1). The generalization error is found to decrease as epsilong propto exp(-alpha/gamma2), where alpha is the number of examples per input dimension. In contradiction to this very slow vanishing generalization error, the next output prediction is found to be almost free of mistakes. This picture is consistent with learning quasi-periodic time series produced by feedforward neural networks, which is dominated by enhanced components of the Fourier spectrum of the input. Simulation results are in good agreement with the analytical results.

  3. Modeling Philippine Stock Exchange Composite Index Using Time Series Analysis

    Science.gov (United States)

    Gayo, W. S.; Urrutia, J. D.; Temple, J. M. F.; Sandoval, J. R. D.; Sanglay, J. E. A.

    2015-06-01

    This study was conducted to develop a time series model of the Philippine Stock Exchange Composite Index and its volatility using the finite mixture of ARIMA model with conditional variance equations such as ARCH, GARCH, EG ARCH, TARCH and PARCH models. Also, the study aimed to find out the reason behind the behaviorof PSEi, that is, which of the economic variables - Consumer Price Index, crude oil price, foreign exchange rate, gold price, interest rate, money supply, price-earnings ratio, Producers’ Price Index and terms of trade - can be used in projecting future values of PSEi and this was examined using Granger Causality Test. The findings showed that the best time series model for Philippine Stock Exchange Composite index is ARIMA(1,1,5) - ARCH(1). Also, Consumer Price Index, crude oil price and foreign exchange rate are factors concluded to Granger cause Philippine Stock Exchange Composite Index.

  4. Optimization of recurrent neural networks for time series modeling

    DEFF Research Database (Denmark)

    Pedersen, Morten With

    1997-01-01

    series. The overall objective s are to improve training by application of second-order methods and to improve generalization ability by architecture optimization accomplished by pruning. The major topics covered in the thesis are: 1. The problem of training recurrent networks is analyzed from a numerical...... of solution obtained as well as computation time required. 3. A theoretical definition of the generalization error for recurrent networks is provided. This definition justifies a commonly adopted approach for estimating generalization ability. 4. The viability of pruning recurrent networks by the Optimal...... networks is proposed. The tool allows for assessment of the length of the effe ctive memory of previous inputs built up in the recurrent network during application. Time series modeling is also treated from a more general point of view, namely modeling of the joint probability distribution function...

  5. Forecasting the Reference Evapotranspiration Using Time Series Model

    Directory of Open Access Journals (Sweden)

    H. Zare Abyaneh

    2016-10-01

    Full Text Available Introduction: Reference evapotranspiration is one of the most important factors in irrigation timing and field management. Moreover, reference evapotranspiration forecasting can play a vital role in future developments. Therefore in this study, the seasonal autoregressive integrated moving average (ARIMA model was used to forecast the reference evapotranspiration time series in the Esfahan, Semnan, Shiraz, Kerman, and Yazd synoptic stations. Materials and Methods: In the present study in all stations (characteristics of the synoptic stations are given in Table 1, the meteorological data, including mean, maximum and minimum air temperature, relative humidity, dry-and wet-bulb temperature, dew-point temperature, wind speed, precipitation, air vapor pressure and sunshine hours were collected from the Islamic Republic of Iran Meteorological Organization (IRIMO for the 41 years from 1965 to 2005. The FAO Penman-Monteith equation was used to calculate the monthly reference evapotranspiration in the five synoptic stations and the evapotranspiration time series were formed. The unit root test was used to identify whether the time series was stationary, then using the Box-Jenkins method, seasonal ARIMA models were applied to the sample data. Table 1. The geographical location and climate conditions of the synoptic stations Station\tGeographical location\tAltitude (m\tMean air temperature (°C\tMean precipitation (mm\tClimate, according to the De Martonne index classification Longitude (E\tLatitude (N Annual\tMin. and Max. Esfahan\t51° 40'\t32° 37'\t1550.4\t16.36\t9.4-23.3\t122\tArid Semnan\t53° 33'\t35° 35'\t1130.8\t18.0\t12.4-23.8\t140\tArid Shiraz\t52° 36'\t29° 32'\t1484\t18.0\t10.2-25.9\t324\tSemi-arid Kerman\t56° 58'\t30° 15'\t1753.8\t15.6\t6.7-24.6\t142\tArid Yazd\t54° 17'\t31° 54'\t1237.2\t19.2\t11.8-26.0\t61\tArid Results and Discussion: The monthly meteorological data were used as input for the Ref-ET software and monthly reference

  6. Fast Algorithms for Mining Co-evolving Time Series

    Science.gov (United States)

    2011-09-01

    abundant in many application areas such as motion capture, sensor networks, weather forecasting, and financial market modeling. The major goal of analyzing...studies of financial markets , network intrusion detection, forecasting, etc. Mining and forecasting are popular operations relevant to time series...81 Reuters. Factbox: A look at the $65 billion video games industry, June 2011. URL http://uk.reuters.com/article/2011/06/06/ us- videogames -factbox

  7. An Intervention Time Series Analysis: Specialization and Competitiveness in Sports”

    OpenAIRE

    Zuzana Janko; Janusz Kokoszewski

    2013-01-01

    We utilize a time-series intervention model in the spirit of Enders (1995) and ask to what extent – if any – institutional specialization improves competitiveness in sports. Specifically, we analyze the impact on the competitiveness of Polish swimmers internationally due to the establishment of high-school sport centers in late 1980's specializing in swimming. This allows us to measure the quantitative and qualitative effects of a standardized system on competitiveness. Our analysis shows tha...

  8. Financial Time Series Prediction Using Elman Recurrent Random Neural Networks

    Directory of Open Access Journals (Sweden)

    Jie Wang

    2016-01-01

    (ERNN, the empirical results show that the proposed neural network displays the best performance among these neural networks in financial time series forecasting. Further, the empirical research is performed in testing the predictive effects of SSE, TWSE, KOSPI, and Nikkei225 with the established model, and the corresponding statistical comparisons of the above market indices are also exhibited. The experimental results show that this approach gives good performance in predicting the values from the stock market indices.

  9. Multifractal analysis of time series generated by discrete Ito equations

    Science.gov (United States)

    Telesca, Luciano; Czechowski, Zbigniew; Lovallo, Michele

    2015-06-01

    In this study, we show that discrete Ito equations with short-tail Gaussian marginal distribution function generate multifractal time series. The multifractality is due to the nonlinear correlations, which are hidden in Markov processes and are generated by the interrelation between the drift and the multiplicative stochastic forces in the Ito equation. A link between the range of the generalized Hurst exponents and the mean of the squares of all averaged net forces is suggested.

  10. Comparison of correlation analysis techniques for irregularly sampled time series

    Directory of Open Access Journals (Sweden)

    K. Rehfeld

    2011-06-01

    Full Text Available Geoscientific measurements often provide time series with irregular time sampling, requiring either data reconstruction (interpolation or sophisticated methods to handle irregular sampling. We compare the linear interpolation technique and different approaches for analyzing the correlation functions and persistence of irregularly sampled time series, as Lomb-Scargle Fourier transformation and kernel-based methods. In a thorough benchmark test we investigate the performance of these techniques.

    All methods have comparable root mean square errors (RMSEs for low skewness of the inter-observation time distribution. For high skewness, very irregular data, interpolation bias and RMSE increase strongly. We find a 40 % lower RMSE for the lag-1 autocorrelation function (ACF for the Gaussian kernel method vs. the linear interpolation scheme,in the analysis of highly irregular time series. For the cross correlation function (CCF the RMSE is then lower by 60 %. The application of the Lomb-Scargle technique gave results comparable to the kernel methods for the univariate, but poorer results in the bivariate case. Especially the high-frequency components of the signal, where classical methods show a strong bias in ACF and CCF magnitude, are preserved when using the kernel methods.

    We illustrate the performances of interpolation vs. Gaussian kernel method by applying both to paleo-data from four locations, reflecting late Holocene Asian monsoon variability as derived from speleothem δ18O measurements. Cross correlation results are similar for both methods, which we attribute to the long time scales of the common variability. The persistence time (memory is strongly overestimated when using the standard, interpolation-based, approach. Hence, the Gaussian kernel is a reliable and more robust estimator with significant advantages compared to other techniques and suitable for large scale application to paleo-data.

  11. Statistical Inference Methods for Sparse Biological Time Series Data

    Directory of Open Access Journals (Sweden)

    Voit Eberhard O

    2011-04-01

    Full Text Available Abstract Background Comparing metabolic profiles under different biological perturbations has become a powerful approach to investigating the functioning of cells. The profiles can be taken as single snapshots of a system, but more information is gained if they are measured longitudinally over time. The results are short time series consisting of relatively sparse data that cannot be analyzed effectively with standard time series techniques, such as autocorrelation and frequency domain methods. In this work, we study longitudinal time series profiles of glucose consumption in the yeast Saccharomyces cerevisiae under different temperatures and preconditioning regimens, which we obtained with methods of in vivo nuclear magnetic resonance (NMR spectroscopy. For the statistical analysis we first fit several nonlinear mixed effect regression models to the longitudinal profiles and then used an ANOVA likelihood ratio method in order to test for significant differences between the profiles. Results The proposed methods are capable of distinguishing metabolic time trends resulting from different treatments and associate significance levels to these differences. Among several nonlinear mixed-effects regression models tested, a three-parameter logistic function represents the data with highest accuracy. ANOVA and likelihood ratio tests suggest that there are significant differences between the glucose consumption rate profiles for cells that had been--or had not been--preconditioned by heat during growth. Furthermore, pair-wise t-tests reveal significant differences in the longitudinal profiles for glucose consumption rates between optimal conditions and heat stress, optimal and recovery conditions, and heat stress and recovery conditions (p-values Conclusion We have developed a nonlinear mixed effects model that is appropriate for the analysis of sparse metabolic and physiological time profiles. The model permits sound statistical inference procedures

  12. Reconstruction of network topology using status-time-series data

    Science.gov (United States)

    Pandey, Pradumn Kumar; Badarla, Venkataramana

    2018-01-01

    Uncovering the heterogeneous connection pattern of a networked system from the available status-time-series (STS) data of a dynamical process on the network is of great interest in network science and known as a reverse engineering problem. Dynamical processes on a network are affected by the structure of the network. The dependency between the diffusion dynamics and structure of the network can be utilized to retrieve the connection pattern from the diffusion data. Information of the network structure can help to devise the control of dynamics on the network. In this paper, we consider the problem of network reconstruction from the available status-time-series (STS) data using matrix analysis. The proposed method of network reconstruction from the STS data is tested successfully under susceptible-infected-susceptible (SIS) diffusion dynamics on real-world and computer-generated benchmark networks. High accuracy and efficiency of the proposed reconstruction procedure from the status-time-series data define the novelty of the method. Our proposed method outperforms compressed sensing theory (CST) based method of network reconstruction using STS data. Further, the same procedure of network reconstruction is applied to the weighted networks. The ordering of the edges in the weighted networks is identified with high accuracy.

  13. Genetic programming and serial processing for time series classification.

    Science.gov (United States)

    Alfaro-Cid, Eva; Sharman, Ken; Esparcia-Alcázar, Anna I

    2014-01-01

    This work describes an approach devised by the authors for time series classification. In our approach genetic programming is used in combination with a serial processing of data, where the last output is the result of the classification. The use of genetic programming for classification, although still a field where more research in needed, is not new. However, the application of genetic programming to classification tasks is normally done by considering the input data as a feature vector. That is, to the best of our knowledge, there are not examples in the genetic programming literature of approaches where the time series data are processed serially and the last output is considered as the classification result. The serial processing approach presented here fills a gap in the existing literature. This approach was tested in three different problems. Two of them are real world problems whose data were gathered for online or conference competitions. As there are published results of these two problems this gives us the chance to compare the performance of our approach against top performing methods. The serial processing of data in combination with genetic programming obtained competitive results in both competitions, showing its potential for solving time series classification problems. The main advantage of our serial processing approach is that it can easily handle very large datasets.

  14. Modeling financial time series with S-plus

    CERN Document Server

    Zivot, Eric

    2003-01-01

    The field of financial econometrics has exploded over the last decade This book represents an integration of theory, methods, and examples using the S-PLUS statistical modeling language and the S+FinMetrics module to facilitate the practice of financial econometrics This is the first book to show the power of S-PLUS for the analysis of time series data It is written for researchers and practitioners in the finance industry, academic researchers in economics and finance, and advanced MBA and graduate students in economics and finance Readers are assumed to have a basic knowledge of S-PLUS and a solid grounding in basic statistics and time series concepts Eric Zivot is an associate professor and Gary Waterman Distinguished Scholar in the Economics Department at the University of Washington, and is co-director of the nascent Professional Master's Program in Computational Finance He regularly teaches courses on econometric theory, financial econometrics and time series econometrics, and is the recipient of the He...

  15. Complexity analysis of the UV radiation dose time series

    CERN Document Server

    Mihailovic, Dragutin T

    2013-01-01

    We have used the Lempel-Ziv and sample entropy measures to assess the complexity in the UV radiation activity in the Vojvodina region (Serbia) for the period 1990-2007. In particular, we have examined the reconstructed daily sum (dose) of the UV-B time series from seven representative places in this region and calculated the Lempel-Ziv Complexity (LZC) and Sample Entropy (SE) values for each time series. The results indicate that the LZC values in some places are close to each other while in others they differ. We have devided the period 1990-2007 into two subintervals: (a) 1990-1998 and (b) 1999-2007 and calculated LZC and SE values for the various time series in these subintervals. It is found that during the period 1999-2007, there is a decrease in their complexities, and corresponding changes in the SE, in comparison to the period 1990-1998. This complexity loss may be attributed to increased (i) human intervention in the post civil war period (land and crop use and urbanization) and military activities i...

  16. Acute ischaemic stroke prediction from physiological time series patterns

    Directory of Open Access Journals (Sweden)

    Qing Zhang,

    2013-05-01

    Full Text Available BackgroundStroke is one of the major diseases with human mortality. Recent clinical research has indicated that early changes in common physiological variables represent a potential therapeutic target, thus the manipulation of these variables may eventually yield an effective way to optimise stroke recovery.AimsWe examined correlations between physiological parameters of patients during the first 48 hours after a stroke, and their stroke outcomes after 3 months. We wanted to discover physiological determinants that could be used to improve health outcomes by supporting the medical decisions that need to be made early on a patient’s stroke experience.Method We applied regression-based machine learning techniques to build a prediction algorithm that can forecast 3-month outcomes from initial physiological time series data during the first 48 hours after stroke. In our method, not only did we use statistical characteristics as traditional prediction features, but also we adopted trend patterns of time series data as new key features.ResultsWe tested our prediction method on a real physiological data set of stroke patients. The experiment results revealed an average high precision rate: 90%. We also tested prediction methods only considering statistical characteristics of physiological data, and concluded an average precision rate: 71%.ConclusionWe demonstrated that using trend pattern features in prediction methods improved the accuracy of stroke outcome prediction. Therefore, trend patterns of physiological time series data have an important role in the early treatment of patients with acute ischaemic stroke.

  17. Time series analysis of age related cataract hospitalizations and phacoemulsification

    Directory of Open Access Journals (Sweden)

    Moineddin Rahim

    2006-01-01

    Full Text Available Abstract Background Cataract surgery remains a commonly performed elective surgical procedure in the aging and the elderly. The purpose of this study was to utilize time series methodology to determine the temporal and seasonal variations and the strength of the seasonality in age-related (senile cataract hospitalizations and phacoemulsification surgeries. Methods A retrospective, cross-sectional time series analysis was used to assess the presence and strength of seasonal and temporal patterns of age-related cataract hospitalizations and phacoemulsification surgeries from April 1, 1991 to March 31, 2002. Hospital admission rates for senile cataract (n = 70,281 and phacoemulsification (n = 556,431 were examined to determine monthly rates of hospitalization per 100,000 population. Time series methodology was then applied to the monthly aggregates. Results During the study period, age-related cataract hospitalizations in Ontario have declined from approximately 40 per 100,000 to only one per 100,000. Meanwhile, the use of phacoemulsification procedures has risen dramatically. The study found evidence of biannual peaks in both procedures during the spring and autumn months, and summer and winter troughs. Statistical analysis revealed significant overall seasonal patterns for both age-related cataract hospitalizations and phacoemulsifications (p Conclusion This study illustrates the decline in age-related cataract hospitalizations in Ontario resulting from the shift to outpatient phacoemulsification surgery, and demonstrates the presence of biannual peaks (a characteristic indicative of seasonality, in hospitalization and phacoemulsification during the spring and autumn throughout the study period.

  18. Data visualization in interactive maps and time series

    Science.gov (United States)

    Maigne, Vanessa; Evano, Pascal; Brockmann, Patrick; Peylin, Philippe; Ciais, Philippe

    2014-05-01

    State-of-the-art data visualization has nothing to do with plots and maps we used few years ago. Many opensource tools are now available to provide access to scientific data and implement accessible, interactive, and flexible web applications. Here we will present a web site opened November 2013 to create custom global and regional maps and time series from research models and datasets. For maps, we explore and get access to data sources from a THREDDS Data Server (TDS) with the OGC WMS protocol (using the ncWMS implementation) then create interactive maps with the OpenLayers javascript library and extra information layers from a GeoServer. Maps become dynamic, zoomable, synchroneaously connected to each other, and exportable to Google Earth. For time series, we extract data from a TDS with the Netcdf Subset Service (NCSS) then display interactive graphs with a custom library based on the Data Driven Documents javascript library (D3.js). This time series application provides dynamic functionalities such as interpolation, interactive zoom on different axes, display of point values, and export to different formats. These tools were implemented for the Global Carbon Atlas (http://www.globalcarbonatlas.org): a web portal to explore, visualize, and interpret global and regional carbon fluxes from various model simulations arising from both human activities and natural processes, a work led by the Global Carbon Project.

  19. Coastline detection with time series of SAR images

    Science.gov (United States)

    Ao, Dongyang; Dumitru, Octavian; Schwarz, Gottfried; Datcu, Mihai

    2017-10-01

    For maritime remote sensing, coastline detection is a vital task. With continuous coastline detection results from satellite image time series, the actual shoreline, the sea level, and environmental parameters can be observed to support coastal management and disaster warning. Established coastline detection methods are often based on SAR images and wellknown image processing approaches. These methods involve a lot of complicated data processing, which is a big challenge for remote sensing time series. Additionally, a number of SAR satellites operating with polarimetric capabilities have been launched in recent years, and many investigations of target characteristics in radar polarization have been performed. In this paper, a fast and efficient coastline detection method is proposed which comprises three steps. First, we calculate a modified correlation coefficient of two SAR images of different polarization. This coefficient differs from the traditional computation where normalization is needed. Through this modified approach, the separation between sea and land becomes more prominent. Second, we set a histogram-based threshold to distinguish between sea and land within the given image. The histogram is derived from the statistical distribution of the polarized SAR image pixel amplitudes. Third, we extract continuous coastlines using a Canny image edge detector that is rather immune to speckle noise. Finally, the individual coastlines derived from time series of .SAR images can be checked for changes.

  20. Testing frequency-domain causality in multivariate time series.

    Science.gov (United States)

    Faes, Luca; Porta, Alberto; Nollo, Giandomenico

    2010-08-01

    We introduce a new hypothesis-testing framework, based on surrogate data generation, to assess in the frequency domain, the concept of causality among multivariate (MV) time series. The approach extends the traditional Fourier transform (FT) method for generating surrogate data in a MV process and adapts it to the specific issue of causality. It generates causal FT (CFT) surrogates with FT modulus taken from the original series, and FT phase taken from a set of series with causal interactions set to zero over the direction of interest and preserved over all other directions. Two different zero-setting procedures, acting on the parameters of a MV autoregressive (MVAR) model fitted on the original series, were used to test the null hypotheses of absence of direct causal influence (CFTd surrogates) and of full (direct and indirect) causal influence (CFTf surrogates), respectively. CFTf and CFTd surrogates were utilized in combination with the directed coherence (DC) and the partial DC (PDC) spectral causality estimators, respectively. Simulations reproducing different causality patterns in linear MVAR processes demonstrated the better accuracy of CFTf and CFTd surrogates with respect to traditional FT surrogates. Application on real MV biological data measured from healthy humans, i.e., heart period, arterial pressure, and respiration variability, as well as multichannel EEG signals, showed that CFT surrogates disclose causal patterns in accordance with expected cardiorespiratory and neurophysiological mechanisms.

  1. Earthquake forecasting studies using radon time series data in Taiwan

    Science.gov (United States)

    Walia, Vivek; Kumar, Arvind; Fu, Ching-Chou; Lin, Shih-Jung; Chou, Kuang-Wu; Wen, Kuo-Liang; Chen, Cheng-Hong

    2017-04-01

    For few decades, growing number of studies have shown usefulness of data in the field of seismogeochemistry interpreted as geochemical precursory signals for impending earthquakes and radon is idendified to be as one of the most reliable geochemical precursor. Radon is recognized as short-term precursor and is being monitored in many countries. This study is aimed at developing an effective earthquake forecasting system by inspecting long term radon time series data. The data is obtained from a network of radon monitoring stations eastblished along different faults of Taiwan. The continuous time series radon data for earthquake studies have been recorded and some significant variations associated with strong earthquakes have been observed. The data is also examined to evaluate earthquake precursory signals against environmental factors. An automated real-time database operating system has been developed recently to improve the data processing for earthquake precursory studies. In addition, the study is aimed at the appraisal and filtrations of these environmental parameters, in order to create a real-time database that helps our earthquake precursory study. In recent years, automatic operating real-time database has been developed using R, an open source programming language, to carry out statistical computation on the data. To integrate our data with our working procedure, we use the popular and famous open source web application solution, AMP (Apache, MySQL, and PHP), creating a website that could effectively show and help us manage the real-time database.

  2. Forecasting long memory time series under a break in persistence

    DEFF Research Database (Denmark)

    Heinen, Florian; Sibbertsen, Philipp; Kruse, Robinson

    of this effect depends on whether the memory parameter is increasing or decreasing over time. A comparison of six forecasting strategies allows us to conclude that pre-testing for a change in persistence is highly recommendable in our setting. In addition we provide an empirical example which underlines......We consider the problem of forecasting time series with long memory when the memory parameter is subject to a structural break. By means of a large-scale Monte Carlo study we show that ignoring such a change in persistence leads to substantially reduced forecasting precision. The strength...

  3. Satellite Image Time Series Decomposition Based on EEMD

    Directory of Open Access Journals (Sweden)

    Yun-long Kong

    2015-11-01

    Full Text Available Satellite Image Time Series (SITS have recently been of great interest due to the emerging remote sensing capabilities for Earth observation. Trend and seasonal components are two crucial elements of SITS. In this paper, a novel framework of SITS decomposition based on Ensemble Empirical Mode Decomposition (EEMD is proposed. EEMD is achieved by sifting an ensemble of adaptive orthogonal components called Intrinsic Mode Functions (IMFs. EEMD is noise-assisted and overcomes the drawback of mode mixing in conventional Empirical Mode Decomposition (EMD. Inspired by these advantages, the aim of this work is to employ EEMD to decompose SITS into IMFs and to choose relevant IMFs for the separation of seasonal and trend components. In a series of simulations, IMFs extracted by EEMD achieved a clear representation with physical meaning. The experimental results of 16-day compositions of Moderate Resolution Imaging Spectroradiometer (MODIS, Normalized Difference Vegetation Index (NDVI, and Global Environment Monitoring Index (GEMI time series with disturbance illustrated the effectiveness and stability of the proposed approach to monitoring tasks, such as applications for the detection of abrupt changes.

  4. Time-series animation techniques for visualizing urban growth

    Science.gov (United States)

    Acevedo, W.; Masuoka, P.

    1997-01-01

    Time-series animation is a visually intuitive way to display urban growth. Animations of landuse change for the Baltimore-Washington region were generated by showing a series of images one after the other in sequential order. Before creating an animation, various issues which will affect the appearance of the animation should be considered, including the number of original data frames to use, the optimal animation display speed, the number of intermediate frames to create between the known frames, and the output media on which the animations will be displayed. To create new frames between the known years of data, the change in each theme (i.e. urban development, water bodies, transportation routes) must be characterized and an algorithm developed to create the in-between frames. Example time-series animations were created using a temporal GIS database of the Baltimore-Washington area. Creating the animations involved generating raster images of the urban development, water bodies, and principal transportation routes; overlaying the raster images on a background image; and importing the frames to a movie file. Three-dimensional perspective animations were created by draping each image over digital elevation data prior to importing the frames to a movie file. ?? 1997 Elsevier Science Ltd.

  5. Time-series animation techniques for visualizing urban growth

    Science.gov (United States)

    Acevedo, William; Masuoka, Penny

    1997-05-01

    Time-series animation is a visually intuitive way to display urban growth. Animations of landuse change for the Baltimore-Washington region were generated by showing a series of images one after the other in sequential order. Before creating an animation, various issues which will affect the appearance of the animation should be considered, including the number of original data frames to use, the optimal animation display speed, the number of intermediate frames to create between the known frames, and the output media on which the animations will be displayed. To create new frames between the known years of data, the change in each theme (i.e. urban development, water bodies, transportation routes) must be characterized and an algorithm developed to create the in-between frames. Example time-series animations were created using a temporal GIS database of the Baltimore-Washington area. Creating the animations involved generating raster images of the urban development, water bodies, and principal transportation routes; overlaying the raster images on a background image; and importing the frames to a movie file. Three-dimensional perspective animations were created by draping each image over digital elevation data prior to importing the frames to a movie file.

  6. Time series analysis of the behavior of brazilian natural rubber

    Directory of Open Access Journals (Sweden)

    Antônio Donizette de Oliveira

    2009-03-01

    Full Text Available The natural rubber is a non-wood product obtained of the coagulation of some lattices of forest species, being Hevea brasiliensis the main one. Native from the Amazon Region, this species was already known by the Indians before the discovery of America. The natural rubber became a product globally valued due to its multiple applications in the economy, being its almost perfect substitute the synthetic rubber derived from the petroleum. Similarly to what happens with other countless products the forecast of future prices of the natural rubber has been object of many studies. The use of models of forecast of univariate timeseries stands out as the more accurate and useful to reduce the uncertainty in the economic decision making process. This studyanalyzed the historical series of prices of the Brazilian natural rubber (R$/kg, in the Jan/99 - Jun/2006 period, in order tocharacterize the rubber price behavior in the domestic market; estimated a model for the time series of monthly natural rubberprices; and foresaw the domestic prices of the natural rubber, in the Jul/2006 - Jun/2007 period, based on the estimated models.The studied models were the ones belonging to the ARIMA family. The main results were: the domestic market of the natural rubberis expanding due to the growth of the world economy; among the adjusted models, the ARIMA (1,1,1 model provided the bestadjustment of the time series of prices of the natural rubber (R$/kg; the prognosis accomplished for the series supplied statistically adequate fittings.

  7. Cluster analysis of activity-time series in motor learning

    DEFF Research Database (Denmark)

    Balslev, Daniela; Nielsen, Finn Å; Futiger, Sally A

    2002-01-01

    series. The optimal number of clusters was chosen using a cross-validated likelihood method, which highlights the clustering pattern that generalizes best over the subjects. Data were acquired with PET at different time points during practice of a visuomotor task. The results from cluster analysis show......Neuroimaging studies of learning focus on brain areas where the activity changes as a function of time. To circumvent the difficult problem of model selection, we used a data-driven analytic tool, cluster analysis, which extracts representative temporal and spatial patterns from the voxel-time...... practice-related activity in a fronto-parieto-cerebellar network, in agreement with previous studies of motor learning. These voxels were separated from a group of voxels showing an unspecific time-effect and another group of voxels, whose activation was an artifact from smoothing...

  8. Loading effects in GPS vertical displacement time series

    Science.gov (United States)

    Memin, A.; Boy, J. P.; Santamaría-Gómez, A.; Watson, C.; Gravelle, M.; Tregoning, P.

    2015-12-01

    Surface deformations due to loading, with yet no comprehensive representation, account for a significant part of the variability in geodetic time series. We assess effects of loading in GPS vertical displacement time series at several frequency bands. We compare displacement derived from up-to-date loading models to two global sets of positioning time series, and investigate how they are reduced looking at interannual periods (> 2 months), intermediate periods (> 7 days) and the whole spectrum (> 1day). We assess the impact of interannual loading on estimating velocities. We compute atmospheric loading effects using surface pressure fields from the ECMWF. We use the inverted barometer (IB) hypothesis valid for periods exceeding a week to describe the ocean response to the pressure forcing. We used general circulation ocean model (ECCO and GLORYS) to account for wind, heat and fresh water flux. We separately use the Toulouse Unstructured Grid Ocean model (TUGO-m), forced by air pressure and winds, to represent the dynamics of the ocean response at high frequencies. The continental water storage is described using GLDAS/Noah and MERRA-land models. Non-hydrology loading reduces the variability of the observed vertical displacement differently according to the frequency band. The hydrology loading leads to a further reduction mostly at annual periods. ECMWF+TUGO-m better agrees with vertical surface motion than the ECMWF+IB model at all frequencies. The interannual deformation is time-correlated at most of the locations. It is adequately described by a power-law process of spectral index varying from -1.5 to -0.2. Depending on the power-law parameters, the predicted non-linear deformation due to mass loading variations leads to vertical velocity biases up to 0.7 mm/yr when estimated from 5 years of continuous observations. The maximum velocity bias can reach up to 1 mm/yr in regions around the southern Tropical band.

  9. Detecting switching and intermittent causalities in time series

    Science.gov (United States)

    Zanin, Massimiliano; Papo, David

    2017-04-01

    During the last decade, complex network representations have emerged as a powerful instrument for describing the cross-talk between different brain regions both at rest and as subjects are carrying out cognitive tasks, in healthy brains and neurological pathologies. The transient nature of such cross-talk has nevertheless by and large been neglected, mainly due to the inherent limitations of some metrics, e.g., causality ones, which require a long time series in order to yield statistically significant results. Here, we present a methodology to account for intermittent causal coupling in neural activity, based on the identification of non-overlapping windows within the original time series in which the causality is strongest. The result is a less coarse-grained assessment of the time-varying properties of brain interactions, which can be used to create a high temporal resolution time-varying network. We apply the proposed methodology to the analysis of the brain activity of control subjects and alcoholic patients performing an image recognition task. Our results show that short-lived, intermittent, local-scale causality is better at discriminating both groups than global network metrics. These results highlight the importance of the transient nature of brain activity, at least under some pathological conditions.

  10. Long-term time series prediction using OP-ELM.

    Science.gov (United States)

    Grigorievskiy, Alexander; Miche, Yoan; Ventelä, Anne-Mari; Séverin, Eric; Lendasse, Amaury

    2014-03-01

    In this paper, an Optimally Pruned Extreme Learning Machine (OP-ELM) is applied to the problem of long-term time series prediction. Three known strategies for the long-term time series prediction i.e. Recursive, Direct and DirRec are considered in combination with OP-ELM and compared with a baseline linear least squares model and Least-Squares Support Vector Machines (LS-SVM). Among these three strategies DirRec is the most time consuming and its usage with nonlinear models like LS-SVM, where several hyperparameters need to be adjusted, leads to relatively heavy computations. It is shown that OP-ELM, being also a nonlinear model, allows reasonable computational time for the DirRec strategy. In all our experiments, except one, OP-ELM with DirRec strategy outperforms the linear model with any strategy. In contrast to the proposed algorithm, LS-SVM behaves unstably without variable selection. It is also shown that there is no superior strategy for OP-ELM: any of three can be the best. In addition, the prediction accuracy of an ensemble of OP-ELM is studied and it is shown that averaging predictions of the ensemble can improve the accuracy (Mean Square Error) dramatically. Copyright © 2013 Elsevier Ltd. All rights reserved.

  11. Weighted statistical parameters for irregularly sampled time series

    Science.gov (United States)

    Rimoldini, Lorenzo

    2014-01-01

    Unevenly spaced time series are common in astronomy because of the day-night cycle, weather conditions, dependence on the source position in the sky, allocated telescope time and corrupt measurements, for example, or inherent to the scanning law of satellites like Hipparcos and the forthcoming Gaia. Irregular sampling often causes clumps of measurements and gaps with no data which can severely disrupt the values of estimators. This paper aims at improving the accuracy of common statistical parameters when linear interpolation (in time or phase) can be considered an acceptable approximation of a deterministic signal. A pragmatic solution is formulated in terms of a simple weighting scheme, adapting to the sampling density and noise level, applicable to large data volumes at minimal computational cost. Tests on time series from the Hipparcos periodic catalogue led to significant improvements in the overall accuracy and precision of the estimators with respect to the unweighted counterparts and those weighted by inverse-squared uncertainties. Automated classification procedures employing statistical parameters weighted by the suggested scheme confirmed the benefits of the improved input attributes. The classification of eclipsing binaries, Mira, RR Lyrae, Delta Cephei and Alpha2 Canum Venaticorum stars employing exclusively weighted descriptive statistics achieved an overall accuracy of 92 per cent, about 6 per cent higher than with unweighted estimators.

  12. Synthesis of rainfall time series in a high temporal resolution

    Science.gov (United States)

    Callau Poduje, Ana Claudia; Haberlandt, Uwe

    2014-05-01

    In order to optimize the design and operation of urban drainage systems, long and continuous rain series in a high temporal resolution are essential. As the length of the rainfall records is often short, particularly the data available with the temporal and regional resolutions required for urban hydrology, it is necessary to find some numerical representation of the precipitation phenomenon to generate long synthetic rainfall series. An Alternating Renewal Model (ARM) is applied for this purpose, which consists of two structures: external and internal. The former is the sequence of wet and dry spells, described by their durations which are simulated stochastically. The internal structure is characterized by the amount of rain corresponding to each wet spell and its distribution within the spell. A multivariate frequency analysis is applied to analyze the internal structure of the wet spells and to generate synthetic events. The stochastic time series must reproduce the statistical characteristics of observed high resolution precipitation measurements used to generate them. The spatio-temporal interdependencies between stations are addressed by resampling the continuous synthetic series based on the Simulated Annealing (SA) procedure. The state of Lower-Saxony and surrounding areas, located in the north-west of Germany is used to develop the ARM. A total of 26 rainfall stations with high temporal resolution records, i.e. rainfall data every 5 minutes, are used to define the events, find the most suitable probability distributions, calibrate the corresponding parameters, simulate long synthetic series and evaluate the results. The length of the available data ranges from 10 to 20 years. The rainfall series involved in the different steps of calculation are compared using a rainfall-runoff model to simulate the runoff behavior in urban areas. The EPA Storm Water Management Model (SWMM) is applied for this evaluation. The results show a good representation of the

  13. Comparison of Statistical Models for Analyzing Wheat Yield Time Series

    Science.gov (United States)

    Michel, Lucie; Makowski, David

    2013-01-01

    The world's population is predicted to exceed nine billion by 2050 and there is increasing concern about the capability of agriculture to feed such a large population. Foresight studies on food security are frequently based on crop yield trends estimated from yield time series provided by national and regional statistical agencies. Various types of statistical models have been proposed for the analysis of yield time series, but the predictive performances of these models have not yet been evaluated in detail. In this study, we present eight statistical models for analyzing yield time series and compare their ability to predict wheat yield at the national and regional scales, using data provided by the Food and Agriculture Organization of the United Nations and by the French Ministry of Agriculture. The Holt-Winters and dynamic linear models performed equally well, giving the most accurate predictions of wheat yield. However, dynamic linear models have two advantages over Holt-Winters models: they can be used to reconstruct past yield trends retrospectively and to analyze uncertainty. The results obtained with dynamic linear models indicated a stagnation of wheat yields in many countries, but the estimated rate of increase of wheat yield remained above 0.06 t ha−1 year−1 in several countries in Europe, Asia, Africa and America, and the estimated values were highly uncertain for several major wheat producing countries. The rate of yield increase differed considerably between French regions, suggesting that efforts to identify the main causes of yield stagnation should focus on a subnational scale. PMID:24205280

  14. Time-series modeling of gross migration and dynamic equilibrium.

    Science.gov (United States)

    Tabuchi, T

    1985-02-01

    Firstly, the high association between in- and out-migration is investigated in a time-series context and modeled according to three categories: 1) job transfer, 2) job search and marriage, and 3) return migration. Under certain coditions it is shown that aggregation of these migrations yields a bivariate time-series model having feedbacks in both directions. Secondly, the recent phenomenon of sharp changes in net migration seems to be discontinuous and, hence, catastrophic modeling [Casetti (1981) may be appropriate. However, this paper considers gross migration between cores (metropolitan areas) and peripheries (rest of the nation) for which a continuous function seems adequate. This is done by introducing a multivariate time-series model. This model is empirically supported, especially in Japan, divided into 32 regions, by t-tests and Durbin-Watson ratios, although it excludes economic variables such as employment growth and wage differentials. This may imply that the recent dispersal from core to peripheral regions could be explained primarily by feedback from return migrants. Finallym, provided future streams of gross migration follow the past trends given by simultaneous equation estimates, in-migration and out-migration would approach a stable state in most regions. Irrespective of random shocks in the future, in- and out-migration would tend to approach a stable equilibrium. According to the estimation of the stable states, the 45 core regions in the US would continue to lose population through net outflows while those in Japan would continue to gain. The present model may thus be valid only for short-term forecasts. By introducing feedback and lag structures, however, it does offer one explanation for the recent population turnaround.

  15. Comparison of statistical models for analyzing wheat yield time series.

    Science.gov (United States)

    Michel, Lucie; Makowski, David

    2013-01-01

    The world's population is predicted to exceed nine billion by 2050 and there is increasing concern about the capability of agriculture to feed such a large population. Foresight studies on food security are frequently based on crop yield trends estimated from yield time series provided by national and regional statistical agencies. Various types of statistical models have been proposed for the analysis of yield time series, but the predictive performances of these models have not yet been evaluated in detail. In this study, we present eight statistical models for analyzing yield time series and compare their ability to predict wheat yield at the national and regional scales, using data provided by the Food and Agriculture Organization of the United Nations and by the French Ministry of Agriculture. The Holt-Winters and dynamic linear models performed equally well, giving the most accurate predictions of wheat yield. However, dynamic linear models have two advantages over Holt-Winters models: they can be used to reconstruct past yield trends retrospectively and to analyze uncertainty. The results obtained with dynamic linear models indicated a stagnation of wheat yields in many countries, but the estimated rate of increase of wheat yield remained above 0.06 t ha⁻¹ year⁻¹ in several countries in Europe, Asia, Africa and America, and the estimated values were highly uncertain for several major wheat producing countries. The rate of yield increase differed considerably between French regions, suggesting that efforts to identify the main causes of yield stagnation should focus on a subnational scale.

  16. Comparison of statistical models for analyzing wheat yield time series.

    Directory of Open Access Journals (Sweden)

    Lucie Michel

    Full Text Available The world's population is predicted to exceed nine billion by 2050 and there is increasing concern about the capability of agriculture to feed such a large population. Foresight studies on food security are frequently based on crop yield trends estimated from yield time series provided by national and regional statistical agencies. Various types of statistical models have been proposed for the analysis of yield time series, but the predictive performances of these models have not yet been evaluated in detail. In this study, we present eight statistical models for analyzing yield time series and compare their ability to predict wheat yield at the national and regional scales, using data provided by the Food and Agriculture Organization of the United Nations and by the French Ministry of Agriculture. The Holt-Winters and dynamic linear models performed equally well, giving the most accurate predictions of wheat yield. However, dynamic linear models have two advantages over Holt-Winters models: they can be used to reconstruct past yield trends retrospectively and to analyze uncertainty. The results obtained with dynamic linear models indicated a stagnation of wheat yields in many countries, but the estimated rate of increase of wheat yield remained above 0.06 t ha⁻¹ year⁻¹ in several countries in Europe, Asia, Africa and America, and the estimated values were highly uncertain for several major wheat producing countries. The rate of yield increase differed considerably between French regions, suggesting that efforts to identify the main causes of yield stagnation should focus on a subnational scale.

  17. Thorium isotopes tracing the iron cycle at the Hawaii Ocean Time-series Station ALOHA

    Science.gov (United States)

    Hayes, Christopher T.; Fitzsimmons, Jessica N.; Boyle, Edward A.; McGee, David; Anderson, Robert F.; Weisend, Rachel; Morton, Peter L.

    2015-11-01

    The role of iron as a limiting micronutrient motivates an effort to understand the supply and removal of lithogenic trace metals in the ocean. The long-lived thorium isotopes (232Th and 230Th) in seawater can be used to quantify the input of lithogenic metals attributable to the partial dissolution of aerosol dust. Thus, Th can help in disentangling the Fe cycle by providing an estimate of its ultimate supply and turnover rate. Here we present time-series (1994-2014) data on thorium isotopes and iron concentrations in seawater from the Hawaii Ocean Time-series Station ALOHA. By comparing Th-based dissolved Fe fluxes with measured dissolved Fe inventories, we derive Fe residence times of 6-12 months for the surface ocean. Therefore, Fe inventories in the surface ocean are sensitive to seasonal changes in dust input. Ultrafiltration results further reveal that Th has a much lower colloidal content than Fe does, despite a common source. On this basis, we suggest Fe colloids may be predominantly organic in composition, at least at Station ALOHA. In the deep ocean (>2 km), Fe approaches a solubility limit while Th, surprisingly, is continually leached from lithogenic particles. This distinction has implications for the relevance of Fe ligand availability in the deep ocean, but also suggests Th is not a good tracer for Fe in deep waters. While uncovering divergent behavior of these elements in the water column, this study finds that dissolved Th flux is a suitable proxy for the supply of Fe from dust in the remote surface ocean.

  18. Variability of African Farming Systems from Phenological Analysis of NDVI Time Series

    Science.gov (United States)

    Vrieling, Anton; deBeurs, K. M.; Brown, Molly E.

    2011-01-01

    Food security exists when people have access to sufficient, safe and nutritious food at all times to meet their dietary needs. The natural resource base is one of the many factors affecting food security. Its variability and decline creates problems for local food production. In this study we characterize for sub-Saharan Africa vegetation phenology and assess variability and trends of phenological indicators based on NDVI time series from 1982 to 2006. We focus on cumulated NDVI over the season (cumNDVI) which is a proxy for net primary productivity. Results are aggregated at the level of major farming systems, while determining also spatial variability within farming systems. High temporal variability of cumNDVI occurs in semiarid and subhumid regions. The results show a large area of positive cumNDVI trends between Senegal and South Sudan. These correspond to positive CRU rainfall trends found and relate to recovery after the 1980's droughts. We find significant negative cumNDVI trends near the south-coast of West Africa (Guinea coast) and in Tanzania. For each farming system, causes of change and variability are discussed based on available literature (Appendix A). Although food security comprises more than the local natural resource base, our results can perform an input for food security analysis by identifying zones of high variability or downward trends. Farming systems are found to be a useful level of analysis. Diversity and trends found within farming system boundaries underline that farming systems are dynamic.

  19. Acoustic and optical methods to infer water transparency at Time Series Station Spiekeroog, Wadden Sea

    Science.gov (United States)

    Schulz, Anne-Christin; Badewien, Thomas H.; Garaba, Shungudzemwoyo P.; Zielinski, Oliver

    2016-11-01

    Water transparency is a primary indicator of optical water quality that is driven by suspended particulate and dissolved material. A data set from the operational Time Series Station Spiekeroog located at a tidal inlet of the Wadden Sea was used to perform (i) an inter-comparison of observations related to water transparency, (ii) correlation tests among these measured parameters, and (iii) to explore the utility of both acoustic and optical tools in monitoring water transparency. An Acoustic Doppler Current Profiler was used to derive the backscatter signal in the water column. Optical observations were collected using above-water hyperspectral radiometers and a submerged turbidity metre. Bio-fouling on the turbidity sensors optical windows resulted in measurement drift and abnormal values during quality control steps. We observed significant correlations between turbidity collected by the submerged metre and that derived from above-water radiometer observations. Turbidity from these sensors was also associated with the backscatter signal derived from the acoustic measurements. These findings suggest that both optical and acoustic measurements can be reasonable proxies of water transparency with the potential to mitigate gaps and increase data quality in long-time observation of marine environments.

  20. Time-series analysis of vibrational nuclear wave packet dynamics

    Science.gov (United States)

    Thumm, Uwe; Niederhausen, Thomas; Feuerstein, Bernold

    2008-10-01

    We discuss the extent to which measured time-dependent fragment kinetic energy release (KER) spectra and calculated nuclear probability densities can reveal 1) the transition frequencies between stationary vibrational states, 2) the nodal structure of stationary vibrational states, 3) the ground-state adiabatic electronic potential curve of the molecular ion, and 4) the progression of decoherence induced by random interactions with the environment. We illustrate our discussion with numerical simulations for the time-dependent nuclear motion of vibrational wave packets in the D2^+ molecular ion caused by the ionization of its neutral D2 parent molecule with an intense pump laser pulse. Based on a harmonic time-series analysis, we suggest a general scheme for the full reconstruction, up to an overall phase factor, of the initial wave packets based on measured KER spectra, cf., Phys. Rev. A 77, 063401 (2008).

  1. Assemblage time series reveal biodiversity change but not systematic loss.

    Science.gov (United States)

    Dornelas, Maria; Gotelli, Nicholas J; McGill, Brian; Shimadzu, Hideyasu; Moyes, Faye; Sievers, Caya; Magurran, Anne E

    2014-04-18

    The extent to which biodiversity change in local assemblages contributes to global biodiversity loss is poorly understood. We analyzed 100 time series from biomes across Earth to ask how diversity within assemblages is changing through time. We quantified patterns of temporal α diversity, measured as change in local diversity, and temporal β diversity, measured as change in community composition. Contrary to our expectations, we did not detect systematic loss of α diversity. However, community composition changed systematically through time, in excess of predictions from null models. Heterogeneous rates of environmental change, species range shifts associated with climate change, and biotic homogenization may explain the different patterns of temporal α and β diversity. Monitoring and understanding change in species composition should be a conservation priority.

  2. Centrality measures in temporal networks with time series analysis

    Science.gov (United States)

    Huang, Qiangjuan; Zhao, Chengli; Zhang, Xue; Wang, Xiaojie; Yi, Dongyun

    2017-05-01

    The study of identifying important nodes in networks has a wide application in different fields. However, the current researches are mostly based on static or aggregated networks. Recently, the increasing attention to networks with time-varying structure promotes the study of node centrality in temporal networks. In this paper, we define a supra-evolution matrix to depict the temporal network structure. With using of the time series analysis, the relationships between different time layers can be learned automatically. Based on the special form of the supra-evolution matrix, the eigenvector centrality calculating problem is turned into the calculation of eigenvectors of several low-dimensional matrices through iteration, which effectively reduces the computational complexity. Experiments are carried out on two real-world temporal networks, Enron email communication network and DBLP co-authorship network, the results of which show that our method is more efficient at discovering the important nodes than the common aggregating method.

  3. Financial Time Series Prediction Using Elman Recurrent Random Neural Networks

    Science.gov (United States)

    Wang, Jie; Wang, Jun; Fang, Wen; Niu, Hongli

    2016-01-01

    In recent years, financial market dynamics forecasting has been a focus of economic research. To predict the price indices of stock markets, we developed an architecture which combined Elman recurrent neural networks with stochastic time effective function. By analyzing the proposed model with the linear regression, complexity invariant distance (CID), and multiscale CID (MCID) analysis methods and taking the model compared with different models such as the backpropagation neural network (BPNN), the stochastic time effective neural network (STNN), and the Elman recurrent neural network (ERNN), the empirical results show that the proposed neural network displays the best performance among these neural networks in financial time series forecasting. Further, the empirical research is performed in testing the predictive effects of SSE, TWSE, KOSPI, and Nikkei225 with the established model, and the corresponding statistical comparisons of the above market indices are also exhibited. The experimental results show that this approach gives good performance in predicting the values from the stock market indices. PMID:27293423

  4. GPS time series at Campi Flegrei caldera (2000-2013

    Directory of Open Access Journals (Sweden)

    Prospero De Martino

    2014-05-01

    Full Text Available The Campi Flegrei caldera is an active volcanic system associated to a high volcanic risk, and represents a well known and peculiar example of ground deformations (bradyseism, characterized by intense uplift periods, followed by subsidence phases with some episodic superimposed mini-uplifts. Ground deformation is an important volcanic precursor, and, its continuous monitoring, is one of the main tool for short time forecast of eruptive activity. This paper provides an overview of the continuous GPS monitoring of the Campi Flegrei caldera from January 2000 to July 2013, including network operations, data recording and processing, and data products. In this period the GPS time series allowed continuous and accurate tracking of ground deformation of the area. Seven main uplift episodes were detected, and during each uplift period, the recurrent horizontal displacement pattern, radial from the “caldera center”, suggests no significant change in deformation source geometry and location occurs. The complete archive of GPS time series at Campi Flegrei area is reported in the Supplementary materials. These data can be usefull for the scientific community in improving the research on Campi Flegrei caldera dynamic and hazard assessment.

  5. Pseudometrics for Nearest Neighbor Classification of Time Series Data

    Directory of Open Access Journals (Sweden)

    Boonserm Kijsirikul

    2009-05-01

    Full Text Available We propose that pseudometric, a subadditive distance measure, has sufficient properties to be a good structure to perform nearest neighbor pattern classification. There exist some theoretical results that asymptotically guarantee the classification accuracy of k-nearest neighbor when the sample size grows larger. These results hold true under the assumption that the distance measure is a metric. The results still hold for pseudometrics up to some technicality. Whether the results are valid for the non-subadditive distance measures is still left unanswered. Pseudometric is also practically appealing. Once we have a subadditive distance measure, the measure will have at least one significant advantage over the non-subadditive; one can directly plug such distance measure into systems which exploit the subadditivity to perform faster nearest neighbor search techniques. This work focuses on pseudometrics for time series. We propose two frameworks for studying and designing subadditive distance measures and a few examples of distance measures resulting from the frameworks. One framework is more general than the other and can be used to tailor distances from the other framework to gain better classification performance. Experimental results of nearest neighbor classification of the designed pseudometrics in comparison with well-known existing distance measures including Dynamic Time Warping showed that the designed distance measures are practical for time series classification.

  6. Indirect inference with time series observed with error

    DEFF Research Database (Denmark)

    Rossi, Eduardo; Santucci de Magistris, Paolo

    We analyze the properties of the indirect inference estimator when the observed series are contaminated by measurement error. We show that the indirect inference estimates are asymptotically biased when the nuisance parameters of the measurement error distribution are neglected in the indirect...... to estimate the parameters of continuous-time stochastic volatility models with auxiliary specifications based on realized volatility measures. Monte Carlo simulations shows the bias reduction of the indirect estimates obtained when the microstructure noise is explicitly modeled. Finally, an empirical...... application illustrates the relevance of a realistic specification of the microstructure noise distribution to match the features of the observed log-returns at high frequencies....

  7. Chaotic time series analysis in economics: Balance and perspectives

    Energy Technology Data Exchange (ETDEWEB)

    Faggini, Marisa, E-mail: mfaggini@unisa.it [Dipartimento di Scienze Economiche e Statistiche, Università di Salerno, Fisciano 84084 (Italy)

    2014-12-15

    The aim of the paper is not to review the large body of work concerning nonlinear time series analysis in economics, about which much has been written, but rather to focus on the new techniques developed to detect chaotic behaviours in economic data. More specifically, our attention will be devoted to reviewing some of these techniques and their application to economic and financial data in order to understand why chaos theory, after a period of growing interest, appears now not to be such an interesting and promising research area.

  8. Albedo Pattern Recognition and Time-Series Analyses in Malaysia

    Science.gov (United States)

    Salleh, S. A.; Abd Latif, Z.; Mohd, W. M. N. Wan; Chan, A.

    2012-07-01

    Pattern recognition and time-series analyses will enable one to evaluate and generate predictions of specific phenomena. The albedo pattern and time-series analyses are very much useful especially in relation to climate condition monitoring. This study is conducted to seek for Malaysia albedo pattern changes. The pattern recognition and changes will be useful for variety of environmental and climate monitoring researches such as carbon budgeting and aerosol mapping. The 10 years (2000-2009) MODIS satellite images were used for the analyses and interpretation. These images were being processed using ERDAS Imagine remote sensing software, ArcGIS 9.3, the 6S code for atmospherical calibration and several MODIS tools (MRT, HDF2GIS, Albedo tools). There are several methods for time-series analyses were explored, this paper demonstrates trends and seasonal time-series analyses using converted HDF format MODIS MCD43A3 albedo land product. The results revealed significance changes of albedo percentages over the past 10 years and the pattern with regards to Malaysia's nebulosity index (NI) and aerosol optical depth (AOD). There is noticeable trend can be identified with regards to its maximum and minimum value of the albedo. The rise and fall of the line graph show a similar trend with regards to its daily observation. The different can be identified in term of the value or percentage of rises and falls of albedo. Thus, it can be concludes that the temporal behavior of land surface albedo in Malaysia have a uniform behaviours and effects with regards to the local monsoons. However, although the average albedo shows linear trend with nebulosity index, the pattern changes of albedo with respects to the nebulosity index indicates that there are external factors that implicates the albedo values, as the sky conditions and its diffusion plotted does not have uniform trend over the years, especially when the trend of 5 years interval is examined, 2000 shows high negative linear

  9. Nonlinear analysis and prediction of time series in multiphase reactors

    CERN Document Server

    Liu, Mingyan

    2014-01-01

    This book reports on important nonlinear aspects or deterministic chaos issues in the systems of multi-phase reactors. The reactors treated in the book include gas-liquid bubble columns, gas-liquid-solid fluidized beds and gas-liquid-solid magnetized fluidized beds. The authors take pressure fluctuations in the bubble columns  as time series for nonlinear analysis, modeling and forecasting. They present qualitative and quantitative non-linear analysis tools which include attractor phase plane plot, correlation dimension, Kolmogorov entropy and largest Lyapunov exponent calculations and local non-linear short-term prediction.

  10. Disease management with ARIMA model in time series.

    Science.gov (United States)

    Sato, Renato Cesar

    2013-01-01

    The evaluation of infectious and noninfectious disease management can be done through the use of a time series analysis. In this study, we expect to measure the results and prevent intervention effects on the disease. Clinical studies have benefited from the use of these techniques, particularly for the wide applicability of the ARIMA model. This study briefly presents the process of using the ARIMA model. This analytical tool offers a great contribution for researchers and healthcare managers in the evaluation of healthcare interventions in specific populations.

  11. Quantum spectrum as a time series: fluctuation measures.

    Science.gov (United States)

    Santhanam, M S; Bandyopadhyay, Jayendra N; Angom, Dilip

    2006-01-01

    The fluctuations in the quantum spectrum could be treated like a time series. In this framework, we explore the statistical self-similarity in the quantum spectrum using the detrended fluctuation analysis (DFA) and random matrix theory (RMT). We calculate the Hausdorff measure for the spectra of atoms and Gaussian ensembles and study their self-affine properties. We show that DFA is equivalent to the Delta3 statistics of RMT, unifying two different approaches. We exploit this connection to obtain theoretical estimates for the Hausdorff measure.

  12. Real coded genetic algorithm for fuzzy time series prediction

    Science.gov (United States)

    Jain, Shilpa; Bisht, Dinesh C. S.; Singh, Phool; Mathpal, Prakash C.

    2017-10-01

    Genetic Algorithm (GA) forms a subset of evolutionary computing, rapidly growing area of Artificial Intelligence (A.I.). Some variants of GA are binary GA, real GA, messy GA, micro GA, saw tooth GA, differential evolution GA. This research article presents a real coded GA for predicting enrollments of University of Alabama. Data of Alabama University is a fuzzy time series. Here, fuzzy logic is used to predict enrollments of Alabama University and genetic algorithm optimizes fuzzy intervals. Results are compared to other eminent author works and found satisfactory, and states that real coded GA are fast and accurate.

  13. Simulation of transcontinental wind and solar PV generation time series

    DEFF Research Database (Denmark)

    Nuño Martinez, Edgar; Maule, Petr; Hahmann, Andrea N.

    2018-01-01

    The deployment of Renewable Energy Sources (RES) is driving modern power systems towards a fundamental green transition. In this regard, there is a need to develop models to accurately capture the variability of wind and solar photovoltaic (PV) power, at different geographical and temporal scales....... This paper presents a general methodology based on meteorological reanalysis techniques allowing to simulate aggregated RES time series over large geographical areas. It also introduces a novel PV conversion approach based on aggregated power curves in order to capture the uncertainty associated...

  14. Ensemble Deep Learning for Biomedical Time Series Classification.

    Science.gov (United States)

    Jin, Lin-Peng; Dong, Jun

    2016-01-01

    Ensemble learning has been proved to improve the generalization ability effectively in both theory and practice. In this paper, we briefly outline the current status of research on it first. Then, a new deep neural network-based ensemble method that integrates filtering views, local views, distorted views, explicit training, implicit training, subview prediction, and Simple Average is proposed for biomedical time series classification. Finally, we validate its effectiveness on the Chinese Cardiovascular Disease Database containing a large number of electrocardiogram recordings. The experimental results show that the proposed method has certain advantages compared to some well-known ensemble methods, such as Bagging and AdaBoost.

  15. Kernel canonical-correlation Granger causality for multiple time series.

    Science.gov (United States)

    Wu, Guorong; Duan, Xujun; Liao, Wei; Gao, Qing; Chen, Huafu

    2011-04-01

    Canonical-correlation analysis as a multivariate statistical technique has been applied to multivariate Granger causality analysis to infer information flow in complex systems. It shows unique appeal and great superiority over the traditional vector autoregressive method, due to the simplified procedure that detects causal interaction between multiple time series, and the avoidance of potential model estimation problems. However, it is limited to the linear case. Here, we extend the framework of canonical correlation to include the estimation of multivariate nonlinear Granger causality for drawing inference about directed interaction. Its feasibility and effectiveness are verified on simulated data. ©2011 American Physical Society

  16. Ensemble Deep Learning for Biomedical Time Series Classification

    Directory of Open Access Journals (Sweden)

    Lin-peng Jin

    2016-01-01

    Full Text Available Ensemble learning has been proved to improve the generalization ability effectively in both theory and practice. In this paper, we briefly outline the current status of research on it first. Then, a new deep neural network-based ensemble method that integrates filtering views, local views, distorted views, explicit training, implicit training, subview prediction, and Simple Average is proposed for biomedical time series classification. Finally, we validate its effectiveness on the Chinese Cardiovascular Disease Database containing a large number of electrocardiogram recordings. The experimental results show that the proposed method has certain advantages compared to some well-known ensemble methods, such as Bagging and AdaBoost.

  17. ALBEDO PATTERN RECOGNITION AND TIME-SERIES ANALYSES IN MALAYSIA

    Directory of Open Access Journals (Sweden)

    S. A. Salleh

    2012-07-01

    Full Text Available Pattern recognition and time-series analyses will enable one to evaluate and generate predictions of specific phenomena. The albedo pattern and time-series analyses are very much useful especially in relation to climate condition monitoring. This study is conducted to seek for Malaysia albedo pattern changes. The pattern recognition and changes will be useful for variety of environmental and climate monitoring researches such as carbon budgeting and aerosol mapping. The 10 years (2000–2009 MODIS satellite images were used for the analyses and interpretation. These images were being processed using ERDAS Imagine remote sensing software, ArcGIS 9.3, the 6S code for atmospherical calibration and several MODIS tools (MRT, HDF2GIS, Albedo tools. There are several methods for time-series analyses were explored, this paper demonstrates trends and seasonal time-series analyses using converted HDF format MODIS MCD43A3 albedo land product. The results revealed significance changes of albedo percentages over the past 10 years and the pattern with regards to Malaysia's nebulosity index (NI and aerosol optical depth (AOD. There is noticeable trend can be identified with regards to its maximum and minimum value of the albedo. The rise and fall of the line graph show a similar trend with regards to its daily observation. The different can be identified in term of the value or percentage of rises and falls of albedo. Thus, it can be concludes that the temporal behavior of land surface albedo in Malaysia have a uniform behaviours and effects with regards to the local monsoons. However, although the average albedo shows linear trend with nebulosity index, the pattern changes of albedo with respects to the nebulosity index indicates that there are external factors that implicates the albedo values, as the sky conditions and its diffusion plotted does not have uniform trend over the years, especially when the trend of 5 years interval is examined, 2000 shows high

  18. Estimation of dynamic flux profiles from metabolic time series data

    Directory of Open Access Journals (Sweden)

    Chou I-Chun

    2012-07-01

    Full Text Available Abstract Background Advances in modern high-throughput techniques of molecular biology have enabled top-down approaches for the estimation of parameter values in metabolic systems, based on time series data. Special among them is the recent method of dynamic flux estimation (DFE, which uses such data not only for parameter estimation but also for the identification of functional forms of the processes governing a metabolic system. DFE furthermore provides diagnostic tools for the evaluation of model validity and of the quality of a model fit beyond residual errors. Unfortunately, DFE works only when the data are more or less complete and the system contains as many independent fluxes as metabolites. These drawbacks may be ameliorated with other types of estimation and information. However, such supplementations incur their own limitations. In particular, assumptions must be made regarding the functional forms of some processes and detailed kinetic information must be available, in addition to the time series data. Results The authors propose here a systematic approach that supplements DFE and overcomes some of its shortcomings. Like DFE, the approach is model-free and requires only minimal assumptions. If sufficient time series data are available, the approach allows the determination of a subset of fluxes that enables the subsequent applicability of DFE to the rest of the flux system. The authors demonstrate the procedure with three artificial pathway systems exhibiting distinct characteristics and with actual data of the trehalose pathway in Saccharomyces cerevisiae. Conclusions The results demonstrate that the proposed method successfully complements DFE under various situations and without a priori assumptions regarding the model representation. The proposed method also permits an examination of whether at all, to what degree, or within what range the available time series data can be validly represented in a particular functional format of

  19. Feasibility of real-time calculation of correlation integral derived statistics applied to EEG time series

    NARCIS (Netherlands)

    Broek, P.L.C. van den; Egmond, J. van; Rijn, C.M. van; Takens, F.; Coenen, A.M.L.; Booij, L.H.D.J.

    2005-01-01

    This study assessed the feasibility of online calculation of the correlation integral (C(r)) aiming to apply C(r)-derived statistics. For real-time application it is important to reduce calculation time. It is shown how our method works for EEG time series. Methods: To achieve online calculation of

  20. Feasibility of real-time calculation of correlation integral derived statistics applied to EGG time series

    NARCIS (Netherlands)

    van den Broek, PLC; van Egmond, J; van Rijn, CM; Takens, F; Coenen, AML; Booij, LHDJ

    2005-01-01

    Background: This study assessed the feasibility of online calculation of the correlation integral (C(r)) aiming to apply C(r)derived statistics. For real-time application it is important to reduce calculation time. It is shown how our method works for EEG time series. Methods: To achieve online

  1. Analysis of short subdiffusive time series: scatter of the time-averaged mean-squared displacement

    Energy Technology Data Exchange (ETDEWEB)

    Jeon, Jae-Hyung; Metzler, Ralf, E-mail: jae-hyung.jeon@ph.tum.d, E-mail: metz@ph.tum.d [Department of Physics, Technical University of Munich, James-Franck Strasse, 85747 Garching (Germany)

    2010-06-25

    We analyse the statistical behaviour of short time series in systems performing subdiffusion. Comparing the non-ergodic continuous time random walk model to the ergodic fractional Brownian motion, we demonstrate that the scatter between individual trajectories is not purely dominated by finite sample size effects but preserves some of the characteristics of the individual processes. In particular we show that the distribution of the time-averaged mean-squared displacements allows one to clearly distinguish between the two stochastic mechanisms even for a very short time series. (fast track communication)

  2. A New Hybrid Methodology for Nonlinear Time Series Forecasting

    Directory of Open Access Journals (Sweden)

    Mehdi Khashei

    2011-01-01

    Full Text Available Artificial neural networks (ANNs are flexible computing frameworks and universal approximators that can be applied to a wide range of forecasting problems with a high degree of accuracy. However, using ANNs to model linear problems have yielded mixed results, and hence; it is not wise to apply them blindly to any type of data. This is the reason that hybrid methodologies combining linear models such as ARIMA and nonlinear models such as ANNs have been proposed in the literature of time series forecasting. Despite of all advantages of the traditional methodologies for combining ARIMA and ANNs, they have some assumptions that will degenerate their performance if the opposite situation occurs. In this paper, a new methodology is proposed in order to combine the ANNs with ARIMA in order to overcome the limitations of traditional hybrid methodologies and yield more general and more accurate hybrid models. Empirical results with Canadian Lynx data set indicate that the proposed methodology can be a more effective way in order to combine linear and nonlinear models together than traditional hybrid methodologies. Therefore, it can be applied as an appropriate alternative methodology for hybridization in time series forecasting field, especially when higher forecasting accuracy is needed.

  3. Land surface phenology from SPOT VEGETATION time series

    Directory of Open Access Journals (Sweden)

    A. Verger

    2016-12-01

    Full Text Available Land surface phenology from time series of satellite data are expected to contribute to improve the representation of vegetation phenology in earth system models. We characterized the baseline phenology of the vegetation at the global scale from GEOCLIM-LAI, a global climatology of leaf area index (LAI derived from 1-km SPOT VEGETATION time series for 1999-2010. The calibration with ground measurements showed that the start and end of season were best identified using respectively 30% and 40% threshold of LAI amplitude values. The satellite-derived phenology was spatially consistent with the global distributions of climatic drivers and biome land cover. The accuracy of the derived phenological metrics, evaluated using available ground observations for birch forests in Europe, cherry in Asia and lilac shrubs in North America showed an overall root mean square error lower than 19 days for the start, end and length of season, and good agreement between the latitudinal gradients of VEGETATION LAI phenology and ground data.

  4. Automatising the analysis of stochastic biochemical time-series.

    Science.gov (United States)

    Caravagna, Giulio; De Sano, Luca; Antoniotti, Marco

    2015-01-01

    Mathematical and computational modelling of biochemical systems has seen a lot of effort devoted to the definition and implementation of high-performance mechanistic simulation frameworks. Within these frameworks it is possible to analyse complex models under a variety of configurations, eventually selecting the best setting of, e.g., parameters for a target system. This operational pipeline relies on the ability to interpret the predictions of a model, often represented as simulation time-series. Thus, an efficient data analysis pipeline is crucial to automatise time-series analyses, bearing in mind that errors in this phase might mislead the modeller's conclusions. For this reason we have developed an intuitive framework-independent Python tool to automate analyses common to a variety of modelling approaches. These include assessment of useful non-trivial statistics for simulation ensembles, e.g., estimation of master equations. Intuitive and domain-independent batch scripts will allow the researcher to automatically prepare reports, thus speeding up the usual model-definition, testing and refinement pipeline.

  5. Time series clustering analysis of health-promoting behavior

    Science.gov (United States)

    Yang, Chi-Ta; Hung, Yu-Shiang; Deng, Guang-Feng

    2013-10-01

    Health promotion must be emphasized to achieve the World Health Organization goal of health for all. Since the global population is aging rapidly, ComCare elder health-promoting service was developed by the Taiwan Institute for Information Industry in 2011. Based on the Pender health promotion model, ComCare service offers five categories of health-promoting functions to address the everyday needs of seniors: nutrition management, social support, exercise management, health responsibility, stress management. To assess the overall ComCare service and to improve understanding of the health-promoting behavior of elders, this study analyzed health-promoting behavioral data automatically collected by the ComCare monitoring system. In the 30638 session records collected for 249 elders from January, 2012 to March, 2013, behavior patterns were identified by fuzzy c-mean time series clustering algorithm combined with autocorrelation-based representation schemes. The analysis showed that time series data for elder health-promoting behavior can be classified into four different clusters. Each type reveals different health-promoting needs, frequencies, function numbers and behaviors. The data analysis result can assist policymakers, health-care providers, and experts in medicine, public health, nursing and psychology and has been provided to Taiwan National Health Insurance Administration to assess the elder health-promoting behavior.

  6. Predicting Physical Time Series Using Dynamic Ridge Polynomial Neural Networks

    Science.gov (United States)

    Al-Jumeily, Dhiya; Ghazali, Rozaida; Hussain, Abir

    2014-01-01

    Forecasting naturally occurring phenomena is a common problem in many domains of science, and this has been addressed and investigated by many scientists. The importance of time series prediction stems from the fact that it has wide range of applications, including control systems, engineering processes, environmental systems and economics. From the knowledge of some aspects of the previous behaviour of the system, the aim of the prediction process is to determine or predict its future behaviour. In this paper, we consider a novel application of a higher order polynomial neural network architecture called Dynamic Ridge Polynomial Neural Network that combines the properties of higher order and recurrent neural networks for the prediction of physical time series. In this study, four types of signals have been used, which are; The Lorenz attractor, mean value of the AE index, sunspot number, and heat wave temperature. The simulation results showed good improvements in terms of the signal to noise ratio in comparison to a number of higher order and feedforward neural networks in comparison to the benchmarked techniques. PMID:25157950

  7. Intermittency and multifractional Brownian character of geomagnetic time series

    Directory of Open Access Journals (Sweden)

    G. Consolini

    2013-07-01

    Full Text Available The Earth's magnetosphere exhibits a complex behavior in response to the solar wind conditions. This behavior, which is described in terms of mutifractional Brownian motions, could be the consequence of the occurrence of dynamical phase transitions. On the other hand, it has been shown that the dynamics of the geomagnetic signals is also characterized by intermittency at the smallest temporal scales. Here, we focus on the existence of a possible relationship in the geomagnetic time series between the multifractional Brownian motion character and the occurrence of intermittency. In detail, we investigate the multifractional nature of two long time series of the horizontal intensity of the Earth's magnetic field as measured at L'Aquila Geomagnetic Observatory during two years (2001 and 2008, which correspond to different conditions of solar activity. We propose a possible double origin of the intermittent character of the small-scale magnetic field fluctuations, which is related to both the multifractional nature of the geomagnetic field and the intermittent character of the disturbance level. Our results suggest a more complex nature of the geomagnetic response to solar wind changes than previously thought.

  8. Intermittency and multifractional Brownian character of geomagnetic time series

    Science.gov (United States)

    Consolini, G.; De Marco, R.; De Michelis, P.

    2013-07-01

    The Earth's magnetosphere exhibits a complex behavior in response to the solar wind conditions. This behavior, which is described in terms of mutifractional Brownian motions, could be the consequence of the occurrence of dynamical phase transitions. On the other hand, it has been shown that the dynamics of the geomagnetic signals is also characterized by intermittency at the smallest temporal scales. Here, we focus on the existence of a possible relationship in the geomagnetic time series between the multifractional Brownian motion character and the occurrence of intermittency. In detail, we investigate the multifractional nature of two long time series of the horizontal intensity of the Earth's magnetic field as measured at L'Aquila Geomagnetic Observatory during two years (2001 and 2008), which correspond to different conditions of solar activity. We propose a possible double origin of the intermittent character of the small-scale magnetic field fluctuations, which is related to both the multifractional nature of the geomagnetic field and the intermittent character of the disturbance level. Our results suggest a more complex nature of the geomagnetic response to solar wind changes than previously thought.

  9. Financial time series prediction using spiking neural networks.

    Directory of Open Access Journals (Sweden)

    David Reid

    Full Text Available In this paper a novel application of a particular type of spiking neural network, a Polychronous Spiking Network, was used for financial time series prediction. It is argued that the inherent temporal capabilities of this type of network are suited to non-stationary data such as this. The performance of the spiking neural network was benchmarked against three systems: two "traditional", rate-encoded, neural networks; a Multi-Layer Perceptron neural network and a Dynamic Ridge Polynomial neural network, and a standard Linear Predictor Coefficients model. For this comparison three non-stationary and noisy time series were used: IBM stock data; US/Euro exchange rate data, and the price of Brent crude oil. The experiments demonstrated favourable prediction results for the Spiking Neural Network in terms of Annualised Return and prediction error for 5-Step ahead predictions. These results were also supported by other relevant metrics such as Maximum Drawdown and Signal-To-Noise ratio. This work demonstrated the applicability of the Polychronous Spiking Network to financial data forecasting and this in turn indicates the potential of using such networks over traditional systems in difficult to manage non-stationary environments.

  10. Hybrid perturbation methods based on statistical time series models

    Science.gov (United States)

    San-Juan, Juan Félix; San-Martín, Montserrat; Pérez, Iván; López, Rosario

    2016-04-01

    In this work we present a new methodology for orbit propagation, the hybrid perturbation theory, based on the combination of an integration method and a prediction technique. The former, which can be a numerical, analytical or semianalytical theory, generates an initial approximation that contains some inaccuracies derived from the fact that, in order to simplify the expressions and subsequent computations, not all the involved forces are taken into account and only low-order terms are considered, not to mention the fact that mathematical models of perturbations not always reproduce physical phenomena with absolute precision. The prediction technique, which can be based on either statistical time series models or computational intelligence methods, is aimed at modelling and reproducing missing dynamics in the previously integrated approximation. This combination results in the precision improvement of conventional numerical, analytical and semianalytical theories for determining the position and velocity of any artificial satellite or space debris object. In order to validate this methodology, we present a family of three hybrid orbit propagators formed by the combination of three different orders of approximation of an analytical theory and a statistical time series model, and analyse their capability to process the effect produced by the flattening of the Earth. The three considered analytical components are the integration of the Kepler problem, a first-order and a second-order analytical theories, whereas the prediction technique is the same in the three cases, namely an additive Holt-Winters method.

  11. Time series trends of the safety effects of pavement resurfacing.

    Science.gov (United States)

    Park, Juneyoung; Abdel-Aty, Mohamed; Wang, Jung-Han

    2017-04-01

    This study evaluated the safety performance of pavement resurfacing projects on urban arterials in Florida using the observational before and after approaches. The safety effects of pavement resurfacing were quantified in the crash modification factors (CMFs) and estimated based on different ranges of heavy vehicle traffic volume and time changes for different severity levels. In order to evaluate the variation of CMFs over time, crash modification functions (CMFunctions) were developed using nonlinear regression and time series models. The results showed that pavement resurfacing projects decrease crash frequency and are found to be more safety effective to reduce severe crashes in general. Moreover, the results of the general relationship between the safety effects and time changes indicated that the CMFs increase over time after the resurfacing treatment. It was also found that pavement resurfacing projects for the urban roadways with higher heavy vehicle volume rate are more safety effective than the roadways with lower heavy vehicle volume rate. Based on the exploration and comparison of the developed CMFucntions, the seasonal autoregressive integrated moving average (SARIMA) and exponential functional form of the nonlinear regression models can be utilized to identify the trend of CMFs over time. Copyright © 2017 Elsevier Ltd. All rights reserved.

  12. Dependency Structures in Differentially Coded Cardiovascular Time Series

    Directory of Open Access Journals (Sweden)

    Tatjana Tasic

    2017-01-01

    Full Text Available Objectives. This paper analyses temporal dependency in the time series recorded from aging rats, the healthy ones and those with early developed hypertension. The aim is to explore effects of age and hypertension on mutual sample relationship along the time axis. Methods. A copula method is applied to raw and to differentially coded signals. The latter ones were additionally binary encoded for a joint conditional entropy application. The signals were recorded from freely moving male Wistar rats and from spontaneous hypertensive rats, aged 3 months and 12 months. Results. The highest level of comonotonic behavior of pulse interval with respect to systolic blood pressure is observed at time lags τ=0, 3, and 4, while a strong counter-monotonic behavior occurs at time lags τ=1 and 2. Conclusion. Dynamic range of aging rats is considerably reduced in hypertensive groups. Conditional entropy of systolic blood pressure signal, compared to unconditional, shows an increased level of discrepancy, except for a time lag 1, where the equality is preserved in spite of the memory of differential coder. The antiparallel streams play an important role at single beat time lag.

  13. Dependency Structures in Differentially Coded Cardiovascular Time Series

    Science.gov (United States)

    Tasic, Tatjana; Jovanovic, Sladjana; Mohamoud, Omer; Skoric, Tamara; Japundzic-Zigon, Nina

    2017-01-01

    Objectives. This paper analyses temporal dependency in the time series recorded from aging rats, the healthy ones and those with early developed hypertension. The aim is to explore effects of age and hypertension on mutual sample relationship along the time axis. Methods. A copula method is applied to raw and to differentially coded signals. The latter ones were additionally binary encoded for a joint conditional entropy application. The signals were recorded from freely moving male Wistar rats and from spontaneous hypertensive rats, aged 3 months and 12 months. Results. The highest level of comonotonic behavior of pulse interval with respect to systolic blood pressure is observed at time lags τ = 0, 3, and 4, while a strong counter-monotonic behavior occurs at time lags τ = 1 and 2. Conclusion. Dynamic range of aging rats is considerably reduced in hypertensive groups. Conditional entropy of systolic blood pressure signal, compared to unconditional, shows an increased level of discrepancy, except for a time lag 1, where the equality is preserved in spite of the memory of differential coder. The antiparallel streams play an important role at single beat time lag. PMID:28127384

  14. Fundamental State Space Time Series Models for JEPX Electricity Prices

    Science.gov (United States)

    Ofuji, Kenta; Kanemoto, Shigeru

    Time series models are popular in attempts to model and forecast price dynamics in various markets. In this paper, we have formulated two state space models and tested them for its applicability to power price modeling and forecasting using JEPX (Japan Electric Power eXchange) data. The state space models generally have a high degree of flexibility with its time-dependent state transition matrix and system equation configurations. Based on empirical data analysis and past literatures, we used calculation assumptions to a) extract stochastic trend component to capture non-stationarity, and b) detect structural changes underlying in the market. The stepwise calculation algorithm followed that of Kalman Filter. We then evaluated the two models' forecasting capabilities, in comparison with ordinary AR (autoregressive) and ARCH (autoregressive conditional heteroskedasticity) models. By choosing proper explanatory variables, the latter state space model yielded as good a forecasting capability as that of the AR and the ARCH models for a short forecasting horizon.

  15. Computational intelligence in time series forecasting theory and engineering applications

    CERN Document Server

    Palit, Ajoy K

    2005-01-01

    Foresight in an engineering enterprise can make the difference between success and failure, and can be vital to the effective control of industrial systems. Applying time series analysis in the on-line milieu of most industrial plants has been problematic owing to the time and computational effort required. The advent of soft computing tools offers a solution. The authors harness the power of intelligent technologies individually and in combination. Examples of the particular systems and processes susceptible to each technique are investigated, cultivating a comprehensive exposition of the improvements on offer in quality, model building and predictive control and the selection of appropriate tools from the plethora available. Application-oriented engineers in process control, manufacturing, production industry and research centres will find much to interest them in this book. It is suitable for industrial training purposes, as well as serving as valuable reference material for experimental researchers.

  16. Estimation of Hurst Exponent for the Financial Time Series

    Science.gov (United States)

    Kumar, J.; Manchanda, P.

    2009-07-01

    Till recently statistical methods and Fourier analysis were employed to study fluctuations in stock markets in general and Indian stock market in particular. However current trend is to apply the concepts of wavelet methodology and Hurst exponent, see for example the work of Manchanda, J. Kumar and Siddiqi, Journal of the Frankline Institute 144 (2007), 613-636 and paper of Cajueiro and B. M. Tabak. Cajueiro and Tabak, Physica A, 2003, have checked the efficiency of emerging markets by computing Hurst component over a time window of 4 years of data. Our goal in the present paper is to understand the dynamics of the Indian stock market. We look for the persistency in the stock market through Hurst exponent and fractal dimension of time series data of BSE 100 and NIFTY 50.

  17. Financial Time Series Forecasting Using Directed-Weighted Chunking SVMs

    Directory of Open Access Journals (Sweden)

    Yongming Cai

    2014-01-01

    Full Text Available Support vector machines (SVMs are a promising alternative to traditional regression estimation approaches. But, when dealing with massive-scale data set, there exist many problems, such as the long training time and excessive demand of memory space. So, the SVMs algorithm is not suitable to deal with financial time series data. In order to solve these problems, directed-weighted chunking SVMs algorithm is proposed. In this algorithm, the whole training data set is split into several chunks, and then the support vectors are obtained on each subset. Furthermore, the weighted support vector regressions are calculated to obtain the forecast model on the new working data set. Our directed-weighted chunking algorithm provides a new method of support vectors decomposing and combining according to the importance of chunks, which can improve the operation speed without reducing prediction accuracy. Finally, IBM stock daily close prices data are used to verify the validity of the proposed algorithm.

  18. Time series power flow analysis for distribution connected PV generation.

    Energy Technology Data Exchange (ETDEWEB)

    Broderick, Robert Joseph; Quiroz, Jimmy Edward; Ellis, Abraham; Reno, Matthew J.; Smith, Jeff; Dugan, Roger

    2013-01-01

    Distributed photovoltaic (PV) projects must go through an interconnection study process before connecting to the distribution grid. These studies are intended to identify the likely impacts and mitigation alternatives. In the majority of the cases, system impacts can be ruled out or mitigation can be identified without an involved study, through a screening process or a simple supplemental review study. For some proposed projects, expensive and time-consuming interconnection studies are required. The challenges to performing the studies are twofold. First, every study scenario is potentially unique, as the studies are often highly specific to the amount of PV generation capacity that varies greatly from feeder to feeder and is often unevenly distributed along the same feeder. This can cause location-specific impacts and mitigations. The second challenge is the inherent variability in PV power output which can interact with feeder operation in complex ways, by affecting the operation of voltage regulation and protection devices. The typical simulation tools and methods in use today for distribution system planning are often not adequate to accurately assess these potential impacts. This report demonstrates how quasi-static time series (QSTS) simulation and high time-resolution data can be used to assess the potential impacts in a more comprehensive manner. The QSTS simulations are applied to a set of sample feeders with high PV deployment to illustrate the usefulness of the approach. The report describes methods that can help determine how PV affects distribution system operations. The simulation results are focused on enhancing the understanding of the underlying technical issues. The examples also highlight the steps needed to perform QSTS simulation and describe the data needed to drive the simulations. The goal of this report is to make the methodology of time series power flow analysis readily accessible to utilities and others responsible for evaluating

  19. Adaptive time-variant models for fuzzy-time-series forecasting.

    Science.gov (United States)

    Wong, Wai-Keung; Bai, Enjian; Chu, Alice Wai-Ching

    2010-12-01

    A fuzzy time series has been applied to the prediction of enrollment, temperature, stock indices, and other domains. Related studies mainly focus on three factors, namely, the partition of discourse, the content of forecasting rules, and the methods of defuzzification, all of which greatly influence the prediction accuracy of forecasting models. These studies use fixed analysis window sizes for forecasting. In this paper, an adaptive time-variant fuzzy-time-series forecasting model (ATVF) is proposed to improve forecasting accuracy. The proposed model automatically adapts the analysis window size of fuzzy time series based on the prediction accuracy in the training phase and uses heuristic rules to generate forecasting values in the testing phase. The performance of the ATVF model is tested using both simulated and actual time series including the enrollments at the University of Alabama, Tuscaloosa, and the Taiwan Stock Exchange Capitalization Weighted Stock Index (TAIEX). The experiment results show that the proposed ATVF model achieves a significant improvement in forecasting accuracy as compared to other fuzzy-time-series forecasting models.

  20. Munchausen syndrome by proxy

    Science.gov (United States)

    ... this page: //medlineplus.gov/ency/article/001555.htm Munchausen syndrome by proxy To use the sharing features on this page, please enable JavaScript. Munchausen syndrome by proxy is a mental illness and ...

  1. Computer Program Recognizes Patterns in Time-Series Data

    Science.gov (United States)

    Hand, Charles

    2003-01-01

    A computer program recognizes selected patterns in time-series data like digitized samples of seismic or electrophysiological signals. The program implements an artificial neural network (ANN) and a set of N clocks for the purpose of determining whether N or more instances of a certain waveform, W, occur within a given time interval, T. The ANN must be trained to recognize W in the incoming stream of data. The first time the ANN recognizes W, it sets clock 1 to count down from T to zero; the second time it recognizes W, it sets clock 2 to count down from T to zero, and so forth through the Nth instance. On the N + 1st instance, the cycle is repeated, starting with clock 1. If any clock has not reached zero when it is reset, then N instances of W have been detected within time T, and the program so indicates. The program can readily be encoded in a field-programmable gate array or an application-specific integrated circuit that could be used, for example, to detect electroencephalographic or electrocardiographic waveforms indicative of epileptic seizures or heart attacks, respectively.

  2. Adaptive Sampling of Time Series During Remote Exploration

    Science.gov (United States)

    Thompson, David R.

    2012-01-01

    This work deals with the challenge of online adaptive data collection in a time series. A remote sensor or explorer agent adapts its rate of data collection in order to track anomalous events while obeying constraints on time and power. This problem is challenging because the agent has limited visibility (all its datapoints lie in the past) and limited control (it can only decide when to collect its next datapoint). This problem is treated from an information-theoretic perspective, fitting a probabilistic model to collected data and optimizing the future sampling strategy to maximize information gain. The performance characteristics of stationary and nonstationary Gaussian process models are compared. Self-throttling sensors could benefit environmental sensor networks and monitoring as well as robotic exploration. Explorer agents can improve performance by adjusting their data collection rate, preserving scarce power or bandwidth resources during uninteresting times while fully covering anomalous events of interest. For example, a remote earthquake sensor could conserve power by limiting its measurements during normal conditions and increasing its cadence during rare earthquake events. A similar capability could improve sensor platforms traversing a fixed trajectory, such as an exploration rover transect or a deep space flyby. These agents can adapt observation times to improve sample coverage during moments of rapid change. An adaptive sampling approach couples sensor autonomy, instrument interpretation, and sampling. The challenge is addressed as an active learning problem, which already has extensive theoretical treatment in the statistics and machine learning literature. A statistical Gaussian process (GP) model is employed to guide sample decisions that maximize information gain. Nonsta tion - ary (e.g., time-varying) covariance relationships permit the system to represent and track local anomalies, in contrast with current GP approaches. Most common GP models

  3. United States Forest Disturbance Trends Observed Using Landsat Time Series

    Science.gov (United States)

    Masek, Jeffrey G.; Goward, Samuel N.; Kennedy, Robert E.; Cohen, Warren B.; Moisen, Gretchen G.; Schleeweis, Karen; Huang, Chengquan

    2013-01-01

    Disturbance events strongly affect the composition, structure, and function of forest ecosystems; however, existing U.S. land management inventories were not designed to monitor disturbance. To begin addressing this gap, the North American Forest Dynamics (NAFD) project has examined a geographic sample of 50 Landsat satellite image time series to assess trends in forest disturbance across the conterminous United States for 1985-2005. The geographic sample design used a probability-based scheme to encompass major forest types and maximize geographic dispersion. For each sample location disturbance was identified in the Landsat series using the Vegetation Change Tracker (VCT) algorithm. The NAFD analysis indicates that, on average, 2.77 Mha/yr of forests were disturbed annually, representing 1.09%/yr of US forestland. These satellite-based national disturbance rates estimates tend to be lower than those derived from land management inventories, reflecting both methodological and definitional differences. In particular the VCT approach used with a biennial time step has limited sensitivity to low-intensity disturbances. Unlike prior satellite studies, our biennial forest disturbance rates vary by nearly a factor of two between high and low years. High western US disturbance rates were associated with active fire years and insect activity, while variability in the east is more strongly related to harvest rates in managed forests. We note that generating a geographic sample based on representing forest type and variability may be problematic since the spatial pattern of disturbance does not necessarily correlate with forest type. We also find that the prevalence of diffuse, non-stand clearing disturbance in US forests makes the application of a biennial geographic sample problematic. Future satellite-based studies of disturbance at regional and national scales should focus on wall-to-wall analyses with annual time step for improved accuracy.

  4. Modeling Glacier Elevation Change from DEM Time Series

    Directory of Open Access Journals (Sweden)

    Di Wang

    2015-08-01

    Full Text Available In this study, a methodology for glacier elevation reconstruction from Digital Elevation Model (DEM time series (tDEM is described for modeling the evolution of glacier elevation and estimating related volume change, with focus on medium-resolution and noisy satellite DEMs. The method is robust with respect to outliers in individual DEM products. Fox Glacier and Franz Josef Glacier in New Zealand are used as test cases based on 31 Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER DEMs and the Shuttle Radar Topography Mission (SRTM DEM. We obtained a mean surface elevation lowering rate of −0.51 ± 0.02 m·a−1 and −0.09 ± 0.02 m·a−1 between 2000 and 2014 for Fox and Franz Josef Glacier, respectively. The specific volume difference between 2000 and 2014 was estimated as −0.77 ± 0.13 m·a−1 and −0.33 ± 0.06 m·a−1 by our tDEM method. The comparably moderate thinning rates are mainly due to volume gains after 2013 that compensate larger thinning rates earlier in the series. Terminus thickening prevailed between 2002 and 2007.

  5. MODELLING GASOLINE DEMAND IN GHANA: A STRUCTURAL TIME SERIES ANALYSIS

    Directory of Open Access Journals (Sweden)

    Ishmael Ackah

    2014-01-01

    Full Text Available Concerns about the role of energy consumption in global warming have led to policy designs that seek to reduce fossil fuel consumption or find a less polluting alternative especiallyfor the transport sector. This study seeks to estimate the elasticities of price, income, education and technology on transport gasoline demand sector inGhana. The Structural Time Series Model reports a short-run price and income elasticities of -0.0088 and 0.713. Total factor productivity is -0.408 whilstthe elasticity for education is 2.33. In the long run, the reported price and income elasticities are -0.065 and 5.129 respectively. The long run elasticityfor productivity is -2.935. The study recommends that in order to enhanceefficiency in gasoline consumption in the transport sector, there should beinvestment in productivity.

  6. Time-series analysis of Campylobacter incidence in Switzerland.

    Science.gov (United States)

    Wei, W; Schüpbach, G; Held, L

    2015-07-01

    Campylobacteriosis has been the most common food-associated notifiable infectious disease in Switzerland since 1995. Contact with and ingestion of raw or undercooked broilers are considered the dominant risk factors for infection. In this study, we investigated the temporal relationship between the disease incidence in humans and the prevalence of Campylobacter in broilers in Switzerland from 2008 to 2012. We use a time-series approach to describe the pattern of the disease by incorporating seasonal effects and autocorrelation. The analysis shows that prevalence of Campylobacter in broilers, with a 2-week lag, has a significant impact on disease incidence in humans. Therefore Campylobacter cases in humans can be partly explained by contagion through broiler meat. We also found a strong autoregressive effect in human illness, and a significant increase of illness during Christmas and New Year's holidays. In a final analysis, we corrected for the sampling error of prevalence in broilers and the results gave similar conclusions.

  7. Determinants of suicides in Denmark: evidence from time series data.

    Science.gov (United States)

    Andrés, Antonio R; Halicioglu, Ferda

    2010-12-01

    This research examines empirically the determinants of suicides in Denmark over the period 1970-2006. To our knowledge, there exist no previous study that estimates a dynamic econometric model of suicides on the basis of time series data and cointegration framework at disaggregate level. Our results indicate that suicide is associated with a range of socio-economic factors but the strength of the association can differ by gender. In particular, we find that a rise in real per capita income and fertility rate decreases suicides for males and females. Divorce is positively associated with suicides and this effect seems to be stronger for men. A fall in unemployment rates seems to lower significantly suicides in males and females. Policy implications of suicides are discussed with some appropriate recommendations. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  8. Multivariate time series with linear state space structure

    CERN Document Server

    Gómez, Víctor

    2016-01-01

    This book presents a comprehensive study of multivariate time series with linear state space structure. The emphasis is put on both the clarity of the theoretical concepts and on efficient algorithms for implementing the theory. In particular, it investigates the relationship between VARMA and state space models, including canonical forms. It also highlights the relationship between Wiener-Kolmogorov and Kalman filtering both with an infinite and a finite sample. The strength of the book also lies in the numerous algorithms included for state space models that take advantage of the recursive nature of the models. Many of these algorithms can be made robust, fast, reliable and efficient. The book is accompanied by a MATLAB package called SSMMATLAB and a webpage presenting implemented algorithms with many examples and case studies. Though it lays a solid theoretical foundation, the book also focuses on practical application, and includes exercises in each chapter. It is intended for researchers and students wor...

  9. Single-Index Additive Vector Autoregressive Time Series Models

    KAUST Repository

    LI, YEHUA

    2009-09-01

    We study a new class of nonlinear autoregressive models for vector time series, where the current vector depends on single-indexes defined on the past lags and the effects of different lags have an additive form. A sufficient condition is provided for stationarity of such models. We also study estimation of the proposed model using P-splines, hypothesis testing, asymptotics, selection of the order of the autoregression and of the smoothing parameters and nonlinear forecasting. We perform simulation experiments to evaluate our model in various settings. We illustrate our methodology on a climate data set and show that our model provides more accurate yearly forecasts of the El Niño phenomenon, the unusual warming of water in the Pacific Ocean. © 2009 Board of the Foundation of the Scandinavian Journal of Statistics.

  10. Predicting the Market Potential Using Time Series Analysis

    Directory of Open Access Journals (Sweden)

    Halmet Bradosti

    2015-12-01

    Full Text Available The aim of this analysis is to forecast a mini-market sales volume for the period of twelve months starting August 2015 to August 2016. The study is based on the monthly sales in Iraqi Dinar for a private local mini-market for the month of April 2014 to July 2015. As revealed on the graph and of course if the stagnant economic condition continues, the trend of future sales is down-warding. Based on time series analysis, the business may continue to operate and generate small revenues until August 2016. However, due to low sales volume, low profit margin and operating expenses, the revenues may not be adequate enough to produce positive net income and the business may not be able to operate afterward. The principal question rose from this is the forecasting sales in the region will be difficult where the business cycle so dynamic and revolutionary due to systematic risks and unforeseeable future.

  11. Forecasting electricity usage using univariate time series models

    Science.gov (United States)

    Hock-Eam, Lim; Chee-Yin, Yip

    2014-12-01

    Electricity is one of the important energy sources. A sufficient supply of electricity is vital to support a country's development and growth. Due to the changing of socio-economic characteristics, increasing competition and deregulation of electricity supply industry, the electricity demand forecasting is even more important than before. It is imperative to evaluate and compare the predictive performance of various forecasting methods. This will provide further insights on the weakness and strengths of each method. In literature, there are mixed evidences on the best forecasting methods of electricity demand. This paper aims to compare the predictive performance of univariate time series models for forecasting the electricity demand using a monthly data of maximum electricity load in Malaysia from January 2003 to December 2013. Results reveal that the Box-Jenkins method produces the best out-of-sample predictive performance. On the other hand, Holt-Winters exponential smoothing method is a good forecasting method for in-sample predictive performance.

  12. Efficient Bayesian inference for natural time series using ARFIMA processes

    Science.gov (United States)

    Graves, Timothy; Gramacy, Robert; Franzke, Christian; Watkins, Nicholas

    2016-04-01

    Many geophysical quantities, such as atmospheric temperature, water levels in rivers, and wind speeds, have shown evidence of long memory (LM). LM implies that these quantities experience non-trivial temporal memory, which potentially not only enhances their predictability, but also hampers the detection of externally forced trends. Thus, it is important to reliably identify whether or not a system exhibits LM. We present a modern and systematic approach to the inference of LM. We use the flexible autoregressive fractional integrated moving average (ARFIMA) model, which is widely used in time series analysis, and of increasing interest in climate science. Unlike most previous work on the inference of LM, which is frequentist in nature, we provide a systematic treatment of Bayesian inference. In particular, we provide a new approximate likelihood for efficient parameter inference, and show how nuisance parameters (e.g., short-memory effects) can be integrated over in order to focus on long-memory parameters and hypothesis testing more directly. We illustrate our new methodology on the Nile water level data and the central England temperature (CET) time series, with favorable comparison to the standard estimators [1]. In addition we show how the method can be used to perform joint inference of the stability exponent and the memory parameter when ARFIMA is extended to allow for alpha-stable innovations. Such models can be used to study systems where heavy tails and long range memory coexist. [1] Graves et al, Nonlin. Processes Geophys., 22, 679-700, 2015; doi:10.5194/npg-22-679-2015.

  13. Controlled, distributed data management of an Antarctic time series

    Science.gov (United States)

    Leadbetter, Adam; Connor, David; Cunningham, Nathan; Reynolds, Sarah

    2010-05-01

    The Rothera Time Series (RaTS) presents over ten years of oceanographic data collected off the Antarctic Peninsula comprising conductivity, temperature, depth cast data; current meter data; and bottle sample data. The data set has been extensively analysed and is well represented in the scientific literature. However, it has never been available to browse as a coherent entity. Work has been undertaken by both the data collecting organisation (the British Antarctic Survey, BAS) and the associated national data centre (the British Oceanographic Data Centre, BODC) to describe the parameters comprising the dataset in a consistent manner. To this end, each data point in the RaTS dataset has now been ascribed a parameter usage term, selected from the appropriate controlled vocabulary of the Natural Environment Research Council's Data Grid (NDG). By marking up the dataset in this way the semantic richness of the NDG vocabularies is fully accessible, and the dataset can be then explored using the Global Change Master Directory keyword set, the International Standards Organisation topic categories, SeaDataNet disciplines and agreed parameter groups, and the NDG parameter discovery vocabulary. We present a single data discovery and exploration tool, a web portal which allows the user to drill down through the dataset using their chosen keyword set. The spatial coverage of the chosen data is displayed through a Google Earth web plugin. Finally, as the time series data are held at BODC and the discrete sample data held at BAS (which are separate physical locations), a mechanism has been established to provide metadata from one site to another. This takes the form of an Open Geospatial Consortium Web Map Service server at BODC feeding information into the portal hosted at BAS.

  14. Time series inversion of spectra from ground-based radiometers

    Directory of Open Access Journals (Sweden)

    O. M. Christensen

    2013-07-01

    Full Text Available Retrieving time series of atmospheric constituents from ground-based spectrometers often requires different temporal averaging depending on the altitude region in focus. This can lead to several datasets existing for one instrument, which complicates validation and comparisons between instruments. This paper puts forth a possible solution by incorporating the temporal domain into the maximum a posteriori (MAP retrieval algorithm. The state vector is increased to include measurements spanning a time period, and the temporal correlations between the true atmospheric states are explicitly specified in the a priori uncertainty matrix. This allows the MAP method to effectively select the best temporal smoothing for each altitude, removing the need for several datasets to cover different altitudes. The method is compared to traditional averaging of spectra using a simulated retrieval of water vapour in the mesosphere. The simulations show that the method offers a significant advantage compared to the traditional method, extending the sensitivity an additional 10 km upwards without reducing the temporal resolution at lower altitudes. The method is also tested on the Onsala Space Observatory (OSO water vapour microwave radiometer confirming the advantages found in the simulation. Additionally, it is shown how the method can interpolate data in time and provide diagnostic values to evaluate the interpolated data.

  15. Muscle segmentation in time series images of Drosophila metamorphosis.

    Science.gov (United States)

    Yadav, Kuleesha; Lin, Feng; Wasser, Martin

    2015-01-01

    In order to study genes associated with muscular disorders, we characterize the phenotypic changes in Drosophila muscle cells during metamorphosis caused by genetic perturbations. We collect in vivo images of muscle fibers during remodeling of larval to adult muscles. In this paper, we focus on the new image processing pipeline designed to quantify the changes in shape and size of muscles. We propose a new two-step approach to muscle segmentation in time series images. First, we implement a watershed algorithm to divide the image into edge-preserving regions, and then, we classify these regions into muscle and non-muscle classes on the basis of shape and intensity. The advantage of our method is two-fold: First, better results are obtained because classification of regions is constrained by the shape of muscle cell from previous time point; and secondly, minimal user intervention results in faster processing time. The segmentation results are used to compare the changes in cell size between controls and reduction of the autophagy related gene Atg 9 during Drosophila metamorphosis.

  16. MaTSE: the gene expression time-series explorer.

    Science.gov (United States)

    Craig, Paul; Cannon, Alan; Kukla, Robert; Kennedy, Jessie

    2013-01-01

    High throughput gene expression time-course experiments provide a perspective on biological functioning recognized as having huge value for the diagnosis, treatment, and prevention of diseases. There are however significant challenges to properly exploiting this data due to its massive scale and complexity. In particular, existing techniques are found to be ill suited to finding patterns of changing activity over a limited interval of an experiments time frame. The Time-Series Explorer (TSE) was developed to overcome this limitation by allowing users to explore their data by controlling an animated scatter-plot view. MaTSE improves and extends TSE by allowing users to visualize data with missing values, cross reference multiple conditions, highlight gene groupings, and collaborate by sharing their findings. MaTSE was developed using an iterative software development cycle that involved a high level of user feedback and evaluation. The resulting software combines a variety of visualization and interaction techniques which work together to allow biologists to explore their data and reveal temporal patterns of gene activity. These include a scatter-plot that can be animated to view different temporal intervals of the data, a multiple coordinated view framework to support the cross reference of multiple experimental conditions, a novel method for highlighting overlapping groups in the scatter-plot, and a pattern browser component that can be used with scatter-plot box queries to support cooperative visualization. A final evaluation demonstrated the tools effectiveness in allowing users to find unexpected temporal patterns and the benefits of functionality such as the overlay of gene groupings and the ability to store patterns. We have developed a new exploratory analysis tool, MaTSE, that allows users to find unexpected patterns of temporal activity in gene expression time-series data. Overall, the study acted well to demonstrate the benefits of an iterative software

  17. Time Series Analysis of Floods across the Niger River Basin

    Directory of Open Access Journals (Sweden)

    Valentin Aich

    2016-04-01

    Full Text Available This study analyses the increasing number of catastrophic floods in the Niger River Basin, focusing on the relation between long term hydro-climatic variability and flood risk over the last 40 to 100 years. Time series for three subregions (Guinean, Sahelian, Benue show a general consistency between the annual maximum discharge (AMAX and climatic decadal patterns in West Africa regarding both trends and major changepoints. Variance analysis reveals rather stable AMAX distributions except for the Sahelian region, implying that the changes in flood behavior differ within the basin and affect mostly the dry Sahelian region. The timing of the floods within the year has changed only downstream of the Inner Niger Delta due to retention processes. The results of the hydro-climatic analysis generally correspond to the presented damage statistics on people affected by catastrophic floods. The damage statistics shows positive trends for the entire basin since the beginning in the 1980s, with the most extreme increase in the Middle Niger.

  18. Patch-Based Forest Change Detection from Landsat Time Series

    Directory of Open Access Journals (Sweden)

    M. Joseph Hughes

    2017-05-01

    Full Text Available In the species-rich and structurally complex forests of the Eastern United States, disturbance events are often partial and therefore difficult to detect using remote sensing methods. Here we present a set of new algorithms, collectively called Vegetation Regeneration and Disturbance Estimates through Time (VeRDET, which employ a novel patch-based approach to detect periods of vegetation disturbance, stability, and growth from the historical Landsat image records. VeRDET generates a yearly clear-sky composite from satellite imagery, calculates a spectral vegetation index for each pixel in that composite, spatially segments the vegetation index image into patches, temporally divides the time series into differently sloped segments, and then labels those segments as disturbed, stable, or regenerating. Segmentation at both the spatial and temporal steps are performed using total variation regularization, an algorithm originally designed for signal denoising. This study explores VeRDET’s effectiveness in detecting forest change using four vegetation indices and two parameters controlling the spatial and temporal scales of segmentation within a calibration region. We then evaluate algorithm effectiveness within a 386,000 km2 area in the Eastern United States where VeRDET has overall error of 23% and omission error across disturbances ranging from 22% to 78% depending on agent.

  19. Impact of Sensor Degradation on the MODIS NDVI Time Series

    Science.gov (United States)

    Wang, Dongdong; Morton, Douglas Christopher; Masek, Jeffrey; Wu, Aisheng; Nagol, Jyoteshwar; Xiong, Xiaoxiong; Levy, Robert; Vermote, Eric; Wolfe, Robert

    2012-01-01

    Time series of satellite data provide unparalleled information on the response of vegetation to climate variability. Detecting subtle changes in vegetation over time requires consistent satellite-based measurements. Here, the impact of sensor degradation on trend detection was evaluated using Collection 5 data from the Moderate Resolution Imaging Spectroradiometer (MODIS) sensors on the Terra and Aqua platforms. For Terra MODIS, the impact of blue band (Band 3, 470 nm) degradation on simulated surface reflectance was most pronounced at near-nadir view angles, leading to a 0.001-0.004 yr-1 decline in Normalized Difference Vegetation Index (NDVI) under a range of simulated aerosol conditions and surface types. Observed trends in MODIS NDVI over North America were consistentwith simulated results,with nearly a threefold difference in negative NDVI trends derived from Terra (17.4%) and Aqua (6.7%) MODIS sensors during 2002-2010. Planned adjustments to Terra MODIS calibration for Collection 6 data reprocessing will largely eliminate this negative bias in detection of NDVI trends.

  20. Urban Area Monitoring using MODIS Time Series Data

    Science.gov (United States)

    Devadiga, S.; Sarkar, S.; Mauoka, E.

    2015-12-01

    Growing urban sprawl and its impact on global climate due to urban heat island effects has been an active area of research over the recent years. This is especially significant in light of rapid urbanization that is happening in some of the first developing nations across the globe. But so far study of urban area growth has been largely restricted to local and regional scales, using high to medium resolution satellite observations, taken at distinct time periods. In this presentation we propose a new approach to detect and monitor urban area expansion using long time series of MODIS data. This work characterizes data points using a vector of several annual metrics computed from the MODIS 8-day and 16-day composite L3 data products, at 250M resolution and over several years and then uses a vector angle mapping classifier to detect and segment the urban area. The classifier is trained using a set of training points obtained from a reference vector point and polygon pre-filtered using the MODIS VI product. This work gains additional significance, given that, despite unprecedented urban growth since 2000, the area covered by the urban class in the MODIS Global Land Cover (MCD12Q1, MCDLCHKM and MCDLC1KM) product hasn't changed since the launch of Terra and Aqua. The proposed approach was applied to delineate the urban area around several cities in Asia known to have maximum growth in the last 15 years. Results were verified using high resolution Landsat data.

  1. Noninvertibility and resonance in discrete-time neural networks for time-series processing

    Science.gov (United States)

    Gicquel, N.; Anderson, J. S.; Kevrekidis, I. G.

    1998-01-01

    We present a computer-assisted study emphasizing certain elements of the dynamics of artificial neural networks (ANNs) used for discrete time-series processing and nonlinear system identification. The structure of the network gives rise to the possibility of multiple inverses of a phase point backward in time; this is not possible for the continuous-time system from which the time series are obtained. Using a two-dimensional illustrative model in an oscillatory regime, we study here the interaction of attractors predicted by the discrete-time ANN model (invariant circles and periodic points locked on them) with critical curves. These curves constitute a generalization of critical points for maps of the interval (in the sense of Julia-Fatou); their interaction with the model-predicted attractors plays a crucial role in the organization of the bifurcation structure and ultimately in determining the dynamic behavior predicted by the neural network.

  2. GPS coordinate time series measurements in Ontario and Quebec, Canada

    Science.gov (United States)

    Samadi Alinia, Hadis; Tiampo, Kristy F.; James, Thomas S.

    2017-06-01

    New precise network solutions for continuous GPS (cGPS) stations distributed in eastern Ontario and western Québec provide constraints on the regional three-dimensional crustal velocity field. Five years of continuous observations at fourteen cGPS sites were analyzed using Bernese GPS processing software. Several different sub-networks were chosen from these stations, and the data were processed and compared to in order to select the optimal configuration to accurately estimate the vertical and horizontal station velocities and minimize the associated errors. The coordinate time series were then compared to the crustal motions from global solutions and the optimized solution is presented here. A noise analysis model with power-law and white noise, which best describes the noise characteristics of all three components, was employed for the GPS time series analysis. The linear trend, associated uncertainties, and the spectral index of the power-law noise were calculated using a maximum likelihood estimation approach. The residual horizontal velocities, after removal of rigid plate motion, have a magnitude consistent with expected glacial isostatic adjustment (GIA). The vertical velocities increase from subsidence of almost 1.9 mm/year south of the Great Lakes to uplift near Hudson Bay, where the highest rate is approximately 10.9 mm/year. The residual horizontal velocities range from approximately 0.5 mm/year, oriented south-southeastward, at the Great Lakes to nearly 1.5 mm/year directed toward the interior of Hudson Bay at stations adjacent to its shoreline. Here, the velocity uncertainties are estimated at less than 0.6 mm/year for the horizontal component and 1.1 mm/year for the vertical component. A comparison between the observed velocities and GIA model predictions, for a limited range of Earth models, shows a better fit to the observations for the Earth model with the smallest upper mantle viscosity and the largest lower mantle viscosity. However, the

  3. Mackenzie River Delta morphological change based on Landsat time series

    Science.gov (United States)

    Vesakoski, Jenni-Mari; Alho, Petteri; Gustafsson, David; Arheimer, Berit; Isberg, Kristina

    2015-04-01

    Arctic rivers are sensitive and yet quite unexplored river systems to which the climate change will impact on. Research has not focused in detail on the fluvial geomorphology of the Arctic rivers mainly due to the remoteness and wideness of the watersheds, problems with data availability and difficult accessibility. Nowadays wide collaborative spatial databases in hydrology as well as extensive remote sensing datasets over the Arctic are available and they enable improved investigation of the Arctic watersheds. Thereby, it is also important to develop and improve methods that enable detecting the fluvio-morphological processes based on the available data. Furthermore, it is essential to reconstruct and improve the understanding of the past fluvial processes in order to better understand prevailing and future fluvial processes. In this study we sum up the fluvial geomorphological change in the Mackenzie River Delta during the last ~30 years. The Mackenzie River Delta (~13 000 km2) is situated in the North Western Territories, Canada where the Mackenzie River enters to the Beaufort Sea, Arctic Ocean near the city of Inuvik. Mackenzie River Delta is lake-rich, productive ecosystem and ecologically sensitive environment. Research objective is achieved through two sub-objectives: 1) Interpretation of the deltaic river channel planform change by applying Landsat time series. 2) Definition of the variables that have impacted the most on detected changes by applying statistics and long hydrological time series derived from Arctic-HYPE model (HYdrologic Predictions for Environment) developed by Swedish Meteorological and Hydrological Institute. According to our satellite interpretation, field observations and statistical analyses, notable spatio-temporal changes have occurred in the morphology of the river channel and delta during the past 30 years. For example, the channels have been developing in braiding and sinuosity. In addition, various linkages between the studied

  4. Time-variant power spectral analysis of heart-rate time series by ...

    Indian Academy of Sciences (India)

    Frequency domain representation of a short-term heart-rate time series (HRTS) signal is a popular method for evaluating the cardiovascular control system. The spectral parameters, viz. percentage power in low frequency band (%PLF), percentage power in high frequency band (%PHF), power ratio of low frequency to high ...

  5. Academic Workload and Working Time: Retrospective Perceptions versus Time-Series Data

    Science.gov (United States)

    Kyvik, Svein

    2013-01-01

    The purpose of this article is to examine the validity of perceptions by academic staff about their past and present workload and working hours. Retrospective assessments are compared with time-series data. The data are drawn from four mail surveys among academic staff in Norwegian universities undertaken in the period 1982-2008. The findings show…

  6. Estimation of vegetation cover resilience from satellite time series

    Directory of Open Access Journals (Sweden)

    T. Simoniello

    2008-07-01

    Full Text Available Resilience is a fundamental concept for understanding vegetation as a dynamic component of the climate system. It expresses the ability of ecosystems to tolerate disturbances and to recover their initial state. Recovery times are basic parameters of the vegetation's response to forcing and, therefore, are essential for describing realistic vegetation within dynamical models. Healthy vegetation tends to rapidly recover from shock and to persist in growth and expansion. On the contrary, climatic and anthropic stress can reduce resilience thus favouring persistent decrease in vegetation activity.

    In order to characterize resilience, we analyzed the time series 1982–2003 of 8 km GIMMS AVHRR-NDVI maps of the Italian territory. Persistence probability of negative and positive trends was estimated according to the vegetation cover class, altitude, and climate. Generally, mean recovery times from negative trends were shorter than those estimated for positive trends, as expected for vegetation of healthy status. Some signatures of inefficient resilience were found in high-level mountainous areas and in the Mediterranean sub-tropical ones. This analysis was refined by aggregating pixels according to phenology. This multitemporal clustering synthesized information on vegetation cover, climate, and orography rather well. The consequent persistence estimations confirmed and detailed hints obtained from the previous analyses. Under the same climatic regime, different vegetation resilience levels were found. In particular, within the Mediterranean sub-tropical climate, clustering was able to identify features with different persistence levels in areas that are liable to different levels of anthropic pressure. Moreover, it was capable of enhancing reduced vegetation resilience also in the southern areas under Warm Temperate sub-continental climate. The general consistency of the obtained results showed that, with the help of suited analysis

  7. A unified nonlinear stochastic time series analysis for climate science

    Science.gov (United States)

    Moon, Woosok; Wettlaufer, John

    2017-04-01

    Earth's orbit and axial tilt imprint a strong seasonal cycle on climatological data. Climate variability is typically viewed in terms of fluctuations in the seasonal cycle induced by higher frequency processes. We can interpret this as a competition between the orbitally enforced monthly stability and the fluctuations/noise induced by weather. Here we introduce a new time-series method that determines these contributions from monthly-averaged data. We find that the spatio-temporal distribution of the monthly stability and the magnitude of the noise reveal key fingerprints of several important climate phenomena, including the evolution of the Arctic sea ice cover, the El Niño Southern Oscillation (ENSO), the Atlantic Niño and the Indian Dipole Mode. In analogy with the classical destabilising influence of the ice-albedo feedback on summertime sea ice, we find that during some period of the season a destabilising process operates in all of these climate phenomena. The interaction between the destabilisation and the accumulation of noise, which we term the memory effect, underlies phase locking to the seasonal cycle and the statistical nature of seasonal predictability.

  8. River flow time series using least squares support vector machines

    Directory of Open Access Journals (Sweden)

    R. Samsudin

    2011-06-01

    Full Text Available This paper proposes a novel hybrid forecasting model known as GLSSVM, which combines the group method of data handling (GMDH and the least squares support vector machine (LSSVM. The GMDH is used to determine the useful input variables which work as the time series forecasting for the LSSVM model. Monthly river flow data from two stations, the Selangor and Bernam rivers in Selangor state of Peninsular Malaysia were taken into consideration in the development of this hybrid model. The performance of this model was compared with the conventional artificial neural network (ANN models, Autoregressive Integrated Moving Average (ARIMA, GMDH and LSSVM models using the long term observations of monthly river flow discharge. The root mean square error (RMSE and coefficient of correlation (R are used to evaluate the models' performances. In both cases, the new hybrid model has been found to provide more accurate flow forecasts compared to the other models. The results of the comparison indicate that the new hybrid model is a useful tool and a promising new method for river flow forecasting.

  9. River flow time series using least squares support vector machines

    Science.gov (United States)

    Samsudin, R.; Saad, P.; Shabri, A.

    2011-06-01

    This paper proposes a novel hybrid forecasting model known as GLSSVM, which combines the group method of data handling (GMDH) and the least squares support vector machine (LSSVM). The GMDH is used to determine the useful input variables which work as the time series forecasting for the LSSVM model. Monthly river flow data from two stations, the Selangor and Bernam rivers in Selangor state of Peninsular Malaysia were taken into consideration in the development of this hybrid model. The performance of this model was compared with the conventional artificial neural network (ANN) models, Autoregressive Integrated Moving Average (ARIMA), GMDH and LSSVM models using the long term observations of monthly river flow discharge. The root mean square error (RMSE) and coefficient of correlation (R) are used to evaluate the models' performances. In both cases, the new hybrid model has been found to provide more accurate flow forecasts compared to the other models. The results of the comparison indicate that the new hybrid model is a useful tool and a promising new method for river flow forecasting.

  10. Enhancing Time-Series Detection Algorithms for Automated Biosurveillance

    Science.gov (United States)

    Burkom, Howard; Xing, Jian; English, Roseanne; Bloom, Steven; Cox, Kenneth; Pavlin, Julie A.

    2009-01-01

    BioSense is a US national system that uses data from health information systems for automated disease surveillance. We studied 4 time-series algorithm modifications designed to improve sensitivity for detecting artificially added data. To test these modified algorithms, we used reports of daily syndrome visits from 308 Department of Defense (DoD) facilities and 340 hospital emergency departments (EDs). At a constant alert rate of 1%, sensitivity was improved for both datasets by using a minimum standard deviation (SD) of 1.0, a 14–28 day baseline duration for calculating mean and SD, and an adjustment for total clinic visits as a surrogate denominator. Stratifying baseline days into weekdays versus weekends to account for day-of-week effects increased sensitivity for the DoD data but not for the ED data. These enhanced methods may increase sensitivity without increasing the alert rate and may improve the ability to detect outbreaks by using automated surveillance system data. PMID:19331728

  11. Financial time series analysis based on effective phase transfer entropy

    Science.gov (United States)

    Yang, Pengbo; Shang, Pengjian; Lin, Aijing

    2017-02-01

    Transfer entropy is a powerful technique which is able to quantify the impact of one dynamic system on another system. In this paper, we propose the effective phase transfer entropy method based on the transfer entropy method. We use simulated data to test the performance of this method, and the experimental results confirm that the proposed approach is capable of detecting the information transfer between the systems. We also explore the relationship between effective phase transfer entropy and some variables, such as data size, coupling strength and noise. The effective phase transfer entropy is positively correlated with the data size and the coupling strength. Even in the presence of a large amount of noise, it can detect the information transfer between systems, and it is very robust to noise. Moreover, this measure is indeed able to accurately estimate the information flow between systems compared with phase transfer entropy. In order to reflect the application of this method in practice, we apply this method to financial time series and gain new insight into the interactions between systems. It is demonstrated that the effective phase transfer entropy can be used to detect some economic fluctuations in the financial market. To summarize, the effective phase transfer entropy method is a very efficient tool to estimate the information flow between systems.

  12. Chaos Time Series Prediction Based on Membrane Optimization Algorithms

    Directory of Open Access Journals (Sweden)

    Meng Li

    2015-01-01

    Full Text Available This paper puts forward a prediction model based on membrane computing optimization algorithm for chaos time series; the model optimizes simultaneously the parameters of phase space reconstruction (τ,m and least squares support vector machine (LS-SVM (γ,σ by using membrane computing optimization algorithm. It is an important basis for spectrum management to predict accurately the change trend of parameters in the electromagnetic environment, which can help decision makers to adopt an optimal action. Then, the model presented in this paper is used to forecast band occupancy rate of frequency modulation (FM broadcasting band and interphone band. To show the applicability and superiority of the proposed model, this paper will compare the forecast model presented in it with conventional similar models. The experimental results show that whether single-step prediction or multistep prediction, the proposed model performs best based on three error measures, namely, normalized mean square error (NMSE, root mean square error (RMSE, and mean absolute percentage error (MAPE.

  13. Change classification in SAR time series: a functional approach

    Science.gov (United States)

    Boldt, Markus; Thiele, Antje; Schulz, Karsten; Hinz, Stefan

    2017-10-01

    Change detection represents a broad field of research in SAR remote sensing, consisting of many different approaches. Besides the simple recognition of change areas, the analysis of type, category or class of the change areas is at least as important for creating a comprehensive result. Conventional strategies for change classification are based on supervised or unsupervised landuse / landcover classifications. The main drawback of such approaches is that the quality of the classification result directly depends on the selection of training and reference data. Additionally, supervised processing methods require an experienced operator who capably selects the training samples. This training step is not necessary when using unsupervised strategies, but nevertheless meaningful reference data must be available for identifying the resulting classes. Consequently, an experienced operator is indispensable. In this study, an innovative concept for the classification of changes in SAR time series data is proposed. Regarding the drawbacks of traditional strategies given above, it copes without using any training data. Moreover, the method can be applied by an operator, who does not have detailed knowledge about the available scenery yet. This knowledge is provided by the algorithm. The final step of the procedure, which main aspect is given by the iterative optimization of an initial class scheme with respect to the categorized change objects, is represented by the classification of these objects to the finally resulting classes. This assignment step is subject of this paper.

  14. Assessing Spontaneous Combustion Instability with Nonlinear Time Series Analysis

    Science.gov (United States)

    Eberhart, C. J.; Casiano, M. J.

    2015-01-01

    Considerable interest lies in the ability to characterize the onset of spontaneous instabilities within liquid propellant rocket engine (LPRE) combustion devices. Linear techniques, such as fast Fourier transforms, various correlation parameters, and critical damping parameters, have been used at great length for over fifty years. Recently, nonlinear time series methods have been applied to deduce information pertaining to instability incipiency hidden in seemingly stochastic combustion noise. A technique commonly used in biological sciences known as the Multifractal Detrended Fluctuation Analysis has been extended to the combustion dynamics field, and is introduced here as a data analysis approach complementary to linear ones. Advancing, a modified technique is leveraged to extract artifacts of impending combustion instability that present themselves a priori growth to limit cycle amplitudes. Analysis is demonstrated on data from J-2X gas generator testing during which a distinct spontaneous instability was observed. Comparisons are made to previous work wherein the data were characterized using linear approaches. Verification of the technique is performed by examining idealized signals and comparing two separate, independently developed tools.

  15. Decadal Time Series of UV Irradiances at two NDSC Sites

    Science.gov (United States)

    McKenzie, R. L.; Johnston, P. V.; Kotkamp, M.; O'Neill, M.; Hofmann, D. J.

    2005-05-01

    The Network for the Detection of Stratospheric Change (NDSC) comprises a small number of well-instrumented unpolluted measurement sites, selected to represent large geographical areas. Its aim is to better understand the causes and effects of long term changes in atmospheric composition. In order to monitor long term ozone change and its effects, UV spectrometers were installed at the mid-latitude southern hemisphere NDSC site (Lauder New Zealand), and the tropical NDSC site (Mauna Loa Observatory, Hawaii). At NIWA's Lauder site, measurements began in December 1989; while at NOAA's Mauna Loa Observatory, measurements began in June 1995. Since deployment, data have been obtained with a high success rate. The instrumentation and data-processing are similar at both sites, and comply with the exacting standards required by the NDSC. Here we present time series of data products from these spectrometers (e.g., erythemally-weighted UV irradiance) to compare and contrast the results from each site and to illustrate the causes for variabilities, and their influences on validation of radiative transfer models and satellite data products.

  16. Scene Context Dependency of Pattern Constancy of Time Series Imagery

    Science.gov (United States)

    Woodell, Glenn A.; Jobson, Daniel J.; Rahman, Zia-ur

    2008-01-01

    A fundamental element of future generic pattern recognition technology is the ability to extract similar patterns for the same scene despite wide ranging extraneous variables, including lighting, turbidity, sensor exposure variations, and signal noise. In the process of demonstrating pattern constancy of this kind for retinex/visual servo (RVS) image enhancement processing, we found that the pattern constancy performance depended somewhat on scene content. Most notably, the scene topography and, in particular, the scale and extent of the topography in an image, affects the pattern constancy the most. This paper will explore these effects in more depth and present experimental data from several time series tests. These results further quantify the impact of topography on pattern constancy. Despite this residual inconstancy, the results of overall pattern constancy testing support the idea that RVS image processing can be a universal front-end for generic visual pattern recognition. While the effects on pattern constancy were significant, the RVS processing still does achieve a high degree of pattern constancy over a wide spectrum of scene content diversity, and wide ranging extraneousness variations in lighting, turbidity, and sensor exposure.

  17. On the C++ Object Programming for Time Series, in the Linux framework

    OpenAIRE

    Mateescu, George Daniel

    2013-01-01

    We study the implementation of time series trough C++ classes, using the fundamentals of C++ programming language, in the Linux framework. Such an implementation may be useful in time series modelling.

  18. KALREF--A Kalman filter and time series approach to the International Terrestrial Reference Frame realization

    National Research Council Canada - National Science Library

    Xiaoping Wu; Claudio Abbondanza; Zuheir Altamimi; T Mike Chin; Xavier Collilieux; Richard S Gross; Michael B Heflin; Yan Jiang; Jay W Parker

    2015-01-01

    ...) quasi-instantaneously. Here, we use a Kalman filter and smoother approach to combine time series from four space geodetic techniques to realize an experimental TRF through weekly time series of geocentric coordinates...

  19. KALREF—A Kalman filter and time series approach to the International Terrestrial Reference Frame realization

    National Research Council Canada - National Science Library

    Wu, Xiaoping; Abbondanza, Claudio; Altamimi, Zuheir; Chin, T. Mike; Collilieux, Xavier; Gross, Richard S; Heflin, Michael B; Jiang, Yan; Parker, Jay W

    2015-01-01

    .... Here, we use a Kalman filter and smoother approach to combine time series from four space geodetic techniques to realize an experimental TRF through weekly time series of geocentric coordinates...

  20. A time-series approach to dynamical systems from classical and quantum worlds

    Energy Technology Data Exchange (ETDEWEB)

    Fossion, Ruben [Instituto Nacional de Geriatría, Periférico Sur No. 2767, Col. San Jerónimo Lídice, Del. Magdalena Contreras, 10200 México D.F., Mexico and Centro de Ciencias de la Complejidad (C3), Universidad Nacional Autó (Mexico)

    2014-01-08

    This contribution discusses some recent applications of time-series analysis in Random Matrix Theory (RMT), and applications of RMT in the statistial analysis of eigenspectra of correlation matrices of multivariate time series.

  1. Time series analysis of collective motions in proteins.

    Science.gov (United States)

    Alakent, Burak; Doruker, Pemra; Camurdan, Mehmet C

    2004-01-08

    The dynamics of alpha-amylase inhibitor tendamistat around its native state is investigated using time series analysis of the principal components of the C(alpha) atomic displacements obtained from molecular dynamics trajectories. Collective motion along a principal component is modeled as a homogeneous nonstationary process, which is the result of the damped oscillations in local minima superimposed on a random walk. The motion in local minima is described by a stationary autoregressive moving average model, consisting of the frequency, damping factor, moving average parameters and random shock terms. Frequencies for the first 50 principal components are found to be in the 3-25 cm(-1) range, which are well correlated with the principal component indices and also with atomistic normal mode analysis results. Damping factors, though their correlation is less pronounced, decrease as principal component indices increase, indicating that low frequency motions are less affected by friction. The existence of a positive moving average parameter indicates that the stochastic force term is likely to disturb the mode in opposite directions for two successive sampling times, showing the modes tendency to stay close to minimum. All these four parameters affect the mean square fluctuations of a principal mode within a single minimum. The inter-minima transitions are described by a random walk model, which is driven by a random shock term considerably smaller than that for the intra-minimum motion. The principal modes are classified into three subspaces based on their dynamics: essential, semiconstrained, and constrained, at least in partial consistency with previous studies. The Gaussian-type distributions of the intermediate modes, called "semiconstrained" modes, are explained by asserting that this random walk behavior is not completely free but between energy barriers. (c) 2004 American Institute of Physics

  2. D City Transformations by Time Series of Aerial Images

    Science.gov (United States)

    Adami, A.

    2015-02-01

    Recent photogrammetric applications, based on dense image matching algorithms, allow to use not only images acquired by digital cameras, amateur or not, but also to recover the vast heritage of analogue photographs. This possibility opens up many possibilities in the use and enhancement of existing photos heritage. The research of the original figuration of old buildings, the virtual reconstruction of disappeared architectures and the study of urban development are some of the application areas that exploit the great cultural heritage of photography. Nevertheless there are some restrictions in the use of historical images for automatic reconstruction of buildings such as image quality, availability of camera parameters and ineffective geometry of image acquisition. These constrains are very hard to solve and it is difficult to discover good dataset in the case of terrestrial close range photogrammetry for the above reasons. Even the photographic archives of museums and superintendence, while retaining a wealth of documentation, have no dataset for a dense image matching approach. Compared to the vast collection of historical photos, the class of aerial photos meets both criteria stated above. In this paper historical aerial photographs are used with dense image matching algorithms to realize 3d models of a city in different years. The models can be used to study the urban development of the city and its changes through time. The application relates to the city centre of Verona, for which some time series of aerial photographs have been retrieved. The models obtained in this way allowed, right away, to observe the urban development of the city, the places of expansion and new urban areas. But a more interesting aspect emerged from the analytical comparison between models. The difference, as the Euclidean distance, between two models gives information about new buildings or demolitions. As considering accuracy it is necessary point out that the quality of final

  3. Nonlinear time series analysis of normal and pathological human walking

    Science.gov (United States)

    Dingwell, Jonathan B.; Cusumano, Joseph P.

    2000-12-01

    Characterizing locomotor dynamics is essential for understanding the neuromuscular control of locomotion. In particular, quantifying dynamic stability during walking is important for assessing people who have a greater risk of falling. However, traditional biomechanical methods of defining stability have not quantified the resistance of the neuromuscular system to perturbations, suggesting that more precise definitions are required. For the present study, average maximum finite-time Lyapunov exponents were estimated to quantify the local dynamic stability of human walking kinematics. Local scaling exponents, defined as the local slopes of the correlation sum curves, were also calculated to quantify the local scaling structure of each embedded time series. Comparisons were made between overground and motorized treadmill walking in young healthy subjects and between diabetic neuropathic (NP) patients and healthy controls (CO) during overground walking. A modification of the method of surrogate data was developed to examine the stochastic nature of the fluctuations overlying the nominally periodic patterns in these data sets. Results demonstrated that having subjects walk on a motorized treadmill artificially stabilized their natural locomotor kinematics by small but statistically significant amounts. Furthermore, a paradox previously present in the biomechanical literature that resulted from mistakenly equating variability with dynamic stability was resolved. By slowing their self-selected walking speeds, NP patients adopted more locally stable gait patterns, even though they simultaneously exhibited greater kinematic variability than CO subjects. Additionally, the loss of peripheral sensation in NP patients was associated with statistically significant differences in the local scaling structure of their walking kinematics at those length scales where it was anticipated that sensory feedback would play the greatest role. Lastly, stride-to-stride fluctuations in the

  4. New Tools for Comparing Beliefs about the Timing of Recurrent Events with Climate Time Series Datasets

    Science.gov (United States)

    Stiller-Reeve, Mathew; Stephenson, David; Spengler, Thomas

    2017-04-01

    For climate services to be relevant and informative for users, scientific data definitions need to match users' perceptions or beliefs. This study proposes and tests novel yet simple methods to compare beliefs of timing of recurrent climatic events with empirical evidence from multiple historical time series. The methods are tested by applying them to the onset date of the monsoon in Bangladesh, where several scientific monsoon definitions can be applied, yielding different results for monsoon onset dates. It is a challenge to know which monsoon definition compares best with people's beliefs. Time series from eight different scientific monsoon definitions in six regions are compared with respondent beliefs from a previously completed survey concerning the monsoon onset. Beliefs about the timing of the monsoon onset are represented probabilistically for each respondent by constructing a probability mass function (PMF) from elicited responses about the earliest, normal, and latest dates for the event. A three-parameter circular modified triangular distribution (CMTD) is used to allow for the possibility (albeit small) of the onset at any time of the year. These distributions are then compared to the historical time series using two approaches: likelihood scores, and the mean and standard deviation of time series of dates simulated from each belief distribution. The methods proposed give the basis for further iterative discussion with decision-makers in the development of eventual climate services. This study uses Jessore, Bangladesh, as an example and finds that a rainfall definition, applying a 10 mm day-1 threshold to NCEP-NCAR reanalysis (Reanalysis-1) data, best matches the survey respondents' beliefs about monsoon onset.

  5. Here be web proxies

    DEFF Research Database (Denmark)

    Weaver, Nicholas; Kreibich, Christian; Dam, Martin

    2014-01-01

    HTTP proxies serve numerous roles, from performance enhancement to access control to network censorship, but often operate stealthily without explicitly indicating their presence to the communicating endpoints. In this paper we present an analysis of the evidence of proxying manifest in executions...... of the ICSI Netalyzr spanning 646,000 distinct IP addresses ("clients"). To identify proxies we employ a range of detectors at the transport and application layer, and report in detail on the extent to which they allow us to fingerprint and map proxies to their likely intended uses. We also analyze 17...

  6. Fusing Landsat and SAR time series to detect deforestation in the tropics

    NARCIS (Netherlands)

    Reiche, J.; Verbesselt, J.; Hoekman, D.H.; Herold, M.

    2015-01-01

    Fusion of optical and SAR time series imagery has the potential to improve forest monitoring in tropical regions, where cloud cover limits optical satellite time series observations. We present a novel pixel-based Multi-sensor Time-series correlation and Fusion approach (MulTiFuse) that exploits the

  7. The method of trend analysis of parameters time series of gas-turbine engine state

    Science.gov (United States)

    Hvozdeva, I.; Myrhorod, V.; Derenh, Y.

    2017-10-01

    This research substantiates an approach to interval estimation of time series trend component. The well-known methods of spectral and trend analysis are used for multidimensional data arrays. The interval estimation of trend component is proposed for the time series whose autocorrelation matrix possesses a prevailing eigenvalue. The properties of time series autocorrelation matrix are identified.

  8. Error Sources in Deforestation Detection Using BFAST Monitor on Landsat Time Series Across Three Tropical Sites

    NARCIS (Netherlands)

    Schultz, Michael; Verbesselt, Jan; Avitabile, Valerio; Souza, Carlos; Herold, Martin

    2016-01-01

    Accurate tropic deforestation monitoring using time series requires methods which can capture gradual to abrupt changes and can account for site-specific properties of the environment and the available data. The generic time series algorithm BFAST Monitor was tested using Landsat time series at

  9. Novel Quantum Proxy Signature without Entanglement

    Science.gov (United States)

    Xu, Guang-bao

    2015-08-01

    Proxy signature is an important research topic in classic cryptography since it has many application occasions in our real life. But only a few quantum proxy signature schemes have been proposed up to now. In this paper, we propose a quantum proxy signature scheme, which is designed based on quantum one-time pad. Our scheme can be realized easily since it only uses single-particle states. Security analysis shows that it is secure and meets all the properties of a proxy signature, such as verifiability, distinguishability, unforgeability and undeniability.

  10. Statistical framework for evaluation of climate model simulations by use of climate proxy data from the last millennium – Part 2: A pseudo-proxy study addressing the amplitude of solar forcing

    Directory of Open Access Journals (Sweden)

    A. Hind

    2012-08-01

    Full Text Available The statistical framework of Part 1 (Sundberg et al., 2012, for comparing ensemble simulation surface temperature output with temperature proxy and instrumental records, is implemented in a pseudo-proxy experiment. A set of previously published millennial forced simulations (Max Planck Institute – COSMOS, including both "low" and "high" solar radiative forcing histories together with other important forcings, was used to define "true" target temperatures as well as pseudo-proxy and pseudo-instrumental series. In a global land-only experiment, using annual mean temperatures at a 30-yr time resolution with realistic proxy noise levels, it was found that the low and high solar full-forcing simulations could be distinguished. In an additional experiment, where pseudo-proxies were created to reflect a current set of proxy locations and noise levels, the low and high solar forcing simulations could only be distinguished when the latter served as targets. To improve detectability of the low solar simulations, increasing the signal-to-noise ratio in local temperature proxies was more efficient than increasing the spatial coverage of the proxy network. The experiences gained here will be of guidance when these methods are applied to real proxy and instrumental data, for example when the aim is to distinguish which of the alternative solar forcing histories is most compatible with the observed/reconstructed climate.

  11. The Time Series Data Server (TSDS) for Standards-Compliant, Convenient, and Efficient Access to Time Series Data

    Science.gov (United States)

    Lindholm, D. M.; Weigel, R. S.; Wilson, A.; Ware Dewolfe, A.

    2009-12-01

    Data analysis in the physical sciences is often plagued by the difficulty in acquiring the desired data. A great deal of work has been done in the area of metadata and data discovery, however, many such discoveries simply provide links that lead directly to a data file. Often these files are impractically large, containing more time samples or variables than desired, and are slow to access. Once these files are downloaded, format issues further complicate using the data. Some data servers have begun to address these problems by improving data virtualization and ease of use. However, these services often don't scale to large datasets. Also, the generic nature of the data models used by these servers, while providing greater flexibility, may complicate setting up such a service for data providers and limit sufficient semantics that would otherwise simplify use for clients, machine or human. The Time Series Data Server (TSDS) aims to address these problems within the limited, yet common, domain of time series data. With the simplifying assumption that all data products served are a function of time, the server can optimize for data access based on time subsets, a common use case. The server also supports requests for specific variables, which can be of type scalar, structure, or sequence. It also supports data types with higher level semantics, such as "spectrum." The TSDS is implemented using Java Servlet technology and can be dropped into any servlet container and customized for a data provider's needs. The interface is based on OPeNDAP (http://opendap.org) and conforms to the Data Acces Protocol (DAP) 2.0, a NASA standard (ESDS-RFC-004), which defines a simple HTTP request and response paradigm. Thus a TSDS server instance is a compliant OPeNDAP server that can be accessed by any OPeNDAP client or directly via RESTful web service requests. The TSDS reads the data that it serves into a common data model via the NetCDF Markup Language (NcML, http

  12. Historical Time Series of Extreme Convective Weather in Finland

    Science.gov (United States)

    Laurila, T. K.; Mäkelä, A.; Rauhala, J.; Olsson, T.; Jylhä, K.

    2016-12-01

    Thunderstorms, lightning, tornadoes, downbursts, large hail and heavy precipitation are well-known for their impacts to human life. In the high latitudes as in Finland, these hazardous warm season convective weather events are focused in the summer season, roughly from May to September with peak in the midsummer. The position of Finland between the maritime Atlantic and the continental Asian climate zones makes possible large variability in weather in general which reflects also to the occurrence of severe weather; the hot, moist and extremely unstable air masses sometimes reach Finland and makes possible for the occurrence of extreme and devastating weather events. Compared to lower latitudes, the Finnish climate of severe convection is "moderate" and contains a large year-to-year variation; however, behind the modest annual average is hidden the climate of severe weather events that practically every year cause large economical losses and sometimes even losses of life. Because of the increased vulnerability of our modern society, these episodes have gained recently plenty of interest. During the decades, the Finnish Meteorological Institute (FMI) has collected observations and damage descriptions of severe weather episodes in Finland; thunderstorm days (1887-present), annual number of lightning flashes (1960-present), tornados (1796-present), large hail (1930-present), heavy rainfall (1922-present). The research findings show e.g. that a severe weather event may occur practically anywhere in the country, although in general the probability of occurrence is smaller in the Northern Finland. This study, funded by the Finnish Research Programme on Nuclear Power Plant Safety (SAFIR), combines the individual Finnish severe weather time series' and examines their trends, cross-correlation and correlations with other atmospheric parameters. Furthermore, a numerical weather model (HARMONIE) simulation is performed for a historical severe weather case for analyzing how

  13. Deformation time series at Llaima volcano, southern Andes

    Science.gov (United States)

    Bathke, Hannes; Walter, Thomas; Motagh, Mahdi; Shirzaei, Manoochehr

    2010-05-01

    Llaima volcano, with an edifice height of 3125 m and a volume of about 400 km³, is one of the largest and most active volcanoes in South America. Its eruptive history suggest a potential for very large and hazardous eruptions including pyroclastic flows, air falls and material remobilization in the form of lahars affecting regions even at the lower apron and beyond, posing a significant risk to civilizations, infrastructure and traffic ways. Llaima volcano is near constantly active; since the 17th century strombolian eruptions occurred at a mean frequency of one eruptive phase every five years. Although this strong activity and socioeconomic importance the source of magma, possible magma reservoirs and deformations prior to or associated with eruptions are hitherto unknown. One of the problems for establishing a monitoring system is that Llaima is difficult to access and located in vegetated and topographically rough terrain. To better understand the volcano physics, we created an InSAR time series based on the PS technique using 18 Envisat images from Dezember 2002 to November 2008. Using the StaMPS software we obtained 24,000 stable pixels in the vicinity of the volcano, that allow to investigate a spatiotemporal displacement field. Associated with the recent eruptions, we observed non-linear subsidence at the vicinity of the volcano base. We assessed the validiy of the deformation signal, using statistical tests and discussed the possible influence of athmospheric and topographic errors. To investigate the cause of the observed spatiotemporal deformation we employed an inverse source modelling approach, and simulated the dislocation source as an analytical pressurized spherical model. The inverted source can reproduce the observed deformation and allows to constrain the location of the magma reservoir under Llaima. Moreover we observed a signal might be associated to a slow landslide at the eastern flank of the volcano between December 2007 and Januar 2008. In

  14. Detection of cavity migration risks using radar interferometric time series

    Science.gov (United States)

    Chang, L.; Hanssen, R. F.

    2012-12-01

    , ERS-2, Envisat, and Radarsat-2, to investigate the dynamics (deformation) of the area. In particular we show, for the first time, shear-stress change distribution patterns within the structure of a building, over a period of close to 20 years. Time series analysis shows that deformation rates of ~4 mm/a could be detected for about 18 years, followed by a dramatic increase of up to 20 mm/a in the last period. These results imply that the driving mechanisms of the 2011 catastrophe have a very long lead time and are therefore likely due to a long-lasting gradual motion, such as the upward migration of a cavity. The analysis shows the collocation of the deformation location with relatively shallow near-horizontal mine shafts, suggesting that cavity migration has a high likelihood to be the driving mechanism of the collapse-sinkhole.

  15. Time series momentum and contrarian effects in the Chinese stock market

    Science.gov (United States)

    Shi, Huai-Long; Zhou, Wei-Xing

    2017-10-01

    This paper concentrates on the time series momentum or contrarian effects in the Chinese stock market. We evaluate the performance of the time series momentum strategy applied to major stock indices in mainland China and explore the relation between the performance of time series momentum strategies and some firm-specific characteristics. Our findings indicate that there is a time series momentum effect in the short run and a contrarian effect in the long run in the Chinese stock market. The performances of the time series momentum and contrarian strategies are highly dependent on the look-back and holding periods and firm-specific characteristics.

  16. Jemen - the Proxy War

    Directory of Open Access Journals (Sweden)

    Magdalena El Ghamari

    2015-12-01

    Full Text Available The military operation in Yemen is significant departure from Saudi Arabia's foreign policy tradition and customs. Riyadh has always relied on three strategies to pursue its interests abroad: wealth, establish a global network and muslim education and diplomacy and meadiation. The term "proxy war" has experienced a new popularity in stories on the Middle East. A proxy war is two opposing countries avoiding direct war, and instead supporting combatants that serve their interests. In some occasions, one country is a direct combatant whilst the other supporting its enemy. Various news sources began using the term to describe the conflict in Yemen immediately, as if on cue, after Saudi Arabia launched its bombing campaign against Houthi targets in Yemen on 25 March 2015. This is the reason, why author try to answer for following questions: Is the Yemen Conflict Devolves into Proxy War? and Who's fighting whom in Yemen's proxy war?" Research area includes the problem of proxy war in the Middle East. For sure, the real problem of proxy war must begin with the fact that the United States and its NATO allies opened the floodgates for regional proxy wars by the two major wars for regime change: in Iraq and Libya. Those two destabilising wars provided opportunities and motives for Sunni states across the Middle East to pursue their own sectarian and political power objectives through "proxy war".

  17. Multiscale multifractal multiproperty analysis of financial time series based on Rényi entropy

    Science.gov (United States)

    Yujun, Yang; Jianping, Li; Yimei, Yang

    This paper introduces a multiscale multifractal multiproperty analysis based on Rényi entropy (3MPAR) method to analyze short-range and long-range characteristics of financial time series, and then applies this method to the five time series of five properties in four stock indices. Combining the two analysis techniques of Rényi entropy and multifractal detrended fluctuation analysis (MFDFA), the 3MPAR method focuses on the curves of Rényi entropy and generalized Hurst exponent of five properties of four stock time series, which allows us to study more universal and subtle fluctuation characteristics of financial time series. By analyzing the curves of the Rényi entropy and the profiles of the logarithm distribution of MFDFA of five properties of four stock indices, the 3MPAR method shows some fluctuation characteristics of the financial time series and the stock markets. Then, it also shows a richer information of the financial time series by comparing the profile of five properties of four stock indices. In this paper, we not only focus on the multifractality of time series but also the fluctuation characteristics of the financial time series and subtle differences in the time series of different properties. We find that financial time series is far more complex than reported in some research works using one property of time series.

  18. HIGH ORDER FUZZY TIME SERIES MODEL AND ITS APLICATION TO IMKB

    Directory of Open Access Journals (Sweden)

    Çağdaş Hakan ALADAĞ

    2010-12-01

    Full Text Available The observations of some real time series such as temperature and stock market can take different values in a day. Instead of representing the observations of these time series by real numbers, employing linguistic values or fuzzy sets can be more appropriate. In recent years, many approaches have been introduced to analyze time series consisting of observations which are fuzzy sets and such time series are called fuzzy time series. In this study, a novel approach is proposed to analyze high order fuzzy time series model. The proposed method is applied to IMKB data and the obtained results are discussed. IMKB data is also analyzed by using some other fuzzy time series methods available in the literature and obtained results are compared to results obtained from the proposed method. As a result of the comparison, it is seen that the proposed method produce accurate forecasts.

  19. Time Granularity Transformation of Time Series Data for Failure Prediction of Overhead Line

    Science.gov (United States)

    Ma, Yan; Zhu, Wenbing; Yao, Jinxia; Gu, Chao; Bai, Demeng; Wang, Kun

    2017-01-01

    In this paper, we give an approach of transforming time series data with different time granularities into the same plane, which is the basis of further association analysis. We focus on the application of overhead line tripping. First all the relative state variables with line tripping are collected into our big data platform. We collect line account, line fault, lightning, power load and meteorological data. Second we respectively pre-process the five kinds of data to guarantee the integrality of data and simplicity of analysis. We use a representation way combining the aggregated representation and trend extraction methods, which considers both short term variation and long term trend of time sequence. Last we use extensive experiments to demonstrate that the proposed time granularity transformation approach not only lets multiple variables analysed on the same plane, but also has a high prediction accuracy and low running time no matter for SVM or logistic regression algorithm.

  20. Studies in astronomical time series analysis. I - Modeling random processes in the time domain

    Science.gov (United States)

    Scargle, J. D.

    1981-01-01

    Several random process models in the time domain are defined and discussed. Attention is given to the moving average model, the autoregressive model, and relationships between and combinations of these models. Consideration is then given to methods for investigating pulse structure, procedures of model construction, computational methods, and numerical experiments. A FORTRAN algorithm of time series analysis has been developed which is relatively stable numerically. Results of test cases are given to study the effect of adding noise and of different distributions for the pulse amplitudes. A preliminary analysis of the light curve of the quasar 3C 272 is considered as an example.

  1. multivariate time series modeling of selected childhood diseases in ...

    African Journals Online (AJOL)

    2016-06-17

    Jun 17, 2016 ... Forecasts were generated. The model revealed that upper respiratory tract infection, pneumonia and anaemia are linked to or .... the correlation structures among the component series. Pre-whitening is used in this .... correlation matrix function for the vector process is defined by. (k) = (k). = (k) , =1,2,⋯ ...

  2. West Africa Land Use Land Cover Time Series

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — This series of three-period land use land cover (LULC) datasets (1975, 2000, and 2013) aids in monitoring change in West Africa’s land resources (exception is...

  3. Climatic signal from Pinus leucodermis axial resin ducts: a tree-ring time series approach

    OpenAIRE

    Antonio Saracino; Angelo Rita; Sergio Rossi; Laia Andreu-Hayles; G. Helle; Luigi Todaro

    2017-01-01

    Developing long-term chronologies of tree-ring anatomical features to evaluate climatic relationships within species might serve as an annual proxy to explore and elucidate the climatic drivers affecting xylem differentiation. Pinus leucodermis response to climate was examined by analyzing vertical xylem resin ducts in wood growing at high elevation in the Apennines of peninsular Southern Italy. Early- and latewood tree-ring resin duct chronologies, spanning the 1804–2010 time period, were co...

  4. Comparison of the Performance of Two Advanced Spectral Methods for the Analysis of Times Series in Paleoceanography

    Directory of Open Access Journals (Sweden)

    Eulogio Pardo-Igúzquiza

    2015-08-01

    Full Text Available Many studies have revealed the cyclicity of past ocean/atmosphere dynamics at a wide range of time scales (from decadal to millennial time scales, based on the spectral analysis of time series of climate proxies obtained from deep sea sediment cores. Among the many techniques available for spectral analysis, the maximum entropy method and the Thomson multitaper approach have frequently been used because of their good statistical properties and high resolution with short time series. The novelty of the present study is that we compared the two methods by according to the performance of their statistical tests to assess the statistical significance of their power spectrum estimates. The statistical significance of maximum entropy estimates was assessed by a random permutation test (Pardo-Igúzquiza and Rodríguez-Tovar, 2000, while the statistical significance of the Thomson multitaper method was assessed by an F-test (Thomson, 1982. We compared the results obtained in a case study using simulated data where the spectral content of the time series was known and in a case study with real data. In both cases the results are similar: while the cycles identified as significant by maximum entropy and the permutation test have a clear physical interpretation, the F-test with the Thomson multitaper estimator tends to find as no significant the peaks in the low frequencies and tends to give as significant more spurious peaks in the middle and high frequencies. Nevertheless, the best strategy is to use both techniques and to use the advantages of each of them.

  5. Real-Time Detection of Application-Layer DDoS Attack Using Time Series Analysis

    Directory of Open Access Journals (Sweden)

    Tongguang Ni

    2013-01-01

    Full Text Available Distributed denial of service (DDoS attacks are one of the major threats to the current Internet, and application-layer DDoS attacks utilizing legitimate HTTP requests to overwhelm victim resources are more undetectable. Consequently, neither intrusion detection systems (IDS nor victim server can detect malicious packets. In this paper, a novel approach to detect application-layer DDoS attack is proposed based on entropy of HTTP GET requests per source IP address (HRPI. By approximating the adaptive autoregressive (AAR model, the HRPI time series is transformed into a multidimensional vector series. Then, a trained support vector machine (SVM classifier is applied to identify the attacks. The experiments with several databases are performed and results show that this approach can detect application-layer DDoS attacks effectively.

  6. Stochastic nonlinear time series forecasting using time-delay reservoir computers: performance and universality.

    Science.gov (United States)

    Grigoryeva, Lyudmila; Henriques, Julie; Larger, Laurent; Ortega, Juan-Pablo

    2014-07-01

    Reservoir computing is a recently introduced machine learning paradigm that has already shown excellent performances in the processing of empirical data. We study a particular kind of reservoir computers called time-delay reservoirs that are constructed out of the sampling of the solution of a time-delay differential equation and show their good performance in the forecasting of the conditional covariances associated to multivariate discrete-time nonlinear stochastic processes of VEC-GARCH type as well as in the prediction of factual daily market realized volatilities computed with intraday quotes, using as training input daily log-return series of moderate size. We tackle some problems associated to the lack of task-universality for individually operating reservoirs and propose a solution based on the use of parallel arrays of time-delay reservoirs. Copyright © 2014 Elsevier Ltd. All rights reserved.

  7. Tsunami arrival time detection system applicable to discontinuous time series data with outliers

    Science.gov (United States)

    Lee, Jun-Whan; Park, Sun-Cheon; Lee, Duk Kee; Lee, Jong Ho

    2016-12-01

    Timely detection of tsunamis with water level records is a critical but logistically challenging task because of outliers and gaps. Since tsunami detection algorithms require several hours of past data, outliers could cause false alarms, and gaps can stop the tsunami detection algorithm even after the recording is restarted. In order to avoid such false alarms and time delays, we propose the Tsunami Arrival time Detection System (TADS), which can be applied to discontinuous time series data with outliers. TADS consists of three algorithms, outlier removal, gap filling, and tsunami detection, which are designed to update whenever new data are acquired. After calibrating the thresholds and parameters for the Ulleung-do surge gauge located in the East Sea (Sea of Japan), Korea, the performance of TADS was discussed based on a 1-year dataset with historical tsunamis and synthetic tsunamis. The results show that the overall performance of TADS is effective in detecting a tsunami signal superimposed on both outliers and gaps.

  8. Record statistics of financial time series and geometric random walks

    Science.gov (United States)

    Sabir, Behlool; Santhanam, M. S.

    2014-09-01

    The study of record statistics of correlated series in physics, such as random walks, is gaining momentum, and several analytical results have been obtained in the past few years. In this work, we study the record statistics of correlated empirical data for which random walk models have relevance. We obtain results for the records statistics of select stock market data and the geometric random walk, primarily through simulations. We show that the distribution of the age of records is a power law with the exponent α lying in the range 1.5≤α≤1.8. Further, the longest record ages follow the Fréchet distribution of extreme value theory. The records statistics of geometric random walk series is in good agreement with that obtained from empirical stock data.

  9. Nonlinear time series theory, methods and applications with R examples

    CERN Document Server

    Douc, Randal; Stoffer, David

    2014-01-01

    FOUNDATIONSLinear ModelsStochastic Processes The Covariance World Linear Processes The Multivariate Cases Numerical Examples ExercisesLinear Gaussian State Space Models Model Basics Filtering, Smoothing, and Forecasting Maximum Likelihood Estimation Smoothing Splines and the Kalman Smoother Asymptotic Distribution of the MLE Missing Data Modifications Structural Component Models State-Space Models with Correlated Errors Exercises Beyond Linear ModelsNonlinear Non-Gaussian Data Volterra Series Expansion Cumulants and Higher-Order Spectra Bilinear Models Conditionally Heteroscedastic Models Thre

  10. Change detection in a time series of polarimetric SAR data

    DEFF Research Database (Denmark)

    Conradsen, Knut; Nielsen, Allan Aasbjerg; Skriver, Henning

    2014-01-01

    A test statistic for the equality of several variance-covariance matrices following the complex Wishart distribution with an associated probability of finding a smaller value of the test statistic is introduced. Unlike tests based on pairwise comparisons between all temporally consecutive acquisi...... acquisitions the new omnibus test statistic and the probability measure successfully detects change in two short series of L- and C-band polarimetric EMISAR data....

  11. Speleothem Mg-isotope time-series data from different climate belts

    Science.gov (United States)

    Riechelmann, S.; Buhl, D.; Richter, D. K.; Schröder-Ritzrau, A.; Riechelmann, D. F. C.; Niedermayr, A.; Vonhof, H. B.; Wassenburg, J.; Immenhauser, A.

    2012-04-01

    Speleothem Mg-isotope time-series data from different climate belts Sylvia Riechelmann (1), Dieter Buhl(1), Detlev K. Richter (1), Andrea Schröder-Ritzrau (2), Dana F.C. Riechelmann (3), Andrea Niedermayr (1), Hubert B. Vonhof (4) , Jasper Wassenburg (1), Adrian Immenhauser (1) (1) Ruhr-University Bochum, Institute for Geology, Mineralogy and Geophysics, Universitätsstraße 150, D-44801 Bochum, Germany (2) Heidelberg Academy of Sciences, Im Neuenheimer Feld 229, D-69120 Heidelberg, Germany (3) Johannes Gutenberg-University Mainz, Institute of Geography, Johann-Joachim-Becher-Weg 21, D-55128 Mainz, Germany (4) Faculty of Earth and Life Sciences, Vrije Universiteit Amsterdam, De Boelelaan 1085, 1081 HV, Amsterdam, The Netherlands The Magnesium isotope proxy in Earth surface research is still underexplored. Recently, field and laboratory experiments have shed light on the complex suite of processes affecting Mg isotope fractionation in continental weathering systems. Magnesium-isotope fractionation in speleothems depends on a series of factors including biogenic activity and composition of soils, mineralogy of hostrock, changes in silicate versus carbonate weathering ratios, water residence time in the soil and hostrock and disequilibrium factors such as the precipitation rate of calcite in speleothems. Furthermore, the silicate (here mainly Mg-bearing clays) versus carbonate weathering ratio depends on air temperature and rainfall amount, also influencing the soil biogenic activity. It must be emphasized that carbonate weathering is generally dominant, but under increasingly warm and more arid climate conditions, silicate weathering rates increase and release 26Mg-enriched isotopes to the soil water. Furthermore, as shown in laboratory experiments, increasing calcite precipitation rates lead to elevated delta26Mg ratios and vice versa. Here, data from six stalagmite time-series Mg-isotope records (Thermo Fisher Scientific Neptune MC-ICP-MS) are shown. Stalagmites

  12. GPS Sensor Web Time Series Analysis Using SensorGrid Technology

    Science.gov (United States)

    Granat, R.; Pierce, M.; Aydin, G.; Qi, Z.

    2006-12-01

    We present a method for performing signal detection and classification on real-time streams of GPS sensor web data. Our approach has two parts. The first is a hidden Markov model fitting methodology that enables us to robustly describe the statistics of the data. The second is the SensorGrid technology which allows us to manage the data streams through a series of filters tied together with a publish/subscribe messaging system. In this framework, the HMM algorithm is viewed as a filter. The sensor web data we use in this work comes from the Southern California Integrated GPS Network (SCIGN), which produces a number of data products. In this work, we use the real-time (1Hz for most stations) three-dimensional position information. This data is collected from a system which is not only noisy but also poorly understood; driving forces on the system derive not only from the physical processes of the solid earth but also from external factors, including atmospheric effects and human activity. Fitting an HMM to time series allows us to describe the statistics of the data in a simple way that ascribes discrete modes of behavior to the system. By matching incoming data against the statistics of previously learned modes, we can perform classification according to the best match. In addition, we can perform signal detection across the entire sensor web by correlating mode changes in time; a significant number of mode changes across the network or within a certain sub-network is an indication of an event that is occurring over a wide geographical area. For most applications, reliable HMM fitting results are achieved by using a priori information to form constraints that reduce the number of free parameters. For GPS data, however, this information is not available as the underlying system is not well understood. As a result, we use the regularized deterministic annealing expectation-maximization (RDAEM) algorithm to perform the fit. This method provides high-quality, self

  13. Spatially shifting temporal points: estimating pooled within-time series variograms for scarce hydrological data

    OpenAIRE

    Bhowmik, A. K.; Cabral, P.

    2015-01-01

    Estimation of pooled within-time series (PTS) variograms is a frequently used technique for geostatistical interpolation of continuous hydrological variables in spatial data-scarce regions conditional that time series are available. The only available method for estimating PTS variograms averages semivariances, which are computed for individual time steps, over each spatial lag within a pooled time series. However, semivariances computed by a few paired comp...

  14. Multiscale Poincar? plots for visualizing the structure of heartbeat time series

    OpenAIRE

    Henriques, Teresa S.; Mariani, Sara; Burykin, Anton; Rodrigues, Filipa; Silva, Tiago F.; Goldberger, Ary L.

    2016-01-01

    Background Poincar? delay maps are widely used in the analysis of cardiac interbeat interval (RR) dynamics. To facilitate visualization of the structure of these time series, we introduce multiscale Poincar? (MSP) plots. Methods Starting with the original RR time series, the method employs a coarse-graining procedure to create a family of time series, each of which represents the system?s dynamics in a different time scale. Next, the Poincar? plots are constructed for the original and the coa...

  15. BTW: a web server for Boltzmann time warping of gene expression time series

    Science.gov (United States)

    Ferrè, F.; Clote, P.

    2006-01-01

    Dynamic time warping (DTW) is a well-known quadratic time algorithm to determine the smallest distance and optimal alignment between two numerical sequences, possibly of different length. Originally developed for speech recognition, this method has been used in data mining, medicine and bioinformatics. For gene expression time series data, time warping distance is arguably a more flexible tool to determine genes having similar temporal expression, hence possibly related biological function, than either Euclidean distance or correlation coefficient—especially since time warping accommodates sequences of different length. The BTW web server allows a user to upload two tab-separated text files A,B of gene expression data, each possibly having a different number of time intervals of different durations. BTW then computes time warping distance between each gene of A with each gene of B, using a recently developed symmetric algorithm which additionally computes the Boltzmann partition function and outputs Boltzmann pair probabilities. The Boltzmann pair probabilities, not available with any other existent software, suggest possible biological significance of certain positions in an optimal time warping alignment. Availability: . PMID:16845055

  16. Studies in astronomical time series analysis: Modeling random processes in the time domain

    Science.gov (United States)

    Scargle, J. D.

    1979-01-01

    Random process models phased in the time domain are used to analyze astrophysical time series data produced by random processes. A moving average (MA) model represents the data as a sequence of pulses occurring randomly in time, with random amplitudes. An autoregressive (AR) model represents the correlations in the process in terms of a linear function of past values. The best AR model is determined from sampled data and transformed to an MA for interpretation. The randomness of the pulse amplitudes is maximized by a FORTRAN algorithm which is relatively stable numerically. Results of test cases are given to study the effects of adding noise and of different distributions for the pulse amplitudes. A preliminary analysis of the optical light curve of the quasar 3C 273 is given.

  17. Time-causal decomposition of geomagnetic time series into secular variation, solar quiet, and disturbance signals

    Science.gov (United States)

    Rigler, E. Joshua

    2017-04-26

    A theoretical basis and prototype numerical algorithm are provided that decompose regular time series of geomagnetic observations into three components: secular variation; solar quiet, and disturbance. Respectively, these three components correspond roughly to slow changes in the Earth’s internal magnetic field, periodic daily variations caused by quasi-stationary (with respect to the sun) electrical current systems in the Earth’s magnetosphere, and episodic perturbations to the geomagnetic baseline that are typically driven by fluctuations in a solar wind that interacts electromagnetically with the Earth’s magnetosphere. In contrast to similar algorithms applied to geomagnetic data in the past, this one addresses the issue of real time data acquisition directly by applying a time-causal, exponential smoother with “seasonal corrections” to the data as soon as they become available.

  18. Persistent homology of time-dependent functional networks constructed from coupled time series

    Science.gov (United States)

    Stolz, Bernadette J.; Harrington, Heather A.; Porter, Mason A.

    2017-04-01

    We use topological data analysis to study "functional networks" that we construct from time-series data from both experimental and synthetic sources. We use persistent homology with a weight rank clique filtration to gain insights into these functional networks, and we use persistence landscapes to interpret our results. Our first example uses time-series output from networks of coupled Kuramoto oscillators. Our second example consists of biological data in the form of functional magnetic resonance imaging data that were acquired from human subjects during a simple motor-learning task in which subjects were monitored for three days during a five-day period. With these examples, we demonstrate that (1) using persistent homology to study functional networks provides fascinating insights into their properties and (2) the position of the features in a filtration can sometimes play a more vital role than persistence in the interpretation of topological features, even though conventionally the latter is used to distinguish between signal and noise. We find that persistent homology can detect differences in synchronization patterns in our data sets over time, giving insight both on changes in community structure in the networks and on increased synchronization between brain regions that form loops in a functional network during motor learning. For the motor-learning data, persistence landscapes also reveal that on average the majority of changes in the network loops take place on the second of the three days of the learning process.

  19. A generalized exponential time series regression model for electricity prices

    DEFF Research Database (Denmark)

    Haldrup, Niels; Knapik, Oskar; Proietti, Tomasso

    We consider the issue of modeling and forecasting daily electricity spot prices on the Nord Pool Elspot power market. We propose a method that can handle seasonal and non-seasonal persistence by modelling the price series as a generalized exponential process. As the presence of spikes can distort...... on the estimated model, the best linear predictor is constructed. Our modeling approach provides good fit within sample and outperforms competing benchmark predictors in terms of forecasting accuracy. We also find that building separate models for each hour of the day and averaging the forecasts is a better...... strategy than forecasting the daily average directly....

  20. Piecewise aggregate representations and lower-bound distance functions for multivariate time series

    Science.gov (United States)

    Li, Hailin

    2015-06-01

    Dimensionality reduction is one of the most important methods to improve the efficiency of the techniques that are applied to the field of multivariate time series data mining. Due to multivariate time series with the variable-based and time-based dimensions, the reduction techniques must take both of them into consideration. To achieve this goal, we use a center sequence to represent a multivariate time series so that the new sequence can be seen as a univariate time series. Thus two sophisticated piecewise aggregate representations, including piecewise aggregate approximation and symbolization applied to univariate time series, are used to further represent the extended sequence that is derived from the center one. Furthermore, some distance functions are designed to measure the similarity between two representations. Through being proven by some related mathematical analysis, the proposed functions are lower bound on Euclidean distance and dynamic time warping. In this way, false dismissals can be avoided when they are used to index the time series. In addition, multivariate time series with different lengths can be transformed into the extended sequences with equal length, and their corresponding distance functions can measure the similarity between two unequal-length multivariate time series. The experimental results demonstrate that the proposed methods can reduce the dimensionality, and their corresponding distance functions satisfy the lower-bound condition, which can speed up the calculation of similarity search and indexing in the multivariate time series datasets.

  1. Quirky patterns in time-series of estimates of recruitment could be artefacts

    DEFF Research Database (Denmark)

    Dickey-Collas, M.; Hinzen, N.T.; Nash, R.D.M.

    2015-01-01

    employed, and the associated modelling assumptions, can have an important influence on the characteristics of each time-series. We explore this idea by investigating recruitment time-series with three different recruitment parameterizations: a stock–recruitment model, a random-walk time-series model......, and non-parametric “free” estimation of recruitment.Weshow that the recruitment time-series is sensitive to model assumptions and this can impact reference points in management, the perception of variability in recruitment and thus undermine meta-analyses. The assumption of the direct comparability...... of recruitment time-series in databases is therefore not consistent across or within species and stocks. Caution is therefore required as perhaps the characteristics of the time-series of stock dynamics may be determined by the model used to generate them, rather than underlying ecological phenomena...

  2. Anthropogenic radiative forcing time series from pre-industrial times until 2010

    Directory of Open Access Journals (Sweden)

    R. B. Skeie

    2011-11-01

    Full Text Available In order to use knowledge of past climate change to improve our understanding of the sensitivity of the climate system, detailed knowledge about the time development of radiative forcing (RF of the earth atmosphere system is crucial. In this study, time series of anthropogenic forcing of climate from pre-industrial times until 2010, for all well established forcing agents, are estimated. This includes presentation of RF histories of well mixed greenhouse gases, tropospheric ozone, direct- and indirect aerosol effects, surface albedo changes, stratospheric ozone and stratospheric water vapour. For long lived greenhouse gases, standard methods are used for calculating RF, based on global mean concentration changes. For short lived climate forcers, detailed chemical transport modelling and radiative transfer modelling using historical emission inventories is performed. For the direct aerosol effect, sulphate, black carbon, organic carbon, nitrate and secondary organic aerosols are considered. For aerosol indirect effects, time series of both the cloud lifetime effect and the cloud albedo effect are presented. Radiative forcing time series due to surface albedo changes are calculated based on prescribed changes in land use and radiative transfer modelling. For the stratospheric components, simple scaling methods are used. Long lived greenhouse gases (LLGHGs are the most important radiative forcing agent with a RF of 2.83±0.28 W m−2 in year 2010 relative to 1750. The two main aerosol components contributing to the direct aerosol effect are black carbon and sulphate, but their contributions are of opposite sign. The total direct aerosol effect was −0.48±0.32 W m−2 in year 2010. Since pre-industrial times the positive RF (LLGHGs and tropospheric O3 has been offset mainly by the direct and indirect aerosol effects, especially in the second half of the 20th century, which possibly lead to a decrease in the total

  3. Jemen - the Proxy War

    OpenAIRE

    Magdalena El Ghamari

    2015-01-01

    The military operation in Yemen is significant departure from Saudi Arabia's foreign policy tradition and customs. Riyadh has always relied on three strategies to pursue its interests abroad: wealth, establish a global network and muslim education and diplomacy and meadiation. The term "proxy war" has experienced a new popularity in stories on the Middle East. A proxy war is two opposing countries avoiding direct war, and instead supporting combatants that serve their interests. In some occas...

  4. Groundwater similarity across a watershed derived from time-warped and flow-corrected time series

    Science.gov (United States)

    Rinderer, M.; McGlynn, B. L.; van Meerveld, H. J.

    2017-05-01

    Information about catchment-scale groundwater dynamics is necessary to understand how catchments store and release water and why water quantity and quality varies in streams. However, groundwater level monitoring is often restricted to a limited number of sites. Knowledge of the factors that determine similarity between monitoring sites can be used to predict catchment-scale groundwater storage and connectivity of different runoff source areas. We used distance-based and correlation-based similarity measures to quantify the spatial and temporal differences in shallow groundwater similarity for 51 monitoring sites in a Swiss prealpine catchment. The 41 months long time series were preprocessed using Dynamic Time-Warping and a Flow-corrected Time Transformation to account for small timing differences and bias toward low-flow periods. The mean distance-based groundwater similarity was correlated to topographic indices, such as upslope contributing area, topographic wetness index, and local slope. Correlation-based similarity was less related to landscape position but instead revealed differences between seasons. Analysis of variance and partial Mantel tests showed that landscape position, represented by the topographic wetness index, explained 52% of the variability in mean distance-based groundwater similarity, while spatial distance, represented by the Euclidean distance, explained only 5%. The variability in distance-based similarity and correlation-based similarity between groundwater and streamflow time series was significantly larger for midslope locations than for other landscape positions. This suggests that groundwater dynamics at these midslope sites, which are important to understand runoff source areas and hydrological connectivity at the catchment scale, are most difficult to predict.

  5. Study of Track Irregularity Time Series Calibration and Variation Pattern at Unit Section

    Directory of Open Access Journals (Sweden)

    Chaolong Jia

    2014-01-01

    Full Text Available Focusing on problems existing in track irregularity time series data quality, this paper first presents abnormal data identification, data offset correction algorithm, local outlier data identification, and noise cancellation algorithms. And then proposes track irregularity time series decomposition and reconstruction through the wavelet decomposition and reconstruction approach. Finally, the patterns and features of track irregularity standard deviation data sequence in unit sections are studied, and the changing trend of track irregularity time series is discovered and described.

  6. A new non-parametric stationarity test of time series in the time domain

    KAUST Repository

    Jin, Lei

    2014-11-07

    © 2015 The Royal Statistical Society and Blackwell Publishing Ltd. We propose a new double-order selection test for checking second-order stationarity of a time series. To develop the test, a sequence of systematic samples is defined via Walsh functions. Then the deviations of the autocovariances based on these systematic samples from the corresponding autocovariances of the whole time series are calculated and the uniform asymptotic joint normality of these deviations over different systematic samples is obtained. With a double-order selection scheme, our test statistic is constructed by combining the deviations at different lags in the systematic samples. The null asymptotic distribution of the statistic proposed is derived and the consistency of the test is shown under fixed and local alternatives. Simulation studies demonstrate well-behaved finite sample properties of the method proposed. Comparisons with some existing tests in terms of power are given both analytically and empirically. In addition, the method proposed is applied to check the stationarity assumption of a chemical process viscosity readings data set.

  7. Scale and time dependence of serial correlations in word-length time series of written texts

    Science.gov (United States)

    Rodriguez, E.; Aguilar-Cornejo, M.; Femat, R.; Alvarez-Ramirez, J.

    2014-11-01

    This work considered the quantitative analysis of large written texts. To this end, the text was converted into a time series by taking the sequence of word lengths. The detrended fluctuation analysis (DFA) was used for characterizing long-range serial correlations of the time series. To this end, the DFA was implemented within a rolling window framework for estimating the variations of correlations, quantified in terms of the scaling exponent, strength along the text. Also, a filtering derivative was used to compute the dependence of the scaling exponent relative to the scale. The analysis was applied to three famous English-written literary narrations; namely, Alice in Wonderland (by Lewis Carrol), Dracula (by Bram Stoker) and Sense and Sensibility (by Jane Austen). The results showed that high correlations appear for scales of about 50-200 words, suggesting that at these scales the text contains the stronger coherence. The scaling exponent was not constant along the text, showing important variations with apparent cyclical behavior. An interesting coincidence between the scaling exponent variations and changes in narrative units (e.g., chapters) was found. This suggests that the scaling exponent obtained from the DFA is able to detect changes in narration structure as expressed by the usage of words of different lengths.

  8. Online time and resource management based on surgical workflow time series analysis.

    Science.gov (United States)

    Maktabi, M; Neumuth, T

    2017-02-01

    Hospitals' effectiveness and efficiency can be enhanced by automating the resource and time management of the most cost-intensive unit in the hospital: the operating room (OR). The key elements required for the ideal organization of hospital staff and technical resources (such as instruments in the OR) are an exact online forecast of both the surgeon's resource usage and the remaining intervention time. This paper presents a novel online approach relying on time series analysis and the application of a linear time-variant system. We calculated the power spectral density and the spectrogram of surgical perspectives (e.g., used instrument) of interest to compare several surgical workflows. Considering only the use of the surgeon's right hand during an intervention, we were able to predict the remaining intervention time online with an error of 21 min 45 s ±9 min 59 s for lumbar discectomy. Furthermore, the performance of forecasting of technical resource usage in the next 20 min was calculated for a combination of spectral analysis and the application of a linear time-variant system (sensitivity: 74 %; specificity: 75 %) focusing on just the use of surgeon's instrument in question. The outstanding benefit of these methods is that the automated recording of surgical workflows has minimal impact during interventions since the whole set of surgical perspectives need not be recorded. The resulting predictions can help various stakeholders such as OR staff and hospital technicians. Moreover, reducing resource conflicts could well improve patient care.

  9. Imprints of approximately 8 year oscillation in climatic time series

    Science.gov (United States)

    Mikšovský, Jiří; Paluš, Milan; Jajcay, Nikola; Donner, Reik

    2017-04-01

    Due to activity and complex interaction of various climate-determining agents, a wide range of variability patterns can be found in the meteorological records. Some of these can be attributed to external factors such as changes in solar or volcanic activity; others are linked to internally induced climate variations. Among the perhaps less prominent, yet still potentially influential variability modes is the approximately 8 year oscillation. In the past, its presence has been reported in various (particularly European) climate records, typically related to temperature or pressure data. Here, the presence of the approximately 8y cycle has been investigated in climate signals originating from a range of observational, reanalyzed and simulated datasets. Through statistical techniques based primarily on wavelet transform and regression analysis, magnitude and statistical significance of the approximately 8y oscillation were evaluated, as well as its temporal stability and geographical patterns. In addition to confirming a statistically significant presence of the approximately 8y periodicity in the mean temperature series over a large part of Europe, its existence has also been demonstrated for minimum and maximum temperature series, while only limited traces were found in precipitation data. The analysis of long European temperature records has revealed a distinct multidecadal variation of the magnitude of the approximately 8y oscillation, although the specific mechanism responsible for this behavior still remains unclear. A link of the approximately 8y component in the index of the North Atlantic Oscillation to near-ground temperatures has been detected for much of Europe as well as some areas in the North Atlantic region. Finally, the presence of the approximately 8y cycle in the general circulation model outputs has been investigated: while some indications of the 8y oscillations were found in the simulated data, they seem generally weaker than in their

  10. Reconstruction of Past Climatic Proxy Series

    Science.gov (United States)

    1975-08-29

    zones, herb pollen, such as Artemisia and Chenopodeaceae are dominant (6e). Deciduous arboreal pollen is restricted to the mixed forest zone (6f... reproduction and an increase in the longer lived species such as hemlock (Tsuga canadensis) and spruce (Picea, sp.) which can reproduce in the absence of fire...are only temporary during the 1450-1850 A.D. interval since it is later replaced by hemlock. White pine also requires fire for good reproduction

  11. Transforming the autocorrelation function of a time series to detect land cover change

    CSIR Research Space (South Africa)

    Salmon, BP

    2015-07-01

    Full Text Available methods. A robust change detection metric can be derived by analyzing the area under the autocorrelation function for a time series. The time dependence on the first and second moment causes a non-stationary event within the time series which results...

  12. Optimal Warping Paths are unique for almost every pair of Time Series

    OpenAIRE

    Jain, Brijnesh J.; Schultz, David

    2017-01-01

    An optimal warping path between two time series is generally not unique. The size and form of the set of pairs of time series with non-unique optimal warping path is unknown. This article shows that optimal warping paths are unique for almost every pair of time series in a measure-theoretic sense. All pairs of time series with non-unique optimal warping path form a negligible set and are geometrically the union of zero sets of quadratic forms. The result is useful for analyzing and understand...

  13. Forecasting and analyzing high O3 time series in educational area through an improved chaotic approach

    Science.gov (United States)

    Hamid, Nor Zila Abd; Adenan, Nur Hamiza; Noorani, Mohd Salmi Md

    2017-08-01

    Forecasting and analyzing the ozone (O3) concentration time series is important because the pollutant is harmful to health. This study is a pilot study for forecasting and analyzing the O3 time series in one of Malaysian educational area namely Shah Alam using chaotic approach. Through this approach, the observed hourly scalar time series is reconstructed into a multi-dimensional phase space, which is then used to forecast the future time series through the local linear approximation method. The main purpose is to forecast the high O3 concentrations. The original method performed poorly but the improved method addressed the weakness thereby enabling the high concentrations to be successfully forecast. The correlation coefficient between the observed and forecasted time series through the improved method is 0.9159 and both the mean absolute error and root mean squared error are low. Thus, the improved method is advantageous. The time series analysis by means of the phase space plot and Cao method identified the presence of low-dimensional chaotic dynamics in the observed O3 time series. Results showed that at least seven factors affect the studied O3 time series, which is consistent with the listed factors from the diurnal variations investigation and the sensitivity analysis from past studies. In conclusion, chaotic approach has been successfully forecast and analyzes the O3 time series in educational area of Shah Alam. These findings are expected to help stakeholders such as Ministry of Education and Department of Environment in having a better air pollution management.

  14. SCALABLE TIME SERIES CHANGE DETECTION FOR BIOMASS MONITORING USING GAUSSIAN PROCESS

    Data.gov (United States)

    National Aeronautics and Space Administration — SCALABLE TIME SERIES CHANGE DETECTION FOR BIOMASS MONITORING USING GAUSSIAN PROCESS VARUN CHANDOLA AND RANGA RAJU VATSAVAI Abstract. Biomass monitoring,...

  15. Time-Elastic Generative Model for Acceleration Time Series in Human Activity Recognition

    Directory of Open Access Journals (Sweden)

    Mario Munoz-Organero

    2017-02-01

    Full Text Available Body-worn sensors in general and accelerometers in particular have been widely used in order to detect human movements and activities. The execution of each type of movement by each particular individual generates sequences of time series of sensed data from which specific movement related patterns can be assessed. Several machine learning algorithms have been used over windowed segments of sensed data in order to detect such patterns in activity recognition based on intermediate features (either hand-crafted or automatically learned from data. The underlying assumption is that the computed features will capture statistical differences that can properly classify different movements and activities after a training phase based on sensed data. In order to achieve high accuracy and recall rates (and guarantee the generalization of the system to new users, the training data have to contain enough information to characterize all possible ways of executing the activity or movement to be detected. This could imply large amounts of data and a complex and time-consuming training phase, which has been shown to be even more relevant when automatically learning the optimal features to be used. In this paper, we present a novel generative model that is able to generate sequences of time series for characterizing a particular movement based on the time elasticity properties of the sensed data. The model is used to train a stack of auto-encoders in order to learn the particular features able to detect human movements. The results of movement detection using a newly generated database with information on five users performing six different movements are presented. The generalization of results using an existing database is also presented in the paper. The results show that the proposed mechanism is able to obtain acceptable recognition rates (F = 0.77 even in the case of using different people executing a different sequence of movements and using different

  16. Climatic interpretation of tree-ring methoxyl d2H time-series from a central alpine larch forest

    Science.gov (United States)

    Riechelmann, Dana F. C.; Greule, Markus; Siegwolf, Rolf T. W.; Esper, Jan; Keppler, Frank

    2017-04-01

    We measured stable hydrogen isotope ratios of lignin methoxyl groups (d2HLM) in high elevation larch trees (Larix decidua Mill.) from the Simplon Valley in southern Switzerland. Thirty-seven larch trees were sampled and five individuals analysed for their d2HLM values at annual (1971-2009) and pentadal resolution (1746-2009). Testing the climate response of the d2HLM series, the annually resolved series show a positive correlation of r = 0.60 with June/July precipitation and weaker but negative correlation with June/July temperature. In addition, a negative correlation with June-August d2H in precipitation of the nearby GNIP station in Locarno is observed. The pentadally resolved d2HLM series show no significant correlation to climate parameters. The positive correlation of the annually resolved data to summer precipitation is uncommon to d2H measurements from tree-rings (Feakins et al., 2013; Helle and Schleser, 2004; McCarroll and Loader, 2004; Mischel et al., 2015; White et al., 1994). However, we explain the positive association with warm season hydroclimate as follows: methoxyl groups of lignin are directly formed from tissues in the xylem water. More precipitation during June and July, which are on average relatively dry month, results in higher d2H values of the xylem water and therefore, higher d2H value in the lignin methoxyl groups. Therefore, we suggest that d2HLM values of high elevation larch trees might likely serve as a summer precipitation proxy. References: Feakins, S.J., Ellsworth, P.V., Sternberg, L.d.S.L., 2013. Lignin methoxyl hydrogen isotope rations in a coastal ecosystem. Geochimica et Cosmochimica Acta, 121: 54-66. Helle, G., Schleser, G.H., 2004. Interpreting Climate Proxies from Tree-rings. In: Fischer, H., Floeser, G., Kumke, T., Lohmann, G., Miller, H., Negendank, J.F.W., et al., editors. The Climate in Historical Times. Springer Berlin Heidelberg, pp. 129-148. McCarroll, D., Loader, N.J., 2004. Stable isotopes in tree rings. Quaternary

  17. Academic Training: Real Time Process Control - Lecture series

    CERN Document Server

    Françoise Benz

    2004-01-01

    ACADEMIC TRAINING LECTURE REGULAR PROGRAMME 7, 8 and 9 June From 11:00 hrs to 12:00 hrs - Main Auditorium bldg. 500 Real Time Process Control T. Riesco / CERN-TS What exactly is meant by Real-time? There are several definitions of real-time, most of them contradictory. Unfortunately the topic is controversial, and there does not seem to be 100% agreement over the terminology. Real-time applications are becoming increasingly important in our daily lives and can be found in diverse environments such as the automatic braking system on an automobile, a lottery ticket system, or robotic environmental samplers on a space station. These lectures will introduce concepts and theory like basic concepts timing constraints, task scheduling, periodic server mechanisms, hard and soft real-time.ENSEIGNEMENT ACADEMIQUE ACADEMIC TRAINING Françoise Benz 73127 academic.training@cern.ch

  18. Updating Landsat time series of surface-reflectance composites and forest change products with new observations

    Science.gov (United States)

    Hermosilla, Txomin; Wulder, Michael A.; White, Joanne C.; Coops, Nicholas C.; Hobart, Geordie W.

    2017-12-01

    The use of time series satellite data allows for the temporally dense, systematic, transparent, and synoptic capture of land dynamics over time. Subsequent to the opening of the Landsat archive, several time series approaches for characterizing landscape change have been developed, often representing a particular analytical time window. The information richness and widespread utility of these time series data have created a need to maintain the currency of time series information via the addition of new data, as it becomes available. When an existing time series is temporally extended, it is critical that previously generated change information remains consistent, thereby not altering reported change statistics or science outcomes based on that change information. In this research, we investigate the impacts and implications of adding additional years to an existing 29-year annual Landsat time series for forest change. To do so, we undertook a spatially explicit comparison of the 29 overlapping years of a time series representing 1984-2012, with a time series representing 1984-2016. Surface reflectance values, and presence, year, and type of change were compared. We found that the addition of years to extend the time series had minimal effect on the annual surface reflectance composites, with slight band-specific differences (r ≥ 0.1) in the final years of the original time series being updated. The area of stand replacing disturbances and determination of change year are virtually unchanged for the overlapping period between the two time-series products. Over the overlapping temporal period (1984-2012), the total area of change differs by 0.53%, equating to an annual difference in change area of 0.019%. Overall, the spatial and temporal agreement of the changes detected by both time series was 96%. Further, our findings suggest that the entire pre-existing historic time series does not need to be re-processed during the update process. Critically, given the time

  19. A non linear analysis of human gait time series based on multifractal analysis and cross correlations

    Energy Technology Data Exchange (ETDEWEB)

    Munoz-Diosdado, A [Department of Mathematics, Unidad Profesional Interdisciplinaria de Biotecnologia, Instituto Politecnico Nacional, Av. Acueducto s/n, 07340, Mexico City (Mexico)

    2005-01-01

    We analyzed databases with gait time series of adults and persons with Parkinson, Huntington and amyotrophic lateral sclerosis (ALS) diseases. We obtained the staircase graphs of accumulated events that can be bounded by a straight line whose slope can be used to distinguish between gait time series from healthy and ill persons. The global Hurst exponent of these series do not show tendencies, we intend that this is because some gait time series have monofractal behavior and others have multifractal behavior so they cannot be characterized with a single Hurst exponent. We calculated the multifractal spectra, obtained the spectra width and found that the spectra of the healthy young persons are almost monofractal. The spectra of ill persons are wider than the spectra of healthy persons. In opposition to the interbeat time series where the pathology implies loss of multifractality, in the gait time series the multifractal behavior emerges with the pathology. Data were collected from healthy and ill subjects as they walked in a roughly circular path and they have sensors in both feet, so we have one time series for the left foot and other for the right foot. First, we analyzed these time series separately, and then we compared both results, with direct comparison and with a cross correlation analysis. We tried to find differences in both time series that can be used as indicators of equilibrium problems.

  20. Detecting cognizable trends of gene expression in a time series ...

    Indian Academy of Sciences (India)

    We propose a bootstrap algorithm to identify 'cognizable' 'time-trends' of gene expression. Properties of the proposed algorithm are derived using a simulation study. The proposed algorithm captured known 'time-trends' in the simulated data with a high probability of success, even when sample sizes were small (n<10).

  1. TAX ELASTICITY IN SIERRA LEONE: A TIME SERIES APPROACH

    Directory of Open Access Journals (Sweden)

    Brima Ibrahim Baimba Kargbo

    2012-01-01

    Full Text Available The fiscal authorities in Sierra Leone introduced series of reforms in the tax system ranging from continual revisions in tax rate to harmonization and instituting new taxes that are relatively easy to collect. Despite these measures, the output of the tax system as measured by the tax/GDP ratio remains very low averaging 11 per cent contributing to higher fiscal deficits. This study examined the base elasticity of the tax system in Sierra Leone and its major handles using annual data covering the period between 1977and 2009.The Singer method of dummy variables was employed in order to make adjustment for the effect of discretionary tax measures and then compare buoyancy and elasticity measures. The empirical results indicated that buoyancy estimates were higher than elasticity estimates; and that short-run elasticities were lower than the static long-run elasticities. Estimation results further showed that discretionary tax measures were effective in mobilizing additional tax revenues and that the tax system was inelastic during the period.

  2. WEIGHTED TIME SERIES ANALYSIS FOR ELECTROENCEPHALOGRAPHIC SOURCE LOCALIZATION

    Directory of Open Access Journals (Sweden)

    EDUARDO GIRALDO

    2012-01-01

    Full Text Available Este artículo presenta un nuevo método para la estimación de actividad neuronal a partir de señales electroencefalográficas usando análisis de series de tiempo ponderadas. El método considera un modelo lineal basado en restricciones fisiológicas que tiene en cuenta tanto la dinámica espacial como la temporal, y una etapa de ponderación que modifica las suposiciones del modelo a partir de las observaciones. La matriz de pesos calculada es incluida en la función de costo usada para solucionar el problema inverso dinámico, y por lo tanto en la formulación del filtro de Kalman. De esta forma, se propone un filtro de Kalman ponderado que incluye la matriz de pesos. El desempeño del filtro (en términos del error de localización se analiza para varios SNRs. El desempeño óptimo se alcanza usando el modelo lineal con matriz de ponderación calculado por el método de producto interno.

  3. Cluster analysis of activity-time series in motor learning

    DEFF Research Database (Denmark)

    Balslev, Daniela; Nielsen, Finn Å; Futiger, Sally A

    2002-01-01

    Neuroimaging studies of learning focus on brain areas where the activity changes as a function of time. To circumvent the difficult problem of model selection, we used a data-driven analytic tool, cluster analysis, which extracts representative temporal and spatial patterns from the voxel-time se...... practice-related activity in a fronto-parieto-cerebellar network, in agreement with previous studies of motor learning. These voxels were separated from a group of voxels showing an unspecific time-effect and another group of voxels, whose activation was an artifact from smoothing...

  4. Time Series Analysis: A New Methodology for Comparing the Temporal Variability of Air Temperature

    Directory of Open Access Journals (Sweden)

    Piia Post

    2013-01-01

    Full Text Available Temporal variability of three different temperature time series was compared by the use of statistical modeling of time series. The three temperature time series represent the same physical process, but are at different levels of spatial averaging: temperatures from point measurements, from regional Baltan65+, and from global ERA-40 reanalyses. The first order integrated average model IMA(0, 1, 1 is used to compare the temporal variability of the time series. The applied IMA(0, 1, 1 model is divisible into a sum of random walk and white noise component, where the variances for both white noises (one of them serving as a generator of the random walk are computable from the parameters of the fitted model. This approach enables us to compare the models fitted independently to the original and restored series using two new parameters. This operation adds a certain new method to the analysis of nonstationary series.

  5. Methods for serial analysis of long time series in the study of biological rhythms

    OpenAIRE

    Díez-Noguera, Antoni

    2013-01-01

    When one is faced with the analysis of long time series, one often finds that the characteristics of circadian rhythms vary with time throughout the series. To cope with this situation, the whole series can be fragmented into successive sections which are analyzed one after the other, which constitutes a serial analysis. This article discusses serial analysis techniques, beginning with the characteristics that the sections must have and how they can affect the results. After consideration of ...

  6. Nonlinear Analysis of Guillain Barré Time Series to Elucidate Its Epidemiology

    OpenAIRE

    Lestayo O'Farrill, Zurina; Hernández Cáceres, José Luís; O'Farrill Mons, Esperanza

    2013-01-01

    The etiology of Guillain Barré Syndrome (GBS) is not fully clarified, and there is a lack of agreement concerning its putative epidemic character. The low incidence rate of this disease is a disadvantage for employing the traditional statistical methods used in the analysis of epidemics. The objective of this paper is to clarify the GBS epidemic behavior applying a nonlinear time series identification approach. The authors obtained one time series of GBS and nine series of classical infectiou...

  7. Cluster analysis of activity-time series in motor learning

    DEFF Research Database (Denmark)

    Balslev, Daniela; Nielsen, Finn Å; Futiger, Sally A

    2002-01-01

    Neuroimaging studies of learning focus on brain areas where the activity changes as a function of time. To circumvent the difficult problem of model selection, we used a data-driven analytic tool, cluster analysis, which extracts representative temporal and spatial patterns from the voxel...... practice-related activity in a fronto-parieto-cerebellar network, in agreement with previous studies of motor learning. These voxels were separated from a group of voxels showing an unspecific time-effect and another group of voxels, whose activation was an artifact from smoothing...

  8. Cluster analysis of activity-time series in motor learning

    DEFF Research Database (Denmark)

    Balslev, Daniela; Nielsen, Finn Årup; Frutiger, Sally A.

    2002-01-01

    Neuroimaging studies of learning focus on brain areas where the activity changes as a function of time. To circumvent the difficult problem of model selection, we used a data-driven analytic tool, cluster analysis, which extracts representative temporal and spatial patterns from the voxel...... practice-related activity in a fronto-parieto-cerebellar network, in agreement with previous studies of motor learning. These voxels were separated from a group of voxels showing an unspecific time-effect and another group of voxels, whose activation was an artifact from smoothing. Hum. Brain Mapping 15...

  9. Extracting the relevant delays in time series modelling

    DEFF Research Database (Denmark)

    Goutte, Cyril

    1997-01-01

    In this contribution, we suggest a convenient way to use generalisation error to extract the relevant delays from a time-varying process, i.e. the delays that lead to the best prediction performance. We design a generalisation-based algorithm that takes its inspiration from traditional variable...

  10. Detecting cognizable trends of gene expression in a time series ...

    Indian Academy of Sciences (India)

    costs involved in carrying out RNA-seq experiments and difficult conditions of sampling usually resulted in a small sample size and a limited number of time points at which biospecimens are sampled. Standard statistical tests with small sample sizes are untenable for formal tests of signifi- cance. On the other hand, without ...

  11. Five Separate Bias Contributions in Time Series Models for Equidistantly Resampled Irregular Data

    NARCIS (Netherlands)

    Broersen, P.M.T.

    2009-01-01

    The use of time series models for irregular data requires resampling of the data on an equidistant grid. Slotted resampling transforms an irregular randomly sampled process into an equidistant signal where data are missing. An approximate maximum-likelihood time series estimator has been developed

  12. Effects of changing sea ice on phytoplankton bloom strength and composition at the Rothera Time Series

    NARCIS (Netherlands)

    Venables, Hugh; Meredith, Michael; Clarke, Andrew; Rozema, Patrick

    2016-01-01

    The Rothera Time Series has collected year-round physical, biological and biogeochemical data from Ryder Bay since 1998. The sample site is 500m deep, close to Rothera Research Station on the west Antarctic Peninsula, just inside the Antarctic Circle. Over the course of the time series there has

  13. An Adaptive Density-Based Time Series Clustering Algorithm: A Case Study on Rainfall Patterns

    Directory of Open Access Journals (Sweden)

    Xiaomi Wang

    2016-11-01

    Full Text Available Current time series clustering algorithms fail to effectively mine clustering distribution characteristics of time series data without sufficient prior knowledge. Furthermore, these algorithms fail to simultaneously consider the spatial attributes, non-spatial time series attribute values, and non-spatial time series attribute trends. This paper proposes an adaptive density-based time series clustering (DTSC algorithm that simultaneously considers the three above-mentioned attributes to relieve these limitations. In this algorithm, the Delaunay triangulation is first utilized in combination with particle swarm optimization (PSO to adaptively obtain objects with similar spatial attributes. An improved density-based clustering strategy is then adopted to detect clusters with similar non-spatial time series attribute values and time series attribute trends. The effectiveness and efficiency of the DTSC algorithm are validated by experiments on simulated datasets and real applications. The results indicate that the proposed DTSC algorithm effectively detects time series clusters with arbitrary shapes and similar attributes and densities while considering noises.

  14. Linear genetic programming for time-series modelling of daily flow rate

    Indian Academy of Sciences (India)

    In this study linear genetic programming (LGP), which is a variant of Genetic Programming, and two versions of Neural Networks (NNs) are used in predicting time-series of daily flow rates at a station on Schuylkill River at Berne, PA, USA. Daily flow rate at present is being predicted based on different time-series scenarios.

  15. A novel water quality data analysis framework based on time-series data mining.

    Science.gov (United States)

    Deng, Weihui; Wang, Guoyin

    2017-07-01

    The rapid development of time-series data mining provides an emerging method for water resource management research. In this paper, based on the time-series data mining methodology, we propose a novel and general analysis framework for water quality time-series data. It consists of two parts: implementation components and common tasks of time-series data mining in water quality data. In the first part, we propose to granulate the time series into several two-dimensional normal clouds and calculate the similarities in the granulated level. On the basis of the similarity matrix, the similarity search, anomaly detection, and pattern discovery tasks in the water quality time-series instance dataset can be easily implemented in the second part. We present a case study of this analysis framework on weekly Dissolve Oxygen time-series data collected from five monitoring stations on the upper reaches of Yangtze River, China. It discovered the relationship of water quality in the mainstream and tributary as well as the main changing patterns of DO. The experimental results show that the proposed analysis framework is a feasible and efficient method to mine the hidden and valuable knowledge from water quality historical time-series data. Copyright © 2017 Elsevier Ltd. All rights reserved.

  16. Are we in a bubble? A simple time-series-based diagnostic

    NARCIS (Netherlands)

    Ph.H.B.F. Franses (Philip Hans)

    2013-01-01

    textabstractTime series with bubble-like patterns display an unbalance between growth and acceleration, in the sense that growth in the upswing is “too fast” and then there is a collapse. In fact, such time series show periods where both the first differences (1-L) and the second differences (1-L)2

  17. Modeling BAS Dysregulation in Bipolar Disorder : Illustrating the Potential of Time Series Analysis

    NARCIS (Netherlands)

    Hamaker, Ellen L.; Grasman, Raoul P P P; Kamphuis, Jan Henk

    2016-01-01

    Time series analysis is a technique that can be used to analyze the data from a single subject and has great potential to investigate clinically relevant processes like affect regulation. This article uses time series models to investigate the assumed dysregulation of affect that is associated with

  18. Dimension Reduction of Multi-Spectral Satellite Image Time Series to Improve Deforestation Monitoring

    NARCIS (Netherlands)

    Lu, Meng; Hamunyela, E.; Verbesselt, Jan; Pebesma, Edzer

    2017-01-01

    In recent years, sequential tests for detecting structural changes in time series have been adapted for deforestation monitoring using satellite data. The input time series of such sequential tests is typically a vegetation index (e.g., NDVI), which uses two or three bands and ignores all other

  19. MAPT and PAICE: Tools for time series and single time point transcriptionist visualization and knowledge discovery.

    Science.gov (United States)

    Hosseini, Parsa; Tremblay, Arianne; Matthews, Benjamin F; Alkharouf, Nadim W

    2012-01-01

    With the advent of next-generation sequencing, -omics fields such as transcriptomics have experienced increases in data throughput on the order of magnitudes. In terms of analyzing and visually representing these huge datasets, an intuitive and computationally tractable approach is to map quantified transcript expression onto biochemical pathways while employing datamining and visualization principles to accelerate knowledge discovery. We present two cross-platform tools: MAPT (Mapping and Analysis of Pathways through Time) and PAICE (Pathway Analysis and Integrated Coloring of Experiments), an easy to use analysis suite to facilitate time series and single time point transcriptomics analysis. In unison, MAPT and PAICE serve as a visual workbench for transcriptomics knowledge discovery, data-mining and functional annotation. Both PAICE and MAPT are two distinct but yet inextricably linked tools. The former is specifically designed to map EC accessions onto KEGG pathways while handling multiple gene copies, detection-call analysis, as well as UN/annotated EC accessions lacking quantifiable expression. The latter tool integrates PAICE datasets to drive visualization, annotation, and data-mining. The database is available for free at http://sourceforge.net/projects/paice/http://sourceforge.net/projects/mapt/

  20. Entropy in natural time of geoelectric time series of dichotomic nature.

    Science.gov (United States)

    Ramírez-Rojas, Alejandro; Rubén Luevano, J.; Vargas, Carlos A.

    2010-05-01

    It is well known that short-term earthquake prediction is one of the most debated topics in the Earth Science. In this context, the seismic electric signals (SES) activities have played a polemic topic by the scientific community. The SES are electric self-potential fluctuations of low frequency (≤ 1Hz) and dichotomic nature. These fluctuations are detected in seismically active regions as anomalies in geoelectric signals since a few hours to some weeks before impending earthquakes. SES have been reported in Greece, Japan and México. In this work we present a study case of dichotomous nature signals monitored in the seismically active Mexican region located in the Guerrero-Oaxaca coast by means of the natural time domain. Our monitored geoelectrical signals are associated with two EQ's occurred on October 24, 1993 and September 14, 1995. In order to compare our experimental data set we consider a chaotic dichotomic time series obtained from the Liebovitch and Thot model. Our results show that entropy and power spectrum, in the natural time domain analysis, are consistent with the results reported already.

  1. Comparing entropy with tests for randomness as a measure of complexity in time series

    CERN Document Server

    Gan, Chee Chun

    2015-01-01

    Entropy measures have become increasingly popular as an evaluation metric for complexity in the analysis of time series data, especially in physiology and medicine. Entropy measures the rate of information gain, or degree of regularity in a time series e.g. heartbeat. Ideally, entropy should be able to quantify the complexity of any underlying structure in the series, as well as determine if the variation arises from a random process. Unfortunately current entropy measures mostly are unable to perform the latter differentiation. Thus, a high entropy score indicates a random or chaotic series, whereas a low score indicates a high degree of regularity. This leads to the observation that current entropy measures are equivalent to evaluating how random a series is, or conversely the degree of regularity in a time series. This raises the possibility that existing tests for randomness, such as the runs test or permutation test, may have similar utility in diagnosing certain conditions. This paper compares various t...

  2. PERAMALAN INDEKS HARGA SAHAM GABUNGAN (IHSG DENGAN METODE FUZZY TIME SERIES MARKOV CHAIN

    Directory of Open Access Journals (Sweden)

    Y Aristyani

    2016-04-01

    Full Text Available Tujuan penelitian ini adalah untuk mengetahui akurasi metode Fuzzy Time Series Markov Chain pada peramalan IHSG dan membuat aplikasi untuk peramalan IHSG menggunakan software MATLAB. Dalam penelitian  ini, data bersumber dari yahoo finance. Data historis diambil dari data Composite Indeks (IHSG periode Januari 2010 sampai dengan Februari 2014. Dengan mengubah data time series IHSG ke dalam fuzzy logic group untuk menentukan matriks probabilitas transisi, maka hasil peramalan dapat diperoleh. Tahap awal pembuatan aplikasi yaitu perancangan sistem. Aplikasi untuk peramalan IHSG dirancang dengan menggunakan GUI pada MATLAB dengan melakukan coding yang sesuai agar aplikasi bisa berjalan. Setelah dilakukan pengujian sistem diperoleh hasil MSE untuk metode Fuzzy Time Series Markov Chain sebesar 9827.1292 dan MSE untuk  metode Fuzzy Time Series S&C sebesar 15769.7036. Karena  memperoleh nilai MSE yang lebih kecil maka metode Fuzzy Time Series Markov Chain lebih akurat dan memiliki kinerja yang lebih baik untuk peramalan. Aplikasi yang dibuat memiliki persentase akurasi peramalan dengan metode Fuzzy Time Series Markov Chain sebesar 98,03458% dan persentase akurasi peramalan dengan metode Fuzzy Time Series S&C sebesar 97,38003%.The purpose of  this research were to determine the accuracy of the Markov Chain Fuzzy Time Series method on JCI forecasting and make an application for JCI forecasting using MATLAB software. In this research, the data sourced from Yahoo Finance. Historical data is taken from Data Composite Index (JCI in the period of January 2010 to February 2014. By transfering time series data into fuzzy logic groups to determine the transition probability matrix, then the forecasting results can be obtained. The initial phase to making the application  is system design. Application for JCI forecasting designed using GUI on MATLAB with appropriate coding in order to run the application. After testing the system then obtained MSE results

  3. A Radial Basis Function Approach to Financial Time Series Analysis

    Science.gov (United States)

    1993-12-01

    primarily to the work of William Sharpe (1964). John Lintner (1965), and Jan Mossin (1966). is one of a. number of models that grew out of Modern...Crutchfield and MacNamara (1987) introduced a general method for estimating the "equations of motion" (i.e. model of the time behavior) of a data... John Lintner. The valuation of risk assets and the selection of risky investments in stock portfolios and capital budgets. Review of Economics and

  4. How long will the traffic flow time series keep efficacious to forecast the future?

    Science.gov (United States)

    Yuan, PengCheng; Lin, XuXun

    2017-02-01

    This paper investigate how long will the historical traffic flow time series keep efficacious to forecast the future. In this frame, we collect the traffic flow time series data with different granularity at first. Then, using the modified rescaled range analysis method, we analyze the long memory property of the traffic flow time series by computing the Hurst exponent. We calculate the long-term memory cycle and test its significance. We also compare it with the maximum Lyapunov exponent method result. Our results show that both of the freeway traffic flow time series and the ground way traffic flow time series demonstrate positively correlated trend (have long-term memory property), both of their memory cycle are about 30 h. We think this study is useful for the short-term or long-term traffic flow prediction and management.

  5. Estimating the level of dynamical noise in time series by using fractal dimensions

    Energy Technology Data Exchange (ETDEWEB)

    Sase, Takumi, E-mail: sase@sat.t.u-tokyo.ac.jp [Graduate School of Information Science and Technology, The University of Tokyo, Tokyo 153-8505 (Japan); Ramírez, Jonatán Peña [CONACYT Research Fellow, Center for Scientific Research and Higher Education at Ensenada (CICESE), Carretera Ensenada-Tijuana No. 3918, Zona Playitas, C.P. 22860, Ensenada, Baja California (Mexico); Kitajo, Keiichi [BSI-Toyota Collaboration Center, RIKEN Brain Science Institute, Wako, Saitama 351-0198 (Japan); Aihara, Kazuyuki; Hirata, Yoshito [Graduate School of Information Science and Technology, The University of Tokyo, Tokyo 153-8505 (Japan); Institute of Industrial Science, The University of Tokyo, Tokyo 153-8505 (Japan)

    2016-03-11

    We present a method for estimating the dynamical noise level of a ‘short’ time series even if the dynamical system is unknown. The proposed method estimates the level of dynamical noise by calculating the fractal dimensions of the time series. Additionally, the method is applied to EEG data to demonstrate its possible effectiveness as an indicator of temporal changes in the level of dynamical noise. - Highlights: • A dynamical noise level estimator for time series is proposed. • The estimator does not need any information about the dynamics generating the time series. • The estimator is based on a novel definition of time series dimension (TSD). • It is demonstrated that there exists a monotonic relationship between the • TSD and the level of dynamical noise. • We apply the proposed method to human electroencephalographic data.

  6. The parametric modified limited penetrable visibility graph for constructing complex networks from time series

    Science.gov (United States)

    Li, Xiuming; Sun, Mei; Gao, Cuixia; Han, Dun; Wang, Minggang

    2018-02-01

    This paper presents the parametric modified limited penetrable visibility graph (PMLPVG) algorithm for constructing complex networks from time series. We modify the penetrable visibility criterion of limited penetrable visibility graph (LPVG) in order to improve the rationality of the original penetrable visibility and preserve the dynamic characteristics of the time series. The addition of view angle provides a new approach to characterize the dynamic structure of the time series that is invisible in the previous algorithm. The reliability of the PMLPVG algorithm is verified by applying it to three types of artificial data as well as the actual data of natural gas prices in different regions. The empirical results indicate that PMLPVG algorithm can distinguish the different time series from each other. Meanwhile, the analysis results of natural gas prices data using PMLPVG are consistent with the detrended fluctuation analysis (DFA). The results imply that the PMLPVG algorithm may be a reasonable and significant tool for identifying various time series in different fields.

  7. Permutation entropy analysis based on Gini-Simpson index for financial time series

    Science.gov (United States)

    Jiang, Jun; Shang, Pengjian; Zhang, Zuoquan; Li, Xuemei

    2017-11-01

    In this paper, a new coefficient is proposed with the objective of quantifying the level of complexity for financial time series. For researching complexity measures from the view of entropy, we propose a new permutation entropy based on Gini-Simpson index (GPE). Logistic map is applied to simulate time series to show the accuracy of the GPE method, and expound the extreme robustness of our GPE by the results of simulated time series. Meanwhile, we compare the effect of the different order of GPE. And then we employ it to US and European and Chinese stock markets in order to reveal the inner mechanism hidden in the original financial time series. After comparison of these results of stock indexes, it can be concluded that the relevance of different stock markets are obvious. To study the complexity features and properties of financial time series, it can provide valuable information for understanding the inner mechanism of financial markets.

  8. Formulating and testing a method for perturbing precipitation time series to reflect anticipated climatic changes

    DEFF Research Database (Denmark)

    Sørup, Hjalte Jomo Danielsen; Georgiadis, Stylianos; Gregersen, Ida Bülow

    2017-01-01

    Urban water infrastructure has very long planning horizons, and planning is thus very dependent on reliable estimates of the impacts of climate change. Many urban water systems are designed using time series with a high temporal resolution. To assess the impact of climate change on these systems......, similarly high-resolution precipitation time series for future climate are necessary. Climate models cannot at their current resolutions provide these time series at the relevant scales. Known methods for stochastic downscaling of climate change to urban hydrological scales have known shortcomings...... in constructing realistic climate-changed precipitation time series at the sub-hourly scale. In the present study we present a deterministic methodology to perturb historical precipitation time series at the minute scale to reflect non-linear expectations to climate change. The methodology shows good skill...

  9. SINOMA - a better tool for proxy based reconstructions?

    Science.gov (United States)

    Buras, Allan; Thees, Barnim; Czymzik, Markus; Dräger, Nadine; Kienel, Ulrike; Neugebauer, Ina; Ott, Florian; Scharnweber, Tobias; Simard, Sonia; Slowinski, Michal; Slowinski, Sandra; Tecklenburg, Christina; Zawiska, Izabela; Wilmking, Martin

    2014-05-01

    than on its mathematical background which we intend to present in another contribution to this EGU session (Thees at al., 2014). On average, SINOMA performs better than or, under specific error noise conditions, equal to the traditional modeling techniques. However, some of the investigated data reveal constraints of SINOMA, which have to be considered in 'real-world' applications. Nevertheless, our results indicate that SINOMA likely is a more reliable tool for estimating regression parameters if compared to traditional techniques. Based on the generally noisy characteristics of proxies used typically, applications of SINOMA to already existing reconstructions will probably result in different model parameter estimates, most likely leading to differing amplitudes of reconstructed past environmental conditions. Therefore, SINOMA has the potential to reframe our picture of the past. References: Kutzbach L, Thees B, and Wilmking M (2011): Identification of linear relationships from noisy data using errors-in-variables models -relevance for reconstruction of past climate from tree-ring and other proxy information, Climatic Change, 105, 155-177. Thees B, Kutzbach L, Wilmking M, and Zorita E (2009): Ein Bewertungsmaß für die amplitudentreue regressive Abbildung von verrauschten Daten im Rahmen einer iterativen "Errors in Variables"- Modellierung (EVM), GKSS-report 2009/8. Thees B, Buras A, Jetschke G, Zorita E, Wilmking M, and Kutzbach L: The Sequential Iterative Noise Matching Algorithm: A new statistical approach for the unbiased estimation of linear relationships between noisy serial data streams and their respective error variances. Submitted. Thees B, Buras A, Jetschke G, Kutzbach L, Zorita, E, and Wilmking, M, 2014: SINOMA - A new iterative statistical approach for the identification of linear relationships between noisy time series. Abstract submitted to EGU-session CL 6.1.

  10. Molecular proxies for paleoclimatology

    Science.gov (United States)

    Eglinton, Timothy I.; Eglinton, Geoffrey

    2008-10-01

    We summarize the applications of molecular proxies in paleoclimatology. Marine molecular records especially are proving to be of value but certain environmentally persistent compounds can also be measured in lake sediments, loess deposits and ice cores. The fundamentals of this approach are the molecular parameters, the compound abundances and carbon, hydrogen, nitrogen and oxygen isotopic contents which can be derived by the analysis of sediment extracts. These afford proxy measures which can be interpreted in terms of the conditions which control climate and also reflect its operation. We discuss two types of proxy; those of terrigenous and those of aquatic origin, and exemplify their application in the study of marine sediments through the medium of ten case studies based in the Atlantic, Mediterranean and Pacific Oceans, and in Antarctica. The studies are mainly for periods in the present, the Holocene and particularly the last glacial/interglacial, but they also include one study from the Cretaceous. The terrigenous proxies, which are measures of continental vegetation, are based on higher plant leaf wax compounds, i.e. long-chain (circa C 30) hydrocarbons, alcohols and acids. They register the relative contributions of C 3 vs. C 4 type plants to the vegetation in the source areas. The two marine proxies are measures of sea surface temperatures (SST). The longer established one, (U 37K') is based on the relative abundances of C 37 alkenones photosynthesized by unicellular algae, members of the Haptophyta. The newest proxy (TEX 86) is based on C 86 glycerol tetraethers (GDGTs) synthesized in the water column by some of the archaeal microbiota, the Crenarchaeota.

  11. Estimating Reliability of Disturbances in Satellite Time Series Data Based on Statistical Analysis

    Science.gov (United States)

    Zhou, Z.-G.; Tang, P.; Zhou, M.

    2016-06-01

    Normally, the status of land cover is inherently dynamic and changing continuously on temporal scale. However, disturbances or abnormal changes of land cover — caused by such as forest fire, flood, deforestation, and plant diseases — occur worldwide at unknown times and locations. Timely detection and characterization of these disturbances is of importance for land cover monitoring. Recently, many time-series-analysis methods have been developed for near real-time or online disturbance detection, using satellite image time series. However, the detection results were only labelled with "Change/ No change" by most of the present methods, while few methods focus on estimating reliability (or confidence level) of the detected disturbances in image time series. To this end, this paper propose a statistical analysis method for estimating reliability of disturbances in new available remote sensing image time series, through analysis of full temporal information laid in time series data. The method consists of three main steps. (1) Segmenting and modelling of historical time series data based on Breaks for Additive Seasonal and Trend (BFAST). (2) Forecasting and detecting disturbances in new time series data. (3) Estimating reliability of each detected disturbance using statistical analysis based on Confidence Interval (CI) and Confidence Levels (CL). The method was validated by estimating reliability of disturbance regions caused by a recent severe flooding occurred around the border of Russia and China. Results demonstrated that the method can estimate reliability of disturbances detected in satellite image with estimation error less than 5% and overall accuracy up to 90%.

  12. DTW4Omics: comparing patterns in biological time series.

    Directory of Open Access Journals (Sweden)

    Rachel Cavill

    Full Text Available When studying time courses of biological measurements and comparing these to other measurements eg. gene expression and phenotypic endpoints, the analysis is complicated by the fact that although the associated elements may show the same patterns of behaviour, the changes do not occur simultaneously. In these cases standard correlation-based measures of similarity will fail to find significant associations. Dynamic time warping (DTW is a technique which can be used in these situations to find the optimal match between two time courses, which may then be assessed for its significance. We implement DTW4Omics, a tool for performing DTW in R. This tool extends existing R scripts for DTW making them applicable for "omics" datasets where thousands entities may need to be compared with a range of markers and endpoints. It includes facilities to estimate the significance of the matches between the supplied data, and provides a set of plots to enable the user to easily visualise the output. We illustrate the utility of this approach using a dataset linking the exposure of the colon carcinoma Caco-2 cell line to oxidative stress by hydrogen peroxide (H2O2 and menadione across 9 timepoints and show that on average 85% of the genes found are not obtained from a standard correlation analysis between the genes and the measured phenotypic endpoints. We then show that when we analyse the genes identified by DTW4Omics as significantly associated with a marker for oxidative DNA damage (8-oxodG, through over-representation, an Oxidative Stress pathway is identified as the most over-represented pathway demonstrating that the genes found by DTW4Omics are biologically relevant. In contrast, when the positively correlated genes were similarly analysed, no pathways were found. The tool is implemented as an R Package and is available, along with a user guide from http://web.tgx.unimaas.nl/svn/public/dtw/.

  13. West Africa land use land cover time series

    Science.gov (United States)

    Tappan, G. Gray; Cushing, W. Matthew; Cotillon, Suzanne E.; Mathis, Melissa L.; Hutchinson, John A.; Dalsted, K. J.

    2016-01-01

    The West Africa Land Use Dynamics Project provides AGRHYMET and its 17 participating countries a comprehensive two-kilometer (2-km) resolution land use land cover (LULC) dataset of the region for three time periods; 1975, 2000, and 2013. Hundreds of Landsat images were visually interpreted to develop a 2-km LULC dataset for each of the three time periods. To assist in validating the interpretations, thousands of aerial photographs and high-resolution satellite images were used. From the initial datasets produced by national teams, the U.S. Geological Survey (USGS) conducted an independent, detailed review of the interpretations. In concurrence with the respective country teams, the data have been revised to produce an accurate and consistent LULC assessment from within the countries and respective transboundary areas. This West Africa Land Use Dynamics Project represents an effort to document and quantify the impacts of change in both time and space, of the environmental and land resource trends across West Africa. The project was carried out through the AGRHYMET Regional Center in Niamey, Niger, in partners from 17 participating countries, the Sahel Institute (INSAH), the USGS Earth Resources Observation and Science (EROS), and with major support from the U.S. Agency for International Development (USAID) West Africa Regional Program. The overarching goal of the West Africa Land Use Dynamics Project is to promote the awareness of the trends and use of spatial information about natural resource trends among national and regional decision-makers. For a complete description of project visit https://eros.usgs.gov/westafrica

  14. Alcohol Messages in Prime-Time Television Series.

    Science.gov (United States)

    Russell, Cristel Antonia; Russell, Dale W

    2009-01-01

    Alcohol messages contained in television programming serve as sources of information about drinking. To better understand the ways embedded messages about alcohol are communicated, it is crucial to objectively monitor and analyze television alcohol depictions. This article presents a content analysis of an eight-week sample of eighteen prime-time programs. Alcohol messages were coded based on modalities of presentation, level of plot connection, and valence. The analysis reveals that mixed messages about alcohol often coexist but the ways in which they are presented differ: whereas negative messages are tied to the plot and communicated verbally, positive messages are associated with subtle visual portrayals.

  15. Time series modelling of the Kobe-Osaka earthquake recordings

    Directory of Open Access Journals (Sweden)

    N. Singh

    2002-01-01

    generated by an earthquake. With a view of comparing these two types of waveforms, Singh (1992 developed a technique for identifying a model in time domain. Fortunately this technique has been found useful in modelling the recordings of the killer earthquake occurred in the Kobe-Osaka region of Japan at 5.46 am on 17 January, 1995. The aim of the present study is to show how well the method for identifying a model (developed by Singh (1992 can be used for describing the vibrations of the above mentioned earthquake recorded at Charters Towers in Queensland, Australia.

  16. Non-linear forecasting in high-frequency financial time series

    Science.gov (United States)

    Strozzi, F.; Zaldívar, J. M.

    2005-08-01

    A new methodology based on state space reconstruction techniques has been developed for trading in financial markets. The methodology has been tested using 18 high-frequency foreign exchange time series. The results are in apparent contradiction with the efficient market hypothesis which states that no profitable information about future movements can be obtained by studying the past prices series. In our (off-line) analysis positive gain may be obtained in all those series. The trading methodology is quite general and may be adapted to other financial time series. Finally, the steps for its on-line application are discussed.

  17. Ocean wavenumber estimation from wave-resolving time series imagery

    Science.gov (United States)

    Plant, N.G.; Holland, K.T.; Haller, M.C.

    2008-01-01

    We review several approaches that have been used to estimate ocean surface gravity wavenumbers from wave-resolving remotely sensed image sequences. Two fundamentally different approaches that utilize these data exist. A power spectral density approach identifies wavenumbers where image intensity variance is maximized. Alternatively, a cross-spectral correlation approach identifies wavenumbers where intensity coherence is maximized. We develop a solution to the latter approach based on a tomographic analysis that utilizes a nonlinear inverse method. The solution is tolerant to noise and other forms of sampling deficiency and can be applied to arbitrary sampling patterns, as well as to full-frame imagery. The solution includes error predictions that can be used for data retrieval quality control and for evaluating sample designs. A quantitative analysis of the intrinsic resolution of the method indicates that the cross-spectral correlation fitting improves resolution by a factor of about ten times as compared to the power spectral density fitting approach. The resolution analysis also provides a rule of thumb for nearshore bathymetry retrievals-short-scale cross-shore patterns may be resolved if they are about ten times longer than the average water depth over the pattern. This guidance can be applied to sample design to constrain both the sensor array (image resolution) and the analysis array (tomographic resolution). ?? 2008 IEEE.

  18. Single Object & Time Series Spectroscopy with JWST NIRCam

    Science.gov (United States)

    Greene, Tom; Schlawin, Everett A.

    2017-01-01

    JWST will enable high signal-to-noise spectroscopic observations of the atmospheres of transiting planets with high sensitivity at wavelengths that are inaccessible with HST or other existing facilities. We plan to exploit this by measuring abundances, chemical compositions, cloud properties, and temperature-pressure parameters of a set of mostly warm (T 600 - 1200 K) and low mass (14 -200 Earth mass) planets in our guaranteed time program. These planets are expected to have significant molecular absorptions of H2O, CH4, CO2, CO, and other molecules that are key for determining these parameters and illuminating how and where the planets formed. We describe how we will use the NIRCam grisms to observe slitless transmission and emission spectra of these planets over 2.4 - 5.0 microns wavelength and how well these observations can measure our desired parameters. This will include how we set integration times, exposure parameters, and obtain simultaneous shorter wavelength images to track telescope pointing and stellar variability. We will illustrate this with specific examples showing model spectra, simulated observations, expected information retrieval results, completed Astronomer's Proposal Tools observing templates, target visibility, and other considerations.

  19. Diurnal Differences in OLR Climatologies and Anomaly Time Series

    Science.gov (United States)

    Susskind, Joel; Lee, Jae N.; Iredell, Lena; Loeb, Norm

    2015-01-01

    AIRS (Atmospheric Infrared Sounder) Version-6 OLR (Outgoing Long-Wave Radiation) matches CERES (Clouds and the Earth's Radiant Energy System) Edition-2.8 OLR very closely on a 1x1 latitude x longitude scale, both with regard to absolute values, and also with regard to anomalies of OLR. There is a bias of 3.5 watts per meter squared, which is nearly constant both in time and space. Contiguous areas contain large positive or negative OLR difference between AIRS and CERES are where the day-night difference of OLR is large. For AIRS, the larger the diurnal cycle, the more likely that sampling twice a day is inadequate. Lower values of OLRclr (Clear Sky OLR) and LWCRF (Longwave Cloud Radiative Forcing) in AIRS compared to CERES is at least in part a result of AIRS sampling over cold and cloudy cases.

  20. Optimal trading strategies—a time series approach

    Science.gov (United States)

    Bebbington, Peter A.; Kühn, Reimer

    2016-05-01

    Motivated by recent advances in the spectral theory of auto-covariance matrices, we are led to revisit a reformulation of Markowitz’ mean-variance portfolio optimization approach in the time domain. In its simplest incarnation it applies to a single traded asset and allows an optimal trading strategy to be found which—for a given return—is minimally exposed to market price fluctuations. The model is initially investigated for a range of synthetic price processes, taken to be either second order stationary, or to exhibit second order stationary increments. Attention is paid to consequences of estimating auto-covariance matrices from small finite samples, and auto-covariance matrix cleaning strategies to mitigate against these are investigated. Finally we apply our framework to real world data.