WorldWideScience

Sample records for climate time series

  1. Forecasting the underlying potential governing climatic time series

    CERN Document Server

    Livina, V N; Mudelsee, M; Lenton, T M

    2012-01-01

    We introduce a technique of time series analysis, potential forecasting, which is based on dynamical propagation of the probability density of time series. We employ polynomial coefficients of the orthogonal approximation of the empirical probability distribution and extrapolate them in order to forecast the future probability distribution of data. The method is tested on artificial data, used for hindcasting observed climate data, and then applied to forecast Arctic sea-ice time series. The proposed methodology completes a framework for `potential analysis' of climatic tipping points which altogether serves anticipating, detecting and forecasting climate transitions and bifurcations using several independent techniques of time series analysis.

  2. Interglacial climate dynamics and advanced time series analysis

    Science.gov (United States)

    Mudelsee, Manfred; Bermejo, Miguel; Köhler, Peter; Lohmann, Gerrit

    2013-04-01

    Studying the climate dynamics of past interglacials (IGs) helps to better assess the anthropogenically influenced dynamics of the current IG, the Holocene. We select the IG portions from the EPICA Dome C ice core archive, which covers the past 800 ka, to apply methods of statistical time series analysis (Mudelsee 2010). The analysed variables are deuterium/H (indicating temperature) (Jouzel et al. 2007), greenhouse gases (Siegenthaler et al. 2005, Loulergue et al. 2008, L¨ü thi et al. 2008) and a model-co-derived climate radiative forcing (Köhler et al. 2010). We select additionally high-resolution sea-surface-temperature records from the marine sedimentary archive. The first statistical method, persistence time estimation (Mudelsee 2002) lets us infer the 'climate memory' property of IGs. Second, linear regression informs about long-term climate trends during IGs. Third, ramp function regression (Mudelsee 2000) is adapted to look on abrupt climate changes during IGs. We compare the Holocene with previous IGs in terms of these mathematical approaches, interprete results in a climate context, assess uncertainties and the requirements to data from old IGs for yielding results of 'acceptable' accuracy. This work receives financial support from the Deutsche Forschungsgemeinschaft (Project ClimSens within the DFG Research Priority Program INTERDYNAMIK) and the European Commission (Marie Curie Initial Training Network LINC, No. 289447, within the 7th Framework Programme). References Jouzel J, Masson-Delmotte V, Cattani O, Dreyfus G, Falourd S, Hoffmann G, Minster B, Nouet J, Barnola JM, Chappellaz J, Fischer H, Gallet JC, Johnsen S, Leuenberger M, Loulergue L, Luethi D, Oerter H, Parrenin F, Raisbeck G, Raynaud D, Schilt A, Schwander J, Selmo E, Souchez R, Spahni R, Stauffer B, Steffensen JP, Stenni B, Stocker TF, Tison JL, Werner M, Wolff EW (2007) Orbital and millennial Antarctic climate variability over the past 800,000 years. Science 317:793. Köhler P, Bintanja R

  3. A unified nonlinear stochastic time series analysis for climate science

    Science.gov (United States)

    Moon, Woosok; Wettlaufer, John S.

    2017-01-01

    Earth’s orbit and axial tilt imprint a strong seasonal cycle on climatological data. Climate variability is typically viewed in terms of fluctuations in the seasonal cycle induced by higher frequency processes. We can interpret this as a competition between the orbitally enforced monthly stability and the fluctuations/noise induced by weather. Here we introduce a new time-series method that determines these contributions from monthly-averaged data. We find that the spatio-temporal distribution of the monthly stability and the magnitude of the noise reveal key fingerprints of several important climate phenomena, including the evolution of the Arctic sea ice cover, the El Nio Southern Oscillation (ENSO), the Atlantic Nio and the Indian Dipole Mode. In analogy with the classical destabilising influence of the ice-albedo feedback on summertime sea ice, we find that during some time interval of the season a destabilising process operates in all of these climate phenomena. The interaction between the destabilisation and the accumulation of noise, which we term the memory effect, underlies phase locking to the seasonal cycle and the statistical nature of seasonal predictability. PMID:28287128

  4. A unified nonlinear stochastic time series analysis for climate science

    Science.gov (United States)

    Moon, Woosok; Wettlaufer, John S.

    2017-03-01

    Earth’s orbit and axial tilt imprint a strong seasonal cycle on climatological data. Climate variability is typically viewed in terms of fluctuations in the seasonal cycle induced by higher frequency processes. We can interpret this as a competition between the orbitally enforced monthly stability and the fluctuations/noise induced by weather. Here we introduce a new time-series method that determines these contributions from monthly-averaged data. We find that the spatio-temporal distribution of the monthly stability and the magnitude of the noise reveal key fingerprints of several important climate phenomena, including the evolution of the Arctic sea ice cover, the El Nio Southern Oscillation (ENSO), the Atlantic Nio and the Indian Dipole Mode. In analogy with the classical destabilising influence of the ice-albedo feedback on summertime sea ice, we find that during some time interval of the season a destabilising process operates in all of these climate phenomena. The interaction between the destabilisation and the accumulation of noise, which we term the memory effect, underlies phase locking to the seasonal cycle and the statistical nature of seasonal predictability.

  5. Climate Prediction Center (CPC) Global Temperature Time Series

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The global temperature time series provides time series charts using station based observations of daily temperature. These charts provide information about the...

  6. Climate Prediction Center (CPC) Global Precipitation Time Series

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The global precipitation time series provides time series charts showing observations of daily precipitation as well as accumulated precipitation compared to normal...

  7. Formulating and testing a method for perturbing precipitation time series to reflect anticipated climatic changes

    DEFF Research Database (Denmark)

    Sørup, Hjalte Jomo Danielsen; Georgiadis, Stylianos; Gregersen, Ida Bülow;

    2017-01-01

    Urban water infrastructure has very long planning horizons, and planning is thus very dependent on reliable estimates of the impacts of climate change. Many urban water systems are designed using time series with a high temporal resolution. To assess the impact of climate change on these systems...... in constructing realistic climate-changed precipitation time series at the sub-hourly scale. In the present study we present a deterministic methodology to perturb historical precipitation time series at the minute scale to reflect non-linear expectations to climate change. The methodology shows good skill......, similarly high-resolution precipitation time series for future climate are necessary. Climate models cannot at their current resolutions provide these time series at the relevant scales. Known methods for stochastic downscaling of climate change to urban hydrological scales have known shortcomings...

  8. Formulating and testing a method for perturbing precipitation time series to reflect anticipated climatic changes

    Science.gov (United States)

    Jomo Danielsen Sørup, Hjalte; Georgiadis, Stylianos; Bülow Gregersen, Ida; Arnbjerg-Nielsen, Karsten

    2017-01-01

    Urban water infrastructure has very long planning horizons, and planning is thus very dependent on reliable estimates of the impacts of climate change. Many urban water systems are designed using time series with a high temporal resolution. To assess the impact of climate change on these systems, similarly high-resolution precipitation time series for future climate are necessary. Climate models cannot at their current resolutions provide these time series at the relevant scales. Known methods for stochastic downscaling of climate change to urban hydrological scales have known shortcomings in constructing realistic climate-changed precipitation time series at the sub-hourly scale. In the present study we present a deterministic methodology to perturb historical precipitation time series at the minute scale to reflect non-linear expectations to climate change. The methodology shows good skill in meeting the expectations to climate change in extremes at the event scale when evaluated at different timescales from the minute to the daily scale. The methodology also shows good skill with respect to representing expected changes of seasonal precipitation. The methodology is very robust against the actual magnitude of the expected changes as well as the direction of the changes (increase or decrease), even for situations where the extremes are increasing for seasons that in general should have a decreasing trend in precipitation. The methodology can provide planners with valuable time series representing future climate that can be used as input to urban hydrological models and give better estimates of climate change impacts on these systems.

  9. Evaluating the uncertainty of predicting future climate time series at the hourly time scale

    Science.gov (United States)

    Caporali, E.; Fatichi, S.; Ivanov, V. Y.

    2011-12-01

    A stochastic downscaling methodology is developed to generate hourly, point-scale time series for several meteorological variables, such as precipitation, cloud cover, shortwave radiation, air temperature, relative humidity, wind speed, and atmospheric pressure. The methodology uses multi-model General Circulation Model (GCM) realizations and an hourly weather generator, AWE-GEN. Probabilistic descriptions of factors of change (a measure of climate change with respect to historic conditions) are computed for several climate statistics and different aggregation times using a Bayesian approach that weights the individual GCM contributions. The Monte Carlo method is applied to sample the factors of change from their respective distributions thereby permitting the generation of time series in an ensemble fashion, which reflects the uncertainty of climate projections of future as well as the uncertainty of the downscaling procedure. Applications of the methodology and probabilistic expressions of certainty in reproducing future climates for the periods, 2000 - 2009, 2046 - 2065 and 2081 - 2100, using the 1962 - 1992 period as the baseline, are discussed for the location of Firenze (Italy). The climate predictions for the period of 2000 - 2009 are tested against observations permitting to assess the reliability and uncertainties of the methodology in reproducing statistics of meteorological variables at different time scales.

  10. Long-term ERP time series as indicators for global climate variability and climate change

    Science.gov (United States)

    Lehmann, E.; Grötzsch, A.; Ulbrich, U.; Leckebusch, G. C.; Nevir, P.; Thomas, M.

    2009-04-01

    This study assesses whether variations in observed Earth orientation parameters (EOPs, IERS) such as length-of day (LOD EOP C04) and polar motion (PM EOP C04) can be applied as climate indicators. Data analyses suggest that observed EOPs are differently affected by parameters associated with the atmosphere and ocean. On interannual time scales the varying ocean-atmosphere effects on EOPs are in particular pronounced during episodes of the coupled ocean-atmosphere phenomenon El Niño-Southern Oscillation (ENSO). Observed ENSO anomalies of spatial patterns of parameters affected by atmosphere and ocean (climate indices and sea surface temperatures) are related to LOD and PM variability and associated with possible physical background processes. Present time analyses (1962 - 2000) indicate that the main source of the varying ENSO signal on observed LOD can be associated with anomalies of the relative angular momentum (AAM) related to variations in location and strength of jet streams of the upper troposphere. While on interannual time scales observed LOD and AAM are highly correlated (r=0.75), results suggest that strong El Niño events affect the observed LOD - AAM relation differently strong (explained variance 71%- 98%). Accordingly, the relation between AAM and ocean sea surface temperatures (SST) in the NIÑO 3.4 region differs (explained variances 15%-73%). Corresponding analysis is conducted on modelled EOPs (ERA40 reanalysis, ECHAM5-OM1) to obtain Earth rotation parameters undisturbed by core-mantle activities, and to study rotational variations under climate variability and change. A total of 91 strong El Niño events are analysed in coupled ocean-atmosphere ECHAM5-OM1 scenarios concerning the 20th century (20C), climate warming (A1B) and pre-industrial climate variability. Analyses on a total of 61 strong El Niño events covering a time period of 505 simulation years under pre-industrial climate conditions indicate a range of El Niño events with a strong or

  11. Detrending phenological time series improves climate-phenology analyses and reveals evidence of plasticity.

    Science.gov (United States)

    Iler, Amy M; Inouye, David W; Schmidt, Niels M; Høye, Toke T

    2017-03-01

    Time series have played a critical role in documenting how phenology responds to climate change. However, regressing phenological responses against climatic predictors involves the risk of finding potentially spurious climate-phenology relationships simply because both variables also change across years. Detrending by year is a way to address this issue. Additionally, detrending isolates interannual variation in phenology and climate, so that detrended climate-phenology relationships can represent statistical evidence of phenotypic plasticity. Using two flowering phenology time series from Colorado, USA and Greenland, we detrend flowering date and two climate predictors known to be important in these ecosystems: temperature and snowmelt date. In Colorado, all climate-phenology relationships persist after detrending. In Greenland, 75% of the temperature-phenology relationships disappear after detrending (three of four species). At both sites, the relationships that persist after detrending suggest that plasticity is a major component of sensitivity of flowering phenology to climate. Finally, simulations that created different strengths of correlations among year, climate, and phenology provide broader support for our two empirical case studies. This study highlights the utility of detrending to determine whether phenology is related to a climate variable in observational data sets. Applying this as a best practice will increase our understanding of phenological responses to climatic variation and change.

  12. The Evolutionary Modeling and Short-range Climatic Prediction for Meteorological Element Time Series

    Institute of Scientific and Technical Information of China (English)

    YU Kangqing; ZHOU Yuehua; YANG Jing'an; KANG Zhuo

    2005-01-01

    The time series of precipitation in flood season (May-September) at Wuhan Station, which is set as an example of the kind of time series with chaos characters, is split into two parts: One includes macro climatic timescale period waves that are affected by some relatively steady climatic factors such as astronomical factors (sunspot, etc.), some other known and/or unknown factors, and the other includes micro climatic timescale period waves superimposed on the macro one. The evolutionary modeling (EM), which develops from genetic programming (GP), is supposed to be adept at simulating the former part because it creates the nonlinear ordinary differential equation (NODE) based upon the data series. The natural fractals (NF)are used to simulate the latter part. The final prediction is the sum of results from both methods, thus the model can reflect multi-time scale effects of forcing factors in the climate system. The results of this example for 2002 and 2003 are satisfactory for climatic prediction operation. The NODE can suggest that the data vary with time, which is beneficial to think over short-range climatic analysis and prediction. Comparison in principle between evolutionary modeling and linear modeling indicates that the evolutionary one is a better way to simulate the complex time series with nonlinear characteristics.

  13. A comparison of two methods for detecting abrupt changes in the variance of climatic time series

    CERN Document Server

    Rodionov, Sergei

    2016-01-01

    Two methods for detecting abrupt shifts in the variance, Integrated Cumulative Sum of Squares (ICSS) and Sequential Regime Shift Detector (SRSD), have been compared on both synthetic and observed time series. In Monte Carlo experiments, SRSD outperformed ICSS in the overwhelming majority of the modelled scenarios with different sequences of variance regimes. The SRSD advantage was particularly apparent in the case of outliers in the series. When tested on climatic time series, in most cases both methods detected the same change points in the longer series (252-787 monthly values). The only exception was the Arctic Ocean SST series, when ICSS found one extra change point that appeared to be spurious. As for the shorter time series (66-136 yearly values), ICSS failed to detect any change points even when the variance doubled or tripled from one regime to another. For these time series, SRSD is recommended. Interestingly, all the climatic time series tested, from the Arctic to the Tropics, had one thing in commo...

  14. Perturbing high-resolution precipitation time series to represent future climates

    Science.gov (United States)

    Jomo Danielsen Sørup, Hjalte; Arnbjerg-Nielsen, Karsten

    2016-04-01

    Climate change impact water management worldwide as the water cycle is embedded in the climate system. For urban infrastructure the time resolution of precipitation data needed for design and planning (minutes) is much finer than what is normally provided by climate models (hourly to daily). Thus, a lot of effort is put into giving reliable estimates of what the expected change in precipitation will be at these fine scales. The relevant urban design criteria span from the minute scale up to yearly water balance scale and time series that show realistic changes across these scales and all those in-between are needed. Generally, fine resolution precipitation time series for future climates do not exist and a multitude of statistical approaches exist to try to overcome this problem. RCM outputs must be downscaled to higher spatial and temporal resolution to meet these needs. This is often done by applying weather generators or scaling of model output statistics. Both of these methods have known shortcomings in generating representative time series at the sub-hourly to hourly time scales. In the present study we utilize 1) that we have high resolution precipitation for present climate in the form of observational data, and 2) that we have robust estimates on how precipitation will change due to climate change for all temporal scales. This latter is quantified through change factors which are available for yearly and seasonal precipitation as well as for short term extreme events for a range of return periods. We demonstrate a novel methodology where the regional knowledge about expected changes in precipitation through the use of Intensity-Frequency-Duration (IDF) relationships is used to non-linearly perturb existing precipitation time series at 1-minute resolution to reflect complex expectations to a future changed climate. The methodology process the precipitation time series at event level where individual change factors are calculated based on the actual IDF

  15. Time Series

    OpenAIRE

    Gil-Alana, L.A.; Moreno, A; Pérez-de-Gracia, F. (Fernando)

    2011-01-01

    The last 20 years have witnessed a considerable increase in the use of time series techniques in econometrics. The articles in this important set have been chosen to illustrate the main themes in time series work as it relates to econometrics. The editor has written a new concise introduction to accompany the articles. Sections covered include: Ad Hoc Forecasting Procedures, ARIMA Modelling, Structural Time Series Models, Unit Roots, Detrending and Non-stationarity, Seasonality, Seasonal Adju...

  16. Modeling climate change impacts on combined sewer overflow using synthetic precipitation time series.

    Science.gov (United States)

    Bendel, David; Beck, Ferdinand; Dittmer, Ulrich

    2013-01-01

    In the presented study climate change impacts on combined sewer overflows (CSOs) in Baden-Wuerttemberg, Southern Germany, were assessed based on continuous long-term rainfall-runoff simulations. As input data, synthetic rainfall time series were used. The applied precipitation generator NiedSim-Klima accounts for climate change effects on precipitation patterns. Time series for the past (1961-1990) and future (2041-2050) were generated for various locations. Comparing the simulated CSO activity of both periods we observe significantly higher overflow frequencies for the future. Changes in overflow volume and overflow duration depend on the type of overflow structure. Both values will increase at simple CSO structures that merely divide the flow, whereas they will decrease when the CSO structure is combined with a storage tank. However, there is a wide variation between the results of different precipitation time series (representative for different locations).

  17. Long time series

    DEFF Research Database (Denmark)

    Hisdal, H.; Holmqvist, E.; Hyvärinen, V.;

    Awareness that emission of greenhouse gases will raise the global temperature and change the climate has led to studies trying to identify such changes in long-term climate and hydrologic time series. This report, written by the......Awareness that emission of greenhouse gases will raise the global temperature and change the climate has led to studies trying to identify such changes in long-term climate and hydrologic time series. This report, written by the...

  18. Unraveling multiple changes in complex climate time series using Bayesian inference

    Science.gov (United States)

    Berner, Nadine; Trauth, Martin H.; Holschneider, Matthias

    2016-04-01

    Change points in time series are perceived as heterogeneities in the statistical or dynamical characteristics of observations. Unraveling such transitions yields essential information for the understanding of the observed system. The precise detection and basic characterization of underlying changes is therefore of particular importance in environmental sciences. We present a kernel-based Bayesian inference approach to investigate direct as well as indirect climate observations for multiple generic transition events. In order to develop a diagnostic approach designed to capture a variety of natural processes, the basic statistical features of central tendency and dispersion are used to locally approximate a complex time series by a generic transition model. A Bayesian inversion approach is developed to robustly infer on the location and the generic patterns of such a transition. To systematically investigate time series for multiple changes occurring at different temporal scales, the Bayesian inversion is extended to a kernel-based inference approach. By introducing basic kernel measures, the kernel inference results are composed into a proxy probability to a posterior distribution of multiple transitions. Thus, based on a generic transition model a probability expression is derived that is capable to indicate multiple changes within a complex time series. We discuss the method's performance by investigating direct and indirect climate observations. The approach is applied to environmental time series (about 100 a), from the weather station in Tuscaloosa, Alabama, and confirms documented instrumentation changes. Moreover, the approach is used to investigate a set of complex terrigenous dust records from the ODP sites 659, 721/722 and 967 interpreted as climate indicators of the African region of the Plio-Pleistocene period (about 5 Ma). The detailed inference unravels multiple transitions underlying the indirect climate observations coinciding with established

  19. Assessment of a stochastic downscaling methodology in generating an ensemble of hourly future climate time series

    Science.gov (United States)

    Fatichi, S.; Ivanov, V. Y.; Caporali, E.

    2013-04-01

    This study extends a stochastic downscaling methodology to generation of an ensemble of hourly time series of meteorological variables that express possible future climate conditions at a point-scale. The stochastic downscaling uses general circulation model (GCM) realizations and an hourly weather generator, the Advanced WEather GENerator (AWE-GEN). Marginal distributions of factors of change are computed for several climate statistics using a Bayesian methodology that can weight GCM realizations based on the model relative performance with respect to a historical climate and a degree of disagreement in projecting future conditions. A Monte Carlo technique is used to sample the factors of change from their respective marginal distributions. As a comparison with traditional approaches, factors of change are also estimated by averaging GCM realizations. With either approach, the derived factors of change are applied to the climate statistics inferred from historical observations to re-evaluate parameters of the weather generator. The re-parameterized generator yields hourly time series of meteorological variables that can be considered to be representative of future climate conditions. In this study, the time series are generated in an ensemble mode to fully reflect the uncertainty of GCM projections, climate stochasticity, as well as uncertainties of the downscaling procedure. Applications of the methodology in reproducing future climate conditions for the periods of 2000-2009, 2046-2065 and 2081-2100, using the period of 1962-1992 as the historical baseline are discussed for the location of Firenze (Italy). The inferences of the methodology for the period of 2000-2009 are tested against observations to assess reliability of the stochastic downscaling procedure in reproducing statistics of meteorological variables at different time scales.

  20. Simulation of an ensemble of future climate time series with an hourly weather generator

    Science.gov (United States)

    Caporali, E.; Fatichi, S.; Ivanov, V. Y.; Kim, J.

    2010-12-01

    There is evidence that climate change is occurring in many regions of the world. The necessity of climate change predictions at the local scale and fine temporal resolution is thus warranted for hydrological, ecological, geomorphological, and agricultural applications that can provide thematic insights into the corresponding impacts. Numerous downscaling techniques have been proposed to bridge the gap between the spatial scales adopted in General Circulation Models (GCM) and regional analyses. Nevertheless, the time and spatial resolutions obtained as well as the type of meteorological variables may not be sufficient for detailed studies of climate change effects at the local scales. In this context, this study presents a stochastic downscaling technique that makes use of an hourly weather generator to simulate time series of predicted future climate. Using a Bayesian approach, the downscaling procedure derives distributions of factors of change for several climate statistics from a multi-model ensemble of GCMs. Factors of change are sampled from their distributions using a Monte Carlo technique to entirely account for the probabilistic information obtained with the Bayesian multi-model ensemble. Factors of change are subsequently applied to the statistics derived from observations to re-evaluate the parameters of the weather generator. The weather generator can reproduce a wide set of climate variables and statistics over a range of temporal scales, from extremes, to the low-frequency inter-annual variability. The final result of such a procedure is the generation of an ensemble of hourly time series of meteorological variables that can be considered as representative of future climate, as inferred from GCMs. The generated ensemble of scenarios also accounts for the uncertainty derived from multiple GCMs used in downscaling. Applications of the procedure in reproducing present and future climates are presented for different locations world-wide: Tucson (AZ

  1. Processing and analysis of Global snow cover time series for climate change assessment

    OpenAIRE

    2014-01-01

    Remote sensing data offer the opportunity to detect terrestrial snow cover in high temporal and spatial resolution. Such information is essential for various applications – ranging from small scale predictions of runoff or floods, ground water recharge and hydro power generation to large scale planetary processes connected to climate change. The processing of globally available time series of remote sensing data constitutes a challenging task due to the huge data volume and computational dema...

  2. Forecasting malaria cases using climatic factors in delhi, India: a time series analysis.

    Science.gov (United States)

    Kumar, Varun; Mangal, Abha; Panesar, Sanjeet; Yadav, Geeta; Talwar, Richa; Raut, Deepak; Singh, Saudan

    2014-01-01

    Background. Malaria still remains a public health problem in developing countries and changing environmental and climatic factors pose the biggest challenge in fighting against the scourge of malaria. Therefore, the study was designed to forecast malaria cases using climatic factors as predictors in Delhi, India. Methods. The total number of monthly cases of malaria slide positives occurring from January 2006 to December 2013 was taken from the register maintained at the malaria clinic at Rural Health Training Centre (RHTC), Najafgarh, Delhi. Climatic data of monthly mean rainfall, relative humidity, and mean maximum temperature were taken from Regional Meteorological Centre, Delhi. Expert modeler of SPSS ver. 21 was used for analyzing the time series data. Results. Autoregressive integrated moving average, ARIMA (0,1,1) (0,1,0)(12), was the best fit model and it could explain 72.5% variability in the time series data. Rainfall (P value = 0.004) and relative humidity (P value = 0.001) were found to be significant predictors for malaria transmission in the study area. Seasonal adjusted factor (SAF) for malaria cases shows peak during the months of August and September. Conclusion. ARIMA models of time series analysis is a simple and reliable tool for producing reliable forecasts for malaria in Delhi, India.

  3. Forecasting Malaria Cases Using Climatic Factors in Delhi, India: A Time Series Analysis

    Directory of Open Access Journals (Sweden)

    Varun Kumar

    2014-01-01

    Full Text Available Background. Malaria still remains a public health problem in developing countries and changing environmental and climatic factors pose the biggest challenge in fighting against the scourge of malaria. Therefore, the study was designed to forecast malaria cases using climatic factors as predictors in Delhi, India. Methods. The total number of monthly cases of malaria slide positives occurring from January 2006 to December 2013 was taken from the register maintained at the malaria clinic at Rural Health Training Centre (RHTC, Najafgarh, Delhi. Climatic data of monthly mean rainfall, relative humidity, and mean maximum temperature were taken from Regional Meteorological Centre, Delhi. Expert modeler of SPSS ver. 21 was used for analyzing the time series data. Results. Autoregressive integrated moving average, ARIMA (0,1,1 (0,1,012, was the best fit model and it could explain 72.5% variability in the time series data. Rainfall (P value = 0.004 and relative humidity (P value = 0.001 were found to be significant predictors for malaria transmission in the study area. Seasonal adjusted factor (SAF for malaria cases shows peak during the months of August and September. Conclusion. ARIMA models of time series analysis is a simple and reliable tool for producing reliable forecasts for malaria in Delhi, India.

  4. Statistical downscaling of meteorological time series and climatic projections in a watershed in Turkey

    Science.gov (United States)

    Göncü, S.; Albek, E.

    2016-10-01

    In this study, meteorological time series from five meteorological stations in and around a watershed in Turkey were used in the statistical downscaling of global climate model results to be used for future projections. Two general circulation models (GCMs), Canadian Climate Center (CGCM3.1(T63)) and Met Office Hadley Centre (2012) (HadCM3) models, were used with three Special Report Emission Scenarios, A1B, A2, and B2. The statistical downscaling model SDSM was used for the downscaling. The downscaled ensembles were put to validation with GCM predictors against observations using nonparametric statistical tests. The two most important meteorological variables, temperature and precipitation, passed validation statistics, and partial validation was achieved with other time series relevant in hydrological studies, namely, cloudiness, relative humidity, and wind velocity. Heat waves, number of dry days, length of dry and wet spells, and maximum precipitation were derived from the primary time series as annual series. The change in monthly predictor sets used in constructing the multiple regression equations for downscaling was examined over the watershed and over the months in a year. Projections between 1962 and 2100 showed that temperatures and dryness indicators show increasing trends while precipitation, relative humidity, and cloudiness tend to decrease. The spatial changes over the watershed and monthly temporal changes revealed that the western parts of the watershed where water is produced for subsequent downstream use will get drier than the rest and the precipitation distribution over the year will shift. Temperatures showed increasing trends over the whole watershed unparalleled with another period in history. The results emphasize the necessity of mitigation efforts to combat climate change on local and global scales and the introduction of adaptation strategies for the region under study which was shown to be vulnerable to climate change.

  5. Cascade-based disaggregation of continuous rainfall time series: the influence of climate

    Directory of Open Access Journals (Sweden)

    A. Güntner

    2001-01-01

    Full Text Available Rainfall data of high temporal resolution are required in a multitude of hydrological applications. In the present paper, a temporal rainfall disaggregation model is applied to convert daily time series into an hourly resolution. The model is based on the principles of random multiplicative cascade processes. Its parameters are dependent on (1 the volume and (2 the position in the rainfall sequence of the time interval with rainfall to be disaggregated. The aim is to compare parameters and performance of the model between two contrasting climates with different rainfall generating mechanisms, a semi-arid tropical (Brazil and a temperate (United Kingdom climate. In the range of time scales studied, the scale-invariant assumptions of the model are approximately equally well fulfilled for both climates. The model parameters differ distinctly between climates, reflecting the dominance of convective processes in the Brazilian rainfall and of advective processes associated with frontal passages in the British rainfall. In the British case, the parameters exhibit a slight seasonal variation consistent with the higher frequency of convection during summer. When applied for disaggregation, the model reproduces a range of hourly rainfall characteristics with a high accuracy in both climates. However, the overall model performance is somewhat better for the semi-arid tropical rainfall. In particular, extreme rainfall in the UK is overestimated whereas extreme rainfall in Brazil is well reproduced. Transferability of parameters in time is associated with larger uncertainty in the semi-arid climate due to its higher interannual variability and lower percentage of rainy intervals. For parameter transferability in space, no restrictions are found between the Brazilian stations whereas in the UK regional differences are more pronounced. The overall high accuracy of disaggregated data supports the potential usefulness of the model in hydrological applications

  6. Describing temporal variability of the mean Estonian precipitation series in climate time scale

    Science.gov (United States)

    Post, P.; Kärner, O.

    2009-04-01

    Applicability of the random walk type models to represent the temporal variability of various atmospheric temperature series has been successfully demonstrated recently (e.g. Kärner, 2002). Main problem in the temperature modeling is connected to the scale break in the generally self similar air temperature anomaly series (Kärner, 2005). The break separates short-range strong non-stationarity from nearly stationary longer range variability region. This is an indication of the fact that several geophysical time series show a short-range non-stationary behaviour and a stationary behaviour in longer range (Davis et al., 1996). In order to model series like that the choice of time step appears to be crucial. To characterize the long-range variability we can neglect the short-range non-stationary fluctuations, provided that we are able to model properly the long-range tendencies. The structure function (Monin and Yaglom, 1975) was used to determine an approximate segregation line between the short and the long scale in terms of modeling. The longer scale can be called climate one, because such models are applicable in scales over some decades. In order to get rid of the short-range fluctuations in daily series the variability can be examined using sufficiently long time step. In the present paper, we show that the same philosophy is useful to find a model to represent a climate-scale temporal variability of the Estonian daily mean precipitation amount series over 45 years (1961-2005). Temporal variability of the obtained daily time series is examined by means of an autoregressive and integrated moving average (ARIMA) family model of the type (0,1,1). This model is applicable for daily precipitation simulating if to select an appropriate time step that enables us to neglet the short-range non-stationary fluctuations. A considerably longer time step than one day (30 days) is used in the current paper to model the precipitation time series variability. Each ARIMA (0

  7. Constructing the reduced dynamical models of interannual climate variability from spatial-distributed time series

    Science.gov (United States)

    Mukhin, Dmitry; Gavrilov, Andrey; Loskutov, Evgeny; Feigin, Alexander

    2016-04-01

    We suggest a method for empirical forecast of climate dynamics basing on the reconstruction of reduced dynamical models in a form of random dynamical systems [1,2] derived from observational time series. The construction of proper embedding - the set of variables determining the phase space the model works in - is no doubt the most important step in such a modeling, but this task is non-trivial due to huge dimension of time series of typical climatic fields. Actually, an appropriate expansion of observational time series is needed yielding the number of principal components considered as phase variables, which are to be efficient for the construction of low-dimensional evolution operator. We emphasize two main features the reduced models should have for capturing the main dynamical properties of the system: (i) taking into account time-lagged teleconnections in the atmosphere-ocean system and (ii) reflecting the nonlinear nature of these teleconnections. In accordance to these principles, in this report we present the methodology which includes the combination of a new way for the construction of an embedding by the spatio-temporal data expansion and nonlinear model construction on the basis of artificial neural networks. The methodology is aplied to NCEP/NCAR reanalysis data including fields of sea level pressure, geopotential height, and wind speed, covering Northern Hemisphere. Its efficiency for the interannual forecast of various climate phenomena including ENSO, PDO, NAO and strong blocking event condition over the mid latitudes, is demonstrated. Also, we investigate the ability of the models to reproduce and predict the evolution of qualitative features of the dynamics, such as spectral peaks, critical transitions and statistics of extremes. This research was supported by the Government of the Russian Federation (Agreement No. 14.Z50.31.0033 with the Institute of Applied Physics RAS) [1] Y. I. Molkov, E. M. Loskutov, D. N. Mukhin, and A. M. Feigin, "Random

  8. A love story about forest drought detection: the relationship between MODIS data and Climate time series.

    Science.gov (United States)

    Domingo, Cristina; Ninyerola, Miquel; Pons, Xavier; Cristóbal, Jordi

    2015-04-01

    The scientific community recognizes drought as an important phenomenon with important implications over many Social Benefit Areas (SBA) that GEOSS addresses and which impacts need to be managed and assessed through policy decisions. The traditional assessment of drought has been often based on both precipitation shortages and differences between actual and potential evapotranspiration, among others. During the last fifteen years, new advances on drought indices, integrating time-scales and effortless computing, have concluded with many drought indices such the Standardized Precipitation Evapotranspiration Index (SPEI). The SPEI uses precipitation data and potential evapotranspiration to emphasize climatic anomalies along different time frames. However, a non-traditional point of view based not only on climatic variables but also on biological data is evaluated here as an encouraging tool for drought detection analysis. Therefore, the real physiological state of the vegetation will be introduced as a new variable required in order to understand the vulnerabilities of forest ecosystems to drought, considering the existing time lag between meteorological events and biological responses. Invaluable Earth Observation satellites provide the research community with a big data of imagery which processed as a Vegetation Indices (VI) time series, such as Normalized Difference Vegetation Index (NDVI), the Vegetation Condition Index (VCI), the Normalized Difference Water Index (NDWI), the Normalized Difference Drought Index (NDDI) and the Temperature Vegetation Dryness Index (TVDI), offer large possibilities on forest applications. This research is focused on the global affection of droughts on forests given the invaluable ecosystem services they provide to society. In this study remote sensing and climate data to characterize drought on forests, supporting the idea that SPEI and MODIS VI clearly respond to drought situations on forests, is used. Results from the analysis of

  9. Climatic significance of δD time series in tree rings from Tianmu Mountain

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    Based on cross-dating tree ring age from Tianmu Mountain, Zhejiang Province, the δD(D/H)sample 1 xl000 of each tree ring nitrocellulose was measured and then the δ D annual time series was established. Using meteorological data from Tianmu Mountain Observatory,the responds of δ D of tree ring to climatic factors were analyzed. The results suggest that the δ D time series of the tree ring correlates well with climatic conditions, primarily with precipitation of the second half of each year, average annual air temperature and average annual maximum air temperature. The reconstructed maximum winter air temperature by the δ D of tree ring is in good correlation with local instrumental data. The Iow-frequency variations of reconstructed mean maximum air temperature of the winter in lianmu Mountain corroborate with the temperature change in a large special scale. Tianmu Mountain is located in winter monsoon sensitive zone,thus the influence of winter temperature on tree growth is quite obvious. The results in this paper suggest that δD of tree ring is an effective proxy for winter temperature in non-limited regions.

  10. Climate Change Impacts and Vulnerabilities Assessment on Forest Vegetation Through Time-Series Multisensor Satellite Data

    Science.gov (United States)

    Zoran, Maria; Savastru, Dan; Dida, Adrian

    2016-08-01

    Sustaining forest resources in Romania requires a better understanding of forest ecosystem processes, and how management decisions and climate and anthropogenic change may affect these processes in the future. Spatio- temporal forest vegetation dynamics have been quantified as the total amount of vegetation (mean NDVI) and the seasonal difference (annual NDVI amplitude) by a time series analysis of NDVI LAI satellite images over 2000 - 2015 period for a forest ecosystem placed in the North-Eastern part of Bucharest town, Romania, from MODIS Terra/Aqua, LANDSAT TM/ETM and Sentinel satellite and meteorological data. For investigated test area, considerable NDVI decline was observed for drought events during 2003, 2007 and 2010 years. Under stress conditions, it is evident that environmental factors such as soil type, parent material, and topography are not correlated with NDVI dynamics. EO-based estimates of forest biophysical variables were shown to be similar to predictions derived from forest field inventories.

  11. A time-series analysis of the 20th century climate simulations produced for the IPCC's Fourth Assessment Report.

    Directory of Open Access Journals (Sweden)

    Francisco Estrada

    Full Text Available In this paper evidence of anthropogenic influence over the warming of the 20th century is presented and the debate regarding the time-series properties of global temperatures is addressed in depth. The 20th century global temperature simulations produced for the Intergovernmental Panel on Climate Change's Fourth Assessment Report and a set of the radiative forcing series used to drive them are analyzed using modern econometric techniques. Results show that both temperatures and radiative forcing series share similar time-series properties and a common nonlinear secular movement. This long-term co-movement is characterized by the existence of time-ordered breaks in the slope of their trend functions. The evidence presented in this paper suggests that while natural forcing factors may help explain the warming of the first part of the century, anthropogenic forcing has been its main driver since the 1970's. In terms of Article 2 of the United Nations Framework Convention on Climate Change, significant anthropogenic interference with the climate system has already occurred and the current climate models are capable of accurately simulating the response of the climate system, even if it consists in a rapid or abrupt change, to changes in external forcing factors. This paper presents a new methodological approach for conducting time-series based attribution studies.

  12. Climatic signal from Pinus leucodermis axial resin ducts: a tree-ring time series approach

    OpenAIRE

    Antonio Saracino; Angelo Rita; Sergio Rossi; Laia Andreu-Hayles; G. Helle; Luigi Todaro

    2016-01-01

    Developing long-term chronologies of tree-ring anatomical features to evaluate climatic relationships within species might serve as an annual proxy to explore and elucidate the climatic drivers affecting xylem differentiation. Pinus leucodermis response to climate was examined by analyzing vertical xylem resin ducts in wood growing at high elevation in the Apennines of peninsular Southern Italy. Early- and latewood tree-ring resin duct chronologies, spanning the 1804–2010 time period, were co...

  13. Unravelling the community structure of the climate system by using lags and symbolic time-series analysis

    Science.gov (United States)

    Tirabassi, Giulio; Masoller, Cristina

    2016-07-01

    Many natural systems can be represented by complex networks of dynamical units with modular structure in the form of communities of densely interconnected nodes. Unraveling this community structure from observed data requires the development of appropriate tools, particularly when the nodes are embedded in a regular space grid and the datasets are short and noisy. Here we propose two methods to identify communities, and validate them with the analysis of climate datasets recorded at a regular grid of geographical locations covering the Earth surface. By identifying mutual lags among time-series recorded at different grid points, and by applying symbolic time-series analysis, we are able to extract meaningful regional communities, which can be interpreted in terms of large-scale climate phenomena. The methods proposed here are valuable tools for the study of other systems represented by networks of dynamical units, allowing the identification of communities, through time-series analysis of the observed output signals.

  14. Constructing new satellite-only time series of global mean, sea surface temperature data for climate from ATSR data

    Science.gov (United States)

    Veal, Karen; Remedios, John; Ghent, Darren

    2013-04-01

    The Along Track Scanning Radiometers (ATSRs) have provided a near-continuous record of sea surface temperature (SST) data for climate from the launch of ATSR-1 in 1991 to the loss of the Advanced ATSR (AATSR) in April 2012. The intention was always to provide an SST record, independent of in situ data, to corroborate and improve climate data records in recent times. We show that the ATSR record provides a very suitable data set with which to study the recent climate record, particularly during the ATSR-2 and AATSR periods (1995 to 2012) in three major respects. First, ATSR climate time series achieve anomaly accuracies of better than 0.05 K (and high stability). Second, the overlap between instruments allows for excellent determination and removal of biases; between ATSR-2 and AATSR, these are less than 0.05 K for the highest accuracy SST data. Finally, uncertainties on global monthly mean data are less than 0.02 K and hence comparable to those achieved by in situ analyses such as HadSST3. A particular hallmark of the ATSR instruments was their exceptional design for accuracy incorporating high accuracy radiometric calibration, dual-view of the Earth's surface and the use of three thermal emission channels; additional channels are included for cloud clearing in this context. The use of dual-view and multiple thermnal wavelengths allows a number of combinations for retrievals of SST, the most accurate being the dual-view, three-channel retrieval (D3) at nighttime. This restriction is due to the use of the 3.7 micron channel which is sensitive to solar radiation during the day. Extensive work has resulted in a major advances recently resulting in both an operational V2.0 SST product and a further improved ATSR Re-analysis for Climate (ARC) product, a particular feature of the latter being the development of a depth SST product in addition to the skin SST directly determined from satellite data. We will discuss the characteristics of these data sets in terms of

  15. Particulate matter time-series and Köppen-Geiger climate classes in North America and Europe

    Science.gov (United States)

    Pražnikar, Jure

    2017-02-01

    Four years of time-series data on the particulate matter (PM) concentrations from 801 monitoring stations located in Europe and 234 stations in North America were analyzed. Using k-means clustering with distance correlation as a measure for similarity, 5 distinct PM clusters in Europe and 9 clusters across the United States of America (USA) were found. This study shows that meteorology has an important role in controlling PM concentrations, as comparison between Köppen-Geiger climate zones and identified PM clusters revealed very good spatial overlapping. Moreover, the Köppen-Geiger boundaries in Europe show a high similarity to the boundaries as defined by PM clusters. The western USA is much more diverse regarding climate zones; this characteristic was confirmed by cluster analysis, as 6 clusters were identified in the west, and only 3 were identified on the eastern side of the USA. The lowest similarity between PM time-series in Europe was observed between the Iberian Peninsula and the north Europe clusters. These two regions also show considerable differences, as the cold semi-arid climate has a long and hot summer period, while the cool continental climate has a short summertime and long and cold winters. Additionally, intra-continental examination of European clusters showed meteorologically driven phenomena in autumn 2011 encompassing a large European region from Bulgaria in the south, Germany in central Europe and Finland in the north with high PM concentrations in November and a decline in December 2011. Inter-continental comparison between Europe and the USA clusters revealed a remarkable difference between the PM time-series located in humid continental zone. It seems that because of higher shortwave downwelling radiation (≈210 W m-2) over the USA's continental zone, and consequently more intense production of secondary aerosols, a summer peak in PM concentration was observed. On the other hand, Europe's humid continental climate region experiences

  16. Causality between time series

    CERN Document Server

    Liang, X San

    2014-01-01

    Given two time series, can one tell, in a rigorous and quantitative way, the cause and effect between them? Based on a recently rigorized physical notion namely information flow, we arrive at a concise formula and give this challenging question, which is of wide concern in different disciplines, a positive answer. Here causality is measured by the time rate of change of information flowing from one series, say, X2, to another, X1. The measure is asymmetric between the two parties and, particularly, if the process underlying X1 does not depend on X2, then the resulting causality from X2 to X1 vanishes. The formula is tight in form, involving only the commonly used statistics, sample covariances. It has been validated with touchstone series purportedly generated with one-way causality. It has also been applied to the investigation of real world problems; an example presented here is the cause-effect relation between two climate modes, El Ni\\~no and Indian Ocean Dipole, which have been linked to the hazards in f...

  17. Climate Variability and the Outbreaks of Cholera in Zanzibar, East Africa: A Time Series Analysis

    OpenAIRE

    Reyburn, Rita; Kim, Deok Ryun; Emch, Michael; Khatib, Ahmed; von Seidlein, Lorenz; Ali, Mohammad

    2011-01-01

    Global cholera incidence is increasing, particularly in sub-Saharan Africa. We examined the impact of climate and ocean environmental variability on cholera outbreaks, and developed a forecasting model for outbreaks in Zanzibar. Routine cholera surveillance reports between 1997 and 2006 were correlated with remotely and locally sensed environmental data. A seasonal autoregressive integrated moving average (SARIMA) model determined the impact of climate and environmental variability on cholera...

  18. Climate variability, weather and enteric disease incidence in New Zealand: time series analysis.

    Directory of Open Access Journals (Sweden)

    Aparna Lal

    Full Text Available BACKGROUND: Evaluating the influence of climate variability on enteric disease incidence may improve our ability to predict how climate change may affect these diseases. OBJECTIVES: To examine the associations between regional climate variability and enteric disease incidence in New Zealand. METHODS: Associations between monthly climate and enteric diseases (campylobacteriosis, salmonellosis, cryptosporidiosis, giardiasis were investigated using Seasonal Auto Regressive Integrated Moving Average (SARIMA models. RESULTS: No climatic factors were significantly associated with campylobacteriosis and giardiasis, with similar predictive power for univariate and multivariate models. Cryptosporidiosis was positively associated with average temperature of the previous month (β =  0.130, SE =  0.060, p <0.01 and inversely related to the Southern Oscillation Index (SOI two months previously (β =  -0.008, SE =  0.004, p <0.05. By contrast, salmonellosis was positively associated with temperature (β  = 0.110, SE = 0.020, p<0.001 of the current month and SOI of the current (β  = 0.005, SE = 0.002, p<0.050 and previous month (β  = 0.005, SE = 0.002, p<0.05. Forecasting accuracy of the multivariate models for cryptosporidiosis and salmonellosis were significantly higher. CONCLUSIONS: Although spatial heterogeneity in the observed patterns could not be assessed, these results suggest that temporally lagged relationships between climate variables and national communicable disease incidence data can contribute to disease prediction models and early warning systems.

  19. The enhanced greenhouse signal versus natural variations in observed climate time series: a statistical approach

    Energy Technology Data Exchange (ETDEWEB)

    Schoenwiese, C.D. [J.W. Goethe Univ., Frankfurt (Germany). Inst. for Meteorology and Geophysics

    1995-12-31

    It is a well-known fact that human activities lead to an atmospheric concentration increase of some IR-active trace gases (greenhouse gases GHG) and that this influence enhances the `greenhouse effect`. However, there are major quantitative and regional uncertainties in the related climate model projections and the observational data reflect the whole complex of both anthropogenic and natural forcing of the climate system. This contribution aims at the separation of the anthropogenic enhanced greenhouse signal in observed global surface air temperature data versus other forcing using statistical methods such as multiple (multiforced) regressions and neural networks. The competitive natural forcing considered are volcanic and solar activity, in addition the ENSO (El Nino/Southern Oscillation) mechanism. This analysis will be extended also to the NAO (North Atlantic Oscillation) and anthropogenic sulfate formation in the troposphere

  20. Time Series Momentum

    DEFF Research Database (Denmark)

    Moskowitz, Tobias J.; Ooi, Yao Hua; Heje Pedersen, Lasse

    2012-01-01

    under-reaction and delayed over-reaction. A diversified portfolio of time series momentum strategies across all asset classes delivers substantial abnormal returns with little exposure to standard asset pricing factors and performs best during extreme markets. Examining the trading activities...... of speculators and hedgers, we find that speculators profit from time series momentum at the expense of hedgers....

  1. Multivariate Time Series Search

    Data.gov (United States)

    National Aeronautics and Space Administration — Multivariate Time-Series (MTS) are ubiquitous, and are generated in areas as disparate as sensor recordings in aerospace systems, music and video streams, medical...

  2. Predictive Time Series Analysis Linking Bengal Cholera with Terrestrial Water Storage Measured from Gravity Recovery and Climate Experiment Sensors

    Science.gov (United States)

    Jutla, Antarpreet; Akanda, Ali; Unnikrishnan, Avinash; Huq, Anwar; Colwell, Rita

    2015-01-01

    Outbreaks of diarrheal diseases, including cholera, are related to floods and droughts in regions where water and sanitation infrastructure are inadequate or insufficient. However, availability of data on water scarcity and abundance in transnational basins, are a prerequisite for developing cholera forecasting systems. With more than a decade of terrestrial water storage (TWS) data from the Gravity Recovery and Climate Experiment, conditions favorable for predicting cholera occurrence may now be determined. We explored lead–lag relationships between TWS in the Ganges–Brahmaputra–Meghna basin and endemic cholera in Bangladesh. Since bimodal seasonal peaks in cholera in Bangladesh occur during spring and autumn seasons, two separate logistical models between TWS and disease time series (2002–2010) were developed. TWS representing water availability showed an asymmetrical, strong association with cholera prevalence in the spring (τ = −0.53; P < 0.001) and autumn (τ = 0.45; P < 0.001) up to 6 months in advance. One unit (centimeter of water) decrease in water availability in the basin increased odds of above normal cholera by 24% (confidence interval [CI] = 20–31%; P < 0.05) in the spring, while an increase in regional water by 1 unit, through floods, increased odds of above average cholera in the autumn by 29% (CI = 22–33%; P < 0.05). PMID:26526921

  3. Assessment of Regional Vegetation Response to Climate Anomalies: A Case Study for Australia Using GIMMS NDVI Time Series between 1982 and 2006

    Directory of Open Access Journals (Sweden)

    Wanda De Keersmaecker

    2017-01-01

    Full Text Available Within the context of climate change, it is of utmost importance to quantify the stability of ecosystems with respect to climate anomalies. It is well acknowledged that ecosystem stability may change over time. As these temporal stability changes may provide a warning for increased vulnerability of the system, this study provides a methodology to quantify and assess these temporal changes in vegetation stability. Within this framework, vegetation stability changes were quantified over Australia from 1982 to 2006 using GIMMS NDVI and climate time series (i.e., SPEI (Standardized Precipitation and Evaporation Index. Starting from a stability assessment on the complete time series, we aim to assess: (i the magnitude and direction of stability changes; and (ii the similarity in these changes for different stability metrics, i.e., the standard deviation of the NDVI anomaly (SD, auto-correlation at lag one of the NDVI anomaly (AC and the correlation of NDVI anomaly with SPEI (CS. Results show high variability in magnitude and direction for the different stability metrics. Large areas and types of Australian vegetation showed an increase in variability (SD over time; however, vegetation memory (AC decreased. The association of NDVI anomalies with drought events (CS showed a mixed response: the association increased in the western part, while it decreased in the eastern part. This methodology shows the potential for quantifying vegetation responses to major climate shifts and land use change, but results could be enhanced with higher resolution time series data.

  4. Applied time series analysis

    CERN Document Server

    Woodward, Wayne A; Elliott, Alan C

    2011-01-01

    ""There is scarcely a standard technique that the reader will find left out … this book is highly recommended for those requiring a ready introduction to applicable methods in time series and serves as a useful resource for pedagogical purposes.""-International Statistical Review (2014), 82""Current time series theory for practice is well summarized in this book.""-Emmanuel Parzen, Texas A&M University""What an extraordinary range of topics covered, all very insightfully. I like [the authors'] innovations very much, such as the AR factor table.""-David Findley, U.S. Census Bureau (retired)""…

  5. Time series analysis

    CERN Document Server

    Madsen, Henrik

    2007-01-01

    ""In this book the author gives a detailed account of estimation, identification methodologies for univariate and multivariate stationary time-series models. The interesting aspect of this introductory book is that it contains several real data sets and the author made an effort to explain and motivate the methodology with real data. … this introductory book will be interesting and useful not only to undergraduate students in the UK universities but also to statisticians who are keen to learn time-series techniques and keen to apply them. I have no hesitation in recommending the book.""-Journa

  6. Visual time series analysis

    DEFF Research Database (Denmark)

    Fischer, Paul; Hilbert, Astrid

    2012-01-01

    commands, our application is select-and-click-driven. It allows to derive many different sequences of deviations for a given time series and to visualize them in different ways in order to judge their expressive power and to reuse the procedure found. For many transformations or model-ts, the user may...

  7. Climatic and ecological drivers of euphausiid community structure vary spatially in the Barents Sea: relationships from a long time series (1952-2009

    Directory of Open Access Journals (Sweden)

    Emma Lvovna Orlova

    2015-01-01

    Full Text Available Euphausiids play an important role in transferring energy from ephemeral primary producers to fish, seabirds, and marine mammals in the Barents Sea ecosystem. Climatic impacts have been suggested to occur at all levels of the Barents Sea food-web, but adequate exploration of these phenomena on ecologically relevant spatial scales has not been integrated sufficiently. We used a time-series of euphausiid abundance data spanning 58 years, one of the longest biological time-series in the Arctic, to explore qualitative and quantitative relationships among climate, euphausiids, and their predators, and how these parameters vary spatially in the Barents Sea. We detected four main hydrographic regions, each with distinct patterns of interannual variability in euphausiid abundance and community structure. Assemblages varied primarily in the relative abundance of Thysanoessa inermis versus T. raschii, or T. inermis versus T. longicaudata and Meganyctiphanes norvegica. Climate proxies and the abundance of capelin or cod explained 30-60% of the variability in euphausiid abundance in each region. Climate also influenced patterns of variability in euphausiid community structure, but correlations were generally weaker. Advection of boreal euphausiid taxa from the Norwegian Sea is clearly more prominent in warmer years than in colder years, and interacts with seasonal fish migrations to help explain spatial differences in primary drivers of euphausiid community structure. Non-linear effects of predators were common, and must be considered more carefully if a mechanistic understanding of the ecosystem is to be achieved. Quantitative relationships among euphausiid abundance, climate proxies, and predator stock-sizes derived from these time series are valuable for ecological models being used to predict impacts of climate change on the Barents Sea ecosystem, and how the system should be managed.

  8. Introduction to Time Series Modeling

    CERN Document Server

    Kitagawa, Genshiro

    2010-01-01

    In time series modeling, the behavior of a certain phenomenon is expressed in relation to the past values of itself and other covariates. Since many important phenomena in statistical analysis are actually time series and the identification of conditional distribution of the phenomenon is an essential part of the statistical modeling, it is very important and useful to learn fundamental methods of time series modeling. Illustrating how to build models for time series using basic methods, "Introduction to Time Series Modeling" covers numerous time series models and the various tools f

  9. Time Series of Aerosol Column Optical Depth at the Barrow, Alaska, ARM Climate Research Facility for 2008 Fourth Quarter 2009 ARM and Climate Change Prediction Program Metric Report

    Energy Technology Data Exchange (ETDEWEB)

    C Flynn; AS Koontz; JH Mather

    2009-09-01

    The uncertainties in current estimates of anthropogenic radiative forcing are dominated by the effects of aerosols, both in relation to the direct absorption and scattering of radiation by aerosols and also with respect to aerosol-related changes in cloud formation, longevity, and microphysics (See Figure 1; Intergovernmental Panel on Climate Change, Assessment Report 4, 2008). Moreover, the Arctic region in particular is especially sensitive to changes in climate with the magnitude of temperature changes (both observed and predicted) being several times larger than global averages (Kaufman et al. 2009). Recent studies confirm that aerosol-cloud interactions in the arctic generate climatologically significant radiative effects equivalent in magnitude to that of green house gases (Lubin and Vogelmann 2006, 2007). The aerosol optical depth is the most immediate representation of the aerosol direct effect and is also important for consideration of aerosol-cloud interactions, and thus this quantity is essential for studies of aerosol radiative forcing.

  10. Identification of Extreme Events Under Climate Change Conditions Over Europe and The Northwest-atlantic Region: Spatial Patterns and Time Series Characteristics

    Science.gov (United States)

    Leckebusch, G.; Ulbrich, U.; Speth, P.

    In the context of climate change and the resulting possible impacts on socio-economic conditions for human activities it seems that due to a changed occurrence of extreme events more severe consequences have to be expected than from changes in the mean climate. These extreme events like floods, excessive heats and droughts or windstorms possess impacts on human social and economic life in different categories such as forestry, agriculture, energy use, tourism and the reinsurance business. Reinsurances are affected by nearly 70% of all insured damages over Europe in the case of wind- storms. Especially the December 1999 French windstorms caused damages about 10 billion. A new EU-founded project (MICE = Modelling the Impact of Climate Ex- tremes) will focus on these impacts caused by changed occurrences of extreme events over Europe. Based upon the output of general circulation models as well as regional climate models, investigations are carried out with regard to time series characteristics as well as the spatial patterns of extremes under climate changed conditions. After the definition of specific thresholds for climate extremes, in this talk we will focus on the results of the analysis for the different data sets (HadCM3 and CGCMII GCM's and RCM's, re-analyses, observations) with regard to windstorm events. At first the results of model outputs are validated against re-analyses and observations. Especially a comparison of the stormtrack (2.5 to 8 day bandpass filtered 500 hPa geopotential height), cyclone track, cyclone frequency and intensity is presented. Highly relevant to damages is the extreme wind near the ground level, so the 10 m wind speed will be investigated additionally. of special interest to possible impacts is the changed spatial occurrence of windspeed maxima under 2xCO2-induced climate change.

  11. Time series analysis of dengue incidence in Guadeloupe, French West Indies: Forecasting models using climate variables as predictors

    Directory of Open Access Journals (Sweden)

    Ruche Guy

    2011-06-01

    better than humidity and rainfall. SARIMA models using climatic data as independent variables could be easily incorporated into an early (3 months-ahead and reliably monitoring system of dengue outbreaks. This approach which is practicable for a surveillance system has public health implications in helping the prediction of dengue epidemic and therefore the timely appropriate and efficient implementation of prevention activities.

  12. GPS Position Time Series @ JPL

    Science.gov (United States)

    Owen, Susan; Moore, Angelyn; Kedar, Sharon; Liu, Zhen; Webb, Frank; Heflin, Mike; Desai, Shailen

    2013-01-01

    Different flavors of GPS time series analysis at JPL - Use same GPS Precise Point Positioning Analysis raw time series - Variations in time series analysis/post-processing driven by different users. center dot JPL Global Time Series/Velocities - researchers studying reference frame, combining with VLBI/SLR/DORIS center dot JPL/SOPAC Combined Time Series/Velocities - crustal deformation for tectonic, volcanic, ground water studies center dot ARIA Time Series/Coseismic Data Products - Hazard monitoring and response focused center dot ARIA data system designed to integrate GPS and InSAR - GPS tropospheric delay used for correcting InSAR - Caltech's GIANT time series analysis uses GPS to correct orbital errors in InSAR - Zhen Liu's talking tomorrow on InSAR Time Series analysis

  13. Improving Intercomparability of Marine Biogeochemical Time Series

    Science.gov (United States)

    Benway, Heather M.; Telszewski, Maciej; Lorenzoni, Laura

    2013-04-01

    Shipboard biogeochemical time series represent one of the most valuable tools scientists have to quantify marine elemental fluxes and associated biogeochemical processes and to understand their links to changing climate. They provide the long, temporally resolved data sets needed to characterize ocean climate, biogeochemistry, and ecosystem variability and change. However, to monitor and differentiate natural cycles and human-driven changes in the global oceans, time series methodologies must be transparent and intercomparable when possible. To review current shipboard biogeochemical time series sampling and analytical methods, the International Ocean Carbon Coordination Project (IOCCP; http://www.ioccp.org/) and the Ocean Carbon and Biogeochemistry Program (http://www.us-ocb.org/) convened an international ocean time series workshop at the Bermuda Institute for Ocean Sciences.

  14. Relevance of long term time - Series of atmospheric parameters at a mountain observatory to models for climate change

    Science.gov (United States)

    Kancírová, M.; Kudela, K.; Erlykin, A. D.; Wolfendale, A. W.

    2016-10-01

    A detailed analysis has been made based on annual meteorological and cosmic ray data from the Lomnicky stit mountain observatory (LS, 2634 masl; 49.40°N, 20.22°E; vertical cut-off rigidity 3.85 GV), from the standpoint of looking for possible solar cycle (including cosmic ray) manifestations. A comparison of the mountain data with the Global average for the cloud cover in general shows no correlation but there is a possible small correlation for low clouds (LCC in the Global satellite data). However, whereas it cannot be claimed that cloud cover observed at Lomnicky stit (LSCC) can be used directly as a proxy for the Global LCC, its examination has value because it is an independent estimate of cloud cover and one that has a different altitude weighting to that adopted in the satellite-derived LCC. This statement is derived from satellite data (http://isccp.giss.nasa.gov/climanal7.html) which shows the time series for the period 1983-2010 for 9 cloud regimes. There is a significant correlation only between cosmic ray (CR) intensity (and sunspot number (SSN)) and the cloud cover of the types cirrus and stratus. This effect is mainly confined to the CR intensity minimum during the epoch around 1990, when the SSN was at its maximum. This fact, together with the present study of the correlation of LSCC with our measured CR intensity, shows that there is no firm evidence for a significant contribution of CR induced ionization to the local (or, indeed, Global) cloud cover. Pressure effects are the preferred cause of the cloud cover changes. A consequence is that there is no evidence favouring a contribution of CR to the Global Warming problem. Our analysis shows that the LS data are consistent with the Gas Laws for a stable mass of atmosphere.

  15. Detecting Climate Effects on Vegetation in Northern Mixed Prairie Using NOAA AVHRR 1-km Time-Series NDVI Data

    Directory of Open Access Journals (Sweden)

    Zhaoqin Li

    2012-01-01

    Full Text Available Grasslands hold varied grazing capacity, provide multiple habitats for diverse wildlife, and are a key component of carbon stock. Research has indicated that grasslands are experiencing effects related to recent climate trends. Understanding how grasslands respond to climate variation thus is essential. However, it is difficult to separate the effects of climate variation from grazing. This study aims to document vegetation condition under climate variation in Grasslands National Park (GNP of Canada, a grassland ecosystem without grazing for over 20 years, using Normalized Difference Vegetation Index (NDVI data to establish vegetation baselines. The main findings are (1 precipitation has more effects than temperature on vegetation; (2 the growing season of vegetation had an expanding trend indicated by earlier green-up and later senescence; (3 phenologically-tuned annual NDVI had an increasing trend from 1985 to 2007; and (4 the baselines of annual NDVI range from 0.13 to 0.32, and only the NDVI in 1999 is beyond the upper bound of the baseline. Our results indicate that vegetation phenology and condition have slightly changed in GNP since 1985, although vegetation condition in most years was still within the baselines.

  16. Climatic Factors and Community — Associated Methicillin-Resistant Staphylococcus aureus Skin and Soft-Tissue Infections — A Time-Series Analysis Study

    Directory of Open Access Journals (Sweden)

    Krushna Chandra Sahoo

    2014-08-01

    Full Text Available Skin and soft tissue infections caused by Staphylococcus aureus (SA-SSTIs including methicillin-resistant Staphylococcus aureus (MRSA have experienced a significant surge all over the world. Changing climatic factors are affecting the global burden of dermatological infections and there is a lack of information on the association between climatic factors and MRSA infections. Therefore, association of temperature and relative humidity (RH with occurrence of SA-SSTIs (n = 387 and also MRSA (n = 251 was monitored for 18 months in the outpatient clinic at a tertiary care hospital located in Bhubaneswar, Odisha, India. The Kirby-Bauer disk diffusion method was used for antibiotic susceptibility testing. Time-series analysis was used to investigate the potential association of climatic factors (weekly averages of maximum temperature, minimum temperature and RH with weekly incidence of SA-SSTIs and MRSA infections. The analysis showed that a combination of weekly average maximum temperature above 33 °C coinciding with weekly average RH ranging between 55% and 78%, is most favorable for the occurrence of SA-SSTIs and MRSA and within these parameters, each unit increase in occurrence of MRSA was associated with increase in weekly average maximum temperature of 1.7 °C (p = 0.044 and weekly average RH increase of 10% (p = 0.097.

  17. Detecting relationships between the interannual variability in climate records and ecological time series using a multivariate statistical approach - four case studies for the North Sea region

    Energy Technology Data Exchange (ETDEWEB)

    Heyen, H. [GKSS-Forschungszentrum Geesthacht GmbH (Germany). Inst. fuer Gewaesserphysik

    1998-12-31

    A multivariate statistical approach is presented that allows a systematic search for relationships between the interannual variability in climate records and ecological time series. Statistical models are built between climatological predictor fields and the variables of interest. Relationships are sought on different temporal scales and for different seasons and time lags. The possibilities and limitations of this approach are discussed in four case studies dealing with salinity in the German Bight, abundance of zooplankton at Helgoland Roads, macrofauna communities off Norderney and the arrival of migratory birds on Helgoland. (orig.) [Deutsch] Ein statistisches, multivariates Modell wird vorgestellt, das eine systematische Suche nach potentiellen Zusammenhaengen zwischen Variabilitaet in Klima- und oekologischen Zeitserien erlaubt. Anhand von vier Anwendungsbeispielen wird der Klimaeinfluss auf den Salzgehalt in der Deutschen Bucht, Zooplankton vor Helgoland, Makrofauna vor Norderney, und die Ankunft von Zugvoegeln auf Helgoland untersucht. (orig.)

  18. Effect of Climate Factors on the Childhood Pneumonia in Papua New Guinea: A Time-Series Analysis

    OpenAIRE

    Jinseob Kim; Jong-Hun Kim; Hae-Kwan Cheong; Ho Kim; Yasushi Honda; Mina Ha; Masahiro Hashizume; Joel Kolam; Kasis Inape

    2016-01-01

    This study aimed to assess the association between climate factors and the incidence of childhood pneumonia in Papua New Guinea quantitatively and to evaluate the variability of the effect size according to their geographic properties. The pneumonia incidence in children under five-year and meteorological factors were obtained from six areas, including monthly rainfall and the monthly average daily maximum temperatures during the period from 1997 to 2006 from national health surveillance data...

  19. Effect of Climate Factors on the Childhood Pneumonia in Papua New Guinea: A Time-Series Analysis.

    Science.gov (United States)

    Kim, Jinseob; Kim, Jong-Hun; Cheong, Hae-Kwan; Kim, Ho; Honda, Yasushi; Ha, Mina; Hashizume, Masahiro; Kolam, Joel; Inape, Kasis

    2016-02-15

    This study aimed to assess the association between climate factors and the incidence of childhood pneumonia in Papua New Guinea quantitatively and to evaluate the variability of the effect size according to their geographic properties. The pneumonia incidence in children under five-year and meteorological factors were obtained from six areas, including monthly rainfall and the monthly average daily maximum temperatures during the period from 1997 to 2006 from national health surveillance data. A generalized linear model was applied to measure the effect size of local and regional climate factor. The pooled risk of pneumonia in children per every 10 mm increase of rainfall was 0.24% (95% confidence interval: -0.01%-0.50%), and risk per every 1 °C increase of the monthly mean of the maximum daily temperatures was 4.88% (95% CI: 1.57-8.30). Southern oscillation index and dipole mode index showed an overall negative effect on childhood pneumonia incidence, -0.57% and -4.30%, respectively, and the risk of pneumonia was higher in the dry season than in the rainy season (pooled effect: 12.08%). There was a variability in the relationship between climate factors and pneumonia which is assumed to reflect distribution of the determinants of and vulnerability to pneumonia in the community.

  20. Predicting Nonlinear Time Series

    Science.gov (United States)

    1993-12-01

    response becomes R,(k) = f (Y FV,(k)) (2.4) where Wy specifies the weight associated with the output of node i to the input of nodej in the next layer and...interconnections for each of these previous nodes. 18 prr~~~o• wfe :t iam i -- ---- --- --- --- Figure 5: Delay block for ATNN [9] Thus, nodej receives the...computed values, aj(tn), and dj(tn) denotes the desired output of nodej at time in. In this thesis, the weights and time delays update after each input

  1. Evaluation of the inter-annual variability of stratospheric chemical composition in chemistry-climate models using ground-based multi species time series

    Science.gov (United States)

    Poulain, V.; Bekki, S.; Marchand, M.; Chipperfield, M. P.; Khodri, M.; Lefèvre, F.; Dhomse, S.; Bodeker, G. E.; Toumi, R.; De Maziere, M.; Pommereau, J.-P.; Pazmino, A.; Goutail, F.; Plummer, D.; Rozanov, E.; Mancini, E.; Akiyoshi, H.; Lamarque, J.-F.; Austin, J.

    2016-07-01

    The variability of stratospheric chemical composition occurs on a broad spectrum of timescales, ranging from day to decades. A large part of the variability appears to be driven by external forcings such as volcanic aerosols, solar activity, halogen loading, levels of greenhouse gases (GHG), and modes of climate variability (quasi-biennial oscillation (QBO), El Niño-Southern Oscillation (ENSO)). We estimate the contributions of different external forcings to the interannual variability of stratospheric chemical composition and evaluate how well 3-D chemistry-climate models (CCMs) can reproduce the observed response-forcing relationships. We carry out multivariate regression analyses on long time series of observed and simulated time series of several traces gases in order to estimate the contributions of individual forcings and unforced variability to their internannual variability. The observations are typically decadal time series of ground-based data from the international Network for the Detection of Atmospheric Composition Change (NDACC) and the CCM simulations are taken from the CCMVal-2 REF-B1 simulations database. The chemical species considered are column O3, HCl, NO2, and N2O. We check the consistency between observations and model simulations in terms of the forced and internal components of the total interannual variability (externally forced variability and internal variability) and identify the driving factors in the interannual variations of stratospheric chemical composition over NDACC measurement sites. Overall, there is a reasonably good agreement between regression results from models and observations regarding the externally forced interannual variability. A much larger fraction of the observed and modelled interannual variability is explained by external forcings in the tropics than in the extratropics, notably in polar regions. CCMs are able to reproduce the amplitudes of responses in chemical composition to specific external forcings

  2. Wavelet analysis of the singular spectral reconstructed time series to study the imprints of Solar–ENSO–Geomagnetic activity on Indian climate

    Directory of Open Access Journals (Sweden)

    S. Sri Lakshmi

    2015-09-01

    Full Text Available In order to study the imprints of solar–ENSO–geomagnetic activity on the Indian Subcontinent, we have applied the Singular Spectral Analysis (SSA and wavelet analysis to the tree ring temperature variability record from the western Himalayas. The data used in the present study are the Solar Sunspot Number (SSN, Geomagnetic Indices (aa Index, Southern Oscillation Index (SOI and tree ring temperature record from western Himalayas (WH, for the period of 1876–2000. The SSA and wavelet spectra reveal the presence of 5 years short term ENSO variations to 11 year solar cycle indicating the influence of both the solar–geomagnetic and ENSO imprints in the tree ring data. The presence of 33-year cycle periodicity suggests the Sun-temperature variability probably involving the induced changes in the basic state of the atmosphere. Our wavelet analysis for the SSA reconstructed time series agrees with our previous results and also enhance the amplitude of the signals by removing the noise and showing a strong influence of solar–geomagnetic and ENSO patterns throughout the record. The solar flares are considered to be responsible for cause in the circulation patterns in the atmosphere. The net effect of solar–geomagnetic processes on temperature record thus appears to be the result of counteracting influences on shorter (about 5–6 years and longer (about 11–12 years time scales. The present analysis thus suggests that the influence of solar processes on Indian temperature variability operates in part indirectly through ENSO, but on more than one time scale. The analyses hence provides credible evidence for teleconnections of tropical pacific climatic variability with Indian climate ranging from interannual-decadal time scales and also demonstrate the possible role of exogenic triggering in reorganizing the global earth–ocean–atmospheric systems.

  3. Models for dependent time series

    CERN Document Server

    Tunnicliffe Wilson, Granville; Haywood, John

    2015-01-01

    Models for Dependent Time Series addresses the issues that arise and the methodology that can be applied when the dependence between time series is described and modeled. Whether you work in the economic, physical, or life sciences, the book shows you how to draw meaningful, applicable, and statistically valid conclusions from multivariate (or vector) time series data.The first four chapters discuss the two main pillars of the subject that have been developed over the last 60 years: vector autoregressive modeling and multivariate spectral analysis. These chapters provide the foundational mater

  4. Fractal and Multifractal Time Series

    CERN Document Server

    Kantelhardt, Jan W

    2008-01-01

    Data series generated by complex systems exhibit fluctuations on many time scales and/or broad distributions of the values. In both equilibrium and non-equilibrium situations, the natural fluctuations are often found to follow a scaling relation over several orders of magnitude, allowing for a characterisation of the data and the generating complex system by fractal (or multifractal) scaling exponents. In addition, fractal and multifractal approaches can be used for modelling time series and deriving predictions regarding extreme events. This review article describes and exemplifies several methods originating from Statistical Physics and Applied Mathematics, which have been used for fractal and multifractal time series analysis.

  5. Time Series with Tailored Nonlinearities

    CERN Document Server

    Raeth, C

    2015-01-01

    It is demonstrated how to generate time series with tailored nonlinearities by inducing well- defined constraints on the Fourier phases. Correlations between the phase information of adjacent phases and (static and dynamic) measures of nonlinearities are established and their origin is explained. By applying a set of simple constraints on the phases of an originally linear and uncor- related Gaussian time series, the observed scaling behavior of the intensity distribution of empirical time series can be reproduced. The power law character of the intensity distributions being typical for e.g. turbulence and financial data can thus be explained in terms of phase correlations.

  6. Impact of change in climate and policy from 1988 to 2007 on environmental and microbial variables at the time series station Boknis Eck, Baltic Sea

    Directory of Open Access Journals (Sweden)

    H.-G. Hoppe

    2013-07-01

    Full Text Available Phytoplankton and bacteria are sensitive indicators of environmental change. The temporal development of these key organisms was monitored from 1988 to the end of 2007 at the time series station Boknis Eck in the western Baltic Sea. This period was characterized by the adaption of the Baltic Sea ecosystem to changes in the environmental conditions caused by the conversion of the political system in the southern and eastern border states, accompanied by the general effects of global climate change. Measured variables were chlorophyll, primary production, bacteria number, -biomass and -production, glucose turnover rate, macro-nutrients, pH, temperature and salinity. Negative trends with time were recorded for chlorophyll, bacteria number, bacterial biomass and bacterial production, nitrate, ammonia, phosphate, silicate, oxygen and salinity while temperature, pH, and the ratio between bacteria numbers and chlorophyll increased. Strongest reductions with time occurred for the annual maximum values, e.g. for chlorophyll during the spring bloom or for nitrate during winter, while the annual minimum values remained more stable. In deep water above sediment the negative trends of oxygen, nitrate, phosphate and bacterial variables as well as the positive trend of temperature were similar to those in the surface while the trends of salinity, ammonia and silicate were opposite to those in the surface. Decreasing oxygen, even in the surface layer, was of particular interest because it suggested enhanced recycling of nutrients from the deep hypoxic zones to the surface by vertical mixing. The long-term seasonal patterns of all variables correlated positively with temperature, except chlorophyll and salinity. Salinity correlated negatively with all bacterial variables (as well as precipitation and positively with chlorophyll. Surprisingly, bacterial variables did not correlate with chlorophyll, which may be inherent with the time lag between the peaks of

  7. Bridging long proxy data time series and instrumental observation in the Virtual Institute of Integrated Climate and Landscape Evolution Analyses - ICLEA

    Science.gov (United States)

    Schwab, Markus J.; Brauer, Achim; Błaszkiewicz, Mirosław; Raab, Thomas; Wilmking, Martin

    2015-04-01

    Understanding causes and effects of present-day climate change on landscapes and the human habitat faces two main challenges, (i) too short time series of instrumental observation that do not cover the full range of variability since mechanisms of climate change and landscape evolution work on different time scales, which often not susceptible to human perception, and, (ii) distinct regional differences due to the location with respect to oceanic/continental climatic influences, the geological underground, and the history and intensity of anthropogenic land-use. Both challenges are central for the ICLEA research strategy and demand a high degree of interdisciplinary. In particular, the need to link observations and measurements of ongoing changes with information from the past taken from natural archives requires joint work of scientists with very different time perspectives. On the one hand, scientists that work at geological time scales of thousands and more years and, on the other hand, those observing and investigating recent processes at short time scales. The GFZ, Greifswald University and the Brandenburg University of Technology together with their partner the Polish Academy of Sciences strive for focusing their research capacities and expertise in ICLEA. ICLEA offers young researchers an interdisciplinary and structured education and promote their early independence through coaching and mentoring. Postdoctoral rotation positions at the ICLEA partner institutions ensure mobility of young researchers and promote dissemination of information and expertise between disciplines. Training, Research and Analytical workshops between research partners of the ICLEA virtual institute are another important measure to qualify young researchers. The long-term mission of the Virtual Institute is to provide a substantiated data basis for sustained environmental maintenance based on a profound process understanding at all relevant time scales. Aim is to explore processes of

  8. Wavelet analysis of the singular spectral reconstructed time series to study the imprints of solar-ENSO-geomagnetic activity on Indian climate

    Science.gov (United States)

    Lakshmi Sunkara, Sri; Krishna Tiwari, Rama

    2016-09-01

    To study the imprints of the solar-ENSO-geomagnetic activity on the Indian subcontinent, we have applied singular spectral analysis (SSA) and wavelet analysis to the tree-ring temperature variability record from the Western Himalayas. Other data used in the present study are the solar sunspot number (SSN), geomagnetic indices (aa index), and the Southern Oscillation Index (SOI) for the common time period of 1876-2000. Both SSA and wavelet spectral analyses reveal the presence of 5-7-year short-term ENSO variations and the 11-year solar cycle, indicating the possible combined influences of solar-geomagnetic activities and ENSO on the Indian temperature. Another prominent signal corresponding to 33-year periodicity in the tree-ring record suggests the Sun-temperature variability link probably induced by changes in the basic state of the Earth's atmosphere. In order to complement the above findings, we performed a wavelet analysis of SSA reconstructed time series, which agrees well with our earlier results and increases the signal-to-noise ratio, thereby showing the strong influence of solar-geomagnetic activity and ENSO throughout the entire period. The solar flares are considered responsible for causing the atmospheric circulation patterns. The net effect of solar-geomagnetic processes on the temperature record might suggest counteracting influences on shorter (about 5-6-year) and longer (about 11-12-year) timescales. The present analyses suggest that the influence of solar activities on the Indian temperature variability operates in part indirectly through coupling of ENSO on multilateral timescales. The analyses, hence, provide credible evidence of teleconnections of tropical Pacific climatic variability and Indian climate ranging from inter-annual to decadal timescales and also suggest the possible role of exogenic triggering in reorganizing the global Earth-ocean-atmospheric systems.

  9. Complex network analysis of time series

    Science.gov (United States)

    Gao, Zhong-Ke; Small, Michael; Kurths, Jürgen

    2016-12-01

    Revealing complicated behaviors from time series constitutes a fundamental problem of continuing interest and it has attracted a great deal of attention from a wide variety of fields on account of its significant importance. The past decade has witnessed a rapid development of complex network studies, which allow to characterize many types of systems in nature and technology that contain a large number of components interacting with each other in a complicated manner. Recently, the complex network theory has been incorporated into the analysis of time series and fruitful achievements have been obtained. Complex network analysis of time series opens up new venues to address interdisciplinary challenges in climate dynamics, multiphase flow, brain functions, ECG dynamics, economics and traffic systems.

  10. Benchmarking of energy time series

    Energy Technology Data Exchange (ETDEWEB)

    Williamson, M.A.

    1990-04-01

    Benchmarking consists of the adjustment of time series data from one source in order to achieve agreement with similar data from a second source. The data from the latter source are referred to as the benchmark(s), and often differ in that they are observed at a lower frequency, represent a higher level of temporal aggregation, and/or are considered to be of greater accuracy. This report provides an extensive survey of benchmarking procedures which have appeared in the statistical literature, and reviews specific benchmarking procedures currently used by the Energy Information Administration (EIA). The literature survey includes a technical summary of the major benchmarking methods and their statistical properties. Factors influencing the choice and application of particular techniques are described and the impact of benchmark accuracy is discussed. EIA applications and procedures are reviewed and evaluated for residential natural gas deliveries series and coal production series. It is found that the current method of adjusting the natural gas series is consistent with the behavior of the series and the methods used in obtaining the initial data. As a result, no change is recommended. For the coal production series, a staged approach based on a first differencing technique is recommended over the current procedure. A comparison of the adjustments produced by the two methods is made for the 1987 Indiana coal production series. 32 refs., 5 figs., 1 tab.

  11. Random time series in astronomy.

    Science.gov (United States)

    Vaughan, Simon

    2013-02-13

    Progress in astronomy comes from interpreting the signals encoded in the light received from distant objects: the distribution of light over the sky (images), over photon wavelength (spectrum), over polarization angle and over time (usually called light curves by astronomers). In the time domain, we see transient events such as supernovae, gamma-ray bursts and other powerful explosions; we see periodic phenomena such as the orbits of planets around nearby stars, radio pulsars and pulsations of stars in nearby galaxies; and we see persistent aperiodic variations ('noise') from powerful systems such as accreting black holes. I review just a few of the recent and future challenges in the burgeoning area of time domain astrophysics, with particular attention to persistently variable sources, the recovery of reliable noise power spectra from sparsely sampled time series, higher order properties of accreting black holes, and time delays and correlations in multi-variate time series.

  12. Random time series in Astronomy

    CERN Document Server

    Vaughan, Simon

    2013-01-01

    Progress in astronomy comes from interpreting the signals encoded in the light received from distant objects: the distribution of light over the sky (images), over photon wavelength (spectrum), over polarization angle, and over time (usually called light curves by astronomers). In the time domain we see transient events such as supernovae, gamma-ray bursts, and other powerful explosions; we see periodic phenomena such as the orbits of planets around nearby stars, radio pulsars, and pulsations of stars in nearby galaxies; and persistent aperiodic variations (`noise') from powerful systems like accreting black holes. I review just a few of the recent and future challenges in the burgeoning area of Time Domain Astrophysics, with particular attention to persistently variable sources, the recovery of reliable noise power spectra from sparsely sampled time series, higher-order properties of accreting black holes, and time delays and correlations in multivariate time series.

  13. Climate change in Bangladesh: a spatio-temporal analysis and simulation of recent temperature and rainfall data using GIS and time series analysis model

    Science.gov (United States)

    Rahman, Md. Rejaur; Lateh, Habibah

    2015-12-01

    In this paper, temperature and rainfall data series were analysed from 34 meteorological stations distributed throughout Bangladesh over a 40-year period (1971 to 2010) in order to evaluate the magnitude of these changes statistically and spatially. Linear regression, coefficient of variation, inverse distance weighted interpolation techniques and geographical information systems were performed to analyse the trends, variability and spatial patterns of temperature and rainfall. Autoregressive integrated moving average time series model was used to simulate the temperature and rainfall data. The results confirm a particularly strong and recent climate change in Bangladesh with a 0.20 °C per decade upward trend of mean temperature. The highest upward trend in minimum temperature (range of 0.80-2.4 °C) was observed in the northern, northwestern, northeastern, central and central southern parts while greatest warming in the maximum temperature (range of 1.20-2.48 °C) was found in the southern, southeastern and northeastern parts during 1971-2010. An upward trend of annual rainfall (+7.13 mm per year) and downward pre-monsoon (-0.75 mm per year) and post-monsoon rainfall (-0.55 mm per year) trends were observed during this period. Rainfall was erratic in pre-monsoon season and even more so during the post-monsoon season (variability of 44.84 and 85.25 % per year, respectively). The mean forecasted temperature exhibited an increase of 0.018 °C per year in 2011-2020, and if this trend continues, this would lead to approximately 1.0 °C warmer temperatures in Bangladesh by 2020, compared to that of 1971. A greater rise is projected for the mean minimum (0.20 °C) than the mean maximum (0.16 °C) temperature. Annual rainfall is projected to decline 153 mm from 2011 to 2020, and a drying condition will persist in the northwestern, western and southwestern parts of the country during the pre- and post-monsoonal seasons.

  14. Normalizing the causality between time series

    CERN Document Server

    Liang, X San

    2015-01-01

    Recently, a rigorous yet concise formula has been derived to evaluate the information flow, and hence the causality in a quantitative sense, between time series. To assess the importance of a resulting causality, it needs to be normalized. The normalization is achieved through distinguishing three types of fundamental mechanisms that govern the marginal entropy change of the flow recipient. A normalized or relative flow measures its importance relative to other mechanisms. In analyzing realistic series, both absolute and relative information flows need to be taken into account, since the normalizers for a pair of reverse flows belong to two different entropy balances; it is quite normal that two identical flows may differ a lot in relative importance in their respective balances. We have reproduced these results with several autoregressive models. We have also shown applications to a climate change problem and a financial analysis problem. For the former, reconfirmed is the role of the Indian Ocean Dipole as ...

  15. Event Discovery in Time Series

    CERN Document Server

    Preston, Dan; Brodley, Carla

    2009-01-01

    The discovery of events in time series can have important implications, such as identifying microlensing events in astronomical surveys, or changes in a patient's electrocardiogram. Current methods for identifying events require a sliding window of a fixed size, which is not ideal for all applications and could overlook important events. In this work, we develop probability models for calculating the significance of an arbitrary-sized sliding window and use these probabilities to find areas of significance. Because a brute force search of all sliding windows and all window sizes would be computationally intractable, we introduce a method for quickly approximating the results. We apply our method to over 100,000 astronomical time series from the MACHO survey, in which 56 different sections of the sky are considered, each with one or more known events. Our method was able to recover 100% of these events in the top 1% of the results, essentially pruning 99% of the data. Interestingly, our method was able to iden...

  16. Detecting chaos from time series

    Science.gov (United States)

    Xiaofeng, Gong; Lai, C. H.

    2000-02-01

    In this paper, an entirely data-based method to detect chaos from the time series is developed by introducing icons/Journals/Common/epsilon" ALT="epsilon" ALIGN="TOP"/> p -neighbour points (the p -steps icons/Journals/Common/epsilon" ALT="epsilon" ALIGN="TOP"/> -neighbour points). We demonstrate that for deterministic chaotic systems, there exists a linear relationship between the logarithm of the average number of icons/Journals/Common/epsilon" ALT="epsilon" ALIGN="TOP"/> p -neighbour points, lnn p ,icons/Journals/Common/epsilon" ALT="epsilon" ALIGN="TOP"/> , and the time step, p . The coefficient can be related to the KS entropy of the system. The effects of the embedding dimension and noise are also discussed.

  17. Climate Change Crunch Time

    Institute of Scientific and Technical Information of China (English)

    Xie Zhenhua

    2011-01-01

    CLIMATE change is a severe challenge facing humanity in the 21st century and thus the Chinese Government always attaches great importance to the problem.Actively dealing with climate change is China's important strategic policy in its social and economic development.China will make a positive contribution to the world in this regard.

  18. The Imprint of Extreme Climate Events in Century-Long Time Series of Wood Anatomical Traits in High-Elevation Conifers.

    Science.gov (United States)

    Carrer, Marco; Brunetti, Michele; Castagneri, Daniele

    2016-01-01

    Extreme climate events are of key importance for forest ecosystems. However, both the inherent infrequency, stochasticity and multiplicity of extreme climate events, and the array of biological responses, challenges investigations. To cope with the long life cycle of trees and the paucity of the extreme events themselves, our inferences should be based on long-term observations. In this context, tree rings and the related xylem anatomical traits represent promising sources of information, due to the wide time perspective and quality of the information they can provide. Here we test, on two high-elevation conifers (Larix decidua and Picea abies sampled at 2100 m a.s.l. in the Eastern Alps), the associations among temperature extremes during the growing season and xylem anatomical traits, specifically the number of cells per ring (CN), cell wall thickness (CWT), and cell diameter (CD). To better track the effect of extreme events over the growing season, tree rings were partitioned in 10 sectors. Climate variability has been reconstructed, for 1800-2011 at monthly resolution and for 1926-2011 at daily resolution, by exploiting the excellent availability of very long and high quality instrumental records available for the surrounding area, and taking into account the relationship between meteorological variables and site topographical settings. Summer temperature influenced anatomical traits of both species, and tree-ring anatomical profiles resulted as being associated to temperature extremes. Most of the extreme values in anatomical traits occurred with warm (positive extremes) or cold (negative) conditions. However, 0-34% of occurrences did not match a temperature extreme event. Specifically, CWT and CN extremes were more clearly associated to climate than CD, which presented a bias to track cold extremes. Dendroanatomical analysis, coupled to high-quality daily-resolved climate records, seems a promising approach to study the effects of extreme events on trees

  19. Trend prediction of chaotic time series

    Institute of Scientific and Technical Information of China (English)

    Li Aiguo; Zhao Cai; Li Zhanhuai

    2007-01-01

    To predict the trend of chaotic time series in time series analysis and time series data mining fields, a novel predicting algorithm of chaotic time series trend is presented, and an on-line segmenting algorithm is proposed to convert a time series into a binary string according to ascending or descending trend of each subsequence. The on-line segmenting algorithm is independent of the prior knowledge about time series. The naive Bayesian algorithm is then employed to predict the trend of chaotic time series according to the binary string. The experimental results of three chaotic time series demonstrate that the proposed method predicts the ascending or descending trend of chaotic time series with few error.

  20. A Course in Time Series Analysis

    CERN Document Server

    Peña, Daniel; Tsay, Ruey S

    2011-01-01

    New statistical methods and future directions of research in time series A Course in Time Series Analysis demonstrates how to build time series models for univariate and multivariate time series data. It brings together material previously available only in the professional literature and presents a unified view of the most advanced procedures available for time series model building. The authors begin with basic concepts in univariate time series, providing an up-to-date presentation of ARIMA models, including the Kalman filter, outlier analysis, automatic methods for building ARIMA models, a

  1. Description of complex time series by multipoles

    DEFF Research Database (Denmark)

    Lewkowicz, M.; Levitan, J.; Puzanov, N.;

    2002-01-01

    We present a new method to describe time series with a highly complex time evolution. The time series is projected onto a two-dimensional phase-space plot which is quantified in terms of a multipole expansion where every data point is assigned a unit mass. The multipoles provide an efficient...... characterization of the original time series....

  2. Effective Feature Preprocessing for Time Series Forecasting

    DEFF Research Database (Denmark)

    Zhao, Junhua; Dong, Zhaoyang; Xu, Zhao

    2006-01-01

    Time series forecasting is an important area in data mining research. Feature preprocessing techniques have significant influence on forecasting accuracy, therefore are essential in a forecasting model. Although several feature preprocessing techniques have been applied in time series forecasting...... performance in time series forecasting. It is demonstrated in our experiment that, effective feature preprocessing can significantly enhance forecasting accuracy. This research can be a useful guidance for researchers on effectively selecting feature preprocessing techniques and integrating them with time...... series forecasting models....

  3. Kolmogorov space in time series data

    Science.gov (United States)

    Kanjamapornkul, Kabin; Pinčák, Richard

    2016-10-01

    We provide the proof that the space of time series data is a Kolmogorov space with $T_{0}$-separation axiom using the loop space of time series data. In our approach we define a cyclic coordinate of intrinsic time scale of time series data after empirical mode decomposition. A spinor field of time series data comes from the rotation of data around price and time axis by defining a new extradimension to time series data. We show that there exist hidden eight dimensions in Kolmogorov space for time series data. Our concept is realized as the algorithm of empirical mode decomposition and intrinsic time scale decomposition and it is subsequently used for preliminary analysis on the real time series data.

  4. Regenerating time series from ordinal networks

    Science.gov (United States)

    McCullough, Michael; Sakellariou, Konstantinos; Stemler, Thomas; Small, Michael

    2017-03-01

    Recently proposed ordinal networks not only afford novel methods of nonlinear time series analysis but also constitute stochastic approximations of the deterministic flow time series from which the network models are constructed. In this paper, we construct ordinal networks from discrete sampled continuous chaotic time series and then regenerate new time series by taking random walks on the ordinal network. We then investigate the extent to which the dynamics of the original time series are encoded in the ordinal networks and retained through the process of regenerating new time series by using several distinct quantitative approaches. First, we use recurrence quantification analysis on traditional recurrence plots and order recurrence plots to compare the temporal structure of the original time series with random walk surrogate time series. Second, we estimate the largest Lyapunov exponent from the original time series and investigate the extent to which this invariant measure can be estimated from the surrogate time series. Finally, estimates of correlation dimension are computed to compare the topological properties of the original and surrogate time series dynamics. Our findings show that ordinal networks constructed from univariate time series data constitute stochastic models which approximate important dynamical properties of the original systems.

  5. Time Series Analysis and Forecasting by Example

    CERN Document Server

    Bisgaard, Soren

    2011-01-01

    An intuition-based approach enables you to master time series analysis with ease Time Series Analysis and Forecasting by Example provides the fundamental techniques in time series analysis using various examples. By introducing necessary theory through examples that showcase the discussed topics, the authors successfully help readers develop an intuitive understanding of seemingly complicated time series models and their implications. The book presents methodologies for time series analysis in a simplified, example-based approach. Using graphics, the authors discuss each presented example in

  6. Complex network approach for recurrence analysis of time series

    Energy Technology Data Exchange (ETDEWEB)

    Marwan, Norbert, E-mail: marwan@pik-potsdam.d [Potsdam Institute for Climate Impact Research, PO Box 601203, 14412 Potsdam (Germany); Donges, Jonathan F. [Potsdam Institute for Climate Impact Research, PO Box 601203, 14412 Potsdam (Germany)] [Department of Physics, Humboldt University Berlin, Newtonstr. 15, 12489 Berlin (Germany); Zou Yong [Potsdam Institute for Climate Impact Research, PO Box 601203, 14412 Potsdam (Germany); Donner, Reik V. [Potsdam Institute for Climate Impact Research, PO Box 601203, 14412 Potsdam (Germany)] [Institute for Transport and Economics, Dresden University of Technology, Andreas-Schubert-Str. 23, 01062 Dresden (Germany)] [Graduate School of Science, Osaka Prefecture University, 1-1 Gakuencho, Naka-ku, Sakai 599-8531 (Japan); Kurths, Juergen [Potsdam Institute for Climate Impact Research, PO Box 601203, 14412 Potsdam (Germany)] [Department of Physics, Humboldt University Berlin, Newtonstr. 15, 12489 Berlin (Germany)

    2009-11-09

    We propose a novel approach for analysing time series using complex network theory. We identify the recurrence matrix (calculated from time series) with the adjacency matrix of a complex network and apply measures for the characterisation of complex networks to this recurrence matrix. By using the logistic map, we illustrate the potential of these complex network measures for the detection of dynamical transitions. Finally, we apply the proposed approach to a marine palaeo-climate record and identify the subtle changes to the climate regime.

  7. Data mining in time series databases

    CERN Document Server

    Kandel, Abraham; Bunke, Horst

    2004-01-01

    Adding the time dimension to real-world databases produces Time SeriesDatabases (TSDB) and introduces new aspects and difficulties to datamining and knowledge discovery. This book covers the state-of-the-artmethodology for mining time series databases. The novel data miningmethods presented in the book include techniques for efficientsegmentation, indexing, and classification of noisy and dynamic timeseries. A graph-based method for anomaly detection in time series isdescribed and the book also studies the implications of a novel andpotentially useful representation of time series as strings. Theproblem of detecting changes in data mining models that are inducedfrom temporal databases is additionally discussed.

  8. Outliers Mining in Time Series Data Sets

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    In this paper, we present a cluster-based algorithm for time series outlier mining.We use discrete Fourier transformation (DFT) to transform time series from time domain to frequency domain. Time series thus can be mapped as the points in k-dimensional space.For these points, a cluster-based algorithm is developed to mine the outliers from these points.The algorithm first partitions the input points into disjoint clusters and then prunes the clusters,through judgment that can not contain outliers.Our algorithm has been run in the electrical load time series of one steel enterprise and proved to be effective.

  9. Landscape dynamics assessment of dry climatic zones on the Baikal-Gobi transect from NDVI time series and field investigations data

    Science.gov (United States)

    Sayapina, D. O.; Zharnikova, M. A.; Tsydypov, B. Z.; Sodnomov, B. V.; Garmaev, E. Zh

    2016-11-01

    Starting in the eighties of the 20th century, the scientists of the Baikal Institute of Nature Management (BINM SB RAS) have been conducting field observations of the Transbaikalia geosystems transformation due to the change of climate and nature management. An utmost importance is placed on the study of a negative response of the land geosystems. This is shown through their deterioration, degradation, and desertification in particular. Through the years of research (1985-2015) in dry areas of the north of Central Asia, the scientists of the BINM SB RAS established a network of key sites for contact monitoring of the status and dynamics of the geosystems and the negative natural-anthropogenic processes along the Baikal-Gobi meridional transect (51-44° N, 105-107° E). The monitoring of the status and dynamics of the vegetation cover of some key sites is conducted by processing and analysis of multitemporal and multispectral Landsat and MODIS Terra imagery. An automatic analysis of the time variation of NDVI and a comparison with the progress of the index in the previous seasons are performed. The landscape indication of the key sites is made on the basis of satellite imagery and complete geobotanical descriptions. Landscape profiles and facies maps with natural boundaries are created.

  10. International Work-Conference on Time Series

    CERN Document Server

    Pomares, Héctor

    2016-01-01

    This volume presents selected peer-reviewed contributions from The International Work-Conference on Time Series, ITISE 2015, held in Granada, Spain, July 1-3, 2015. It discusses topics in time series analysis and forecasting, advanced methods and online learning in time series, high-dimensional and complex/big data time series as well as forecasting in real problems. The International Work-Conferences on Time Series (ITISE) provide a forum for scientists, engineers, educators and students to discuss the latest ideas and implementations in the foundations, theory, models and applications in the field of time series analysis and forecasting. It focuses on interdisciplinary and multidisciplinary research encompassing the disciplines of computer science, mathematics, statistics and econometrics.

  11. Coupling between time series: a network view

    CERN Document Server

    Mehraban, Saeed; Zamani, Maryam; Jafari, Gholamreza

    2013-01-01

    Recently, the visibility graph has been introduced as a novel view for analyzing time series, which maps it to a complex network. In this paper, we introduce new algorithm of visibility, "cross-visibility", which reveals the conjugation of two coupled time series. The correspondence between the two time series is mapped to a network, "the cross-visibility graph", to demonstrate the correlation between them. We applied the algorithm to several correlated and uncorrelated time series, generated by the linear stationary ARFIMA process. The results demonstrate that the cross-visibility graph associated with correlated time series with power-law auto-correlation is scale-free. If the time series are uncorrelated, the degree distribution of their cross-visibility network deviates from power-law. For more clarifying the process, we applied the algorithm to real-world data from the financial trades of two companies, and observed significant small-scale coupling in their dynamics.

  12. Forecasting Enrollments with Fuzzy Time Series.

    Science.gov (United States)

    Song, Qiang; Chissom, Brad S.

    The concept of fuzzy time series is introduced and used to forecast the enrollment of a university. Fuzzy time series, an aspect of fuzzy set theory, forecasts enrollment using a first-order time-invariant model. To evaluate the model, the conventional linear regression technique is applied and the predicted values obtained are compared to the…

  13. Comparison of time series using entropy and mutual correlation

    Science.gov (United States)

    Madonna, Fabio; Rosoldi, Marco

    2015-04-01

    The potential for redundant time series to reduce uncertainty in atmospheric variables has not been investigated comprehensively for climate observations. Moreover, comparison among time series of in situ and ground based remote sensing measurements have been performed using several methods, but quite often relying on linear models. In this work, the concepts of entropy (H) and mutual correlation (MC), defined in the frame of the information theory, are applied to the study of essential climate variables with the aim of characterizing the uncertainty of a time series and the redundancy of collocated measurements provided by different surface-based techniques. In particular, integrated water vapor (IWV) and water vapour mixing ratio times series obtained at five highly instrumented GRUAN (GCOS, Global Climate Observing System, Reference Upper-Air Network) stations with several sensors (e.g radiosondes, GPS, microwave and infrared radiometers, Raman lidar), in the period from 2010-2012, are analyzed in terms of H and MC. The comparison between the probability density functions of the time series shows that caution in using linear assumptions is needed and the use of statistics, like entropy, that are robust to outliers, is recommended to investigate measurements time series. Results reveals that the random uncertainties on the IWV measured with radiosondes, global positioning system, microwave and infrared radiometers, and Raman lidar measurements differed by less than 8 % over the considered time period. Comparisons of the time series of IWV content from ground-based remote sensing instruments with in situ soundings showed that microwave radiometers have the highest redundancy with the IWV time series measured by radiosondes and therefore the highest potential to reduce the random uncertainty of the radiosondes time series. Moreover, the random uncertainty of a time series from one instrument can be reduced by 60% by constraining the measurements with those from

  14. How to analyse irregularly sampled geophysical time series?

    Science.gov (United States)

    Eroglu, Deniz; Ozken, Ibrahim; Stemler, Thomas; Marwan, Norbert; Wyrwoll, Karl-Heinz; Kurths, Juergen

    2015-04-01

    One of the challenges of time series analysis is to detect dynamical changes in the dynamics of the underlying system.There are numerous methods that can be used to detect such regime changes in regular sampled times series. Here we present a new approach, that can be applied, when the time series is irregular sampled. Such data sets occur frequently in real world applications as in paleo climate proxy records. The basic idea follows Victor and Purpura [1] and considers segments of the time series. For each segment we compute the cost of transforming the segment into the following one. If the time series is from one dynamical regime the cost of transformation should be similar for each segment of the data. Dramatic changes in the cost time series indicate a change in the underlying dynamics. Any kind of analysis can be applicable to the cost time series since it is a regularly sampled time series. While recurrence plots are not the best choice for irregular sampled data with some measurement noise component, we show that a recurrence plot analysis based on the cost time series can successfully identify the changes in the dynamics of the system. We tested this method using synthetically created time series and will use these results to highlight the performance of our method. Furthermore we present our analysis of a suite of calcite and aragonite stalagmites located in the eastern Kimberley region of tropical Western Australia. This oxygen isotopic data is a proxy for the monsoon activity over the last 8,000 years. In this time series our method picks up several so far undetected changes from wet to dry in the monsoon system and therefore enables us to get a better understanding of the monsoon dynamics in the North-East of Australia over the last couple of thousand years. [1] J. D. Victor and K. P. Purpura, Network: Computation in Neural Systems 8, 127 (1997)

  15. Hurst Exponent Analysis of Financial Time Series

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    Statistical properties of stock market time series and the implication of their Hurst exponents are discussed. Hurst exponents of DJ1A (Dow Jones Industrial Average) components are tested using re-scaled range analysis. In addition to the original stock return series, the linear prediction errors of the daily returns are also tested. Numerical results show that the Hurst exponent analysis can provide some information about the statistical properties of the financial time series.

  16. Statistical criteria for characterizing irradiance time series.

    Energy Technology Data Exchange (ETDEWEB)

    Stein, Joshua S.; Ellis, Abraham; Hansen, Clifford W.

    2010-10-01

    We propose and examine several statistical criteria for characterizing time series of solar irradiance. Time series of irradiance are used in analyses that seek to quantify the performance of photovoltaic (PV) power systems over time. Time series of irradiance are either measured or are simulated using models. Simulations of irradiance are often calibrated to or generated from statistics for observed irradiance and simulations are validated by comparing the simulation output to the observed irradiance. Criteria used in this comparison should derive from the context of the analyses in which the simulated irradiance is to be used. We examine three statistics that characterize time series and their use as criteria for comparing time series. We demonstrate these statistics using observed irradiance data recorded in August 2007 in Las Vegas, Nevada, and in June 2009 in Albuquerque, New Mexico.

  17. Reconstruction of time-delay systems from chaotic time series.

    Science.gov (United States)

    Bezruchko, B P; Karavaev, A S; Ponomarenko, V I; Prokhorov, M D

    2001-11-01

    We propose a method that allows one to estimate the parameters of model scalar time-delay differential equations from time series. The method is based on a statistical analysis of time intervals between extrema in the time series. We verify our method by using it for the reconstruction of time-delay differential equations from their chaotic solutions and for modeling experimental systems with delay-induced dynamics from their chaotic time series.

  18. Lag space estimation in time series modelling

    DEFF Research Database (Denmark)

    Goutte, Cyril

    1997-01-01

    The purpose of this article is to investigate some techniques for finding the relevant lag-space, i.e. input information, for time series modelling. This is an important aspect of time series modelling, as it conditions the design of the model through the regressor vector a.k.a. the input layer...

  19. On reconstruction of time series in climatology

    Directory of Open Access Journals (Sweden)

    V. Privalsky

    2015-10-01

    Full Text Available The approach to time series reconstruction in climatology based upon cross-correlation coefficients and regression equations is mathematically incorrect because it ignores the dependence of time series upon their past. The proper method described here for the bivariate case requires the autoregressive time- and frequency domains modeling of the time series which contains simultaneous observations of both scalar series with subsequent application of the model to restore the shorter one into the past. The method presents further development of previous efforts taken by a number of authors starting from A. Douglass who introduced some concepts of time series analysis into paleoclimatology. The method is applied to the monthly data of total solar irradiance (TSI, 1979–2014, and sunspot numbers (SSN, 1749–2014, to restore the TSI data over 1749–1978. The results of the reconstruction are in statistical agreement with observations.

  20. A radar image time series

    Science.gov (United States)

    Leberl, F.; Fuchs, H.; Ford, J. P.

    1981-01-01

    A set of ten side-looking radar images of a mining area in Arizona that were aquired over a period of 14 yr are studied to demonstrate the photogrammetric differential-rectification technique applied to radar images and to examine changes that occurred in the area over time. Five of the images are rectified by using ground control points and a digital height model taken from a map. Residual coordinate errors in ground control are reduced from several hundred meters in all cases to + or - 19 to 70 m. The contents of the radar images are compared with a Landsat image and with aerial photographs. Effects of radar system parameters on radar images are briefly reviewed.

  1. Detecting Inhomogeneity in Daily Climate Series Using Wavelet Analysis

    Institute of Scientific and Technical Information of China (English)

    YAN Zhongwei; Phil D.JONES

    2008-01-01

    A wavelet method was applied to detect inhomogeneities in daily meteorological series,data which are being increasingly applied in studies of climate extremes.The wavelet method has been applied to a few well-established long-term daily temperature series back to the 18th century,which have been "homogenized" with conventional approaches.Various types of problems remaining in the series were revealed with the wavelet method.Their influences on analyses of change in climate extremes are discussed.The results have importance for understanding issues in conventional climate data processing and for development of improved methods of homogenization in order to improve analysis of climate extremes based on daily data.

  2. Time Series Analysis Forecasting and Control

    CERN Document Server

    Box, George E P; Reinsel, Gregory C

    2011-01-01

    A modernized new edition of one of the most trusted books on time series analysis. Since publication of the first edition in 1970, Time Series Analysis has served as one of the most influential and prominent works on the subject. This new edition maintains its balanced presentation of the tools for modeling and analyzing time series and also introduces the latest developments that have occurred n the field over the past decade through applications from areas such as business, finance, and engineering. The Fourth Edition provides a clearly written exploration of the key methods for building, cl

  3. Time series modeling, computation, and inference

    CERN Document Server

    Prado, Raquel

    2010-01-01

    The authors systematically develop a state-of-the-art analysis and modeling of time series. … this book is well organized and well written. The authors present various statistical models for engineers to solve problems in time series analysis. Readers no doubt will learn state-of-the-art techniques from this book.-Hsun-Hsien Chang, Computing Reviews, March 2012My favorite chapters were on dynamic linear models and vector AR and vector ARMA models.-William Seaver, Technometrics, August 2011… a very modern entry to the field of time-series modelling, with a rich reference list of the current lit

  4. A Simple Fuzzy Time Series Forecasting Model

    DEFF Research Database (Denmark)

    Ortiz-Arroyo, Daniel

    2016-01-01

    In this paper we describe a new first order fuzzy time series forecasting model. We show that our automatic fuzzy partitioning method provides an accurate approximation to the time series that when combined with rule forecasting and an OWA operator improves forecasting accuracy. Our model does...... not attempt to provide the best results in comparison with other forecasting methods but to show how to improve first order models using simple techniques. However, we show that our first order model is still capable of outperforming some more complex higher order fuzzy time series models....

  5. DATA MINING IN CANADIAN LYNX TIME SERIES

    Directory of Open Access Journals (Sweden)

    R.Karnaboopathy

    2012-01-01

    Full Text Available This paper sums up the applications of Statistical model such as ARIMA family timeseries models in Canadian lynx data time series analysis and introduces the method of datamining combined with Statistical knowledge to analysis Canadian lynx data series.

  6. Visibility Graph Based Time Series Analysis

    Science.gov (United States)

    Stephen, Mutua; Gu, Changgui; Yang, Huijie

    2015-01-01

    Network based time series analysis has made considerable achievements in the recent years. By mapping mono/multivariate time series into networks, one can investigate both it’s microscopic and macroscopic behaviors. However, most proposed approaches lead to the construction of static networks consequently providing limited information on evolutionary behaviors. In the present paper we propose a method called visibility graph based time series analysis, in which series segments are mapped to visibility graphs as being descriptions of the corresponding states and the successively occurring states are linked. This procedure converts a time series to a temporal network and at the same time a network of networks. Findings from empirical records for stock markets in USA (S&P500 and Nasdaq) and artificial series generated by means of fractional Gaussian motions show that the method can provide us rich information benefiting short-term and long-term predictions. Theoretically, we propose a method to investigate time series from the viewpoint of network of networks. PMID:26571115

  7. Integrating vegetation index time series and meteorological data to understand the effect of the land use/land cover (LULC) in the climatic seasonality of the Brazilian Cerrado

    Science.gov (United States)

    Lins, D. B.; Zullo, J.; Friedel, M. J.

    2013-12-01

    The Cerrado (savanna ecosystem) of São Paulo state (Brazil) represent a complex mosaic of different typologies of uses, actors and biophysical and social restrictions. Originally, 14% of the state of São Paulo area was covered by the diversity of Cerrado phytophysiognomies. Currently, only 1% of this original composition remains fragmented into numerous relicts of biodiversity, mainly concentrated in the central-eastern of the state. A relevant part of the fragments are found in areas of intense coverage change by human activities, whereas the greatest pressure comes from sugar cane cultivation, either by direct replacement of Cerrado vegetation or occupying pasture areas in the fragments edges. As a result, new local level dynamics has been introduced, directly or indirectly, affecting the established of processes in climate systems. In this study, the main goal is analyzing the relationship between the Cerrado landscape changing and the climate dynamics in regional and local areas. The multi-temporal MODIS 250 m Vegetation Index (VI) datasets (period of 2000 to 2012) are integrated with precipitation data of the correspondent period (http://www.agritempo.gov.br/),one of the most important variable of the spatial phytophysiognomies distribution. The integration of meteorological data enable the development of an integrated approach to understand the relationship between climatic seasonality and the changes in the spatial patterns. A procedure to congregated diverse dynamics information is the Self Organizing Map (SOM, Kohonen, 2001), a technique that relies on unsupervised competitive learning (Kohonen and Somervuo 2002) to recognize patterns. In this approach, high-dimensional data are represented on two dimensions, making possible to obtain patterns that takes into account information from different natures. Observed advances will contribute to bring machine-learning techniques as a valid tool to provide improve in land use/land cover (LULC) analyzes at

  8. Evaluation of Harmonic Analysis of Time Series (HANTS): impact of gaps on time series reconstruction

    NARCIS (Netherlands)

    Zhou, J.Y.; Jia, L.; Hu, G.; Menenti, M.

    2012-01-01

    In recent decades, researchers have developed methods and models to reconstruct time series of irregularly spaced observations from satellite remote sensing, among which the widely used Harmonic Analysis of Time Series (HANTS) method. Many studies based on time series reconstructed with HANTS docume

  9. Forecasting Daily Time Series using Periodic Unobserved Components Time Series Models

    NARCIS (Netherlands)

    Koopman, Siem Jan; Ooms, Marius

    2004-01-01

    We explore a periodic analysis in the context of unobserved components time series models that decompose time series into components of interest such as trend and seasonal. Periodic time series models allow dynamic characteristics to depend on the period of the year, month, week or day. In the stand

  10. Measuring nonlinear behavior in time series data

    Science.gov (United States)

    Wai, Phoong Seuk; Ismail, Mohd Tahir

    2014-12-01

    Stationary Test is an important test in detect the time series behavior since financial and economic data series always have missing data, structural change as well as jumps or breaks in the data set. Moreover, stationary test is able to transform the nonlinear time series variable to become stationary by taking difference-stationary process or trend-stationary process. Two different types of hypothesis testing of stationary tests that are Augmented Dickey-Fuller (ADF) test and Kwiatkowski-Philips-Schmidt-Shin (KPSS) test are examine in this paper to describe the properties of the time series variables in financial model. Besides, Least Square method is used in Augmented Dickey-Fuller test to detect the changes of the series and Lagrange multiplier is used in Kwiatkowski-Philips-Schmidt-Shin test to examine the properties of oil price, gold price and Malaysia stock market. Moreover, Quandt-Andrews, Bai-Perron and Chow tests are also use to detect the existence of break in the data series. The monthly index data are ranging from December 1989 until May 2012. Result is shown that these three series exhibit nonlinear properties but are able to transform to stationary series after taking first difference process.

  11. Applied time series analysis and innovative computing

    CERN Document Server

    Ao, Sio-Iong

    2010-01-01

    This text is a systematic, state-of-the-art introduction to the use of innovative computing paradigms as an investigative tool for applications in time series analysis. It includes frontier case studies based on recent research.

  12. Spectra: Time series power spectrum calculator

    Science.gov (United States)

    Gallardo, Tabaré

    2017-01-01

    Spectra calculates the power spectrum of a time series equally spaced or not based on the Spectral Correlation Coefficient (Ferraz-Mello 1981, Astron. Journal 86 (4), 619). It is very efficient for detection of low frequencies.

  13. Handbook of Time Series Analysis Recent Theoretical Developments and Applications

    CERN Document Server

    Schelter, Björn; Timmer, Jens

    2006-01-01

    This handbook provides an up-to-date survey of current research topics and applications of time series analysis methods written by leading experts in their fields. It covers recent developments in univariate as well as bivariate and multivariate time series analysis techniques ranging from physics' to life sciences' applications. Each chapter comprises both methodological aspects and applications to real world complex systems, such as the human brain or Earth's climate. Covering an exceptionally broad spectrum of topics, beginners, experts and practitioners who seek to understand the latest de

  14. Combination prediction method of chaotic time series

    Institute of Scientific and Technical Information of China (English)

    ZHAO DongHua; RUAN Jiong; CAI ZhiJie

    2007-01-01

    In the present paper, we propose an approach of combination prediction of chaotic time series. The method is based on the adding-weight one-rank local-region method of chaotic time series. The method allows us to define an interval containing a future value with a given probability, which is obtained by studying the prediction error distribution. Its effectiveness is shown with data generated by Logistic map.

  15. Pseudotime estimation: deconfounding single cell time series

    OpenAIRE

    John E Reid; Wernisch, Lorenz

    2016-01-01

    Motivation: Repeated cross-sectional time series single cell data confound several sources of variation, with contributions from measurement noise, stochastic cell-to-cell variation and cell progression at different rates. Time series from single cell assays are particularly susceptible to confounding as the measurements are not averaged over populations of cells. When several genes are assayed in parallel these effects can be estimated and corrected for under certain smoothness assumptions o...

  16. FATS: Feature Analysis for Time Series

    CERN Document Server

    Nun, Isadora; Sim, Brandon; Zhu, Ming; Dave, Rahul; Castro, Nicolas; Pichara, Karim

    2015-01-01

    In this paper, we present the FATS (Feature Analysis for Time Series) library. FATS is a Python library which facilitates and standardizes feature extraction for time series data. In particular, we focus on one application: feature extraction for astronomical light curve data, although the library is generalizable for other uses. We detail the methods and features implemented for light curve analysis, and present examples for its usage.

  17. Introduction to time series analysis and forecasting

    CERN Document Server

    Montgomery, Douglas C; Kulahci, Murat

    2008-01-01

    An accessible introduction to the most current thinking in and practicality of forecasting techniques in the context of time-oriented data. Analyzing time-oriented data and forecasting are among the most important problems that analysts face across many fields, ranging from finance and economics to production operations and the natural sciences. As a result, there is a widespread need for large groups of people in a variety of fields to understand the basic concepts of time series analysis and forecasting. Introduction to Time Series Analysis and Forecasting presents the time series analysis branch of applied statistics as the underlying methodology for developing practical forecasts, and it also bridges the gap between theory and practice by equipping readers with the tools needed to analyze time-oriented data and construct useful, short- to medium-term, statistically based forecasts.

  18. Time Series Forecasting with Missing Values

    Directory of Open Access Journals (Sweden)

    Shin-Fu Wu

    2015-11-01

    Full Text Available Time series prediction has become more popular in various kinds of applications such as weather prediction, control engineering, financial analysis, industrial monitoring, etc. To deal with real-world problems, we are often faced with missing values in the data due to sensor malfunctions or human errors. Traditionally, the missing values are simply omitted or replaced by means of imputation methods. However, omitting those missing values may cause temporal discontinuity. Imputation methods, on the other hand, may alter the original time series. In this study, we propose a novel forecasting method based on least squares support vector machine (LSSVM. We employ the input patterns with the temporal information which is defined as local time index (LTI. Time series data as well as local time indexes are fed to LSSVM for doing forecasting without imputation. We compare the forecasting performance of our method with other imputation methods. Experimental results show that the proposed method is promising and is worth further investigations.

  19. Feature Matching in Time Series Modelling

    CERN Document Server

    Xia, Yingcun

    2011-01-01

    Using a time series model to mimic an observed time series has a long history. However, with regard to this objective, conventional estimation methods for discrete-time dynamical models are frequently found to be wanting. In the absence of a true model, we prefer an alternative approach to conventional model fitting that typically involves one-step-ahead prediction errors. Our primary aim is to match the joint probability distribution of the observable time series, including long-term features of the dynamics that underpin the data, such as cycles, long memory and others, rather than short-term prediction. For want of a better name, we call this specific aim {\\it feature matching}. The challenges of model mis-specification, measurement errors and the scarcity of data are forever present in real time series modelling. In this paper, by synthesizing earlier attempts into an extended-likelihood, we develop a systematic approach to empirical time series analysis to address these challenges and to aim at achieving...

  20. Density dependence and climate effects in Rocky Mountain elk: an application of regression with instrumental variables for population time series with sampling error.

    Science.gov (United States)

    Creel, Scott; Creel, Michael

    2009-11-01

    1. Sampling error in annual estimates of population size creates two widely recognized problems for the analysis of population growth. First, if sampling error is mistakenly treated as process error, one obtains inflated estimates of the variation in true population trajectories (Staples, Taper & Dennis 2004). Second, treating sampling error as process error is thought to overestimate the importance of density dependence in population growth (Viljugrein et al. 2005; Dennis et al. 2006). 2. In ecology, state-space models are used to account for sampling error when estimating the effects of density and other variables on population growth (Staples et al. 2004; Dennis et al. 2006). In econometrics, regression with instrumental variables is a well-established method that addresses the problem of correlation between regressors and the error term, but requires fewer assumptions than state-space models (Davidson & MacKinnon 1993; Cameron & Trivedi 2005). 3. We used instrumental variables to account for sampling error and fit a generalized linear model to 472 annual observations of population size for 35 Elk Management Units in Montana, from 1928 to 2004. We compared this model with state-space models fit with the likelihood function of Dennis et al. (2006). We discuss the general advantages and disadvantages of each method. Briefly, regression with instrumental variables is valid with fewer distributional assumptions, but state-space models are more efficient when their distributional assumptions are met. 4. Both methods found that population growth was negatively related to population density and winter snow accumulation. Summer rainfall and wolf (Canis lupus) presence had much weaker effects on elk (Cervus elaphus) dynamics [though limitation by wolves is strong in some elk populations with well-established wolf populations (Creel et al. 2007; Creel & Christianson 2008)]. 5. Coupled with predictions for Montana from global and regional climate models, our results

  1. Predicting road accidents: Structural time series approach

    Science.gov (United States)

    Junus, Noor Wahida Md; Ismail, Mohd Tahir

    2014-07-01

    In this paper, the model for occurrence of road accidents in Malaysia between the years of 1970 to 2010 was developed and throughout this model the number of road accidents have been predicted by using the structural time series approach. The models are developed by using stepwise method and the residual of each step has been analyzed. The accuracy of the model is analyzed by using the mean absolute percentage error (MAPE) and the best model is chosen based on the smallest Akaike information criterion (AIC) value. A structural time series approach found that local linear trend model is the best model to represent the road accidents. This model allows level and slope component to be varied over time. In addition, this approach also provides useful information on improving the conventional time series method.

  2. Efficient Approximate OLAP Querying Over Time Series

    DEFF Research Database (Denmark)

    Perera, Kasun Baruhupolage Don Kasun Sanjeewa; Hahmann, Martin; Lehner, Wolfgang;

    2016-01-01

    are either costly or require continuous maintenance. In this paper we propose an approach for approximate OLAP querying of time series that offers constant latency and is maintenance-free. To achieve this, we identify similarities between aggregation cuboids and propose algorithms that eliminate......The ongoing trend for data gathering not only produces larger volumes of data, but also increases the variety of recorded data types. Out of these, especially time series, e.g. various sensor readings, have attracted attention in the domains of business intelligence and decision making. As OLAP...... queries play a major role in these domains, it is desirable to also execute them on time series data. While this is not a problem on the conceptual level, it can become a bottleneck with regards to query run-time. In general, processing OLAP queries gets more computationally intensive as the volume...

  3. Introduction to time series analysis and forecasting

    CERN Document Server

    Montgomery, Douglas C; Kulahci, Murat

    2015-01-01

    Praise for the First Edition ""…[t]he book is great for readers who need to apply the methods and models presented but have little background in mathematics and statistics."" -MAA Reviews Thoroughly updated throughout, Introduction to Time Series Analysis and Forecasting, Second Edition presents the underlying theories of time series analysis that are needed to analyze time-oriented data and construct real-world short- to medium-term statistical forecasts.    Authored by highly-experienced academics and professionals in engineering statistics, the Second Edition features discussions on both

  4. Testing the sensitivity of stable carbon isotopes of sub-fossil Sphagnum cellulose to past climate variability: a two millennia high resolution stable carbon isotope time series from the peat deposit "Dürres Maar", Germany

    Science.gov (United States)

    Moschen, Robert; Kühl, Norbert; Peters, Sabrina; Vos, Heinz; Lücke, Andreas

    2010-05-01

    Peat deposits are terrestrial archives of environmental changes and climate dynamics over time. They are widely distributed and cover a large part of the earth's land surface often within human habitat and, thus, form an excellent basis for evaluating ecosystem and climate dynamics by multiple geochemical and biological methods. Records of the stable carbon composition of cellulose separately extracted from selected Sphagnum plant components (δ13CSphagnum) from the kettle-hole type peat deposit of 'Dürres Maar' are presented. Manually separated Sphagnum stems, branches and the small leaves covering Sphagnum branches were used for cellulose extraction and subsequent isotope measurements, because intra-plant δ13CSphagnum variability between different physical components of individual modern plants has been described (Loader et al. 2007). We observed the same isotopic offset between single plant components of sub-fossil Sphagnum plant components which is statistically highly significant and observable down-core (Moschen et al. 2009). Using the size fraction of 355-630 μm, which almost exclusively consists of single Sphagnum leaves, allows to derive environmental and climate signals based on a plant response to external controls, presumably including temperature and relative humidity. Because down-core changes in the ratio of different plant components in the peat profile seem probable, erroneous interpretations of isotope records are likely if no differentiation into single Sphagnum plant components is possible. A high resolution time series of δ13CSphagnum is presented covering the last two millennia, tracing decadal to sub-decadal past environmental and climate dynamics. The thickness of the water film surrounding the chloroplasts of Sphagnum plants has been suggested as the most important factor influencing δ13CSphagnum. This points to bog surface wetness which is primarily driven by precipitation and evaporation temperature as the major control of δ13

  5. Layered Ensemble Architecture for Time Series Forecasting.

    Science.gov (United States)

    Rahman, Md Mustafizur; Islam, Md Monirul; Murase, Kazuyuki; Yao, Xin

    2016-01-01

    Time series forecasting (TSF) has been widely used in many application areas such as science, engineering, and finance. The phenomena generating time series are usually unknown and information available for forecasting is only limited to the past values of the series. It is, therefore, necessary to use an appropriate number of past values, termed lag, for forecasting. This paper proposes a layered ensemble architecture (LEA) for TSF problems. Our LEA consists of two layers, each of which uses an ensemble of multilayer perceptron (MLP) networks. While the first ensemble layer tries to find an appropriate lag, the second ensemble layer employs the obtained lag for forecasting. Unlike most previous work on TSF, the proposed architecture considers both accuracy and diversity of the individual networks in constructing an ensemble. LEA trains different networks in the ensemble by using different training sets with an aim of maintaining diversity among the networks. However, it uses the appropriate lag and combines the best trained networks to construct the ensemble. This indicates LEAs emphasis on accuracy of the networks. The proposed architecture has been tested extensively on time series data of neural network (NN)3 and NN5 competitions. It has also been tested on several standard benchmark time series data. In terms of forecasting accuracy, our experimental results have revealed clearly that LEA is better than other ensemble and nonensemble methods.

  6. Improving the prediction of chaotic time series

    Institute of Scientific and Technical Information of China (English)

    李克平; 高自友; 陈天仑

    2003-01-01

    One of the features of deterministic chaos is sensitive to initial conditions. This feature limits the prediction horizons of many chaotic systems. In this paper, we propose a new prediction technique for chaotic time series. In our method, some neighbouring points of the predicted point, for which the corresponding local Lyapunov exponent is particularly large, would be discarded during estimating the local dynamics, and thus the error accumulated by the prediction algorithm is reduced. The model is tested for the convection amplitude of Lorenz systems. The simulation results indicate that the prediction technique can improve the prediction of chaotic time series.

  7. Lecture notes for Advanced Time Series Analysis

    DEFF Research Database (Denmark)

    Madsen, Henrik; Holst, Jan

    1997-01-01

    A first version of this notes was used at the lectures in Grenoble, and they are now extended and improved (together with Jan Holst), and used in Ph.D. courses on Advanced Time Series Analysis at IMM and at the Department of Mathematical Statistics, University of Lund, 1994, 1997, ......A first version of this notes was used at the lectures in Grenoble, and they are now extended and improved (together with Jan Holst), and used in Ph.D. courses on Advanced Time Series Analysis at IMM and at the Department of Mathematical Statistics, University of Lund, 1994, 1997, ...

  8. Dynamical networks reconstructed from time series

    CERN Document Server

    Levnajić, Zoran

    2012-01-01

    Novel method of reconstructing dynamical networks from empirically measured time series is proposed. By statistically examining the correlations between motions displayed by network nodes, we derive a simple equation that directly yields the adjacency matrix, assuming the intra-network interaction functions to be known. We illustrate the method's implementation on a simple example and discuss the dependence of the reconstruction precision on the properties of time series. Our method is applicable to any network, allowing for reconstruction precision to be maximized, and errors to be estimated.

  9. Integrating SPOT-VEGETATION 13-yr time series and land-surface modelling to forecast the terrestrial carbon dynamics in a changing climate - The VEGECLIM project: achievements and lessons learned

    Science.gov (United States)

    Defourny, Pierre; Verbeeck, Hans; Moreau, Inès; De Weirdt, Marjolein; Verhegghen, Astrid; Kibambe-Lubamba, Jean-Paul; Jungers, Quentin; Maignan, Fabienne; Najdovski, Nicolas; Poulter, Benjamin; MacBean, Natasha; Peylin, Philippe

    2014-05-01

    Vegetation is a major carbon sink and is as such a key component of the international response to climate change caused by the build-up of greenhouse gases in the atmosphere. However, anthropogenic disturbances like deforestation are the primary mechanism that changes ecosystems from carbon sinks to sources, and are hardly included in the current carbon modelling approaches. Moreover, in tropical regions, the seasonal/interannual variability of carbon fluxes is still uncertain and a weak or even no seasonality is taken into account in global vegetation models. In the context of climate change and mitigation policies like "Reducing Emissions from Deforestation and Forest Degradation in Developing Countries" (REDD), it is particularly important to be able to quantify and forecast the vegetation dynamics and carbon fluxes in these regions. The overall objective of the VEGECLIM project is to increase our knowledge on the terrestrial carbon cycle in tropical regions and to improve the forecast of the vegetation dynamics and carbon stocks and fluxes under different climate-change and deforestation scenarios. Such an approach aims to determine whether the African terrestrial carbon balance will remain a net sink or could become a carbon source by the end of the century, according to different climate-change and deforestation scenarios. The research strategy is to integrate the information of the land surface characterizations obtained from 13 years of consistent SPOT-VEGETATION time series (land cover, vegetation phenology through vegetation indices such as the Enhanced Vegetation Index (EVI)) as well as in-situ carbon flux data into the process based ORCHIDEE global vegetation model, capable of simulating vegetation dynamics and carbon balance. Key challenge of this project was to bridge the gap between the land cover and the land surface model teams. Several improvements of the ORCHIDEE model have been realized such as a new seasonal leaf dynamics for tropical evergreen

  10. Introduction to time series and forecasting

    CERN Document Server

    Brockwell, Peter J

    2016-01-01

    This book is aimed at the reader who wishes to gain a working knowledge of time series and forecasting methods as applied to economics, engineering and the natural and social sciences. It assumes knowledge only of basic calculus, matrix algebra and elementary statistics. This third edition contains detailed instructions for the use of the professional version of the Windows-based computer package ITSM2000, now available as a free download from the Springer Extras website. The logic and tools of time series model-building are developed in detail. Numerous exercises are included and the software can be used to analyze and forecast data sets of the user's own choosing. The book can also be used in conjunction with other time series packages such as those included in R. The programs in ITSM2000 however are menu-driven and can be used with minimal investment of time in the computational details. The core of the book covers stationary processes, ARMA and ARIMA processes, multivariate time series and state-space mod...

  11. Multifractal Analysis of Polyalanines Time Series

    CERN Document Server

    Figueirêdo, P H; Moret, M A; Coutinho, Sérgio; 10.1016/j.physa.2009.11.045

    2010-01-01

    Multifractal properties of the energy time series of short $\\alpha$-helix structures, specifically from a polyalanine family, are investigated through the MF-DFA technique ({\\it{multifractal detrended fluctuation analysis}}). Estimates for the generalized Hurst exponent $h(q)$ and its associated multifractal exponents $\\tau(q)$ are obtained for several series generated by numerical simulations of molecular dynamics in different systems from distinct initial conformations. All simulations were performed using the GROMOS force field, implemented in the program THOR. The main results have shown that all series exhibit multifractal behavior depending on the number of residues and temperature. Moreover, the multifractal spectra reveal important aspects on the time evolution of the system and suggest that the nucleation process of the secondary structures during the visits on the energy hyper-surface is an essential feature of the folding process.

  12. Time Series Rule Discovery: Tough, not Meaningless

    NARCIS (Netherlands)

    Struzik, Z.R.

    2003-01-01

    `Model free' rule discovery from data has recently been subject to considerable criticism, which has cast a shadow over the emerging discipline of time series data mining. However, other than in data mining, rule discovery has long been the subject of research in statistical physics of complex pheno

  13. Forecasting with periodic autoregressive time series models

    NARCIS (Netherlands)

    Ph.H.B.F. Franses (Philip Hans); R. Paap (Richard)

    1999-01-01

    textabstractThis paper is concerned with forecasting univariate seasonal time series data using periodic autoregressive models. We show how one should account for unit roots and deterministic terms when generating out-of-sample forecasts. We illustrate the models for various quarterly UK consumption

  14. 25 years of time series forecasting

    NARCIS (Netherlands)

    de Gooijer, J.G.; Hyndman, R.J.

    2006-01-01

    We review the past 25 years of research into time series forecasting. In this silver jubilee issue, we naturally highlight results published in journals managed by the International Institute of Forecasters (Journal of Forecasting 1982-1985 and International Journal of Forecasting 1985-2005). During

  15. Nonlinear time series modelling: an introduction

    OpenAIRE

    Simon M. Potter

    1999-01-01

    Recent developments in nonlinear time series modelling are reviewed. Three main types of nonlinear models are discussed: Markov Switching, Threshold Autoregression and Smooth Transition Autoregression. Classical and Bayesian estimation techniques are described for each model. Parametric tests for nonlinearity are reviewed with examples from the three types of models. Finally, forecasting and impulse response analysis is developed.

  16. Common large innovations across nonlinear time series

    NARCIS (Netherlands)

    Ph.H.B.F. Franses (Philip Hans); R. Paap (Richard)

    2002-01-01

    textabstractWe propose a multivariate nonlinear econometric time series model, which can be used to examine if there is common nonlinearity across economic variables. The model is a multivariate censored latent effects autoregression. The key feature of this model is that nonlinearity appears as sep

  17. Designer networks for time series processing

    DEFF Research Database (Denmark)

    Svarer, C; Hansen, Lars Kai; Larsen, Jan;

    1993-01-01

    The conventional tapped-delay neural net may be analyzed using statistical methods and the results of such analysis can be applied to model optimization. The authors review and extend efforts to demonstrate the power of this strategy within time series processing. They attempt to design compact...

  18. Nonlinear Time Series Analysis via Neural Networks

    Science.gov (United States)

    Volná, Eva; Janošek, Michal; Kocian, Václav; Kotyrba, Martin

    This article deals with a time series analysis based on neural networks in order to make an effective forex market [Moore and Roche, J. Int. Econ. 58, 387-411 (2002)] pattern recognition. Our goal is to find and recognize important patterns which repeatedly appear in the market history to adapt our trading system behaviour based on them.

  19. Similarity estimators for irregular and age uncertain time series

    Directory of Open Access Journals (Sweden)

    K. Rehfeld

    2013-09-01

    Full Text Available Paleoclimate time series are often irregularly sampled and age uncertain, which is an important technical challenge to overcome for successful reconstruction of past climate variability and dynamics. Visual comparison and interpolation-based linear correlation approaches have been used to infer dependencies from such proxy time series. While the first is subjective, not measurable and not suitable for the comparison of many datasets at a time, the latter introduces interpolation bias, and both face difficulties if the underlying dependencies are nonlinear. In this paper we investigate similarity estimators that could be suitable for the quantitative investigation of dependencies in irregular and age uncertain time series. We compare the Gaussian-kernel based cross correlation (gXCF, Rehfeld et al., 2011 and mutual information (gMI, Rehfeld et al., 2013 against their interpolation-based counterparts and the new event synchronization function (ESF. We test the efficiency of the methods in estimating coupling strength and coupling lag numerically, using ensembles of synthetic stalagmites with short, autocorrelated, linear and nonlinearly coupled proxy time series, and in the application to real stalagmite time series. In the linear test case coupling strength increases are identified consistently for all estimators, while in the nonlinear test case the correlation-based approaches fail. The lag at which the time series are coupled is identified correctly as the maximum of the similarity functions in around 60–55% (in the linear case to 53–42% (for the nonlinear processes of the cases when the dating of the synthetic stalagmite is perfectly precise. If the age uncertainty increases beyond 5% of the time series length, however, the true coupling lag is not identified more often than the others for which the similarity function was estimated. Age uncertainty contributes up to half of the uncertainty in the similarity estimation process. Time

  20. Similarity estimators for irregular and age-uncertain time series

    Science.gov (United States)

    Rehfeld, K.; Kurths, J.

    2014-01-01

    Paleoclimate time series are often irregularly sampled and age uncertain, which is an important technical challenge to overcome for successful reconstruction of past climate variability and dynamics. Visual comparison and interpolation-based linear correlation approaches have been used to infer dependencies from such proxy time series. While the first is subjective, not measurable and not suitable for the comparison of many data sets at a time, the latter introduces interpolation bias, and both face difficulties if the underlying dependencies are nonlinear. In this paper we investigate similarity estimators that could be suitable for the quantitative investigation of dependencies in irregular and age-uncertain time series. We compare the Gaussian-kernel-based cross-correlation (gXCF, Rehfeld et al., 2011) and mutual information (gMI, Rehfeld et al., 2013) against their interpolation-based counterparts and the new event synchronization function (ESF). We test the efficiency of the methods in estimating coupling strength and coupling lag numerically, using ensembles of synthetic stalagmites with short, autocorrelated, linear and nonlinearly coupled proxy time series, and in the application to real stalagmite time series. In the linear test case, coupling strength increases are identified consistently for all estimators, while in the nonlinear test case the correlation-based approaches fail. The lag at which the time series are coupled is identified correctly as the maximum of the similarity functions in around 60-55% (in the linear case) to 53-42% (for the nonlinear processes) of the cases when the dating of the synthetic stalagmite is perfectly precise. If the age uncertainty increases beyond 5% of the time series length, however, the true coupling lag is not identified more often than the others for which the similarity function was estimated. Age uncertainty contributes up to half of the uncertainty in the similarity estimation process. Time series irregularity

  1. Similarity estimators for irregular and age uncertain time series

    Science.gov (United States)

    Rehfeld, K.; Kurths, J.

    2013-09-01

    Paleoclimate time series are often irregularly sampled and age uncertain, which is an important technical challenge to overcome for successful reconstruction of past climate variability and dynamics. Visual comparison and interpolation-based linear correlation approaches have been used to infer dependencies from such proxy time series. While the first is subjective, not measurable and not suitable for the comparison of many datasets at a time, the latter introduces interpolation bias, and both face difficulties if the underlying dependencies are nonlinear. In this paper we investigate similarity estimators that could be suitable for the quantitative investigation of dependencies in irregular and age uncertain time series. We compare the Gaussian-kernel based cross correlation (gXCF, Rehfeld et al., 2011) and mutual information (gMI, Rehfeld et al., 2013) against their interpolation-based counterparts and the new event synchronization function (ESF). We test the efficiency of the methods in estimating coupling strength and coupling lag numerically, using ensembles of synthetic stalagmites with short, autocorrelated, linear and nonlinearly coupled proxy time series, and in the application to real stalagmite time series. In the linear test case coupling strength increases are identified consistently for all estimators, while in the nonlinear test case the correlation-based approaches fail. The lag at which the time series are coupled is identified correctly as the maximum of the similarity functions in around 60-55% (in the linear case) to 53-42% (for the nonlinear processes) of the cases when the dating of the synthetic stalagmite is perfectly precise. If the age uncertainty increases beyond 5% of the time series length, however, the true coupling lag is not identified more often than the others for which the similarity function was estimated. Age uncertainty contributes up to half of the uncertainty in the similarity estimation process. Time series irregularity

  2. What marketing scholars should know about time series analysis : time series applications in marketing

    NARCIS (Netherlands)

    Horváth, Csilla; Kornelis, Marcel; Leeflang, Peter S.H.

    2002-01-01

    In this review, we give a comprehensive summary of time series techniques in marketing, and discuss a variety of time series analysis (TSA) techniques and models. We classify them in the sets (i) univariate TSA, (ii) multivariate TSA, and (iii) multiple TSA. We provide relevant marketing application

  3. Remote Sensing Time Series Product Tool

    Science.gov (United States)

    Predos, Don; Ryan, Robert E.; Ross, Kenton W.

    2006-01-01

    The TSPT (Time Series Product Tool) software was custom-designed for NASA to rapidly create and display single-band and band-combination time series, such as NDVI (Normalized Difference Vegetation Index) images, for wide-area crop surveillance and for other time-critical applications. The TSPT, developed in MATLAB, allows users to create and display various MODIS (Moderate Resolution Imaging Spectroradiometer) or simulated VIIRS (Visible/Infrared Imager Radiometer Suite) products as single images, as time series plots at a selected location, or as temporally processed image videos. Manually creating these types of products is extremely labor intensive; however, the TSPT development tool makes the process simplified and efficient. MODIS is ideal for monitoring large crop areas because of its wide swath (2330 km), its relatively small ground sample distance (250 m), and its high temporal revisit time (twice daily). Furthermore, because MODIS imagery is acquired daily, rapid changes in vegetative health can potentially be detected. The new TSPT technology provides users with the ability to temporally process high-revisit-rate satellite imagery, such as that acquired from MODIS and from its successor, the VIIRS. The TSPT features the important capability of fusing data from both MODIS instruments onboard the Terra and Aqua satellites, which drastically improves cloud statistics. With the TSPT, MODIS metadata is used to find and optionally remove bad and suspect data. Noise removal and temporal processing techniques allow users to create low-noise time series plots and image videos and to select settings and thresholds that tailor particular output products. The TSPT GUI (graphical user interface) provides an interactive environment for crafting what-if scenarios by enabling a user to repeat product generation using different settings and thresholds. The TSPT Application Programming Interface provides more fine-tuned control of product generation, allowing experienced

  4. Time Series Forecasting: A Nonlinear Dynamics Approach

    OpenAIRE

    Sello, Stefano

    1999-01-01

    The problem of prediction of a given time series is examined on the basis of recent nonlinear dynamics theories. Particular attention is devoted to forecast the amplitude and phase of one of the most common solar indicator activity, the international monthly smoothed sunspot number. It is well known that the solar cycle is very difficult to predict due to the intrinsic complexity of the related time behaviour and to the lack of a succesful quantitative theoretical model of the Sun magnetic cy...

  5. Delay Differential Analysis of Time Series

    Science.gov (United States)

    Lainscsek, Claudia; Sejnowski, Terrence J.

    2015-01-01

    Nonlinear dynamical system analysis based on embedding theory has been used for modeling and prediction, but it also has applications to signal detection and classification of time series. An embedding creates a multidimensional geometrical object from a single time series. Traditionally either delay or derivative embeddings have been used. The delay embedding is composed of delayed versions of the signal, and the derivative embedding is composed of successive derivatives of the signal. The delay embedding has been extended to nonuniform embeddings to take multiple timescales into account. Both embeddings provide information on the underlying dynamical system without having direct access to all the system variables. Delay differential analysis is based on functional embeddings, a combination of the derivative embedding with nonuniform delay embeddings. Small delay differential equation (DDE) models that best represent relevant dynamic features of time series data are selected from a pool of candidate models for detection or classification. We show that the properties of DDEs support spectral analysis in the time domain where nonlinear correlation functions are used to detect frequencies, frequency and phase couplings, and bispectra. These can be efficiently computed with short time windows and are robust to noise. For frequency analysis, this framework is a multivariate extension of discrete Fourier transform (DFT), and for higher-order spectra, it is a linear and multivariate alternative to multidimensional fast Fourier transform of multidimensional correlations. This method can be applied to short or sparse time series and can be extended to cross-trial and cross-channel spectra if multiple short data segments of the same experiment are available. Together, this time-domain toolbox provides higher temporal resolution, increased frequency and phase coupling information, and it allows an easy and straightforward implementation of higher-order spectra across time

  6. The Statistical Analysis of Time Series

    CERN Document Server

    Anderson, T W

    2011-01-01

    The Wiley Classics Library consists of selected books that have become recognized classics in their respective fields. With these new unabridged and inexpensive editions, Wiley hopes to extend the life of these important works by making them available to future generations of mathematicians and scientists. Currently available in the Series: T. W. Anderson Statistical Analysis of Time Series T. S. Arthanari & Yadolah Dodge Mathematical Programming in Statistics Emil Artin Geometric Algebra Norman T. J. Bailey The Elements of Stochastic Processes with Applications to the Natural Sciences George

  7. Nonlinear Analysis of Physiological Time Series

    Institute of Scientific and Technical Information of China (English)

    MENG Qing-fang; PENG Yu-hua; XUE Yu-li; HAN Min

    2007-01-01

    Abstract.The heart rate variability could be explained by a low-dimensional governing mechanism. There has been increasing interest in verifying and understanding the coupling between the respiration and the heart rate. In this paper we use the nonlinear detection method to detect the nonlinear deterministic component in the physiological time series by a single variable series and two variables series respectively, and use the conditional information entropy to analyze the correlation between the heart rate, the respiration and the blood oxygen concentration. The conclusions are that there is the nonlinear deterministic component in the heart rate data and respiration data, and the heart rate and the respiration are two variables originating from the same underlying dynamics.

  8. TIME SERIES FORECASTING USING NEURAL NETWORKS

    Directory of Open Access Journals (Sweden)

    BOGDAN OANCEA

    2013-05-01

    Full Text Available Recent studies have shown the classification and prediction power of the Neural Networks. It has been demonstrated that a NN can approximate any continuous function. Neural networks have been successfully used for forecasting of financial data series. The classical methods used for time series prediction like Box-Jenkins or ARIMA assumes that there is a linear relationship between inputs and outputs. Neural Networks have the advantage that can approximate nonlinear functions. In this paper we compared the performances of different feed forward and recurrent neural networks and training algorithms for predicting the exchange rate EUR/RON and USD/RON. We used data series with daily exchange rates starting from 2005 until 2013.

  9. Estimation of vegetation cover resilience from satellite time series

    Directory of Open Access Journals (Sweden)

    T. Simoniello

    2008-07-01

    Full Text Available Resilience is a fundamental concept for understanding vegetation as a dynamic component of the climate system. It expresses the ability of ecosystems to tolerate disturbances and to recover their initial state. Recovery times are basic parameters of the vegetation's response to forcing and, therefore, are essential for describing realistic vegetation within dynamical models. Healthy vegetation tends to rapidly recover from shock and to persist in growth and expansion. On the contrary, climatic and anthropic stress can reduce resilience thus favouring persistent decrease in vegetation activity.

    In order to characterize resilience, we analyzed the time series 1982–2003 of 8 km GIMMS AVHRR-NDVI maps of the Italian territory. Persistence probability of negative and positive trends was estimated according to the vegetation cover class, altitude, and climate. Generally, mean recovery times from negative trends were shorter than those estimated for positive trends, as expected for vegetation of healthy status. Some signatures of inefficient resilience were found in high-level mountainous areas and in the Mediterranean sub-tropical ones. This analysis was refined by aggregating pixels according to phenology. This multitemporal clustering synthesized information on vegetation cover, climate, and orography rather well. The consequent persistence estimations confirmed and detailed hints obtained from the previous analyses. Under the same climatic regime, different vegetation resilience levels were found. In particular, within the Mediterranean sub-tropical climate, clustering was able to identify features with different persistence levels in areas that are liable to different levels of anthropic pressure. Moreover, it was capable of enhancing reduced vegetation resilience also in the southern areas under Warm Temperate sub-continental climate. The general consistency of the obtained results showed that, with the help of suited analysis

  10. A comprehensive characterization of recurrences in time series

    CERN Document Server

    Chicheportiche, Rémy

    2013-01-01

    Study of recurrences in earthquakes, climate, financial time-series, etc. is crucial to better forecast disasters and limit their consequences. However, almost all the previous phenomenological studies involved only a long-ranged autocorrelation function, or disregarded the multi-scaling properties induced by potential higher order dependencies. Consequently, they missed the facts that non-linear dependences do impact both the statistics and dynamics of recurrence times, and that scaling arguments for the unconditional distribution may not be applicable. We argue that copulas is the correct model-free framework to study non-linear dependencies in time series and related concepts like recurrences. Fitting and/or simulating the intertemporal distribution of recurrence intervals is very much system specific, and cannot actually benefit from universal features, in contrast to the previous claims. This has important implications in epilepsy prognosis and financial risk management applications.

  11. Forecasting with nonlinear time series models

    DEFF Research Database (Denmark)

    Kock, Anders Bredahl; Teräsvirta, Timo

    applied to economic fore- casting problems, is briefly highlighted. A number of large published studies comparing macroeconomic forecasts obtained using different time series models are discussed, and the paper also contains a small simulation study comparing recursive and direct forecasts in a partic......In this paper, nonlinear models are restricted to mean nonlinear parametric models. Several such models popular in time series econo- metrics are presented and some of their properties discussed. This in- cludes two models based on universal approximators: the Kolmogorov- Gabor polynomial model...... and two versions of a simple artificial neural network model. Techniques for generating multi-period forecasts from nonlinear models recursively are considered, and the direct (non-recursive) method for this purpose is mentioned as well. Forecasting with com- plex dynamic systems, albeit less frequently...

  12. Visibility graphlet approach to chaotic time series.

    Science.gov (United States)

    Mutua, Stephen; Gu, Changgui; Yang, Huijie

    2016-05-01

    Many novel methods have been proposed for mapping time series into complex networks. Although some dynamical behaviors can be effectively captured by existing approaches, the preservation and tracking of the temporal behaviors of a chaotic system remains an open problem. In this work, we extended the visibility graphlet approach to investigate both discrete and continuous chaotic time series. We applied visibility graphlets to capture the reconstructed local states, so that each is treated as a node and tracked downstream to create a temporal chain link. Our empirical findings show that the approach accurately captures the dynamical properties of chaotic systems. Networks constructed from periodic dynamic phases all converge to regular networks and to unique network structures for each model in the chaotic zones. Furthermore, our results show that the characterization of chaotic and non-chaotic zones in the Lorenz system corresponds to the maximal Lyapunov exponent, thus providing a simple and straightforward way to analyze chaotic systems.

  13. Univariate time series forecasting algorithm validation

    Science.gov (United States)

    Ismail, Suzilah; Zakaria, Rohaiza; Muda, Tuan Zalizam Tuan

    2014-12-01

    Forecasting is a complex process which requires expert tacit knowledge in producing accurate forecast values. This complexity contributes to the gaps between end users and expert. Automating this process by using algorithm can act as a bridge between them. Algorithm is a well-defined rule for solving a problem. In this study a univariate time series forecasting algorithm was developed in JAVA and validated using SPSS and Excel. Two set of simulated data (yearly and non-yearly); several univariate forecasting techniques (i.e. Moving Average, Decomposition, Exponential Smoothing, Time Series Regressions and ARIMA) and recent forecasting process (such as data partition, several error measures, recursive evaluation and etc.) were employed. Successfully, the results of the algorithm tally with the results of SPSS and Excel. This algorithm will not just benefit forecaster but also end users that lacking in depth knowledge of forecasting process.

  14. On clustering fMRI time series

    DEFF Research Database (Denmark)

    Goutte, C; Toft, P; Rostrup, E

    1999-01-01

    Analysis of fMRI time series is often performed by extracting one or more parameters for the individual voxels. Methods based, e.g., on various statistical tests are then used to yield parameters corresponding to probability of activation or activation strength. However, these methods do not indi......Analysis of fMRI time series is often performed by extracting one or more parameters for the individual voxels. Methods based, e.g., on various statistical tests are then used to yield parameters corresponding to probability of activation or activation strength. However, these methods do...... between the activation stimulus and the fMRI signal. We present two different clustering algorithms and use them to identify regions of similar activations in an fMRI experiment involving a visual stimulus....

  15. Revisiting algorithms for generating surrogate time series

    CERN Document Server

    Raeth, C; Papadakis, I E; Brinkmann, W

    2011-01-01

    The method of surrogates is one of the key concepts of nonlinear data analysis. Here, we demonstrate that commonly used algorithms for generating surrogates often fail to generate truly linear time series. Rather, they create surrogate realizations with Fourier phase correlations leading to non-detections of nonlinearities. We argue that reliable surrogates can only be generated, if one tests separately for static and dynamic nonlinearities.

  16. Learning and Prediction of Relational Time Series

    Science.gov (United States)

    2013-03-01

    genetic algorithms can generate a sequence of events to maximize some functions or the likelihood to achieve the assumed goals. With reference...Reinforcement learning is not the same as relational time-series learning mainly because its main focus is to learn a set of policies to maximize the...scope blending, and has been applied to machine poetry generation [48] and the generation of animation characters [49]. Tan and Kowk [50] applied the

  17. Time Series Modelling using Proc Varmax

    DEFF Research Database (Denmark)

    Milhøj, Anders

    2007-01-01

    In this paper it will be demonstrated how various time series problems could be met using Proc Varmax. The procedure is rather new and hence new features like cointegration, testing for Granger causality are included, but it also means that more traditional ARIMA modelling as outlined by Box & Je...... & Jenkins is performed in a more modern way using the computer resources which are now available...

  18. Vegetation Dynamics of NW Mexico using MODIS time series data

    Science.gov (United States)

    Valdes, M.; Bonifaz, R.; Pelaez, G.; Leyva Contreras, A.

    2010-12-01

    Northwestern Mexico is an area subjected to a combination of marine and continental climatic influences which produce a highly variable vegetation dynamics throughout time. Using Moderate Resolution Imaging Spectroradiometer (MODIS) vegetation indices data (NDVI and EVI) from 2001 to 2008, mean and standard deviation image values of the time series were calculated. Using this data, annual vegetation dynamics was characterized based on the different values for the different vegetation types. Annual mean values were compared and inter annual variations or anomalies were analyzed calculating departures of de mean. An anomaly was considered if the value was over or under two standard deviations. Using this procedure it was possible determine spatio-temporal patterns over the study area and relate them to climatic conditions.

  19. Mapping Brazilian savanna vegetation gradients with Landsat time series

    Science.gov (United States)

    Schwieder, Marcel; Leitão, Pedro J.; da Cunha Bustamante, Mercedes Maria; Ferreira, Laerte Guimarães; Rabe, Andreas; Hostert, Patrick

    2016-10-01

    Global change has tremendous impacts on savanna systems around the world. Processes related to climate change or agricultural expansion threaten the ecosystem's state, function and the services it provides. A prominent example is the Brazilian Cerrado that has an extent of around 2 million km2 and features high biodiversity with many endemic species. It is characterized by landscape patterns from open grasslands to dense forests, defining a heterogeneous gradient in vegetation structure throughout the biome. While it is undisputed that the Cerrado provides a multitude of valuable ecosystem services, it is exposed to changes, e.g. through large scale land conversions or climatic changes. Monitoring of the Cerrado is thus urgently needed to assess the state of the system as well as to analyze and further understand ecosystem responses and adaptations to ongoing changes. Therefore we explored the potential of dense Landsat time series to derive phenological information for mapping vegetation gradients in the Cerrado. Frequent data gaps, e.g. due to cloud contamination, impose a serious challenge for such time series analyses. We synthetically filled data gaps based on Radial Basis Function convolution filters to derive continuous pixel-wise temporal profiles capable of representing Land Surface Phenology (LSP). Derived phenological parameters revealed differences in the seasonal cycle between the main Cerrado physiognomies and could thus be used to calibrate a Support Vector Classification model to map their spatial distribution. Our results show that it is possible to map the main spatial patterns of the observed physiognomies based on their phenological differences, whereat inaccuracies occurred especially between similar classes and data-scarce areas. The outcome emphasizes the need for remote sensing based time series analyses at fine scales. Mapping heterogeneous ecosystems such as savannas requires spatial detail, as well as the ability to derive important

  20. Argos: An Optimized Time-Series Photometer

    Indian Academy of Sciences (India)

    Anjum S. Mukadam; R. E. Nather

    2005-06-01

    We designed a prime focus CCD photometer, Argos, optimized for high speed time-series measurements of blue variables (Nather & Mukadam 2004) for the 2.1 m telescope at McDonald Observatory. Lack of any intervening optics between the primary mirror and the CCD makes the instrument highly efficient.We measure an improvement in sensitivity by a factor of nine over the 3-channel PMT photometers used on the same telescope and for the same exposure time. The CCD frame transfer operation triggered by GPS synchronized pulses serves as an electronic shutter for the photometer. This minimizes the dead time between exposures, but more importantly, allows a precise control of the start and duration of the exposure. We expect the uncertainty in our timing to be less than 100 s.

  1. Finding recurrence networks' threshold adaptively for a specific time series

    Science.gov (United States)

    Eroglu, D.; Marwan, N.; Prasad, S.; Kurths, J.

    2014-11-01

    Recurrence-plot-based recurrence networks are an approach used to analyze time series using a complex networks theory. In both approaches - recurrence plots and recurrence networks -, a threshold to identify recurrent states is required. The selection of the threshold is important in order to avoid bias of the recurrence network results. In this paper, we propose a novel method to choose a recurrence threshold adaptively. We show a comparison between the constant threshold and adaptive threshold cases to study period-chaos and even period-period transitions in the dynamics of a prototypical model system. This novel method is then used to identify climate transitions from a lake sediment record.

  2. Directed networks with underlying time structures from multivariate time series

    CERN Document Server

    Tanizawa, Toshihiro; Taya, Fumihiko

    2014-01-01

    In this paper we propose a method of constructing directed networks of time-dependent phenomena from multivariate time series. As the construction method is based on the linear model, the network fully reflects dynamical features of the system such as time structures of periodicities. Furthermore, this method can construct networks even if these time series show no similarity: situations in which common methods fail. We explicitly introduce a case where common methods do not work. This fact indicates the importance of constructing networks based on dynamical perspective, when we consider time-dependent phenomena. We apply the method to multichannel electroencephalography~(EEG) data and the result reveals underlying interdependency among the components in the brain system.

  3. Fractal fluctuations in cardiac time series

    Science.gov (United States)

    West, B. J.; Zhang, R.; Sanders, A. W.; Miniyar, S.; Zuckerman, J. H.; Levine, B. D.; Blomqvist, C. G. (Principal Investigator)

    1999-01-01

    Human heart rate, controlled by complex feedback mechanisms, is a vital index of systematic circulation. However, it has been shown that beat-to-beat values of heart rate fluctuate continually over a wide range of time scales. Herein we use the relative dispersion, the ratio of the standard deviation to the mean, to show, by systematically aggregating the data, that the correlation in the beat-to-beat cardiac time series is a modulated inverse power law. This scaling property indicates the existence of long-time memory in the underlying cardiac control process and supports the conclusion that heart rate variability is a temporal fractal. We argue that the cardiac control system has allometric properties that enable it to respond to a dynamical environment through scaling.

  4. Time Series Forecasting A Nonlinear Dynamics Approach

    CERN Document Server

    Sello, S

    1999-01-01

    The problem of prediction of a given time series is examined on the basis of recent nonlinear dynamics theories. Particular attention is devoted to forecast the amplitude and phase of one of the most common solar indicator activity, the international monthly smoothed sunspot number. It is well known that the solar cycle is very difficult to predict due to the intrinsic complexity of the related time behaviour and to the lack of a succesful quantitative theoretical model of the Sun magnetic cycle. Starting from a previous recent work, we checked the reliability and accuracy of a forecasting model based on concepts of nonlinear dynamical systems applied to experimental time series, such as embedding phase space,Lyapunov spectrum,chaotic behaviour. The model is based on a locally hypothesis of the behaviour on the embedding space, utilizing an optimal number k of neighbour vectors to predict the future evolution of the current point with the set of characteristic parameters determined by several previous paramet...

  5. Time Series Photometry of KZ Lacertae

    Science.gov (United States)

    Joner, Michael D.

    2016-01-01

    We present BVRI time series photometry of the high amplitude delta Scuti star KZ Lacertae secured using the 0.9-meter telescope located at the Brigham Young University West Mountain Observatory. In addition to the multicolor light curves that are presented, the V data from the last six years of observations are used to plot an O-C diagram in order to determine the ephemeris and evaluate evidence for period change. We wish to thank the Brigham Young University College of Physical and Mathematical Sciences as well as the Department of Physics and Astronomy for their continued support of the research activities at the West Mountain Observatory.

  6. Time series modeling for automatic target recognition

    Science.gov (United States)

    Sokolnikov, Andre

    2012-05-01

    Time series modeling is proposed for identification of targets whose images are not clearly seen. The model building takes into account air turbulence, precipitation, fog, smoke and other factors obscuring and distorting the image. The complex of library data (of images, etc.) serving as a basis for identification provides the deterministic part of the identification process, while the partial image features, distorted parts, irrelevant pieces and absence of particular features comprise the stochastic part of the target identification. The missing data approach is elaborated that helps the prediction process for the image creation or reconstruction. The results are provided.

  7. Outlier Detection in Structural Time Series Models

    DEFF Research Database (Denmark)

    Marczak, Martyna; Proietti, Tommaso

    –to–specific approach to the detection of structural change, currently implemented in Autometrics via indicator saturation, has proven to be both practical and effective in the context of stationary dynamic regression models and unit–root autoregressions. By focusing on impulse– and step–indicator saturation, we...... investigate via Monte Carlo simulations how this approach performs for detecting additive outliers and level shifts in the analysis of nonstationary seasonal time series. The reference model is the basic structural model, featuring a local linear trend, possibly integrated of order two, stochastic seasonality...

  8. Fourier analysis of time series an introduction

    CERN Document Server

    Bloomfield, Peter

    2000-01-01

    A new, revised edition of a yet unrivaled work on frequency domain analysis Long recognized for his unique focus on frequency domain methods for the analysis of time series data as well as for his applied, easy-to-understand approach, Peter Bloomfield brings his well-known 1976 work thoroughly up to date. With a minimum of mathematics and an engaging, highly rewarding style, Bloomfield provides in-depth discussions of harmonic regression, harmonic analysis, complex demodulation, and spectrum analysis. All methods are clearly illustrated using examples of specific data sets, while ample

  9. Modeling noisy time series Physiological tremor

    CERN Document Server

    Timmer, J

    1998-01-01

    Empirical time series often contain observational noise. We investigate the effect of this noise on the estimated parameters of models fitted to the data. For data of physiological tremor, i.e. a small amplitude oscillation of the outstretched hand of healthy subjects, we compare the results for a linear model that explicitly includes additional observational noise to one that ignores this noise. We discuss problems and possible solutions for nonlinear deterministic as well as nonlinear stochastic processes. Especially we discuss the state space model applicable for modeling noisy stochastic systems and Bock's algorithm capable for modeling noisy deterministic systems.

  10. Time Series Analysis of SOLSTICE Measurements

    Science.gov (United States)

    Wen, G.; Cahalan, R. F.

    2003-12-01

    Solar radiation is the major energy source for the Earth's biosphere and atmospheric and ocean circulations. Variations of solar irradiance have been a major concern of scientists both in solar physics and atmospheric sciences. A number of missions have been carried out to monitor changes in total solar irradiance (TSI) [see Fröhlich and Lean, 1998 for review] and spectral solar irradiance (SSI) [e.g., SOLSTICE on UARS and VIRGO on SOHO]. Observations over a long time period reveal the connection between variations in solar irradiance and surface magnetic fields of the Sun [Lean1997]. This connection provides a guide to scientists in modeling solar irradiances [e.g., Fontenla et al., 1999; Krivova et al., 2003]. Solar spectral observations have now been made over a relatively long time period, allowing statistical analysis. This paper focuses on predictability of solar spectral irradiance using observed SSI from SOLSTICE . Analysis of predictability is based on nonlinear dynamics using an artificial neural network in a reconstructed phase space [Abarbanel et al., 1993]. In the analysis, we first examine the average mutual information of the observed time series and a delayed time series. The time delay that gives local minimum of mutual information is chosen as the time-delay for phase space reconstruction [Fraser and Swinney, 1986]. The embedding dimension of the reconstructed phase space is determined using the false neighbors and false strands method [Kennel and Abarbanel, 2002]. Subsequently, we use a multi-layer feed-forward network with back propagation scheme [e.g., Haykin, 1994] to model the time series. The predictability of solar irradiance as a function of wavelength is considered. References Abarbanel, H. D. I., R. Brown, J. J. Sidorowich, and L. Sh. Tsimring, Rev. Mod. Phys. 65, 1331, 1993. Fraser, A. M. and H. L. Swinney, Phys. Rev. 33A, 1134, 1986. Fontenla, J., O. R. White, P. Fox, E. H. Avrett and R. L. Kurucz, The Astrophysical Journal, 518, 480

  11. An introduction to state space time series analysis.

    NARCIS (Netherlands)

    Commandeur, J.J.F. & Koopman, S.J.

    2007-01-01

    Providing a practical introduction to state space methods as applied to unobserved components time series models, also known as structural time series models, this book introduces time series analysis using state space methodology to readers who are neither familiar with time series analysis, nor wi

  12. Nonlinear Time Series Analysis Since 1990:Some Personal Reflections

    Institute of Scientific and Technical Information of China (English)

    Howel Tong

    2002-01-01

    I reflect upon the development of nonlinear time series analysis since 1990 by focusing on five major areas of development. These areas include the interface between nonlinear time series analysis and chaos, the nonparametric/semiparametric approach, nonlinear state space modelling, financial time series and nonlinear modelling of panels of time series.

  13. Ensemble vs. time averages in financial time series analysis

    Science.gov (United States)

    Seemann, Lars; Hua, Jia-Chen; McCauley, Joseph L.; Gunaratne, Gemunu H.

    2012-12-01

    Empirical analysis of financial time series suggests that the underlying stochastic dynamics are not only non-stationary, but also exhibit non-stationary increments. However, financial time series are commonly analyzed using the sliding interval technique that assumes stationary increments. We propose an alternative approach that is based on an ensemble over trading days. To determine the effects of time averaging techniques on analysis outcomes, we create an intraday activity model that exhibits periodic variable diffusion dynamics and we assess the model data using both ensemble and time averaging techniques. We find that ensemble averaging techniques detect the underlying dynamics correctly, whereas sliding intervals approaches fail. As many traded assets exhibit characteristic intraday volatility patterns, our work implies that ensemble averages approaches will yield new insight into the study of financial markets’ dynamics.

  14. Partial spectral analysis of hydrological time series

    Science.gov (United States)

    Jukić, D.; Denić-Jukić, V.

    2011-03-01

    SummaryHydrological time series comprise the influences of numerous processes involved in the transfer of water in hydrological cycle. It implies that an ambiguity with respect to the processes encoded in spectral and cross-spectral density functions exists. Previous studies have not paid attention adequately to this issue. Spectral and cross-spectral density functions represent the Fourier transforms of auto-covariance and cross-covariance functions. Using this basic property, the ambiguity is resolved by applying a novel approach based on the spectral representation of partial correlation. Mathematical background for partial spectral density, partial amplitude and partial phase functions is presented. The proposed functions yield the estimates of spectral density, amplitude and phase that are not affected by a controlling process. If an input-output relation is the subject of interest, antecedent and subsequent influences of the controlling process can be distinguished considering the input event as a referent point. The method is used for analyses of the relations between the rainfall, air temperature and relative humidity, as well as the influences of air temperature and relative humidity on the discharge from karst spring. Time series are collected in the catchment of the Jadro Spring located in the Dinaric karst area of Croatia.

  15. Forecasting the Time Series of Sunspot Numbers

    Science.gov (United States)

    Aguirre, L. A.; Letellier, C.; Maquet, J.

    2008-05-01

    Forecasting the solar cycle is of great importance for weather prediction and environmental monitoring, and also constitutes a difficult scientific benchmark in nonlinear dynamical modeling. This paper describes the identification of a model and its use in the forecasting the time series comprised of Wolf’s sunspot numbers. A key feature of this procedure is that the original time series is first transformed into a symmetrical space where the dynamics of the solar dynamo are unfolded in a better way, thus improving the model. The nonlinear model obtained is parsimonious and has both deterministic and stochastic parts. Monte Carlo simulation of the whole model produces very consistent results with the deterministic part of the model but allows for the determination of confidence bands. The obtained model was used to predict cycles 24 and 25, although the forecast of the latter is seen as a crude approximation, given the long prediction horizon required. As for the 24th cycle, two estimates were obtained with peaks of 65±16 and of 87±13 units of sunspot numbers. The simulated results suggest that the 24th cycle will be shorter and less active than the preceding one.

  16. Forecasting autoregressive time series under changing persistence

    DEFF Research Database (Denmark)

    Kruse, Robinson

    Changing persistence in time series models means that a structural change from nonstationarity to stationarity or vice versa occurs over time. Such a change has important implications for forecasting, as negligence may lead to inaccurate model predictions. This paper derives generally applicable...... recommendations, no matter whether a change in persistence occurs or not. Seven different forecasting strategies based on a biasedcorrected estimator are compared by means of a large-scale Monte Carlo study. The results for decreasing and increasing persistence are highly asymmetric and new to the literature. Its...... good predictive ability and its balanced performance among different settings strongly advocate the use of forecasting strategies based on the Bai-Perron procedure....

  17. Useful Pattern Mining on Time Series

    DEFF Research Database (Denmark)

    Goumatianos, Nikitas; Christou, Ioannis T; Lindgren, Peter

    2013-01-01

    We present the architecture of a “useful pattern” mining system that is capable of detecting thousands of different candlestick sequence patterns at the tick or any higher granularity levels. The system architecture is highly distributed and performs most of its highly compute-intensive aggregation...... calculations as complex but efficient distributed SQL queries on the relational databases that store the time-series. We present initial results from mining all frequent candlestick sequences with the characteristic property that when they occur then, with an average at least 60% probability, they signal a 2......% or higher increase (or, alternatively, decrease) in a chosen property of the stock (e.g. close-value) within a given time-window (e.g. 5 days). Initial results from a first prototype implementation of the architecture show that after training on a large set of stocks, the system is capable of finding...

  18. Learning with Latent Factors in Time Series

    CERN Document Server

    Jalali, Ali

    2011-01-01

    This paper considers the problem of learning, from samples, the dependency structure of a system of linear stochastic differential equations, when some of the variables are latent. In particular, we observe the time evolution of some variables, and never observe other variables; from this, we would like to find the dependency structure between the observed variables -- separating out the spurious interactions caused by the (marginalizing out of the) latent variables' time series. We develop a new method, based on convex optimization, to do so in the case when the number of latent variables is smaller than the number of observed ones. For the case when the dependency structure between the observed variables is sparse, we theoretically establish a high-dimensional scaling result for structure recovery. We verify our theoretical result with both synthetic and real data (from the stock market).

  19. Long Series of GNSS Integrated Precipitable Water as a Climate Change Indicator

    Directory of Open Access Journals (Sweden)

    Kruczyk Michał

    2015-12-01

    Full Text Available This paper investigates information potential contained in tropospheric delay product for selected International GNSS Service (IGS stations in climatologic research. Long time series of daily averaged Integrated Precipitable Water (IPW can serve as climate indicator. The seasonal model of IPW change has been adjusted to the multi-year series (by the least square method. Author applied two modes: sinusoidal and composite (two or more oscillations. Even simple sinusoidal seasonal model (of daily IPW values series clearly represents diversity of world climates. Residuals in periods from 10 up to 17 years are searched for some long-term IPW trend – self-evident climate change indicator. Results are ambiguous: for some stations or periods IPW trends are quite clear, the following years (or the other station not visible. Method of fitting linear trend to IPW series does not influence considerably the value of linear trend. The results are mostly influenced by series length, completeness and data (e.g. meteorological quality. The longer and more homogenous IPW series, the better chance to estimate the magnitude of climatologic IPW changes.

  20. Consistent forest change maps 1981 – 2000 from the AVHRR time series. Case studies for South America and Indonesia

    NARCIS (Netherlands)

    Eberenz, J.; Herold, M.; Verbesselt, J.; Wijaya, A.; Lindquist, E.; Defourny, P.; Gibbs, H.K.; Arino, O.; Achard, F.

    2015-01-01

    This study predicts global forest cover change for the 1980s and 1990s from AVHRR time series metrics in order to show how the series of consistent land cover maps for climate modeling produced by the ESA climate change initiative land cover project can be extended back in time. A Random Forest mode

  1. Automated time series forecasting for biosurveillance.

    Science.gov (United States)

    Burkom, Howard S; Murphy, Sean Patrick; Shmueli, Galit

    2007-09-30

    For robust detection performance, traditional control chart monitoring for biosurveillance is based on input data free of trends, day-of-week effects, and other systematic behaviour. Time series forecasting methods may be used to remove this behaviour by subtracting forecasts from observations to form residuals for algorithmic input. We describe three forecast methods and compare their predictive accuracy on each of 16 authentic syndromic data streams. The methods are (1) a non-adaptive regression model using a long historical baseline, (2) an adaptive regression model with a shorter, sliding baseline, and (3) the Holt-Winters method for generalized exponential smoothing. Criteria for comparing the forecasts were the root-mean-square error, the median absolute per cent error (MedAPE), and the median absolute deviation. The median-based criteria showed best overall performance for the Holt-Winters method. The MedAPE measures over the 16 test series averaged 16.5, 11.6, and 9.7 for the non-adaptive regression, adaptive regression, and Holt-Winters methods, respectively. The non-adaptive regression forecasts were degraded by changes in the data behaviour in the fixed baseline period used to compute model coefficients. The mean-based criterion was less conclusive because of the effects of poor forecasts on a small number of calendar holidays. The Holt-Winters method was also most effective at removing serial autocorrelation, with most 1-day-lag autocorrelation coefficients below 0.15. The forecast methods were compared without tuning them to the behaviour of individual series. We achieved improved predictions with such tuning of the Holt-Winters method, but practical use of such improvements for routine surveillance will require reliable data classification methods.

  2. Trend prediction of chaotic time series

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    Trend prediction of chaotic ti me series is anin-teresting probleminti me series analysis andti me se-ries data mining(TSDM)fields[1].TSDM-basedmethods can successfully characterize and predictcomplex,irregular,and chaotic ti me series.Somemethods have been proposed to predict the trend ofchaotic ti me series.In our knowledge,these meth-ods can be classified into t wo categories as follows.The first category is based on the embeddedspace[2-3],where rawti me series data is mapped to areconstructed phase spac...

  3. Periodograms for multiband astronomical time series

    Science.gov (United States)

    Ivezic, Z.; VanderPlas, J. T.

    2016-05-01

    We summarize the multiband periodogram, a general extension of the well-known Lomb-Scargle approach for detecting periodic signals in time- domain data developed by VanderPlas & Ivezic (2015). A Python implementation of this method is available on GitHub. The multiband periodogram significantly improves period finding for randomly sampled multiband light curves (e.g., Pan-STARRS, DES, and LSST), and can treat non-uniform sampling and heteroscedastic errors. The light curves in each band are modeled as arbitrary truncated Fourier series, with the period and phase shared across all bands. The key aspect is the use of Tikhonov regularization which drives most of the variability into the so-called base model common to all bands, while fits for individual bands describe residuals relative to the base model and typically require lower-order Fourier series. We use simulated light curves and randomly subsampled SDSS Stripe 82 data to demonstrate the superiority of this method compared to other methods from the literature, and find that this method will be able to efficiently determine the correct period in the majority of LSST's bright RR Lyrae stars with as little as six months of LSST data.

  4. Correcting and combining time series forecasters.

    Science.gov (United States)

    Firmino, Paulo Renato A; de Mattos Neto, Paulo S G; Ferreira, Tiago A E

    2014-02-01

    Combined forecasters have been in the vanguard of stochastic time series modeling. In this way it has been usual to suppose that each single model generates a residual or prediction error like a white noise. However, mostly because of disturbances not captured by each model, it is yet possible that such supposition is violated. The present paper introduces a two-step method for correcting and combining forecasting models. Firstly, the stochastic process underlying the bias of each predictive model is built according to a recursive ARIMA algorithm in order to achieve a white noise behavior. At each iteration of the algorithm the best ARIMA adjustment is determined according to a given information criterion (e.g. Akaike). Then, in the light of the corrected predictions, it is considered a maximum likelihood combined estimator. Applications involving single ARIMA and artificial neural networks models for Dow Jones Industrial Average Index, S&P500 Index, Google Stock Value, and Nasdaq Index series illustrate the usefulness of the proposed framework.

  5. Assemblage time series reveal biodiversity change but not systematic loss.

    Science.gov (United States)

    Dornelas, Maria; Gotelli, Nicholas J; McGill, Brian; Shimadzu, Hideyasu; Moyes, Faye; Sievers, Caya; Magurran, Anne E

    2014-04-18

    The extent to which biodiversity change in local assemblages contributes to global biodiversity loss is poorly understood. We analyzed 100 time series from biomes across Earth to ask how diversity within assemblages is changing through time. We quantified patterns of temporal α diversity, measured as change in local diversity, and temporal β diversity, measured as change in community composition. Contrary to our expectations, we did not detect systematic loss of α diversity. However, community composition changed systematically through time, in excess of predictions from null models. Heterogeneous rates of environmental change, species range shifts associated with climate change, and biotic homogenization may explain the different patterns of temporal α and β diversity. Monitoring and understanding change in species composition should be a conservation priority.

  6. Bringing a Global Issue Closer to Home: The OSU Climate Change Webinar Series

    Science.gov (United States)

    Jentes Banicki, J.; Dierkes, C.

    2012-12-01

    to share climate research and response projects with a diverse group of individuals. For webinar attendees, real-time and recorded webinars provide access to current research data and the ability to interact with like-minded colleagues working to mitigate and adapt to regional impacts of climate change. This presentation will provide an overview of this ongoing project, as well as the available online climate resources and webinar survey results from the series.

  7. On clustering fMRI time series

    DEFF Research Database (Denmark)

    Goutte, Cyril; Toft, Peter Aundal; Rostrup, E.

    1999-01-01

    Analysis of fMRI time series is often performed by extracting one or more parameters for the individual voxels. Methods based, e.g., on various statistical tests are then used to yield parameters corresponding to probability of activation or activation strength. However, these methods do not indi...... between the activation stimulus and the fMRI signal. We present two different clustering algorithms and use them to identify regions of similar activations in an fMRI experiment involving a visual stimulus....... not indicate whether sets of voxels are activated in a similar way or in different ways. Typically, delays between two activated signals are not identified. In this article, we use clustering methods to detect similarities in activation between voxels. We employ a novel metric that measures the similarity...

  8. Normalizing the causality between time series

    Science.gov (United States)

    Liang, X. San

    2015-08-01

    Recently, a rigorous yet concise formula was derived to evaluate information flow, and hence the causality in a quantitative sense, between time series. To assess the importance of a resulting causality, it needs to be normalized. The normalization is achieved through distinguishing a Lyapunov exponent-like, one-dimensional phase-space stretching rate and a noise-to-signal ratio from the rate of information flow in the balance of the marginal entropy evolution of the flow recipient. It is verified with autoregressive models and applied to a real financial analysis problem. An unusually strong one-way causality is identified from IBM (International Business Machines Corporation) to GE (General Electric Company) in their early era, revealing to us an old story, which has almost faded into oblivion, about "Seven Dwarfs" competing with a giant for the mainframe computer market.

  9. Inferring causality from noisy time series data

    DEFF Research Database (Denmark)

    Mønster, Dan; Fusaroli, Riccardo; Tylén, Kristian;

    2016-01-01

    Convergent Cross-Mapping (CCM) has shown high potential to perform causal inference in the absence of models. We assess the strengths and weaknesses of the method by varying coupling strength and noise levels in coupled logistic maps. We find that CCM fails to infer accurate coupling strength...... and even causality direction in synchronized time-series and in the presence of intermediate coupling. We find that the presence of noise deterministically reduces the level of cross-mapping fidelity, while the convergence rate exhibits higher levels of robustness. Finally, we propose that controlled noise...... injections in intermediate-to-strongly coupled systems could enable more accurate causal inferences. Given the inherent noisy nature of real-world systems, our findings enable a more accurate evaluation of CCM applicability and advance suggestions on how to overcome its weaknesses....

  10. Highly comparative, feature-based time-series classification

    CERN Document Server

    Fulcher, Ben D

    2014-01-01

    A highly comparative, feature-based approach to time series classification is introduced that uses an extensive database of algorithms to extract thousands of interpretable features from time series. These features are derived from across the scientific time-series analysis literature, and include summaries of time series in terms of their correlation structure, distribution, entropy, stationarity, scaling properties, and fits to a range of time-series models. After computing thousands of features for each time series in a training set, those that are most informative of the class structure are selected using greedy forward feature selection with a linear classifier. The resulting feature-based classifiers automatically learn the differences between classes using a reduced number of time-series properties, and circumvent the need to calculate distances between time series. Representing time series in this way results in orders of magnitude of dimensionality reduction, allowing the method to perform well on ve...

  11. PERIODOGRAMS FOR MULTIBAND ASTRONOMICAL TIME SERIES

    Energy Technology Data Exchange (ETDEWEB)

    VanderPlas, Jacob T. [eScience Institute, University of Washington, Seattle, WA (United States); Ivezic, Željko [Department of Astronomy, University of Washington, Seattle, WA (United States)

    2015-10-10

    This paper introduces the multiband periodogram, a general extension of the well-known Lomb–Scargle approach for detecting periodic signals in time-domain data. In addition to advantages of the Lomb–Scargle method such as treatment of non-uniform sampling and heteroscedastic errors, the multiband periodogram significantly improves period finding for randomly sampled multiband light curves (e.g., Pan-STARRS, DES, and LSST). The light curves in each band are modeled as arbitrary truncated Fourier series, with the period and phase shared across all bands. The key aspect is the use of Tikhonov regularization which drives most of the variability into the so-called base model common to all bands, while fits for individual bands describe residuals relative to the base model and typically require lower-order Fourier series. This decrease in the effective model complexity is the main reason for improved performance. After a pedagogical development of the formalism of least-squares spectral analysis, which motivates the essential features of the multiband model, we use simulated light curves and randomly subsampled SDSS Stripe 82 data to demonstrate the superiority of this method compared to other methods from the literature and find that this method will be able to efficiently determine the correct period in the majority of LSST’s bright RR Lyrae stars with as little as six months of LSST data, a vast improvement over the years of data reported to be required by previous studies. A Python implementation of this method, along with code to fully reproduce the results reported here, is available on GitHub.

  12. Climate Forcing Datasets for Agricultural Modeling: Merged Products for Gap-Filling and Historical Climate Series Estimation

    Science.gov (United States)

    Ruane, Alex C.; Goldberg, Richard; Chryssanthacopoulos, James

    2014-01-01

    The AgMERRA and AgCFSR climate forcing datasets provide daily, high-resolution, continuous, meteorological series over the 1980-2010 period designed for applications examining the agricultural impacts of climate variability and climate change. These datasets combine daily resolution data from retrospective analyses (the Modern-Era Retrospective Analysis for Research and Applications, MERRA, and the Climate Forecast System Reanalysis, CFSR) with in situ and remotely-sensed observational datasets for temperature, precipitation, and solar radiation, leading to substantial reductions in bias in comparison to a network of 2324 agricultural-region stations from the Hadley Integrated Surface Dataset (HadISD). Results compare favorably against the original reanalyses as well as the leading climate forcing datasets (Princeton, WFD, WFD-EI, and GRASP), and AgMERRA distinguishes itself with substantially improved representation of daily precipitation distributions and extreme events owing to its use of the MERRA-Land dataset. These datasets also peg relative humidity to the maximum temperature time of day, allowing for more accurate representation of the diurnal cycle of near-surface moisture in agricultural models. AgMERRA and AgCFSR enable a number of ongoing investigations in the Agricultural Model Intercomparison and Improvement Project (AgMIP) and related research networks, and may be used to fill gaps in historical observations as well as a basis for the generation of future climate scenarios.

  13. Timing calibration and spectral cleaning of LOFAR time series data

    Science.gov (United States)

    Corstanje, A.; Buitink, S.; Enriquez, J. E.; Falcke, H.; Hörandel, J. R.; Krause, M.; Nelles, A.; Rachen, J. P.; Schellart, P.; Scholten, O.; ter Veen, S.; Thoudam, S.; Trinh, T. N. G.

    2016-05-01

    We describe a method for spectral cleaning and timing calibration of short time series data of the voltage in individual radio interferometer receivers. It makes use of phase differences in fast Fourier transform (FFT) spectra across antenna pairs. For strong, localized terrestrial sources these are stable over time, while being approximately uniform-random for a sum over many sources or for noise. Using only milliseconds-long datasets, the method finds the strongest interfering transmitters, a first-order solution for relative timing calibrations, and faulty data channels. No knowledge of gain response or quiescent noise levels of the receivers is required. With relatively small data volumes, this approach is suitable for use in an online system monitoring setup for interferometric arrays. We have applied the method to our cosmic-ray data collection, a collection of measurements of short pulses from extensive air showers, recorded by the LOFAR radio telescope. Per air shower, we have collected 2 ms of raw time series data for each receiver. The spectral cleaning has a calculated optimal sensitivity corresponding to a power signal-to-noise ratio of 0.08 (or -11 dB) in a spectral window of 25 kHz, for 2 ms of data in 48 antennas. This is well sufficient for our application. Timing calibration across individual antenna pairs has been performed at 0.4 ns precision; for calibration of signal clocks across stations of 48 antennas the precision is 0.1 ns. Monitoring differences in timing calibration per antenna pair over the course of the period 2011 to 2015 shows a precision of 0.08 ns, which is useful for monitoring and correcting drifts in signal path synchronizations. A cross-check method for timing calibration is presented, using a pulse transmitter carried by a drone flying over the array. Timing precision is similar, 0.3 ns, but is limited by transmitter position measurements, while requiring dedicated flights.

  14. Time series modeling for syndromic surveillance

    Directory of Open Access Journals (Sweden)

    Mandl Kenneth D

    2003-01-01

    Full Text Available Abstract Background Emergency department (ED based syndromic surveillance systems identify abnormally high visit rates that may be an early signal of a bioterrorist attack. For example, an anthrax outbreak might first be detectable as an unusual increase in the number of patients reporting to the ED with respiratory symptoms. Reliably identifying these abnormal visit patterns requires a good understanding of the normal patterns of healthcare usage. Unfortunately, systematic methods for determining the expected number of (ED visits on a particular day have not yet been well established. We present here a generalized methodology for developing models of expected ED visit rates. Methods Using time-series methods, we developed robust models of ED utilization for the purpose of defining expected visit rates. The models were based on nearly a decade of historical data at a major metropolitan academic, tertiary care pediatric emergency department. The historical data were fit using trimmed-mean seasonal models, and additional models were fit with autoregressive integrated moving average (ARIMA residuals to account for recent trends in the data. The detection capabilities of the model were tested with simulated outbreaks. Results Models were built both for overall visits and for respiratory-related visits, classified according to the chief complaint recorded at the beginning of each visit. The mean absolute percentage error of the ARIMA models was 9.37% for overall visits and 27.54% for respiratory visits. A simple detection system based on the ARIMA model of overall visits was able to detect 7-day-long simulated outbreaks of 30 visits per day with 100% sensitivity and 97% specificity. Sensitivity decreased with outbreak size, dropping to 94% for outbreaks of 20 visits per day, and 57% for 10 visits per day, all while maintaining a 97% benchmark specificity. Conclusions Time series methods applied to historical ED utilization data are an important tool

  15. Timing calibration and spectral cleaning of LOFAR time series data

    CERN Document Server

    Corstanje, A; Enriquez, J E; Falcke, H; Hörandel, J R; Krause, M; Nelles, A; Rachen, J P; Schellart, P; Scholten, O; ter Veen, S; Thoudam, S; Trinh, T N G

    2016-01-01

    We describe a method for spectral cleaning and timing calibration of short voltage time series data from individual radio interferometer receivers. It makes use of the phase differences in Fast Fourier Transform (FFT) spectra across antenna pairs. For strong, localized terrestrial sources these are stable over time, while being approximately uniform-random for a sum over many sources or for noise. Using only milliseconds-long datasets, the method finds the strongest interfering transmitters, a first-order solution for relative timing calibrations, and faulty data channels. No knowledge of gain response or quiescent noise levels of the receivers is required. With relatively small data volumes, this approach is suitable for use in an online system monitoring setup for interferometric arrays. We have applied the method to our cosmic-ray data collection, a collection of measurements of short pulses from extensive air showers, recorded by the LOFAR radio telescope. Per air shower, we have collected 2 ms of raw tim...

  16. Time series models of symptoms in schizophrenia.

    Science.gov (United States)

    Tschacher, Wolfgang; Kupper, Zeno

    2002-12-15

    The symptom courses of 84 schizophrenia patients (mean age: 24.4 years; mean previous admissions: 1.3; 64% males) of a community-based acute ward were examined to identify dynamic patterns of symptoms and to investigate the relation between these patterns and treatment outcome. The symptoms were monitored by systematic daily staff ratings using a scale composed of three factors: psychoticity, excitement, and withdrawal. Patients showed moderate to high symptomatic improvement documented by effect size measures. Each of the 84 symptom trajectories was analyzed by time series methods using vector autoregression (VAR) that models the day-to-day interrelations between symptom factors. Multiple and stepwise regression analyses were then performed on the basis of the VAR models. Two VAR parameters were found to be associated significantly with favorable outcome in this exploratory study: 'withdrawal preceding a reduction of psychoticity' as well as 'excitement preceding an increase of withdrawal'. The findings were interpreted as generating hypotheses about how patients cope with psychotic episodes.

  17. Testing whether a time series is Guassian

    Energy Technology Data Exchange (ETDEWEB)

    Lee, S.

    1991-01-01

    The authors first tests whether a stationary linear process with mean 0 is Gaussian. For the invertible processes, he considers the empirical process based on the residuals as the basis of a test procedure. By applying the result of Boldin (1983) and Kreiss (1988), he shows that the process behaves asymptotically like the one based on the true errors. For non-invertible processes, on the other hand, Lee uses the empirical process based on data themselves rather than the one based on residuals. Here, the time series is assumed to be a strongly mixing process with a suitable mixing order. Then, the asymptotic behavior of the empirical process in each case is studied under a sequence of contiguous alternatives, and quadratic functionals of the empirical process are employed for AAR([infinity]) processes in order to compare efficiencies between these two procedures. The rest of the thesis is devoted to extending Boldin's results to nonstationary processes such as unstable AR(p) processes and explosive AR(1) processes, analyzing by means of a general stochastic regression model.

  18. Spectral Estimation of Non-Gaussian Time Series

    OpenAIRE

    Fabián, Z. (Zdeněk)

    2010-01-01

    Based on the concept of the scalar score of a probability distribution, we introduce a concept of a scalar score of time series and propose to characterize a non-Gaussian time series by spectral density of its scalar score.

  19. An introduction to state space time series analysis.

    OpenAIRE

    Commandeur, J.J.F. & Koopman, S.J.

    2007-01-01

    Providing a practical introduction to state space methods as applied to unobserved components time series models, also known as structural time series models, this book introduces time series analysis using state space methodology to readers who are neither familiar with time series analysis, nor with state space methods. The only background required in order to understand the material presented in the book is a basic knowledge of classical linear regression models, of which a brief review is...

  20. Transmission of linear regression patterns between time series: from relationship in time series to complex networks.

    Science.gov (United States)

    Gao, Xiangyun; An, Haizhong; Fang, Wei; Huang, Xuan; Li, Huajiao; Zhong, Weiqiong; Ding, Yinghui

    2014-07-01

    The linear regression parameters between two time series can be different under different lengths of observation period. If we study the whole period by the sliding window of a short period, the change of the linear regression parameters is a process of dynamic transmission over time. We tackle fundamental research that presents a simple and efficient computational scheme: a linear regression patterns transmission algorithm, which transforms linear regression patterns into directed and weighted networks. The linear regression patterns (nodes) are defined by the combination of intervals of the linear regression parameters and the results of the significance testing under different sizes of the sliding window. The transmissions between adjacent patterns are defined as edges, and the weights of the edges are the frequency of the transmissions. The major patterns, the distance, and the medium in the process of the transmission can be captured. The statistical results of weighted out-degree and betweenness centrality are mapped on timelines, which shows the features of the distribution of the results. Many measurements in different areas that involve two related time series variables could take advantage of this algorithm to characterize the dynamic relationships between the time series from a new perspective.

  1. Seasonal Time Series Analysis Based on Genetic Algorithm

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    Pattern discovery from the seasonal time-series is of importance. Traditionally, most of the algorithms of pattern discovery in time series are similar. A novel mode of time series is proposed which integrates the Genetic Algorithm (GA) for the actual problem. The experiments on the electric power yield sequence models show that this algorithm is practicable and effective.

  2. Crop Yield Forecasted Model Based on Time Series Techniques

    Institute of Scientific and Technical Information of China (English)

    Li Hong-ying; Hou Yan-lin; Zhou Yong-juan; Zhao Hui-ming

    2012-01-01

    Traditional studies on potential yield mainly referred to attainable yield: the maximum yield which could be reached by a crop in a given environment. The new concept of crop yield under average climate conditions was defined in this paper, which was affected by advancement of science and technology. Based on the new concept of crop yield, the time series techniques relying on past yield data was employed to set up a forecasting model. The model was tested by using average grain yields of Liaoning Province in China from 1949 to 2005. The testing combined dynamic n-choosing and micro tendency rectification, and an average forecasting error was 1.24%. In the trend line of yield change, and then a yield turning point might occur, in which case the inflexion model was used to solve the problem of yield turn point.

  3. Single-Index Additive Vector Autoregressive Time Series Models

    KAUST Repository

    LI, YEHUA

    2009-09-01

    We study a new class of nonlinear autoregressive models for vector time series, where the current vector depends on single-indexes defined on the past lags and the effects of different lags have an additive form. A sufficient condition is provided for stationarity of such models. We also study estimation of the proposed model using P-splines, hypothesis testing, asymptotics, selection of the order of the autoregression and of the smoothing parameters and nonlinear forecasting. We perform simulation experiments to evaluate our model in various settings. We illustrate our methodology on a climate data set and show that our model provides more accurate yearly forecasts of the El Niño phenomenon, the unusual warming of water in the Pacific Ocean. © 2009 Board of the Foundation of the Scandinavian Journal of Statistics.

  4. Investigation of the 16-year and 18-year ZTD Time Series Derived from GPS Data Processing

    Directory of Open Access Journals (Sweden)

    Bałdysz Zofia

    2015-08-01

    Full Text Available The GPS system can play an important role in activities related to the monitoring of climate. Long time series, coherent strategy, and very high quality of tropospheric parameter Zenith Tropospheric Delay (ZTD estimated on the basis of GPS data analysis allows to investigate its usefulness for climate research as a direct GPS product. This paper presents results of analysis of 16-year time series derived from EUREF Permanent Network (EPN reprocessing performed by the Military University of Technology. For 58 stations Lomb-Scargle periodograms were performed in order to obtain information about the oscillations in ZTD time series. Seasonal components and linear trend were estimated using Least Square Estimation (LSE and Mann-Kendall trend test was used to confirm the presence of a linear trend designated by LSE method. In order to verify the impact of the length of time series on trend value, comparison between 16 and 18 years were performed

  5. Moving from Envisat MERIS to Sentinel-3 to Provide Consistent Global Land Cover Time Series at 300 M up to 2016: The Land Cover Component of the ESA Climate Change Initiative

    Science.gov (United States)

    Defourny, Pierre; Bontemps, Sophie; Boettcher, Martin; Brockmann, Carten; De Maet, Thomas; Kirches, Grit; Lamarche, Celine; Van Bogaert, Eric; Ramoino, Fabrizio; Arino, Olivier

    2015-12-01

    At the end of 2015, Sentinel-3 will be launched. With its two instruments OLCI (Ocean Land Colour Instrument) and SLSTR (Sea and Land Surface Temperature Radiometer), the successor of the Envisat MERIS sensor will allow ensuring the continuity of global land cover maps production initiated in the CCI Land Cover project. At the end of its 3-year Phase, the project delivered a first database made of three global land cover maps representative of three 5-year epochs (2000, 2005 and 2010) based on MERIS time series. One requirement for the second phase of the project is to extend the dataset in the future and to produce an additional global land cover map covering the 2015 epoch. That will be done relying on the coming Sentinel-3 sensor, which is the only one that can ensure continuity in the global acquisition of medium spatial resolution time series on daily intervals. Waiting for Sentinel-3, the project will rely on PROBA-V time series.

  6. Land surface phenology from SPOT VEGETATION time series

    Directory of Open Access Journals (Sweden)

    A. Verger

    2016-12-01

    Full Text Available Land surface phenology from time series of satellite data are expected to contribute to improve the representation of vegetation phenology in earth system models. We characterized the baseline phenology of the vegetation at the global scale from GEOCLIM-LAI, a global climatology of leaf area index (LAI derived from 1-km SPOT VEGETATION time series for 1999-2010. The calibration with ground measurements showed that the start and end of season were best identified using respectively 30% and 40% threshold of LAI amplitude values. The satellite-derived phenology was spatially consistent with the global distributions of climatic drivers and biome land cover. The accuracy of the derived phenological metrics, evaluated using available ground observations for birch forests in Europe, cherry in Asia and lilac shrubs in North America showed an overall root mean square error lower than 19 days for the start, end and length of season, and good agreement between the latitudinal gradients of VEGETATION LAI phenology and ground data.

  7. Principal Component and Time Series Analysis of a 500-year Stalagmite Geochemical Record from Yucatán, Mexico Reveals Climate Variability, Land-use changes, and Volcanic Ashfall

    Science.gov (United States)

    Kuklewicz, K. B.; Frappier, A. E.

    2015-12-01

    Principal Component Analysis of stalagmite multivariate geochemical records can provide insight into climate variability as well as the frequency of high-magnitude events (i.e. volcanic eruptions) and even land use changes above cave systems. For most environmental proxies, large trace element data sets can pose difficulties for analysis and interpretation due to natural processes acting across wide ranges of time scales and magnitudes with overlapping influences on individual chemical species. To reduce the complexity of geochemical data, we applied Principal Component Analysis (PCA) and Evolutionary Spectral Analysis to a large high-resolution Laser Ablation Inductively Coupled Plasma Mass Spectrometer (LA-ICP-MS) stalagmite trace element data set from northern Yucatán, Mexico (CH-1), from about 1500-2007 CE. In our study, PCA identified five significant principal components (PCs) in this CH-1 record, which explain >83% of the data set's variability. Our analysis reveals that PC1 responds to overall trace element loading, including both short-lived trace element influxes associated with volcanic eruptions, and sustained land use changes associated with the Spanish settlement and Henequen (succulent plant) production. PC2 reflects prior calcite precipitation associated with regional dry climate anomalies by increasing Sr and Mg substitution in calcite. High loadings for B and Na indicate that PC3 is sensitive to wet climate anomalies. PCs 4 and 5 reflect related but lagged trace element transport mechanisms. Evolutionary spectral analysis results for the PCs reveal the changing influence of solar 11 and 22-year cycles and the 3-7 year El Niño/Southern Oscillation (ENSO) system over the last 500 years. This study adds to growing evidence that speleothems can record multivariate trace element fingerprints of volcanic eruptions, soil erosion, and different styles of climate variability, which can be useful for model verification and sensitivity testing studies.

  8. Assessing homogeneity and climate variability of temperature and precipitation series in the capitals of northeastern Brazil

    Science.gov (United States)

    Hänsel, Stephanie; Medeiros, Deusdedit; Matschullat, Jörg; Silva, Isamara; Petta, Reinaldo

    2016-03-01

    A 51-year dataset (1961 to 2011) from nine meteorological stations in the capitals of northeastern Brazil (NEB), with daily data of precipitation totals and of mean, minimum and maximum temperatures, was statistically analyzed for data homogeneity and for signals of climate variability. The hypothesis was explored that a connection exists between inhomogeneities of the time series and the meteorological systems influencing the region. Results of the homogeneity analysis depend on the selected test variable, the test algorithm and the chosen significance level; all more or less subjective choices. Most of the temperature series was classified as "suspect", while most of the precipitation series was categorized as "useful". Displaying and visually checking the time series demonstrates the power of expertise and may allow for a deeper data analysis. Consistent changes in the seasonality of temperature and precipitation emerge over NEB despite manifold breaks in the temperature series. Both series appear to be coupled. The intra-annual temperature and precipitation ranges have increased, along with an intensified seasonal cycle. Temperature mainly increased during DJF, MAM and SON, with decreases in JJA being related to wetter conditions and more frequent heavy precipitation events. Drought conditions mostly increased in SON and DJF, depending on the timing of the local dry season.

  9. Generalized Framework for Similarity Measure of Time Series

    Directory of Open Access Journals (Sweden)

    Hongsheng Yin

    2014-01-01

    Full Text Available Currently, there is no definitive and uniform description for the similarity of time series, which results in difficulties for relevant research on this topic. In this paper, we propose a generalized framework to measure the similarity of time series. In this generalized framework, whether the time series is univariable or multivariable, and linear transformed or nonlinear transformed, the similarity of time series is uniformly defined using norms of vectors or matrices. The definitions of the similarity of time series in the original space and the transformed space are proved to be equivalent. Furthermore, we also extend the theory on similarity of univariable time series to multivariable time series. We present some experimental results on published time series datasets tested with the proposed similarity measure function of time series. Through the proofs and experiments, it can be claimed that the similarity measure functions of linear multivariable time series based on the norm distance of covariance matrix and nonlinear multivariable time series based on kernel function are reasonable and practical.

  10. Hidden Markov Models for Time Series An Introduction Using R

    CERN Document Server

    Zucchini, Walter

    2009-01-01

    Illustrates the flexibility of HMMs as general-purpose models for time series data. This work presents an overview of HMMs for analyzing time series data, from continuous-valued, circular, and multivariate series to binary data, bounded and unbounded counts and categorical observations.

  11. Time and ensemble averaging in time series analysis

    CERN Document Server

    Latka, Miroslaw; Jernajczyk, Wojciech; West, Bruce J

    2010-01-01

    In many applications expectation values are calculated by partitioning a single experimental time series into an ensemble of data segments of equal length. Such single trajectory ensemble (STE) is a counterpart to a multiple trajectory ensemble (MTE) used whenever independent measurements or realizations of a stochastic process are available. The equivalence of STE and MTE for stationary systems was postulated by Wang and Uhlenbeck in their classic paper on Brownian motion (Rev. Mod. Phys. 17, 323 (1945)) but surprisingly has not yet been proved. Using the stationary and ergodic paradigm of statistical physics -- the Ornstein-Uhlenbeck (OU) Langevin equation, we revisit Wang and Uhlenbeck's postulate. In particular, we find that the variance of the solution of this equation is different for these two ensembles. While the variance calculated using the MTE quantifies the spreading of independent trajectories originating from the same initial point, the variance for STE measures the spreading of two correlated r...

  12. Scale-dependent intrinsic entropies of complex time series.

    Science.gov (United States)

    Yeh, Jia-Rong; Peng, Chung-Kang; Huang, Norden E

    2016-04-13

    Multi-scale entropy (MSE) was developed as a measure of complexity for complex time series, and it has been applied widely in recent years. The MSE algorithm is based on the assumption that biological systems possess the ability to adapt and function in an ever-changing environment, and these systems need to operate across multiple temporal and spatial scales, such that their complexity is also multi-scale and hierarchical. Here, we present a systematic approach to apply the empirical mode decomposition algorithm, which can detrend time series on various time scales, prior to analysing a signal's complexity by measuring the irregularity of its dynamics on multiple time scales. Simulated time series of fractal Gaussian noise and human heartbeat time series were used to study the performance of this new approach. We show that our method can successfully quantify the fractal properties of the simulated time series and can accurately distinguish modulations in human heartbeat time series in health and disease.

  13. Efficient Algorithms for Segmentation of Item-Set Time Series

    Science.gov (United States)

    Chundi, Parvathi; Rosenkrantz, Daniel J.

    We propose a special type of time series, which we call an item-set time series, to facilitate the temporal analysis of software version histories, email logs, stock market data, etc. In an item-set time series, each observed data value is a set of discrete items. We formalize the concept of an item-set time series and present efficient algorithms for segmenting a given item-set time series. Segmentation of a time series partitions the time series into a sequence of segments where each segment is constructed by combining consecutive time points of the time series. Each segment is associated with an item set that is computed from the item sets of the time points in that segment, using a function which we call a measure function. We then define a concept called the segment difference, which measures the difference between the item set of a segment and the item sets of the time points in that segment. The segment difference values are required to construct an optimal segmentation of the time series. We describe novel and efficient algorithms to compute segment difference values for each of the measure functions described in the paper. We outline a dynamic programming based scheme to construct an optimal segmentation of the given item-set time series. We use the item-set time series segmentation techniques to analyze the temporal content of three different data sets—Enron email, stock market data, and a synthetic data set. The experimental results show that an optimal segmentation of item-set time series data captures much more temporal content than a segmentation constructed based on the number of time points in each segment, without examining the item set data at the time points, and can be used to analyze different types of temporal data.

  14. Sparse Representation for Time-Series Classification

    Science.gov (United States)

    2015-02-08

    Comput. Vision and Pattern Recognition (CVPR), pp. 4114–4121 (2014). 18. J. Mairal, F. Bach , A. Zisserman, and G. Sapiro. Supervised dictionary learn...ing. In Advances Neural Inform. Process. Syst. (NIPS), pp. 1033–1040 (2008). 19. J. Mairal, F. Bach , and J. Ponce, Task-driven dictionary learning...Series Classification 17 compressive sensing, SISC. 33(1), 250–278 (2011). 41. J. Mairal, F. Bach , J. Ponce, and G. Sapiro, Online dictionary learning for

  15. TIME SERIES ANALYSIS USING A UNIQUE MODEL OF TRANSFORMATION

    Directory of Open Access Journals (Sweden)

    Goran Klepac

    2007-12-01

    Full Text Available REFII1 model is an authorial mathematical model for time series data mining. The main purpose of that model is to automate time series analysis, through a unique transformation model of time series. An advantage of this approach of time series analysis is the linkage of different methods for time series analysis, linking traditional data mining tools in time series, and constructing new algorithms for analyzing time series. It is worth mentioning that REFII model is not a closed system, which means that we have a finite set of methods. At first, this is a model for transformation of values of time series, which prepares data used by different sets of methods based on the same model of transformation in a domain of problem space. REFII model gives a new approach in time series analysis based on a unique model of transformation, which is a base for all kind of time series analysis. The advantage of REFII model is its possible application in many different areas such as finance, medicine, voice recognition, face recognition and text mining.

  16. Time-series prediction and applications a machine intelligence approach

    CERN Document Server

    Konar, Amit

    2017-01-01

    This book presents machine learning and type-2 fuzzy sets for the prediction of time-series with a particular focus on business forecasting applications. It also proposes new uncertainty management techniques in an economic time-series using type-2 fuzzy sets for prediction of the time-series at a given time point from its preceding value in fluctuating business environments. It employs machine learning to determine repetitively occurring similar structural patterns in the time-series and uses stochastic automaton to predict the most probabilistic structure at a given partition of the time-series. Such predictions help in determining probabilistic moves in a stock index time-series Primarily written for graduate students and researchers in computer science, the book is equally useful for researchers/professionals in business intelligence and stock index prediction. A background of undergraduate level mathematics is presumed, although not mandatory, for most of the sections. Exercises with tips are provided at...

  17. Ruin Probability in Linear Time Series Model

    Institute of Scientific and Technical Information of China (English)

    ZHANG Lihong

    2005-01-01

    This paper analyzes a continuous time risk model with a linear model used to model the claim process. The time is discretized stochastically using the times when claims occur, using Doob's stopping time theorem and martingale inequalities to obtain expressions for the ruin probability as well as both exponential and non-exponential upper bounds for the ruin probability for an infinite time horizon. Numerical results are included to illustrate the accuracy of the non-exponential bound.

  18. Time varying arctic climate change amplification

    Energy Technology Data Exchange (ETDEWEB)

    Chylek, Petr [Los Alamos National Laboratory; Dubey, Manvendra K [Los Alamos National Laboratory; Lesins, Glen [DALLHOUSIE U; Wang, Muyin [NOAA/JISAO

    2009-01-01

    During the past 130 years the global mean surface air temperature has risen by about 0.75 K. Due to feedbacks -- including the snow/ice albedo feedback -- the warming in the Arctic is expected to proceed at a faster rate than the global average. Climate model simulations suggest that this Arctic amplification produces warming that is two to three times larger than the global mean. Understanding the Arctic amplification is essential for projections of future Arctic climate including sea ice extent and melting of the Greenland ice sheet. We use the temperature records from the Arctic stations to show that (a) the Arctic amplification is larger at latitudes above 700 N compared to those within 64-70oN belt, and that, surprisingly; (b) the ratio of the Arctic to global rate of temperature change is not constant but varies on the decadal timescale. This time dependence will affect future projections of climate changes in the Arctic.

  19. On correlations and fractal characteristics of time series

    CERN Document Server

    Vitanov, N K; Yankulova, E D; Vitanov, Nikolay K.; Sakai, kenschi; Yankulova, Elka D.

    2005-01-01

    Correlation analysis is convenient and frequently used tool for investigation of time series from complex systems. Recently new methods such as the multifractal detrended fluctuation analysis (MFDFA) and the wavelet transform modulus maximum method (WTMM) have been developed. By means of these methods (i) we can investigate long-range correlations in time series and (ii) we can calculate fractal spectra of these time series. But opposite to the classical tool for correlation analysis - the autocorrelation function, the newly developed tools are not applicable to all kinds of time series. The unappropriate application of MFDFA or WTMM leads to wrong results and conclusions. In this article we discuss the opportunities and risks connected to the application of the MFDFA method to time series from a random number generator and to experimentally measured time series (i) for accelerations of an agricultural tractor and (ii) for the heartbeat activity of {\\sl Drosophila melanogaster}. Our main goal is to emphasize ...

  20. Clustering Time Series Data Stream - A Literature Survey

    CERN Document Server

    Kavitha, V

    2010-01-01

    Mining Time Series data has a tremendous growth of interest in today's world. To provide an indication various implementations are studied and summarized to identify the different problems in existing applications. Clustering time series is a trouble that has applications in an extensive assortment of fields and has recently attracted a large amount of research. Time series data are frequently large and may contain outliers. In addition, time series are a special type of data set where elements have a temporal ordering. Therefore clustering of such data stream is an important issue in the data mining process. Numerous techniques and clustering algorithms have been proposed earlier to assist clustering of time series data streams. The clustering algorithms and its effectiveness on various applications are compared to develop a new method to solve the existing problem. This paper presents a survey on various clustering algorithms available for time series datasets. Moreover, the distinctiveness and restriction ...

  1. Non-parametric causal inference for bivariate time series

    CERN Document Server

    McCracken, James M

    2015-01-01

    We introduce new quantities for exploratory causal inference between bivariate time series. The quantities, called penchants and leanings, are computationally straightforward to apply, follow directly from assumptions of probabilistic causality, do not depend on any assumed models for the time series generating process, and do not rely on any embedding procedures; these features may provide a clearer interpretation of the results than those from existing time series causality tools. The penchant and leaning are computed based on a structured method for computing probabilities.

  2. Predicting Chaotic Time Series Using Recurrent Neural Network

    Institute of Scientific and Technical Information of China (English)

    ZHANG Jia-Shu; XIAO Xian-Ci

    2000-01-01

    A new proposed method, i.e. the recurrent neural network (RNN), is introduced to predict chaotic time series. The effectiveness of using RNN for making one-step and multi-step predictions is tested based on remarkable few datum points by computer-generated chaotic time series. Numerical results show that the RNN proposed here is a very powerful tool for making prediction of chaotic time series.

  3. Information distance and its application in time series

    Directory of Open Access Journals (Sweden)

    B. Mirza

    2008-03-01

    Full Text Available   In this paper a new method is introduced for studying time series of complex systems. This method is based on using the concept of entropy and Jensen-Shannon divergence. In this paper this method is applied to time series of billiard system and heart signals. By this method, we can diagnose the healthy and unhealthy heart and also chaotic billiards from non chaotic systems . The method can also be applied to other time series.

  4. Intrusion Detection Forecasting Using Time Series for Improving Cyber Defence

    OpenAIRE

    Abdullah, Azween Bin; Pillai, Thulasyammal Ramiah; Cai, Long Zheng

    2015-01-01

    The strength of time series modeling is generally not used in almost all current intrusion detection and prevention systems. By having time series models, system administrators will be able to better plan resource allocation and system readiness to defend against malicious activities. In this paper, we address the knowledge gap by investigating the possible inclusion of a statistical based time series modeling that can be seamlessly integrated into existing cyber defense system. Cyber-attack ...

  5. Reconstructing Ocean Circulation using Coral (triangle)14C Time Series

    Energy Technology Data Exchange (ETDEWEB)

    Kashgarian, M; Guilderson, T P

    2001-02-23

    We utilize monthly {sup 14}C data derived from coral archives in conjunction with ocean circulation models to address two questions: (1) how does the shallow circulation of the tropical Pacific vary on seasonal to decadal time scales and (2) which dynamic processes determine the mean vertical structure of the equatorial Pacific thermocline. Our results directly impact the understanding of global climate events such as the El Nino-Southern Oscillation (ENSO). To study changes in ocean circulation and water mass distribution involved in the genesis and evolution of ENSO and decadal climate variability, it is necessary to have records of climate variables several decades in length. Continuous instrumental records are limited because technology for continuous monitoring of ocean currents (e.g. satellites and moored arrays) has only recently been available, and ships of opportunity archives such as COADS contain large spatial and temporal biases. In addition, temperature and salinity in surface waters are not conservative and thus can not be independently relied upon to trace water masses, reducing the utility of historical observations. Radiocarbon in sea water is a quasi-conservative water mass tracer and is incorporated into coral skeletal material, thus coral {sup 14}C records can be used to reconstruct changes in shallow circulation that would be difficult to characterize using instrumental data. High resolution {Delta}{sup 14}C timeseries such as ours, provide a powerful constraint on the rate of surface ocean mixing and hold great promise to augment one time oceanographic surveys. {Delta}{sup 14}C timeseries such as these, not only provide fundamental information about the shallow circulation of the Pacific, but can also be directly used as a benchmark for the next generation of high resolution ocean models used in prognosticating climate. The measurement of {Delta}{sup 14}C in biological archives such as tree rings and coral growth bands is a direct record of

  6. Using neural networks for dynamic light scattering time series processing

    Science.gov (United States)

    Chicea, Dan

    2017-04-01

    A basic experiment to record dynamic light scattering (DLS) time series was assembled using basic components. The DLS time series processing using the Lorentzian function fit was considered as reference. A Neural Network was designed and trained using simulated frequency spectra for spherical particles in the range 0–350 nm, assumed to be scattering centers, and the neural network design and training procedure are described in detail. The neural network output accuracy was tested both on simulated and on experimental time series. The match with the DLS results, considered as reference, was good serving as a proof of concept for using neural networks in fast DLS time series processing.

  7. Efficient use of correlation entropy for analysing time series data

    Indian Academy of Sciences (India)

    K P Harikrishnan; R Misra; G Ambika

    2009-02-01

    The correlation dimension 2 and correlation entropy 2 are both important quantifiers in nonlinear time series analysis. However, use of 2 has been more common compared to 2 as a discriminating measure. One reason for this is that 2 is a static measure and can be easily evaluated from a time series. However, in many cases, especially those involving coloured noise, 2 is regarded as a more useful measure. Here we present an efficient algorithmic scheme to compute 2 directly from a time series data and show that 2 can be used as a more effective measure compared to 2 for analysing practical time series involving coloured noise.

  8. Trend time-series modeling and forecasting with neural networks.

    Science.gov (United States)

    Qi, Min; Zhang, G Peter

    2008-05-01

    Despite its great importance, there has been no general consensus on how to model the trends in time-series data. Compared to traditional approaches, neural networks (NNs) have shown some promise in time-series forecasting. This paper investigates how to best model trend time series using NNs. Four different strategies (raw data, raw data with time index, detrending, and differencing) are used to model various trend patterns (linear, nonlinear, deterministic, stochastic, and breaking trend). We find that with NNs differencing often gives meritorious results regardless of the underlying data generating processes (DGPs). This finding is also confirmed by the real gross national product (GNP) series.

  9. gatspy: General tools for Astronomical Time Series in Python

    Science.gov (United States)

    VanderPlas, Jake

    2016-10-01

    Gatspy contains efficient, well-documented implementations of several common routines for Astronomical time series analysis, including the Lomb-Scargle periodogram, the Supersmoother method, and others.

  10. Time series analysis in the social sciences the fundamentals

    CERN Document Server

    Shin, Youseop

    2017-01-01

    Times Series Analysis in the Social Sciences is a practical and highly readable introduction written exclusively for students and researchers whose mathematical background is limited to basic algebra. The book focuses on fundamental elements of time series analysis that social scientists need to understand so they can employ time series analysis for their research and practice. Through step-by-step explanations and using monthly violent crime rates as case studies, this book explains univariate time series from the preliminary visual analysis through the modeling of seasonality, trends, and re

  11. Transatlantic flight times and climate change

    Science.gov (United States)

    Williams, Paul

    2016-04-01

    Aircraft do not fly through a vacuum, but through an atmosphere whose meteorological characteristics are changing because of global warming. The impacts of aviation on climate change have long been recognised, but the impacts of climate change on aviation have only recently begun to emerge. These impacts include intensified turbulence (Williams and Joshi 2013) and increased take-off weight restrictions. A forthcoming study (Williams 2016) investigates the influence of climate change on flight routes and journey times. This is achieved by feeding synthetic atmospheric wind fields generated from climate model simulations into a routing algorithm of the type used operationally by flight planners. The focus is on transatlantic flights between London and New York, and how they change when the atmospheric concentration of carbon dioxide is doubled. It is found that a strengthening of the prevailing jet-stream winds causes eastbound flights to significantly shorten and westbound flights to significantly lengthen in all seasons, causing round-trip journey times to increase. Eastbound and westbound crossings in winter become approximately twice as likely to take under 5h 20m and over 7h 00m, respectively. The early stages of this effect perhaps contributed to a well-publicised British Airways flight from New York to London on 8 January 2015, which took a record time of only 5h 16m because of a strong tailwind from an unusually fast jet stream. Even assuming no future growth in aviation, extrapolation of our results to all transatlantic traffic suggests that aircraft may collectively be airborne for an extra 2,000 hours each year, burning an extra 7.2 million gallons of jet fuel at a cost of US 22 million, and emitting an extra 70 million kg of carbon dioxide. These findings provide further evidence of the two-way interaction between aviation and climate change. References Williams PD (2016) Transatlantic flight times and climate change. Environmental Research Letters, in

  12. Studies on time series applications in environmental sciences

    CERN Document Server

    Bărbulescu, Alina

    2016-01-01

    Time series analysis and modelling represent a large study field, implying the approach from the perspective of the time and frequency, with applications in different domains. Modelling hydro-meteorological time series is difficult due to the characteristics of these series, as long range dependence, spatial dependence, the correlation with other series. Continuous spatial data plays an important role in planning, risk assessment and decision making in environmental management. In this context, in this book we present various statistical tests and modelling techniques used for time series analysis, as well as applications to hydro-meteorological series from Dobrogea, a region situated in the south-eastern part of Romania, less studied till now. Part of the results are accompanied by their R code. .

  13. A Method for Determining Periods in Time Series.

    Science.gov (United States)

    1981-04-01

    SUPPLEMENTARY NOTES IS. KEY WORDS (Conlinu an revere cide Ii necesry d Identify by block nmi 9ber) Univariate time series; spectral density function ; Newton’s...and the method is applied to a series of hormone levels data. KEY WORDS: Univariate time series; Spectral density function ; Newton’s Method...Z the set of integers, be a zero mean covariance stationary time series with autocovariance function R(v) = E(Y(t)Y(t+v)), vZ and spectral density function f

  14. FRACTAL ANALYSIS OF MONTHLY EVAPORATION AND PRECIPITATION TIME SERIES AT CENTRAL MEXICO

    Directory of Open Access Journals (Sweden)

    Rafael Magallanes Quintanar

    2015-09-01

    Full Text Available Advances on climate change research, as well as the assessment of the potential impacts of climate change on water resources, would allow the understanding of the spatial and temporal variability of land-surface precipitation and evaporation time series at local and regional levels. In the present study, the spectral analysis approach was applied on monthly evaporation and precipitation anomaly time series with the aim of estimating their self-affinity statistics. The behavior of estimated fractal dimension values of evaporation time series throughout Zacatecas State territory is irregular, and noise in all the evaporation anomaly time series tends to have a persistent behavior. On the other hand, the behavior of estimated fractal dimension values of most of the precipitation time series throughout Zacatecas State territory tends to be like the Brownian motion. Self-affinity statistics of monthly evaporation or precipitation anomaly time series and geographic coordinates of 32 stations were used to estimate correlation coefficients; the results are compelling evidence concerning monthly precipitation anomaly behavior tends to be more regular toward North of Zacatecas State territory, that is, toward driest areas.

  15. Distance measure with improved lower bound for multivariate time series

    Science.gov (United States)

    Li, Hailin

    2017-02-01

    Lower bound function is one of the important techniques used to fast search and index time series data. Multivariate time series has two aspects of high dimensionality including the time-based dimension and the variable-based dimension. Due to the influence of variable-based dimension, a novel method is proposed to deal with the lower bound distance computation for multivariate time series. The proposed method like the traditional ones also reduces the dimensionality of time series in its first step and thus does not directly apply the lower bound function on the multivariate time series. The dimensionality reduction is that multivariate time series is reduced to univariate time series denoted as center sequences according to the principle of piecewise aggregate approximation. In addition, an extended lower bound function is designed to obtain good tightness and fast measure the distance between any two center sequences. The experimental results demonstrate that the proposed lower bound function has better tightness and improves the performance of similarity search in multivariate time series datasets.

  16. Recovery of the Time-Evolution Equation of Time-Delay Systems from Time Series

    CERN Document Server

    Bünner, M J; Kittel, A; Parisi, J; Meyer, Th.

    1997-01-01

    We present a method for time series analysis of both, scalar and nonscalar time-delay systems. If the dynamics of the system investigated is governed by a time-delay induced instability, the method allows to determine the delay time. In a second step, the time-delay differential equation can be recovered from the time series. The method is a generalization of our recently proposed method suitable for time series analysis of {\\it scalar} time-delay systems. The dynamics is not required to be settled on its attractor, which also makes transient motion accessible to the analysis. If the motion actually takes place on a chaotic attractor, the applicability of the method does not depend on the dimensionality of the chaotic attractor - one main advantage over all time series analysis methods known until now. For demonstration, we analyze time series, which are obtained with the help of the numerical integration of a two-dimensional time-delay differential equation. After having determined the delay time, we recover...

  17. Time series prediction using wavelet process neural network

    Institute of Scientific and Technical Information of China (English)

    Ding Gang; Zhong Shi-Sheng; Li Yang

    2008-01-01

    In the real world, the inputs of many complicated systems are time-varying functions or processes. In order to predict the outputs of these systems with high speed and accuracy, this paper proposes a time series prediction model based on the wavelet process neural network, and develops the corresponding learning algorithm based on the expansion of the orthogonal basis functions. The effectiveness of the proposed time series prediction model and its learning algorithm is proved by the Mackey-Glass time series prediction, and the comparative prediction results indicate that the proposed time series prediction model based on the wavelet process neural network seems to perform well and appears suitable for using as a good tool to predict the highly complex nonlinear time series.

  18. Time series requirements and trends of temperature and precipitation extremes over Italy

    Science.gov (United States)

    Fioravanti, Guido; Desiato, Franco; Fraschetti, Piero; Perconti, Walter; Piervitali, Emanuela

    2013-04-01

    Extreme climate events have strong impacts on society and economy; accordingly,the knowledge of their trends on long period is crucial for the definition and implementation of a national adaptation strategy to climate change. The Research Programme on Climate Variability and Predictability (CLIVAR) identified a set of temperature and precipitation indices suited to investigate variability and trends of climate extremes. It is well known that extreme indices calculation is more demanding than first and second order statistics are: daily temperature and precipitation data are required and strict constrains in terms of continuity and completeness must be met. In addition, possible dishomogeneities affecting time series must be identified and adjusted before indices calculation. When metadata are not available, statistical methods can provide scientist a relevant support for homogeneity check; however, ad-hoc decision criteria (sometimes subjective) must be applied whenever contradictory results characterize different statistical homogeneity tests. In this work, a set of daily (minimum and maximum) temperature and precipitation time series for the period 1961-2011 were selected in order to guarantee a quite uniform spatial distribution of the stations over the Italian territory and according to the afore-said continuity and completeness criteria. Following the method described by Vincent, the homogeneity check of temperature time series was run at annual level. Two well-documented tests were employed (F-test and T-test), both implemented in the free R-package RHtestV3. The Vincent method was also used for a further investigation of time series homogeneity. Temperature dishomogeneous series were discarded. For precipitation series, no homogeneity check was run. The selected series were employed at daily level to calculate a reliable set of extreme indices. For each station, a linear model was employed for indices trend estimation. Finally, single station results were

  19. RECONSTRUCTION OF PRECIPITATION SERIES AND ANALYSIS OF CLIMATE CHANGE OVER PAST 500 YEARS IN NORTHERN CHINA

    Institute of Scientific and Technical Information of China (English)

    RONG Yan-shu; TU Qi-pu

    2005-01-01

    It is important and necessary to get a much longer precipitation series in order to research features of drought/flood and climate change.Based on dryness and wetness grades series of 18 stations in Northern China of 533 years from 1470 to 2002, the Moving Cumulative Frequency Method (MCFM) was developed, moving average precipitation series from 1499 to 2002 were reconstructed by testing three kinds of average precipitation, and the features of climate change and dry and wet periods were researched by using reconstructed precipitation series in the present paper.The results showed that there were good relationship between the reconstructed precipitation series and the observation precipitation series since 1954 and their relative root-mean-square error were below 1.89%, that the relation between reconstructed series and the dryness and wetness grades series were nonlinear and this nonlinear relation implied that reconstructed series were reliable and could became foundation data for researching evolution of the drought and flood.Analysis of climate change upon reconstructed precipitation series revealed that although drought intensity of recent dry period from middle 1970s of 20th century until early 21st century was not the strongest in historical climate of Northern China, intensity and duration of wet period was a great deal decreasing and shortening respectively, climate evolve to aridification situation in Northern China.

  20. Fixed Points in Self-Similar Analysis of Time Series

    OpenAIRE

    Gluzman, S.; Yukalov, V. I.

    1998-01-01

    Two possible definitions of fixed points in the self-similar analysis of time series are considered. One definition is based on the minimal-difference condition and another, on a simple averaging. From studying stock market time series, one may conclude that these two definitions are practically equivalent. A forecast is made for the stock market indices for the end of March 1998.

  1. Robust Forecasting of Non-Stationary Time Series

    NARCIS (Netherlands)

    Croux, C.; Fried, R.; Gijbels, I.; Mahieu, K.

    2010-01-01

    This paper proposes a robust forecasting method for non-stationary time series. The time series is modelled using non-parametric heteroscedastic regression, and fitted by a localized MM-estimator, combining high robustness and large efficiency. The proposed method is shown to produce reliable foreca

  2. Mean shifts, unit roots and forecasting seasonal time series

    NARCIS (Netherlands)

    Ph.H.B.F. Franses (Philip Hans); R. Paap (Richard); H. Hoek (Henk)

    1997-01-01

    textabstractExamples of descriptive models for changing seasonal patterns in economic time series are autoregressive models with seasonal unit roots or with deterministic seasonal mean shifts. In this paper we show through a forecasting comparison for three macroeconomic time series (for which tests

  3. Stata: The language of choice for time series analysis?

    OpenAIRE

    Baum, Christopher F

    2004-01-01

    This paper discusses the use of Stata for the analysis of time series and panel data. The evolution of time-series capabilities in Stata is reviewed. Facilities for data management, graphics, and econometric analysis from both official Stata and the user community are discussed. A new routine to provide moving-window regression estimates, rollreg, is described, and its use illustrated.

  4. Metagenomics meets time series analysis: unraveling microbial community dynamics

    NARCIS (Netherlands)

    Faust, K.; Lahti, L.M.; Gonze, D.; Vos, de W.M.; Raes, J.

    2015-01-01

    The recent increase in the number of microbial time series studies offers new insights into the stability and dynamics of microbial communities, from the world's oceans to human microbiota. Dedicated time series analysis tools allow taking full advantage of these data. Such tools can reveal periodic

  5. Time series analysis : Smoothed correlation integrals, autocovariances, and power spectra

    NARCIS (Netherlands)

    Takens, F; Dumortier, F; Broer, H; Mawhin, J; Vanderbauwhede, A; Lunel, SV

    2005-01-01

    In this paper we relate notions from linear time series analyses, like autocovariances and power spectra, with notions from nonlinear times series analysis, like (smoothed) correlation integrals and the corresponding dimensions and entropies. The complete proofs of the results announced in this pape

  6. Two-fractal overlap time series: Earthquakes and market crashes

    Indian Academy of Sciences (India)

    Bikas K Chakrabarti; Arnab Chatterjee; Pratip Bhattacharyya

    2008-08-01

    We find prominent similarities in the features of the time series for the (model earthquakes or) overlap of two Cantor sets when one set moves with uniform relative velocity over the other and time series of stock prices. An anticipation method for some of the crashes have been proposed here, based on these observations.

  7. Extracting a common pulse like signal from Time Serie using a non linear Kalman Filter

    Science.gov (United States)

    Gazeaux, J.; Batista, D.; Ammann, C.; Naveau, P.; Jégat, C.; Gao, C.

    2009-04-01

    To understand the nature and cause of natural climate variability, it is important to attribute past climate variations to particular forcing factors. In this work, our main focus is to introduce an automatic assimilation procedure to estimate the magnitude of strong but short-lived perturbations, such as large explosive volcanic eruptions, using climate/proxies time series. The extraction and decomposition procedure is run on real multivariate time series of sulfate from ice cores drilled at different sites in Greenland. The sulfate ejected by volcanoes is transported through the stratosphere towards the poles and deposited via sedimentation near the pole. Sulfate in Greenland is then a marker of huge volcanic eruptions which occur all over the world. Such pulse-like processes are highly non linear, as much in time as for their intensity. If they are not detected, such pulse-like signals of extreme and rare events can perturb an objective calculation of the trend. This work is then as much an estimation procedure for such signals, as a first step to estimate a posteriori trend in the time series. Our extraction algorithm handles multivariate time series with a common but unknown forcing. This statistical procedure is based on a multivariate multi-state space model and a non linear Kalman Filter. The non linearity is solved using the calculation of a twice conditional expectation and variance. It can provide an accurate estimate of the timing and duration of individual pulse-like events from a set of different series covering the same temporal space. It not only allows for a more objective estimation of its associated peak amplitude and the subsequent time evolution of the signal, but at the same time it provides a measure of confidence through the posterior probability for each pulse-like event. The flexibility, robustness and limitations of our approach are discussed by applying our method to simulated time series and to the Monte-Carlo method to test the

  8. The PRIMAP-hist national historical emissions time series

    Science.gov (United States)

    Gütschow, Johannes; Jeffery, M. Louise; Gieseke, Robert; Gebel, Ronja; Stevens, David; Krapp, Mario; Rocha, Marcia

    2016-11-01

    To assess the history of greenhouse gas emissions and individual countries' contributions to emissions and climate change, detailed historical data are needed. We combine several published datasets to create a comprehensive set of emissions pathways for each country and Kyoto gas, covering the years 1850 to 2014 with yearly values, for all UNFCCC member states and most non-UNFCCC territories. The sectoral resolution is that of the main IPCC 1996 categories. Additional time series of CO2 are available for energy and industry subsectors. Country-resolved data are combined from different sources and supplemented using year-to-year growth rates from regionally resolved sources and numerical extrapolations to complete the dataset. Regional deforestation emissions are downscaled to country level using estimates of the deforested area obtained from potential vegetation and simulations of agricultural land. In this paper, we discuss the data sources and methods used and present the resulting dataset, including its limitations and uncertainties. The dataset is available from doi:10.5880/PIK.2016.003 and can be viewed on the website accompanying this paper (http://www.pik-potsdam.de/primap-live/primap-hist/).

  9. Comparison of New and Old Sunspot Number Time Series

    Science.gov (United States)

    Cliver, E. W.

    2016-11-01

    Four new sunspot number time series have been published in this Topical Issue: a backbone-based group number in Svalgaard and Schatten ( Solar Phys., 2016; referred to here as SS, 1610 - present), a group number series in Usoskin et al. ( Solar Phys., 2016; UEA, 1749 - present) that employs active day fractions from which it derives an observational threshold in group spot area as a measure of observer merit, a provisional group number series in Cliver and Ling ( Solar Phys., 2016; CL, 1841 - 1976) that removed flaws in the Hoyt and Schatten ( Solar Phys. 179, 189, 1998a; 181, 491, 1998b) normalization scheme for the original relative group sunspot number (RG, 1610 - 1995), and a corrected Wolf (international, RI) number in Clette and Lefèvre ( Solar Phys., 2016; SN, 1700 - present). Despite quite different construction methods, the four new series agree well after about 1900. Before 1900, however, the UEA time series is lower than SS, CL, and SN, particularly so before about 1885. Overall, the UEA series most closely resembles the original RG series. Comparison of the UEA and SS series with a new solar wind B time series (Owens et al. in J. Geophys. Res., 2016; 1845 - present) indicates that the UEA time series is too low before 1900. We point out incongruities in the Usoskin et al. ( Solar Phys., 2016) observer normalization scheme and present evidence that this method under-estimates group counts before 1900. In general, a correction factor time series, obtained by dividing an annual group count series by the corresponding yearly averages of raw group counts for all observers, can be used to assess the reliability of new sunspot number reconstructions.

  10. Fisher Information Framework for Time Series Modeling

    CERN Document Server

    Venkatesan, R C

    2016-01-01

    A robust prediction model invoking the Takens embedding theorem, whose \\textit{working hypothesis} is obtained via an inference procedure based on the minimum Fisher information principle, is presented. The coefficients of the ansatz, central to the \\textit{working hypothesis} satisfy a time independent Schr\\"{o}dinger-like equation in a vector setting. The inference of i) the probability density function of the coefficients of the \\textit{working hypothesis} and ii) the establishing of constraint driven pseudo-inverse condition for the modeling phase of the prediction scheme, is made, for the case of normal distributions, with the aid of the quantum mechanical virial theorem. The well-known reciprocity relations and the associated Legendre transform structure for the Fisher information measure (FIM, hereafter)-based model in a vector setting (with least square constraints) are self-consistently derived. These relations are demonstrated to yield an intriguing form of the FIM for the modeling phase, which defi...

  11. Time series analysis and inverse theory for geophysicists

    Institute of Scientific and Technical Information of China (English)

    Junzo Kasahara

    2006-01-01

    @@ Thanks to the advances in geophysical measurement technologies, most geophysical data are now recorded in digital form. But to extract the ‘Earth's nature’ from observed data, it is necessary to apply the signal-processing method to the time-series data, seismograms and geomagnetic records being the most common. The processing of time-series data is one of the major subjects of this book.By the processing of time series data, numerical values such as travel-times are obtained.The first stage of data analysis is forward modeling, but the more advanced step is the inversion method. This is the second subject of this book.

  12. Performance of multifractal detrended fluctuation analysis on short time series

    CERN Document Server

    Lopez, Juan Luis

    2013-01-01

    The performance of the multifractal detrended analysis on short time series is evaluated for synthetic samples of several mono- and multifractal models. The reconstruction of the generalized Hurst exponents is used to determine the range of applicability of the method and the precision of its results as a function of the decreasing length of the series. As an application the series of the daily exchange rate between the U.S. dollar and the euro is studied.

  13. Inverse method for estimating respiration rates from decay time series

    Directory of Open Access Journals (Sweden)

    D. C. Forney

    2012-03-01

    Full Text Available Long-term organic matter decomposition experiments typically measure the mass lost from decaying organic matter as a function of time. These experiments can provide information about the dynamics of carbon dioxide input to the atmosphere and controls on natural respiration processes. Decay slows down with time, suggesting that organic matter is composed of components (pools with varied lability. Yet it is unclear how the appropriate rates, sizes, and number of pools vary with organic matter type, climate, and ecosystem. To better understand these relations, it is necessary to properly extract the decay rates from decomposition data. Here we present a regularized inverse method to identify an optimally-fitting distribution of decay rates associated with a decay time series. We motivate our study by first evaluating a standard, direct inversion of the data. The direct inversion identifies a discrete distribution of decay rates, where mass is concentrated in just a small number of discrete pools. It is consistent with identifying the best fitting "multi-pool" model, without prior assumption of the number of pools. However we find these multi-pool solutions are not robust to noise and are over-parametrized. We therefore introduce a method of regularized inversion, which identifies the solution which best fits the data but not the noise. This method shows that the data are described by a continuous distribution of rates which we find is well approximated by a lognormal distribution, and consistent with the idea that decomposition results from a continuum of processes at different rates. The ubiquity of the lognormal distribution suggest that decay may be simply described by just two parameters; a mean and a variance of log rates. We conclude by describing a procedure that estimates these two lognormal parameters from decay data. Matlab codes for all numerical methods and procedures are provided.

  14. Inverse method for estimating respiration rates from decay time series

    Directory of Open Access Journals (Sweden)

    D. C. Forney

    2012-09-01

    Full Text Available Long-term organic matter decomposition experiments typically measure the mass lost from decaying organic matter as a function of time. These experiments can provide information about the dynamics of carbon dioxide input to the atmosphere and controls on natural respiration processes. Decay slows down with time, suggesting that organic matter is composed of components (pools with varied lability. Yet it is unclear how the appropriate rates, sizes, and number of pools vary with organic matter type, climate, and ecosystem. To better understand these relations, it is necessary to properly extract the decay rates from decomposition data. Here we present a regularized inverse method to identify an optimally-fitting distribution of decay rates associated with a decay time series. We motivate our study by first evaluating a standard, direct inversion of the data. The direct inversion identifies a discrete distribution of decay rates, where mass is concentrated in just a small number of discrete pools. It is consistent with identifying the best fitting "multi-pool" model, without prior assumption of the number of pools. However we find these multi-pool solutions are not robust to noise and are over-parametrized. We therefore introduce a method of regularized inversion, which identifies the solution which best fits the data but not the noise. This method shows that the data are described by a continuous distribution of rates, which we find is well approximated by a lognormal distribution, and consistent with the idea that decomposition results from a continuum of processes at different rates. The ubiquity of the lognormal distribution suggest that decay may be simply described by just two parameters: a mean and a variance of log rates. We conclude by describing a procedure that estimates these two lognormal parameters from decay data. Matlab codes for all numerical methods and procedures are provided.

  15. Cross recurrence plot based synchronization of time series

    OpenAIRE

    N. Marwan; Thiel, M.; Nowaczyk, N. R.

    2002-01-01

    The method of recurrence plots is extended to the cross recurrence plots (CRP) which, among others, enables the study of synchronization or time differences in two time series. This is emphasized in a distorted main diagonal in the cross recurrence plot, the line of synchronization (LOS). A non-parametrical fit of this LOS can be used to rescale the time axis of the two data series (whereby one of them is compressed or stretched) so ...

  16. Solving Nonlinear Time Delay Control Systems by Fourier series

    Directory of Open Access Journals (Sweden)

    Mohammad Hadi Farahi

    2014-06-01

    Full Text Available In this paper we present a method to find the solution of time-delay optimal control systems using Fourier series. The method is based upon expanding various time functions in the system as their truncated Fourier series. Operational matrices of integration and delay are presented and are utilized to reduce the solution of time-delay control systems to the solution of algebraic equations. Illustrative examples are included to demonstrate the validity and applicability of the technique.

  17. Modeling Persistence In Hydrological Time Series Using Fractional Differencing

    Science.gov (United States)

    Hosking, J. R. M.

    1984-12-01

    The class of autoregressive integrated moving average (ARIMA) time series models may be generalized by permitting the degree of differencing d to take fractional values. Models including fractional differencing are capable of representing persistent series (d > 0) or short-memory series (d = 0). The class of fractionally differenced ARIMA processes provides a more flexible way than has hitherto been available of simultaneously modeling the long-term and short-term behavior of a time series. In this paper some fundamental properties of fractionally differenced ARIMA processes are presented. Methods of simulating these processes are described. Estimation of the parameters of fractionally differenced ARIMA models is discussed, and an approximate maximum likelihood method is proposed. The methodology is illustrated by fitting fractionally differenced models to time series of streamflows and annual temperatures.

  18. Spectral analysis of hydrological time series of a river basin in southern Spain

    Science.gov (United States)

    Luque-Espinar, Juan Antonio; Pulido-Velazquez, David; Pardo-Igúzquiza, Eulogio; Fernández-Chacón, Francisca; Jiménez-Sánchez, Jorge; Chica-Olmo, Mario

    2016-04-01

    Spectral analysis has been applied with the aim to determine the presence and statistical significance of climate cycles in data series from different rainfall, piezometric and gauging stations located in upper Genil River Basin. This river starts in Sierra Nevada Range at 3,480 m a.s.l. and is one of the most important rivers of this region. The study area has more than 2.500 km2, with large topographic differences. For this previous study, we have used more than 30 rain data series, 4 piezometric data series and 3 data series from gauging stations. Considering a monthly temporal unit, the studied period range from 1951 to 2015 but most of the data series have some lacks. Spectral analysis is a methodology widely used to discover cyclic components in time series. The time series is assumed to be a linear combination of sinusoidal functions of known periods but of unknown amplitude and phase. The amplitude is related with the variance of the time series, explained by the oscillation at each frequency (Blackman and Tukey, 1958, Bras and Rodríguez-Iturbe, 1985, Chatfield, 1991, Jenkins and Watts, 1968, among others). The signal component represents the structured part of the time series, made up of a small number of embedded periodicities. Then, we take into account the known result for the one-sided confidence band of the power spectrum estimator. For this study, we established confidence levels of <90%, 90%, 95%, and 99%. Different climate signals have been identified: ENSO, QBO, NAO, Sun Spot cycles, as well as others related to sun activity, but the most powerful signals correspond to the annual cycle, followed by the 6 month and NAO cycles. Nevertheless, significant differences between rain data series and piezometric/flow data series have been pointed out. In piezometric data series and flow data series, ENSO and NAO signals could be stronger than others with high frequencies. The climatic peaks in lower frequencies in rain data are smaller and the confidence

  19. Intensity-Duration-Frequency (IDF) rainfall curves, for data series and climate projection in African cities

    Science.gov (United States)

    De Paola, Francesco; Giugni, Maurizio; Topa, Maria Elena; Coly, Adrien; Yeshitela, Kumelachew; Kombe, Wilbard; Tonye, Emmanuel; Touré, Hamidou

    2013-04-01

    The intensity-duration-frequency curves are used in hydrology to express in a synthetic way, the link between the maximum rainfall height h and a generic duration d of a rainfall event, fixed a given return period T. Generally, IDF curves can be characterized by a bi-parameter power law: h(d,T) = a(T)dn where a(T), and n are the parameters that have to be estimated through a probabilistic approach. An intensity-duration-frequency analysis starts by gathering time series record of different durations and extracting annual extremes for each duration. The annual extreme data are then fitted by a probability distribution. The present study, carried out within the FP7-ENV-2010 CLUVA project (CLimate change and Urban Vulnerability in Africa), regards the evaluation of the IDF curves for five case studies: Addis Ababa (Ethiopia), Dar Es Salaam (Tanzania), Douala (Cameroon), Ouagadouogou (Burkina Faso) and Saint Louis (Senegal). The probability distribution chosen to fit the annual extreme data is the classic Gumbel distribution. However, for the case studies, only the maximum annual daily rainfall heights are available. Therefore, to define the IDF curves and the extreme values in a smaller time window (10', 30', 1h, 3h, 6h, 12h), it is required to develop disaggregation techniques of the collected data, in order to generate a synthetic sequence of rainfall, with statistical properties equal to the recorded data. The daily rainfalls were disaggregated using two models: short-time intensity disaggregation model (10', 30', 1h); cascade-based disaggregation model (3h, 6h, 12h). On the basis of disaggegation models and Gumbel distribution , the parameters of the IDF curves for the five test cities were evaluated. In order to estimate the contingent influence of climate change on the IDF curves, the illustrated procedure has been applied to the climate (rainfall) simulations over the time period 2010-2050 provided by the CMCC (Centro Euro-Mediterraneo sui Cambiamenti Climatici

  20. Modelling road accidents: An approach using structural time series

    Science.gov (United States)

    Junus, Noor Wahida Md; Ismail, Mohd Tahir

    2014-09-01

    In this paper, the trend of road accidents in Malaysia for the years 2001 until 2012 was modelled using a structural time series approach. The structural time series model was identified using a stepwise method, and the residuals for each model were tested. The best-fitted model was chosen based on the smallest Akaike Information Criterion (AIC) and prediction error variance. In order to check the quality of the model, a data validation procedure was performed by predicting the monthly number of road accidents for the year 2012. Results indicate that the best specification of the structural time series model to represent road accidents is the local level with a seasonal model.

  1. Algorithms for Linear Time Series Analysis: With R Package

    Directory of Open Access Journals (Sweden)

    A. Ian McLeod

    2007-11-01

    Full Text Available Our ltsa package implements the Durbin-Levinson and Trench algorithms and provides a general approach to the problems of fitting, forecasting and simulating linear time series models as well as fitting regression models with linear time series errors. For computational efficiency both algorithms are implemented in C and interfaced to R. Examples are given which illustrate the efficiency and accuracy of the algorithms. We provide a second package FGN which illustrates the use of the ltsa package with fractional Gaussian noise (FGN. It is hoped that the ltsa will provide a base for further time series software.

  2. Multivariate time series analysis with R and financial applications

    CERN Document Server

    Tsay, Ruey S

    2013-01-01

    Since the publication of his first book, Analysis of Financial Time Series, Ruey Tsay has become one of the most influential and prominent experts on the topic of time series. Different from the traditional and oftentimes complex approach to multivariate (MV) time series, this sequel book emphasizes structural specification, which results in simplified parsimonious VARMA modeling and, hence, eases comprehension. Through a fundamental balance between theory and applications, the book supplies readers with an accessible approach to financial econometric models and their applications to real-worl

  3. On the detection of superdiffusive behaviour in time series

    CERN Document Server

    Gottwald, Georg A

    2016-01-01

    We present a new method for detecting superdiffusive behaviour and for determining rates of superdiffusion in time series data. Our method applies equally to stochastic and deterministic time series data and relies on one realisation (ie one sample path) of the process. Linear drift effects are automatically removed without any preprocessing. We show numerical results for time series constructed from i.i.d. $\\alpha$-stable random variables and from deterministic weakly chaotic maps. We compare our method with the standard method of estimating the growth rate of the mean-square displacement as well as the $p$-variation method.

  4. A vector of quarters representation for bivariate time series

    NARCIS (Netherlands)

    Ph.H.B.F. Franses (Philip Hans)

    1995-01-01

    textabstractIn this paper it is shown that several models for a bivariate nonstationary quarterly time series are nested in a vector autoregression with cointegration restrictions for the eight annual series of quarterly observations. Or, the Granger Representation Theorem is extended to incorporate

  5. A multivariate approach to modeling univariate seasonal time series

    NARCIS (Netherlands)

    Ph.H.B.F. Franses (Philip Hans)

    1994-01-01

    textabstractA seasonal time series can be represented by a vector autoregressive model for the annual series containing the seasonal observations. This model allows for periodically varying coefficients. When the vector elements are integrated, the maximum likelihood cointegration method can be used

  6. Seasonality, nonstationarity and the forecasting of monthly time series

    NARCIS (Netherlands)

    Ph.H.B.F. Franses (Philip Hans)

    1991-01-01

    textabstractWe focus on two forecasting models for a monthly time series. The first model requires that the variable is first order and seasonally differenced. The second model considers the series only in its first differences, while seasonality is modeled with a constant and seasonal dummies. A me

  7. Scalable Prediction of Energy Consumption using Incremental Time Series Clustering

    Energy Technology Data Exchange (ETDEWEB)

    Simmhan, Yogesh; Noor, Muhammad Usman

    2013-10-09

    Time series datasets are a canonical form of high velocity Big Data, and often generated by pervasive sensors, such as found in smart infrastructure. Performing predictive analytics on time series data can be computationally complex, and requires approximation techniques. In this paper, we motivate this problem using a real application from the smart grid domain. We propose an incremental clustering technique, along with a novel affinity score for determining cluster similarity, which help reduce the prediction error for cumulative time series within a cluster. We evaluate this technique, along with optimizations, using real datasets from smart meters, totaling ~700,000 data points, and show the efficacy of our techniques in improving the prediction error of time series data within polynomial time.

  8. Elements of nonlinear time series analysis and forecasting

    CERN Document Server

    De Gooijer, Jan G

    2017-01-01

    This book provides an overview of the current state-of-the-art of nonlinear time series analysis, richly illustrated with examples, pseudocode algorithms and real-world applications. Avoiding a “theorem-proof” format, it shows concrete applications on a variety of empirical time series. The book can be used in graduate courses in nonlinear time series and at the same time also includes interesting material for more advanced readers. Though it is largely self-contained, readers require an understanding of basic linear time series concepts, Markov chains and Monte Carlo simulation methods. The book covers time-domain and frequency-domain methods for the analysis of both univariate and multivariate (vector) time series. It makes a clear distinction between parametric models on the one hand, and semi- and nonparametric models/methods on the other. This offers the reader the option of concentrating exclusively on one of these nonlinear time series analysis methods. To make the book as user friendly as possible...

  9. Multi-dimensional sparse time series: feature extraction

    CERN Document Server

    Franciosi, Marco

    2008-01-01

    We show an analysis of multi-dimensional time series via entropy and statistical linguistic techniques. We define three markers encoding the behavior of the series, after it has been translated into a multi-dimensional symbolic sequence. The leading component and the trend of the series with respect to a mobile window analysis result from the entropy analysis and label the dynamical evolution of the series. The diversification formalizes the differentiation in the use of recurrent patterns, from a Zipf law point of view. These markers are the starting point of further analysis such as classification or clustering of large database of multi-dimensional time series, prediction of future behavior and attribution of new data. We also present an application to economic data. We deal with measurements of money investments of some business companies in advertising market for different media sources.

  10. Uniform Consistency for Nonparametric Estimators in Null Recurrent Time Series

    DEFF Research Database (Denmark)

    Gao, Jiti; Kanaya, Shin; Li, Degui;

    2015-01-01

    This paper establishes uniform consistency results for nonparametric kernel density and regression estimators when time series regressors concerned are nonstationary null recurrent Markov chains. Under suitable regularity conditions, we derive uniform convergence rates of the estimators. Our...

  11. Distinguishing chaotic time series from noise: A random matrix approach

    Science.gov (United States)

    Ye, Bin; Chen, Jianxing; Ju, Chen; Li, Huijun; Wang, Xuesong

    2017-03-01

    Deterministically chaotic systems can often give rise to random and unpredictable behaviors which make the time series obtained from them to be almost indistinguishable from noise. Motivated by the fact that data points in a chaotic time series will have intrinsic correlations between them, we propose a random matrix theory (RMT) approach to identify the deterministic or stochastic dynamics of the system. We show that the spectral distributions of the correlation matrices, constructed from the chaotic time series, deviate significantly from the predictions of random matrix ensembles. On the contrary, the eigenvalue statistics for a noisy signal follow closely those of random matrix ensembles. Numerical results also indicate that the approach is to some extent robust to additive observational noise which pollutes the data in many practical situations. Our approach is efficient in recognizing the continuous chaotic dynamics underlying the evolution of the time series.

  12. On robust forecasting of autoregressive time series under censoring

    OpenAIRE

    Kharin, Y.; Badziahin, I.

    2009-01-01

    Problems of robust statistical forecasting are considered for autoregressive time series observed under distortions generated by interval censoring. Three types of robust forecasting statistics are developed; meansquare risk is evaluated for the developed forecasting statistics. Numerical results are given.

  13. A probability distribution approach to synthetic turbulence time series

    Science.gov (United States)

    Sinhuber, Michael; Bodenschatz, Eberhard; Wilczek, Michael

    2016-11-01

    The statistical features of turbulence can be described in terms of multi-point probability density functions (PDFs). The complexity of these statistical objects increases rapidly with the number of points. This raises the question of how much information has to be incorporated into statistical models of turbulence to capture essential features such as inertial-range scaling and intermittency. Using high Reynolds number hot-wire data obtained at the Variable Density Turbulence Tunnel at the Max Planck Institute for Dynamics and Self-Organization, we establish a PDF-based approach on generating synthetic time series that reproduce those features. To do this, we measure three-point conditional PDFs from the experimental data and use an adaption-rejection method to draw random velocities from this distribution to produce synthetic time series. Analyzing these synthetic time series, we find that time series based on even low-dimensional conditional PDFs already capture some essential features of real turbulent flows.

  14. AFSC/ABL: Naknek sockeye salmon scale time series

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — A time series of scale samples (1956 2002) collected from adult sockeye salmon returning to Naknek River were retrieved from the Alaska Department of Fish and Game....

  15. Phenotyping of Clinical Time Series with LSTM Recurrent Neural Networks

    OpenAIRE

    Lipton, Zachary C.; Kale, David C.; Wetzell, Randall C.

    2015-01-01

    We present a novel application of LSTM recurrent neural networks to multilabel classification of diagnoses given variable-length time series of clinical measurements. Our method outperforms a strong baseline on a variety of metrics.

  16. Fast and Flexible Multivariate Time Series Subsequence Search

    Data.gov (United States)

    National Aeronautics and Space Administration — Multivariate Time-Series (MTS) are ubiquitous, and are generated in areas as disparate as sensor recordings in aerospace systems, music and video streams, medical...

  17. AFSC/ABL: Ugashik sockeye salmon scale time series

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — A time series of scale samples (1956 b?? 2002) collected from adult sockeye salmon returning to Ugashik River were retrieved from the Alaska Department of Fish and...

  18. Lagrangian Time Series Models for Ocean Surface Drifter Trajectories

    CERN Document Server

    Sykulski, Adam M; Lilly, Jonathan M; Danioux, Eric

    2016-01-01

    This paper proposes stochastic models for the analysis of ocean surface trajectories obtained from freely-drifting satellite-tracked instruments. The proposed time series models are used to summarise large multivariate datasets and infer important physical parameters of inertial oscillations and other ocean processes. Nonstationary time series methods are employed to account for the spatiotemporal variability of each trajectory. Because the datasets are large, we construct computationally efficient methods through the use of frequency-domain modelling and estimation, with the data expressed as complex-valued time series. We detail how practical issues related to sampling and model misspecification may be addressed using semi-parametric techniques for time series, and we demonstrate the effectiveness of our stochastic models through application to both real-world data and to numerical model output.

  19. Stacked Heterogeneous Neural Networks for Time Series Forecasting

    Directory of Open Access Journals (Sweden)

    Florin Leon

    2010-01-01

    Full Text Available A hybrid model for time series forecasting is proposed. It is a stacked neural network, containing one normal multilayer perceptron with bipolar sigmoid activation functions, and the other with an exponential activation function in the output layer. As shown by the case studies, the proposed stacked hybrid neural model performs well on a variety of benchmark time series. The combination of weights of the two stack components that leads to optimal performance is also studied.

  20. A Generalization of Some Classical Time Series Tools

    DEFF Research Database (Denmark)

    Nielsen, Henrik Aalborg; Madsen, Henrik

    2001-01-01

    In classical time series analysis the sample autocorrelation function (SACF) and the sample partial autocorrelation function (SPACF) has gained wide application for structural identification of linear time series models. We suggest generalizations, founded on smoothing techniques, applicable for ....... In this paper the generalizations are applied to some simulated data sets and to the Canadian lynx data. The generalizations seem to perform well and the measure of the departure from linearity proves to be an important additional tool....

  1. Prediction and interpolation of time series by state space models

    OpenAIRE

    Helske, Jouni

    2015-01-01

    A large amount of data collected today is in the form of a time series. In order to make realistic inferences based on time series forecasts, in addition to point predictions, prediction intervals or other measures of uncertainty should be presented. Multiple sources of uncertainty are often ignored due to the complexities involved in accounting them correctly. In this dissertation, some of these problems are reviewed and some new solutions are presented. A state space approach...

  2. The use of synthetic input sequences in time series modeling

    Energy Technology Data Exchange (ETDEWEB)

    Oliveira, Dair Jose de [Programa de Pos-Graduacao em Engenharia Eletrica, Universidade Federal de Minas Gerais, Av. Antonio Carlos 6627, 31.270-901 Belo Horizonte, MG (Brazil); Letellier, Christophe [CORIA/CNRS UMR 6614, Universite et INSA de Rouen, Av. de l' Universite, BP 12, F-76801 Saint-Etienne du Rouvray cedex (France); Gomes, Murilo E.D. [Programa de Pos-Graduacao em Engenharia Eletrica, Universidade Federal de Minas Gerais, Av. Antonio Carlos 6627, 31.270-901 Belo Horizonte, MG (Brazil); Aguirre, Luis A. [Programa de Pos-Graduacao em Engenharia Eletrica, Universidade Federal de Minas Gerais, Av. Antonio Carlos 6627, 31.270-901 Belo Horizonte, MG (Brazil)], E-mail: aguirre@cpdee.ufmg.br

    2008-08-04

    In many situations time series models obtained from noise-like data settle to trivial solutions under iteration. This Letter proposes a way of producing a synthetic (dummy) input, that is included to prevent the model from settling down to a trivial solution, while maintaining features of the original signal. Simulated benchmark models and a real time series of RR intervals from an ECG are used to illustrate the procedure.

  3. The use of synthetic input sequences in time series modeling

    Science.gov (United States)

    de Oliveira, Dair José; Letellier, Christophe; Gomes, Murilo E. D.; Aguirre, Luis A.

    2008-08-01

    In many situations time series models obtained from noise-like data settle to trivial solutions under iteration. This Letter proposes a way of producing a synthetic (dummy) input, that is included to prevent the model from settling down to a trivial solution, while maintaining features of the original signal. Simulated benchmark models and a real time series of RR intervals from an ECG are used to illustrate the procedure.

  4. Extracting Chaos Control Parameters from Time Series Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Santos, R B B [Centro Universitario da FEI, Avenida Humberto de Alencar Castelo Branco 3972, 09850-901, Sao Bernardo do Campo, SP (Brazil); Graves, J C, E-mail: rsantos@fei.edu.br [Instituto Tecnologico de Aeronautica, Praca Marechal Eduardo Gomes 50, 12228-900, Sao Jose dos Campos, SP (Brazil)

    2011-03-01

    We present a simple method to analyze time series, and estimate the parameters needed to control chaos in dynamical systems. Application of the method to a system described by the logistic map is also shown. Analyzing only two 100-point time series, we achieved results within 2% of the analytical ones. With these estimates, we show that OGY control method successfully stabilized a period-1 unstable periodic orbit embedded in the chaotic attractor.

  5. Time-varying parameter auto-regressive models for autocovariance nonstationary time series

    Institute of Scientific and Technical Information of China (English)

    FEI WanChun; BAI Lun

    2009-01-01

    In this paper,autocovariance nonstationary time series is clearly defined on a family of time series.We propose three types of TVPAR (time-varying parameter auto-regressive) models:the full order TVPAR model,the time-unvarying order TVPAR model and the time-varying order TVPAR model for autocovariance nonstationary time series.Related minimum AIC (Akaike information criterion) estimations are carried out.

  6. Time-varying parameter auto-regressive models for autocovariance nonstationary time series

    Institute of Scientific and Technical Information of China (English)

    2009-01-01

    In this paper, autocovariance nonstationary time series is clearly defined on a family of time series. We propose three types of TVPAR (time-varying parameter auto-regressive) models: the full order TVPAR model, the time-unvarying order TVPAR model and the time-varying order TV-PAR model for autocovariance nonstationary time series. Related minimum AIC (Akaike information criterion) estimations are carried out.

  7. A method for detecting changes in long time series

    Energy Technology Data Exchange (ETDEWEB)

    Downing, D.J.; Lawkins, W.F.; Morris, M.D.; Ostrouchov, G.

    1995-09-01

    Modern scientific activities, both physical and computational, can result in time series of many thousands or even millions of data values. Here the authors describe a statistically motivated algorithm for quick screening of very long time series data for the presence of potentially interesting but arbitrary changes. The basic data model is a stationary Gaussian stochastic process, and the approach to detecting a change is the comparison of two predictions of the series at a time point or contiguous collection of time points. One prediction is a ``forecast``, i.e. based on data from earlier times, while the other a ``backcast``, i.e. based on data from later times. The statistic is the absolute value of the log-likelihood ratio for these two predictions, evaluated at the observed data. A conservative procedure is suggested for specifying critical values for the statistic under the null hypothesis of ``no change``.

  8. LEGENDRE SERIES SOLUTIONS FOR TIME-VARIATION DYNAMICS

    Institute of Scientific and Technical Information of China (English)

    Cao Zhiyuan; Zou Guiping; Tang Shougao

    2000-01-01

    In this topic, a new approach to the analysis of time-variation dynamics is proposed by use of Legendre series expansion and Legendre integral operator matrix. The theoretical basis for effective solution of time-variation dynamics is therefore established, which is beneficial to further research of time-variation science.

  9. Combined forecasts from linear and nonlinear time series models

    NARCIS (Netherlands)

    N. Terui (Nobuhiko); H.K. van Dijk (Herman)

    1999-01-01

    textabstractCombined forecasts from a linear and a nonlinear model are investigated for time series with possibly nonlinear characteristics. The forecasts are combined by a constant coefficient regression method as well as a time varying method. The time varying method allows for a locally (non)line

  10. Analyses of Inhomogeneities in Radiosonde Temperature and Humidity Time Series.

    Science.gov (United States)

    Zhai, Panmao; Eskridge, Robert E.

    1996-04-01

    Twice daily radiosonde data from selected stations in the United States (period 1948 to 1990) and China (period 1958 to 1990) were sorted into time series. These stations have one sounding taken in darkness and the other in sunlight. The analysis shows that the 0000 and 1200 UTC time series are highly correlated. Therefore, the Easterling and Peterson technique was tested on the 0000 and 1200 time series to detect inhomogeneities and to estimate the size of the biases. Discontinuities were detected using the difference series created from the 0000 and 1200 UTC time series. To establish that the detected bias was significant, a t test was performed to confirm that the change occurs in the daytime series but not in the nighttime series.Both U.S. and Chinese radiosonde temperature and humidity data include inhomogeneities caused by changes in radiosonde sensors and observation times. The U.S. humidity data have inhomogeneities that were caused by instrument changes and the censoring of data. The practice of reporting relative humidity as 19% when it is lower than 20% or the temperature is below 40°C is called censoring. This combination of procedural and instrument changes makes the detection of biases and adjustment of the data very difficult. In the Chinese temperatures, them are inhomogeneities related to a change in the radiation correction procedure.Test results demonstrate that a modified Easterling and Peterson method is suitable for use in detecting and adjusting time series radiosonde data.Accurate stations histories are very desirable. Stations histories can confirm that detected inhomogeneities are related to instrument or procedural changes. Adjustments can then he made to the data with some confidence.

  11. Downscaled TRMM Rainfall Time-Series for Catchment Hydrology Applications

    Science.gov (United States)

    Tarnavsky, E.; Mulligan, M.

    2009-04-01

    Hydrology in semi-arid regions is controlled, to a large extent, by the spatial and temporal distribution of rainfall defined in terms of rainfall depth and intensity. Thus, appropriate representation of the space-time variability of rainfall is essential for catchment-scale hydrological models applied in semi-arid regions. While spaceborne platforms equipped with remote sensing instruments provide information on a range of variables for hydrological modelling, including rainfall, the necessary spatial and temporal detail is rarely obtained from a single dataset. This paper presents a new dynamic model of dryland hydrology, DryMOD, which makes best use of free, public-domain remote sensing data for representation of key variables with a particular focus on (a) simulation of spatial rainfall fields and (b) the hydrological response to rainfall, particularly in terms of rainfall-runoff partitioning. In DryMOD, rainfall is simulated using a novel approach combining 1-km spatial detail from a climatology derived from the TRMM 2B31 dataset (mean monthly rainfall) and 3-hourly temporal detail from time-series derived from the 0.25-degree gridded TRMM 3B42 dataset (rainfall intensity). This allows for rainfall simulation at the hourly time step, as well as accumulation of infiltration, recharge, and runoff at the monthly time step. In combination with temperature, topography, and soil data, rainfall-runoff and soil moisture dynamics are simulated over large dryland regions. In order to investigate the hydrological response to rainfall and variable catchment characteristics, the model is applied to two very different catchments in the drylands of North and West Africa. The results of the study demonstrate the use of remote sensing-based estimates of precipitation intensity and volume for the simulation of critical hydrological parameters. The model allows for better spatial planning of water harvesting activities, as well as for optimisation of agricultural activities

  12. Correlation measure to detect time series distances, whence economy globalization

    Science.gov (United States)

    Miśkiewicz, Janusz; Ausloos, Marcel

    2008-11-01

    An instantaneous time series distance is defined through the equal time correlation coefficient. The idea is applied to the Gross Domestic Product (GDP) yearly increments of 21 rich countries between 1950 and 2005 in order to test the process of economic globalisation. Some data discussion is first presented to decide what (EKS, GK, or derived) GDP series should be studied. Distances are then calculated from the correlation coefficient values between pairs of series. The role of time averaging of the distances over finite size windows is discussed. Three network structures are next constructed based on the hierarchy of distances. It is shown that the mean distance between the most developed countries on several networks actually decreases in time, -which we consider as a proof of globalization. An empirical law is found for the evolution after 1990, similar to that found in flux creep. The optimal observation time window size is found ≃15 years.

  13. Exploratory Causal Analysis in Bivariate Time Series Data

    Science.gov (United States)

    McCracken, James M.

    Many scientific disciplines rely on observational data of systems for which it is difficult (or impossible) to implement controlled experiments and data analysis techniques are required for identifying causal information and relationships directly from observational data. This need has lead to the development of many different time series causality approaches and tools including transfer entropy, convergent cross-mapping (CCM), and Granger causality statistics. In this thesis, the existing time series causality method of CCM is extended by introducing a new method called pairwise asymmetric inference (PAI). It is found that CCM may provide counter-intuitive causal inferences for simple dynamics with strong intuitive notions of causality, and the CCM causal inference can be a function of physical parameters that are seemingly unrelated to the existence of a driving relationship in the system. For example, a CCM causal inference might alternate between ''voltage drives current'' and ''current drives voltage'' as the frequency of the voltage signal is changed in a series circuit with a single resistor and inductor. PAI is introduced to address both of these limitations. Many of the current approaches in the times series causality literature are not computationally straightforward to apply, do not follow directly from assumptions of probabilistic causality, depend on assumed models for the time series generating process, or rely on embedding procedures. A new approach, called causal leaning, is introduced in this work to avoid these issues. The leaning is found to provide causal inferences that agree with intuition for both simple systems and more complicated empirical examples, including space weather data sets. The leaning may provide a clearer interpretation of the results than those from existing time series causality tools. A practicing analyst can explore the literature to find many proposals for identifying drivers and causal connections in times series data

  14. Evaluation of scaling invariance embedded in short time series.

    Directory of Open Access Journals (Sweden)

    Xue Pan

    Full Text Available Scaling invariance of time series has been making great contributions in diverse research fields. But how to evaluate scaling exponent from a real-world series is still an open problem. Finite length of time series may induce unacceptable fluctuation and bias to statistical quantities and consequent invalidation of currently used standard methods. In this paper a new concept called correlation-dependent balanced estimation of diffusion entropy is developed to evaluate scale-invariance in very short time series with length ~10(2. Calculations with specified Hurst exponent values of 0.2,0.3,...,0.9 show that by using the standard central moving average de-trending procedure this method can evaluate the scaling exponents for short time series with ignorable bias (≤0.03 and sharp confidential interval (standard deviation ≤0.05. Considering the stride series from ten volunteers along an approximate oval path of a specified length, we observe that though the averages and deviations of scaling exponents are close, their evolutionary behaviors display rich patterns. It has potential use in analyzing physiological signals, detecting early warning signals, and so on. As an emphasis, the our core contribution is that by means of the proposed method one can estimate precisely shannon entropy from limited records.

  15. Berlin Workshop Series 2010 : Climate Governance and Development

    OpenAIRE

    Ansohn, Albrecht; Pleskovic, Boris

    2011-01-01

    This volume, Berlin workshop series 2010, contains a selection of papers presented at the 11th International Policy Workshop, held in Berlin, September 28-30, 2008. The workshop was jointly organized by Inwent-Capacity Building International, Germany, and the World Bank in preparation for the World Bank's World Development Report 2010. It provided a forum for an exchange of ideas and viewp...

  16. Wavelet matrix transform for time-series similarity measurement

    Institute of Scientific and Technical Information of China (English)

    HU Zhi-kun; XU Fei; GUI Wei-hua; YANG Chun-hua

    2009-01-01

    A time-series similarity measurement method based on wavelet and matrix transform was proposed, and its anti-noise ability, sensitivity and accuracy were discussed. The time-series sequences were compressed into wavelet subspace, and sample feature vector and orthogonal basics of sample time-series sequences were obtained by K-L transform. Then the inner product transform was carried out to project analyzed time-series sequence into orthogonal basics to gain analyzed feature vectors. The similarity was calculated between sample feature vector and analyzed feature vector by the Euclid distance. Taking fault wave of power electronic devices for example, the experimental results show that the proposed method has low dimension of feature vector, the anti-noise ability of proposed method is 30 times as large as that of plain wavelet method, the sensitivity of proposed method is 1/3 as large as that of plain wavelet method, and the accuracy of proposed method is higher than that of the wavelet singular value decomposition method. The proposed method can be applied in similarity matching and indexing for lager time series databases.

  17. Self-affinity in the dengue fever time series

    Science.gov (United States)

    Azevedo, S. M.; Saba, H.; Miranda, J. G. V.; Filho, A. S. Nascimento; Moret, M. A.

    2016-06-01

    Dengue is a complex public health problem that is common in tropical and subtropical regions. This disease has risen substantially in the last three decades, and the physical symptoms depict the self-affine behavior of the occurrences of reported dengue cases in Bahia, Brazil. This study uses detrended fluctuation analysis (DFA) to verify the scale behavior in a time series of dengue cases and to evaluate the long-range correlations that are characterized by the power law α exponent for different cities in Bahia, Brazil. The scaling exponent (α) presents different long-range correlations, i.e. uncorrelated, anti-persistent, persistent and diffusive behaviors. The long-range correlations highlight the complex behavior of the time series of this disease. The findings show that there are two distinct types of scale behavior. In the first behavior, the time series presents a persistent α exponent for a one-month period. For large periods, the time series signal approaches subdiffusive behavior. The hypothesis of the long-range correlations in the time series of the occurrences of reported dengue cases was validated. The observed self-affinity is useful as a forecasting tool for future periods through extrapolation of the α exponent behavior. This complex system has a higher predictability in a relatively short time (approximately one month), and it suggests a new tool in epidemiological control strategies. However, predictions for large periods using DFA are hidden by the subdiffusive behavior.

  18. Stationary Time Series Analysis Using Information and Spectral Analysis

    Science.gov (United States)

    1992-09-01

    spectral density function of the time series. The spectral density function f(w), 0 < w < 1, is defined as the Fourier transform of...series with spectral density function f(w). 4 An important result of Pinsker [(1964), p. 196] can be interpreted as providing a for- mula for asymptotic...Analysis Papers, Holden-Day, San Francisco, California. Parzen, E. (1958) "On asymptotically efficient consistent estimates of the spectral density function

  19. Gaussian semiparametric estimation of non-stationary time series

    OpenAIRE

    Velasco, Carlos

    1998-01-01

    Generalizing the definition of the memory parameter d in terms of the differentiated series, we showed in Velasco (Non-stationary log-periodogram regression, Forthcoming J. Economet., 1997) that it is possible to estimate consistently the memory of non-stationary processes using methods designed for stationary long-range-dependent time series. In this paper we consider the Gaussian semiparametric estimate analysed by Robinson (Gaussian semiparametric estimation of long range dependence. Ann. ...

  20. Moderate Growth Time Series for Dynamic Combinatorics Modelisation

    CERN Document Server

    Jaff, Luaï; Kacem, Hatem Hadj; Bertelle, Cyrille

    2007-01-01

    Here, we present a family of time series with a simple growth constraint. This family can be the basis of a model to apply to emerging computation in business and micro-economy where global functions can be expressed from local rules. We explicit a double statistics on these series which allows to establish a one-to-one correspondence between three other ballot-like strunctures.

  1. Image-Based Learning Approach Applied to Time Series Forecasting

    Directory of Open Access Journals (Sweden)

    J. C. Chimal-Eguía

    2012-06-01

    Full Text Available In this paper, a new learning approach based on time-series image information is presented. In order to implementthis new learning technique, a novel time-series input data representation is also defined. This input datarepresentation is based on information obtained by image axis division into boxes. The difference between this newinput data representation and the classical is that this technique is not time-dependent. This new information isimplemented in the new Image-Based Learning Approach (IBLA and by means of a probabilistic mechanism thislearning technique is applied to the interesting problem of time series forecasting. The experimental results indicatethat by using the methodology proposed in this article, it is possible to obtain better results than with the classicaltechniques such as artificial neuronal networks and support vector machines.

  2. A refined fuzzy time series model for stock market forecasting

    Science.gov (United States)

    Jilani, Tahseen Ahmed; Burney, Syed Muhammad Aqil

    2008-05-01

    Time series models have been used to make predictions of stock prices, academic enrollments, weather, road accident casualties, etc. In this paper we present a simple time-variant fuzzy time series forecasting method. The proposed method uses heuristic approach to define frequency-density-based partitions of the universe of discourse. We have proposed a fuzzy metric to use the frequency-density-based partitioning. The proposed fuzzy metric also uses a trend predictor to calculate the forecast. The new method is applied for forecasting TAIEX and enrollments’ forecasting of the University of Alabama. It is shown that the proposed method work with higher accuracy as compared to other fuzzy time series methods developed for forecasting TAIEX and enrollments of the University of Alabama.

  3. Weighted statistical parameters for irregularly sampled time series

    CERN Document Server

    Rimoldini, Lorenzo

    2014-01-01

    Unevenly spaced time series are common in astronomy because of the day-night cycle, weather conditions, dependence on the source position in the sky, allocated telescope time, corrupt measurements, for example, or be inherent to the scanning law of satellites like Hipparcos and the forthcoming Gaia. This paper aims at improving the accuracy of common statistical parameters for the characterization of irregularly sampled signals. The uneven representation of time series, often including clumps of measurements and gaps with no data, can severely disrupt the values of estimators. A weighting scheme adapting to the sampling density and noise level of the signal is formulated. Its application to time series from the Hipparcos periodic catalogue led to significant improvements in the overall accuracy and precision of the estimators with respect to the unweighted counterparts and those weighted by inverse-squared uncertainties. Automated classification procedures employing statistical parameters weighted by the sugg...

  4. First time-series optical photometry from Antarctica

    CERN Document Server

    Strassmeier, K G; Granzer, T; Tosti, G; DiVarano, I; Savanov, I; Bagaglia, M; Castellini, S; Mancini, A; Nucciarelli, G; Straniero, O; Distefano, E; Messina, S; Cutispoto, G

    2008-01-01

    Beating the Earth's day-night cycle is mandatory for long and continuous time-series photometry and had been achieved with either large ground-based networks of observatories at different geographic longitudes or when conducted from space. A third possibility is offered by a polar location with astronomically-qualified site characteristics. Aims. In this paper, we present the first scientific stellar time-series optical photometry from Dome C in Antarctica and analyze approximately 13,000 CCD frames taken in July 2007. We conclude that high-precision CCD photometry with exceptional time coverage and cadence can be obtained at Dome C in Antarctica and be successfully used for time-series astrophysics.

  5. Time series analysis of the response of measurement instruments

    CERN Document Server

    Georgakaki, Dimitra; Polatoglou, Hariton

    2012-01-01

    In this work the significance of treating a set of measurements as a time series is being explored. Time Series Analysis (TSA) techniques, part of the Exploratory Data Analysis (EDA) approach, can provide much insight regarding the stochastic correlations that are induced on the outcome of an experiment by the measurement system and can provide criteria for the limited use of the classical variance in metrology. Specifically, techniques such as the Lag Plots, Autocorrelation Function, Power Spectral Density and Allan Variance are used to analyze series of sequential measurements, collected at equal time intervals from an electromechanical transducer. These techniques are used in conjunction with power law models of stochastic noise in order to characterize time or frequency regimes for which the usually assumed white noise model is adequate for the description of the measurement system response. However, through the detection of colored noise, usually referred to as flicker noise, which is expected to appear ...

  6. Periodicity Estimation in Mechanical Acoustic Time-Series Data

    Directory of Open Access Journals (Sweden)

    Zhu Yongbo

    2015-01-01

    Full Text Available Periodicity estimation in mechanical acoustic time-series data is a well-established problem in data mining as it can be applicable in variety of disciplines either for anomaly detection or for prediction purposes in industry. In this paper, we develop a new approach for capturing and characterizing periodic patterns in time-series data by virtue of the dynamic time warping (DTW. We have conducted extensive experiments to evaluate the proposed approach with synthetic data and our collected data in practice. Experimental results demonstrated its effectiveness and robustness on periodicity detection in highly noised data.

  7. Detecting structural breaks in time series via genetic algorithms

    DEFF Research Database (Denmark)

    Doerr, Benjamin; Fischer, Paul; Hilbert, Astrid

    2016-01-01

    Detecting structural breaks is an essential task for the statistical analysis of time series, for example, for fitting parametric models to it. In short, structural breaks are points in time at which the behaviour of the time series substantially changes. Typically, no solid background knowledge...... and mutation operations for this problem, we conduct extensive experiments to determine good choices for the parameters and operators of the genetic algorithm. One surprising observation is that use of uniform and one-point crossover together gave significantly better results than using either crossover...

  8. Time Series Analysis of Wheat Futures Reward in China

    Institute of Scientific and Technical Information of China (English)

    2005-01-01

    Different from the fact that the main researches are focused on single futures contract and lack of the comparison of different periods, this paper described the statistical characteristics of wheat futures reward time series of Zhengzhou Commodity Exchange in recent three years. Besides the basic statistic analysis, the paper used the GARCH and EGARCH model to describe the time series which had the ARCH effect and analyzed the persistence of volatility shocks and the leverage effect. The results showed that compared with that of normal one,wheat futures reward series were abnormality, leptokurtic and thick tail distribution. The study also found that two-part of the reward series had no autocorrelation. Among the six correlative series, three ones presented the ARCH effect. By using of the Auto-regressive Distributed Lag Model, GARCH model and EGARCH model, the paper demonstrates the persistence of volatility shocks and the leverage effect on the wheat futures reward time series. The results reveal that on the one hand, the statistical characteristics of the wheat futures reward are similar to the aboard mature futures market as a whole. But on the other hand, the results reflect some shortages such as the immatureness and the over-control by the government in the Chinese future market.

  9. Unified functional network and nonlinear time series analysis for complex systems science: The pyunicorn package

    CERN Document Server

    Donges, Jonathan F; Beronov, Boyan; Wiedermann, Marc; Runge, Jakob; Feng, Qing Yi; Tupikina, Liubov; Stolbova, Veronika; Donner, Reik V; Marwan, Norbert; Dijkstra, Henk A; Kurths, Jürgen

    2015-01-01

    We introduce the pyunicorn (Pythonic unified complex network and recurrence analysis toolbox) open source software package for applying and combining modern methods of data analysis and modeling from complex network theory and nonlinear time series analysis. pyunicorn is a fully object-oriented and easily parallelizable package written in the language Python. It allows for the construction of functional networks such as climate networks in climatology or functional brain networks in neuroscience representing the structure of statistical interrelationships in large data sets of time series and, subsequently, investigating this structure using advanced methods of complex network theory such as measures and models for spatial networks, networks of interacting networks, node-weighted statistics or network surrogates. Additionally, pyunicorn provides insights into the nonlinear dynamics of complex systems as recorded in uni- and multivariate time series from a non-traditional perspective by means of recurrence qua...

  10. Preparing Landsat Image Time Series (LITS) for Monitoring Changes in Vegetation Phenology in Queensland, Australia

    OpenAIRE

    Santosh Bhandari; Tony Gill; Stuart Phinn

    2012-01-01

    Time series of images are required to extract and separate information on vegetation change due to phenological cycles, inter-annual climatic variability, and long-term trends. While images from the Landsat Thematic Mapper (TM) sensor have the spatial and spectral characteristics suited for mapping a range of vegetation structural and compositional properties, its 16-day revisit period combined with cloud cover problems and seasonally limited latitudinal range, limit the availability of image...

  11. The Application of Kernel Smoothing to Time Series Data

    Institute of Scientific and Technical Information of China (English)

    Zhao-jun Wang; Yi Zhao; Chun-jie Wu; Yan-ting Li

    2006-01-01

    There are already a lot of models to fit a set of stationary time series, such as AR, MA, and ARMA models. For the non-stationary data, an ARIMA or seasonal ARIMA models can be used to fit the given data.Moreover, there are also many statistical softwares that can be used to build a stationary or non-stationary time series model for a given set of time series data, such as SAS, SPLUS, etc. However, some statistical softwares wouldn't work well for small samples with or without missing data, especially for small time series data with seasonal trend. A nonparametric smoothing technique to build a forecasting model for a given small seasonal time series data is carried out in this paper. And then, both the method provided in this paper and that in SAS package axe applied to the modeling of international airline passengers data respectively, the comparisons between the two methods are done afterwards. The results of the comparison show us the method provided in this paper has superiority over SAS's method.

  12. Sparse time series chain graphical models for reconstructing genetic networks

    NARCIS (Netherlands)

    Abegaz, Fentaw; Wit, Ernst

    2013-01-01

    We propose a sparse high-dimensional time series chain graphical model for reconstructing genetic networks from gene expression data parametrized by a precision matrix and autoregressive coefficient matrix. We consider the time steps as blocks or chains. The proposed approach explores patterns of co

  13. Nonlinear projective filtering; 1, Application to real time series

    CERN Document Server

    Schreiber, T

    1998-01-01

    We discuss applications of nonlinear filtering of time series by locally linear phase space projections. Noise can be reduced whenever the error due to the manifold approximation is smaller than the noise in the system. Examples include the real time extraction of the fetal electrocardiogram from abdominal recordings.

  14. Optimization of recurrent neural networks for time series modeling

    DEFF Research Database (Denmark)

    Pedersen, Morten With

    1997-01-01

    The present thesis is about optimization of recurrent neural networks applied to time series modeling. In particular is considered fully recurrent networks working from only a single external input, one layer of nonlinear hidden units and a li near output unit applied to prediction of discrete time...

  15. Detection of long term persistence in time series of the Neuquen River (Argentina)

    Science.gov (United States)

    Seoane, Rafael; Paz González, Antonio

    2014-05-01

    In the Patagonian region (Argentina), previous hydrometeorological studies that have been developed using general circulation models show variations in annual mean flows. Future climate scenarios obtained from high-resolution models indicate decreases in total annual precipitation, and these scenarios are more important in the Neuquén river basin (23000 km2). The aim of this study was the estimation of long term persistence in the Neuquén River basin (Argentina). The detection of variations in the long range dependence term and long memory of time series was evaluated with the Hurst exponent. We applied rescaled adjusted range analysis (R/S) to time series of River discharges measured from 1903 to 2011 and this time series was divided into two subperiods: the first was from 1903 to 1970 and the second from 1970 to 2011. Results show a small increase in persistence for the second period. Our results are consistent with those obtained by Koch and Markovic (2007), who observed and estimated an increase of the H exponent for the period 1960-2000 in the Elbe River (Germany). References Hurst, H. (1951).Long term storage capacities of reservoirs". Trans. Am. Soc. Civil Engrs., 116:776-808. Koch and Markovic (2007). Evidences for Climate Change in Germany over the 20th Century from the Stochastic Analysis of hydro-meteorological Time Series, MODSIM07, International Congress on Modelling and Simulation, Christchurch, New Zealand.

  16. Mining approximate periodic pattern in hydrological time series

    Science.gov (United States)

    Zhu, Y. L.; Li, S. J.; Bao, N. N.; Wan, D. S.

    2012-04-01

    There is a lot of information about the hidden laws of nature evolution and the influences of human beings activities on the earth surface in long sequence of hydrological time series. Data mining technology can help find those hidden laws, such as flood frequency and abrupt change, which is useful for the decision support of hydrological prediction and flood control scheduling. The periodic nature of hydrological time series is important for trend forecasting of drought and flood and hydraulic engineering planning. In Hydrology, the full period analysis of hydrological time series has attracted a lot of attention, such as the discrete periodogram, simple partial wave method, Fourier analysis method, and maximum entropy spectral analysis method and wavelet analysis. In fact, the hydrological process is influenced both by deterministic factors and stochastic ones. For example, the tidal level is also affected by moon circling the Earth, in addition to the Earth revolution and its rotation. Hence, there is some kind of approximate period hidden in the hydrological time series, sometimes which is also called the cryptic period. Recently, partial period mining originated from the data mining domain can be a remedy for the traditional period analysis methods in hydrology, which has a loose request of the data integrity and continuity. They can find some partial period in the time series. This paper is focused on the partial period mining in the hydrological time series. Based on asynchronous periodic pattern and partial period mining with suffix tree, this paper proposes to mine multi-event asynchronous periodic pattern based on modified suffix tree representation and traversal, and invent a dynamic candidate period intervals adjusting method, which can avoids period omissions or waste of time and space. The experimental results on synthetic data and real water level data of the Yangtze River at Nanjing station indicate that this algorithm can discover hydrological

  17. Real Time Clustering of Time Series Using Triangular Potentials

    Directory of Open Access Journals (Sweden)

    Aldo Pacchiano

    2015-01-01

    Full Text Available Motivated by the problem of computing investment portfolio weightin gs we investigate various methods of clustering as alternatives to traditional mean-v ariance approaches. Such methods can have significant benefits from a practical point of view since they remove the need to invert a sample covariance matrix, which can suffer from estimation error and will almost certainly be non-stationary. The general idea is to find groups of assets w hich share similar return characteristics over time and treat each group as a singl e composite asset. We then apply inverse volatility weightings to these new composite assets. In the course of our investigation we devise a method of clustering based on triangular potentials and we present as sociated theoretical results as well as various examples based on synthetic data.

  18. A Platform for Processing Expression of Short Time Series (PESTS

    Directory of Open Access Journals (Sweden)

    Markatou Marianthi

    2011-01-01

    Full Text Available Abstract Background Time course microarray profiles examine the expression of genes over a time domain. They are necessary in order to determine the complete set of genes that are dynamically expressed under given conditions, and to determine the interaction between these genes. Because of cost and resource issues, most time series datasets contain less than 9 points and there are few tools available geared towards the analysis of this type of data. Results To this end, we introduce a platform for Processing Expression of Short Time Series (PESTS. It was designed with a focus on usability and interpretability of analyses for the researcher. As such, it implements several standard techniques for comparability as well as visualization functions. However, it is designed specifically for the unique methods we have developed for significance analysis, multiple test correction and clustering of short time series data. The central tenet of these methods is the use of biologically relevant features for analysis. Features summarize short gene expression profiles, inherently incorporate dependence across time, and allow for both full description of the examined curve and missing data points. Conclusions PESTS is fully generalizable to other types of time series analyses. PESTS implements novel methods as well as several standard techniques for comparability and visualization functions. These features and functionality make PESTS a valuable resource for a researcher's toolkit. PESTS is available to download for free to academic and non-profit users at http://www.mailman.columbia.edu/academic-departments/biostatistics/research-service/software-development.

  19. Time Series Outlier Detection Based on Sliding Window Prediction

    Directory of Open Access Journals (Sweden)

    Yufeng Yu

    2014-01-01

    Full Text Available In order to detect outliers in hydrological time series data for improving data quality and decision-making quality related to design, operation, and management of water resources, this research develops a time series outlier detection method for hydrologic data that can be used to identify data that deviate from historical patterns. The method first built a forecasting model on the history data and then used it to predict future values. Anomalies are assumed to take place if the observed values fall outside a given prediction confidence interval (PCI, which can be calculated by the predicted value and confidence coefficient. The use of PCI as threshold is mainly on the fact that it considers the uncertainty in the data series parameters in the forecasting model to address the suitable threshold selection problem. The method performs fast, incremental evaluation of data as it becomes available, scales to large quantities of data, and requires no preclassification of anomalies. Experiments with different hydrologic real-world time series showed that the proposed methods are fast and correctly identify abnormal data and can be used for hydrologic time series analysis.

  20. Increment entropy as a measure of complexity for time series

    CERN Document Server

    Liu, Xiaofeng; Xu, Ning; Xue, Jianru

    2015-01-01

    Entropy has been a common index to quantify the complexity of time series in a variety of fields. Here, we introduce increment entropy to measure the complexity of time series in which each increment is mapped into a word of two letters, one letter corresponding to direction and the other corresponding to magnitude. The Shannon entropy of the words is termed as increment entropy (IncrEn). Simulations on synthetic data and tests on epileptic EEG signals have demonstrated its ability of detecting the abrupt change, regardless of energetic (e.g. spikes or bursts) or structural changes. The computation of IncrEn does not make any assumption on time series and it can be applicable to arbitrary real-world data.

  1. Track Irregularity Time Series Analysis and Trend Forecasting

    Directory of Open Access Journals (Sweden)

    Jia Chaolong

    2012-01-01

    Full Text Available The combination of linear and nonlinear methods is widely used in the prediction of time series data. This paper analyzes track irregularity time series data by using gray incidence degree models and methods of data transformation, trying to find the connotative relationship between the time series data. In this paper, GM (1,1 is based on first-order, single variable linear differential equations; after an adaptive improvement and error correction, it is used to predict the long-term changing trend of track irregularity at a fixed measuring point; the stochastic linear AR, Kalman filtering model, and artificial neural network model are applied to predict the short-term changing trend of track irregularity at unit section. Both long-term and short-term changes prove that the model is effective and can achieve the expected accuracy.

  2. Feature-preserving interpolation and filtering of environmental time series

    CERN Document Server

    Mariethoz, Gregoire; Jougnot, Damien; Rezaee, Hassan

    2015-01-01

    We propose a method for filling gaps and removing interferences in time series for applications involving continuous monitoring of environmental variables. The approach is non-parametric and based on an iterative pattern-matching between the affected and the valid parts of the time series. It considers several variables jointly in the pattern matching process and allows preserving linear or non-linear dependences between variables. The uncertainty in the reconstructed time series is quantified through multiple realizations. The method is tested on self-potential data that are affected by strong interferences as well as data gaps, and the results show that our approach allows reproducing the spectral features of the original signal. Even in the presence of intense signal perturbations, it significantly improves the signal and corrects bias introduced by asymmetrical interferences. Potential applications are wide-ranging, including geophysics, meteorology and hydrology.

  3. Grammar-based feature generation for time-series prediction

    CERN Document Server

    De Silva, Anthony Mihirana

    2015-01-01

    This book proposes a novel approach for time-series prediction using machine learning techniques with automatic feature generation. Application of machine learning techniques to predict time-series continues to attract considerable attention due to the difficulty of the prediction problems compounded by the non-linear and non-stationary nature of the real world time-series. The performance of machine learning techniques, among other things, depends on suitable engineering of features. This book proposes a systematic way for generating suitable features using context-free grammar. A number of feature selection criteria are investigated and a hybrid feature generation and selection algorithm using grammatical evolution is proposed. The book contains graphical illustrations to explain the feature generation process. The proposed approaches are demonstrated by predicting the closing price of major stock market indices, peak electricity load and net hourly foreign exchange client trade volume. The proposed method ...

  4. GAS DETECTING AND FORECASTING VIA TIME SERIES METHOD

    Institute of Scientific and Technical Information of China (English)

    黄养光

    1990-01-01

    The importance and urgency of gas detecting and forecasting in underground coal mining are self-evident. Unfortunately, this problem has not yet been solved thoroughly. In this paper, the author suggests that the time series analysis method be adopted for processing the gas stochastic data. The time series method is superior to the conventional Fourier analysis in some aspects, especially, the time series method possesses Forecasting (or prediction) function which is highly valuable for gas monitoring. An example ot a set ot gas data sampled From a certain foul coal mine is investigated and an AR (3) model is established. The fitting result and the forecasting error are accepted satisfactorily. At the end of this paper several remarks are presented for further discussion.

  5. The time series forecasting: from the aspect of network

    CERN Document Server

    Chen, S; Hu, Y; Liu, Q; Deng, Y

    2014-01-01

    Forecasting can estimate the statement of events according to the historical data and it is considerably important in many disciplines. At present, time series models have been utilized to solve forecasting problems in various domains. In general, researchers use curve fitting and parameter estimation methods (moment estimation, maximum likelihood estimation and least square method) to forecast. In this paper, a new sight is given to the forecasting and a completely different method is proposed to forecast time series. Inspired by the visibility graph and link prediction, this letter converts time series into network and then finds the nodes which are mostly likelihood to link with the predicted node. Finally, the predicted value will be obtained according to the state of the link. The TAIEX data set is used in the case study to illustrate that the proposed method is effectiveness. Compared with ARIMA model, the proposed shows a good forecasting performance when there is a small amount of data.

  6. Parameter-Free Search of Time-Series Discord

    Institute of Scientific and Technical Information of China (English)

    Wei Luo; Marcus Gallagher; Janet Wiles

    2013-01-01

    Time-series discord is widely used in data mining applications to characterize anomalous subsequences in time series.Compared to some other discord search algorithms,the direct search algorithm based on the recurrence plot shows the advantage of being fast and parameter free.The direct search algorithm,however,relies on quasi-periodicity in input time series,an assumption that limits the algorithm's applicability.In this paper,we eliminate the periodicity assumption from the direct search algorithm by proposing a reference function for subsequences and a new sampling strategy based on the reference function.These measures result in a new algorithm with improved efficiency and robustness,as evidenced by our empirical evaluation.

  7. Time series characterization via horizontal visibility graph and Information Theory

    Science.gov (United States)

    Gonçalves, Bruna Amin; Carpi, Laura; Rosso, Osvaldo A.; Ravetti, Martín G.

    2016-12-01

    Complex networks theory have gained wider applicability since methods for transformation of time series to networks were proposed and successfully tested. In the last few years, horizontal visibility graph has become a popular method due to its simplicity and good results when applied to natural and artificially generated data. In this work, we explore different ways of extracting information from the network constructed from the horizontal visibility graph and evaluated by Information Theory quantifiers. Most works use the degree distribution of the network, however, we found alternative probability distributions, more efficient than the degree distribution in characterizing dynamical systems. In particular, we find that, when using distributions based on distances and amplitude values, significant shorter time series are required. We analyze fractional Brownian motion time series, and a paleoclimatic proxy record of ENSO from the Pallcacocha Lake to study dynamical changes during the Holocene.

  8. Asymptotics for Nonlinear Transformations of Fractionally Integrated Time Series

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    The asymptotic theory for nonlinear transformations of fractionally integrated time series is developed. By the use of fractional Occupation Times Formula, various nonlinear functions of fractionally integrated series such as ARFIMA time series are studied, and the asymptotic distributions of the sample moments of such functions are obtained and analyzed. The transformations considered in this paper includes a variety of functions such as regular functions, integrable functions and asymptotically homogeneous functions that are often used in practical nonlinear econometric analysis. It is shown that the asymptotic theory of nonlinear transformations of original and normalized fractionally integrated processes is different from that of fractionally integrated processes, but is similar to the asymptotic theory of nonlinear transformations of integrated processes.

  9. Complex Network Approach to the Fractional Time Series

    CERN Document Server

    Manshour, Pouya

    2015-01-01

    In order to extract the correlation information inherited in a stochastic time series, the visibility graph algorithm has been recently proposed, by which a time series can be mapped onto a complex network. We demonstrate that the visibility algorithm is not an appropriate one to study the correlation aspects of a time series. We then employ the horizontal visibility algorithm, as a much simpler one, to map the fractional processes onto complex networks. The parabolic exponential functions are found to ?fit with the corresponding degree distributions, with Hurst dependent ?fitting parameter. Further, we take into account other topological properties such as the maximum eigenvalue of the adjacency matrix and the degree assortativity, and show that such topological quantities can also be used to predict the Hurst exponent, with an exception for the antipersistent fractional Gaussian noises. To solve this problem, we take into account the Spearman correlation coefficient between the node's degree and its corresp...

  10. On the detection of superdiffusive behaviour in time series

    Science.gov (United States)

    Gottwald, G. A.; Melbourne, I.

    2016-12-01

    We present a new method for detecting superdiffusive behaviour and for determining rates of superdiffusion in time series data. Our method applies equally to stochastic and deterministic time series data (with no prior knowledge required of the nature of the data) and relies on one realisation (ie one sample path) of the process. Linear drift effects are automatically removed without any preprocessing. We show numerical results for time series constructed from i.i.d. α-stable random variables and from deterministic weakly chaotic maps. We compare our method with the standard method of estimating the growth rate of the mean-square displacement as well as the p-variation method, maximum likelihood, quantile matching and linear regression of the empirical characteristic function.

  11. Increment Entropy as a Measure of Complexity for Time Series

    Directory of Open Access Journals (Sweden)

    Xiaofeng Liu

    2016-01-01

    Full Text Available Entropy has been a common index to quantify the complexity of time series in a variety of fields. Here, we introduce an increment entropy to measure the complexity of time series in which each increment is mapped onto a word of two letters, one corresponding to the sign and the other corresponding to the magnitude. Increment entropy (IncrEn is defined as the Shannon entropy of the words. Simulations on synthetic data and tests on epileptic electroencephalogram (EEG signals demonstrate its ability of detecting abrupt changes, regardless of the energetic (e.g., spikes or bursts or structural changes. The computation of IncrEn does not make any assumption on time series, and it can be applicable to arbitrary real-world data.

  12. Time series, correlation matrices and random matrix models

    Energy Technology Data Exchange (ETDEWEB)

    Vinayak [Instituto de Ciencias Físicas, Universidad Nacional Autónoma de México, C.P. 62210 Cuernavaca (Mexico); Seligman, Thomas H. [Instituto de Ciencias Físicas, Universidad Nacional Autónoma de México, C.P. 62210 Cuernavaca, México and Centro Internacional de Ciencias, C.P. 62210 Cuernavaca (Mexico)

    2014-01-08

    In this set of five lectures the authors have presented techniques to analyze open classical and quantum systems using correlation matrices. For diverse reasons we shall see that random matrices play an important role to describe a null hypothesis or a minimum information hypothesis for the description of a quantum system or subsystem. In the former case various forms of correlation matrices of time series associated with the classical observables of some system. The fact that such series are necessarily finite, inevitably introduces noise and this finite time influence lead to a random or stochastic component in these time series. By consequence random correlation matrices have a random component, and corresponding ensembles are used. In the latter we use random matrices to describe high temperature environment or uncontrolled perturbations, ensembles of differing chaotic systems etc. The common theme of the lectures is thus the importance of random matrix theory in a wide range of fields in and around physics.

  13. Time series analysis by the Maximum Entropy method

    Energy Technology Data Exchange (ETDEWEB)

    Kirk, B.L.; Rust, B.W.; Van Winkle, W.

    1979-01-01

    The principal subject of this report is the use of the Maximum Entropy method for spectral analysis of time series. The classical Fourier method is also discussed, mainly as a standard for comparison with the Maximum Entropy method. Examples are given which clearly demonstrate the superiority of the latter method over the former when the time series is short. The report also includes a chapter outlining the theory of the method, a discussion of the effects of noise in the data, a chapter on significance tests, a discussion of the problem of choosing the prediction filter length, and, most importantly, a description of a package of FORTRAN subroutines for making the various calculations. Cross-referenced program listings are given in the appendices. The report also includes a chapter demonstrating the use of the programs by means of an example. Real time series like the lynx data and sunspot numbers are also analyzed. 22 figures, 21 tables, 53 references.

  14. Detection of "noisy" chaos in a time series

    DEFF Research Database (Denmark)

    Chon, K H; Kanters, J K; Cohen, R J;

    1997-01-01

    Time series from biological system often displays fluctuations in the measured variables. Much effort has been directed at determining whether this variability reflects deterministic chaos, or whether it is merely "noise". The output from most biological systems is probably the result of both...... the internal dynamics of the systems, and the input to the system from the surroundings. This implies that the system should be viewed as a mixed system with both stochastic and deterministic components. We present a method that appears to be useful in deciding whether determinism is present in a time series......, and if this determinism has chaotic attributes. The method relies on fitting a nonlinear autoregressive model to the time series followed by an estimation of the characteristic exponents of the model over the observed probability distribution of states for the system. The method is tested by computer simulations...

  15. Extracting unstable periodic orbits from chaotic time series data

    Energy Technology Data Exchange (ETDEWEB)

    So, P.; Schiff, S.; Gluckman, B.J., [Center for Neuroscience, Childrens Research Institute, Childrens National Medical Center and the George Washington University, NW, Washington, D.C. 20010 (United States); So, P.; Ott, E.; Grebogi, C., [Institute for Plasma Research, University of Maryland, College Park, Maryland 20742 (United States); Sauer, T., [Department of Mathematics, The George Mason University, Fairfax, Virginia 22030 (United States); Gluckman, B.J., [Naval Surface Warfare Center, Carderock Division, Bethesda, Maryland 20054-5000 (United States)

    1997-05-01

    A general nonlinear method to extract unstable periodic orbits from chaotic time series is proposed. By utilizing the estimated local dynamics along a trajectory, we devise a transformation of the time series data such that the transformed data are concentrated on the periodic orbits. Thus, one can extract unstable periodic orbits from a chaotic time series by simply looking for peaks in a finite grid approximation of the distribution function of the transformed data. Our method is demonstrated using data from both numerical and experimental examples, including neuronal ensemble data from mammalian brain slices. The statistical significance of the results in the presence of noise is assessed using surrogate data. {copyright} {ital 1997} {ital The American Physical Society}

  16. Appropriate Algorithms for Nonlinear Time Series Analysis in Psychology

    Science.gov (United States)

    Scheier, Christian; Tschacher, Wolfgang

    Chaos theory has a strong appeal for psychology because it allows for the investigation of the dynamics and nonlinearity of psychological systems. Consequently, chaos-theoretic concepts and methods have recently gained increasing attention among psychologists and positive claims for chaos have been published in nearly every field of psychology. Less attention, however, has been paid to the appropriateness of chaos-theoretic algorithms for psychological time series. An appropriate algorithm can deal with short, noisy data sets and yields `objective' results. In the present paper it is argued that most of the classical nonlinear techniques don't satisfy these constraints and thus are not appropriate for psychological data. A methodological approach is introduced that is based on nonlinear forecasting and the method of surrogate data. In artificial data sets and empirical time series we can show that this methodology reliably assesses nonlinearity and chaos in time series even if they are short and contaminated by noise.

  17. General expression for linear and nonlinear time series models

    Institute of Scientific and Technical Information of China (English)

    Ren HUANG; Feiyun XU; Ruwen CHEN

    2009-01-01

    The typical time series models such as ARMA, AR, and MA are founded on the normality and stationarity of a system and expressed by a linear difference equation; therefore, they are strictly limited to the linear system. However, some nonlinear factors are within the practical system; thus, it is difficult to fit the model for real systems with the above models. This paper proposes a general expression for linear and nonlinear auto-regressive time series models (GNAR). With the gradient optimization method and modified AIC information criteria integrated with the prediction error, the parameter estimation and order determination are achieved. The model simulation and experiments show that the GNAR model can accurately approximate to the dynamic characteristics of the most nonlinear models applied in academics and engineering. The modeling and prediction accuracy of the GNAR model is superior to the classical time series models. The proposed GNAR model is flexible and effective.

  18. Detection of inhomogeneities in precipitation time series in Portugal using direct sequential simulation

    Science.gov (United States)

    Ribeiro, Sara; Caineta, Júlio; Costa, Ana Cristina; Henriques, Roberto; Soares, Amílcar

    2016-05-01

    Climate data homogenisation is of major importance in climate change monitoring, validation of weather forecasting, general circulation and regional atmospheric models, modelling of erosion, drought monitoring, among other studies of hydrological and environmental impacts. The reason is that non-climate factors can cause time series discontinuities which may hide the true climatic signal and patterns, thus potentially bias the conclusions of those studies. In the last two decades, many methods have been developed to identify and remove these inhomogeneities. One of those is based on a geostatistical simulation technique (DSS - direct sequential simulation), where local probability density functions (pdfs) are calculated at candidate monitoring stations using spatial and temporal neighbouring observations, which then are used for the detection of inhomogeneities. Such approach has been previously applied to detect inhomogeneities in four precipitation series (wet day count) from a network with 66 monitoring stations located in the southern region of Portugal (1980-2001). That study revealed promising results and the potential advantages of geostatistical techniques for inhomogeneity detection in climate time series. This work extends the case study presented before and investigates the application of the geostatistical stochastic approach to ten precipitation series that were previously classified as inhomogeneous by one of six absolute homogeneity tests (Mann-Kendall, Wald-Wolfowitz runs, Von Neumann ratio, Pettitt, Buishand range test, and standard normal homogeneity test (SNHT) for a single break). Moreover, a sensitivity analysis is performed to investigate the number of simulated realisations which should be used to infer the local pdfs with more accuracy. Accordingly, the number of simulations per iteration was increased from 50 to 500, which resulted in a more representative local pdf. As in the previous study, the results are compared with those from the

  19. Chaotic time series prediction and additive white Gaussian noise

    Energy Technology Data Exchange (ETDEWEB)

    Lim, Teck Por [Department of Mathematics, 6M30 Huxley, Imperial College London, 180 Queen' s Gate, London, SW7 2BZ (United Kingdom)]. E-mail: teckpor@gmail.com; Puthusserypady, Sadasivan [Department of Electrical and Computer Engineering, National University of Singapore, 4 Engineering Drive 3, Singapore 117576 (Singapore)]. E-mail: elespk@nus.edu.sg

    2007-06-04

    Taken's delay embedding theorem states that a pseudo state-space can be reconstructed from a time series consisting of observations of a chaotic process. However, experimental observations are inevitably corrupted by measurement noise, which can be modelled as Additive White Gaussian Noise (AWGN). This Letter analyses time series prediction in the presence of AWGN using the triangle inequality and the mean of the Nakagami distribution. It is shown that using more delay coordinates than those used by a typical delay embedding can improve prediction accuracy, when the mean magnitude of the input vector dominates the mean magnitude of AWGN.

  20. Mining Rules from Electrical Load Time Series Data Set

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    The mining of the rules from the electrical load time series data which are collected from the EMS (Energy Management System) is discussed. The data from the EMS are too huge and sophisticated to be understood and used by the power system engineer, while useful information is hidden in the electrical load data. The authors discuss the use of fuzzy linguistic summary as data mining method to induce the rules from the electrical load time series. The data preprocessing techniques are also discussed in the paper.

  1. A Non-standard Empirical Likelihood for Time Series

    DEFF Research Database (Denmark)

    Nordman, Daniel J.; Bunzel, Helle; Lahiri, Soumendra N.

    Standard blockwise empirical likelihood (BEL) for stationary, weakly dependent time series requires specifying a fixed block length as a tuning parameter for setting confidence regions. This aspect can be difficult and impacts coverage accuracy. As an alternative, this paper proposes a new version......-standard asymptotics and requires a significantly different development compared to standard BEL. We establish the large-sample distribution of log-ratio statistics from the new BEL method for calibrating confidence regions for mean or smooth function parameters of time series. This limit law is not the usual chi...

  2. Nonlinear Time Series Forecast Using Radial Basis Function Neural Networks

    Institute of Scientific and Technical Information of China (English)

    ZHENGXin; CHENTian-Lun

    2003-01-01

    In the research of using Radial Basis Function Neural Network (RBF NN) forecasting nonlinear time series, we investigate how the different clusterings affect the process of learning and forecasting. We find that k-means clustering is very suitable. In order to increase the precision we introduce a nonlinear feedback term to escape from the local minima of energy, then we use the model to forecast the nonlinear time series which are produced by Mackey-Glass equation and stocks. By selecting the k-means clustering and the suitable feedback term, much better forecasting results are obtained.

  3. Parameterizing unconditional skewness in models for financial time series

    DEFF Research Database (Denmark)

    He, Changli; Silvennoinen, Annastiina; Teräsvirta, Timo

    In this paper we consider the third-moment structure of a class of time series models. It is often argued that the marginal distribution of financial time series such as returns is skewed. Therefore it is of importance to know what properties a model should possess if it is to accommodate...... unconditional skewness. We consider modelling the unconditional mean and variance using models that respond nonlinearly or asymmetrically to shocks. We investigate the implications of these models on the third-moment structure of the marginal distribution as well as conditions under which the unconditional...

  4. Testing for intracycle determinism in pseudoperiodic time series

    Science.gov (United States)

    Coelho, Mara C. S.; Mendes, Eduardo M. A. M.; Aguirre, Luis A.

    2008-06-01

    A determinism test is proposed based on the well-known method of the surrogate data. Assuming predictability to be a signature of determinism, the proposed method checks for intracycle (e.g., short-term) determinism in the pseudoperiodic time series for which standard methods of surrogate analysis do not apply. The approach presented is composed of two steps. First, the data are preprocessed to reduce the effects of seasonal and trend components. Second, standard tests of surrogate analysis can then be used. The determinism test is applied to simulated and experimental pseudoperiodic time series and the results show the applicability of the proposed test.

  5. Multi-Scale Dissemination of Time Series Data

    DEFF Research Database (Denmark)

    Guo, Qingsong; Zhou, Yongluan; Su, Li

    2013-01-01

    , which is an abstract indicator for both the physical limits and the amount of data that the subscriber would like to handle. To handle this problem, we propose a system framework for multi-scale time series data dissemination that employs a typical tree-based dissemination network and existing time......-series compression models. Due to the bandwidth limits regarding to potentially sheer speed of data, it is inevitable to compress and re-compress data along the dissemination paths according to the subscription level of each node. Compression would caused the accuracy loss of data, thus we devise several algorithms...

  6. Bootstrap Power of Time Series Goodness of fit tests

    Directory of Open Access Journals (Sweden)

    Sohail Chand

    2013-10-01

    Full Text Available In this article, we looked at power of various versions of Box and Pierce statistic and Cramer von Mises test. An extensive simulation study has been conducted to compare the power of these tests. Algorithms have been provided for the power calculations and comparison has also been made between the semi parametric bootstrap methods used for time series. Results show that Box-Pierce statistic and its various versions have good power against linear time series models but poor power against non linear models while situation reverses for Cramer von Mises test. Moreover, we found that dynamic bootstrap method is better than xed design bootstrap method.

  7. Kālī: Time series data modeler

    Science.gov (United States)

    Kasliwal, Vishal P.

    2016-07-01

    The fully parallelized and vectorized software package Kālī models time series data using various stochastic processes such as continuous-time ARMA (C-ARMA) processes and uses Bayesian Markov Chain Monte-Carlo (MCMC) for inferencing a stochastic light curve. Kālimacr; is written in c++ with Python language bindings for ease of use. K¯lī is named jointly after the Hindu goddess of time, change, and power and also as an acronym for KArma LIbrary.

  8. Time is an affliction: Why ecology cannot be as predictive as physics and why it needs time series

    Science.gov (United States)

    Boero, F.; Kraberg, A. C.; Krause, G.; Wiltshire, K. H.

    2015-07-01

    Ecological systems depend on both constraints and historical contingencies, both of which shape their present observable system state. In contrast to ahistorical systems, which are governed solely by constraints (i.e. laws), historical systems and their dynamics can be understood only if properly described, in the course of time. Describing these dynamics and understanding long-term variability can be seen as the mission of long time series measuring not only simple abiotic features but also complex biological variables, such as species diversity and abundances, allowing deep insights in the functioning of food webs and ecosystems in general. Long time-series are irreplaceable for understanding change, and crucially inherent system variability and thus envisaging future scenarios. This notwithstanding current policies in funding and evaluating scientific research discourage the maintenance of long term series, despite a clear need for long-term strategies to cope with climate change. Time series are crucial for a pursuit of the much invoked Ecosystem Approach and to the passage from simple monitoring programs of large-scale and long-term Earth observatories - thus promoting a better understanding of the causes and effects of change in ecosystems. The few ongoing long time series in European waters must be integrated and networked so as to facilitate the formation of nodes of a series of observatories which, together, should allow the long-term management of the features and characteristics of European waters. Human capacity building in this region of expertise and a stronger societal involvement are also urgently needed, since the expertise in recognizing and describing species and therefore recording them reliably in the context of time series is rapidly vanishing from the European Scientific community.

  9. Fractal dimension of electroencephalographic time series and underlying brain processes.

    Science.gov (United States)

    Lutzenberger, W; Preissl, H; Pulvermüller, F

    1995-10-01

    Fractal dimension has been proposed as a useful measure for the characterization of electrophysiological time series. This paper investigates what the pointwise dimension of electroencephalographic (EEG) time series can reveal about underlying neuronal generators. The following theoretical assumptions concerning brain function were made (i) within the cortex, strongly coupled neural assemblies exist which oscillate at certain frequencies when they are active, (ii) several such assemblies can oscillate at a time, and (iii) activity flow between assemblies is minimal. If these assumptions are made, cortical activity can be considered as the weighted sum of a finite number of oscillations (plus noise). It is shown that the correlation dimension of finite time series generated by multiple oscillators increases monotonically with the number of oscillators. Furthermore, it is shown that a reliable estimate of the pointwise dimension of the raw EEG signal can be calculated from a time series as short as a few seconds. These results indicate that (i) The pointwise dimension of the EEG allows conclusions regarding the number of independently oscillating networks in the cortex, and (ii) a reliable estimate of the pointwise dimension of the EEG is possible on the basis of short raw signals.

  10. FTSPlot: fast time series visualization for large datasets.

    Science.gov (United States)

    Riss, Michael

    2014-01-01

    The analysis of electrophysiological recordings often involves visual inspection of time series data to locate specific experiment epochs, mask artifacts, and verify the results of signal processing steps, such as filtering or spike detection. Long-term experiments with continuous data acquisition generate large amounts of data. Rapid browsing through these massive datasets poses a challenge to conventional data plotting software because the plotting time increases proportionately to the increase in the volume of data. This paper presents FTSPlot, which is a visualization concept for large-scale time series datasets using techniques from the field of high performance computer graphics, such as hierarchic level of detail and out-of-core data handling. In a preprocessing step, time series data, event, and interval annotations are converted into an optimized data format, which then permits fast, interactive visualization. The preprocessing step has a computational complexity of O(n x log(N)); the visualization itself can be done with a complexity of O(1) and is therefore independent of the amount of data. A demonstration prototype has been implemented and benchmarks show that the technology is capable of displaying large amounts of time series data, event, and interval annotations lag-free with visualization method for long-term electrophysiological experiments.

  11. Seasonal river discharge forecast in alpine catchments using snow map time series and support vector regression approach

    OpenAIRE

    Callegari, Mattia; Mazzoli, Paolo; Gregorio, Ludovica de; Notarnicola, Claudia; PETITTA Marcello; Pasolli, Luca; Seppi, Roberto; Pistocchi, Alberto

    2014-01-01

    The prediction of monthly mean discharge is critical for water resources management. Statistical methods applied on discharge time series are traditionally used for predicting this kind of slow response hydrological events. With this paper we present a Support Vector Regression (SVR) system able to predict monthly mean discharge considering discharge and snow cover extent (250 meters resolution obtained by MODIS images) time series as input. Additional meteorological and climatic variables ar...

  12. LEARNING GRANGER CAUSALITY GRAPHS FOR MULTIVARIATE NONLINEAR TIME SERIES

    Institute of Scientific and Technical Information of China (English)

    Wei GAO; Zheng TIAN

    2009-01-01

    An information theory method is proposed to test the. Granger causality and contemporaneous conditional independence in Granger causality graph models. In the graphs, the vertex set denotes the component series of the multivariate time series, and the directed edges denote causal dependence, while the undirected edges reflect the instantaneous dependence. The presence of the edges is measured by a statistics based on conditional mutual information and tested by a permutation procedure. Furthermore, for the existed relations, a statistics based on the difference between general conditional mutual information and linear conditional mutual information is proposed to test the nonlinearity. The significance of the nonlinear test statistics is determined by a bootstrap method based on surrogate data. We investigate the finite sample behavior of the procedure through simulation time series with different dependence structures, including linear and nonlinear relations.

  13. Studies in Astronomical Time Series Analysis. VI. Bayesian Block Representations

    CERN Document Server

    Scargle, Jeffrey D; Jackson, Brad; Chiang, James

    2012-01-01

    This paper addresses the problem of detecting and characterizing local variability in time series and other forms of sequential data. The goal is to identify and characterize statistically significant variations, at the same time suppressing the inevitable corrupting observational errors. We present a simple nonparametric modeling technique and an algorithm implementing it - an improved and generalized version of Bayesian Blocks (Scargle 1998) - that finds the optimal segmentation of the data in the observation interval. The structure of the algorithm allows it to be used in either a real-time trigger mode, or a retrospective mode. Maximum likelihood or marginal posterior functions to measure model fitness are presented for events, binned counts, and measurements at arbitrary times with known error distributions. Problems addressed include those connected with data gaps, variable exposure, extension to piecewise linear and piecewise exponential representations, multi-variate time series data, analysis of vari...

  14. Recovery of delay time from time series based on the nearest neighbor method

    Science.gov (United States)

    Prokhorov, M. D.; Ponomarenko, V. I.; Khorev, V. S.

    2013-12-01

    We propose a method for the recovery of delay time from time series of time-delay systems. The method is based on the nearest neighbor analysis. The method allows one to reconstruct delays in various classes of time-delay systems including systems of high order, systems with several coexisting delays, and nonscalar time-delay systems. It can be applied to time series heavily corrupted by additive and dynamical noise.

  15. Recovery of delay time from time series based on the nearest neighbor method

    Energy Technology Data Exchange (ETDEWEB)

    Prokhorov, M.D., E-mail: mdprokhorov@yandex.ru [Saratov Branch of Kotel' nikov Institute of Radio Engineering and Electronics of Russian Academy of Sciences, Zelyonaya Street, 38, Saratov 410019 (Russian Federation); Ponomarenko, V.I. [Saratov Branch of Kotel' nikov Institute of Radio Engineering and Electronics of Russian Academy of Sciences, Zelyonaya Street, 38, Saratov 410019 (Russian Federation); Department of Nano- and Biomedical Technologies, Saratov State University, Astrakhanskaya Street, 83, Saratov 410012 (Russian Federation); Khorev, V.S. [Department of Nano- and Biomedical Technologies, Saratov State University, Astrakhanskaya Street, 83, Saratov 410012 (Russian Federation)

    2013-12-09

    We propose a method for the recovery of delay time from time series of time-delay systems. The method is based on the nearest neighbor analysis. The method allows one to reconstruct delays in various classes of time-delay systems including systems of high order, systems with several coexisting delays, and nonscalar time-delay systems. It can be applied to time series heavily corrupted by additive and dynamical noise.

  16. Effects of dating errors on nonparametric trend analyses of speleothem time series

    Directory of Open Access Journals (Sweden)

    M. Mudelsee

    2012-10-01

    Full Text Available A fundamental problem in paleoclimatology is to take fully into account the various error sources when examining proxy records with quantitative methods of statistical time series analysis. Records from dated climate archives such as speleothems add extra uncertainty from the age determination to the other sources that consist in measurement and proxy errors. This paper examines three stalagmite time series of oxygen isotopic composition (δ18O from two caves in western Germany, the series AH-1 from the Atta Cave and the series Bu1 and Bu4 from the Bunker Cave. These records carry regional information about past changes in winter precipitation and temperature. U/Th and radiocarbon dating reveals that they cover the later part of the Holocene, the past 8.6 thousand years (ka. We analyse centennial- to millennial-scale climate trends by means of nonparametric Gasser–Müller kernel regression. Error bands around fitted trend curves are determined by combining (1 block bootstrap resampling to preserve noise properties (shape, autocorrelation of the δ18O residuals and (2 timescale simulations (models StalAge and iscam. The timescale error influences on centennial- to millennial-scale trend estimation are not excessively large. We find a "mid-Holocene climate double-swing", from warm to cold to warm winter conditions (6.5 ka to 6.0 ka to 5.1 ka, with warm–cold amplitudes of around 0.5‰ δ18O; this finding is documented by all three records with high confidence. We also quantify the Medieval Warm Period (MWP, the Little Ice Age (LIA and the current warmth. Our analyses cannot unequivocally support the conclusion that current regional winter climate is warmer than that during the MWP.

  17. Time-lag effects of global vegetation responses to climate change.

    Science.gov (United States)

    Wu, Donghai; Zhao, Xiang; Liang, Shunlin; Zhou, Tao; Huang, Kaicheng; Tang, Bijian; Zhao, Wenqian

    2015-09-01

    Climate conditions significantly affect vegetation growth in terrestrial ecosystems. Due to the spatial heterogeneity of ecosystems, the vegetation responses to climate vary considerably with the diverse spatial patterns and the time-lag effects, which are the most important mechanism of climate-vegetation interactive effects. Extensive studies focused on large-scale vegetation-climate interactions use the simultaneous meteorological and vegetation indicators to develop models; however, the time-lag effects are less considered, which tends to increase uncertainty. In this study, we aim to quantitatively determine the time-lag effects of global vegetation responses to different climatic factors using the GIMMS3g NDVI time series and the CRU temperature, precipitation, and solar radiation datasets. First, this study analyzed the time-lag effects of global vegetation responses to different climatic factors. Then, a multiple linear regression model and partial correlation model were established to statistically analyze the roles of different climatic factors on vegetation responses, from which the primary climate-driving factors for different vegetation types were determined. The results showed that (i) both the time-lag effects of the vegetation responses and the major climate-driving factors that significantly affect vegetation growth varied significantly at the global scale, which was related to the diverse vegetation and climate characteristics; (ii) regarding the time-lag effects, the climatic factors explained 64% variation of the global vegetation growth, which was 11% relatively higher than the model ignoring the time-lag effects; (iii) for the area with a significant change trend (for the period 1982-2008) in the global GIMMS3g NDVI (P effects is quite important for better predicting and evaluating the vegetation dynamics under the background of global climate change.

  18. Change detection in a time series of polarimetric SAR images

    DEFF Research Database (Denmark)

    Skriver, Henning; Nielsen, Allan Aasbjerg; Conradsen, Knut

    can be used to detect at which points changes occur in the time series. [1] T. W. Anderson, An Introduction to Multivariate Statistical Analysis, John Wiley, New York, third edition, 2003. [2] K. Conradsen, A. A. Nielsen, J. Schou, and H. Skriver, “A test statistic in the complex Wishart distribution...

  19. Time series analysis in astronomy: Limits and potentialities

    DEFF Research Database (Denmark)

    Vio, R.; Kristensen, N.R.; Madsen, Henrik

    2005-01-01

    In this paper we consider the problem of the limits concerning the physical information that can be extracted from the analysis of one or more time series ( light curves) typical of astrophysical objects. On the basis of theoretical considerations and numerical simulations, we show that with no a...

  20. Wavelet methods in (financial) time-series processing

    NARCIS (Netherlands)

    Struzik, Z.R.

    2000-01-01

    We briefly describe the major advantages of using the wavelet transform for the processing of financial time series on the example of the S&P index. In particular, we show how to uncover local the scaling (correlation) characteristics of the S&P index with the wavelet based effective H'older expone

  1. Long-memory time series theory and methods

    CERN Document Server

    Palma, Wilfredo

    2007-01-01

    Wilfredo Palma, PhD, is Chairman and Professor of Statistics in the Department of Statistics at Pontificia Universidad Católica de Chile. Dr. Palma has published several refereed articles and has received over a dozen academic honors and awards. His research interests include time series analysis, prediction theory, state space systems, linear models, and econometrics.

  2. Notes on economic time series analysis system theoretic perspectives

    CERN Document Server

    Aoki, Masanao

    1983-01-01

    In seminars and graduate level courses I have had several opportunities to discuss modeling and analysis of time series with economists and economic graduate students during the past several years. These experiences made me aware of a gap between what economic graduate students are taught about vector-valued time series and what is available in recent system literature. Wishing to fill or narrow the gap that I suspect is more widely spread than my personal experiences indicate, I have written these notes to augment and reor­ ganize materials I have given in these courses and seminars. I have endeavored to present, in as much a self-contained way as practicable, a body of results and techniques in system theory that I judge to be relevant and useful to economists interested in using time series in their research. I have essentially acted as an intermediary and interpreter of system theoretic results and perspectives in time series by filtering out non-essential details, and presenting coherent accounts of wha...

  3. Deriving dynamic marketing effectiveness from econometric time series models

    NARCIS (Netherlands)

    C. Horváth (Csilla); Ph.H.B.F. Franses (Philip Hans)

    2003-01-01

    textabstractTo understand the relevance of marketing efforts, it has become standard practice to estimate the long-run and short-run effects of the marketing-mix, using, say, weekly scanner data. A common vehicle for this purpose is an econometric time series model. Issues that are addressed in the

  4. Outlier detection algorithms for least squares time series regression

    DEFF Research Database (Denmark)

    Johansen, Søren; Nielsen, Bent

    We review recent asymptotic results on some robust methods for multiple regression. The regressors include stationary and non-stationary time series as well as polynomial terms. The methods include the Huber-skip M-estimator, 1-step Huber-skip M-estimators, in particular the Impulse Indicator Sat...

  5. Publicly Verifiable Private Aggregation of Time-Series Data

    NARCIS (Netherlands)

    Bakondi, B.G.; Peter, A.; Everts, M.H.; Hartel, P.H.; Jonker, W.

    2015-01-01

    Aggregation of time-series data offers the possibility to learn certain statistics over data periodically uploaded by different sources. In case of privacy sensitive data, it is desired to hide every data provider's individual values from the other participants (including the data aggregator). Exist

  6. Noise in multivariate GPS position time-series

    NARCIS (Netherlands)

    Amiri-Simkooei, A.R.

    2008-01-01

    A methodology is developed to analyze a multivariate linear model, which occurs in many geodetic and geophysical applications. Proper analysis of multivariate GPS coordinate time-series is considered to be an application. General, special, and more practical stochastic models are adopted to assess t

  7. A Hybrid Joint Moment Ratio Test for Financial Time Series

    NARCIS (Netherlands)

    P.A. Groenendijk (Patrick); A. Lucas (André); C.G. de Vries (Casper)

    1998-01-01

    textabstractWe advocate the use of absolute moment ratio statistics in conjunction with standard variance ratio statistics in order to disentangle linear dependence, non-linear dependence, and leptokurtosis in financial time series. Both statistics are computed for multiple return horizons simultane

  8. Segmentation of Nonstationary Time Series with Geometric Clustering

    DEFF Research Database (Denmark)

    Bocharov, Alexei; Thiesson, Bo

    2013-01-01

    We introduce a non-parametric method for segmentation in regimeswitching time-series models. The approach is based on spectral clustering of target-regressor tuples and derives a switching regression tree, where regime switches are modeled by oblique splits. Such models can be learned efficiently...

  9. A test of conditional heteroscedasticity in time series

    Institute of Scientific and Technical Information of China (English)

    陈敏; 安鸿志

    1999-01-01

    A new test of conditional heteroscedasticity for time series is proposed. The new testing method is based on a goodness of fit type test statistics and a Cramer-von Mises type test statistic. The asymptotic properties of the new test statistic is establised. The results demonstrate that such a test is consistent.

  10. What Makes a Coursebook Series Stand the Test of Time?

    Science.gov (United States)

    Illes, Eva

    2009-01-01

    Intriguingly, at a time when the ELT market is inundated with state-of-the-art coursebooks teaching modern-day English, a 30-year-old series enjoys continuing popularity in some secondary schools in Hungary. Why would teachers, several of whom are school-based teacher-mentors in the vanguard of the profession, purposefully choose materials which…

  11. Inhomogeneities detection in annual precipitation time series in Portugal using direct sequential simulation

    Science.gov (United States)

    Caineta, Júlio; Ribeiro, Sara; Costa, Ana Cristina; Henriques, Roberto; Soares, Amílcar

    2014-05-01

    Climate data homogenisation is of major importance in monitoring climate change, the validation of weather forecasting, general circulation and regional atmospheric models, modelling of erosion, drought monitoring, among other studies of hydrological and environmental impacts. This happens because non-climate factors can cause time series discontinuities which may hide the true climatic signal and patterns, thus potentially bias the conclusions of those studies. In the last two decades, many methods have been developed to identify and remove these inhomogeneities. One of those is based on geostatistical simulation (DSS - direct sequential simulation), where local probability density functions (pdf) are calculated at candidate monitoring stations, using spatial and temporal neighbouring observations, and then are used for detection of inhomogeneities. This approach has been previously applied to detect inhomogeneities in four precipitation series (wet day count) from a network with 66 monitoring stations located in the southern region of Portugal (1980-2001). This study revealed promising results and the potential advantages of geostatistical techniques for inhomogeneities detection in climate time series. This work extends the case study presented before and investigates the application of the geostatistical stochastic approach to ten precipitation series that were previously classified as inhomogeneous by one of six absolute homogeneity tests (Mann-Kendall test, Wald-Wolfowitz runs test, Von Neumann ratio test, Standard normal homogeneity test (SNHT) for a single break, Pettit test, and Buishand range test). Moreover, a sensibility analysis is implemented to investigate the number of simulated realisations that should be used to accurately infer the local pdfs. Accordingly, the number of simulations per iteration is increased from 50 to 500, which resulted in a more representative local pdf. A set of default and recommended settings is provided, which will help

  12. Irreversibility of financial time series: A graph-theoretical approach

    Science.gov (United States)

    Flanagan, Ryan; Lacasa, Lucas

    2016-04-01

    The relation between time series irreversibility and entropy production has been recently investigated in thermodynamic systems operating away from equilibrium. In this work we explore this concept in the context of financial time series. We make use of visibility algorithms to quantify, in graph-theoretical terms, time irreversibility of 35 financial indices evolving over the period 1998-2012. We show that this metric is complementary to standard measures based on volatility and exploit it to both classify periods of financial stress and to rank companies accordingly. We then validate this approach by finding that a projection in principal components space of financial years, based on time irreversibility features, clusters together periods of financial stress from stable periods. Relations between irreversibility, efficiency and predictability are briefly discussed.

  13. Metagenomics meets time series analysis: unraveling microbial community dynamics.

    Science.gov (United States)

    Faust, Karoline; Lahti, Leo; Gonze, Didier; de Vos, Willem M; Raes, Jeroen

    2015-06-01

    The recent increase in the number of microbial time series studies offers new insights into the stability and dynamics of microbial communities, from the world's oceans to human microbiota. Dedicated time series analysis tools allow taking full advantage of these data. Such tools can reveal periodic patterns, help to build predictive models or, on the contrary, quantify irregularities that make community behavior unpredictable. Microbial communities can change abruptly in response to small perturbations, linked to changing conditions or the presence of multiple stable states. With sufficient samples or time points, such alternative states can be detected. In addition, temporal variation of microbial interactions can be captured with time-varying networks. Here, we apply these techniques on multiple longitudinal datasets to illustrate their potential for microbiome research.

  14. Climate model boundary conditions for four Cretaceous time slices

    NARCIS (Netherlands)

    Sewall, J.O.; Wal, R.S.W. van de; Zwan, C.J. van der; Oosterhout, C. van; Dijkstra, H.A.; Scotese, C.R.

    2007-01-01

    General circulation models (GCMs) are useful tools for investigating the characteristics and dynamics of past climates. Understanding of past climates contributes significantly to our overall understanding of Earth’s climate system. One of the most time consuming, and often daunting, tasks facing th

  15. Displaying time series, spatial, and space-time data with R

    CERN Document Server

    Perpinan Lamigueiro, Oscar

    2014-01-01

    Code and Methods for Creating High-Quality Data GraphicsA data graphic is not only a static image, but it also tells a story about the data. It activates cognitive processes that are able to detect patterns and discover information not readily available with the raw data. This is particularly true for time series, spatial, and space-time datasets.Focusing on the exploration of data with visual methods, Displaying Time Series, Spatial, and Space-Time Data with R presents methods and R code for producing high-quality graphics of time series, spatial, and space-time data. Practical examples using

  16. Classification of time series patterns from complex dynamic systems

    Energy Technology Data Exchange (ETDEWEB)

    Schryver, J.C.; Rao, N.

    1998-07-01

    An increasing availability of high-performance computing and data storage media at decreasing cost is making possible the proliferation of large-scale numerical databases and data warehouses. Numeric warehousing enterprises on the order of hundreds of gigabytes to terabytes are a reality in many fields such as finance, retail sales, process systems monitoring, biomedical monitoring, surveillance and transportation. Large-scale databases are becoming more accessible to larger user communities through the internet, web-based applications and database connectivity. Consequently, most researchers now have access to a variety of massive datasets. This trend will probably only continue to grow over the next several years. Unfortunately, the availability of integrated tools to explore, analyze and understand the data warehoused in these archives is lagging far behind the ability to gain access to the same data. In particular, locating and identifying patterns of interest in numerical time series data is an increasingly important problem for which there are few available techniques. Temporal pattern recognition poses many interesting problems in classification, segmentation, prediction, diagnosis and anomaly detection. This research focuses on the problem of classification or characterization of numerical time series data. Highway vehicles and their drivers are examples of complex dynamic systems (CDS) which are being used by transportation agencies for field testing to generate large-scale time series datasets. Tools for effective analysis of numerical time series in databases generated by highway vehicle systems are not yet available, or have not been adapted to the target problem domain. However, analysis tools from similar domains may be adapted to the problem of classification of numerical time series data.

  17. Modelling, simulation and inference for multivariate time series of counts

    OpenAIRE

    Veraart, Almut E. D.

    2016-01-01

    This article presents a new continuous-time modelling framework for multivariate time series of counts which have an infinitely divisible marginal distribution. The model is based on a mixed moving average process driven by L\\'{e}vy noise - called a trawl process - where the serial correlation and the cross-sectional dependence are modelled independently of each other. Such processes can exhibit short or long memory. We derive a stochastic simulation algorithm and a statistical inference meth...

  18. Statistical Analysis of Time Series Data (STATS). Users Manual (Preliminary)

    Science.gov (United States)

    1987-05-01

    15, 30. 60, 90, 120, andL -!/14:X.... 183 days are presently used. auto Page 1 of 10 wrpy *VtsE0> J1 record (continued) Field Variab Vlue D 2 NPRDS ...each event. 6 JEND + Order number of last period in time series to ( NPRDS ) select for analysis. If blank, the last period is assumed. 7 JPPF Plotting...values. 2 NPRDS + Actual number of periods for the event following on ’INO records until the next ID, BF, or LI record. IN record - T:E SERIES DATA

  19. Improving predictability of time series using maximum entropy methods

    Science.gov (United States)

    Chliamovitch, G.; Dupuis, A.; Golub, A.; Chopard, B.

    2015-04-01

    We discuss how maximum entropy methods may be applied to the reconstruction of Markov processes underlying empirical time series and compare this approach to usual frequency sampling. It is shown that, in low dimension, there exists a subset of the space of stochastic matrices for which the MaxEnt method is more efficient than sampling, in the sense that shorter historical samples have to be considered to reach the same accuracy. Considering short samples is of particular interest when modelling smoothly non-stationary processes, which provides, under some conditions, a powerful forecasting tool. The method is illustrated for a discretized empirical series of exchange rates.

  20. Tuning the Voices of a Choir: Detecting Ecological Gradients in Time-Series Populations.

    Science.gov (United States)

    Buras, Allan; van der Maaten-Theunissen, Marieke; van der Maaten, Ernst; Ahlgrimm, Svenja; Hermann, Philipp; Simard, Sonia; Heinrich, Ingo; Helle, Gerd; Unterseher, Martin; Schnittler, Martin; Eusemann, Pascal; Wilmking, Martin

    2016-01-01

    This paper introduces a new approach-the Principal Component Gradient Analysis (PCGA)-to detect ecological gradients in time-series populations, i.e. several time-series originating from different individuals of a population. Detection of ecological gradients is of particular importance when dealing with time-series from heterogeneous populations which express differing trends. PCGA makes use of polar coordinates of loadings from the first two axes obtained by principal component analysis (PCA) to define groups of similar trends. Based on the mean inter-series correlation (rbar) the gain of increasing a common underlying signal by PCGA groups is quantified using Monte Carlo Simulations. In terms of validation PCGA is compared to three other existing approaches. Focusing on dendrochronological examples, PCGA is shown to correctly determine population gradients and in particular cases to be advantageous over other considered methods. Furthermore, PCGA groups in each example allowed for enhancing the strength of a common underlying signal and comparably well as hierarchical cluster analysis. Our results indicate that PCGA potentially allows for a better understanding of mechanisms causing time-series population gradients as well as objectively enhancing the performance of climate transfer functions in dendroclimatology. While our examples highlight the relevance of PCGA to the field of dendrochronology, we believe that also other disciplines working with data of comparable structure may benefit from PCGA.

  1. Autoregression of Quasi-Stationary Time Series (Invited)

    Science.gov (United States)

    Meier, T. M.; Küperkoch, L.

    2009-12-01

    Autoregression is a model based tool for spectral analysis and prediction of time series. It has the potential to increase the resolution of spectral estimates. However, the validity of the assumed model has to be tested. Here we review shortly methods for the determination of the parameters of autoregression and summarize properties of autoregressive prediction and autoregressive spectral analysis. Time series with a limited number of dominant frequencies varying slowly in time (quasi-stationary time series) may well be described by a time-dependent autoregressive model of low order. An algorithm for the estimation of the autoregression parameters in a moving window is presented. Time-varying dominant frequencies are estimated. The comparison to results obtained by Fourier transform based methods and the visualization of the time dependent normalized prediction error are essential for quality assessment of the results. The algorithm is applied to synthetic examples as well as to mircoseism and tremor. The sensitivity of the results to the choice of model and filter parameters is discussed. Autoregressive forward prediction offers the opportunity to detect body wave phases in seismograms and to determine arrival times automatically. Examples are shown for P- and S-phases at local and regional distances. In order to determine S-wave arrival times the autoregressive model is extended to multi-component recordings. For the detection of significant temporal changes in waveforms, the choice of the model appears to be less crucial compared to spectral analysis. Temporal changes in frequency, amplitude, phase, and polarisation are detectable by autoregressive prediction. Quality estimates of automatically determined onset times may be obtained from the slope of the absolute prediction error as a function of time and the signal-to-noise ratio. Results are compared to manual readings.

  2. Wavelet analysis on paleomagnetic (and computer simulated VGP time series

    Directory of Open Access Journals (Sweden)

    A. Siniscalchi

    2003-06-01

    Full Text Available We present Continuous Wavelet Transform (CWT data analysis of Virtual Geomagnetic Pole (VGP latitude time series. The analyzed time series are sedimentary paleomagnetic and geodynamo simulated data. Two mother wavelets (the Morlet function and the first derivative of a Gaussian function are used in order to detect features related to the spectral content as well as polarity excursions and reversals. By means of the Morlet wavelet, we estimate both the global spectrum and the time evolution of the spectral content of the paleomagnetic data series. Some peaks corresponding to the orbital components are revealed by the spectra and the local analysis helped disclose their statistical significance. Even if this feature could be an indication of orbital influence on geodynamo, other interpretations are possible. In particular, we note a correspondence of local spectral peaks with the appearance of the excursions in the series. The comparison among the paleomagnetic and simulated spectra shows a similarity in the high frequency region indicating that their degree of regularity is analogous. By means of Gaussian first derivative wavelet, reversals and excursions of polarity were sought. The analysis was performed first on the simulated data, to have a guide in understanding the features present in the more complex paleomagnetic data. Various excursions and reversals have been identified, despite of the prevalent normality of the series and its inherent noise. The found relative chronology of the paleomagnetic data reversals was compared with a coeval global polarity time scale (Channel et al., 1995. The relative lengths of polarity stability intervals are found similar, but a general shift appears between the two scales, that could be due to the datation uncertainties of the Hauterivian/Barremian boundary.

  3. Minimum Entropy Density Method for the Time Series Analysis

    CERN Document Server

    Lee, J W; Moon, H T; Park, J B; Yang, J S; Jo, Hang-Hyun; Lee, Jeong Won; Moon, Hie-Tae; Park, Joongwoo Brian; Yang, Jae-Suk

    2006-01-01

    The entropy density is an intuitive and powerful concept to study the complicated nonlinear processes derived from physical systems. We develop the minimum entropy density method (MEDM) to detect the most correlated time interval of a given time series and define the effective delay of information (EDI) as the correlation length that minimizes the entropy density in relation to the velocity of information flow. The MEDM is applied to the financial time series of Standard and Poor's 500 (S&P500) index from February 1983 to April 2006. It is found that EDI of S&P500 index has decreased for the last twenty years, which suggests that the efficiency of the U.S. market dynamics became close to the efficient market hypothesis.

  4. Adaptively Sharing Time-Series with Differential Privacy

    CERN Document Server

    Fan, Liyue

    2012-01-01

    Sharing real-time aggregate statistics of private data has given much benefit to the public to perform data mining for understanding important phenomena, such as Influenza outbreaks and traffic congestions. We propose an adaptive approach with sampling and estimation to release aggregated time series under differential privacy, the key innovation of which is that we utilize feedback loops based on observed (perturbed) values to dynamically adjust the estimation model as well as the sampling rate. To minimize the overall privacy cost, our solution uses the PID controller to adaptively sample long time-series according to detected data dynamics. To improve the accuracy of data release per timestamp, the Kalman filter is used to predict data values at non-sampling points and to estimate true values from perturbed query answers at sampling points. Our experiments with three real data sets show that it is beneficial to incorporate feedback into both the estimation model and the sampling process. The results confir...

  5. Cross Recurrence Plot Based Synchronization of Time Series

    CERN Document Server

    Marwan, N; Nowaczyk, N R

    2002-01-01

    The method of recurrence plots is extended to the cross recurrence plots (CRP), which among others enables the study of synchronization or time differences in two time series. This is emphasized in a distorted main diagonal in the cross recurrence plot, the line of synchronization (LOS). A non-parametrical fit of this LOS can be used to rescale the time axis of the two data series (whereby one of it is e.g. compressed or stretched) so that they are synchronized. An application of this method to geophysical sediment core data illustrates its suitability for real data. The rock magnetic data of two different sediment cores from the Makarov Basin can be adjusted to each other by using this method, so that they are comparable.

  6. A multivariate heuristic model for fuzzy time-series forecasting.

    Science.gov (United States)

    Huarng, Kun-Huang; Yu, Tiffany Hui-Kuang; Hsu, Yu Wei

    2007-08-01

    Fuzzy time-series models have been widely applied due to their ability to handle nonlinear data directly and because no rigid assumptions for the data are needed. In addition, many such models have been shown to provide better forecasting results than their conventional counterparts. However, since most of these models require complicated matrix computations, this paper proposes the adoption of a multivariate heuristic function that can be integrated with univariate fuzzy time-series models into multivariate models. Such a multivariate heuristic function can easily be extended and integrated with various univariate models. Furthermore, the integrated model can handle multiple variables to improve forecasting results and, at the same time, avoid complicated computations due to the inclusion of multiple variables.

  7. Cross recurrence plot based synchronization of time series

    Directory of Open Access Journals (Sweden)

    N. Marwan

    2002-01-01

    Full Text Available The method of recurrence plots is extended to the cross recurrence plots (CRP which, among others, enables the study of synchronization or time differences in two time series. This is emphasized in a distorted main diagonal in the cross recurrence plot, the line of synchronization (LOS. A non-parametrical fit of this LOS can be used to rescale the time axis of the two data series (whereby one of them is compressed or stretched so that they are synchronized. An application of this method to geophysical sediment core data illustrates its suitability for real data. The rock magnetic data of two different sediment cores from the Makarov Basin can be adjusted to each other by using this method, so that they are comparable.

  8. Time series data mining for the Gaia variability analysis

    CERN Document Server

    Nienartowicz, Krzysztof; Guy, Leanne; Holl, Berry; Lecoeur-Taïbi, Isabelle; Mowlavi, Nami; Rimoldini, Lorenzo; Ruiz, Idoia; Süveges, Maria; Eyer, Laurent

    2014-01-01

    Gaia is an ESA cornerstone mission, which was successfully launched December 2013 and commenced operations in July 2014. Within the Gaia Data Processing and Analysis consortium, Coordination Unit 7 (CU7) is responsible for the variability analysis of over a billion celestial sources and nearly 4 billion associated time series (photometric, spectrophotometric, and spectroscopic), encoding information in over 800 billion observations during the 5 years of the mission, resulting in a petabyte scale analytical problem. In this article, we briefly describe the solutions we developed to address the challenges of time series variability analysis: from the structure for a distributed data-oriented scientific collaboration to architectural choices and specific components used. Our approach is based on Open Source components with a distributed, partitioned database as the core to handle incrementally: ingestion, distributed processing, analysis, results and export in a constrained time window.

  9. Recursive Bayesian recurrent neural networks for time-series modeling.

    Science.gov (United States)

    Mirikitani, Derrick T; Nikolaev, Nikolay

    2010-02-01

    This paper develops a probabilistic approach to recursive second-order training of recurrent neural networks (RNNs) for improved time-series modeling. A general recursive Bayesian Levenberg-Marquardt algorithm is derived to sequentially update the weights and the covariance (Hessian) matrix. The main strengths of the approach are a principled handling of the regularization hyperparameters that leads to better generalization, and stable numerical performance. The framework involves the adaptation of a noise hyperparameter and local weight prior hyperparameters, which represent the noise in the data and the uncertainties in the model parameters. Experimental investigations using artificial and real-world data sets show that RNNs equipped with the proposed approach outperform standard real-time recurrent learning and extended Kalman training algorithms for recurrent networks, as well as other contemporary nonlinear neural models, on time-series modeling.

  10. Measures of Analysis of Time Series (MATS: A MATLAB Toolkit for Computation of Multiple Measures on Time Series Data Bases

    Directory of Open Access Journals (Sweden)

    Dimitris Kugiumtzis

    2010-02-01

    Full Text Available In many applications, such as physiology and finance, large time series data bases are to be analyzed requiring the computation of linear, nonlinear and other measures. Such measures have been developed and implemented in commercial and freeware softwares rather selectively and independently. The Measures of Analysis of Time Series (MATS MATLAB toolkit is designed to handle an arbitrary large set of scalar time series and compute a large variety of measures on them, allowing for the specification of varying measure parameters as well. The variety of options with added facilities for visualization of the results support different settings of time series analysis, such as the detection of dynamics changes in long data records, resampling (surrogate or bootstrap tests for independence and linearity with various test statistics, and discrimination power of different measures and for different combinations of their parameters. The basic features of MATS are presented and the implemented measures are briefly described. The usefulness of MATS is illustrated on some empirical examples along with screenshots.

  11. Building Real-Time Network Intrusion Detection System Based on Parallel Time-Series Mining Techniques

    Institute of Scientific and Technical Information of China (English)

    Zhao Feng; Li Qinghua

    2005-01-01

    A new real-time model based on parallel time-series mining is proposed to improve the accuracy and efficiency of the network intrusion detection systems. In this model, multidimensional dataset is constructed to describe network events, and sliding window updating algorithm is used to maintain network stream. Moreover, parallel frequent patterns and frequent episodes mining algorithms are applied to implement parallel time-series mining engineer which can intelligently generate rules to distinguish intrusions from normal activities. Analysis and study on the basis of DAWNING 3000 indicate that this parallel time-series mining-based model provides a more accurate and efficient way to building real-time NIDS.

  12. Reconstruction of ensembles of coupled time-delay systems from time series

    Science.gov (United States)

    Sysoev, I. V.; Prokhorov, M. D.; Ponomarenko, V. I.; Bezruchko, B. P.

    2014-06-01

    We propose a method to recover from time series the parameters of coupled time-delay systems and the architecture of couplings between them. The method is based on a reconstruction of model delay-differential equations and estimation of statistical significance of couplings. It can be applied to networks composed of nonidentical nodes with an arbitrary number of unidirectional and bidirectional couplings. We test our method on chaotic and periodic time series produced by model equations of ensembles of diffusively coupled time-delay systems in the presence of noise, and apply it to experimental time series obtained from electronic oscillators with delayed feedback coupled by resistors.

  13. Exploring large scale time-series data using nested timelines

    Science.gov (United States)

    Xie, Zaixian; Ward, Matthew O.; Rundensteiner, Elke A.

    2013-01-01

    When data analysts study time-series data, an important task is to discover how data patterns change over time. If the dataset is very large, this task becomes challenging. Researchers have developed many visualization techniques to help address this problem. However, little work has been done regarding the changes of multivariate patterns, such as linear trends and clusters, on time-series data. In this paper, we describe a set of history views to fill this gap. This technique works under two modes: merge and non-merge. For the merge mode, merge algorithms were applied to selected time windows to generate a change-based hierarchy. Contiguous time windows having similar patterns are merged first. Users can choose different levels of merging with the tradeoff between more details in the data and less visual clutter in the visualizations. In the non-merge mode, the framework can use natural hierarchical time units or one defined by domain experts to represent timelines. This can help users navigate across long time periods. Gridbased views were designed to provide a compact overview for the history data. In addition, MDS pattern starfields and distance maps were developed to enable users to quickly investigate the degree of pattern similarity among different time periods. The usability evaluation demonstrated that most participants could understand the concepts of the history views correctly and finished assigned tasks with a high accuracy and relatively fast response time.

  14. World Climate Classification and Search: Data Mining Approach Utilizing Dynamic Time Warping Similarity Function

    Science.gov (United States)

    Stepinski, T. F.; Netzel, P.; Jasiewicz, J.

    2014-12-01

    We have developed a novel method for classification and search of climate over the global land surface excluding Antarctica. Our method classifies climate on the basis of the outcome of time series segmentation and clustering. We use WorldClim 30 arc sec. (approx. 1 km) resolution grid data which is based on 50 years of climatic observations. Each cell in a grid is assigned a 12 month series consisting of 50-years monthly averages of mean, maximum, and minimum temperatures as well as the total precipitation. The presented method introduces several innovations with comparison to existing data-driven methods of world climate classifications. First, it uses only climatic rather than bioclimatic data. Second, it employs object-oriented methodology - the grid is first segmented before climatic segments are classified. Third, and most importantly, the similarity between climates in two given cells is performed using the dynamic time warping (DTW) measure instead of the Euclidean distance. The DTW is known to be superior to Euclidean distance for time series, but has not been utilized before in classification of global climate. To account for computational expense of DTW we use highly efficient GeoPAT software (http://sil.uc.edu/gitlist/) that, in the first step, segments the grid into local regions of uniform climate. In the second step, the segments are classified. We also introduce a climate search - a GeoWeb-based method for interactive presentation of global climate information in the form of query-and-retrieval. A user selects a geographical location and the system returns a global map indicating level of similarity between local climates and a climate in the selected location. The results of the search for location: "University of Cincinnati, Main Campus" are presented on attached map. The results of the search for location: "University of Cincinnati, Main Campus" are presented on the map. We have compared the results of our method to Koeppen classification scheme

  15. Assessing spatial covariance among time series of abundance.

    Science.gov (United States)

    Jorgensen, Jeffrey C; Ward, Eric J; Scheuerell, Mark D; Zabel, Richard W

    2016-04-01

    For species of conservation concern, an essential part of the recovery planning process is identifying discrete population units and their location with respect to one another. A common feature among geographically proximate populations is that the number of organisms tends to covary through time as a consequence of similar responses to exogenous influences. In turn, high covariation among populations can threaten the persistence of the larger metapopulation. Historically, explorations of the covariance in population size of species with many (>10) time series have been computationally difficult. Here, we illustrate how dynamic factor analysis (DFA) can be used to characterize diversity among time series of population abundances and the degree to which all populations can be represented by a few common signals. Our application focuses on anadromous Chinook salmon (Oncorhynchus tshawytscha), a species listed under the US Endangered Species Act, that is impacted by a variety of natural and anthropogenic factors. Specifically, we fit DFA models to 24 time series of population abundance and used model selection to identify the minimum number of latent variables that explained the most temporal variation after accounting for the effects of environmental covariates. We found support for grouping the time series according to 5 common latent variables. The top model included two covariates: the Pacific Decadal Oscillation in spring and summer. The assignment of populations to the latent variables matched the currently established population structure at a broad spatial scale. At a finer scale, there was more population grouping complexity. Some relatively distant populations were grouped together, and some relatively close populations - considered to be more aligned with each other - were more associated with populations further away. These coarse- and fine-grained examinations of spatial structure are important because they reveal different structural patterns not evident

  16. Satellite time series analysis using Empirical Mode Decomposition

    Science.gov (United States)

    Pannimpullath, R. Renosh; Doolaeghe, Diane; Loisel, Hubert; Vantrepotte, Vincent; Schmitt, Francois G.

    2016-04-01

    Geophysical fields possess large fluctuations over many spatial and temporal scales. Satellite successive images provide interesting sampling of this spatio-temporal multiscale variability. Here we propose to consider such variability by performing satellite time series analysis, pixel by pixel, using Empirical Mode Decomposition (EMD). EMD is a time series analysis technique able to decompose an original time series into a sum of modes, each one having a different mean frequency. It can be used to smooth signals, to extract trends. It is built in a data-adaptative way, and is able to extract information from nonlinear signals. Here we use MERIS Suspended Particulate Matter (SPM) data, on a weekly basis, during 10 years. There are 458 successive time steps. We have selected 5 different regions of coastal waters for the present study. They are Vietnam coastal waters, Brahmaputra region, St. Lawrence, English Channel and McKenzie. These regions have high SPM concentrations due to large scale river run off. Trend and Hurst exponents are derived for each pixel in each region. The energy also extracted using Hilberts Spectral Analysis (HSA) along with EMD method. Normalised energy computed for each mode for each region with the total energy. The total energy computed using all the modes are extracted using EMD method.

  17. West Africa land use and land cover time series

    Science.gov (United States)

    Cotillon, Suzanne E.

    2017-02-16

    Started in 1999, the West Africa Land Use Dynamics project represents an effort to map land use and land cover, characterize the trends in time and space, and understand their effects on the environment across West Africa. The outcome of the West Africa Land Use Dynamics project is the production of a three-time period (1975, 2000, and 2013) land use and land cover dataset for the Sub-Saharan region of West Africa, including the Cabo Verde archipelago. The West Africa Land Use Land Cover Time Series dataset offers a unique basis for characterizing and analyzing land changes across the region, systematically and at an unprecedented level of detail.

  18. Copulas and time series with long-ranged dependences

    CERN Document Server

    Chicheportiche, Rémy

    2013-01-01

    We review ideas on temporal dependences and recurrences in discrete time series from several areas of natural and social sciences. We revisit existing studies and redefine the relevant observables in the language of copulas (joint laws of the ranks). We propose that copulas provide an appropriate mathematical framework to study non-linear time dependences and related concepts - like aftershocks, Omori law, recurrences, waiting times. We also critically argue using this global approach that previous phenomenological attempts involving only a long-ranged autocorrelation function lacked complexity in that they were essentially mono-scale.

  19. A Comparative Study of Portmanteau Tests for Univariate Time Series Models

    Directory of Open Access Journals (Sweden)

    Sohail Chand

    2006-07-01

    Full Text Available Time series model diagnostic checking is the most important stage of time series model building. In this paper the comparison among several suggested diagnostic tests has been made using the simulation time series data.

  20. Simple Patterns in Fluctuations of Time Series of Economic Interest

    Science.gov (United States)

    Fanchiotti, H.; García Canal, C. A.; García Zúñiga, H.

    Time series corresponding to nominal exchange rates between the US dollar and Argentina, Brazil and European Economic Community currencies; different financial indexes as the Industrial Dow Jones, the British Footsie, the German DAX Composite, the Australian Share Price and the Nikkei Cash and also different Argentine local tax revenues, are analyzed looking for the appearance of simple patterns and the possible definition of forecast evaluators. In every case, the statistical fractal dimensions are obtained from the behavior of the corresponding variance of increments at a given lag. The detrended fluctuation analysis of the data in terms of the corresponding exponent in the resulting power law is carried out. Finally, the frequency power spectra of all the time series considered are computed and compared

  1. Time series analysis of physiological response during ICU visitation.

    Science.gov (United States)

    Hepworth, J T; Hendrickson, S G; Lopez, J

    1994-12-01

    Time series analysis (TSA) is an important statistical procedure for clinical nursing research. The current paucity of nursing research reports using TSA may be due to unfamiliarity with this technique. In this article, TSA is compared with the ordinary least squares regression model; validity concerns of time series designs are discussed; and concomitant and interrupted TSA of data collected on the effects of family visitation on intracranial pressure (ICP), heart rate, and blood pressure of patients in ICUs are presented. The concomitant TSA of the effect of family on ICP suggested that family presence tended to be associated with decreased ICP. Interrupted TSA indicated the effect of family on heart rate and blood pressure was not as consistent: The overall effect on blood pressure appeared to be negligible, and heart rate may increase overall. Restrictive visiting policies, once typical of intensive care units, should be reconsidered.

  2. Time series analysis methods and applications for flight data

    CERN Document Server

    Zhang, Jianye

    2017-01-01

    This book focuses on different facets of flight data analysis, including the basic goals, methods, and implementation techniques. As mass flight data possesses the typical characteristics of time series, the time series analysis methods and their application for flight data have been illustrated from several aspects, such as data filtering, data extension, feature optimization, similarity search, trend monitoring, fault diagnosis, and parameter prediction, etc. An intelligent information-processing platform for flight data has been established to assist in aircraft condition monitoring, training evaluation and scientific maintenance. The book will serve as a reference resource for people working in aviation management and maintenance, as well as researchers and engineers in the fields of data analysis and data mining.

  3. Model of a synthetic wind speed time series generator

    DEFF Research Database (Denmark)

    Negra, N.B.; Holmstrøm, O.; Bak-Jensen, B.;

    2008-01-01

    Wind energy has assumed a great relevance in the operation and planning of today's power systems due to the exponential increase of installations in the last 10 years. For this reason, many performed studies have looked at suitable representations of wind generation for power system analysis. One...... of the main elements to consider for this purpose is the model of the wind speed that is usually required as input. Wind speed measurements may represent a solution for this problem, but, for techniques such as sequential Monte Carlo simulation, they have to be long enough in order to describe a wide range...... of possible wind conditions. If these information are not available, synthetic wind speed time series may be a useful tool as well, but their generator must preserve statistical and stochastic features of the phenomenon. This paper deals with this issue: a generator for synthetic wind speed time series...

  4. Time series prediction by feedforward neural networks - is it difficult?

    CERN Document Server

    Rosen-Zvi, M; Kinzel, W

    2003-01-01

    The difficulties that a neural network faces when trying to learn from a quasi-periodic time series are studied analytically using a teacher-student scenario where the random input is divided into two macroscopic regions with different variances, 1 and 1/gamma sup 2 (gamma >> 1). The generalization error is found to decrease as epsilon sub g propor to exp(-alpha/gamma sup 2), where alpha is the number of examples per input dimension. In contradiction to this very slow vanishing generalization error, the next output prediction is found to be almost free of mistakes. This picture is consistent with learning quasi-periodic time series produced by feedforward neural networks, which is dominated by enhanced components of the Fourier spectrum of the input. Simulation results are in good agreement with the analytical results.

  5. Mathematical methods in time series analysis and digital image processing

    CERN Document Server

    Kurths, J; Maass, P; Timmer, J

    2008-01-01

    The aim of this volume is to bring together research directions in theoretical signal and imaging processing developed rather independently in electrical engineering, theoretical physics, mathematics and the computer sciences. In particular, mathematically justified algorithms and methods, the mathematical analysis of these algorithms, and methods as well as the investigation of connections between methods from time series analysis and image processing are reviewed. An interdisciplinary comparison of these methods, drawing upon common sets of test problems from medicine and geophysical/enviromental sciences, is also addressed. This volume coherently summarizes work carried out in the field of theoretical signal and image processing. It focuses on non-linear and non-parametric models for time series as well as on adaptive methods in image processing.

  6. Model-Coupled Autoencoder for Time Series Visualisation

    CERN Document Server

    Gianniotis, Nikolaos; Tiňo, Peter; Polsterer, Kai L

    2016-01-01

    We present an approach for the visualisation of a set of time series that combines an echo state network with an autoencoder. For each time series in the dataset we train an echo state network, using a common and fixed reservoir of hidden neurons, and use the optimised readout weights as the new representation. Dimensionality reduction is then performed via an autoencoder on the readout weight representations. The crux of the work is to equip the autoencoder with a loss function that correctly interprets the reconstructed readout weights by associating them with a reconstruction error measured in the data space of sequences. This essentially amounts to measuring the predictive performance that the reconstructed readout weights exhibit on their corresponding sequences when plugged back into the echo state network with the same fixed reservoir. We demonstrate that the proposed visualisation framework can deal both with real valued sequences as well as binary sequences. We derive magnification factors in order t...

  7. Radial basis function network design for chaotic time series prediction

    Energy Technology Data Exchange (ETDEWEB)

    Shin, Chang Yong; Kim, Taek Soo; Park, Sang Hui [Yonsei University, Seoul (Korea, Republic of); Choi, Yoon Ho [Kyonggi University, Suwon (Korea, Republic of)

    1996-04-01

    In this paper, radial basis function networks with two hidden layers, which employ the K-means clustering method and the hierarchical training, are proposed for improving the short-term predictability of chaotic time series. Furthermore the recursive training method of radial basis function network using the recursive modified Gram-Schmidt algorithm is proposed for the purpose. In addition, the radial basis function networks trained by the proposed training methods are compared with the X.D. He A Lapedes`s model and the radial basis function network by non-recursive training method. Through this comparison, an improved radial basis function network for predicting chaotic time series is presented. (author). 17 refs., 8 figs., 3 tabs.

  8. Chaotic time series. Part II. System Identification and Prediction

    Directory of Open Access Journals (Sweden)

    Bjørn Lillekjendlie

    1994-10-01

    Full Text Available This paper is the second in a series of two, and describes the current state of the art in modeling and prediction of chaotic time series. Sample data from deterministic non-linear systems may look stochastic when analysed with linear methods. However, the deterministic structure may be uncovered and non-linear models constructed that allow improved prediction. We give the background for such methods from a geometrical point of view, and briefly describe the following types of methods: global polynomials, local polynomials, multilayer perceptrons and semi-local methods including radial basis functions. Some illustrative examples from known chaotic systems are presented, emphasising the increase in prediction error with time. We compare some of the algorithms with respect to prediction accuracy and storage requirements, and list applications of these methods to real data from widely different areas.

  9. TESTING FOR OUTLIERS IN TIME SERIES USING WAVELETS

    Institute of Scientific and Technical Information of China (English)

    ZHANG Tong; ZHANG Xibin; ZHANG Shiying

    2003-01-01

    One remarkable feature of wavelet decomposition is that the wavelet coefficients are localized, and any singularity in the input signals can only affect the wavelet coefficients at the point near the singularity. The localized property of the wavelet coefficients allows us to identify the singularities in the input signals by studying the wavelet coefficients at different resolution levels. This paper considers wavelet-based approaches for the detection of outliers in time series. Outliers are high-frequency phenomena which are associated with the wavelet coefficients with large absolute values at different resolution levels. On the basis of the first-level wavelet coefficients, this paper presents a diagnostic to identify outliers in a time series. Under the null hypothesis that there is no outlier, the proposed diagnostic is distributed as a X12. Empirical examples are presented to demonstrate the application of the proposed diagnostic.

  10. Chaotic time series; 2, system identification and prediction

    CERN Document Server

    Lillekjendlie, B

    1994-01-01

    This paper is the second in a series of two, and describes the current state of the art in modelling and prediction of chaotic time series. Sampled data from deterministic non-linear systems may look stochastic when analysed with linear methods. However, the deterministic structure may be uncovered and non-linear models constructed that allow improved prediction. We give the background for such methods from a geometrical point of view, and briefly describe the following types of methods: global polynomials, local polynomials, multi layer perceptrons and semi-local methods including radial basis functions. Some illustrative examples from known chaotic systems are presented, emphasising the increase in prediction error with time. We compare some of the algorithms with respect to prediction accuracy and storage requirements, and list applications of these methods to real data from widely different areas.

  11. Modeling Philippine Stock Exchange Composite Index Using Time Series Analysis

    Science.gov (United States)

    Gayo, W. S.; Urrutia, J. D.; Temple, J. M. F.; Sandoval, J. R. D.; Sanglay, J. E. A.

    2015-06-01

    This study was conducted to develop a time series model of the Philippine Stock Exchange Composite Index and its volatility using the finite mixture of ARIMA model with conditional variance equations such as ARCH, GARCH, EG ARCH, TARCH and PARCH models. Also, the study aimed to find out the reason behind the behaviorof PSEi, that is, which of the economic variables - Consumer Price Index, crude oil price, foreign exchange rate, gold price, interest rate, money supply, price-earnings ratio, Producers’ Price Index and terms of trade - can be used in projecting future values of PSEi and this was examined using Granger Causality Test. The findings showed that the best time series model for Philippine Stock Exchange Composite index is ARIMA(1,1,5) - ARCH(1). Also, Consumer Price Index, crude oil price and foreign exchange rate are factors concluded to Granger cause Philippine Stock Exchange Composite Index.

  12. Time Series Analysis, Modeling and Applications A Computational Intelligence Perspective

    CERN Document Server

    Chen, Shyi-Ming

    2013-01-01

    Temporal and spatiotemporal data form an inherent fabric of the society as we are faced with streams of data coming from numerous sensors, data feeds, recordings associated with numerous areas of application embracing physical and human-generated phenomena (environmental data, financial markets, Internet activities, etc.). A quest for a thorough analysis, interpretation, modeling and prediction of time series comes with an ongoing challenge for developing models that are both accurate and user-friendly (interpretable). The volume is aimed to exploit the conceptual and algorithmic framework of Computational Intelligence (CI) to form a cohesive and comprehensive environment for building models of time series. The contributions covered in the volume are fully reflective of the wealth of the CI technologies by bringing together ideas, algorithms, and numeric studies, which convincingly demonstrate their relevance, maturity and visible usefulness. It reflects upon the truly remarkable diversity of methodological a...

  13. DEM error retrieval by analyzing time series of differential interferograms

    OpenAIRE

    Bombrun, Lionel; Gay, Michel; Trouvé, Emmanuel; Vasile, Gabriel; Mars, Jerome,

    2009-01-01

    International audience; 2-pass Differential Synthetic Aperture Radar Interferometry (D-InSAR) processing have been successfully used by the scientific community to derive velocity fields. Nevertheless, a precise Digital Elevation Model (DEM) is necessary to remove the topographic component from the interferograms. This letter presents a novel method to detect and retrieve DEM errors by analyzing time series of differential interferograms. The principle of the method is based on the comparison...

  14. Financial Time Series Prediction Using Elman Recurrent Random Neural Networks

    Directory of Open Access Journals (Sweden)

    Jie Wang

    2016-01-01

    (ERNN, the empirical results show that the proposed neural network displays the best performance among these neural networks in financial time series forecasting. Further, the empirical research is performed in testing the predictive effects of SSE, TWSE, KOSPI, and Nikkei225 with the established model, and the corresponding statistical comparisons of the above market indices are also exhibited. The experimental results show that this approach gives good performance in predicting the values from the stock market indices.

  15. On Clustering Time Series Using Euclidean Distance and Pearson Correlation

    OpenAIRE

    Berthold, Michael R.; Höppner, Frank

    2016-01-01

    For time series comparisons, it has often been observed that z-score normalized Euclidean distances far outperform the unnormalized variant. In this paper we show that a z-score normalized, squared Euclidean Distance is, in fact, equal to a distance based on Pearson Correlation. This has profound impact on many distance-based classification or clustering methods. In addition to this theoretically sound result we also show that the often used k-Means algorithm formally needs a mod ification to...

  16. Analyzing the Dynamics of Nonlinear Multivariate Time Series Models

    Institute of Scientific and Technical Information of China (English)

    DenghuaZhong; ZhengfengZhang; DonghaiLiu; StefanMittnik

    2004-01-01

    This paper analyzes the dynamics of nonlinear multivariate time series models that is represented by generalized impulse response functions and asymmetric functions. We illustrate the measures of shock persistences and asymmetric effects of shocks derived from the generalized impulse response functions and asymmetric function in bivariate smooth transition regression models. The empirical work investigates a bivariate smooth transition model of US GDP and the unemployment rate.

  17. Time-series properties of state-level public expenditure.

    OpenAIRE

    Rajaraman, Indira; Mukhopadhyaya, Hiranya; Rao, Kavita R.

    2001-01-01

    Public expenditure reform must be underpinned by some understanding of the time-series properties of public expenditure. This paper examines the univariate properties of aggregate revenue expenditure at the level of State governments in India over the period 1974-98 for three states: Punjab, Haryana and Maharashtra. The empirical exercise is performed on the logarithmic transformation of aggregate revenue expenditure in terms of nominal (rather than ex post real) expenditure, not normalised t...

  18. Note---New Confidence Interval Estimators Using Standardized Time Series

    OpenAIRE

    David Goldsman; Lee Schruben

    1990-01-01

    We develop new asymptotically valid confidence interval estimators (CIE's) for the underlying mean of a stationary simulation process. The new estimators are weighted generalizations of Schruben's standardized time series area CIE. We show that the weighted CIE's have the same asymptotic expected length and variance of the length as the area CIE; but in the small sample environment, the new CIE's exhibit performance characteristics which are different from those of the area CIE.

  19. Comparison of correlation analysis techniques for irregularly sampled time series

    Directory of Open Access Journals (Sweden)

    K. Rehfeld

    2011-06-01

    Full Text Available Geoscientific measurements often provide time series with irregular time sampling, requiring either data reconstruction (interpolation or sophisticated methods to handle irregular sampling. We compare the linear interpolation technique and different approaches for analyzing the correlation functions and persistence of irregularly sampled time series, as Lomb-Scargle Fourier transformation and kernel-based methods. In a thorough benchmark test we investigate the performance of these techniques.

    All methods have comparable root mean square errors (RMSEs for low skewness of the inter-observation time distribution. For high skewness, very irregular data, interpolation bias and RMSE increase strongly. We find a 40 % lower RMSE for the lag-1 autocorrelation function (ACF for the Gaussian kernel method vs. the linear interpolation scheme,in the analysis of highly irregular time series. For the cross correlation function (CCF the RMSE is then lower by 60 %. The application of the Lomb-Scargle technique gave results comparable to the kernel methods for the univariate, but poorer results in the bivariate case. Especially the high-frequency components of the signal, where classical methods show a strong bias in ACF and CCF magnitude, are preserved when using the kernel methods.

    We illustrate the performances of interpolation vs. Gaussian kernel method by applying both to paleo-data from four locations, reflecting late Holocene Asian monsoon variability as derived from speleothem δ18O measurements. Cross correlation results are similar for both methods, which we attribute to the long time scales of the common variability. The persistence time (memory is strongly overestimated when using the standard, interpolation-based, approach. Hence, the Gaussian kernel is a reliable and more robust estimator with significant advantages compared to other techniques and suitable for large scale application to paleo-data.

  20. Statistical Inference Methods for Sparse Biological Time Series Data

    Directory of Open Access Journals (Sweden)

    Voit Eberhard O

    2011-04-01

    Full Text Available Abstract Background Comparing metabolic profiles under different biological perturbations has become a powerful approach to investigating the functioning of cells. The profiles can be taken as single snapshots of a system, but more information is gained if they are measured longitudinally over time. The results are short time series consisting of relatively sparse data that cannot be analyzed effectively with standard time series techniques, such as autocorrelation and frequency domain methods. In this work, we study longitudinal time series profiles of glucose consumption in the yeast Saccharomyces cerevisiae under different temperatures and preconditioning regimens, which we obtained with methods of in vivo nuclear magnetic resonance (NMR spectroscopy. For the statistical analysis we first fit several nonlinear mixed effect regression models to the longitudinal profiles and then used an ANOVA likelihood ratio method in order to test for significant differences between the profiles. Results The proposed methods are capable of distinguishing metabolic time trends resulting from different treatments and associate significance levels to these differences. Among several nonlinear mixed-effects regression models tested, a three-parameter logistic function represents the data with highest accuracy. ANOVA and likelihood ratio tests suggest that there are significant differences between the glucose consumption rate profiles for cells that had been--or had not been--preconditioned by heat during growth. Furthermore, pair-wise t-tests reveal significant differences in the longitudinal profiles for glucose consumption rates between optimal conditions and heat stress, optimal and recovery conditions, and heat stress and recovery conditions (p-values Conclusion We have developed a nonlinear mixed effects model that is appropriate for the analysis of sparse metabolic and physiological time profiles. The model permits sound statistical inference procedures

  1. Deconvolution of mixing time series on a graph

    CERN Document Server

    Blocker, Alexander W

    2011-01-01

    In many applications we are interested in making inference on latent time series from indirect measurements, which are often low-dimensional projections resulting from mixing or aggregation. Positron emission tomography, super-resolution, and network traffic monitoring are some examples. Inference in such settings requires solving a sequence of ill-posed inverse problems, y_t= A x_t, where the projection mechanism provides information on A. We consider problems in which A specifies mixing on a graph of times series that are bursty and sparse. We develop a multilevel state-space model for mixing times series and an efficient approach to inference. A simple model is used to calibrate regularization parameters that lead to efficient inference in the multilevel state-space model. We apply this method to the problem of estimating point-to-point traffic flows on a network from aggregate measurements. Our solution outperforms existing methods for this problem, and our two-stage approach suggests an efficient inferen...

  2. Toward automatic time-series forecasting using neural networks.

    Science.gov (United States)

    Yan, Weizhong

    2012-07-01

    Over the past few decades, application of artificial neural networks (ANN) to time-series forecasting (TSF) has been growing rapidly due to several unique features of ANN models. However, to date, a consistent ANN performance over different studies has not been achieved. Many factors contribute to the inconsistency in the performance of neural network models. One such factor is that ANN modeling involves determining a large number of design parameters, and the current design practice is essentially heuristic and ad hoc, this does not exploit the full potential of neural networks. Systematic ANN modeling processes and strategies for TSF are, therefore, greatly needed. Motivated by this need, this paper attempts to develop an automatic ANN modeling scheme. It is based on the generalized regression neural network (GRNN), a special type of neural network. By taking advantage of several GRNN properties (i.e., a single design parameter and fast learning) and by incorporating several design strategies (e.g., fusing multiple GRNNs), we have been able to make the proposed modeling scheme to be effective for modeling large-scale business time series. The initial model was entered into the NN3 time-series competition. It was awarded the best prediction on the reduced dataset among approximately 60 different models submitted by scholars worldwide.

  3. Time series analysis for psychological research: examining and forecasting change.

    Science.gov (United States)

    Jebb, Andrew T; Tay, Louis; Wang, Wei; Huang, Qiming

    2015-01-01

    Psychological research has increasingly recognized the importance of integrating temporal dynamics into its theories, and innovations in longitudinal designs and analyses have allowed such theories to be formalized and tested. However, psychological researchers may be relatively unequipped to analyze such data, given its many characteristics and the general complexities involved in longitudinal modeling. The current paper introduces time series analysis to psychological research, an analytic domain that has been essential for understanding and predicting the behavior of variables across many diverse fields. First, the characteristics of time series data are discussed. Second, different time series modeling techniques are surveyed that can address various topics of interest to psychological researchers, including describing the pattern of change in a variable, modeling seasonal effects, assessing the immediate and long-term impact of a salient event, and forecasting future values. To illustrate these methods, an illustrative example based on online job search behavior is used throughout the paper, and a software tutorial in R for these analyses is provided in the Supplementary Materials.

  4. Fractal Characteristic of Rock Cutting Load Time Series

    Directory of Open Access Journals (Sweden)

    Hongxiang Jiang

    2014-01-01

    Full Text Available A test-bed was developed to perform the rock cutting experiments under different cutting conditions. The fractal theory was adopted to investigate the fractal characteristic of cutting load time series and fragment size distribution in rock cutting. The box-counting dimension for the cutting load time series was consistent with the fractal dimension of the corresponding fragment size distribution, which indicated that there were inherent relations between the rock fragmentation and the cutting load. Furthermore, the box-counting dimension was used to describe the fractal characteristic of cutting load time series under different conditions. The results show that the rock compressive strength, cutting depth, cutting angle, and assisted water-jet types all have no significant effect on the fractal characteristic of cutting load. The box-counting dimension can be an evaluation index to assess the extent of rock crushing or cutting. Rock fracture mechanism would not be changed due to water-jet in front of or behind the cutter, but it would be changed when the water-jet was in cutter.

  5. Modeling financial time series with S-plus

    CERN Document Server

    Zivot, Eric

    2003-01-01

    The field of financial econometrics has exploded over the last decade This book represents an integration of theory, methods, and examples using the S-PLUS statistical modeling language and the S+FinMetrics module to facilitate the practice of financial econometrics This is the first book to show the power of S-PLUS for the analysis of time series data It is written for researchers and practitioners in the finance industry, academic researchers in economics and finance, and advanced MBA and graduate students in economics and finance Readers are assumed to have a basic knowledge of S-PLUS and a solid grounding in basic statistics and time series concepts Eric Zivot is an associate professor and Gary Waterman Distinguished Scholar in the Economics Department at the University of Washington, and is co-director of the nascent Professional Master's Program in Computational Finance He regularly teaches courses on econometric theory, financial econometrics and time series econometrics, and is the recipient of the He...

  6. LS-SVR and AGO Based Time Series Prediction Method

    Institute of Scientific and Technical Information of China (English)

    ZHANG Shou-peng; LIU Shan; CHAI Wang-xu; ZHANG Jia-qi; GUO Yang-ming

    2016-01-01

    Recently , fault or health condition prediction of complex systems becomes an interesting research topic.However, it is difficult to establish precise physical model for complex systems , and the time series properties are often necessary to be incorporated for the prediction in practice .Currently ,the LS -SVR is widely adopted for prediction of systems with time series data .In this paper , in order to improve the prediction accuracy, accumulated generating operation (AGO) is carried out to improve the data quality and regularity of raw time series data based on grey system theory;then, the inverse accumulated generating operation ( IAGO) is performed to obtain the prediction results .In addition , due to the reason that appropriate kernel function plays an important role in improving the accuracy of prediction through LS-SVR, a modified Gaussian radial basis function (RBF) is proposed.The requirements of distance functions-based kernel functions are satisfied , which ensure fast damping at the place adjacent to the test point and a moderate damping at infinity .The presented model is applied to the analysis of benchmarks .As indicated by the results , the proposed method is an effective prediction one with good precision .

  7. Learning restricted Boolean network model by time-series data.

    Science.gov (United States)

    Ouyang, Hongjia; Fang, Jie; Shen, Liangzhong; Dougherty, Edward R; Liu, Wenbin

    2014-01-01

    Restricted Boolean networks are simplified Boolean networks that are required for either negative or positive regulations between genes. Higa et al. (BMC Proc 5:S5, 2011) proposed a three-rule algorithm to infer a restricted Boolean network from time-series data. However, the algorithm suffers from a major drawback, namely, it is very sensitive to noise. In this paper, we systematically analyze the regulatory relationships between genes based on the state switch of the target gene and propose an algorithm with which restricted Boolean networks may be inferred from time-series data. We compare the proposed algorithm with the three-rule algorithm and the best-fit algorithm based on both synthetic networks and a well-studied budding yeast cell cycle network. The performance of the algorithms is evaluated by three distance metrics: the normalized-edge Hamming distance [Formula: see text], the normalized Hamming distance of state transition [Formula: see text], and the steady-state distribution distance μ (ssd). Results show that the proposed algorithm outperforms the others according to both [Formula: see text] and [Formula: see text], whereas its performance according to μ (ssd) is intermediate between best-fit and the three-rule algorithms. Thus, our new algorithm is more appropriate for inferring interactions between genes from time-series data.

  8. Genetic programming and serial processing for time series classification.

    Science.gov (United States)

    Alfaro-Cid, Eva; Sharman, Ken; Esparcia-Alcázar, Anna I

    2014-01-01

    This work describes an approach devised by the authors for time series classification. In our approach genetic programming is used in combination with a serial processing of data, where the last output is the result of the classification. The use of genetic programming for classification, although still a field where more research in needed, is not new. However, the application of genetic programming to classification tasks is normally done by considering the input data as a feature vector. That is, to the best of our knowledge, there are not examples in the genetic programming literature of approaches where the time series data are processed serially and the last output is considered as the classification result. The serial processing approach presented here fills a gap in the existing literature. This approach was tested in three different problems. Two of them are real world problems whose data were gathered for online or conference competitions. As there are published results of these two problems this gives us the chance to compare the performance of our approach against top performing methods. The serial processing of data in combination with genetic programming obtained competitive results in both competitions, showing its potential for solving time series classification problems. The main advantage of our serial processing approach is that it can easily handle very large datasets.

  9. Clustering Multivariate Time Series Using Hidden Markov Models

    Directory of Open Access Journals (Sweden)

    Shima Ghassempour

    2014-03-01

    Full Text Available In this paper we describe an algorithm for clustering multivariate time series with variables taking both categorical and continuous values. Time series of this type are frequent in health care, where they represent the health trajectories of individuals. The problem is challenging because categorical variables make it difficult to define a meaningful distance between trajectories. We propose an approach based on Hidden Markov Models (HMMs, where we first map each trajectory into an HMM, then define a suitable distance between HMMs and finally proceed to cluster the HMMs with a method based on a distance matrix. We test our approach on a simulated, but realistic, data set of 1,255 trajectories of individuals of age 45 and over, on a synthetic validation set with known clustering structure, and on a smaller set of 268 trajectories extracted from the longitudinal Health and Retirement Survey. The proposed method can be implemented quite simply using standard packages in R and Matlab and may be a good candidate for solving the difficult problem of clustering multivariate time series with categorical variables using tools that do not require advanced statistic knowledge, and therefore are accessible to a wide range of researchers.

  10. Learning restricted Boolean network model by time-series data

    Science.gov (United States)

    2014-01-01

    Restricted Boolean networks are simplified Boolean networks that are required for either negative or positive regulations between genes. Higa et al. (BMC Proc 5:S5, 2011) proposed a three-rule algorithm to infer a restricted Boolean network from time-series data. However, the algorithm suffers from a major drawback, namely, it is very sensitive to noise. In this paper, we systematically analyze the regulatory relationships between genes based on the state switch of the target gene and propose an algorithm with which restricted Boolean networks may be inferred from time-series data. We compare the proposed algorithm with the three-rule algorithm and the best-fit algorithm based on both synthetic networks and a well-studied budding yeast cell cycle network. The performance of the algorithms is evaluated by three distance metrics: the normalized-edge Hamming distance μhame, the normalized Hamming distance of state transition μhamst, and the steady-state distribution distance μssd. Results show that the proposed algorithm outperforms the others according to both μhame and μhamst, whereas its performance according to μssd is intermediate between best-fit and the three-rule algorithms. Thus, our new algorithm is more appropriate for inferring interactions between genes from time-series data. PMID:25093019

  11. Complexity analysis of the UV radiation dose time series

    CERN Document Server

    Mihailovic, Dragutin T

    2013-01-01

    We have used the Lempel-Ziv and sample entropy measures to assess the complexity in the UV radiation activity in the Vojvodina region (Serbia) for the period 1990-2007. In particular, we have examined the reconstructed daily sum (dose) of the UV-B time series from seven representative places in this region and calculated the Lempel-Ziv Complexity (LZC) and Sample Entropy (SE) values for each time series. The results indicate that the LZC values in some places are close to each other while in others they differ. We have devided the period 1990-2007 into two subintervals: (a) 1990-1998 and (b) 1999-2007 and calculated LZC and SE values for the various time series in these subintervals. It is found that during the period 1999-2007, there is a decrease in their complexities, and corresponding changes in the SE, in comparison to the period 1990-1998. This complexity loss may be attributed to increased (i) human intervention in the post civil war period (land and crop use and urbanization) and military activities i...

  12. An approach for estimating time-variable rates from geodetic time series

    Science.gov (United States)

    Didova, Olga; Gunter, Brian; Riva, Riccardo; Klees, Roland; Roese-Koerner, Lutz

    2016-11-01

    There has been considerable research in the literature focused on computing and forecasting sea-level changes in terms of constant trends or rates. The Antarctic ice sheet is one of the main contributors to sea-level change with highly uncertain rates of glacial thinning and accumulation. Geodetic observing systems such as the Gravity Recovery and Climate Experiment (GRACE) and the Global Positioning System (GPS) are routinely used to estimate these trends. In an effort to improve the accuracy and reliability of these trends, this study investigates a technique that allows the estimated rates, along with co-estimated seasonal components, to vary in time. For this, state space models are defined and then solved by a Kalman filter (KF). The reliable estimation of noise parameters is one of the main problems encountered when using a KF approach, which is solved by numerically optimizing likelihood. Since the optimization problem is non-convex, it is challenging to find an optimal solution. To address this issue, we limited the parameter search space using classical least-squares adjustment (LSA). In this context, we also tested the usage of inequality constraints by directly verifying whether they are supported by the data. The suggested technique for time-series analysis is expanded to classify and handle time-correlated observational noise within the state space framework. The performance of the method is demonstrated using GRACE and GPS data at the CAS1 station located in East Antarctica and compared to commonly used LSA. The results suggest that the outlined technique allows for more reliable trend estimates, as well as for more physically valuable interpretations, while validating independent observing systems.

  13. Hydroxyl time series and recirculation in turbulent nonpremixed swirling flames

    Energy Technology Data Exchange (ETDEWEB)

    Guttenfelder, Walter A.; Laurendeau, Normand M.; Ji, Jun; King, Galen B.; Gore, Jay P. [School of Mechanical Engineering, Purdue University, West Lafayette, IN 47907-1288 (United States); Renfro, Michael W. [Department of Mechanical Engineering, University of Connecticut, Storrs, CT 06269-3139 (United States)

    2006-10-15

    Time-series measurements of OH, as related to accompanying flow structures, are reported using picosecond time-resolved laser-induced fluorescence (PITLIF) and particle-imaging velocimetry (PIV) for turbulent, swirling, nonpremixed methane-air flames. The [OH] data portray a primary reaction zone surrounding the internal recirculation zone, with residual OH in the recirculation zone approaching chemical equilibrium. Modeling of the OH electronic quenching environment, when compared to fluorescence lifetime measurements, offers additional evidence that the reaction zone burns as a partially premixed flame. A time-series analysis affirms the presence of thin flamelet-like regions based on the relation between swirl-induced turbulence and fluctuations of [OH] in the reaction and recirculation zones. The OH integral time-scales are found to correspond qualitatively to local mean velocities. Furthermore, quantitative dependencies can be established with respect to axial position, Reynolds number, and global equivalence ratio. Given these relationships, the OH time-scales, and thus the primary reaction zone, appear to be dominated by convection-driven fluctuations. Surprisingly, the OH time-scales for these nominally swirling flames demonstrate significant similarities to previous PITLIF results in nonpremixed jet flames. (author)

  14. Unified functional network and nonlinear time series analysis for complex systems science: The pyunicorn package.

    Science.gov (United States)

    Donges, Jonathan F; Heitzig, Jobst; Beronov, Boyan; Wiedermann, Marc; Runge, Jakob; Feng, Qing Yi; Tupikina, Liubov; Stolbova, Veronika; Donner, Reik V; Marwan, Norbert; Dijkstra, Henk A; Kurths, Jürgen

    2015-11-01

    We introduce the pyunicorn (Pythonic unified complex network and recurrence analysis toolbox) open source software package for applying and combining modern methods of data analysis and modeling from complex network theory and nonlinear time series analysis. pyunicorn is a fully object-oriented and easily parallelizable package written in the language Python. It allows for the construction of functional networks such as climate networks in climatology or functional brain networks in neuroscience representing the structure of statistical interrelationships in large data sets of time series and, subsequently, investigating this structure using advanced methods of complex network theory such as measures and models for spatial networks, networks of interacting networks, node-weighted statistics, or network surrogates. Additionally, pyunicorn provides insights into the nonlinear dynamics of complex systems as recorded in uni- and multivariate time series from a non-traditional perspective by means of recurrence quantification analysis, recurrence networks, visibility graphs, and construction of surrogate time series. The range of possible applications of the library is outlined, drawing on several examples mainly from the field of climatology.

  15. Estimating the Lyapunov spectrum of time delay feedback systems from scalar time series.

    Science.gov (United States)

    Hegger, R

    1999-08-01

    On the basis of a recently developed method for modeling time delay systems, we propose a procedure to estimate the spectrum of Lyapunov exponents from a scalar time series. It turns out that the spectrum is approximated very well and allows for good estimates of the Lyapunov dimension even if the sampling rate of the time series is so low that the infinite dimensional tangent space is spanned quite sparsely.

  16. Forecasting long memory time series under a break in persistence

    DEFF Research Database (Denmark)

    Heinen, Florian; Sibbertsen, Philipp; Kruse, Robinson

    We consider the problem of forecasting time series with long memory when the memory parameter is subject to a structural break. By means of a large-scale Monte Carlo study we show that ignoring such a change in persistence leads to substantially reduced forecasting precision. The strength...... of this effect depends on whether the memory parameter is increasing or decreasing over time. A comparison of six forecasting strategies allows us to conclude that pre-testing for a change in persistence is highly recommendable in our setting. In addition we provide an empirical example which underlines...

  17. Extracting the relevant delays in time series modelling

    DEFF Research Database (Denmark)

    Goutte, Cyril

    1997-01-01

    In this contribution, we suggest a convenient way to use generalisation error to extract the relevant delays from a time-varying process, i.e. the delays that lead to the best prediction performance. We design a generalisation-based algorithm that takes its inspiration from traditional variable...... selection, and more precisely stepwise forward selection. The method is compared to other forward selection schemes, as well as to a nonparametric tests aimed at estimating the embedding dimension of time series. The final application extends these results to the efficient estimation of FIR filters on some...

  18. Inferring complex networks from time series of dynamical systems: Pitfalls, misinterpretations, and possible solutions

    CERN Document Server

    Bialonski, S

    2012-01-01

    Understanding the dynamics of spatially extended systems represents a challenge in diverse scientific disciplines, ranging from physics and mathematics to the earth and climate sciences or the neurosciences. This challenge has stimulated the development of sophisticated data analysis approaches adopting concepts from network theory: systems are considered to be composed of subsystems (nodes) which interact with each other (represented by edges). In many studies, such complex networks of interactions have been derived from empirical time series for various spatially extended systems and have been repeatedly reported to possess the same, possibly desirable, properties (e.g. small-world characteristics and assortativity). In this thesis we study whether and how interaction networks are influenced by the analysis methodology, i.e. by the way how empirical data is acquired (the spatial and temporal sampling of the dynamics) and how nodes and edges are derived from multivariate time series. Our modeling and numeric...

  19. Applying Markov Chains for NDVI Time Series Forecasting of Latvian Regions

    Directory of Open Access Journals (Sweden)

    Stepchenko Arthur

    2015-12-01

    Full Text Available Time series of earth observation based estimates of vegetation inform about variations in vegetation at the scale of Latvia. A vegetation index is an indicator that describes the amount of chlorophyll (the green mass and shows the relative density and health of vegetation. NDVI index is an important variable for vegetation forecasting and management of various problems, such as climate change monitoring, energy usage monitoring, managing the consumption of natural resources, agricultural productivity monitoring, drought monitoring and forest fire detection. In this paper, we make a one-step-ahead prediction of 7-daily time series of NDVI index using Markov chains. The choice of a Markov chain is due to the fact that a Markov chain is a sequence of random variables where each variable is located in some state. And a Markov chain contains probabilities of moving from one state to other.

  20. Satellite Image Time Series Decomposition Based on EEMD

    Directory of Open Access Journals (Sweden)

    Yun-long Kong

    2015-11-01

    Full Text Available Satellite Image Time Series (SITS have recently been of great interest due to the emerging remote sensing capabilities for Earth observation. Trend and seasonal components are two crucial elements of SITS. In this paper, a novel framework of SITS decomposition based on Ensemble Empirical Mode Decomposition (EEMD is proposed. EEMD is achieved by sifting an ensemble of adaptive orthogonal components called Intrinsic Mode Functions (IMFs. EEMD is noise-assisted and overcomes the drawback of mode mixing in conventional Empirical Mode Decomposition (EMD. Inspired by these advantages, the aim of this work is to employ EEMD to decompose SITS into IMFs and to choose relevant IMFs for the separation of seasonal and trend components. In a series of simulations, IMFs extracted by EEMD achieved a clear representation with physical meaning. The experimental results of 16-day compositions of Moderate Resolution Imaging Spectroradiometer (MODIS, Normalized Difference Vegetation Index (NDVI, and Global Environment Monitoring Index (GEMI time series with disturbance illustrated the effectiveness and stability of the proposed approach to monitoring tasks, such as applications for the detection of abrupt changes.

  1. Linear and nonlinear dynamic systems in financial time series prediction

    Directory of Open Access Journals (Sweden)

    Salim Lahmiri

    2012-10-01

    Full Text Available Autoregressive moving average (ARMA process and dynamic neural networks namely the nonlinear autoregressive moving average with exogenous inputs (NARX are compared by evaluating their ability to predict financial time series; for instance the S&P500 returns. Two classes of ARMA are considered. The first one is the standard ARMA model which is a linear static system. The second one uses Kalman filter (KF to estimate and predict ARMA coefficients. This model is a linear dynamic system. The forecasting ability of each system is evaluated by means of mean absolute error (MAE and mean absolute deviation (MAD statistics. Simulation results indicate that the ARMA-KF system performs better than the standard ARMA alone. Thus, introducing dynamics into the ARMA process improves the forecasting accuracy. In addition, the ARMA-KF outperformed the NARX. This result may suggest that the linear component found in the S&P500 return series is more dominant than the nonlinear part. In sum, we conclude that introducing dynamics into the ARMA process provides an effective system for S&P500 time series prediction.

  2. Time-series animation techniques for visualizing urban growth

    Science.gov (United States)

    Acevedo, W.; Masuoka, P.

    1997-01-01

    Time-series animation is a visually intuitive way to display urban growth. Animations of landuse change for the Baltimore-Washington region were generated by showing a series of images one after the other in sequential order. Before creating an animation, various issues which will affect the appearance of the animation should be considered, including the number of original data frames to use, the optimal animation display speed, the number of intermediate frames to create between the known frames, and the output media on which the animations will be displayed. To create new frames between the known years of data, the change in each theme (i.e. urban development, water bodies, transportation routes) must be characterized and an algorithm developed to create the in-between frames. Example time-series animations were created using a temporal GIS database of the Baltimore-Washington area. Creating the animations involved generating raster images of the urban development, water bodies, and principal transportation routes; overlaying the raster images on a background image; and importing the frames to a movie file. Three-dimensional perspective animations were created by draping each image over digital elevation data prior to importing the frames to a movie file. ?? 1997 Elsevier Science Ltd.

  3. Nonlinear time-series-based adaptive control applications

    Science.gov (United States)

    Mohler, R. R.; Rajkumar, V.; Zakrzewski, R. R.

    1991-01-01

    A control design methodology based on a nonlinear time-series reference model is presented. It is indicated by highly nonlinear simulations that such designs successfully stabilize troublesome aircraft maneuvers undergoing large changes in angle of attack as well as large electric power transients due to line faults. In both applications, the nonlinear controller was significantly better than the corresponding linear adaptive controller. For the electric power network, a flexible AC transmission system with series capacitor power feedback control is studied. A bilinear autoregressive moving average reference model is identified from system data, and the feedback control is manipulated according to a desired reference state. The control is optimized according to a predictive one-step quadratic performance index. A similar algorithm is derived for control of rapid changes in aircraft angle of attack over a normally unstable flight regime. In the latter case, however, a generalization of a bilinear time-series model reference includes quadratic and cubic terms in angle of attack.

  4. Studies in Astronomical Time Series Analysis. VI. Bayesian Block Representations

    Science.gov (United States)

    Scargle, Jeffrey D.; Norris, Jay P.; Jackson, Brad; Chiang, James

    2013-01-01

    This paper addresses the problem of detecting and characterizing local variability in time series and other forms of sequential data. The goal is to identify and characterize statistically significant variations, at the same time suppressing the inevitable corrupting observational errors. We present a simple nonparametric modeling technique and an algorithm implementing it-an improved and generalized version of Bayesian Blocks [Scargle 1998]-that finds the optimal segmentation of the data in the observation interval. The structure of the algorithm allows it to be used in either a real-time trigger mode, or a retrospective mode. Maximum likelihood or marginal posterior functions to measure model fitness are presented for events, binned counts, and measurements at arbitrary times with known error distributions. Problems addressed include those connected with data gaps, variable exposure, extension to piece- wise linear and piecewise exponential representations, multivariate time series data, analysis of variance, data on the circle, other data modes, and dispersed data. Simulations provide evidence that the detection efficiency for weak signals is close to a theoretical asymptotic limit derived by [Arias-Castro, Donoho and Huo 2003]. In the spirit of Reproducible Research [Donoho et al. (2008)] all of the code and data necessary to reproduce all of the figures in this paper are included as auxiliary material.

  5. STUDIES IN ASTRONOMICAL TIME SERIES ANALYSIS. VI. BAYESIAN BLOCK REPRESENTATIONS

    Energy Technology Data Exchange (ETDEWEB)

    Scargle, Jeffrey D. [Space Science and Astrobiology Division, MS 245-3, NASA Ames Research Center, Moffett Field, CA 94035-1000 (United States); Norris, Jay P. [Physics Department, Boise State University, 2110 University Drive, Boise, ID 83725-1570 (United States); Jackson, Brad [The Center for Applied Mathematics and Computer Science, Department of Mathematics, San Jose State University, One Washington Square, MH 308, San Jose, CA 95192-0103 (United States); Chiang, James, E-mail: jeffrey.d.scargle@nasa.gov [W. W. Hansen Experimental Physics Laboratory, Kavli Institute for Particle Astrophysics and Cosmology, Department of Physics and SLAC National Accelerator Laboratory, Stanford University, Stanford, CA 94305 (United States)

    2013-02-20

    This paper addresses the problem of detecting and characterizing local variability in time series and other forms of sequential data. The goal is to identify and characterize statistically significant variations, at the same time suppressing the inevitable corrupting observational errors. We present a simple nonparametric modeling technique and an algorithm implementing it-an improved and generalized version of Bayesian Blocks-that finds the optimal segmentation of the data in the observation interval. The structure of the algorithm allows it to be used in either a real-time trigger mode, or a retrospective mode. Maximum likelihood or marginal posterior functions to measure model fitness are presented for events, binned counts, and measurements at arbitrary times with known error distributions. Problems addressed include those connected with data gaps, variable exposure, extension to piecewise linear and piecewise exponential representations, multivariate time series data, analysis of variance, data on the circle, other data modes, and dispersed data. Simulations provide evidence that the detection efficiency for weak signals is close to a theoretical asymptotic limit derived by Arias-Castro et al. In the spirit of Reproducible Research all of the code and data necessary to reproduce all of the figures in this paper are included as supplementary material.

  6. FAULT IDENTIFICATION IN HETEROGENEOUS NETWORKS USING TIME SERIES ANALYSIS

    Institute of Scientific and Technical Information of China (English)

    孙钦东; 张德运; 孙朝晖

    2004-01-01

    Fault management is crucial to provide quality of service grantees for the future networks, and fault identification is an essential part of it. A novel fault identification algorithm is proposed in this paper, which focuses on the anomaly detection of network traffic. Since the fault identification has been achieved using statistical information in management information base, the algorithm is compatible with the existing simple network management protocol framework. The network traffic time series is verified to be non-stationary. By fitting the adaptive autoregressive model, the series is transformed into a multidimensional vector. The training samples and identifiers are acquired from the network simulation. A k-nearest neighbor classifier identifies the system faults after being trained. The experiment results are consistent with the given fault scenarios, which prove the accuracy of the algorithm. The identification errors are discussed to illustrate that the novel fault identification algorithm is adaptive in the fault scenarios with network traffic change.

  7. Research on time series mining based on shape concept time warping

    Institute of Scientific and Technical Information of China (English)

    翁颖钧; 朱仲英

    2004-01-01

    Time series is an important kind of complex data, while a growing attention has been paid to mining time series knowledge recently. Typically Euclidean distance measure is used for comparing time series. However, it may be a brittle distance measure because of less robustness. Dynamic time warp is a pattern matching algorithm based on nonlinear dynamic programming technique, however it is computationally expensive and suffered from the local shape variance. A modification algorithm named by shape DTW is presented, which uses linguistic variable concept to describe the slope feather of time series. The concept tree is developed by cloud models theory which integrates randomness and probability of uncertainty, so that it makes conversion between qualitative and quantitive knowledge. Experiments about cluster analysis on the basis of this algorithm, compared with Euclidean measure, are implemented on synthetic control chart time series. The results show that this method has strong robustness to loss of feature data due to piecewise segment preprocessing. Moreover, after the construction of shape concept tree, we can discovery knowledge of time series on different time granularity.

  8. Time series analysis of the behavior of brazilian natural rubber

    Directory of Open Access Journals (Sweden)

    Antônio Donizette de Oliveira

    2009-03-01

    Full Text Available The natural rubber is a non-wood product obtained of the coagulation of some lattices of forest species, being Hevea brasiliensis the main one. Native from the Amazon Region, this species was already known by the Indians before the discovery of America. The natural rubber became a product globally valued due to its multiple applications in the economy, being its almost perfect substitute the synthetic rubber derived from the petroleum. Similarly to what happens with other countless products the forecast of future prices of the natural rubber has been object of many studies. The use of models of forecast of univariate timeseries stands out as the more accurate and useful to reduce the uncertainty in the economic decision making process. This studyanalyzed the historical series of prices of the Brazilian natural rubber (R$/kg, in the Jan/99 - Jun/2006 period, in order tocharacterize the rubber price behavior in the domestic market; estimated a model for the time series of monthly natural rubberprices; and foresaw the domestic prices of the natural rubber, in the Jul/2006 - Jun/2007 period, based on the estimated models.The studied models were the ones belonging to the ARIMA family. The main results were: the domestic market of the natural rubberis expanding due to the growth of the world economy; among the adjusted models, the ARIMA (1,1,1 model provided the bestadjustment of the time series of prices of the natural rubber (R$/kg; the prognosis accomplished for the series supplied statistically adequate fittings.

  9. Characterizing weak chaos using time series of Lyapunov exponents.

    Science.gov (United States)

    da Silva, R M; Manchein, C; Beims, M W; Altmann, E G

    2015-06-01

    We investigate chaos in mixed-phase-space Hamiltonian systems using time series of the finite-time Lyapunov exponents. The methodology we propose uses the number of Lyapunov exponents close to zero to define regimes of ordered (stickiness), semiordered (or semichaotic), and strongly chaotic motion. The dynamics is then investigated looking at the consecutive time spent in each regime, the transition between different regimes, and the regions in the phase space associated to them. Applying our methodology to a chain of coupled standard maps we obtain (i) that it allows for an improved numerical characterization of stickiness in high-dimensional Hamiltonian systems, when compared to the previous analyses based on the distribution of recurrence times; (ii) that the transition probabilities between different regimes are determined by the phase-space volume associated to the corresponding regions; and (iii) the dependence of the Lyapunov exponents with the coupling strength.

  10. Removing atmosphere loading effect from GPS time series

    Science.gov (United States)

    Tiampo, K. F.; Samadi Alinia, H.; Samsonov, S. V.; Gonzalez, P. J.

    2015-12-01

    The GPS time series of site position are contaminated by various sources of noise; in particular, the ionospheric and tropospheric path delays are significant [Gray et al., 2000; Meyer et al., 2006]. The GPS path delay in the ionosphere is largely dependent on the wave frequency whereas the delay in troposphere is dependent on the length of the travel path and therefore site elevation. Various approaches available for compensating ionosphere path delay cannot be used for removal of the tropospheric component. Quantifying the tropospheric delay plays an important role for determination of the vertical GPS component precision, as tropospheric parameters over a large distance have very little correlation with each other. Several methods have been proposed for tropospheric signal elimination from GPS vertical time series. Here we utilize surface temperature fluctuations and seasonal variations in water vapour and air pressure data for various spatial and temporal profiles in order to more accurately remove the atmospheric path delay [Samsonov et al., 2014]. In this paper, we model the atmospheric path delay of vertical position time series by analyzing the signal in the frequency domain and study its dependency on topography in eastern Ontario for the time period from January 2008 to December 2012. Systematic dependency of amplitude of atmospheric path delay as a function of height and its temporal variations based on the development of a new, physics-based model relating tropospheric/atmospheric effects with topography and can help in determining the most accurate GPS position.The GPS time series of site position are contaminated by various sources of noise; in particular, the ionospheric and tropospheric path delays are significant [Gray et al., 2000; Meyer et al., 2006]. The GPS path delay in the ionosphere is largely dependent on the wave frequency whereas the delay in troposphere is dependent on the length of the travel path and therefore site elevation. Various

  11. Remote-Sensing Time Series Analysis, a Vegetation Monitoring Tool

    Science.gov (United States)

    McKellip, Rodney; Prados, Donald; Ryan, Robert; Ross, Kenton; Spruce, Joseph; Gasser, Gerald; Greer, Randall

    2008-01-01

    The Time Series Product Tool (TSPT) is software, developed in MATLAB , which creates and displays high signal-to- noise Vegetation Indices imagery and other higher-level products derived from remotely sensed data. This tool enables automated, rapid, large-scale regional surveillance of crops, forests, and other vegetation. TSPT temporally processes high-revisit-rate satellite imagery produced by the Moderate Resolution Imaging Spectroradiometer (MODIS) and by other remote-sensing systems. Although MODIS imagery is acquired daily, cloudiness and other sources of noise can greatly reduce the effective temporal resolution. To improve cloud statistics, the TSPT combines MODIS data from multiple satellites (Aqua and Terra). The TSPT produces MODIS products as single time-frame and multitemporal change images, as time-series plots at a selected location, or as temporally processed image videos. Using the TSPT program, MODIS metadata is used to remove and/or correct bad and suspect data. Bad pixel removal, multiple satellite data fusion, and temporal processing techniques create high-quality plots and animated image video sequences that depict changes in vegetation greenness. This tool provides several temporal processing options not found in other comparable imaging software tools. Because the framework to generate and use other algorithms is established, small modifications to this tool will enable the use of a large range of remotely sensed data types. An effective remote-sensing crop monitoring system must be able to detect subtle changes in plant health in the earliest stages, before the effects of a disease outbreak or other adverse environmental conditions can become widespread and devastating. The integration of the time series analysis tool with ground-based information, soil types, crop types, meteorological data, and crop growth models in a Geographic Information System, could provide the foundation for a large-area crop-surveillance system that could identify

  12. Improvement in global forecast for chaotic time series

    Science.gov (United States)

    Alves, P. R. L.; Duarte, L. G. S.; da Mota, L. A. C. P.

    2016-10-01

    In the Polynomial Global Approach to Time Series Analysis, the most costly (computationally speaking) step is the finding of the fitting polynomial. Here we present two routines that improve the forecasting. In the first, an algorithm that greatly improves this situation is introduced and implemented. The heart of this procedure is implemented on the specific routine which performs a mapping with great efficiency. In comparison with the similar procedure of the TimeS package developed by Carli et al. (2014), an enormous gain in efficiency and an increasing in accuracy are obtained. Another development in this work is the establishment of a level of confidence in global prediction with a statistical test for evaluating if the minimization performed is suitable or not. The other program presented in this article applies the Shapiro-Wilk test for checking the normality of the distribution of errors and calculates the expected deviation. The development is employed in observed and simulated time series to illustrate the performance obtained.

  13. Estimation of coupling between time-delay systems from time series.

    Science.gov (United States)

    Prokhorov, M D; Ponomarenko, V I

    2005-07-01

    We propose a method for estimation of coupling between the systems governed by scalar time-delay differential equations of the Mackey-Glass type from the observed time series data. The method allows one to detect the presence of certain types of linear coupling between two time-delay systems, to define the type, strength, and direction of coupling, and to recover the model equations of coupled time-delay systems from chaotic time series corrupted by noise. We verify our method using both numerical and experimental data.

  14. A Tool to Recover Scalar Time-Delay Systems from Experimental Time Series

    CERN Document Server

    Bünner, M J; Meyer, T; Kittel, A; Parisi, J; Meyer, Th.

    1996-01-01

    We propose a method that is able to analyze chaotic time series, gained from exp erimental data. The method allows to identify scalar time-delay systems. If the dynamics of the system under investigation is governed by a scalar time-delay differential equation of the form $dy(t)/dt = h(y(t),y(t-\\tau_0))$, the delay time $\\tau_0$ and the functi on $h$ can be recovered. There are no restrictions to the dimensionality of the chaotic attractor. The method turns out to be insensitive to noise. We successfully apply the method to various time series taken from a computer experiment and two different electronic oscillators.

  15. Estimating and Analyzing Savannah Phenology with a Lagged Time Series Model

    DEFF Research Database (Denmark)

    Boke-Olen, Niklas; Lehsten, Veiko; Ardo, Jonas

    2016-01-01

    Savannah regions are predicted to undergo changes in precipitation patterns according to current climate change projections. This change will affect leaf phenology, which controls net primary productivity. It is of importance to study this since savannahs play an important role in the global carbon...... cycle due to their areal coverage and can have an effect on the food security in regions that depend on subsistence farming. In this study we investigate how soil moisture, mean annual precipitation, and day length control savannah phenology by developing a lagged time series model. The model uses...... climate data for 15 flux tower sites across four continents, and normalized difference vegetation index from satellite to optimize a statistical phenological model. We show that all three variables can be used to estimate savannah phenology on a global scale. However, it was not possible to create...

  16. Characterizability of metabolic pathway systems from time series data.

    Science.gov (United States)

    Voit, Eberhard O

    2013-12-01

    Over the past decade, the biomathematical community has devoted substantial effort to the complicated challenge of estimating parameter values for biological systems models. An even more difficult issue is the characterization of functional forms for the processes that govern these systems. Most parameter estimation approaches tacitly assume that these forms are known or can be assumed with some validity. However, this assumption is not always true. The recently proposed method of Dynamic Flux Estimation (DFE) addresses this problem in a genuinely novel fashion for metabolic pathway systems. Specifically, DFE allows the characterization of fluxes within such systems through an analysis of metabolic time series data. Its main drawback is the fact that DFE can only directly be applied if the pathway system contains as many metabolites as unknown fluxes. This situation is unfortunately rare. To overcome this roadblock, earlier work in this field had proposed strategies for augmenting the set of unknown fluxes with independent kinetic information, which however is not always available. Employing Moore-Penrose pseudo-inverse methods of linear algebra, the present article discusses an approach for characterizing fluxes from metabolic time series data that is applicable even if the pathway system is underdetermined and contains more fluxes than metabolites. Intriguingly, this approach is independent of a specific modeling framework and unaffected by noise in the experimental time series data. The results reveal whether any fluxes may be characterized and, if so, which subset is characterizable. They also help with the identification of fluxes that, if they could be determined independently, would allow the application of DFE.

  17. Financial Time Series Prediction Using Elman Recurrent Random Neural Networks.

    Science.gov (United States)

    Wang, Jie; Wang, Jun; Fang, Wen; Niu, Hongli

    2016-01-01

    In recent years, financial market dynamics forecasting has been a focus of economic research. To predict the price indices of stock markets, we developed an architecture which combined Elman recurrent neural networks with stochastic time effective function. By analyzing the proposed model with the linear regression, complexity invariant distance (CID), and multiscale CID (MCID) analysis methods and taking the model compared with different models such as the backpropagation neural network (BPNN), the stochastic time effective neural network (STNN), and the Elman recurrent neural network (ERNN), the empirical results show that the proposed neural network displays the best performance among these neural networks in financial time series forecasting. Further, the empirical research is performed in testing the predictive effects of SSE, TWSE, KOSPI, and Nikkei225 with the established model, and the corresponding statistical comparisons of the above market indices are also exhibited. The experimental results show that this approach gives good performance in predicting the values from the stock market indices.

  18. VARTOOLS: A Program for Analyzing Astronomical Time-Series Data

    CERN Document Server

    Hartman, Joel D

    2016-01-01

    This paper describes the VARTOOLS program, which is an open-source command-line utility, written in C, for analyzing astronomical time-series data, especially light curves. The program provides a general-purpose set of tools for processing light curves including signal identification, filtering, light curve manipulation, time conversions, and modeling and simulating light curves. Some of the routines implemented include the Generalized Lomb-Scargle periodogram, the Box-Least Squares transit search routine, the Analysis of Variance periodogram, the Discrete Fourier Transform including the CLEAN algorithm, the Weighted Wavelet Z-Transform, light curve arithmetic, linear and non-linear optimization of analytic functions including support for Markov Chain Monte Carlo analyses with non-trivial covariances, characterizing and/or simulating time-correlated noise, and the TFA and SYSREM filtering algorithms, among others. A mechanism is also provided for incorporating a user's own compiled processing routines into th...

  19. Scaling in Non-stationary time series I

    CERN Document Server

    Ignaccolo, M; Grigolini, P; Hamilton, P; West, B J

    2003-01-01

    Most data processing techniques, applied to biomedical and sociological time series, are only valid for random fluctuations that are stationary in time. Unfortunately, these data are often non stationary and the use of techniques of analysis resting on the stationary assumption can produce a wrong information on the scaling, and so on the complexity of the process under study. Herein, we test and compare two techniques for removing the non-stationary influences from computer generated time series, consisting of the superposition of a slow signal and a random fluctuation. The former is based on the method of wavelet decomposition, and the latter is a proposal of this paper, denoted by us as step detrending technique. We focus our attention on two cases, when the slow signal is a periodic function mimicking the influence of seasons, and when it is an aperiodic signal mimicking the influence of a population change (increase or decrease). For the purpose of computational simplicity the random fluctuation is taken...

  20. Artificial neural networks applied to forecasting time series.

    Science.gov (United States)

    Montaño Moreno, Juan J; Palmer Pol, Alfonso; Muñoz Gracia, Pilar

    2011-04-01

    This study offers a description and comparison of the main models of Artificial Neural Networks (ANN) which have proved to be useful in time series forecasting, and also a standard procedure for the practical application of ANN in this type of task. The Multilayer Perceptron (MLP), Radial Base Function (RBF), Generalized Regression Neural Network (GRNN), and Recurrent Neural Network (RNN) models are analyzed. With this aim in mind, we use a time series made up of 244 time points. A comparative study establishes that the error made by the four neural network models analyzed is less than 10%. In accordance with the interpretation criteria of this performance, it can be concluded that the neural network models show a close fit regarding their forecasting capacity. The model with the best performance is the RBF, followed by the RNN and MLP. The GRNN model is the one with the worst performance. Finally, we analyze the advantages and limitations of ANN, the possible solutions to these limitations, and provide an orientation towards future research.

  1. GPS time series at Campi Flegrei caldera (2000-2013

    Directory of Open Access Journals (Sweden)

    Prospero De Martino

    2014-05-01

    Full Text Available The Campi Flegrei caldera is an active volcanic system associated to a high volcanic risk, and represents a well known and peculiar example of ground deformations (bradyseism, characterized by intense uplift periods, followed by subsidence phases with some episodic superimposed mini-uplifts. Ground deformation is an important volcanic precursor, and, its continuous monitoring, is one of the main tool for short time forecast of eruptive activity. This paper provides an overview of the continuous GPS monitoring of the Campi Flegrei caldera from January 2000 to July 2013, including network operations, data recording and processing, and data products. In this period the GPS time series allowed continuous and accurate tracking of ground deformation of the area. Seven main uplift episodes were detected, and during each uplift period, the recurrent horizontal displacement pattern, radial from the “caldera center”, suggests no significant change in deformation source geometry and location occurs. The complete archive of GPS time series at Campi Flegrei area is reported in the Supplementary materials. These data can be usefull for the scientific community in improving the research on Campi Flegrei caldera dynamic and hazard assessment.

  2. Time series prediction of mining subsidence based on a SVM

    Institute of Scientific and Technical Information of China (English)

    Li Peixian; Tan Zhixiang; Yah Lili; Deng Kazhong

    2011-01-01

    In order to study dynamic laws of surface movements over coal mines due to mining activities,a dynamic prediction model of surface movements was established,based on the theory of support vector machines (SVM) and times-series analysis.An engineering application was used to verify the correctness of the model.Measurements from observation stations were analyzed and processed to obtain equal-time interval surface movement data and subjected to tests of stationary,zero means and normality.Then the data were used to train the SVM model.A time series model was established to predict mining subsidence by rational choices of embedding dimensions and SVM parameters.MAPE and WIA were used asindicators to evaluate the accuracy of the model and for generalization performance.In the end,the model was used to predict future surface movements.Data from observation stations in Huaibei coal mining area were used as an example.The results show that the maximum absolute error of subsidence is 9 mm,the maximum relative error 1.5%.the maximum absolute error of displacement 7 mm and the maximum relative error 1.8%.The accuracy and reliability of the model meet the requirements of on-site engineering.The results of the study provide a new approach to investigate the dynamics of surface movements.

  3. On the maximum-entropy/autoregressive modeling of time series

    Science.gov (United States)

    Chao, B. F.

    1984-01-01

    The autoregressive (AR) model of a random process is interpreted in the light of the Prony's relation which relates a complex conjugate pair of poles of the AR process in the z-plane (or the z domain) on the one hand, to the complex frequency of one complex harmonic function in the time domain on the other. Thus the AR model of a time series is one that models the time series as a linear combination of complex harmonic functions, which include pure sinusoids and real exponentials as special cases. An AR model is completely determined by its z-domain pole configuration. The maximum-entropy/autogressive (ME/AR) spectrum, defined on the unit circle of the z-plane (or the frequency domain), is nothing but a convenient, but ambiguous visual representation. It is asserted that the position and shape of a spectral peak is determined by the corresponding complex frequency, and the height of the spectral peak contains little information about the complex amplitude of the complex harmonic functions.

  4. Reconstruction of Ordinary Differential Equations From Time Series Data

    CERN Document Server

    Mai, Manuel; O'Hern, Corey S

    2016-01-01

    We develop a numerical method to reconstruct systems of ordinary differential equations (ODEs) from time series data without {\\it a priori} knowledge of the underlying ODEs using sparse basis learning and sparse function reconstruction. We show that employing sparse representations provides more accurate ODE reconstruction compared to least-squares reconstruction techniques for a given amount of time series data. We test and validate the ODE reconstruction method on known 1D, 2D, and 3D systems of ODEs. The 1D system possesses two stable fixed points; the 2D system possesses an oscillatory fixed point with closed orbits; and the 3D system displays chaotic dynamics on a strange attractor. We determine the amount of data required to achieve an error in the reconstructed functions to less than $0.1\\%$. For the reconstructed 1D and 2D systems, we are able to match the trajectories from the original ODEs even at long times. For the 3D system with chaotic dynamics, as expected, the trajectories from the original an...

  5. Monthly hail time series analysis related to agricultural insurance

    Science.gov (United States)

    Tarquis, Ana M.; Saa, Antonio; Gascó, Gabriel; Díaz, M. C.; Garcia Moreno, M. R.; Burgaz, F.

    2010-05-01

    Hail is one of the mos important crop insurance in Spain being more than the 50% of the total insurance in cereal crops. The purpose of the present study is to carry out a study about the hail in cereals. Four provinces have been chosen, those with the values of production are higher: Burgos and Zaragoza for the wheat and Cuenca and Valladolid for the barley. The data that we had available for the study of the evolution and intensity of the damages for hail includes an analysis of the correlation between the ratios of agricultural insurances provided by ENESA and the number of days of annual hail (from 1981 to 2007). At the same time, several weather station per province were selected by the longest more complete data recorded (from 1963 to 2007) to perform an analysis of monthly time series of the number of hail days (HD). The results of the study show us that relation between the ratio of the agricultural insurances and the number of hail days is not clear. Several observations are discussed to explain these results as well as if it is possible to determinte a change in tendency in the HD time series.

  6. Kernel canonical-correlation Granger causality for multiple time series

    Science.gov (United States)

    Wu, Guorong; Duan, Xujun; Liao, Wei; Gao, Qing; Chen, Huafu

    2011-04-01

    Canonical-correlation analysis as a multivariate statistical technique has been applied to multivariate Granger causality analysis to infer information flow in complex systems. It shows unique appeal and great superiority over the traditional vector autoregressive method, due to the simplified procedure that detects causal interaction between multiple time series, and the avoidance of potential model estimation problems. However, it is limited to the linear case. Here, we extend the framework of canonical correlation to include the estimation of multivariate nonlinear Granger causality for drawing inference about directed interaction. Its feasibility and effectiveness are verified on simulated data.

  7. Time series analysis using semiparametric regression on oil palm production

    Science.gov (United States)

    Yundari, Pasaribu, U. S.; Mukhaiyar, U.

    2016-04-01

    This paper presents semiparametric kernel regression method which has shown its flexibility and easiness in mathematical calculation, especially in estimating density and regression function. Kernel function is continuous and it produces a smooth estimation. The classical kernel density estimator is constructed by completely nonparametric analysis and it is well reasonable working for all form of function. Here, we discuss about parameter estimation in time series analysis. First, we consider the parameters are exist, then we use nonparametrical estimation which is called semiparametrical. The selection of optimum bandwidth is obtained by considering the approximation of Mean Integrated Square Root Error (MISE).

  8. Signatures of discrete scale invariance in Dst time series

    Science.gov (United States)

    Balasis, Georgios; Papadimitriou, Constantinos; Daglis, Ioannis A.; Anastasiadis, Anastasios; Athanasopoulou, Labrini; Eftaxias, Konstantinos

    2011-07-01

    Self-similar systems are characterized by continuous scale invariance and, in response, the existence of power laws. However, a significant number of systems exhibits discrete scale invariance (DSI) which in turn leads to log-periodic corrections to scaling that decorate the pure power law. Here, we present the results of a search of log-periodic corrections to scaling in the squares of Dst index increments which are taken as proxies of the energy dissipation rate in the magnetosphere. We show that Dst time series exhibit DSI and discuss the consequence of this feature, as well as the possible implications of Dst DSI on space weather forecasting efforts.

  9. Nonlinear analysis and prediction of time series in multiphase reactors

    CERN Document Server

    Liu, Mingyan

    2014-01-01

    This book reports on important nonlinear aspects or deterministic chaos issues in the systems of multi-phase reactors. The reactors treated in the book include gas-liquid bubble columns, gas-liquid-solid fluidized beds and gas-liquid-solid magnetized fluidized beds. The authors take pressure fluctuations in the bubble columns  as time series for nonlinear analysis, modeling and forecasting. They present qualitative and quantitative non-linear analysis tools which include attractor phase plane plot, correlation dimension, Kolmogorov entropy and largest Lyapunov exponent calculations and local non-linear short-term prediction.

  10. Quality Quandaries- Time Series Model Selection and Parsimony

    DEFF Research Database (Denmark)

    Bisgaard, Søren; Kulahci, Murat

    2009-01-01

    Some of the issues involved in selecting adequate models for time series data are discussed using an example concerning the number of users of an Internet server. The process of selecting an appropriate model is subjective and requires experience and judgment. The authors believe an important...... consideration in model selection should be parameter parsimony. They favor the use of parsimonious mixed ARMA models, noting that research has shown that a model building strategy that considers only autoregressive representations will lead to non-parsimonious models and to loss of forecasting accuracy....

  11. Nonlinear Time Series Forecast Using Radial Basis Function Neural Networks

    Institute of Scientific and Technical Information of China (English)

    ZHENG Xin; CHEN Tian-Lun

    2003-01-01

    In the research of using Radial Basis Function Neural Network (RBF NN) forecasting nonlinear timeseries, we investigate how the different clusterings affect the process of learning and forecasting. We find that k-meansclustering is very suitable. In order to increase the precision we introduce a nonlinear feedback term to escape from thelocal minima of energy, then we use the model to forecast the nonlinear time series which are produced by Mackey-Glassequation and stocks. By selecting the k-means clustering and the suitable feedback term, much better forecasting resultsare obtained.

  12. Nonlinear Time Series Prediction Using Chaotic Neural Networks

    Institute of Scientific and Technical Information of China (English)

    LI KePing; CHEN TianLun

    2001-01-01

    A nonlinear feedback term is introduced into the evaluation equation of weights of the backpropagation algorithm for neural network, the network becomes a chaotic one. For the purpose of that we can investigate how the different feedback terms affect the process of learning and forecasting, we use the model to forecast the nonlinear time series which is produced by Makey-Glass equation. By selecting the suitable feedback term, the system can escape from the local minima and converge to the global minimum or its approximate solutions, and the forecasting results are better than those of backpropagation algorithm.``

  13. Ensemble Deep Learning for Biomedical Time Series Classification

    Directory of Open Access Journals (Sweden)

    Lin-peng Jin

    2016-01-01

    Full Text Available Ensemble learning has been proved to improve the generalization ability effectively in both theory and practice. In this paper, we briefly outline the current status of research on it first. Then, a new deep neural network-based ensemble method that integrates filtering views, local views, distorted views, explicit training, implicit training, subview prediction, and Simple Average is proposed for biomedical time series classification. Finally, we validate its effectiveness on the Chinese Cardiovascular Disease Database containing a large number of electrocardiogram recordings. The experimental results show that the proposed method has certain advantages compared to some well-known ensemble methods, such as Bagging and AdaBoost.

  14. Ensemble Deep Learning for Biomedical Time Series Classification.

    Science.gov (United States)

    Jin, Lin-Peng; Dong, Jun

    2016-01-01

    Ensemble learning has been proved to improve the generalization ability effectively in both theory and practice. In this paper, we briefly outline the current status of research on it first. Then, a new deep neural network-based ensemble method that integrates filtering views, local views, distorted views, explicit training, implicit training, subview prediction, and Simple Average is proposed for biomedical time series classification. Finally, we validate its effectiveness on the Chinese Cardiovascular Disease Database containing a large number of electrocardiogram recordings. The experimental results show that the proposed method has certain advantages compared to some well-known ensemble methods, such as Bagging and AdaBoost.

  15. Phase space reconstruction using input-output time series data

    Science.gov (United States)

    Walker, David M.; Tufillaro, Nicholas B.

    1999-10-01

    In this paper we suggest that an extension of a procedure recently proposed by Wayland et al. [Phys. Rev. Lett. 70, 580 (1993)] for recognizing determinism in an autonomous time series can also be used as a diagnostic for determining an appropriate embedding dimension for driven (``input-output'') systems. We compare the results of this extension to the results produced by the extensions to the method of false nearest neighbors put forward by Rhodes and Morari [Proceedings of the American Control Conference, Seattle, edited by The American Automatic Control Council (IEEE, Piscataway, 1995)] and the method of averaged false nearest neighbors by Cao et al. [Int. J. Bifurcation Chaos 8, 1491 (1998)].

  16. Ensemble Deep Learning for Biomedical Time Series Classification

    Science.gov (United States)

    2016-01-01

    Ensemble learning has been proved to improve the generalization ability effectively in both theory and practice. In this paper, we briefly outline the current status of research on it first. Then, a new deep neural network-based ensemble method that integrates filtering views, local views, distorted views, explicit training, implicit training, subview prediction, and Simple Average is proposed for biomedical time series classification. Finally, we validate its effectiveness on the Chinese Cardiovascular Disease Database containing a large number of electrocardiogram recordings. The experimental results show that the proposed method has certain advantages compared to some well-known ensemble methods, such as Bagging and AdaBoost.

  17. A Suspicious Action Detection System Considering Time Series

    Science.gov (United States)

    Kozuka, Noriaki; Kimura, Koji; Hagiwara, Masafumi

    The paper proposes a new system that can detect suspicious actions such as a car break-in and surroundings in an open space parking, based on image processing. The proposed system focuses on three points of “order”, “time”, and “location” of human actions. The proposed system has the following features: it 1) deals time series data flow, 2) estimates human actions and the location, 3) extracts suspicious action detection rules automatically, 4) detects suspicious actions using the suspicious score. We carried out experiments using real image sequences. As a result, we obtained about 7.8% higher estimation rate than the conventional system.

  18. A series expansion for the time autocorrelation of dynamical variables

    CERN Document Server

    Maiocchi, A M; Giorgilli, A

    2011-01-01

    We present here a general iterative formula which gives a (formal) series expansion for the time autocorrelation of smooth dynamical variables, for all Hamiltonian systems endowed with an invariant measure. We add some criteria, theoretical in nature, which enable one to decide whether the decay of the correlations is exponentially fast or not. One of these criteria is implemented numerically for the case of the Fermi-Pasta-Ulam system, and we find indications which might suggest a sub-exponentially decay for such a system.

  19. Disease management with ARIMA model in time series.

    Science.gov (United States)

    Sato, Renato Cesar

    2013-01-01

    The evaluation of infectious and noninfectious disease management can be done through the use of a time series analysis. In this study, we expect to measure the results and prevent intervention effects on the disease. Clinical studies have benefited from the use of these techniques, particularly for the wide applicability of the ARIMA model. This study briefly presents the process of using the ARIMA model. This analytical tool offers a great contribution for researchers and healthcare managers in the evaluation of healthcare interventions in specific populations.

  20. Estimation of dynamic flux profiles from metabolic time series data

    Directory of Open Access Journals (Sweden)

    Chou I-Chun

    2012-07-01

    Full Text Available Abstract Background Advances in modern high-throughput techniques of molecular biology have enabled top-down approaches for the estimation of parameter values in metabolic systems, based on time series data. Special among them is the recent method of dynamic flux estimation (DFE, which uses such data not only for parameter estimation but also for the identification of functional forms of the processes governing a metabolic system. DFE furthermore provides diagnostic tools for the evaluation of model validity and of the quality of a model fit beyond residual errors. Unfortunately, DFE works only when the data are more or less complete and the system contains as many independent fluxes as metabolites. These drawbacks may be ameliorated with other types of estimation and information. However, such supplementations incur their own limitations. In particular, assumptions must be made regarding the functional forms of some processes and detailed kinetic information must be available, in addition to the time series data. Results The authors propose here a systematic approach that supplements DFE and overcomes some of its shortcomings. Like DFE, the approach is model-free and requires only minimal assumptions. If sufficient time series data are available, the approach allows the determination of a subset of fluxes that enables the subsequent applicability of DFE to the rest of the flux system. The authors demonstrate the procedure with three artificial pathway systems exhibiting distinct characteristics and with actual data of the trehalose pathway in Saccharomyces cerevisiae. Conclusions The results demonstrate that the proposed method successfully complements DFE under various situations and without a priori assumptions regarding the model representation. The proposed method also permits an examination of whether at all, to what degree, or within what range the available time series data can be validly represented in a particular functional format of

  1. Real Rainfall Time Series for Storm Sewer Design

    DEFF Research Database (Denmark)

    Larsen, Torben

    1981-01-01

    to a storm sewer system. The output of the simulation is the frequency distribution of the peak flow, overflow volume etc. from the overflow or the retention storage. The parameters in the transfer model are found either from rainfall/runoff measurements in the catchment or from one or more simulations......This paper describes a simulation method for the design of retention storages, overflows etc. in storm sewer systems. The method is based on computer simulation with real real rainfall time series as input and with a simple transfer model of the ARMA-type (Autoregressive moving average) applied...... with an advanced hydraulic computer model....

  2. Real Rainfall Time Series for Storm Sewer Design

    DEFF Research Database (Denmark)

    Larsen, Torben

    ) as the model of the storm sewer system. The output of the simulation is the frequency distribution of the peak flow, overflow volume etc. from the overflow or retention storage. The parameters in the transfer model is found either from rainfall/runoff measurements in the catchment or from one or a few......The paper describes a simulation method for the design of retention storages, overflows etc. in storm sewer systems. The method is based on computer simulation with real rainfall time series as input ans with the aply of a simple transfer model of the ARMA-type (autoregressiv moving average model...... simulations with an advanced hydraulic computer model....

  3. Almost Periodically Correlated Time Series in Business Fluctuations Analysis

    CERN Document Server

    Lenart, Lukasz

    2012-01-01

    We propose a non-standard subsampling procedure to make formal statistical inference about the business cycle, one of the most important unobserved feature characterising fluctuations of economic growth. We show that some characteristics of business cycle can be modelled in a non-parametric way by discrete spectrum of the Almost Periodically Correlated (APC) time series. On the basis of estimated characteristics of this spectrum business cycle is extracted by filtering. As an illustration we characterise the man properties of business cycles in industrial production index for Polish economy.

  4. Time series analysis for minority game simulations of financial markets

    CERN Document Server

    Ferreira, F F; Machado, B S; Muruganandam, P

    2003-01-01

    The minority game model introduced recently provides promising insights into the understanding of the evolution of prices, indices and rates in the financial markets. In this paper we perform a time series analysis of the model employing tools from statistics, dynamical systems theory and stochastic processes. Using benchmark systems and a financial index for comparison, we draw conclusions about the generating mechanism for this kind of evolution. The trajectories of the model are found to be similar to that of the first differences of the SP500 index: stochastic, nonlinear and (unit root) stationary.

  5. Unified functional network and nonlinear time series analysis for complex systems science: The pyunicorn package

    Science.gov (United States)

    Donges, Jonathan; Heitzig, Jobst; Beronov, Boyan; Wiedermann, Marc; Runge, Jakob; Feng, Qing Yi; Tupikina, Liubov; Stolbova, Veronika; Donner, Reik; Marwan, Norbert; Dijkstra, Henk; Kurths, Jürgen

    2016-04-01

    We introduce the pyunicorn (Pythonic unified complex network and recurrence analysis toolbox) open source software package for applying and combining modern methods of data analysis and modeling from complex network theory and nonlinear time series analysis. pyunicorn is a fully object-oriented and easily parallelizable package written in the language Python. It allows for the construction of functional networks such as climate networks in climatology or functional brain networks in neuroscience representing the structure of statistical interrelationships in large data sets of time series and, subsequently, investigating this structure using advanced methods of complex network theory such as measures and models for spatial networks, networks of interacting networks, node-weighted statistics, or network surrogates. Additionally, pyunicorn provides insights into the nonlinear dynamics of complex systems as recorded in uni- and multivariate time series from a non-traditional perspective by means of recurrence quantification analysis, recurrence networks, visibility graphs, and construction of surrogate time series. The range of possible applications of the library is outlined, drawing on several examples mainly from the field of climatology. pyunicorn is available online at https://github.com/pik-copan/pyunicorn. Reference: J.F. Donges, J. Heitzig, B. Beronov, M. Wiedermann, J. Runge, Q.-Y. Feng, L. Tupikina, V. Stolbova, R.V. Donner, N. Marwan, H.A. Dijkstra, and J. Kurths, Unified functional network and nonlinear time series analysis for complex systems science: The pyunicorn package, Chaos 25, 113101 (2015), DOI: 10.1063/1.4934554, Preprint: arxiv.org:1507.01571 [physics.data-an].

  6. Visualizing trends and clusters in ranked time-series data

    Science.gov (United States)

    Gousie, Michael B.; Grady, John; Branagan, Melissa

    2013-12-01

    There are many systems that provide visualizations for time-oriented data. Of those, few provide the means of finding patterns in time-series data in which rankings are also important. Fewer still have the fine granularity necessary to visually follow individual data points through time. We propose the Ranking Timeline, a novel visualization method for modestly-sized multivariate data sets that include the top ten rankings over time. The system includes two main visualization components: a ranking over time and a cluster analysis. The ranking visualization, loosely based on line plots, allows the user to track individual data points so as to facilitate comparisons within a given time frame. Glyphs represent additional attributes within the framework of the overall system. The user has control over many aspects of the visualization, including viewing a subset of the data and/or focusing on a desired time frame. The cluster analysis tool shows the relative importance of individual items in conjunction with a visualization showing the connection(s) to other, similar items, while maintaining the aforementioned glyphs and user interaction. The user controls the clustering according to a similarity threshold. The system has been implemented as a Web application, and has been tested with data showing the top ten actors/actresses from 1929-2010. The experiments have revealed patterns in the data heretofore not explored.

  7. Local polynomial method for ensemble forecast of time series

    Directory of Open Access Journals (Sweden)

    S. Regonda

    2005-01-01

    Full Text Available We present a nonparametric approach based on local polynomial regression for ensemble forecast of time series. The state space is first reconstructed by embedding the univariate time series of the response variable in a space of dimension (D with a delay time (τ. To obtain a forecast from a given time point t, three steps are involved: (i the current state of the system is mapped on to the state space, known as the feature vector, (ii a small number (K=α*n, α=fraction (0,1] of the data, n=data length of neighbors (and their future evolution to the feature vector are identified in the state space, and (iii a polynomial of order p is fitted to the identified neighbors, which is then used for prediction. A suite of parameter combinations (D, τ, α, p is selected based on an objective criterion, called the Generalized Cross Validation (GCV. All of the selected parameter combinations are then used to issue a T-step iterated forecast starting from the current time t, thus generating an ensemble forecast which can be used to obtain the forecast probability density function (PDF. The ensemble approach improves upon the traditional method of providing a single mean forecast by providing the forecast uncertainty. Further, for short noisy data it can provide better forecasts. We demonstrate the utility of this approach on two synthetic (Henon and Lorenz attractors and two real data sets (Great Salt Lake bi-weekly volume and NINO3 index. This framework can also be used to forecast a vector of response variables based on a vector of predictors.

  8. Detection of intermittent events in atmospheric time series

    Science.gov (United States)

    Paradisi, P.; Cesari, R.; Palatella, L.; Contini, D.; Donateo, A.

    2009-04-01

    associated with the occurrence of critical events in the atmospheric dynamics. The critical events are associated with transitions between meta-stable configurations. Consequently, this approach could give some effort in the study of Extreme Events in meteorology and climatology and in weather classification schemes. Then, the renewal approach could give some effort in the modelling of non-Gaussian closures for turbulent fluxes [3]. In the proposed approach the main features that need to be estimated are: (a) the distribution of life-times of a given atmospheric meta-stable structure (Waiting Times between two critical events); (b) the statistical distribution of fluctuations; (c) the presence of memory in the time series. These features are related to the evaluation of memory content and scaling from the time series. In order to analyze these features, in recent years some novel statistical techniques have been developed. In particular, the analysis of Diffusion Entropy [4] was shown to be a robust method for the determination of the dynamical scaling. This property is related to the power-law behaviour of the life-time statistics and to the memory properties of the time series. The analysis of Renewal Aging [5], based on renewal theory [2], allows to estimate the content of memory in a time series that is related to the amount of critical events in the time series itself. After a brief review of the statistical techniques (Diffusion Entropy and Renewal Aging), an application to experimental atmospheric time series will be illustrated. References [1] Weiss G.H., Rubin R.J., Random Walks: theory and selected applications, Advances in Chemical Physics,1983, 52, 363-505 (1983). [2] D.R. Cox, Renewal Theory, Methuen, London (1962). [3] P. Paradisi, R. Cesari, F. Mainardi, F. Tampieri: The fractional Fick's law for non-local transport processes, Physica A, 293, p. 130-142 (2001). [4] P. Grigolini, L. Palatella, G. Raffaelli, Fractals 9 (2001) 439. [5] P. Allegrini, F. Barbi, P

  9. Geologic Carbon Sequestration: Mitigating Climate Change by Injecting CO2 Underground (LBNL Summer Lecture Series)

    Energy Technology Data Exchange (ETDEWEB)

    Oldenburg, Curtis M [LBNL Earth Sciences Division

    2009-07-21

    Summer Lecture Series 2009: Climate change provides strong motivation to reduce CO2 emissions from the burning of fossil fuels. Carbon dioxide capture and storage involves the capture, compression, and transport of CO2 to geologically favorable areas, where its injected into porous rock more than one kilometer underground for permanent storage. Oldenburg, who heads Berkeley Labs Geologic Carbon Sequestration Program, will focus on the challenges, opportunities, and research needs of this innovative technology.

  10. Hybrid perturbation methods based on statistical time series models

    Science.gov (United States)

    San-Juan, Juan Félix; San-Martín, Montserrat; Pérez, Iván; López, Rosario

    2016-04-01

    In this work we present a new methodology for orbit propagation, the hybrid perturbation theory, based on the combination of an integration method and a prediction technique. The former, which can be a numerical, analytical or semianalytical theory, generates an initial approximation that contains some inaccuracies derived from the fact that, in order to simplify the expressions and subsequent computations, not all the involved forces are taken into account and only low-order terms are considered, not to mention the fact that mathematical models of perturbations not always reproduce physical phenomena with absolute precision. The prediction technique, which can be based on either statistical time series models or computational intelligence methods, is aimed at modelling and reproducing missing dynamics in the previously integrated approximation. This combination results in the precision improvement of conventional numerical, analytical and semianalytical theories for determining the position and velocity of any artificial satellite or space debris object. In order to validate this methodology, we present a family of three hybrid orbit propagators formed by the combination of three different orders of approximation of an analytical theory and a statistical time series model, and analyse their capability to process the effect produced by the flattening of the Earth. The three considered analytical components are the integration of the Kepler problem, a first-order and a second-order analytical theories, whereas the prediction technique is the same in the three cases, namely an additive Holt-Winters method.

  11. Time series clustering analysis of health-promoting behavior

    Science.gov (United States)

    Yang, Chi-Ta; Hung, Yu-Shiang; Deng, Guang-Feng

    2013-10-01

    Health promotion must be emphasized to achieve the World Health Organization goal of health for all. Since the global population is aging rapidly, ComCare elder health-promoting service was developed by the Taiwan Institute for Information Industry in 2011. Based on the Pender health promotion model, ComCare service offers five categories of health-promoting functions to address the everyday needs of seniors: nutrition management, social support, exercise management, health responsibility, stress management. To assess the overall ComCare service and to improve understanding of the health-promoting behavior of elders, this study analyzed health-promoting behavioral data automatically collected by the ComCare monitoring system. In the 30638 session records collected for 249 elders from January, 2012 to March, 2013, behavior patterns were identified by fuzzy c-mean time series clustering algorithm combined with autocorrelation-based representation schemes. The analysis showed that time series data for elder health-promoting behavior can be classified into four different clusters. Each type reveals different health-promoting needs, frequencies, function numbers and behaviors. The data analysis result can assist policymakers, health-care providers, and experts in medicine, public health, nursing and psychology and has been provided to Taiwan National Health Insurance Administration to assess the elder health-promoting behavior.

  12. TIME SERIES FORECASTING WITH MULTIPLE CANDIDATE MODELS: SELECTING OR COMBINING?

    Institute of Scientific and Technical Information of China (English)

    YU Lean; WANG Shouyang; K. K. Lai; Y.Nakamori

    2005-01-01

    Various mathematical models have been commonly used in time series analysis and forecasting. In these processes, academic researchers and business practitioners often come up against two important problems. One is whether to select an appropriate modeling approach for prediction purposes or to combine these different individual approaches into a single forecast for the different/dissimilar modeling approaches. Another is whether to select the best candidate model for forecasting or to mix the various candidate models with different parameters into a new forecast for the same/similar modeling approaches. In this study, we propose a set of computational procedures to solve the above two issues via two judgmental criteria. Meanwhile, in view of the problems presented in the literature, a novel modeling technique is also proposed to overcome the drawbacks of existing combined forecasting methods. To verify the efficiency and reliability of the proposed procedure and modeling technique, the simulations and real data examples are conducted in this study.The results obtained reveal that the proposed procedure and modeling technique can be used as a feasible solution for time series forecasting with multiple candidate models.

  13. Time series modelling and forecasting of emergency department overcrowding.

    Science.gov (United States)

    Kadri, Farid; Harrou, Fouzi; Chaabane, Sondès; Tahon, Christian

    2014-09-01

    Efficient management of patient flow (demand) in emergency departments (EDs) has become an urgent issue for many hospital administrations. Today, more and more attention is being paid to hospital management systems to optimally manage patient flow and to improve management strategies, efficiency and safety in such establishments. To this end, EDs require significant human and material resources, but unfortunately these are limited. Within such a framework, the ability to accurately forecast demand in emergency departments has considerable implications for hospitals to improve resource allocation and strategic planning. The aim of this study was to develop models for forecasting daily attendances at the hospital emergency department in Lille, France. The study demonstrates how time-series analysis can be used to forecast, at least in the short term, demand for emergency services in a hospital emergency department. The forecasts were based on daily patient attendances at the paediatric emergency department in Lille regional hospital centre, France, from January 2012 to December 2012. An autoregressive integrated moving average (ARIMA) method was applied separately to each of the two GEMSA categories and total patient attendances. Time-series analysis was shown to provide a useful, readily available tool for forecasting emergency department demand.

  14. A New Hybrid Methodology for Nonlinear Time Series Forecasting

    Directory of Open Access Journals (Sweden)

    Mehdi Khashei

    2011-01-01

    Full Text Available Artificial neural networks (ANNs are flexible computing frameworks and universal approximators that can be applied to a wide range of forecasting problems with a high degree of accuracy. However, using ANNs to model linear problems have yielded mixed results, and hence; it is not wise to apply them blindly to any type of data. This is the reason that hybrid methodologies combining linear models such as ARIMA and nonlinear models such as ANNs have been proposed in the literature of time series forecasting. Despite of all advantages of the traditional methodologies for combining ARIMA and ANNs, they have some assumptions that will degenerate their performance if the opposite situation occurs. In this paper, a new methodology is proposed in order to combine the ANNs with ARIMA in order to overcome the limitations of traditional hybrid methodologies and yield more general and more accurate hybrid models. Empirical results with Canadian Lynx data set indicate that the proposed methodology can be a more effective way in order to combine linear and nonlinear models together than traditional hybrid methodologies. Therefore, it can be applied as an appropriate alternative methodology for hybridization in time series forecasting field, especially when higher forecasting accuracy is needed.

  15. Intermittency and multifractional Brownian character of geomagnetic time series

    Directory of Open Access Journals (Sweden)

    G. Consolini

    2013-07-01

    Full Text Available The Earth's magnetosphere exhibits a complex behavior in response to the solar wind conditions. This behavior, which is described in terms of mutifractional Brownian motions, could be the consequence of the occurrence of dynamical phase transitions. On the other hand, it has been shown that the dynamics of the geomagnetic signals is also characterized by intermittency at the smallest temporal scales. Here, we focus on the existence of a possible relationship in the geomagnetic time series between the multifractional Brownian motion character and the occurrence of intermittency. In detail, we investigate the multifractional nature of two long time series of the horizontal intensity of the Earth's magnetic field as measured at L'Aquila Geomagnetic Observatory during two years (2001 and 2008, which correspond to different conditions of solar activity. We propose a possible double origin of the intermittent character of the small-scale magnetic field fluctuations, which is related to both the multifractional nature of the geomagnetic field and the intermittent character of the disturbance level. Our results suggest a more complex nature of the geomagnetic response to solar wind changes than previously thought.

  16. Exponential smoothing for financial time series data forecasting

    Directory of Open Access Journals (Sweden)

    Kuzhda, Tetyana Ivanivna

    2014-05-01

    Full Text Available The article begins with the formulation for predictive learning called exponential smoothing forecasting. The exponential smoothing is commonly applied to financial markets such as stock or bond, foreign exchange, insurance, credit, primary and secondary markets. The exponential smoothing models are useful in providing the valuable decision information for investors. Simple and double exponential smoothing models are two basic types of exponential smoothing method. The simple exponential smoothing method is suitable for financial time series forecasting for the specified time period. The simple exponential smoothing weights past observations with exponentially decreasing weights to forecast future values. The double exponential smoothing is a refinement of the simple exponential smoothing model but adds another component which takes into account any trend in the data. The double exponential smoothing is designed to address this type of data series by taking into account any trend in the data. Measurement of the forecast accuracy is described in this article. Finally, the quantitative value of the price per common share forecast using simple exponential smoothing is calculated. The applied recommendations concerning determination of the price per common share forecast using double exponential smoothing are shown in the article.

  17. A new complexity measure for time series analysis and classification

    Science.gov (United States)

    Nagaraj, Nithin; Balasubramanian, Karthi; Dey, Sutirth

    2013-07-01

    Complexity measures are used in a number of applications including extraction of information from data such as ecological time series, detection of non-random structure in biomedical signals, testing of random number generators, language recognition and authorship attribution etc. Different complexity measures proposed in the literature like Shannon entropy, Relative entropy, Lempel-Ziv, Kolmogrov and Algorithmic complexity are mostly ineffective in analyzing short sequences that are further corrupted with noise. To address this problem, we propose a new complexity measure ETC and define it as the "Effort To Compress" the input sequence by a lossless compression algorithm. Here, we employ the lossless compression algorithm known as Non-Sequential Recursive Pair Substitution (NSRPS) and define ETC as the number of iterations needed for NSRPS to transform the input sequence to a constant sequence. We demonstrate the utility of ETC in two applications. ETC is shown to have better correlation with Lyapunov exponent than Shannon entropy even with relatively short and noisy time series. The measure also has a greater rate of success in automatic identification and classification of short noisy sequences, compared to entropy and a popular measure based on Lempel-Ziv compression (implemented by Gzip).

  18. Financial time series prediction using spiking neural networks.

    Science.gov (United States)

    Reid, David; Hussain, Abir Jaafar; Tawfik, Hissam

    2014-01-01

    In this paper a novel application of a particular type of spiking neural network, a Polychronous Spiking Network, was used for financial time series prediction. It is argued that the inherent temporal capabilities of this type of network are suited to non-stationary data such as this. The performance of the spiking neural network was benchmarked against three systems: two "traditional", rate-encoded, neural networks; a Multi-Layer Perceptron neural network and a Dynamic Ridge Polynomial neural network, and a standard Linear Predictor Coefficients model. For this comparison three non-stationary and noisy time series were used: IBM stock data; US/Euro exchange rate data, and the price of Brent crude oil. The experiments demonstrated favourable prediction results for the Spiking Neural Network in terms of Annualised Return and prediction error for 5-Step ahead predictions. These results were also supported by other relevant metrics such as Maximum Drawdown and Signal-To-Noise ratio. This work demonstrated the applicability of the Polychronous Spiking Network to financial data forecasting and this in turn indicates the potential of using such networks over traditional systems in difficult to manage non-stationary environments.

  19. Financial time series prediction using spiking neural networks.

    Directory of Open Access Journals (Sweden)

    David Reid

    Full Text Available In this paper a novel application of a particular type of spiking neural network, a Polychronous Spiking Network, was used for financial time series prediction. It is argued that the inherent temporal capabilities of this type of network are suited to non-stationary data such as this. The performance of the spiking neural network was benchmarked against three systems: two "traditional", rate-encoded, neural networks; a Multi-Layer Perceptron neural network and a Dynamic Ridge Polynomial neural network, and a standard Linear Predictor Coefficients model. For this comparison three non-stationary and noisy time series were used: IBM stock data; US/Euro exchange rate data, and the price of Brent crude oil. The experiments demonstrated favourable prediction results for the Spiking Neural Network in terms of Annualised Return and prediction error for 5-Step ahead predictions. These results were also supported by other relevant metrics such as Maximum Drawdown and Signal-To-Noise ratio. This work demonstrated the applicability of the Polychronous Spiking Network to financial data forecasting and this in turn indicates the potential of using such networks over traditional systems in difficult to manage non-stationary environments.

  20. Time series analysis of gold production in Malaysia

    Science.gov (United States)

    Muda, Nora; Hoon, Lee Yuen

    2012-05-01

    Gold is a soft, malleable, bright yellow metallic element and unaffected by air or most reagents. It is highly valued as an asset or investment commodity and is extensively used in jewellery, industrial application, dentistry and medical applications. In Malaysia, gold mining is limited in several areas such as Pahang, Kelantan, Terengganu, Johor and Sarawak. The main purpose of this case study is to obtain a suitable model for the production of gold in Malaysia. The model can also be used to predict the data of Malaysia's gold production in the future. Box-Jenkins time series method was used to perform time series analysis with the following steps: identification, estimation, diagnostic checking and forecasting. In addition, the accuracy of prediction is tested using mean absolute percentage error (MAPE). From the analysis, the ARIMA (3,1,1) model was found to be the best fitted model with MAPE equals to 3.704%, indicating the prediction is very accurate. Hence, this model can be used for forecasting. This study is expected to help the private and public sectors to understand the gold production scenario and later plan the gold mining activities in Malaysia.