WorldWideScience

Sample records for multi-city time-series analysis

  1. Air pollution and emergency department visits for cardiac and respiratory conditions: a multi-city time-series analysis

    Directory of Open Access Journals (Sweden)

    Rowe Brian H

    2009-06-01

    Full Text Available Abstract Background Relatively few studies have been conducted of the association between air pollution and emergency department (ED visits, and most of these have been based on a small number of visits, for a limited number of health conditions and pollutants, and only daily measures of exposure and response. Methods A time-series analysis was conducted on nearly 400,000 ED visits to 14 hospitals in seven Canadian cities during the 1990s and early 2000s. Associations were examined between carbon monoxide (CO, nitrogen dioxide (NO2, ozone (O3, sulfur dioxide (SO2, and particulate matter (PM10 and PM2.5, and visits for angina/myocardial infarction, heart failure, dysrhythmia/conduction disturbance, asthma, chronic obstructive pulmonary disease (COPD, and respiratory infections. Daily and 3-hourly visit counts were modeled as quasi-Poisson and analyses controlled for effects of temporal cycles, weather, day of week and holidays. Results 24-hour average concentrations of CO and NO2 lag 0 days exhibited the most consistent associations with cardiac conditions (2.1% (95% CI, 0.0–4.2% and 2.6% (95% CI, 0.2–5.0% increase in visits for myocardial infarction/angina per 0.7 ppm CO and 18.4 ppb NO2 respectively; 3.8% (95% CI, 0.7–6.9% and 4.7% (95% CI, 1.2–8.4% increase in visits for heart failure. Ozone (lag 2 days was most consistently associated with respiratory visits (3.2% (95% CI, 0.3–6.2%, and 3.7% (95% CI, -0.5–7.9% increases in asthma and COPD visits respectively per 18.4 ppb. Associations tended to be of greater magnitude during the warm season (April – September. In particular, the associations of PM10 and PM2.5with asthma visits were respectively nearly three- and over fourfold larger vs. all year analyses (14.4% increase in visits, 95% CI, 0.2–30.7, per 20.6 μg/m3 PM10 and 7.6% increase in visits, 95% CI, 5.1–10.1, per 8.2 μg/m3 PM2.5. No consistent associations were observed between three hour average pollutant

  2. Visual time series analysis

    DEFF Research Database (Denmark)

    Fischer, Paul; Hilbert, Astrid

    2012-01-01

    We introduce a platform which supplies an easy-to-handle, interactive, extendable, and fast analysis tool for time series analysis. In contrast to other software suits like Maple, Matlab, or R, which use a command-line-like interface and where the user has to memorize/look-up the appropriate...... commands, our application is select-and-click-driven. It allows to derive many different sequences of deviations for a given time series and to visualize them in different ways in order to judge their expressive power and to reuse the procedure found. For many transformations or model-ts, the user may...... choose between manual and automated parameter selection. The user can dene new transformations and add them to the system. The application contains efficient implementations of advanced and recent techniques for time series analysis including techniques related to extreme value analysis and filtering...

  3. Applied time series analysis

    CERN Document Server

    Woodward, Wayne A; Elliott, Alan C

    2011-01-01

    ""There is scarcely a standard technique that the reader will find left out … this book is highly recommended for those requiring a ready introduction to applicable methods in time series and serves as a useful resource for pedagogical purposes.""-International Statistical Review (2014), 82""Current time series theory for practice is well summarized in this book.""-Emmanuel Parzen, Texas A&M University""What an extraordinary range of topics covered, all very insightfully. I like [the authors'] innovations very much, such as the AR factor table.""-David Findley, U.S. Census Bureau (retired)""…

  4. A Course in Time Series Analysis

    CERN Document Server

    Peña, Daniel; Tsay, Ruey S

    2011-01-01

    New statistical methods and future directions of research in time series A Course in Time Series Analysis demonstrates how to build time series models for univariate and multivariate time series data. It brings together material previously available only in the professional literature and presents a unified view of the most advanced procedures available for time series model building. The authors begin with basic concepts in univariate time series, providing an up-to-date presentation of ARIMA models, including the Kalman filter, outlier analysis, automatic methods for building ARIMA models, a

  5. The analysis of time series: an introduction

    National Research Council Canada - National Science Library

    Chatfield, Christopher

    1989-01-01

    .... A variety of practical examples are given to support the theory. The book covers a wide range of time-series topics, including probability models for time series, Box-Jenkins forecasting, spectral analysis, linear systems and system identification...

  6. Time series analysis time series analysis methods and applications

    CERN Document Server

    Rao, Tata Subba; Rao, C R

    2012-01-01

    The field of statistics not only affects all areas of scientific activity, but also many other matters such as public policy. It is branching rapidly into so many different subjects that a series of handbooks is the only way of comprehensively presenting the various aspects of statistical methodology, applications, and recent developments. The Handbook of Statistics is a series of self-contained reference books. Each volume is devoted to a particular topic in statistics, with Volume 30 dealing with time series. The series is addressed to the entire community of statisticians and scientists in various disciplines who use statistical methodology in their work. At the same time, special emphasis is placed on applications-oriented techniques, with the applied statistician in mind as the primary audience. Comprehensively presents the various aspects of statistical methodology Discusses a wide variety of diverse applications and recent developments Contributors are internationally renowened experts in their respect...

  7. Time Series Analysis and Forecasting by Example

    CERN Document Server

    Bisgaard, Soren

    2011-01-01

    An intuition-based approach enables you to master time series analysis with ease Time Series Analysis and Forecasting by Example provides the fundamental techniques in time series analysis using various examples. By introducing necessary theory through examples that showcase the discussed topics, the authors successfully help readers develop an intuitive understanding of seemingly complicated time series models and their implications. The book presents methodologies for time series analysis in a simplified, example-based approach. Using graphics, the authors discuss each presented example in

  8. The foundations of modern time series analysis

    CERN Document Server

    Mills, Terence C

    2011-01-01

    This book develops the analysis of Time Series from its formal beginnings in the 1890s through to the publication of Box and Jenkins' watershed publication in 1970, showing how these methods laid the foundations for the modern techniques of Time Series analysis that are in use today.

  9. Time Series Analysis Forecasting and Control

    CERN Document Server

    Box, George E P; Reinsel, Gregory C

    2011-01-01

    A modernized new edition of one of the most trusted books on time series analysis. Since publication of the first edition in 1970, Time Series Analysis has served as one of the most influential and prominent works on the subject. This new edition maintains its balanced presentation of the tools for modeling and analyzing time series and also introduces the latest developments that have occurred n the field over the past decade through applications from areas such as business, finance, and engineering. The Fourth Edition provides a clearly written exploration of the key methods for building, cl

  10. Applied time series analysis and innovative computing

    CERN Document Server

    Ao, Sio-Iong

    2010-01-01

    This text is a systematic, state-of-the-art introduction to the use of innovative computing paradigms as an investigative tool for applications in time series analysis. It includes frontier case studies based on recent research.

  11. Analysis of Heavy-Tailed Time Series

    DEFF Research Database (Denmark)

    Xie, Xiaolei

    This thesis is about analysis of heavy-tailed time series. We discuss tail properties of real-world equity return series and investigate the possibility that a single tail index is shared by all return series of actively traded equities in a market. Conditions for this hypothesis to be true...... are identified. We study the eigenvalues and eigenvectors of sample covariance and sample auto-covariance matrices of multivariate heavy-tailed time series, and particularly for time series with very high dimensions. Asymptotic approximations of the eigenvalues and eigenvectors of such matrices are found...... and expressed in terms of the parameters of the dependence structure, among others. Furthermore, we study an importance sampling method for estimating rare-event probabilities of multivariate heavy-tailed time series generated by matrix recursion. We show that the proposed algorithm is efficient in the sense...

  12. Entropic Analysis of Electromyography Time Series

    Science.gov (United States)

    Kaufman, Miron; Sung, Paul

    2005-03-01

    We are in the process of assessing the effectiveness of fractal and entropic measures for the diagnostic of low back pain from surface electromyography (EMG) time series. Surface electromyography (EMG) is used to assess patients with low back pain. In a typical EMG measurement, the voltage is measured every millisecond. We observed back muscle fatiguing during one minute, which results in a time series with 60,000 entries. We characterize the complexity of time series by computing the Shannon entropy time dependence. The analysis of the time series from different relevant muscles from healthy and low back pain (LBP) individuals provides evidence that the level of variability of back muscle activities is much larger for healthy individuals than for individuals with LBP. In general the time dependence of the entropy shows a crossover from a diffusive regime to a regime characterized by long time correlations (self organization) at about 0.01s.

  13. Introduction to time series analysis and forecasting

    CERN Document Server

    Montgomery, Douglas C; Kulahci, Murat

    2008-01-01

    An accessible introduction to the most current thinking in and practicality of forecasting techniques in the context of time-oriented data. Analyzing time-oriented data and forecasting are among the most important problems that analysts face across many fields, ranging from finance and economics to production operations and the natural sciences. As a result, there is a widespread need for large groups of people in a variety of fields to understand the basic concepts of time series analysis and forecasting. Introduction to Time Series Analysis and Forecasting presents the time series analysis branch of applied statistics as the underlying methodology for developing practical forecasts, and it also bridges the gap between theory and practice by equipping readers with the tools needed to analyze time-oriented data and construct useful, short- to medium-term, statistically based forecasts.

  14. Visibility Graph Based Time Series Analysis.

    Science.gov (United States)

    Stephen, Mutua; Gu, Changgui; Yang, Huijie

    2015-01-01

    Network based time series analysis has made considerable achievements in the recent years. By mapping mono/multivariate time series into networks, one can investigate both it's microscopic and macroscopic behaviors. However, most proposed approaches lead to the construction of static networks consequently providing limited information on evolutionary behaviors. In the present paper we propose a method called visibility graph based time series analysis, in which series segments are mapped to visibility graphs as being descriptions of the corresponding states and the successively occurring states are linked. This procedure converts a time series to a temporal network and at the same time a network of networks. Findings from empirical records for stock markets in USA (S&P500 and Nasdaq) and artificial series generated by means of fractional Gaussian motions show that the method can provide us rich information benefiting short-term and long-term predictions. Theoretically, we propose a method to investigate time series from the viewpoint of network of networks.

  15. Visibility Graph Based Time Series Analysis.

    Directory of Open Access Journals (Sweden)

    Mutua Stephen

    Full Text Available Network based time series analysis has made considerable achievements in the recent years. By mapping mono/multivariate time series into networks, one can investigate both it's microscopic and macroscopic behaviors. However, most proposed approaches lead to the construction of static networks consequently providing limited information on evolutionary behaviors. In the present paper we propose a method called visibility graph based time series analysis, in which series segments are mapped to visibility graphs as being descriptions of the corresponding states and the successively occurring states are linked. This procedure converts a time series to a temporal network and at the same time a network of networks. Findings from empirical records for stock markets in USA (S&P500 and Nasdaq and artificial series generated by means of fractional Gaussian motions show that the method can provide us rich information benefiting short-term and long-term predictions. Theoretically, we propose a method to investigate time series from the viewpoint of network of networks.

  16. Introduction to time series analysis and forecasting

    CERN Document Server

    Montgomery, Douglas C; Kulahci, Murat

    2015-01-01

    Praise for the First Edition ""…[t]he book is great for readers who need to apply the methods and models presented but have little background in mathematics and statistics."" -MAA Reviews Thoroughly updated throughout, Introduction to Time Series Analysis and Forecasting, Second Edition presents the underlying theories of time series analysis that are needed to analyze time-oriented data and construct real-world short- to medium-term statistical forecasts.    Authored by highly-experienced academics and professionals in engineering statistics, the Second Edition features discussions on both

  17. Nonlinear Time Series Analysis via Neural Networks

    Science.gov (United States)

    Volná, Eva; Janošek, Michal; Kocian, Václav; Kotyrba, Martin

    This article deals with a time series analysis based on neural networks in order to make an effective forex market [Moore and Roche, J. Int. Econ. 58, 387-411 (2002)] pattern recognition. Our goal is to find and recognize important patterns which repeatedly appear in the market history to adapt our trading system behaviour based on them.

  18. Lecture notes for Advanced Time Series Analysis

    DEFF Research Database (Denmark)

    Madsen, Henrik; Holst, Jan

    1997-01-01

    A first version of this notes was used at the lectures in Grenoble, and they are now extended and improved (together with Jan Holst), and used in Ph.D. courses on Advanced Time Series Analysis at IMM and at the Department of Mathematical Statistics, University of Lund, 1994, 1997, ...

  19. Time series analysis of barometric pressure data

    International Nuclear Information System (INIS)

    La Rocca, Paola; Riggi, Francesco; Riggi, Daniele

    2010-01-01

    Time series of atmospheric pressure data, collected over a period of several years, were analysed to provide undergraduate students with educational examples of application of simple statistical methods of analysis. In addition to basic methods for the analysis of periodicities, a comparison of two forecast models, one based on autoregression algorithms, and the other making use of an artificial neural network, was made. Results show that the application of artificial neural networks may give slightly better results compared to traditional methods.

  20. Nonlinear time series analysis with R

    CERN Document Server

    Huffaker, Ray; Rosa, Rodolfo

    2017-01-01

    In the process of data analysis, the investigator is often facing highly-volatile and random-appearing observed data. A vast body of literature shows that the assumption of underlying stochastic processes was not necessarily representing the nature of the processes under investigation and, when other tools were used, deterministic features emerged. Non Linear Time Series Analysis (NLTS) allows researchers to test whether observed volatility conceals systematic non linear behavior, and to rigorously characterize governing dynamics. Behavioral patterns detected by non linear time series analysis, along with scientific principles and other expert information, guide the specification of mechanistic models that serve to explain real-world behavior rather than merely reproducing it. Often there is a misconception regarding the complexity of the level of mathematics needed to understand and utilize the tools of NLTS (for instance Chaos theory). However, mathematics used in NLTS is much simpler than many other subjec...

  1. Time-Series Analysis: A Cautionary Tale

    Science.gov (United States)

    Damadeo, Robert

    2015-01-01

    Time-series analysis has often been a useful tool in atmospheric science for deriving long-term trends in various atmospherically important parameters (e.g., temperature or the concentration of trace gas species). In particular, time-series analysis has been repeatedly applied to satellite datasets in order to derive the long-term trends in stratospheric ozone, which is a critical atmospheric constituent. However, many of the potential pitfalls relating to the non-uniform sampling of the datasets were often ignored and the results presented by the scientific community have been unknowingly biased. A newly developed and more robust application of this technique is applied to the Stratospheric Aerosol and Gas Experiment (SAGE) II version 7.0 ozone dataset and the previous biases and newly derived trends are presented.

  2. Fourier analysis of time series an introduction

    CERN Document Server

    Bloomfield, Peter

    2000-01-01

    A new, revised edition of a yet unrivaled work on frequency domain analysis Long recognized for his unique focus on frequency domain methods for the analysis of time series data as well as for his applied, easy-to-understand approach, Peter Bloomfield brings his well-known 1976 work thoroughly up to date. With a minimum of mathematics and an engaging, highly rewarding style, Bloomfield provides in-depth discussions of harmonic regression, harmonic analysis, complex demodulation, and spectrum analysis. All methods are clearly illustrated using examples of specific data sets, while ample

  3. Highly comparative time-series analysis: the empirical structure of time series and their methods.

    Science.gov (United States)

    Fulcher, Ben D; Little, Max A; Jones, Nick S

    2013-06-06

    The process of collecting and organizing sets of observations represents a common theme throughout the history of science. However, despite the ubiquity of scientists measuring, recording and analysing the dynamics of different processes, an extensive organization of scientific time-series data and analysis methods has never been performed. Addressing this, annotated collections of over 35 000 real-world and model-generated time series, and over 9000 time-series analysis algorithms are analysed in this work. We introduce reduced representations of both time series, in terms of their properties measured by diverse scientific methods, and of time-series analysis methods, in terms of their behaviour on empirical time series, and use them to organize these interdisciplinary resources. This new approach to comparing across diverse scientific data and methods allows us to organize time-series datasets automatically according to their properties, retrieve alternatives to particular analysis methods developed in other scientific disciplines and automate the selection of useful methods for time-series classification and regression tasks. The broad scientific utility of these tools is demonstrated on datasets of electroencephalograms, self-affine time series, heartbeat intervals, speech signals and others, in each case contributing novel analysis techniques to the existing literature. Highly comparative techniques that compare across an interdisciplinary literature can thus be used to guide more focused research in time-series analysis for applications across the scientific disciplines.

  4. The Statistical Analysis of Time Series

    CERN Document Server

    Anderson, T W

    2011-01-01

    The Wiley Classics Library consists of selected books that have become recognized classics in their respective fields. With these new unabridged and inexpensive editions, Wiley hopes to extend the life of these important works by making them available to future generations of mathematicians and scientists. Currently available in the Series: T. W. Anderson Statistical Analysis of Time Series T. S. Arthanari & Yadolah Dodge Mathematical Programming in Statistics Emil Artin Geometric Algebra Norman T. J. Bailey The Elements of Stochastic Processes with Applications to the Natural Sciences George

  5. What marketing scholars should know about time series analysis : time series applications in marketing

    NARCIS (Netherlands)

    Horváth, Csilla; Kornelis, Marcel; Leeflang, Peter S.H.

    2002-01-01

    In this review, we give a comprehensive summary of time series techniques in marketing, and discuss a variety of time series analysis (TSA) techniques and models. We classify them in the sets (i) univariate TSA, (ii) multivariate TSA, and (iii) multiple TSA. We provide relevant marketing

  6. Time series analysis of temporal networks

    Science.gov (United States)

    Sikdar, Sandipan; Ganguly, Niloy; Mukherjee, Animesh

    2016-01-01

    A common but an important feature of all real-world networks is that they are temporal in nature, i.e., the network structure changes over time. Due to this dynamic nature, it becomes difficult to propose suitable growth models that can explain the various important characteristic properties of these networks. In fact, in many application oriented studies only knowing these properties is sufficient. For instance, if one wishes to launch a targeted attack on a network, this can be done even without the knowledge of the full network structure; rather an estimate of some of the properties is sufficient enough to launch the attack. We, in this paper show that even if the network structure at a future time point is not available one can still manage to estimate its properties. We propose a novel method to map a temporal network to a set of time series instances, analyze them and using a standard forecast model of time series, try to predict the properties of a temporal network at a later time instance. To our aim, we consider eight properties such as number of active nodes, average degree, clustering coefficient etc. and apply our prediction framework on them. We mainly focus on the temporal network of human face-to-face contacts and observe that it represents a stochastic process with memory that can be modeled as Auto-Regressive-Integrated-Moving-Average (ARIMA). We use cross validation techniques to find the percentage accuracy of our predictions. An important observation is that the frequency domain properties of the time series obtained from spectrogram analysis could be used to refine the prediction framework by identifying beforehand the cases where the error in prediction is likely to be high. This leads to an improvement of 7.96% (for error level ≤20%) in prediction accuracy on an average across all datasets. As an application we show how such prediction scheme can be used to launch targeted attacks on temporal networks. Contribution to the Topical Issue

  7. Nonparametric factor analysis of time series

    OpenAIRE

    Rodríguez-Poo, Juan M.; Linton, Oliver Bruce

    1998-01-01

    We introduce a nonparametric smoothing procedure for nonparametric factor analaysis of multivariate time series. The asymptotic properties of the proposed procedures are derived. We present an application based on the residuals from the Fair macromodel.

  8. Time Series Analysis Using Geometric Template Matching.

    Science.gov (United States)

    Frank, Jordan; Mannor, Shie; Pineau, Joelle; Precup, Doina

    2013-03-01

    We present a novel framework for analyzing univariate time series data. At the heart of the approach is a versatile algorithm for measuring the similarity of two segments of time series called geometric template matching (GeTeM). First, we use GeTeM to compute a similarity measure for clustering and nearest-neighbor classification. Next, we present a semi-supervised learning algorithm that uses the similarity measure with hierarchical clustering in order to improve classification performance when unlabeled training data are available. Finally, we present a boosting framework called TDEBOOST, which uses an ensemble of GeTeM classifiers. TDEBOOST augments the traditional boosting approach with an additional step in which the features used as inputs to the classifier are adapted at each step to improve the training error. We empirically evaluate the proposed approaches on several datasets, such as accelerometer data collected from wearable sensors and ECG data.

  9. Analysis of JET ELMy time series

    International Nuclear Information System (INIS)

    Zvejnieks, G.; Kuzovkov, V.N.

    2005-01-01

    Full text: Achievement of the planned operational regime in the next generation tokamaks (such as ITER) still faces principal problems. One of the main challenges is obtaining the control of edge localized modes (ELMs), which should lead to both long plasma pulse times and reasonable divertor life time. In order to control ELMs the hypothesis was proposed by Degeling [1] that ELMs exhibit features of chaotic dynamics and thus a standard chaos control methods might be applicable. However, our findings which are based on the nonlinear autoregressive (NAR) model contradict this hypothesis for JET ELMy time-series. In turn, it means that ELM behavior is of a relaxation or random type. These conclusions coincide with our previous results obtained for ASDEX Upgrade time series [2]. [1] A.W. Degeling, Y.R. Martin, P.E. Bak, J. B.Lister, and X. Llobet, Plasma Phys. Control. Fusion 43, 1671 (2001). [2] G. Zvejnieks, V.N. Kuzovkov, O. Dumbrajs, A.W. Degeling, W. Suttrop, H. Urano, and H. Zohm, Physics of Plasmas 11, 5658 (2004)

  10. Tool Wear Monitoring Using Time Series Analysis

    Science.gov (United States)

    Song, Dong Yeul; Ohara, Yasuhiro; Tamaki, Haruo; Suga, Masanobu

    A tool wear monitoring approach considering the nonlinear behavior of cutting mechanism caused by tool wear and/or localized chipping is proposed, and its effectiveness is verified through the cutting experiment and actual turning machining. Moreover, the variation in the surface roughness of the machined workpiece is also discussed using this approach. In this approach, the residual error between the actually measured vibration signal and the estimated signal obtained from the time series model corresponding to dynamic model of cutting is introduced as the feature of diagnosis. Consequently, it is found that the early tool wear state (i.e. flank wear under 40µm) can be monitored, and also the optimal tool exchange time and the tool wear state for actual turning machining can be judged by this change in the residual error. Moreover, the variation of surface roughness Pz in the range of 3 to 8µm can be estimated by the monitoring of the residual error.

  11. TIME SERIES ANALYSIS USING A UNIQUE MODEL OF TRANSFORMATION

    Directory of Open Access Journals (Sweden)

    Goran Klepac

    2007-12-01

    Full Text Available REFII1 model is an authorial mathematical model for time series data mining. The main purpose of that model is to automate time series analysis, through a unique transformation model of time series. An advantage of this approach of time series analysis is the linkage of different methods for time series analysis, linking traditional data mining tools in time series, and constructing new algorithms for analyzing time series. It is worth mentioning that REFII model is not a closed system, which means that we have a finite set of methods. At first, this is a model for transformation of values of time series, which prepares data used by different sets of methods based on the same model of transformation in a domain of problem space. REFII model gives a new approach in time series analysis based on a unique model of transformation, which is a base for all kind of time series analysis. The advantage of REFII model is its possible application in many different areas such as finance, medicine, voice recognition, face recognition and text mining.

  12. Time series analysis in the social sciences the fundamentals

    CERN Document Server

    Shin, Youseop

    2017-01-01

    Times Series Analysis in the Social Sciences is a practical and highly readable introduction written exclusively for students and researchers whose mathematical background is limited to basic algebra. The book focuses on fundamental elements of time series analysis that social scientists need to understand so they can employ time series analysis for their research and practice. Through step-by-step explanations and using monthly violent crime rates as case studies, this book explains univariate time series from the preliminary visual analysis through the modeling of seasonality, trends, and re

  13. Mathematical foundations of time series analysis a concise introduction

    CERN Document Server

    Beran, Jan

    2017-01-01

    This book provides a concise introduction to the mathematical foundations of time series analysis, with an emphasis on mathematical clarity. The text is reduced to the essential logical core, mostly using the symbolic language of mathematics, thus enabling readers to very quickly grasp the essential reasoning behind time series analysis. It appeals to anybody wanting to understand time series in a precise, mathematical manner. It is suitable for graduate courses in time series analysis but is equally useful as a reference work for students and researchers alike.

  14. Stochastic time series analysis of hydrology data for water resources

    Science.gov (United States)

    Sathish, S.; Khadar Babu, S. K.

    2017-11-01

    The prediction to current publication of stochastic time series analysis in hydrology and seasonal stage. The different statistical tests for predicting the hydrology time series on Thomas-Fiering model. The hydrology time series of flood flow have accept a great deal of consideration worldwide. The concentration of stochastic process areas of time series analysis method are expanding with develop concerns about seasonal periods and global warming. The recent trend by the researchers for testing seasonal periods in the hydrologic flowseries using stochastic process on Thomas-Fiering model. The present article proposed to predict the seasonal periods in hydrology using Thomas-Fiering model.

  15. Topic Time Series Analysis of Microblogs

    Science.gov (United States)

    2014-10-01

    may be distributed more globally. Tweets on a specific topic that cluster spatially, temporally or both might be of interest to analysts, marketers ...of $ and @, with the latter only in the case that it is the only character in the token (the @ symbol is significant in its usage by Instagram in...is generated by Instagram . Topic 80, Distance: 143.2101 Top words: 1. rawr 2. ˆ0ˆ 3. kill 4. jurassic 5. dinosaur Analysis: This topic is quite

  16. Metagenomics meets time series analysis: unraveling microbial community dynamics

    NARCIS (Netherlands)

    Faust, K.; Lahti, L.M.; Gonze, D.; Vos, de W.M.; Raes, J.

    2015-01-01

    The recent increase in the number of microbial time series studies offers new insights into the stability and dynamics of microbial communities, from the world's oceans to human microbiota. Dedicated time series analysis tools allow taking full advantage of these data. Such tools can reveal periodic

  17. Volatility Analysis of Bitcoin Price Time Series

    Directory of Open Access Journals (Sweden)

    Lukáš Pichl

    2017-12-01

    Full Text Available Bitcoin has the largest share in the total capitalization of cryptocurrency markets currently reaching above 70 billion USD. In this work we focus on the price of Bitcoin in terms of standard currencies and their volatility over the last five years. The average day-to-day return throughout this period is 0.328%, amounting in exponential growth from 6 USD to over 4,000 USD per 1 BTC at present. Multi-scale analysis is performed from the level of the tick data, through the 5 min, 1 hour and 1 day scales. Distribution of trading volumes (1 sec, 1 min, 1 hour and 1 day aggregated from the Kraken BTCEUR tick data is provided that shows the artifacts of algorithmic trading (selling transactions with volume peaks distributed at integer multiples of BTC unit. Arbitrage opportunities are studied using the EUR, USD and CNY currencies. Whereas the arbitrage spread for EUR-USD currency pair is found narrow at the order of a percent, at the 1 hour sampling period the arbitrage spread for USD-CNY (and similarly EUR-CNY is found to be more substantial, reaching as high as above 5 percent on rare occasions. The volatility of BTC exchange rates is modeled using the day-to-day distribution of logarithmic return, and the Realized Volatility, sum of the squared logarithmic returns on 5-minute basis. In this work we demonstrate that the Heterogeneous Autoregressive model for Realized Volatility Andersen et al. (2007 applies reasonably well to the BTCUSD dataset. Finally, a feed-forward neural network with 2 hidden layers using 10-day moving window sampling daily return predictors is applied to estimate the next-day logarithmic return. The results show that such an artificial neural network prediction is capable of approximate capture of the actual log return distribution; more sophisticated methods, such as recurrent neural networks and LSTM (Long Short Term Memory techniques from deep learning may be necessary for higher prediction accuracy.

  18. Growth And Export Expansion In Mauritius - A Time Series Analysis ...

    African Journals Online (AJOL)

    Growth And Export Expansion In Mauritius - A Time Series Analysis. ... RV Sannassee, R Pearce ... Using Granger Causality tests, the short-run analysis results revealed that there is significant reciprocal causality between real export earnings ...

  19. Transition Icons for Time-Series Visualization and Exploratory Analysis.

    Science.gov (United States)

    Nickerson, Paul V; Baharloo, Raheleh; Wanigatunga, Amal A; Manini, Todd M; Tighe, Patrick J; Rashidi, Parisa

    2018-03-01

    The modern healthcare landscape has seen the rapid emergence of techniques and devices that temporally monitor and record physiological signals. The prevalence of time-series data within the healthcare field necessitates the development of methods that can analyze the data in order to draw meaningful conclusions. Time-series behavior is notoriously difficult to intuitively understand due to its intrinsic high-dimensionality, which is compounded in the case of analyzing groups of time series collected from different patients. Our framework, which we call transition icons, renders common patterns in a visual format useful for understanding the shared behavior within groups of time series. Transition icons are adept at detecting and displaying subtle differences and similarities, e.g., between measurements taken from patients receiving different treatment strategies or stratified by demographics. We introduce various methods that collectively allow for exploratory analysis of groups of time series, while being free of distribution assumptions and including simple heuristics for parameter determination. Our technique extracts discrete transition patterns from symbolic aggregate approXimation representations, and compiles transition frequencies into a bag of patterns constructed for each group. These transition frequencies are normalized and aligned in icon form to intuitively display the underlying patterns. We demonstrate the transition icon technique for two time-series datasets-postoperative pain scores, and hip-worn accelerometer activity counts. We believe transition icons can be an important tool for researchers approaching time-series data, as they give rich and intuitive information about collective time-series behaviors.

  20. Time averaging, ageing and delay analysis of financial time series

    Science.gov (United States)

    Cherstvy, Andrey G.; Vinod, Deepak; Aghion, Erez; Chechkin, Aleksei V.; Metzler, Ralf

    2017-06-01

    We introduce three strategies for the analysis of financial time series based on time averaged observables. These comprise the time averaged mean squared displacement (MSD) as well as the ageing and delay time methods for varying fractions of the financial time series. We explore these concepts via statistical analysis of historic time series for several Dow Jones Industrial indices for the period from the 1960s to 2015. Remarkably, we discover a simple universal law for the delay time averaged MSD. The observed features of the financial time series dynamics agree well with our analytical results for the time averaged measurables for geometric Brownian motion, underlying the famed Black-Scholes-Merton model. The concepts we promote here are shown to be useful for financial data analysis and enable one to unveil new universal features of stock market dynamics.

  1. Multivariate time series analysis with R and financial applications

    CERN Document Server

    Tsay, Ruey S

    2013-01-01

    Since the publication of his first book, Analysis of Financial Time Series, Ruey Tsay has become one of the most influential and prominent experts on the topic of time series. Different from the traditional and oftentimes complex approach to multivariate (MV) time series, this sequel book emphasizes structural specification, which results in simplified parsimonious VARMA modeling and, hence, eases comprehension. Through a fundamental balance between theory and applications, the book supplies readers with an accessible approach to financial econometric models and their applications to real-worl

  2. Time Series Analysis of Insar Data: Methods and Trends

    Science.gov (United States)

    Osmanoglu, Batuhan; Sunar, Filiz; Wdowinski, Shimon; Cano-Cabral, Enrique

    2015-01-01

    Time series analysis of InSAR data has emerged as an important tool for monitoring and measuring the displacement of the Earth's surface. Changes in the Earth's surface can result from a wide range of phenomena such as earthquakes, volcanoes, landslides, variations in ground water levels, and changes in wetland water levels. Time series analysis is applied to interferometric phase measurements, which wrap around when the observed motion is larger than one-half of the radar wavelength. Thus, the spatio-temporal ''unwrapping" of phase observations is necessary to obtain physically meaningful results. Several different algorithms have been developed for time series analysis of InSAR data to solve for this ambiguity. These algorithms may employ different models for time series analysis, but they all generate a first-order deformation rate, which can be compared to each other. However, there is no single algorithm that can provide optimal results in all cases. Since time series analyses of InSAR data are used in a variety of applications with different characteristics, each algorithm possesses inherently unique strengths and weaknesses. In this review article, following a brief overview of InSAR technology, we discuss several algorithms developed for time series analysis of InSAR data using an example set of results for measuring subsidence rates in Mexico City.

  3. Elements of nonlinear time series analysis and forecasting

    CERN Document Server

    De Gooijer, Jan G

    2017-01-01

    This book provides an overview of the current state-of-the-art of nonlinear time series analysis, richly illustrated with examples, pseudocode algorithms and real-world applications. Avoiding a “theorem-proof” format, it shows concrete applications on a variety of empirical time series. The book can be used in graduate courses in nonlinear time series and at the same time also includes interesting material for more advanced readers. Though it is largely self-contained, readers require an understanding of basic linear time series concepts, Markov chains and Monte Carlo simulation methods. The book covers time-domain and frequency-domain methods for the analysis of both univariate and multivariate (vector) time series. It makes a clear distinction between parametric models on the one hand, and semi- and nonparametric models/methods on the other. This offers the reader the option of concentrating exclusively on one of these nonlinear time series analysis methods. To make the book as user friendly as possible...

  4. Time Series Factor Analysis with an Application to Measuring Money

    NARCIS (Netherlands)

    Gilbert, Paul D.; Meijer, Erik

    2005-01-01

    Time series factor analysis (TSFA) and its associated statistical theory is developed. Unlike dynamic factor analysis (DFA), TSFA obviates the need for explicitly modeling the process dynamics of the underlying phenomena. It also differs from standard factor analysis (FA) in important respects: the

  5. Nonlinear time series analysis of the human electrocardiogram

    International Nuclear Information System (INIS)

    Perc, Matjaz

    2005-01-01

    We analyse the human electrocardiogram with simple nonlinear time series analysis methods that are appropriate for graduate as well as undergraduate courses. In particular, attention is devoted to the notions of determinism and stationarity in physiological data. We emphasize that methods of nonlinear time series analysis can be successfully applied only if the studied data set originates from a deterministic stationary system. After positively establishing the presence of determinism and stationarity in the studied electrocardiogram, we calculate the maximal Lyapunov exponent, thus providing interesting insights into the dynamics of the human heart. Moreover, to facilitate interest and enable the integration of nonlinear time series analysis methods into the curriculum at an early stage of the educational process, we also provide user-friendly programs for each implemented method

  6. Handbook of Time Series Analysis Recent Theoretical Developments and Applications

    CERN Document Server

    Schelter, Björn; Timmer, Jens

    2006-01-01

    This handbook provides an up-to-date survey of current research topics and applications of time series analysis methods written by leading experts in their fields. It covers recent developments in univariate as well as bivariate and multivariate time series analysis techniques ranging from physics' to life sciences' applications. Each chapter comprises both methodological aspects and applications to real world complex systems, such as the human brain or Earth's climate. Covering an exceptionally broad spectrum of topics, beginners, experts and practitioners who seek to understand the latest de

  7. Multiresolution analysis of Bursa Malaysia KLCI time series

    Science.gov (United States)

    Ismail, Mohd Tahir; Dghais, Amel Abdoullah Ahmed

    2017-05-01

    In general, a time series is simply a sequence of numbers collected at regular intervals over a period. Financial time series data processing is concerned with the theory and practice of processing asset price over time, such as currency, commodity data, and stock market data. The primary aim of this study is to understand the fundamental characteristics of selected financial time series by using the time as well as the frequency domain analysis. After that prediction can be executed for the desired system for in sample forecasting. In this study, multiresolution analysis which the assist of discrete wavelet transforms (DWT) and maximal overlap discrete wavelet transform (MODWT) will be used to pinpoint special characteristics of Bursa Malaysia KLCI (Kuala Lumpur Composite Index) daily closing prices and return values. In addition, further case study discussions include the modeling of Bursa Malaysia KLCI using linear ARIMA with wavelets to address how multiresolution approach improves fitting and forecasting results.

  8. The Photoplethismographic Signal Processed with Nonlinear Time Series Analysis Tools

    International Nuclear Information System (INIS)

    Hernandez Caceres, Jose Luis; Hong, Rolando; Garcia Lanz, Abel; Garcia Dominguez, Luis; Cabannas, Karelia

    2001-01-01

    Finger photoplethismography (PPG) signals were submitted to nonlinear time series analysis. The applied analytical techniques were: (i) High degree polynomial fitting for baseline estimation; (ii) FFT analysis for estimating power spectra; (iii) fractal dimension estimation via the Higuchi's time-domain method, and (iv) kernel nonparametric estimation for reconstructing noise free-attractors and also for estimating signal's stochastic components

  9. Time series analysis in chaotic diode resonator circuit

    Energy Technology Data Exchange (ETDEWEB)

    Hanias, M.P. [TEI of Chalkis, GR 34400, Evia, Chalkis (Greece)] e-mail: mhanias@teihal.gr; Giannaris, G. [TEI of Chalkis, GR 34400, Evia, Chalkis (Greece); Spyridakis, A. [TEI of Chalkis, GR 34400, Evia, Chalkis (Greece); Rigas, A. [TEI of Chalkis, GR 34400, Evia, Chalkis (Greece)

    2006-01-01

    A diode resonator chaotic circuit is presented. Multisim is used to simulate the circuit and show the presence of chaos. Time series analysis performed by the method proposed by Grasberger and Procaccia. The correlation and minimum embedding dimension {nu} and m {sub min}, respectively, were calculated. Also the corresponding Kolmogorov entropy was calculated.

  10. Time series analysis in chaotic diode resonator circuit

    International Nuclear Information System (INIS)

    Hanias, M.P.; Giannaris, G.; Spyridakis, A.; Rigas, A.

    2006-01-01

    A diode resonator chaotic circuit is presented. Multisim is used to simulate the circuit and show the presence of chaos. Time series analysis performed by the method proposed by Grasberger and Procaccia. The correlation and minimum embedding dimension ν and m min , respectively, were calculated. Also the corresponding Kolmogorov entropy was calculated

  11. Time series analysis of monthly pulpwood use in the Northeast

    Science.gov (United States)

    James T. Bones

    1980-01-01

    Time series analysis was used to develop a model that depicts pulpwood use in the Northeast. The model is useful in forecasting future pulpwood requirements (short term) or monitoring pulpwood-use activity in relation to past use patterns. The model predicted a downturn in use during 1980.

  12. Multi-granular trend detection for time-series analysis

    NARCIS (Netherlands)

    van Goethem, A.I.; Staals, F.; Löffler, M.; Dykes, J.; Speckmann, B.

    2017-01-01

    Time series (such as stock prices) and ensembles (such as model runs for weather forecasts) are two important types of one-dimensional time-varying data. Such data is readily available in large quantities but visual analysis of the raw data quickly becomes infeasible, even for moderately sized data

  13. Time Series Analysis Based on Running Mann Whitney Z Statistics

    Science.gov (United States)

    A sensitive and objective time series analysis method based on the calculation of Mann Whitney U statistics is described. This method samples data rankings over moving time windows, converts those samples to Mann-Whitney U statistics, and then normalizes the U statistics to Z statistics using Monte-...

  14. Time series analysis in astronomy: Limits and potentialities

    DEFF Research Database (Denmark)

    Vio, R.; Kristensen, N.R.; Madsen, Henrik

    2005-01-01

    In this paper we consider the problem of the limits concerning the physical information that can be extracted from the analysis of one or more time series ( light curves) typical of astrophysical objects. On the basis of theoretical considerations and numerical simulations, we show that with no a...

  15. Time Series Analysis of 3D Coordinates Using Nonstochastic Observations

    NARCIS (Netherlands)

    Velsink, H.

    2016-01-01

    Adjustment and testing of a combination of stochastic and nonstochastic observations is applied to the deformation analysis of a time series of 3D coordinates. Nonstochastic observations are constant values that are treated as if they were observations. They are used to formulate constraints on

  16. Time Series Analysis of 3D Coordinates Using Nonstochastic Observations

    NARCIS (Netherlands)

    Hiddo Velsink

    2016-01-01

    From the article: Abstract Adjustment and testing of a combination of stochastic and nonstochastic observations is applied to the deformation analysis of a time series of 3D coordinates. Nonstochastic observations are constant values that are treated as if they were observations. They are used to

  17. Identification of human operator performance models utilizing time series analysis

    Science.gov (United States)

    Holden, F. M.; Shinners, S. M.

    1973-01-01

    The results of an effort performed by Sperry Systems Management Division for AMRL in applying time series analysis as a tool for modeling the human operator are presented. This technique is utilized for determining the variation of the human transfer function under various levels of stress. The human operator's model is determined based on actual input and output data from a tracking experiment.

  18. Graphical Data Analysis on the Circle: Wrap-Around Time Series Plots for (Interrupted) Time Series Designs.

    Science.gov (United States)

    Rodgers, Joseph Lee; Beasley, William Howard; Schuelke, Matthew

    2014-01-01

    Many data structures, particularly time series data, are naturally seasonal, cyclical, or otherwise circular. Past graphical methods for time series have focused on linear plots. In this article, we move graphical analysis onto the circle. We focus on 2 particular methods, one old and one new. Rose diagrams are circular histograms and can be produced in several different forms using the RRose software system. In addition, we propose, develop, illustrate, and provide software support for a new circular graphical method, called Wrap-Around Time Series Plots (WATS Plots), which is a graphical method useful to support time series analyses in general but in particular in relation to interrupted time series designs. We illustrate the use of WATS Plots with an interrupted time series design evaluating the effect of the Oklahoma City bombing on birthrates in Oklahoma County during the 10 years surrounding the bombing of the Murrah Building in Oklahoma City. We compare WATS Plots with linear time series representations and overlay them with smoothing and error bands. Each method is shown to have advantages in relation to the other; in our example, the WATS Plots more clearly show the existence and effect size of the fertility differential.

  19. Notes on economic time series analysis system theoretic perspectives

    CERN Document Server

    Aoki, Masanao

    1983-01-01

    In seminars and graduate level courses I have had several opportunities to discuss modeling and analysis of time series with economists and economic graduate students during the past several years. These experiences made me aware of a gap between what economic graduate students are taught about vector-valued time series and what is available in recent system literature. Wishing to fill or narrow the gap that I suspect is more widely spread than my personal experiences indicate, I have written these notes to augment and reor­ ganize materials I have given in these courses and seminars. I have endeavored to present, in as much a self-contained way as practicable, a body of results and techniques in system theory that I judge to be relevant and useful to economists interested in using time series in their research. I have essentially acted as an intermediary and interpreter of system theoretic results and perspectives in time series by filtering out non-essential details, and presenting coherent accounts of wha...

  20. Time series analysis methods and applications for flight data

    CERN Document Server

    Zhang, Jianye

    2017-01-01

    This book focuses on different facets of flight data analysis, including the basic goals, methods, and implementation techniques. As mass flight data possesses the typical characteristics of time series, the time series analysis methods and their application for flight data have been illustrated from several aspects, such as data filtering, data extension, feature optimization, similarity search, trend monitoring, fault diagnosis, and parameter prediction, etc. An intelligent information-processing platform for flight data has been established to assist in aircraft condition monitoring, training evaluation and scientific maintenance. The book will serve as a reference resource for people working in aviation management and maintenance, as well as researchers and engineers in the fields of data analysis and data mining.

  1. Time series analysis of ozone data in Isfahan

    Science.gov (United States)

    Omidvari, M.; Hassanzadeh, S.; Hosseinibalam, F.

    2008-07-01

    Time series analysis used to investigate the stratospheric ozone formation and decomposition processes. Different time series methods are applied to detect the reason for extreme high ozone concentrations for each season. Data was convert into seasonal component and frequency domain, the latter has been evaluated by using the Fast Fourier Transform (FFT), spectral analysis. The power density spectrum estimated from the ozone data showed peaks at cycle duration of 22, 20, 36, 186, 365 and 40 days. According to seasonal component analysis most fluctuation was in 1999 and 2000, but the least fluctuation was in 2003. The best correlation between ozone and sun radiation was found in 2000. Other variables which are not available cause to this fluctuation in the 1999 and 2001. The trend of ozone is increasing in 1999 and is decreasing in other years.

  2. Time series analysis and its applications with R examples

    CERN Document Server

    Shumway, Robert H

    2017-01-01

    The fourth edition of this popular graduate textbook, like its predecessors, presents a balanced and comprehensive treatment of both time and frequency domain methods with accompanying theory. Numerous examples using nontrivial data illustrate solutions to problems such as discovering natural and anthropogenic climate change, evaluating pain perception experiments using functional magnetic resonance imaging, and monitoring a nuclear test ban treaty. The book is designed as a textbook for graduate level students in the physical, biological, and social sciences and as a graduate level text in statistics. Some parts may also serve as an undergraduate introductory course. Theory and methodology are separated to allow presentations on different levels. In addition to coverage of classical methods of time series regression, ARIMA models, spectral analysis and state-space models, the text includes modern developments including categorical time series analysis, multivariate spectral methods, long memory series, nonli...

  3. Time series analysis of nuclear instrumentation in EBR-II

    International Nuclear Information System (INIS)

    Imel, G.R.

    1996-01-01

    Results of a time series analysis of the scaler count data from the 3 wide range nuclear detectors in the Experimental Breeder Reactor-II are presented. One of the channels was replaced, and it was desired to determine if there was any statistically significant change (ie, improvement) in the channel's response after the replacement. Data were collected from all 3 channels for 16-day periods before and after detector replacement. Time series analysis and statistical tests showed that there was no significant change after the detector replacement. Also, there were no statistically significant differences among the 3 channels, either before or after the replacement. Finally, it was determined that errors in the reactivity change inferred from subcritical count monitoring during fuel handling would be on the other of 20-30 cents for single count intervals

  4. Mathematical methods in time series analysis and digital image processing

    CERN Document Server

    Kurths, J; Maass, P; Timmer, J

    2008-01-01

    The aim of this volume is to bring together research directions in theoretical signal and imaging processing developed rather independently in electrical engineering, theoretical physics, mathematics and the computer sciences. In particular, mathematically justified algorithms and methods, the mathematical analysis of these algorithms, and methods as well as the investigation of connections between methods from time series analysis and image processing are reviewed. An interdisciplinary comparison of these methods, drawing upon common sets of test problems from medicine and geophysical/enviromental sciences, is also addressed. This volume coherently summarizes work carried out in the field of theoretical signal and image processing. It focuses on non-linear and non-parametric models for time series as well as on adaptive methods in image processing.

  5. Spectral Unmixing Analysis of Time Series Landsat 8 Images

    Science.gov (United States)

    Zhuo, R.; Xu, L.; Peng, J.; Chen, Y.

    2018-05-01

    Temporal analysis of Landsat 8 images opens up new opportunities in the unmixing procedure. Although spectral analysis of time series Landsat imagery has its own advantage, it has rarely been studied. Nevertheless, using the temporal information can provide improved unmixing performance when compared to independent image analyses. Moreover, different land cover types may demonstrate different temporal patterns, which can aid the discrimination of different natures. Therefore, this letter presents time series K-P-Means, a new solution to the problem of unmixing time series Landsat imagery. The proposed approach is to obtain the "purified" pixels in order to achieve optimal unmixing performance. The vertex component analysis (VCA) is used to extract endmembers for endmember initialization. First, nonnegative least square (NNLS) is used to estimate abundance maps by using the endmember. Then, the estimated endmember is the mean value of "purified" pixels, which is the residual of the mixed pixel after excluding the contribution of all nondominant endmembers. Assembling two main steps (abundance estimation and endmember update) into the iterative optimization framework generates the complete algorithm. Experiments using both simulated and real Landsat 8 images show that the proposed "joint unmixing" approach provides more accurate endmember and abundance estimation results compared with "separate unmixing" approach.

  6. Topological data analysis of financial time series: Landscapes of crashes

    Science.gov (United States)

    Gidea, Marian; Katz, Yuri

    2018-02-01

    We explore the evolution of daily returns of four major US stock market indices during the technology crash of 2000, and the financial crisis of 2007-2009. Our methodology is based on topological data analysis (TDA). We use persistence homology to detect and quantify topological patterns that appear in multidimensional time series. Using a sliding window, we extract time-dependent point cloud data sets, to which we associate a topological space. We detect transient loops that appear in this space, and we measure their persistence. This is encoded in real-valued functions referred to as a 'persistence landscapes'. We quantify the temporal changes in persistence landscapes via their Lp-norms. We test this procedure on multidimensional time series generated by various non-linear and non-equilibrium models. We find that, in the vicinity of financial meltdowns, the Lp-norms exhibit strong growth prior to the primary peak, which ascends during a crash. Remarkably, the average spectral density at low frequencies of the time series of Lp-norms of the persistence landscapes demonstrates a strong rising trend for 250 trading days prior to either dotcom crash on 03/10/2000, or to the Lehman bankruptcy on 09/15/2008. Our study suggests that TDA provides a new type of econometric analysis, which complements the standard statistical measures. The method can be used to detect early warning signals of imminent market crashes. We believe that this approach can be used beyond the analysis of financial time series presented here.

  7. Time Series Analysis, Modeling and Applications A Computational Intelligence Perspective

    CERN Document Server

    Chen, Shyi-Ming

    2013-01-01

    Temporal and spatiotemporal data form an inherent fabric of the society as we are faced with streams of data coming from numerous sensors, data feeds, recordings associated with numerous areas of application embracing physical and human-generated phenomena (environmental data, financial markets, Internet activities, etc.). A quest for a thorough analysis, interpretation, modeling and prediction of time series comes with an ongoing challenge for developing models that are both accurate and user-friendly (interpretable). The volume is aimed to exploit the conceptual and algorithmic framework of Computational Intelligence (CI) to form a cohesive and comprehensive environment for building models of time series. The contributions covered in the volume are fully reflective of the wealth of the CI technologies by bringing together ideas, algorithms, and numeric studies, which convincingly demonstrate their relevance, maturity and visible usefulness. It reflects upon the truly remarkable diversity of methodological a...

  8. Recurrence Density Enhanced Complex Networks for Nonlinear Time Series Analysis

    Science.gov (United States)

    Costa, Diego G. De B.; Reis, Barbara M. Da F.; Zou, Yong; Quiles, Marcos G.; Macau, Elbert E. N.

    We introduce a new method, which is entitled Recurrence Density Enhanced Complex Network (RDE-CN), to properly analyze nonlinear time series. Our method first transforms a recurrence plot into a figure of a reduced number of points yet preserving the main and fundamental recurrence properties of the original plot. This resulting figure is then reinterpreted as a complex network, which is further characterized by network statistical measures. We illustrate the computational power of RDE-CN approach by time series by both the logistic map and experimental fluid flows, which show that our method distinguishes different dynamics sufficiently well as the traditional recurrence analysis. Therefore, the proposed methodology characterizes the recurrence matrix adequately, while using a reduced set of points from the original recurrence plots.

  9. Time series analysis for psychological research: examining and forecasting change.

    Science.gov (United States)

    Jebb, Andrew T; Tay, Louis; Wang, Wei; Huang, Qiming

    2015-01-01

    Psychological research has increasingly recognized the importance of integrating temporal dynamics into its theories, and innovations in longitudinal designs and analyses have allowed such theories to be formalized and tested. However, psychological researchers may be relatively unequipped to analyze such data, given its many characteristics and the general complexities involved in longitudinal modeling. The current paper introduces time series analysis to psychological research, an analytic domain that has been essential for understanding and predicting the behavior of variables across many diverse fields. First, the characteristics of time series data are discussed. Second, different time series modeling techniques are surveyed that can address various topics of interest to psychological researchers, including describing the pattern of change in a variable, modeling seasonal effects, assessing the immediate and long-term impact of a salient event, and forecasting future values. To illustrate these methods, an illustrative example based on online job search behavior is used throughout the paper, and a software tutorial in R for these analyses is provided in the Supplementary Materials.

  10. Time Series Analysis of Wheat Futures Reward in China

    Institute of Scientific and Technical Information of China (English)

    2005-01-01

    Different from the fact that the main researches are focused on single futures contract and lack of the comparison of different periods, this paper described the statistical characteristics of wheat futures reward time series of Zhengzhou Commodity Exchange in recent three years. Besides the basic statistic analysis, the paper used the GARCH and EGARCH model to describe the time series which had the ARCH effect and analyzed the persistence of volatility shocks and the leverage effect. The results showed that compared with that of normal one,wheat futures reward series were abnormality, leptokurtic and thick tail distribution. The study also found that two-part of the reward series had no autocorrelation. Among the six correlative series, three ones presented the ARCH effect. By using of the Auto-regressive Distributed Lag Model, GARCH model and EGARCH model, the paper demonstrates the persistence of volatility shocks and the leverage effect on the wheat futures reward time series. The results reveal that on the one hand, the statistical characteristics of the wheat futures reward are similar to the aboard mature futures market as a whole. But on the other hand, the results reflect some shortages such as the immatureness and the over-control by the government in the Chinese future market.

  11. Time series analysis for psychological research: examining and forecasting change

    Science.gov (United States)

    Jebb, Andrew T.; Tay, Louis; Wang, Wei; Huang, Qiming

    2015-01-01

    Psychological research has increasingly recognized the importance of integrating temporal dynamics into its theories, and innovations in longitudinal designs and analyses have allowed such theories to be formalized and tested. However, psychological researchers may be relatively unequipped to analyze such data, given its many characteristics and the general complexities involved in longitudinal modeling. The current paper introduces time series analysis to psychological research, an analytic domain that has been essential for understanding and predicting the behavior of variables across many diverse fields. First, the characteristics of time series data are discussed. Second, different time series modeling techniques are surveyed that can address various topics of interest to psychological researchers, including describing the pattern of change in a variable, modeling seasonal effects, assessing the immediate and long-term impact of a salient event, and forecasting future values. To illustrate these methods, an illustrative example based on online job search behavior is used throughout the paper, and a software tutorial in R for these analyses is provided in the Supplementary Materials. PMID:26106341

  12. Cluster analysis of activity-time series in motor learning

    DEFF Research Database (Denmark)

    Balslev, Daniela; Nielsen, Finn Å; Futiger, Sally A

    2002-01-01

    Neuroimaging studies of learning focus on brain areas where the activity changes as a function of time. To circumvent the difficult problem of model selection, we used a data-driven analytic tool, cluster analysis, which extracts representative temporal and spatial patterns from the voxel......-time series. The optimal number of clusters was chosen using a cross-validated likelihood method, which highlights the clustering pattern that generalizes best over the subjects. Data were acquired with PET at different time points during practice of a visuomotor task. The results from cluster analysis show...

  13. Chaotic time series analysis in economics: Balance and perspectives

    International Nuclear Information System (INIS)

    Faggini, Marisa

    2014-01-01

    The aim of the paper is not to review the large body of work concerning nonlinear time series analysis in economics, about which much has been written, but rather to focus on the new techniques developed to detect chaotic behaviours in economic data. More specifically, our attention will be devoted to reviewing some of these techniques and their application to economic and financial data in order to understand why chaos theory, after a period of growing interest, appears now not to be such an interesting and promising research area

  14. Chaotic time series analysis in economics: Balance and perspectives

    Energy Technology Data Exchange (ETDEWEB)

    Faggini, Marisa, E-mail: mfaggini@unisa.it [Dipartimento di Scienze Economiche e Statistiche, Università di Salerno, Fisciano 84084 (Italy)

    2014-12-15

    The aim of the paper is not to review the large body of work concerning nonlinear time series analysis in economics, about which much has been written, but rather to focus on the new techniques developed to detect chaotic behaviours in economic data. More specifically, our attention will be devoted to reviewing some of these techniques and their application to economic and financial data in order to understand why chaos theory, after a period of growing interest, appears now not to be such an interesting and promising research area.

  15. Comparison of correlation analysis techniques for irregularly sampled time series

    Directory of Open Access Journals (Sweden)

    K. Rehfeld

    2011-06-01

    Full Text Available Geoscientific measurements often provide time series with irregular time sampling, requiring either data reconstruction (interpolation or sophisticated methods to handle irregular sampling. We compare the linear interpolation technique and different approaches for analyzing the correlation functions and persistence of irregularly sampled time series, as Lomb-Scargle Fourier transformation and kernel-based methods. In a thorough benchmark test we investigate the performance of these techniques.

    All methods have comparable root mean square errors (RMSEs for low skewness of the inter-observation time distribution. For high skewness, very irregular data, interpolation bias and RMSE increase strongly. We find a 40 % lower RMSE for the lag-1 autocorrelation function (ACF for the Gaussian kernel method vs. the linear interpolation scheme,in the analysis of highly irregular time series. For the cross correlation function (CCF the RMSE is then lower by 60 %. The application of the Lomb-Scargle technique gave results comparable to the kernel methods for the univariate, but poorer results in the bivariate case. Especially the high-frequency components of the signal, where classical methods show a strong bias in ACF and CCF magnitude, are preserved when using the kernel methods.

    We illustrate the performances of interpolation vs. Gaussian kernel method by applying both to paleo-data from four locations, reflecting late Holocene Asian monsoon variability as derived from speleothem δ18O measurements. Cross correlation results are similar for both methods, which we attribute to the long time scales of the common variability. The persistence time (memory is strongly overestimated when using the standard, interpolation-based, approach. Hence, the Gaussian kernel is a reliable and more robust estimator with significant advantages compared to other techniques and suitable for large scale application to paleo-data.

  16. Remote-Sensing Time Series Analysis, a Vegetation Monitoring Tool

    Science.gov (United States)

    McKellip, Rodney; Prados, Donald; Ryan, Robert; Ross, Kenton; Spruce, Joseph; Gasser, Gerald; Greer, Randall

    2008-01-01

    The Time Series Product Tool (TSPT) is software, developed in MATLAB , which creates and displays high signal-to- noise Vegetation Indices imagery and other higher-level products derived from remotely sensed data. This tool enables automated, rapid, large-scale regional surveillance of crops, forests, and other vegetation. TSPT temporally processes high-revisit-rate satellite imagery produced by the Moderate Resolution Imaging Spectroradiometer (MODIS) and by other remote-sensing systems. Although MODIS imagery is acquired daily, cloudiness and other sources of noise can greatly reduce the effective temporal resolution. To improve cloud statistics, the TSPT combines MODIS data from multiple satellites (Aqua and Terra). The TSPT produces MODIS products as single time-frame and multitemporal change images, as time-series plots at a selected location, or as temporally processed image videos. Using the TSPT program, MODIS metadata is used to remove and/or correct bad and suspect data. Bad pixel removal, multiple satellite data fusion, and temporal processing techniques create high-quality plots and animated image video sequences that depict changes in vegetation greenness. This tool provides several temporal processing options not found in other comparable imaging software tools. Because the framework to generate and use other algorithms is established, small modifications to this tool will enable the use of a large range of remotely sensed data types. An effective remote-sensing crop monitoring system must be able to detect subtle changes in plant health in the earliest stages, before the effects of a disease outbreak or other adverse environmental conditions can become widespread and devastating. The integration of the time series analysis tool with ground-based information, soil types, crop types, meteorological data, and crop growth models in a Geographic Information System, could provide the foundation for a large-area crop-surveillance system that could identify

  17. Time series clustering analysis of health-promoting behavior

    Science.gov (United States)

    Yang, Chi-Ta; Hung, Yu-Shiang; Deng, Guang-Feng

    2013-10-01

    Health promotion must be emphasized to achieve the World Health Organization goal of health for all. Since the global population is aging rapidly, ComCare elder health-promoting service was developed by the Taiwan Institute for Information Industry in 2011. Based on the Pender health promotion model, ComCare service offers five categories of health-promoting functions to address the everyday needs of seniors: nutrition management, social support, exercise management, health responsibility, stress management. To assess the overall ComCare service and to improve understanding of the health-promoting behavior of elders, this study analyzed health-promoting behavioral data automatically collected by the ComCare monitoring system. In the 30638 session records collected for 249 elders from January, 2012 to March, 2013, behavior patterns were identified by fuzzy c-mean time series clustering algorithm combined with autocorrelation-based representation schemes. The analysis showed that time series data for elder health-promoting behavior can be classified into four different clusters. Each type reveals different health-promoting needs, frequencies, function numbers and behaviors. The data analysis result can assist policymakers, health-care providers, and experts in medicine, public health, nursing and psychology and has been provided to Taiwan National Health Insurance Administration to assess the elder health-promoting behavior.

  18. Time series analysis of gold production in Malaysia

    Science.gov (United States)

    Muda, Nora; Hoon, Lee Yuen

    2012-05-01

    Gold is a soft, malleable, bright yellow metallic element and unaffected by air or most reagents. It is highly valued as an asset or investment commodity and is extensively used in jewellery, industrial application, dentistry and medical applications. In Malaysia, gold mining is limited in several areas such as Pahang, Kelantan, Terengganu, Johor and Sarawak. The main purpose of this case study is to obtain a suitable model for the production of gold in Malaysia. The model can also be used to predict the data of Malaysia's gold production in the future. Box-Jenkins time series method was used to perform time series analysis with the following steps: identification, estimation, diagnostic checking and forecasting. In addition, the accuracy of prediction is tested using mean absolute percentage error (MAPE). From the analysis, the ARIMA (3,1,1) model was found to be the best fitted model with MAPE equals to 3.704%, indicating the prediction is very accurate. Hence, this model can be used for forecasting. This study is expected to help the private and public sectors to understand the gold production scenario and later plan the gold mining activities in Malaysia.

  19. STUDIES IN ASTRONOMICAL TIME SERIES ANALYSIS. VI. BAYESIAN BLOCK REPRESENTATIONS

    Energy Technology Data Exchange (ETDEWEB)

    Scargle, Jeffrey D. [Space Science and Astrobiology Division, MS 245-3, NASA Ames Research Center, Moffett Field, CA 94035-1000 (United States); Norris, Jay P. [Physics Department, Boise State University, 2110 University Drive, Boise, ID 83725-1570 (United States); Jackson, Brad [The Center for Applied Mathematics and Computer Science, Department of Mathematics, San Jose State University, One Washington Square, MH 308, San Jose, CA 95192-0103 (United States); Chiang, James, E-mail: jeffrey.d.scargle@nasa.gov [W. W. Hansen Experimental Physics Laboratory, Kavli Institute for Particle Astrophysics and Cosmology, Department of Physics and SLAC National Accelerator Laboratory, Stanford University, Stanford, CA 94305 (United States)

    2013-02-20

    This paper addresses the problem of detecting and characterizing local variability in time series and other forms of sequential data. The goal is to identify and characterize statistically significant variations, at the same time suppressing the inevitable corrupting observational errors. We present a simple nonparametric modeling technique and an algorithm implementing it-an improved and generalized version of Bayesian Blocks-that finds the optimal segmentation of the data in the observation interval. The structure of the algorithm allows it to be used in either a real-time trigger mode, or a retrospective mode. Maximum likelihood or marginal posterior functions to measure model fitness are presented for events, binned counts, and measurements at arbitrary times with known error distributions. Problems addressed include those connected with data gaps, variable exposure, extension to piecewise linear and piecewise exponential representations, multivariate time series data, analysis of variance, data on the circle, other data modes, and dispersed data. Simulations provide evidence that the detection efficiency for weak signals is close to a theoretical asymptotic limit derived by Arias-Castro et al. In the spirit of Reproducible Research all of the code and data necessary to reproduce all of the figures in this paper are included as supplementary material.

  20. Time series analysis of brain regional volume by MR image

    International Nuclear Information System (INIS)

    Tanaka, Mika; Tarusawa, Ayaka; Nihei, Mitsuyo; Fukami, Tadanori; Yuasa, Tetsuya; Wu, Jin; Ishiwata, Kiichi; Ishii, Kenji

    2010-01-01

    The present study proposed a methodology of time series analysis of volumes of frontal, parietal, temporal and occipital lobes and cerebellum because such volumetric reports along the process of individual's aging have been scarcely presented. Subjects analyzed were brain images of 2 healthy males and 18 females of av. age of 69.0 y, of which T1-weighted 3D SPGR (spoiled gradient recalled in the steady state) acquisitions with a GE SIGNA EXCITE HD 1.5T machine were conducted for 4 times in the time series of 42-50 months. The image size was 256 x 256 x (86-124) voxels with digitization level 16 bits. As the template for the regions, the standard gray matter atlas (icbn452 a tlas p robability g ray) and its labeled one (icbn.Labels), provided by UCLA Laboratory of Neuro Imaging, were used for individual's standardization. Segmentation, normalization and coregistration were performed with the MR imaging software SPM8 (Statistic Parametric Mapping 8). Volumes of regions were calculated as their voxel ratio to the whole brain voxel in percent. It was found that the regional volumes decreased with aging in all above lobes examined and cerebellum in average percent per year of -0.11, -0.07, -0.04, -0.02, and -0.03, respectively. The procedure for calculation of the regional volumes, which has been manually operated hitherto, can be automatically conducted for the individual brain using the standard atlases above. (T.T.)

  1. Studies in Astronomical Time Series Analysis. VI. Bayesian Block Representations

    Science.gov (United States)

    Scargle, Jeffrey D.; Norris, Jay P.; Jackson, Brad; Chiang, James

    2013-01-01

    This paper addresses the problem of detecting and characterizing local variability in time series and other forms of sequential data. The goal is to identify and characterize statistically significant variations, at the same time suppressing the inevitable corrupting observational errors. We present a simple nonparametric modeling technique and an algorithm implementing it-an improved and generalized version of Bayesian Blocks [Scargle 1998]-that finds the optimal segmentation of the data in the observation interval. The structure of the algorithm allows it to be used in either a real-time trigger mode, or a retrospective mode. Maximum likelihood or marginal posterior functions to measure model fitness are presented for events, binned counts, and measurements at arbitrary times with known error distributions. Problems addressed include those connected with data gaps, variable exposure, extension to piece- wise linear and piecewise exponential representations, multivariate time series data, analysis of variance, data on the circle, other data modes, and dispersed data. Simulations provide evidence that the detection efficiency for weak signals is close to a theoretical asymptotic limit derived by [Arias-Castro, Donoho and Huo 2003]. In the spirit of Reproducible Research [Donoho et al. (2008)] all of the code and data necessary to reproduce all of the figures in this paper are included as auxiliary material.

  2. STUDIES IN ASTRONOMICAL TIME SERIES ANALYSIS. VI. BAYESIAN BLOCK REPRESENTATIONS

    International Nuclear Information System (INIS)

    Scargle, Jeffrey D.; Norris, Jay P.; Jackson, Brad; Chiang, James

    2013-01-01

    This paper addresses the problem of detecting and characterizing local variability in time series and other forms of sequential data. The goal is to identify and characterize statistically significant variations, at the same time suppressing the inevitable corrupting observational errors. We present a simple nonparametric modeling technique and an algorithm implementing it—an improved and generalized version of Bayesian Blocks—that finds the optimal segmentation of the data in the observation interval. The structure of the algorithm allows it to be used in either a real-time trigger mode, or a retrospective mode. Maximum likelihood or marginal posterior functions to measure model fitness are presented for events, binned counts, and measurements at arbitrary times with known error distributions. Problems addressed include those connected with data gaps, variable exposure, extension to piecewise linear and piecewise exponential representations, multivariate time series data, analysis of variance, data on the circle, other data modes, and dispersed data. Simulations provide evidence that the detection efficiency for weak signals is close to a theoretical asymptotic limit derived by Arias-Castro et al. In the spirit of Reproducible Research all of the code and data necessary to reproduce all of the figures in this paper are included as supplementary material.

  3. Predicting the Market Potential Using Time Series Analysis

    Directory of Open Access Journals (Sweden)

    Halmet Bradosti

    2015-12-01

    Full Text Available The aim of this analysis is to forecast a mini-market sales volume for the period of twelve months starting August 2015 to August 2016. The study is based on the monthly sales in Iraqi Dinar for a private local mini-market for the month of April 2014 to July 2015. As revealed on the graph and of course if the stagnant economic condition continues, the trend of future sales is down-warding. Based on time series analysis, the business may continue to operate and generate small revenues until August 2016. However, due to low sales volume, low profit margin and operating expenses, the revenues may not be adequate enough to produce positive net income and the business may not be able to operate afterward. The principal question rose from this is the forecasting sales in the region will be difficult where the business cycle so dynamic and revolutionary due to systematic risks and unforeseeable future.

  4. Centrality measures in temporal networks with time series analysis

    Science.gov (United States)

    Huang, Qiangjuan; Zhao, Chengli; Zhang, Xue; Wang, Xiaojie; Yi, Dongyun

    2017-05-01

    The study of identifying important nodes in networks has a wide application in different fields. However, the current researches are mostly based on static or aggregated networks. Recently, the increasing attention to networks with time-varying structure promotes the study of node centrality in temporal networks. In this paper, we define a supra-evolution matrix to depict the temporal network structure. With using of the time series analysis, the relationships between different time layers can be learned automatically. Based on the special form of the supra-evolution matrix, the eigenvector centrality calculating problem is turned into the calculation of eigenvectors of several low-dimensional matrices through iteration, which effectively reduces the computational complexity. Experiments are carried out on two real-world temporal networks, Enron email communication network and DBLP co-authorship network, the results of which show that our method is more efficient at discovering the important nodes than the common aggregating method.

  5. The Prediction of Teacher Turnover Employing Time Series Analysis.

    Science.gov (United States)

    Costa, Crist H.

    The purpose of this study was to combine knowledge of teacher demographic data with time-series forecasting methods to predict teacher turnover. Moving averages and exponential smoothing were used to forecast discrete time series. The study used data collected from the 22 largest school districts in Iowa, designated as FACT schools. Predictions…

  6. Analysis of complex time series using refined composite multiscale entropy

    International Nuclear Information System (INIS)

    Wu, Shuen-De; Wu, Chiu-Wen; Lin, Shiou-Gwo; Lee, Kung-Yen; Peng, Chung-Kang

    2014-01-01

    Multiscale entropy (MSE) is an effective algorithm for measuring the complexity of a time series that has been applied in many fields successfully. However, MSE may yield an inaccurate estimation of entropy or induce undefined entropy because the coarse-graining procedure reduces the length of a time series considerably at large scales. Composite multiscale entropy (CMSE) was recently proposed to improve the accuracy of MSE, but it does not resolve undefined entropy. Here we propose a refined composite multiscale entropy (RCMSE) to improve CMSE. For short time series analyses, we demonstrate that RCMSE increases the accuracy of entropy estimation and reduces the probability of inducing undefined entropy.

  7. Assessing Spontaneous Combustion Instability with Nonlinear Time Series Analysis

    Science.gov (United States)

    Eberhart, C. J.; Casiano, M. J.

    2015-01-01

    Considerable interest lies in the ability to characterize the onset of spontaneous instabilities within liquid propellant rocket engine (LPRE) combustion devices. Linear techniques, such as fast Fourier transforms, various correlation parameters, and critical damping parameters, have been used at great length for over fifty years. Recently, nonlinear time series methods have been applied to deduce information pertaining to instability incipiency hidden in seemingly stochastic combustion noise. A technique commonly used in biological sciences known as the Multifractal Detrended Fluctuation Analysis has been extended to the combustion dynamics field, and is introduced here as a data analysis approach complementary to linear ones. Advancing, a modified technique is leveraged to extract artifacts of impending combustion instability that present themselves a priori growth to limit cycle amplitudes. Analysis is demonstrated on data from J-2X gas generator testing during which a distinct spontaneous instability was observed. Comparisons are made to previous work wherein the data were characterized using linear approaches. Verification of the technique is performed by examining idealized signals and comparing two separate, independently developed tools.

  8. Interrupted time-series analysis: studying trends in neurosurgery.

    Science.gov (United States)

    Wong, Ricky H; Smieliauskas, Fabrice; Pan, I-Wen; Lam, Sandi K

    2015-12-01

    OBJECT Neurosurgery studies traditionally have evaluated the effects of interventions on health care outcomes by studying overall changes in measured outcomes over time. Yet, this type of linear analysis is limited due to lack of consideration of the trend's effects both pre- and postintervention and the potential for confounding influences. The aim of this study was to illustrate interrupted time-series analysis (ITSA) as applied to an example in the neurosurgical literature and highlight ITSA's potential for future applications. METHODS The methods used in previous neurosurgical studies were analyzed and then compared with the methodology of ITSA. RESULTS The ITSA method was identified in the neurosurgical literature as an important technique for isolating the effect of an intervention (such as a policy change or a quality and safety initiative) on a health outcome independent of other factors driving trends in the outcome. The authors determined that ITSA allows for analysis of the intervention's immediate impact on outcome level and on subsequent trends and enables a more careful measure of the causal effects of interventions on health care outcomes. CONCLUSIONS ITSA represents a significant improvement over traditional observational study designs in quantifying the impact of an intervention. ITSA is a useful statistical procedure to understand, consider, and implement as the field of neurosurgery evolves in sophistication in big-data analytics, economics, and health services research.

  9. Unsupervised land cover change detection: meaningful sequential time series analysis

    CSIR Research Space (South Africa)

    Salmon, BP

    2011-06-01

    Full Text Available An automated land cover change detection method is proposed that uses coarse spatial resolution hyper-temporal earth observation satellite time series data. The study compared three different unsupervised clustering approaches that operate on short...

  10. Advances in Antithetic Time Series Analysis : Separating Fact from Artifact

    Directory of Open Access Journals (Sweden)

    Dennis Ridley

    2016-01-01

    Full Text Available The problem of biased time series mathematical model parameter estimates is well known to be insurmountable. When used to predict future values by extrapolation, even a de minimis bias will eventually grow into a large bias, with misleading results. This paper elucidates how combining antithetic time series' solves this baffling problem of bias in the fitted and forecast values by dynamic bias cancellation. Instead of growing to infinity, the average error can converge to a constant. (original abstract

  11. Geomechanical time series and its singularity spectrum analysis

    Czech Academy of Sciences Publication Activity Database

    Lyubushin, Alexei A.; Kaláb, Zdeněk; Lednická, Markéta

    2012-01-01

    Roč. 47, č. 1 (2012), s. 69-77 ISSN 1217-8977 R&D Projects: GA ČR GA105/09/0089 Institutional research plan: CEZ:AV0Z30860518 Keywords : geomechanical time series * singularity spectrum * time series segmentation * laser distance meter Subject RIV: DC - Siesmology, Volcanology, Earth Structure Impact factor: 0.347, year: 2012 http://www.akademiai.com/content/88v4027758382225/fulltext.pdf

  12. Interglacial climate dynamics and advanced time series analysis

    Science.gov (United States)

    Mudelsee, Manfred; Bermejo, Miguel; Köhler, Peter; Lohmann, Gerrit

    2013-04-01

    Studying the climate dynamics of past interglacials (IGs) helps to better assess the anthropogenically influenced dynamics of the current IG, the Holocene. We select the IG portions from the EPICA Dome C ice core archive, which covers the past 800 ka, to apply methods of statistical time series analysis (Mudelsee 2010). The analysed variables are deuterium/H (indicating temperature) (Jouzel et al. 2007), greenhouse gases (Siegenthaler et al. 2005, Loulergue et al. 2008, L¨ü thi et al. 2008) and a model-co-derived climate radiative forcing (Köhler et al. 2010). We select additionally high-resolution sea-surface-temperature records from the marine sedimentary archive. The first statistical method, persistence time estimation (Mudelsee 2002) lets us infer the 'climate memory' property of IGs. Second, linear regression informs about long-term climate trends during IGs. Third, ramp function regression (Mudelsee 2000) is adapted to look on abrupt climate changes during IGs. We compare the Holocene with previous IGs in terms of these mathematical approaches, interprete results in a climate context, assess uncertainties and the requirements to data from old IGs for yielding results of 'acceptable' accuracy. This work receives financial support from the Deutsche Forschungsgemeinschaft (Project ClimSens within the DFG Research Priority Program INTERDYNAMIK) and the European Commission (Marie Curie Initial Training Network LINC, No. 289447, within the 7th Framework Programme). References Jouzel J, Masson-Delmotte V, Cattani O, Dreyfus G, Falourd S, Hoffmann G, Minster B, Nouet J, Barnola JM, Chappellaz J, Fischer H, Gallet JC, Johnsen S, Leuenberger M, Loulergue L, Luethi D, Oerter H, Parrenin F, Raisbeck G, Raynaud D, Schilt A, Schwander J, Selmo E, Souchez R, Spahni R, Stauffer B, Steffensen JP, Stenni B, Stocker TF, Tison JL, Werner M, Wolff EW (2007) Orbital and millennial Antarctic climate variability over the past 800,000 years. Science 317:793. Köhler P, Bintanja R

  13. Time Series Analysis of the Quasar PKS 1749+096

    Science.gov (United States)

    Lam, Michael T.; Balonek, T. J.

    2011-01-01

    Multiple timescales of variability are observed in quasars at a variety of wavelengths, the nature of which is not fully understood. In 2007 and 2008, the quasar 1749+096 underwent two unprecedented optical outbursts, reaching a brightness never before seen in our twenty years of monitoring. Much lower level activity had been seen prior to these two outbursts. We present an analysis of the timescales of variability over the two regimes using a variety of statistical techniques. An IDL software package developed at Colgate University over the summer of 2010, the Quasar User Interface (QUI), provides effective computation of four time series functions for analyzing underlying trends present in generic, discretely sampled data sets. Using the Autocorrelation Function, Structure Function, and Power Spectrum, we are able to quickly identify possible variability timescales. QUI is also capable of computing the Cross-Correlation Function for comparing variability at different wavelengths. We apply these algorithms to 1749+096 and present our analysis of the timescales for this object. Funding for this project was received from Colgate University, the Justus and Jayne Schlichting Student Research Fund, and the NASA / New York Space Grant.

  14. Modeling activity patterns of wildlife using time-series analysis.

    Science.gov (United States)

    Zhang, Jindong; Hull, Vanessa; Ouyang, Zhiyun; He, Liang; Connor, Thomas; Yang, Hongbo; Huang, Jinyan; Zhou, Shiqiang; Zhang, Zejun; Zhou, Caiquan; Zhang, Hemin; Liu, Jianguo

    2017-04-01

    The study of wildlife activity patterns is an effective approach to understanding fundamental ecological and evolutionary processes. However, traditional statistical approaches used to conduct quantitative analysis have thus far had limited success in revealing underlying mechanisms driving activity patterns. Here, we combine wavelet analysis, a type of frequency-based time-series analysis, with high-resolution activity data from accelerometers embedded in GPS collars to explore the effects of internal states (e.g., pregnancy) and external factors (e.g., seasonal dynamics of resources and weather) on activity patterns of the endangered giant panda ( Ailuropoda melanoleuca ). Giant pandas exhibited higher frequency cycles during the winter when resources (e.g., water and forage) were relatively poor, as well as during spring, which includes the giant panda's mating season. During the summer and autumn when resources were abundant, pandas exhibited a regular activity pattern with activity peaks every 24 hr. A pregnant individual showed distinct differences in her activity pattern from other giant pandas for several months following parturition. These results indicate that animals adjust activity cycles to adapt to seasonal variation of the resources and unique physiological periods. Wavelet coherency analysis also verified the synchronization of giant panda activity level with air temperature and solar radiation at the 24-hr band. Our study also shows that wavelet analysis is an effective tool for analyzing high-resolution activity pattern data and its relationship to internal and external states, an approach that has the potential to inform wildlife conservation and management across species.

  15. ON THE FOURIER AND WAVELET ANALYSIS OF CORONAL TIME SERIES

    International Nuclear Information System (INIS)

    Auchère, F.; Froment, C.; Bocchialini, K.; Buchlin, E.; Solomon, J.

    2016-01-01

    Using Fourier and wavelet analysis, we critically re-assess the significance of our detection of periodic pulsations in coronal loops. We show that the proper identification of the frequency dependence and statistical properties of the different components of the power spectra provides a strong argument against the common practice of data detrending, which tends to produce spurious detections around the cut-off frequency of the filter. In addition, the white and red noise models built into the widely used wavelet code of Torrence and Compo cannot, in most cases, adequately represent the power spectra of coronal time series, thus also possibly causing false positives. Both effects suggest that several reports of periodic phenomena should be re-examined. The Torrence and Compo code nonetheless effectively computes rigorous confidence levels if provided with pertinent models of mean power spectra, and we describe the appropriate manner in which to call its core routines. We recall the meaning of the default confidence levels output from the code, and we propose new Monte-Carlo-derived levels that take into account the total number of degrees of freedom in the wavelet spectra. These improvements allow us to confirm that the power peaks that we detected have a very low probability of being caused by noise.

  16. ON THE FOURIER AND WAVELET ANALYSIS OF CORONAL TIME SERIES

    Energy Technology Data Exchange (ETDEWEB)

    Auchère, F.; Froment, C.; Bocchialini, K.; Buchlin, E.; Solomon, J., E-mail: frederic.auchere@ias.u-psud.fr [Institut d’Astrophysique Spatiale, CNRS, Univ. Paris-Sud, Université Paris-Saclay, Bât. 121, F-91405 Orsay (France)

    2016-07-10

    Using Fourier and wavelet analysis, we critically re-assess the significance of our detection of periodic pulsations in coronal loops. We show that the proper identification of the frequency dependence and statistical properties of the different components of the power spectra provides a strong argument against the common practice of data detrending, which tends to produce spurious detections around the cut-off frequency of the filter. In addition, the white and red noise models built into the widely used wavelet code of Torrence and Compo cannot, in most cases, adequately represent the power spectra of coronal time series, thus also possibly causing false positives. Both effects suggest that several reports of periodic phenomena should be re-examined. The Torrence and Compo code nonetheless effectively computes rigorous confidence levels if provided with pertinent models of mean power spectra, and we describe the appropriate manner in which to call its core routines. We recall the meaning of the default confidence levels output from the code, and we propose new Monte-Carlo-derived levels that take into account the total number of degrees of freedom in the wavelet spectra. These improvements allow us to confirm that the power peaks that we detected have a very low probability of being caused by noise.

  17. Track Irregularity Time Series Analysis and Trend Forecasting

    Directory of Open Access Journals (Sweden)

    Jia Chaolong

    2012-01-01

    Full Text Available The combination of linear and nonlinear methods is widely used in the prediction of time series data. This paper analyzes track irregularity time series data by using gray incidence degree models and methods of data transformation, trying to find the connotative relationship between the time series data. In this paper, GM (1,1 is based on first-order, single variable linear differential equations; after an adaptive improvement and error correction, it is used to predict the long-term changing trend of track irregularity at a fixed measuring point; the stochastic linear AR, Kalman filtering model, and artificial neural network model are applied to predict the short-term changing trend of track irregularity at unit section. Both long-term and short-term changes prove that the model is effective and can achieve the expected accuracy.

  18. Data imputation analysis for Cosmic Rays time series

    Science.gov (United States)

    Fernandes, R. C.; Lucio, P. S.; Fernandez, J. H.

    2017-05-01

    The occurrence of missing data concerning Galactic Cosmic Rays time series (GCR) is inevitable since loss of data is due to mechanical and human failure or technical problems and different periods of operation of GCR stations. The aim of this study was to perform multiple dataset imputation in order to depict the observational dataset. The study has used the monthly time series of GCR Climax (CLMX) and Roma (ROME) from 1960 to 2004 to simulate scenarios of 10%, 20%, 30%, 40%, 50%, 60%, 70%, 80% and 90% of missing data compared to observed ROME series, with 50 replicates. Then, the CLMX station as a proxy for allocation of these scenarios was used. Three different methods for monthly dataset imputation were selected: AMÉLIA II - runs the bootstrap Expectation Maximization algorithm, MICE - runs an algorithm via Multivariate Imputation by Chained Equations and MTSDI - an Expectation Maximization algorithm-based method for imputation of missing values in multivariate normal time series. The synthetic time series compared with the observed ROME series has also been evaluated using several skill measures as such as RMSE, NRMSE, Agreement Index, R, R2, F-test and t-test. The results showed that for CLMX and ROME, the R2 and R statistics were equal to 0.98 and 0.96, respectively. It was observed that increases in the number of gaps generate loss of quality of the time series. Data imputation was more efficient with MTSDI method, with negligible errors and best skill coefficients. The results suggest a limit of about 60% of missing data for imputation, for monthly averages, no more than this. It is noteworthy that CLMX, ROME and KIEL stations present no missing data in the target period. This methodology allowed reconstructing 43 time series.

  19. Unstable Periodic Orbit Analysis of Histograms of Chaotic Time Series

    International Nuclear Information System (INIS)

    Zoldi, S.M.

    1998-01-01

    Using the Lorenz equations, we have investigated whether unstable periodic orbits (UPOs) associated with a strange attractor may predict the occurrence of the robust sharp peaks in histograms of some experimental chaotic time series. Histograms with sharp peaks occur for the Lorenz parameter value r=60.0 but not for r=28.0 , and the sharp peaks for r=60.0 do not correspond to a histogram derived from any single UPO. However, we show that histograms derived from the time series of a non-Axiom-A chaotic system can be accurately predicted by an escape-time weighting of UPO histograms. copyright 1998 The American Physical Society

  20. Minimum entropy density method for the time series analysis

    Science.gov (United States)

    Lee, Jeong Won; Park, Joongwoo Brian; Jo, Hang-Hyun; Yang, Jae-Suk; Moon, Hie-Tae

    2009-01-01

    The entropy density is an intuitive and powerful concept to study the complicated nonlinear processes derived from physical systems. We develop the minimum entropy density method (MEDM) to detect the structure scale of a given time series, which is defined as the scale in which the uncertainty is minimized, hence the pattern is revealed most. The MEDM is applied to the financial time series of Standard and Poor’s 500 index from February 1983 to April 2006. Then the temporal behavior of structure scale is obtained and analyzed in relation to the information delivery time and efficient market hypothesis.

  1. Koopman Operator Framework for Time Series Modeling and Analysis

    Science.gov (United States)

    Surana, Amit

    2018-01-01

    We propose an interdisciplinary framework for time series classification, forecasting, and anomaly detection by combining concepts from Koopman operator theory, machine learning, and linear systems and control theory. At the core of this framework is nonlinear dynamic generative modeling of time series using the Koopman operator which is an infinite-dimensional but linear operator. Rather than working with the underlying nonlinear model, we propose two simpler linear representations or model forms based on Koopman spectral properties. We show that these model forms are invariants of the generative model and can be readily identified directly from data using techniques for computing Koopman spectral properties without requiring the explicit knowledge of the generative model. We also introduce different notions of distances on the space of such model forms which is essential for model comparison/clustering. We employ the space of Koopman model forms equipped with distance in conjunction with classical machine learning techniques to develop a framework for automatic feature generation for time series classification. The forecasting/anomaly detection framework is based on using Koopman model forms along with classical linear systems and control approaches. We demonstrate the proposed framework for human activity classification, and for time series forecasting/anomaly detection in power grid application.

  2. Complexity analysis of the turbulent environmental fluid flow time series

    Science.gov (United States)

    Mihailović, D. T.; Nikolić-Đorić, E.; Drešković, N.; Mimić, G.

    2014-02-01

    We have used the Kolmogorov complexities, sample and permutation entropies to quantify the randomness degree in river flow time series of two mountain rivers in Bosnia and Herzegovina, representing the turbulent environmental fluid, for the period 1926-1990. In particular, we have examined the monthly river flow time series from two rivers (the Miljacka and the Bosnia) in the mountain part of their flow and then calculated the Kolmogorov complexity (KL) based on the Lempel-Ziv Algorithm (LZA) (lower-KLL and upper-KLU), sample entropy (SE) and permutation entropy (PE) values for each time series. The results indicate that the KLL, KLU, SE and PE values in two rivers are close to each other regardless of the amplitude differences in their monthly flow rates. We have illustrated the changes in mountain river flow complexity by experiments using (i) the data set for the Bosnia River and (ii) anticipated human activities and projected climate changes. We have explored the sensitivity of considered measures in dependence on the length of time series. In addition, we have divided the period 1926-1990 into three subintervals: (a) 1926-1945, (b) 1946-1965, (c) 1966-1990, and calculated the KLL, KLU, SE, PE values for the various time series in these subintervals. It is found that during the period 1946-1965, there is a decrease in their complexities, and corresponding changes in the SE and PE, in comparison to the period 1926-1990. This complexity loss may be primarily attributed to (i) human interventions, after the Second World War, on these two rivers because of their use for water consumption and (ii) climate change in recent times.

  3. Modeling Philippine Stock Exchange Composite Index Using Time Series Analysis

    Science.gov (United States)

    Gayo, W. S.; Urrutia, J. D.; Temple, J. M. F.; Sandoval, J. R. D.; Sanglay, J. E. A.

    2015-06-01

    This study was conducted to develop a time series model of the Philippine Stock Exchange Composite Index and its volatility using the finite mixture of ARIMA model with conditional variance equations such as ARCH, GARCH, EG ARCH, TARCH and PARCH models. Also, the study aimed to find out the reason behind the behaviorof PSEi, that is, which of the economic variables - Consumer Price Index, crude oil price, foreign exchange rate, gold price, interest rate, money supply, price-earnings ratio, Producers’ Price Index and terms of trade - can be used in projecting future values of PSEi and this was examined using Granger Causality Test. The findings showed that the best time series model for Philippine Stock Exchange Composite index is ARIMA(1,1,5) - ARCH(1). Also, Consumer Price Index, crude oil price and foreign exchange rate are factors concluded to Granger cause Philippine Stock Exchange Composite Index.

  4. Analysis of time series and size of equivalent sample

    International Nuclear Information System (INIS)

    Bernal, Nestor; Molina, Alicia; Pabon, Daniel; Martinez, Jorge

    2004-01-01

    In a meteorological context, a first approach to the modeling of time series is to use models of autoregressive type. This allows one to take into account the meteorological persistence or temporal behavior, thereby identifying the memory of the analyzed process. This article seeks to pre-sent the concept of the size of an equivalent sample, which helps to identify in the data series sub periods with a similar structure. Moreover, in this article we examine the alternative of adjusting the variance of the series, keeping in mind its temporal structure, as well as an adjustment to the covariance of two time series. This article presents two examples, the first one corresponding to seven simulated series with autoregressive structure of first order, and the second corresponding to seven meteorological series of anomalies of the air temperature at the surface in two Colombian regions

  5. Dynamical analysis and visualization of tornadoes time series.

    Directory of Open Access Journals (Sweden)

    António M Lopes

    Full Text Available In this paper we analyze the behavior of tornado time-series in the U.S. from the perspective of dynamical systems. A tornado is a violently rotating column of air extending from a cumulonimbus cloud down to the ground. Such phenomena reveal features that are well described by power law functions and unveil characteristics found in systems with long range memory effects. Tornado time series are viewed as the output of a complex system and are interpreted as a manifestation of its dynamics. Tornadoes are modeled as sequences of Dirac impulses with amplitude proportional to the events size. First, a collection of time series involving 64 years is analyzed in the frequency domain by means of the Fourier transform. The amplitude spectra are approximated by power law functions and their parameters are read as an underlying signature of the system dynamics. Second, it is adopted the concept of circular time and the collective behavior of tornadoes analyzed. Clustering techniques are then adopted to identify and visualize the emerging patterns.

  6. Dynamical analysis and visualization of tornadoes time series.

    Science.gov (United States)

    Lopes, António M; Tenreiro Machado, J A

    2015-01-01

    In this paper we analyze the behavior of tornado time-series in the U.S. from the perspective of dynamical systems. A tornado is a violently rotating column of air extending from a cumulonimbus cloud down to the ground. Such phenomena reveal features that are well described by power law functions and unveil characteristics found in systems with long range memory effects. Tornado time series are viewed as the output of a complex system and are interpreted as a manifestation of its dynamics. Tornadoes are modeled as sequences of Dirac impulses with amplitude proportional to the events size. First, a collection of time series involving 64 years is analyzed in the frequency domain by means of the Fourier transform. The amplitude spectra are approximated by power law functions and their parameters are read as an underlying signature of the system dynamics. Second, it is adopted the concept of circular time and the collective behavior of tornadoes analyzed. Clustering techniques are then adopted to identify and visualize the emerging patterns.

  7. Financial time series analysis based on information categorization method

    Science.gov (United States)

    Tian, Qiang; Shang, Pengjian; Feng, Guochen

    2014-12-01

    The paper mainly applies the information categorization method to analyze the financial time series. The method is used to examine the similarity of different sequences by calculating the distances between them. We apply this method to quantify the similarity of different stock markets. And we report the results of similarity in US and Chinese stock markets in periods 1991-1998 (before the Asian currency crisis), 1999-2006 (after the Asian currency crisis and before the global financial crisis), and 2007-2013 (during and after global financial crisis) by using this method. The results show the difference of similarity between different stock markets in different time periods and the similarity of the two stock markets become larger after these two crises. Also we acquire the results of similarity of 10 stock indices in three areas; it means the method can distinguish different areas' markets from the phylogenetic trees. The results show that we can get satisfactory information from financial markets by this method. The information categorization method can not only be used in physiologic time series, but also in financial time series.

  8. Mortality on extreme heat days using official thresholds in Spain: a multi-city time series analysis

    Directory of Open Access Journals (Sweden)

    Tobias Aurelio

    2012-02-01

    Full Text Available Abstract Background The 2003 heat wave had a high impact on mortality in Europe, which made necessary to develop heat health watch warning systems. In Spain this was carried-out by the Ministry of Health in 2004, being based on exceeding of city-specific simultaneous thresholds of minimum and maximum daily temperatures. The aim of this study is to assess effectiveness of the official thresholds established by the Ministry of Health for each provincial capital city, by quantifying and comparing the short-term effects of above-threshold days on total daily mortality. Methods Total daily mortality and minimum and maximum temperatures for the 52 capitals of province in Spain were collected during summer months (June to September for the study period 1995-2004. Data was analysed using GEE for Poisson regression. Relative Risk (RR of total daily mortality was quantified for the current day of official thresholds exceeded. Results The number of days in which the thresholds were exceeded show great inconsistency, with provinces with great number of exceeded days adjacent to provinces that did not exceed or rarely exceeded. The average overall excess risk of dying during an extreme heat day was about 25% (RR = 1.24; 95% confidence interval (CI = [1.19-1.30]. Relative risks showed a significant heterogeneity between cities (I2 = 54.9%. Western situation and low mean summer temperatures were associated with higher relative risks, suggesting thresholds may have been set too high in these areas. Conclusions This study confirmed that extreme heat days have a considerable impact on total daily mortality in Spain. Official thresholds gave consistent relative risk in the large capital cities. However, in some other cities thresholds

  9. Time-Series Analysis of Supergranule Characterstics at Solar Minimum

    Science.gov (United States)

    Williams, Peter E.; Pesnell, W. Dean

    2013-01-01

    Sixty days of Doppler images from the Solar and Heliospheric Observatory (SOHO) / Michelson Doppler Imager (MDI) investigation during the 1996 and 2008 solar minima have been analyzed to show that certain supergranule characteristics (size, size range, and horizontal velocity) exhibit fluctuations of three to five days. Cross-correlating parameters showed a good, positive correlation between supergranulation size and size range, and a moderate, negative correlation between size range and velocity. The size and velocity do exhibit a moderate, negative correlation, but with a small time lag (less than 12 hours). Supergranule sizes during five days of co-temporal data from MDI and the Solar Dynamics Observatory (SDO) / Helioseismic Magnetic Imager (HMI) exhibit similar fluctuations with a high level of correlation between them. This verifies the solar origin of the fluctuations, which cannot be caused by instrumental artifacts according to these observations. Similar fluctuations are also observed in data simulations that model the evolution of the MDI Doppler pattern over a 60-day period. Correlations between the supergranule size and size range time-series derived from the simulated data are similar to those seen in MDI data. A simple toy-model using cumulative, uncorrelated exponential growth and decay patterns at random emergence times produces a time-series similar to the data simulations. The qualitative similarities between the simulated and the observed time-series suggest that the fluctuations arise from stochastic processes occurring within the solar convection zone. This behavior, propagating to surface manifestations of supergranulation, may assist our understanding of magnetic-field-line advection, evolution, and interaction.

  10. A Multivariate Time Series Method for Monte Carlo Reactor Analysis

    International Nuclear Information System (INIS)

    Taro Ueki

    2008-01-01

    A robust multivariate time series method has been established for the Monte Carlo calculation of neutron multiplication problems. The method is termed Coarse Mesh Projection Method (CMPM) and can be implemented using the coarse statistical bins for acquisition of nuclear fission source data. A novel aspect of CMPM is the combination of the general technical principle of projection pursuit in the signal processing discipline and the neutron multiplication eigenvalue problem in the nuclear engineering discipline. CMPM enables reactor physicists to accurately evaluate major eigenvalue separations of nuclear reactors with continuous energy Monte Carlo calculation. CMPM was incorporated in the MCNP Monte Carlo particle transport code of Los Alamos National Laboratory. The great advantage of CMPM over the traditional Fission Matrix method is demonstrated for the three space-dimensional modeling of the initial core of a pressurized water reactor

  11. Time series analysis of the behavior of brazilian natural rubber

    Directory of Open Access Journals (Sweden)

    Antônio Donizette de Oliveira

    2009-03-01

    Full Text Available The natural rubber is a non-wood product obtained of the coagulation of some lattices of forest species, being Hevea brasiliensis the main one. Native from the Amazon Region, this species was already known by the Indians before the discovery of America. The natural rubber became a product globally valued due to its multiple applications in the economy, being its almost perfect substitute the synthetic rubber derived from the petroleum. Similarly to what happens with other countless products the forecast of future prices of the natural rubber has been object of many studies. The use of models of forecast of univariate timeseries stands out as the more accurate and useful to reduce the uncertainty in the economic decision making process. This studyanalyzed the historical series of prices of the Brazilian natural rubber (R$/kg, in the Jan/99 - Jun/2006 period, in order tocharacterize the rubber price behavior in the domestic market; estimated a model for the time series of monthly natural rubberprices; and foresaw the domestic prices of the natural rubber, in the Jul/2006 - Jun/2007 period, based on the estimated models.The studied models were the ones belonging to the ARIMA family. The main results were: the domestic market of the natural rubberis expanding due to the growth of the world economy; among the adjusted models, the ARIMA (1,1,1 model provided the bestadjustment of the time series of prices of the natural rubber (R$/kg; the prognosis accomplished for the series supplied statistically adequate fittings.

  12. Multiscale multifractal multiproperty analysis of financial time series based on Rényi entropy

    Science.gov (United States)

    Yujun, Yang; Jianping, Li; Yimei, Yang

    This paper introduces a multiscale multifractal multiproperty analysis based on Rényi entropy (3MPAR) method to analyze short-range and long-range characteristics of financial time series, and then applies this method to the five time series of five properties in four stock indices. Combining the two analysis techniques of Rényi entropy and multifractal detrended fluctuation analysis (MFDFA), the 3MPAR method focuses on the curves of Rényi entropy and generalized Hurst exponent of five properties of four stock time series, which allows us to study more universal and subtle fluctuation characteristics of financial time series. By analyzing the curves of the Rényi entropy and the profiles of the logarithm distribution of MFDFA of five properties of four stock indices, the 3MPAR method shows some fluctuation characteristics of the financial time series and the stock markets. Then, it also shows a richer information of the financial time series by comparing the profile of five properties of four stock indices. In this paper, we not only focus on the multifractality of time series but also the fluctuation characteristics of the financial time series and subtle differences in the time series of different properties. We find that financial time series is far more complex than reported in some research works using one property of time series.

  13. A non linear analysis of human gait time series based on multifractal analysis and cross correlations

    International Nuclear Information System (INIS)

    Munoz-Diosdado, A

    2005-01-01

    We analyzed databases with gait time series of adults and persons with Parkinson, Huntington and amyotrophic lateral sclerosis (ALS) diseases. We obtained the staircase graphs of accumulated events that can be bounded by a straight line whose slope can be used to distinguish between gait time series from healthy and ill persons. The global Hurst exponent of these series do not show tendencies, we intend that this is because some gait time series have monofractal behavior and others have multifractal behavior so they cannot be characterized with a single Hurst exponent. We calculated the multifractal spectra, obtained the spectra width and found that the spectra of the healthy young persons are almost monofractal. The spectra of ill persons are wider than the spectra of healthy persons. In opposition to the interbeat time series where the pathology implies loss of multifractality, in the gait time series the multifractal behavior emerges with the pathology. Data were collected from healthy and ill subjects as they walked in a roughly circular path and they have sensors in both feet, so we have one time series for the left foot and other for the right foot. First, we analyzed these time series separately, and then we compared both results, with direct comparison and with a cross correlation analysis. We tried to find differences in both time series that can be used as indicators of equilibrium problems

  14. A non linear analysis of human gait time series based on multifractal analysis and cross correlations

    Energy Technology Data Exchange (ETDEWEB)

    Munoz-Diosdado, A [Department of Mathematics, Unidad Profesional Interdisciplinaria de Biotecnologia, Instituto Politecnico Nacional, Av. Acueducto s/n, 07340, Mexico City (Mexico)

    2005-01-01

    We analyzed databases with gait time series of adults and persons with Parkinson, Huntington and amyotrophic lateral sclerosis (ALS) diseases. We obtained the staircase graphs of accumulated events that can be bounded by a straight line whose slope can be used to distinguish between gait time series from healthy and ill persons. The global Hurst exponent of these series do not show tendencies, we intend that this is because some gait time series have monofractal behavior and others have multifractal behavior so they cannot be characterized with a single Hurst exponent. We calculated the multifractal spectra, obtained the spectra width and found that the spectra of the healthy young persons are almost monofractal. The spectra of ill persons are wider than the spectra of healthy persons. In opposition to the interbeat time series where the pathology implies loss of multifractality, in the gait time series the multifractal behavior emerges with the pathology. Data were collected from healthy and ill subjects as they walked in a roughly circular path and they have sensors in both feet, so we have one time series for the left foot and other for the right foot. First, we analyzed these time series separately, and then we compared both results, with direct comparison and with a cross correlation analysis. We tried to find differences in both time series that can be used as indicators of equilibrium problems.

  15. Financial time series analysis based on effective phase transfer entropy

    Science.gov (United States)

    Yang, Pengbo; Shang, Pengjian; Lin, Aijing

    2017-02-01

    Transfer entropy is a powerful technique which is able to quantify the impact of one dynamic system on another system. In this paper, we propose the effective phase transfer entropy method based on the transfer entropy method. We use simulated data to test the performance of this method, and the experimental results confirm that the proposed approach is capable of detecting the information transfer between the systems. We also explore the relationship between effective phase transfer entropy and some variables, such as data size, coupling strength and noise. The effective phase transfer entropy is positively correlated with the data size and the coupling strength. Even in the presence of a large amount of noise, it can detect the information transfer between systems, and it is very robust to noise. Moreover, this measure is indeed able to accurately estimate the information flow between systems compared with phase transfer entropy. In order to reflect the application of this method in practice, we apply this method to financial time series and gain new insight into the interactions between systems. It is demonstrated that the effective phase transfer entropy can be used to detect some economic fluctuations in the financial market. To summarize, the effective phase transfer entropy method is a very efficient tool to estimate the information flow between systems.

  16. Industrial electricity demand for Turkey: A structural time series analysis

    International Nuclear Information System (INIS)

    Dilaver, Zafer; Hunt, Lester C.

    2011-01-01

    This research investigates the relationship between Turkish industrial electricity consumption, industrial value added and electricity prices in order to forecast future Turkish industrial electricity demand. To achieve this, an industrial electricity demand function for Turkey is estimated by applying the structural time series technique to annual data over the period 1960 to 2008. In addition to identifying the size and significance of the price and industrial value added (output) elasticities, this technique also uncovers the electricity Underlying Energy Demand Trend (UEDT) for the Turkish industrial sector and is, as far as is known, the first attempt to do this. The results suggest that output and real electricity prices and a UEDT all have an important role to play in driving Turkish industrial electricity demand. Consequently, they should all be incorporated when modelling Turkish industrial electricity demand and the estimated UEDT should arguably be considered in future energy policy decisions concerning the Turkish electricity industry. The output and price elasticities are estimated to be 0.15 and - 0.16 respectively, with an increasing (but at a decreasing rate) UEDT and based on the estimated equation, and different forecast assumptions, it is predicted that Turkish industrial electricity demand will be somewhere between 97 and 148 TWh by 2020. -- Research Highlights: → Estimated output and price elasticities of 0.15 and -0.16 respectively. → Estimated upward sloping UEDT (i.e. energy using) but at a decreasing rate. → Predicted Turkish industrial electricity demand between 97 and 148 TWh in 2020.

  17. A unified nonlinear stochastic time series analysis for climate science.

    Science.gov (United States)

    Moon, Woosok; Wettlaufer, John S

    2017-03-13

    Earth's orbit and axial tilt imprint a strong seasonal cycle on climatological data. Climate variability is typically viewed in terms of fluctuations in the seasonal cycle induced by higher frequency processes. We can interpret this as a competition between the orbitally enforced monthly stability and the fluctuations/noise induced by weather. Here we introduce a new time-series method that determines these contributions from monthly-averaged data. We find that the spatio-temporal distribution of the monthly stability and the magnitude of the noise reveal key fingerprints of several important climate phenomena, including the evolution of the Arctic sea ice cover, the El Nio Southern Oscillation (ENSO), the Atlantic Nio and the Indian Dipole Mode. In analogy with the classical destabilising influence of the ice-albedo feedback on summertime sea ice, we find that during some time interval of the season a destabilising process operates in all of these climate phenomena. The interaction between the destabilisation and the accumulation of noise, which we term the memory effect, underlies phase locking to the seasonal cycle and the statistical nature of seasonal predictability.

  18. Time-series-analysis techniques applied to nuclear-material accounting

    International Nuclear Information System (INIS)

    Pike, D.H.; Morrison, G.W.; Downing, D.J.

    1982-05-01

    This document is designed to introduce the reader to the applications of Time Series Analysis techniques to Nuclear Material Accountability data. Time series analysis techniques are designed to extract information from a collection of random variables ordered by time by seeking to identify any trends, patterns, or other structure in the series. Since nuclear material accountability data is a time series, one can extract more information using time series analysis techniques than by using other statistical techniques. Specifically, the objective of this document is to examine the applicability of time series analysis techniques to enhance loss detection of special nuclear materials. An introductory section examines the current industry approach which utilizes inventory differences. The error structure of inventory differences is presented. Time series analysis techniques discussed include the Shewhart Control Chart, the Cumulative Summation of Inventory Differences Statistics (CUSUM) and the Kalman Filter and Linear Smoother

  19. Time series analysis of diverse extreme phenomena: universal features

    Science.gov (United States)

    Eftaxias, K.; Balasis, G.

    2012-04-01

    The field of study of complex systems holds that the dynamics of complex systems are founded on universal principles that may used to describe a great variety of scientific and technological approaches of different types of natural, artificial, and social systems. We suggest that earthquake, epileptic seizures, solar flares, and magnetic storms dynamics can be analyzed within similar mathematical frameworks. A central property of aforementioned extreme events generation is the occurrence of coherent large-scale collective behavior with very rich structure, resulting from repeated nonlinear interactions among the corresponding constituents. Consequently, we apply the Tsallis nonextensive statistical mechanics as it proves an appropriate framework in order to investigate universal principles of their generation. First, we examine the data in terms of Tsallis entropy aiming to discover common "pathological" symptoms of transition to a significant shock. By monitoring the temporal evolution of the degree of organization in time series we observe similar distinctive features revealing significant reduction of complexity during their emergence. Second, a model for earthquake dynamics coming from a nonextensive Tsallis formalism, starting from first principles, has been recently introduced. This approach leads to an energy distribution function (Gutenberg-Richter type law) for the magnitude distribution of earthquakes, providing an excellent fit to seismicities generated in various large geographic areas usually identified as seismic regions. We show that this function is able to describe the energy distribution (with similar non-extensive q-parameter) of solar flares, magnetic storms, epileptic and earthquake shocks. The above mentioned evidence of a universal statistical behavior suggests the possibility of a common approach for studying space weather, earthquakes and epileptic seizures.

  20. The Timeseries Toolbox - A Web Application to Enable Accessible, Reproducible Time Series Analysis

    Science.gov (United States)

    Veatch, W.; Friedman, D.; Baker, B.; Mueller, C.

    2017-12-01

    The vast majority of data analyzed by climate researchers are repeated observations of physical process or time series data. This data lends itself of a common set of statistical techniques and models designed to determine trends and variability (e.g., seasonality) of these repeated observations. Often, these same techniques and models can be applied to a wide variety of different time series data. The Timeseries Toolbox is a web application designed to standardize and streamline these common approaches to time series analysis and modeling with particular attention to hydrologic time series used in climate preparedness and resilience planning and design by the U. S. Army Corps of Engineers. The application performs much of the pre-processing of time series data necessary for more complex techniques (e.g. interpolation, aggregation). With this tool, users can upload any dataset that conforms to a standard template and immediately begin applying these techniques to analyze their time series data.

  1. Time series analysis of aerobic bacterial flora during Miso fermentation.

    Science.gov (United States)

    Onda, T; Yanagida, F; Tsuji, M; Shinohara, T; Yokotsuka, K

    2003-01-01

    This article reports a microbiological study of aerobic mesophilic bacteria that are present during the fermentation process of Miso. Aerobic bacteria were enumerated and isolated from Miso during fermentation and divided into nine groups using traditional phenotypic tests. The strains were identified by biochemical analysis and 16S rRNA sequence analysis. They were identified as Bacillus subtilis, B. amyloliquefaciens, Kocuria kristinae, Staphylococcus gallinarum and S. kloosii. All strains were sensitive to the bacteriocins produced by the lactic acid bacteria isolated from Miso. The dominant species among the undesirable species throughout the fermentation process were B. subtilis and B. amyloliquefaciens. It is suggested that bacteriocin-producing lactic acid bacteria are effective in the growth prevention of aerobic bacteria in Miso. This study has provided useful information for controlling of bacterial flora during Miso fermentation.

  2. Mapping air temperature using time series analysis of LST : The SINTESI approach

    NARCIS (Netherlands)

    Alfieri, S.M.; De Lorenzi, F.; Menenti, M.

    2013-01-01

    This paper presents a new procedure to map time series of air temperature (Ta) at fine spatial resolution using time series analysis of satellite-derived land surface temperature (LST) observations. The method assumes that air temperature is known at a single (reference) location such as in gridded

  3. Cluster analysis of activity-time series in motor learning

    DEFF Research Database (Denmark)

    Balslev, Daniela; Nielsen, Finn Årup; Frutiger, Sally A.

    2002-01-01

    Neuroimaging studies of learning focus on brain areas where the activity changes as a function of time. To circumvent the difficult problem of model selection, we used a data-driven analytic tool, cluster analysis, which extracts representative temporal and spatial patterns from the voxel...... practice-related activity in a fronto-parieto-cerebellar network, in agreement with previous studies of motor learning. These voxels were separated from a group of voxels showing an unspecific time-effect and another group of voxels, whose activation was an artifact from smoothing. Hum. Brain Mapping 15...

  4. A novel water quality data analysis framework based on time-series data mining.

    Science.gov (United States)

    Deng, Weihui; Wang, Guoyin

    2017-07-01

    The rapid development of time-series data mining provides an emerging method for water resource management research. In this paper, based on the time-series data mining methodology, we propose a novel and general analysis framework for water quality time-series data. It consists of two parts: implementation components and common tasks of time-series data mining in water quality data. In the first part, we propose to granulate the time series into several two-dimensional normal clouds and calculate the similarities in the granulated level. On the basis of the similarity matrix, the similarity search, anomaly detection, and pattern discovery tasks in the water quality time-series instance dataset can be easily implemented in the second part. We present a case study of this analysis framework on weekly Dissolve Oxygen time-series data collected from five monitoring stations on the upper reaches of Yangtze River, China. It discovered the relationship of water quality in the mainstream and tributary as well as the main changing patterns of DO. The experimental results show that the proposed analysis framework is a feasible and efficient method to mine the hidden and valuable knowledge from water quality historical time-series data. Copyright © 2017 Elsevier Ltd. All rights reserved.

  5. Chaos in Electronic Circuits: Nonlinear Time Series Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Wheat, Jr., Robert M. [Kennedy Western Univ., Cheyenne, WY (United States)

    2003-07-01

    Chaos in electronic circuits is a phenomenon that has been largely ignored by engineers, manufacturers, and researchers until the early 1990’s and the work of Chua, Matsumoto, and others. As the world becomes more dependent on electronic devices, the detrimental effects of non-normal operation of these devices becomes more significant. Developing a better understanding of the mechanisms involved in the chaotic behavior of electronic circuits is a logical step toward the prediction and prevention of any potentially catastrophic occurrence of this phenomenon. Also, a better understanding of chaotic behavior, in a general sense, could potentially lead to better accuracy in the prediction of natural events such as weather, volcanic activity, and earthquakes. As a first step in this improvement of understanding, and as part of the research being reported here, methods of computer modeling, identifying and analyzing, and producing chaotic behavior in simple electronic circuits have been developed. The computer models were developed using both the Alternative Transient Program (ATP) and Spice, the analysis techniques have been implemented using the C and C++ programming languages, and the chaotically behaving circuits developed using “off the shelf” electronic components.

  6. Financing Human Development for Sectorial Growth: A Time Series Analysis

    Directory of Open Access Journals (Sweden)

    Shobande Abdul Olatunji

    2017-06-01

    Full Text Available The role which financing human development plays in fostering the sectorial growth of an economy cannot be undermined. It is a key instrument which can be utilized to alleviate poverty, create employment and ensure the sustenance of economic growth and development. Thus financing human development for sectorial growth has taken the center stage of economic growth and development strategies in most countries. In a constructive effort to examine the in-depth relationship between the variables in the Nigerian space, this paper provides evidence on the impact of financing human development and sectorial growth in Nigeria between 1982 and 2016, using the Johansen co-integration techniques to test for co-integration among the variables and the Vector Error Correction Model (VECM to ascertain the speed of adjustment of the variables to their long run equilibrium position. The analysis shows that a long and short run relationship exists between financing human capital development and sectorial growth during the period reviewed. Therefore, the paper argues that for an active foundation for sustainable sectorial growth and development, financing human capital development across each unit is urgently required through increased budgetary allocation for both health and educational sectors since they are key components of human capital development in a nation.

  7. Spatially adaptive mixture modeling for analysis of FMRI time series.

    Science.gov (United States)

    Vincent, Thomas; Risser, Laurent; Ciuciu, Philippe

    2010-04-01

    Within-subject analysis in fMRI essentially addresses two problems, the detection of brain regions eliciting evoked activity and the estimation of the underlying dynamics. In Makni et aL, 2005 and Makni et aL, 2008, a detection-estimation framework has been proposed to tackle these problems jointly, since they are connected to one another. In the Bayesian formalism, detection is achieved by modeling activating and nonactivating voxels through independent mixture models (IMM) within each region while hemodynamic response estimation is performed at a regional scale in a nonparametric way. Instead of IMMs, in this paper we take advantage of spatial mixture models (SMM) for their nonlinear spatial regularizing properties. The proposed method is unsupervised and spatially adaptive in the sense that the amount of spatial correlation is automatically tuned from the data and this setting automatically varies across brain regions. In addition, the level of regularization is specific to each experimental condition since both the signal-to-noise ratio and the activation pattern may vary across stimulus types in a given brain region. These aspects require the precise estimation of multiple partition functions of underlying Ising fields. This is addressed efficiently using first path sampling for a small subset of fields and then using a recently developed fast extrapolation technique for the large remaining set. Simulation results emphasize that detection relying on supervised SMM outperforms its IMM counterpart and that unsupervised spatial mixture models achieve similar results without any hand-tuning of the correlation parameter. On real datasets, the gain is illustrated in a localizer fMRI experiment: brain activations appear more spatially resolved using SMM in comparison with classical general linear model (GLM)-based approaches, while estimating a specific parcel-based HRF shape. Our approach therefore validates the treatment of unsmoothed fMRI data without fixed GLM

  8. Correlation between detrended fluctuation analysis and the Lempel-Ziv complexity in nonlinear time series analysis

    International Nuclear Information System (INIS)

    Tang You-Fu; Liu Shu-Lin; Jiang Rui-Hong; Liu Ying-Hui

    2013-01-01

    We study the correlation between detrended fluctuation analysis (DFA) and the Lempel-Ziv complexity (LZC) in nonlinear time series analysis in this paper. Typical dynamic systems including a logistic map and a Duffing model are investigated. Moreover, the influence of Gaussian random noise on both the DFA and LZC are analyzed. The results show a high correlation between the DFA and LZC, which can quantify the non-stationarity and the nonlinearity of the time series, respectively. With the enhancement of the random component, the exponent a and the normalized complexity index C show increasing trends. In addition, C is found to be more sensitive to the fluctuation in the nonlinear time series than α. Finally, the correlation between the DFA and LZC is applied to the extraction of vibration signals for a reciprocating compressor gas valve, and an effective fault diagnosis result is obtained

  9. Multichannel biomedical time series clustering via hierarchical probabilistic latent semantic analysis.

    Science.gov (United States)

    Wang, Jin; Sun, Xiangping; Nahavandi, Saeid; Kouzani, Abbas; Wu, Yuchuan; She, Mary

    2014-11-01

    Biomedical time series clustering that automatically groups a collection of time series according to their internal similarity is of importance for medical record management and inspection such as bio-signals archiving and retrieval. In this paper, a novel framework that automatically groups a set of unlabelled multichannel biomedical time series according to their internal structural similarity is proposed. Specifically, we treat a multichannel biomedical time series as a document and extract local segments from the time series as words. We extend a topic model, i.e., the Hierarchical probabilistic Latent Semantic Analysis (H-pLSA), which was originally developed for visual motion analysis to cluster a set of unlabelled multichannel time series. The H-pLSA models each channel of the multichannel time series using a local pLSA in the first layer. The topics learned in the local pLSA are then fed to a global pLSA in the second layer to discover the categories of multichannel time series. Experiments on a dataset extracted from multichannel Electrocardiography (ECG) signals demonstrate that the proposed method performs better than previous state-of-the-art approaches and is relatively robust to the variations of parameters including length of local segments and dictionary size. Although the experimental evaluation used the multichannel ECG signals in a biometric scenario, the proposed algorithm is a universal framework for multichannel biomedical time series clustering according to their structural similarity, which has many applications in biomedical time series management. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  10. Fractal analysis and nonlinear forecasting of indoor 222Rn time series

    International Nuclear Information System (INIS)

    Pausch, G.; Bossew, P.; Hofmann, W.; Steger, F.

    1998-01-01

    Fractal analyses of indoor 222 Rn time series were performed using different chaos theory based measurements such as time delay method, Hurst's rescaled range analysis, capacity (fractal) dimension, and Lyapunov exponent. For all time series we calculated only positive Lyapunov exponents which is a hint to chaos, while the Hurst exponents were well below 0.5, indicating antipersistent behaviour (past trends tend to reverse in the future). These time series were also analyzed with a nonlinear prediction method which allowed an estimation of the embedding dimensions with some restrictions, limiting the prediction to about three relative time steps. (orig.)

  11. Trend Estimation and Regression Analysis in Climatological Time Series: An Application of Structural Time Series Models and the Kalman Filter.

    Science.gov (United States)

    Visser, H.; Molenaar, J.

    1995-05-01

    The detection of trends in climatological data has become central to the discussion on climate change due to the enhanced greenhouse effect. To prove detection, a method is needed (i) to make inferences on significant rises or declines in trends, (ii) to take into account natural variability in climate series, and (iii) to compare output from GCMs with the trends in observed climate data. To meet these requirements, flexible mathematical tools are needed. A structural time series model is proposed with which a stochastic trend, a deterministic trend, and regression coefficients can be estimated simultaneously. The stochastic trend component is described using the class of ARIMA models. The regression component is assumed to be linear. However, the regression coefficients corresponding with the explanatory variables may be time dependent to validate this assumption. The mathematical technique used to estimate this trend-regression model is the Kaiman filter. The main features of the filter are discussed.Examples of trend estimation are given using annual mean temperatures at a single station in the Netherlands (1706-1990) and annual mean temperatures at Northern Hemisphere land stations (1851-1990). The inclusion of explanatory variables is shown by regressing the latter temperature series on four variables: Southern Oscillation index (SOI), volcanic dust index (VDI), sunspot numbers (SSN), and a simulated temperature signal, induced by increasing greenhouse gases (GHG). In all analyses, the influence of SSN on global temperatures is found to be negligible. The correlations between temperatures and SOI and VDI appear to be negative. For SOI, this correlation is significant, but for VDI it is not, probably because of a lack of volcanic eruptions during the sample period. The relation between temperatures and GHG is positive, which is in agreement with the hypothesis of a warming climate because of increasing levels of greenhouse gases. The prediction performance of

  12. Bioelectric signal classification using a recurrent probabilistic neural network with time-series discriminant component analysis.

    Science.gov (United States)

    Hayashi, Hideaki; Shima, Keisuke; Shibanoki, Taro; Kurita, Yuichi; Tsuji, Toshio

    2013-01-01

    This paper outlines a probabilistic neural network developed on the basis of time-series discriminant component analysis (TSDCA) that can be used to classify high-dimensional time-series patterns. TSDCA involves the compression of high-dimensional time series into a lower-dimensional space using a set of orthogonal transformations and the calculation of posterior probabilities based on a continuous-density hidden Markov model that incorporates a Gaussian mixture model expressed in the reduced-dimensional space. The analysis can be incorporated into a neural network so that parameters can be obtained appropriately as network coefficients according to backpropagation-through-time-based training algorithm. The network is considered to enable high-accuracy classification of high-dimensional time-series patterns and to reduce the computation time taken for network training. In the experiments conducted during the study, the validity of the proposed network was demonstrated for EEG signals.

  13. Phase synchronization based minimum spanning trees for analysis of financial time series with nonlinear correlations

    Science.gov (United States)

    Radhakrishnan, Srinivasan; Duvvuru, Arjun; Sultornsanee, Sivarit; Kamarthi, Sagar

    2016-02-01

    The cross correlation coefficient has been widely applied in financial time series analysis, in specific, for understanding chaotic behaviour in terms of stock price and index movements during crisis periods. To better understand time series correlation dynamics, the cross correlation matrices are represented as networks, in which a node stands for an individual time series and a link indicates cross correlation between a pair of nodes. These networks are converted into simpler trees using different schemes. In this context, Minimum Spanning Trees (MST) are the most favoured tree structures because of their ability to preserve all the nodes and thereby retain essential information imbued in the network. Although cross correlations underlying MSTs capture essential information, they do not faithfully capture dynamic behaviour embedded in the time series data of financial systems because cross correlation is a reliable measure only if the relationship between the time series is linear. To address the issue, this work investigates a new measure called phase synchronization (PS) for establishing correlations among different time series which relate to one another, linearly or nonlinearly. In this approach the strength of a link between a pair of time series (nodes) is determined by the level of phase synchronization between them. We compare the performance of phase synchronization based MST with cross correlation based MST along selected network measures across temporal frame that includes economically good and crisis periods. We observe agreement in the directionality of the results across these two methods. They show similar trends, upward or downward, when comparing selected network measures. Though both the methods give similar trends, the phase synchronization based MST is a more reliable representation of the dynamic behaviour of financial systems than the cross correlation based MST because of the former's ability to quantify nonlinear relationships among time

  14. CROSAT: A digital computer program for statistical-spectral analysis of two discrete time series

    International Nuclear Information System (INIS)

    Antonopoulos Domis, M.

    1978-03-01

    The program CROSAT computes directly from two discrete time series auto- and cross-spectra, transfer and coherence functions, using a Fast Fourier Transform subroutine. Statistical analysis of the time series is optional. While of general use the program is constructed to be immediately compatible with the ICL 4-70 and H316 computers at AEE Winfrith, and perhaps with minor modifications, with any other hardware system. (author)

  15. Biological time series analysis using a context free language: applicability to pulsatile hormone data.

    Directory of Open Access Journals (Sweden)

    Dennis A Dean

    Full Text Available We present a novel approach for analyzing biological time-series data using a context-free language (CFL representation that allows the extraction and quantification of important features from the time-series. This representation results in Hierarchically AdaPtive (HAP analysis, a suite of multiple complementary techniques that enable rapid analysis of data and does not require the user to set parameters. HAP analysis generates hierarchically organized parameter distributions that allow multi-scale components of the time-series to be quantified and includes a data analysis pipeline that applies recursive analyses to generate hierarchically organized results that extend traditional outcome measures such as pharmacokinetics and inter-pulse interval. Pulsicons, a novel text-based time-series representation also derived from the CFL approach, are introduced as an objective qualitative comparison nomenclature. We apply HAP to the analysis of 24 hours of frequently sampled pulsatile cortisol hormone data, which has known analysis challenges, from 14 healthy women. HAP analysis generated results in seconds and produced dozens of figures for each participant. The results quantify the observed qualitative features of cortisol data as a series of pulse clusters, each consisting of one or more embedded pulses, and identify two ultradian phenotypes in this dataset. HAP analysis is designed to be robust to individual differences and to missing data and may be applied to other pulsatile hormones. Future work can extend HAP analysis to other time-series data types, including oscillatory and other periodic physiological signals.

  16. Multivariate stochastic analysis for Monthly hydrological time series at Cuyahoga River Basin

    Science.gov (United States)

    zhang, L.

    2011-12-01

    Copula has become a very powerful statistic and stochastic methodology in case of the multivariate analysis in Environmental and Water resources Engineering. In recent years, the popular one-parameter Archimedean copulas, e.g. Gumbel-Houggard copula, Cook-Johnson copula, Frank copula, the meta-elliptical copula, e.g. Gaussian Copula, Student-T copula, etc. have been applied in multivariate hydrological analyses, e.g. multivariate rainfall (rainfall intensity, duration and depth), flood (peak discharge, duration and volume), and drought analyses (drought length, mean and minimum SPI values, and drought mean areal extent). Copula has also been applied in the flood frequency analysis at the confluences of river systems by taking into account the dependence among upstream gauge stations rather than by using the hydrological routing technique. In most of the studies above, the annual time series have been considered as stationary signal which the time series have been assumed as independent identically distributed (i.i.d.) random variables. But in reality, hydrological time series, especially the daily and monthly hydrological time series, cannot be considered as i.i.d. random variables due to the periodicity existed in the data structure. Also, the stationary assumption is also under question due to the Climate Change and Land Use and Land Cover (LULC) change in the fast years. To this end, it is necessary to revaluate the classic approach for the study of hydrological time series by relaxing the stationary assumption by the use of nonstationary approach. Also as to the study of the dependence structure for the hydrological time series, the assumption of same type of univariate distribution also needs to be relaxed by adopting the copula theory. In this paper, the univariate monthly hydrological time series will be studied through the nonstationary time series analysis approach. The dependence structure of the multivariate monthly hydrological time series will be

  17. Time Series Analysis of Wheat flour Price Shocks in Pakistan: A Case Analysis

    OpenAIRE

    Asad Raza Abdi; Ali Hassan Halepoto; Aisha Bashir Shah; Faiz M. Shaikh

    2013-01-01

    The current research investigates the wheat flour Price Shocks in Pakistan: A case analysis. Data was collected by using secondary sources by using Time series Analysis, and data were analyzed by using SPSS-20 version. It was revealed that the price of wheat flour increases from last four decades, and trend of price shocks shows that due to certain market variation and supply and demand shocks also play a positive relationship in price shocks in the wheat prices. It was further revealed th...

  18. On-line analysis of reactor noise using time-series analysis

    International Nuclear Information System (INIS)

    McGevna, V.G.

    1981-10-01

    A method to allow use of time series analysis for on-line noise analysis has been developed. On-line analysis of noise in nuclear power reactors has been limited primarily to spectral analysis and related frequency domain techniques. Time series analysis has many distinct advantages over spectral analysis in the automated processing of reactor noise. However, fitting an autoregressive-moving average (ARMA) model to time series data involves non-linear least squares estimation. Unless a high speed, general purpose computer is available, the calculations become too time consuming for on-line applications. To eliminate this problem, a special purpose algorithm was developed for fitting ARMA models. While it is based on a combination of steepest descent and Taylor series linearization, properties of the ARMA model are used so that the auto- and cross-correlation functions can be used to eliminate the need for estimating derivatives. The number of calculations, per iteration varies lineegardless of the mee 0.2% yield strength displayed anisotropy, with axial and circumferential values being greater than radial. For CF8-CPF8 and CF8M-CPF8M castings to meet current ASME Code S acid fuel cells

  19. Properties of Asymmetric Detrended Fluctuation Analysis in the time series of RR intervals

    Science.gov (United States)

    Piskorski, J.; Kosmider, M.; Mieszkowski, D.; Krauze, T.; Wykretowicz, A.; Guzik, P.

    2018-02-01

    Heart rate asymmetry is a phenomenon by which the accelerations and decelerations of heart rate behave differently, and this difference is consistent and unidirectional, i.e. in most of the analyzed recordings the inequalities have the same directions. So far, it has been established for variance and runs based types of descriptors of RR intervals time series. In this paper we apply the newly developed method of Asymmetric Detrended Fluctuation Analysis, which so far has mainly been used with economic time series, to the set of 420 stationary 30 min time series of RR intervals from young, healthy individuals aged between 20 and 40. This asymmetric approach introduces separate scaling exponents for rising and falling trends. We systematically study the presence of asymmetry in both global and local versions of this method. In this study global means "applying to the whole time series" and local means "applying to windows jumping along the recording". It is found that the correlation structure of the fluctuations left over after detrending in physiological time series shows strong asymmetric features in both magnitude, with α+ physiological data after shuffling or with a group of symmetric synthetic time series.

  20. The application of complex network time series analysis in turbulent heated jets

    International Nuclear Information System (INIS)

    Charakopoulos, A. K.; Karakasidis, T. E.; Liakopoulos, A.; Papanicolaou, P. N.

    2014-01-01

    In the present study, we applied the methodology of the complex network-based time series analysis to experimental temperature time series from a vertical turbulent heated jet. More specifically, we approach the hydrodynamic problem of discriminating time series corresponding to various regions relative to the jet axis, i.e., time series corresponding to regions that are close to the jet axis from time series originating at regions with a different dynamical regime based on the constructed network properties. Applying the transformation phase space method (k nearest neighbors) and also the visibility algorithm, we transformed time series into networks and evaluated the topological properties of the networks such as degree distribution, average path length, diameter, modularity, and clustering coefficient. The results show that the complex network approach allows distinguishing, identifying, and exploring in detail various dynamical regions of the jet flow, and associate it to the corresponding physical behavior. In addition, in order to reject the hypothesis that the studied networks originate from a stochastic process, we generated random network and we compared their statistical properties with that originating from the experimental data. As far as the efficiency of the two methods for network construction is concerned, we conclude that both methodologies lead to network properties that present almost the same qualitative behavior and allow us to reveal the underlying system dynamics

  1. ESTIMATING RELIABILITY OF DISTURBANCES IN SATELLITE TIME SERIES DATA BASED ON STATISTICAL ANALYSIS

    Directory of Open Access Journals (Sweden)

    Z.-G. Zhou

    2016-06-01

    Full Text Available Normally, the status of land cover is inherently dynamic and changing continuously on temporal scale. However, disturbances or abnormal changes of land cover — caused by such as forest fire, flood, deforestation, and plant diseases — occur worldwide at unknown times and locations. Timely detection and characterization of these disturbances is of importance for land cover monitoring. Recently, many time-series-analysis methods have been developed for near real-time or online disturbance detection, using satellite image time series. However, the detection results were only labelled with “Change/ No change” by most of the present methods, while few methods focus on estimating reliability (or confidence level of the detected disturbances in image time series. To this end, this paper propose a statistical analysis method for estimating reliability of disturbances in new available remote sensing image time series, through analysis of full temporal information laid in time series data. The method consists of three main steps. (1 Segmenting and modelling of historical time series data based on Breaks for Additive Seasonal and Trend (BFAST. (2 Forecasting and detecting disturbances in new time series data. (3 Estimating reliability of each detected disturbance using statistical analysis based on Confidence Interval (CI and Confidence Levels (CL. The method was validated by estimating reliability of disturbance regions caused by a recent severe flooding occurred around the border of Russia and China. Results demonstrated that the method can estimate reliability of disturbances detected in satellite image with estimation error less than 5% and overall accuracy up to 90%.

  2. Multi-Scale Entropy Analysis as a Method for Time-Series Analysis of Climate Data

    Directory of Open Access Journals (Sweden)

    Heiko Balzter

    2015-03-01

    Full Text Available Evidence is mounting that the temporal dynamics of the climate system are changing at the same time as the average global temperature is increasing due to multiple climate forcings. A large number of extreme weather events such as prolonged cold spells, heatwaves, droughts and floods have been recorded around the world in the past 10 years. Such changes in the temporal scaling behaviour of climate time-series data can be difficult to detect. While there are easy and direct ways of analysing climate data by calculating the means and variances for different levels of temporal aggregation, these methods can miss more subtle changes in their dynamics. This paper describes multi-scale entropy (MSE analysis as a tool to study climate time-series data and to identify temporal scales of variability and their change over time in climate time-series. MSE estimates the sample entropy of the time-series after coarse-graining at different temporal scales. An application of MSE to Central European, variance-adjusted, mean monthly air temperature anomalies (CRUTEM4v is provided. The results show that the temporal scales of the current climate (1960–2014 are different from the long-term average (1850–1960. For temporal scale factors longer than 12 months, the sample entropy increased markedly compared to the long-term record. Such an increase can be explained by systems theory with greater complexity in the regional temperature data. From 1961 the patterns of monthly air temperatures are less regular at time-scales greater than 12 months than in the earlier time period. This finding suggests that, at these inter-annual time scales, the temperature variability has become less predictable than in the past. It is possible that climate system feedbacks are expressed in altered temporal scales of the European temperature time-series data. A comparison with the variance and Shannon entropy shows that MSE analysis can provide additional information on the

  3. A Markovian Entropy Measure for the Analysis of Calcium Activity Time Series.

    Science.gov (United States)

    Marken, John P; Halleran, Andrew D; Rahman, Atiqur; Odorizzi, Laura; LeFew, Michael C; Golino, Caroline A; Kemper, Peter; Saha, Margaret S

    2016-01-01

    Methods to analyze the dynamics of calcium activity often rely on visually distinguishable features in time series data such as spikes, waves, or oscillations. However, systems such as the developing nervous system display a complex, irregular type of calcium activity which makes the use of such methods less appropriate. Instead, for such systems there exists a class of methods (including information theoretic, power spectral, and fractal analysis approaches) which use more fundamental properties of the time series to analyze the observed calcium dynamics. We present a new analysis method in this class, the Markovian Entropy measure, which is an easily implementable calcium time series analysis method which represents the observed calcium activity as a realization of a Markov Process and describes its dynamics in terms of the level of predictability underlying the transitions between the states of the process. We applied our and other commonly used calcium analysis methods on a dataset from Xenopus laevis neural progenitors which displays irregular calcium activity and a dataset from murine synaptic neurons which displays activity time series that are well-described by visually-distinguishable features. We find that the Markovian Entropy measure is able to distinguish between biologically distinct populations in both datasets, and that it can separate biologically distinct populations to a greater extent than other methods in the dataset exhibiting irregular calcium activity. These results support the benefit of using the Markovian Entropy measure to analyze calcium dynamics, particularly for studies using time series data which do not exhibit easily distinguishable features.

  4. A Markovian Entropy Measure for the Analysis of Calcium Activity Time Series.

    Directory of Open Access Journals (Sweden)

    John P Marken

    Full Text Available Methods to analyze the dynamics of calcium activity often rely on visually distinguishable features in time series data such as spikes, waves, or oscillations. However, systems such as the developing nervous system display a complex, irregular type of calcium activity which makes the use of such methods less appropriate. Instead, for such systems there exists a class of methods (including information theoretic, power spectral, and fractal analysis approaches which use more fundamental properties of the time series to analyze the observed calcium dynamics. We present a new analysis method in this class, the Markovian Entropy measure, which is an easily implementable calcium time series analysis method which represents the observed calcium activity as a realization of a Markov Process and describes its dynamics in terms of the level of predictability underlying the transitions between the states of the process. We applied our and other commonly used calcium analysis methods on a dataset from Xenopus laevis neural progenitors which displays irregular calcium activity and a dataset from murine synaptic neurons which displays activity time series that are well-described by visually-distinguishable features. We find that the Markovian Entropy measure is able to distinguish between biologically distinct populations in both datasets, and that it can separate biologically distinct populations to a greater extent than other methods in the dataset exhibiting irregular calcium activity. These results support the benefit of using the Markovian Entropy measure to analyze calcium dynamics, particularly for studies using time series data which do not exhibit easily distinguishable features.

  5. On statistical inference in time series analysis of the evolution of road safety.

    Science.gov (United States)

    Commandeur, Jacques J F; Bijleveld, Frits D; Bergel-Hayat, Ruth; Antoniou, Constantinos; Yannis, George; Papadimitriou, Eleonora

    2013-11-01

    Data collected for building a road safety observatory usually include observations made sequentially through time. Examples of such data, called time series data, include annual (or monthly) number of road traffic accidents, traffic fatalities or vehicle kilometers driven in a country, as well as the corresponding values of safety performance indicators (e.g., data on speeding, seat belt use, alcohol use, etc.). Some commonly used statistical techniques imply assumptions that are often violated by the special properties of time series data, namely serial dependency among disturbances associated with the observations. The first objective of this paper is to demonstrate the impact of such violations to the applicability of standard methods of statistical inference, which leads to an under or overestimation of the standard error and consequently may produce erroneous inferences. Moreover, having established the adverse consequences of ignoring serial dependency issues, the paper aims to describe rigorous statistical techniques used to overcome them. In particular, appropriate time series analysis techniques of varying complexity are employed to describe the development over time, relating the accident-occurrences to explanatory factors such as exposure measures or safety performance indicators, and forecasting the development into the near future. Traditional regression models (whether they are linear, generalized linear or nonlinear) are shown not to naturally capture the inherent dependencies in time series data. Dedicated time series analysis techniques, such as the ARMA-type and DRAG approaches are discussed next, followed by structural time series models, which are a subclass of state space methods. The paper concludes with general recommendations and practice guidelines for the use of time series models in road safety research. Copyright © 2012 Elsevier Ltd. All rights reserved.

  6. A Recurrent Probabilistic Neural Network with Dimensionality Reduction Based on Time-series Discriminant Component Analysis.

    Science.gov (United States)

    Hayashi, Hideaki; Shibanoki, Taro; Shima, Keisuke; Kurita, Yuichi; Tsuji, Toshio

    2015-12-01

    This paper proposes a probabilistic neural network (NN) developed on the basis of time-series discriminant component analysis (TSDCA) that can be used to classify high-dimensional time-series patterns. TSDCA involves the compression of high-dimensional time series into a lower dimensional space using a set of orthogonal transformations and the calculation of posterior probabilities based on a continuous-density hidden Markov model with a Gaussian mixture model expressed in the reduced-dimensional space. The analysis can be incorporated into an NN, which is named a time-series discriminant component network (TSDCN), so that parameters of dimensionality reduction and classification can be obtained simultaneously as network coefficients according to a backpropagation through time-based learning algorithm with the Lagrange multiplier method. The TSDCN is considered to enable high-accuracy classification of high-dimensional time-series patterns and to reduce the computation time taken for network training. The validity of the TSDCN is demonstrated for high-dimensional artificial data and electroencephalogram signals in the experiments conducted during the study.

  7. Time Series Data Analysis of Wireless Sensor Network Measurements of Temperature.

    Science.gov (United States)

    Bhandari, Siddhartha; Bergmann, Neil; Jurdak, Raja; Kusy, Branislav

    2017-05-26

    Wireless sensor networks have gained significant traction in environmental signal monitoring and analysis. The cost or lifetime of the system typically depends on the frequency at which environmental phenomena are monitored. If sampling rates are reduced, energy is saved. Using empirical datasets collected from environmental monitoring sensor networks, this work performs time series analyses of measured temperature time series. Unlike previous works which have concentrated on suppressing the transmission of some data samples by time-series analysis but still maintaining high sampling rates, this work investigates reducing the sampling rate (and sensor wake up rate) and looks at the effects on accuracy. Results show that the sampling period of the sensor can be increased up to one hour while still allowing intermediate and future states to be estimated with interpolation RMSE less than 0.2 °C and forecasting RMSE less than 1 °C.

  8. Multi-complexity ensemble measures for gait time series analysis: application to diagnostics, monitoring and biometrics.

    Science.gov (United States)

    Gavrishchaka, Valeriy; Senyukova, Olga; Davis, Kristina

    2015-01-01

    Previously, we have proposed to use complementary complexity measures discovered by boosting-like ensemble learning for the enhancement of quantitative indicators dealing with necessarily short physiological time series. We have confirmed robustness of such multi-complexity measures for heart rate variability analysis with the emphasis on detection of emerging and intermittent cardiac abnormalities. Recently, we presented preliminary results suggesting that such ensemble-based approach could be also effective in discovering universal meta-indicators for early detection and convenient monitoring of neurological abnormalities using gait time series. Here, we argue and demonstrate that these multi-complexity ensemble measures for gait time series analysis could have significantly wider application scope ranging from diagnostics and early detection of physiological regime change to gait-based biometrics applications.

  9. Interrupted time series analysis in drug utilization research is increasing: systematic review and recommendations.

    Science.gov (United States)

    Jandoc, Racquel; Burden, Andrea M; Mamdani, Muhammad; Lévesque, Linda E; Cadarette, Suzanne M

    2015-08-01

    To describe the use and reporting of interrupted time series methods in drug utilization research. We completed a systematic search of MEDLINE, Web of Science, and reference lists to identify English language articles through to December 2013 that used interrupted time series methods in drug utilization research. We tabulated the number of studies by publication year and summarized methodological detail. We identified 220 eligible empirical applications since 1984. Only 17 (8%) were published before 2000, and 90 (41%) were published since 2010. Segmented regression was the most commonly applied interrupted time series method (67%). Most studies assessed drug policy changes (51%, n = 112); 22% (n = 48) examined the impact of new evidence, 18% (n = 39) examined safety advisories, and 16% (n = 35) examined quality improvement interventions. Autocorrelation was considered in 66% of studies, 31% reported adjusting for seasonality, and 15% accounted for nonstationarity. Use of interrupted time series methods in drug utilization research has increased, particularly in recent years. Despite methodological recommendations, there is large variation in reporting of analytic methods. Developing methodological and reporting standards for interrupted time series analysis is important to improve its application in drug utilization research, and we provide recommendations for consideration. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.

  10. BiGGEsTS: integrated environment for biclustering analysis of time series gene expression data

    Directory of Open Access Journals (Sweden)

    Madeira Sara C

    2009-07-01

    Full Text Available Abstract Background The ability to monitor changes in expression patterns over time, and to observe the emergence of coherent temporal responses using expression time series, is critical to advance our understanding of complex biological processes. Biclustering has been recognized as an effective method for discovering local temporal expression patterns and unraveling potential regulatory mechanisms. The general biclustering problem is NP-hard. In the case of time series this problem is tractable, and efficient algorithms can be used. However, there is still a need for specialized applications able to take advantage of the temporal properties inherent to expression time series, both from a computational and a biological perspective. Findings BiGGEsTS makes available state-of-the-art biclustering algorithms for analyzing expression time series. Gene Ontology (GO annotations are used to assess the biological relevance of the biclusters. Methods for preprocessing expression time series and post-processing results are also included. The analysis is additionally supported by a visualization module capable of displaying informative representations of the data, including heatmaps, dendrograms, expression charts and graphs of enriched GO terms. Conclusion BiGGEsTS is a free open source graphical software tool for revealing local coexpression of genes in specific intervals of time, while integrating meaningful information on gene annotations. It is freely available at: http://kdbio.inesc-id.pt/software/biggests. We present a case study on the discovery of transcriptional regulatory modules in the response of Saccharomyces cerevisiae to heat stress.

  11. Time Series Imputation via L1 Norm-Based Singular Spectrum Analysis

    Science.gov (United States)

    Kalantari, Mahdi; Yarmohammadi, Masoud; Hassani, Hossein; Silva, Emmanuel Sirimal

    Missing values in time series data is a well-known and important problem which many researchers have studied extensively in various fields. In this paper, a new nonparametric approach for missing value imputation in time series is proposed. The main novelty of this research is applying the L1 norm-based version of Singular Spectrum Analysis (SSA), namely L1-SSA which is robust against outliers. The performance of the new imputation method has been compared with many other established methods. The comparison is done by applying them to various real and simulated time series. The obtained results confirm that the SSA-based methods, especially L1-SSA can provide better imputation in comparison to other methods.

  12. Analysis of Land Subsidence Monitoring in Mining Area with Time-Series Insar Technology

    Science.gov (United States)

    Sun, N.; Wang, Y. J.

    2018-04-01

    Time-series InSAR technology has become a popular land subsidence monitoring method in recent years, because of its advantages such as high accuracy, wide area, low expenditure, intensive monitoring points and free from accessibility restrictions. In this paper, we applied two kinds of satellite data, ALOS PALSAR and RADARSAT-2, to get the subsidence monitoring results of the study area in two time periods by time-series InSAR technology. By analyzing the deformation range, rate and amount, the time-series analysis of land subsidence in mining area was realized. The results show that InSAR technology could be used to monitor land subsidence in large area and meet the demand of subsidence monitoring in mining area.

  13. Multifractal detrended cross-correlation analysis on gold, crude oil and foreign exchange rate time series

    Science.gov (United States)

    Pal, Mayukha; Madhusudana Rao, P.; Manimaran, P.

    2014-12-01

    We apply the recently developed multifractal detrended cross-correlation analysis method to investigate the cross-correlation behavior and fractal nature between two non-stationary time series. We analyze the daily return price of gold, West Texas Intermediate and Brent crude oil, foreign exchange rate data, over a period of 18 years. The cross correlation has been measured from the Hurst scaling exponents and the singularity spectrum quantitatively. From the results, the existence of multifractal cross-correlation between all of these time series is found. We also found that the cross correlation between gold and oil prices possess uncorrelated behavior and the remaining bivariate time series possess persistent behavior. It was observed for five bivariate series that the cross-correlation exponents are less than the calculated average generalized Hurst exponents (GHE) for q0 and for one bivariate series the cross-correlation exponent is greater than GHE for all q values.

  14. Sensitivity analysis of machine-learning models of hydrologic time series

    Science.gov (United States)

    O'Reilly, A. M.

    2017-12-01

    Sensitivity analysis traditionally has been applied to assessing model response to perturbations in model parameters, where the parameters are those model input variables adjusted during calibration. Unlike physics-based models where parameters represent real phenomena, the equivalent of parameters for machine-learning models are simply mathematical "knobs" that are automatically adjusted during training/testing/verification procedures. Thus the challenge of extracting knowledge of hydrologic system functionality from machine-learning models lies in their very nature, leading to the label "black box." Sensitivity analysis of the forcing-response behavior of machine-learning models, however, can provide understanding of how the physical phenomena represented by model inputs affect the physical phenomena represented by model outputs.As part of a previous study, hybrid spectral-decomposition artificial neural network (ANN) models were developed to simulate the observed behavior of hydrologic response contained in multidecadal datasets of lake water level, groundwater level, and spring flow. Model inputs used moving window averages (MWA) to represent various frequencies and frequency-band components of time series of rainfall and groundwater use. Using these forcing time series, the MWA-ANN models were trained to predict time series of lake water level, groundwater level, and spring flow at 51 sites in central Florida, USA. A time series of sensitivities for each MWA-ANN model was produced by perturbing forcing time-series and computing the change in response time-series per unit change in perturbation. Variations in forcing-response sensitivities are evident between types (lake, groundwater level, or spring), spatially (among sites of the same type), and temporally. Two generally common characteristics among sites are more uniform sensitivities to rainfall over time and notable increases in sensitivities to groundwater usage during significant drought periods.

  15. Regression and regression analysis time series prediction modeling on climate data of quetta, pakistan

    International Nuclear Information System (INIS)

    Jafri, Y.Z.; Kamal, L.

    2007-01-01

    Various statistical techniques was used on five-year data from 1998-2002 of average humidity, rainfall, maximum and minimum temperatures, respectively. The relationships to regression analysis time series (RATS) were developed for determining the overall trend of these climate parameters on the basis of which forecast models can be corrected and modified. We computed the coefficient of determination as a measure of goodness of fit, to our polynomial regression analysis time series (PRATS). The correlation to multiple linear regression (MLR) and multiple linear regression analysis time series (MLRATS) were also developed for deciphering the interdependence of weather parameters. Spearman's rand correlation and Goldfeld-Quandt test were used to check the uniformity or non-uniformity of variances in our fit to polynomial regression (PR). The Breusch-Pagan test was applied to MLR and MLRATS, respectively which yielded homoscedasticity. We also employed Bartlett's test for homogeneity of variances on a five-year data of rainfall and humidity, respectively which showed that the variances in rainfall data were not homogenous while in case of humidity, were homogenous. Our results on regression and regression analysis time series show the best fit to prediction modeling on climatic data of Quetta, Pakistan. (author)

  16. Bayesian near-boundary analysis in basic macroeconomic time series models

    NARCIS (Netherlands)

    M.D. de Pooter (Michiel); F. Ravazzolo (Francesco); R. Segers (René); H.K. van Dijk (Herman)

    2008-01-01

    textabstractSeveral lessons learnt from a Bayesian analysis of basic macroeconomic time series models are presented for the situation where some model parameters have substantial posterior probability near the boundary of the parameter region. This feature refers to near-instability within dynamic

  17. A Comparison of Missing-Data Procedures for Arima Time-Series Analysis

    Science.gov (United States)

    Velicer, Wayne F.; Colby, Suzanne M.

    2005-01-01

    Missing data are a common practical problem for longitudinal designs. Time-series analysis is a longitudinal method that involves a large number of observations on a single unit. Four different missing-data methods (deletion, mean substitution, mean of adjacent observations, and maximum likelihood estimation) were evaluated. Computer-generated…

  18. Time Series in Education: The Analysis of Daily Attendance in Two High Schools

    Science.gov (United States)

    Koopmans, Matthijs

    2011-01-01

    This presentation discusses the use of a time series approach to the analysis of daily attendance in two urban high schools over the course of one school year (2009-10). After establishing that the series for both schools were stationary, they were examined for moving average processes, autoregression, seasonal dependencies (weekly cycles),…

  19. Detrended partial cross-correlation analysis of two nonstationary time series influenced by common external forces

    Science.gov (United States)

    Qian, Xi-Yuan; Liu, Ya-Min; Jiang, Zhi-Qiang; Podobnik, Boris; Zhou, Wei-Xing; Stanley, H. Eugene

    2015-06-01

    When common factors strongly influence two power-law cross-correlated time series recorded in complex natural or social systems, using detrended cross-correlation analysis (DCCA) without considering these common factors will bias the results. We use detrended partial cross-correlation analysis (DPXA) to uncover the intrinsic power-law cross correlations between two simultaneously recorded time series in the presence of nonstationarity after removing the effects of other time series acting as common forces. The DPXA method is a generalization of the detrended cross-correlation analysis that takes into account partial correlation analysis. We demonstrate the method by using bivariate fractional Brownian motions contaminated with a fractional Brownian motion. We find that the DPXA is able to recover the analytical cross Hurst indices, and thus the multiscale DPXA coefficients are a viable alternative to the conventional cross-correlation coefficient. We demonstrate the advantage of the DPXA coefficients over the DCCA coefficients by analyzing contaminated bivariate fractional Brownian motions. We calculate the DPXA coefficients and use them to extract the intrinsic cross correlation between crude oil and gold futures by taking into consideration the impact of the U.S. dollar index. We develop the multifractal DPXA (MF-DPXA) method in order to generalize the DPXA method and investigate multifractal time series. We analyze multifractal binomial measures masked with strong white noises and find that the MF-DPXA method quantifies the hidden multifractal nature while the multifractal DCCA method fails.

  20. Dynamic factor analysis in the frequency domain: causal modeling of multivariate psychophysiological time series

    NARCIS (Netherlands)

    Molenaar, P.C.M.

    1987-01-01

    Outlines a frequency domain analysis of the dynamic factor model and proposes a solution to the problem of constructing a causal filter of lagged factor loadings. The method is illustrated with applications to simulated and real multivariate time series. The latter applications involve topographic

  1. Harmonic analysis of dense time series of landsat imagery for modeling change in forest conditions

    Science.gov (United States)

    Barry Tyler. Wilson

    2015-01-01

    This study examined the utility of dense time series of Landsat imagery for small area estimation and mapping of change in forest conditions over time. The study area was a region in north central Wisconsin for which Landsat 7 ETM+ imagery and field measurements from the Forest Inventory and Analysis program are available for the decade of 2003 to 2012. For the periods...

  2. Economic Conditions and the Divorce Rate: A Time-Series Analysis of the Postwar United States.

    Science.gov (United States)

    South, Scott J.

    1985-01-01

    Challenges the belief that the divorce rate rises during prosperity and falls during economic recessions. Time-series regression analysis of postwar United States reveals small but positive effects of unemployment on divorce rate. Stronger influences on divorce rates are changes in age structure and labor-force participation rate of women.…

  3. Time-series analysis of Nigeria rice supply and demand: Error ...

    African Journals Online (AJOL)

    The study examined a time-series analysis of Nigeria rice supply and demand with a view to determining any long-run equilibrium between them using the Error Correction Model approach (ECM). The data used for the study represents the annual series of 1960-2007 (47 years) for rice supply and demand in Nigeria, ...

  4. Unified functional network and nonlinear time series analysis for complex systems science: The pyunicorn package

    Science.gov (United States)

    Donges, Jonathan; Heitzig, Jobst; Beronov, Boyan; Wiedermann, Marc; Runge, Jakob; Feng, Qing Yi; Tupikina, Liubov; Stolbova, Veronika; Donner, Reik; Marwan, Norbert; Dijkstra, Henk; Kurths, Jürgen

    2016-04-01

    We introduce the pyunicorn (Pythonic unified complex network and recurrence analysis toolbox) open source software package for applying and combining modern methods of data analysis and modeling from complex network theory and nonlinear time series analysis. pyunicorn is a fully object-oriented and easily parallelizable package written in the language Python. It allows for the construction of functional networks such as climate networks in climatology or functional brain networks in neuroscience representing the structure of statistical interrelationships in large data sets of time series and, subsequently, investigating this structure using advanced methods of complex network theory such as measures and models for spatial networks, networks of interacting networks, node-weighted statistics, or network surrogates. Additionally, pyunicorn provides insights into the nonlinear dynamics of complex systems as recorded in uni- and multivariate time series from a non-traditional perspective by means of recurrence quantification analysis, recurrence networks, visibility graphs, and construction of surrogate time series. The range of possible applications of the library is outlined, drawing on several examples mainly from the field of climatology. pyunicorn is available online at https://github.com/pik-copan/pyunicorn. Reference: J.F. Donges, J. Heitzig, B. Beronov, M. Wiedermann, J. Runge, Q.-Y. Feng, L. Tupikina, V. Stolbova, R.V. Donner, N. Marwan, H.A. Dijkstra, and J. Kurths, Unified functional network and nonlinear time series analysis for complex systems science: The pyunicorn package, Chaos 25, 113101 (2015), DOI: 10.1063/1.4934554, Preprint: arxiv.org:1507.01571 [physics.data-an].

  5. Statistical tools for analysis and modeling of cosmic populations and astronomical time series: CUDAHM and TSE

    Science.gov (United States)

    Loredo, Thomas; Budavari, Tamas; Scargle, Jeffrey D.

    2018-01-01

    This presentation provides an overview of open-source software packages addressing two challenging classes of astrostatistics problems. (1) CUDAHM is a C++ framework for hierarchical Bayesian modeling of cosmic populations, leveraging graphics processing units (GPUs) to enable applying this computationally challenging paradigm to large datasets. CUDAHM is motivated by measurement error problems in astronomy, where density estimation and linear and nonlinear regression must be addressed for populations of thousands to millions of objects whose features are measured with possibly complex uncertainties, potentially including selection effects. An example calculation demonstrates accurate GPU-accelerated luminosity function estimation for simulated populations of $10^6$ objects in about two hours using a single NVIDIA Tesla K40c GPU. (2) Time Series Explorer (TSE) is a collection of software in Python and MATLAB for exploratory analysis and statistical modeling of astronomical time series. It comprises a library of stand-alone functions and classes, as well as an application environment for interactive exploration of times series data. The presentation will summarize key capabilities of this emerging project, including new algorithms for analysis of irregularly-sampled time series.

  6. Early detection of metabolic and energy disorders by thermal time series stochastic complexity analysis

    Energy Technology Data Exchange (ETDEWEB)

    Lutaif, N.A. [Departamento de Clínica Médica, Faculdade de Ciências Médicas, Universidade Estadual de Campinas, Campinas, SP (Brazil); Palazzo, R. Jr [Departamento de Telemática, Faculdade de Engenharia Elétrica e Computação, Universidade Estadual de Campinas, Campinas, SP (Brazil); Gontijo, J.A.R. [Departamento de Clínica Médica, Faculdade de Ciências Médicas, Universidade Estadual de Campinas, Campinas, SP (Brazil)

    2014-01-17

    Maintenance of thermal homeostasis in rats fed a high-fat diet (HFD) is associated with changes in their thermal balance. The thermodynamic relationship between heat dissipation and energy storage is altered by the ingestion of high-energy diet content. Observation of thermal registers of core temperature behavior, in humans and rodents, permits identification of some characteristics of time series, such as autoreference and stationarity that fit adequately to a stochastic analysis. To identify this change, we used, for the first time, a stochastic autoregressive model, the concepts of which match those associated with physiological systems involved and applied in male HFD rats compared with their appropriate standard food intake age-matched male controls (n=7 per group). By analyzing a recorded temperature time series, we were able to identify when thermal homeostasis would be affected by a new diet. The autoregressive time series model (AR model) was used to predict the occurrence of thermal homeostasis, and this model proved to be very effective in distinguishing such a physiological disorder. Thus, we infer from the results of our study that maximum entropy distribution as a means for stochastic characterization of temperature time series registers may be established as an important and early tool to aid in the diagnosis and prevention of metabolic diseases due to their ability to detect small variations in thermal profile.

  7. Early detection of metabolic and energy disorders by thermal time series stochastic complexity analysis

    International Nuclear Information System (INIS)

    Lutaif, N.A.; Palazzo, R. Jr; Gontijo, J.A.R.

    2014-01-01

    Maintenance of thermal homeostasis in rats fed a high-fat diet (HFD) is associated with changes in their thermal balance. The thermodynamic relationship between heat dissipation and energy storage is altered by the ingestion of high-energy diet content. Observation of thermal registers of core temperature behavior, in humans and rodents, permits identification of some characteristics of time series, such as autoreference and stationarity that fit adequately to a stochastic analysis. To identify this change, we used, for the first time, a stochastic autoregressive model, the concepts of which match those associated with physiological systems involved and applied in male HFD rats compared with their appropriate standard food intake age-matched male controls (n=7 per group). By analyzing a recorded temperature time series, we were able to identify when thermal homeostasis would be affected by a new diet. The autoregressive time series model (AR model) was used to predict the occurrence of thermal homeostasis, and this model proved to be very effective in distinguishing such a physiological disorder. Thus, we infer from the results of our study that maximum entropy distribution as a means for stochastic characterization of temperature time series registers may be established as an important and early tool to aid in the diagnosis and prevention of metabolic diseases due to their ability to detect small variations in thermal profile

  8. Application of the Allan Variance to Time Series Analysis in Astrometry and Geodesy: A Review.

    Science.gov (United States)

    Malkin, Zinovy

    2016-04-01

    The Allan variance (AVAR) was introduced 50 years ago as a statistical tool for assessing the frequency standards deviations. For the past decades, AVAR has increasingly been used in geodesy and astrometry to assess the noise characteristics in geodetic and astrometric time series. A specific feature of astrometric and geodetic measurements, as compared with clock measurements, is that they are generally associated with uncertainties; thus, an appropriate weighting should be applied during data analysis. In addition, some physically connected scalar time series naturally form series of multidimensional vectors. For example, three station coordinates time series X, Y, and Z can be combined to analyze 3-D station position variations. The classical AVAR is not intended for processing unevenly weighted and/or multidimensional data. Therefore, AVAR modifications, namely weighted AVAR (WAVAR), multidimensional AVAR (MAVAR), and weighted multidimensional AVAR (WMAVAR), were introduced to overcome these deficiencies. In this paper, a brief review is given of the experience of using AVAR and its modifications in processing astrogeodetic time series.

  9. Time series analysis of infrared satellite data for detecting thermal anomalies: a hybrid approach

    Science.gov (United States)

    Koeppen, W. C.; Pilger, E.; Wright, R.

    2011-07-01

    We developed and tested an automated algorithm that analyzes thermal infrared satellite time series data to detect and quantify the excess energy radiated from thermal anomalies such as active volcanoes. Our algorithm enhances the previously developed MODVOLC approach, a simple point operation, by adding a more complex time series component based on the methods of the Robust Satellite Techniques (RST) algorithm. Using test sites at Anatahan and Kīlauea volcanoes, the hybrid time series approach detected ~15% more thermal anomalies than MODVOLC with very few, if any, known false detections. We also tested gas flares in the Cantarell oil field in the Gulf of Mexico as an end-member scenario representing very persistent thermal anomalies. At Cantarell, the hybrid algorithm showed only a slight improvement, but it did identify flares that were undetected by MODVOLC. We estimate that at least 80 MODIS images for each calendar month are required to create good reference images necessary for the time series analysis of the hybrid algorithm. The improved performance of the new algorithm over MODVOLC will result in the detection of low temperature thermal anomalies that will be useful in improving our ability to document Earth's volcanic eruptions, as well as detecting low temperature thermal precursors to larger eruptions.

  10. Spatial analysis of precipitation time series over the Upper Indus Basin

    Science.gov (United States)

    Latif, Yasir; Yaoming, Ma; Yaseen, Muhammad

    2018-01-01

    The upper Indus basin (UIB) holds one of the most substantial river systems in the world, contributing roughly half of the available surface water in Pakistan. This water provides necessary support for agriculture, domestic consumption, and hydropower generation; all critical for a stable economy in Pakistan. This study has identified trends, analyzed variability, and assessed changes in both annual and seasonal precipitation during four time series, identified herein as: (first) 1961-2013, (second) 1971-2013, (third) 1981-2013, and (fourth) 1991-2013, over the UIB. This study investigated spatial characteristics of the precipitation time series over 15 weather stations and provides strong evidence of annual precipitation by determining significant trends at 6 stations (Astore, Chilas, Dir, Drosh, Gupis, and Kakul) out of the 15 studied stations, revealing a significant negative trend during the fourth time series. Our study also showed significantly increased precipitation at Bunji, Chitral, and Skardu, whereas such trends at the rest of the stations appear insignificant. Moreover, our study found that seasonal precipitation decreased at some locations (at a high level of significance), as well as periods of scarce precipitation during all four seasons. The observed decreases in precipitation appear stronger and more significant in autumn; having 10 stations exhibiting decreasing precipitation during the fourth time series, with respect to time and space. Furthermore, the observed decreases in precipitation appear robust and more significant for regions at high elevation (>1300 m). This analysis concludes that decreasing precipitation dominated the UIB, both temporally and spatially including in the higher areas.

  11. Time series analysis as input for clinical predictive modeling: modeling cardiac arrest in a pediatric ICU.

    Science.gov (United States)

    Kennedy, Curtis E; Turley, James P

    2011-10-24

    Thousands of children experience cardiac arrest events every year in pediatric intensive care units. Most of these children die. Cardiac arrest prediction tools are used as part of medical emergency team evaluations to identify patients in standard hospital beds that are at high risk for cardiac arrest. There are no models to predict cardiac arrest in pediatric intensive care units though, where the risk of an arrest is 10 times higher than for standard hospital beds. Current tools are based on a multivariable approach that does not characterize deterioration, which often precedes cardiac arrests. Characterizing deterioration requires a time series approach. The purpose of this study is to propose a method that will allow for time series data to be used in clinical prediction models. Successful implementation of these methods has the potential to bring arrest prediction to the pediatric intensive care environment, possibly allowing for interventions that can save lives and prevent disabilities. We reviewed prediction models from nonclinical domains that employ time series data, and identified the steps that are necessary for building predictive models using time series clinical data. We illustrate the method by applying it to the specific case of building a predictive model for cardiac arrest in a pediatric intensive care unit. Time course analysis studies from genomic analysis provided a modeling template that was compatible with the steps required to develop a model from clinical time series data. The steps include: 1) selecting candidate variables; 2) specifying measurement parameters; 3) defining data format; 4) defining time window duration and resolution; 5) calculating latent variables for candidate variables not directly measured; 6) calculating time series features as latent variables; 7) creating data subsets to measure model performance effects attributable to various classes of candidate variables; 8) reducing the number of candidate features; 9

  12. A Filtering of Incomplete GNSS Position Time Series with Probabilistic Principal Component Analysis

    Science.gov (United States)

    Gruszczynski, Maciej; Klos, Anna; Bogusz, Janusz

    2018-04-01

    For the first time, we introduced the probabilistic principal component analysis (pPCA) regarding the spatio-temporal filtering of Global Navigation Satellite System (GNSS) position time series to estimate and remove Common Mode Error (CME) without the interpolation of missing values. We used data from the International GNSS Service (IGS) stations which contributed to the latest International Terrestrial Reference Frame (ITRF2014). The efficiency of the proposed algorithm was tested on the simulated incomplete time series, then CME was estimated for a set of 25 stations located in Central Europe. The newly applied pPCA was compared with previously used algorithms, which showed that this method is capable of resolving the problem of proper spatio-temporal filtering of GNSS time series characterized by different observation time span. We showed, that filtering can be carried out with pPCA method when there exist two time series in the dataset having less than 100 common epoch of observations. The 1st Principal Component (PC) explained more than 36% of the total variance represented by time series residuals' (series with deterministic model removed), what compared to the other PCs variances (less than 8%) means that common signals are significant in GNSS residuals. A clear improvement in the spectral indices of the power-law noise was noticed for the Up component, which is reflected by an average shift towards white noise from - 0.98 to - 0.67 (30%). We observed a significant average reduction in the accuracy of stations' velocity estimated for filtered residuals by 35, 28 and 69% for the North, East, and Up components, respectively. CME series were also subjected to analysis in the context of environmental mass loading influences of the filtering results. Subtraction of the environmental loading models from GNSS residuals provides to reduction of the estimated CME variance by 20 and 65% for horizontal and vertical components, respectively.

  13. Definition of distance for nonlinear time series analysis of marked point process data

    Energy Technology Data Exchange (ETDEWEB)

    Iwayama, Koji, E-mail: koji@sat.t.u-tokyo.ac.jp [Research Institute for Food and Agriculture, Ryukoku Univeristy, 1-5 Yokotani, Seta Oe-cho, Otsu-Shi, Shiga 520-2194 (Japan); Hirata, Yoshito; Aihara, Kazuyuki [Institute of Industrial Science, The University of Tokyo, 4-6-1 Komaba, Meguro-ku, Tokyo 153-8505 (Japan)

    2017-01-30

    Marked point process data are time series of discrete events accompanied with some values, such as economic trades, earthquakes, and lightnings. A distance for marked point process data allows us to apply nonlinear time series analysis to such data. We propose a distance for marked point process data which can be calculated much faster than the existing distance when the number of marks is small. Furthermore, under some assumptions, the Kullback–Leibler divergences between posterior distributions for neighbors defined by this distance are small. We performed some numerical simulations showing that analysis based on the proposed distance is effective. - Highlights: • A new distance for marked point process data is proposed. • The distance can be computed fast enough for a small number of marks. • The method to optimize parameter values of the distance is also proposed. • Numerical simulations indicate that the analysis based on the distance is effective.

  14. Fractal time series analysis of postural stability in elderly and control subjects

    Directory of Open Access Journals (Sweden)

    Doussot Michel

    2007-05-01

    Full Text Available Abstract Background The study of balance using stabilogram analysis is of particular interest in the study of falls. Although simple statistical parameters derived from the stabilogram have been shown to predict risk of falls, such measures offer little insight into the underlying control mechanisms responsible for degradation in balance. In contrast, fractal and non-linear time-series analysis of stabilograms, such as estimations of the Hurst exponent (H, may provide information related to the underlying motor control strategies governing postural stability. In order to be adapted for a home-based follow-up of balance, such methods need to be robust, regardless of the experimental protocol, while producing time-series that are as short as possible. The present study compares two methods of calculating H: Detrended Fluctuation Analysis (DFA and Stabilogram Diffusion Analysis (SDA for elderly and control subjects, as well as evaluating the effect of recording duration. Methods Centre of pressure signals were obtained from 90 young adult subjects and 10 elderly subjects. Data were sampled at 100 Hz for 30 s, including stepping onto and off the force plate. Estimations of H were made using sliding windows of 10, 5, and 2.5 s durations, with windows slid forward in 1-s increments. Multivariate analysis of variance was used to test for the effect of time, age and estimation method on the Hurst exponent, while the intra-class correlation coefficient (ICC was used as a measure of reliability. Results Both SDA and DFA methods were able to identify differences in postural stability between control and elderly subjects for time series as short as 5 s, with ICC values as high as 0.75 for DFA. Conclusion Both methods would be well-suited to non-invasive longitudinal assessment of balance. In addition, reliable estimations of H were obtained from time series as short as 5 s.

  15. Adventures in Modern Time Series Analysis: From the Sun to the Crab Nebula and Beyond

    Science.gov (United States)

    Scargle, Jeffrey

    2014-01-01

    With the generation of long, precise, and finely sampled time series the Age of Digital Astronomy is uncovering and elucidating energetic dynamical processes throughout the Universe. Fulfilling these opportunities requires data effective analysis techniques rapidly and automatically implementing advanced concepts. The Time Series Explorer, under development in collaboration with Tom Loredo, provides tools ranging from simple but optimal histograms to time and frequency domain analysis for arbitrary data modes with any time sampling. Much of this development owes its existence to Joe Bredekamp and the encouragement he provided over several decades. Sample results for solar chromospheric activity, gamma-ray activity in the Crab Nebula, active galactic nuclei and gamma-ray bursts will be displayed.

  16. Time series analysis of wind speed using VAR and the generalized impulse response technique

    Energy Technology Data Exchange (ETDEWEB)

    Ewing, Bradley T. [Area of Information Systems and Quantitative Sciences, Rawls College of Business and Wind Science and Engineering Research Center, Texas Tech University, Lubbock, TX 79409-2101 (United States); Kruse, Jamie Brown [Center for Natural Hazard Research, East Carolina University, Greenville, NC (United States); Schroeder, John L. [Department of Geosciences and Wind Science and Engineering Research Center, Texas Tech University, Lubbock, TX (United States); Smith, Douglas A. [Department of Civil Engineering and Wind Science and Engineering Research Center, Texas Tech University, Lubbock, TX (United States)

    2007-03-15

    This research examines the interdependence in time series wind speed data measured in the same location at four different heights. A multiple-equation system known as a vector autoregression is proposed for characterizing the time series dynamics of wind. Additionally, the recently developed method of generalized impulse response analysis provides insight into the cross-effects of the wind series and their responses to shocks. Findings are based on analysis of contemporaneous wind speed time histories taken at 13, 33, 70 and 160 ft above ground level with a sampling rate of 10 Hz. The results indicate that wind speeds measured at 70 ft was the most variable. Further, the turbulence persisted longer at the 70-ft measurement than at the other heights. The greatest interdependence is observed at 13 ft. Gusts at 160 ft led to the greatest persistence to an 'own' shock and led to greatest persistence in the responses of the other wind series. (author)

  17. Time-series analysis of climatologic measurements: a method to distinguish future climatic changes

    International Nuclear Information System (INIS)

    Duband, D.

    1992-01-01

    Time-series analysis of climatic parameters as air temperature, rivers flow rate, lakes or seas level is an indispensable basis to detect a possible significant climatic change. These observations, when they are carefully analyzed and criticized, constitute the necessary reference for testing and validation numerical climatic models which try to simulate the physical and dynamical process of the ocean-atmosphere couple, taking continents into account. 32 refs., 13 figs

  18. Chernobyl effects on domestic and inbound tourism in Sweden. A time series analysis

    International Nuclear Information System (INIS)

    Hultkrantz, L.; Olsson, C.

    1997-01-01

    This paper estimates the impact of the Chernobyl nuclear accident on domestic and international tourism in Sweden. From ARIMA time series forecasts, outlier search, and intervention analysis based on regional monthly accommodation data from 1978-1989, no effect on domestic tourism is found. However, there is an enduring deterrence effect on incoming tourism. The loss of gross revenue from incoming tourism because of the Chernobyl accident, is estimated to 2.5 billion SEK. 5 figs., 7 tabs., 1 appendix, 27 refs

  19. On-line condition monitoring of nuclear systems via symbolic time series analysis

    International Nuclear Information System (INIS)

    Rajagopalan, V.; Ray, A.; Garcia, H. E.

    2006-01-01

    This paper provides a symbolic time series analysis approach to fault diagnostics and condition monitoring. The proposed technique is built upon concepts from wavelet theory, symbolic dynamics and pattern recognition. Various aspects of the methodology such as wavelet selection, choice of alphabet and determination of depth of D-Markov Machine are explained in the paper. The technique is validated with experiments performed in a Machine Condition Monitoring (MCM) test bed at the Idaho National Laboratory. (authors)

  20. A Time Series Analysis to Asymmetric Marketing Competition Within a Market Structure

    OpenAIRE

    Francisco F. R. Ramos

    1996-01-01

    As a complementary to the existing studies of competitive market structure analysis, the present paper proposed a time series methodology to provide a more detailed picture of marketing competition in relation to competitive market structure. Two major hypotheses were tested as part of this project. First, it was found that some significant cross- lead and lag effects of marketing variables on sales between brands existed even between differents submarkets. second, it was found that high qual...

  1. Time series analysis in road safety research uisng state space methods

    OpenAIRE

    BIJLEVELD, FD

    2008-01-01

    In this thesis we present a comprehensive study into novel time series models for aggregated road safety data. The models are mainly intended for analysis of indicators relevant to road safety, with a particular focus on how to measure these factors. Such developments may need to be related to or explained by external influences. It is also possible to make forecasts using the models. Relevant indicators include the number of persons killed permonth or year. These statistics are closely watch...

  2. Chernobyl effects on domestic and inbound tourism in Sweden. A time series analysis

    Energy Technology Data Exchange (ETDEWEB)

    Hultkrantz, L. [Department of Economics, University of Uppsala, Uppsala (Sweden); Olsson, C. [Department of Economics, Umeaa University, Umeaa (Sweden)

    1997-03-01

    This paper estimates the impact of the Chernobyl nuclear accident on domestic and international tourism in Sweden. From ARIMA time series forecasts, outlier search, and intervention analysis based on regional monthly accommodation data from 1978-1989, no effect on domestic tourism is found. However, there is an enduring deterrence effect on incoming tourism. The loss of gross revenue from incoming tourism because of the Chernobyl accident, is estimated to 2.5 billion SEK. 5 figs., 7 tabs., 1 appendix, 27 refs.

  3. Time Series Modeling of Army Mission Command Communication Networks: An Event-Driven Analysis

    Science.gov (United States)

    2013-06-01

    Lehmann, D. R. (1984). How advertising affects sales: Meta- analysis of econometric results. Journal of Marketing Research , 21, 65-74. Barabási, A. L...317-357. Leone, R. P. (1983). Modeling sales-advertising relationships: An integrated time series- econometric approach. Journal of Marketing ... Research , 20, 291-295. McGrath, J. E., & Kravitz, D. A. (1982). Group research. Annual Review of Psychology, 33, 195- 230. Monge, P. R., & Contractor

  4. Phase correction and error estimation in InSAR time series analysis

    Science.gov (United States)

    Zhang, Y.; Fattahi, H.; Amelung, F.

    2017-12-01

    During the last decade several InSAR time series approaches have been developed in response to the non-idea acquisition strategy of SAR satellites, such as large spatial and temporal baseline with non-regular acquisitions. The small baseline tubes and regular acquisitions of new SAR satellites such as Sentinel-1 allows us to form fully connected networks of interferograms and simplifies the time series analysis into a weighted least square inversion of an over-determined system. Such robust inversion allows us to focus more on the understanding of different components in InSAR time-series and its uncertainties. We present an open-source python-based package for InSAR time series analysis, called PySAR (https://yunjunz.github.io/PySAR/), with unique functionalities for obtaining unbiased ground displacement time-series, geometrical and atmospheric correction of InSAR data and quantifying the InSAR uncertainty. Our implemented strategy contains several features including: 1) improved spatial coverage using coherence-based network of interferograms, 2) unwrapping error correction using phase closure or bridging, 3) tropospheric delay correction using weather models and empirical approaches, 4) DEM error correction, 5) optimal selection of reference date and automatic outlier detection, 6) InSAR uncertainty due to the residual tropospheric delay, decorrelation and residual DEM error, and 7) variance-covariance matrix of final products for geodetic inversion. We demonstrate the performance using SAR datasets acquired by Cosmo-Skymed and TerraSAR-X, Sentinel-1 and ALOS/ALOS-2, with application on the highly non-linear volcanic deformation in Japan and Ecuador (figure 1). Our result shows precursory deformation before the 2015 eruptions of Cotopaxi volcano, with a maximum uplift of 3.4 cm on the western flank (fig. 1b), with a standard deviation of 0.9 cm (fig. 1a), supporting the finding by Morales-Rivera et al. (2017, GRL); and a post-eruptive subsidence on the same

  5. TIME SERIES ANALYSIS ON STOCK MARKET FOR TEXT MINING CORRELATION OF ECONOMY NEWS

    Directory of Open Access Journals (Sweden)

    Sadi Evren SEKER

    2014-01-01

    Full Text Available This paper proposes an information retrieval methodfor the economy news. Theeffect of economy news, are researched in the wordlevel and stock market valuesare considered as the ground proof.The correlation between stock market prices and economy news is an already ad-dressed problem for most of the countries. The mostwell-known approach is ap-plying the text mining approaches to the news and some time series analysis tech-niques over stock market closing values in order toapply classification or cluster-ing algorithms over the features extracted. This study goes further and tries to askthe question what are the available time series analysis techniques for the stockmarket closing values and which one is the most suitable? In this study, the newsand their dates are collected into a database and text mining is applied over thenews, the text mining part has been kept simple with only term frequency – in-verse document frequency method. For the time series analysis part, we havestudied 10 different methods such as random walk, moving average, acceleration,Bollinger band, price rate of change, periodic average, difference, momentum orrelative strength index and their variation. In this study we have also explainedthese techniques in a comparative way and we have applied the methods overTurkish Stock Market closing values for more than a2 year period. On the otherhand, we have applied the term frequency – inversedocument frequency methodon the economy news of one of the high-circulatingnewspapers in Turkey.

  6. The Fourier decomposition method for nonlinear and non-stationary time series analysis.

    Science.gov (United States)

    Singh, Pushpendra; Joshi, Shiv Dutt; Patney, Rakesh Kumar; Saha, Kaushik

    2017-03-01

    for many decades, there has been a general perception in the literature that Fourier methods are not suitable for the analysis of nonlinear and non-stationary data. In this paper, we propose a novel and adaptive Fourier decomposition method (FDM), based on the Fourier theory, and demonstrate its efficacy for the analysis of nonlinear and non-stationary time series. The proposed FDM decomposes any data into a small number of 'Fourier intrinsic band functions' (FIBFs). The FDM presents a generalized Fourier expansion with variable amplitudes and variable frequencies of a time series by the Fourier method itself. We propose an idea of zero-phase filter bank-based multivariate FDM (MFDM), for the analysis of multivariate nonlinear and non-stationary time series, using the FDM. We also present an algorithm to obtain cut-off frequencies for MFDM. The proposed MFDM generates a finite number of band-limited multivariate FIBFs (MFIBFs). The MFDM preserves some intrinsic physical properties of the multivariate data, such as scale alignment, trend and instantaneous frequency. The proposed methods provide a time-frequency-energy (TFE) distribution that reveals the intrinsic structure of a data. Numerical computations and simulations have been carried out and comparison is made with the empirical mode decomposition algorithms.

  7. Detrended fluctuation analysis based on higher-order moments of financial time series

    Science.gov (United States)

    Teng, Yue; Shang, Pengjian

    2018-01-01

    In this paper, a generalized method of detrended fluctuation analysis (DFA) is proposed as a new measure to assess the complexity of a complex dynamical system such as stock market. We extend DFA and local scaling DFA to higher moments such as skewness and kurtosis (labeled SMDFA and KMDFA), so as to investigate the volatility scaling property of financial time series. Simulations are conducted over synthetic and financial data for providing the comparative study. We further report the results of volatility behaviors in three American countries, three Chinese and three European stock markets by using DFA and LSDFA method based on higher moments. They demonstrate the dynamics behaviors of time series in different aspects, which can quantify the changes of complexity for stock market data and provide us with more meaningful information than single exponent. And the results reveal some higher moments volatility and higher moments multiscale volatility details that cannot be obtained using the traditional DFA method.

  8. Time series analysis of pressure fluctuation in gas-solid fluidized beds

    Directory of Open Access Journals (Sweden)

    C. Alberto S. Felipe

    2004-09-01

    Full Text Available The purpose of the present work was to study the differentiation of states of typical fluidization (single bubble, multiple bubble and slugging in a gas-solid fluidized bed, using spectral analysis of pressure fluctuation time series. The effects of the method of measuring (differential and absolute pressure fluctuations and the axial position of the probes in the fluidization column on the identification of each of the regimes studied were evaluated. Fast Fourier Transform (FFT was the mathematic tool used to analysing the data of pressure fluctuations, which expresses the behavior of a time series in the frequency domain. Results indicated that the plenum chamber was a place for reliable measurement and that care should be taken in measurement in the dense phase. The method allowed fluid dynamic regimes to be differentiated by their dominant frequency characteristics.

  9. Investigation of interfacial wave structure using time-series analysis techniques

    International Nuclear Information System (INIS)

    Jayanti, S.; Hewitt, G.F.; Cliffe, K.A.

    1990-09-01

    The report presents an investigation into the interfacial structure in horizontal annular flow using spectral and time-series analysis techniques. Film thickness measured using conductance probes shows an interesting transition in wave pattern from a continuous low-frequency wave pattern to an intermittent, high-frequency one. From the autospectral density function of the film thickness, it appears that this transition is caused by the breaking up of long waves into smaller ones. To investigate the possibility of the wave structure being represented as a low order chaotic system, phase portraits of the time series were constructed using the technique developed by Broomhead and co-workers (1986, 1987 and 1989). These showed a banded structure when waves of relatively high frequency were filtered out. Although these results are encouraging, further work is needed to characterise the attractor. (Author)

  10. Development of indicators of vegetation recovery based on time series analysis of SPOT Vegetation data

    Science.gov (United States)

    Lhermitte, S.; Tips, M.; Verbesselt, J.; Jonckheere, I.; Van Aardt, J.; Coppin, Pol

    2005-10-01

    Large-scale wild fires have direct impacts on natural ecosystems and play a major role in the vegetation ecology and carbon budget. Accurate methods for describing post-fire development of vegetation are therefore essential for the understanding and monitoring of terrestrial ecosystems. Time series analysis of satellite imagery offers the potential to quantify these parameters with spatial and temporal accuracy. Current research focuses on the potential of time series analysis of SPOT Vegetation S10 data (1999-2001) to quantify the vegetation recovery of large-scale burns detected in the framework of GBA2000. The objective of this study was to provide quantitative estimates of the spatio-temporal variation of vegetation recovery based on remote sensing indicators. Southern Africa was used as a pilot study area, given the availability of ground and satellite data. An automated technique was developed to extract consistent indicators of vegetation recovery from the SPOT-VGT time series. Reference areas were used to quantify the vegetation regrowth by means of Regeneration Indices (RI). Two kinds of recovery indicators (time and value- based) were tested for RI's of NDVI, SR, SAVI, NDWI, and pure band information. The effects of vegetation structure and temporal fire regime features on the recovery indicators were subsequently analyzed. Statistical analyses were conducted to assess whether the recovery indicators were different for different vegetation types and dependent on timing of the burning season. Results highlighted the importance of appropriate reference areas and the importance of correct normalization of the SPOT-VGT data.

  11. Application of Time Series Analysis in Determination of Lag Time in Jahanbin Basin

    Directory of Open Access Journals (Sweden)

    Seied Yahya Mirzaee

    2005-11-01

        One of the important issues that have significant role in study of hydrology of basin is determination of lag time. Lag time has significant role in hydrological studies. Quantity of rainfall related lag time depends on several factors, such as permeability, vegetation cover, catchments slope, rainfall intensity, storm duration and type of rain. Determination of lag time is important parameter in many projects such as dam design and also water resource studies. Lag time of basin could be calculated using various methods. One of these methods is time series analysis of spectral density. The analysis is based on fouries series. The time series is approximated with Sinuous and Cosines functions. In this method harmonically significant quantities with individual frequencies are presented. Spectral density under multiple time series could be used to obtain basin lag time for annual runoff and short-term rainfall fluctuation. A long lag time could be due to snowmelt as well as melting ice due to rainfalls in freezing days. In this research the lag time of Jahanbin basin has been determined using spectral density method. The catchments is subjected to both rainfall and snowfall. For short term rainfall fluctuation with a return period  2, 3, 4 months, the lag times were found 0.18, 0.5 and 0.083 month, respectively.

  12. Dynamic Forecasting Conditional Probability of Bombing Attacks Based on Time-Series and Intervention Analysis.

    Science.gov (United States)

    Li, Shuying; Zhuang, Jun; Shen, Shifei

    2017-07-01

    In recent years, various types of terrorist attacks occurred, causing worldwide catastrophes. According to the Global Terrorism Database (GTD), among all attack tactics, bombing attacks happened most frequently, followed by armed assaults. In this article, a model for analyzing and forecasting the conditional probability of bombing attacks (CPBAs) based on time-series methods is developed. In addition, intervention analysis is used to analyze the sudden increase in the time-series process. The results show that the CPBA increased dramatically at the end of 2011. During that time, the CPBA increased by 16.0% in a two-month period to reach the peak value, but still stays 9.0% greater than the predicted level after the temporary effect gradually decays. By contrast, no significant fluctuation can be found in the conditional probability process of armed assault. It can be inferred that some social unrest, such as America's troop withdrawal from Afghanistan and Iraq, could have led to the increase of the CPBA in Afghanistan, Iraq, and Pakistan. The integrated time-series and intervention model is used to forecast the monthly CPBA in 2014 and through 2064. The average relative error compared with the real data in 2014 is 3.5%. The model is also applied to the total number of attacks recorded by the GTD between 2004 and 2014. © 2016 Society for Risk Analysis.

  13. Work-related accidents among the Iranian population: a time series analysis, 2000-2011.

    Science.gov (United States)

    Karimlou, Masoud; Salehi, Masoud; Imani, Mehdi; Hosseini, Agha-Fatemeh; Dehnad, Afsaneh; Vahabi, Nasim; Bakhtiyari, Mahmood

    2015-01-01

    Work-related accidents result in human suffering and economic losses and are considered as a major health problem worldwide, especially in the economically developing world. To introduce seasonal autoregressive moving average (ARIMA) models for time series analysis of work-related accident data for workers insured by the Iranian Social Security Organization (ISSO) between 2000 and 2011. In this retrospective study, all insured people experiencing at least one work-related accident during a 10-year period were included in the analyses. We used Box-Jenkins modeling to develop a time series model of the total number of accidents. There was an average of 1476 accidents per month (1476·05±458·77, mean±SD). The final ARIMA (p,d,q) (P,D,Q)s model for fitting to data was: ARIMA(1,1,1)×(0,1,1)12 consisting of the first ordering of the autoregressive, moving average and seasonal moving average parameters with 20·942 mean absolute percentage error (MAPE). The final model showed that time series analysis of ARIMA models was useful for forecasting the number of work-related accidents in Iran. In addition, the forecasted number of work-related accidents for 2011 explained the stability of occurrence of these accidents in recent years, indicating a need for preventive occupational health and safety policies such as safety inspection.

  14. Work-related accidents among the Iranian population: a time series analysis, 2000–2011

    Science.gov (United States)

    Karimlou, Masoud; Imani, Mehdi; Hosseini, Agha-Fatemeh; Dehnad, Afsaneh; Vahabi, Nasim; Bakhtiyari, Mahmood

    2015-01-01

    Background Work-related accidents result in human suffering and economic losses and are considered as a major health problem worldwide, especially in the economically developing world. Objectives To introduce seasonal autoregressive moving average (ARIMA) models for time series analysis of work-related accident data for workers insured by the Iranian Social Security Organization (ISSO) between 2000 and 2011. Methods In this retrospective study, all insured people experiencing at least one work-related accident during a 10-year period were included in the analyses. We used Box–Jenkins modeling to develop a time series model of the total number of accidents. Results There was an average of 1476 accidents per month (1476·05±458·77, mean±SD). The final ARIMA (p,d,q) (P,D,Q)s model for fitting to data was: ARIMA(1,1,1)×(0,1,1)12 consisting of the first ordering of the autoregressive, moving average and seasonal moving average parameters with 20·942 mean absolute percentage error (MAPE). Conclusions The final model showed that time series analysis of ARIMA models was useful for forecasting the number of work-related accidents in Iran. In addition, the forecasted number of work-related accidents for 2011 explained the stability of occurrence of these accidents in recent years, indicating a need for preventive occupational health and safety policies such as safety inspection. PMID:26119774

  15. Multidimensional scaling analysis of financial time series based on modified cross-sample entropy methods

    Science.gov (United States)

    He, Jiayi; Shang, Pengjian; Xiong, Hui

    2018-06-01

    Stocks, as the concrete manifestation of financial time series with plenty of potential information, are often used in the study of financial time series. In this paper, we utilize the stock data to recognize their patterns through out the dissimilarity matrix based on modified cross-sample entropy, then three-dimensional perceptual maps of the results are provided through multidimensional scaling method. Two modified multidimensional scaling methods are proposed in this paper, that is, multidimensional scaling based on Kronecker-delta cross-sample entropy (MDS-KCSE) and multidimensional scaling based on permutation cross-sample entropy (MDS-PCSE). These two methods use Kronecker-delta based cross-sample entropy and permutation based cross-sample entropy to replace the distance or dissimilarity measurement in classical multidimensional scaling (MDS). Multidimensional scaling based on Chebyshev distance (MDSC) is employed to provide a reference for comparisons. Our analysis reveals a clear clustering both in synthetic data and 18 indices from diverse stock markets. It implies that time series generated by the same model are easier to have similar irregularity than others, and the difference in the stock index, which is caused by the country or region and the different financial policies, can reflect the irregularity in the data. In the synthetic data experiments, not only the time series generated by different models can be distinguished, the one generated under different parameters of the same model can also be detected. In the financial data experiment, the stock indices are clearly divided into five groups. Through analysis, we find that they correspond to five regions, respectively, that is, Europe, North America, South America, Asian-Pacific (with the exception of mainland China), mainland China and Russia. The results also demonstrate that MDS-KCSE and MDS-PCSE provide more effective divisions in experiments than MDSC.

  16. Forecasting malaria cases using climatic factors in delhi, India: a time series analysis.

    Science.gov (United States)

    Kumar, Varun; Mangal, Abha; Panesar, Sanjeet; Yadav, Geeta; Talwar, Richa; Raut, Deepak; Singh, Saudan

    2014-01-01

    Background. Malaria still remains a public health problem in developing countries and changing environmental and climatic factors pose the biggest challenge in fighting against the scourge of malaria. Therefore, the study was designed to forecast malaria cases using climatic factors as predictors in Delhi, India. Methods. The total number of monthly cases of malaria slide positives occurring from January 2006 to December 2013 was taken from the register maintained at the malaria clinic at Rural Health Training Centre (RHTC), Najafgarh, Delhi. Climatic data of monthly mean rainfall, relative humidity, and mean maximum temperature were taken from Regional Meteorological Centre, Delhi. Expert modeler of SPSS ver. 21 was used for analyzing the time series data. Results. Autoregressive integrated moving average, ARIMA (0,1,1) (0,1,0)(12), was the best fit model and it could explain 72.5% variability in the time series data. Rainfall (P value = 0.004) and relative humidity (P value = 0.001) were found to be significant predictors for malaria transmission in the study area. Seasonal adjusted factor (SAF) for malaria cases shows peak during the months of August and September. Conclusion. ARIMA models of time series analysis is a simple and reliable tool for producing reliable forecasts for malaria in Delhi, India.

  17. Capturing Context-Related Change in Emotional Dynamics via Fixed Moderated Time Series Analysis.

    Science.gov (United States)

    Adolf, Janne K; Voelkle, Manuel C; Brose, Annette; Schmiedek, Florian

    2017-01-01

    Much of recent affect research relies on intensive longitudinal studies to assess daily emotional experiences. The resulting data are analyzed with dynamic models to capture regulatory processes involved in emotional functioning. Daily contexts, however, are commonly ignored. This may not only result in biased parameter estimates and wrong conclusions, but also ignores the opportunity to investigate contextual effects on emotional dynamics. With fixed moderated time series analysis, we present an approach that resolves this problem by estimating context-dependent change in dynamic parameters in single-subject time series models. The approach examines parameter changes of known shape and thus addresses the problem of observed intra-individual heterogeneity (e.g., changes in emotional dynamics due to observed changes in daily stress). In comparison to existing approaches to unobserved heterogeneity, model estimation is facilitated and different forms of change can readily be accommodated. We demonstrate the approach's viability given relatively short time series by means of a simulation study. In addition, we present an empirical application, targeting the joint dynamics of affect and stress and how these co-vary with daily events. We discuss potentials and limitations of the approach and close with an outlook on the broader implications for understanding emotional adaption and development.

  18. Multifractal analysis of visibility graph-based Ito-related connectivity time series.

    Science.gov (United States)

    Czechowski, Zbigniew; Lovallo, Michele; Telesca, Luciano

    2016-02-01

    In this study, we investigate multifractal properties of connectivity time series resulting from the visibility graph applied to normally distributed time series generated by the Ito equations with multiplicative power-law noise. We show that multifractality of the connectivity time series (i.e., the series of numbers of links outgoing any node) increases with the exponent of the power-law noise. The multifractality of the connectivity time series could be due to the width of connectivity degree distribution that can be related to the exit time of the associated Ito time series. Furthermore, the connectivity time series are characterized by persistence, although the original Ito time series are random; this is due to the procedure of visibility graph that, connecting the values of the time series, generates persistence but destroys most of the nonlinear correlations. Moreover, the visibility graph is sensitive for detecting wide "depressions" in input time series.

  19. On the Impact of a Quadratic Acceleration Term in the Analysis of Position Time Series

    Science.gov (United States)

    Bogusz, Janusz; Klos, Anna; Bos, Machiel Simon; Hunegnaw, Addisu; Teferle, Felix Norman

    2016-04-01

    The analysis of Global Navigation Satellite System (GNSS) position time series generally assumes that each of the coordinate component series is described by the sum of a linear rate (velocity) and various periodic terms. The residuals, the deviations between the fitted model and the observations, are then a measure of the epoch-to-epoch scatter and have been used for the analysis of the stochastic character (noise) of the time series. Often the parameters of interest in GNSS position time series are the velocities and their associated uncertainties, which have to be determined with the highest reliability. It is clear that not all GNSS position time series follow this simple linear behaviour. Therefore, we have added an acceleration term in the form of a quadratic polynomial function to the model in order to better describe the non-linear motion in the position time series. This non-linear motion could be a response to purely geophysical processes, for example, elastic rebound of the Earth's crust due to ice mass loss in Greenland, artefacts due to deficiencies in bias mitigation models, for example, of the GNSS satellite and receiver antenna phase centres, or any combination thereof. In this study we have simulated 20 time series with different stochastic characteristics such as white, flicker or random walk noise of length of 23 years. The noise amplitude was assumed at 1 mm/y-/4. Then, we added the deterministic part consisting of a linear trend of 20 mm/y (that represents the averaged horizontal velocity) and accelerations ranging from minus 0.6 to plus 0.6 mm/y2. For all these data we estimated the noise parameters with Maximum Likelihood Estimation (MLE) using the Hector software package without taken into account the non-linear term. In this way we set the benchmark to then investigate how the noise properties and velocity uncertainty may be affected by any un-modelled, non-linear term. The velocities and their uncertainties versus the accelerations for

  20. Studies in astronomical time series analysis. I - Modeling random processes in the time domain

    Science.gov (United States)

    Scargle, J. D.

    1981-01-01

    Several random process models in the time domain are defined and discussed. Attention is given to the moving average model, the autoregressive model, and relationships between and combinations of these models. Consideration is then given to methods for investigating pulse structure, procedures of model construction, computational methods, and numerical experiments. A FORTRAN algorithm of time series analysis has been developed which is relatively stable numerically. Results of test cases are given to study the effect of adding noise and of different distributions for the pulse amplitudes. A preliminary analysis of the light curve of the quasar 3C 272 is considered as an example.

  1. Non-linear time series analysis on flow instability of natural circulation under rolling motion condition

    International Nuclear Information System (INIS)

    Zhang, Wenchao; Tan, Sichao; Gao, Puzhen; Wang, Zhanwei; Zhang, Liansheng; Zhang, Hong

    2014-01-01

    Highlights: • Natural circulation flow instabilities in rolling motion are studied. • The method of non-linear time series analysis is used. • Non-linear evolution characteristic of flow instability is analyzed. • Irregular complex flow oscillations are chaotic oscillations. • The effect of rolling parameter on the threshold of chaotic oscillation is studied. - Abstract: Non-linear characteristics of natural circulation flow instabilities under rolling motion conditions were studied by the method of non-linear time series analysis. Experimental flow time series of different dimensionless power and rolling parameters were analyzed based on phase space reconstruction theory. Attractors which were reconstructed in phase space and the geometric invariants, including correlation dimension, Kolmogorov entropy and largest Lyapunov exponent, were determined. Non-linear characteristics of natural circulation flow instabilities under rolling motion conditions was studied based on the results of the geometric invariant analysis. The results indicated that the values of the geometric invariants first increase and then decrease as dimensionless power increases which indicated the non-linear characteristics of the system first enhance and then weaken. The irregular complex flow oscillation is typical chaotic oscillation because the value of geometric invariants is at maximum. The threshold of chaotic oscillation becomes larger as the rolling frequency or rolling amplitude becomes big. The main influencing factors that influence the non-linear characteristics of the natural circulation system under rolling motion are thermal driving force, flow resistance and the additional forces caused by rolling motion. The non-linear characteristics of the natural circulation system under rolling motion changes caused by the change of the feedback and coupling degree among these influencing factors when the dimensionless power or rolling parameters changes

  2. Use of recurrence plot and recurrence quantification analysis in Taiwan unemployment rate time series

    Science.gov (United States)

    Chen, Wei-Shing

    2011-04-01

    The aim of the article is to answer the question if the Taiwan unemployment rate dynamics is generated by a non-linear deterministic dynamic process. This paper applies a recurrence plot and recurrence quantification approach based on the analysis of non-stationary hidden transition patterns of the unemployment rate of Taiwan. The case study uses the time series data of the Taiwan’s unemployment rate during the period from 1978/01 to 2010/06. The results show that recurrence techniques are able to identify various phases in the evolution of unemployment transition in Taiwan.

  3. Interpretation of engine cycle-to-cycle variation by chaotic time series analysis

    Energy Technology Data Exchange (ETDEWEB)

    Daw, C.S.; Kahl, W.K.

    1990-01-01

    In this paper we summarize preliminary results from applying a new mathematical technique -- chaotic time series analysis (CTSA) -- to cylinder pressure data from a spark-ignition (SI) four-stroke engine fueled with both methanol and iso-octane. Our objective is to look for the presence of deterministic chaos'' dynamics in peak pressure variations and to investigate the potential usefulness of CTSA as a diagnostic tool. Our results suggest that sequential peak cylinder pressures exhibit some characteristic features of deterministic chaos and that CTSA can extract previously unrecognized information from such data. 18 refs., 11 figs., 2 tabs.

  4. Analysis of the development trend of China’s business administration based on time series

    OpenAIRE

    Jiang Rui

    2016-01-01

    On the general direction of the economic system, China is in a crucial period of the establishment of the modern enterprise system and reform of the macroeconomic system, and a lot of high-quality business administration talents are required to make China’s economy be stably developed. This paper carries out time series analysis of the development situation of China’s business administration major: on the whole, the society currently presents an upward trend on the demand for the business adm...

  5. Time series analysis of soil Radon-222 recorded at Kutch region, Gujarat, India

    International Nuclear Information System (INIS)

    Madhusudan Rao, K.; Rastogi, B.K.; Barman, Chiranjib; Chaudhuri, Hirok

    2013-01-01

    Kutch region in Gujarat lies in a seismic vulnerable zone (seismic zone-v). After the devastating Bhuj earthquake (7.7M) of January 26, 2001 in the Kutch region several researcher focused their attention to monitor geophysical and geochemical precursors for earthquakes in the region. In order to find out the possible geochemical precursory signals for earthquake events, we monitored radioactive gas radon-222 in sub surface soil gas at Kutch region. We have analysed the recorded soil radon-222 time series by means of nonlinear techniques such as FFT power spectral analysis, empirical mode decomposition, multi-fractal analysis along with other linear statistical methods. Some fascinating and fruitful results originated out the nonlinear analysis of the said time series have been discussed in the present paper. The entire analytical method aided us to recognize the nature and pattern of soil radon-222 emanation process. Moreover the recording and statistical and non-linear analysis of soil radon data at Kutch region will assist us to understand the preparation phase of an imminent seismic event in the region. (author)

  6. Time Series Analysis of Onchocerciasis Data from Mexico: A Trend towards Elimination

    Science.gov (United States)

    Pérez-Rodríguez, Miguel A.; Adeleke, Monsuru A.; Orozco-Algarra, María E.; Arrendondo-Jiménez, Juan I.; Guo, Xianwu

    2013-01-01

    Background In Latin America, there are 13 geographically isolated endemic foci distributed among Mexico, Guatemala, Colombia, Venezuela, Brazil and Ecuador. The communities of the three endemic foci found within Mexico have been receiving ivermectin treatment since 1989. In this study, we predicted the trend of occurrence of cases in Mexico by applying time series analysis to monthly onchocerciasis data reported by the Mexican Secretariat of Health between 1988 and 2011 using the software R. Results A total of 15,584 cases were reported in Mexico from 1988 to 2011. The data of onchocerciasis cases are mainly from the main endemic foci of Chiapas and Oaxaca. The last case in Oaxaca was reported in 1998, but new cases were reported in the Chiapas foci up to 2011. Time series analysis performed for the foci in Mexico showed a decreasing trend of the disease over time. The best-fitted models with the smallest Akaike Information Criterion (AIC) were Auto-Regressive Integrated Moving Average (ARIMA) models, which were used to predict the tendency of onchocerciasis cases for two years ahead. According to the ARIMA models predictions, the cases in very low number (below 1) are expected for the disease between 2012 and 2013 in Chiapas, the last endemic region in Mexico. Conclusion The endemic regions of Mexico evolved from high onchocerciasis-endemic states to the interruption of transmission due to the strategies followed by the MSH, based on treatment with ivermectin. The extremely low level of expected cases as predicted by ARIMA models for the next two years suggest that the onchocerciasis is being eliminated in Mexico. To our knowledge, it is the first study utilizing time series for predicting case dynamics of onchocerciasis, which could be used as a benchmark during monitoring and post-treatment surveillance. PMID:23459370

  7. Use of a Principal Components Analysis for the Generation of Daily Time Series.

    Science.gov (United States)

    Dreveton, Christine; Guillou, Yann

    2004-07-01

    A new approach for generating daily time series is considered in response to the weather-derivatives market. This approach consists of performing a principal components analysis to create independent variables, the values of which are then generated separately with a random process. Weather derivatives are financial or insurance products that give companies the opportunity to cover themselves against adverse climate conditions. The aim of a generator is to provide a wider range of feasible situations to be used in an assessment of risk. Generation of a temperature time series is required by insurers or bankers for pricing weather options. The provision of conditional probabilities and a good representation of the interannual variance are the main challenges of a generator when used for weather derivatives. The generator was developed according to this new approach using a principal components analysis and was applied to the daily average temperature time series of the Paris-Montsouris station in France. The observed dataset was homogenized and the trend was removed to represent correctly the present climate. The results obtained with the generator show that it represents correctly the interannual variance of the observed climate; this is the main result of the work, because one of the main discrepancies of other generators is their inability to represent accurately the observed interannual climate variance—this discrepancy is not acceptable for an application to weather derivatives. The generator was also tested to calculate conditional probabilities: for example, the knowledge of the aggregated value of heating degree-days in the middle of the heating season allows one to estimate the probability if reaching a threshold at the end of the heating season. This represents the main application of a climate generator for use with weather derivatives.

  8. Online Time Series Analysis of Land Products over Asia Monsoon Region via Giovanni

    Science.gov (United States)

    Shen, Suhung; Leptoukh, Gregory G.; Gerasimov, Irina

    2011-01-01

    Time series analysis is critical to the study of land cover/land use changes and climate. Time series studies at local-to-regional scales require higher spatial resolution, such as 1km or less, data. MODIS land products of 250m to 1km resolution enable such studies. However, such MODIS land data files are distributed in 10ox10o tiles, due to large data volumes. Conducting a time series study requires downloading all tiles that include the study area for the time period of interest, and mosaicking the tiles spatially. This can be an extremely time-consuming process. In support of the Monsoon Asia Integrated Regional Study (MAIRS) program, NASA GES DISC (Goddard Earth Sciences Data and Information Services Center) has processed MODIS land products at 1 km resolution over the Asia monsoon region (0o-60oN, 60o-150oE) with a common data structure and format. The processed data have been integrated into the Giovanni system (Goddard Interactive Online Visualization ANd aNalysis Infrastructure) that enables users to explore, analyze, and download data over an area and time period of interest easily. Currently, the following regional MODIS land products are available in Giovanni: 8-day 1km land surface temperature and active fire, monthly 1km vegetation index, and yearly 0.05o, 500m land cover types. More data will be added in the near future. By combining atmospheric and oceanic data products in the Giovanni system, it is possible to do further analyses of environmental and climate changes associated with the land, ocean, and atmosphere. This presentation demonstrates exploring land products in the Giovanni system with sample case scenarios.

  9. FREQUENCY ANALYSIS OF MODIS NDVI TIME SERIES FOR DETERMINING HOTSPOT OF LAND DEGRADATION IN MONGOLIA

    Directory of Open Access Journals (Sweden)

    E. Nasanbat

    2018-04-01

    Full Text Available This study examines MODIS NDVI satellite imagery time series can be used to determine hotspot of land degradation area in whole Mongolia. The trend statistical analysis of Mann-Kendall was applied to a 16-year MODIS NDVI satellite imagery record, based on 16-day composited temporal data (from May to September for growing seasons and from 2000 to 2016. We performed to frequency analysis that resulting NDVI residual trend pattern would enable successful determined of negative and positive changes in photo synthetically health vegetation. Our result showed that negative and positive values and generated a map of significant trends. Also, we examined long-term of meteorological parameters for the same period. The result showed positive and negative NDVI trends concurred with land cover types change representing an improve or a degrade in vegetation, respectively. Also, integrated the climate parameters which were precipitation and air temperature changes in the same time period seem to have had an affecting on huge NDVI trend area. The time series trend analysis approach applied successfully determined hotspot of an improvement and a degraded area due to land degradation and desertification.

  10. Frequency Analysis of Modis Ndvi Time Series for Determining Hotspot of Land Degradation in Mongolia

    Science.gov (United States)

    Nasanbat, E.; Sharav, S.; Sanjaa, T.; Lkhamjav, O.; Magsar, E.; Tuvdendorj, B.

    2018-04-01

    This study examines MODIS NDVI satellite imagery time series can be used to determine hotspot of land degradation area in whole Mongolia. The trend statistical analysis of Mann-Kendall was applied to a 16-year MODIS NDVI satellite imagery record, based on 16-day composited temporal data (from May to September) for growing seasons and from 2000 to 2016. We performed to frequency analysis that resulting NDVI residual trend pattern would enable successful determined of negative and positive changes in photo synthetically health vegetation. Our result showed that negative and positive values and generated a map of significant trends. Also, we examined long-term of meteorological parameters for the same period. The result showed positive and negative NDVI trends concurred with land cover types change representing an improve or a degrade in vegetation, respectively. Also, integrated the climate parameters which were precipitation and air temperature changes in the same time period seem to have had an affecting on huge NDVI trend area. The time series trend analysis approach applied successfully determined hotspot of an improvement and a degraded area due to land degradation and desertification.

  11. Wet tropospheric delays forecast based on Vienna Mapping Function time series analysis

    Science.gov (United States)

    Rzepecka, Zofia; Kalita, Jakub

    2016-04-01

    It is well known that the dry part of the zenith tropospheric delay (ZTD) is much easier to model than the wet part (ZTW). The aim of the research is applying stochastic modeling and prediction of ZTW using time series analysis tools. Application of time series analysis enables closer understanding of ZTW behavior as well as short-term prediction of future ZTW values. The ZTW data used for the studies were obtained from the GGOS service hold by Vienna technical University. The resolution of the data is six hours. ZTW for the years 2010 -2013 were adopted for the study. The International GNSS Service (IGS) permanent stations LAMA and GOPE, located in mid-latitudes, were admitted for the investigations. Initially the seasonal part was separated and modeled using periodic signals and frequency analysis. The prominent annual and semi-annual signals were removed using sines and consines functions. The autocorrelation of the resulting signal is significant for several days (20-30 samples). The residuals of this fitting were further analyzed and modeled with ARIMA processes. For both the stations optimal ARMA processes based on several criterions were obtained. On this basis predicted ZTW values were computed for one day ahead, leaving the white process residuals. Accuracy of the prediction can be estimated at about 3 cm.

  12. Permutation entropy based time series analysis: Equalities in the input signal can lead to false conclusions

    Energy Technology Data Exchange (ETDEWEB)

    Zunino, Luciano, E-mail: lucianoz@ciop.unlp.edu.ar [Centro de Investigaciones Ópticas (CONICET La Plata – CIC), C.C. 3, 1897 Gonnet (Argentina); Departamento de Ciencias Básicas, Facultad de Ingeniería, Universidad Nacional de La Plata (UNLP), 1900 La Plata (Argentina); Olivares, Felipe, E-mail: olivaresfe@gmail.com [Instituto de Física, Pontificia Universidad Católica de Valparaíso (PUCV), 23-40025 Valparaíso (Chile); Scholkmann, Felix, E-mail: Felix.Scholkmann@gmail.com [Research Office for Complex Physical and Biological Systems (ROCoS), Mutschellenstr. 179, 8038 Zurich (Switzerland); Biomedical Optics Research Laboratory, Department of Neonatology, University Hospital Zurich, University of Zurich, 8091 Zurich (Switzerland); Rosso, Osvaldo A., E-mail: oarosso@gmail.com [Instituto de Física, Universidade Federal de Alagoas (UFAL), BR 104 Norte km 97, 57072-970, Maceió, Alagoas (Brazil); Instituto Tecnológico de Buenos Aires (ITBA) and CONICET, C1106ACD, Av. Eduardo Madero 399, Ciudad Autónoma de Buenos Aires (Argentina); Complex Systems Group, Facultad de Ingeniería y Ciencias Aplicadas, Universidad de los Andes, Av. Mons. Álvaro del Portillo 12.455, Las Condes, Santiago (Chile)

    2017-06-15

    A symbolic encoding scheme, based on the ordinal relation between the amplitude of neighboring values of a given data sequence, should be implemented before estimating the permutation entropy. Consequently, equalities in the analyzed signal, i.e. repeated equal values, deserve special attention and treatment. In this work, we carefully study the effect that the presence of equalities has on permutation entropy estimated values when these ties are symbolized, as it is commonly done, according to their order of appearance. On the one hand, the analysis of computer-generated time series is initially developed to understand the incidence of repeated values on permutation entropy estimations in controlled scenarios. The presence of temporal correlations is erroneously concluded when true pseudorandom time series with low amplitude resolutions are considered. On the other hand, the analysis of real-world data is included to illustrate how the presence of a significant number of equal values can give rise to false conclusions regarding the underlying temporal structures in practical contexts. - Highlights: • Impact of repeated values in a signal when estimating permutation entropy is studied. • Numerical and experimental tests are included for characterizing this limitation. • Non-negligible temporal correlations can be spuriously concluded by repeated values. • Data digitized with low amplitude resolutions could be especially affected. • Analysis with shuffled realizations can help to overcome this limitation.

  13. Visualization of time series statistical data by shape analysis (GDP ratio changes among Asia countries)

    Science.gov (United States)

    Shirota, Yukari; Hashimoto, Takako; Fitri Sari, Riri

    2018-03-01

    It has been very significant to visualize time series big data. In the paper we shall discuss a new analysis method called “statistical shape analysis” or “geometry driven statistics” on time series statistical data in economics. In the paper, we analyse the agriculture, value added and industry, value added (percentage of GDP) changes from 2000 to 2010 in Asia. We handle the data as a set of landmarks on a two-dimensional image to see the deformation using the principal components. The point of the analysis method is the principal components of the given formation which are eigenvectors of its bending energy matrix. The local deformation can be expressed as the set of non-Affine transformations. The transformations give us information about the local differences between in 2000 and in 2010. Because the non-Affine transformation can be decomposed into a set of partial warps, we present the partial warps visually. The statistical shape analysis is widely used in biology but, in economics, no application can be found. In the paper, we investigate its potential to analyse the economic data.

  14. Forecast models for suicide: Time-series analysis with data from Italy.

    Science.gov (United States)

    Preti, Antonio; Lentini, Gianluca

    2016-01-01

    The prediction of suicidal behavior is a complex task. To fine-tune targeted preventative interventions, predictive analytics (i.e. forecasting future risk of suicide) is more important than exploratory data analysis (pattern recognition, e.g. detection of seasonality in suicide time series). This study sets out to investigate the accuracy of forecasting models of suicide for men and women. A total of 101 499 male suicides and of 39 681 female suicides - occurred in Italy from 1969 to 2003 - were investigated. In order to apply the forecasting model and test its accuracy, the time series were split into a training set (1969 to 1996; 336 months) and a test set (1997 to 2003; 84 months). The main outcome was the accuracy of forecasting models on the monthly number of suicides. These measures of accuracy were used: mean absolute error; root mean squared error; mean absolute percentage error; mean absolute scaled error. In both male and female suicides a change in the trend pattern was observed, with an increase from 1969 onwards to reach a maximum around 1990 and decrease thereafter. The variances attributable to the seasonal and trend components were, respectively, 24% and 64% in male suicides, and 28% and 41% in female ones. Both annual and seasonal historical trends of monthly data contributed to forecast future trends of suicide with a margin of error around 10%. The finding is clearer in male than in female time series of suicide. The main conclusion of the study is that models taking seasonality into account seem to be able to derive information on deviation from the mean when this occurs as a zenith, but they fail to reproduce it when it occurs as a nadir. Preventative efforts should concentrate on the factors that influence the occurrence of increases above the main trend in both seasonal and cyclic patterns of suicides.

  15. Disentangling Time-series Spectra with Gaussian Processes: Applications to Radial Velocity Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Czekala, Ian [Kavli Institute for Particle Astrophysics and Cosmology, Stanford University, Stanford, CA 94305 (United States); Mandel, Kaisey S.; Andrews, Sean M.; Dittmann, Jason A. [Harvard-Smithsonian Center for Astrophysics, 60 Garden Street, Cambridge, MA 02138 (United States); Ghosh, Sujit K. [Department of Statistics, NC State University, 2311 Stinson Drive, Raleigh, NC 27695 (United States); Montet, Benjamin T. [Department of Astronomy and Astrophysics, University of Chicago, 5640 S. Ellis Avenue, Chicago, IL 60637 (United States); Newton, Elisabeth R., E-mail: iczekala@stanford.edu [Massachusetts Institute of Technology, Cambridge, MA 02138 (United States)

    2017-05-01

    Measurements of radial velocity variations from the spectroscopic monitoring of stars and their companions are essential for a broad swath of astrophysics; these measurements provide access to the fundamental physical properties that dictate all phases of stellar evolution and facilitate the quantitative study of planetary systems. The conversion of those measurements into both constraints on the orbital architecture and individual component spectra can be a serious challenge, however, especially for extreme flux ratio systems and observations with relatively low sensitivity. Gaussian processes define sampling distributions of flexible, continuous functions that are well-motivated for modeling stellar spectra, enabling proficient searches for companion lines in time-series spectra. We introduce a new technique for spectral disentangling, where the posterior distributions of the orbital parameters and intrinsic, rest-frame stellar spectra are explored simultaneously without needing to invoke cross-correlation templates. To demonstrate its potential, this technique is deployed on red-optical time-series spectra of the mid-M-dwarf binary LP661-13. We report orbital parameters with improved precision compared to traditional radial velocity analysis and successfully reconstruct the primary and secondary spectra. We discuss potential applications for other stellar and exoplanet radial velocity techniques and extensions to time-variable spectra. The code used in this analysis is freely available as an open-source Python package.

  16. Combined use of correlation dimension and entropy as discriminating measures for time series analysis

    Science.gov (United States)

    Harikrishnan, K. P.; Misra, R.; Ambika, G.

    2009-09-01

    We show that the combined use of correlation dimension (D2) and correlation entropy (K2) as discriminating measures can extract a more accurate information regarding the different types of noise present in a time series data. For this, we make use of an algorithmic approach for computing D2 and K2 proposed by us recently [Harikrishnan KP, Misra R, Ambika G, Kembhavi AK. Physica D 2006;215:137; Harikrishnan KP, Ambika G, Misra R. Mod Phys Lett B 2007;21:129; Harikrishnan KP, Misra R, Ambika G. Pramana - J Phys, in press], which is a modification of the standard Grassberger-Proccacia scheme. While the presence of white noise can be easily identified by computing D2 of data and surrogates, K2 is a better discriminating measure to detect colored noise in the data. Analysis of time series from a real world system involving both white and colored noise is presented as evidence. To our knowledge, this is the first time that such a combined analysis is undertaken on a real world data.

  17. On-line diagnostic techniques for air-operated control valves based on time series analysis

    International Nuclear Information System (INIS)

    Ito, Kenji; Matsuoka, Yoshinori; Minamikawa, Shigeru; Komatsu, Yasuki; Satoh, Takeshi.

    1996-01-01

    The objective of this research is to study the feasibility of applying on-line diagnostic techniques based on time series analysis to air-operated control valves - numerous valves of the type which are used in PWR plants. Generally the techniques can detect anomalies by failures in the initial stages for which detection is difficult by conventional surveillance of process parameters measured directly. However, the effectiveness of these techniques depends on the system being diagnosed. The difficulties in applying diagnostic techniques to air-operated control valves seem to come from the reduced sensitivity of their response as compared with hydraulic control systems, as well as the need to identify anomalies in low level signals that fluctuate only slightly but continuously. In this research, simulation tests were performed by setting various kinds of failure modes for a test valve with the same specifications as of a valve actually used in the plants. Actual control signals recorded from an operating plant were then used as input signals for simulation. The results of the tests confirmed the feasibility of applying on-line diagnostic techniques based on time series analysis to air-operated control valves. (author)

  18. Visibility graph analysis of heart rate time series and bio-marker of congestive heart failure

    Science.gov (United States)

    Bhaduri, Anirban; Bhaduri, Susmita; Ghosh, Dipak

    2017-09-01

    Study of RR interval time series for Congestive Heart Failure had been an area of study with different methods including non-linear methods. In this article the cardiac dynamics of heart beat are explored in the light of complex network analysis, viz. visibility graph method. Heart beat (RR Interval) time series data taken from Physionet database [46, 47] belonging to two groups of subjects, diseased (congestive heart failure) (29 in number) and normal (54 in number) are analyzed with the technique. The overall results show that a quantitative parameter can significantly differentiate between the diseased subjects and the normal subjects as well as different stages of the disease. Further, the data when split into periods of around 1 hour each and analyzed separately, also shows the same consistent differences. This quantitative parameter obtained using the visibility graph analysis thereby can be used as a potential bio-marker as well as a subsequent alarm generation mechanism for predicting the onset of Congestive Heart Failure.

  19. Empirical mode decomposition and long-range correlation analysis of sunspot time series

    International Nuclear Information System (INIS)

    Zhou, Yu; Leung, Yee

    2010-01-01

    Sunspots, which are the best known and most variable features of the solar surface, affect our planet in many ways. The number of sunspots during a period of time is highly variable and arouses strong research interest. When multifractal detrended fluctuation analysis (MF-DFA) is employed to study the fractal properties and long-range correlation of the sunspot series, some spurious crossover points might appear because of the periodic and quasi-periodic trends in the series. However many cycles of solar activities can be reflected by the sunspot time series. The 11-year cycle is perhaps the most famous cycle of the sunspot activity. These cycles pose problems for the investigation of the scaling behavior of sunspot time series. Using different methods to handle the 11-year cycle generally creates totally different results. Using MF-DFA, Movahed and co-workers employed Fourier truncation to deal with the 11-year cycle and found that the series is long-range anti-correlated with a Hurst exponent, H, of about 0.12. However, Hu and co-workers proposed an adaptive detrending method for the MF-DFA and discovered long-range correlation characterized by H≈0.74. In an attempt to get to the bottom of the problem in the present paper, empirical mode decomposition (EMD), a data-driven adaptive method, is applied to first extract the components with different dominant frequencies. MF-DFA is then employed to study the long-range correlation of the sunspot time series under the influence of these components. On removing the effects of these periods, the natural long-range correlation of the sunspot time series can be revealed. With the removal of the 11-year cycle, a crossover point located at around 60 months is discovered to be a reasonable point separating two different time scale ranges, H≈0.72 and H≈1.49. And on removing all cycles longer than 11 years, we have H≈0.69 and H≈0.28. The three cycle-removing methods—Fourier truncation, adaptive detrending and the

  20. Analysis of time series for postal shipments in Regional VII East Java Indonesia

    Science.gov (United States)

    Kusrini, DE; Ulama, B. S. S.; Aridinanti, L.

    2018-03-01

    The change of number delivery goods through PT. Pos Regional VII East Java Indonesia indicates that the trend of increasing and decreasing the delivery of documents and non-documents in PT. Pos Regional VII East Java Indonesia is strongly influenced by conditions outside of PT. Pos Regional VII East Java Indonesia so that the prediction the number of document and non-documents requires a model that can accommodate it. Based on the time series plot monthly data fluctuations occur from 2013-2016 then the model is done using ARIMA or seasonal ARIMA and selected the best model based on the smallest AIC value. The results of data analysis about the number of shipments on each product sent through the Sub-Regional Postal Office VII East Java indicates that there are 5 post offices of 26 post offices entering the territory. The largest number of shipments is available on the PPB (Paket Pos Biasa is regular package shipment/non-document ) and SKH (Surat Kilat Khusus is Special Express Mail/document) products. The time series model generated is largely a Random walk model meaning that the number of shipment in the future is influenced by random effects that are difficult to predict. Some are AR and MA models, except for Express shipment products with Malang post office destination which has seasonal ARIMA model on lag 6 and 12. This means that the number of items in the following month is affected by the number of items in the previous 6 months.

  1. The Relationship between Logistics and Economic Development in Indonesia: Analysis of Time Series Data

    Directory of Open Access Journals (Sweden)

    Mohammad Reza

    2013-01-01

    Full Text Available This paper investigates the relationship between logistics and economic development in Indonesia using time series data on traffic volume and economic growth for the period from 1988 to 2010. Literature reviews were conducted to find the most applicable econometric model. The data of cargo volume that travels through sea, air and rail is used as the logistics index, while GDP is used for the economic index. The time series data was tested using stationarity and co-integration tests. Granger causality tests were employed, and then a proposed logistic model is presented. This study showed that logistics plays an important role in supporting and sustaining economic growth, in a form where the economic growth is the significant demand-pull effect towards logistics. Although the model is developed in the context of Indonesia, the overall statistical analysis can be generalized to other developing economies. Based on the model, this paper presented the importance of sustaining economic development with regards continuously improving the logistics infrastructure.

  2. Parametric time series analysis of geoelectrical signals: an application to earthquake forecasting in Southern Italy

    Directory of Open Access Journals (Sweden)

    V. Tramutoli

    1996-06-01

    Full Text Available An autoregressive model was selected to describe geoelectrical time series. An objective technique was subsequently applied to analyze and discriminate values above (below an a priorifixed threshold possibly related to seismic events. A complete check of the model and the main guidelines to estimate the occurrence probability of extreme events are reported. A first application of the proposed technique is discussed through the analysis of the experimental data recorded by an automatic station located in Tito, a small town on the Apennine chain in Southern Italy. This region was hit by the November 1980 Irpinia-Basilicata earthquake and it is one of most active areas of the Mediterranean region. After a preliminary filtering procedure to reduce the influence of external parameters (i.e. the meteo-climatic effects, it was demonstrated that the geoelectrical residual time series are well described by means of a second order autoregressive model. Our findings outline a statistical methodology to evaluate the efficiency of electrical seismic precursors.

  3. Evaluation of the autoregression time-series model for analysis of a noisy signal

    International Nuclear Information System (INIS)

    Allen, J.W.

    1977-01-01

    The autoregression (AR) time-series model of a continuous noisy signal was statistically evaluated to determine quantitatively the uncertainties of the model order, the model parameters, and the model's power spectral density (PSD). The result of such a statistical evaluation enables an experimenter to decide whether an AR model can adequately represent a continuous noisy signal and be consistent with the signal's frequency spectrum, and whether it can be used for on-line monitoring. Although evaluations of other types of signals have been reported in the literature, no direct reference has been found to AR model's uncertainties for continuous noisy signals; yet the evaluation is necessary to decide the usefulness of AR models of typical reactor signals (e.g., neutron detector output or thermocouple output) and the potential of AR models for on-line monitoring applications. AR and other time-series models for noisy data representation are being investigated by others since such models require fewer parameters than the traditional PSD model. For this study, the AR model was selected for its simplicity and conduciveness to uncertainty analysis, and controlled laboratory bench signals were used for continuous noisy data. (author)

  4. Using time-series intervention analysis to understand U.S. Medicaid expenditures on antidepressant agents.

    Science.gov (United States)

    Ferrand, Yann; Kelton, Christina M L; Guo, Jeff J; Levy, Martin S; Yu, Yan

    2011-03-01

    Medicaid programs' spending on antidepressants increased from $159 million in 1991 to $2 billion in 2005. The National Institute for Health Care Management attributed this expenditure growth to increases in drug utilization, entry of newer higher-priced antidepressants, and greater prescription drug insurance coverage. Rising enrollment in Medicaid has also contributed to this expenditure growth. This research examines the impact of specific events, including branded-drug and generic entry, a black box warning, direct-to-consumer advertising (DTCA), and new indication approval, on Medicaid spending on antidepressants. Using quarterly expenditure data for 1991-2005 from the national Medicaid pharmacy claims database maintained by the Centers for Medicare and Medicaid Services, a time-series autoregressive integrated moving average (ARIMA) intervention analysis was performed on 6 specific antidepressant drugs and on overall antidepressant spending. Twenty-nine potentially relevant interventions and their dates of occurrence were identified from the literature. Each was tested for an impact on the time series. Forecasts from the models were compared with a holdout sample of actual expenditure data. Interventions with significant impacts on Medicaid expenditures included the patent expiration of Prozac® (P0.05), implying that the expanding market for antidepressants overwhelmed the effect of generic competition. Copyright © 2011 Elsevier Inc. All rights reserved.

  5. Conflicts in Coalitions: A Stability Analysis of Robust Multi-City Regional Water Supply Portfolios

    Science.gov (United States)

    Gold, D.; Trindade, B. C.; Reed, P. M.; Characklis, G. W.

    2017-12-01

    Regional cooperation among water utilities can improve the robustness of urban water supply portfolios to deeply uncertain future conditions such as those caused by climate change or population growth. Coordination mechanisms such as water transfers, coordinated demand management, and shared infrastructure, can improve the efficiency of resource allocation and delay the need for new infrastructure investments. Regionalization does however come at a cost. Regionally coordinated water supply plans may be vulnerable to any emerging instabilities in the regional coalition. If one or more regional actors does not cooperate or follow the required regional actions in a time of crisis, the overall system performance may degrade. Furthermore, when crafting regional water supply portfolios, decision makers must choose a framework for measuring the performance of regional policies based on the evaluation of the objective values for each individual actor. Regional evaluations may inherently favor one actor's interests over those of another. This work focuses on four interconnected water utilities in the Research Triangle region of North Carolina for which robust regional water supply portfolios have previously been designed using multi-objective optimization to maximize the robustness of the worst performing utility across several objectives. This study 1) examines the sensitivity of portfolio performance to deviations from prescribed actions by individual utilities, 2) quantifies the implications of the regional formulation used to evaluate robustness for the portfolio performance of each individual utility and 3) elucidates the inherent regional tensions and conflicts that exist between utilities under this regionalization scheme through visual diagnostics of the system under simulated drought scenarios. Results of this analysis will help inform the creation of future regional water supply portfolios and provide insight into the nature of multi-actor water supply systems.

  6. Investigation on Law and Economics Based on Complex Network and Time Series Analysis

    Science.gov (United States)

    Yang, Jian; Qu, Zhao; Chang, Hui

    2015-01-01

    The research focuses on the cooperative relationship and the strategy tendency among three mutually interactive parties in financing: small enterprises, commercial banks and micro-credit companies. Complex network theory and time series analysis were applied to figure out the quantitative evidence. Moreover, this paper built up a fundamental model describing the particular interaction among them through evolutionary game. Combining the results of data analysis and current situation, it is justifiable to put forward reasonable legislative recommendations for regulations on lending activities among small enterprises, commercial banks and micro-credit companies. The approach in this research provides a framework for constructing mathematical models and applying econometrics and evolutionary game in the issue of corporation financing. PMID:26076460

  7. Modeling of human operator dynamics in simple manual control utilizing time series analysis. [tracking (position)

    Science.gov (United States)

    Agarwal, G. C.; Osafo-Charles, F.; Oneill, W. D.; Gottlieb, G. L.

    1982-01-01

    Time series analysis is applied to model human operator dynamics in pursuit and compensatory tracking modes. The normalized residual criterion is used as a one-step analytical tool to encompass the processes of identification, estimation, and diagnostic checking. A parameter constraining technique is introduced to develop more reliable models of human operator dynamics. The human operator is adequately modeled by a second order dynamic system both in pursuit and compensatory tracking modes. In comparing the data sampling rates, 100 msec between samples is adequate and is shown to provide better results than 200 msec sampling. The residual power spectrum and eigenvalue analysis show that the human operator is not a generator of periodic characteristics.

  8. Analysis of the Main Factors Influencing Food Production in China Based on Time Series Trend Chart

    Institute of Scientific and Technical Information of China (English)

    Shuangjin; WANG; Jianying; LI

    2014-01-01

    Based on the annual sample data on food production in China since the reform and opening up,we select 8 main factors influencing the total food production( growing area,application rate of chemical fertilizer,effective irrigation area,the affected area,total machinery power,food production cost index,food production price index,financial funds for supporting agriculture,farmers and countryside),and put them into categories of material input,resources and environment,and policy factors. Using the factor analysis,we carry out the multi-angle analysis of these typical influencing factors one by one through the time series trend chart. It is found that application rate of chemical fertilizer,the growing area of food crops and drought-affected area become the key factors affecting food production. On this basis,we set forth the corresponding recommendations for improving the comprehensive food production capacity.

  9. Automated preparation of Kepler time series of planet hosts for asteroseismic analysis

    DEFF Research Database (Denmark)

    Handberg, R.; Lund, M. N.

    2014-01-01

    . In this paper we present the KASOC Filter, which is used to automatically prepare data from the Kepler/K2 mission for asteroseismic analyses of solar-like planet host stars. The methods are very effective at removing unwanted signals of both instrumental and planetary origins and produce significantly cleaner......One of the tasks of the Kepler Asteroseismic Science Operations Center (KASOC) is to provide asteroseismic analyses on Kepler Objects of Interest (KOIs). However, asteroseismic analysis of planetary host stars presents some unique complications with respect to data preprocessing, compared to pure...... asteroseismic targets. If not accounted for, the presence of planetary transits in the photometric time series often greatly complicates or even hinders these asteroseismic analyses. This drives the need for specialised methods of preprocessing data to make them suitable for asteroseismic analysis...

  10. Time series modeling for analysis and control advanced autopilot and monitoring systems

    CERN Document Server

    Ohtsu, Kohei; Kitagawa, Genshiro

    2015-01-01

    This book presents multivariate time series methods for the analysis and optimal control of feedback systems. Although ships’ autopilot systems are considered through the entire book, the methods set forth in this book can be applied to many other complicated, large, or noisy feedback control systems for which it is difficult to derive a model of the entire system based on theory in that subject area. The basic models used in this method are the multivariate autoregressive model with exogenous variables (ARX) model and the radial bases function net-type coefficients ARX model. The noise contribution analysis can then be performed through the estimated autoregressive (AR) model and various types of autopilot systems can be designed through the state–space representation of the models. The marine autopilot systems addressed in this book include optimal controllers for course-keeping motion, rolling reduction controllers with rudder motion, engine governor controllers, noise adaptive autopilots, route-tracki...

  11. Use of a prototype pulse oximeter for time series analysis of heart rate variability

    Science.gov (United States)

    González, Erika; López, Jehú; Hautefeuille, Mathieu; Velázquez, Víctor; Del Moral, Jésica

    2015-05-01

    This work presents the development of a low cost pulse oximeter prototype consisting of pulsed red and infrared commercial LEDs and a broad spectral photodetector used to register time series of heart rate and oxygen saturation of blood. This platform, besides providing these values, like any other pulse oximeter, processes the signals to compute a power spectrum analysis of the patient heart rate variability in real time and, additionally, the device allows access to all raw and analyzed data if databases construction is required or another kind of further analysis is desired. Since the prototype is capable of acquiring data for long periods of time, it is suitable for collecting data in real life activities, enabling the development of future wearable applications.

  12. Trend analysis using non-stationary time series clustering based on the finite element method

    OpenAIRE

    Gorji Sefidmazgi, M.; Sayemuzzaman, M.; Homaifar, A.; Jha, M. K.; Liess, S.

    2014-01-01

    In order to analyze low-frequency variability of climate, it is useful to model the climatic time series with multiple linear trends and locate the times of significant changes. In this paper, we have used non-stationary time series clustering to find change points in the trends. Clustering in a multi-dimensional non-stationary time series is challenging, since the problem is mathematically ill-posed. Clustering based on the finite element method (FEM) is one of the methods ...

  13. Short-term forecasting of meteorological time series using Nonparametric Functional Data Analysis (NPFDA)

    Science.gov (United States)

    Curceac, S.; Ternynck, C.; Ouarda, T.

    2015-12-01

    Over the past decades, a substantial amount of research has been conducted to model and forecast climatic variables. In this study, Nonparametric Functional Data Analysis (NPFDA) methods are applied to forecast air temperature and wind speed time series in Abu Dhabi, UAE. The dataset consists of hourly measurements recorded for a period of 29 years, 1982-2010. The novelty of the Functional Data Analysis approach is in expressing the data as curves. In the present work, the focus is on daily forecasting and the functional observations (curves) express the daily measurements of the above mentioned variables. We apply a non-linear regression model with a functional non-parametric kernel estimator. The computation of the estimator is performed using an asymmetrical quadratic kernel function for local weighting based on the bandwidth obtained by a cross validation procedure. The proximities between functional objects are calculated by families of semi-metrics based on derivatives and Functional Principal Component Analysis (FPCA). Additionally, functional conditional mode and functional conditional median estimators are applied and the advantages of combining their results are analysed. A different approach employs a SARIMA model selected according to the minimum Akaike (AIC) and Bayessian (BIC) Information Criteria and based on the residuals of the model. The performance of the models is assessed by calculating error indices such as the root mean square error (RMSE), relative RMSE, BIAS and relative BIAS. The results indicate that the NPFDA models provide more accurate forecasts than the SARIMA models. Key words: Nonparametric functional data analysis, SARIMA, time series forecast, air temperature, wind speed

  14. Times Series

    DEFF Research Database (Denmark)

    Johansen, Søren

    An overvies of results for the cointegrated VAR model for nonstationary I (1) variables is given. The emphasis is on the analysis of the model and the tools for asymptotic inference. These include: formulation of criteria on the parameters, for the process to be nonstationary and I (1), formulation...

  15. Time Series Analysis of the Bacillus subtilis Sporulation Network Reveals Low Dimensional Chaotic Dynamics.

    Science.gov (United States)

    Lecca, Paola; Mura, Ivan; Re, Angela; Barker, Gary C; Ihekwaba, Adaoha E C

    2016-01-01

    Chaotic behavior refers to a behavior which, albeit irregular, is generated by an underlying deterministic process. Therefore, a chaotic behavior is potentially controllable. This possibility becomes practically amenable especially when chaos is shown to be low-dimensional, i.e., to be attributable to a small fraction of the total systems components. In this case, indeed, including the major drivers of chaos in a system into the modeling approach allows us to improve predictability of the systems dynamics. Here, we analyzed the numerical simulations of an accurate ordinary differential equation model of the gene network regulating sporulation initiation in Bacillus subtilis to explore whether the non-linearity underlying time series data is due to low-dimensional chaos. Low-dimensional chaos is expectedly common in systems with few degrees of freedom, but rare in systems with many degrees of freedom such as the B. subtilis sporulation network. The estimation of a number of indices, which reflect the chaotic nature of a system, indicates that the dynamics of this network is affected by deterministic chaos. The neat separation between the indices obtained from the time series simulated from the model and those obtained from time series generated by Gaussian white and colored noise confirmed that the B. subtilis sporulation network dynamics is affected by low dimensional chaos rather than by noise. Furthermore, our analysis identifies the principal driver of the networks chaotic dynamics to be sporulation initiation phosphotransferase B (Spo0B). We then analyzed the parameters and the phase space of the system to characterize the instability points of the network dynamics, and, in turn, to identify the ranges of values of Spo0B and of the other drivers of the chaotic dynamics, for which the whole system is highly sensitive to minimal perturbation. In summary, we described an unappreciated source of complexity in the B. subtilis sporulation network by gathering

  16. Seasonal and annual precipitation time series trend analysis in North Carolina, United States

    Science.gov (United States)

    Sayemuzzaman, Mohammad; Jha, Manoj K.

    2014-02-01

    The present study performs the spatial and temporal trend analysis of the annual and seasonal time-series of a set of uniformly distributed 249 stations precipitation data across the state of North Carolina, United States over the period of 1950-2009. The Mann-Kendall (MK) test, the Theil-Sen approach (TSA) and the Sequential Mann-Kendall (SQMK) test were applied to quantify the significance of trend, magnitude of trend, and the trend shift, respectively. Regional (mountain, piedmont and coastal) precipitation trends were also analyzed using the above-mentioned tests. Prior to the application of statistical tests, the pre-whitening technique was used to eliminate the effect of autocorrelation of precipitation data series. The application of the above-mentioned procedures has shown very notable statewide increasing trend for winter and decreasing trend for fall precipitation. Statewide mixed (increasing/decreasing) trend has been detected in annual, spring, and summer precipitation time series. Significant trends (confidence level ≥ 95%) were detected only in 8, 7, 4 and 10 nos. of stations (out of 249 stations) in winter, spring, summer, and fall, respectively. Magnitude of the highest increasing (decreasing) precipitation trend was found about 4 mm/season (- 4.50 mm/season) in fall (summer) season. Annual precipitation trend magnitude varied between - 5.50 mm/year and 9 mm/year. Regional trend analysis found increasing precipitation in mountain and coastal regions in general except during the winter. Piedmont region was found to have increasing trends in summer and fall, but decreasing trend in winter, spring and on an annual basis. The SQMK test on "trend shift analysis" identified a significant shift during 1960 - 70 in most parts of the state. Finally, the comparison between winter (summer) precipitations with the North Atlantic Oscillation (Southern Oscillation) indices concluded that the variability and trend of precipitation can be explained by the

  17. Time Series

    DEFF Research Database (Denmark)

    Johansen, Søren

    2015-01-01

    An overview of results for the cointegrated VAR model for nonstationary I(1) variables is given. The emphasis is on the analysis of the model and the tools for asymptotic inference. These include: formulation of criteria on the parameters, for the process to be nonstationary and I(1), formulation...... of hypotheses of interest on the rank, the cointegrating relations and the adjustment coefficients. A discussion of the asymptotic distribution results that are used for inference. The results are illustrated by a few examples. A number of extensions of the theory are pointed out....

  18. Analysis of rhythmic variance - ANORVA. A new simple method for detecting rhythms in biological time series

    Directory of Open Access Journals (Sweden)

    Peter Celec

    2004-01-01

    Full Text Available Cyclic variations of variables are ubiquitous in biomedical science. A number of methods for detecting rhythms have been developed, but they are often difficult to interpret. A simple procedure for detecting cyclic variations in biological time series and quantification of their probability is presented here. Analysis of rhythmic variance (ANORVA is based on the premise that the variance in groups of data from rhythmic variables is low when a time distance of one period exists between the data entries. A detailed stepwise calculation is presented including data entry and preparation, variance calculating, and difference testing. An example for the application of the procedure is provided, and a real dataset of the number of papers published per day in January 2003 using selected keywords is compared to randomized datasets. Randomized datasets show no cyclic variations. The number of papers published daily, however, shows a clear and significant (p<0.03 circaseptan (period of 7 days rhythm, probably of social origin

  19. Possible signatures of dissipation from time-series analysis techniques using a turbulent laboratory magnetohydrodynamic plasma

    International Nuclear Information System (INIS)

    Schaffner, D. A.; Brown, M. R.; Rock, A. B.

    2016-01-01

    The frequency spectrum of magnetic fluctuations as measured on the Swarthmore Spheromak Experiment is broadband and exhibits a nearly Kolmogorov 5/3 scaling. It features a steepening region which is indicative of dissipation of magnetic fluctuation energy similar to that observed in fluid and magnetohydrodynamic turbulence systems. Two non-spectrum based time-series analysis techniques are implemented on this data set in order to seek other possible signatures of turbulent dissipation beyond just the steepening of fluctuation spectra. Presented here are results for the flatness, permutation entropy, and statistical complexity, each of which exhibits a particular character at spectral steepening scales which can then be compared to the behavior of the frequency spectrum.

  20. Singular spectrum analysis in nonlinear dynamics, with applications to paleoclimatic time series

    Science.gov (United States)

    Vautard, R.; Ghil, M.

    1989-01-01

    Two dimensions of a dynamical system given by experimental time series are distinguished. Statistical dimension gives a theoretical upper bound for the minimal number of degrees of freedom required to describe the attractor up to the accuracy of the data, taking into account sampling and noise problems. The dynamical dimension is the intrinsic dimension of the attractor and does not depend on the quality of the data. Singular Spectrum Analysis (SSA) provides estimates of the statistical dimension. SSA also describes the main physical phenomena reflected by the data. It gives adaptive spectral filters associated with the dominant oscillations of the system and clarifies the noise characteristics of the data. SSA is applied to four paleoclimatic records. The principal climatic oscillations and the regime changes in their amplitude are detected. About 10 degrees of freedom are statistically significant in the data. Large noise and insufficient sample length do not allow reliable estimates of the dynamical dimension.

  1. Evaluating disease management program effectiveness: an introduction to time-series analysis.

    Science.gov (United States)

    Linden, Ariel; Adams, John L; Roberts, Nancy

    2003-01-01

    Currently, the most widely used method in the disease management (DM) industry for evaluating program effectiveness is referred to as the "total population approach." This model is a pretest-posttest design, with the most basic limitation being that without a control group, there may be sources of bias and/or competing extraneous confounding factors that offer a plausible rationale explaining the change from baseline. Furthermore, with the current inclination of DM programs to use financial indicators rather than program-specific utilization indicators as the principal measure of program success, additional biases are introduced that may cloud evaluation results. This paper presents a non-technical introduction to time-series analysis (using disease-specific utilization measures) as an alternative, and more appropriate, approach to evaluating DM program effectiveness than the current total population approach.

  2. Event-sequence time series analysis in ground-based gamma-ray astronomy

    International Nuclear Information System (INIS)

    Barres de Almeida, U.; Chadwick, P.; Daniel, M.; Nolan, S.; McComb, L.

    2008-01-01

    The recent, extreme episodes of variability detected from Blazars by the leading atmospheric Cerenkov experiments motivate the development and application of specialized statistical techniques that enable the study of this rich data set to its furthest extent. The identification of the shortest variability timescales supported by the data and the actual variability structure observed in the light curves of these sources are some of the fundamental aspects being studied, that answers can bring new developments on the understanding of the physics of these objects and on the mechanisms of production of VHE gamma-rays in the Universe. Some of our efforts in studying the time variability of VHE sources involve the application of dynamic programming algorithms to the problem of detecting change-points in a Poisson sequence. In this particular paper we concentrate on the more primary issue of the applicability of counting statistics to the analysis of time-series on VHE gamma-ray astronomy.

  3. Studies in astronomical time series analysis: Modeling random processes in the time domain

    Science.gov (United States)

    Scargle, J. D.

    1979-01-01

    Random process models phased in the time domain are used to analyze astrophysical time series data produced by random processes. A moving average (MA) model represents the data as a sequence of pulses occurring randomly in time, with random amplitudes. An autoregressive (AR) model represents the correlations in the process in terms of a linear function of past values. The best AR model is determined from sampled data and transformed to an MA for interpretation. The randomness of the pulse amplitudes is maximized by a FORTRAN algorithm which is relatively stable numerically. Results of test cases are given to study the effects of adding noise and of different distributions for the pulse amplitudes. A preliminary analysis of the optical light curve of the quasar 3C 273 is given.

  4. Fuzzy central tendency measure for time series variability analysis with application to fatigue electromyography signals.

    Science.gov (United States)

    Xie, Hong-Bo; Dokos, Socrates

    2013-01-01

    A new method, namely fuzzy central tendency measure (fCTM) analysis, that could enable measurement of the variability of a time series, is presented in this study. Tests on simulated data sets show that fCTM is superior to the conventional central tendency measure (CTM) in several respects, including improved relative consistency and robustness to noise. The proposed fCTM method was applied to electromyograph (EMG) signals recorded during sustained isometric contraction for tracking local muscle fatigue. The results showed that the fCTM increased significantly during the development of muscle fatigue, and it was more sensitive to the fatigue phenomenon than mean frequency (MNF), the most commonly-used muscle fatigue indicator.

  5. Nonlinear Analysis on Cross-Correlation of Financial Time Series by Continuum Percolation System

    Science.gov (United States)

    Niu, Hongli; Wang, Jun

    We establish a financial price process by continuum percolation system, in which we attribute price fluctuations to the investors’ attitudes towards the financial market, and consider the clusters in continuum percolation as the investors share the same investment opinion. We investigate the cross-correlations in two return time series, and analyze the multifractal behaviors in this relationship. Further, we study the corresponding behaviors for the real stock indexes of SSE and HSI as well as the liquid stocks pair of SPD and PAB by comparison. To quantify the multifractality in cross-correlation relationship, we employ multifractal detrended cross-correlation analysis method to perform an empirical research for the simulation data and the real markets data.

  6. Analysis of the development trend of China’s business administration based on time series

    Directory of Open Access Journals (Sweden)

    Jiang Rui

    2016-01-01

    Full Text Available On the general direction of the economic system, China is in a crucial period of the establishment of the modern enterprise system and reform of the macroeconomic system, and a lot of high-quality business administration talents are required to make China’s economy be stably developed. This paper carries out time series analysis of the development situation of China’s business administration major: on the whole, the society currently presents an upward trend on the demand for the business administration talents. With the gradually increasing demand for the business administration talents, various colleges and universities also set up the business administration major to train a large number of administration talents, thus leading to an upward trend for the academic focus on business administration.

  7. The Relative Importance of the Service Sector in the Mexican Economy: A Time Series Analysis

    Directory of Open Access Journals (Sweden)

    Carlos Alberto Flores

    2014-01-01

    Full Text Available We conduct a study of the secondary and tertiary sectors with the goal of highlighting the relative im-portance of services in the Mexican economy. We consider a time series analysis approach designed to identify the stochastic nature of the series, as well as to define their long-run and-short run relationships with Gross Domestic Product (GDP. The results of cointegration tests suggest that, for the most part, activities in the secondary and tertiary sectors share a common trend with GDP. Interestingly, the long-run elasticities of GDP with respect to services are on average larger than those with respect to secondary activities. Common cycle tests results identify the existence of common cycles between GDP and the disaggregated sectors, as well as with manufacturing, commerce, real estate and transportation. In this case, the short-run elasticities of secondary activities are on average larger than those corresponding to services.

  8. Characterization of Ground Displacement Sources from Variational Bayesian Independent Component Analysis of Space Geodetic Time Series

    Science.gov (United States)

    Gualandi, Adriano; Serpelloni, Enrico; Elina Belardinelli, Maria; Bonafede, Maurizio; Pezzo, Giuseppe; Tolomei, Cristiano

    2015-04-01

    A critical point in the analysis of ground displacement time series, as those measured by modern space geodetic techniques (primarly continuous GPS/GNSS and InSAR) is the development of data driven methods that allow to discern and characterize the different sources that generate the observed displacements. A widely used multivariate statistical technique is the Principal Component Analysis (PCA), which allows to reduce the dimensionality of the data space maintaining most of the variance of the dataset explained. It reproduces the original data using a limited number of Principal Components, but it also shows some deficiencies, since PCA does not perform well in finding the solution to the so-called Blind Source Separation (BSS) problem. The recovering and separation of the different sources that generate the observed ground deformation is a fundamental task in order to provide a physical meaning to the possible different sources. PCA fails in the BSS problem since it looks for a new Euclidean space where the projected data are uncorrelated. Usually, the uncorrelation condition is not strong enough and it has been proven that the BSS problem can be tackled imposing on the components to be independent. The Independent Component Analysis (ICA) is, in fact, another popular technique adopted to approach this problem, and it can be used in all those fields where PCA is also applied. An ICA approach enables us to explain the displacement time series imposing a fewer number of constraints on the model, and to reveal anomalies in the data such as transient deformation signals. However, the independence condition is not easy to impose, and it is often necessary to introduce some approximations. To work around this problem, we use a variational bayesian ICA (vbICA) method, which models the probability density function (pdf) of each source signal using a mix of Gaussian distributions. This technique allows for more flexibility in the description of the pdf of the sources

  9. Trend analysis using non-stationary time series clustering based on the finite element method

    Science.gov (United States)

    Gorji Sefidmazgi, M.; Sayemuzzaman, M.; Homaifar, A.; Jha, M. K.; Liess, S.

    2014-05-01

    In order to analyze low-frequency variability of climate, it is useful to model the climatic time series with multiple linear trends and locate the times of significant changes. In this paper, we have used non-stationary time series clustering to find change points in the trends. Clustering in a multi-dimensional non-stationary time series is challenging, since the problem is mathematically ill-posed. Clustering based on the finite element method (FEM) is one of the methods that can analyze multidimensional time series. One important attribute of this method is that it is not dependent on any statistical assumption and does not need local stationarity in the time series. In this paper, it is shown how the FEM-clustering method can be used to locate change points in the trend of temperature time series from in situ observations. This method is applied to the temperature time series of North Carolina (NC) and the results represent region-specific climate variability despite higher frequency harmonics in climatic time series. Next, we investigated the relationship between the climatic indices with the clusters/trends detected based on this clustering method. It appears that the natural variability of climate change in NC during 1950-2009 can be explained mostly by AMO and solar activity.

  10. Analysis of Seasonal Signal in GPS Short-Baseline Time Series

    Science.gov (United States)

    Wang, Kaihua; Jiang, Weiping; Chen, Hua; An, Xiangdong; Zhou, Xiaohui; Yuan, Peng; Chen, Qusen

    2018-04-01

    Proper modeling of seasonal signals and their quantitative analysis are of interest in geoscience applications, which are based on position time series of permanent GPS stations. Seasonal signals in GPS short-baseline (paper, to better understand the seasonal signal in GPS short-baseline time series, we adopted and processed six different short-baselines with data span that varies from 2 to 14 years and baseline length that varies from 6 to 1100 m. To avoid seasonal signals that are overwhelmed by noise, each of the station pairs is chosen with significant differences in their height (> 5 m) or type of the monument. For comparison, we also processed an approximately zero baseline with a distance of pass-filtered (BP) noise is valid for approximately 40% of the baseline components, and another 20% of the components can be best modeled by a combination of the first-order Gauss-Markov (FOGM) process plus white noise (WN). The TEM displacements are then modeled by considering the monument height of the building structure beneath the GPS antenna. The median contributions of TEM to the annual amplitude in the vertical direction are 84% and 46% with and without additional parts of the monument, respectively. Obvious annual signals with amplitude > 0.4 mm in the horizontal direction are observed in five short-baselines, and the amplitudes exceed 1 mm in four of them. These horizontal seasonal signals are likely related to the propagation of daily/sub-daily TEM displacement or other signals related to the site environment. Mismodeling of the tropospheric delay may also introduce spurious seasonal signals with annual amplitudes of 5 and 2 mm, respectively, for two short-baselines with elevation differences greater than 100 m. The results suggest that the monument height of the additional part of a typical GPS station should be considered when estimating the TEM displacement and that the tropospheric delay should be modeled cautiously, especially with station pairs with

  11. Time series analysis of the developed financial markets' integration using visibility graphs

    Science.gov (United States)

    Zhuang, Enyu; Small, Michael; Feng, Gang

    2014-09-01

    A time series representing the developed financial markets' segmentation from 1973 to 2012 is studied. The time series reveals an obvious market integration trend. To further uncover the features of this time series, we divide it into seven windows and generate seven visibility graphs. The measuring capabilities of the visibility graphs provide means to quantitatively analyze the original time series. It is found that the important historical incidents that influenced market integration coincide with variations in the measured graphical node degree. Through the measure of neighborhood span, the frequencies of the historical incidents are disclosed. Moreover, it is also found that large "cycles" and significant noise in the time series are linked to large and small communities in the generated visibility graphs. For large cycles, how historical incidents significantly affected market integration is distinguished by density and compactness of the corresponding communities.

  12. Ecological Momentary Assessments and Automated Time Series Analysis to Promote Tailored Health Care : A Proof-of-Principle Study

    NARCIS (Netherlands)

    van der Krieke, Lian; Emerencia, Ando C; Bos, Elisabeth H; Rosmalen, Judith Gm; Riese, Harriëtte; Aiello, Marco; Sytema, Sjoerd; de Jonge, Peter

    2015-01-01

    BACKGROUND: Health promotion can be tailored by combining ecological momentary assessments (EMA) with time series analysis. This combined method allows for studying the temporal order of dynamic relationships among variables, which may provide concrete indications for intervention. However,

  13. A knowledge translation tool improved osteoporosis disease management in primary care: an interrupted time series analysis.

    Science.gov (United States)

    Kastner, Monika; Sawka, Anna M; Hamid, Jemila; Chen, Maggie; Thorpe, Kevin; Chignell, Mark; Ewusie, Joycelyne; Marquez, Christine; Newton, David; Straus, Sharon E

    2014-09-25

    Osteoporosis affects over 200 million people worldwide at a high cost to healthcare systems, yet gaps in management still exist. In response, we developed a multi-component osteoporosis knowledge translation (Op-KT) tool involving a patient-initiated risk assessment questionnaire (RAQ), which generates individualized best practice recommendations for physicians and customized education for patients at the point of care. The objective of this study was to evaluate the effectiveness of the Op-KT tool for appropriate disease management by physicians. The Op-KT tool was evaluated using an interrupted time series design. This involved multiple assessments of the outcomes 12 months before (baseline) and 12 months after tool implementation (52 data points in total). Inclusion criteria were family physicians and their patients at risk for osteoporosis (women aged ≥ 50 years, men aged ≥ 65 years). Primary outcomes were the initiation of appropriate osteoporosis screening and treatment. Analyses included segmented linear regression modeling and analysis of variance. The Op-KT tool was implemented in three family practices in Ontario, Canada representing 5 family physicians with 2840 age eligible patients (mean age 67 years; 76% women). Time series regression models showed an overall increase from baseline in the initiation of screening (3.4%; P management addressed by their physician. Study limitations included the inherent susceptibility of our design compared with a randomized trial. The multicomponent Op-KT tool significantly increased osteoporosis investigations in three family practices, and highlights its potential to facilitate patient self-management. Next steps include wider implementation and evaluation of the tool in primary care.

  14. Time-series panel analysis (TSPA): multivariate modeling of temporal associations in psychotherapy process.

    Science.gov (United States)

    Ramseyer, Fabian; Kupper, Zeno; Caspar, Franz; Znoj, Hansjörg; Tschacher, Wolfgang

    2014-10-01

    Processes occurring in the course of psychotherapy are characterized by the simple fact that they unfold in time and that the multiple factors engaged in change processes vary highly between individuals (idiographic phenomena). Previous research, however, has neglected the temporal perspective by its traditional focus on static phenomena, which were mainly assessed at the group level (nomothetic phenomena). To support a temporal approach, the authors introduce time-series panel analysis (TSPA), a statistical methodology explicitly focusing on the quantification of temporal, session-to-session aspects of change in psychotherapy. TSPA-models are initially built at the level of individuals and are subsequently aggregated at the group level, thus allowing the exploration of prototypical models. TSPA is based on vector auto-regression (VAR), an extension of univariate auto-regression models to multivariate time-series data. The application of TSPA is demonstrated in a sample of 87 outpatient psychotherapy patients who were monitored by postsession questionnaires. Prototypical mechanisms of change were derived from the aggregation of individual multivariate models of psychotherapy process. In a 2nd step, the associations between mechanisms of change (TSPA) and pre- to postsymptom change were explored. TSPA allowed a prototypical process pattern to be identified, where patient's alliance and self-efficacy were linked by a temporal feedback-loop. Furthermore, therapist's stability over time in both mastery and clarification interventions was positively associated with better outcomes. TSPA is a statistical tool that sheds new light on temporal mechanisms of change. Through this approach, clinicians may gain insight into prototypical patterns of change in psychotherapy. PsycINFO Database Record (c) 2014 APA, all rights reserved.

  15. Principal component analysis of MSBAS DInSAR time series from Campi Flegrei, Italy

    Science.gov (United States)

    Tiampo, Kristy F.; González, Pablo J.; Samsonov, Sergey; Fernández, Jose; Camacho, Antonio

    2017-09-01

    Because of its proximity to the city of Naples and with a population of nearly 1 million people within its caldera, Campi Flegrei is one of the highest risk volcanic areas in the world. Since the last major eruption in 1538, the caldera has undergone frequent episodes of ground subsidence and uplift accompanied by seismic activity that has been interpreted as the result of a stationary, deeper source below the caldera that feeds shallower eruptions. However, the location and depth of the deeper source is not well-characterized and its relationship to current activity is poorly understood. Recently, a significant increase in the uplift rate has occurred, resulting in almost 13 cm of uplift by 2013 (De Martino et al., 2014; Samsonov et al., 2014b; Di Vito et al., 2016). Here we apply a principal component decomposition to high resolution time series from the region produced by the advanced Multidimensional SBAS DInSAR technique in order to better delineate both the deeper source and the recent shallow activity. We analyzed both a period of substantial subsidence (1993-1999) and a second of significant uplift (2007-2013) and inverted the associated vertical surface displacement for the most likely source models. Results suggest that the underlying dynamics of the caldera changed in the late 1990s, from one in which the primary signal arises from a shallow deflating source above a deeper, expanding source to one dominated by a shallow inflating source. In general, the shallow source lies between 2700 and 3400 m below the caldera while the deeper source lies at 7600 m or more in depth. The combination of principal component analysis with high resolution MSBAS time series data allows for these new insights and confirms the applicability of both to areas at risk from dynamic natural hazards.

  16. Learning from environmental data: Methods for analysis of forest nutrition time series

    Energy Technology Data Exchange (ETDEWEB)

    Sulkava, M. (Helsinki Univ. of Technology, Espoo (Finland). Computer and Information Science)

    2008-07-01

    Data analysis methods play an important role in increasing our knowledge of the environment as the amount of data measured from the environment increases. This thesis fits under the scope of environmental informatics and environmental statistics. They are fields, in which data analysis methods are developed and applied for the analysis of environmental data. The environmental data studied in this thesis are time series of nutrient concentration measurements of pine and spruce needles. In addition, there are data of laboratory quality and related environmental factors, such as the weather and atmospheric depositions. The most important methods used for the analysis of the data are based on the self-organizing map and linear regression models. First, a new clustering algorithm of the self-organizing map is proposed. It is found to provide better results than two other methods for clustering of the self-organizing map. The algorithm is used to divide the nutrient concentration data into clusters, and the result is evaluated by environmental scientists. Based on the clustering, the temporal development of the forest nutrition is modeled and the effect of nitrogen and sulfur deposition on the foliar mineral composition is assessed. Second, regression models are used for studying how much environmental factors and properties of the needles affect the changes in the nutrient concentrations of the needles between their first and second year of existence. The aim is to build understandable models with good prediction capabilities. Sparse regression models are found to outperform more traditional regression models in this task. Third, fusion of laboratory quality data from different sources is performed to estimate the precisions of the analytical methods. Weighted regression models are used to quantify how much the precision of observations can affect the time needed to detect a trend in environmental time series. The results of power analysis show that improving the

  17. GPS Position Time Series @ JPL

    Science.gov (United States)

    Owen, Susan; Moore, Angelyn; Kedar, Sharon; Liu, Zhen; Webb, Frank; Heflin, Mike; Desai, Shailen

    2013-01-01

    Different flavors of GPS time series analysis at JPL - Use same GPS Precise Point Positioning Analysis raw time series - Variations in time series analysis/post-processing driven by different users. center dot JPL Global Time Series/Velocities - researchers studying reference frame, combining with VLBI/SLR/DORIS center dot JPL/SOPAC Combined Time Series/Velocities - crustal deformation for tectonic, volcanic, ground water studies center dot ARIA Time Series/Coseismic Data Products - Hazard monitoring and response focused center dot ARIA data system designed to integrate GPS and InSAR - GPS tropospheric delay used for correcting InSAR - Caltech's GIANT time series analysis uses GPS to correct orbital errors in InSAR - Zhen Liu's talking tomorrow on InSAR Time Series analysis

  18. Time Series Analysis OF SAR Image Fractal Maps: The Somma-Vesuvio Volcanic Complex Case Study

    Science.gov (United States)

    Pepe, Antonio; De Luca, Claudio; Di Martino, Gerardo; Iodice, Antonio; Manzo, Mariarosaria; Pepe, Susi; Riccio, Daniele; Ruello, Giuseppe; Sansosti, Eugenio; Zinno, Ivana

    2016-04-01

    The fractal dimension is a significant geophysical parameter describing natural surfaces representing the distribution of the roughness over different spatial scale; in case of volcanic structures, it has been related to the specific nature of materials and to the effects of active geodynamic processes. In this work, we present the analysis of the temporal behavior of the fractal dimension estimates generated from multi-pass SAR images relevant to the Somma-Vesuvio volcanic complex (South Italy). To this aim, we consider a Cosmo-SkyMed data-set of 42 stripmap images acquired from ascending orbits between October 2009 and December 2012. Starting from these images, we generate a three-dimensional stack composed by the corresponding fractal maps (ordered according to the acquisition dates), after a proper co-registration. The time-series of the pixel-by-pixel estimated fractal dimension values show that, over invariant natural areas, the fractal dimension values do not reveal significant changes; on the contrary, over urban areas, it correctly assumes values outside the natural surfaces fractality range and show strong fluctuations. As a final result of our analysis, we generate a fractal map that includes only the areas where the fractal dimension is considered reliable and stable (i.e., whose standard deviation computed over the time series is reasonably small). The so-obtained fractal dimension map is then used to identify areas that are homogeneous from a fractal viewpoint. Indeed, the analysis of this map reveals the presence of two distinctive landscape units corresponding to the Mt. Vesuvio and Gran Cono. The comparison with the (simplified) geological map clearly shows the presence in these two areas of volcanic products of different age. The presented fractal dimension map analysis demonstrates the ability to get a figure about the evolution degree of the monitored volcanic edifice and can be profitably extended in the future to other volcanic systems with

  19. Nonlinear Analysis of Time Series in Genome-Wide Linkage Disequilibrium Data

    Science.gov (United States)

    Hernández-Lemus, Enrique; Estrada-Gil, Jesús K.; Silva-Zolezzi, Irma; Fernández-López, J. Carlos; Hidalgo-Miranda, Alfredo; Jiménez-Sánchez, Gerardo

    2008-02-01

    The statistical study of large scale genomic data has turned out to be a very important tool in population genetics. Quantitative methods are essential to understand and implement association studies in the biomedical and health sciences. Nevertheless, the characterization of recently admixed populations has been an elusive problem due to the presence of a number of complex phenomena. For example, linkage disequilibrium structures are thought to be more complex than their non-recently admixed population counterparts, presenting the so-called ancestry blocks, admixed regions that are not yet smoothed by the effect of genetic recombination. In order to distinguish characteristic features for various populations we have implemented several methods, some of them borrowed or adapted from the analysis of nonlinear time series in statistical physics and quantitative physiology. We calculate the main fractal dimensions (Kolmogorov's capacity, information dimension and correlation dimension, usually named, D0, D1 and D2). We also have made detrended fluctuation analysis and information based similarity index calculations for the probability distribution of correlations of linkage disequilibrium coefficient of six recently admixed (mestizo) populations within the Mexican Genome Diversity Project [1] and for the non-recently admixed populations in the International HapMap Project [2]. Nonlinear correlations showed up as a consequence of internal structure within the haplotype distributions. The analysis of these correlations as well as the scope and limitations of these procedures within the biomedical sciences are discussed.

  20. The physiology analysis system: an integrated approach for warehousing, management and analysis of time-series physiology data.

    Science.gov (United States)

    McKenna, Thomas M; Bawa, Gagandeep; Kumar, Kamal; Reifman, Jaques

    2007-04-01

    The physiology analysis system (PAS) was developed as a resource to support the efficient warehousing, management, and analysis of physiology data, particularly, continuous time-series data that may be extensive, of variable quality, and distributed across many files. The PAS incorporates time-series data collected by many types of data-acquisition devices, and it is designed to free users from data management burdens. This Web-based system allows both discrete (attribute) and time-series (ordered) data to be manipulated, visualized, and analyzed via a client's Web browser. All processes occur on a server, so that the client does not have to download data or any application programs, and the PAS is independent of the client's computer operating system. The PAS contains a library of functions, written in different computer languages that the client can add to and use to perform specific data operations. Functions from the library are sequentially inserted into a function chain-based logical structure to construct sophisticated data operators from simple function building blocks, affording ad hoc query and analysis of time-series data. These features support advanced mining of physiology data.

  1. Introduction to Time Series Modeling

    CERN Document Server

    Kitagawa, Genshiro

    2010-01-01

    In time series modeling, the behavior of a certain phenomenon is expressed in relation to the past values of itself and other covariates. Since many important phenomena in statistical analysis are actually time series and the identification of conditional distribution of the phenomenon is an essential part of the statistical modeling, it is very important and useful to learn fundamental methods of time series modeling. Illustrating how to build models for time series using basic methods, "Introduction to Time Series Modeling" covers numerous time series models and the various tools f

  2. Chaos analysis of the electrical signal time series evoked by acupuncture

    International Nuclear Information System (INIS)

    Wang Jiang; Sun Li; Fei Xiangyang; Zhu Bing

    2007-01-01

    This paper employs chaos theory to analyze the time series of electrical signal which are evoked by different acupuncture methods applied to the Zusanli point. The phase space is reconstructed and the embedding parameters are obtained by the mutual information and Cao's methods. Subsequently, the largest Lyapunov exponent is calculated. From the analyses we can conclude that the time series are chaotic. In addition, differences between various acupuncture methods are discussed

  3. Chaos analysis of the electrical signal time series evoked by acupuncture

    Energy Technology Data Exchange (ETDEWEB)

    Wang Jiang [School of Electrical Engineering, Tianjin University, Tianjin 300072 (China)]. E-mail: jiangwang@tju.edu.cn; Sun Li [School of Electrical Engineering, Tianjin University, Tianjin 300072 (China); Fei Xiangyang [School of Electrical Engineering, Tianjin University, Tianjin 300072 (China); Zhu Bing [Institute of Acupuncture and Moxibustion, China Academy of Traditional Chinese Medicine, Beijing 100700 (China)

    2007-08-15

    This paper employs chaos theory to analyze the time series of electrical signal which are evoked by different acupuncture methods applied to the Zusanli point. The phase space is reconstructed and the embedding parameters are obtained by the mutual information and Cao's methods. Subsequently, the largest Lyapunov exponent is calculated. From the analyses we can conclude that the time series are chaotic. In addition, differences between various acupuncture methods are discussed.

  4. Self-potential time series analysis in a seismic area of the Southern Apennines: preliminary results

    OpenAIRE

    Di Bello, G.; Lapenna, V.; Satriano, C.; Tramutoli, V.

    1994-01-01

    The self-potential time series recorded during the period May 1991 - August 1992 by an automatic station, located in a seismic area of Southern Apennines, is analyzed. We deal with the spectral and the statistical features of the electrotellurie precursors: they can play a major role in the approach to seismic prediction. The time-dynamics of the experimental time series is investigated, the cyclic components and the time trends are removed. In particular we consider the influence of external...

  5. A New Modified Histogram Matching Normalization for Time Series Microarray Analysis.

    Science.gov (United States)

    Astola, Laura; Molenaar, Jaap

    2014-07-01

    Microarray data is often utilized in inferring regulatory networks. Quantile normalization (QN) is a popular method to reduce array-to-array variation. We show that in the context of time series measurements QN may not be the best choice for this task, especially not if the inference is based on continuous time ODE model. We propose an alternative normalization method that is better suited for network inference from time series data.

  6. A New Modified Histogram Matching Normalization for Time Series Microarray Analysis

    Directory of Open Access Journals (Sweden)

    Laura Astola

    2014-07-01

    Full Text Available Microarray data is often utilized in inferring regulatory networks. Quantile normalization (QN is a popular method to reduce array-to-array variation. We show that in the context of time series measurements QN may not be the best choice for this task, especially not if the inference is based on continuous time ODE model. We propose an alternative normalization method that is better suited for network inference from time series data.

  7. Causality as a Rigorous Notion and Quantitative Causality Analysis with Time Series

    Science.gov (United States)

    Liang, X. S.

    2017-12-01

    Given two time series, can one faithfully tell, in a rigorous and quantitative way, the cause and effect between them? Here we show that this important and challenging question (one of the major challenges in the science of big data), which is of interest in a wide variety of disciplines, has a positive answer. Particularly, for linear systems, the maximal likelihood estimator of the causality from a series X2 to another series X1, written T2→1, turns out to be concise in form: T2→1 = [C11 C12 C2,d1 — C112 C1,d1] / [C112 C22 — C11C122] where Cij (i,j=1,2) is the sample covariance between Xi and Xj, and Ci,dj the covariance between Xi and ΔXj/Δt, the difference approximation of dXj/dt using the Euler forward scheme. An immediate corollary is that causation implies correlation, but not vice versa, resolving the long-standing debate over causation versus correlation. The above formula has been validated with touchstone series purportedly generated with one-way causality that evades the classical approaches such as Granger causality test and transfer entropy analysis. It has also been applied successfully to the investigation of many real problems. Through a simple analysis with the stock series of IBM and GE, an unusually strong one-way causality is identified from the former to the latter in their early era, revealing to us an old story, which has almost faded into oblivion, about "Seven Dwarfs" competing with a "Giant" for the computer market. Another example presented here regards the cause-effect relation between the two climate modes, El Niño and Indian Ocean Dipole (IOD). In general, these modes are mutually causal, but the causality is asymmetric. To El Niño, the information flowing from IOD manifests itself as a propagation of uncertainty from the Indian Ocean. In the third example, an unambiguous one-way causality is found between CO2 and the global mean temperature anomaly. While it is confirmed that CO2 indeed drives the recent global warming

  8. The investigation of Martian dune fields using very high resolution photogrammetric measurements and time series analysis

    Science.gov (United States)

    Kim, J.; Park, M.; Baik, H. S.; Choi, Y.

    2016-12-01

    At the present time, arguments continue regarding the migration speeds of Martian dune fields and their correlation with atmospheric circulation. However, precisely measuring the spatial translation of Martian dunes has rarely conducted only a very few times Therefore, we developed a generic procedure to precisely measure the migration of dune fields with recently introduced 25-cm resolution High Resolution Imaging Science Experimen (HIRISE) employing a high-accuracy photogrammetric processor and sub-pixel image correlator. The processor was designed to trace estimated dune migration, albeit slight, over the Martian surface by 1) the introduction of very high resolution ortho images and stereo analysis based on hierarchical geodetic control for better initial point settings; 2) positioning error removal throughout the sensor model refinement with a non-rigorous bundle block adjustment, which makes possible the co-alignment of all images in a time series; and 3) improved sub-pixel co-registration algorithms using optical flow with a refinement stage conducted on a pyramidal grid processor and a blunder classifier. Moreover, volumetric changes of Martian dunes were additionally traced by means of stereo analysis and photoclinometry. The established algorithms have been tested using high-resolution HIRISE images over a large number of Martian dune fields covering whole Mars Global Dune Database. Migrations over well-known crater dune fields appeared to be almost static for the considerable temporal periods and were weakly correlated with wind directions estimated by the Mars Climate Database (Millour et al. 2015). Only over a few Martian dune fields, such as Kaiser crater, meaningful migration speeds (>1m/year) compared to phtotogrammetric error residual have been measured. Currently a technical improved processor to compensate error residual using time series observation is under developing and expected to produce the long term migration speed over Martian dune

  9. Spectral analysis of time series of events: effect of respiration on heart rate in neonates

    International Nuclear Information System (INIS)

    Van Drongelen, Wim; Williams, Amber L; Lasky, Robert E

    2009-01-01

    Certain types of biomedical processes such as the heart rate generator can be considered as signals that are sampled by the occurring events, i.e. QRS complexes. This sampling property generates problems for the evaluation of spectral parameters of such signals. First, the irregular occurrence of heart beats creates an unevenly sampled data set which must either be pre-processed (e.g. by using trace binning or interpolation) prior to spectral analysis, or analyzed with specialized methods (e.g. Lomb's algorithm). Second, the average occurrence of events determines the Nyquist limit for the sampled time series. Here we evaluate different types of spectral analysis of recordings of neonatal heart rate. Coupling between respiration and heart rate and the detection of heart rate itself are emphasized. We examine both standard and data adaptive frequency bands of heart rate signals generated by models of coupled oscillators and recorded data sets from neonates. We find that an important spectral artifact occurs due to a mirror effect around the Nyquist limit of half the average heart rate. Further we conclude that the presence of respiratory coupling can only be detected under low noise conditions and if a data-adaptive respiratory band is used

  10. Variability of African Farming Systems from Phenological Analysis of NDVI Time Series

    Science.gov (United States)

    Vrieling, Anton; deBeurs, K. M.; Brown, Molly E.

    2011-01-01

    Food security exists when people have access to sufficient, safe and nutritious food at all times to meet their dietary needs. The natural resource base is one of the many factors affecting food security. Its variability and decline creates problems for local food production. In this study we characterize for sub-Saharan Africa vegetation phenology and assess variability and trends of phenological indicators based on NDVI time series from 1982 to 2006. We focus on cumulated NDVI over the season (cumNDVI) which is a proxy for net primary productivity. Results are aggregated at the level of major farming systems, while determining also spatial variability within farming systems. High temporal variability of cumNDVI occurs in semiarid and subhumid regions. The results show a large area of positive cumNDVI trends between Senegal and South Sudan. These correspond to positive CRU rainfall trends found and relate to recovery after the 1980's droughts. We find significant negative cumNDVI trends near the south-coast of West Africa (Guinea coast) and in Tanzania. For each farming system, causes of change and variability are discussed based on available literature (Appendix A). Although food security comprises more than the local natural resource base, our results can perform an input for food security analysis by identifying zones of high variability or downward trends. Farming systems are found to be a useful level of analysis. Diversity and trends found within farming system boundaries underline that farming systems are dynamic.

  11. Statistical Analysis of fMRI Time-Series: A Critical Review of the GLM Approach

    Directory of Open Access Journals (Sweden)

    Martin M Monti

    2011-03-01

    Full Text Available Functional Magnetic Resonance Imaging (fMRI is one of the most widely used tools to study the neural underpinnings of human cognition. Standard analysis of fMRI data relies on a General Linear Model (GLM approach to separate stimulus induced signals from noise. Crucially, this approach relies on a number of assumptions about the data which, for inferences to be valid, must be met. The current paper reviews the GLM approach to analysis of fMRI time-series, focusing in particular on the degree to which such data abides by the assumptions of the GLM framework, and on the methods that have been developed to correct for any violation of those assumptions. Rather than biasing estimates of effect size, the major consequence of non-conformity to the assumptions is to introduce bias into estimates of the variance, thus affecting test statistics, power and false positive rates. Furthermore, this bias can have pervasive effects on both individual subject and group-level statistics, potentially yielding qualitatively different results across replications, especially after the thresholding procedures commonly used for inference-making.

  12. A Time Series Analysis Using R for Understanding Car Sales On The Romanian Market

    Directory of Open Access Journals (Sweden)

    Mihaela Cornelia Sandu

    2015-09-01

    Full Text Available The size of the Romanian automobile industry is relative small compared to the main car producers in Europe and the world, but an analysis of its structure and dynamic appears to be most relevant given the strong linkages with the main macroeconomic indicators and important microeconomic variables at the level of the household.The paper presents a time series analysis for car sales in Romania, in the period 2007-2014, focusing on the sales dynamic of the national main producer– Dacia Pitesti. The aim of the investigation is twofold: to test the impact of macroeconomic variables on this important and underexplored segment of the economy and to emphasize potential differences between the factors influencing the buying decision for domestic versus foreign cars (observed in three regimes: new, registered and reenrolled. While the major influence of the global economic crisis cannot be ignored for the analyzed interval, we believe that it may also help to illustrate the real behaviors of individuals by setting the line between the immediate period after the crisis as treatment under scarcity conditions and the re-installment of normality towards the second half of the time interval. The results are confirming the general findings of the literature for the main indicators but they not entirely consistent with the rational economic models, especially with regard to the nature of the investigated goods (the cars – normal or positional.

  13. Interrupted time-series analysis of regulations to reduce paracetamol (acetaminophen poisoning.

    Directory of Open Access Journals (Sweden)

    Oliver W Morgan

    2007-04-01

    Full Text Available Paracetamol (acetaminophen poisoning is the leading cause of acute liver failure in Great Britain and the United States. Successful interventions to reduced harm from paracetamol poisoning are needed. To achieve this, the government of the United Kingdom introduced legislation in 1998 limiting the pack size of paracetamol sold in shops. Several studies have reported recent decreases in fatal poisonings involving paracetamol. We use interrupted time-series analysis to evaluate whether the recent fall in the number of paracetamol deaths is different to trends in fatal poisoning involving aspirin, paracetamol compounds, antidepressants, or nondrug poisoning suicide.We calculated directly age-standardised mortality rates for paracetamol poisoning in England and Wales from 1993 to 2004. We used an ordinary least-squares regression model divided into pre- and postintervention segments at 1999. The model included a term for autocorrelation within the time series. We tested for changes in the level and slope between the pre- and postintervention segments. To assess whether observed changes in the time series were unique to paracetamol, we compared against poisoning deaths involving compound paracetamol (not covered by the regulations, aspirin, antidepressants, and nonpoisoning suicide deaths. We did this comparison by calculating a ratio of each comparison series with paracetamol and applying a segmented regression model to the ratios. No change in the ratio level or slope indicated no difference compared to the control series. There were about 2,200 deaths involving paracetamol. The age-standardised mortality rate rose from 8.1 per million in 1993 to 8.8 per million in 1997, subsequently falling to about 5.3 per million in 2004. After the regulations were introduced, deaths dropped by 2.69 per million (p = 0.003. Trends in the age-standardised mortality rate for paracetamol compounds, aspirin, and antidepressants were broadly similar to paracetamol

  14. Risk assessment of environmentally influenced airway diseases based on time-series analysis.

    Science.gov (United States)

    Herbarth, O

    1995-09-01

    Threshold values are of prime importance in providing a sound basis for public health decisions. A key issue is determining threshold or maximum exposure values for pollutants and assessing their potential health risks. Environmental epidemiology could be instrumental in assessing these levels, especially since the assessment of ambient exposures involves relatively low concentrations of pollutants. This paper presents a statistical method that allows the determination of threshold values as well as the assessment of the associated risk using a retrospective, longitudinal study design with a prospective follow-up. Morbidity data were analyzed using the Fourier method, a time-series analysis that is based on the assumption of a high temporal resolution of the data. This method eliminates time-dependent responses like temporal inhomogeneity and pseudocorrelation. The frequency of calls for respiratory distress conditions to the regional Mobile Medical Emergency Service (MMES) in the city of Leipzig were investigated. The entire population of Leipzig served as a pool for data collection. In addition to the collection of morbidity data, air pollution measurements were taken every 30 min for the entire study period using sulfur dioxide as the regional indicator variable. This approach allowed the calculation of a dose-response curve for respiratory diseases and air pollution indices in children and adults. Significantly higher morbidities were observed above a 24-hr mean value of 0.6 mg SO2/m3 air for children and 0.8 mg SO2/m3 for adults.(ABSTRACT TRUNCATED AT 250 WORDS)

  15. On the limits of probabilistic forecasting in nonlinear time series analysis II: Differential entropy.

    Science.gov (United States)

    Amigó, José M; Hirata, Yoshito; Aihara, Kazuyuki

    2017-08-01

    In a previous paper, the authors studied the limits of probabilistic prediction in nonlinear time series analysis in a perfect model scenario, i.e., in the ideal case that the uncertainty of an otherwise deterministic model is due to only the finite precision of the observations. The model consisted of the symbolic dynamics of a measure-preserving transformation with respect to a finite partition of the state space, and the quality of the predictions was measured by the so-called ignorance score, which is a conditional entropy. In practice, though, partitions are dispensed with by considering numerical and experimental data to be continuous, which prompts us to trade off in this paper the Shannon entropy for the differential entropy. Despite technical differences, we show that the core of the previous results also hold in this extended scenario for sufficiently high precision. The corresponding imperfect model scenario will be revisited too because it is relevant for the applications. The theoretical part and its application to probabilistic forecasting are illustrated with numerical simulations and a new prediction algorithm.

  16. A Long-Term Prediction Model of Beijing Haze Episodes Using Time Series Analysis

    Directory of Open Access Journals (Sweden)

    Xiaoping Yang

    2016-01-01

    Full Text Available The rapid industrial development has led to the intermittent outbreak of pm2.5 or haze in developing countries, which has brought about great environmental issues, especially in big cities such as Beijing and New Delhi. We investigated the factors and mechanisms of haze change and present a long-term prediction model of Beijing haze episodes using time series analysis. We construct a dynamic structural measurement model of daily haze increment and reduce the model to a vector autoregressive model. Typical case studies on 886 continuous days indicate that our model performs very well on next day’s Air Quality Index (AQI prediction, and in severely polluted cases (AQI ≥ 300 the accuracy rate of AQI prediction even reaches up to 87.8%. The experiment of one-week prediction shows that our model has excellent sensitivity when a sudden haze burst or dissipation happens, which results in good long-term stability on the accuracy of the next 3–7 days’ AQI prediction.

  17. Mapping Mountain Pine Beetle Mortality through Growth Trend Analysis of Time-Series Landsat Data

    Directory of Open Access Journals (Sweden)

    Lu Liang

    2014-06-01

    Full Text Available Disturbances are key processes in the carbon cycle of forests and other ecosystems. In recent decades, mountain pine beetle (MPB; Dendroctonus ponderosae outbreaks have become more frequent and extensive in western North America. Remote sensing has the ability to fill the data gaps of long-term infestation monitoring, but the elimination of observational noise and attributing changes quantitatively are two main challenges in its effective application. Here, we present a forest growth trend analysis method that integrates Landsat temporal trajectories and decision tree techniques to derive annual forest disturbance maps over an 11-year period. The temporal trajectory component successfully captures the disturbance events as represented by spectral segments, whereas decision tree modeling efficiently recognizes and attributes events based upon the characteristics of the segments. Validated against a point set sampled across a gradient of MPB mortality, 86.74% to 94.00% overall accuracy was achieved with small variability in accuracy among years. In contrast, the overall accuracies of single-date classifications ranged from 37.20% to 75.20% and only become comparable with our approach when the training sample size was increased at least four-fold. This demonstrates that the advantages of this time series work flow exist in its small training sample size requirement. The easily understandable, interpretable and modifiable characteristics of our approach suggest that it could be applicable to other ecoregions.

  18. Mental health impacts of flooding: a controlled interrupted time series analysis of prescribing data in England.

    Science.gov (United States)

    Milojevic, Ai; Armstrong, Ben; Wilkinson, Paul

    2017-10-01

    There is emerging evidence that people affected by flooding suffer adverse impacts on their mental well-being, mostly based on self-reports. We examined prescription records for drugs used in the management of common mental disorder among primary care practices located in the vicinity of recent large flood events in England, 2011-2014. A controlled interrupted time series analysis was conducted of the number of prescribing items for antidepressant drugs in the year before and after the flood onset. Pre-post changes were compared by distance of the practice from the inundated boundaries among 930 practices located within 10 km of a flood. After control for deprivation and population density, there was an increase of 0.59% (95% CI 0.24 to 0.94) prescriptions in the postflood year among practices located within 1 km of a flood over and above the change observed in the furthest distance band. The increase was greater in more deprived areas. This study suggests an increase in prescribed antidepressant drugs in the year after flooding in primary care practices close to recent major floods in England. The degree to which the increase is actually concentrated in those flooded can only be determined by more detailed linkage studies. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  19. Economic feasibility of biogas production in swine farms using time series analysis

    Directory of Open Access Journals (Sweden)

    Felipe Luis Rockenbach

    2016-07-01

    Full Text Available ABSTRACT: This study aimed to measure the economic feasibility and the time needed to return capital invested for the installation of a swine manure treatment system, these values originated the sale of carbon credits and/or of compensation of electric energy in swine farms, using the Box-Jenkins forecast models. It was found that the use of biogas is a viable option in a large scale with machines that operate daily for 10h or more, being the return period between 70 to 80 months. Time series analysis models are important to anticipate the series under study behavior, providing the swine breeder/investor means to reduce the financial investment risk as well as helping to decrease the production costs. Moreover, this process can be seen as another source of income and enable the breeder to be self-sufficient in the continuous supply of electric energy, which is very valuable nowadays considering that breeders are now increasingly using various technologies.

  20. ANALYSIS OF TIME SERIES FOR THE CURRENCY PAIR CROATIAN KUNA / EURO

    Directory of Open Access Journals (Sweden)

    Marko Martinović

    2017-01-01

    Full Text Available The domestic currency Croatian kuna (HRK was introduced in May 1995. To date, the Croatian National Bank (HNB, as a regulator and formulator of monetary policy in Croatia has operated a policy of stable exchange rate, typically referenced to the formal currency of the European Union euro (EUR. From the date of introduction of the euro 01/01/1999 until 01/01/2016 the value of the currency pair HRK / EUR changed in value by only 4.25% (HNB. Although the value of the Croatian kuna is relatively stable, there are some fluctuations on an annual level (e.g. in ­­­the last few years because of the global crisis as well as  on periodic levels within a year. The aim of this paper is to show the movement of the value of the currency pair since the beginning of 2002 to the present day (the time curve, analyze the correctness, trends and periodicity (seasonal behavior, if any exist.The research will be done using the method of Time Series Analysis, assuming that the external (global economy and internal factors (economic policy remain similar or the same. According to the results, further assessment of price developments in the period followed will be made by using the obtained predicative models. In the event that the curve contains the component of periodicity, the observed patterns will be studied further.

  1. Forecasting Container Throughput at the Doraleh Port in Djibouti through Time Series Analysis

    Science.gov (United States)

    Mohamed Ismael, Hawa; Vandyck, George Kobina

    The Doraleh Container Terminal (DCT) located in Djibouti has been noted as the most technologically advanced container terminal on the African continent. DCT's strategic location at the crossroads of the main shipping lanes connecting Asia, Africa and Europe put it in a unique position to provide important shipping services to vessels plying that route. This paper aims to forecast container throughput through the Doraleh Container Port in Djibouti by Time Series Analysis. A selection of univariate forecasting models has been used, namely Triple Exponential Smoothing Model, Grey Model and Linear Regression Model. By utilizing the above three models and their combination, the forecast of container throughput through the Doraleh port was realized. A comparison of the different forecasting results of the three models, in addition to the combination forecast is then undertaken, based on commonly used evaluation criteria Mean Absolute Deviation (MAD) and Mean Absolute Percentage Error (MAPE). The study found that the Linear Regression forecasting Model was the best prediction method for forecasting the container throughput, since its forecast error was the least. Based on the regression model, a ten (10) year forecast for container throughput at DCT has been made.

  2. A population based time series analysis of asthma hospitalisations in Ontario, Canada: 1988 to 2000

    Directory of Open Access Journals (Sweden)

    Upshur Ross EG

    2001-08-01

    Full Text Available Abstract Background Asthma is a common yet incompletely understood health problem associated with a high morbidity burden. A wide variety of seasonally variable environmental stimuli such as viruses and air pollution are believed to influence asthma morbidity. This study set out to examine the seasonal patterns of asthma hospitalisations in relation to age and gender for the province of Ontario over a period of 12 years. Methods A retrospective, population-based study design was used to assess temporal patterns in hospitalisations for asthma from April 1, 1988 to March 31, 2000. Approximately 14 million residents of Ontario eligible for universal healthcare coverage during this time were included for analysis. Time series analyses were conducted on monthly aggregations of hospitalisations. Results There is strong evidence of an autumn peak and summer trough seasonal pattern occurring every year over the 12-year period (Fisher-Kappa (FK = 23.93, p > 0.01; Bartlett Kolmogorov Smirnov (BKS = 0.459, p Conclusions A clear and consistent seasonal pattern was observed in this study for asthma hospitalisations. These findings have important implications for the development of effective management and prevention strategies.

  3. Analysis of cyclical behavior in time series of stock market returns

    Science.gov (United States)

    Stratimirović, Djordje; Sarvan, Darko; Miljković, Vladimir; Blesić, Suzana

    2018-01-01

    In this paper we have analyzed scaling properties and cyclical behavior of the three types of stock market indexes (SMI) time series: data belonging to stock markets of developed economies, emerging economies, and of the underdeveloped or transitional economies. We have used two techniques of data analysis to obtain and verify our findings: the wavelet transform (WT) spectral analysis to identify cycles in the SMI returns data, and the time-dependent detrended moving average (tdDMA) analysis to investigate local behavior around market cycles and trends. We found cyclical behavior in all SMI data sets that we have analyzed. Moreover, the positions and the boundaries of cyclical intervals that we found seam to be common for all markets in our dataset. We list and illustrate the presence of nine such periods in our SMI data. We report on the possibilities to differentiate between the level of growth of the analyzed markets by way of statistical analysis of the properties of wavelet spectra that characterize particular peak behaviors. Our results show that measures like the relative WT energy content and the relative WT amplitude of the peaks in the small scales region could be used to partially differentiate between market economies. Finally, we propose a way to quantify the level of development of a stock market based on estimation of local complexity of market's SMI series. From the local scaling exponents calculated for our nine peak regions we have defined what we named the Development Index, which proved, at least in the case of our dataset, to be suitable to rank the SMI series that we have analyzed in three distinct groups.

  4. [Time-series analysis on effect of air pollution on stroke mortality in Tianjin, China].

    Science.gov (United States)

    Wang, De-zheng; Gu, Qing; Jiang, Guo-hong; Yang, De-yi; Zhang, Hui; Song, Gui-de; Zhang, Ying

    2012-12-01

    To investigate the effect of air pollution on stroke mortality in Tianjin, China, and to provide basis for stroke control and prevention. Total data of mortality surveillance were collected by Tianjin Centers for Disease Control and Prevention. Meteorological data and atmospheric pollution data were from Tianjin Meteorological Bureau and Tianjin Environmental Monitoring Center, respectively. Generalized additive Poisson regression model was used in time-series analysis on the relationship between air pollution and stroke mortality in Tianjin. Single-pollutant analysis and multi-pollutant analysis were performed after adjustment for confounding factors such as meteorological factors, long-term trend of death, "days of the week" effect and population. The crude death rates of stroke in Tianjin were from 136.67 in 2001 to 160.01/100000 in 2009, with an escalating trend (P = 0.000), while the standardized mortality ratios of stroke in Tianjin were from 138.36 to 99.14/100000, with a declining trend (P = 0.000). An increase of 10 µg/m³ in daily average concentrations of atmospheric SO₂, NO₂ and PM₁₀ led to 1.0105 (95%CI: 1.0060 ∼ 1.0153), 1.0197 (95%CI: 1.0149 ∼ 1.0246) and 1.0064 (95%CI: 1.0052 ∼ 1.0077), respectively, in relative risks of stroke mortality. SO₂ effect peaked after 1-day exposure, while NO₂ and PM₁₀ effects did within 1 day. Air pollution in Tianjin may increase the risk of stroke mortality in the population and induce acute onset of stroke. It is necessary to carry out air pollution control and allocate health resources rationally to reduce the hazard of stroke mortality.

  5. ENTREPRENEURIAL ACTIVITY IN ROMANIA – A TIME SERIES CLUSTERING ANALYSIS AT THE NUTS3 LEVEL

    Directory of Open Access Journals (Sweden)

    Sipos-Gug Sebastian

    2013-07-01

    Full Text Available Entrepreneurship is an active field of research, having known a major increase in interest and publication levels in the last years (Landström et al., 2012. Within this field recently there has been an increasing interest in understanding why some regions seem to have a significantly higher entrepreneurship activity compared to others. In line with this research field, we would like to investigate the differences in entrepreneurial activity among the Romanian counties (NUTS 3 regions. While the classical research paradigm in this field is to conduct a temporally stationary analysis, we choose to use a time series clustering analysis to better understanding the dynamics of entrepreneurial activity between counties. Our analysis showed that if we use the total number of new privately owned companies that are founded each year in the last decade (2002-2012 we can distinguish between 5 clusters, one with high total entrepreneurial activity (18 counties, one with above average activity (8 counties, two clusters with average and slightly below average activity (total of 18 counties and one cluster with low and declining activity (2 counties. If we are interested in the entrepreneurial activity rate, that is the number of new privately owned companies founded each year adjusted by the population of the respective county, we obtain 4 clusters, one with a very high entrepreneurial rate (1 county, one with average rate (10 counties, and two clusters with below average entrepreneurial rate (total of 31 counties. In conclusion, our research shows that Romania is far from being a homogeneous geographical area in respect to entrepreneurial activity. Depending on what we are interested in, it can be divided in 5 or 4 clusters of counties, which behave differently as a function of time. Further research should be focused on explaining these regional differences, on studying the high performance clusters and trying to improve the low performing ones.

  6. Development of analysis software for radiation time-series data with the use of visual studio 2005

    International Nuclear Information System (INIS)

    Hohara, Sin-ya; Horiguchi, Tetsuo; Ito, Shin

    2008-01-01

    Time-Series Analysis supplies a new vision that conventional analysis methods such as energy spectroscopy haven't achieved ever. However, application of time-series analysis to radiation measurements needs much effort in software and hardware development. By taking advantage of Visual Studio 2005, we developed an analysis software, 'ListFileConverter', for time-series radiation measurement system called as 'MPA-3'. The software is based on graphical user interface (GUI) architecture that enables us to save a large amount of operation time in the analysis, and moreover to make an easy-access to special file structure of MPA-3 data. In this paper, detailed structure of ListFileConverter is fully explained, and experimental results for counting capability of MPA-3 hardware system and those for neutron measurements with our UTR-KINKI reactor are also given. (author)

  7. New significance test methods for Fourier analysis of geophysical time series

    Directory of Open Access Journals (Sweden)

    Z. Zhang

    2011-09-01

    Full Text Available When one applies the discrete Fourier transform to analyze finite-length time series, discontinuities at the data boundaries will distort its Fourier power spectrum. In this paper, based on a rigid statistics framework, we present a new significance test method which can extract the intrinsic feature of a geophysical time series very well. We show the difference in significance level compared with traditional Fourier tests by analyzing the Arctic Oscillation (AO and the Nino3.4 time series. In the AO, we find significant peaks at about 2.8, 4.3, and 5.7 yr periods and in Nino3.4 at about 12 yr period in tests against red noise. These peaks are not significant in traditional tests.

  8. Determinants of Egyptian Banking Sector Profitability: Time-Series Analysis from 2004-2014

    Directory of Open Access Journals (Sweden)

    Heba Youssef Hashem

    2016-06-01

    Full Text Available Purpose - The purpose of this paper is to examine the determinants of banking sector profitability in Egypt to shed light on the most influential variables that have a significant impact on the performance of this vital sector. Design/methodology/approach - The analysis includes a time series model of quarterly data from 2004 to 2014. The model utilizes Cointegration technique to investigate the long-run relationship between the return on equity as a proxy for bank profitability and several bank-specific variables including liquidity, capital adequacy, and percentage of non-performing loans. In addition, Vector Error Correction Model (VECM is utilized to explore the short-run dynamics of the model and the speed of adjustment to reach the long-run equilibrium. Findings - The main findings of this work show that banking sector profitability is inversely related to capital adequacy, the percentage of loan provisions and the ratio of deposits to total assets. On the other hand, it is positively related to the size of the banking sector which implies that the banking sector exhibits economies of scale. Research limitations/implications - The implications of this work is that it helps reveal the major factors affecting bank performance in the short-run and long-run, and hence provide bank managers and monetary policy makers with beneficial insights on how to enhance bank performance. Since the banking sector represents one of the main engines of financing investment, enhancing the efficiency of this sector would contribute to economic growth and prosperity Originality/value - The Vector error correction model showed that about 4% of the disequilibrium is corrected each quarter to reach the long run equilibrium. In addition, all bank specific variables were found to affect profitability in the long-run only. This study would serve as a base that further work on Egyptian banking sector profitability can build on by incorporating more variables in the

  9. Temporal trend of carpal tunnel release surgery: a population-based time series analysis.

    Directory of Open Access Journals (Sweden)

    Naif Fnais

    Full Text Available BACKGROUND: Carpal tunnel release (CTR is among the most common hand surgeries, although little is known about its pattern. In this study, we aimed to investigate temporal trends, age and gender variation and current practice patterns in CTR surgeries. METHODS: We conducted a population-based time series analysis among over 13 million residents of Ontario, who underwent operative management for carpal tunnel syndrome (CTS from April 1, 1992 to March 31, 2010 using administrative claims data. RESULTS: The primary analysis revealed a fairly stable procedure rate of approximately 10 patients per 10,000 population per year receiving CTRs without any significant, consistent temporal trend (p = 0.94. Secondary analyses revealed different trends in procedure rates according to age. The annual procedure rate among those age >75 years increased from 22 per 10,000 population at the beginning of the study period to over 26 patients per 10,000 population (p<0.01 by the end of the study period. CTR surgical procedures were approximately two-fold more common among females relative to males (64.9% vs. 35.1 respectively; p<0.01. Lastly, CTR procedures are increasingly being conducted in the outpatient setting while procedures in the inpatient setting have been declining steadily - the proportion of procedures performed in the outpatient setting increased from 13% to over 30% by 2010 (p<0.01. CONCLUSION: Overall, CTR surgical-procedures are conducted at a rate of approximately 10 patients per 10,000 population annually with significant variation with respect to age and gender. CTR surgical procedures in ambulatory-care facilities may soon outpace procedure rates in the in-hospital setting.

  10. Evaluating the impact of flexible alcohol trading hours on violence: an interrupted time series analysis.

    Directory of Open Access Journals (Sweden)

    David K Humphreys

    Full Text Available On November 24(th 2005, the Government of England and Wales removed regulatory restrictions on the times at which licensed premises could sell alcohol. This study tests availability theory by treating the implementation of Licensing Act (2003 as a natural experiment in alcohol policy.An interrupted time series design was employed to estimate the Act's immediate and delayed impact on violence in the City of Manchester (Population 464,200. We collected police recorded rates of violence, robbery, and total crime between the 1st of February 2004 and the 31st of December 2007. Events were aggregated by week, yielding a total of 204 observations (95 pre-, and 109 post-intervention. Secondary analysis examined changes in daily patterns of violence. Pre- and post-intervention events were separated into four three-hour segments 18∶00-20∶59, 21∶00-23.59, 00∶00-02∶59, 03∶00-05∶59.Analysis found no evidence that the Licensing Act (2003 affected the overall volume of violence. However, analyses of night-time violence found a gradual and permanent shift of weekend violence into later parts of the night. The results estimated an initial increase of 27.5% between 03∶00 to 06∶00 (ω = 0.2433, 95% CI = 0.06, 0.42, which increased to 36% by the end of the study period (δ = -0.897, 95% CI = -1.02, -0.77.This study found no evidence that a national policy increasing the physical availability of alcohol affected the overall volume of violence. There was, however, evidence suggesting that the policy may be associated with changes to patterns of violence in the early morning (3 a.m. to 6 a.m..

  11. The application of time series models to cloud field morphology analysis

    Science.gov (United States)

    Chin, Roland T.; Jau, Jack Y. C.; Weinman, James A.

    1987-01-01

    A modeling method for the quantitative description of remotely sensed cloud field images is presented. A two-dimensional texture modeling scheme based on one-dimensional time series procedures is adopted for this purpose. The time series procedure used is the seasonal autoregressive, moving average (ARMA) process in Box and Jenkins. Cloud field properties such as directionality, clustering and cloud coverage can be retrieved by this method. It has been demonstrated that a cloud field image can be quantitatively defined by a small set of parameters and synthesized surrogates can be reconstructed from these model parameters. This method enables cloud climatology to be studied quantitatively.

  12. Analysis of three amphibian populations with quarter-century long time-series.

    OpenAIRE

    Meyer, A H; Schimidt, B R; Grossenbacher, K

    1998-01-01

    Amphibians are in decline in many parts of the world. Long tme-series of amphibian populations are necessary to distinguish declines from the often strong fluctuations observed in natural populations. Time-series may also help to understand the causes of these declines. We analysed 23-28-year long time-series of the frog Rana temporaria. Only one of the three studied populations showed a negative trend which was probably caused by the introduction of fish. Two populations appeared to be densi...

  13. Time series analysis of Mexico City subsidence constrained by radar interferometry

    Science.gov (United States)

    Doin, Marie-Pierre; Lopez-Quiroz, Penelope; Yan, Yajing; Bascou, Pascale; Pinel, Virginie

    2010-05-01

    unwrapping errors for each pixel and show that they are strongly decreased by iterations in the unwrapping process. (3) Finally, we present a new algorithm for time series analysis that differs from classical SVD decomposition and is best suited to the present data base. Accurate deformation time series are then derived over the metropolitan area of the city with a spatial resolution of 30 × 30 m. We also use the Gamma-PS software on the same data set. The phase differences are unwrapped within small patches with respect to a reference point chosen in each patch, whose phase is in turn unwrapped relatively to a reference point common for the whole area of interest. After removing the modelled contribution of the linear displacement rate and DEM error, some residual interferograms, presenting unwrapping errors because of strong residual orbital ramp or atmospheric phase screen, are spatially unwrapped by a minimum cost-flow algorithm. The next steps are to estimate and remove the residual orbital ramp and to apply temporal low-pass filter to remove atmospheric contributions. The step by step comparison of the SBAS and PS approaches shows both methods complementarity. The SBAS analysis provide subsidence rates with an accuracy of a mm/yr over the whole basin in a large area, together with the subsidence non linear behavior through time, however at the expense of some spatial regularization. The PS method provides locally accurate and punctual deformation rates, but fails in this case to yield a good large scale map and the non linear temporal behavior of the subsidence. We conclude that the relative contrast in subsidence between individual buildings and infrastructure must be relatively small, on average of the order of 5mm/yr.

  14. DynPeak: An Algorithm for Pulse Detection and Frequency Analysis in Hormonal Time Series

    Science.gov (United States)

    Vidal, Alexandre; Zhang, Qinghua; Médigue, Claire; Fabre, Stéphane; Clément, Frédérique

    2012-01-01

    The endocrine control of the reproductive function is often studied from the analysis of luteinizing hormone (LH) pulsatile secretion by the pituitary gland. Whereas measurements in the cavernous sinus cumulate anatomical and technical difficulties, LH levels can be easily assessed from jugular blood. However, plasma levels result from a convolution process due to clearance effects when LH enters the general circulation. Simultaneous measurements comparing LH levels in the cavernous sinus and jugular blood have revealed clear differences in the pulse shape, the amplitude and the baseline. Besides, experimental sampling occurs at a relatively low frequency (typically every 10 min) with respect to LH highest frequency release (one pulse per hour) and the resulting LH measurements are noised by both experimental and assay errors. As a result, the pattern of plasma LH may be not so clearly pulsatile. Yet, reliable information on the InterPulse Intervals (IPI) is a prerequisite to study precisely the steroid feedback exerted on the pituitary level. Hence, there is a real need for robust IPI detection algorithms. In this article, we present an algorithm for the monitoring of LH pulse frequency, basing ourselves both on the available endocrinological knowledge on LH pulse (shape and duration with respect to the frequency regime) and synthetic LH data generated by a simple model. We make use of synthetic data to make clear some basic notions underlying our algorithmic choices. We focus on explaining how the process of sampling affects drastically the original pattern of secretion, and especially the amplitude of the detectable pulses. We then describe the algorithm in details and perform it on different sets of both synthetic and experimental LH time series. We further comment on how to diagnose possible outliers from the series of IPIs which is the main output of the algorithm. PMID:22802933

  15. Association between air pollution and cardiovascular mortality in Hefei, China: A time-series analysis.

    Science.gov (United States)

    Zhang, Chao; Ding, Rui; Xiao, Changchun; Xu, Yachun; Cheng, Han; Zhu, Furong; Lei, Ruoqian; Di, Dongsheng; Zhao, Qihong; Cao, Jiyu

    2017-10-01

    In recent years, air pollution has become an alarming problem in China. However, evidence on the effects of air pollution on cardiovascular mortality is still not conclusive to date. This research aimed to assess the short-term effects of air pollution on cardiovascular morbidity in Hefei, China. Data of air pollution, cardiovascular mortality, and meteorological characteristics in Hefei between 2010 and 2015 were collected. Time-series analysis in generalized additive model was applied to evaluate the association between air pollution and daily cardiovascular mortality. During the study period, the annual average concentration of PM 10, SO 2 , and NO 2 was 105.91, 20.58, and 30.93 μg/m 3 , respectively. 21,816 people (including 11,876 man, and 14,494 people over 75 years of age) died of cardiovascular diseases. In single pollutant model, the effects of multi-day exposure were greater than single-day exposure of the air pollution. For every increase of 10 μg/m 3 in SO 2 , NO 2 , and PM 10 levels, CVD mortality increased by 5.26% (95%CI: 3.31%-7.23%), 2.71% (95%CI: 1.23%-4.22%), and 0.68% (95%CI: 0.33%-1.04%) at a lag03, respectively. The multi-pollutant models showed that PM 10 and SO 2 remained associated with CVD mortality, although the effect estimates attenuated. However, the effect of NO 2 on CVD mortality decreased to statistically insignificant. Subgroup analyses further showed that women were more vulnerable than man upon air pollution exposure. These findings showed that air pollution could significantly increase the CVD mortality. Copyright © 2017 Elsevier Ltd. All rights reserved.

  16. Efficient Transfer Entropy Analysis of Non-Stationary Neural Time Series

    Science.gov (United States)

    Vicente, Raul; Díaz-Pernas, Francisco J.; Wibral, Michael

    2014-01-01

    Information theory allows us to investigate information processing in neural systems in terms of information transfer, storage and modification. Especially the measure of information transfer, transfer entropy, has seen a dramatic surge of interest in neuroscience. Estimating transfer entropy from two processes requires the observation of multiple realizations of these processes to estimate associated probability density functions. To obtain these necessary observations, available estimators typically assume stationarity of processes to allow pooling of observations over time. This assumption however, is a major obstacle to the application of these estimators in neuroscience as observed processes are often non-stationary. As a solution, Gomez-Herrero and colleagues theoretically showed that the stationarity assumption may be avoided by estimating transfer entropy from an ensemble of realizations. Such an ensemble of realizations is often readily available in neuroscience experiments in the form of experimental trials. Thus, in this work we combine the ensemble method with a recently proposed transfer entropy estimator to make transfer entropy estimation applicable to non-stationary time series. We present an efficient implementation of the approach that is suitable for the increased computational demand of the ensemble method's practical application. In particular, we use a massively parallel implementation for a graphics processing unit to handle the computationally most heavy aspects of the ensemble method for transfer entropy estimation. We test the performance and robustness of our implementation on data from numerical simulations of stochastic processes. We also demonstrate the applicability of the ensemble method to magnetoencephalographic data. While we mainly evaluate the proposed method for neuroscience data, we expect it to be applicable in a variety of fields that are concerned with the analysis of information transfer in complex biological, social, and

  17. Information-Theoretic Performance Analysis of Sensor Networks via Markov Modeling of Time Series Data.

    Science.gov (United States)

    Li, Yue; Jha, Devesh K; Ray, Asok; Wettergren, Thomas A; Yue Li; Jha, Devesh K; Ray, Asok; Wettergren, Thomas A; Wettergren, Thomas A; Li, Yue; Ray, Asok; Jha, Devesh K

    2018-06-01

    This paper presents information-theoretic performance analysis of passive sensor networks for detection of moving targets. The proposed method falls largely under the category of data-level information fusion in sensor networks. To this end, a measure of information contribution for sensors is formulated in a symbolic dynamics framework. The network information state is approximately represented as the largest principal component of the time series collected across the network. To quantify each sensor's contribution for generation of the information content, Markov machine models as well as x-Markov (pronounced as cross-Markov) machine models, conditioned on the network information state, are constructed; the difference between the conditional entropies of these machines is then treated as an approximate measure of information contribution by the respective sensors. The x-Markov models represent the conditional temporal statistics given the network information state. The proposed method has been validated on experimental data collected from a local area network of passive sensors for target detection, where the statistical characteristics of environmental disturbances are similar to those of the target signal in the sense of time scale and texture. A distinctive feature of the proposed algorithm is that the network decisions are independent of the behavior and identity of the individual sensors, which is desirable from computational perspectives. Results are presented to demonstrate the proposed method's efficacy to correctly identify the presence of a target with very low false-alarm rates. The performance of the underlying algorithm is compared with that of a recent data-driven, feature-level information fusion algorithm. It is shown that the proposed algorithm outperforms the other algorithm.

  18. Hybrid analysis for indicating patients with breast cancer using temperature time series.

    Science.gov (United States)

    Silva, Lincoln F; Santos, Alair Augusto S M D; Bravo, Renato S; Silva, Aristófanes C; Muchaluat-Saade, Débora C; Conci, Aura

    2016-07-01

    Breast cancer is the most common cancer among women worldwide. Diagnosis and treatment in early stages increase cure chances. The temperature of cancerous tissue is generally higher than that of healthy surrounding tissues, making thermography an option to be considered in screening strategies of this cancer type. This paper proposes a hybrid methodology for analyzing dynamic infrared thermography in order to indicate patients with risk of breast cancer, using unsupervised and supervised machine learning techniques, which characterizes the methodology as hybrid. The dynamic infrared thermography monitors or quantitatively measures temperature changes on the examined surface, after a thermal stress. In the dynamic infrared thermography execution, a sequence of breast thermograms is generated. In the proposed methodology, this sequence is processed and analyzed by several techniques. First, the region of the breasts is segmented and the thermograms of the sequence are registered. Then, temperature time series are built and the k-means algorithm is applied on these series using various values of k. Clustering formed by k-means algorithm, for each k value, is evaluated using clustering validation indices, generating values treated as features in the classification model construction step. A data mining tool was used to solve the combined algorithm selection and hyperparameter optimization (CASH) problem in classification tasks. Besides the classification algorithm recommended by the data mining tool, classifiers based on Bayesian networks, neural networks, decision rules and decision tree were executed on the data set used for evaluation. Test results support that the proposed analysis methodology is able to indicate patients with breast cancer. Among 39 tested classification algorithms, K-Star and Bayes Net presented 100% classification accuracy. Furthermore, among the Bayes Net, multi-layer perceptron, decision table and random forest classification algorithms, an

  19. Time series analysis of embodied interaction: Movement variability and complexity matching as dyadic properties

    Directory of Open Access Journals (Sweden)

    Leonardo Zapata-Fonseca

    2016-12-01

    Full Text Available There is a growing consensus that a fuller understanding of social cognition depends on more systematic studies of real-time social interaction. Such studies require methods that can deal with the complex dynamics taking place at multiple interdependent temporal and spatial scales, spanning sub-personal, personal, and dyadic levels of analysis. We demonstrate the value of adopting an extended multi-scale approach by re-analyzing movement time series generated in a study of embodied dyadic interaction in a minimal virtual reality environment (a perceptual crossing experiment. Reduced movement variability revealed an interdependence between social awareness and social coordination that cannot be accounted for by either subjective or objective factors alone: it picks out interactions in which subjective and objective conditions are convergent (i.e. elevated coordination is perceived as clearly social, and impaired coordination is perceived as socially ambiguous. This finding is consistent with the claim that interpersonal interaction can be partially constitutive of direct social perception. Clustering statistics (Allan Factor of salient events revealed fractal scaling. Complexity matching defined as the similarity between these scaling laws was significantly more pronounced in pairs of participants as compared to surrogate dyads. This further highlights the multi-scale and distributed character of social interaction and extends previous complexity matching results from dyadic conversation to nonverbal social interaction dynamics. Trials with successful joint interaction were also associated with an increase in local coordination. Consequently, a local coordination pattern emerges on the background of complex dyadic interactions in the PCE task and makes joint successful performance possible.

  20. Using forecast modelling to evaluate treatment effects in single-group interrupted time series analysis.

    Science.gov (United States)

    Linden, Ariel

    2018-05-11

    Interrupted time series analysis (ITSA) is an evaluation methodology in which a single treatment unit's outcome is studied serially over time and the intervention is expected to "interrupt" the level and/or trend of that outcome. ITSA is commonly evaluated using methods which may produce biased results if model assumptions are violated. In this paper, treatment effects are alternatively assessed by using forecasting methods to closely fit the preintervention observations and then forecast the post-intervention trend. A treatment effect may be inferred if the actual post-intervention observations diverge from the forecasts by some specified amount. The forecasting approach is demonstrated using the effect of California's Proposition 99 for reducing cigarette sales. Three forecast models are fit to the preintervention series-linear regression (REG), Holt-Winters (HW) non-seasonal smoothing, and autoregressive moving average (ARIMA)-and forecasts are generated into the post-intervention period. The actual observations are then compared with the forecasts to assess intervention effects. The preintervention data were fit best by HW, followed closely by ARIMA. REG fit the data poorly. The actual post-intervention observations were above the forecasts in HW and ARIMA, suggesting no intervention effect, but below the forecasts in the REG (suggesting a treatment effect), thereby raising doubts about any definitive conclusion of a treatment effect. In a single-group ITSA, treatment effects are likely to be biased if the model is misspecified. Therefore, evaluators should consider using forecast models to accurately fit the preintervention data and generate plausible counterfactual forecasts, thereby improving causal inference of treatment effects in single-group ITSA studies. © 2018 John Wiley & Sons, Ltd.

  1. Trend analysis of time-series data: A novel method for untargeted metabolite discovery

    NARCIS (Netherlands)

    Peters, S.; Janssen, H.-G.; Vivó-Truyols, G.

    2010-01-01

    A new strategy for biomarker discovery is presented that uses time-series metabolomics data. Data sets from samples analysed at different time points after an intervention are searched for compounds that show a meaningful trend following the intervention. Obviously, this requires new data-analytical

  2. Dissolved organic nitrogen dynamics in the North Sea: A time series analysis (1995-2005)

    NARCIS (Netherlands)

    Van Engeland, T.; Soetaert, K.E.R.; Knuijt, A.; Laane, R.W.P.M.; Middelburg, J.J.

    2010-01-01

    Dissolved organic nitrogen (DON) dynamics in the North Sea was explored by means of long-term time series of nitrogen parameters from the Dutch national monitoring program. Generally, the data quality was good with little missing data points. Different imputation methods were used to verify the

  3. A new modified histogram matching normalization for time series microarray analysis

    NARCIS (Netherlands)

    Astola, L.J.; Molenaar, J.

    2014-01-01

    Microarray data is often utilized in inferring regulatory networks. Quantile normalization (QN) is a popular method to reduce array-to-array variation. We show that in the context of time series measurements QN may not be the best choice for this task, especially not if the inference is based on

  4. A global evaluation of harmonic analysis of time series under distrinct gap conditions

    NARCIS (Netherlands)

    Zhou, J.; Hu, G.; Menenti, M.

    2013-01-01

    Reconstruction of time series of satellite image data to obtain continuous, consistent and accurate data for downstream applications is playing a crucial role in remote sensing applications such as vegetation dynamics, land cover changes, land-atmosphere interactions and climate changes. Among the

  5. Time-variant power spectral analysis of heart-rate time series by ...

    Indian Academy of Sciences (India)

    Frequency domain representation of a short-term heart-rate time series (HRTS) signal is a popular method for evaluating the cardiovascular control system. The spectral parameters, viz. percentage power in low frequency band (%PLF), percentage power in high frequency band (%PHF), power ratio of low frequency to high ...

  6. Analysis of financial time series using multiscale entropy based on skewness and kurtosis

    Science.gov (United States)

    Xu, Meng; Shang, Pengjian

    2018-01-01

    There is a great interest in studying dynamic characteristics of the financial time series of the daily stock closing price in different regions. Multi-scale entropy (MSE) is effective, mainly in quantifying the complexity of time series on different time scales. This paper applies a new method for financial stability from the perspective of MSE based on skewness and kurtosis. To better understand the superior coarse-graining method for the different kinds of stock indexes, we take into account the developmental characteristics of the three continents of Asia, North America and European stock markets. We study the volatility of different financial time series in addition to analyze the similarities and differences of coarsening time series from the perspective of skewness and kurtosis. A kind of corresponding relationship between the entropy value of stock sequences and the degree of stability of financial markets, were observed. The three stocks which have particular characteristics in the eight piece of stock sequences were discussed, finding the fact that it matches the result of applying the MSE method to showing results on a graph. A comparative study is conducted to simulate over synthetic and real world data. Results show that the modified method is more effective to the change of dynamics and has more valuable information. The result is obtained at the same time, finding the results of skewness and kurtosis discrimination is obvious, but also more stable.

  7. Software for hydrogeologic time series analysis, interfacing data with physical insight

    NARCIS (Netherlands)

    Asmuth, Jos R. von; Maas, K.; Knotters, M.; Bierkens, M.F.P.; Bakker, M.; Olsthoorn, T.; Cirkel, D.; Leunk, I.; Schaars, F.; Asmuth, Daniel C. von

    2012-01-01

    The program Menyanthes combines a variety of functions for managing, editing, visualizing, analyzing and modeling hydrogeologic time series. Menyanthes was initially developed within the scope of the PhD research of the first author, whose primary aimwas the integration of data and

  8. Analysis of monotonic greening and browning trends from global NDVI time-series

    NARCIS (Netherlands)

    Jong, de R.; Bruin, de S.; Wit, de A.J.W.; Schaepman, M.E.; Dent, D.L.

    2011-01-01

    Remotely sensed vegetation indices are widely used to detect greening and browning trends; especially the global coverage of time-series normalized difference vegetation index (NDVI) data which are available from 1981. Seasonality and serial auto-correlation in the data have previously been dealt

  9. Analysis of rainfall and temperature time series to detect long-term ...

    Indian Academy of Sciences (India)

    67

    ABSTRACT. Arid and semiarid environments have been identified with locations prone to impacts of climate variability and change. Investigating long term trends is one way of tracing climate change impacts. This study investigates variability through annual and seasonal meteorological time series. Possible ...

  10. Characterization of Land Transitions Patterns from Multivariate Time Series Using Seasonal Trend Analysis and Principal Component Analysis

    Directory of Open Access Journals (Sweden)

    Benoit Parmentier

    2014-12-01

    Full Text Available Characterizing biophysical changes in land change areas over large regions with short and noisy multivariate time series and multiple temporal parameters remains a challenging task. Most studies focus on detection rather than the characterization, i.e., the manner by which surface state variables are altered by the process of changes. In this study, a procedure is presented to extract and characterize simultaneous temporal changes in MODIS multivariate times series from three surface state variables the Normalized Difference Vegetation Index (NDVI, land surface temperature (LST and albedo (ALB. The analysis involves conducting a seasonal trend analysis (STA to extract three seasonal shape parameters (Amplitude 0, Amplitude 1 and Amplitude 2 and using principal component analysis (PCA to contrast trends in change and no-change areas. We illustrate the method by characterizing trends in burned and unburned pixels in Alaska over the 2001–2009 time period. Findings show consistent and meaningful extraction of temporal patterns related to fire disturbances. The first principal component (PC1 is characterized by a decrease in mean NDVI (Amplitude 0 with a concurrent increase in albedo (the mean and the annual amplitude and an increase in LST annual variability (Amplitude 1. These results provide systematic empirical evidence of surface changes associated with one type of land change, fire disturbances, and suggest that STA with PCA may be used to characterize many other types of land transitions over large landscape areas using multivariate Earth observation time series.

  11. Time-series analysis in imatinib-resistant chronic myeloid leukemia K562-cells under different drug treatments.

    Science.gov (United States)

    Zhao, Yan-Hong; Zhang, Xue-Fang; Zhao, Yan-Qiu; Bai, Fan; Qin, Fan; Sun, Jing; Dong, Ying

    2017-08-01

    Chronic myeloid leukemia (CML) is characterized by the accumulation of active BCR-ABL protein. Imatinib is the first-line treatment of CML; however, many patients are resistant to this drug. In this study, we aimed to compare the differences in expression patterns and functions of time-series genes in imatinib-resistant CML cells under different drug treatments. GSE24946 was downloaded from the GEO database, which included 17 samples of K562-r cells with (n=12) or without drug administration (n=5). Three drug treatment groups were considered for this study: arsenic trioxide (ATO), AMN107, and ATO+AMN107. Each group had one sample at each time point (3, 12, 24, and 48 h). Time-series genes with a ratio of standard deviation/average (coefficient of variation) >0.15 were screened, and their expression patterns were revealed based on Short Time-series Expression Miner (STEM). Then, the functional enrichment analysis of time-series genes in each group was performed using DAVID, and the genes enriched in the top ten functional categories were extracted to detect their expression patterns. Different time-series genes were identified in the three groups, and most of them were enriched in the ribosome and oxidative phosphorylation pathways. Time-series genes in the three treatment groups had different expression patterns and functions. Time-series genes in the ATO group (e.g. CCNA2 and DAB2) were significantly associated with cell adhesion, those in the AMN107 group were related to cellular carbohydrate metabolic process, while those in the ATO+AMN107 group (e.g. AP2M1) were significantly related to cell proliferation and antigen processing. In imatinib-resistant CML cells, ATO could influence genes related to cell adhesion, AMN107 might affect genes involved in cellular carbohydrate metabolism, and the combination therapy might regulate genes involved in cell proliferation.

  12. Hyperspectral Time Series Analysis of Native and Invasive Species in Hawaiian Rainforests

    Directory of Open Access Journals (Sweden)

    Gregory P. Asner

    2012-08-01

    Full Text Available The unique ecosystems of the Hawaiian Islands are progressively being threatened following the introduction of exotic species. Operational implementation of remote sensing for the detection, mapping and monitoring of these biological invasions is currently hampered by a lack of knowledge on the spectral separability between native and invasive species. We used spaceborne imaging spectroscopy to analyze the seasonal dynamics of the canopy hyperspectral reflectance properties of four tree species: (i Metrosideros polymorpha, a keystone native Hawaiian species; (ii Acacia koa, a native Hawaiian nitrogen fixer; (iii the highly invasive Psidium cattleianum; and (iv Morella faya, a highly invasive nitrogen fixer. The species specific separability of the reflectance and derivative-reflectance signatures extracted from an Earth Observing-1 Hyperion time series, composed of 22 cloud-free images spanning a period of four years and was quantitatively evaluated using the Separability Index (SI. The analysis revealed that the Hawaiian native trees were universally unique from the invasive trees in their near-infrared-1 (700–1,250 nm reflectance (0.4 > SI > 1.4. Due to its higher leaf area index, invasive trees generally had a higher near-infrared reflectance. To a lesser extent, it could also be demonstrated that nitrogen-fixing trees were spectrally unique from non-fixing trees. The higher leaf nitrogen content of nitrogen-fixing trees was expressed through slightly increased separabilities in visible and shortwave-infrared reflectance wavebands (SI = 0.4. We also found phenology to be key to spectral separability analysis. As such, it was shown that the spectral separability in the near-infrared-1 reflectance between the native and invasive species groups was more expressed in summer (SI > 0.7 than in winter (SI < 0.7. The lowest separability was observed for March-July (SI < 0.3. This could be explained by the

  13. An econometric time-series analysis of global CO2 concentrations and emissions

    International Nuclear Information System (INIS)

    Cohen, B.C.; Labys, W.C.; Eliste, P.

    2001-01-01

    This paper extends previous work on the econometric modelling of CO 2 concentrations and emissions. The importance of such work rests in the fact that models of the Cohen-Labys variety represent the only alternative to scientific or physical models of CO 2 accumulations whose parameters are inferred rather than estimated. The stimulation for this study derives from the recent discovery of oscillations and cycles in the net biospheric flux of CO 2 . A variety of time series tests is thus used to search for the presence of normality, stationarity, cyclicality and stochastic processes in global CO 2 emissions and concentrations series. Given the evidence for cyclicality of a short-run nature in the spectra of these series, both structural time series and error correction model are applied to confirm the frequency and amplitude of these cycles. Our results suggest new possibilities for determining equilibrium levels of CO 2 concentrations and subsequently revising stabilization policies. (Author)

  14. Cointegration and Error Correction Modelling in Time-Series Analysis: A Brief Introduction

    Directory of Open Access Journals (Sweden)

    Helmut Thome

    2015-07-01

    Full Text Available Criminological research is often based on time-series data showing some type of trend movement. Trending time-series may correlate strongly even in cases where no causal relationship exists (spurious causality. To avoid this problem researchers often apply some technique of detrending their data, such as by differencing the series. This approach, however, may bring up another problem: that of spurious non-causality. Both problems can, in principle, be avoided if the series under investigation are “difference-stationary” (if the trend movements are stochastic and “cointegrated” (if the stochastically changing trendmovements in different variables correspond to each other. The article gives a brief introduction to key instruments and interpretative tools applied in cointegration modelling.

  15. Time-Series Analysis of Intermittent Velocity Fluctuations in Turbulent Boundary Layers

    Science.gov (United States)

    Zayernouri, Mohsen; Samiee, Mehdi; Meerschaert, Mark M.; Klewicki, Joseph

    2017-11-01

    Classical turbulence theory is modified under the inhomogeneities produced by the presence of a wall. In this regard, we propose a new time series model for the streamwise velocity fluctuations in the inertial sub-layer of turbulent boundary layers. The new model employs tempered fractional calculus and seamlessly extends the classical 5/3 spectral model of Kolmogorov in the inertial subrange to the whole spectrum from large to small scales. Moreover, the proposed time-series model allows the quantification of data uncertainties in the underlying stochastic cascade of turbulent kinetic energy. The model is tested using well-resolved streamwise velocity measurements up to friction Reynolds numbers of about 20,000. The physics of the energy cascade are briefly described within the context of the determined model parameters. This work was supported by the AFOSR Young Investigator Program (YIP) award (FA9550-17-1-0150) and partially by MURI/ARO (W911NF-15-1-0562).

  16. A Multipixel Time Series Analysis Method Accounting for Ground Motion, Atmospheric Noise, and Orbital Errors

    Science.gov (United States)

    Jolivet, R.; Simons, M.

    2018-02-01

    Interferometric synthetic aperture radar time series methods aim to reconstruct time-dependent ground displacements over large areas from sets of interferograms in order to detect transient, periodic, or small-amplitude deformation. Because of computational limitations, most existing methods consider each pixel independently, ignoring important spatial covariances between observations. We describe a framework to reconstruct time series of ground deformation while considering all pixels simultaneously, allowing us to account for spatial covariances, imprecise orbits, and residual atmospheric perturbations. We describe spatial covariances by an exponential decay function dependent of pixel-to-pixel distance. We approximate the impact of imprecise orbit information and residual long-wavelength atmosphere as a low-order polynomial function. Tests on synthetic data illustrate the importance of incorporating full covariances between pixels in order to avoid biased parameter reconstruction. An example of application to the northern Chilean subduction zone highlights the potential of this method.

  17. Self-potential time series analysis in a seismic area of the Southern Apennines: preliminary results

    Directory of Open Access Journals (Sweden)

    V. Tramutoli

    1994-06-01

    Full Text Available The self-potential time series recorded during the period May 1991 - August 1992 by an automatic station, located in a seismic area of Southern Apennines, is analyzed. We deal with the spectral and the statistical features of the electrotellurie precursors: they can play a major role in the approach to seismic prediction. The time-dynamics of the experimental time series is investigated, the cyclic components and the time trends are removed. In particular we consider the influence of external noise, related to anthropic activities and meteoclimatic parameters, and pick out the anomalies from the residual series. Finally we show the preliminary results of the correlation between the anomalies in the time patterns of self-potential data and the earthquakes which occurred in the area.

  18. Detecting a currency’s dominance using multivariate time series analysis

    Science.gov (United States)

    Syahidah Yusoff, Nur; Sharif, Shamshuritawati

    2017-09-01

    A currency exchange rate is the price of one country’s currency in terms of another country’s currency. There are four different prices; opening, closing, highest, and lowest can be achieved from daily trading activities. In the past, a lot of studies have been carried out by using closing price only. However, those four prices are interrelated to each other. Thus, the multivariate time series can provide more information than univariate time series. Therefore, the enthusiasm of this paper is to compare the results of two different approaches, which are mean vector and Escoufier’s RV coefficient in constructing similarity matrices of 20 world currencies. Consequently, both matrices are used to substitute the correlation matrix required by network topology. With the help of degree centrality measure, we can detect the currency’s dominance for both networks. The pros and cons for both approaches will be presented at the end of this paper.

  19. Does Financial Development Reduce CO2 Emissions in Malaysian Economy? A Time Series Analysis

    OpenAIRE

    Shahbaz, Muhammad; Solarin, Sakiru Adebola; Mahmood, Haider

    2012-01-01

    This study deals with the question whether financial development reduces CO2 emissions or not in case of Malaysia. For this purpose, we apply the bounds testing approach to cointegration for long run relations between the variables. The study uses annual time series data over the period 1971-2008. Ng-Perron stationarity test is applied to test the unit root properties of the series. Our results validate the presence of cointegration between CO2 emissions, financial development, energy co...

  20. Gender inequality and economic growth: a time series analysis for Pakistan

    OpenAIRE

    Pervaiz, Zahid; Chani, Muhammad Irfan; Jan, Sajjad Ahmad; Chaudhary, Amatul R.

    2011-01-01

    This paper attempts to analyze the impact of gender inequality on economic growth of Pakistan. An annual time series data for the period of 1972-2009 has been used in this study. We have regressed growth rate of real gross domestic product (GDP) per capita on labour force growth, investment, trade openness and a composite index of gender inequality. The results reveal that labour force growth, investment and trade openness have statistically significant and positive impact whereas gender ineq...

  1. The study of coastal groundwater depth and salinity variation using time-series analysis

    International Nuclear Information System (INIS)

    Tularam, G.A.; Keeler, H.P.

    2006-01-01

    A time-series approach is applied to study and model tidal intrusion into coastal aquifers. The authors examine the effect of tidal behaviour on groundwater level and salinity intrusion for the coastal Brisbane region using auto-correlation and spectral analyses. The results show a close relationship between tidal behaviour, groundwater depth and salinity levels for the Brisbane coast. The known effect can be quantified and incorporated into new models in order to more accurately map salinity intrusion into coastal groundwater table

  2. Time series analysis of air pollution and mortality: effects by cause, age and socioeconomic status

    OpenAIRE

    Gouveia, N.; Fletcher, T.

    2000-01-01

    OBJECTIVE—To investigate the association between outdoor air pollution and mortality in São Paulo, Brazil.
DESIGN—Time series study
METHODS—All causes, respiratory and cardiovascular mortality were analysed and the role of age and socioeconomic status in modifying associations between mortality and air pollution were investigated. Models used Poisson regression and included terms for temporal patterns, meteorology, and autocorrelation.
MAIN RESULTS—All causes all ages mortality showed much sm...

  3. Characteristics of Articles About Human Papillomavirus Vaccination in Japanese Newspapers: Time-Series Analysis Study.

    Science.gov (United States)

    Ueda, Nao; Yokouchi, Ryoki; Onoda, Taro; Ogihara, Atsushi

    2017-12-19

    Media coverage and reports have a major influence on individual vaccination and other health-related activities. People use the media to seek information and knowledge on health-related behaviors. They obtain health-related information from media such as television and newspapers, and they trust such information. While several studies have examined the relation between media coverage and individual health, there is a lack of studies that have analyzed media reports of health information. In particular, we have found no analyses related to cervical cancer (human papillomavirus [HPV]) vaccine. This study aimed to identify mentions of cervical cancer vaccine in Japan's printed news media and to determine their characteristics. We used the archival databases of 2 Japanese newspapers, Yomiuri Shimbun (Yomidasu Rekishikan) and Asahi Shimbun (Kikuzo II Visual), for text mining. First, we created a database by extracting articles published between January 1, 2007, and December 31, 2014, that matched the terms "cervical cancer" AND "vaccination" in a keyword search. Then, we tallied the extracted articles based on the month of publication and number of characters in order to conduct a time-series analysis. We extracted a total of 219 articles. Of these, 154 (70.3%) were positive and 51 (23.3%) were negative toward HPV vaccination. Of the 51 negative articles, 4 (7.8%) were published before June 2013, when routine vaccination was temporarily discontinued due to concerns regarding side effects, and 47 (92.2%) were published since then. The negative reports commonly cited side effects, although prior to June 2013, these issues were hardly mentioned. Although foreign media reports mentioned side effects before routine vaccination was temporarily discontinued, fewer articles mentioned side effects than recommendations for vaccination. Furthermore, on June 13, 2013, the World Health Organization's advisory body Global Advisory Committee on Vaccine Safety issued a statement

  4. Effect of Environmental Factors on Low Weight in Non-Premature Births: A Time Series Analysis.

    Science.gov (United States)

    Díaz, Julio; Arroyo, Virginia; Ortiz, Cristina; Carmona, Rocío; Linares, Cristina

    2016-01-01

    Exposure to pollutants during pregnancy has been related to adverse birth outcomes. LBW can give rise to lifelong impairments. Prematurity is the leading cause of LBW, yet few studies have attempted to analyse how environmental factors can influence LBW in infants who are not premature. This study therefore sought to analyse the influence of air pollution, noise levels and temperature on LBW in non-premature births in Madrid during the period 2001-2009. Ecological time-series study to assess the impact of PM2.5, NO2 and O3 concentrations, noise levels, and temperatures on LBW among non-premature infants across the period 2001-2009. Our analysis extended to infants having birth weights of 1,500 g to 2,500 g (VLBW) and less than 1,500 g (ELBW). Environmental variables were lagged until 37 weeks with respect to the date of birth, and cross-correlation functions were used to identify explaining lags. Results were quantified using Poisson regression models. Across the study period 298,705 births were registered in Madrid, 3,290 of which had LBW; of this latter total, 1,492 were non-premature. PM2.5 was the only pollutant to show an association with the three variables of LBW in non-premature births. This association occurred at around the third month of gestation for LBW and VLBW (LBW: lag 23 and VLBW: lag 25), and at around the eighth month of gestation for ELBW (lag 6). Leqd was linked to LBW at lag zero. The RR of PM2.5 on LBW was 1.01 (1.00 1.03). The RR of Leqd on LBW was 1.09 (0.99 1.19)(p<0.1). The results obtained indicate that PM2.5 had influence on LBW. The adoption of measures aimed at reducing the number of vehicles would serve to lower pregnant women's exposure. In the case of noise should be limited the exposure to high levels during the final weeks of pregnancy.

  5. All-phase MR angiography using independent component analysis of dynamic contrast enhanced MRI time series. φ-MRA

    International Nuclear Information System (INIS)

    Suzuki, Kiyotaka; Matsuzawa, Hitoshi; Watanabe, Masaki; Nakada, Tsutomu; Nakayama, Naoki; Kwee, I.L.

    2003-01-01

    Dynamic contrast enhanced magnetic resonance imaging (dynamic MRI) represents a MRI version of non-diffusible tracer methods, the main clinical use of which is the physiological construction of what is conventionally referred to as perfusion images. The raw data utilized for constructing MRI perfusion images are time series of pixel signal alterations associated with the passage of a gadolinium containing contrast agent. Such time series are highly compatible with independent component analysis (ICA), a novel statistical signal processing technique capable of effectively separating a single mixture of multiple signals into their original independent source signals (blind separation). Accordingly, we applied ICA to dynamic MRI time series. The technique was found to be powerful, allowing for hitherto unobtainable assessment of regional cerebral hemodynamics in vivo. (author)

  6. THE ANALYSIS OF THE TIME-SERIES FLUCTUATION OF WATER DEMAND FOR THE SMALL WATER SUPPLY BLOCK

    Science.gov (United States)

    Koizumi, Akira; Suehiro, Miki; Arai, Yasuhiro; Inakazu, Toyono; Masuko, Atushi; Tamura, Satoshi; Ashida, Hiroshi

    The purpose of this study is to define one apartment complex as "the water supply block" and to show the relationship between the amount of water supply for an apartment house and its time series fluctuation. We examined the observation data which were collected from 33 apartment houses. The water meters were installed at individual observation points for about 20 days in Tokyo. This study used Fourier analysis in order to grasp the irregularity in a time series data. As a result, this paper demonstrated that the smaller the amount of water supply became, the larger irregularity the time series fluctuation had. We also found that it was difficult to describe the daily cyclical pattern for a small apartment house using the dominant periodic components which were obtained from a Fourier spectrum. Our research give useful information about the design for a directional water supply system, as to making estimates of the hourly fluctuation and the maximum daily water demand.

  7. TimesVector: a vectorized clustering approach to the analysis of time series transcriptome data from multiple phenotypes.

    Science.gov (United States)

    Jung, Inuk; Jo, Kyuri; Kang, Hyejin; Ahn, Hongryul; Yu, Youngjae; Kim, Sun

    2017-12-01

    Identifying biologically meaningful gene expression patterns from time series gene expression data is important to understand the underlying biological mechanisms. To identify significantly perturbed gene sets between different phenotypes, analysis of time series transcriptome data requires consideration of time and sample dimensions. Thus, the analysis of such time series data seeks to search gene sets that exhibit similar or different expression patterns between two or more sample conditions, constituting the three-dimensional data, i.e. gene-time-condition. Computational complexity for analyzing such data is very high, compared to the already difficult NP-hard two dimensional biclustering algorithms. Because of this challenge, traditional time series clustering algorithms are designed to capture co-expressed genes with similar expression pattern in two sample conditions. We present a triclustering algorithm, TimesVector, specifically designed for clustering three-dimensional time series data to capture distinctively similar or different gene expression patterns between two or more sample conditions. TimesVector identifies clusters with distinctive expression patterns in three steps: (i) dimension reduction and clustering of time-condition concatenated vectors, (ii) post-processing clusters for detecting similar and distinct expression patterns and (iii) rescuing genes from unclassified clusters. Using four sets of time series gene expression data, generated by both microarray and high throughput sequencing platforms, we demonstrated that TimesVector successfully detected biologically meaningful clusters of high quality. TimesVector improved the clustering quality compared to existing triclustering tools and only TimesVector detected clusters with differential expression patterns across conditions successfully. The TimesVector software is available at http://biohealth.snu.ac.kr/software/TimesVector/. sunkim.bioinfo@snu.ac.kr. Supplementary data are available at

  8. A very British affair six Britons and the development of time series analysis during the 20th century

    CERN Document Server

    Mills, T

    2012-01-01

    This book develops the major themes of time series analysis from its formal beginnings in the early part of the 20th century to the present day through the research of six distinguished British statisticians, all of whose work is characterised by the British traits of pragmatism and the desire to solve practical problems of importance.

  9. Mapping agroecological zones and time lag in vegetation growth by means of Fourier analysis of time series of NDVI images

    Science.gov (United States)

    Menenti, M.; Azzali, S.; Verhoef, W.; Van Swol, R.

    1993-01-01

    Examples are presented of applications of a fast Fourier transform algorithm to analyze time series of images of Normalized Difference Vegetation Index values. The results obtained for a case study on Zambia indicated that differences in vegetation development among map units of an existing agroclimatic map were not significant, while reliable differences were observed among the map units obtained using the Fourier analysis.

  10. Time Series Analysis of Non-Gaussian Observations Based on State Space Models from Both Classical and Bayesian Perspectives

    NARCIS (Netherlands)

    Durbin, J.; Koopman, S.J.M.

    1998-01-01

    The analysis of non-Gaussian time series using state space models is considered from both classical and Bayesian perspectives. The treatment in both cases is based on simulation using importance sampling and antithetic variables; Monte Carlo Markov chain methods are not employed. Non-Gaussian

  11. Extended local similarity analysis (eLSA) of microbial community and other time series data with replicates.

    Science.gov (United States)

    Xia, Li C; Steele, Joshua A; Cram, Jacob A; Cardon, Zoe G; Simmons, Sheri L; Vallino, Joseph J; Fuhrman, Jed A; Sun, Fengzhu

    2011-01-01

    The increasing availability of time series microbial community data from metagenomics and other molecular biological studies has enabled the analysis of large-scale microbial co-occurrence and association networks. Among the many analytical techniques available, the Local Similarity Analysis (LSA) method is unique in that it captures local and potentially time-delayed co-occurrence and association patterns in time series data that cannot otherwise be identified by ordinary correlation analysis. However LSA, as originally developed, does not consider time series data with replicates, which hinders the full exploitation of available information. With replicates, it is possible to understand the variability of local similarity (LS) score and to obtain its confidence interval. We extended our LSA technique to time series data with replicates and termed it extended LSA, or eLSA. Simulations showed the capability of eLSA to capture subinterval and time-delayed associations. We implemented the eLSA technique into an easy-to-use analytic software package. The software pipeline integrates data normalization, statistical correlation calculation, statistical significance evaluation, and association network construction steps. We applied the eLSA technique to microbial community and gene expression datasets, where unique time-dependent associations were identified. The extended LSA analysis technique was demonstrated to reveal statistically significant local and potentially time-delayed association patterns in replicated time series data beyond that of ordinary correlation analysis. These statistically significant associations can provide insights to the real dynamics of biological systems. The newly designed eLSA software efficiently streamlines the analysis and is freely available from the eLSA homepage, which can be accessed at http://meta.usc.edu/softs/lsa.

  12. Nonlinear time-series analysis of current signal in cathodic contact glow discharge electrolysis

    International Nuclear Information System (INIS)

    Allagui, Anis; Abdelkareem, Mohammad Ali; Rojas, Andrea Espinel; Bonny, Talal; Elwakil, Ahmed S.

    2016-01-01

    In the standard two-electrode configuration employed in electrolytic process, when the control dc voltage is brought to a critical value, the system undergoes a transition from conventional electrolysis to contact glow discharge electrolysis (CGDE), which has also been referred to as liquid-submerged micro-plasma, glow discharge plasma electrolysis, electrode effect, electrolytic plasma, etc. The light-emitting process is associated with the development of an irregular and erratic current time-series which has been arbitrarily labelled as “random,” and thus dissuaded further research in this direction. Here, we examine the current time-series signals measured in cathodic CGDE configuration in a concentrated KOH solution at different dc bias voltages greater than the critical voltage. We show that the signals are, in fact, not random according to the NIST SP. 800-22 test suite definition. We also demonstrate that post-processing low-pass filtered sequences requires less time than the native as-measured sequences, suggesting a superposition of low frequency chaotic fluctuations and high frequency behaviors (which may be produced by more than one possible source of entropy). Using an array of nonlinear time-series analyses for dynamical systems, i.e., the computation of largest Lyapunov exponents and correlation dimensions, and re-construction of phase portraits, we found that low-pass filtered datasets undergo a transition from quasi-periodic to chaotic to quasi-hyper-chaotic behavior, and back again to chaos when the voltage controlling-parameter is increased. The high frequency part of the signals is discussed in terms of highly nonlinear turbulent motion developed around the working electrode.

  13. TIME SERIES CHARACTERISTIC ANALYSIS OF RAINFALL, LAND USE AND FLOOD DISCHARGE BASED ON ARIMA BOX-JENKINS MODEL

    Directory of Open Access Journals (Sweden)

    Abror Abror

    2014-01-01

    Full Text Available Indonesia located in tropic area consists of wet season and dry season. However, in last few years, in river discharge in dry season is very little, but in contrary, in wet season, frequency of flood increases with sharp peak and increasingly great water elevation. The increased flood discharge may occur due to change in land use or change in rainfall characteristic. Both matters should get clarity. Therefore, a research should be done to analyze rainfall characteristic, land use and flood discharge in some watershed area (DAS quantitatively from time series data. The research was conducted in DAS Gintung in Parakankidang, DAS Gung in Danawarih, DAS Rambut in Cipero, DAS Kemiri in Sidapurna and DAS Comal in Nambo, located in Tegal Regency and Pemalang Regency in Central Java Province. This research activity consisted of three main steps: input, DAS system and output. Input is DAS determination and selection and searching secondary data. DAS system is early secondary data processing consisting of rainfall analysis, HSS GAMA I parameter, land type analysis and DAS land use. Output is final processing step that consisting of calculation of Tadashi Tanimoto, USSCS effective rainfall, flood discharge, ARIMA analysis, result analysis and conclusion. Analytical calculation of ARIMA Box-Jenkins time series used software Number Cruncher Statistical Systems and Power Analysis Sample Size (NCSS-PASS version 2000, which result in time series characteristic in form of time series pattern, mean square errors (MSE, root mean square ( RMS, autocorrelation of residual and trend. Result of this research indicates that composite CN and flood discharge is proportional that means when composite CN trend increase then flood discharge trend also increase and vice versa. Meanwhile, decrease of rainfall trend is not always followed with decrease in flood discharge trend. The main cause of flood discharge characteristic is DAS management characteristic, not change in

  14. Monitoring Springs in the Mojave Desert Using Landsat Time Series Analysis

    Science.gov (United States)

    Potter, Christopher S.

    2018-01-01

    The purpose of this study, based on Landsat satellite data was to characterize variations and trends over 30 consecutive years (1985-2016) in perennial vegetation green cover at over 400 confirmed Mojave Desert spring locations. These springs were surveyed between in 2015 and 2016 on lands managed in California by the U.S. Bureau of Land Management (BLM) and on several land trusts within the Barstow, Needles, and Ridgecrest BLM Field Offices. The normalized difference vegetation index (NDVI) from July Landsat images was computed at each spring location and a trend model was first fit to the multi-year NDVI time series using least squares linear regression.Â

  15. Time series analysis of reference crop evapotranspiration using soft computing techniques for Ganjam District, Odisha, India

    Science.gov (United States)

    Patra, S. R.

    2017-12-01

    Evapotranspiration (ET0) influences water resources and it is considered as a vital process in aridic hydrologic frameworks. It is one of the most important measure in finding the drought condition. Therefore, time series forecasting of evapotranspiration is very important in order to help the decision makers and water system mangers build up proper systems to sustain and manage water resources. Time series considers that -history repeats itself, hence by analysing the past values, better choices, or forecasts, can be carried out for the future. Ten years of ET0 data was used as a part of this study to make sure a satisfactory forecast of monthly values. In this study, three models: (ARIMA) mathematical model, artificial neural network model, support vector machine model are presented. These three models are used for forecasting monthly reference crop evapotranspiration based on ten years of past historical records (1991-2001) of measured evaporation at Ganjam region, Odisha, India without considering the climate data. The developed models will allow water resource managers to predict up to 12 months, making these predictions very useful to optimize the resources needed for effective water resources management. In this study multistep-ahead prediction is performed which is more complex and troublesome than onestep ahead. Our investigation proposed that nonlinear relationships may exist among the monthly indices, so that the ARIMA model might not be able to effectively extract the full relationship hidden in the historical data. Support vector machines are potentially helpful time series forecasting strategies on account of their strong nonlinear mapping capability and resistance to complexity in forecasting data. SVMs have great learning capability in time series modelling compared to ANN. For instance, the SVMs execute the structural risk minimization principle, which allows in better generalization as compared to neural networks that use the empirical risk

  16. Hybrid Wavelet De-noising and Rank-Set Pair Analysis approach for forecasting hydro-meteorological time series

    Science.gov (United States)

    WANG, D.; Wang, Y.; Zeng, X.

    2017-12-01

    Accurate, fast forecasting of hydro-meteorological time series is presently a major challenge in drought and flood mitigation. This paper proposes a hybrid approach, Wavelet De-noising (WD) and Rank-Set Pair Analysis (RSPA), that takes full advantage of a combination of the two approaches to improve forecasts of hydro-meteorological time series. WD allows decomposition and reconstruction of a time series by the wavelet transform, and hence separation of the noise from the original series. RSPA, a more reliable and efficient version of Set Pair Analysis, is integrated with WD to form the hybrid WD-RSPA approach. Two types of hydro-meteorological data sets with different characteristics and different levels of human influences at some representative stations are used to illustrate the WD-RSPA approach. The approach is also compared to three other generic methods: the conventional Auto Regressive Integrated Moving Average (ARIMA) method, Artificial Neural Networks (ANNs) (BP-error Back Propagation, MLP-Multilayer Perceptron and RBF-Radial Basis Function), and RSPA alone. Nine error metrics are used to evaluate the model performance. The results show that WD-RSPA is accurate, feasible, and effective. In particular, WD-RSPA is found to be the best among the various generic methods compared in this paper, even when the extreme events are included within a time series.

  17. Short Term Prediction of PM10 Concentrations Using Seasonal Time Series Analysis

    Directory of Open Access Journals (Sweden)

    Hamid Hazrul Abdul

    2016-01-01

    Full Text Available Air pollution modelling is one of an important tool that usually used to make short term and long term prediction. Since air pollution gives a big impact especially to human health, prediction of air pollutants concentration is needed to help the local authorities to give an early warning to people who are in risk of acute and chronic health effects from air pollution. Finding the best time series model would allow prediction to be made accurately. This research was carried out to find the best time series model to predict the PM10 concentrations in Nilai, Negeri Sembilan, Malaysia. By considering two seasons which is wet season (north east monsoon and dry season (south west monsoon, seasonal autoregressive integrated moving average model were used to find the most suitable model to predict the PM10 concentrations in Nilai, Negeri Sembilan by using three error measures. Based on AIC statistics, results show that ARIMA (1, 1, 1 × (1, 0, 012 is the most suitable model to predict PM10 concentrations in Nilai, Negeri Sembilan.

  18. The Usage of Time Series Control Charts for Financial Process Analysis

    Directory of Open Access Journals (Sweden)

    Kovářík Martin

    2012-09-01

    Full Text Available We will deal with financial proceedings of the company using methods of SPC (Statistical Process Control, specifically through time series control charts. The paper will outline the intersection of two disciplines which are econometrics and statistical process control. The theoretical part will discuss the methodology of time series control charts and in the research part there will be this methodology demonstrated in three case studies. The first study will focus on the regulation of simulated financial flows for a company by CUSUM control chart. The second study will involve the regulation of financial flows for a heteroskedastic financial process by EWMA control chart. The last case study of our paper will be devoted to applications of ARIMA, EWMA and CUSUM control charts in the financial data that are sensitive to the mean shifting while calculating the autocorrelation in the data. In this paper, we highlight the versatility of control charts not only in manufacturing but also in managing the financial stability of cash flows.

  19. A Bias and Variance Analysis for Multistep-Ahead Time Series Forecasting.

    Science.gov (United States)

    Ben Taieb, Souhaib; Atiya, Amir F

    2016-01-01

    Multistep-ahead forecasts can either be produced recursively by iterating a one-step-ahead time series model or directly by estimating a separate model for each forecast horizon. In addition, there are other strategies; some of them combine aspects of both aforementioned concepts. In this paper, we present a comprehensive investigation into the bias and variance behavior of multistep-ahead forecasting strategies. We provide a detailed review of the different multistep-ahead strategies. Subsequently, we perform a theoretical study that derives the bias and variance for a number of forecasting strategies. Finally, we conduct a Monte Carlo experimental study that compares and evaluates the bias and variance performance of the different strategies. From the theoretical and the simulation studies, we analyze the effect of different factors, such as the forecast horizon and the time series length, on the bias and variance components, and on the different multistep-ahead strategies. Several lessons are learned, and recommendations are given concerning the advantages, disadvantages, and best conditions of use of each strategy.

  20. Beyond Fractals and 1/f Noise: Multifractal Analysis of Complex Physiological Time Series

    Science.gov (United States)

    Ivanov, Plamen Ch.; Amaral, Luis A. N.; Ashkenazy, Yosef; Stanley, H. Eugene; Goldberger, Ary L.; Hausdorff, Jeffrey M.; Yoneyama, Mitsuru; Arai, Kuniharu

    2001-03-01

    We investigate time series with 1/f-like spectra generated by two physiologic control systems --- the human heartbeat and human gait. We show that physiological fluctuations exhibit unexpected ``hidden'' structures often described by scaling laws. In particular, our studies indicate that when analyzed on different time scales the heartbeat fluctuations exhibit cascades of branching patterns with self-similar (fractal) properties, characterized by long-range power-law anticorrelations. We find that these scaling features change during sleep and wake phases, and with pathological perturbations. Further, by means of a new wavelet-based technique, we find evidence of multifractality in the healthy human heartbeat even under resting conditions, and show that the multifractal character and nonlinear properties of the healthy heart are encoded in the Fourier phases. We uncover a loss of multifractality for a life-threatening condition, congestive heart failure. In contrast to the heartbeat, we find that the interstride interval time series of healthy human gait, a voluntary process under neural regulation, is described by a single fractal dimension (such as classical 1/f noise) indicating monofractal behavior. Thus our approach can help distinguish physiological and physical signals with comparable frequency spectra and two-point correlations, and guide modeling of their control mechanisms.

  1. A Regularized Linear Dynamical System Framework for Multivariate Time Series Analysis.

    Science.gov (United States)

    Liu, Zitao; Hauskrecht, Milos

    2015-01-01

    Linear Dynamical System (LDS) is an elegant mathematical framework for modeling and learning Multivariate Time Series (MTS). However, in general, it is difficult to set the dimension of an LDS's hidden state space. A small number of hidden states may not be able to model the complexities of a MTS, while a large number of hidden states can lead to overfitting. In this paper, we study learning methods that impose various regularization penalties on the transition matrix of the LDS model and propose a regularized LDS learning framework (rLDS) which aims to (1) automatically shut down LDSs' spurious and unnecessary dimensions, and consequently, address the problem of choosing the optimal number of hidden states; (2) prevent the overfitting problem given a small amount of MTS data; and (3) support accurate MTS forecasting. To learn the regularized LDS from data we incorporate a second order cone program and a generalized gradient descent method into the Maximum a Posteriori framework and use Expectation Maximization to obtain a low-rank transition matrix of the LDS model. We propose two priors for modeling the matrix which lead to two instances of our rLDS. We show that our rLDS is able to recover well the intrinsic dimensionality of the time series dynamics and it improves the predictive performance when compared to baselines on both synthetic and real-world MTS datasets.

  2. Local to Global Scale Time Series Analysis of US Dryland Degradation Using Landsat, AVHRR, and MODIS

    Science.gov (United States)

    Washington-Allen, R. A.; Ramsey, R. D.; West, N. E.; Kulawardhana, W.; Reeves, M. C.; Mitchell, J. E.; Van Niel, T. G.

    2011-12-01

    Drylands cover 41% of the terrestrial land surface and annually generate $1 trillion in ecosystem goods and services for 38% of the global population, yet estimates of the global extent of Dryland degradation is uncertain with a range of 10 - 80%. It is currently understood that Drylands exhibit topological complexity including self-organization of parameters of different levels-of-organization, e.g., ecosystem and landscape parameters such as soil and vegetation pattern and structure, that gradually or discontinuously shift to multiple basins of attraction in response to herbivory, fire, and climatic drivers at multiple spatial and temporal scales. Our research has shown that at large geographic scales, contemporaneous time series of 10 to 20 years for response and driving variables across two or more spatial scales is required to replicate and differentiate between the impact of climate and land use activities such as commercial grazing. For example, the Pacific Decadal Oscillation (PDO) is a major driver of Dryland net primary productivity (NPP), biodiversity, and ecological resilience with a 10-year return interval, thus 20 years of data are required to replicate its impact. Degradation is defined here as a change in physiognomic composition contrary to management goals, a persistent reduction in vegetation response, e.g., NPP, accelerated soil erosion, a decline in soil quality, and changes in landscape configuration and structure that lead to a loss of ecosystem function. Freely available Landsat, Advanced Very High Resolution Radiometer (AVHRR) and Moderate Resolution Imaging Spectroradimeter (MODIS) archives of satellite imagery exist that provide local to global spatial coverage and time series between 1972 to the present from which proxies of land degradation can be derived. This paper presents time series assessments between 1972 and 2011 of US Dryland degradation including early detection of dynamic regime shifts in the Mojave and landscape pattern and

  3. Statistical performance and information content of time lag analysis and redundancy analysis in time series modeling.

    Science.gov (United States)

    Angeler, David G; Viedma, Olga; Moreno, José M

    2009-11-01

    Time lag analysis (TLA) is a distance-based approach used to study temporal dynamics of ecological communities by measuring community dissimilarity over increasing time lags. Despite its increased use in recent years, its performance in comparison with other more direct methods (i.e., canonical ordination) has not been evaluated. This study fills this gap using extensive simulations and real data sets from experimental temporary ponds (true zooplankton communities) and landscape studies (landscape categories as pseudo-communities) that differ in community structure and anthropogenic stress history. Modeling time with a principal coordinate of neighborhood matrices (PCNM) approach, the canonical ordination technique (redundancy analysis; RDA) consistently outperformed the other statistical tests (i.e., TLAs, Mantel test, and RDA based on linear time trends) using all real data. In addition, the RDA-PCNM revealed different patterns of temporal change, and the strength of each individual time pattern, in terms of adjusted variance explained, could be evaluated, It also identified species contributions to these patterns of temporal change. This additional information is not provided by distance-based methods. The simulation study revealed better Type I error properties of the canonical ordination techniques compared with the distance-based approaches when no deterministic component of change was imposed on the communities. The simulation also revealed that strong emphasis on uniform deterministic change and low variability at other temporal scales is needed to result in decreased statistical power of the RDA-PCNM approach relative to the other methods. Based on the statistical performance of and information content provided by RDA-PCNM models, this technique serves ecologists as a powerful tool for modeling temporal change of ecological (pseudo-) communities.

  4. Phenological response of vegetation to upstream river flow in the Heihe Rive basin by time series analysis of MODIS data

    Directory of Open Access Journals (Sweden)

    L. Jia

    2011-03-01

    Full Text Available Liquid and solid precipitation is abundant in the high elevation, upper reach of the Heihe River basin in northwestern China. The development of modern irrigation schemes in the middle reach of the basin is taking up an increasing share of fresh water resources, endangering the oasis and traditional irrigation systems in the lower reach. In this study, the response of vegetation in the Ejina Oasis in the lower reach of the Heihe River to the water yield of the upper catchment was analyzed by time series analysis of monthly observations of precipitation in the upper and lower catchment, river streamflow downstream of the modern irrigation schemes and satellite observations of vegetation index. Firstly, remotely sensed NDVI data acquired by Terra-MODIS are used to monitor the vegetation dynamic for a seven years period between 2000 and 2006. Due to cloud-contamination, atmospheric influence and different solar and viewing angles, however, the quality and consistence of time series of remotely sensed NDVI data are degraded. A Fourier Transform method – the Harmonic Analysis of Time Series (HANTS algorithm – is used to reconstruct cloud- and noise-free NDVI time series data from the Terra-MODIS NDVI dataset. Modification is made on HANTS by adding additional parameters to deal with large data gaps in yearly time series in combination with a Temporal-Similarity-Statistics (TSS method developed in this study to seek for initial values for the large gap periods. Secondly, the same Fourier Transform method is used to model time series of the vegetation phenology. The reconstructed cloud-free NDVI time series data are used to study the relationship between the water availability (i.e. the local precipitation and upstream water yield and the evolution of vegetation conditions in Ejina Oasis from 2000 to 2006. Anomalies in precipitation, streamflow, and vegetation index are detected by comparing each year with the average year. The results showed that

  5. Comparing and Contrasting Traditional Membrane Bioreactor Models with Novel Ones Based on Time Series Analysis

    Directory of Open Access Journals (Sweden)

    Parneet Paul

    2013-02-01

    Full Text Available The computer modelling and simulation of wastewater treatment plant and their specific technologies, such as membrane bioreactors (MBRs, are becoming increasingly useful to consultant engineers when designing, upgrading, retrofitting, operating and controlling these plant. This research uses traditional phenomenological mechanistic models based on MBR filtration and biochemical processes to measure the effectiveness of alternative and novel time series models based upon input–output system identification methods. Both model types are calibrated and validated using similar plant layouts and data sets derived for this purpose. Results prove that although both approaches have their advantages, they also have specific disadvantages as well. In conclusion, the MBR plant designer and/or operator who wishes to use good quality, calibrated models to gain a better understanding of their process, should carefully consider which model type is selected based upon on what their initial modelling objectives are. Each situation usually proves unique.

  6. SHRINKING THE UNCERTAINTY IN ONLINE SALES PREDICTION WITH TIME SERIES ANALYSIS

    Directory of Open Access Journals (Sweden)

    Rashmi Ranjan Dhal

    2014-10-01

    Full Text Available In any production environment, processing is centered on the manufacture of products. It is important to get adequate volumes of orders for those products. However, merely getting orders is not enough for the long-term sustainability of multinationals. They need to know the demand for their products well in advance in order to compete and win in a highly competitive market. To assess the demand of a product we need to track its order behavior and predict the future response of customers depending on the present dataset as well as historical dataset. In this paper we propose a systematic, time-series based scheme to perform this task using the Hadoop framework and Holt-Winter prediction function in the R environment to show the sales forecast for forthcoming years.

  7. Real-Time Detection of Application-Layer DDoS Attack Using Time Series Analysis

    Directory of Open Access Journals (Sweden)

    Tongguang Ni

    2013-01-01

    Full Text Available Distributed denial of service (DDoS attacks are one of the major threats to the current Internet, and application-layer DDoS attacks utilizing legitimate HTTP requests to overwhelm victim resources are more undetectable. Consequently, neither intrusion detection systems (IDS nor victim server can detect malicious packets. In this paper, a novel approach to detect application-layer DDoS attack is proposed based on entropy of HTTP GET requests per source IP address (HRPI. By approximating the adaptive autoregressive (AAR model, the HRPI time series is transformed into a multidimensional vector series. Then, a trained support vector machine (SVM classifier is applied to identify the attacks. The experiments with several databases are performed and results show that this approach can detect application-layer DDoS attacks effectively.

  8. Studies in astronomical time series analysis. IV - Modeling chaotic and random processes with linear filters

    Science.gov (United States)

    Scargle, Jeffrey D.

    1990-01-01

    While chaos arises only in nonlinear systems, standard linear time series models are nevertheless useful for analyzing data from chaotic processes. This paper introduces such a model, the chaotic moving average. This time-domain model is based on the theorem that any chaotic process can be represented as the convolution of a linear filter with an uncorrelated process called the chaotic innovation. A technique, minimum phase-volume deconvolution, is introduced to estimate the filter and innovation. The algorithm measures the quality of a model using the volume covered by the phase-portrait of the innovation process. Experiments on synthetic data demonstrate that the algorithm accurately recovers the parameters of simple chaotic processes. Though tailored for chaos, the algorithm can detect both chaos and randomness, distinguish them from each other, and separate them if both are present. It can also recover nonminimum-delay pulse shapes in non-Gaussian processes, both random and chaotic.

  9. Foreign Remittances, Foreign Direct Investment, Foreign Imports and Economic Growth in Pakistan: A Time Series Analysis

    Directory of Open Access Journals (Sweden)

    Muhammad Tahir

    2015-10-01

    Full Text Available This empirical research paper focuses on establishing a relationship between external determinants and economic growth of Pakistan economy. Empirical analyses are carried out with time series econometric techniques using data over the period of 1977-2013. The main finding is that external determinants such as foreign remittances, foreign direct investment, and foreign imports matter from a growth perspective. Foreign remittances and foreign direct investment have a significant positive role in the growth process of Pakistan economy. Furthermore, it is found that foreign imports have adversely influenced the economic growth of Pakistan. The study recommends that policy makers shall take appropriate steps to increase the inflow of both foreign remittances and foreign direct investment in order to achieve the long run economic growth.

  10. A time-series analysis of flood disaster around Lena river using Landsat TM/ETM+

    Science.gov (United States)

    Sakai, Toru; Hatta, Shigemi; Okumura, Makoto; Takeuchi, Wataru; Hiyama, Tetsuya; Inoue, Gen

    2010-05-01

    Landsat satellite has provided a continuous record of earth observation since 1972, gradually improving sensors (i.e. MSS, TM and ETM+). Already processed archives of Landsat image are now available free of charge from the internet. The Landsat image of 30 m spatial resolution with multiple spectral bands between 450 and 2350 nm is appropriate for detailed mapping of natural resource at wide geographical areas. However, one of the biggest concerns in the use of Landsat image is the uncertainty in the timing of acquisitions. Although detection of land cover change usually requires acquisitions before and after the change, the Landsat image is often unavailable because of the long-term intervals (16 days) and variation in atmosphere. Nearly cloud-free image is acquired at least once per year (total of 22 or 23 scenes per year). Therefore, it may be difficult to acquire appropriate images for monitoring natural disturbances caused at short-term intervals (e.g., flood, forest fire and hurricanes). Our objectives are: (1) to examine whether a time-series of Landsat image is available for monitoring a flood disaster, and (2) to evaluate the impact and timing of the flood disaster around Lena river in Siberia. A set of Landsat TM/ETM+ satellite images was used to enable acquisition of cloud-free image, although Landsat ETM+ images include failure of the Scan Line Corrector (SLC) from May 2003. The overlap area of a time series of 20 Landsat TM/ETM+ images (path 120-122, row 17) from April 2007 to August 2007 was clipped (approximately 33 km × 90 km), and the other area was excluded from the analyses. Image classification was performed on each image separately using an unsupervised ISODATA method, and each Landsat TM/ETM+ image was classified into three land cover types: (1) ice, (2) water, and (3) land. From three land cover types, the area of Lena river was estimated. The area of Lena river dramatically changed after spring breakup. The middle part of Lena river around

  11. Time Series Analysis of Photovoltaic Soiling Station Data: Version 1.0, August 2017

    Energy Technology Data Exchange (ETDEWEB)

    Micheli, Leonardo [National Renewable Energy Lab. (NREL), Golden, CO (United States); Muller, Matthew T. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Deceglie, Michael G. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Ruth, Daniel [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2017-09-11

    The time series data from PV soiling stations, operating in the USA, at different time periods are analyzed and presented. The current version of the paper includes twenty stations operating between 2013 and 2016, but the paper is intended to be periodically updated as more stations and more data become available. The challenges in working with soiling stations data are discussed, including measurement methodology, quality controls, and measurement uncertainty. The soiling profiles of the soiling stations are made available so that the PV community can make use of this data to guide operations and maintence decisions, estimate soiling derate in performance models, and more generally come to a better understanding of the challenges associated with the variability of PV soiling.

  12. Housefly population density correlates with shigellosis among children in Mirzapur, Bangladesh: a time series analysis.

    Directory of Open Access Journals (Sweden)

    Tamer H Farag

    Full Text Available BACKGROUND: Shigella infections are a public health problem in developing and transitional countries because of high transmissibility, severity of clinical disease, widespread antibiotic resistance and lack of a licensed vaccine. Whereas Shigellae are known to be transmitted primarily by direct fecal-oral contact and less commonly by contaminated food and water, the role of the housefly Musca domestica as a mechanical vector of transmission is less appreciated. We sought to assess the contribution of houseflies to Shigella-associated moderate-to-severe diarrhea (MSD among children less than five years old in Mirzapur, Bangladesh, a site where shigellosis is hyperendemic, and to model the potential impact of a housefly control intervention. METHODS: Stool samples from 843 children presenting to Kumudini Hospital during 2009-2010 with new episodes of MSD (diarrhea accompanied by dehydration, dysentery or hospitalization were analyzed. Housefly density was measured twice weekly in six randomly selected sentinel households. Poisson time series regression was performed and autoregression-adjusted attributable fractions (AFs were calculated using the Bruzzi method, with standard errors via jackknife procedure. FINDINGS: Dramatic springtime peaks in housefly density in 2009 and 2010 were followed one to two months later by peaks of Shigella-associated MSD among toddlers and pre-school children. Poisson time series regression showed that housefly density was associated with Shigella cases at three lags (six weeks (Incidence Rate Ratio = 1.39 [95% CI: 1.23 to 1.58] for each log increase in fly count, an association that was not confounded by ambient air temperature. Autocorrelation-adjusted AF calculations showed that a housefly control intervention could have prevented approximately 37% of the Shigella cases over the study period. INTERPRETATION: Houseflies may play an important role in the seasonal transmission of Shigella in some developing

  13. Improved efficiency of maximum likelihood analysis of time series with temporally correlated errors

    Science.gov (United States)

    Langbein, John

    2017-08-01

    Most time series of geophysical phenomena have temporally correlated errors. From these measurements, various parameters are estimated. For instance, from geodetic measurements of positions, the rates and changes in rates are often estimated and are used to model tectonic processes. Along with the estimates of the size of the parameters, the error in these parameters needs to be assessed. If temporal correlations are not taken into account, or each observation is assumed to be independent, it is likely that any estimate of the error of these parameters will be too low and the estimated value of the parameter will be biased. Inclusion of better estimates of uncertainties is limited by several factors, including selection of the correct model for the background noise and the computational requirements to estimate the parameters of the selected noise model for cases where there are numerous observations. Here, I address the second problem of computational efficiency using maximum likelihood estimates (MLE). Most geophysical time series have background noise processes that can be represented as a combination of white and power-law noise, 1/f^{α } with frequency, f. With missing data, standard spectral techniques involving FFTs are not appropriate. Instead, time domain techniques involving construction and inversion of large data covariance matrices are employed. Bos et al. (J Geod, 2013. doi: 10.1007/s00190-012-0605-0) demonstrate one technique that substantially increases the efficiency of the MLE methods, yet is only an approximate solution for power-law indices >1.0 since they require the data covariance matrix to be Toeplitz. That restriction can be removed by simply forming a data filter that adds noise processes rather than combining them in quadrature. Consequently, the inversion of the data covariance matrix is simplified yet provides robust results for a wider range of power-law indices.

  14. Dissolved organic nitrogen dynamics in the North Sea: A time series analysis (1995-2005)

    Science.gov (United States)

    Van Engeland, T.; Soetaert, K.; Knuijt, A.; Laane, R. W. P. M.; Middelburg, J. J.

    2010-09-01

    Dissolved organic nitrogen (DON) dynamics in the North Sea was explored by means of long-term time series of nitrogen parameters from the Dutch national monitoring program. Generally, the data quality was good with little missing data points. Different imputation methods were used to verify the robustness of the patterns against these missing data. No long-term trends in DON concentrations were found over the sampling period (1995-2005). Inter-annual variability in the different time series showed both common and station-specific behavior. The stations could be divided into two regions, based on absolute concentrations and the dominant times scales of variability. Average DON concentrations were 11 μmol l -1 in the coastal region and 5 μmol l -1 in the open sea. Organic fractions of total dissolved nitrogen (TDN) averaged 38 and 71% in the coastal zone and open sea, respectively, but increased over time due to decreasing dissolved inorganic nitrogen (DIN) concentrations. In both regions intra-annual variability dominated over inter-annual variability, but DON variation in the open sea was markedly shifted towards shorter time scales relative to coastal stations. In the coastal zone a consistent seasonal DON cycle existed with high values in spring-summer and low values in autumn-winter. In the open sea seasonality was weak. A marked shift in the seasonality was found at the Dogger Bank, with DON accumulation towards summer and low values in winter prior to 1999, and accumulation in spring and decline throughout summer after 1999. This study clearly shows that DON is a dynamic actor in the North Sea and should be monitored systematically to enable us to understand fully the functioning of this ecosystem.

  15. Short time series analysis of Didymosphenia geminata blooming in the Oreti River, New Zealand

    Science.gov (United States)

    Garcia, T.; Kilroy, C.; Larned, S.; Packman, A. I.; Kumar, P.

    2010-12-01

    The mat-forming diatom Didymosphenia geminata was introduced to New Zealand in 2004, and subsequently spread to many rivers on the south island. D geminata mats are exceptionally dense and thick. Extensive blooms of this introduced organism have substantially modified the benthic environment in many New Zealand rivers, but the factors that contribute to D. geminata blooming are not well understood. We synthesized a sequence of observations of D. geminata areal coverage and thickness to examine physical and chemical controls on the growth and persistence of D germinata. We analyzed the best available time series on the distribution of this organism in New Zealand, observations in the Oreti River every 15 days spanning April 2006 to May 2007. During this period, mean D. geminata coverage of the river bed was ~52% and the mean mat thickness was ~6 mm. Relationships between time-series observations of D. geminata and 13 different physical and chemical variables were analyzed using linear and nonlinear methods. Areal cover and thickness of D geminata mats were found to be influenced by both slow and fast dynamic processes. The spread of the organism, in terms of % cover, was highly correlated with conductivity, ammonium, nitrate, dissolved oxygen, and total nitrogen with short time lags (fast dynamics). Moreover, water clarity, cloud cover, and flow were highly correlated with % cover with long time lags, indicating that these conditions exert long-term control on D. geminata growth. Areal coverage and thickness were found to be highly correlated, but the variables associated with slow and fast dynamics of these two measures were not identical. The variables found to be highly correlated with D. germinata thickness and represented fast dynamics were temperature, dissolved oxygen, conductivity, nitrate, and total nitrogen. Additionally, the variables influencing the slow dynamics of D. germinata thickness were flow, water clarity, turbidity and total phosphorous.

  16. Trend analysis of air temperature and precipitation time series over Greece: 1955-2010

    Science.gov (United States)

    Marougianni, G.; Melas, D.; Kioutsioukis, I.; Feidas, H.; Zanis, P.; Anandranistakis, E.

    2012-04-01

    In this study, a database of air temperature and precipitation time series from the network of Hellenic National Meteorological Service has been developed in the framework of the project GEOCLIMA, co-financed by the European Union and Greek national funds through the Operational Program "Competitiveness and Entrepreneurship" of the Research Funding Program COOPERATION 2009. Initially, a quality test was applied to the raw data and then missing observations have been imputed with a regularized, spatial-temporal expectation - maximization algorithm to complete the climatic record. Next, a quantile - matching algorithm was applied in order to verify the homogeneity of the data. The processed time series were used for the calculation of temporal annual and seasonal trends of air temperature and precipitation. Monthly maximum and minimum surface air temperature and precipitation means at all available stations in Greece were analyzed for temporal trends and spatial variation patterns for the longest common time period of homogenous data (1955 - 2010), applying the Mann-Kendall test. The majority of the examined stations showed a significant increase in the summer maximum and minimum temperatures; this could be possibly physically linked to the Etesian winds, because of the less frequent expansion of the low over the southeastern Mediterranean. Summer minimum temperatures have been increasing at a faster rate than that of summer maximum temperatures, reflecting an asymmetric change of extreme temperature distributions. Total annual precipitation has been significantly decreased at the stations located in western Greece, as well as in the southeast, while the remaining areas exhibit a non-significant negative trend. This reduction is very likely linked to the positive phase of the NAO that resulted in an increase in the frequency and persistence of anticyclones over the Mediterranean.

  17. Effect of reclassification of cannabis on hospital admissions for cannabis psychosis: a time series analysis.

    Science.gov (United States)

    Hamilton, Ian; Lloyd, Charlie; Hewitt, Catherine; Godfrey, Christine

    2014-01-01

    The UK Misuse of Drugs Act (1971) divided controlled drugs into three groups A, B and C, with descending criminal sanctions attached to each class. Cannabis was originally assigned by the Act to Group B but in 2004, it was transferred to the lowest risk group, Group C. Then in 2009, on the basis of increasing concerns about a link between high strength cannabis and schizophrenia, it was moved back to Group B. The aim of this study is to test the assumption that changes in classification lead to changes in levels of psychosis. In particular, it explores whether the two changes in 2004 and 2009 were associated with changes in the numbers of people admitted for cannabis psychosis. An interrupted time series was used to investigate the relationship between the two changes in cannabis classification and their impact on hospital admissions for cannabis psychosis. Reflecting the two policy changes, two interruptions to the time series were made. Hospital Episode Statistics admissions data was analysed covering the period 1999 through to 2010. There was a significantly increasing trend in cannabis psychosis admissions from 1999 to 2004. However, following the reclassification of cannabis from B to C in 2004, there was a significant change in the trend such that cannabis psychosis admissions declined to 2009. Following the second reclassification of cannabis back to class B in 2009, there was a significant change to increasing admissions. This study shows a statistical association between the reclassification of cannabis and hospital admissions for cannabis psychosis in the opposite direction to that predicted by the presumed relationship between the two. However, the reasons for this statistical association are unclear. It is unlikely to be due to changes in cannabis use over this period. Other possible explanations include changes in policing and systemic changes in mental health services unrelated to classification decisions. Copyright © 2013 Elsevier B.V. All rights

  18. Single event time series analysis in a binary karst catchment evaluated using a groundwater model (Lurbach system, Austria).

    Science.gov (United States)

    Mayaud, C; Wagner, T; Benischke, R; Birk, S

    2014-04-16

    The Lurbach karst system (Styria, Austria) is drained by two major springs and replenished by both autogenic recharge from the karst massif itself and a sinking stream that originates in low permeable schists (allogenic recharge). Detailed data from two events recorded during a tracer experiment in 2008 demonstrate that an overflow from one of the sub-catchments to the other is activated if the discharge of the main spring exceeds a certain threshold. Time series analysis (autocorrelation and cross-correlation) was applied to examine to what extent the various available methods support the identification of the transient inter-catchment flow observed in this binary karst system. As inter-catchment flow is found to be intermittent, the evaluation was focused on single events. In order to support the interpretation of the results from the time series analysis a simplified groundwater flow model was built using MODFLOW. The groundwater model is based on the current conceptual understanding of the karst system and represents a synthetic karst aquifer for which the same methods were applied. Using the wetting capability package of MODFLOW, the model simulated an overflow similar to what has been observed during the tracer experiment. Various intensities of allogenic recharge were employed to generate synthetic discharge data for the time series analysis. In addition, geometric and hydraulic properties of the karst system were varied in several model scenarios. This approach helps to identify effects of allogenic recharge and aquifer properties in the results from the time series analysis. Comparing the results from the time series analysis of the observed data with those of the synthetic data a good agreement was found. For instance, the cross-correlograms show similar patterns with respect to time lags and maximum cross-correlation coefficients if appropriate hydraulic parameters are assigned to the groundwater model. The comparable behaviors of the real and the

  19. A general framework for time series data mining based on event analysis: application to the medical domains of electroencephalography and stabilometry.

    Science.gov (United States)

    Lara, Juan A; Lizcano, David; Pérez, Aurora; Valente, Juan P

    2014-10-01

    There are now domains where information is recorded over a period of time, leading to sequences of data known as time series. In many domains, like medicine, time series analysis requires to focus on certain regions of interest, known as events, rather than analyzing the whole time series. In this paper, we propose a framework for knowledge discovery in both one-dimensional and multidimensional time series containing events. We show how our approach can be used to classify medical time series by means of a process that identifies events in time series, generates time series reference models of representative events and compares two time series by analyzing the events they have in common. We have applied our framework on time series generated in the areas of electroencephalography (EEG) and stabilometry. Framework performance was evaluated in terms of classification accuracy, and the results confirmed that the proposed schema has potential for classifying EEG and stabilometric signals. The proposed framework is useful for discovering knowledge from medical time series containing events, such as stabilometric and electroencephalographic time series. These results would be equally applicable to other medical domains generating iconographic time series, such as, for example, electrocardiography (ECG). Copyright © 2014 Elsevier Inc. All rights reserved.

  20. Landsat time series analysis documents beaver migration into permafrost landscapes of arctic Alaska

    Science.gov (United States)

    Jones, B. M.; Tape, K. D.; Nitze, I.; Arp, C. D.; Grosse, G.; Zimmerman, C. E.

    2017-12-01

    Landscape-scale impacts of climate change in the Arctic include increases in growing season length, shrubby vegetation, winter river discharge, snowfall, summer and winter water temperatures, and decreases in river and lake ice thickness. Combined, these changes may have created conditions that are suitable for beaver colonization of low Arctic tundra regions. We developed a semi-automated workflow that analyzes Landsat imagery time series to determine the extent to which beavers may have colonized permafrost landscapes in arctic Alaska since 1999. We tested this approach on the Lower Noatak, Wulik, and Kivalina river watersheds in northwest Alaska and identified 83 locations representing potential beaver activity. Seventy locations indicated wetting trends and 13 indicated drying trends. Verification of each site using high-resolution satellite imagery showed that 80 % of the wetting locations represented beaver activity (damming and pond formation), 11 % were unrelated to beavers, and 9 % could not readily be distinguished as being beaver related or not. For the drying locations, 31 % represented beaver activity (pond drying due to dam abandonment), 62 % were unrelated to beavers, and 7 % were undetermined. Comparison of the beaver activity database with historic aerial photography from ca. 1950 and ca. 1980 indicates that beavers have recently colonized or recolonized riparian corridors in northwest Alaska. Remote sensing time series observations associated with the migration of beavers in permafrost landscapes in arctic Alaska include thermokarst lake expansion and drainage, thaw slump initiation, ice wedge degradation, thermokarst shore fen development, and possibly development of lake and river taliks. Additionally, beaver colonization in the Arctic may alter channel courses, thermal regimes, hyporheic flow, riparian vegetation, and winter ice regimes that could impact ecosystem structure and function in this region. In particular, the combination of beaver

  1. InSAR Time Series Analysis of Dextral Strain Partitioning Across the Burma Plate

    Science.gov (United States)

    Reitman, N. G.; Wang, Y.; Lin, N.; Lindsey, E. O.; Mueller, K. J.

    2017-12-01

    Oblique convergence between the India and Sunda plates creates partitioning of strike-slip and compressional strain across the Burma plate. GPS data indicate up to 40 mm/yr (Steckler et al 2016) of dextral strain exists between the India and Sunda plates. The Sagaing fault in Myanmar accommodates 20 mm/yr at the eastern boundary of the Burma plate, but the location and magnitude of dextral strain on other faults remains an open question, as does the relative importance of seismic vs aseismic processes. The remaining 20 mm/yr of dextral strain may be accommodated on one or two faults or widely distributed on faults across the Burma plate, scenarios that have a major impact on seismic hazard. However, the dense GPS data necessary for precise determination of which faults accommodate how much strain do not exist yet. Previous studies using GPS data ascribe 10-18 mm/yr dextral strain on the Churachandpur Mao fault in India (Gahaluat et al 2013, Steckler et al 2016) and 18-22 mm/yr on the northern Sagaing fault (Maurin et al 2010, Steckler et al 2016), leaving up to 10 mm/yr unconstrained. Several of the GPS results are suggestive of shallow aseismic slip along parts of these faults, which, if confirmed, would have a significant impact on our understanding of hazard in the area. Here, we use differential InSAR analyzed in time series to investigate dextral strain on the Churachandpur Mao fault and across the Burma plate. Ascending ALOS-1 imagery spanning 2007-2010 were processed in time series for three locations. Offsets in phase and a strong gradient in line-of-sight deformation rate are observed across the Churachandpur Mao fault, and work is ongoing to determine if these are produced by shallow fault movement, topographic effects, or both. The results of this study will provide further constraints for strain rate on the Churachandpur Mao fault, and yield a more complete understanding of strain partitioning across the Burma plate.

  2. Time-series analysis to study the impact of an intersection on dispersion along a street canyon.

    Science.gov (United States)

    Richmond-Bryant, Jennifer; Eisner, Alfred D; Hahn, Intaek; Fortune, Christopher R; Drake-Richman, Zora E; Brixey, Laurie A; Talih, M; Wiener, Russell W; Ellenson, William D

    2009-12-01

    This paper presents data analysis from the Brooklyn Traffic Real-Time Ambient Pollutant Penetration and Environmental Dispersion (B-TRAPPED) study to assess the transport of ultrafine particulate matter (PM) across urban intersections. Experiments were performed in a street canyon perpendicular to a highway in Brooklyn, NY, USA. Real-time ultrafine PM samplers were positioned on either side of an intersection at multiple locations along a street to collect time-series number concentration data. Meteorology equipment was positioned within the street canyon and at an upstream background site to measure wind speed and direction. Time-series analysis was performed on the PM data to compute a transport velocity along the direction of the street for the cases where background winds were parallel and perpendicular to the street. The data were analyzed for sampler pairs located (1) on opposite sides of the intersection and (2) on the same block. The time-series analysis demonstrated along-street transport, including across the intersection when background winds were parallel to the street canyon and there was minimal transport and no communication across the intersection when background winds were perpendicular to the street canyon. Low but significant values of the cross-correlation function (CCF) underscore the turbulent nature of plume transport along the street canyon. The low correlations suggest that flow switching around corners or traffic-induced turbulence at the intersection may have aided dilution of the PM plume from the highway. This observation supports similar findings in the literature. Furthermore, the time-series analysis methodology applied in this study is introduced as a technique for studying spatiotemporal variation in the urban microscale environment.

  3. On the forecast of runoff based on the harmonic analysis of time series of precipitation in the catchment area

    Science.gov (United States)

    Cherednichenko, A. V.; Cherednichenko, A. V.; Cherednichenko, V. S.

    2018-01-01

    It is shown that a significant connection exists between the most important harmonics, extracted in the process of harmonic analysis of time series of precipitation in the catchment area of rivers and the amount of runoff. This allowed us to predict the size of the flow for a period of up to 20 years, assuming that the main parameters of the harmonics are preserved at the predicted time interval. The results of such a forecast for three river basins of Kazakhstan are presented.

  4. Reconstructing disturbance history for an intensively mined region by time-series analysis of Landsat imagery.

    Science.gov (United States)

    Li, Jing; Zipper, Carl E; Donovan, Patricia F; Wynne, Randolph H; Oliphant, Adam J

    2015-09-01

    Surface mining disturbances have attracted attention globally due to extensive influence on topography, land use, ecosystems, and human populations in mineral-rich regions. We analyzed a time series of Landsat satellite imagery to produce a 28-year disturbance history for surface coal mining in a segment of eastern USA's central Appalachian coalfield, southwestern Virginia. The method was developed and applied as a three-step sequence: vegetation index selection, persistent vegetation identification, and mined-land delineation by year of disturbance. The overall classification accuracy and kappa coefficient were 0.9350 and 0.9252, respectively. Most surface coal mines were identified correctly by location and by time of initial disturbance. More than 8 % of southwestern Virginia's >4000-km(2) coalfield area was disturbed by surface coal mining over the 28-year period. Approximately 19.5 % of the Appalachian coalfield surface within the most intensively mined county (Wise County) has been disturbed by mining. Mining disturbances expanded steadily and progressively over the study period. Information generated can be applied to gain further insight concerning mining influences on ecosystems and other essential environmental features.

  5. Anomaly Detection in Smart Metering Infrastructure with the Use of Time Series Analysis

    Directory of Open Access Journals (Sweden)

    Tomasz Andrysiak

    2017-01-01

    Full Text Available The article presents solutions to anomaly detection in network traffic for critical smart metering infrastructure, realized with the use of radio sensory network. The structure of the examined smart meter network and the key security aspects which have influence on the correct performance of an advanced metering infrastructure (possibility of passive and active cyberattacks are described. An effective and quick anomaly detection method is proposed. At its initial stage, Cook’s distance was used for detection and elimination of outlier observations. So prepared data was used to estimate standard statistical models based on exponential smoothing, that is, Brown’s, Holt’s, and Winters’ models. To estimate possible fluctuations in forecasts of the implemented models, properly parameterized Bollinger Bands was used. Next, statistical relations between the estimated traffic model and its real variability were examined to detect abnormal behavior, which could indicate a cyberattack attempt. An update procedure of standard models in case there were significant real network traffic fluctuations was also proposed. The choice of optimal parameter values of statistical models was realized as forecast error minimization. The results confirmed efficiency of the presented method and accuracy of choice of the proper statistical model for the analyzed time series.

  6. Time series analysis of Carbon Monoxide from MOPITT over the Asian Continent from 2000-2004

    Science.gov (United States)

    Bhattacharjee, P. S.; Roy, P.

    2005-12-01

    The human population continues to grow and large parts of the world industrialize rapidly, causing changes in the global atmospheric chemistry. Carbon monoxide (CO) is a poisonous gas in the troposphere when highly concentrated, and is produced by fossil fuel combustion, biomass burning and through natural emissions from plants. It is also an important trace gas in the atmosphere and plays a major role in the atmospheric chemistry. We present a study of CO from the measurement of MOPITT (Measurement of Pollution in the Troposphere-Level 3 gridded data) instrument on NASA Terra satellite over India and Eastern Asia for the period of 2000-2004. Day- and night-time total column CO measurements are considered over the selected regions in India, China, Thailand and Japan. The selected regions comprise of industrial cities in the Asian continent which form the source of high CO in the atmosphere. The time series data do not show an overall increasing or decreasing trend, but CO is affected by seasonal variations, wind, and precipitation patterns. East Asian regions have higher and wider seasonal fluctuations than the Indian region. CO total column values over the Bay of Bengal are also high and can be explained through wind patterns from the land towards the ocean. Although the sources of CO are mostly confined to the land, it is transported globally through the atmosphere, and has high concentrations over the ocean.

  7. Sample preparation for phosphoproteomic analysis of circadian time series in Arabidopsis thaliana.

    Science.gov (United States)

    Krahmer, Johanna; Hindle, Matthew M; Martin, Sarah F; Le Bihan, Thierry; Millar, Andrew J

    2015-01-01

    Systems biological approaches to study the Arabidopsis thaliana circadian clock have mainly focused on transcriptomics while little is known about the proteome, and even less about posttranslational modifications. Evidence has emerged that posttranslational protein modifications, in particular phosphorylation, play an important role for the clock and its output. Phosphoproteomics is the method of choice for a large-scale approach to gain more knowledge about rhythmic protein phosphorylation. Recent plant phosphoproteomics publications have identified several thousand phosphopeptides. However, the methods used in these studies are very labor-intensive and therefore not suitable to apply to a well-replicated circadian time series. To address this issue, we present and compare different strategies for sample preparation for phosphoproteomics that are compatible with large numbers of samples. Methods are compared regarding number of identifications, variability of quantitation, and functional categorization. We focus on the type of detergent used for protein extraction as well as methods for its removal. We also test a simple two-fraction separation of the protein extract. © 2015 Elsevier Inc. All rights reserved.

  8. Electricity consumption-GDP nexus in Pakistan: A structural time series analysis

    International Nuclear Information System (INIS)

    Javid, Muhammad; Qayyum, Abdul

    2014-01-01

    This study investigates the relationships among electricity consumption, real economic activity, real price of electricity and the UEDT (underlying energy demand trend) at the aggregate and sectoral levels, namely, for the residential, commercial, industrial, and agricultural sectors. To achieve this goal, an electricity demand function for Pakistan is estimated by applying the structural time series technique to annual data for the period from 1972 to 2012. In addition to identifying the size and significance of the price and income elasticities, this technique also uncovers UEDT for the whole economy as well as for sub-sectors. The results suggest that the nature of the trend is not linear and deterministic but stochastic in form. The UEDT for the electricity usage of the commercial, agricultural and residential sectors shows an upward slope. This upward slope of the UEDT suggests that either energy efficient equipment has not been introduced in these sectors or any energy efficiency improvements due to technical progress is outweighed by other exogenous factors. - Highlights: • Electricity demand function is estimated by applying the STSM approach. • The results suggest that nature of trend is stochastic in form. • Low price elasticity reflects weak link between the electricity price and demand. • Low price elasticity implies that demand did not react to changes in price

  9. Forecasting of exported volume for brazilian fruits by time series analysis: an arima/garch approach

    Directory of Open Access Journals (Sweden)

    Abdinardo Moreira Barreto de Oliveira

    2015-06-01

    Full Text Available The aim of this paper was to offer econometric forecasting models to the Brazilian exported volume fruits, with a view to assisting the planning and production control, also motivated by the existence of a few published papers dealing with this issue. In this sense, it was used the ARIMA/GARCH models, considering, likewise, the occurrence of a multiplicative stochastic seasonality in these series. They were collected 300 observations of exported net weight (kg between Jan/1989 and Dec/2013 of the following fruits: pineapple, banana, orange, lemon, apple, papaya, mango, watermelon, melon and grape, which selection criteria was its importance in the exported basket fruit, because they represented 97% of total received dollars, and 99% of total volume sold in 2010, of a population about 28 kinds of exported fruits. The results showed that it was not only observed the existence of a 12 month multiplicative seasonality in banana and mango. On the other hand, they were identified two fruits groups: (1 those which are continuously exported, and (2 those which have export peaks. On the quality of the models, they were considered satisfactory for six of the ten fruits analyzed. On the volatility, it was seen a high persistence in banana and papaya series, pointing to the existence of a structural break in time series, which could be linked to the economic crises happened in the last 17 years.

  10. LAND COVER DYNAMICS OF OLESHKY SANDS: TIME-SERIES ANALYSIS 1987-2017

    Directory of Open Access Journals (Sweden)

    V. Bogdanets

    2017-11-01

    Full Text Available Oleshky Sands is the largest expanse of sand in Ukraine and the second in Europe. In the beginning of XX century sands moving outside of arenas was almost stopped by planting trees (Pinus nigra ssp. pallasian and Pinus sylvestris L., and the territory had different use during the years. A 30-year (1987-2017 time series of Landsat imagery obtained via USGS geoservice was used to reveal land cover dynamics of deserted landscapes of Oleshky sands using QGIS software. Heavy sand storms can impact nearby settlements and expose harmful effect on local industry and quality of life of local communities. Forest fire is another dangerous factor for protective forest plantations during last years. Our estimation shows that sandy areas increase during 2000-2017; generally, conservation measures had constant effect despite afforestation of last years. The preventive effect of forest on sands moving at Oleshky sands can be characterized as stable in case of constant care about the forest plantation and proper documentation on land use and ownership.

  11. R - evolution in Time Series Analysis Software Applied on R - omanian Capital Market

    Directory of Open Access Journals (Sweden)

    Ciprian ALEXANDRU

    2014-06-01

    Full Text Available Worldwide and during the last decade, R has developed in a balanced way and nowadays it represents the most powerful tool for computational statistics, data science and visualization. Millions of data scientists use R to face their most challenging problems in topics ranging from economics to engineering and genetics. In this study, R was used to compute data on stock market prices in order to build trading models and to estimate the evolution of the quantitative financial market. These models were already applied on the international capital markets. In Romania, the quantitative modeling of capital market is available only for clients of trading brokers because the time series data are collected for the commercial purpose; in that circumstance, the statistical computing tools meet the inertia to change. This paper aims to expose a small part of the capability of R to use mix-and-match models and cutting-edge methods in statistics and quantitative modeling in order to build an alternative way to analyze capital market in Romania over the commercial threshold.

  12. Economic and Sociological Correlates of Suicides: Multilevel Analysis of the Time Series Data in the United Kingdom.

    Science.gov (United States)

    Sun, Bruce Qiang; Zhang, Jie

    2016-03-01

    For the effects of social integration on suicides, there have been different and even contradictive conclusions. In this study, the selected economic and social risks of suicide for different age groups and genders in the United Kingdom were identified and the effects were estimated by the multilevel time series analyses. To our knowledge, there exist no previous studies that estimated a dynamic model of suicides on the time series data together with multilevel analysis and autoregressive distributed lags. The investigation indicated that unemployment rate, inflation rate, and divorce rate are all significantly and positively related to the national suicide rates in the United Kingdom from 1981 to 2011. Furthermore, the suicide rates of almost all groups above 40 years are significantly associated with the risk factors of unemployment and inflation rate, in comparison with the younger groups. © 2016 American Academy of Forensic Sciences.

  13. Time-Series Analysis of Remotely-Sensed SeaWiFS Chlorophyll in River-Influenced Coastal Regions

    Science.gov (United States)

    Acker, James G.; McMahon, Erin; Shen, Suhung; Hearty, Thomas; Casey, Nancy

    2009-01-01

    The availability of a nearly-continuous record of remotely-sensed chlorophyll a data (chl a) from the Sea-viewing Wide Field-of-view Sensor (SeaWiFS) mission, now longer than ten years, enables examination of time-series trends for multiple global locations. Innovative data analysis technology available on the World Wide Web facilitates such analyses. In coastal regions influenced by river outflows, chl a is not always indicative of actual trends in phytoplankton chlorophyll due to the interference of colored dissolved organic matter and suspended sediments; significant chl a timeseries trends for coastal regions influenced by river outflows may nonetheless be indicative of important alterations of the hydrologic and coastal environment. Chl a time-series analysis of nine marine regions influenced by river outflows demonstrates the simplicity and usefulness of this technique. The analyses indicate that coastal time-series are significantly influenced by unusual flood events. Major river systems in regions with relatively low human impact did not exhibit significant trends. Most river systems with demonstrated human impact exhibited significant negative trends, with the noteworthy exception of the Pearl River in China, which has a positive trend.

  14. Remote sensing time series analysis for crop monitoring with the SPIRITS software: new functionalities and use examples

    Directory of Open Access Journals (Sweden)

    Felix eRembold

    2015-07-01

    Full Text Available Monitoring crop and natural vegetation conditions is highly relevant, particularly in the food insecure areas of the world. Data from remote sensing image time series at high temporal and medium to low spatial resolution can assist this monitoring as they provide key information about vegetation status in near real-time over large areas. The Software for the Processing and Interpretation of Remotely sensed Image Time Series (SPIRITS is a stand-alone flexible analysis environment created to facilitate the processing and analysis of large image time series and ultimately for providing clear information about vegetation status in various graphical formats to crop production analysts and decision makers. In this paper we present the latest functional developments of SPIRITS and we illustrate recent applications. The main new developments include: HDF5 importer, Image re-projection, additional options for temporal Smoothing and Periodicity conversion, computation of a rainfall-based probability index (Standardized Precipitation Index for drought detection and extension of the Graph composer functionalities.In particular,. The examples of operational analyses are taken from several recent agriculture and food security monitoring reports and bulletins. We conclude with considerations on future SPIRITS developments also in view of the data processing requirements imposed by the coming generation of remote sensing products at high spatial and temporal resolution, such as those provided by the Sentinel sensors of the European Copernicus programme.

  15. Assessing error sources for Landsat time series analysis for tropical test sites in Viet Nam and Ethiopia

    Science.gov (United States)

    Schultz, Michael; Verbesselt, Jan; Herold, Martin; Avitabile, Valerio

    2013-10-01

    Researchers who use remotely sensed data can spend half of their total effort analysing prior data. If this data preprocessing does not match the application, this time spent on data analysis can increase considerably and can lead to inaccuracies. Despite the existence of a number of methods for pre-processing Landsat time series, each method has shortcomings, particularly for mapping forest changes under varying illumination, data availability and atmospheric conditions. Based on the requirements of mapping forest changes as defined by the United Nations (UN) Reducing Emissions from Forest Degradation and Deforestation (REDD) program, the accurate reporting of the spatio-temporal properties of these changes is necessary. We compared the impact of three fundamentally different radiometric preprocessing techniques Moderate Resolution Atmospheric TRANsmission (MODTRAN), Second Simulation of a Satellite Signal in the Solar Spectrum (6S) and simple Dark Object Subtraction (DOS) on mapping forest changes using Landsat time series data. A modification of Breaks For Additive Season and Trend (BFAST) monitor was used to jointly map the spatial and temporal agreement of forest changes at test sites in Ethiopia and Viet Nam. The suitability of the pre-processing methods for the occurring forest change drivers was assessed using recently captured Ground Truth and high resolution data (1000 points). A method for creating robust generic forest maps used for the sampling design is presented. An assessment of error sources has been performed identifying haze as a major source for time series analysis commission error.

  16. A Time Series Analysis of Global Soil Moisture Data Products for Water Cycle Studies

    Science.gov (United States)

    Zhan, X.; Yin, J.; Liu, J.; Fang, L.; Hain, C.; Ferraro, R. R.; Weng, F.

    2017-12-01

    Water is essential for sustaining life on our planet Earth and water cycle is one of the most important processes of out weather and climate system. As one of the major components of the water cycle, soil moisture impacts significantly the other water cycle components (e.g. evapotranspiration, runoff, etc) and the carbon cycle (e.g. plant/crop photosynthesis and respiration). Understanding of soil moisture status and dynamics is crucial for monitoring and predicting the weather, climate, hydrology and ecological processes. Satellite remote sensing has been used for soil moisture observation since the launch of the Scanning Multi-channel Microwave Radiometer (SMMR) on NASA's Nimbus-7 satellite in 1978. Many satellite soil moisture data products have been made available to the science communities and general public. The soil moisture operational product system (SMOPS) of NOAA NESDIS has been operationally providing global soil moisture data products from each of the currently available microwave satellite sensors and their blends. This presentation will provide an update of SMOPS products. The time series of each of these soil moisture data products are analyzed against other data products, such as precipitation and evapotranspiration from other independent data sources such as the North America Land Data Assimilation System (NLDAS). Temporal characteristics of these water cycle components are explored against some historical events, such as the 2010 Russian, 2010 China and 2012 United States droughts, 2015 South Carolina floods, etc. Finally whether a merged global soil moisture data product can be used as a climate data record is evaluated based on the above analyses.

  17. Alcopops, taxation and harm: a segmented time series analysis of emergency department presentations.

    Science.gov (United States)

    Gale, Marianne; Muscatello, David J; Dinh, Michael; Byrnes, Joshua; Shakeshaft, Anthony; Hayen, Andrew; MacIntyre, Chandini Raina; Haber, Paul; Cretikos, Michelle; Morton, Patricia

    2015-05-06

    In Australia, a Goods and Services Tax (GST) introduced in 2000 led to a decline in the price of ready-to-drink (RTD) beverages relative to other alcohol products. The 2008 RTD ("alcopops") tax increased RTD prices. The objective of this study was to estimate the change in incidence of Emergency Department (ED) presentations for acute alcohol problems associated with each tax. Segmented regression analyses were performed on age and sex-specific time series of monthly presentation rates for acute alcohol problems to 39 hospital emergency departments across New South Wales, Australia over 15 years, 1997 to 2011. Indicator variables represented the introduction of each tax. Retail liquor turnover controlled for large-scale economic factors such as the global financial crisis that may have influenced demand. Under-age (15-17 years) and legal age (18 years and over) drinkers were included. The GST was associated with a statistically significant increase in ED presentations for acute alcohol problems among 18-24 year old females (0 · 14/100,000/month, 95% CI 0 · 05-0 · 22). The subsequent alcopops tax was associated with a statistically significant decrease in males 15-50 years, and females 15-65 years, particularly in 18-24 year old females (-0 · 37/100,000/month, 95% CI -0 · 45 to -0 · 29). An increase in retail turnover of liquor was positively and statistically significantly associated with ED presentations for acute alcohol problems across all age and sex strata. Reduced tax on RTDs was associated with increasing ED presentations for acute alcohol problems among young women. The alcopops tax was associated with declining presentations in young to middle-aged persons of both sexes, including under-age drinkers.

  18. Time Series Analysis for Forecasting Hospital Census: Application to the Neonatal Intensive Care Unit.

    Science.gov (United States)

    Capan, Muge; Hoover, Stephen; Jackson, Eric V; Paul, David; Locke, Robert

    2016-01-01

    Accurate prediction of future patient census in hospital units is essential for patient safety, health outcomes, and resource planning. Forecasting census in the Neonatal Intensive Care Unit (NICU) is particularly challenging due to limited ability to control the census and clinical trajectories. The fixed average census approach, using average census from previous year, is a forecasting alternative used in clinical practice, but has limitations due to census variations. Our objectives are to: (i) analyze the daily NICU census at a single health care facility and develop census forecasting models, (ii) explore models with and without patient data characteristics obtained at the time of admission, and (iii) evaluate accuracy of the models compared with the fixed average census approach. We used five years of retrospective daily NICU census data for model development (January 2008 - December 2012, N=1827 observations) and one year of data for validation (January - December 2013, N=365 observations). Best-fitting models of ARIMA and linear regression were applied to various 7-day prediction periods and compared using error statistics. The census showed a slightly increasing linear trend. Best fitting models included a non-seasonal model, ARIMA(1,0,0), seasonal ARIMA models, ARIMA(1,0,0)x(1,1,2)7 and ARIMA(2,1,4)x(1,1,2)14, as well as a seasonal linear regression model. Proposed forecasting models resulted on average in 36.49% improvement in forecasting accuracy compared with the fixed average census approach. Time series models provide higher prediction accuracy under different census conditions compared with the fixed average census approach. Presented methodology is easily applicable in clinical practice, can be generalized to other care settings, support short- and long-term census forecasting, and inform staff resource planning.

  19. Time-Series Similarity Analysis of Satellite Derived Data to Understand Changes in Forest Biomass.

    Science.gov (United States)

    Singh, N.; Fritz, B.

    2017-12-01

    One of the goals of promoting bioenergy is reducing green-house gas emissions by replacing fossil fuels. However, there are concerns that carbon emissions due to changes in land use resulting from crop production for ethanol will negate the impact of biofuels on the environment. So, the current focus is to use lignocellulose feedstocks also referred to as second generation biofuels as the new source of bioenergy. Wood based pellets derived from the forests of southeastern United States are one such source which is being exported to Europe as a carbon-neutral fuel. These wood-pellets meet the EU standard for carbon emissions and are being used to replace coal for energy generation and heating. As a result US exports of wood-based pellets have increased from nearly zero to over 6 million metric tons over the past 8 years. Wood-based pellets are traditionally produced from softwood trees which have a relatively shorter life-cycle and propagate easily, and thus are expected to provide a sustainable source of wood chips used for pellet production. However, there are concerns that as the demand and price of wood pellets increases, lumber mills will seek wood chips from other sources as well, particularly from hardwood trees resulting in higher carbon emissions as well as loss of biodiversity. In this study we use annual stacks of normalized difference vegetation index (NDVI) data at a 16-day temporal resolution to monitor biomass around pellet mills in southeastern United States. We use a combination of time series similarity technique and supervised learning to understand if there have been significant changes in biomass around pellet mills in the southeastern US. We also demonstrate how our method can be used to monitor biomass over large geographic regions using phenological properties of growing vegetation.

  20. Improving prehospital trauma care in Rwanda through continuous quality improvement: an interrupted time series analysis.

    Science.gov (United States)

    Scott, John W; Nyinawankusi, Jeanne D'Arc; Enumah, Samuel; Maine, Rebecca; Uwitonze, Eric; Hu, Yihan; Kabagema, Ignace; Byiringiro, Jean Claude; Riviello, Robert; Jayaraman, Sudha

    2017-07-01

    Injury is a major cause of premature death and disability in East Africa, and high-quality pre-hospital care is essential for optimal trauma outcomes. The Rwandan pre-hospital emergency care service (SAMU) uses an electronic database to evaluate and optimize pre-hospital care through a continuous quality improvement programme (CQIP), beginning March 2014. The SAMU database was used to assess pre-hospital quality metrics including supplementary oxygen for hypoxia (O2), intravenous fluids for hypotension (IVF), cervical collar placement for head injuries (c-collar), and either splinting (splint) or administration of pain medications (pain) for long bone fractures. Targets of >90% were set for each metric and daily team meetings and monthly feedback sessions were implemented to address opportunities for improvement. These five pre-hospital quality metrics were assessed monthly before and after implementation of the CQIP. Met and unmet needs for O2, IVF, and c-collar were combined into a summative monthly SAMU Trauma Quality Scores (STQ score). An interrupted time series linear regression model compared the STQ score during 14 months before the CQIP implementation to the first 14 months after. During the 29-month study period 3,822 patients met study criteria. 1,028 patients needed one or more of the five studied interventions during the study period. All five endpoints had a significant increase between the pre-CQI and post-CQI periods (pRwanda. This programme may be used as an example for additional efforts engaging frontline staff with real-time data feedback in order to rapidly translate data collection efforts into improved care for the injured in a resource-limited setting. Copyright © 2017 Elsevier Ltd. All rights reserved.

  1. A hybrid wavelet de-noising and Rank-Set Pair Analysis approach for forecasting hydro-meteorological time series.

    Science.gov (United States)

    Wang, Dong; Borthwick, Alistair G; He, Handan; Wang, Yuankun; Zhu, Jieyu; Lu, Yuan; Xu, Pengcheng; Zeng, Xiankui; Wu, Jichun; Wang, Lachun; Zou, Xinqing; Liu, Jiufu; Zou, Ying; He, Ruimin

    2018-01-01

    Accurate, fast forecasting of hydro-meteorological time series is presently a major challenge in drought and flood mitigation. This paper proposes a hybrid approach, wavelet de-noising (WD) and Rank-Set Pair Analysis (RSPA), that takes full advantage of a combination of the two approaches to improve forecasts of hydro-meteorological time series. WD allows decomposition and reconstruction of a time series by the wavelet transform, and hence separation of the noise from the original series. RSPA, a more reliable and efficient version of Set Pair Analysis, is integrated with WD to form the hybrid WD-RSPA approach. Two types of hydro-meteorological data sets with different characteristics and different levels of human influences at some representative stations are used to illustrate the WD-RSPA approach. The approach is also compared to three other generic methods: the conventional Auto Regressive Integrated Moving Average (ARIMA) method, Artificial Neural Networks (ANNs) (BP-error Back Propagation, MLP-Multilayer Perceptron and RBF-Radial Basis Function), and RSPA alone. Nine error metrics are used to evaluate the model performance. Compared to three other generic methods, the results generated by WD-REPA model presented invariably smaller error measures which means the forecasting capability of the WD-REPA model is better than other models. The results show that WD-RSPA is accurate, feasible, and effective. In particular, WD-RSPA is found to be the best among the various generic methods compared in this paper, even when the extreme events are included within a time series. Copyright © 2017 Elsevier Inc. All rights reserved.

  2. Separation of spatial-temporal patterns ('climatic modes') by combined analysis of really measured and generated numerically vector time series

    Science.gov (United States)

    Feigin, A. M.; Mukhin, D.; Volodin, E. M.; Gavrilov, A.; Loskutov, E. M.

    2013-12-01

    The new method of decomposition of the Earth's climate system into well separated spatial-temporal patterns ('climatic modes') is discussed. The method is based on: (i) generalization of the MSSA (Multichannel Singular Spectral Analysis) [1] for expanding vector (space-distributed) time series in basis of spatial-temporal empirical orthogonal functions (STEOF), which makes allowance delayed correlations of the processes recorded in spatially separated points; (ii) expanding both real SST data, and longer by several times SST data generated numerically, in STEOF basis; (iii) use of the numerically produced STEOF basis for exclusion of 'too slow' (and thus not represented correctly) processes from real data. The application of the method allows by means of vector time series generated numerically by the INM RAS Coupled Climate Model [2] to separate from real SST anomalies data [3] two climatic modes possessing by noticeably different time scales: 3-5 and 9-11 years. Relations of separated modes to ENSO and PDO are investigated. Possible applications of spatial-temporal climatic patterns concept to prognosis of climate system evolution is discussed. 1. Ghil, M., R. M. Allen, M. D. Dettinger, K. Ide, D. Kondrashov, et al. (2002) "Advanced spectral methods for climatic time series", Rev. Geophys. 40(1), 3.1-3.41. 2. http://83.149.207.89/GCM_DATA_PLOTTING/GCM_INM_DATA_XY_en.htm 3. http://iridl.ldeo.columbia.edu/SOURCES/.KAPLAN/.EXTENDED/.v2/.ssta/

  3. Regional Land Subsidence Analysis in Eastern Beijing Plain by InSAR Time Series and Wavelet Transforms

    Directory of Open Access Journals (Sweden)

    Mingliang Gao

    2018-02-01

    Full Text Available Land subsidence is the disaster phenomenon of environmental geology with regionally surface altitude lowering caused by the natural or man-made factors. Beijing, the capital city of China, has suffered from land subsidence since the 1950s, and extreme groundwater extraction has led to subsidence rates of more than 100 mm/year. In this study, we employ two SAR datasets acquired by Envisat and TerraSAR-X satellites to investigate the surface deformation in Beijing Plain from 2003 to 2013 based on the multi-temporal InSAR technique. Furthermore, we also use observation wells to provide in situ hydraulic head levels to perform the evolution of land subsidence and spatial-temporal changes of groundwater level. Then, we analyze the accumulated displacement and hydraulic head level time series using continuous wavelet transform to separate periodic signal components. Finally, cross wavelet transform (XWT and wavelet transform coherence (WTC are implemented to analyze the relationship between the accumulated displacement and hydraulic head level time series. The results show that the subsidence centers in the northern Beijing Plain is spatially consistent with the groundwater drop funnels. According to the analysis of well based results located in different areas, the long-term groundwater exploitation in the northern subsidence area has led to the continuous decline of the water level, resulting in the inelastic and permanent compaction, while for the monitoring wells located outside the subsidence area, the subsidence time series show obvious elastic deformation characteristics (seasonal characteristics as the groundwater level changes. Moreover, according to the wavelet transformation, the land subsidence time series at monitoring well site lags several months behind the groundwater level change.

  4. International Work-Conference on Time Series

    CERN Document Server

    Pomares, Héctor; Valenzuela, Olga

    2017-01-01

    This volume of selected and peer-reviewed contributions on the latest developments in time series analysis and forecasting updates the reader on topics such as analysis of irregularly sampled time series, multi-scale analysis of univariate and multivariate time series, linear and non-linear time series models, advanced time series forecasting methods, applications in time series analysis and forecasting, advanced methods and online learning in time series and high-dimensional and complex/big data time series. The contributions were originally presented at the International Work-Conference on Time Series, ITISE 2016, held in Granada, Spain, June 27-29, 2016. The series of ITISE conferences provides a forum for scientists, engineers, educators and students to discuss the latest ideas and implementations in the foundations, theory, models and applications in the field of time series analysis and forecasting.  It focuses on interdisciplinary and multidisciplinary rese arch encompassing the disciplines of comput...

  5. Association between air pollution and suicide: a time series analysis in four Colombian cities.

    Science.gov (United States)

    Fernández-Niño, Julián Alfredo; Astudillo-García, Claudia Iveth; Rodríguez-Villamizar, Laura Andrea; Florez-Garcia, Víctor Alfonso

    2018-05-12

    Recent epidemiological studies have suggested that air pollution could be associated with suicide. However, other studies have criticized these results for being analytically weak and not taking into account potential confounding factors. As such, further studies examining the relationship under diverse contexts are necessary to help clarify this issue. This study explored the association between specific air pollutants (NO 2 , SO 2 , PM 10 , PM 2.5 , CO and O 3 ) and suicide incidence in four Colombian cities after adjusting for climatic variables and holidays. A time series of daily suicides among men and women living in Bogota, Medellin, Cali and Bucaramanga was generated using information from the National Administrative Department of Statistics (DANE) for the years 2011-2014. At the same time, the average daily concentration of each air pollutant for each city was obtained from monitoring stations belonging to the National Air Quality Surveillance System. Using this information together, we generated conditional Poisson models (stratified by day, month and year) for the suicide rate in men and women, with air pollutants as the principal explanatory variable. These models were adjusted for temperature, relative humidity, precipitation and holidays. No association was found between any of the examined pollutants and suicide: NO 2 (IRR:0.99, 95% CI: 0.95-1.04), SO 2 (IRR:0.99, 95% CI: 0.98-1.01), PM 10 (IRR:0.99, 95% CI:0.95-1.03), PM 2.5 (IRR:1.01, 95% CI: 0.98-1.05), CO (IRR:1.00, 95% CI:1.00-1.00) and O 3 (IRR: 1.00, 95% CI: 0.96-1.04). In the same way, no association was found in stratified models by sex and age group neither in lagged and cumulative effects models. After adjusting for major confounding factors, we found no statistically significant association between air pollution and suicide in Colombia. These "negative" results provide further insight into the current discussion regarding the existence of such a relationship.

  6. Developing a complex independent component analysis technique to extract non-stationary patterns from geophysical time-series

    Science.gov (United States)

    Forootan, Ehsan; Kusche, Jürgen

    2016-04-01

    Geodetic/geophysical observations, such as the time series of global terrestrial water storage change or sea level and temperature change, represent samples of physical processes and therefore contain information about complex physical interactionswith many inherent time scales. Extracting relevant information from these samples, for example quantifying the seasonality of a physical process or its variability due to large-scale ocean-atmosphere interactions, is not possible by rendering simple time series approaches. In the last decades, decomposition techniques have found increasing interest for extracting patterns from geophysical observations. Traditionally, principal component analysis (PCA) and more recently independent component analysis (ICA) are common techniques to extract statistical orthogonal (uncorrelated) and independent modes that represent the maximum variance of observations, respectively. PCA and ICA can be classified as stationary signal decomposition techniques since they are based on decomposing the auto-covariance matrix or diagonalizing higher (than two)-order statistical tensors from centered time series. However, the stationary assumption is obviously not justifiable for many geophysical and climate variables even after removing cyclic components e.g., the seasonal cycles. In this paper, we present a new decomposition method, the complex independent component analysis (CICA, Forootan, PhD-2014), which can be applied to extract to non-stationary (changing in space and time) patterns from geophysical time series. Here, CICA is derived as an extension of real-valued ICA (Forootan and Kusche, JoG-2012), where we (i) define a new complex data set using a Hilbert transformation. The complex time series contain the observed values in their real part, and the temporal rate of variability in their imaginary part. (ii) An ICA algorithm based on diagonalization of fourth-order cumulants is then applied to decompose the new complex data set in (i

  7. Multiple Indicator Stationary Time Series Models.

    Science.gov (United States)

    Sivo, Stephen A.

    2001-01-01

    Discusses the propriety and practical advantages of specifying multivariate time series models in the context of structural equation modeling for time series and longitudinal panel data. For time series data, the multiple indicator model specification improves on classical time series analysis. For panel data, the multiple indicator model…

  8. How to interpret the results of medical time series data analysis: Classical statistical approaches versus dynamic Bayesian network modeling.

    Science.gov (United States)

    Onisko, Agnieszka; Druzdzel, Marek J; Austin, R Marshall

    2016-01-01

    Classical statistics is a well-established approach in the analysis of medical data. While the medical community seems to be familiar with the concept of a statistical analysis and its interpretation, the Bayesian approach, argued by many of its proponents to be superior to the classical frequentist approach, is still not well-recognized in the analysis of medical data. The goal of this study is to encourage data analysts to use the Bayesian approach, such as modeling with graphical probabilistic networks, as an insightful alternative to classical statistical analysis of medical data. This paper offers a comparison of two approaches to analysis of medical time series data: (1) classical statistical approach, such as the Kaplan-Meier estimator and the Cox proportional hazards regression model, and (2) dynamic Bayesian network modeling. Our comparison is based on time series cervical cancer screening data collected at Magee-Womens Hospital, University of Pittsburgh Medical Center over 10 years. The main outcomes of our comparison are cervical cancer risk assessments produced by the three approaches. However, our analysis discusses also several aspects of the comparison, such as modeling assumptions, model building, dealing with incomplete data, individualized risk assessment, results interpretation, and model validation. Our study shows that the Bayesian approach is (1) much more flexible in terms of modeling effort, and (2) it offers an individualized risk assessment, which is more cumbersome for classical statistical approaches.

  9. Forecasting of particulate matter time series using wavelet analysis and wavelet-ARMA/ARIMA model in Taiyuan, China.

    Science.gov (United States)

    Zhang, Hong; Zhang, Sheng; Wang, Ping; Qin, Yuzhe; Wang, Huifeng

    2017-07-01

    Particulate matter with aerodynamic diameter below 10 μm (PM 10 ) forecasting is difficult because of the uncertainties in describing the emission and meteorological fields. This paper proposed a wavelet-ARMA/ARIMA model to forecast the short-term series of the PM 10 concentrations. It was evaluated by experiments using a 10-year data set of daily PM 10 concentrations from 4 stations located in Taiyuan, China. The results indicated the following: (1) PM 10 concentrations of Taiyuan had a decreasing trend during 2005 to 2012 but increased in 2013. PM 10 concentrations had an obvious seasonal fluctuation related to coal-fired heating in winter and early spring. (2) Spatial differences among the four stations showed that the PM 10 concentrations in industrial and heavily trafficked areas were higher than those in residential and suburb areas. (3) Wavelet analysis revealed that the trend variation and the changes of the PM 10 concentration of Taiyuan were complicated. (4) The proposed wavelet-ARIMA model could be efficiently and successfully applied to the PM 10 forecasting field. Compared with the traditional ARMA/ARIMA methods, this wavelet-ARMA/ARIMA method could effectively reduce the forecasting error, improve the prediction accuracy, and realize multiple-time-scale prediction. Wavelet analysis can filter noisy signals and identify the variation trend and the fluctuation of the PM 10 time-series data. Wavelet decomposition and reconstruction reduce the nonstationarity of the PM 10 time-series data, and thus improve the accuracy of the prediction. This paper proposed a wavelet-ARMA/ARIMA model to forecast the PM 10 time series. Compared with the traditional ARMA/ARIMA method, this wavelet-ARMA/ARIMA method could effectively reduce the forecasting error, improve the prediction accuracy, and realize multiple-time-scale prediction. The proposed model could be efficiently and successfully applied to the PM 10 forecasting field.

  10. Comparison of the Performance of Two Advanced Spectral Methods for the Analysis of Times Series in Paleoceanography

    Directory of Open Access Journals (Sweden)

    Eulogio Pardo-Igúzquiza

    2015-08-01

    Full Text Available Many studies have revealed the cyclicity of past ocean/atmosphere dynamics at a wide range of time scales (from decadal to millennial time scales, based on the spectral analysis of time series of climate proxies obtained from deep sea sediment cores. Among the many techniques available for spectral analysis, the maximum entropy method and the Thomson multitaper approach have frequently been used because of their good statistical properties and high resolution with short time series. The novelty of the present study is that we compared the two methods by according to the performance of their statistical tests to assess the statistical significance of their power spectrum estimates. The statistical significance of maximum entropy estimates was assessed by a random permutation test (Pardo-Igúzquiza and Rodríguez-Tovar, 2000, while the statistical significance of the Thomson multitaper method was assessed by an F-test (Thomson, 1982. We compared the results obtained in a case study using simulated data where the spectral content of the time series was known and in a case study with real data. In both cases the results are similar: while the cycles identified as significant by maximum entropy and the permutation test have a clear physical interpretation, the F-test with the Thomson multitaper estimator tends to find as no significant the peaks in the low frequencies and tends to give as significant more spurious peaks in the middle and high frequencies. Nevertheless, the best strategy is to use both techniques and to use the advantages of each of them.

  11. Models for dependent time series

    CERN Document Server

    Tunnicliffe Wilson, Granville; Haywood, John

    2015-01-01

    Models for Dependent Time Series addresses the issues that arise and the methodology that can be applied when the dependence between time series is described and modeled. Whether you work in the economic, physical, or life sciences, the book shows you how to draw meaningful, applicable, and statistically valid conclusions from multivariate (or vector) time series data.The first four chapters discuss the two main pillars of the subject that have been developed over the last 60 years: vector autoregressive modeling and multivariate spectral analysis. These chapters provide the foundational mater

  12. MODVOLC2: A Hybrid Time Series Analysis for Detecting Thermal Anomalies Applied to Thermal Infrared Satellite Data

    Science.gov (United States)

    Koeppen, W. C.; Wright, R.; Pilger, E.

    2009-12-01

    We developed and tested a new, automated algorithm, MODVOLC2, which analyzes thermal infrared satellite time series data to detect and quantify the excess energy radiated from thermal anomalies such as active volcanoes, fires, and gas flares. MODVOLC2 combines two previously developed algorithms, a simple point operation algorithm (MODVOLC) and a more complex time series analysis (Robust AVHRR Techniques, or RAT) to overcome the limitations of using each approach alone. MODVOLC2 has four main steps: (1) it uses the original MODVOLC algorithm to process the satellite data on a pixel-by-pixel basis and remove thermal outliers, (2) it uses the remaining data to calculate reference and variability images for each calendar month, (3) it compares the original satellite data and any newly acquired data to the reference images normalized by their variability, and it detects pixels that fall outside the envelope of normal thermal behavior, (4) it adds any pixels detected by MODVOLC to those detected in the time series analysis. Using test sites at Anatahan and Kilauea volcanoes, we show that MODVOLC2 was able to detect ~15% more thermal anomalies than using MODVOLC alone, with very few, if any, known false detections. Using gas flares from the Cantarell oil field in the Gulf of Mexico, we show that MODVOLC2 provided results that were unattainable using a time series-only approach. Some thermal anomalies (e.g., Cantarell oil field flares) are so persistent that an additional, semi-automated 12-µm correction must be applied in order to correctly estimate both the number of anomalies and the total excess radiance being emitted by them. Although all available data should be included to make the best possible reference and variability images necessary for the MODVOLC2, we estimate that at least 80 images per calendar month are required to generate relatively good statistics from which to run MODVOLC2, a condition now globally met by a decade of MODIS observations. We also found

  13. SWToolbox: A surface-water tool-box for statistical analysis of streamflow time series

    Science.gov (United States)

    Kiang, Julie E.; Flynn, Kate; Zhai, Tong; Hummel, Paul; Granato, Gregory

    2018-03-07

    This report is a user guide for the low-flow analysis methods provided with version 1.0 of the Surface Water Toolbox (SWToolbox) computer program. The software combines functionality from two software programs—U.S. Geological Survey (USGS) SWSTAT and U.S. Environmental Protection Agency (EPA) DFLOW. Both of these programs have been used primarily for computation of critical low-flow statistics. The main analysis methods are the computation of hydrologic frequency statistics such as the 7-day minimum flow that occurs on average only once every 10 years (7Q10), computation of design flows including biologically based flows, and computation of flow-duration curves and duration hydrographs. Other annual, monthly, and seasonal statistics can also be computed. The interface facilitates retrieval of streamflow discharge data from the USGS National Water Information System and outputs text reports for a record of the analysis. Tools for graphing data and screening tests are available to assist the analyst in conducting the analysis.

  14. Application of principal component analysis to time series of daily air pollution and mortality

    NARCIS (Netherlands)

    Quant C; Fischer P; Buringh E; Ameling C; Houthuijs D; Cassee F; MGO

    2004-01-01

    We investigated whether cause-specific daily mortality can be attributed to specific sources of air pollution. To construct indicators of source-specific air pollution, we applied a principal component analysis (PCA) on routinely collected air pollution data in the Netherlands during the period

  15. Statistics for Time-Series Spatial Data: Applying Survival Analysis to Study Land-Use Change

    Science.gov (United States)

    Wang, Ninghua Nathan

    2013-01-01

    Traditional spatial analysis and data mining methods fall short of extracting temporal information from data. This inability makes their use difficult to study changes and the associated mechanisms of many geographic phenomena of interest, for example, land-use. On the other hand, the growing availability of land-change data over multiple time…

  16. Model-based time-series analysis of FIA panel data absent re-measurements

    Science.gov (United States)

    Raymond L. Czaplewski; Mike T. Thompson

    2013-01-01

    An epidemic of lodgepole pine (Pinus contorta) mortality from the mountain pine beetle (Dendroctonus ponderosae) has swept across the Interior West. Aerial surveys monitor the areal extent of the epidemic, but only Forest Inventory and Analysis (FIA) field data support a detailed assessment at the tree level. Dynamics of the lodgepole pine population occur at a more...

  17. time series analysis of monthly rainfall in nigeria with emphasis on ...

    African Journals Online (AJOL)

    User

    Monthly rainfall data of twenty-one years (1980 – 2000) were analyzed for the six regions of. Nigeria using the rescaled range (R/S) statistic, the standard fluctuation analysis (FA) and the detrended fluctuation ... 2011 Kwame Nkrumah University of Science and Technology (KNUST) .... starting from the beginning, and s non-.

  18. Vector Nonlinear Time-Series Analysis of Gamma-Ray Burst Datasets on Heterogeneous Clusters

    Directory of Open Access Journals (Sweden)

    Ioana Banicescu

    2005-01-01

    Full Text Available The simultaneous analysis of a number of related datasets using a single statistical model is an important problem in statistical computing. A parameterized statistical model is to be fitted on multiple datasets and tested for goodness of fit within a fixed analytical framework. Definitive conclusions are hopefully achieved by analyzing the datasets together. This paper proposes a strategy for the efficient execution of this type of analysis on heterogeneous clusters. Based on partitioning processors into groups for efficient communications and a dynamic loop scheduling approach for load balancing, the strategy addresses the variability of the computational loads of the datasets, as well as the unpredictable irregularities of the cluster environment. Results from preliminary tests of using this strategy to fit gamma-ray burst time profiles with vector functional coefficient autoregressive models on 64 processors of a general purpose Linux cluster demonstrate the effectiveness of the strategy.

  19. Analysis and treatment of the Søndersø time series

    DEFF Research Database (Denmark)

    Dorini, Gianluca F.; Thordarson, Fannar Ørn; Madsen, Henrik

    This report deals with grey box modelling applied to the Well Field Optimisation project. The subject is the real case study of Søndersø, located north-west of Copenhagen, DK. This report contains a comprehensive description on how the dataset of measurements taken at Søndersø have been treated...... and analysed. The purpose of such analysis is twofold. Firstly is to identify a suitable architecture for the grey-box model. Secondly to design a procedure to select values from the dataset that will be used for the calibration of the parameters of the grey-box model. Section 1 describes the Søndersø well...... field, and provides an overview of the dataset. Section 2 describes the numeric treatments that have been applied to the dataset; the result is summarized in Section 3. Section 4 illustrates the analysis performed on the treated dataset. In this section, the fundamental mechanisms of the well field...

  20. Wavelet analysis as a tool to characteriseand remove environmental noisefrom self-potential time series

    OpenAIRE

    Chianese, D.; Colangelo, G.; D'Emilio, M.; Lanfredi, M.; Lapenna, V.; Ragosta, M.; Macchiato, M. F.

    2004-01-01

    Multiresolution wavelet analysis of self-potential signals and rainfall levels is performed for extracting fluctuations in electrical signals, which might be addressed to meteorological variability. In the time-scale domain of the wavelet transform, rain data are used as markers to single out those wavelet coefficients of the electric signal which can be considered relevant to the environmental disturbance. Then these coefficients are filtered out and the signal is recovered by anti...

  1. Assessment of land degradation using time series trend analysis of vegetation indictors in Otindag Sandy land

    International Nuclear Information System (INIS)

    Wang, H Y; Li, Z Y; Gao, Z H; Wu, J J; Sun, B; Li, C L

    2014-01-01

    Land condition assessment is a basic prerequisite for finding the degradation of a territory, which might lead to desertification under climatic and human pressures. The temporal change in vegetation productivity is a key indicator of land degradation. In this paper, taking the Otindag Sandy Land as a case, the mean normalized difference vegetation index (NDVI a ), net primary production (NPP) and vegetation rain use efficiency (RUE) dynamic trends during 2001–2010 were analysed. The Mann-Kendall test and the Correlation Analysis method were used and their sensitivities to land degradation were evaluated. The results showed that the three vegetation indicators (NDVI a , NPP and RUE) showed a downward trend with the two methods in the past 10 years and the land was degraded. For the analysis of the three vegetation indicators (NDVI a , NPP and RUE), it indicated a decreasing trend in 62.57%, 74.16% and 88.56% of the study area according to the Mann-Kendall test and in 57.85%, 68.38% and 85.29% according to the correlation analysis method. However, the change trends were not significant, the significant trends at the 95% confidence level only accounted for a small proportion. Analysis of NDVI a , NPP and RUE series showed a significant decreasing trend in 9.21%, 4.81% and 6.51% with the Mann-Kendall test. The NPP change trends showed obvious positive link with the precipitation in the study area. While the effect of the inter-annual variation of the precipitation for RUE was small, the vegetation RUE can provide valuable insights into the status of land condition and had best sensitivity to land degradation

  2. How cyanobacteria pose new problems to old methods: challenges in microarray time series analysis

    Czech Academy of Sciences Publication Activity Database

    Lehmann, R.; Machné, R.; Georg, J.; Benary, M.; Axman, I. M.; Steuer, Ralf

    2013-01-01

    Roč. 14, č. 133 (2013) ISSN 1471-2105 R&D Projects: GA MŠk(CZ) EE2.3.20.0256 Institutional support: RVO:67179843 Keywords : gene-expression data * growing neural-network * Scycle-regulated genes * cell-cycle * cluster-analysis * normalization * patterns * identification * calibration * intensities Subject RIV: EH - Ecology, Behaviour Impact factor: 2.672, year: 2013

  3. Time series analysis of reference crop evapotranspiration for Bokaro District, Jharkhand, India

    Directory of Open Access Journals (Sweden)

    Gautam Ratnesh

    2016-09-01

    Full Text Available Evapotranspiration is the one of the major role playing element in water cycle. More accurate measurement and forecasting of Evapotranspiration would enable more efficient water resources management. This study, is therefore, particularly focused on evapotranspiration modelling and forecasting, since forecasting would provide better information for optimal water resources management. There are numerous techniques of evapotranspiration forecasting that include autoregressive (AR and moving average (MA, autoregressive moving average (ARMA, autoregressive integrated moving average (ARIMA, Thomas Feiring, etc. Out of these models ARIMA model has been found to be more suitable for analysis and forecasting of hydrological events. Therefore, in this study ARIMA models have been used for forecasting of mean monthly reference crop evapotranspiration by stochastic analysis. The data series of 102 years i.e. 1224 months of Bokaro District were used for analysis and forecasting. Different order of ARIMA model was selected on the basis of autocorrelation function (ACF and partial autocorrelation (PACF of data series. Maximum likelihood method was used for determining the parameters of the models. To see the statistical parameter of model, best fitted model is ARIMA (0, 1, 4 (0, 1, 112.

  4. A Study of Wavelet Analysis and Data Extraction from Second-Order Self-Similar Time Series

    Directory of Open Access Journals (Sweden)

    Leopoldo Estrada Vargas

    2013-01-01

    Full Text Available Statistical analysis and synthesis of self-similar discrete time signals are presented. The analysis equation is formally defined through a special family of basis functions of which the simplest case matches the Haar wavelet. The original discrete time series is synthesized without loss by a linear combination of the basis functions after some scaling, displacement, and phase shift. The decomposition is then used to synthesize a new second-order self-similar signal with a different Hurst index than the original. The components are also used to describe the behavior of the estimated mean and variance of self-similar discrete time series. It is shown that the sample mean, although it is unbiased, provides less information about the process mean as its Hurst index is higher. It is also demonstrated that the classical variance estimator is biased and that the widely accepted aggregated variance-based estimator of the Hurst index results biased not due to its nature (which is being unbiased and has minimal variance but to flaws in its implementation. Using the proposed decomposition, the correct estimation of the Variance Plot is described, as well as its close association with the popular Logscale Diagram.

  5. Time-Series Analysis of Continuously Monitored Blood Glucose: The Impacts of Geographic and Daily Lifestyle Factors

    Directory of Open Access Journals (Sweden)

    Sean T. Doherty

    2015-01-01

    Full Text Available Type 2 diabetes is known to be associated with environmental, behavioral, and lifestyle factors. However, the actual impacts of these factors on blood glucose (BG variation throughout the day have remained relatively unexplored. Continuous blood glucose monitors combined with human activity tracking technologies afford new opportunities for exploration in a naturalistic setting. Data from a study of 40 patients with diabetes is utilized in this paper, including continuously monitored BG, food/medicine intake, and patient activity/location tracked using global positioning systems over a 4-day period. Standard linear regression and more disaggregated time-series analysis using autoregressive integrated moving average (ARIMA are used to explore patient BG variation throughout the day and over space. The ARIMA models revealed a wide variety of BG correlating factors related to specific activity types, locations (especially those far from home, and travel modes, although the impacts were highly personal. Traditional variables related to food intake and medications were less often significant. Overall, the time-series analysis revealed considerable patient-by-patient variation in the effects of geographic and daily lifestyle factors. We would suggest that maps of BG spatial variation or an interactive messaging system could provide new tools to engage patients and highlight potential risk factors.

  6. Explorative analysis of long time series of very high resolution spatial rainfall

    DEFF Research Database (Denmark)

    Thomassen, Emma Dybro; Sørup, Hjalte Jomo Danielsen; Scheibel, Marc

    2017-01-01

    . For each method a set of 17 variables are used to describe the properties of each event, e.g. duration, maximum volumes, spatial coverage and heterogeneity, and movement of cells. A total of 5-9 dimensions can be found in the data, which can be interpreted as a rough indication of how many independent...... simple scaling across the set of variables, i.e. the level of each variable varies signicantly, but not the overall structure of the spatial precipitation. The analysis show that there is a good potential for making a spatial weather generator for high spatio-temporal precipitation for precipitation...

  7. Time Series Analysis of Monte Carlo Fission Sources - I: Dominance Ratio Computation

    International Nuclear Information System (INIS)

    Ueki, Taro; Brown, Forrest B.; Parsons, D. Kent; Warsa, James S.

    2004-01-01

    In the nuclear engineering community, the error propagation of the Monte Carlo fission source distribution through cycles is known to be a linear Markov process when the number of histories per cycle is sufficiently large. In the statistics community, linear Markov processes with linear observation functions are known to have an autoregressive moving average (ARMA) representation of orders p and p - 1. Therefore, one can perform ARMA fitting of the binned Monte Carlo fission source in order to compute physical and statistical quantities relevant to nuclear criticality analysis. In this work, the ARMA fitting of a binary Monte Carlo fission source has been successfully developed as a method to compute the dominance ratio, i.e., the ratio of the second-largest to the largest eigenvalues. The method is free of binning mesh refinement and does not require the alteration of the basic source iteration cycle algorithm. Numerical results are presented for problems with one-group isotropic, two-group linearly anisotropic, and continuous-energy cross sections. Also, a strategy for the analysis of eigenmodes higher than the second-largest eigenvalue is demonstrated numerically

  8. Measuring Coupling of Rhythmical Time Series Using Cross Sample Entropy and Cross Recurrence Quantification Analysis

    Directory of Open Access Journals (Sweden)

    John McCamley

    2017-01-01

    Full Text Available The aim of this investigation was to compare and contrast the use of cross sample entropy (xSE and cross recurrence quantification analysis (cRQA measures for the assessment of coupling of rhythmical patterns. Measures were assessed using simulated signals with regular, chaotic, and random fluctuations in frequency, amplitude, and a combination of both. Biological data were studied as models of normal and abnormal locomotor-respiratory coupling. Nine signal types were generated for seven frequency ratios. Fifteen patients with COPD (abnormal coupling and twenty-one healthy controls (normal coupling walked on a treadmill at three speeds while breathing and walking were recorded. xSE and the cRQA measures of percent determinism, maximum line, mean line, and entropy were quantified for both the simulated and experimental data. In the simulated data, xSE, percent determinism, and entropy were influenced by the frequency manipulation. The 1 : 1 frequency ratio was different than other frequency ratios for almost all measures and/or manipulations. The patients with COPD used a 2 : 3 ratio more often and xSE, percent determinism, maximum line, mean line, and cRQA entropy were able to discriminate between the groups. Analysis of the effects of walking speed indicated that all measures were able to discriminate between speeds.

  9. A time-series analysis of energy-related carbon emissions in Korea

    International Nuclear Information System (INIS)

    Ki-Hong Choi; Ang, B.W.

    2001-01-01

    Energy-related carbon emissions and their relationships with energy consumption and GNP in Korea are studied from 1961 to 1998. The ratio of carbon emissions to GNP is expressed as the product of the aggregate carbon factor and the energy intensity. Changes in the aggregate carbon factor are decomposed into the impacts associated with the fuel carbon factor and the fuel mix, using the Divisia index approach. The analysis is carried out using two sets of data, with and without wood consumption as an energy source, and very different results are obtained. This shows that carbon emission studies for developing countries based on commercial energy consumption only may have to be interpreted with caution. Our analysis also reveals that the impact of the energy intensity on carbon emissions is greater than that of the aggregate carbon factor. This finding supports the assertion made in earlier studies that the energy intensity is a more meaningful indicator than the aggregate carbon factor in the study of climate change resulting from energy-related emissions. (author)

  10. Time series analysis of ambient air concentrations in Alexandria and Nile delta region, Egypt

    International Nuclear Information System (INIS)

    EI Raev, M.; Shalaby, E.A.; Ghatass, Z.F.; Marey, H.S.

    2007-01-01

    Data collected from the Air Monitoring Network of Alexandria and Delta (EEAA/EIMP-program), were analyzed. Emphasis is given to indicator pollutants PM 10 , NO 2 , SO 2 , O 3 and CO. Two sites have been selected in Alexandria (IGSR and Shohada) and three sites in Delta region (Kafr Elzyat, Mansoura and Mahalla) for analysis of three years from 2000-2002. Box -Jenkins modeling has been used mainly for forecasting and assessing relative importance of various parameters or pollutants. Results showed that, the autoregressive (AR) order for all series ranged from 0-2 except NO 2 at Mansoura site. Also the moving average order ranged from 0-2 except CO at IGSR site. Nitrogen dioxide and Ozone at IGSR site have the same ARIMA model which is (0, 1, and 2). Cross correlation analysis has revealed important information on the dynamics, chemistry and interpretation of ambient pollution. Cross-correlation functions of SO 2 and PM 10 at IGSR sites suggest that, sulfur dioxide has been adsorbed on the surface of particulates which has an alkaline nature. This enhances the oxidation of sulfur dioxide to sulfate, which results in low levels of SO 2 in spite of the presence of sources

  11. Qualitative phase space reconstruction analysis of supply-chain inventor time series

    Directory of Open Access Journals (Sweden)

    Jinliang Wu

    2010-10-01

    Full Text Available The economy systems are usually too complex to be analysed, but some advanced methods have been developed in order to do so, such as system dynamics modelling, multi-agent modelling, complex adaptive system modelling and qualitative modelling. In this paper, we considered a supply-chain (SC system including several kinds of products. Using historic suppliers’ demand data, we firstly applied the phase space analysis method and then used qualitative analysis to improve the complex system’s performance. Quantitative methods can forecast the quantitative SC demands, but they cannot indicate the qualitative aspects of SC, so when we apply quantitative methods to a SC system we get only numerous data of demand. By contrast, qualitative methods can show the qualitative change and trend of the SC demand. We therefore used qualitative methods to improve the quantitative forecasting results. Comparing the quantitative only method and the combined method used in this paper, we found that the combined method is far more accurate. Not only is the inventory cost lower, but the forecasting accuracy is also better.

  12. Intensive time series data exploitation: the Multi-sensor Evolution Analysis (MEA) platform

    Science.gov (United States)

    Mantovani, Simone; Natali, Stefano; Folegani, Marco; Scremin, Alessandro

    2014-05-01

    The monitoring of the temporal evolution of natural phenomena must be performed in order to ensure their correct description and to allow improvements in modelling and forecast capabilities. This assumption, that is obvious for ground-based measurements, has not always been true for data collected through space-based platforms: except for geostationary satellites and sensors, that allow providing a very effective monitoring of phenomena with geometric scale from regional to global; smaller phenomena (with characteristic dimension lower than few kilometres) have been monitored with instruments that could collect data only with a time interval in the order of several days; bi-temporal techniques have been the most used ones for years, in order to characterise temporal changes and try identifying specific phenomena. The more the number of flying sensor has grown and their performance improved, the more their capability of monitoring natural phenomena at a smaller geographic scale has grown: we can now count on tenth of years of remotely sensed data, collected by hundreds of sensors that are now accessible from a wide users' community, and the techniques for data processing have to be adapted to move toward a data intensive exploitation. Starting from 2008, the European Space Agency has initiated the development of the Multi-sensor Evolution Analysis (MEA) platform (https://mea.eo.esa.int), whose first aim was to permit the access and exploitation of long term remotely sensed satellite data from different platforms: 15 years of global (A)ATSR data together with 5 years of regional AVNIR-2 data were loaded into the system and were used, through a web-based graphic user interface, for land cover change analysis. The MEA data availability has grown during years integrating multi-disciplinary data that feature spatial and temporal dimensions: so far tenths of Terabytes of data in the land and atmosphere domains are available and can be visualized and exploited, keeping the

  13. Wavelet analysis as a tool to characteriseand remove environmental noisefrom self-potential time series

    Directory of Open Access Journals (Sweden)

    M. Ragosta

    2004-06-01

    Full Text Available Multiresolution wavelet analysis of self-potential signals and rainfall levels is performed for extracting fluctuations in electrical signals, which might be addressed to meteorological variability. In the time-scale domain of the wavelet transform, rain data are used as markers to single out those wavelet coefficients of the electric signal which can be considered relevant to the environmental disturbance. Then these coefficients are filtered out and the signal is recovered by anti-transforming the retained coefficients. Such methodological approach might be applied to characterise unwanted environmental noise. It also can be considered as a practical technique to remove noise that can hamper the correct assessment and use of electrical techniques for the monitoring of geophysical phenomena.

  14. Singular Spectrum Analysis for Astronomical Time Series: Constructing a Parsimonious Hypothesis Test

    Science.gov (United States)

    Greco, G.; Kondrashov, D.; Kobayashi, S.; Ghil, M.; Branchesi, M.; Guidorzi, C.; Stratta, G.; Ciszak, M.; Marino, F.; Ortolan, A.

    We present a data-adaptive spectral method - Monte Carlo Singular Spectrum Analysis (MC-SSA) - and its modification to tackle astrophysical problems. Through numerical simulations we show the ability of the MC-SSA in dealing with 1/f β power-law noise affected by photon counting statistics. Such noise process is simulated by a first-order autoregressive, AR(1) process corrupted by intrinsic Poisson noise. In doing so, we statistically estimate a basic stochastic variation of the source and the corresponding fluctuations due to the quantum nature of light. In addition, MC-SSA test retains its effectiveness even when a significant percentage of the signal falls below a certain level of detection, e.g., caused by the instrument sensitivity. The parsimonious approach presented here may be broadly applied, from the search for extrasolar planets to the extraction of low-intensity coherent phenomena probably hidden in high energy transients.

  15. An example of utilization of Kalman filters in time series analysis

    International Nuclear Information System (INIS)

    Marseguerra, M.; Porceddu, C.M.

    1987-01-01

    In reactor noise analysis the fluctuation of many interesting signals may be described by linear models such as the AR, ARMA or ARMAX ones. Another interesting approach of increasing importance is the Kalman filter methodology. In the this paper a linear system described by an autoregressive AR(2) model is considered and it is investigated whether the Kalman filter is capable of correctly estimating parameters together with their accuracies both in the stationary state and in the case of sudden variation of the parameters. In addition a more complex situation in which a stationary system under investigation feeds the sensor which delivers the observed signal. Assuming the system obeying on AR(2) model and the sensor a simpler AR(1) model, the problem is that of recovering the system output from the measured signal

  16. Analysis of time series exposure rates obtained at a monitoring station around nuclear power stations

    International Nuclear Information System (INIS)

    Urabe, I.; Ogawa, Y.; Kimura, Y.; Honda, Y.; Nakashima, Y.; Yoshimoto, T.; Tsujimoto, T.

    1991-01-01

    From the investigation on the variation of AAD rates monitored in the natural environment around nuclear power station, it may be concluded; (1) Differences between monthly averaged air absorbed dose rates (AAD rates) given by all data obtained and those obtained in fine weather become larger in winter (from Dec. to Feb.) (2) Cummulative frequency distributions of AAD rates are very different among four seasons. Remarkably high AAD rates are observed by heavy rains in summer and snow falls or rains in winter. (3) Though the hypothesis that the frequency distribution of AAD rates fit to the lognormal distribution can not be accepted by chi-square test, higher part of the frequency distribution of AAD rates agree approximately with the lognormal one. (4) Identification of AAD rates due to plume exposure may be possible by statistical analysis assuming lognormal distribution of AAD rates as well as the discrimination method based on the reference standard using mean values and standard deviations of the data obtained in fine weather. (author)

  17. Time series analysis applied to construct US natural gas price functions for groups of states

    International Nuclear Information System (INIS)

    Kalashnikov, V.V.; Matis, T.I.; Perez-Valdes, G.A.

    2010-01-01

    The study of natural gas markets took a considerably new direction after the liberalization of the natural gas markets during the early 1990s. As a result, several problems and research opportunities arose for those studying the natural gas supply chain, particularly the marketing operations. Consequently, various studies have been undertaken about the econometrics of natural gas. Several models have been developed and used for different purposes, from descriptive analysis to practical applications such as price and consumption forecasting. In this work, we address the problem of finding a pooled regression formula relating the monthly figures of price and consumption volumes for each state of the United States during the last twenty years. The model thus obtained is used as the basis for the development of two methods aimed at classifying the states into groups sharing a similar price/consumption relationship: a dendrogram application, and an heuristic algorithm. The details and further applications of these grouping techniques are discussed, along with the ultimate purpose of using this pooled regression model to validate data employed in the stochastic optimization problem studied by the authors.

  18. Time series analysis applied to construct US natural gas price functions for groups of states

    Energy Technology Data Exchange (ETDEWEB)

    Kalashnikov, V.V. [Departamento de Ingenieria Industrial y de Sistemas, Tecnologico de Monterrey, Av. Eugenio Garza Sada 2501 Sur, Col. Tecnologico, Monterrey, Nuevo Leon, 64849 (Mexico); Matis, T.I. [Deparment of Industrial Engineering, Texas Tech University, 2500 Broadway, Lubbock, TX 79409 (United States); Perez-Valdes, G.A. [Departamento de Ingenieria Industrial y de Sistemas, Tecnologico de Monterrey, Av. Eugenio Garza Sada 2501 Sur, Col. Tecnologico, Monterrey, Nuevo Leon, 64849 (Mexico); Deparment of Industrial Engineering, Texas Tech University, 2500 Broadway, Lubbock, TX 79409 (United States)

    2010-07-15

    The study of natural gas markets took a considerably new direction after the liberalization of the natural gas markets during the early 1990s. As a result, several problems and research opportunities arose for those studying the natural gas supply chain, particularly the marketing operations. Consequently, various studies have been undertaken about the econometrics of natural gas. Several models have been developed and used for different purposes, from descriptive analysis to practical applications such as price and consumption forecasting. In this work, we address the problem of finding a pooled regression formula relating the monthly figures of price and consumption volumes for each state of the United States during the last twenty years. The model thus obtained is used as the basis for the development of two methods aimed at classifying the states into groups sharing a similar price/consumption relationship: a dendrogram application, and an heuristic algorithm. The details and further applications of these grouping techniques are discussed, along with the ultimate purpose of using this pooled regression model to validate data employed in the stochastic optimization problem studied by the authors. (author)

  19. The Tracking Meteogram, an AWIPS II Tool for Time-Series Analysis

    Science.gov (United States)

    Burks, Jason Eric; Sperow, Ken

    2015-01-01

    A new tool has been developed for the National Weather Service (NWS) Advanced Weather Interactive Processing System (AWIPS) II through collaboration between NASA's Short-term Prediction Research and Transition (SPoRT) and the NWS Meteorological Development Laboratory (MDL). Referred to as the "Tracking Meteogram", the tool aids NWS forecasters in assessing meteorological parameters associated with moving phenomena. The tool aids forecasters in severe weather situations by providing valuable satellite and radar derived trends such as cloud top cooling rates, radial velocity couplets, reflectivity, and information from ground-based lightning networks. The Tracking Meteogram tool also aids in synoptic and mesoscale analysis by tracking parameters such as the deepening of surface low pressure systems, changes in surface or upper air temperature, and other properties. The tool provides a valuable new functionality and demonstrates the flexibility and extensibility of the NWS AWIPS II architecture. In 2014, the operational impact of the tool was formally evaluated through participation in the NOAA/NWS Operations Proving Ground (OPG), a risk reduction activity to assess performance and operational impact of new forecasting concepts, tools, and applications. Performance of the Tracking Meteogram Tool during the OPG assessment confirmed that it will be a valuable asset to the operational forecasters. This presentation reviews development of the Tracking Meteogram tool, performance and feedback acquired during the OPG activity, and future goals for continued support and extension to other application areas.

  20. Sustainability of Italian budgetary policies: a time series analysis (1862-2013

    Directory of Open Access Journals (Sweden)

    Gordon L. Brady

    2017-12-01

    Full Text Available In this paper, we analyze the sustainability of Italian public finances using a unique database covering the period 1862-2013. This paper focuses on empirical tests for the sustainability and solvency of fiscal policies. A necessary but not sufficient condition implies that the growth rate of public debt should in the limit be smaller than the asymptotic rate of interest. In addition, the debt-to-GDP ratio must eventually stabilize at a steady-state level. The results of unit root and stationarity tests show that the variables are non-stationary at levels, but stationary in first-differences form, or I(1. However, some breaks in the series emerge, given internal and external crises (wars, oil shocks, regime changes, institutional reforms. Therefore, the empirical analysis is conducted for the entire period, as well as two sub‐periods (1862‐1913 and 1947‐2013. Moreover, anecdotal evidence and visual inspection of the series confirm our results. Furthermore, we conduct tests on cointegration, which evidence that a long-run relationship between public expenditure and revenues is found only for the first sub-period (1862-1913. In essence, the paper’s results reveal that Italy have sustainability problems in the Republican age.

  1. Digital signal processing for the Johnson noise thermometry: a time series analysis of the Johnson noise

    International Nuclear Information System (INIS)

    Moon, Byung Soo; Hwang, In Koo; Chung, Chong Eun; Kwon, Kee Choon; David, E. H.; Kisner, R.A.

    2004-06-01

    In this report, we first proved that a random signal obtained by taking the sum of a set of signal frequency signals generates a continuous Markov process. We used this random signal to simulate the Johnson noise and verified that the Johnson noise thermometry can be used to improve the measurements of the reactor coolant temperature within an accuracy of below 0.14%. Secondly, by using this random signal we determined the optimal sampling rate when the frequency band of the Johnson noise signal is given. Also the results of our examination on how good the linearity of the Johnson noise is and how large the relative error of the temperature could become when the temperature increases are described. Thirdly, the results of our analysis on a set of the Johnson noise signal blocks taken from a simple electric circuit are described. We showed that the properties of the continuous Markov process are satisfied even when some channel noises are present. Finally, we describe the algorithm we devised to handle the problem of the time lag in the long-term average or the moving average in a transient state. The algorithm is based on the Haar wavelet and is to estimate the transient temperature that has much smaller time delay. We have shown that the algorithm can track the transient temperature successfully

  2. Suicide in Greece 1992-2012: A time-series analysis.

    Science.gov (United States)

    Papaslanis, Theodoros; Kontaxakis, Vassilis; Christodoulou, Christos; Konstantakopoulos, George; Kontaxaki, Maria-Irini; Papadimitriou, George N

    2016-08-01

    Since 2008, Greece has entered a long period of economic crisis with adverse effects on various aspects of daily life. In this frame, it is quite important to examine the suicide trends in Greece. Our analysis covered the period 1992-2012. 2012 was the last year for which official suicide data were available. The inclusion of data for pre-crisis period enabled us to assess trends in suicide preceding the economic crisis, starting in 2008. Trends in sex- and age-adjusted standardized suicide rates (SSR) were analyzed using joinpoint regression. Total SSR presented statistically significant annual decrease of 0.89% (95% confidence interval (CI): -1.7, -0.1) during the period 1992-2008. After 2009, the trend in total SSR increased statistically significant annual increase (12.48%; 95% CI: 0.3%, 26.1%). SSR in males presented an initial period of modest annual decrease (-0.84%; 95% CI: -1.6%, -0.1%), during the period 1992-2008. After 2009, an annual increase by 9.25% (95% CI: 2.7%, 16.3%) was revealed. No change in female SSR trend was observed during the studied period. According to the results of this study, there is clear evidence of an increase in the overall SSR and male SSR in Greece during the period of the current financial crisis. © The Author(s) 2016.

  3. Association of Attorney Advertising and FDA Action with Prescription Claims: A Time Series Segmented Regression Analysis.

    Science.gov (United States)

    Tippett, Elizabeth C; Chen, Brian K

    2015-12-01

    Attorneys sponsor television advertisements that include repeated warnings about adverse drug events to solicit consumers for lawsuits against drug manufacturers. The relationship between such advertising, safety actions by the US Food and Drug Administration (FDA), and healthcare use is unknown. To investigate the relationship between attorney advertising, FDA actions, and prescription drug claims. The study examined total users per month and prescription rates for seven drugs with substantial attorney advertising volume and FDA or other safety interventions during 2009. Segmented regression analysis was used to detect pre-intervention trends, post-intervention level changes, and changes in post-intervention trends relative to the pre-intervention trends in the use of these seven drugs, using advertising volume, media hits, and the number of Medicare enrollees as covariates. Data for these variables were obtained from the Center for Medicare and Medicaid Services, Kantar Media, and LexisNexis. Several types of safety actions were associated with reductions in drug users and/or prescription rates, particularly for fentanyl, varenicline, and paroxetine. In most cases, attorney advertising volume rose in conjunction with major safety actions. Attorney advertising volume was positively correlated with prescription rates in five of seven drugs, likely because advertising volume began rising before safety actions, when prescription rates were still increasing. On the other hand, attorney advertising had mixed associations with the number of users per month. Regulatory and safety actions likely reduced the number of users and/or prescription rates for some drugs. Attorneys may have strategically chosen to begin advertising adverse drug events prior to major safety actions, but we found little evidence that attorney advertising reduced drug use. Further research is needed to better understand how consumers and physicians respond to attorney advertising.

  4. Time series analysis of cholera in Matlab, Bangladesh, during 1988-2001.

    Science.gov (United States)

    Ali, Mohammad; Kim, Deok Ryun; Yunus, Mohammad; Emch, Michael

    2013-03-01

    The study examined the impact of in-situ climatic and marine environmental variability on cholera incidence in an endemic area of Bangladesh and developed a forecasting model for understanding the magnitude of incidence. Diarrhoea surveillance data collected between 1988 and 2001 were obtained from a field research site in Matlab, Bangladesh. Cholera cases were defined as Vibrio cholerae O1 isolated from faecal specimens of patients who sought care at treatment centres serving the Matlab population. Cholera incidence for 168 months was correlated with remotely-sensed sea-surface temperature (SST) and in-situ environmental data, including rainfall and ambient temperature. A seasonal autoregressive integrated moving average (SARIMA) model was used for determining the impact of climatic and environmental variability on cholera incidence and evaluating the ability of the model to forecast the magnitude of cholera. There were 4,157 cholera cases during the study period, with an average of 1.4 cases per 1,000 people. Since monthly cholera cases varied significantly by month, it was necessary to stabilize the variance of cholera incidence by computing the natural logarithm to conduct the analysis. The SARIMA model shows temporal clustering of cholera at one- and 12-month lags. There was a 6% increase in cholera incidence with a minimum temperature increase of one degree celsius in the current month. For increase of SST by one degree celsius, there was a 25% increase in the cholera incidence at currrent month and 18% increase in the cholera incidence at two months. Rainfall did not influenc to cause variation in cholera incidence during the study period. The model forecast the fluctuation of cholera incidence in Matlab reasonably well (Root mean square error, RMSE: 0.108). Thus, the ambient and sea-surface temperature-based model could be used in forecasting cholera outbreaks in Matlab.

  5. Time Series Analysis of Energy Production and Associated Landscape Fragmentation in the Eagle Ford Shale Play

    Science.gov (United States)

    Pierre, Jon Paul; Young, Michael H.; Wolaver, Brad D.; Andrews, John R.; Breton, Caroline L.

    2017-11-01

    Spatio-temporal trends in infrastructure footprints, energy production, and landscape alteration were assessed for the Eagle Ford Shale of Texas. The period of analysis was over four 2-year periods (2006-2014). Analyses used high-resolution imagery, as well as pipeline data to map EF infrastructure. Landscape conditions from 2006 were used as baseline. Results indicate that infrastructure footprints varied from 94.5 km2 in 2008 to 225.0 km2 in 2014. By 2014, decreased land-use intensities (ratio of land alteration to energy production) were noted play-wide. Core-area alteration by period was highest (3331.6 km2) in 2008 at the onset of play development, and increased from 582.3 to 3913.9 km2 by 2014, though substantial revegetation of localized core areas was observed throughout the study (i.e., alteration improved in some areas and worsened in others). Land-use intensity in the eastern portion of the play was consistently lower than that in the western portion, while core alteration remained relatively constant east to west. Land alteration from pipeline construction was 65 km2 for all time periods, except in 2010 when alteration was recorded at 47 km2. Percent of total alteration from well-pad construction increased from 27.3% in 2008 to 71.5% in 2014. The average number of wells per pad across all 27 counties increased from 1.15 to 1.7. This study presents a framework for mapping landscape alteration from oil and gas infrastructure development. However, the framework could be applied to other energy development programs, such as wind or solar fields, or any other regional infrastructure development program.

  6. Thermal-Induced Errors Prediction and Compensation for a Coordinate Boring Machine Based on Time Series Analysis

    Directory of Open Access Journals (Sweden)

    Jun Yang

    2014-01-01

    Full Text Available To improve the CNC machine tools precision, a thermal error modeling for the motorized spindle was proposed based on time series analysis, considering the length of cutting tools and thermal declined angles, and the real-time error compensation was implemented. A five-point method was applied to measure radial thermal declinations and axial expansion of the spindle with eddy current sensors, solving the problem that the three-point measurement cannot obtain the radial thermal angle errors. Then the stationarity of the thermal error sequences was determined by the Augmented Dickey-Fuller Test Algorithm, and the autocorrelation/partial autocorrelation function was applied to identify the model pattern. By combining both Yule-Walker equations and information criteria, the order and parameters of the models were solved effectively, which improved the prediction accuracy and generalization ability. The results indicated that the prediction accuracy of the time series model could reach up to 90%. In addition, the axial maximum error decreased from 39.6 μm to 7 μm after error compensation, and the machining accuracy was improved by 89.7%. Moreover, the X/Y-direction accuracy can reach up to 77.4% and 86%, respectively, which demonstrated that the proposed methods of measurement, modeling, and compensation were effective.

  7. [Time-series analysis of ambient PM₁₀ pollution on residential mortality in Beijing].

    Science.gov (United States)

    Xue, Jiang-li; Wang, Qi; Cai, Yue; Zhou, Mai-geng

    2012-05-01

    To explore the short-term impact of ambient PM(10) on daily non-accidental death, cardiovascular and respiratory death of residents in Beijing. Mortality data of residents in Beijing during 2006 to 2009 were obtained from public health surveillance and information service center of Chinese Center for Disease Control and Prevention, contemporaneous data of average daily air concentration of PM(10), SO(2), NO(2) were obtained from Beijing Environment Protection Bureau (year 2005 - 2006) and public website of Beijing environmental protection (year 2007 - 2009), respectively, contemporaneous meteorological data were obtained from china meteorological data sharing service system. Generalized addictive model (GAM) of time serial analysis was applied. In additional to the control of confounding factors such as long-term trend, day of the week effect, meteorological factors, lag effect and the effects of other atmospheric pollutants were also analyzed. During year 2006 to 2009, the number of average daily non-accidental death, respiratory disease caused death, cardiovascular and cerebrovascular diseases caused death among Beijing residents were 140.1, 15.0, 65.8, respectively;contemporaneous medians of average daily air concentration of PM(10), SO(2), NO(2) were 123.0, 26.0, 58.0 µg/m(3), respectively;contemporaneous average atmosphere pressure, temperature and relative humidity were 10.1 kPa, 13.5°C and 51.9%, respectively. An exposure-response relationship between exposure to ambient PM(10) and increased daily death number was found as every 10 µg/m(3) increase in daily average concentration of PM(10), there was a 0.1267% (95%CI: 0.0824% - 0.1710%) increase in daily non-accidental death of residents, 0.1365% (95%CI: 0.0010% - 0.2720%) increase in respiratory death and 0.1239% (95%CI: 0.0589% - 0.1889%) increase in cardiovascular death. Ambient PM(10) had greatest influence on daily non-accidental and cardiovascular death of the same day, while its greatest influence

  8. A novel model for Time-Series Data Clustering Based on piecewise SVD and BIRCH for Stock Data Analysis on Hadoop Platform

    Directory of Open Access Journals (Sweden)

    Ibgtc Bowala

    2017-06-01

    Full Text Available With the rapid growth of financial markets, analyzers are paying more attention on predictions. Stock data are time series data, with huge amounts. Feasible solution for handling the increasing amount of data is to use a cluster for parallel processing, and Hadoop parallel computing platform is a typical representative. There are various statistical models for forecasting time series data, but accurate clusters are a pre-requirement. Clustering analysis for time series data is one of the main methods for mining time series data for many other analysis processes. However, general clustering algorithms cannot perform clustering for time series data because series data has a special structure and a high dimensionality has highly co-related values due to high noise level. A novel model for time series clustering is presented using BIRCH, based on piecewise SVD, leading to a novel dimension reduction approach. Highly co-related features are handled using SVD with a novel approach for dimensionality reduction in order to keep co-related behavior optimal and then use BIRCH for clustering. The algorithm is a novel model that can handle massive time series data. Finally, this new model is successfully applied to real stock time series data of Yahoo finance with satisfactory results.

  9. Detection and Characterization of Ground Displacement Sources from Variational Bayesian Independent Component Analysis of GPS Time Series

    Science.gov (United States)

    Gualandi, A.; Serpelloni, E.; Belardinelli, M. E.

    2014-12-01

    A critical point in the analysis of ground displacements time series is the development of data driven methods that allow to discern and characterize the different sources that generate the observed displacements. A widely used multivariate statistical technique is the Principal Component Analysis (PCA), which allows to reduce the dimensionality of the data space maintaining most of the variance of the dataset explained. It reproduces the original data using a limited number of Principal Components, but it also shows some deficiencies. Indeed, PCA does not perform well in finding the solution to the so-called Blind Source Separation (BSS) problem, i.e. in recovering and separating the original sources that generated the observed data. This is mainly due to the assumptions on which PCA relies: it looks for a new Euclidean space where the projected data are uncorrelated. Usually, the uncorrelation condition is not strong enough and it has been proven that the BSS problem can be tackled imposing on the components to be independent. The Independent Component Analysis (ICA) is, in fact, another popular technique adopted to approach this problem, and it can be used in all those fields where PCA is also applied. An ICA approach enables us to explain the time series imposing a fewer number of constraints on the model, and to reveal anomalies in the data such as transient signals. However, the independence condition is not easy to impose, and it is often necessary to introduce some approximations. To work around this problem, we use a variational bayesian ICA (vbICA) method, which models the probability density function (pdf) of each source signal using a mix of Gaussian distributions. This technique allows for more flexibility in the description of the pdf of the sources, giving a more reliable estimate of them. Here we present the application of the vbICA technique to GPS position time series. First, we use vbICA on synthetic data that simulate a seismic cycle

  10. Seasonal trend analysis and ARIMA modeling of relative humidity and wind speed time series around Yamula Dam

    Science.gov (United States)

    Eymen, Abdurrahman; Köylü, Ümran

    2018-02-01

    Local climate change is determined by analysis of long-term recorded meteorological data. In the statistical analysis of the meteorological data, the Mann-Kendall rank test, which is one of the non-parametrical tests, has been used; on the other hand, for determining the power of the trend, Theil-Sen method has been used on the data obtained from 16 meteorological stations. The stations cover the provinces of Kayseri, Sivas, Yozgat, and Nevşehir in the Central Anatolia region of Turkey. Changes in land-use affect local climate. Dams are structures that cause major changes on the land. Yamula Dam is located 25 km northwest of Kayseri. The dam has huge water body which is approximately 85 km2. The mentioned tests have been used for detecting the presence of any positive or negative trend in meteorological data. The meteorological data in relation to the seasonal average, maximum, and minimum values of the relative humidity and seasonal average wind speed have been organized as time series and the tests have been conducted accordingly. As a result of these tests, the following have been identified: increase was observed in minimum relative humidity values in the spring, summer, and autumn seasons. As for the seasonal average wind speed, decrease was detected for nine stations in all seasons, whereas increase was observed in four stations. After the trend analysis, pre-dam mean relative humidity time series were modeled with Autoregressive Integrated Moving Averages (ARIMA) model which is statistical modeling tool. Post-dam relative humidity values were predicted by ARIMA models.

  11. Data Rods: High Speed, Time-Series Analysis of Massive Cryospheric Data Sets Using Object-Oriented Database Methods

    Science.gov (United States)

    Liang, Y.; Gallaher, D. W.; Grant, G.; Lv, Q.

    2011-12-01

    Change over time, is the central driver of climate change detection. The goal is to diagnose the underlying causes, and make projections into the future. In an effort to optimize this process we have developed the Data Rod model, an object-oriented approach that provides the ability to query grid cell changes and their relationships to neighboring grid cells through time. The time series data is organized in time-centric structures called "data rods." A single data rod can be pictured as the multi-spectral data history at one grid cell: a vertical column of data through time. This resolves the long-standing problem of managing time-series data and opens new possibilities for temporal data analysis. This structure enables rapid time- centric analysis at any grid cell across multiple sensors and satellite platforms. Collections of data rods can be spatially and temporally filtered, statistically analyzed, and aggregated for use with pattern matching algorithms. Likewise, individual image pixels can be extracted to generate multi-spectral imagery at any spatial and temporal location. The Data Rods project has created a series of prototype databases to store and analyze massive datasets containing multi-modality remote sensing data. Using object-oriented technology, this method overcomes the operational limitations of traditional relational databases. To demonstrate the speed and efficiency of time-centric analysis using the Data Rods model, we have developed a sea ice detection algorithm. This application determines the concentration of sea ice in a small spatial region across a long temporal window. If performed using traditional analytical techniques, this task would typically require extensive data downloads and spatial filtering. Using Data Rods databases, the exact spatio-temporal data set is immediately available No extraneous data is downloaded, and all selected data querying occurs transparently on the server side. Moreover, fundamental statistical

  12. Systematic Analysis of Time-Series Gene Expression Data on Tumor Cell-Selective Apoptotic Responses to HDAC Inhibitors

    Directory of Open Access Journals (Sweden)

    Yun-feng Qi

    2014-01-01

    Full Text Available SAHA (suberoylanilide hydroxamic acid or vorinostat is the first nonselective histone deacetylase (HDAC inhibitor approved by the US Food and Drug Administration (FDA. SAHA affects histone acetylation in chromatin and a variety of nonhistone substrates, thus influencing many cellular processes. In particularly, SAHA induces selective apoptosis of tumor cells, although the mechanism is not well understood. A series of microarray experiments was recently conducted to investigate tumor cell-selective proapoptotic transcriptional responses induced by SAHA. Based on that gene expression time series, we propose a novel framework for detailed analysis of the mechanism of tumor cell apoptosis selectively induced by SAHA. Our analyses indicated that SAHA selectively disrupted the DNA damage response, cell cycle, p53 expression, and mitochondrial integrity of tumor samples to induce selective tumor cell apoptosis. Our results suggest a possible regulation network. Our research extends the existing research.

  13. Analysing Stable Time Series

    National Research Council Canada - National Science Library

    Adler, Robert

    1997-01-01

    We describe how to take a stable, ARMA, time series through the various stages of model identification, parameter estimation, and diagnostic checking, and accompany the discussion with a goodly number...

  14. Multivariate Time Series Search

    Data.gov (United States)

    National Aeronautics and Space Administration — Multivariate Time-Series (MTS) are ubiquitous, and are generated in areas as disparate as sensor recordings in aerospace systems, music and video streams, medical...

  15. Improved vertical displacements induced by a refined thermal expansion model and its quantitative analysis in GPS height time series

    Science.gov (United States)

    Wang, Kaihua; Chen, Hua; Jiang, Weiping; Li, Zhao; Ma, Yifang; Deng, Liansheng

    2018-04-01

    There are apparent seasonal variations in GPS height time series, and thermal expansion is considered to be one of the potential geophysical contributors. The displacements introduced by thermal expansion are usually derived without considering the annex height and underground part of the monument (e.g. located on roof or top of the buildings), which may bias the geophysical explanation of the seasonal oscillation. In this paper, the improved vertical displacements are derived by a refined thermal expansion model where the annex height and underground depth of the monument are taken into account, and then 560 IGS stations are adopted to validate the modeled thermal expansion (MTE) displacements. In order to evaluate the impact of thermal expansion on GPS heights, the MTE displacements of 80 IGS stations with less data discontinuities are selected to compare with their observed GPS vertical (OGV) displacements with the modeled surface loading (MSL) displacements removed in advance. Quantitative analysis results show the maximum annual and semiannual amplitudes of the MTE are 6.65 mm (NOVJ) and 0.51 mm (IISC), respectively, and the maximum peak-to-peak oscillation of the MTE displacements can be 19.4 mm. The average annual amplitude reductions are 0.75 mm and 1.05 mm respectively after removing the MTE and MSL displacements from the OGV, indicating the seasonal oscillation induced by thermal expansion is equivalent to >75% of the impact of surface loadings. However, there are rarely significant reductions for the semiannual amplitude. Given the result in this study that thermal expansion can explain 17.3% of the annual amplitude in GPS heights on average, it must be precisely modeled both in GPS precise data processing and GPS time series analysis, especially for those stations located in the middle and high latitudes with larger annual temperature oscillation, or stations with higher monument.

  16. Atmospheric Pressure and Abdominal Aortic Aneurysm Rupture: Results From a Time Series Analysis and Case-Crossover Study.

    Science.gov (United States)

    Penning de Vries, Bas B L; Kolkert, Joé L P; Meerwaldt, Robbert; Groenwold, Rolf H H

    2017-10-01

    Associations between atmospheric pressure and abdominal aortic aneurysm (AAA) rupture risk have been reported, but empirical evidence is inconclusive and largely derived from studies that did not account for possible nonlinearity, seasonality, and confounding by temperature. Associations between atmospheric pressure and AAA rupture risk were investigated using local meteorological data and a case series of 358 patients admitted to hospital for ruptured AAA during the study period, January 2002 to December 2012. Two analyses were performed-a time series analysis and a case-crossover study. Results from the 2 analyses were similar; neither the time series analysis nor the case-crossover study showed a significant association between atmospheric pressure ( P = .627 and P = .625, respectively, for mean daily atmospheric pressure) or atmospheric pressure variation ( P = .464 and P = .816, respectively, for 24-hour change in mean daily atmospheric pressure) and AAA rupture risk. This study failed to support claims that atmospheric pressure causally affects AAA rupture risk. In interpreting our results, one should be aware that the range of atmospheric pressure observed in this study is not representative of the atmospheric pressure to which patients with AAA may be exposed, for example, during air travel or travel to high altitudes in the mountains. Making firm claims regarding these conditions in relation to AAA rupture risk is difficult at best. Furthermore, despite the fact that we used one of the largest case series to date to investigate the effect of atmospheric pressure on AAA rupture risk, it is possible that this study is simply too small to demonstrate a causal link.

  17. Modeling and forecasting of the under-five mortality rate in Kermanshah province in Iran: a time series analysis

    Directory of Open Access Journals (Sweden)

    Mehran Rostami

    2015-01-01

    Full Text Available OBJECTIVES: The target of the Fourth Millennium Development Goal (MDG-4 is to reduce the rate of under-five mortality by two-thirds between 1990 and 2015. Despite substantial progress towards achieving the target of the MDG-4 in Iran at the national level, differences at the sub-national levels should be taken into consideration. METHODS: The under-five mortality data available from the Deputy of Public Health, Kermanshah University of Medical Sciences, was used in order to perform a time series analysis of the monthly under-five mortality rate (U5MR from 2005 to 2012 in Kermanshah province in the west of Iran. After primary analysis, a seasonal auto-regressive integrated moving average model was chosen as the best fitting model based on model selection criteria. RESULTS: The model was assessed and proved to be adequate in describing variations in the data. However, the unexpected presence of a stochastic increasing trend and a seasonal component with a periodicity of six months in the fitted model are very likely to be consequences of poor quality of data collection and reporting systems. CONCLUSIONS: The present work is the first attempt at time series modeling of the U5MR in Iran, and reveals that improvement of under-five mortality data collection in health facilities and their corresponding systems is a major challenge to fully achieving the MGD-4 in Iran. Studies similar to the present work can enhance the understanding of the invisible patterns in U5MR, monitor progress towards the MGD-4, and predict the impact of future variations on the U5MR.

  18. Modeling and forecasting of the under-five mortality rate in Kermanshah province in Iran: a time series analysis.

    Science.gov (United States)

    Rostami, Mehran; Jalilian, Abdollah; Hamzeh, Behrooz; Laghaei, Zahra

    2015-01-01

    The target of the Fourth Millennium Development Goal (MDG-4) is to reduce the rate of under-five mortality by two-thirds between 1990 and 2015. Despite substantial progress towards achieving the target of the MDG-4 in Iran at the national level, differences at the sub-national levels should be taken into consideration. The under-five mortality data available from the Deputy of Public Health, Kermanshah University of Medical Sciences, was used in order to perform a time series analysis of the monthly under-five mortality rate (U5MR) from 2005 to 2012 in Kermanshah province in the west of Iran. After primary analysis, a seasonal auto-regressive integrated moving average model was chosen as the best fitting model based on model selection criteria. The model was assessed and proved to be adequate in describing variations in the data. However, the unexpected presence of a stochastic increasing trend and a seasonal component with a periodicity of six months in the fitted model are very likely to be consequences of poor quality of data collection and reporting systems. The present work is the first attempt at time series modeling of the U5MR in Iran, and reveals that improvement of under-five mortality data collection in health facilities and their corresponding systems is a major challenge to fully achieving the MGD-4 in Iran. Studies similar to the present work can enhance the understanding of the invisible patterns in U5MR, monitor progress towards the MGD-4, and predict the impact of future variations on the U5MR.

  19. Prediction and Geometry of Chaotic Time Series

    National Research Council Canada - National Science Library

    Leonardi, Mary

    1997-01-01

    This thesis examines the topic of chaotic time series. An overview of chaos, dynamical systems, and traditional approaches to time series analysis is provided, followed by an examination of state space reconstruction...

  20. Long time series

    DEFF Research Database (Denmark)

    Hisdal, H.; Holmqvist, E.; Hyvärinen, V.

    Awareness that emission of greenhouse gases will raise the global temperature and change the climate has led to studies trying to identify such changes in long-term climate and hydrologic time series. This report, written by the......Awareness that emission of greenhouse gases will raise the global temperature and change the climate has led to studies trying to identify such changes in long-term climate and hydrologic time series. This report, written by the...

  1. [Predicting Incidence of Hepatitis E in Chinausing Fuzzy Time Series Based on Fuzzy C-Means Clustering Analysis].

    Science.gov (United States)

    Luo, Yi; Zhang, Tao; Li, Xiao-song

    2016-05-01

    To explore the application of fuzzy time series model based on fuzzy c-means clustering in forecasting monthly incidence of Hepatitis E in mainland China. Apredictive model (fuzzy time series method based on fuzzy c-means clustering) was developed using Hepatitis E incidence data in mainland China between January 2004 and July 2014. The incidence datafrom August 2014 to November 2014 were used to test the fitness of the predictive model. The forecasting results were compared with those resulted from traditional fuzzy time series models. The fuzzy time series model based on fuzzy c-means clustering had 0.001 1 mean squared error (MSE) of fitting and 6.977 5 x 10⁻⁴ MSE of forecasting, compared with 0.0017 and 0.0014 from the traditional forecasting model. The results indicate that the fuzzy time series model based on fuzzy c-means clustering has a better performance in forecasting incidence of Hepatitis E.

  2. HiTempo: a platform for time-series analysis of remote-sensing satellite data in a high-performance computing environment

    CSIR Research Space (South Africa)

    Van den Bergh, F

    2012-08-01

    Full Text Available Course resolution earth observation satellites offer large data sets with daily observations at global scales. These data sets represent a rich resource that, because of the high acquisition rate, allows the application of time-series analysis...

  3. Change-point analysis of geophysical time-series: application to landslide displacement rate (Séchilienne rock avalanche, France)

    Science.gov (United States)

    Amorese, D.; Grasso, J.-R.; Garambois, S.; Font, M.

    2018-05-01

    increase in mean displacement rate) in the landslide kinematic. This suggests that an increase of the rainfall is able to drive an increase of the landslide displacement rate, but that most of the kinematics of the landslide is not directly attributable to rainfall amount. The detailed exploration of the characteristics of the five kinematic stages suggests that the weekly averaged displacement rates are more tied to the frequency or rainy days than to the rainfall rate values. These results suggest the pattern of Séchilienne rock avalanche is consistent with the previous findings that landslide kinematics is dependent upon not only rainfall but also soil moisture conditions (as known as being more strongly related to precipitation frequency than to precipitation amount). Finally, our analysis of the displacement rate time-series pinpoints a susceptibility change of slope response to rainfall, as being slower before the end of 2009 than after, respectively. The kinematic history as depicted by statistical tools opens new routes to understand the apparent complexity of Séchilienne landslide kinematic.

  4. A comparative analysis of spectral exponent estimation techniques for 1/f(β) processes with applications to the analysis of stride interval time series.

    Science.gov (United States)

    Schaefer, Alexander; Brach, Jennifer S; Perera, Subashan; Sejdić, Ervin

    2014-01-30

    The time evolution and complex interactions of many nonlinear systems, such as in the human body, result in fractal types of parameter outcomes that exhibit self similarity over long time scales by a power law in the frequency spectrum S(f)=1/f(β). The scaling exponent β is thus often interpreted as a "biomarker" of relative health and decline. This paper presents a thorough comparative numerical analysis of fractal characterization techniques with specific consideration given to experimentally measured gait stride interval time series. The ideal fractal signals generated in the numerical analysis are constrained under varying lengths and biases indicative of a range of physiologically conceivable fractal signals. This analysis is to complement previous investigations of fractal characteristics in healthy and pathological gait stride interval time series, with which this study is compared. The results of our analysis showed that the averaged wavelet coefficient method consistently yielded the most accurate results. Class dependent methods proved to be unsuitable for physiological time series. Detrended fluctuation analysis as most prevailing method in the literature exhibited large estimation variances. The comparative numerical analysis and experimental applications provide a thorough basis for determining an appropriate and robust method for measuring and comparing a physiologically meaningful biomarker, the spectral index β. In consideration of the constraints of application, we note the significant drawbacks of detrended fluctuation analysis and conclude that the averaged wavelet coefficient method can provide reasonable consistency and accuracy for characterizing these fractal time series. Copyright © 2013 Elsevier B.V. All rights reserved.

  5. Impact of STROBE statement publication on quality of observational study reporting: interrupted time series versus before-after analysis.

    Directory of Open Access Journals (Sweden)

    Sylvie Bastuji-Garin

    Full Text Available In uncontrolled before-after studies, CONSORT was shown to improve the reporting of randomised trials. Before-after studies ignore underlying secular trends and may overestimate the impact of interventions. Our aim was to assess the impact of the 2007 STROBE statement publication on the quality of observational study reporting, using both uncontrolled before-after analyses and interrupted time series.For this quasi-experimental study, original articles reporting cohort, case-control, and cross-sectional studies published between 2004 and 2010 in the four dermatological journals having the highest 5-year impact factors (≥ 4 were selected. We compared the proportions of STROBE items (STROBE score adequately reported in each article during three periods, two pre STROBE period (2004-2005 and 2006-2007 and one post STROBE period (2008-2010. Segmented regression analysis of interrupted time series was also performed.Of the 456 included articles, 187 (41% reported cohort studies, 166 (36.4% cross-sectional studies, and 103 (22.6% case-control studies. The median STROBE score was 57% (range, 18%-98%. Before-after analysis evidenced significant STROBE score increases between the two pre-STROBE periods and between the earliest pre-STROBE period and the post-STROBE period (median score2004-05 48% versus median score2008-10 58%, p<0.001 but not between the immediate pre-STROBE period and the post-STROBE period (median score2006-07 58% versus median score2008-10 58%, p = 0.42. In the pre STROBE period, the six-monthly mean STROBE score increased significantly, by 1.19% per six-month period (absolute increase 95%CI, 0.26% to 2.11%, p = 0.016. By segmented analysis, no significant changes in STROBE score trends occurred (-0.40%; 95%CI, -2.20 to 1.41; p = 0.64 in the post STROBE statement publication.The quality of reports increased over time but was not affected by STROBE. Our findings raise concerns about the relevance of uncontrolled before

  6. Decadal strain along creeping faults in the Needles District, Paradox Basin Utah determined with InSAR Time Series Analysis

    Science.gov (United States)

    Kravitz, K.; Furuya, M.; Mueller, K. J.

    2013-12-01

    The Needles District, in Canyonlands National Park in Utah exposes an array of actively creeping normal faults that accommodate gravity-driven extension above a plastically deforming substrate of evaporite deposits. Previous interferogram stacking and InSAR analysis of faults in the Needles District using 35 ERS satellite scenes from 1992 to 2002 showed line-of-sight deformation rates of ~1-2 mm/yr along active normal faults, with a wide strain gradient along the eastern margin of the deforming region. More rapid subsidence of ~2-2.5 mm/yr was also evident south of the main fault array across a broad platform bounded by the Colorado River and a single fault scarp to the south. In this study, time series analysis was performed on SAR scenes from Envisat, PALSAR, and ERS satellites ranging from 1992 to 2010 to expand upon previous results. Both persistent scatterer and small baseline methods were implemented using StaMPS. Preliminary results from Envisat data indicate equally distributed slip rates along the length of faults within the Needles District and very little subsidence in the broad region further southwest identified in previous work. A phase ramp that appears to be present within the initial interferograms creates uncertainty in the current analysis and future work is aimed at removing this artifact. Our new results suggest, however that a clear deformation signal is present along a number of large grabens in the northern part of the region at higher rates of up to 3-4 mm/yr. Little to no creep is evident along the single fault zone that bounds the southern Needles, in spite of the presence of a large and apparently active fault. This includes a segment of this fault that is instrumented by a creepmeter that yields slip rates on the order of ~1mm/yr. Further work using time series analysis and a larger sampling of SAR scenes will be used in an effort to determine why differences exist between previous and current work and to test mechanics-based modeling

  7. Time Series Analysis and Forecasting of Wastewater Inflow into Bandar Tun Razak Sewage Treatment Plant in Selangor, Malaysia

    Science.gov (United States)

    Abunama, Taher; Othman, Faridah

    2017-06-01

    Analysing the fluctuations of wastewater inflow rates in sewage treatment plants (STPs) is essential to guarantee a sufficient treatment of wastewater before discharging it to the environment. The main objectives of this study are to statistically analyze and forecast the wastewater inflow rates into the Bandar Tun Razak STP in Kuala Lumpur, Malaysia. A time series analysis of three years’ weekly influent data (156weeks) has been conducted using the Auto-Regressive Integrated Moving Average (ARIMA) model. Various combinations of ARIMA orders (p, d, q) have been tried to select the most fitted model, which was utilized to forecast the wastewater inflow rates. The linear regression analysis was applied to testify the correlation between the observed and predicted influents. ARIMA (3, 1, 3) model was selected with the highest significance R-square and lowest normalized Bayesian Information Criterion (BIC) value, and accordingly the wastewater inflow rates were forecasted to additional 52weeks. The linear regression analysis between the observed and predicted values of the wastewater inflow rates showed a positive linear correlation with a coefficient of 0.831.

  8. Comparison of data transformation procedures to enhance topographical accuracy in time-series analysis of the human EEG.

    Science.gov (United States)

    Hauk, O; Keil, A; Elbert, T; Müller, M M

    2002-01-30

    We describe a methodology to apply current source density (CSD) and minimum norm (MN) estimation as pre-processing tools for time-series analysis of single trial EEG data. The performance of these methods is compared for the case of wavelet time-frequency analysis of simulated gamma-band activity. A reasonable comparison of CSD and MN on the single trial level requires regularization such that the corresponding transformed data sets have similar signal-to-noise ratios (SNRs). For region-of-interest approaches, it should be possible to optimize the SNR for single estimates rather than for the whole distributed solution. An effective implementation of the MN method is described. Simulated data sets were created by modulating the strengths of a radial and a tangential test dipole with wavelets in the frequency range of the gamma band, superimposed with simulated spatially uncorrelated noise. The MN and CSD transformed data sets as well as the average reference (AR) representation were subjected to wavelet frequency-domain analysis, and power spectra were mapped for relevant frequency bands. For both CSD and MN, the influence of noise can be sufficiently suppressed by regularization to yield meaningful information, but only MN represents both radial and tangential dipole sources appropriately as single peaks. Therefore, when relating wavelet power spectrum topographies to their neuronal generators, MN should be preferred.

  9. Time Series Momentum

    DEFF Research Database (Denmark)

    Moskowitz, Tobias J.; Ooi, Yao Hua; Heje Pedersen, Lasse

    2012-01-01

    We document significant “time series momentum” in equity index, currency, commodity, and bond futures for each of the 58 liquid instruments we consider. We find persistence in returns for one to 12 months that partially reverses over longer horizons, consistent with sentiment theories of initial...... under-reaction and delayed over-reaction. A diversified portfolio of time series momentum strategies across all asset classes delivers substantial abnormal returns with little exposure to standard asset pricing factors and performs best during extreme markets. Examining the trading activities...

  10. Predicting chaotic time series

    International Nuclear Information System (INIS)

    Farmer, J.D.; Sidorowich, J.J.

    1987-01-01

    We present a forecasting technique for chaotic data. After embedding a time series in a state space using delay coordinates, we ''learn'' the induced nonlinear mapping using local approximation. This allows us to make short-term predictions of the future behavior of a time series, using information based only on past values. We present an error estimate for this technique, and demonstrate its effectiveness by applying it to several examples, including data from the Mackey-Glass delay differential equation, Rayleigh-Benard convection, and Taylor-Couette flow

  11. Hierarchical structure of the energy landscape of proteins revisited by time series analysis. II. Investigation of explicit solvent effects

    Science.gov (United States)

    Alakent, Burak; Camurdan, Mehmet C.; Doruker, Pemra

    2005-10-01

    Time series analysis tools are employed on the principal modes obtained from the Cα trajectories from two independent molecular-dynamics simulations of α-amylase inhibitor (tendamistat). Fluctuations inside an energy minimum (intraminimum motions), transitions between minima (interminimum motions), and relaxations in different hierarchical energy levels are investigated and compared with those encountered in vacuum by using different sampling window sizes and intervals. The low-frequency low-indexed mode relationship, established in vacuum, is also encountered in water, which shows the reliability of the important dynamics information offered by principal components analysis in water. It has been shown that examining a short data collection period (100ps) may result in a high population of overdamped modes, while some of the low-frequency oscillations (memory: future conformations are less dependent on previous conformations due to the lowering of energy barriers in hierarchical levels of the energy landscape. In short-time dynamics (sight contradicts. However, this comes about because water enhances the transitions between minima and forces the protein to reduce its already inherent inability to maintain oscillations observed in vacuum. Some of the frequencies lower than 10cm-1 are found to be overdamped, while those higher than 20cm-1 are slightly increased. As for the long-time dynamics in water, it is found that random-walk motion is maintained for approximately 200ps (about five times of that in vacuum) in the low-indexed modes, showing the lowering of energy barriers between the higher-level minima.

  12. Analysis of trend in temperature and rainfall time series of an Indian arid region: comparative evaluation of salient techniques

    Science.gov (United States)

    Machiwal, Deepesh; Gupta, Ankit; Jha, Madan Kumar; Kamble, Trupti

    2018-04-01

    This study investigated trends in 35 years (1979-2013) temperature (maximum, Tmax and minimum, Tmin) and rainfall at annual and seasonal (pre-monsoon, monsoon, post-monsoon, and winter) scales for 31 grid points in a coastal arid region of India. Box-whisker plots of annual temperature and rainfall time series depict systematic spatial gradients. Trends were examined by applying eight tests, such as Kendall rank correlation (KRC), Spearman rank order correlation (SROC), Mann-Kendall (MK), four modified MK tests, and innovative trend analysis (ITA). Trend magnitudes were quantified by Sen's slope estimator, and a new method was adopted to assess the significance of linear trends in MK-test statistics. It was found that the significant serial correlation is prominent in the annual and post-monsoon Tmax and Tmin, and pre-monsoon Tmin. The KRC and MK tests yielded similar results in close resemblance with the SROC test. The performance of two modified MK tests considering variance-correction approaches was found superior to the KRC, MK, modified MK with pre-whitening, and ITA tests. The performance of original MK test is poor due to the presence of serial correlation, whereas the ITA method is over-sensitive in identifying trends. Significantly increasing trends are more prominent in Tmin than Tmax. Further, both the annual and monsoon rainfall time series have a significantly increasing trend of 9 mm year-1. The sequential significance of linear trend in MK test-statistics is very strong (R 2 ≥ 0.90) in the annual and pre-monsoon Tmin (90% grid points), and strong (R 2 ≥ 0.75) in monsoon Tmax (68% grid points), monsoon, post-monsoon, and winter Tmin (respectively 65, 55, and 48% grid points), as well as in the annual and monsoon rainfalls (respectively 68 and 61% grid points). Finally, this study recommends use of variance-corrected MK test for the precise identification of trends. It is emphasized that the rising Tmax may hamper crop growth due to enhanced

  13. The impact of policy guidelines on hospital antibiotic use over a decade: a segmented time series analysis.

    Directory of Open Access Journals (Sweden)

    Sujith J Chandy

    Full Text Available Antibiotic pressure contributes to rising antibiotic resistance. Policy guidelines encourage rational prescribing behavior, but effectiveness in containing antibiotic use needs further assessment. This study therefore assessed the patterns of antibiotic use over a decade and analyzed the impact of different modes of guideline development and dissemination on inpatient antibiotic use.Antibiotic use was calculated monthly as defined daily doses (DDD per 100 bed days for nine antibiotic groups and overall. This time series compared trends in antibiotic use in five adjacent time periods identified as 'Segments,' divided based on differing modes of guideline development and implementation: Segment 1--Baseline prior to antibiotic guidelines development; Segment 2--During preparation of guidelines and booklet dissemination; Segment 3--Dormant period with no guidelines dissemination; Segment 4--Booklet dissemination of revised guidelines; Segment 5--Booklet dissemination of revised guidelines with intranet access. Regression analysis adapted for segmented time series and adjusted for seasonality assessed changes in antibiotic use trend.Overall antibiotic use increased at a monthly rate of 0.95 (SE = 0.18, 0.21 (SE = 0.08 and 0.31 (SE = 0.06 for Segments 1, 2 and 3, stabilized in Segment 4 (0.05; SE = 0.10 and declined in Segment 5 (-0.37; SE = 0.11. Segments 1, 2 and 4 exhibited seasonal fluctuations. Pairwise segmented regression adjusted for seasonality revealed a significant drop in monthly antibiotic use of 0.401 (SE = 0.089; p<0.001 for Segment 5 compared to Segment 4. Most antibiotic groups showed similar trends to overall use.Use of overall and specific antibiotic groups showed varied patterns and seasonal fluctuations. Containment of rising overall antibiotic use was possible during periods of active guideline dissemination. Wider access through intranet facilitated significant decline in use. Stakeholders and policy

  14. 55-year (1960-2015) spatiotemporal shoreline change analysis using historical DISP and Landsat time series data in Shanghai

    Science.gov (United States)

    Qiao, Gang; Mi, Huan; Wang, Weian; Tong, Xiaohua; Li, Zhongbin; Li, Tan; Liu, Shijie; Hong, Yang

    2018-06-01

    Shoreline change has been an increasing concern for low-lying and vulnerable coastal zones worldwide, especially in estuarine delta regions, which generally have significant economic development, large human settlements and infrastructures. Thus, long time-series shoreline change data are useful for understanding how shorelines respond to natural and anthropogenic activities, as well as for providing greater insights into coastal protection and sustainable development in the future. For the first time, this study analyzes 55 years of spatiotemporal shoreline changes in Shanghai, China, by integrating the historical Declassified Intelligence Satellite Photography (DISP) and Landsat time series data at five-year intervals from 1960 to 2015. Twelve shorelines were interpreted from DISP and Landsat images. The spatiotemporal changes in the shorelines were explored at five-year intervals within the study period for the Shanghai mainland and islands. The results indicate that shorelines in Shanghai accreted significantly over the last 55 years, but different accretion patterns were observed in Chongming Dongtan. The rate of shoreline change varied in different areas, and the most noticeable expansions were Chongming Beitan, Chongming Dongtan, Hengsha Dongtan, and Nanhuizui. The length of the entire shoreline increased by 25.7% from 472.6 km in 1960 to 594.2 km in 2015. Due to the shoreline changes, the Shanghai area expanded by 1,192.5 km2 by 2015, which was an increase of 19.9% relative to its 1960 area. The Digital Shoreline Analysis System (DSAS) was used to compute rate-of-change statistics. Between 1960 and 2015, 10.6% of the total transects exceeded 3 km of Net Shoreline Movement (NSM), with a maximum value of approximately 20 km at eastern Hengsha Island. The average Weighted Linear Regression Rate (WLR) of the Shanghai shoreline was 52.2 m/yr from 1960 to 2015; there was 94.1% accretion, 3.1% erosion, and 2.8% with no significant change. In addition, the driving

  15. Incidence of infective endocarditis in England, 2000-13: a secular trend, interrupted time-series analysis.

    Science.gov (United States)

    Dayer, Mark J; Jones, Simon; Prendergast, Bernard; Baddour, Larry M; Lockhart, Peter B; Thornhill, Martin H

    2015-03-28

    Antibiotic prophylaxis given before invasive dental procedures in patients at risk of developing infective endocarditis has historically been the focus of infective endocarditis prevention. Recent changes in antibiotic prophylaxis guidelines in the USA and Europe have substantially reduced the number of patients for whom antibiotic prophylaxis is recommended. In the UK, guidelines from the National Institute for Health and Clinical Excellence (NICE) recommended complete cessation of antibiotic prophylaxis for prevention of infective endocarditis in March, 2008. We aimed to investigate changes in the prescribing of antibiotic prophylaxis and the incidence of infective endocarditis since the introduction of these guidelines. We did a retrospective secular trend study, analysed as an interrupted time series, to investigate the effect of antibiotic prophylaxis versus no prophylaxis on the incidence of infective endocarditis in England. We analysed data for the prescription of antibiotic prophylaxis from Jan 1, 2004, to March 31, 2013, and hospital discharge episode statistics for patients with a primary diagnosis of infective endocarditis from Jan 1, 2000, to March 31, 2013. We compared the incidence of infective endocarditis before and after the introduction of the NICE guidelines using segmented regression analysis of the interrupted time series. Prescriptions of antibiotic prophylaxis for the prevention of infective endocarditis fell substantially after introduction of the NICE guidance (mean 10,900 prescriptions per month [Jan 1, 2004, to March 31, 2008] vs 2236 prescriptions per month [April 1, 2008, to March 31, 2013], pinfective endocarditis increased significantly above the projected historical trend, by 0·11 cases per 10 million people per month (95% CI 0·05-0·16, pinfective endocarditis was significant for both individuals at high risk of infective endocarditis and those at lower risk. Although our data do not establish a causal association, prescriptions

  16. The impact of "Option B" on HIV transmission from mother to child in Rwanda: An interrupted time series analysis.

    Science.gov (United States)

    Abimpaye, Monique; Kirk, Catherine M; Iyer, Hari S; Gupta, Neil; Remera, Eric; Mugwaneza, Placidie; Law, Michael R

    2018-01-01

    Nearly a quarter of a million children have acquired HIV, prompting the implementation of new protocols-Option B and B+-for treating HIV+ pregnant women. While efficacy has been demonstrated in randomized trials, there is limited real-world evidence on the impact of these changes. Using longitudinal, routinely collected data we assessed the impact of the adoption of WHO Option B in Rwanda on mother to infant transmission. We used interrupted time series analysis to evaluate the impact of Option B on mother-to-child HIV transmission in Rwanda. Our primary outcome was the proportion of HIV tests in infants with positive results at six weeks of age. We included data for 20 months before and 22 months after the 2010 policy change. Of the 15,830 HIV tests conducted during our study period, 392 tested positive. We found a significant decrease in both the level (-2.08 positive tests per 100 tests conducted, 95% CI: -2.71 to -1.45, p Option B in Rwanda contributed to an immediate decrease in the rate of HIV transmission from mother to child. This suggests other countries may benefit from adopting these WHO guidelines.

  17. Effect of cardiovascular prevention strategies on incident coronary disease hospitalisation rates in Spain; an ecological time series analysis.

    Science.gov (United States)

    Medrano, María José; Alcalde-Cabero, Enrique; Ortíz, Cristina; Galán, Iñaki

    2014-02-17

    To assess the overall population impact of primary prevention strategies (promotion of healthy lifestyles, prevention of smoking and use of vascular risk drug therapy) of coronary disease in Spain. Ecological time series analysis, 1982-2009. All public and private hospitals in Spain. General population. Incident coronary disease hospitalisation as derived from official hospital discharge data. Annual hospitalisation rates were modelled according to nationwide use of statins, antihypertensive, antidiabetic and antiplatelet drugs, and prevalences of smoking, obesity and overweight. Additive generalised models and mixed Poisson regression models were used for the purpose, taking year as the random-effect variable and adjusting for age, sex, prevalence of vascular risk factors and the number of hospital beds in intensive and coronary care units. Across 28 years and 671.5 million person-years of observation, there were 2 986 834 hospitalisations due to coronary disease; of these, 1 441 980 (48.28%) were classified as incident. Hospitalisation rates increased from 1982 to 1996, with an inflection point in 1997 and a subsequent 52% decrease until 2009. Prevalences of smoking, obesity, overweight and use of vascular risk drug therapy were significantly associated with hospitalisation rates (pcrisis. Future strategies ought to lay special stress on excessive body weight prevention.

  18. Time-series analysis of surface deformation at Brady Hot Springs geothermal field (Nevada) using interferometric synthetic aperture radar

    Energy Technology Data Exchange (ETDEWEB)

    Ali, S. T. [Univ. of Wisconsin, Madison, WI (United States); Akerley, J. [Ormat Technologies Inc., Reno, NV (United States); Baluyut, E. C. [Univ. of Wisconsin, Madison, WI (United States); Cardiff, M. [Univ. of Wisconsin, Madison, WI (United States); Davatzes, N. C. [Temple Univ., Philadelphia, PA (United States). Dept. of Earth and Environmental Science; Feigl, K. L. [Univ. of Wisconsin, Madison, WI (United States); Foxall, W. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Fratta, D. [Univ. of Wisconsin, Madison, WI (United States); Mellors, R. J. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Spielman, P. [Ormat Technologies Inc., Reno, NV (United States); Wang, H. F. [Univ. of Wisconsin, Madison, WI (United States); Zemach, E. [Ormat Technologies Inc., Reno, NV (United States)

    2016-05-01

    We analyze interferometric synthetic aperture radar (InSAR) data acquired between 2004 and 2014, by the ERS-2, Envisat, ALOS and TerraSAR-X/TanDEM-X satellite missions to measure and characterize time-dependent deformation at the Brady Hot Springs geothermal field in western Nevada due to extraction of fluids. The long axis of the ~4 km by ~1.5 km elliptical subsiding area coincides with the strike of the dominant normal fault system at Brady. Within this bowl of subsidence, the interference pattern shows several smaller features with length scales of the order of ~1 km. This signature occurs consistently in all of the well-correlated interferometric pairs spanning several months. Results from inverse modeling suggest that the deformation is a result of volumetric contraction in shallow units, no deeper than 600 m, likely associated with damaged regions where fault segments mechanically interact. Such damaged zones are expected to extend downward along steeply dipping fault planes, providing a high permeability conduit to the production wells. Using time series analysis, we test the hypothesis that geothermal production drives the observed deformation. We find a good correlation between the observed deformation rate and the rate of production in the shallow wells. We also explore mechanisms that could potentially cause the observed deformation, including thermal contraction of rock, decline in pore pressure and dissolution of minerals over time.

  19. Forecasting of UV-Vis absorbance time series using artificial neural networks combined with principal component analysis.

    Science.gov (United States)

    Plazas-Nossa, Leonardo; Hofer, Thomas; Gruber, Günter; Torres, Andres

    2017-02-01

    This work proposes a methodology for the forecasting of online water quality data provided by UV-Vis spectrometry. Therefore, a combination of principal component analysis (PCA) to reduce the dimensionality of a data set and artificial neural networks (ANNs) for forecasting purposes was used. The results obtained were compared with those obtained by using discrete Fourier transform (DFT). The proposed methodology was applied to four absorbance time series data sets composed by a total number of 5705 UV-Vis spectra. Absolute percentage errors obtained by applying the proposed PCA/ANN methodology vary between 10% and 13% for all four study sites. In general terms, the results obtained were hardly generalizable, as they appeared to be highly dependent on specific dynamics of the water system; however, some trends can be outlined. PCA/ANN methodology gives better results than PCA/DFT forecasting procedure by using a specific spectra range for the following conditions: (i) for Salitre wastewater treatment plant (WWTP) (first hour) and Graz West R05 (first 18 min), from the last part of UV range to all visible range; (ii) for Gibraltar pumping station (first 6 min) for all UV-Vis absorbance spectra; and (iii) for San Fernando WWTP (first 24 min) for all of UV range to middle part of visible range.

  20. Comparative Time Series Analysis of Aerosol Optical Depth over Sites in United States and China Using ARIMA Modeling

    Science.gov (United States)

    Li, X.; Zhang, C.; Li, W.

    2017-12-01

    Long-term spatiotemporal analysis and modeling of aerosol optical depth (AOD) distribution is of paramount importance to study radiative forcing, climate change, and human health. This study is focused on the trends and variations of AOD over six stations located in United States and China during 2003 to 2015, using satellite-retrieved Moderate Resolution Imaging Spectrometer (MODIS) Collection 6 retrievals and ground measurements derived from Aerosol Robotic NETwork (AERONET). An autoregressive integrated moving average (ARIMA) model is applied to simulate and predict AOD values. The R2, adjusted R2, Root Mean Square Error (RMSE), Mean Absolute Error (MAE), Mean Absolute Percentage Error (MAPE), and Bayesian Information Criterion (BIC) are used as indices to select the best fitted model. Results show that there is a persistent decreasing trend in AOD for both MODIS data and AERONET data over three stations. Monthly and seasonal AOD variations reveal consistent aerosol patterns over stations along mid-latitudes. Regional differences impacted by climatology and land cover types are observed for the selected stations. Statistical validation of time series models indicates that the non-seasonal ARIMA model performs better for AERONET AOD data than for MODIS AOD data over most stations, suggesting the method works better for data with higher quality. By contrast, the seasonal ARIMA model reproduces the seasonal variations of MODIS AOD data much more precisely. Overall, the reasonably predicted results indicate the applicability and feasibility of the stochastic ARIMA modeling technique to forecast future and missing AOD values.

  1. A Time Series Analysis of Macroeconomic Determinants of Corporate Births in Romania in the period 2008-2013

    Directory of Open Access Journals (Sweden)

    Marușa Beca

    2015-06-01

    Full Text Available In this article, we studied the relationship between macroeconomic factors and the observed corporate births for the Romanian economy through the Autoregressive Distributed Lags Model (ADL. We performed a time series analysis that uses monthly data for the period January 2008 – December 2013 in order to establish the impact of the fiscal and monetary policy adopted by the Romanian government in times of economic crisis on the firms’ demography. The corporate birth rate is an endogenous variable in a linear function model with five exogenous macroeconomic variables such as the CPI, the loans ratio to GDP, the FDI, the long term interest rate, tax rate to GDP and the lags of the dependent variable. The main finding is that the variance of the corporate birth rate variable is negatively correlated with the variances of CPI in the current month and the interest rate two months lagged. We also determined that the variance of the dependent variable was positively correlated with the variances of the loans rate two months lagged, tax rate four months ago and FDI two months lagged and FDI in the current period.

  2. Seasonality of service provision in hip and knee surgery: a possible contributor to waiting times? A time series analysis.

    Science.gov (United States)

    Upshur, Ross E G; Moineddin, Rahim; Crighton, Eric J; Mamdani, Muhammad

    2006-03-01

    The question of how best to reduce waiting times for health care, particularly surgical procedures such as hip and knee replacements is among the most pressing concern of the Canadian health care system. The objective of this study was to test the hypothesis that significant seasonal variation exists in the performance of hip and knee replacement surgery in the province of Ontario. We performed a retrospective, cross-sectional time series analysis examining all hip and knee replacement surgeries in people over the age of 65 in the province of Ontario, Canada between 1992 and 2002. The main outcome measure was monthly hospitalization rates per 100,000 population for all hip and knee replacements. There was a marked increase in the rate of hip and knee replacement surgery over the 10-year period as well as an increasing seasonal variation in surgeries. Highly significant (Fisher Kappa = 16.05, p Holidays and utilization caps appear to exert a significant influence on the rate of service provision. It is expected that waiting times for hip and knee replacement could be reduced by reducing seasonal fluctuations in service provision and benchmarking services to peak delivery. The results highlight the importance of system behaviour in seasonal fluctuation of service delivery.

  3. Modeling malaria control intervention effect in KwaZulu-Natal, South Africa using intervention time series analysis.

    Science.gov (United States)

    Ebhuoma, Osadolor; Gebreslasie, Michael; Magubane, Lethumusa

    The change of the malaria control intervention policy in South Africa (SA), re-introduction of dichlorodiphenyltrichloroethane (DDT), may be responsible for the low and sustained malaria transmission in KwaZulu-Natal (KZN). We evaluated the effect of the re-introduction of DDT on malaria in KZN and suggested practical ways the province can strengthen her already existing malaria control and elimination efforts, to achieve zero malaria transmission. We obtained confirmed monthly malaria cases in KZN from the malaria control program of KZN from 1998 to 2014. The seasonal autoregressive integrated moving average (SARIMA) intervention time series analysis (ITSA) was employed to model the effect of the re-introduction of DDT on confirmed monthly malaria cases. The result is an abrupt and permanent decline of monthly malaria cases (w 0 =-1174.781, p-value=0.003) following the implementation of the intervention policy. The sustained low malaria cases observed over a long period suggests that the continued usage of DDT did not result in insecticide resistance as earlier anticipated. It may be due to exophagic malaria vectors, which renders the indoor residual spraying not totally effective. Therefore, the feasibility of reducing malaria transmission to zero in KZN requires other reliable and complementary intervention resources to optimize the existing ones. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  4. A Big Data and Time Series Analysis Technology-Based Multi-Agent System for Smart Tourism

    Directory of Open Access Journals (Sweden)

    Wei-Chih Chen

    2018-06-01

    Full Text Available This study focuses on presenting a development trend from the perspective of data-oriented evidence, especially open data and technologies, as those numbers can verify and prove current technology trends and user information requirements. According to the practical progress of Dr. What-Info I and II, this paper continues to develop Dr. What-Info III. Moreover, big data technology, the MapReduce paralleled decrement mechanism of the cloud information agent CEOntoIAS, which is supported by a Hadoop-like framework, Software R, and time series analysis are adopted to enhance the precision, reliability, and integrity of cloud information. Furthermore, the proposed system app receives a collective satisfaction score of 80% in terms of Quesenbery’s 5Es and Nielsen ratings. In addition, the verification results of the interface design show that the human-machine interface of our proposed system can meet important design preferences and provide approximately optimal balance. The top-n experiment shows that the top-5 recommendations would be better for solving the traditional tradeoff between output quality and processing time. Finally, the system effectiveness experiments indicate that the proposed system receives an overall up-to-standard function rate of 87.5%, and such recommendations provide this system with high information correctness and user satisfaction. Although there is plenty of room for improvement in experience, the feasibility of this service architecture has been proven.

  5. Seasonality of service provision in hip and knee surgery: A possible contributor to waiting times? A time series analysis

    Directory of Open Access Journals (Sweden)

    Upshur Ross EG

    2006-03-01

    Full Text Available Abstract Background The question of how best to reduce waiting times for health care, particularly surgical procedures such as hip and knee replacements is among the most pressing concern of the Canadian health care system. The objective of this study was to test the hypothesis that significant seasonal variation exists in the performance of hip and knee replacement surgery in the province of Ontario. Methods We performed a retrospective, cross-sectional time series analysis examining all hip and knee replacement surgeries in people over the age of 65 in the province of Ontario, Canada between 1992 and 2002. The main outcome measure was monthly hospitalization rates per 100 000 population for all hip and knee replacements. Results There was a marked increase in the rate of hip and knee replacement surgery over the 10-year period as well as an increasing seasonal variation in surgeries. Highly significant (Fisher Kappa = 16.05, p 2Autoreg = 0.85 seasonality was identified in the data. Conclusion Holidays and utilization caps appear to exert a significant influence on the rate of service provision. It is expected that waiting times for hip and knee replacement could be reduced by reducing seasonal fluctuations in service provision and benchmarking services to peak delivery. The results highlight the importance of system behaviour in seasonal fluctuation of service delivery.

  6. Impact of meteorological factors on the incidence of bacillary dysentery in Beijing, China: A time series analysis (1970-2012.

    Directory of Open Access Journals (Sweden)

    Long Yan

    Full Text Available Influence of meteorological variables on the transmission of bacillary dysentery (BD is under investigated topic and effective forecasting models as public health tool are lacking. This paper aimed to quantify the relationship between meteorological variables and BD cases in Beijing and to establish an effective forecasting model.A time series analysis was conducted in the Beijing area based upon monthly data on weather variables (i.e. temperature, rainfall, relative humidity, vapor pressure, and wind speed and on the number of BD cases during the period 1970-2012. Autoregressive integrated moving average models with explanatory variables (ARIMAX were built based on the data from 1970 to 2004. Prediction of monthly BD cases from 2005 to 2012 was made using the established models. The prediction accuracy was evaluated by the mean square error (MSE.Firstly, temperature with 2-month and 7-month lags and rainfall with 12-month lag were found positively correlated with the number of BD cases in Beijing. Secondly, ARIMAX model with covariates of temperature with 7-month lag (β = 0.021, 95% confidence interval(CI: 0.004-0.038 and rainfall with 12-month lag (β = 0.023, 95% CI: 0.009-0.037 displayed the highest prediction accuracy.The ARIMAX model developed in this study showed an accurate goodness of fit and precise prediction accuracy in the short term, which would be beneficial for government departments to take early public health measures to prevent and control possible BD popularity.

  7. Health system outcomes and determinants amenable to public health in industrialized countries: a pooled, cross-sectional time series analysis

    Directory of Open Access Journals (Sweden)

    Westert Gert P

    2005-08-01

    Full Text Available Abstract Background Few studies have tried to assess the combined cross-sectional and temporal contributions of a more comprehensive set of amenable factors to population health outcomes for wealthy countries during the last 30 years of the 20th century. We assessed the overall ecological associations between mortality and factors amenable to public health. These amenable factors included addictive and nutritional lifestyle, air quality, public health spending, healthcare coverage, and immunizations. Methods We used a pooled cross-sectional, time series analysis with corrected fixed effects regression models in an ecological design involving eighteen member countries of the Organisation for Economic Cooperation and Development during the period 1970 to 1999. Results Alcohol, tobacco, and fat consumption, and sometimes, air pollution were significantly associated with higher all-cause mortality and premature death. Immunizations, health care coverage, fruit/vegetable and protein consumption, and collective health expenditure had negative effects on mortality and premature death, even after controlling for the elderly, density of practicing physicians, doctor visits and per capita GDP. However, tobacco, air pollution, and fruit/vegetable intake were sometimes sensitive to adjustments. Conclusion Mortality and premature deaths could be improved by focusing on factors that are amenable to public health policies. Tackling these issues should be reflected in the ongoing assessments of health system performance.

  8. Lake Area Analysis Using Exponential Smoothing Model and Long Time-Series Landsat Images in Wuhan, China

    Directory of Open Access Journals (Sweden)

    Gonghao Duan

    2018-01-01

    Full Text Available The loss of lake area significantly influences the climate change in a region, and this loss represents a serious and unavoidable challenge to maintaining ecological sustainability under the circumstances of lakes that are being filled. Therefore, mapping and forecasting changes in the lake is critical for protecting the environment and mitigating ecological problems in the urban district. We created an accessible map displaying area changes for 82 lakes in the Wuhan city using remote sensing data in conjunction with visual interpretation by combining field data with Landsat 2/5/7/8 Thematic Mapper (TM time-series images for the period 1987–2013. In addition, we applied a quadratic exponential smoothing model to forecast lake area changes in Wuhan city. The map provides, for the first time, estimates of lake development in Wuhan using data required for local-scale studies. The model predicted a lake area reduction of 18.494 km2 in 2015. The average error reached 0.23 with a correlation coefficient of 0.98, indicating that the model is reliable. The paper provided a numerical analysis and forecasting method to provide a better understanding of lake area changes. The modeling and mapping results can help assess aquatic habitat suitability and property planning for Wuhan lakes.

  9. Did Medicare Part D Affect National Trends in Health Outcomes or Hospitalizations? A Time-Series Analysis.

    Science.gov (United States)

    Briesacher, Becky A; Madden, Jeanne M; Zhang, Fang; Fouayzi, Hassan; Ross-Degnan, Dennis; Gurwitz, Jerry H; Soumerai, Stephen B

    2015-06-16

    Medicare Part D increased economic access to medications, but its effect on population-level health outcomes and use of other medical services remains unclear. To examine changes in health outcomes and medical services in the Medicare population after implementation of Part D. Population-level longitudinal time-series analysis with generalized linear models. Community. Nationally representative sample of Medicare beneficiaries (n = 56,293 [unweighted and unique]) from 2000 to 2010. Changes in self-reported health status, limitations in activities of daily living (ADLs) (ADLs and instrumental ADLs), emergency department visits and hospital admissions (prevalence, counts, and spending), and mortality. Medicare claims data were used for confirmatory analyses. Five years after Part D implementation, no clinically or statistically significant reductions in the prevalence of fair or poor health status or limitations in ADLs or instrumental ADLs, relative to historical trends, were detected. Compared with trends before Part D, no changes in emergency department visits, hospital admissions or days, inpatient costs, or mortality after Part D were seen. Confirmatory analyses were consistent. Only total population-level outcomes were studied. Self-reported measures may lack sensitivity. Five years after implementation, and contrary to previous reports, no evidence was found of Part D's effect on a range of population-level health indicators among Medicare enrollees. Further, there was no clear evidence of gains in medical care efficiencies.

  10. The effect of the late 2000s financial crisis on suicides in Spain: an interrupted time-series analysis.

    Science.gov (United States)

    Lopez Bernal, James A; Gasparrini, Antonio; Artundo, Carlos M; McKee, Martin

    2013-10-01

    The current financial crisis is having a major impact on European economies, especially that of Spain. Past evidence suggests that adverse macro-economic conditions exacerbate mental illness, but evidence from the current crisis is limited. This study analyses the association between the financial crisis and suicide rates in Spain. An interrupted time-series analysis of national suicides data between 2005 and 2010 was used to establish whether there has been any deviation in the underlying trend in suicide rates associated with the financial crisis. Segmented regression with a seasonally adjusted quasi-Poisson model was used for the analysis. Stratified analyses were performed to establish whether the effect of the crisis on suicides varied by region, sex and age group. The mean monthly suicide rate in Spain during the study period was 0.61 per 100 000 with an underlying trend of a 0.3% decrease per month. We found an 8.0% increase in the suicide rate above this underlying trend since the financial crisis (95% CI: 1.009-1.156; P = 0.03); this was robust to sensitivity analysis. A control analysis showed no change in deaths from accidental falls associated with the crisis. Stratified analyses suggested that the association between the crisis and suicide rates is greatest in the Mediterranean and Northern areas, in males and amongst those of working age. The financial crisis in Spain has been associated with a relative increase in suicides. Males and those of working age may be at particular risk of suicide associated with the crisis and may benefit from targeted interventions.

  11. Evaluation of the effects of climate and man intervention on ground waters and their dependent ecosystems using time series analysis

    Science.gov (United States)

    Gemitzi, Alexandra; Stefanopoulos, Kyriakos

    2011-06-01

    SummaryGroundwaters and their dependent ecosystems are affected both by the meteorological conditions as well as from human interventions, mainly in the form of groundwater abstractions for irrigation needs. This work aims at investigating the quantitative effects of meteorological conditions and man intervention on groundwater resources and their dependent ecosystems. Various seasonal Auto-Regressive Integrated Moving Average (ARIMA) models with external predictor variables were used in order to model the influence of meteorological conditions and man intervention on the groundwater level time series. Initially, a seasonal ARIMA model that simulates the abstraction time series using as external predictor variable temperature ( T) was prepared. Thereafter, seasonal ARIMA models were developed in order to simulate groundwater level time series in 8 monitoring locations, using the appropriate predictor variables determined for each individual case. The spatial component was introduced through the use of Geographical Information Systems (GIS). Application of the proposed methodology took place in the Neon Sidirochorion alluvial aquifer (Northern Greece), for which a 7-year long time series (i.e., 2003-2010) of piezometric and groundwater abstraction data exists. According to the developed ARIMA models, three distinct groups of groundwater level time series exist; the first one proves to be dependent only on the meteorological parameters, the second group demonstrates a mixed dependence both on meteorological conditions and on human intervention, whereas the third group shows a clear influence from man intervention. Moreover, there is evidence that groundwater abstraction has affected an important protected ecosystem.

  12. Radiocarbon dating uncertainty and the reliability of the PEWMA method of time-series analysis for research on long-term human-environment interaction.

    Science.gov (United States)

    Carleton, W Christopher; Campbell, David; Collard, Mark

    2018-01-01

    Statistical time-series analysis has the potential to improve our understanding of human-environment interaction in deep time. However, radiocarbon dating-the most common chronometric technique in archaeological and palaeoenvironmental research-creates challenges for established statistical methods. The methods assume that observations in a time-series are precisely dated, but this assumption is often violated when calibrated radiocarbon dates are used because they usually have highly irregular uncertainties. As a result, it is unclear whether the methods can be reliably used on radiocarbon-dated time-series. With this in mind, we conducted a large simulation study to investigate the impact of chronological uncertainty on a potentially useful time-series method. The method is a type of regression involving a prediction algorithm called the Poisson Exponentially Weighted Moving Average (PEMWA). It is designed for use with count time-series data, which makes it applicable to a wide range of questions about human-environment interaction in deep time. Our simulations suggest that the PEWMA method can often correctly identify relationships between time-series despite chronological uncertainty. When two time-series are correlated with a coefficient of 0.25, the method is able to identify that relationship correctly 20-30% of the time, providing the time-series contain low noise levels. With correlations of around 0.5, it is capable of correctly identifying correlations despite chronological uncertainty more than 90% of the time. While further testing is desirable, these findings indicate that the method can be used to test hypotheses about long-term human-environment interaction with a reasonable degree of confidence.

  13. Duality between Time Series and Networks

    Science.gov (United States)

    Campanharo, Andriana S. L. O.; Sirer, M. Irmak; Malmgren, R. Dean; Ramos, Fernando M.; Amaral, Luís A. Nunes.

    2011-01-01

    Studying the interaction between a system's components and the temporal evolution of the system are two common ways to uncover and characterize its internal workings. Recently, several maps from a time series to a network have been proposed with the intent of using network metrics to characterize time series. Although these maps demonstrate that different time series result in networks with distinct topological properties, it remains unclear how these topological properties relate to the original time series. Here, we propose a map from a time series to a network with an approximate inverse operation, making it possible to use network statistics to characterize time series and time series statistics to characterize networks. As a proof of concept, we generate an ensemble of time series ranging from periodic to random and confirm that application of the proposed map retains much of the information encoded in the original time series (or networks) after application of the map (or its inverse). Our results suggest that network analysis can be used to distinguish different dynamic regimes in time series and, perhaps more importantly, time series analysis can provide a powerful set of tools that augment the traditional network analysis toolkit to quantify networks in new and useful ways. PMID:21858093

  14. Time Series Analysis of the Microbiota of Children Suffering From Acute Infectious Diarrhea and Their Recovery After Treatment

    Directory of Open Access Journals (Sweden)

    Ener C. Dinleyici

    2018-06-01

    Full Text Available Gut microbiota is closely related to acute infectious diarrhea, one of the leading causes of mortality and morbidity in children worldwide. Understanding the dynamics of the recovery from this disease is of clinical interest. This work aims to correlate the dynamics of gut microbiota with the evolution of children who were suffering from acute infectious diarrhea caused by a rotavirus, and their recovery after the administration of a probiotic, Saccharomyces boulardii CNCM I-745. The experiment involved 10 children with acute infectious diarrhea caused by a rotavirus, and six healthy children, all aged between 3 and 4 years. The children who suffered the rotavirus infection received S. boulardii CNCM I-745 twice daily for the first 5 days of the experiment. Fecal samples were collected from each participant at 0, 3, 5, 10, and 30 days after probiotic administration. Microbial composition was characterized by 16S rRNA gene sequencing. Alpha and beta diversity were calculated, along with dynamical analysis based on Taylor's law to assess the temporal stability of the microbiota. All children infected with the rotavirus stopped having diarrhea at day 3 after the intervention. We observed low alpha diversities in the first 5 days (p-value < 0.05, Wilcoxon test, larger at 10 and 30 days after probiotic treatment. Canonical correspondence analysis (CCA showed differences in the gut microbiota of healthy children and of those who suffered from acute diarrhea in the first days (p-value < 0.05, ADONIS test, but not in the last days of the experiment. Temporal variability was larger in children infected with the rotavirus than in healthy ones. In particular, Gammaproteobacteria class was found to be abundant in children with acute diarrhea. We identified the microbiota transition from a diseased state to a healthy one with time, whose characterization may lead to relevant clinical data. This work highlights the importance of using time series for the

  15. Using barometric time series of the IMS infrasound network for a global analysis of thermally induced atmospheric tides

    Science.gov (United States)

    Hupe, Patrick; Ceranna, Lars; Pilger, Christoph

    2018-04-01

    The International Monitoring System (IMS) has been established to monitor compliance with the Comprehensive Nuclear-Test-Ban Treaty and comprises four technologies, one of which is infrasound. When fully established, the IMS infrasound network consists of 60 sites uniformly distributed around the globe. Besides its primary purpose of determining explosions in the atmosphere, the recorded data reveal information on other anthropogenic and natural infrasound sources. Furthermore, the almost continuous multi-year recordings of differential and absolute air pressure allow for analysing the atmospheric conditions. In this paper, spectral analysis tools are applied to derive atmospheric dynamics from barometric time series. Based on the solar atmospheric tides, a methodology for performing geographic and temporal variability analyses is presented, which is supposed to serve for upcoming studies related to atmospheric dynamics. The surplus value of using the IMS infrasound network data for such purposes is demonstrated by comparing the findings on the thermal tides with previous studies and the Modern-Era Retrospective analysis for Research and Applications Version 2 (MERRA-2), which represents the solar tides well in its surface pressure fields. Absolute air pressure recordings reveal geographical characteristics of atmospheric tides related to the solar day and even to the lunar day. We therefore claim the chosen methodology of using the IMS infrasound network to be applicable for global and temporal studies on specific atmospheric dynamics. Given the accuracy and high temporal resolution of the barometric data from the IMS infrasound network, interactions with gravity waves and planetary waves can be examined in future for refining the knowledge of atmospheric dynamics, e.g. the origin of tidal harmonics up to 9 cycles per day as found in the barometric data sets. Data assimilation in empirical models of solar tides would be a valuable application of the IMS infrasound

  16. Protective Effect of Dual-Strain Probiotics in Preterm Infants: A Multi-Center Time Series Analysis

    Science.gov (United States)

    Schwab, Frank; Garten, Lars; Geffers, Christine; Gastmeier, Petra; Piening, Brar

    2016-01-01

    Objective To determine the effect of dual-strain probiotics on the development of necrotizing enterocolitis (NEC), mortality and nosocomial bloodstream infections (BSI) in preterm infants in German neonatal intensive care units (NICUs). Design A multi-center interrupted time series analysis. Setting 44 German NICUs with routine use of dual-strain probiotics on neonatal ward level. Patients Preterm infants documented by NEO-KISS, the German surveillance system for nosocomial infections in preterm infants with birth weights below 1,500 g, between 2004 and 2014. Intervention Routine use of dual-strain probiotics containing Lactobacillus acidophilus and Bifidobacterium spp. (Infloran) on the neonatal ward level. Main outcome measures Incidences of NEC, overall mortality, mortality following NEC and nosocomial BSI. Results Data from 10,890 preterm infants in 44 neonatal wards was included in this study. Incidences of NEC and BSI were 2.5% (n = 274) and 15.0%, (n = 1631), respectively. Mortality rate was 6.1% (n = 665). The use of dual-strain probiotics significantly reduced the risk of NEC (HR = 0.48; 95% CI = 0.38–0.62), overall mortality (HR = 0.60, 95% CI = 0.44–0.83), mortality after NEC (HR = 0.51, 95% CI = 0.26–0.999) and nosocomial BSI (HR = 0.89, 95% CI = 0.81–0.98). These effects were even more pronounced in the subgroup analysis of preterm infants with birth weights below 1,000 g. Conclusion In order to reduce NEC and mortality in preterm infants, it is advisable to add routine prophylaxis with dual-strain probiotics to clinical practice in neonatal wards. PMID:27332554

  17. Risk prediction for chronic kidney disease progression using heterogeneous electronic health record data and time series analysis.

    Science.gov (United States)

    Perotte, Adler; Ranganath, Rajesh; Hirsch, Jamie S; Blei, David; Elhadad, Noémie

    2015-07-01

    As adoption of electronic health records continues to increase, there is an opportunity to incorporate clinical documentation as well as laboratory values and demographics into risk prediction modeling. The authors develop a risk prediction model for chronic kidney disease (CKD) progression from stage III to stage IV that includes longitudinal data and features drawn from clinical documentation. The study cohort consisted of 2908 primary-care clinic patients who had at least three visits prior to January 1, 2013 and developed CKD stage III during their documented history. Development and validation cohorts were randomly selected from this cohort and the study datasets included longitudinal inpatient and outpatient data from these populations. Time series analysis (Kalman filter) and survival analysis (Cox proportional hazards) were combined to produce a range of risk models. These models were evaluated using concordance, a discriminatory statistic. A risk model incorporating longitudinal data on clinical documentation and laboratory test results (concordance 0.849) predicts progression from state III CKD to stage IV CKD more accurately when compared to a similar model without laboratory test results (concordance 0.733, P<.001), a model that only considers the most recent laboratory test results (concordance 0.819, P < .031) and a model based on estimated glomerular filtration rate (concordance 0.779, P < .001). A risk prediction model that takes longitudinal laboratory test results and clinical documentation into consideration can predict CKD progression from stage III to stage IV more accurately than three models that do not take all of these variables into consideration. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association.

  18. The importance of alcoholic beverage type for suicide in Japan: a time-series analysis, 1963-2007.

    Science.gov (United States)

    Norström, Thor; Stickley, Andrew; Shibuya, Kenji

    2012-05-01

    Japan has one of the highest suicide rates in the world. Cohort analysis has suggested that alcohol consumption is a risk factor for suicide in Japan. However, this relationship has not been observed at the population level when a measure of per capita total alcohol consumption has been analysed. The present study employed a time-series analysis to examine whether these contradictory findings may be due to the existence of beverage-specific effects on suicide. An autoregressive integrated moving average model was used to assess the relationship between the consumption of different types of alcohol and suicide rates from 1963 to 2007. The data comprised age-adjusted suicide rates for the ages 15-69, and information on beverage-specific alcohol consumption per capita (15+). The unemployment rate was included as a control variable. During 1963-2007, male suicide rates increased substantially whereas female rates decreased slightly. Consumption of distilled spirits was significantly related to male suicide rates (but not in women) with a 1L increase in consumption associated with a 21.4% (95% confidence interval: 3.2-42.9) increase in male suicide rates. There was no statistically significant relationship between suicide and any other form of alcohol consumption (beer, wine, other alcohol). This is the first study that has shown an association between spirits consumption and male suicide in Japan. Potentially beneficial policy changes include increasing spirits prices through taxation, reducing the physical availability of alcohol and discouraging the practice of heavy drinking. © 2011 Australasian Professional Society on Alcohol and other Drugs.

  19. Morbidity and mortality due to malaria in Est Mono district, Togo, from 2005 to 2010: a times series analysis

    Directory of Open Access Journals (Sweden)

    Landoh Essoya D

    2012-11-01

    Full Text Available Abstract Background In 2004, Togo adopted a regional strategy for malaria control that made use of insecticide-treated nets (ITNs, followed by the use of rapid diagnostic tests (RDTs, artemisinin-based combination therapy (ACT. Community health workers (CHWs became involved in 2007. In 2010, the impact of the implementation of these new malaria control strategies had not yet been evaluated. This study sought to assess the trends of malaria incidence and mortality due to malaria in Est Mono district from 2005 to 2010. Methods Secondary data on confirmed and suspected malaria cases reported by health facilities from 2005 to 2010 were obtained from the district health information system. Rainfall and temperature data were provided by the national Department of Meteorology. Chi square test or independent student’s t-test were used to compare trends of variables at a 95% confidence interval. An interrupted time series analysis was performed to assess the effect of meteorological factors and the use of ACT and CHWs on morbidity and mortality due to malaria. Results From January 2005 to December 2010, 114,654 malaria cases (annual mean 19,109 ± 6,622 were reported with an increase of all malaria cases from 10,299 in 2005 to 26,678 cases in 2010 (p Conclusion This study showed an increase of malaria prevalence despite the implementation of the use of ACT and CHW strategies. Multicentre data analysis over longer periods should be carried out in similar settings to assess the impact of malaria control strategies on the burden of the disease. Integrated malaria vector control management should be implemented in Togo to reduce malaria transmission.

  20. A comparative analysis of spectral exponent estimation techniques for 1/fβ processes with applications to the analysis of stride interval time series

    Science.gov (United States)

    Schaefer, Alexander; Brach, Jennifer S.; Perera, Subashan; Sejdić, Ervin

    2013-01-01

    Background The time evolution and complex interactions of many nonlinear systems, such as in the human body, result in fractal types of parameter outcomes that exhibit self similarity over long time scales by a power law in the frequency spectrum S(f) = 1/fβ. The scaling exponent β is thus often interpreted as a “biomarker” of relative health and decline. New Method This paper presents a thorough comparative numerical analysis of fractal characterization techniques with specific consideration given to experimentally measured gait stride interval time series. The ideal fractal signals generated in the numerical analysis are constrained under varying lengths and biases indicative of a range of physiologically conceivable fractal signals. This analysis is to complement previous investigations of fractal characteristics in healthy and pathological gait stride interval time series, with which this study is compared. Results The results of our analysis showed that the averaged wavelet coefficient method consistently yielded the most accurate results. Comparison with Existing Methods: Class dependent methods proved to be unsuitable for physiological time series. Detrended fluctuation analysis as most prevailing method in the literature exhibited large estimation variances. Conclusions The comparative numerical analysis and experimental applications provide a thorough basis for determining an appropriate and robust method for measuring and comparing a physiologically meaningful biomarker, the spectral index β. In consideration of the constraints of application, we note the significant drawbacks of detrended fluctuation analysis and conclude that the averaged wavelet coefficient method can provide reasonable consistency and accuracy for characterizing these fractal time series. PMID:24200509

  1. Impact of a COPD discharge care bundle on readmissions following admission with acute exacerbation: interrupted time series analysis.

    Directory of Open Access Journals (Sweden)

    Anthony A Laverty

    Full Text Available We evaluated the impact of a COPD discharge care bundle on readmission rates following hospitalisation with an acute exacerbation.Interrupted time series analysis, comparing readmission rates for COPD exacerbations at nine trusts that introduced the bundle, to two comparison groups; (1 other NHS trusts in London and (2 all other NHS trusts in England. Care bundles were implemented at different times for different NHS trusts, ranging from October 2009 to April 2011.Nine NHS acute trusts in the London, England.Patients aged 45 years and older admitted to an NHS acute hospital in England for acute exacerbation of COPD. Data come from Hospital Episode Statistics, April 2002 to March 2012.Annual trend readmission rates (and in total bed days within 7, 28 and 90 days, before and after implementation.In hospitals introducing the bundle readmission rates were rising before implementation and falling afterwards (e.g. readmissions within 28 days +2.13% per annum (pa pre and -5.32% pa post (p for difference in trends = 0.012. Following implementation, readmission rates within 7 and 28 day were falling faster than among other trusts in London, although this was not statistically significant (e.g. readmissions within 28 days -4.6% pa vs. -3.2% pa, p = 0.44. Comparisons with a national control group were similar.The COPD discharge care bundle appeared to be associated with a reduction in readmission rate among hospitals using it. The significance of this is unclear because of changes to background trends in London and nationally.

  2. Trend analysis of GIMMS and MODIS NDVI time series for establishing a land degradation neutrality national baseline

    Science.gov (United States)

    Gichenje, Helene; Godinho, Sergio

    2017-04-01

    Land degradation is a key global environment and development problem that is recognized as a priority by the international development community. The Sustainable Development Goals (SDGs) were adopted by the global community in 2015, and include a goal related to land degradation and the accompanying target to achieve a land degradation-neutral (LDN) world by 2030. The LDN concept encompasses two joint actions of reducing the rate of degradation and increasing the rate of restoration. Using Kenya as the study area, this study aims to develop and test a spatially explicit methodology for assessing and monitoring the operationalization of a land degradation neutrality scheme at the national level. Time series analysis is applied to Normalized Difference Vegetation Index (NDVI) satellite data records, based on the hypothesis that the resulting NDVI residual trend would enable successful detection of changes in vegetation photosynthetic capacity and thus serve as a proxy for land degradation and regeneration processes. Two NDVI data sets are used to identify the spatial and temporal distribution of degraded and regenerated areas: the long term coarse resolution (8km, 1982-2015) third generation Global Inventory Modeling and Mapping Studies (GIMMS) NDVI3g data record; and the shorter-term finer resolution (250m, 2001-2015) Moderate Resolution Imaging Spectroradiometer (MODIS) derived NDVI data record. Climate data (rainfall, temperature and soil moisture) are used to separate areas of human-induced vegetation productivity decline from those driven by climate dynamics. Further, weekly vegetation health (VH) indexes (4km, 1982-2015) developed by National Oceanic and Atmospheric Administration (NOAA), are assessed as indicators for early detection and monitoring of land degradation by estimating vegetation stress (moisture, thermal and combined conditions).

  3. Impacts of floods on dysentery in Xinxiang city, China, during 2004–2010: a time-series Poisson analysis

    Science.gov (United States)

    Ni, Wei; Ding, Guoyong; Li, Yifei; Li, Hongkai; Jiang, Baofa

    2014-01-01

    Background Xinxiang, a city in Henan Province, suffered from frequent floods due to persistent and heavy precipitation from 2004 to 2010. In the same period, dysentery was a common public health problem in Xinxiang, with the proportion of reported cases being the third highest among all the notified infectious diseases. Objectives We focused on dysentery disease consequences of different degrees of floods and examined the association between floods and the morbidity of dysentery on the basis of longitudinal data during the study period. Design A time-series Poisson regression model was conducted to examine the relationship between 10 times different degrees of floods and the monthly morbidity of dysentery from 2004 to 2010 in Xinxiang. Relative risks (RRs) of moderate and severe floods on the morbidity of dysentery were calculated in this paper. In addition, we estimated the attributable contributions of moderate and severe floods to the morbidity of dysentery. Results A total of 7591 cases of dysentery were notified in Xinxiang during the study period. The effect of floods on dysentery was shown with a 0-month lag. Regression analysis showed that the risk of moderate and severe floods on the morbidity of dysentery was 1.55 (95% CI: 1.42–1.670) and 1.74 (95% CI: 1.56–1.94), respectively. The attributable risk proportions (ARPs) of moderate and severe floods to the morbidity of dysentery were 35.53 and 42.48%, respectively. Conclusions This study confirms that floods have significantly increased the risk of dysentery in the study area. In addition, severe floods have a higher proportional contribution to the morbidity of dysentery than moderate floods. Public health action should be taken to avoid and control a potential risk of dysentery epidemics after floods. PMID:25098726

  4. Impacts of floods on dysentery in Xinxiang city, China, during 2004-2010: a time-series Poisson analysis.

    Science.gov (United States)

    Ni, Wei; Ding, Guoyong; Li, Yifei; Li, Hongkai; Jiang, Baofa

    2014-01-01

    Xinxiang, a city in Henan Province, suffered from frequent floods due to persistent and heavy precipitation from 2004 to 2010. In the same period, dysentery was a common public health problem in Xinxiang, with the proportion of reported cases being the third highest among all the notified infectious diseases. We focused on dysentery disease consequences of different degrees of floods and examined the association between floods and the morbidity of dysentery on the basis of longitudinal data during the study period. A time-series Poisson regression model was conducted to examine the relationship between 10 times different degrees of floods and the monthly morbidity of dysentery from 2004 to 2010 in Xinxiang. Relative risks (RRs) of moderate and severe floods on the morbidity of dysentery were calculated in this paper. In addition, we estimated the attributable contributions of moderate and severe floods to the morbidity of dysentery. A total of 7591 cases of dysentery were notified in Xinxiang during the study period. The effect of floods on dysentery was shown with a 0-month lag. Regression analysis showed that the risk of moderate and severe floods on the morbidity of dysentery was 1.55 (95% CI: 1.42-1.670) and 1.74 (95% CI: 1.56-1.94), respectively. The attributable risk propor