Sample records for series specific analysis

  1. Visual time series analysis

    DEFF Research Database (Denmark)

    Fischer, Paul; Hilbert, Astrid


    We introduce a platform which supplies an easy-to-handle, interactive, extendable, and fast analysis tool for time series analysis. In contrast to other software suits like Maple, Matlab, or R, which use a command-line-like interface and where the user has to memorize/look-up the appropriate...... commands, our application is select-and-click-driven. It allows to derive many different sequences of deviations for a given time series and to visualize them in different ways in order to judge their expressive power and to reuse the procedure found. For many transformations or model-ts, the user may...... choose between manual and automated parameter selection. The user can dene new transformations and add them to the system. The application contains efficient implementations of advanced and recent techniques for time series analysis including techniques related to extreme value analysis and filtering...

  2. Applied time series analysis

    CERN Document Server

    Woodward, Wayne A; Elliott, Alan C


    ""There is scarcely a standard technique that the reader will find left out … this book is highly recommended for those requiring a ready introduction to applicable methods in time series and serves as a useful resource for pedagogical purposes.""-International Statistical Review (2014), 82""Current time series theory for practice is well summarized in this book.""-Emmanuel Parzen, Texas A&M University""What an extraordinary range of topics covered, all very insightfully. I like [the authors'] innovations very much, such as the AR factor table.""-David Findley, U.S. Census Bureau (retired)""…

  3. Total and cause-specific mortality before and after the onset of the Greek economic crisis: an interrupted time-series analysis. (United States)

    Laliotis, Ioannis; Ioannidis, John P A; Stavropoulou, Charitini


    Greece was one of the countries hit the hardest by the 2008 financial crisis in Europe. Yet, evidence on the effect of the crisis on total and cause-specific mortality remains unclear. We explored whether the economic crisis affected the trend of overall and cause-specific mortality rates. We used regional panel data from the Hellenic Statistical Authority to assess mortality trends by age, sex, region, and cause in Greece between January, 2001, and December, 2013. We used Eurostat data to calculate monthly age-standardised mortality rates per 100 000 inhabitants for each region. Data were divided into two subperiods: before the crisis (January, 2001, to August, 2008) and after the onset of the crisis (September, 2008, to December, 2013). We tested for changes in the slope of mortality by doing an interrupted time-series analysis. Overall mortality continued to decline after the onset of the financial crisis (-0·065, 95% CI -0·080 to -0·049), but at a slower pace than before the crisis (-0·13, -0·15 to -0·10; trend difference 0·062, 95% CI 0·041 to 0·083; pperiod after the onset of the crisis with extrapolated values based on the period before the crisis, we estimate that an extra 242 deaths per month occurred after the onset of the crisis. Mortality trends have been interrupted after the onset of compared with before the crisis, but changes vary by age, sex, and cause of death. The increase in deaths due to adverse events during medical treatment might reflect the effects of deterioration in quality of care during economic recessions. None. Copyright © 2016 The Author(s). Published by Elsevier Ltd. This is an Open Access article under the CC BY license. Published by Elsevier Ltd.. All rights reserved.

  4. Fourier analysis of time series an introduction

    CERN Document Server

    Bloomfield, Peter


    A new, revised edition of a yet unrivaled work on frequency domain analysis Long recognized for his unique focus on frequency domain methods for the analysis of time series data as well as for his applied, easy-to-understand approach, Peter Bloomfield brings his well-known 1976 work thoroughly up to date. With a minimum of mathematics and an engaging, highly rewarding style, Bloomfield provides in-depth discussions of harmonic regression, harmonic analysis, complex demodulation, and spectrum analysis. All methods are clearly illustrated using examples of specific data sets, while ample

  5. Searching for the best modeling specification for assessing the effects of temperature and humidity on health: a time series analysis in three European cities. (United States)

    Rodopoulou, Sophia; Samoli, Evangelia; Analitis, Antonis; Atkinson, Richard W; de'Donato, Francesca K; Katsouyanni, Klea


    Epidemiological time series studies suggest daily temperature and humidity are associated with adverse health effects including increased mortality and hospital admissions. However, there is no consensus over which metric or lag best describes the relationships. We investigated which temperature and humidity model specification most adequately predicted mortality in three large European cities. Daily counts of all-cause mortality, minimum, maximum and mean temperature and relative humidity and apparent temperature (a composite measure of ambient and dew point temperature) were assembled for Athens, London, and Rome for 6 years between 1999 and 2005. City-specific Poisson regression models were fitted separately for warm (April-September) and cold (October-March) periods adjusting for seasonality, air pollution, and public holidays. We investigated goodness of model fit for each metric for delayed effects up to 13 days using three model fit criteria: sum of the partial autocorrelation function, AIC, and GCV. No uniformly best index for all cities and seasonal periods was observed. The effects of temperature were uniformly shown to be more prolonged during cold periods and the majority of models suggested separate temperature and humidity variables performed better than apparent temperature in predicting mortality. Our study suggests that the nature of the effects of temperature and humidity on mortality vary between cities for unknown reasons which require further investigation but may relate to city-specific population, socioeconomic, and environmental characteristics. This may have consequences on epidemiological studies and local temperature-related warning systems.

  6. Nonlinear time series analysis with R

    CERN Document Server

    Huffaker, Ray; Rosa, Rodolfo


    In the process of data analysis, the investigator is often facing highly-volatile and random-appearing observed data. A vast body of literature shows that the assumption of underlying stochastic processes was not necessarily representing the nature of the processes under investigation and, when other tools were used, deterministic features emerged. Non Linear Time Series Analysis (NLTS) allows researchers to test whether observed volatility conceals systematic non linear behavior, and to rigorously characterize governing dynamics. Behavioral patterns detected by non linear time series analysis, along with scientific principles and other expert information, guide the specification of mechanistic models that serve to explain real-world behavior rather than merely reproducing it. Often there is a misconception regarding the complexity of the level of mathematics needed to understand and utilize the tools of NLTS (for instance Chaos theory). However, mathematics used in NLTS is much simpler than many other subjec...

  7. Time Series Analysis and Forecasting by Example

    CERN Document Server

    Bisgaard, Soren


    An intuition-based approach enables you to master time series analysis with ease Time Series Analysis and Forecasting by Example provides the fundamental techniques in time series analysis using various examples. By introducing necessary theory through examples that showcase the discussed topics, the authors successfully help readers develop an intuitive understanding of seemingly complicated time series models and their implications. The book presents methodologies for time series analysis in a simplified, example-based approach. Using graphics, the authors discuss each presented example in

  8. The analysis of time series: an introduction

    National Research Council Canada - National Science Library

    Chatfield, Christopher


    .... A variety of practical examples are given to support the theory. The book covers a wide range of time-series topics, including probability models for time series, Box-Jenkins forecasting, spectral analysis, linear systems and system identification...

  9. Time series analysis time series analysis methods and applications

    CERN Document Server

    Rao, Tata Subba; Rao, C R


    The field of statistics not only affects all areas of scientific activity, but also many other matters such as public policy. It is branching rapidly into so many different subjects that a series of handbooks is the only way of comprehensively presenting the various aspects of statistical methodology, applications, and recent developments. The Handbook of Statistics is a series of self-contained reference books. Each volume is devoted to a particular topic in statistics, with Volume 30 dealing with time series. The series is addressed to the entire community of statisticians and scientists in various disciplines who use statistical methodology in their work. At the same time, special emphasis is placed on applications-oriented techniques, with the applied statistician in mind as the primary audience. Comprehensively presents the various aspects of statistical methodology Discusses a wide variety of diverse applications and recent developments Contributors are internationally renowened experts in their respect...

  10. A Course in Time Series Analysis

    CERN Document Server

    Peña, Daniel; Tsay, Ruey S


    New statistical methods and future directions of research in time series A Course in Time Series Analysis demonstrates how to build time series models for univariate and multivariate time series data. It brings together material previously available only in the professional literature and presents a unified view of the most advanced procedures available for time series model building. The authors begin with basic concepts in univariate time series, providing an up-to-date presentation of ARIMA models, including the Kalman filter, outlier analysis, automatic methods for building ARIMA models, a

  11. The foundations of modern time series analysis

    CERN Document Server

    Mills, Terence C


    This book develops the analysis of Time Series from its formal beginnings in the 1890s through to the publication of Box and Jenkins' watershed publication in 1970, showing how these methods laid the foundations for the modern techniques of Time Series analysis that are in use today.

  12. Analysis of series resonant converter with series-parallel connection (United States)

    Lin, Bor-Ren; Huang, Chien-Lan


    In this study, a parallel inductor-inductor-capacitor (LLC) resonant converter series-connected on the primary side and parallel-connected on the secondary side is presented for server power supply systems. Based on series resonant behaviour, the power metal-oxide-semiconductor field-effect transistors are turned on at zero voltage switching and the rectifier diodes are turned off at zero current switching. Thus, the switching losses on the power semiconductors are reduced. In the proposed converter, the primary windings of the two LLC converters are connected in series. Thus, the two converters have the same primary currents to ensure that they can supply the balance load current. On the output side, two LLC converters are connected in parallel to share the load current and to reduce the current stress on the secondary windings and the rectifier diodes. In this article, the principle of operation, steady-state analysis and design considerations of the proposed converter are provided and discussed. Experiments with a laboratory prototype with a 24 V/21 A output for server power supply were performed to verify the effectiveness of the proposed converter.

  13. Analysis of Heavy-Tailed Time Series

    DEFF Research Database (Denmark)

    Xie, Xiaolei

    This thesis is about analysis of heavy-tailed time series. We discuss tail properties of real-world equity return series and investigate the possibility that a single tail index is shared by all return series of actively traded equities in a market. Conditions for this hypothesis to be true...... are identified. We study the eigenvalues and eigenvectors of sample covariance and sample auto-covariance matrices of multivariate heavy-tailed time series, and particularly for time series with very high dimensions. Asymptotic approximations of the eigenvalues and eigenvectors of such matrices are found...... and expressed in terms of the parameters of the dependence structure, among others. Furthermore, we study an importance sampling method for estimating rare-event probabilities of multivariate heavy-tailed time series generated by matrix recursion. We show that the proposed algorithm is efficient in the sense...

  14. Applied time series analysis and innovative computing

    CERN Document Server

    Ao, Sio-Iong


    This text is a systematic, state-of-the-art introduction to the use of innovative computing paradigms as an investigative tool for applications in time series analysis. It includes frontier case studies based on recent research.

  15. Area-specific information processing in prefrontal cortex during a probabilistic inference task: a multivariate fMRI BOLD time series analysis.

    Directory of Open Access Journals (Sweden)

    Charmaine Demanuele

    Full Text Available Discriminating spatiotemporal stages of information processing involved in complex cognitive processes remains a challenge for neuroscience. This is especially so in prefrontal cortex whose subregions, such as the dorsolateral prefrontal (DLPFC, anterior cingulate (ACC and orbitofrontal (OFC cortices are known to have differentiable roles in cognition. Yet it is much less clear how these subregions contribute to different cognitive processes required by a given task. To investigate this, we use functional MRI data recorded from a group of healthy adults during a "Jumping to Conclusions" probabilistic reasoning task.We used a novel approach combining multivariate test statistics with bootstrap-based procedures to discriminate between different task stages reflected in the fMRI blood oxygenation level dependent signal pattern and to unravel differences in task-related information encoded by these regions. Furthermore, we implemented a new feature extraction algorithm that selects voxels from any set of brain regions that are jointly maximally predictive about specific task stages.Using both the multivariate statistics approach and the algorithm that searches for maximally informative voxels we show that during the Jumping to Conclusions task, the DLPFC and ACC contribute more to the decision making phase comprising the accumulation of evidence and probabilistic reasoning, while the OFC is more involved in choice evaluation and uncertainty feedback. Moreover, we show that in presumably non-task-related regions (temporal cortices all information there was about task processing could be extracted from just one voxel (indicating the unspecific nature of that information, while for prefrontal areas a wider multivariate pattern of activity was maximally informative.We present a new approach to reveal the different roles of brain regions during the processing of one task from multivariate activity patterns measured by fMRI. This method can be a valuable

  16. Time Series Analysis Forecasting and Control

    CERN Document Server

    Box, George E P; Reinsel, Gregory C


    A modernized new edition of one of the most trusted books on time series analysis. Since publication of the first edition in 1970, Time Series Analysis has served as one of the most influential and prominent works on the subject. This new edition maintains its balanced presentation of the tools for modeling and analyzing time series and also introduces the latest developments that have occurred n the field over the past decade through applications from areas such as business, finance, and engineering. The Fourth Edition provides a clearly written exploration of the key methods for building, cl

  17. Visibility Graph Based Time Series Analysis. (United States)

    Stephen, Mutua; Gu, Changgui; Yang, Huijie


    Network based time series analysis has made considerable achievements in the recent years. By mapping mono/multivariate time series into networks, one can investigate both it's microscopic and macroscopic behaviors. However, most proposed approaches lead to the construction of static networks consequently providing limited information on evolutionary behaviors. In the present paper we propose a method called visibility graph based time series analysis, in which series segments are mapped to visibility graphs as being descriptions of the corresponding states and the successively occurring states are linked. This procedure converts a time series to a temporal network and at the same time a network of networks. Findings from empirical records for stock markets in USA (S&P500 and Nasdaq) and artificial series generated by means of fractional Gaussian motions show that the method can provide us rich information benefiting short-term and long-term predictions. Theoretically, we propose a method to investigate time series from the viewpoint of network of networks.

  18. Visibility Graph Based Time Series Analysis.

    Directory of Open Access Journals (Sweden)

    Mutua Stephen

    Full Text Available Network based time series analysis has made considerable achievements in the recent years. By mapping mono/multivariate time series into networks, one can investigate both it's microscopic and macroscopic behaviors. However, most proposed approaches lead to the construction of static networks consequently providing limited information on evolutionary behaviors. In the present paper we propose a method called visibility graph based time series analysis, in which series segments are mapped to visibility graphs as being descriptions of the corresponding states and the successively occurring states are linked. This procedure converts a time series to a temporal network and at the same time a network of networks. Findings from empirical records for stock markets in USA (S&P500 and Nasdaq and artificial series generated by means of fractional Gaussian motions show that the method can provide us rich information benefiting short-term and long-term predictions. Theoretically, we propose a method to investigate time series from the viewpoint of network of networks.

  19. Allan deviation analysis of financial return series (United States)

    Hernández-Pérez, R.


    We perform a scaling analysis for the return series of different financial assets applying the Allan deviation (ADEV), which is used in the time and frequency metrology to characterize quantitatively the stability of frequency standards since it has demonstrated to be a robust quantity to analyze fluctuations of non-stationary time series for different observation intervals. The data used are opening price daily series for assets from different markets during a time span of around ten years. We found that the ADEV results for the return series at short scales resemble those expected for an uncorrelated series, consistent with the efficient market hypothesis. On the other hand, the ADEV results for absolute return series for short scales (first one or two decades) decrease following approximately a scaling relation up to a point that is different for almost each asset, after which the ADEV deviates from scaling, which suggests that the presence of clustering, long-range dependence and non-stationarity signatures in the series drive the results for large observation intervals.

  20. Introduction to time series analysis and forecasting

    CERN Document Server

    Montgomery, Douglas C; Kulahci, Murat


    An accessible introduction to the most current thinking in and practicality of forecasting techniques in the context of time-oriented data. Analyzing time-oriented data and forecasting are among the most important problems that analysts face across many fields, ranging from finance and economics to production operations and the natural sciences. As a result, there is a widespread need for large groups of people in a variety of fields to understand the basic concepts of time series analysis and forecasting. Introduction to Time Series Analysis and Forecasting presents the time series analysis branch of applied statistics as the underlying methodology for developing practical forecasts, and it also bridges the gap between theory and practice by equipping readers with the tools needed to analyze time-oriented data and construct useful, short- to medium-term, statistically based forecasts.

  1. Entropic Analysis of Electromyography Time Series (United States)

    Kaufman, Miron; Sung, Paul


    We are in the process of assessing the effectiveness of fractal and entropic measures for the diagnostic of low back pain from surface electromyography (EMG) time series. Surface electromyography (EMG) is used to assess patients with low back pain. In a typical EMG measurement, the voltage is measured every millisecond. We observed back muscle fatiguing during one minute, which results in a time series with 60,000 entries. We characterize the complexity of time series by computing the Shannon entropy time dependence. The analysis of the time series from different relevant muscles from healthy and low back pain (LBP) individuals provides evidence that the level of variability of back muscle activities is much larger for healthy individuals than for individuals with LBP. In general the time dependence of the entropy shows a crossover from a diffusive regime to a regime characterized by long time correlations (self organization) at about 0.01s.

  2. Introduction to time series analysis and forecasting

    CERN Document Server

    Montgomery, Douglas C; Kulahci, Murat


    Praise for the First Edition ""…[t]he book is great for readers who need to apply the methods and models presented but have little background in mathematics and statistics."" -MAA Reviews Thoroughly updated throughout, Introduction to Time Series Analysis and Forecasting, Second Edition presents the underlying theories of time series analysis that are needed to analyze time-oriented data and construct real-world short- to medium-term statistical forecasts.    Authored by highly-experienced academics and professionals in engineering statistics, the Second Edition features discussions on both

  3. Multivariate time series analysis with R and financial applications

    CERN Document Server

    Tsay, Ruey S


    Since the publication of his first book, Analysis of Financial Time Series, Ruey Tsay has become one of the most influential and prominent experts on the topic of time series. Different from the traditional and oftentimes complex approach to multivariate (MV) time series, this sequel book emphasizes structural specification, which results in simplified parsimonious VARMA modeling and, hence, eases comprehension. Through a fundamental balance between theory and applications, the book supplies readers with an accessible approach to financial econometric models and their applications to real-worl

  4. Nonlinear Time Series Analysis via Neural Networks (United States)

    Volná, Eva; Janošek, Michal; Kocian, Václav; Kotyrba, Martin

    This article deals with a time series analysis based on neural networks in order to make an effective forex market [Moore and Roche, J. Int. Econ. 58, 387-411 (2002)] pattern recognition. Our goal is to find and recognize important patterns which repeatedly appear in the market history to adapt our trading system behaviour based on them.

  5. Lecture notes for Advanced Time Series Analysis

    DEFF Research Database (Denmark)

    Madsen, Henrik; Holst, Jan


    A first version of this notes was used at the lectures in Grenoble, and they are now extended and improved (together with Jan Holst), and used in Ph.D. courses on Advanced Time Series Analysis at IMM and at the Department of Mathematical Statistics, University of Lund, 1994, 1997, ...

  6. A taylor series approach to survival analysis

    International Nuclear Information System (INIS)

    Brodsky, J.B.; Groer, P.G.


    A method of survival analysis using hazard functions is developed. The method uses the well known mathematical theory for Taylor Series. Hypothesis tests of the adequacy of many statistical models, including proportional hazards and linear and/or quadratic dose responses, are obtained. A partial analysis of leukemia mortality in the Life Span Study cohort is used as an example. Furthermore, a relatively robust estimation procedure for the proportional hazards model is proposed. (author)

  7. Time series analysis of barometric pressure data

    International Nuclear Information System (INIS)

    La Rocca, Paola; Riggi, Francesco; Riggi, Daniele


    Time series of atmospheric pressure data, collected over a period of several years, were analysed to provide undergraduate students with educational examples of application of simple statistical methods of analysis. In addition to basic methods for the analysis of periodicities, a comparison of two forecast models, one based on autoregression algorithms, and the other making use of an artificial neural network, was made. Results show that the application of artificial neural networks may give slightly better results compared to traditional methods.

  8. The Statistical Analysis of Time Series

    CERN Document Server

    Anderson, T W


    The Wiley Classics Library consists of selected books that have become recognized classics in their respective fields. With these new unabridged and inexpensive editions, Wiley hopes to extend the life of these important works by making them available to future generations of mathematicians and scientists. Currently available in the Series: T. W. Anderson Statistical Analysis of Time Series T. S. Arthanari & Yadolah Dodge Mathematical Programming in Statistics Emil Artin Geometric Algebra Norman T. J. Bailey The Elements of Stochastic Processes with Applications to the Natural Sciences George

  9. Analysis of the Arabidopsis superman allelic series and the interactions with other genes demonstrate developmental robustness and joint specification of male-female boundary, flower meristem termination and carpel compartmentalization. (United States)

    Breuil-Broyer, Stéphanie; Trehin, Christophe; Morel, Patrice; Boltz, Véronique; Sun, Bo; Chambrier, Pierre; Ito, Toshiro; Negrutiu, Ioan


    SUPERMAN is a cadastral gene controlling the sexual boundary in the flower. The gene's functions and role in flower development and evolution have remained elusive. The analysis of a contrasting SUP allelic series (for which the names superman, superwoman and supersex have been coined) makes it possible to distinguish early vs. late regulatory processes at the flower meristem centre to which SUP is an important contributor. Their understanding is essential in further addressing evolutionary questions linking bisexuality and flower meristem homeostasis. Inter-allelic comparisons were carried out and SUP interactions with other boundary factors and flower meristem patterning and homeostasis regulators (such as CLV, WUS, PAN, CUC, KNU, AG, AP3/PI, CRC and SPT) have been evaluated at genetic, molecular, morphological and histological levels. Early SUP functions include mechanisms of male-female (sexual) boundary specification, flower mersitem termination and control of stamen number. A SUP-dependent flower meristem termination pathway is identified and analysed. Late SUP functions play a role in organ morphogenesis by controlling intra-whorl organ separation and carpel medial region formation. By integrating early and late SUP functions, and by analyzing in one single experiment a series of SUP genetic interactions, the concept of meristematic 'transference' (cascade) - a regulatory bridging process redundantly and sequentially co-ordinating the triggering and completion of flower meristem termination, and carpel margin meristem and placenta patterning - is proposed. Taken together, the results strongly support the view that SUP(-type) function(s) have been instrumental in resolving male/female gradients into sharp male and female identities (whorls, organs) and in enforcing flower homeostasis during evolution. This has probably been achieved by incorporating the meristem patterning system of the floral axis into the female/carpel programme. © The Author 2016

  10. Time-Series Analysis: A Cautionary Tale (United States)

    Damadeo, Robert


    Time-series analysis has often been a useful tool in atmospheric science for deriving long-term trends in various atmospherically important parameters (e.g., temperature or the concentration of trace gas species). In particular, time-series analysis has been repeatedly applied to satellite datasets in order to derive the long-term trends in stratospheric ozone, which is a critical atmospheric constituent. However, many of the potential pitfalls relating to the non-uniform sampling of the datasets were often ignored and the results presented by the scientific community have been unknowingly biased. A newly developed and more robust application of this technique is applied to the Stratospheric Aerosol and Gas Experiment (SAGE) II version 7.0 ozone dataset and the previous biases and newly derived trends are presented.

  11. Base compaction specification feasibility analysis. (United States)


    The objective of this research is to establish the technical engineering and cost : analysis concepts that will enable WisDOT management to objectively evaluate the : feasibility of switching construction specification philosophies for aggregate base...

  12. Is the population level link between drinking and harm similar for women and men?--a time series analysis with focus on gender-specific drinking and alcohol-related hospitalizations in Sweden. (United States)

    Engdahl, Barbro; Ramstedt, Mats


    A question that has not been addressed in the literature is whether the population level association between alcohol and harm differs between men and women. The main aim of this article is to fill this gap by analysing recently collected time series data of male and female self-reported drinking in relation to gender-specific harm indicators in Sweden. Male and female per capita and risk consumption was estimated on the basis of self-reported data from monthly alcohol surveys for the period 2002-07. Overall per capita consumption including recorded sales and estimates of unrecorded consumption were also collected for the same period. Alcohol-related hospitalizations were used as indicators of alcohol-related harm. Data were aggregated into quarterly observations and analysed by means of time series analyses (ARIMA-modelling). Overall per capita consumption was significantly related to both male and female alcohol-related hospitalizations. Male per capita consumption and risk consumption were also significantly related to alcohol-related hospitalizations among men. Female per capita consumption and risk consumption had also a positive association with alcohol-related hospitalizations but statistical significance was only reached for alcohol poisonings where the association was even stronger than for men. Changes in alcohol consumption in Sweden was associated with changes in male and female alcohol-related hospitalizations also in analyses based on gender-specific consumption measures. There was no clear evidence that the population level association between alcohol and harm differed between men and women.

  13. Highly comparative time-series analysis: the empirical structure of time series and their methods. (United States)

    Fulcher, Ben D; Little, Max A; Jones, Nick S


    The process of collecting and organizing sets of observations represents a common theme throughout the history of science. However, despite the ubiquity of scientists measuring, recording and analysing the dynamics of different processes, an extensive organization of scientific time-series data and analysis methods has never been performed. Addressing this, annotated collections of over 35 000 real-world and model-generated time series, and over 9000 time-series analysis algorithms are analysed in this work. We introduce reduced representations of both time series, in terms of their properties measured by diverse scientific methods, and of time-series analysis methods, in terms of their behaviour on empirical time series, and use them to organize these interdisciplinary resources. This new approach to comparing across diverse scientific data and methods allows us to organize time-series datasets automatically according to their properties, retrieve alternatives to particular analysis methods developed in other scientific disciplines and automate the selection of useful methods for time-series classification and regression tasks. The broad scientific utility of these tools is demonstrated on datasets of electroencephalograms, self-affine time series, heartbeat intervals, speech signals and others, in each case contributing novel analysis techniques to the existing literature. Highly comparative techniques that compare across an interdisciplinary literature can thus be used to guide more focused research in time-series analysis for applications across the scientific disciplines.

  14. Topic Time Series Analysis of Microblogs (United States)


    may be distributed more globally. Tweets on a specific topic that cluster spatially, temporally or both might be of interest to analysts, marketers ...of $ and @, with the latter only in the case that it is the only character in the token (the @ symbol is significant in its usage by Instagram generated by Instagram . Topic 80, Distance: 143.2101 Top words: 1. rawr 2. ˆ0ˆ 3. kill 4. jurassic 5. dinosaur Analysis: This topic is quite

  15. Time series analysis of temporal networks (United States)

    Sikdar, Sandipan; Ganguly, Niloy; Mukherjee, Animesh


    A common but an important feature of all real-world networks is that they are temporal in nature, i.e., the network structure changes over time. Due to this dynamic nature, it becomes difficult to propose suitable growth models that can explain the various important characteristic properties of these networks. In fact, in many application oriented studies only knowing these properties is sufficient. For instance, if one wishes to launch a targeted attack on a network, this can be done even without the knowledge of the full network structure; rather an estimate of some of the properties is sufficient enough to launch the attack. We, in this paper show that even if the network structure at a future time point is not available one can still manage to estimate its properties. We propose a novel method to map a temporal network to a set of time series instances, analyze them and using a standard forecast model of time series, try to predict the properties of a temporal network at a later time instance. To our aim, we consider eight properties such as number of active nodes, average degree, clustering coefficient etc. and apply our prediction framework on them. We mainly focus on the temporal network of human face-to-face contacts and observe that it represents a stochastic process with memory that can be modeled as Auto-Regressive-Integrated-Moving-Average (ARIMA). We use cross validation techniques to find the percentage accuracy of our predictions. An important observation is that the frequency domain properties of the time series obtained from spectrogram analysis could be used to refine the prediction framework by identifying beforehand the cases where the error in prediction is likely to be high. This leads to an improvement of 7.96% (for error level ≤20%) in prediction accuracy on an average across all datasets. As an application we show how such prediction scheme can be used to launch targeted attacks on temporal networks. Contribution to the Topical Issue

  16. Time-series-analysis techniques applied to nuclear-material accounting

    International Nuclear Information System (INIS)

    Pike, D.H.; Morrison, G.W.; Downing, D.J.


    This document is designed to introduce the reader to the applications of Time Series Analysis techniques to Nuclear Material Accountability data. Time series analysis techniques are designed to extract information from a collection of random variables ordered by time by seeking to identify any trends, patterns, or other structure in the series. Since nuclear material accountability data is a time series, one can extract more information using time series analysis techniques than by using other statistical techniques. Specifically, the objective of this document is to examine the applicability of time series analysis techniques to enhance loss detection of special nuclear materials. An introductory section examines the current industry approach which utilizes inventory differences. The error structure of inventory differences is presented. Time series analysis techniques discussed include the Shewhart Control Chart, the Cumulative Summation of Inventory Differences Statistics (CUSUM) and the Kalman Filter and Linear Smoother

  17. Chaotic time series analysis in economics: Balance and perspectives

    International Nuclear Information System (INIS)

    Faggini, Marisa


    The aim of the paper is not to review the large body of work concerning nonlinear time series analysis in economics, about which much has been written, but rather to focus on the new techniques developed to detect chaotic behaviours in economic data. More specifically, our attention will be devoted to reviewing some of these techniques and their application to economic and financial data in order to understand why chaos theory, after a period of growing interest, appears now not to be such an interesting and promising research area

  18. Chaotic time series analysis in economics: Balance and perspectives

    Energy Technology Data Exchange (ETDEWEB)

    Faggini, Marisa, E-mail: [Dipartimento di Scienze Economiche e Statistiche, Università di Salerno, Fisciano 84084 (Italy)


    The aim of the paper is not to review the large body of work concerning nonlinear time series analysis in economics, about which much has been written, but rather to focus on the new techniques developed to detect chaotic behaviours in economic data. More specifically, our attention will be devoted to reviewing some of these techniques and their application to economic and financial data in order to understand why chaos theory, after a period of growing interest, appears now not to be such an interesting and promising research area.

  19. Nonparametric factor analysis of time series


    Rodríguez-Poo, Juan M.; Linton, Oliver Bruce


    We introduce a nonparametric smoothing procedure for nonparametric factor analaysis of multivariate time series. The asymptotic properties of the proposed procedures are derived. We present an application based on the residuals from the Fair macromodel.

  20. DIY Solar Market Analysis Webinar Series: Solar Resource and Technical (United States)

    Series: Solar Resource and Technical Potential DIY Solar Market Analysis Webinar Series: Solar Resource and Technical Potential Wednesday, June 11, 2014 As part of a Do-It-Yourself Solar Market Analysis Potential | State, Local, and Tribal Governments | NREL DIY Solar Market Analysis Webinar

  1. What marketing scholars should know about time series analysis : time series applications in marketing

    NARCIS (Netherlands)

    Horváth, Csilla; Kornelis, Marcel; Leeflang, Peter S.H.


    In this review, we give a comprehensive summary of time series techniques in marketing, and discuss a variety of time series analysis (TSA) techniques and models. We classify them in the sets (i) univariate TSA, (ii) multivariate TSA, and (iii) multiple TSA. We provide relevant marketing

  2. Time series analysis in the social sciences the fundamentals

    CERN Document Server

    Shin, Youseop


    Times Series Analysis in the Social Sciences is a practical and highly readable introduction written exclusively for students and researchers whose mathematical background is limited to basic algebra. The book focuses on fundamental elements of time series analysis that social scientists need to understand so they can employ time series analysis for their research and practice. Through step-by-step explanations and using monthly violent crime rates as case studies, this book explains univariate time series from the preliminary visual analysis through the modeling of seasonality, trends, and re


    Directory of Open Access Journals (Sweden)

    Goran Klepac


    Full Text Available REFII1 model is an authorial mathematical model for time series data mining. The main purpose of that model is to automate time series analysis, through a unique transformation model of time series. An advantage of this approach of time series analysis is the linkage of different methods for time series analysis, linking traditional data mining tools in time series, and constructing new algorithms for analyzing time series. It is worth mentioning that REFII model is not a closed system, which means that we have a finite set of methods. At first, this is a model for transformation of values of time series, which prepares data used by different sets of methods based on the same model of transformation in a domain of problem space. REFII model gives a new approach in time series analysis based on a unique model of transformation, which is a base for all kind of time series analysis. The advantage of REFII model is its possible application in many different areas such as finance, medicine, voice recognition, face recognition and text mining.

  4. Mathematical foundations of time series analysis a concise introduction

    CERN Document Server

    Beran, Jan


    This book provides a concise introduction to the mathematical foundations of time series analysis, with an emphasis on mathematical clarity. The text is reduced to the essential logical core, mostly using the symbolic language of mathematics, thus enabling readers to very quickly grasp the essential reasoning behind time series analysis. It appeals to anybody wanting to understand time series in a precise, mathematical manner. It is suitable for graduate courses in time series analysis but is equally useful as a reference work for students and researchers alike.

  5. Time Series Analysis Using Geometric Template Matching. (United States)

    Frank, Jordan; Mannor, Shie; Pineau, Joelle; Precup, Doina


    We present a novel framework for analyzing univariate time series data. At the heart of the approach is a versatile algorithm for measuring the similarity of two segments of time series called geometric template matching (GeTeM). First, we use GeTeM to compute a similarity measure for clustering and nearest-neighbor classification. Next, we present a semi-supervised learning algorithm that uses the similarity measure with hierarchical clustering in order to improve classification performance when unlabeled training data are available. Finally, we present a boosting framework called TDEBOOST, which uses an ensemble of GeTeM classifiers. TDEBOOST augments the traditional boosting approach with an additional step in which the features used as inputs to the classifier are adapted at each step to improve the training error. We empirically evaluate the proposed approaches on several datasets, such as accelerometer data collected from wearable sensors and ECG data.

  6. Growth And Export Expansion In Mauritius - A Time Series Analysis ...

    African Journals Online (AJOL)

    Growth And Export Expansion In Mauritius - A Time Series Analysis. ... RV Sannassee, R Pearce ... Using Granger Causality tests, the short-run analysis results revealed that there is significant reciprocal causality between real export earnings ...

  7. Stochastic time series analysis of hydrology data for water resources (United States)

    Sathish, S.; Khadar Babu, S. K.


    The prediction to current publication of stochastic time series analysis in hydrology and seasonal stage. The different statistical tests for predicting the hydrology time series on Thomas-Fiering model. The hydrology time series of flood flow have accept a great deal of consideration worldwide. The concentration of stochastic process areas of time series analysis method are expanding with develop concerns about seasonal periods and global warming. The recent trend by the researchers for testing seasonal periods in the hydrologic flowseries using stochastic process on Thomas-Fiering model. The present article proposed to predict the seasonal periods in hydrology using Thomas-Fiering model.

  8. Economic Analysis in Series-Distillation Desalination

    Directory of Open Access Journals (Sweden)

    Mirna Rahmah Lubis


    Full Text Available The ability to produce potable water economically is the primary purpose of seawater desalination research. Reverse osmosis (RO and multi-stage flash (MSF cost more than potable water produced from fresh water resources. Therefore, this research investigates a high-efficiency mechanical vapor-compression distillation system that employs an improved water flow arrangement. The incoming salt concentration was 0.15% salt for brackish water and 3.5% salt for seawater, whereas the outgoing salt concentration was 1.5% and 7%, respectively. Distillation was performed at 439 K and 722 kPa for both brackish water feed and seawater feed. Water costs of the various conditions were calculated for brackish water and seawater feeds using optimum conditions considered as 25 and 20 stages, respectively. For brackish water at a temperature difference of 0.96 K, the energy requirement is 2.0 kWh/m3. At this condition, the estimated water cost is $0.39/m3 achieved with 10,000,000 gal/day distillate, 30-year bond, 5% interest rate, and $0.05/kWh electricity. For seawater at a temperature difference of 0.44 K, the energy requirement is 3.97 kWh/m3 and the estimated water cost is $0.61/m3. Greater efficiency of the vapor compression system is achieved by connecting multiple evaporators in series, rather than the traditional parallel arrangement. The efficiency results from the gradual increase of salinity in each stage of the series arrangement in comparison to parallel. Calculations using various temperature differences between boiling brine and condensing steam show the series arrangement has the greatest improvement at lower temperature differences. Keywords: desalination, dropwise condensation, mechanical-vapor compression

  9. Analysis of JET ELMy time series

    International Nuclear Information System (INIS)

    Zvejnieks, G.; Kuzovkov, V.N.


    Full text: Achievement of the planned operational regime in the next generation tokamaks (such as ITER) still faces principal problems. One of the main challenges is obtaining the control of edge localized modes (ELMs), which should lead to both long plasma pulse times and reasonable divertor life time. In order to control ELMs the hypothesis was proposed by Degeling [1] that ELMs exhibit features of chaotic dynamics and thus a standard chaos control methods might be applicable. However, our findings which are based on the nonlinear autoregressive (NAR) model contradict this hypothesis for JET ELMy time-series. In turn, it means that ELM behavior is of a relaxation or random type. These conclusions coincide with our previous results obtained for ASDEX Upgrade time series [2]. [1] A.W. Degeling, Y.R. Martin, P.E. Bak, J. B.Lister, and X. Llobet, Plasma Phys. Control. Fusion 43, 1671 (2001). [2] G. Zvejnieks, V.N. Kuzovkov, O. Dumbrajs, A.W. Degeling, W. Suttrop, H. Urano, and H. Zohm, Physics of Plasmas 11, 5658 (2004)

  10. A Standard for Sharing and Accessing Time Series Data: The Heliophysics Application Programmers Interface (HAPI) Specification (United States)

    Vandegriff, J. D.; King, T. A.; Weigel, R. S.; Faden, J.; Roberts, D. A.; Harris, B. T.; Lal, N.; Boardsen, S. A.; Candey, R. M.; Lindholm, D. M.


    We present the Heliophysics Application Programmers Interface (HAPI), a new interface specification that both large and small data centers can use to expose time series data holdings in a standard way. HAPI was inspired by the similarity of existing services at many Heliophysics data centers, and these data centers have collaborated to define a single interface that captures best practices and represents what everyone considers the essential, lowest common denominator for basic data access. This low level access can serve as infrastructure to support greatly enhanced interoperability among analysis tools, with the goal being simplified analysis and comparison of data from any instrument, model, mission or data center. The three main services a HAPI server must perform are 1. list a catalog of datasets (one unique ID per dataset), 2. describe the content of one dataset (JSON metadata), and 3. retrieve numerical content for one dataset (stream the actual data). HAPI defines both the format of the query to the server, and the response from the server. The metadata is lightweight, focusing on use rather than discovery, and the data format is a streaming one, with Comma Separated Values (CSV) being required and binary or JSON streaming being optional. The HAPI specification is available at GitHub, where projects are also underway to develop reference implementation servers that data providers can adapt and use at their own sites. Also in the works are data analysis clients in multiple languages (IDL, Python, Matlab, and Java). Institutions which have agreed to adopt HAPI include Goddard (CDAWeb for data and CCMC for models), LASP at the University of Colorado Boulder, the Particles and Plasma Interactions node of the Planetary Data System (PPI/PDS) at UCLA, the Plasma Wave Group at the University of Iowa, the Space Sector at the Johns Hopkins Applied Physics Lab (APL), and the site maintained at George Mason University. Over the next year, the adoption of a

  11. Tool Wear Monitoring Using Time Series Analysis (United States)

    Song, Dong Yeul; Ohara, Yasuhiro; Tamaki, Haruo; Suga, Masanobu

    A tool wear monitoring approach considering the nonlinear behavior of cutting mechanism caused by tool wear and/or localized chipping is proposed, and its effectiveness is verified through the cutting experiment and actual turning machining. Moreover, the variation in the surface roughness of the machined workpiece is also discussed using this approach. In this approach, the residual error between the actually measured vibration signal and the estimated signal obtained from the time series model corresponding to dynamic model of cutting is introduced as the feature of diagnosis. Consequently, it is found that the early tool wear state (i.e. flank wear under 40µm) can be monitored, and also the optimal tool exchange time and the tool wear state for actual turning machining can be judged by this change in the residual error. Moreover, the variation of surface roughness Pz in the range of 3 to 8µm can be estimated by the monitoring of the residual error.

  12. Metagenomics meets time series analysis: unraveling microbial community dynamics

    NARCIS (Netherlands)

    Faust, K.; Lahti, L.M.; Gonze, D.; Vos, de W.M.; Raes, J.


    The recent increase in the number of microbial time series studies offers new insights into the stability and dynamics of microbial communities, from the world's oceans to human microbiota. Dedicated time series analysis tools allow taking full advantage of these data. Such tools can reveal periodic

  13. Time Series Factor Analysis with an Application to Measuring Money

    NARCIS (Netherlands)

    Gilbert, Paul D.; Meijer, Erik


    Time series factor analysis (TSFA) and its associated statistical theory is developed. Unlike dynamic factor analysis (DFA), TSFA obviates the need for explicitly modeling the process dynamics of the underlying phenomena. It also differs from standard factor analysis (FA) in important respects: the

  14. Time averaging, ageing and delay analysis of financial time series (United States)

    Cherstvy, Andrey G.; Vinod, Deepak; Aghion, Erez; Chechkin, Aleksei V.; Metzler, Ralf


    We introduce three strategies for the analysis of financial time series based on time averaged observables. These comprise the time averaged mean squared displacement (MSD) as well as the ageing and delay time methods for varying fractions of the financial time series. We explore these concepts via statistical analysis of historic time series for several Dow Jones Industrial indices for the period from the 1960s to 2015. Remarkably, we discover a simple universal law for the delay time averaged MSD. The observed features of the financial time series dynamics agree well with our analytical results for the time averaged measurables for geometric Brownian motion, underlying the famed Black-Scholes-Merton model. The concepts we promote here are shown to be useful for financial data analysis and enable one to unveil new universal features of stock market dynamics.

  15. Biostatistics series module 9: Survival analysis

    Directory of Open Access Journals (Sweden)

    Avijit Hazra


    Full Text Available Survival analysis is concerned with “time to event“ data. Conventionally, it dealt with cancer death as the event in question, but it can handle any event occurring over a time frame, and this need not be always adverse in nature. When the outcome of a study is the time to an event, it is often not possible to wait until the event in question has happened to all the subjects, for example, until all are dead. In addition, subjects may leave the study prematurely. Such situations lead to what is called censored observations as complete information is not available for these subjects. The data set is thus an assemblage of times to the event in question and times after which no more information on the individual is available. Survival analysis methods are the only techniques capable of handling censored observations without treating them as missing data. They also make no assumption regarding normal distribution of time to event data. Descriptive methods for exploring survival times in a sample include life table and Kaplan–Meier techniques as well as various kinds of distribution fitting as advanced modeling techniques. The Kaplan–Meier cumulative survival probability over time plot has become the signature plot for biomedical survival analysis. Several techniques are available for comparing the survival experience in two or more groups – the log-rank test is popularly used. This test can also be used to produce an odds ratio as an estimate of risk of the event in the test group; this is called hazard ratio (HR. Limitations of the traditional log-rank test have led to various modifications and enhancements. Finally, survival analysis offers different regression models for estimating the impact of multiple predictors on survival. Cox's proportional hazard model is the most general of the regression methods that allows the hazard function to be modeled on a set of explanatory variables without making restrictive assumptions concerning the

  16. Time Series Analysis of Insar Data: Methods and Trends (United States)

    Osmanoglu, Batuhan; Sunar, Filiz; Wdowinski, Shimon; Cano-Cabral, Enrique


    Time series analysis of InSAR data has emerged as an important tool for monitoring and measuring the displacement of the Earth's surface. Changes in the Earth's surface can result from a wide range of phenomena such as earthquakes, volcanoes, landslides, variations in ground water levels, and changes in wetland water levels. Time series analysis is applied to interferometric phase measurements, which wrap around when the observed motion is larger than one-half of the radar wavelength. Thus, the spatio-temporal ''unwrapping" of phase observations is necessary to obtain physically meaningful results. Several different algorithms have been developed for time series analysis of InSAR data to solve for this ambiguity. These algorithms may employ different models for time series analysis, but they all generate a first-order deformation rate, which can be compared to each other. However, there is no single algorithm that can provide optimal results in all cases. Since time series analyses of InSAR data are used in a variety of applications with different characteristics, each algorithm possesses inherently unique strengths and weaknesses. In this review article, following a brief overview of InSAR technology, we discuss several algorithms developed for time series analysis of InSAR data using an example set of results for measuring subsidence rates in Mexico City.

  17. Domain specific modeling and analysis

    NARCIS (Netherlands)

    Jacob, Joost Ferdinand


    It is desirable to model software systems in such a way that analysis of the systems, and tool development for such analysis, is readily possible and feasible in the context of large scientific research projects. This thesis emphasizes the methodology that serves as a basis for such developments.

  18. Stochastic Analysis : A Series of Lectures

    CERN Document Server

    Dozzi, Marco; Flandoli, Franco; Russo, Francesco


    This book presents in thirteen refereed survey articles an overview of modern activity in stochastic analysis, written by leading international experts. The topics addressed include stochastic fluid dynamics and regularization by noise of deterministic dynamical systems; stochastic partial differential equations driven by Gaussian or Lévy noise, including the relationship between parabolic equations and particle systems, and wave equations in a geometric framework; Malliavin calculus and applications to stochastic numerics; stochastic integration in Banach spaces; porous media-type equations; stochastic deformations of classical mechanics and Feynman integrals and stochastic differential equations with reflection. The articles are based on short courses given at the Centre Interfacultaire Bernoulli of the Ecole Polytechnique Fédérale de Lausanne, Switzerland, from January to June 2012. They offer a valuable resource not only for specialists, but also for other researchers and Ph.D. students in the fields o...

  19. Analysis of historical series of industrial demand of energy; Analisi delle serie storiche dei consumi energetici dell`industria

    Energy Technology Data Exchange (ETDEWEB)

    Moauro, F. [ENEA, Centro Ricerche Casaccia, Rome (Italy). Dip. Energia


    This paper reports a short term analysis of the Italian demand for energy fonts and a check of a statistic model supposing the industrial demand for energy fonts as a function of prices and production, according to neoclassic neoclassic micro economic theory. To this pourpose monthly time series of industrial consumption of main energy fonts in 6 sectors, industrial production indexes in the same sectors and indexes of energy prices (coal, natural gas, oil products, electricity) have been used. The statistic methodology refers to modern analysis of time series and specifically to transfer function models. These ones permit rigorous identification and representation of the most important dynamic relations between dependent variables (production and prices), as relation of an input-output system. The results have shown an important positive correlation between energy consumption with prices. Furthermore, it has been shown the reliability of forecasts and their use as monthly energy indicators.

  20. Transition Icons for Time-Series Visualization and Exploratory Analysis. (United States)

    Nickerson, Paul V; Baharloo, Raheleh; Wanigatunga, Amal A; Manini, Todd M; Tighe, Patrick J; Rashidi, Parisa


    The modern healthcare landscape has seen the rapid emergence of techniques and devices that temporally monitor and record physiological signals. The prevalence of time-series data within the healthcare field necessitates the development of methods that can analyze the data in order to draw meaningful conclusions. Time-series behavior is notoriously difficult to intuitively understand due to its intrinsic high-dimensionality, which is compounded in the case of analyzing groups of time series collected from different patients. Our framework, which we call transition icons, renders common patterns in a visual format useful for understanding the shared behavior within groups of time series. Transition icons are adept at detecting and displaying subtle differences and similarities, e.g., between measurements taken from patients receiving different treatment strategies or stratified by demographics. We introduce various methods that collectively allow for exploratory analysis of groups of time series, while being free of distribution assumptions and including simple heuristics for parameter determination. Our technique extracts discrete transition patterns from symbolic aggregate approXimation representations, and compiles transition frequencies into a bag of patterns constructed for each group. These transition frequencies are normalized and aligned in icon form to intuitively display the underlying patterns. We demonstrate the transition icon technique for two time-series datasets-postoperative pain scores, and hip-worn accelerometer activity counts. We believe transition icons can be an important tool for researchers approaching time-series data, as they give rich and intuitive information about collective time-series behaviors.

  1. Volatility Analysis of Bitcoin Price Time Series

    Directory of Open Access Journals (Sweden)

    Lukáš Pichl


    Full Text Available Bitcoin has the largest share in the total capitalization of cryptocurrency markets currently reaching above 70 billion USD. In this work we focus on the price of Bitcoin in terms of standard currencies and their volatility over the last five years. The average day-to-day return throughout this period is 0.328%, amounting in exponential growth from 6 USD to over 4,000 USD per 1 BTC at present. Multi-scale analysis is performed from the level of the tick data, through the 5 min, 1 hour and 1 day scales. Distribution of trading volumes (1 sec, 1 min, 1 hour and 1 day aggregated from the Kraken BTCEUR tick data is provided that shows the artifacts of algorithmic trading (selling transactions with volume peaks distributed at integer multiples of BTC unit. Arbitrage opportunities are studied using the EUR, USD and CNY currencies. Whereas the arbitrage spread for EUR-USD currency pair is found narrow at the order of a percent, at the 1 hour sampling period the arbitrage spread for USD-CNY (and similarly EUR-CNY is found to be more substantial, reaching as high as above 5 percent on rare occasions. The volatility of BTC exchange rates is modeled using the day-to-day distribution of logarithmic return, and the Realized Volatility, sum of the squared logarithmic returns on 5-minute basis. In this work we demonstrate that the Heterogeneous Autoregressive model for Realized Volatility Andersen et al. (2007 applies reasonably well to the BTCUSD dataset. Finally, a feed-forward neural network with 2 hidden layers using 10-day moving window sampling daily return predictors is applied to estimate the next-day logarithmic return. The results show that such an artificial neural network prediction is capable of approximate capture of the actual log return distribution; more sophisticated methods, such as recurrent neural networks and LSTM (Long Short Term Memory techniques from deep learning may be necessary for higher prediction accuracy.

  2. Elements of nonlinear time series analysis and forecasting

    CERN Document Server

    De Gooijer, Jan G


    This book provides an overview of the current state-of-the-art of nonlinear time series analysis, richly illustrated with examples, pseudocode algorithms and real-world applications. Avoiding a “theorem-proof” format, it shows concrete applications on a variety of empirical time series. The book can be used in graduate courses in nonlinear time series and at the same time also includes interesting material for more advanced readers. Though it is largely self-contained, readers require an understanding of basic linear time series concepts, Markov chains and Monte Carlo simulation methods. The book covers time-domain and frequency-domain methods for the analysis of both univariate and multivariate (vector) time series. It makes a clear distinction between parametric models on the one hand, and semi- and nonparametric models/methods on the other. This offers the reader the option of concentrating exclusively on one of these nonlinear time series analysis methods. To make the book as user friendly as possible...

  3. The Photoplethismographic Signal Processed with Nonlinear Time Series Analysis Tools

    International Nuclear Information System (INIS)

    Hernandez Caceres, Jose Luis; Hong, Rolando; Garcia Lanz, Abel; Garcia Dominguez, Luis; Cabannas, Karelia


    Finger photoplethismography (PPG) signals were submitted to nonlinear time series analysis. The applied analytical techniques were: (i) High degree polynomial fitting for baseline estimation; (ii) FFT analysis for estimating power spectra; (iii) fractal dimension estimation via the Higuchi's time-domain method, and (iv) kernel nonparametric estimation for reconstructing noise free-attractors and also for estimating signal's stochastic components

  4. Nonlinear time series analysis of the human electrocardiogram

    International Nuclear Information System (INIS)

    Perc, Matjaz


    We analyse the human electrocardiogram with simple nonlinear time series analysis methods that are appropriate for graduate as well as undergraduate courses. In particular, attention is devoted to the notions of determinism and stationarity in physiological data. We emphasize that methods of nonlinear time series analysis can be successfully applied only if the studied data set originates from a deterministic stationary system. After positively establishing the presence of determinism and stationarity in the studied electrocardiogram, we calculate the maximal Lyapunov exponent, thus providing interesting insights into the dynamics of the human heart. Moreover, to facilitate interest and enable the integration of nonlinear time series analysis methods into the curriculum at an early stage of the educational process, we also provide user-friendly programs for each implemented method

  5. Handbook of Time Series Analysis Recent Theoretical Developments and Applications

    CERN Document Server

    Schelter, Björn; Timmer, Jens


    This handbook provides an up-to-date survey of current research topics and applications of time series analysis methods written by leading experts in their fields. It covers recent developments in univariate as well as bivariate and multivariate time series analysis techniques ranging from physics' to life sciences' applications. Each chapter comprises both methodological aspects and applications to real world complex systems, such as the human brain or Earth's climate. Covering an exceptionally broad spectrum of topics, beginners, experts and practitioners who seek to understand the latest de

  6. Time series analysis and its applications with R examples

    CERN Document Server

    Shumway, Robert H


    The fourth edition of this popular graduate textbook, like its predecessors, presents a balanced and comprehensive treatment of both time and frequency domain methods with accompanying theory. Numerous examples using nontrivial data illustrate solutions to problems such as discovering natural and anthropogenic climate change, evaluating pain perception experiments using functional magnetic resonance imaging, and monitoring a nuclear test ban treaty. The book is designed as a textbook for graduate level students in the physical, biological, and social sciences and as a graduate level text in statistics. Some parts may also serve as an undergraduate introductory course. Theory and methodology are separated to allow presentations on different levels. In addition to coverage of classical methods of time series regression, ARIMA models, spectral analysis and state-space models, the text includes modern developments including categorical time series analysis, multivariate spectral methods, long memory series, nonli...

  7. Multiresolution analysis of Bursa Malaysia KLCI time series (United States)

    Ismail, Mohd Tahir; Dghais, Amel Abdoullah Ahmed


    In general, a time series is simply a sequence of numbers collected at regular intervals over a period. Financial time series data processing is concerned with the theory and practice of processing asset price over time, such as currency, commodity data, and stock market data. The primary aim of this study is to understand the fundamental characteristics of selected financial time series by using the time as well as the frequency domain analysis. After that prediction can be executed for the desired system for in sample forecasting. In this study, multiresolution analysis which the assist of discrete wavelet transforms (DWT) and maximal overlap discrete wavelet transform (MODWT) will be used to pinpoint special characteristics of Bursa Malaysia KLCI (Kuala Lumpur Composite Index) daily closing prices and return values. In addition, further case study discussions include the modeling of Bursa Malaysia KLCI using linear ARIMA with wavelets to address how multiresolution approach improves fitting and forecasting results.

  8. Time Series Analysis of Wheat Futures Reward in China

    Institute of Scientific and Technical Information of China (English)


    Different from the fact that the main researches are focused on single futures contract and lack of the comparison of different periods, this paper described the statistical characteristics of wheat futures reward time series of Zhengzhou Commodity Exchange in recent three years. Besides the basic statistic analysis, the paper used the GARCH and EGARCH model to describe the time series which had the ARCH effect and analyzed the persistence of volatility shocks and the leverage effect. The results showed that compared with that of normal one,wheat futures reward series were abnormality, leptokurtic and thick tail distribution. The study also found that two-part of the reward series had no autocorrelation. Among the six correlative series, three ones presented the ARCH effect. By using of the Auto-regressive Distributed Lag Model, GARCH model and EGARCH model, the paper demonstrates the persistence of volatility shocks and the leverage effect on the wheat futures reward time series. The results reveal that on the one hand, the statistical characteristics of the wheat futures reward are similar to the aboard mature futures market as a whole. But on the other hand, the results reflect some shortages such as the immatureness and the over-control by the government in the Chinese future market.

  9. Time series analysis in chaotic diode resonator circuit

    Energy Technology Data Exchange (ETDEWEB)

    Hanias, M.P. [TEI of Chalkis, GR 34400, Evia, Chalkis (Greece)] e-mail:; Giannaris, G. [TEI of Chalkis, GR 34400, Evia, Chalkis (Greece); Spyridakis, A. [TEI of Chalkis, GR 34400, Evia, Chalkis (Greece); Rigas, A. [TEI of Chalkis, GR 34400, Evia, Chalkis (Greece)


    A diode resonator chaotic circuit is presented. Multisim is used to simulate the circuit and show the presence of chaos. Time series analysis performed by the method proposed by Grasberger and Procaccia. The correlation and minimum embedding dimension {nu} and m {sub min}, respectively, were calculated. Also the corresponding Kolmogorov entropy was calculated.

  10. Time series analysis in chaotic diode resonator circuit

    International Nuclear Information System (INIS)

    Hanias, M.P.; Giannaris, G.; Spyridakis, A.; Rigas, A.


    A diode resonator chaotic circuit is presented. Multisim is used to simulate the circuit and show the presence of chaos. Time series analysis performed by the method proposed by Grasberger and Procaccia. The correlation and minimum embedding dimension ν and m min , respectively, were calculated. Also the corresponding Kolmogorov entropy was calculated

  11. Time series analysis of monthly pulpwood use in the Northeast (United States)

    James T. Bones


    Time series analysis was used to develop a model that depicts pulpwood use in the Northeast. The model is useful in forecasting future pulpwood requirements (short term) or monitoring pulpwood-use activity in relation to past use patterns. The model predicted a downturn in use during 1980.

  12. Multi-granular trend detection for time-series analysis

    NARCIS (Netherlands)

    van Goethem, A.I.; Staals, F.; Löffler, M.; Dykes, J.; Speckmann, B.


    Time series (such as stock prices) and ensembles (such as model runs for weather forecasts) are two important types of one-dimensional time-varying data. Such data is readily available in large quantities but visual analysis of the raw data quickly becomes infeasible, even for moderately sized data

  13. Time Series Analysis Based on Running Mann Whitney Z Statistics (United States)

    A sensitive and objective time series analysis method based on the calculation of Mann Whitney U statistics is described. This method samples data rankings over moving time windows, converts those samples to Mann-Whitney U statistics, and then normalizes the U statistics to Z statistics using Monte-...

  14. Time series analysis in astronomy: Limits and potentialities

    DEFF Research Database (Denmark)

    Vio, R.; Kristensen, N.R.; Madsen, Henrik


    In this paper we consider the problem of the limits concerning the physical information that can be extracted from the analysis of one or more time series ( light curves) typical of astrophysical objects. On the basis of theoretical considerations and numerical simulations, we show that with no a...

  15. Time Series Analysis of 3D Coordinates Using Nonstochastic Observations

    NARCIS (Netherlands)

    Velsink, H.


    Adjustment and testing of a combination of stochastic and nonstochastic observations is applied to the deformation analysis of a time series of 3D coordinates. Nonstochastic observations are constant values that are treated as if they were observations. They are used to formulate constraints on

  16. Time Series Analysis of 3D Coordinates Using Nonstochastic Observations

    NARCIS (Netherlands)

    Hiddo Velsink


    From the article: Abstract Adjustment and testing of a combination of stochastic and nonstochastic observations is applied to the deformation analysis of a time series of 3D coordinates. Nonstochastic observations are constant values that are treated as if they were observations. They are used to

  17. Identification of human operator performance models utilizing time series analysis (United States)

    Holden, F. M.; Shinners, S. M.


    The results of an effort performed by Sperry Systems Management Division for AMRL in applying time series analysis as a tool for modeling the human operator are presented. This technique is utilized for determining the variation of the human transfer function under various levels of stress. The human operator's model is determined based on actual input and output data from a tracking experiment.

  18. Analysis and implementation of LLC-T series parallel resonant ...

    African Journals Online (AJOL)

    A prototype 300 W, 100 kHz converter is designed and built to experimentally demonstrate, dynamic and steady state performance for the LLC-T series parallel resonant converter. A comparative study is performed between experimental results and the simulation studies. The analysis shows that the output of converter is ...

  19. Discontinuous conduction mode analysis of phase-modulated series ...

    Indian Academy of Sciences (India)

    Utsab Kundu

    domain analysis; frequency domain analysis; critical load resistance. 1. Introduction ... DCMSRC design process, requiring repeated circuit simu- lations for design ... Structured derivation of Av is presented, ..... System specifications. L. C r. Lm.

  20. Power-Cooling-Mismatch Test Series Test PCM-7. Experiment operating specifications

    International Nuclear Information System (INIS)

    Sparks, D.T.; Smith, R.H.; Stanley, C.J.


    The experiment operating specifications for the Power-Cooling-Mismatch (PCM) Test PCM-7 to be conducted in the Power Burst Facility are described. The PCM Test Series was designed on the basis of a parametric evaluation of fuel behavior response with cladding temperature, rod internal pressure, time in film boiling, and test rod power being the variable parameters. The test matrix, defined in the PCM Experiment Requirements Document (ERD), encompasses a wide range of situations extending from pre-CHF (critical heat flux) PCMs to long duration operation in stable film boiling leading to rod failure

  1. Time series analysis methods and applications for flight data

    CERN Document Server

    Zhang, Jianye


    This book focuses on different facets of flight data analysis, including the basic goals, methods, and implementation techniques. As mass flight data possesses the typical characteristics of time series, the time series analysis methods and their application for flight data have been illustrated from several aspects, such as data filtering, data extension, feature optimization, similarity search, trend monitoring, fault diagnosis, and parameter prediction, etc. An intelligent information-processing platform for flight data has been established to assist in aircraft condition monitoring, training evaluation and scientific maintenance. The book will serve as a reference resource for people working in aviation management and maintenance, as well as researchers and engineers in the fields of data analysis and data mining.

  2. Time series analysis of ozone data in Isfahan (United States)

    Omidvari, M.; Hassanzadeh, S.; Hosseinibalam, F.


    Time series analysis used to investigate the stratospheric ozone formation and decomposition processes. Different time series methods are applied to detect the reason for extreme high ozone concentrations for each season. Data was convert into seasonal component and frequency domain, the latter has been evaluated by using the Fast Fourier Transform (FFT), spectral analysis. The power density spectrum estimated from the ozone data showed peaks at cycle duration of 22, 20, 36, 186, 365 and 40 days. According to seasonal component analysis most fluctuation was in 1999 and 2000, but the least fluctuation was in 2003. The best correlation between ozone and sun radiation was found in 2000. Other variables which are not available cause to this fluctuation in the 1999 and 2001. The trend of ozone is increasing in 1999 and is decreasing in other years.

  3. Multichannel biomedical time series clustering via hierarchical probabilistic latent semantic analysis. (United States)

    Wang, Jin; Sun, Xiangping; Nahavandi, Saeid; Kouzani, Abbas; Wu, Yuchuan; She, Mary


    Biomedical time series clustering that automatically groups a collection of time series according to their internal similarity is of importance for medical record management and inspection such as bio-signals archiving and retrieval. In this paper, a novel framework that automatically groups a set of unlabelled multichannel biomedical time series according to their internal structural similarity is proposed. Specifically, we treat a multichannel biomedical time series as a document and extract local segments from the time series as words. We extend a topic model, i.e., the Hierarchical probabilistic Latent Semantic Analysis (H-pLSA), which was originally developed for visual motion analysis to cluster a set of unlabelled multichannel time series. The H-pLSA models each channel of the multichannel time series using a local pLSA in the first layer. The topics learned in the local pLSA are then fed to a global pLSA in the second layer to discover the categories of multichannel time series. Experiments on a dataset extracted from multichannel Electrocardiography (ECG) signals demonstrate that the proposed method performs better than previous state-of-the-art approaches and is relatively robust to the variations of parameters including length of local segments and dictionary size. Although the experimental evaluation used the multichannel ECG signals in a biometric scenario, the proposed algorithm is a universal framework for multichannel biomedical time series clustering according to their structural similarity, which has many applications in biomedical time series management. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.


    Directory of Open Access Journals (Sweden)



    Full Text Available Needs analysis is considered to be the cornerstone of English for Specific Purposes(ESP. The concept of Needs analysis has been different along the decades. At the initial stages of ESP( the 1960s and early 1970s, needs analysis consisted in assessing the communicative needs of the learners and the techniques of achieving specific teaching objectives. Nowadays, the tasks of needs analysis is much more complex: it aims at collecting information about the learners and at defining the target situation and environment of studying ESP.

  5. Time series analysis of nuclear instrumentation in EBR-II

    International Nuclear Information System (INIS)

    Imel, G.R.


    Results of a time series analysis of the scaler count data from the 3 wide range nuclear detectors in the Experimental Breeder Reactor-II are presented. One of the channels was replaced, and it was desired to determine if there was any statistically significant change (ie, improvement) in the channel's response after the replacement. Data were collected from all 3 channels for 16-day periods before and after detector replacement. Time series analysis and statistical tests showed that there was no significant change after the detector replacement. Also, there were no statistically significant differences among the 3 channels, either before or after the replacement. Finally, it was determined that errors in the reactivity change inferred from subcritical count monitoring during fuel handling would be on the other of 20-30 cents for single count intervals

  6. Mathematical methods in time series analysis and digital image processing

    CERN Document Server

    Kurths, J; Maass, P; Timmer, J


    The aim of this volume is to bring together research directions in theoretical signal and imaging processing developed rather independently in electrical engineering, theoretical physics, mathematics and the computer sciences. In particular, mathematically justified algorithms and methods, the mathematical analysis of these algorithms, and methods as well as the investigation of connections between methods from time series analysis and image processing are reviewed. An interdisciplinary comparison of these methods, drawing upon common sets of test problems from medicine and geophysical/enviromental sciences, is also addressed. This volume coherently summarizes work carried out in the field of theoretical signal and image processing. It focuses on non-linear and non-parametric models for time series as well as on adaptive methods in image processing.

  7. Spectral Unmixing Analysis of Time Series Landsat 8 Images (United States)

    Zhuo, R.; Xu, L.; Peng, J.; Chen, Y.


    Temporal analysis of Landsat 8 images opens up new opportunities in the unmixing procedure. Although spectral analysis of time series Landsat imagery has its own advantage, it has rarely been studied. Nevertheless, using the temporal information can provide improved unmixing performance when compared to independent image analyses. Moreover, different land cover types may demonstrate different temporal patterns, which can aid the discrimination of different natures. Therefore, this letter presents time series K-P-Means, a new solution to the problem of unmixing time series Landsat imagery. The proposed approach is to obtain the "purified" pixels in order to achieve optimal unmixing performance. The vertex component analysis (VCA) is used to extract endmembers for endmember initialization. First, nonnegative least square (NNLS) is used to estimate abundance maps by using the endmember. Then, the estimated endmember is the mean value of "purified" pixels, which is the residual of the mixed pixel after excluding the contribution of all nondominant endmembers. Assembling two main steps (abundance estimation and endmember update) into the iterative optimization framework generates the complete algorithm. Experiments using both simulated and real Landsat 8 images show that the proposed "joint unmixing" approach provides more accurate endmember and abundance estimation results compared with "separate unmixing" approach.

  8. Notes on economic time series analysis system theoretic perspectives

    CERN Document Server

    Aoki, Masanao


    In seminars and graduate level courses I have had several opportunities to discuss modeling and analysis of time series with economists and economic graduate students during the past several years. These experiences made me aware of a gap between what economic graduate students are taught about vector-valued time series and what is available in recent system literature. Wishing to fill or narrow the gap that I suspect is more widely spread than my personal experiences indicate, I have written these notes to augment and reor­ ganize materials I have given in these courses and seminars. I have endeavored to present, in as much a self-contained way as practicable, a body of results and techniques in system theory that I judge to be relevant and useful to economists interested in using time series in their research. I have essentially acted as an intermediary and interpreter of system theoretic results and perspectives in time series by filtering out non-essential details, and presenting coherent accounts of wha...

  9. Topological data analysis of financial time series: Landscapes of crashes (United States)

    Gidea, Marian; Katz, Yuri


    We explore the evolution of daily returns of four major US stock market indices during the technology crash of 2000, and the financial crisis of 2007-2009. Our methodology is based on topological data analysis (TDA). We use persistence homology to detect and quantify topological patterns that appear in multidimensional time series. Using a sliding window, we extract time-dependent point cloud data sets, to which we associate a topological space. We detect transient loops that appear in this space, and we measure their persistence. This is encoded in real-valued functions referred to as a 'persistence landscapes'. We quantify the temporal changes in persistence landscapes via their Lp-norms. We test this procedure on multidimensional time series generated by various non-linear and non-equilibrium models. We find that, in the vicinity of financial meltdowns, the Lp-norms exhibit strong growth prior to the primary peak, which ascends during a crash. Remarkably, the average spectral density at low frequencies of the time series of Lp-norms of the persistence landscapes demonstrates a strong rising trend for 250 trading days prior to either dotcom crash on 03/10/2000, or to the Lehman bankruptcy on 09/15/2008. Our study suggests that TDA provides a new type of econometric analysis, which complements the standard statistical measures. The method can be used to detect early warning signals of imminent market crashes. We believe that this approach can be used beyond the analysis of financial time series presented here.

  10. Cluster analysis of activity-time series in motor learning

    DEFF Research Database (Denmark)

    Balslev, Daniela; Nielsen, Finn Å; Futiger, Sally A


    Neuroimaging studies of learning focus on brain areas where the activity changes as a function of time. To circumvent the difficult problem of model selection, we used a data-driven analytic tool, cluster analysis, which extracts representative temporal and spatial patterns from the voxel......-time series. The optimal number of clusters was chosen using a cross-validated likelihood method, which highlights the clustering pattern that generalizes best over the subjects. Data were acquired with PET at different time points during practice of a visuomotor task. The results from cluster analysis show...

  11. Analysis of site-specific dispersion conditions

    International Nuclear Information System (INIS)

    Paesler-Sauer, J.


    This report presents an analysis of atmospheric dispersion conditions in the environs of nuclear power stations in the Federal Republic of Germany. The analysis is based on meteorological data measured on the power station sites (KFUe = nuclear reactor remote control records) and by neighbouring stations operated by the German Weather Service. The data are series of hourly mean values of wind and temperature gradient or stability class over the period of one or more years. The aim of the data analysis is to find types of dispersion conditions characterized by the flow field and stratification, and to assess the feasibility of calculating these quantities in the case of an emergency. Influences of terrain structures in the environs of the site are considered. The annual frequencies of types of dispersion situations are assessed, the capability to recognize the dispersion situation from meteorological data measured on the site and the applicability of dispersion models are discussed. (orig.) [de

  12. Time Series Analysis, Modeling and Applications A Computational Intelligence Perspective

    CERN Document Server

    Chen, Shyi-Ming


    Temporal and spatiotemporal data form an inherent fabric of the society as we are faced with streams of data coming from numerous sensors, data feeds, recordings associated with numerous areas of application embracing physical and human-generated phenomena (environmental data, financial markets, Internet activities, etc.). A quest for a thorough analysis, interpretation, modeling and prediction of time series comes with an ongoing challenge for developing models that are both accurate and user-friendly (interpretable). The volume is aimed to exploit the conceptual and algorithmic framework of Computational Intelligence (CI) to form a cohesive and comprehensive environment for building models of time series. The contributions covered in the volume are fully reflective of the wealth of the CI technologies by bringing together ideas, algorithms, and numeric studies, which convincingly demonstrate their relevance, maturity and visible usefulness. It reflects upon the truly remarkable diversity of methodological a...

  13. Recurrence Density Enhanced Complex Networks for Nonlinear Time Series Analysis (United States)

    Costa, Diego G. De B.; Reis, Barbara M. Da F.; Zou, Yong; Quiles, Marcos G.; Macau, Elbert E. N.

    We introduce a new method, which is entitled Recurrence Density Enhanced Complex Network (RDE-CN), to properly analyze nonlinear time series. Our method first transforms a recurrence plot into a figure of a reduced number of points yet preserving the main and fundamental recurrence properties of the original plot. This resulting figure is then reinterpreted as a complex network, which is further characterized by network statistical measures. We illustrate the computational power of RDE-CN approach by time series by both the logistic map and experimental fluid flows, which show that our method distinguishes different dynamics sufficiently well as the traditional recurrence analysis. Therefore, the proposed methodology characterizes the recurrence matrix adequately, while using a reduced set of points from the original recurrence plots.

  14. Time series analysis for psychological research: examining and forecasting change. (United States)

    Jebb, Andrew T; Tay, Louis; Wang, Wei; Huang, Qiming


    Psychological research has increasingly recognized the importance of integrating temporal dynamics into its theories, and innovations in longitudinal designs and analyses have allowed such theories to be formalized and tested. However, psychological researchers may be relatively unequipped to analyze such data, given its many characteristics and the general complexities involved in longitudinal modeling. The current paper introduces time series analysis to psychological research, an analytic domain that has been essential for understanding and predicting the behavior of variables across many diverse fields. First, the characteristics of time series data are discussed. Second, different time series modeling techniques are surveyed that can address various topics of interest to psychological researchers, including describing the pattern of change in a variable, modeling seasonal effects, assessing the immediate and long-term impact of a salient event, and forecasting future values. To illustrate these methods, an illustrative example based on online job search behavior is used throughout the paper, and a software tutorial in R for these analyses is provided in the Supplementary Materials.

  15. Time series analysis for psychological research: examining and forecasting change (United States)

    Jebb, Andrew T.; Tay, Louis; Wang, Wei; Huang, Qiming


    Psychological research has increasingly recognized the importance of integrating temporal dynamics into its theories, and innovations in longitudinal designs and analyses have allowed such theories to be formalized and tested. However, psychological researchers may be relatively unequipped to analyze such data, given its many characteristics and the general complexities involved in longitudinal modeling. The current paper introduces time series analysis to psychological research, an analytic domain that has been essential for understanding and predicting the behavior of variables across many diverse fields. First, the characteristics of time series data are discussed. Second, different time series modeling techniques are surveyed that can address various topics of interest to psychological researchers, including describing the pattern of change in a variable, modeling seasonal effects, assessing the immediate and long-term impact of a salient event, and forecasting future values. To illustrate these methods, an illustrative example based on online job search behavior is used throughout the paper, and a software tutorial in R for these analyses is provided in the Supplementary Materials. PMID:26106341

  16. Phase synchronization based minimum spanning trees for analysis of financial time series with nonlinear correlations (United States)

    Radhakrishnan, Srinivasan; Duvvuru, Arjun; Sultornsanee, Sivarit; Kamarthi, Sagar


    The cross correlation coefficient has been widely applied in financial time series analysis, in specific, for understanding chaotic behaviour in terms of stock price and index movements during crisis periods. To better understand time series correlation dynamics, the cross correlation matrices are represented as networks, in which a node stands for an individual time series and a link indicates cross correlation between a pair of nodes. These networks are converted into simpler trees using different schemes. In this context, Minimum Spanning Trees (MST) are the most favoured tree structures because of their ability to preserve all the nodes and thereby retain essential information imbued in the network. Although cross correlations underlying MSTs capture essential information, they do not faithfully capture dynamic behaviour embedded in the time series data of financial systems because cross correlation is a reliable measure only if the relationship between the time series is linear. To address the issue, this work investigates a new measure called phase synchronization (PS) for establishing correlations among different time series which relate to one another, linearly or nonlinearly. In this approach the strength of a link between a pair of time series (nodes) is determined by the level of phase synchronization between them. We compare the performance of phase synchronization based MST with cross correlation based MST along selected network measures across temporal frame that includes economically good and crisis periods. We observe agreement in the directionality of the results across these two methods. They show similar trends, upward or downward, when comparing selected network measures. Though both the methods give similar trends, the phase synchronization based MST is a more reliable representation of the dynamic behaviour of financial systems than the cross correlation based MST because of the former's ability to quantify nonlinear relationships among time

  17. Patient specific dynamic geometric models from sequential volumetric time series image data. (United States)

    Cameron, B M; Robb, R A


    Generating patient specific dynamic models is complicated by the complexity of the motion intrinsic and extrinsic to the anatomic structures being modeled. Using a physics-based sequentially deforming algorithm, an anatomically accurate dynamic four-dimensional model can be created from a sequence of 3-D volumetric time series data sets. While such algorithms may accurately track the cyclic non-linear motion of the heart, they generally fail to accurately track extrinsic structural and non-cyclic motion. To accurately model these motions, we have modified a physics-based deformation algorithm to use a meta-surface defining the temporal and spatial maxima of the anatomic structure as the base reference surface. A mass-spring physics-based deformable model, which can expand or shrink with the local intrinsic motion, is applied to the metasurface, deforming this base reference surface to the volumetric data at each time point. As the meta-surface encompasses the temporal maxima of the structure, any extrinsic motion is inherently encoded into the base reference surface and allows the computation of the time point surfaces to be performed in parallel. The resultant 4-D model can be interactively transformed and viewed from different angles, showing the spatial and temporal motion of the anatomic structure. Using texture maps and per-vertex coloring, additional data such as physiological and/or biomechanical variables (e.g., mapping electrical activation sequences onto contracting myocardial surfaces) can be associated with the dynamic model, producing a 5-D model. For acquisition systems that may capture only limited time series data (e.g., only images at end-diastole/end-systole or inhalation/exhalation), this algorithm can provide useful interpolated surfaces between the time points. Such models help minimize the number of time points required to usefully depict the motion of anatomic structures for quantitative assessment of regional dynamics.

  18. Population specific analysis of Yakut exomes

    NARCIS (Netherlands)

    Zlobin, A. S.; Sharapov, S. Sh.; Guryev, V. P.; Bevova, M. R.; Tsepilov, Y. A.; Sivtseva, T. M.; Boyarskih, U. A.; Sokolova, E. A.; Aulchenko, Y. S.; Filipenko, M. L.; Osakovsky, V. L.

    We studied the genetic diversity of the Yakut population using exome sequencing. We performed comparative analysis of the Yakut population and the populations that are included in the "1000 Genomes" project and we identified the alleles specific to the Yakut population. We showed, that the Yakuts

  19. The application of complex network time series analysis in turbulent heated jets

    International Nuclear Information System (INIS)

    Charakopoulos, A. K.; Karakasidis, T. E.; Liakopoulos, A.; Papanicolaou, P. N.


    In the present study, we applied the methodology of the complex network-based time series analysis to experimental temperature time series from a vertical turbulent heated jet. More specifically, we approach the hydrodynamic problem of discriminating time series corresponding to various regions relative to the jet axis, i.e., time series corresponding to regions that are close to the jet axis from time series originating at regions with a different dynamical regime based on the constructed network properties. Applying the transformation phase space method (k nearest neighbors) and also the visibility algorithm, we transformed time series into networks and evaluated the topological properties of the networks such as degree distribution, average path length, diameter, modularity, and clustering coefficient. The results show that the complex network approach allows distinguishing, identifying, and exploring in detail various dynamical regions of the jet flow, and associate it to the corresponding physical behavior. In addition, in order to reject the hypothesis that the studied networks originate from a stochastic process, we generated random network and we compared their statistical properties with that originating from the experimental data. As far as the efficiency of the two methods for network construction is concerned, we conclude that both methodologies lead to network properties that present almost the same qualitative behavior and allow us to reveal the underlying system dynamics

  20. Time series clustering analysis of health-promoting behavior (United States)

    Yang, Chi-Ta; Hung, Yu-Shiang; Deng, Guang-Feng


    Health promotion must be emphasized to achieve the World Health Organization goal of health for all. Since the global population is aging rapidly, ComCare elder health-promoting service was developed by the Taiwan Institute for Information Industry in 2011. Based on the Pender health promotion model, ComCare service offers five categories of health-promoting functions to address the everyday needs of seniors: nutrition management, social support, exercise management, health responsibility, stress management. To assess the overall ComCare service and to improve understanding of the health-promoting behavior of elders, this study analyzed health-promoting behavioral data automatically collected by the ComCare monitoring system. In the 30638 session records collected for 249 elders from January, 2012 to March, 2013, behavior patterns were identified by fuzzy c-mean time series clustering algorithm combined with autocorrelation-based representation schemes. The analysis showed that time series data for elder health-promoting behavior can be classified into four different clusters. Each type reveals different health-promoting needs, frequencies, function numbers and behaviors. The data analysis result can assist policymakers, health-care providers, and experts in medicine, public health, nursing and psychology and has been provided to Taiwan National Health Insurance Administration to assess the elder health-promoting behavior.

  1. Time series analysis of gold production in Malaysia (United States)

    Muda, Nora; Hoon, Lee Yuen


    Gold is a soft, malleable, bright yellow metallic element and unaffected by air or most reagents. It is highly valued as an asset or investment commodity and is extensively used in jewellery, industrial application, dentistry and medical applications. In Malaysia, gold mining is limited in several areas such as Pahang, Kelantan, Terengganu, Johor and Sarawak. The main purpose of this case study is to obtain a suitable model for the production of gold in Malaysia. The model can also be used to predict the data of Malaysia's gold production in the future. Box-Jenkins time series method was used to perform time series analysis with the following steps: identification, estimation, diagnostic checking and forecasting. In addition, the accuracy of prediction is tested using mean absolute percentage error (MAPE). From the analysis, the ARIMA (3,1,1) model was found to be the best fitted model with MAPE equals to 3.704%, indicating the prediction is very accurate. Hence, this model can be used for forecasting. This study is expected to help the private and public sectors to understand the gold production scenario and later plan the gold mining activities in Malaysia.

  2. Predicting the Market Potential Using Time Series Analysis

    Directory of Open Access Journals (Sweden)

    Halmet Bradosti


    Full Text Available The aim of this analysis is to forecast a mini-market sales volume for the period of twelve months starting August 2015 to August 2016. The study is based on the monthly sales in Iraqi Dinar for a private local mini-market for the month of April 2014 to July 2015. As revealed on the graph and of course if the stagnant economic condition continues, the trend of future sales is down-warding. Based on time series analysis, the business may continue to operate and generate small revenues until August 2016. However, due to low sales volume, low profit margin and operating expenses, the revenues may not be adequate enough to produce positive net income and the business may not be able to operate afterward. The principal question rose from this is the forecasting sales in the region will be difficult where the business cycle so dynamic and revolutionary due to systematic risks and unforeseeable future.

  3. Pipe-anchor discontinuity analysis utilizing power series solutions, Bessel functions, and Fourier series

    International Nuclear Information System (INIS)

    Williams, Dennis K.; Ranson, William F.


    One of the paradigmatic classes of problems that frequently arise in piping stress analysis discipline is the effect of local stresses created by supports and restraints attachments. Over the past 20 years, concerns have been identified by both regulatory agencies in the nuclear power industry and others in the process and chemicals industries concerning the effect of various stiff clamping arrangements on the expected life of the pipe and its various piping components. In many of the commonly utilized geometries and arrangements of pipe clamps, the elasticity problem becomes the axisymmetric stress and deformation determination in a hollow cylinder (pipe) subjected to the appropriate boundary conditions and respective loads per se. One of the geometries that serve as a pipe anchor is comprised of two pipe clamps that are bolted tightly to the pipe and affixed to a modified shoe-type arrangement. The shoe is employed for the purpose of providing an immovable base that can be easily attached either by bolting or welding to a structural steel pipe rack. Over the past 50 years, the computational tools available to the piping analyst have changed dramatically and thereby have caused the implementation of solutions to the basic problems of elasticity to change likewise. The need to obtain closed form elasticity solutions, however, has always been a driving force in engineering. The employment of symbolic calculus that is currently available through numerous software packages makes closed form solutions very economical. This paper briefly traces the solutions over the past 50 years to a variety of axisymmetric stress problems involving hollow circular cylinders employing a Fourier series representation. In the present example, a properly chosen Fourier series represent the mathematical simulation of the imposed axial displacements on the outside diametrical surface. A general solution technique is introduced for the axisymmetric discontinuity stresses resulting from an

  4. Comparison of correlation analysis techniques for irregularly sampled time series

    Directory of Open Access Journals (Sweden)

    K. Rehfeld


    Full Text Available Geoscientific measurements often provide time series with irregular time sampling, requiring either data reconstruction (interpolation or sophisticated methods to handle irregular sampling. We compare the linear interpolation technique and different approaches for analyzing the correlation functions and persistence of irregularly sampled time series, as Lomb-Scargle Fourier transformation and kernel-based methods. In a thorough benchmark test we investigate the performance of these techniques.

    All methods have comparable root mean square errors (RMSEs for low skewness of the inter-observation time distribution. For high skewness, very irregular data, interpolation bias and RMSE increase strongly. We find a 40 % lower RMSE for the lag-1 autocorrelation function (ACF for the Gaussian kernel method vs. the linear interpolation scheme,in the analysis of highly irregular time series. For the cross correlation function (CCF the RMSE is then lower by 60 %. The application of the Lomb-Scargle technique gave results comparable to the kernel methods for the univariate, but poorer results in the bivariate case. Especially the high-frequency components of the signal, where classical methods show a strong bias in ACF and CCF magnitude, are preserved when using the kernel methods.

    We illustrate the performances of interpolation vs. Gaussian kernel method by applying both to paleo-data from four locations, reflecting late Holocene Asian monsoon variability as derived from speleothem δ18O measurements. Cross correlation results are similar for both methods, which we attribute to the long time scales of the common variability. The persistence time (memory is strongly overestimated when using the standard, interpolation-based, approach. Hence, the Gaussian kernel is a reliable and more robust estimator with significant advantages compared to other techniques and suitable for large scale application to paleo-data.

  5. Remote-Sensing Time Series Analysis, a Vegetation Monitoring Tool (United States)

    McKellip, Rodney; Prados, Donald; Ryan, Robert; Ross, Kenton; Spruce, Joseph; Gasser, Gerald; Greer, Randall


    The Time Series Product Tool (TSPT) is software, developed in MATLAB , which creates and displays high signal-to- noise Vegetation Indices imagery and other higher-level products derived from remotely sensed data. This tool enables automated, rapid, large-scale regional surveillance of crops, forests, and other vegetation. TSPT temporally processes high-revisit-rate satellite imagery produced by the Moderate Resolution Imaging Spectroradiometer (MODIS) and by other remote-sensing systems. Although MODIS imagery is acquired daily, cloudiness and other sources of noise can greatly reduce the effective temporal resolution. To improve cloud statistics, the TSPT combines MODIS data from multiple satellites (Aqua and Terra). The TSPT produces MODIS products as single time-frame and multitemporal change images, as time-series plots at a selected location, or as temporally processed image videos. Using the TSPT program, MODIS metadata is used to remove and/or correct bad and suspect data. Bad pixel removal, multiple satellite data fusion, and temporal processing techniques create high-quality plots and animated image video sequences that depict changes in vegetation greenness. This tool provides several temporal processing options not found in other comparable imaging software tools. Because the framework to generate and use other algorithms is established, small modifications to this tool will enable the use of a large range of remotely sensed data types. An effective remote-sensing crop monitoring system must be able to detect subtle changes in plant health in the earliest stages, before the effects of a disease outbreak or other adverse environmental conditions can become widespread and devastating. The integration of the time series analysis tool with ground-based information, soil types, crop types, meteorological data, and crop growth models in a Geographic Information System, could provide the foundation for a large-area crop-surveillance system that could identify

  6. Inorganic chemical analysis of environmental materials—A lecture series (United States)

    Crock, J.G.; Lamothe, P.J.


    At the request of the faculty of the Colorado School of Mines, Golden, Colorado, the authors prepared and presented a lecture series to the students of a graduate level advanced instrumental analysis class. The slides and text presented in this report are a compilation and condensation of this series of lectures. The purpose of this report is to present the slides and notes and to emphasize the thought processes that should be used by a scientist submitting samples for analyses in order to procure analytical data to answer a research question. First and foremost, the analytical data generated can be no better than the samples submitted. The questions to be answered must first be well defined and the appropriate samples collected from the population that will answer the question. The proper methods of analysis, including proper sample preparation and digestion techniques, must then be applied. Care must be taken to achieve the required limits of detection of the critical analytes to yield detectable analyte concentration (above "action" levels) for the majority of the study's samples and to address what portion of those analytes answer the research question-total or partial concentrations. To guarantee a robust analytical result that answers the research question(s), a well-defined quality assurance and quality control (QA/QC) plan must be employed. This QA/QC plan must include the collection and analysis of field and laboratory blanks, sample duplicates, and matrix-matched standard reference materials (SRMs). The proper SRMs may include in-house materials and/or a selection of widely available commercial materials. A discussion of the preparation and applicability of in-house reference materials is also presented. Only when all these analytical issues are sufficiently addressed can the research questions be answered with known certainty.


    Energy Technology Data Exchange (ETDEWEB)

    Scargle, Jeffrey D. [Space Science and Astrobiology Division, MS 245-3, NASA Ames Research Center, Moffett Field, CA 94035-1000 (United States); Norris, Jay P. [Physics Department, Boise State University, 2110 University Drive, Boise, ID 83725-1570 (United States); Jackson, Brad [The Center for Applied Mathematics and Computer Science, Department of Mathematics, San Jose State University, One Washington Square, MH 308, San Jose, CA 95192-0103 (United States); Chiang, James, E-mail: [W. W. Hansen Experimental Physics Laboratory, Kavli Institute for Particle Astrophysics and Cosmology, Department of Physics and SLAC National Accelerator Laboratory, Stanford University, Stanford, CA 94305 (United States)


    This paper addresses the problem of detecting and characterizing local variability in time series and other forms of sequential data. The goal is to identify and characterize statistically significant variations, at the same time suppressing the inevitable corrupting observational errors. We present a simple nonparametric modeling technique and an algorithm implementing it-an improved and generalized version of Bayesian Blocks-that finds the optimal segmentation of the data in the observation interval. The structure of the algorithm allows it to be used in either a real-time trigger mode, or a retrospective mode. Maximum likelihood or marginal posterior functions to measure model fitness are presented for events, binned counts, and measurements at arbitrary times with known error distributions. Problems addressed include those connected with data gaps, variable exposure, extension to piecewise linear and piecewise exponential representations, multivariate time series data, analysis of variance, data on the circle, other data modes, and dispersed data. Simulations provide evidence that the detection efficiency for weak signals is close to a theoretical asymptotic limit derived by Arias-Castro et al. In the spirit of Reproducible Research all of the code and data necessary to reproduce all of the figures in this paper are included as supplementary material.

  8. Time series analysis of brain regional volume by MR image

    International Nuclear Information System (INIS)

    Tanaka, Mika; Tarusawa, Ayaka; Nihei, Mitsuyo; Fukami, Tadanori; Yuasa, Tetsuya; Wu, Jin; Ishiwata, Kiichi; Ishii, Kenji


    The present study proposed a methodology of time series analysis of volumes of frontal, parietal, temporal and occipital lobes and cerebellum because such volumetric reports along the process of individual's aging have been scarcely presented. Subjects analyzed were brain images of 2 healthy males and 18 females of av. age of 69.0 y, of which T1-weighted 3D SPGR (spoiled gradient recalled in the steady state) acquisitions with a GE SIGNA EXCITE HD 1.5T machine were conducted for 4 times in the time series of 42-50 months. The image size was 256 x 256 x (86-124) voxels with digitization level 16 bits. As the template for the regions, the standard gray matter atlas (icbn452 a tlas p robability g ray) and its labeled one (icbn.Labels), provided by UCLA Laboratory of Neuro Imaging, were used for individual's standardization. Segmentation, normalization and coregistration were performed with the MR imaging software SPM8 (Statistic Parametric Mapping 8). Volumes of regions were calculated as their voxel ratio to the whole brain voxel in percent. It was found that the regional volumes decreased with aging in all above lobes examined and cerebellum in average percent per year of -0.11, -0.07, -0.04, -0.02, and -0.03, respectively. The procedure for calculation of the regional volumes, which has been manually operated hitherto, can be automatically conducted for the individual brain using the standard atlases above. (T.T.)

  9. Studies in Astronomical Time Series Analysis. VI. Bayesian Block Representations (United States)

    Scargle, Jeffrey D.; Norris, Jay P.; Jackson, Brad; Chiang, James


    This paper addresses the problem of detecting and characterizing local variability in time series and other forms of sequential data. The goal is to identify and characterize statistically significant variations, at the same time suppressing the inevitable corrupting observational errors. We present a simple nonparametric modeling technique and an algorithm implementing it-an improved and generalized version of Bayesian Blocks [Scargle 1998]-that finds the optimal segmentation of the data in the observation interval. The structure of the algorithm allows it to be used in either a real-time trigger mode, or a retrospective mode. Maximum likelihood or marginal posterior functions to measure model fitness are presented for events, binned counts, and measurements at arbitrary times with known error distributions. Problems addressed include those connected with data gaps, variable exposure, extension to piece- wise linear and piecewise exponential representations, multivariate time series data, analysis of variance, data on the circle, other data modes, and dispersed data. Simulations provide evidence that the detection efficiency for weak signals is close to a theoretical asymptotic limit derived by [Arias-Castro, Donoho and Huo 2003]. In the spirit of Reproducible Research [Donoho et al. (2008)] all of the code and data necessary to reproduce all of the figures in this paper are included as auxiliary material.


    International Nuclear Information System (INIS)

    Scargle, Jeffrey D.; Norris, Jay P.; Jackson, Brad; Chiang, James


    This paper addresses the problem of detecting and characterizing local variability in time series and other forms of sequential data. The goal is to identify and characterize statistically significant variations, at the same time suppressing the inevitable corrupting observational errors. We present a simple nonparametric modeling technique and an algorithm implementing it—an improved and generalized version of Bayesian Blocks—that finds the optimal segmentation of the data in the observation interval. The structure of the algorithm allows it to be used in either a real-time trigger mode, or a retrospective mode. Maximum likelihood or marginal posterior functions to measure model fitness are presented for events, binned counts, and measurements at arbitrary times with known error distributions. Problems addressed include those connected with data gaps, variable exposure, extension to piecewise linear and piecewise exponential representations, multivariate time series data, analysis of variance, data on the circle, other data modes, and dispersed data. Simulations provide evidence that the detection efficiency for weak signals is close to a theoretical asymptotic limit derived by Arias-Castro et al. In the spirit of Reproducible Research all of the code and data necessary to reproduce all of the figures in this paper are included as supplementary material.

  11. A non linear analysis of human gait time series based on multifractal analysis and cross correlations

    International Nuclear Information System (INIS)

    Munoz-Diosdado, A


    We analyzed databases with gait time series of adults and persons with Parkinson, Huntington and amyotrophic lateral sclerosis (ALS) diseases. We obtained the staircase graphs of accumulated events that can be bounded by a straight line whose slope can be used to distinguish between gait time series from healthy and ill persons. The global Hurst exponent of these series do not show tendencies, we intend that this is because some gait time series have monofractal behavior and others have multifractal behavior so they cannot be characterized with a single Hurst exponent. We calculated the multifractal spectra, obtained the spectra width and found that the spectra of the healthy young persons are almost monofractal. The spectra of ill persons are wider than the spectra of healthy persons. In opposition to the interbeat time series where the pathology implies loss of multifractality, in the gait time series the multifractal behavior emerges with the pathology. Data were collected from healthy and ill subjects as they walked in a roughly circular path and they have sensors in both feet, so we have one time series for the left foot and other for the right foot. First, we analyzed these time series separately, and then we compared both results, with direct comparison and with a cross correlation analysis. We tried to find differences in both time series that can be used as indicators of equilibrium problems

  12. A non linear analysis of human gait time series based on multifractal analysis and cross correlations

    Energy Technology Data Exchange (ETDEWEB)

    Munoz-Diosdado, A [Department of Mathematics, Unidad Profesional Interdisciplinaria de Biotecnologia, Instituto Politecnico Nacional, Av. Acueducto s/n, 07340, Mexico City (Mexico)


    We analyzed databases with gait time series of adults and persons with Parkinson, Huntington and amyotrophic lateral sclerosis (ALS) diseases. We obtained the staircase graphs of accumulated events that can be bounded by a straight line whose slope can be used to distinguish between gait time series from healthy and ill persons. The global Hurst exponent of these series do not show tendencies, we intend that this is because some gait time series have monofractal behavior and others have multifractal behavior so they cannot be characterized with a single Hurst exponent. We calculated the multifractal spectra, obtained the spectra width and found that the spectra of the healthy young persons are almost monofractal. The spectra of ill persons are wider than the spectra of healthy persons. In opposition to the interbeat time series where the pathology implies loss of multifractality, in the gait time series the multifractal behavior emerges with the pathology. Data were collected from healthy and ill subjects as they walked in a roughly circular path and they have sensors in both feet, so we have one time series for the left foot and other for the right foot. First, we analyzed these time series separately, and then we compared both results, with direct comparison and with a cross correlation analysis. We tried to find differences in both time series that can be used as indicators of equilibrium problems.

  13. Discontinuous conduction mode analysis of phase-modulated series ...

    Indian Academy of Sciences (India)

    modulated dc–dc series resonant converter (SRC) operating in discontinuous conduction mode (DCM). The conventional fundamental harmonic approximation technique is extended for a non-ideal series resonant tank to clarify the limitations of ...

  14. Centrality measures in temporal networks with time series analysis (United States)

    Huang, Qiangjuan; Zhao, Chengli; Zhang, Xue; Wang, Xiaojie; Yi, Dongyun


    The study of identifying important nodes in networks has a wide application in different fields. However, the current researches are mostly based on static or aggregated networks. Recently, the increasing attention to networks with time-varying structure promotes the study of node centrality in temporal networks. In this paper, we define a supra-evolution matrix to depict the temporal network structure. With using of the time series analysis, the relationships between different time layers can be learned automatically. Based on the special form of the supra-evolution matrix, the eigenvector centrality calculating problem is turned into the calculation of eigenvectors of several low-dimensional matrices through iteration, which effectively reduces the computational complexity. Experiments are carried out on two real-world temporal networks, Enron email communication network and DBLP co-authorship network, the results of which show that our method is more efficient at discovering the important nodes than the common aggregating method.

  15. Adult Craniopharyngioma: Case Series, Systematic Review, and Meta-Analysis. (United States)

    Dandurand, Charlotte; Sepehry, Amir Ali; Asadi Lari, Mohammad Hossein; Akagami, Ryojo; Gooderham, Peter


    The optimal therapeutic approach for adult craniopharyngioma remains controversial. Some advocate for gross total resection (GTR), while others advocate for subtotal resection followed by adjuvant radiotherapy (STR + XRT). To conduct a systematic review and meta-analysis assessing the rate of recurrence in the follow-up of 3 yr in adult craniopharyngioma stratified by extent of resection and presence of adjuvant radiotherapy. MEDLINE (1946-July 1, 2016) and EMBASE (1980-June 30, 2016) were systematically reviewed. From1975 to 2013, 33 patients were treated with initial surgical resection for adult onset craniopharyngioma at our center and were reviewed for inclusion in this study. Data from 22 patients were available for inclusion as a case series in the systematic review. Eligible studies (n = 21) were identified from the literature in addition to a case series of our institutional experience. Three groups were available for analysis: GTR, STR + XRT, and STR. The rates of recurrence were 17%, 27%, and 45%, respectively. The risk of developing recurrence was significant for GTR vs STR (odds ratio [OR]: 0.24, 95% confidence interval [CI]: 0.15-0.38) and STR + XRT vs STR (OR: 0.20, 95% CI: 0.10-0.41). Risk of recurrence after GTR vs STR + XRT did not reach significance (OR: 0.63, 95% CI: 0.33-1.24, P = .18). This is the first and largest systematic review focusing on the rate of recurrence in adult craniopharyngioma. Although the rates of recurrence are favoring GTR, difference in risk of recurrence did not reach significance. This study provides guidance to clinicians and directions for future research with the need to stratify outcomes per treatment modalities. Copyright © 2017 by the Congress of Neurological Surgeons

  16. Assessing Spontaneous Combustion Instability with Nonlinear Time Series Analysis (United States)

    Eberhart, C. J.; Casiano, M. J.


    Considerable interest lies in the ability to characterize the onset of spontaneous instabilities within liquid propellant rocket engine (LPRE) combustion devices. Linear techniques, such as fast Fourier transforms, various correlation parameters, and critical damping parameters, have been used at great length for over fifty years. Recently, nonlinear time series methods have been applied to deduce information pertaining to instability incipiency hidden in seemingly stochastic combustion noise. A technique commonly used in biological sciences known as the Multifractal Detrended Fluctuation Analysis has been extended to the combustion dynamics field, and is introduced here as a data analysis approach complementary to linear ones. Advancing, a modified technique is leveraged to extract artifacts of impending combustion instability that present themselves a priori growth to limit cycle amplitudes. Analysis is demonstrated on data from J-2X gas generator testing during which a distinct spontaneous instability was observed. Comparisons are made to previous work wherein the data were characterized using linear approaches. Verification of the technique is performed by examining idealized signals and comparing two separate, independently developed tools.

  17. Interrupted time-series analysis: studying trends in neurosurgery. (United States)

    Wong, Ricky H; Smieliauskas, Fabrice; Pan, I-Wen; Lam, Sandi K


    OBJECT Neurosurgery studies traditionally have evaluated the effects of interventions on health care outcomes by studying overall changes in measured outcomes over time. Yet, this type of linear analysis is limited due to lack of consideration of the trend's effects both pre- and postintervention and the potential for confounding influences. The aim of this study was to illustrate interrupted time-series analysis (ITSA) as applied to an example in the neurosurgical literature and highlight ITSA's potential for future applications. METHODS The methods used in previous neurosurgical studies were analyzed and then compared with the methodology of ITSA. RESULTS The ITSA method was identified in the neurosurgical literature as an important technique for isolating the effect of an intervention (such as a policy change or a quality and safety initiative) on a health outcome independent of other factors driving trends in the outcome. The authors determined that ITSA allows for analysis of the intervention's immediate impact on outcome level and on subsequent trends and enables a more careful measure of the causal effects of interventions on health care outcomes. CONCLUSIONS ITSA represents a significant improvement over traditional observational study designs in quantifying the impact of an intervention. ITSA is a useful statistical procedure to understand, consider, and implement as the field of neurosurgery evolves in sophistication in big-data analytics, economics, and health services research.

  18. Application of the Allan Variance to Time Series Analysis in Astrometry and Geodesy: A Review. (United States)

    Malkin, Zinovy


    The Allan variance (AVAR) was introduced 50 years ago as a statistical tool for assessing the frequency standards deviations. For the past decades, AVAR has increasingly been used in geodesy and astrometry to assess the noise characteristics in geodetic and astrometric time series. A specific feature of astrometric and geodetic measurements, as compared with clock measurements, is that they are generally associated with uncertainties; thus, an appropriate weighting should be applied during data analysis. In addition, some physically connected scalar time series naturally form series of multidimensional vectors. For example, three station coordinates time series X, Y, and Z can be combined to analyze 3-D station position variations. The classical AVAR is not intended for processing unevenly weighted and/or multidimensional data. Therefore, AVAR modifications, namely weighted AVAR (WAVAR), multidimensional AVAR (MAVAR), and weighted multidimensional AVAR (WMAVAR), were introduced to overcome these deficiencies. In this paper, a brief review is given of the experience of using AVAR and its modifications in processing astrogeodetic time series.

  19. BiGGEsTS: integrated environment for biclustering analysis of time series gene expression data

    Directory of Open Access Journals (Sweden)

    Madeira Sara C


    Full Text Available Abstract Background The ability to monitor changes in expression patterns over time, and to observe the emergence of coherent temporal responses using expression time series, is critical to advance our understanding of complex biological processes. Biclustering has been recognized as an effective method for discovering local temporal expression patterns and unraveling potential regulatory mechanisms. The general biclustering problem is NP-hard. In the case of time series this problem is tractable, and efficient algorithms can be used. However, there is still a need for specialized applications able to take advantage of the temporal properties inherent to expression time series, both from a computational and a biological perspective. Findings BiGGEsTS makes available state-of-the-art biclustering algorithms for analyzing expression time series. Gene Ontology (GO annotations are used to assess the biological relevance of the biclusters. Methods for preprocessing expression time series and post-processing results are also included. The analysis is additionally supported by a visualization module capable of displaying informative representations of the data, including heatmaps, dendrograms, expression charts and graphs of enriched GO terms. Conclusion BiGGEsTS is a free open source graphical software tool for revealing local coexpression of genes in specific intervals of time, while integrating meaningful information on gene annotations. It is freely available at: We present a case study on the discovery of transcriptional regulatory modules in the response of Saccharomyces cerevisiae to heat stress.

  20. Modeling activity patterns of wildlife using time-series analysis. (United States)

    Zhang, Jindong; Hull, Vanessa; Ouyang, Zhiyun; He, Liang; Connor, Thomas; Yang, Hongbo; Huang, Jinyan; Zhou, Shiqiang; Zhang, Zejun; Zhou, Caiquan; Zhang, Hemin; Liu, Jianguo


    The study of wildlife activity patterns is an effective approach to understanding fundamental ecological and evolutionary processes. However, traditional statistical approaches used to conduct quantitative analysis have thus far had limited success in revealing underlying mechanisms driving activity patterns. Here, we combine wavelet analysis, a type of frequency-based time-series analysis, with high-resolution activity data from accelerometers embedded in GPS collars to explore the effects of internal states (e.g., pregnancy) and external factors (e.g., seasonal dynamics of resources and weather) on activity patterns of the endangered giant panda ( Ailuropoda melanoleuca ). Giant pandas exhibited higher frequency cycles during the winter when resources (e.g., water and forage) were relatively poor, as well as during spring, which includes the giant panda's mating season. During the summer and autumn when resources were abundant, pandas exhibited a regular activity pattern with activity peaks every 24 hr. A pregnant individual showed distinct differences in her activity pattern from other giant pandas for several months following parturition. These results indicate that animals adjust activity cycles to adapt to seasonal variation of the resources and unique physiological periods. Wavelet coherency analysis also verified the synchronization of giant panda activity level with air temperature and solar radiation at the 24-hr band. Our study also shows that wavelet analysis is an effective tool for analyzing high-resolution activity pattern data and its relationship to internal and external states, an approach that has the potential to inform wildlife conservation and management across species.

  1. Graphical Data Analysis on the Circle: Wrap-Around Time Series Plots for (Interrupted) Time Series Designs. (United States)

    Rodgers, Joseph Lee; Beasley, William Howard; Schuelke, Matthew


    Many data structures, particularly time series data, are naturally seasonal, cyclical, or otherwise circular. Past graphical methods for time series have focused on linear plots. In this article, we move graphical analysis onto the circle. We focus on 2 particular methods, one old and one new. Rose diagrams are circular histograms and can be produced in several different forms using the RRose software system. In addition, we propose, develop, illustrate, and provide software support for a new circular graphical method, called Wrap-Around Time Series Plots (WATS Plots), which is a graphical method useful to support time series analyses in general but in particular in relation to interrupted time series designs. We illustrate the use of WATS Plots with an interrupted time series design evaluating the effect of the Oklahoma City bombing on birthrates in Oklahoma County during the 10 years surrounding the bombing of the Murrah Building in Oklahoma City. We compare WATS Plots with linear time series representations and overlay them with smoothing and error bands. Each method is shown to have advantages in relation to the other; in our example, the WATS Plots more clearly show the existence and effect size of the fertility differential.

  2. Specifications for surface reaction analysis apparatus

    International Nuclear Information System (INIS)

    Teraoka, Yuden; Yoshigoe, Akitaka


    A surface reaction analysis apparatus was installed at the JAERI soft x-ray beamline in the SPring-8 as an experimental end-station for the study of surface chemistry. The apparatus is devoted to the study concerning the influence of translational kinetic energy of incident molecules to chemical reactions on solid surfaces with gas molecules. In order to achieve the research purpose, reactive molecular scattering experiments and photoemission spectroscopic measurements using synchrotron radiation are performed in that apparatus via a supersonic molecular beam generator, an electron energy analyzer and a quadrupole mass analyzer. The detail specifications for the apparatus are described in this report. (author)

  3. Time Series Analysis of the Quasar PKS 1749+096 (United States)

    Lam, Michael T.; Balonek, T. J.


    Multiple timescales of variability are observed in quasars at a variety of wavelengths, the nature of which is not fully understood. In 2007 and 2008, the quasar 1749+096 underwent two unprecedented optical outbursts, reaching a brightness never before seen in our twenty years of monitoring. Much lower level activity had been seen prior to these two outbursts. We present an analysis of the timescales of variability over the two regimes using a variety of statistical techniques. An IDL software package developed at Colgate University over the summer of 2010, the Quasar User Interface (QUI), provides effective computation of four time series functions for analyzing underlying trends present in generic, discretely sampled data sets. Using the Autocorrelation Function, Structure Function, and Power Spectrum, we are able to quickly identify possible variability timescales. QUI is also capable of computing the Cross-Correlation Function for comparing variability at different wavelengths. We apply these algorithms to 1749+096 and present our analysis of the timescales for this object. Funding for this project was received from Colgate University, the Justus and Jayne Schlichting Student Research Fund, and the NASA / New York Space Grant.

  4. Analysis of time series and size of equivalent sample

    International Nuclear Information System (INIS)

    Bernal, Nestor; Molina, Alicia; Pabon, Daniel; Martinez, Jorge


    In a meteorological context, a first approach to the modeling of time series is to use models of autoregressive type. This allows one to take into account the meteorological persistence or temporal behavior, thereby identifying the memory of the analyzed process. This article seeks to pre-sent the concept of the size of an equivalent sample, which helps to identify in the data series sub periods with a similar structure. Moreover, in this article we examine the alternative of adjusting the variance of the series, keeping in mind its temporal structure, as well as an adjustment to the covariance of two time series. This article presents two examples, the first one corresponding to seven simulated series with autoregressive structure of first order, and the second corresponding to seven meteorological series of anomalies of the air temperature at the surface in two Colombian regions

  5. The physiology analysis system: an integrated approach for warehousing, management and analysis of time-series physiology data. (United States)

    McKenna, Thomas M; Bawa, Gagandeep; Kumar, Kamal; Reifman, Jaques


    The physiology analysis system (PAS) was developed as a resource to support the efficient warehousing, management, and analysis of physiology data, particularly, continuous time-series data that may be extensive, of variable quality, and distributed across many files. The PAS incorporates time-series data collected by many types of data-acquisition devices, and it is designed to free users from data management burdens. This Web-based system allows both discrete (attribute) and time-series (ordered) data to be manipulated, visualized, and analyzed via a client's Web browser. All processes occur on a server, so that the client does not have to download data or any application programs, and the PAS is independent of the client's computer operating system. The PAS contains a library of functions, written in different computer languages that the client can add to and use to perform specific data operations. Functions from the library are sequentially inserted into a function chain-based logical structure to construct sophisticated data operators from simple function building blocks, affording ad hoc query and analysis of time-series data. These features support advanced mining of physiology data.

  6. Interglacial climate dynamics and advanced time series analysis (United States)

    Mudelsee, Manfred; Bermejo, Miguel; Köhler, Peter; Lohmann, Gerrit


    Studying the climate dynamics of past interglacials (IGs) helps to better assess the anthropogenically influenced dynamics of the current IG, the Holocene. We select the IG portions from the EPICA Dome C ice core archive, which covers the past 800 ka, to apply methods of statistical time series analysis (Mudelsee 2010). The analysed variables are deuterium/H (indicating temperature) (Jouzel et al. 2007), greenhouse gases (Siegenthaler et al. 2005, Loulergue et al. 2008, L¨ü thi et al. 2008) and a model-co-derived climate radiative forcing (Köhler et al. 2010). We select additionally high-resolution sea-surface-temperature records from the marine sedimentary archive. The first statistical method, persistence time estimation (Mudelsee 2002) lets us infer the 'climate memory' property of IGs. Second, linear regression informs about long-term climate trends during IGs. Third, ramp function regression (Mudelsee 2000) is adapted to look on abrupt climate changes during IGs. We compare the Holocene with previous IGs in terms of these mathematical approaches, interprete results in a climate context, assess uncertainties and the requirements to data from old IGs for yielding results of 'acceptable' accuracy. This work receives financial support from the Deutsche Forschungsgemeinschaft (Project ClimSens within the DFG Research Priority Program INTERDYNAMIK) and the European Commission (Marie Curie Initial Training Network LINC, No. 289447, within the 7th Framework Programme). References Jouzel J, Masson-Delmotte V, Cattani O, Dreyfus G, Falourd S, Hoffmann G, Minster B, Nouet J, Barnola JM, Chappellaz J, Fischer H, Gallet JC, Johnsen S, Leuenberger M, Loulergue L, Luethi D, Oerter H, Parrenin F, Raisbeck G, Raynaud D, Schilt A, Schwander J, Selmo E, Souchez R, Spahni R, Stauffer B, Steffensen JP, Stenni B, Stocker TF, Tison JL, Werner M, Wolff EW (2007) Orbital and millennial Antarctic climate variability over the past 800,000 years. Science 317:793. Köhler P, Bintanja R

  7. The Prediction of Teacher Turnover Employing Time Series Analysis. (United States)

    Costa, Crist H.

    The purpose of this study was to combine knowledge of teacher demographic data with time-series forecasting methods to predict teacher turnover. Moving averages and exponential smoothing were used to forecast discrete time series. The study used data collected from the 22 largest school districts in Iowa, designated as FACT schools. Predictions…


    International Nuclear Information System (INIS)

    Auchère, F.; Froment, C.; Bocchialini, K.; Buchlin, E.; Solomon, J.


    Using Fourier and wavelet analysis, we critically re-assess the significance of our detection of periodic pulsations in coronal loops. We show that the proper identification of the frequency dependence and statistical properties of the different components of the power spectra provides a strong argument against the common practice of data detrending, which tends to produce spurious detections around the cut-off frequency of the filter. In addition, the white and red noise models built into the widely used wavelet code of Torrence and Compo cannot, in most cases, adequately represent the power spectra of coronal time series, thus also possibly causing false positives. Both effects suggest that several reports of periodic phenomena should be re-examined. The Torrence and Compo code nonetheless effectively computes rigorous confidence levels if provided with pertinent models of mean power spectra, and we describe the appropriate manner in which to call its core routines. We recall the meaning of the default confidence levels output from the code, and we propose new Monte-Carlo-derived levels that take into account the total number of degrees of freedom in the wavelet spectra. These improvements allow us to confirm that the power peaks that we detected have a very low probability of being caused by noise.


    Energy Technology Data Exchange (ETDEWEB)

    Auchère, F.; Froment, C.; Bocchialini, K.; Buchlin, E.; Solomon, J., E-mail: [Institut d’Astrophysique Spatiale, CNRS, Univ. Paris-Sud, Université Paris-Saclay, Bât. 121, F-91405 Orsay (France)


    Using Fourier and wavelet analysis, we critically re-assess the significance of our detection of periodic pulsations in coronal loops. We show that the proper identification of the frequency dependence and statistical properties of the different components of the power spectra provides a strong argument against the common practice of data detrending, which tends to produce spurious detections around the cut-off frequency of the filter. In addition, the white and red noise models built into the widely used wavelet code of Torrence and Compo cannot, in most cases, adequately represent the power spectra of coronal time series, thus also possibly causing false positives. Both effects suggest that several reports of periodic phenomena should be re-examined. The Torrence and Compo code nonetheless effectively computes rigorous confidence levels if provided with pertinent models of mean power spectra, and we describe the appropriate manner in which to call its core routines. We recall the meaning of the default confidence levels output from the code, and we propose new Monte-Carlo-derived levels that take into account the total number of degrees of freedom in the wavelet spectra. These improvements allow us to confirm that the power peaks that we detected have a very low probability of being caused by noise.

  10. Prediction of solar cycle 24 using fourier series analysis

    International Nuclear Information System (INIS)

    Khalid, M.; Sultana, M.; Zaidi, F.


    Predicting the behavior of solar activity has become very significant. It is due to its influence on Earth and the surrounding environment. Apt predictions of the amplitude and timing of the next solar cycle will aid in the estimation of the several results of Space Weather. In the past, many prediction procedures have been used and have been successful to various degrees in the field of solar activity forecast. In this study, Solar cycle 24 is forecasted by the Fourier series method. Comparative analysis has been made by auto regressive integrated moving averages method. From sources, January 2008 was the minimum preceding solar cycle 24, the amplitude and shape of solar cycle 24 is approximate on monthly number of sunspots. This forecast framework approximates a mean solar cycle 24, with the maximum appearing during May 2014 (+- 8 months), with most sunspot of 98 +- 10. Solar cycle 24 will be ending in June 2020 (+- 7 months). The difference between two consecutive peak values of solar cycles (i.e. solar cycle 23 and 24 ) is 165 months(+- 6 months). (author)

  11. Beyond the Hofmeister Series: Ion-Specific Effects on Proteins and Their Biological Functions

    Czech Academy of Sciences Publication Activity Database

    Okur, H. I.; Hladílková, Jana; Rembert, K. B.; Cho, Y.; Heyda, J.; Dzubiella, J.; Cremer, P. S.; Jungwirth, Pavel


    Roč. 121, č. 9 (2017), s. 1997-2014 ISSN 1520-6106 R&D Projects: GA ČR(CZ) GA16-01074S Institutional support: RVO:61388963 Keywords : Hofmeister series * ions * proteins * molecular dynamics Subject RIV: CF - Physical ; Theoretical Chemistry OBOR OECD: Physical chemistry Impact factor: 3.177, year: 2016

  12. On-line analysis of reactor noise using time-series analysis

    International Nuclear Information System (INIS)

    McGevna, V.G.


    A method to allow use of time series analysis for on-line noise analysis has been developed. On-line analysis of noise in nuclear power reactors has been limited primarily to spectral analysis and related frequency domain techniques. Time series analysis has many distinct advantages over spectral analysis in the automated processing of reactor noise. However, fitting an autoregressive-moving average (ARMA) model to time series data involves non-linear least squares estimation. Unless a high speed, general purpose computer is available, the calculations become too time consuming for on-line applications. To eliminate this problem, a special purpose algorithm was developed for fitting ARMA models. While it is based on a combination of steepest descent and Taylor series linearization, properties of the ARMA model are used so that the auto- and cross-correlation functions can be used to eliminate the need for estimating derivatives. The number of calculations, per iteration varies lineegardless of the mee 0.2% yield strength displayed anisotropy, with axial and circumferential values being greater than radial. For CF8-CPF8 and CF8M-CPF8M castings to meet current ASME Code S acid fuel cells

  13. Analysis of complex time series using refined composite multiscale entropy

    International Nuclear Information System (INIS)

    Wu, Shuen-De; Wu, Chiu-Wen; Lin, Shiou-Gwo; Lee, Kung-Yen; Peng, Chung-Kang


    Multiscale entropy (MSE) is an effective algorithm for measuring the complexity of a time series that has been applied in many fields successfully. However, MSE may yield an inaccurate estimation of entropy or induce undefined entropy because the coarse-graining procedure reduces the length of a time series considerably at large scales. Composite multiscale entropy (CMSE) was recently proposed to improve the accuracy of MSE, but it does not resolve undefined entropy. Here we propose a refined composite multiscale entropy (RCMSE) to improve CMSE. For short time series analyses, we demonstrate that RCMSE increases the accuracy of entropy estimation and reduces the probability of inducing undefined entropy.

  14. Unsupervised land cover change detection: meaningful sequential time series analysis

    CSIR Research Space (South Africa)

    Salmon, BP


    Full Text Available An automated land cover change detection method is proposed that uses coarse spatial resolution hyper-temporal earth observation satellite time series data. The study compared three different unsupervised clustering approaches that operate on short...

  15. Correlation between detrended fluctuation analysis and the Lempel-Ziv complexity in nonlinear time series analysis

    International Nuclear Information System (INIS)

    Tang You-Fu; Liu Shu-Lin; Jiang Rui-Hong; Liu Ying-Hui


    We study the correlation between detrended fluctuation analysis (DFA) and the Lempel-Ziv complexity (LZC) in nonlinear time series analysis in this paper. Typical dynamic systems including a logistic map and a Duffing model are investigated. Moreover, the influence of Gaussian random noise on both the DFA and LZC are analyzed. The results show a high correlation between the DFA and LZC, which can quantify the non-stationarity and the nonlinearity of the time series, respectively. With the enhancement of the random component, the exponent a and the normalized complexity index C show increasing trends. In addition, C is found to be more sensitive to the fluctuation in the nonlinear time series than α. Finally, the correlation between the DFA and LZC is applied to the extraction of vibration signals for a reciprocating compressor gas valve, and an effective fault diagnosis result is obtained

  16. Advances in Antithetic Time Series Analysis : Separating Fact from Artifact

    Directory of Open Access Journals (Sweden)

    Dennis Ridley


    Full Text Available The problem of biased time series mathematical model parameter estimates is well known to be insurmountable. When used to predict future values by extrapolation, even a de minimis bias will eventually grow into a large bias, with misleading results. This paper elucidates how combining antithetic time series' solves this baffling problem of bias in the fitted and forecast values by dynamic bias cancellation. Instead of growing to infinity, the average error can converge to a constant. (original abstract

  17. Data imputation analysis for Cosmic Rays time series (United States)

    Fernandes, R. C.; Lucio, P. S.; Fernandez, J. H.


    The occurrence of missing data concerning Galactic Cosmic Rays time series (GCR) is inevitable since loss of data is due to mechanical and human failure or technical problems and different periods of operation of GCR stations. The aim of this study was to perform multiple dataset imputation in order to depict the observational dataset. The study has used the monthly time series of GCR Climax (CLMX) and Roma (ROME) from 1960 to 2004 to simulate scenarios of 10%, 20%, 30%, 40%, 50%, 60%, 70%, 80% and 90% of missing data compared to observed ROME series, with 50 replicates. Then, the CLMX station as a proxy for allocation of these scenarios was used. Three different methods for monthly dataset imputation were selected: AMÉLIA II - runs the bootstrap Expectation Maximization algorithm, MICE - runs an algorithm via Multivariate Imputation by Chained Equations and MTSDI - an Expectation Maximization algorithm-based method for imputation of missing values in multivariate normal time series. The synthetic time series compared with the observed ROME series has also been evaluated using several skill measures as such as RMSE, NRMSE, Agreement Index, R, R2, F-test and t-test. The results showed that for CLMX and ROME, the R2 and R statistics were equal to 0.98 and 0.96, respectively. It was observed that increases in the number of gaps generate loss of quality of the time series. Data imputation was more efficient with MTSDI method, with negligible errors and best skill coefficients. The results suggest a limit of about 60% of missing data for imputation, for monthly averages, no more than this. It is noteworthy that CLMX, ROME and KIEL stations present no missing data in the target period. This methodology allowed reconstructing 43 time series.

  18. Geomechanical time series and its singularity spectrum analysis

    Czech Academy of Sciences Publication Activity Database

    Lyubushin, Alexei A.; Kaláb, Zdeněk; Lednická, Markéta


    Roč. 47, č. 1 (2012), s. 69-77 ISSN 1217-8977 R&D Projects: GA ČR GA105/09/0089 Institutional research plan: CEZ:AV0Z30860518 Keywords : geomechanical time series * singularity spectrum * time series segmentation * laser distance meter Subject RIV: DC - Siesmology, Volcanology, Earth Structure Impact factor: 0.347, year: 2012

  19. Time series analysis as input for clinical predictive modeling: modeling cardiac arrest in a pediatric ICU. (United States)

    Kennedy, Curtis E; Turley, James P


    Thousands of children experience cardiac arrest events every year in pediatric intensive care units. Most of these children die. Cardiac arrest prediction tools are used as part of medical emergency team evaluations to identify patients in standard hospital beds that are at high risk for cardiac arrest. There are no models to predict cardiac arrest in pediatric intensive care units though, where the risk of an arrest is 10 times higher than for standard hospital beds. Current tools are based on a multivariable approach that does not characterize deterioration, which often precedes cardiac arrests. Characterizing deterioration requires a time series approach. The purpose of this study is to propose a method that will allow for time series data to be used in clinical prediction models. Successful implementation of these methods has the potential to bring arrest prediction to the pediatric intensive care environment, possibly allowing for interventions that can save lives and prevent disabilities. We reviewed prediction models from nonclinical domains that employ time series data, and identified the steps that are necessary for building predictive models using time series clinical data. We illustrate the method by applying it to the specific case of building a predictive model for cardiac arrest in a pediatric intensive care unit. Time course analysis studies from genomic analysis provided a modeling template that was compatible with the steps required to develop a model from clinical time series data. The steps include: 1) selecting candidate variables; 2) specifying measurement parameters; 3) defining data format; 4) defining time window duration and resolution; 5) calculating latent variables for candidate variables not directly measured; 6) calculating time series features as latent variables; 7) creating data subsets to measure model performance effects attributable to various classes of candidate variables; 8) reducing the number of candidate features; 9

  20. Time Series Analysis of Wheat flour Price Shocks in Pakistan: A Case Analysis


    Asad Raza Abdi; Ali Hassan Halepoto; Aisha Bashir Shah; Faiz M. Shaikh


    The current research investigates the wheat flour Price Shocks in Pakistan: A case analysis. Data was collected by using secondary sources by using Time series Analysis, and data were analyzed by using SPSS-20 version. It was revealed that the price of wheat flour increases from last four decades, and trend of price shocks shows that due to certain market variation and supply and demand shocks also play a positive relationship in price shocks in the wheat prices. It was further revealed th...

  1. Track Irregularity Time Series Analysis and Trend Forecasting

    Directory of Open Access Journals (Sweden)

    Jia Chaolong


    Full Text Available The combination of linear and nonlinear methods is widely used in the prediction of time series data. This paper analyzes track irregularity time series data by using gray incidence degree models and methods of data transformation, trying to find the connotative relationship between the time series data. In this paper, GM (1,1 is based on first-order, single variable linear differential equations; after an adaptive improvement and error correction, it is used to predict the long-term changing trend of track irregularity at a fixed measuring point; the stochastic linear AR, Kalman filtering model, and artificial neural network model are applied to predict the short-term changing trend of track irregularity at unit section. Both long-term and short-term changes prove that the model is effective and can achieve the expected accuracy.

  2. Unstable Periodic Orbit Analysis of Histograms of Chaotic Time Series

    International Nuclear Information System (INIS)

    Zoldi, S.M.


    Using the Lorenz equations, we have investigated whether unstable periodic orbits (UPOs) associated with a strange attractor may predict the occurrence of the robust sharp peaks in histograms of some experimental chaotic time series. Histograms with sharp peaks occur for the Lorenz parameter value r=60.0 but not for r=28.0 , and the sharp peaks for r=60.0 do not correspond to a histogram derived from any single UPO. However, we show that histograms derived from the time series of a non-Axiom-A chaotic system can be accurately predicted by an escape-time weighting of UPO histograms. copyright 1998 The American Physical Society

  3. Minimum entropy density method for the time series analysis (United States)

    Lee, Jeong Won; Park, Joongwoo Brian; Jo, Hang-Hyun; Yang, Jae-Suk; Moon, Hie-Tae


    The entropy density is an intuitive and powerful concept to study the complicated nonlinear processes derived from physical systems. We develop the minimum entropy density method (MEDM) to detect the structure scale of a given time series, which is defined as the scale in which the uncertainty is minimized, hence the pattern is revealed most. The MEDM is applied to the financial time series of Standard and Poor’s 500 index from February 1983 to April 2006. Then the temporal behavior of structure scale is obtained and analyzed in relation to the information delivery time and efficient market hypothesis.

  4. Quantitative analysis of patient-specific dosimetric IMRT verification

    International Nuclear Information System (INIS)

    Budgell, G J; Perrin, B A; Mott, J H L; Fairfoul, J; Mackay, R I


    Patient-specific dosimetric verification methods for IMRT treatments are variable, time-consuming and frequently qualitative, preventing evidence-based reduction in the amount of verification performed. This paper addresses some of these issues by applying a quantitative analysis parameter to the dosimetric verification procedure. Film measurements in different planes were acquired for a series of ten IMRT prostate patients, analysed using the quantitative parameter, and compared to determine the most suitable verification plane. Film and ion chamber verification results for 61 patients were analysed to determine long-term accuracy, reproducibility and stability of the planning and delivery system. The reproducibility of the measurement and analysis system was also studied. The results show that verification results are strongly dependent on the plane chosen, with the coronal plane particularly insensitive to delivery error. Unexpectedly, no correlation could be found between the levels of error in different verification planes. Longer term verification results showed consistent patterns which suggest that the amount of patient-specific verification can be safely reduced, provided proper caution is exercised: an evidence-based model for such reduction is proposed. It is concluded that dose/distance to agreement (e.g., 3%/3 mm) should be used as a criterion of acceptability. Quantitative parameters calculated for a given criterion of acceptability should be adopted in conjunction with displays that show where discrepancies occur. Planning and delivery systems which cannot meet the required standards of accuracy, reproducibility and stability to reduce verification will not be accepted by the radiotherapy community

  5. Arab drama series content analysis from a transnational Arab identity perspective

    Directory of Open Access Journals (Sweden)

    Joelle Chamieh


    Full Text Available The scientific contribution in deciphering drama series falls under the discipline of understanding the narratology of distinctive cultures and traditions within specific contexts of certain societies. This article spells out the interferences deployed by the provocations that are induced through the functions of values in modeling societies which are projected through the transmission of media. The proposed operational model consists of providing an à priori design of common Arab values assimilated into an innovative grid analysis code book that has enabled the execution of a systematic and reliable approach to the quantitative content analysis performance. Additionally, a more thorough qualitative content analysis has been implemented in terms of narratolgy where actions have been evaluated based on the grid analysis code book for a clearer perception of Arab values depicted in terms of their context within the Arab drama milieu. This approach has been deployed on four Arab drama series covering the transnational/national and non-divisive/divisive media aspects in the intention of extracting the transmitted values from a common identity perspective for cause of divulging Arab people’s expectancies.

  6. Dynamic Factor Analysis of Nonstationary Multivariate Time Series. (United States)

    Molenaar, Peter C. M.; And Others


    The dynamic factor model proposed by P. C. Molenaar (1985) is exhibited, and a dynamic nonstationary factor model (DNFM) is constructed with latent factor series that have time-varying mean functions. The use of a DNFM is illustrated using data from a television viewing habits study. (SLD)

  7. Koopman Operator Framework for Time Series Modeling and Analysis (United States)

    Surana, Amit


    We propose an interdisciplinary framework for time series classification, forecasting, and anomaly detection by combining concepts from Koopman operator theory, machine learning, and linear systems and control theory. At the core of this framework is nonlinear dynamic generative modeling of time series using the Koopman operator which is an infinite-dimensional but linear operator. Rather than working with the underlying nonlinear model, we propose two simpler linear representations or model forms based on Koopman spectral properties. We show that these model forms are invariants of the generative model and can be readily identified directly from data using techniques for computing Koopman spectral properties without requiring the explicit knowledge of the generative model. We also introduce different notions of distances on the space of such model forms which is essential for model comparison/clustering. We employ the space of Koopman model forms equipped with distance in conjunction with classical machine learning techniques to develop a framework for automatic feature generation for time series classification. The forecasting/anomaly detection framework is based on using Koopman model forms along with classical linear systems and control approaches. We demonstrate the proposed framework for human activity classification, and for time series forecasting/anomaly detection in power grid application.

  8. Spatially adaptive mixture modeling for analysis of FMRI time series. (United States)

    Vincent, Thomas; Risser, Laurent; Ciuciu, Philippe


    Within-subject analysis in fMRI essentially addresses two problems, the detection of brain regions eliciting evoked activity and the estimation of the underlying dynamics. In Makni et aL, 2005 and Makni et aL, 2008, a detection-estimation framework has been proposed to tackle these problems jointly, since they are connected to one another. In the Bayesian formalism, detection is achieved by modeling activating and nonactivating voxels through independent mixture models (IMM) within each region while hemodynamic response estimation is performed at a regional scale in a nonparametric way. Instead of IMMs, in this paper we take advantage of spatial mixture models (SMM) for their nonlinear spatial regularizing properties. The proposed method is unsupervised and spatially adaptive in the sense that the amount of spatial correlation is automatically tuned from the data and this setting automatically varies across brain regions. In addition, the level of regularization is specific to each experimental condition since both the signal-to-noise ratio and the activation pattern may vary across stimulus types in a given brain region. These aspects require the precise estimation of multiple partition functions of underlying Ising fields. This is addressed efficiently using first path sampling for a small subset of fields and then using a recently developed fast extrapolation technique for the large remaining set. Simulation results emphasize that detection relying on supervised SMM outperforms its IMM counterpart and that unsupervised spatial mixture models achieve similar results without any hand-tuning of the correlation parameter. On real datasets, the gain is illustrated in a localizer fMRI experiment: brain activations appear more spatially resolved using SMM in comparison with classical general linear model (GLM)-based approaches, while estimating a specific parcel-based HRF shape. Our approach therefore validates the treatment of unsmoothed fMRI data without fixed GLM

  9. Loss-of-coolant accident test series TC-1 experiment operating specifications

    International Nuclear Information System (INIS)

    Yackle, T.R.


    The purpose of this document is to specify the experiment operating procedure for the test series TC-1. The effects of externally mounted cladding thermocouples on the fuel rod thermal behavior during LOCA blowdown and reflood cycles will be investigated in the test. Potential thermocouple effects include: (a) delayed DNB, (b) momentary cladding rewets following DNB, (c) premature cladding rewet during a blowdown two-phase slug period, and (d) early cladding rewet during reflood. The two-phase slug period will be controlled by momentarily opening the hot leg valve. The slug will consist of lower plenum liquid that is sent through the flow shrouds and will be designed to quench the fuel rods at a rate that is similar to the slug experienced early in the LOFT L2-2 and L2-3 tests

  10. Complexity analysis of the turbulent environmental fluid flow time series (United States)

    Mihailović, D. T.; Nikolić-Đorić, E.; Drešković, N.; Mimić, G.


    We have used the Kolmogorov complexities, sample and permutation entropies to quantify the randomness degree in river flow time series of two mountain rivers in Bosnia and Herzegovina, representing the turbulent environmental fluid, for the period 1926-1990. In particular, we have examined the monthly river flow time series from two rivers (the Miljacka and the Bosnia) in the mountain part of their flow and then calculated the Kolmogorov complexity (KL) based on the Lempel-Ziv Algorithm (LZA) (lower-KLL and upper-KLU), sample entropy (SE) and permutation entropy (PE) values for each time series. The results indicate that the KLL, KLU, SE and PE values in two rivers are close to each other regardless of the amplitude differences in their monthly flow rates. We have illustrated the changes in mountain river flow complexity by experiments using (i) the data set for the Bosnia River and (ii) anticipated human activities and projected climate changes. We have explored the sensitivity of considered measures in dependence on the length of time series. In addition, we have divided the period 1926-1990 into three subintervals: (a) 1926-1945, (b) 1946-1965, (c) 1966-1990, and calculated the KLL, KLU, SE, PE values for the various time series in these subintervals. It is found that during the period 1946-1965, there is a decrease in their complexities, and corresponding changes in the SE and PE, in comparison to the period 1926-1990. This complexity loss may be primarily attributed to (i) human interventions, after the Second World War, on these two rivers because of their use for water consumption and (ii) climate change in recent times.

  11. Evaluating disease management program effectiveness: an introduction to time-series analysis. (United States)

    Linden, Ariel; Adams, John L; Roberts, Nancy


    Currently, the most widely used method in the disease management (DM) industry for evaluating program effectiveness is referred to as the "total population approach." This model is a pretest-posttest design, with the most basic limitation being that without a control group, there may be sources of bias and/or competing extraneous confounding factors that offer a plausible rationale explaining the change from baseline. Furthermore, with the current inclination of DM programs to use financial indicators rather than program-specific utilization indicators as the principal measure of program success, additional biases are introduced that may cloud evaluation results. This paper presents a non-technical introduction to time-series analysis (using disease-specific utilization measures) as an alternative, and more appropriate, approach to evaluating DM program effectiveness than the current total population approach.

  12. Multiscale multifractal multiproperty analysis of financial time series based on Rényi entropy (United States)

    Yujun, Yang; Jianping, Li; Yimei, Yang

    This paper introduces a multiscale multifractal multiproperty analysis based on Rényi entropy (3MPAR) method to analyze short-range and long-range characteristics of financial time series, and then applies this method to the five time series of five properties in four stock indices. Combining the two analysis techniques of Rényi entropy and multifractal detrended fluctuation analysis (MFDFA), the 3MPAR method focuses on the curves of Rényi entropy and generalized Hurst exponent of five properties of four stock time series, which allows us to study more universal and subtle fluctuation characteristics of financial time series. By analyzing the curves of the Rényi entropy and the profiles of the logarithm distribution of MFDFA of five properties of four stock indices, the 3MPAR method shows some fluctuation characteristics of the financial time series and the stock markets. Then, it also shows a richer information of the financial time series by comparing the profile of five properties of four stock indices. In this paper, we not only focus on the multifractality of time series but also the fluctuation characteristics of the financial time series and subtle differences in the time series of different properties. We find that financial time series is far more complex than reported in some research works using one property of time series.

  13. Integrating Hypnosis with Other Therapies for Treating Specific Phobias: A Case Series. (United States)

    Hirsch, Joseph A


    There is a high prevalence of anxiety disorders including specific phobias and panic disorder in the United States and Europe. A variety of therapeutic modalities including pharmacotherapy, cognitive behavioral therapy, systematic desensitization, hypnosis, in vivo exposure, and virtual reality exposure therapy have been applied. No one modality has been entirely successful. There has been only a limited attempt to combine psychological therapies in the treatment of specific phobias and panic disorder and what has been done has been primarily with systematic desensitization or cognitive behavioral therapy along with hypnotherapy. I present two cases of multiple specific phobias that were successfully treated with hypnotherapy combined with virtual reality exposure therapy or in vivo exposure therapy. The rationale for this integrative therapy and the neurobiological constructs are considered.

  14. Modeling Philippine Stock Exchange Composite Index Using Time Series Analysis (United States)

    Gayo, W. S.; Urrutia, J. D.; Temple, J. M. F.; Sandoval, J. R. D.; Sanglay, J. E. A.


    This study was conducted to develop a time series model of the Philippine Stock Exchange Composite Index and its volatility using the finite mixture of ARIMA model with conditional variance equations such as ARCH, GARCH, EG ARCH, TARCH and PARCH models. Also, the study aimed to find out the reason behind the behaviorof PSEi, that is, which of the economic variables - Consumer Price Index, crude oil price, foreign exchange rate, gold price, interest rate, money supply, price-earnings ratio, Producers’ Price Index and terms of trade - can be used in projecting future values of PSEi and this was examined using Granger Causality Test. The findings showed that the best time series model for Philippine Stock Exchange Composite index is ARIMA(1,1,5) - ARCH(1). Also, Consumer Price Index, crude oil price and foreign exchange rate are factors concluded to Granger cause Philippine Stock Exchange Composite Index.

  15. Time series analysis of the behavior of brazilian natural rubber

    Directory of Open Access Journals (Sweden)

    Antônio Donizette de Oliveira


    Full Text Available The natural rubber is a non-wood product obtained of the coagulation of some lattices of forest species, being Hevea brasiliensis the main one. Native from the Amazon Region, this species was already known by the Indians before the discovery of America. The natural rubber became a product globally valued due to its multiple applications in the economy, being its almost perfect substitute the synthetic rubber derived from the petroleum. Similarly to what happens with other countless products the forecast of future prices of the natural rubber has been object of many studies. The use of models of forecast of univariate timeseries stands out as the more accurate and useful to reduce the uncertainty in the economic decision making process. This studyanalyzed the historical series of prices of the Brazilian natural rubber (R$/kg, in the Jan/99 - Jun/2006 period, in order tocharacterize the rubber price behavior in the domestic market; estimated a model for the time series of monthly natural rubberprices; and foresaw the domestic prices of the natural rubber, in the Jul/2006 - Jun/2007 period, based on the estimated models.The studied models were the ones belonging to the ARIMA family. The main results were: the domestic market of the natural rubberis expanding due to the growth of the world economy; among the adjusted models, the ARIMA (1,1,1 model provided the bestadjustment of the time series of prices of the natural rubber (R$/kg; the prognosis accomplished for the series supplied statistically adequate fittings.

  16. Dynamical analysis and visualization of tornadoes time series.

    Directory of Open Access Journals (Sweden)

    António M Lopes

    Full Text Available In this paper we analyze the behavior of tornado time-series in the U.S. from the perspective of dynamical systems. A tornado is a violently rotating column of air extending from a cumulonimbus cloud down to the ground. Such phenomena reveal features that are well described by power law functions and unveil characteristics found in systems with long range memory effects. Tornado time series are viewed as the output of a complex system and are interpreted as a manifestation of its dynamics. Tornadoes are modeled as sequences of Dirac impulses with amplitude proportional to the events size. First, a collection of time series involving 64 years is analyzed in the frequency domain by means of the Fourier transform. The amplitude spectra are approximated by power law functions and their parameters are read as an underlying signature of the system dynamics. Second, it is adopted the concept of circular time and the collective behavior of tornadoes analyzed. Clustering techniques are then adopted to identify and visualize the emerging patterns.

  17. Dynamical analysis and visualization of tornadoes time series. (United States)

    Lopes, António M; Tenreiro Machado, J A


    In this paper we analyze the behavior of tornado time-series in the U.S. from the perspective of dynamical systems. A tornado is a violently rotating column of air extending from a cumulonimbus cloud down to the ground. Such phenomena reveal features that are well described by power law functions and unveil characteristics found in systems with long range memory effects. Tornado time series are viewed as the output of a complex system and are interpreted as a manifestation of its dynamics. Tornadoes are modeled as sequences of Dirac impulses with amplitude proportional to the events size. First, a collection of time series involving 64 years is analyzed in the frequency domain by means of the Fourier transform. The amplitude spectra are approximated by power law functions and their parameters are read as an underlying signature of the system dynamics. Second, it is adopted the concept of circular time and the collective behavior of tornadoes analyzed. Clustering techniques are then adopted to identify and visualize the emerging patterns.

  18. Financial time series analysis based on information categorization method (United States)

    Tian, Qiang; Shang, Pengjian; Feng, Guochen


    The paper mainly applies the information categorization method to analyze the financial time series. The method is used to examine the similarity of different sequences by calculating the distances between them. We apply this method to quantify the similarity of different stock markets. And we report the results of similarity in US and Chinese stock markets in periods 1991-1998 (before the Asian currency crisis), 1999-2006 (after the Asian currency crisis and before the global financial crisis), and 2007-2013 (during and after global financial crisis) by using this method. The results show the difference of similarity between different stock markets in different time periods and the similarity of the two stock markets become larger after these two crises. Also we acquire the results of similarity of 10 stock indices in three areas; it means the method can distinguish different areas' markets from the phylogenetic trees. The results show that we can get satisfactory information from financial markets by this method. The information categorization method can not only be used in physiologic time series, but also in financial time series.

  19. Using time-series intervention analysis to understand U.S. Medicaid expenditures on antidepressant agents. (United States)

    Ferrand, Yann; Kelton, Christina M L; Guo, Jeff J; Levy, Martin S; Yu, Yan


    Medicaid programs' spending on antidepressants increased from $159 million in 1991 to $2 billion in 2005. The National Institute for Health Care Management attributed this expenditure growth to increases in drug utilization, entry of newer higher-priced antidepressants, and greater prescription drug insurance coverage. Rising enrollment in Medicaid has also contributed to this expenditure growth. This research examines the impact of specific events, including branded-drug and generic entry, a black box warning, direct-to-consumer advertising (DTCA), and new indication approval, on Medicaid spending on antidepressants. Using quarterly expenditure data for 1991-2005 from the national Medicaid pharmacy claims database maintained by the Centers for Medicare and Medicaid Services, a time-series autoregressive integrated moving average (ARIMA) intervention analysis was performed on 6 specific antidepressant drugs and on overall antidepressant spending. Twenty-nine potentially relevant interventions and their dates of occurrence were identified from the literature. Each was tested for an impact on the time series. Forecasts from the models were compared with a holdout sample of actual expenditure data. Interventions with significant impacts on Medicaid expenditures included the patent expiration of Prozac® (P0.05), implying that the expanding market for antidepressants overwhelmed the effect of generic competition. Copyright © 2011 Elsevier Inc. All rights reserved.

  20. Time-Series Analysis of Supergranule Characterstics at Solar Minimum (United States)

    Williams, Peter E.; Pesnell, W. Dean


    Sixty days of Doppler images from the Solar and Heliospheric Observatory (SOHO) / Michelson Doppler Imager (MDI) investigation during the 1996 and 2008 solar minima have been analyzed to show that certain supergranule characteristics (size, size range, and horizontal velocity) exhibit fluctuations of three to five days. Cross-correlating parameters showed a good, positive correlation between supergranulation size and size range, and a moderate, negative correlation between size range and velocity. The size and velocity do exhibit a moderate, negative correlation, but with a small time lag (less than 12 hours). Supergranule sizes during five days of co-temporal data from MDI and the Solar Dynamics Observatory (SDO) / Helioseismic Magnetic Imager (HMI) exhibit similar fluctuations with a high level of correlation between them. This verifies the solar origin of the fluctuations, which cannot be caused by instrumental artifacts according to these observations. Similar fluctuations are also observed in data simulations that model the evolution of the MDI Doppler pattern over a 60-day period. Correlations between the supergranule size and size range time-series derived from the simulated data are similar to those seen in MDI data. A simple toy-model using cumulative, uncorrelated exponential growth and decay patterns at random emergence times produces a time-series similar to the data simulations. The qualitative similarities between the simulated and the observed time-series suggest that the fluctuations arise from stochastic processes occurring within the solar convection zone. This behavior, propagating to surface manifestations of supergranulation, may assist our understanding of magnetic-field-line advection, evolution, and interaction.

  1. Visibility graph analysis on quarterly macroeconomic series of China based on complex network theory (United States)

    Wang, Na; Li, Dong; Wang, Qiwen


    The visibility graph approach and complex network theory provide a new insight into time series analysis. The inheritance of the visibility graph from the original time series was further explored in the paper. We found that degree distributions of visibility graphs extracted from Pseudo Brownian Motion series obtained by the Frequency Domain algorithm exhibit exponential behaviors, in which the exponential exponent is a binomial function of the Hurst index inherited in the time series. Our simulations presented that the quantitative relations between the Hurst indexes and the exponents of degree distribution function are different for different series and the visibility graph inherits some important features of the original time series. Further, we convert some quarterly macroeconomic series including the growth rates of value-added of three industry series and the growth rates of Gross Domestic Product series of China to graphs by the visibility algorithm and explore the topological properties of graphs associated from the four macroeconomic series, namely, the degree distribution and correlations, the clustering coefficient, the average path length, and community structure. Based on complex network analysis we find degree distributions of associated networks from the growth rates of value-added of three industry series are almost exponential and the degree distributions of associated networks from the growth rates of GDP series are scale free. We also discussed the assortativity and disassortativity of the four associated networks as they are related to the evolutionary process of the original macroeconomic series. All the constructed networks have “small-world” features. The community structures of associated networks suggest dynamic changes of the original macroeconomic series. We also detected the relationship among government policy changes, community structures of associated networks and macroeconomic dynamics. We find great influences of government

  2. Anatomy of the ICDS series: A bibliometric analysis

    International Nuclear Information System (INIS)

    Cardona, Manuel; Marx, Werner


    In this article, the proceedings of the International Conferences on Defects in Semiconductors (ICDS) have been analyzed by bibliometric methods. The papers of these conferences have been published as articles in regular journals or special proceedings journals and in books with diverse publishers. The conference name/title changed several times. Many of the proceedings did not appear in the so-called 'source journals' covered by the Thomson/ISI citation databases, in particular by the Science Citation Index (SCI). But the number of citations within these source journals can be determined using the Cited Reference Search mode under the Web of Science (WoS) and the SCI offered by the host STN International. The search functions of both systems were needed to select the papers published as different document types and to cover the full time span of the series. The most cited ICDS papers were identified, and the overall numbers of citations as well as the time-dependent impact of these papers, of single conferences, and of the complete series, was established. The complete of citing papers was analyzed with respect to the countries of the citing authors, the citing journals, and the ISI subject categories

  3. From Safety Analysis to Formal Specification

    DEFF Research Database (Denmark)

    Hansen, Kirsten Mark; Ravn, Anders P.; Stavridou, Victoria


    Software for safety critical systems must deal with the hazards identified bysafety analysis. This paper investigates, how the results of onesafety analysis technique, fault trees, are interpreted as software safetyrequirements to be used in the program design process. We propose thatfault tree...... analysis and program development use the samesystem model. This model is formalized in areal-time, interval logic, based on a conventional dynamic systems modelwith state evolving over time. Fault trees are interpreted astemporal formulas, and it is shown how such formulas can be usedfor deriving safety...

  4. On-line diagnostic techniques for air-operated control valves based on time series analysis

    International Nuclear Information System (INIS)

    Ito, Kenji; Matsuoka, Yoshinori; Minamikawa, Shigeru; Komatsu, Yasuki; Satoh, Takeshi.


    The objective of this research is to study the feasibility of applying on-line diagnostic techniques based on time series analysis to air-operated control valves - numerous valves of the type which are used in PWR plants. Generally the techniques can detect anomalies by failures in the initial stages for which detection is difficult by conventional surveillance of process parameters measured directly. However, the effectiveness of these techniques depends on the system being diagnosed. The difficulties in applying diagnostic techniques to air-operated control valves seem to come from the reduced sensitivity of their response as compared with hydraulic control systems, as well as the need to identify anomalies in low level signals that fluctuate only slightly but continuously. In this research, simulation tests were performed by setting various kinds of failure modes for a test valve with the same specifications as of a valve actually used in the plants. Actual control signals recorded from an operating plant were then used as input signals for simulation. The results of the tests confirmed the feasibility of applying on-line diagnostic techniques based on time series analysis to air-operated control valves. (author)

  5. The Analysis Of Personality Disorder On Two Characters In The Animation Series Black Rock Shooter


    Ramadhana, Rizki Andrian


    The title of this thesis is The Analysis of Personality Disorder on Two Characters in the Animation Series “Black Rock Shooter” which discusses about the personality disorder of two characters from this series; they are Kagari Izuriha and Yomi Takanashi. The animation series Black Rock Shooter is chosen as the source of data because this animation has psychological genre and represents the complexity of human relationship, especially when build up a friendship. It is because human is a social...

  6. Multi-Scale Entropy Analysis as a Method for Time-Series Analysis of Climate Data

    Directory of Open Access Journals (Sweden)

    Heiko Balzter


    Full Text Available Evidence is mounting that the temporal dynamics of the climate system are changing at the same time as the average global temperature is increasing due to multiple climate forcings. A large number of extreme weather events such as prolonged cold spells, heatwaves, droughts and floods have been recorded around the world in the past 10 years. Such changes in the temporal scaling behaviour of climate time-series data can be difficult to detect. While there are easy and direct ways of analysing climate data by calculating the means and variances for different levels of temporal aggregation, these methods can miss more subtle changes in their dynamics. This paper describes multi-scale entropy (MSE analysis as a tool to study climate time-series data and to identify temporal scales of variability and their change over time in climate time-series. MSE estimates the sample entropy of the time-series after coarse-graining at different temporal scales. An application of MSE to Central European, variance-adjusted, mean monthly air temperature anomalies (CRUTEM4v is provided. The results show that the temporal scales of the current climate (1960–2014 are different from the long-term average (1850–1960. For temporal scale factors longer than 12 months, the sample entropy increased markedly compared to the long-term record. Such an increase can be explained by systems theory with greater complexity in the regional temperature data. From 1961 the patterns of monthly air temperatures are less regular at time-scales greater than 12 months than in the earlier time period. This finding suggests that, at these inter-annual time scales, the temperature variability has become less predictable than in the past. It is possible that climate system feedbacks are expressed in altered temporal scales of the European temperature time-series data. A comparison with the variance and Shannon entropy shows that MSE analysis can provide additional information on the

  7. A Multivariate Time Series Method for Monte Carlo Reactor Analysis

    International Nuclear Information System (INIS)

    Taro Ueki


    A robust multivariate time series method has been established for the Monte Carlo calculation of neutron multiplication problems. The method is termed Coarse Mesh Projection Method (CMPM) and can be implemented using the coarse statistical bins for acquisition of nuclear fission source data. A novel aspect of CMPM is the combination of the general technical principle of projection pursuit in the signal processing discipline and the neutron multiplication eigenvalue problem in the nuclear engineering discipline. CMPM enables reactor physicists to accurately evaluate major eigenvalue separations of nuclear reactors with continuous energy Monte Carlo calculation. CMPM was incorporated in the MCNP Monte Carlo particle transport code of Los Alamos National Laboratory. The great advantage of CMPM over the traditional Fission Matrix method is demonstrated for the three space-dimensional modeling of the initial core of a pressurized water reactor

  8. Biological time series analysis using a context free language: applicability to pulsatile hormone data.

    Directory of Open Access Journals (Sweden)

    Dennis A Dean

    Full Text Available We present a novel approach for analyzing biological time-series data using a context-free language (CFL representation that allows the extraction and quantification of important features from the time-series. This representation results in Hierarchically AdaPtive (HAP analysis, a suite of multiple complementary techniques that enable rapid analysis of data and does not require the user to set parameters. HAP analysis generates hierarchically organized parameter distributions that allow multi-scale components of the time-series to be quantified and includes a data analysis pipeline that applies recursive analyses to generate hierarchically organized results that extend traditional outcome measures such as pharmacokinetics and inter-pulse interval. Pulsicons, a novel text-based time-series representation also derived from the CFL approach, are introduced as an objective qualitative comparison nomenclature. We apply HAP to the analysis of 24 hours of frequently sampled pulsatile cortisol hormone data, which has known analysis challenges, from 14 healthy women. HAP analysis generated results in seconds and produced dozens of figures for each participant. The results quantify the observed qualitative features of cortisol data as a series of pulse clusters, each consisting of one or more embedded pulses, and identify two ultradian phenotypes in this dataset. HAP analysis is designed to be robust to individual differences and to missing data and may be applied to other pulsatile hormones. Future work can extend HAP analysis to other time-series data types, including oscillatory and other periodic physiological signals.

  9. Time Series in Education: The Analysis of Daily Attendance in Two High Schools (United States)

    Koopmans, Matthijs


    This presentation discusses the use of a time series approach to the analysis of daily attendance in two urban high schools over the course of one school year (2009-10). After establishing that the series for both schools were stationary, they were examined for moving average processes, autoregression, seasonal dependencies (weekly cycles),…

  10. Mapping air temperature using time series analysis of LST : The SINTESI approach

    NARCIS (Netherlands)

    Alfieri, S.M.; De Lorenzi, F.; Menenti, M.


    This paper presents a new procedure to map time series of air temperature (Ta) at fine spatial resolution using time series analysis of satellite-derived land surface temperature (LST) observations. The method assumes that air temperature is known at a single (reference) location such as in gridded

  11. Time-series analysis of Nigeria rice supply and demand: Error ...

    African Journals Online (AJOL)

    The study examined a time-series analysis of Nigeria rice supply and demand with a view to determining any long-run equilibrium between them using the Error Correction Model approach (ECM). The data used for the study represents the annual series of 1960-2007 (47 years) for rice supply and demand in Nigeria, ...

  12. Taxation in Public Education. Analysis and Bibliography Series, No. 12. (United States)

    Ross, Larry L.

    Intended for both researchers and practitioners, this analysis and bibliography cites approximately 100 publications on educational taxation, including general texts and reports, statistical reports, taxation guidelines, and alternative proposals for taxation. Topics covered in the analysis section include State and Federal aid, urban and suburban…


    Institute of Scientific and Technical Information of China (English)

    RONG Yan-shu; TU Qi-pu


    It is important and necessary to get a much longer precipitation series in order to research features of drought/flood and climate change.Based on dryness and wetness grades series of 18 stations in Northern China of 533 years from 1470 to 2002, the Moving Cumulative Frequency Method (MCFM) was developed, moving average precipitation series from 1499 to 2002 were reconstructed by testing three kinds of average precipitation, and the features of climate change and dry and wet periods were researched by using reconstructed precipitation series in the present paper.The results showed that there were good relationship between the reconstructed precipitation series and the observation precipitation series since 1954 and their relative root-mean-square error were below 1.89%, that the relation between reconstructed series and the dryness and wetness grades series were nonlinear and this nonlinear relation implied that reconstructed series were reliable and could became foundation data for researching evolution of the drought and flood.Analysis of climate change upon reconstructed precipitation series revealed that although drought intensity of recent dry period from middle 1970s of 20th century until early 21st century was not the strongest in historical climate of Northern China, intensity and duration of wet period was a great deal decreasing and shortening respectively, climate evolve to aridification situation in Northern China.

  14. The Timeseries Toolbox - A Web Application to Enable Accessible, Reproducible Time Series Analysis (United States)

    Veatch, W.; Friedman, D.; Baker, B.; Mueller, C.


    The vast majority of data analyzed by climate researchers are repeated observations of physical process or time series data. This data lends itself of a common set of statistical techniques and models designed to determine trends and variability (e.g., seasonality) of these repeated observations. Often, these same techniques and models can be applied to a wide variety of different time series data. The Timeseries Toolbox is a web application designed to standardize and streamline these common approaches to time series analysis and modeling with particular attention to hydrologic time series used in climate preparedness and resilience planning and design by the U. S. Army Corps of Engineers. The application performs much of the pre-processing of time series data necessary for more complex techniques (e.g. interpolation, aggregation). With this tool, users can upload any dataset that conforms to a standard template and immediately begin applying these techniques to analyze their time series data.

  15. Financial time series analysis based on effective phase transfer entropy (United States)

    Yang, Pengbo; Shang, Pengjian; Lin, Aijing


    Transfer entropy is a powerful technique which is able to quantify the impact of one dynamic system on another system. In this paper, we propose the effective phase transfer entropy method based on the transfer entropy method. We use simulated data to test the performance of this method, and the experimental results confirm that the proposed approach is capable of detecting the information transfer between the systems. We also explore the relationship between effective phase transfer entropy and some variables, such as data size, coupling strength and noise. The effective phase transfer entropy is positively correlated with the data size and the coupling strength. Even in the presence of a large amount of noise, it can detect the information transfer between systems, and it is very robust to noise. Moreover, this measure is indeed able to accurately estimate the information flow between systems compared with phase transfer entropy. In order to reflect the application of this method in practice, we apply this method to financial time series and gain new insight into the interactions between systems. It is demonstrated that the effective phase transfer entropy can be used to detect some economic fluctuations in the financial market. To summarize, the effective phase transfer entropy method is a very efficient tool to estimate the information flow between systems.

  16. Stock price forecasting based on time series analysis (United States)

    Chi, Wan Le


    Using the historical stock price data to set up a sequence model to explain the intrinsic relationship of data, the future stock price can forecasted. The used models are auto-regressive model, moving-average model and autoregressive-movingaverage model. The original data sequence of unit root test was used to judge whether the original data sequence was stationary. The non-stationary original sequence as a first order difference needed further processing. Then the stability of the sequence difference was re-inspected. If it is still non-stationary, the second order differential processing of the sequence is carried out. Autocorrelation diagram and partial correlation diagram were used to evaluate the parameters of the identified ARMA model, including coefficients of the model and model order. Finally, the model was used to forecast the fitting of the shanghai composite index daily closing price with precision. Results showed that the non-stationary original data series was stationary after the second order difference. The forecast value of shanghai composite index daily closing price was closer to actual value, indicating that the ARMA model in the paper was a certain accuracy.

  17. Industrial electricity demand for Turkey: A structural time series analysis

    International Nuclear Information System (INIS)

    Dilaver, Zafer; Hunt, Lester C.


    This research investigates the relationship between Turkish industrial electricity consumption, industrial value added and electricity prices in order to forecast future Turkish industrial electricity demand. To achieve this, an industrial electricity demand function for Turkey is estimated by applying the structural time series technique to annual data over the period 1960 to 2008. In addition to identifying the size and significance of the price and industrial value added (output) elasticities, this technique also uncovers the electricity Underlying Energy Demand Trend (UEDT) for the Turkish industrial sector and is, as far as is known, the first attempt to do this. The results suggest that output and real electricity prices and a UEDT all have an important role to play in driving Turkish industrial electricity demand. Consequently, they should all be incorporated when modelling Turkish industrial electricity demand and the estimated UEDT should arguably be considered in future energy policy decisions concerning the Turkish electricity industry. The output and price elasticities are estimated to be 0.15 and - 0.16 respectively, with an increasing (but at a decreasing rate) UEDT and based on the estimated equation, and different forecast assumptions, it is predicted that Turkish industrial electricity demand will be somewhere between 97 and 148 TWh by 2020. -- Research Highlights: → Estimated output and price elasticities of 0.15 and -0.16 respectively. → Estimated upward sloping UEDT (i.e. energy using) but at a decreasing rate. → Predicted Turkish industrial electricity demand between 97 and 148 TWh in 2020.

  18. A unified nonlinear stochastic time series analysis for climate science. (United States)

    Moon, Woosok; Wettlaufer, John S


    Earth's orbit and axial tilt imprint a strong seasonal cycle on climatological data. Climate variability is typically viewed in terms of fluctuations in the seasonal cycle induced by higher frequency processes. We can interpret this as a competition between the orbitally enforced monthly stability and the fluctuations/noise induced by weather. Here we introduce a new time-series method that determines these contributions from monthly-averaged data. We find that the spatio-temporal distribution of the monthly stability and the magnitude of the noise reveal key fingerprints of several important climate phenomena, including the evolution of the Arctic sea ice cover, the El Nio Southern Oscillation (ENSO), the Atlantic Nio and the Indian Dipole Mode. In analogy with the classical destabilising influence of the ice-albedo feedback on summertime sea ice, we find that during some time interval of the season a destabilising process operates in all of these climate phenomena. The interaction between the destabilisation and the accumulation of noise, which we term the memory effect, underlies phase locking to the seasonal cycle and the statistical nature of seasonal predictability.

  19. Methodology Series Module 6: Systematic Reviews and Meta-analysis. (United States)

    Setia, Maninder Singh


    Systematic reviews and meta-analysis have become an important of biomedical literature, and they provide the "highest level of evidence" for various clinical questions. There are a lot of studies - sometimes with contradictory conclusions - on a particular topic in literature. Hence, as a clinician, which results will you believe? What will you tell your patient? Which drug is better? A systematic review or a meta-analysis may help us answer these questions. In addition, it may also help us understand the quality of the articles in literature or the type of studies that have been conducted and published (example, randomized trials or observational studies). The first step it to identify a research question for systematic review or meta-analysis. The next step is to identify the articles that will be included in the study. This will be done by searching various databases; it is important that the researcher should search for articles in more than one database. It will also be useful to form a group of researchers and statisticians that have expertise in conducting systematic reviews and meta-analysis before initiating them. We strongly encourage the readers to register their proposed review/meta-analysis with PROSPERO. Finally, these studies should be reported according to the Preferred Reporting Items for Systematic Reviews and Meta-analysis checklist.

  20. A novel water quality data analysis framework based on time-series data mining. (United States)

    Deng, Weihui; Wang, Guoyin


    The rapid development of time-series data mining provides an emerging method for water resource management research. In this paper, based on the time-series data mining methodology, we propose a novel and general analysis framework for water quality time-series data. It consists of two parts: implementation components and common tasks of time-series data mining in water quality data. In the first part, we propose to granulate the time series into several two-dimensional normal clouds and calculate the similarities in the granulated level. On the basis of the similarity matrix, the similarity search, anomaly detection, and pattern discovery tasks in the water quality time-series instance dataset can be easily implemented in the second part. We present a case study of this analysis framework on weekly Dissolve Oxygen time-series data collected from five monitoring stations on the upper reaches of Yangtze River, China. It discovered the relationship of water quality in the mainstream and tributary as well as the main changing patterns of DO. The experimental results show that the proposed analysis framework is a feasible and efficient method to mine the hidden and valuable knowledge from water quality historical time-series data. Copyright © 2017 Elsevier Ltd. All rights reserved.

  1. WTF! Taboo Language in TV Series: An Analysis of Professional and Amateur Translation

    Directory of Open Access Journals (Sweden)

    Micòl Beseghi


    Full Text Available This paper focuses on the topic of censorship associated with the use of strong language and swear words in the translation of contemporary American TV series. In AVT, more specifically in Italian dubbing, the practice of censorship, in the form of suppression or toning down of what might be perceived as offensive, disturbing, too explicit or inconvenient, still remains a problematic issue. By focusing on two recent successful TV series - Girls and Orange is the New Black – which are characterized by the use of strong language (swear words, politically incorrect references and the presence of taboo subjects (homosexuality, sex, drugs, violence – this study will consider the different translation choices applied in dubbing and fansubbing. Previous academic studies have underlined the fact that professional translators tend to remove, more or less consciously, the disturbing elements from the source text, while fansubbers try to adhere as much as possible to the original text, not only in terms of linguistic contents but also in terms of register and style. The results of this analysis seem on the one hand to confirm that there is still not a systematic set of rules that govern the translation of strong language in dubbing, and on the other to indicate that the gap between professional and amateur translation is perhaps becoming less pronounced.

  2. Time series analysis of diverse extreme phenomena: universal features (United States)

    Eftaxias, K.; Balasis, G.


    The field of study of complex systems holds that the dynamics of complex systems are founded on universal principles that may used to describe a great variety of scientific and technological approaches of different types of natural, artificial, and social systems. We suggest that earthquake, epileptic seizures, solar flares, and magnetic storms dynamics can be analyzed within similar mathematical frameworks. A central property of aforementioned extreme events generation is the occurrence of coherent large-scale collective behavior with very rich structure, resulting from repeated nonlinear interactions among the corresponding constituents. Consequently, we apply the Tsallis nonextensive statistical mechanics as it proves an appropriate framework in order to investigate universal principles of their generation. First, we examine the data in terms of Tsallis entropy aiming to discover common "pathological" symptoms of transition to a significant shock. By monitoring the temporal evolution of the degree of organization in time series we observe similar distinctive features revealing significant reduction of complexity during their emergence. Second, a model for earthquake dynamics coming from a nonextensive Tsallis formalism, starting from first principles, has been recently introduced. This approach leads to an energy distribution function (Gutenberg-Richter type law) for the magnitude distribution of earthquakes, providing an excellent fit to seismicities generated in various large geographic areas usually identified as seismic regions. We show that this function is able to describe the energy distribution (with similar non-extensive q-parameter) of solar flares, magnetic storms, epileptic and earthquake shocks. The above mentioned evidence of a universal statistical behavior suggests the possibility of a common approach for studying space weather, earthquakes and epileptic seizures.

  3. Real analysis series, functions of several variables, and applications

    CERN Document Server

    Laczkovich, Miklós


    This book develops the theory of multivariable analysis, building on the single variable foundations established in the companion volume, Real Analysis: Foundations and Functions of One Variable. Together, these volumes form the first English edition of the popular Hungarian original, Valós Analízis I & II, based on courses taught by the authors at Eötvös Loránd University, Hungary, for more than 30 years. Numerous exercises are included throughout, offering ample opportunities to master topics by progressing from routine to difficult problems. Hints or solutions to many of the more challenging exercises make this book ideal for independent study, or further reading. Intended as a sequel to a course in single variable analysis, this book builds upon and expands these ideas into higher dimensions. The modular organization makes this text adaptable for either a semester or year-long introductory course. Topics include: differentiation and integration of functions of several variables; infinite numerica...

  4. Analysis of engineering cycles thermodynamics and fluid mechanics series

    CERN Document Server

    Haywood, R W


    Analysis of Engineering Cycles, Third Edition, deals principally with an analysis of the overall performance, under design conditions, of work-producing power plants and work-absorbing refrigerating and gas-liquefaction plants, most of which are either cyclic or closely related thereto. The book is organized into two parts, dealing first with simple power and refrigerating plants and then moving on to more complex plants. The principal modifications in this Third Edition arise from the updating and expansion of material on nuclear plants and on combined and binary plants. In view of increased

  5. Time series analysis of aerobic bacterial flora during Miso fermentation. (United States)

    Onda, T; Yanagida, F; Tsuji, M; Shinohara, T; Yokotsuka, K


    This article reports a microbiological study of aerobic mesophilic bacteria that are present during the fermentation process of Miso. Aerobic bacteria were enumerated and isolated from Miso during fermentation and divided into nine groups using traditional phenotypic tests. The strains were identified by biochemical analysis and 16S rRNA sequence analysis. They were identified as Bacillus subtilis, B. amyloliquefaciens, Kocuria kristinae, Staphylococcus gallinarum and S. kloosii. All strains were sensitive to the bacteriocins produced by the lactic acid bacteria isolated from Miso. The dominant species among the undesirable species throughout the fermentation process were B. subtilis and B. amyloliquefaciens. It is suggested that bacteriocin-producing lactic acid bacteria are effective in the growth prevention of aerobic bacteria in Miso. This study has provided useful information for controlling of bacterial flora during Miso fermentation.

  6. A comparative analysis of spectral exponent estimation techniques for 1/f(β) processes with applications to the analysis of stride interval time series. (United States)

    Schaefer, Alexander; Brach, Jennifer S; Perera, Subashan; Sejdić, Ervin


    The time evolution and complex interactions of many nonlinear systems, such as in the human body, result in fractal types of parameter outcomes that exhibit self similarity over long time scales by a power law in the frequency spectrum S(f)=1/f(β). The scaling exponent β is thus often interpreted as a "biomarker" of relative health and decline. This paper presents a thorough comparative numerical analysis of fractal characterization techniques with specific consideration given to experimentally measured gait stride interval time series. The ideal fractal signals generated in the numerical analysis are constrained under varying lengths and biases indicative of a range of physiologically conceivable fractal signals. This analysis is to complement previous investigations of fractal characteristics in healthy and pathological gait stride interval time series, with which this study is compared. The results of our analysis showed that the averaged wavelet coefficient method consistently yielded the most accurate results. Class dependent methods proved to be unsuitable for physiological time series. Detrended fluctuation analysis as most prevailing method in the literature exhibited large estimation variances. The comparative numerical analysis and experimental applications provide a thorough basis for determining an appropriate and robust method for measuring and comparing a physiologically meaningful biomarker, the spectral index β. In consideration of the constraints of application, we note the significant drawbacks of detrended fluctuation analysis and conclude that the averaged wavelet coefficient method can provide reasonable consistency and accuracy for characterizing these fractal time series. Copyright © 2013 Elsevier B.V. All rights reserved.

  7. Cluster analysis of activity-time series in motor learning

    DEFF Research Database (Denmark)

    Balslev, Daniela; Nielsen, Finn Årup; Frutiger, Sally A.


    Neuroimaging studies of learning focus on brain areas where the activity changes as a function of time. To circumvent the difficult problem of model selection, we used a data-driven analytic tool, cluster analysis, which extracts representative temporal and spatial patterns from the voxel...... practice-related activity in a fronto-parieto-cerebellar network, in agreement with previous studies of motor learning. These voxels were separated from a group of voxels showing an unspecific time-effect and another group of voxels, whose activation was an artifact from smoothing. Hum. Brain Mapping 15...

  8. Trend analysis using non-stationary time series clustering based on the finite element method (United States)

    Gorji Sefidmazgi, M.; Sayemuzzaman, M.; Homaifar, A.; Jha, M. K.; Liess, S.


    In order to analyze low-frequency variability of climate, it is useful to model the climatic time series with multiple linear trends and locate the times of significant changes. In this paper, we have used non-stationary time series clustering to find change points in the trends. Clustering in a multi-dimensional non-stationary time series is challenging, since the problem is mathematically ill-posed. Clustering based on the finite element method (FEM) is one of the methods that can analyze multidimensional time series. One important attribute of this method is that it is not dependent on any statistical assumption and does not need local stationarity in the time series. In this paper, it is shown how the FEM-clustering method can be used to locate change points in the trend of temperature time series from in situ observations. This method is applied to the temperature time series of North Carolina (NC) and the results represent region-specific climate variability despite higher frequency harmonics in climatic time series. Next, we investigated the relationship between the climatic indices with the clusters/trends detected based on this clustering method. It appears that the natural variability of climate change in NC during 1950-2009 can be explained mostly by AMO and solar activity.

  9. Multifractal detrended cross-correlation analysis on gold, crude oil and foreign exchange rate time series (United States)

    Pal, Mayukha; Madhusudana Rao, P.; Manimaran, P.


    We apply the recently developed multifractal detrended cross-correlation analysis method to investigate the cross-correlation behavior and fractal nature between two non-stationary time series. We analyze the daily return price of gold, West Texas Intermediate and Brent crude oil, foreign exchange rate data, over a period of 18 years. The cross correlation has been measured from the Hurst scaling exponents and the singularity spectrum quantitatively. From the results, the existence of multifractal cross-correlation between all of these time series is found. We also found that the cross correlation between gold and oil prices possess uncorrelated behavior and the remaining bivariate time series possess persistent behavior. It was observed for five bivariate series that the cross-correlation exponents are less than the calculated average generalized Hurst exponents (GHE) for q0 and for one bivariate series the cross-correlation exponent is greater than GHE for all q values.

  10. A Reception Analysis on the Youth Audiences of TV Series in Marivan

    Directory of Open Access Journals (Sweden)

    Omid Karimi


    Full Text Available The aim of this article is to describe the role of foreign media as the agitators of popular culture. For that with reception analysis it’s pay to describe decoding of youth audiences about this series. Globalization theory and Reception in Communication theory are formed the theoretical system of current article. The methodology in this research is qualitative one, and two techniques as in-depth interview and observation are used for data collection. The results show different people based on individual features, social and cultural backgrounds have inclination toward special characters and identify with them. This inclination so far the audience fallow the series because of his/her favorite character. Also there is a great compatibility between audience backgrounds and their receptions. A number of audience have criticized the series and point out the negative consequences on its society. However, seeing the series continue; really they prefer watching series enjoying to risks of it.

  11. Cerebral venous sinus thrombosis on MRI: A case series analysis

    Directory of Open Access Journals (Sweden)

    Sanjay M Khaladkar


    Full Text Available Background: Cerebral venous sinus thrombosis (CVST is a rare form of stroke seen in young and middle aged group, especially in women due to thrombus of dural venous sinuses and can cause acute neurological deterioration with increased morbidity and mortality if not diagnosed in early stage. Neurological deficit occurs due to focal or diffuse cerebral edema and venous non-hemorrhagic or hemorrhagic infarct. Aim and Objectives: To assess/evaluate the role of Magnetic Resonance Imaging (MRI and Magnetic Resonance Venography (MRV as an imaging modality for early diagnosis of CVST and to study patterns of venous thrombosis, in detecting changes in brain parenchyma and residual effects of CVST using MRI. Materials and Methods: Retrospective descriptive analysis of 40 patients of CVST diagnosed on MRI brain and MRV was done. Results: 29/40 (72.5% were males and 11/40 (27.5% were females. Most of the patients were in the age group of 21-40 years (23/40-57.5%. Most of the patients 16/40 (40% presented within 7 days. No definite cause of CVST was found in 24 (60% patients in spite of detailed history. In 36/40 (90% of cases major sinuses were involved, deep venous system were involved in 7/40 (17.5% cases, superficial cortical vein was involved in 1/40 (2.5% cases. Analysis of stage of thrombus (acute, subacute, chronic was done based on its appearance on T1 and T2WI. 31/40 (77.5% patients showed complete absence of flow on MRV, while 9/40 (22.5% cases showed partial flow on MR venogram. Brain parenchyma was normal in 20/40 (50% patients while 6/40 (15% cases had non-hemorrhagic infarct and 14/40 (35% patients presented with hemorrhagic infarct. Conclusion: Our study concluded that MRI brain with MRV is sensitive in diagnosing both direct signs (evidence of thrombus inside the affected veins and indirect signs (parenchymal changes of CVST and their follow up.

  12. Chaos in Electronic Circuits: Nonlinear Time Series Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Wheat, Jr., Robert M. [Kennedy Western Univ., Cheyenne, WY (United States)


    Chaos in electronic circuits is a phenomenon that has been largely ignored by engineers, manufacturers, and researchers until the early 1990’s and the work of Chua, Matsumoto, and others. As the world becomes more dependent on electronic devices, the detrimental effects of non-normal operation of these devices becomes more significant. Developing a better understanding of the mechanisms involved in the chaotic behavior of electronic circuits is a logical step toward the prediction and prevention of any potentially catastrophic occurrence of this phenomenon. Also, a better understanding of chaotic behavior, in a general sense, could potentially lead to better accuracy in the prediction of natural events such as weather, volcanic activity, and earthquakes. As a first step in this improvement of understanding, and as part of the research being reported here, methods of computer modeling, identifying and analyzing, and producing chaotic behavior in simple electronic circuits have been developed. The computer models were developed using both the Alternative Transient Program (ATP) and Spice, the analysis techniques have been implemented using the C and C++ programming languages, and the chaotically behaving circuits developed using “off the shelf” electronic components.

  13. Financing Human Development for Sectorial Growth: A Time Series Analysis

    Directory of Open Access Journals (Sweden)

    Shobande Abdul Olatunji


    Full Text Available The role which financing human development plays in fostering the sectorial growth of an economy cannot be undermined. It is a key instrument which can be utilized to alleviate poverty, create employment and ensure the sustenance of economic growth and development. Thus financing human development for sectorial growth has taken the center stage of economic growth and development strategies in most countries. In a constructive effort to examine the in-depth relationship between the variables in the Nigerian space, this paper provides evidence on the impact of financing human development and sectorial growth in Nigeria between 1982 and 2016, using the Johansen co-integration techniques to test for co-integration among the variables and the Vector Error Correction Model (VECM to ascertain the speed of adjustment of the variables to their long run equilibrium position. The analysis shows that a long and short run relationship exists between financing human capital development and sectorial growth during the period reviewed. Therefore, the paper argues that for an active foundation for sustainable sectorial growth and development, financing human capital development across each unit is urgently required through increased budgetary allocation for both health and educational sectors since they are key components of human capital development in a nation.

  14. Simultaneous determination of radionuclides separable into natural decay series by use of time-interval analysis

    International Nuclear Information System (INIS)

    Hashimoto, Tetsuo; Sanada, Yukihisa; Uezu, Yasuhiro


    A delayed coincidence method, time-interval analysis (TIA), has been applied to successive α-α decay events on the millisecond time-scale. Such decay events are part of the 220 Rn→ 216 Po (T 1/2 145 ms) (Th-series) and 219 Rn→ 215 Po (T 1/2 1.78 ms) (Ac-series). By using TIA in addition to measurement of 226 Ra (U-series) from α-spectrometry by liquid scintillation counting (LSC), two natural decay series could be identified and separated. The TIA detection efficiency was improved by using the pulse-shape discrimination technique (PSD) to reject β-pulses, by solvent extraction of Ra combined with simple chemical separation, and by purging the scintillation solution with dry N 2 gas. The U- and Th-series together with the Ac-series were determined, respectively, from alpha spectra and TIA carried out immediately after Ra-extraction. Using the 221 Fr→ 217 At (T 1/2 32.3 ms) decay process as a tracer, overall yields were estimated from application of TIA to the 225 Ra (Np-decay series) at the time of maximum growth. The present method has proven useful for simultaneous determination of three radioactive decay series in environmental samples. (orig.)

  15. Route-specific analysis for radioactive materials transportation

    International Nuclear Information System (INIS)


    This report addresses a methodology for route-specific analysis, of which route-selection is one aspect. Identification and mitigation of specific hazards along a chosen route is another important facet of route-specific analysis. Route-selection and route-specific mitigation are two tools to be used in minimizing the risk of radioactive materials transportation and promoting public confidence. Other tools exist to improve the safety of transportation under the Nuclear Waste Policy Act. Selection of a transportation mode and other, non-route-specific measures, such as improved driver training and improved cask designs, are additional tools to minimize transportation risk and promote public confidence. This report addresses the route-specific analysis tool and does not attempt to evaluate its relative usefulness as compared to other available tools. This report represents a preliminary attempt to develop a route-specific analysis methodlogy. The Western Interstate Energy Board High-Level Waste Committee has formed a Route-Specific Analysis Task Force which will build upon the methodology proposed in this Staff Report. As western states continue to investigate route-specific analysis issues, it is expected that the methodology will evolve into a more refined product representing the views of a larger group of interested parties in the West

  16. Time series analysis of wind speed using VAR and the generalized impulse response technique

    Energy Technology Data Exchange (ETDEWEB)

    Ewing, Bradley T. [Area of Information Systems and Quantitative Sciences, Rawls College of Business and Wind Science and Engineering Research Center, Texas Tech University, Lubbock, TX 79409-2101 (United States); Kruse, Jamie Brown [Center for Natural Hazard Research, East Carolina University, Greenville, NC (United States); Schroeder, John L. [Department of Geosciences and Wind Science and Engineering Research Center, Texas Tech University, Lubbock, TX (United States); Smith, Douglas A. [Department of Civil Engineering and Wind Science and Engineering Research Center, Texas Tech University, Lubbock, TX (United States)


    This research examines the interdependence in time series wind speed data measured in the same location at four different heights. A multiple-equation system known as a vector autoregression is proposed for characterizing the time series dynamics of wind. Additionally, the recently developed method of generalized impulse response analysis provides insight into the cross-effects of the wind series and their responses to shocks. Findings are based on analysis of contemporaneous wind speed time histories taken at 13, 33, 70 and 160 ft above ground level with a sampling rate of 10 Hz. The results indicate that wind speeds measured at 70 ft was the most variable. Further, the turbulence persisted longer at the 70-ft measurement than at the other heights. The greatest interdependence is observed at 13 ft. Gusts at 160 ft led to the greatest persistence to an 'own' shock and led to greatest persistence in the responses of the other wind series. (author)

  17. Bioelectric signal classification using a recurrent probabilistic neural network with time-series discriminant component analysis. (United States)

    Hayashi, Hideaki; Shima, Keisuke; Shibanoki, Taro; Kurita, Yuichi; Tsuji, Toshio


    This paper outlines a probabilistic neural network developed on the basis of time-series discriminant component analysis (TSDCA) that can be used to classify high-dimensional time-series patterns. TSDCA involves the compression of high-dimensional time series into a lower-dimensional space using a set of orthogonal transformations and the calculation of posterior probabilities based on a continuous-density hidden Markov model that incorporates a Gaussian mixture model expressed in the reduced-dimensional space. The analysis can be incorporated into a neural network so that parameters can be obtained appropriately as network coefficients according to backpropagation-through-time-based training algorithm. The network is considered to enable high-accuracy classification of high-dimensional time-series patterns and to reduce the computation time taken for network training. In the experiments conducted during the study, the validity of the proposed network was demonstrated for EEG signals.

  18. Fractal analysis and nonlinear forecasting of indoor 222Rn time series

    International Nuclear Information System (INIS)

    Pausch, G.; Bossew, P.; Hofmann, W.; Steger, F.


    Fractal analyses of indoor 222 Rn time series were performed using different chaos theory based measurements such as time delay method, Hurst's rescaled range analysis, capacity (fractal) dimension, and Lyapunov exponent. For all time series we calculated only positive Lyapunov exponents which is a hint to chaos, while the Hurst exponents were well below 0.5, indicating antipersistent behaviour (past trends tend to reverse in the future). These time series were also analyzed with a nonlinear prediction method which allowed an estimation of the embedding dimensions with some restrictions, limiting the prediction to about three relative time steps. (orig.)

  19. CROSAT: A digital computer program for statistical-spectral analysis of two discrete time series

    International Nuclear Information System (INIS)

    Antonopoulos Domis, M.


    The program CROSAT computes directly from two discrete time series auto- and cross-spectra, transfer and coherence functions, using a Fast Fourier Transform subroutine. Statistical analysis of the time series is optional. While of general use the program is constructed to be immediately compatible with the ICL 4-70 and H316 computers at AEE Winfrith, and perhaps with minor modifications, with any other hardware system. (author)

  20. Analysis of the employee Benefits in Specific Organization


    Procházková, Petra


    The main subject of my Bachelor's Thesis called "Analysis of employee Benefits in Specific Organization" is to analyze system of employee benefits used in company RWE Transgas, a. s. in 2010. Theoretical part will specify basic terms in general, which are important to cope with this issue. There will be especially importance, division, risks and trends in benefits. In practical part the analysis of employee benefits in specific joint-stock company is made. Part of this analysis is survey done...

  1. An Interactive Analysis of Hyperboles in a British TV Series: Implications For EFL Classes (United States)

    Sert, Olcay


    This paper, part of an ongoing study on the analysis of hyperboles in a British TV series, reports findings drawing upon a 90,000 word corpus. The findings are compared to the ones from CANCODE (McCarthy and Carter 2004), a five-million word corpus of spontaneous speech, in order to identify similarities between the two. The analysis showed that…

  2. Multiple Indicator Stationary Time Series Models. (United States)

    Sivo, Stephen A.


    Discusses the propriety and practical advantages of specifying multivariate time series models in the context of structural equation modeling for time series and longitudinal panel data. For time series data, the multiple indicator model specification improves on classical time series analysis. For panel data, the multiple indicator model…

  3. Statistical analysis of yearly series of maximum daily rainfall in Spain. Analisis estadistico de las series anuales de maximas lluvias diarias en Espaa

    Energy Technology Data Exchange (ETDEWEB)

    Ferrer Polo, J.; Ardiles Lopez, K. L. (CEDEX, Ministerio de Obras Publicas, Transportes y Medio ambiente, Madrid (Spain))


    Work on the statistical modelling of maximum daily rainfalls is presented, with a view to estimating the quantiles for different return periods. An index flood approach has been adopted in which the local quantiles are a result of rescaling a regional law using the mean of each series of values, that is utilized as a local scale factor. The annual maximum series have been taken from 1.545 meteorological stations over a 30 year period, and these have been classified into 26 regions defined according to meteorological criteria, the homogeneity of wich has been checked by means of a statistical analysis of the coefficients of variation of the samples,using the. An estimation has been made of the parameters for the following four distribution models: Two Component Extreme Value (TCEV); General Extreme Value (GEV); Log-Pearson III (LP3); and SQRT-Exponential Type Distribution of Maximum. The analysis of the quantiles obtained reveals slight differences in the results thus detracting from the importance of the model selection. The last of the above-mentioned distribution has been finally chosen, on the basis of the following: it is defined with fewer parameters it is the only that was proposed specifically for the analysis of daily rainfall maximums; it yields more conservative results than the traditional Gumbel distribution for the high return periods; and it is capable of providing a good description of the main sampling statistics concerning the right-hand tail of the distribution, a fact that has been checked with Montecarlo's simulation techniques. The choice of a distribution model with only two parameters has led to the selection of the regional coefficient of variation as the only determining parameter for the regional quantiles. This has permitted the elimination of the quantiles discontinuity of the classical regional approach, thus smoothing the values of that coefficient by means of an isoline plan on a national scale.

  4. TimesVector: a vectorized clustering approach to the analysis of time series transcriptome data from multiple phenotypes. (United States)

    Jung, Inuk; Jo, Kyuri; Kang, Hyejin; Ahn, Hongryul; Yu, Youngjae; Kim, Sun


    Identifying biologically meaningful gene expression patterns from time series gene expression data is important to understand the underlying biological mechanisms. To identify significantly perturbed gene sets between different phenotypes, analysis of time series transcriptome data requires consideration of time and sample dimensions. Thus, the analysis of such time series data seeks to search gene sets that exhibit similar or different expression patterns between two or more sample conditions, constituting the three-dimensional data, i.e. gene-time-condition. Computational complexity for analyzing such data is very high, compared to the already difficult NP-hard two dimensional biclustering algorithms. Because of this challenge, traditional time series clustering algorithms are designed to capture co-expressed genes with similar expression pattern in two sample conditions. We present a triclustering algorithm, TimesVector, specifically designed for clustering three-dimensional time series data to capture distinctively similar or different gene expression patterns between two or more sample conditions. TimesVector identifies clusters with distinctive expression patterns in three steps: (i) dimension reduction and clustering of time-condition concatenated vectors, (ii) post-processing clusters for detecting similar and distinct expression patterns and (iii) rescuing genes from unclassified clusters. Using four sets of time series gene expression data, generated by both microarray and high throughput sequencing platforms, we demonstrated that TimesVector successfully detected biologically meaningful clusters of high quality. TimesVector improved the clustering quality compared to existing triclustering tools and only TimesVector detected clusters with differential expression patterns across conditions successfully. The TimesVector software is available at Supplementary data are available at

  5. Time Series Analysis OF SAR Image Fractal Maps: The Somma-Vesuvio Volcanic Complex Case Study (United States)

    Pepe, Antonio; De Luca, Claudio; Di Martino, Gerardo; Iodice, Antonio; Manzo, Mariarosaria; Pepe, Susi; Riccio, Daniele; Ruello, Giuseppe; Sansosti, Eugenio; Zinno, Ivana


    The fractal dimension is a significant geophysical parameter describing natural surfaces representing the distribution of the roughness over different spatial scale; in case of volcanic structures, it has been related to the specific nature of materials and to the effects of active geodynamic processes. In this work, we present the analysis of the temporal behavior of the fractal dimension estimates generated from multi-pass SAR images relevant to the Somma-Vesuvio volcanic complex (South Italy). To this aim, we consider a Cosmo-SkyMed data-set of 42 stripmap images acquired from ascending orbits between October 2009 and December 2012. Starting from these images, we generate a three-dimensional stack composed by the corresponding fractal maps (ordered according to the acquisition dates), after a proper co-registration. The time-series of the pixel-by-pixel estimated fractal dimension values show that, over invariant natural areas, the fractal dimension values do not reveal significant changes; on the contrary, over urban areas, it correctly assumes values outside the natural surfaces fractality range and show strong fluctuations. As a final result of our analysis, we generate a fractal map that includes only the areas where the fractal dimension is considered reliable and stable (i.e., whose standard deviation computed over the time series is reasonably small). The so-obtained fractal dimension map is then used to identify areas that are homogeneous from a fractal viewpoint. Indeed, the analysis of this map reveals the presence of two distinctive landscape units corresponding to the Mt. Vesuvio and Gran Cono. The comparison with the (simplified) geological map clearly shows the presence in these two areas of volcanic products of different age. The presented fractal dimension map analysis demonstrates the ability to get a figure about the evolution degree of the monitored volcanic edifice and can be profitably extended in the future to other volcanic systems with

  6. What is the fundamental ion-specific series for anions and cations? Ion specificity in standard partial molar volumes of electrolytes and electrostriction in water and non-aqueous solvents. (United States)

    Mazzini, Virginia; Craig, Vincent S J


    The importance of electrolyte solutions cannot be overstated. Beyond the ionic strength of electrolyte solutions the specific nature of the ions present is vital in controlling a host of properties. Therefore ion specificity is fundamentally important in physical chemistry, engineering and biology. The observation that the strengths of the effect of ions often follows well established series suggests that a single predictive and quantitative description of specific-ion effects covering a wide range of systems is possible. Such a theory would revolutionise applications of physical chemistry from polymer precipitation to drug design. Current approaches to understanding specific-ion effects involve consideration of the ions themselves, the solvent and relevant interfaces and the interactions between them. Here we investigate the specific-ion effects trends of standard partial molar volumes and electrostrictive volumes of electrolytes in water and eleven non-aqueous solvents. We choose these measures as they relate to bulk properties at infinite dilution, therefore they are the simplest electrolyte systems. This is done to test the hypothesis that the ions alone exhibit a specific-ion effect series that is independent of the solvent and unrelated to surface properties. The specific-ion effects trends of standard partial molar volumes and normalised electrostrictive volumes examined in this work show a fundamental ion-specific series that is reproduced across the solvents, which is the Hofmeister series for anions and the reverse lyotropic series for cations, supporting the hypothesis. This outcome is important in demonstrating that ion specificity is observed at infinite dilution and demonstrates that the complexity observed in the manifestation of specific-ion effects in a very wide range of systems is due to perturbations of solvent, surfaces and concentration on the underlying fundamental series. This knowledge will guide a general understanding of specific

  7. A Markovian Entropy Measure for the Analysis of Calcium Activity Time Series. (United States)

    Marken, John P; Halleran, Andrew D; Rahman, Atiqur; Odorizzi, Laura; LeFew, Michael C; Golino, Caroline A; Kemper, Peter; Saha, Margaret S


    Methods to analyze the dynamics of calcium activity often rely on visually distinguishable features in time series data such as spikes, waves, or oscillations. However, systems such as the developing nervous system display a complex, irregular type of calcium activity which makes the use of such methods less appropriate. Instead, for such systems there exists a class of methods (including information theoretic, power spectral, and fractal analysis approaches) which use more fundamental properties of the time series to analyze the observed calcium dynamics. We present a new analysis method in this class, the Markovian Entropy measure, which is an easily implementable calcium time series analysis method which represents the observed calcium activity as a realization of a Markov Process and describes its dynamics in terms of the level of predictability underlying the transitions between the states of the process. We applied our and other commonly used calcium analysis methods on a dataset from Xenopus laevis neural progenitors which displays irregular calcium activity and a dataset from murine synaptic neurons which displays activity time series that are well-described by visually-distinguishable features. We find that the Markovian Entropy measure is able to distinguish between biologically distinct populations in both datasets, and that it can separate biologically distinct populations to a greater extent than other methods in the dataset exhibiting irregular calcium activity. These results support the benefit of using the Markovian Entropy measure to analyze calcium dynamics, particularly for studies using time series data which do not exhibit easily distinguishable features.

  8. A Markovian Entropy Measure for the Analysis of Calcium Activity Time Series.

    Directory of Open Access Journals (Sweden)

    John P Marken

    Full Text Available Methods to analyze the dynamics of calcium activity often rely on visually distinguishable features in time series data such as spikes, waves, or oscillations. However, systems such as the developing nervous system display a complex, irregular type of calcium activity which makes the use of such methods less appropriate. Instead, for such systems there exists a class of methods (including information theoretic, power spectral, and fractal analysis approaches which use more fundamental properties of the time series to analyze the observed calcium dynamics. We present a new analysis method in this class, the Markovian Entropy measure, which is an easily implementable calcium time series analysis method which represents the observed calcium activity as a realization of a Markov Process and describes its dynamics in terms of the level of predictability underlying the transitions between the states of the process. We applied our and other commonly used calcium analysis methods on a dataset from Xenopus laevis neural progenitors which displays irregular calcium activity and a dataset from murine synaptic neurons which displays activity time series that are well-described by visually-distinguishable features. We find that the Markovian Entropy measure is able to distinguish between biologically distinct populations in both datasets, and that it can separate biologically distinct populations to a greater extent than other methods in the dataset exhibiting irregular calcium activity. These results support the benefit of using the Markovian Entropy measure to analyze calcium dynamics, particularly for studies using time series data which do not exhibit easily distinguishable features.

  9. Determinants of Egyptian Banking Sector Profitability: Time-Series Analysis from 2004-2014

    Directory of Open Access Journals (Sweden)

    Heba Youssef Hashem


    Full Text Available Purpose - The purpose of this paper is to examine the determinants of banking sector profitability in Egypt to shed light on the most influential variables that have a significant impact on the performance of this vital sector. Design/methodology/approach - The analysis includes a time series model of quarterly data from 2004 to 2014. The model utilizes Cointegration technique to investigate the long-run relationship between the return on equity as a proxy for bank profitability and several bank-specific variables including liquidity, capital adequacy, and percentage of non-performing loans. In addition, Vector Error Correction Model (VECM is utilized to explore the short-run dynamics of the model and the speed of adjustment to reach the long-run equilibrium. Findings - The main findings of this work show that banking sector profitability is inversely related to capital adequacy, the percentage of loan provisions and the ratio of deposits to total assets. On the other hand, it is positively related to the size of the banking sector which implies that the banking sector exhibits economies of scale. Research limitations/implications - The implications of this work is that it helps reveal the major factors affecting bank performance in the short-run and long-run, and hence provide bank managers and monetary policy makers with beneficial insights on how to enhance bank performance. Since the banking sector represents one of the main engines of financing investment, enhancing the efficiency of this sector would contribute to economic growth and prosperity Originality/value - The Vector error correction model showed that about 4% of the disequilibrium is corrected each quarter to reach the long run equilibrium. In addition, all bank specific variables were found to affect profitability in the long-run only. This study would serve as a base that further work on Egyptian banking sector profitability can build on by incorporating more variables in the


    Directory of Open Access Journals (Sweden)

    Z.-G. Zhou


    Full Text Available Normally, the status of land cover is inherently dynamic and changing continuously on temporal scale. However, disturbances or abnormal changes of land cover — caused by such as forest fire, flood, deforestation, and plant diseases — occur worldwide at unknown times and locations. Timely detection and characterization of these disturbances is of importance for land cover monitoring. Recently, many time-series-analysis methods have been developed for near real-time or online disturbance detection, using satellite image time series. However, the detection results were only labelled with “Change/ No change” by most of the present methods, while few methods focus on estimating reliability (or confidence level of the detected disturbances in image time series. To this end, this paper propose a statistical analysis method for estimating reliability of disturbances in new available remote sensing image time series, through analysis of full temporal information laid in time series data. The method consists of three main steps. (1 Segmenting and modelling of historical time series data based on Breaks for Additive Seasonal and Trend (BFAST. (2 Forecasting and detecting disturbances in new time series data. (3 Estimating reliability of each detected disturbance using statistical analysis based on Confidence Interval (CI and Confidence Levels (CL. The method was validated by estimating reliability of disturbance regions caused by a recent severe flooding occurred around the border of Russia and China. Results demonstrated that the method can estimate reliability of disturbances detected in satellite image with estimation error less than 5% and overall accuracy up to 90%.

  11. Trend analysis and change point detection of annual and seasonal temperature series in Peninsular Malaysia (United States)

    Suhaila, Jamaludin; Yusop, Zulkifli


    Most of the trend analysis that has been conducted has not considered the existence of a change point in the time series analysis. If these occurred, then the trend analysis will not be able to detect an obvious increasing or decreasing trend over certain parts of the time series. Furthermore, the lack of discussion on the possible factors that influenced either the decreasing or the increasing trend in the series needs to be addressed in any trend analysis. Hence, this study proposes to investigate the trends, and change point detection of mean, maximum and minimum temperature series, both annually and seasonally in Peninsular Malaysia and determine the possible factors that could contribute to the significance trends. In this study, Pettitt and sequential Mann-Kendall (SQ-MK) tests were used to examine the occurrence of any abrupt climate changes in the independent series. The analyses of the abrupt changes in temperature series suggested that most of the change points in Peninsular Malaysia were detected during the years 1996, 1997 and 1998. These detection points captured by Pettitt and SQ-MK tests are possibly related to climatic factors, such as El Niño and La Niña events. The findings also showed that the majority of the significant change points that exist in the series are related to the significant trend of the stations. Significant increasing trends of annual and seasonal mean, maximum and minimum temperatures in Peninsular Malaysia were found with a range of 2-5 °C/100 years during the last 32 years. It was observed that the magnitudes of the increasing trend in minimum temperatures were larger than the maximum temperatures for most of the studied stations, particularly at the urban stations. These increases are suspected to be linked with the effect of urban heat island other than El Niño event.

  12. Multivariate stochastic analysis for Monthly hydrological time series at Cuyahoga River Basin (United States)

    zhang, L.


    Copula has become a very powerful statistic and stochastic methodology in case of the multivariate analysis in Environmental and Water resources Engineering. In recent years, the popular one-parameter Archimedean copulas, e.g. Gumbel-Houggard copula, Cook-Johnson copula, Frank copula, the meta-elliptical copula, e.g. Gaussian Copula, Student-T copula, etc. have been applied in multivariate hydrological analyses, e.g. multivariate rainfall (rainfall intensity, duration and depth), flood (peak discharge, duration and volume), and drought analyses (drought length, mean and minimum SPI values, and drought mean areal extent). Copula has also been applied in the flood frequency analysis at the confluences of river systems by taking into account the dependence among upstream gauge stations rather than by using the hydrological routing technique. In most of the studies above, the annual time series have been considered as stationary signal which the time series have been assumed as independent identically distributed (i.i.d.) random variables. But in reality, hydrological time series, especially the daily and monthly hydrological time series, cannot be considered as i.i.d. random variables due to the periodicity existed in the data structure. Also, the stationary assumption is also under question due to the Climate Change and Land Use and Land Cover (LULC) change in the fast years. To this end, it is necessary to revaluate the classic approach for the study of hydrological time series by relaxing the stationary assumption by the use of nonstationary approach. Also as to the study of the dependence structure for the hydrological time series, the assumption of same type of univariate distribution also needs to be relaxed by adopting the copula theory. In this paper, the univariate monthly hydrological time series will be studied through the nonstationary time series analysis approach. The dependence structure of the multivariate monthly hydrological time series will be

  13. Ratio analysis specifics of the family dairies' financial statements


    Mitrović Aleksandra; Knežević Snežana; Veličković Milica


    The subject of this paper is the evaluation of the financial analysis specifics of the dairy enterprises with a focus on the implementation of the ratio analysis of financial statements. The ratio analysis is a central part of financial analysis, since it is based on investigating the relationship between logically related items in the financial statements to assess the financial position of the observed enterprise and its earning capacity. Speaking about the reporting of financial performanc...

  14. TIPPtool: Compositional Specification and Analysis of Markovian Performance Models

    NARCIS (Netherlands)

    Hermanns, H.; Halbwachs, N.; Peled, D.; Mertsiotakis, V.; Siegle, M.


    In this short paper we briefly describe a tool which is based on a Markovian stochastic process algebra. The tool offers both model specification and quantitative model analysis in a compositional fashion, wrapped in a userfriendly graphical front-end.

  15. Sensitivity and specificity of coherence and phase synchronization analysis

    International Nuclear Information System (INIS)

    Winterhalder, Matthias; Schelter, Bjoern; Kurths, Juergen; Schulze-Bonhage, Andreas; Timmer, Jens


    In this Letter, we show that coherence and phase synchronization analysis are sensitive but not specific in detecting the correct class of underlying dynamics. We propose procedures to increase specificity and demonstrate the power of the approach by application to paradigmatic dynamic model systems

  16. Appendix S-NH-1 and S-NH-2 of the experiment operating specification for the semiscale MOD-2C small break LOCA without HPI experiment series

    International Nuclear Information System (INIS)

    Owca, W.A.


    This document is Appendix S-NH--1 and S-NH-2 of the Experiment Operating Specification (EOS) for the Small Break LOCA without high pressure injection (HPI) series. It contains detailed information on the S-NH-1 and S-NH-2 experiment operation and facility configuration necessary to meet the series objectives stated in the main EOS body. 14 refs., 17 figs

  17. Plant specific PTS analysis of Kori Unit 1

    Energy Technology Data Exchange (ETDEWEB)

    Sung-Yull, Hong; Changheui, Jang; Ill-Seok, Jeong [Korea Eletric Power Research Inst., Daejon (Korea, Republic of); Tae-Eun, Jin [Korea Power Engineering Company, Yonging (Korea, Republic of)


    Currently, a nuclear PLIM (Plant Lifetime Management) program is underway in Korea to extend the operation life of Kori-1 which was originally licensed for 30 years. For the life extension of nuclear power plants, the residual lives of major components should be evaluated for the extended operation period. According to the residual life evaluation of reactor pressure vessel, which was classified as one of the major components crucial to life extension, it was found by screening analysis that reference PTS temperature would exceed screening criteria before the target extended operation years. In order to deal with this problem, a plant-specific PTS analysis for Kori-1 RPV has been initiated. In this paper, the relationship between PTS analysis and Kori-1 PLIM program is briefly described. The plant-specific PTS analysis covers system transient analysis, downcomer mixing analysis, and probabilistic fracture mechanics analysis to check the integrity or RPV during various PTS transients. The step-by-step procedure of the analysis will be described in detail. Finally, various issues regarding RPV materials and its integrity will be briefly mentioned, and their implications on Kori-1 PTS analysis will be discussed. Despite of the screening analysis result concern, it is now expected that Kori-1 PTS issues can be handled through the plant-specific PTS analysis. (author). 14 refs, 4 figs, 2 tabs.

  18. Time Series Data Analysis of Wireless Sensor Network Measurements of Temperature. (United States)

    Bhandari, Siddhartha; Bergmann, Neil; Jurdak, Raja; Kusy, Branislav


    Wireless sensor networks have gained significant traction in environmental signal monitoring and analysis. The cost or lifetime of the system typically depends on the frequency at which environmental phenomena are monitored. If sampling rates are reduced, energy is saved. Using empirical datasets collected from environmental monitoring sensor networks, this work performs time series analyses of measured temperature time series. Unlike previous works which have concentrated on suppressing the transmission of some data samples by time-series analysis but still maintaining high sampling rates, this work investigates reducing the sampling rate (and sensor wake up rate) and looks at the effects on accuracy. Results show that the sampling period of the sensor can be increased up to one hour while still allowing intermediate and future states to be estimated with interpolation RMSE less than 0.2 °C and forecasting RMSE less than 1 °C.

  19. Multi-complexity ensemble measures for gait time series analysis: application to diagnostics, monitoring and biometrics. (United States)

    Gavrishchaka, Valeriy; Senyukova, Olga; Davis, Kristina


    Previously, we have proposed to use complementary complexity measures discovered by boosting-like ensemble learning for the enhancement of quantitative indicators dealing with necessarily short physiological time series. We have confirmed robustness of such multi-complexity measures for heart rate variability analysis with the emphasis on detection of emerging and intermittent cardiac abnormalities. Recently, we presented preliminary results suggesting that such ensemble-based approach could be also effective in discovering universal meta-indicators for early detection and convenient monitoring of neurological abnormalities using gait time series. Here, we argue and demonstrate that these multi-complexity ensemble measures for gait time series analysis could have significantly wider application scope ranging from diagnostics and early detection of physiological regime change to gait-based biometrics applications.

  20. Harmonic Analysis of a Nonstationary Series of Temperature Paleoreconstruction for the Central Part of Greenland

    Directory of Open Access Journals (Sweden)

    T.E. Danova


    Full Text Available The results of the investigations of a transformed series of reconstructed air temperature data for the central part of Greenland with an increment of 30 years have been presented. Stationarization of a ~ 50,000-years’ series of the reconstructed air temperature in the central part of Greenland according to ice core data has been performed using mathematical expectation. To obtain mathematical expectation estimation, the smoothing procedure by the methods of moving average and wavelet analysis has been carried out. Fourier’s transformation has been applied repeatedly to the stationarized series with changing the averaging time in the process of smoothing. Three averaging time values have been selected for the investigations: ~ 400–500 years, ~ 2,000 years, and ~ 4,000 years. Stationarization of the reconstructed temperature series with the help of wavelet transformation showed the best results when applying the averaging time of ~ 400 and ~ 2000 years, the trends well characterize the initial temperature series, there-by revealing the main patterns of its dynamics. Using the period with the averaging time of ~ 4,000 years showed the worst result: significant events of the main temperature series were lost in the process of averaging. The obtained results well correspond to cycling known to be inherent to the climatic system of the planet; the detected modes of 1,470 ± 500 years are comparable to the Dansgaard–Oeschger and Bond oscillations.

  1. Properties of Asymmetric Detrended Fluctuation Analysis in the time series of RR intervals (United States)

    Piskorski, J.; Kosmider, M.; Mieszkowski, D.; Krauze, T.; Wykretowicz, A.; Guzik, P.


    Heart rate asymmetry is a phenomenon by which the accelerations and decelerations of heart rate behave differently, and this difference is consistent and unidirectional, i.e. in most of the analyzed recordings the inequalities have the same directions. So far, it has been established for variance and runs based types of descriptors of RR intervals time series. In this paper we apply the newly developed method of Asymmetric Detrended Fluctuation Analysis, which so far has mainly been used with economic time series, to the set of 420 stationary 30 min time series of RR intervals from young, healthy individuals aged between 20 and 40. This asymmetric approach introduces separate scaling exponents for rising and falling trends. We systematically study the presence of asymmetry in both global and local versions of this method. In this study global means "applying to the whole time series" and local means "applying to windows jumping along the recording". It is found that the correlation structure of the fluctuations left over after detrending in physiological time series shows strong asymmetric features in both magnitude, with α+ physiological data after shuffling or with a group of symmetric synthetic time series.

  2. Statistical attribution analysis of the nonstationarity of the annual runoff series of the Weihe River. (United States)

    Xiong, Lihua; Jiang, Cong; Du, Tao


    Time-varying moments models based on Pearson Type III and normal distributions respectively are built under the generalized additive model in location, scale and shape (GAMLSS) framework to analyze the nonstationarity of the annual runoff series of the Weihe River, the largest tributary of the Yellow River. The detection of nonstationarities in hydrological time series (annual runoff, precipitation and temperature) from 1960 to 2009 is carried out using a GAMLSS model, and then the covariate analysis for the annual runoff series is implemented with GAMLSS. Finally, the attribution of each covariate to the nonstationarity of annual runoff is analyzed quantitatively. The results demonstrate that (1) obvious change-points exist in all three hydrological series, (2) precipitation, temperature and irrigated area are all significant covariates of the annual runoff series, and (3) temperature increase plays the main role in leading to the reduction of the annual runoff series in the study basin, followed by the decrease of precipitation and the increase of irrigated area.

  3. High Specificity of Quantitative Methylation-Specific PCR Analysis for MGMT Promoter Hypermethylation Detection in Gliomas

    Directory of Open Access Journals (Sweden)

    Paola Parrella


    Full Text Available Normal brain tissue from 28 individuals and 50 glioma samples were analyzed by real-time Quantitative Methylation-Specific PCR (QMSP. Data from this analysis were compared with results obtained on the same samples by MSP. QMSP analysis demonstrated a statistically significant difference in both methylation level (P=.000009 Mann Whitney Test and frequencies (P=.0000007, Z-test in tumour samples as compared with normal brain tissues. Although QMSP and MSP showed similar sensitivity, the specificity of QMSP analysis was significantly higher (93%; CI95%: 84%–100% as compared with MSP (64%; 95%CI: 46%–82%. Our results suggest that QMSP analysis may represent a powerful tool to identify glioma patients that will benefit from alkylating agents chemotherapy.

  4. Specific classification of financial analysis of enterprise activity

    Directory of Open Access Journals (Sweden)

    Synkevych Nadiia I.


    Full Text Available Despite the fact that one can find a big variety of classifications of types of financial analysis of enterprise activity, which differ with their approach to classification and a number of classification features and their content, in modern scientific literature, their complex comparison and analysis of existing classification have not been done. This explains urgency of this study. The article studies classification of types of financial analysis of scientists and presents own approach to this problem. By the results of analysis the article improves and builds up a specific classification of financial analysis of enterprise activity and offers classification by the following features: objects, subjects, goals of study, automation level, time period of the analytical base, scope of study, organisation system, classification features of the subject, spatial belonging, sufficiency, information sources, periodicity, criterial base, method of data selection for analysis and time direction. All types of financial analysis significantly differ with their inherent properties and parameters depending on the goals of financial analysis. The developed specific classification provides subjects of financial analysis of enterprise activity with a possibility to identify a specific type of financial analysis, which would correctly meet the set goals.

  5. On statistical inference in time series analysis of the evolution of road safety. (United States)

    Commandeur, Jacques J F; Bijleveld, Frits D; Bergel-Hayat, Ruth; Antoniou, Constantinos; Yannis, George; Papadimitriou, Eleonora


    Data collected for building a road safety observatory usually include observations made sequentially through time. Examples of such data, called time series data, include annual (or monthly) number of road traffic accidents, traffic fatalities or vehicle kilometers driven in a country, as well as the corresponding values of safety performance indicators (e.g., data on speeding, seat belt use, alcohol use, etc.). Some commonly used statistical techniques imply assumptions that are often violated by the special properties of time series data, namely serial dependency among disturbances associated with the observations. The first objective of this paper is to demonstrate the impact of such violations to the applicability of standard methods of statistical inference, which leads to an under or overestimation of the standard error and consequently may produce erroneous inferences. Moreover, having established the adverse consequences of ignoring serial dependency issues, the paper aims to describe rigorous statistical techniques used to overcome them. In particular, appropriate time series analysis techniques of varying complexity are employed to describe the development over time, relating the accident-occurrences to explanatory factors such as exposure measures or safety performance indicators, and forecasting the development into the near future. Traditional regression models (whether they are linear, generalized linear or nonlinear) are shown not to naturally capture the inherent dependencies in time series data. Dedicated time series analysis techniques, such as the ARMA-type and DRAG approaches are discussed next, followed by structural time series models, which are a subclass of state space methods. The paper concludes with general recommendations and practice guidelines for the use of time series models in road safety research. Copyright © 2012 Elsevier Ltd. All rights reserved.

  6. A Filtering of Incomplete GNSS Position Time Series with Probabilistic Principal Component Analysis (United States)

    Gruszczynski, Maciej; Klos, Anna; Bogusz, Janusz


    For the first time, we introduced the probabilistic principal component analysis (pPCA) regarding the spatio-temporal filtering of Global Navigation Satellite System (GNSS) position time series to estimate and remove Common Mode Error (CME) without the interpolation of missing values. We used data from the International GNSS Service (IGS) stations which contributed to the latest International Terrestrial Reference Frame (ITRF2014). The efficiency of the proposed algorithm was tested on the simulated incomplete time series, then CME was estimated for a set of 25 stations located in Central Europe. The newly applied pPCA was compared with previously used algorithms, which showed that this method is capable of resolving the problem of proper spatio-temporal filtering of GNSS time series characterized by different observation time span. We showed, that filtering can be carried out with pPCA method when there exist two time series in the dataset having less than 100 common epoch of observations. The 1st Principal Component (PC) explained more than 36% of the total variance represented by time series residuals' (series with deterministic model removed), what compared to the other PCs variances (less than 8%) means that common signals are significant in GNSS residuals. A clear improvement in the spectral indices of the power-law noise was noticed for the Up component, which is reflected by an average shift towards white noise from - 0.98 to - 0.67 (30%). We observed a significant average reduction in the accuracy of stations' velocity estimated for filtered residuals by 35, 28 and 69% for the North, East, and Up components, respectively. CME series were also subjected to analysis in the context of environmental mass loading influences of the filtering results. Subtraction of the environmental loading models from GNSS residuals provides to reduction of the estimated CME variance by 20 and 65% for horizontal and vertical components, respectively.

  7. Ratio analysis specifics of the family dairies' financial statements

    Directory of Open Access Journals (Sweden)

    Mitrović Aleksandra


    Full Text Available The subject of this paper is the evaluation of the financial analysis specifics of the dairy enterprises with a focus on the implementation of the ratio analysis of financial statements. The ratio analysis is a central part of financial analysis, since it is based on investigating the relationship between logically related items in the financial statements to assess the financial position of the observed enterprise and its earning capacity. Speaking about the reporting of financial performance in family dairies, the basis is created for displaying techniques of financial analysis, with a special indication on the specifics of their application in agricultural enterprises focusing on companies engaged in dairying. Applied in the paper is ratio analysis on the example of a dairy enterprise, i.e. a family dairy operating in Serbia. The ratio indicators are the basis for identifying relationships based on which by comparing the actual performance and certain business standards differences or variations are identified.

  8. A Recurrent Probabilistic Neural Network with Dimensionality Reduction Based on Time-series Discriminant Component Analysis. (United States)

    Hayashi, Hideaki; Shibanoki, Taro; Shima, Keisuke; Kurita, Yuichi; Tsuji, Toshio


    This paper proposes a probabilistic neural network (NN) developed on the basis of time-series discriminant component analysis (TSDCA) that can be used to classify high-dimensional time-series patterns. TSDCA involves the compression of high-dimensional time series into a lower dimensional space using a set of orthogonal transformations and the calculation of posterior probabilities based on a continuous-density hidden Markov model with a Gaussian mixture model expressed in the reduced-dimensional space. The analysis can be incorporated into an NN, which is named a time-series discriminant component network (TSDCN), so that parameters of dimensionality reduction and classification can be obtained simultaneously as network coefficients according to a backpropagation through time-based learning algorithm with the Lagrange multiplier method. The TSDCN is considered to enable high-accuracy classification of high-dimensional time-series patterns and to reduce the computation time taken for network training. The validity of the TSDCN is demonstrated for high-dimensional artificial data and electroencephalogram signals in the experiments conducted during the study.

  9. Regression and regression analysis time series prediction modeling on climate data of quetta, pakistan

    International Nuclear Information System (INIS)

    Jafri, Y.Z.; Kamal, L.


    Various statistical techniques was used on five-year data from 1998-2002 of average humidity, rainfall, maximum and minimum temperatures, respectively. The relationships to regression analysis time series (RATS) were developed for determining the overall trend of these climate parameters on the basis of which forecast models can be corrected and modified. We computed the coefficient of determination as a measure of goodness of fit, to our polynomial regression analysis time series (PRATS). The correlation to multiple linear regression (MLR) and multiple linear regression analysis time series (MLRATS) were also developed for deciphering the interdependence of weather parameters. Spearman's rand correlation and Goldfeld-Quandt test were used to check the uniformity or non-uniformity of variances in our fit to polynomial regression (PR). The Breusch-Pagan test was applied to MLR and MLRATS, respectively which yielded homoscedasticity. We also employed Bartlett's test for homogeneity of variances on a five-year data of rainfall and humidity, respectively which showed that the variances in rainfall data were not homogenous while in case of humidity, were homogenous. Our results on regression and regression analysis time series show the best fit to prediction modeling on climatic data of Quetta, Pakistan. (author)

  10. Bayesian near-boundary analysis in basic macroeconomic time series models

    NARCIS (Netherlands)

    M.D. de Pooter (Michiel); F. Ravazzolo (Francesco); R. Segers (René); H.K. van Dijk (Herman)


    textabstractSeveral lessons learnt from a Bayesian analysis of basic macroeconomic time series models are presented for the situation where some model parameters have substantial posterior probability near the boundary of the parameter region. This feature refers to near-instability within dynamic

  11. A Comparison of Missing-Data Procedures for Arima Time-Series Analysis (United States)

    Velicer, Wayne F.; Colby, Suzanne M.


    Missing data are a common practical problem for longitudinal designs. Time-series analysis is a longitudinal method that involves a large number of observations on a single unit. Four different missing-data methods (deletion, mean substitution, mean of adjacent observations, and maximum likelihood estimation) were evaluated. Computer-generated…

  12. Using trajectory sensitivity analysis to find suitable locations of series compensators for improving rotor angle stability

    DEFF Research Database (Denmark)

    Nasri, Amin; Eriksson, Robert; Ghandhar, Mehrdad


    This paper proposes an approach based on trajectory sensitivity analysis (TSA) to find most suitable placement of series compensators in the power system. The main objective is to maximize the benefit of these devices in order to enhance the rotor angle stability. This approach is formulated...

  13. Detrended partial cross-correlation analysis of two nonstationary time series influenced by common external forces (United States)

    Qian, Xi-Yuan; Liu, Ya-Min; Jiang, Zhi-Qiang; Podobnik, Boris; Zhou, Wei-Xing; Stanley, H. Eugene


    When common factors strongly influence two power-law cross-correlated time series recorded in complex natural or social systems, using detrended cross-correlation analysis (DCCA) without considering these common factors will bias the results. We use detrended partial cross-correlation analysis (DPXA) to uncover the intrinsic power-law cross correlations between two simultaneously recorded time series in the presence of nonstationarity after removing the effects of other time series acting as common forces. The DPXA method is a generalization of the detrended cross-correlation analysis that takes into account partial correlation analysis. We demonstrate the method by using bivariate fractional Brownian motions contaminated with a fractional Brownian motion. We find that the DPXA is able to recover the analytical cross Hurst indices, and thus the multiscale DPXA coefficients are a viable alternative to the conventional cross-correlation coefficient. We demonstrate the advantage of the DPXA coefficients over the DCCA coefficients by analyzing contaminated bivariate fractional Brownian motions. We calculate the DPXA coefficients and use them to extract the intrinsic cross correlation between crude oil and gold futures by taking into consideration the impact of the U.S. dollar index. We develop the multifractal DPXA (MF-DPXA) method in order to generalize the DPXA method and investigate multifractal time series. We analyze multifractal binomial measures masked with strong white noises and find that the MF-DPXA method quantifies the hidden multifractal nature while the multifractal DCCA method fails.

  14. Operation States Analysis of the Series-Parallel resonant Converter Working Above Resonance Frequency

    Directory of Open Access Journals (Sweden)

    Peter Dzurko


    Full Text Available Operation states analysis of a series-parallel converter working above resonance frequency is described in the paper. Principal equations are derived for individual operation states. On the basis of them the diagrams are made out. The diagrams give the complex image of the converter behaviour for individual circuit parameters. The waveforms may be utilised at designing the inverter individual parts.

  15. AAMFT Master Series Tapes: An Analysis of the Inclusion of Feminist Principles into Family Therapy Practice. (United States)

    Haddock, Shelley A.; MacPhee, David; Zimmerman, Toni Schindler


    Content analysis of 23 American Association for Marriage and Family Therapy Master Series tapes was used to determine how well feminist behaviors have been incorporated into ideal family therapy practice. Feminist behaviors were infrequent, being evident in fewer than 3% of time blocks in event sampling and 10 of 39 feminist behaviors of the…

  16. Dynamic factor analysis in the frequency domain: causal modeling of multivariate psychophysiological time series

    NARCIS (Netherlands)

    Molenaar, P.C.M.


    Outlines a frequency domain analysis of the dynamic factor model and proposes a solution to the problem of constructing a causal filter of lagged factor loadings. The method is illustrated with applications to simulated and real multivariate time series. The latter applications involve topographic

  17. Harmonic analysis of dense time series of landsat imagery for modeling change in forest conditions (United States)

    Barry Tyler. Wilson


    This study examined the utility of dense time series of Landsat imagery for small area estimation and mapping of change in forest conditions over time. The study area was a region in north central Wisconsin for which Landsat 7 ETM+ imagery and field measurements from the Forest Inventory and Analysis program are available for the decade of 2003 to 2012. For the periods...

  18. Economic Conditions and the Divorce Rate: A Time-Series Analysis of the Postwar United States. (United States)

    South, Scott J.


    Challenges the belief that the divorce rate rises during prosperity and falls during economic recessions. Time-series regression analysis of postwar United States reveals small but positive effects of unemployment on divorce rate. Stronger influences on divorce rates are changes in age structure and labor-force participation rate of women.…

  19. Operation Analysis of the Series-Parallel Resonant Converter Working above Resonance Frequency

    Directory of Open Access Journals (Sweden)

    Peter Dzurko


    Full Text Available The present article deals with theoretical analysis of operation of a series-parallel converter working above resonance frequency. Derived are principal equations for individual operation intervals. Based on these made out are waveforms of individual quantities during both the inverter operation at load and no-load operation. The waveforms may be utilised at designing the inverter individual parts.

  20. Gas chromatographic isolation technique for compound-specific radiocarbon analysis

    International Nuclear Information System (INIS)

    Uchida, M.; Kumamoto, Y.; Shibata, Y.; Yoneda, M.; Morita, M.; Kawamura, K.


    Full text: We present here a gas chromatographic isolation technique for the compound-specific radiocarbon analysis of biomarkers from the marine sediments. The biomarkers of fatty acids, hydrocarbon and sterols were isolated with enough amount for radiocarbon analysis using a preparative capillary gas chromatograph (PCGC) system. The PCGC systems used here is composed of an HP 6890 GC with FID, a cooled injection system (CIS, Gerstel, Germany), a zero-dead-volume effluent splitter, and a cryogenic preparative collection device (PFC, Gerstel). For AMS analysis, we need to separate and recover sufficient quantity of target individual compounds (>50 μgC). Yields of target compounds from C 14 n-alkanes to C 40 to C 30 n-alkanes and approximately that of 80% for higher molecular weights compounds more than C 30 n-alkanes. Compound specific radiocarbon analysis of organic compounds, as well as compound-specific stable isotope analysis, provide valuable information on the origins and carbon cycling in marine system. Above PCGC conditions, we applied compound-specific radiocarbon analysis to the marine sediments from western north Pacific, which showed the possibility of a useful chronology tool for estimating the age of sediment using organic matter in paleoceanographic study, in the area where enough amounts of planktonic foraminifera for radiocarbon analysis by accelerator mass spectrometry (AMS) are difficult to obtain due to dissolution of calcium carbonate. (author)

  1. Unified functional network and nonlinear time series analysis for complex systems science: The pyunicorn package (United States)

    Donges, Jonathan; Heitzig, Jobst; Beronov, Boyan; Wiedermann, Marc; Runge, Jakob; Feng, Qing Yi; Tupikina, Liubov; Stolbova, Veronika; Donner, Reik; Marwan, Norbert; Dijkstra, Henk; Kurths, Jürgen


    We introduce the pyunicorn (Pythonic unified complex network and recurrence analysis toolbox) open source software package for applying and combining modern methods of data analysis and modeling from complex network theory and nonlinear time series analysis. pyunicorn is a fully object-oriented and easily parallelizable package written in the language Python. It allows for the construction of functional networks such as climate networks in climatology or functional brain networks in neuroscience representing the structure of statistical interrelationships in large data sets of time series and, subsequently, investigating this structure using advanced methods of complex network theory such as measures and models for spatial networks, networks of interacting networks, node-weighted statistics, or network surrogates. Additionally, pyunicorn provides insights into the nonlinear dynamics of complex systems as recorded in uni- and multivariate time series from a non-traditional perspective by means of recurrence quantification analysis, recurrence networks, visibility graphs, and construction of surrogate time series. The range of possible applications of the library is outlined, drawing on several examples mainly from the field of climatology. pyunicorn is available online at Reference: J.F. Donges, J. Heitzig, B. Beronov, M. Wiedermann, J. Runge, Q.-Y. Feng, L. Tupikina, V. Stolbova, R.V. Donner, N. Marwan, H.A. Dijkstra, and J. Kurths, Unified functional network and nonlinear time series analysis for complex systems science: The pyunicorn package, Chaos 25, 113101 (2015), DOI: 10.1063/1.4934554, Preprint: [].

  2. Independent component analysis: A new possibility for analysing series of electron energy loss spectra

    International Nuclear Information System (INIS)

    Bonnet, Nogl; Nuzillard, Danielle


    A complementary approach is proposed for analysing series of electron energy-loss spectra that can be recorded with the spectrum-line technique, across an interface for instance. This approach, called blind source separation (BSS) or independent component analysis (ICA), complements two existing methods: the spatial difference approach and multivariate statistical analysis. The principle of the technique is presented and illustrations are given through one simulated example and one real example

  3. A comparative analysis of spectral exponent estimation techniques for 1/fβ processes with applications to the analysis of stride interval time series (United States)

    Schaefer, Alexander; Brach, Jennifer S.; Perera, Subashan; Sejdić, Ervin


    Background The time evolution and complex interactions of many nonlinear systems, such as in the human body, result in fractal types of parameter outcomes that exhibit self similarity over long time scales by a power law in the frequency spectrum S(f) = 1/fβ. The scaling exponent β is thus often interpreted as a “biomarker” of relative health and decline. New Method This paper presents a thorough comparative numerical analysis of fractal characterization techniques with specific consideration given to experimentally measured gait stride interval time series. The ideal fractal signals generated in the numerical analysis are constrained under varying lengths and biases indicative of a range of physiologically conceivable fractal signals. This analysis is to complement previous investigations of fractal characteristics in healthy and pathological gait stride interval time series, with which this study is compared. Results The results of our analysis showed that the averaged wavelet coefficient method consistently yielded the most accurate results. Comparison with Existing Methods: Class dependent methods proved to be unsuitable for physiological time series. Detrended fluctuation analysis as most prevailing method in the literature exhibited large estimation variances. Conclusions The comparative numerical analysis and experimental applications provide a thorough basis for determining an appropriate and robust method for measuring and comparing a physiologically meaningful biomarker, the spectral index β. In consideration of the constraints of application, we note the significant drawbacks of detrended fluctuation analysis and conclude that the averaged wavelet coefficient method can provide reasonable consistency and accuracy for characterizing these fractal time series. PMID:24200509

  4. Time Series Imputation via L1 Norm-Based Singular Spectrum Analysis (United States)

    Kalantari, Mahdi; Yarmohammadi, Masoud; Hassani, Hossein; Silva, Emmanuel Sirimal

    Missing values in time series data is a well-known and important problem which many researchers have studied extensively in various fields. In this paper, a new nonparametric approach for missing value imputation in time series is proposed. The main novelty of this research is applying the L1 norm-based version of Singular Spectrum Analysis (SSA), namely L1-SSA which is robust against outliers. The performance of the new imputation method has been compared with many other established methods. The comparison is done by applying them to various real and simulated time series. The obtained results confirm that the SSA-based methods, especially L1-SSA can provide better imputation in comparison to other methods.

  5. Analysis of Land Subsidence Monitoring in Mining Area with Time-Series Insar Technology (United States)

    Sun, N.; Wang, Y. J.


    Time-series InSAR technology has become a popular land subsidence monitoring method in recent years, because of its advantages such as high accuracy, wide area, low expenditure, intensive monitoring points and free from accessibility restrictions. In this paper, we applied two kinds of satellite data, ALOS PALSAR and RADARSAT-2, to get the subsidence monitoring results of the study area in two time periods by time-series InSAR technology. By analyzing the deformation range, rate and amount, the time-series analysis of land subsidence in mining area was realized. The results show that InSAR technology could be used to monitor land subsidence in large area and meet the demand of subsidence monitoring in mining area.

  6. Methodology for risk-based analysis of technical specifications

    International Nuclear Information System (INIS)

    Vesely, W.E.; Gaertner, J.P.; Wagner, D.P.


    Part of the effort by EPRI to apply probabilistic risk assessment methods and results to the solution of utility problems involves the investigation of methods for risk-based analysis of technical specifications. The culmination of this investigation is the SOCRATES computer code developed by Battelle's Columbus Laboratories to assist in the evaluation of technical specifications of nuclear power plants. The program is designed to use information found in PRAs to re-evaluate risk for changes in component allowed outage times (AOTs) and surveillance test intervals (STIs). The SOCRATES program is a unique and important tool for technical specification evaluations. The detailed component unavailability model allows a detailed analysis of AOT and STI contributions to risk. Explicit equations allow fast and inexpensive calculations. Because the code is designed to accept ranges of parameters and to save results of calculations that do not change during the analysis, sensitivity studies are efficiently performed and results are clearly displayed

  7. A study and meta-analysis of lay attributions of cures for overcoming specific psychological problems. (United States)

    Furnham, A; Hayward, R


    Lay beliefs about the importance of 24 different contributors to overcoming 4 disorders that constitute primarily cognitive deficits were studied. A meta-analysis of previous programmatic studies in the area was performed so that 22 different psychological problems could be compared. In the present study, 107 participants completed a questionnaire indicating how effective 24 factors were in overcoming 4 specific problems: dyslexia, fear of flying, amnesia, and learning difficulties. Factor analysis revealed almost identical clusters (inner control, social consequences, understanding, receiving help, and fate) for each problem. The perceived relevance of those factors differed significantly between problems. Some individual difference factors (sex and religion) were found to predict certain factor attributions for specific disorders. A meta-analysis of the 5 studies in this series yielded a 6-factor structure comparable to those of the individual studies and provided results indicating the benefits and limitations of this kind of investigation. The clinical relevance of studying attributions for cure is considered.

  8. Sensitivity analysis of machine-learning models of hydrologic time series (United States)

    O'Reilly, A. M.


    Sensitivity analysis traditionally has been applied to assessing model response to perturbations in model parameters, where the parameters are those model input variables adjusted during calibration. Unlike physics-based models where parameters represent real phenomena, the equivalent of parameters for machine-learning models are simply mathematical "knobs" that are automatically adjusted during training/testing/verification procedures. Thus the challenge of extracting knowledge of hydrologic system functionality from machine-learning models lies in their very nature, leading to the label "black box." Sensitivity analysis of the forcing-response behavior of machine-learning models, however, can provide understanding of how the physical phenomena represented by model inputs affect the physical phenomena represented by model outputs.As part of a previous study, hybrid spectral-decomposition artificial neural network (ANN) models were developed to simulate the observed behavior of hydrologic response contained in multidecadal datasets of lake water level, groundwater level, and spring flow. Model inputs used moving window averages (MWA) to represent various frequencies and frequency-band components of time series of rainfall and groundwater use. Using these forcing time series, the MWA-ANN models were trained to predict time series of lake water level, groundwater level, and spring flow at 51 sites in central Florida, USA. A time series of sensitivities for each MWA-ANN model was produced by perturbing forcing time-series and computing the change in response time-series per unit change in perturbation. Variations in forcing-response sensitivities are evident between types (lake, groundwater level, or spring), spatially (among sites of the same type), and temporally. Two generally common characteristics among sites are more uniform sensitivities to rainfall over time and notable increases in sensitivities to groundwater usage during significant drought periods.

  9. Social network analysis of character interaction in the Stargate and Star Trek television series (United States)

    Tan, Melody Shi Ai; Ujum, Ephrance Abu; Ratnavelu, Kuru

    This paper undertakes a social network analysis of two science fiction television series, Stargate and Star Trek. Television series convey stories in the form of character interaction, which can be represented as “character networks”. We connect each pair of characters that exchanged spoken dialogue in any given scene demarcated in the television series transcripts. These networks are then used to characterize the overall structure and topology of each series. We find that the character networks of both series have similar structure and topology to that found in previous work on mythological and fictional networks. The character networks exhibit the small-world effects but found no significant support for power-law. Since the progression of an episode depends to a large extent on the interaction between each of its characters, the underlying network structure tells us something about the complexity of that episode’s storyline. We assessed the complexity using techniques from spectral graph theory. We found that the episode networks are structured either as (1) closed networks, (2) those containing bottlenecks that connect otherwise disconnected clusters or (3) a mixture of both.

  10. Principal components and iterative regression analysis of geophysical series: Application to Sunspot number (1750 2004) (United States)

    Nordemann, D. J. R.; Rigozo, N. R.; de Souza Echer, M. P.; Echer, E.


    We present here an implementation of a least squares iterative regression method applied to the sine functions embedded in the principal components extracted from geophysical time series. This method seems to represent a useful improvement for the non-stationary time series periodicity quantitative analysis. The principal components determination followed by the least squares iterative regression method was implemented in an algorithm written in the Scilab (2006) language. The main result of the method is to obtain the set of sine functions embedded in the series analyzed in decreasing order of significance, from the most important ones, likely to represent the physical processes involved in the generation of the series, to the less important ones that represent noise components. Taking into account the need of a deeper knowledge of the Sun's past history and its implication to global climate change, the method was applied to the Sunspot Number series (1750-2004). With the threshold and parameter values used here, the application of the method leads to a total of 441 explicit sine functions, among which 65 were considered as being significant and were used for a reconstruction that gave a normalized mean squared error of 0.146.

  11. Interrupted time series analysis in drug utilization research is increasing: systematic review and recommendations. (United States)

    Jandoc, Racquel; Burden, Andrea M; Mamdani, Muhammad; Lévesque, Linda E; Cadarette, Suzanne M


    To describe the use and reporting of interrupted time series methods in drug utilization research. We completed a systematic search of MEDLINE, Web of Science, and reference lists to identify English language articles through to December 2013 that used interrupted time series methods in drug utilization research. We tabulated the number of studies by publication year and summarized methodological detail. We identified 220 eligible empirical applications since 1984. Only 17 (8%) were published before 2000, and 90 (41%) were published since 2010. Segmented regression was the most commonly applied interrupted time series method (67%). Most studies assessed drug policy changes (51%, n = 112); 22% (n = 48) examined the impact of new evidence, 18% (n = 39) examined safety advisories, and 16% (n = 35) examined quality improvement interventions. Autocorrelation was considered in 66% of studies, 31% reported adjusting for seasonality, and 15% accounted for nonstationarity. Use of interrupted time series methods in drug utilization research has increased, particularly in recent years. Despite methodological recommendations, there is large variation in reporting of analytic methods. Developing methodological and reporting standards for interrupted time series analysis is important to improve its application in drug utilization research, and we provide recommendations for consideration. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.

  12. Definition of distance for nonlinear time series analysis of marked point process data

    Energy Technology Data Exchange (ETDEWEB)

    Iwayama, Koji, E-mail: [Research Institute for Food and Agriculture, Ryukoku Univeristy, 1-5 Yokotani, Seta Oe-cho, Otsu-Shi, Shiga 520-2194 (Japan); Hirata, Yoshito; Aihara, Kazuyuki [Institute of Industrial Science, The University of Tokyo, 4-6-1 Komaba, Meguro-ku, Tokyo 153-8505 (Japan)


    Marked point process data are time series of discrete events accompanied with some values, such as economic trades, earthquakes, and lightnings. A distance for marked point process data allows us to apply nonlinear time series analysis to such data. We propose a distance for marked point process data which can be calculated much faster than the existing distance when the number of marks is small. Furthermore, under some assumptions, the Kullback–Leibler divergences between posterior distributions for neighbors defined by this distance are small. We performed some numerical simulations showing that analysis based on the proposed distance is effective. - Highlights: • A new distance for marked point process data is proposed. • The distance can be computed fast enough for a small number of marks. • The method to optimize parameter values of the distance is also proposed. • Numerical simulations indicate that the analysis based on the distance is effective.

  13. Application of principal component analysis to time series of daily air pollution and mortality

    NARCIS (Netherlands)

    Quant C; Fischer P; Buringh E; Ameling C; Houthuijs D; Cassee F; MGO


    We investigated whether cause-specific daily mortality can be attributed to specific sources of air pollution. To construct indicators of source-specific air pollution, we applied a principal component analysis (PCA) on routinely collected air pollution data in the Netherlands during the period

  14. Hyperspectral Time Series Analysis of Native and Invasive Species in Hawaiian Rainforests

    Directory of Open Access Journals (Sweden)

    Gregory P. Asner


    Full Text Available The unique ecosystems of the Hawaiian Islands are progressively being threatened following the introduction of exotic species. Operational implementation of remote sensing for the detection, mapping and monitoring of these biological invasions is currently hampered by a lack of knowledge on the spectral separability between native and invasive species. We used spaceborne imaging spectroscopy to analyze the seasonal dynamics of the canopy hyperspectral reflectance properties of four tree species: (i Metrosideros polymorpha, a keystone native Hawaiian species; (ii Acacia koa, a native Hawaiian nitrogen fixer; (iii the highly invasive Psidium cattleianum; and (iv Morella faya, a highly invasive nitrogen fixer. The species specific separability of the reflectance and derivative-reflectance signatures extracted from an Earth Observing-1 Hyperion time series, composed of 22 cloud-free images spanning a period of four years and was quantitatively evaluated using the Separability Index (SI. The analysis revealed that the Hawaiian native trees were universally unique from the invasive trees in their near-infrared-1 (700–1,250 nm reflectance (0.4 > SI > 1.4. Due to its higher leaf area index, invasive trees generally had a higher near-infrared reflectance. To a lesser extent, it could also be demonstrated that nitrogen-fixing trees were spectrally unique from non-fixing trees. The higher leaf nitrogen content of nitrogen-fixing trees was expressed through slightly increased separabilities in visible and shortwave-infrared reflectance wavebands (SI = 0.4. We also found phenology to be key to spectral separability analysis. As such, it was shown that the spectral separability in the near-infrared-1 reflectance between the native and invasive species groups was more expressed in summer (SI > 0.7 than in winter (SI < 0.7. The lowest separability was observed for March-July (SI < 0.3. This could be explained by the

  15. Tissue-type-specific transcriptome analysis identifies developing xylem-specific promoters in poplar. (United States)

    Ko, Jae-Heung; Kim, Hyun-Tae; Hwang, Ildoo; Han, Kyung-Hwan


    Plant biotechnology offers a means to create novel phenotypes. However, commercial application of biotechnology in crop improvement programmes is severely hindered by the lack of utility promoters (or freedom to operate the existing ones) that can drive gene expression in a tissue-specific or temporally controlled manner. Woody biomass is gaining popularity as a source of fermentable sugars for liquid fuel production. To improve the quantity and quality of woody biomass, developing xylem (DX)-specific modification of the feedstock is highly desirable. To develop utility promoters that can drive transgene expression in a DX-specific manner, we used the Affymetrix Poplar Genome Arrays to obtain tissue-type-specific transcriptomes from poplar stems. Subsequent bioinformatics analysis identified 37 transcripts that are specifically or strongly expressed in DX cells of poplar. After further confirmation of their DX-specific expression using semi-quantitative PCR, we selected four genes (DX5, DX8, DX11 and DX15) for in vivo confirmation of their tissue-specific expression in transgenic poplars. The promoter regions of the selected DX genes were isolated and fused to a β-glucuronidase (GUS)-reported gene in a binary vector. This construct was used to produce transgenic poplars via Agrobacterium-mediated transformation. The GUS expression patterns of the resulting transgenic plants showed that these promoters were active in the xylem cells at early seedling growth and had strongest expression in the developing xylem cells at later growth stages of poplar. We conclude that these DX promoters can be used as a utility promoter for DX-specific biomass engineering. © 2012 The Authors. Plant Biotechnology Journal © 2012 Society for Experimental Biology, Association of Applied Biologists and Blackwell Publishing Ltd.

  16. Application of Time Series Analysis in Determination of Lag Time in Jahanbin Basin

    Directory of Open Access Journals (Sweden)

    Seied Yahya Mirzaee


        One of the important issues that have significant role in study of hydrology of basin is determination of lag time. Lag time has significant role in hydrological studies. Quantity of rainfall related lag time depends on several factors, such as permeability, vegetation cover, catchments slope, rainfall intensity, storm duration and type of rain. Determination of lag time is important parameter in many projects such as dam design and also water resource studies. Lag time of basin could be calculated using various methods. One of these methods is time series analysis of spectral density. The analysis is based on fouries series. The time series is approximated with Sinuous and Cosines functions. In this method harmonically significant quantities with individual frequencies are presented. Spectral density under multiple time series could be used to obtain basin lag time for annual runoff and short-term rainfall fluctuation. A long lag time could be due to snowmelt as well as melting ice due to rainfalls in freezing days. In this research the lag time of Jahanbin basin has been determined using spectral density method. The catchments is subjected to both rainfall and snowfall. For short term rainfall fluctuation with a return period  2, 3, 4 months, the lag times were found 0.18, 0.5 and 0.083 month, respectively.

  17. Statistical tools for analysis and modeling of cosmic populations and astronomical time series: CUDAHM and TSE (United States)

    Loredo, Thomas; Budavari, Tamas; Scargle, Jeffrey D.


    This presentation provides an overview of open-source software packages addressing two challenging classes of astrostatistics problems. (1) CUDAHM is a C++ framework for hierarchical Bayesian modeling of cosmic populations, leveraging graphics processing units (GPUs) to enable applying this computationally challenging paradigm to large datasets. CUDAHM is motivated by measurement error problems in astronomy, where density estimation and linear and nonlinear regression must be addressed for populations of thousands to millions of objects whose features are measured with possibly complex uncertainties, potentially including selection effects. An example calculation demonstrates accurate GPU-accelerated luminosity function estimation for simulated populations of $10^6$ objects in about two hours using a single NVIDIA Tesla K40c GPU. (2) Time Series Explorer (TSE) is a collection of software in Python and MATLAB for exploratory analysis and statistical modeling of astronomical time series. It comprises a library of stand-alone functions and classes, as well as an application environment for interactive exploration of times series data. The presentation will summarize key capabilities of this emerging project, including new algorithms for analysis of irregularly-sampled time series.

  18. Attitude Determination Error Analysis System (ADEAS) mathematical specifications document (United States)

    Nicholson, Mark; Markley, F.; Seidewitz, E.


    The mathematical specifications of Release 4.0 of the Attitude Determination Error Analysis System (ADEAS), which provides a general-purpose linear error analysis capability for various spacecraft attitude geometries and determination processes, are presented. The analytical basis of the system is presented. The analytical basis of the system is presented, and detailed equations are provided for both three-axis-stabilized and spin-stabilized attitude sensor models.

  19. Fractal time series analysis of postural stability in elderly and control subjects

    Directory of Open Access Journals (Sweden)

    Doussot Michel


    Full Text Available Abstract Background The study of balance using stabilogram analysis is of particular interest in the study of falls. Although simple statistical parameters derived from the stabilogram have been shown to predict risk of falls, such measures offer little insight into the underlying control mechanisms responsible for degradation in balance. In contrast, fractal and non-linear time-series analysis of stabilograms, such as estimations of the Hurst exponent (H, may provide information related to the underlying motor control strategies governing postural stability. In order to be adapted for a home-based follow-up of balance, such methods need to be robust, regardless of the experimental protocol, while producing time-series that are as short as possible. The present study compares two methods of calculating H: Detrended Fluctuation Analysis (DFA and Stabilogram Diffusion Analysis (SDA for elderly and control subjects, as well as evaluating the effect of recording duration. Methods Centre of pressure signals were obtained from 90 young adult subjects and 10 elderly subjects. Data were sampled at 100 Hz for 30 s, including stepping onto and off the force plate. Estimations of H were made using sliding windows of 10, 5, and 2.5 s durations, with windows slid forward in 1-s increments. Multivariate analysis of variance was used to test for the effect of time, age and estimation method on the Hurst exponent, while the intra-class correlation coefficient (ICC was used as a measure of reliability. Results Both SDA and DFA methods were able to identify differences in postural stability between control and elderly subjects for time series as short as 5 s, with ICC values as high as 0.75 for DFA. Conclusion Both methods would be well-suited to non-invasive longitudinal assessment of balance. In addition, reliable estimations of H were obtained from time series as short as 5 s.

  20. Temporal Statistical Analysis of Degree Distributions in an Undirected Landline Phone Call Network Graph Series

    Directory of Open Access Journals (Sweden)

    Orgeta Gjermëni


    Full Text Available This article aims to provide new results about the intraday degree sequence distribution considering phone call network graph evolution in time. More specifically, it tackles the following problem. Given a large amount of landline phone call data records, what is the best way to summarize the distinct number of calling partners per client per day? In order to answer this question, a series of undirected phone call network graphs is constructed based on data from a local telecommunication source in Albania. All network graphs of the series are simplified. Further, a longitudinal temporal study is made on this network graphs series related to the degree distributions. Power law and log-normal distribution fittings on the degree sequence are compared on each of the network graphs of the series. The maximum likelihood method is used to estimate the parameters of the distributions, and a Kolmogorov–Smirnov test associated with a p-value is used to define the plausible models. A direct distribution comparison is made through a Vuong test in the case that both distributions are plausible. Another goal was to describe the parameters’ distributions’ shape. A Shapiro-Wilk test is used to test the normality of the data, and measures of shape are used to define the distributions’ shape. Study findings suggested that log-normal distribution models better the intraday degree sequence data of the network graphs. It is not possible to say that the distributions of log-normal parameters are normal.

  1. Compound-specific radiocarbon analysis - Analytical challenges and applications (United States)

    Mollenhauer, G.; Rethemeyer, J.


    Within the last decades, techniques have become available that allow measurement of isotopic compositions of individual organic compounds (compound-specific isotope measurements). Most often the carbon isotopic composition of these compounds is studied, including stable carbon (δ13C) and radiocarbon (Δ14C) measurements. While compound-specific stable carbon isotope measurements are fairly simple, and well-established techniques are widely available, radiocarbon analysis of specific organic compounds is a more challenging method. Analytical challenges include difficulty obtaining adequate quantities of sample, tedious and complicated laboratory separations, the lack of authentic standards for measuring realistic processing blanks, and large uncertainties in values of Δ14C at small sample sizes. The challenges associated with sample preparation for compound-specific Δ14C measurements will be discussed in this contribution. Several years of compound-specific radiocarbon analysis have revealed that in most natural samples, purified organic compounds consist of heterogeneous mixtures of the same compound. These mixtures could derive from multiple sources, each having a different initial reservoir age but mixed in the same terminal reservoir, from a single source but mixed after deposition, or from a prokaryotic organism using variable carbon sources including mobilization of ancient carbon. These processes not only represent challenges to the interpretation of compound-specific radiocarbon data, but provide unique tools for the understanding of biogeochemical and sedimentological processes influencing the preserved organic geochemical records in marine sediments. We will discuss some examples where compound-specific radiocarbon analysis has provided new insights for the understanding of carbon source utilization and carbon cycling.

  2. Adventures in Modern Time Series Analysis: From the Sun to the Crab Nebula and Beyond (United States)

    Scargle, Jeffrey


    With the generation of long, precise, and finely sampled time series the Age of Digital Astronomy is uncovering and elucidating energetic dynamical processes throughout the Universe. Fulfilling these opportunities requires data effective analysis techniques rapidly and automatically implementing advanced concepts. The Time Series Explorer, under development in collaboration with Tom Loredo, provides tools ranging from simple but optimal histograms to time and frequency domain analysis for arbitrary data modes with any time sampling. Much of this development owes its existence to Joe Bredekamp and the encouragement he provided over several decades. Sample results for solar chromospheric activity, gamma-ray activity in the Crab Nebula, active galactic nuclei and gamma-ray bursts will be displayed.

  3. Flood Frequency Analysis For Partial Duration Series In Ganjiang River Basin (United States)

    zhangli, Sun; xiufang, Zhu; yaozhong, Pan


    Accurate estimation of flood frequency is key to effective, nationwide flood damage abatement programs. The partial duration series (PDS) method is widely used in hydrologic studies because it considers all events above a certain threshold level as compared to the annual maximum series (AMS) method, which considers only the annual maximum value. However, the PDS has a drawback in that it is difficult to define the thresholds and maintain an independent and identical distribution of the partial duration time series; this drawback is discussed in this paper. The Ganjiang River is the seventh largest tributary of the Yangtze River, the longest river in China. The Ganjiang River covers a drainage area of 81,258 km2 at the Wanzhou hydrologic station as the basin outlet. In this work, 56 years of daily flow data (1954-2009) from the Wanzhou station were used to analyze flood frequency, and the Pearson-III model was employed as the hydrologic probability distribution. Generally, three tasks were accomplished: (1) the threshold of PDS by percentile rank of daily runoff was obtained; (2) trend analysis of the flow series was conducted using PDS; and (3) flood frequency analysis was conducted for partial duration flow series. The results showed a slight upward trend of the annual runoff in the Ganjiang River basin. The maximum flow with a 0.01 exceedance probability (corresponding to a 100-year flood peak under stationary conditions) was 20,000 m3/s, while that with a 0.1 exceedance probability was 15,000 m3/s. These results will serve as a guide to hydrological engineering planning, design, and management for policymakers and decision makers associated with hydrology.

  4. Time-series analysis of climatologic measurements: a method to distinguish future climatic changes

    International Nuclear Information System (INIS)

    Duband, D.


    Time-series analysis of climatic parameters as air temperature, rivers flow rate, lakes or seas level is an indispensable basis to detect a possible significant climatic change. These observations, when they are carefully analyzed and criticized, constitute the necessary reference for testing and validation numerical climatic models which try to simulate the physical and dynamical process of the ocean-atmosphere couple, taking continents into account. 32 refs., 13 figs

  5. Chernobyl effects on domestic and inbound tourism in Sweden. A time series analysis

    International Nuclear Information System (INIS)

    Hultkrantz, L.; Olsson, C.


    This paper estimates the impact of the Chernobyl nuclear accident on domestic and international tourism in Sweden. From ARIMA time series forecasts, outlier search, and intervention analysis based on regional monthly accommodation data from 1978-1989, no effect on domestic tourism is found. However, there is an enduring deterrence effect on incoming tourism. The loss of gross revenue from incoming tourism because of the Chernobyl accident, is estimated to 2.5 billion SEK. 5 figs., 7 tabs., 1 appendix, 27 refs

  6. On-line condition monitoring of nuclear systems via symbolic time series analysis

    International Nuclear Information System (INIS)

    Rajagopalan, V.; Ray, A.; Garcia, H. E.


    This paper provides a symbolic time series analysis approach to fault diagnostics and condition monitoring. The proposed technique is built upon concepts from wavelet theory, symbolic dynamics and pattern recognition. Various aspects of the methodology such as wavelet selection, choice of alphabet and determination of depth of D-Markov Machine are explained in the paper. The technique is validated with experiments performed in a Machine Condition Monitoring (MCM) test bed at the Idaho National Laboratory. (authors)

  7. A Time Series Analysis to Asymmetric Marketing Competition Within a Market Structure


    Francisco F. R. Ramos


    As a complementary to the existing studies of competitive market structure analysis, the present paper proposed a time series methodology to provide a more detailed picture of marketing competition in relation to competitive market structure. Two major hypotheses were tested as part of this project. First, it was found that some significant cross- lead and lag effects of marketing variables on sales between brands existed even between differents submarkets. second, it was found that high qual...

  8. Time series analysis in road safety research uisng state space methods




    In this thesis we present a comprehensive study into novel time series models for aggregated road safety data. The models are mainly intended for analysis of indicators relevant to road safety, with a particular focus on how to measure these factors. Such developments may need to be related to or explained by external influences. It is also possible to make forecasts using the models. Relevant indicators include the number of persons killed permonth or year. These statistics are closely watch...

  9. Chernobyl effects on domestic and inbound tourism in Sweden. A time series analysis

    Energy Technology Data Exchange (ETDEWEB)

    Hultkrantz, L. [Department of Economics, University of Uppsala, Uppsala (Sweden); Olsson, C. [Department of Economics, Umeaa University, Umeaa (Sweden)


    This paper estimates the impact of the Chernobyl nuclear accident on domestic and international tourism in Sweden. From ARIMA time series forecasts, outlier search, and intervention analysis based on regional monthly accommodation data from 1978-1989, no effect on domestic tourism is found. However, there is an enduring deterrence effect on incoming tourism. The loss of gross revenue from incoming tourism because of the Chernobyl accident, is estimated to 2.5 billion SEK. 5 figs., 7 tabs., 1 appendix, 27 refs.

  10. Time Series Modeling of Army Mission Command Communication Networks: An Event-Driven Analysis (United States)


    Lehmann, D. R. (1984). How advertising affects sales: Meta- analysis of econometric results. Journal of Marketing Research , 21, 65-74. Barabási, A. L...317-357. Leone, R. P. (1983). Modeling sales-advertising relationships: An integrated time series- econometric approach. Journal of Marketing ... Research , 20, 291-295. McGrath, J. E., & Kravitz, D. A. (1982). Group research. Annual Review of Psychology, 33, 195- 230. Monge, P. R., & Contractor

  11. Site-specific Probabilistic Analysis of DCGLs Using RESRAD Code

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jeongju; Yoon, Suk Bon; Sohn, Wook [KHNP CRI, Daejeon (Korea, Republic of)


    In general, DCGLs can be conservative (screening DCGL) if they do not take into account site specific factors. Use of such conservative DCGLs can lead to additional remediation that would not be required if the effort was made to develop site-specific DCGLs. Therefore, the objective of this work is to provide an example on the use of the RESRAD 6.0 probabilistic (site-specific) dose analysis to compare with the screening DCGL. Site release regulations state that a site will be considered acceptable for unrestricted use if the residual radioactivity that is distinguishable from background radiation results in a Total Effective Dose Equivalent (TEDE) to an average member of the critical group of less than the site release criteria, for example 0.25 mSv per year in U.S. Utilities use computer dose modeling codes to establish an acceptable level of contamination, the derived concentration guideline level (DCGL) that will meet this regulatory limit. Since the DCGL value is the principal measure of residual radioactivity, it is critical to understand the technical basis of these dose modeling codes. The objective this work was to provide example on nuclear power plant decommissioning dose analysis in a probabilistic analysis framework. The focus was on the demonstration of regulatory compliance for surface soil contamination using the RESRAD 6.0 code. Both the screening and site-specific probabilistic dose analysis methodologies were examined. Example analyses performed with the screening probabilistic dose analysis confirmed the conservatism of the NRC screening values and indicated the effectiveness of probabilistic dose analysis in reducing the conservatism in DCGL derivation.


    Directory of Open Access Journals (Sweden)

    Sadi Evren SEKER


    Full Text Available This paper proposes an information retrieval methodfor the economy news. Theeffect of economy news, are researched in the wordlevel and stock market valuesare considered as the ground proof.The correlation between stock market prices and economy news is an already ad-dressed problem for most of the countries. The mostwell-known approach is ap-plying the text mining approaches to the news and some time series analysis tech-niques over stock market closing values in order toapply classification or cluster-ing algorithms over the features extracted. This study goes further and tries to askthe question what are the available time series analysis techniques for the stockmarket closing values and which one is the most suitable? In this study, the newsand their dates are collected into a database and text mining is applied over thenews, the text mining part has been kept simple with only term frequency – in-verse document frequency method. For the time series analysis part, we havestudied 10 different methods such as random walk, moving average, acceleration,Bollinger band, price rate of change, periodic average, difference, momentum orrelative strength index and their variation. In this study we have also explainedthese techniques in a comparative way and we have applied the methods overTurkish Stock Market closing values for more than a2 year period. On the otherhand, we have applied the term frequency – inversedocument frequency methodon the economy news of one of the high-circulatingnewspapers in Turkey.

  13. The Fourier decomposition method for nonlinear and non-stationary time series analysis. (United States)

    Singh, Pushpendra; Joshi, Shiv Dutt; Patney, Rakesh Kumar; Saha, Kaushik


    for many decades, there has been a general perception in the literature that Fourier methods are not suitable for the analysis of nonlinear and non-stationary data. In this paper, we propose a novel and adaptive Fourier decomposition method (FDM), based on the Fourier theory, and demonstrate its efficacy for the analysis of nonlinear and non-stationary time series. The proposed FDM decomposes any data into a small number of 'Fourier intrinsic band functions' (FIBFs). The FDM presents a generalized Fourier expansion with variable amplitudes and variable frequencies of a time series by the Fourier method itself. We propose an idea of zero-phase filter bank-based multivariate FDM (MFDM), for the analysis of multivariate nonlinear and non-stationary time series, using the FDM. We also present an algorithm to obtain cut-off frequencies for MFDM. The proposed MFDM generates a finite number of band-limited multivariate FIBFs (MFIBFs). The MFDM preserves some intrinsic physical properties of the multivariate data, such as scale alignment, trend and instantaneous frequency. The proposed methods provide a time-frequency-energy (TFE) distribution that reveals the intrinsic structure of a data. Numerical computations and simulations have been carried out and comparison is made with the empirical mode decomposition algorithms.

  14. Trend Estimation and Regression Analysis in Climatological Time Series: An Application of Structural Time Series Models and the Kalman Filter. (United States)

    Visser, H.; Molenaar, J.


    The detection of trends in climatological data has become central to the discussion on climate change due to the enhanced greenhouse effect. To prove detection, a method is needed (i) to make inferences on significant rises or declines in trends, (ii) to take into account natural variability in climate series, and (iii) to compare output from GCMs with the trends in observed climate data. To meet these requirements, flexible mathematical tools are needed. A structural time series model is proposed with which a stochastic trend, a deterministic trend, and regression coefficients can be estimated simultaneously. The stochastic trend component is described using the class of ARIMA models. The regression component is assumed to be linear. However, the regression coefficients corresponding with the explanatory variables may be time dependent to validate this assumption. The mathematical technique used to estimate this trend-regression model is the Kaiman filter. The main features of the filter are discussed.Examples of trend estimation are given using annual mean temperatures at a single station in the Netherlands (1706-1990) and annual mean temperatures at Northern Hemisphere land stations (1851-1990). The inclusion of explanatory variables is shown by regressing the latter temperature series on four variables: Southern Oscillation index (SOI), volcanic dust index (VDI), sunspot numbers (SSN), and a simulated temperature signal, induced by increasing greenhouse gases (GHG). In all analyses, the influence of SSN on global temperatures is found to be negligible. The correlations between temperatures and SOI and VDI appear to be negative. For SOI, this correlation is significant, but for VDI it is not, probably because of a lack of volcanic eruptions during the sample period. The relation between temperatures and GHG is positive, which is in agreement with the hypothesis of a warming climate because of increasing levels of greenhouse gases. The prediction performance of

  15. Early detection of metabolic and energy disorders by thermal time series stochastic complexity analysis

    Energy Technology Data Exchange (ETDEWEB)

    Lutaif, N.A. [Departamento de Clínica Médica, Faculdade de Ciências Médicas, Universidade Estadual de Campinas, Campinas, SP (Brazil); Palazzo, R. Jr [Departamento de Telemática, Faculdade de Engenharia Elétrica e Computação, Universidade Estadual de Campinas, Campinas, SP (Brazil); Gontijo, J.A.R. [Departamento de Clínica Médica, Faculdade de Ciências Médicas, Universidade Estadual de Campinas, Campinas, SP (Brazil)


    Maintenance of thermal homeostasis in rats fed a high-fat diet (HFD) is associated with changes in their thermal balance. The thermodynamic relationship between heat dissipation and energy storage is altered by the ingestion of high-energy diet content. Observation of thermal registers of core temperature behavior, in humans and rodents, permits identification of some characteristics of time series, such as autoreference and stationarity that fit adequately to a stochastic analysis. To identify this change, we used, for the first time, a stochastic autoregressive model, the concepts of which match those associated with physiological systems involved and applied in male HFD rats compared with their appropriate standard food intake age-matched male controls (n=7 per group). By analyzing a recorded temperature time series, we were able to identify when thermal homeostasis would be affected by a new diet. The autoregressive time series model (AR model) was used to predict the occurrence of thermal homeostasis, and this model proved to be very effective in distinguishing such a physiological disorder. Thus, we infer from the results of our study that maximum entropy distribution as a means for stochastic characterization of temperature time series registers may be established as an important and early tool to aid in the diagnosis and prevention of metabolic diseases due to their ability to detect small variations in thermal profile.

  16. Early detection of metabolic and energy disorders by thermal time series stochastic complexity analysis

    International Nuclear Information System (INIS)

    Lutaif, N.A.; Palazzo, R. Jr; Gontijo, J.A.R.


    Maintenance of thermal homeostasis in rats fed a high-fat diet (HFD) is associated with changes in their thermal balance. The thermodynamic relationship between heat dissipation and energy storage is altered by the ingestion of high-energy diet content. Observation of thermal registers of core temperature behavior, in humans and rodents, permits identification of some characteristics of time series, such as autoreference and stationarity that fit adequately to a stochastic analysis. To identify this change, we used, for the first time, a stochastic autoregressive model, the concepts of which match those associated with physiological systems involved and applied in male HFD rats compared with their appropriate standard food intake age-matched male controls (n=7 per group). By analyzing a recorded temperature time series, we were able to identify when thermal homeostasis would be affected by a new diet. The autoregressive time series model (AR model) was used to predict the occurrence of thermal homeostasis, and this model proved to be very effective in distinguishing such a physiological disorder. Thus, we infer from the results of our study that maximum entropy distribution as a means for stochastic characterization of temperature time series registers may be established as an important and early tool to aid in the diagnosis and prevention of metabolic diseases due to their ability to detect small variations in thermal profile

  17. Time series analysis of infrared satellite data for detecting thermal anomalies: a hybrid approach (United States)

    Koeppen, W. C.; Pilger, E.; Wright, R.


    We developed and tested an automated algorithm that analyzes thermal infrared satellite time series data to detect and quantify the excess energy radiated from thermal anomalies such as active volcanoes. Our algorithm enhances the previously developed MODVOLC approach, a simple point operation, by adding a more complex time series component based on the methods of the Robust Satellite Techniques (RST) algorithm. Using test sites at Anatahan and Kīlauea volcanoes, the hybrid time series approach detected ~15% more thermal anomalies than MODVOLC with very few, if any, known false detections. We also tested gas flares in the Cantarell oil field in the Gulf of Mexico as an end-member scenario representing very persistent thermal anomalies. At Cantarell, the hybrid algorithm showed only a slight improvement, but it did identify flares that were undetected by MODVOLC. We estimate that at least 80 MODIS images for each calendar month are required to create good reference images necessary for the time series analysis of the hybrid algorithm. The improved performance of the new algorithm over MODVOLC will result in the detection of low temperature thermal anomalies that will be useful in improving our ability to document Earth's volcanic eruptions, as well as detecting low temperature thermal precursors to larger eruptions.

  18. Spatial analysis of precipitation time series over the Upper Indus Basin (United States)

    Latif, Yasir; Yaoming, Ma; Yaseen, Muhammad


    The upper Indus basin (UIB) holds one of the most substantial river systems in the world, contributing roughly half of the available surface water in Pakistan. This water provides necessary support for agriculture, domestic consumption, and hydropower generation; all critical for a stable economy in Pakistan. This study has identified trends, analyzed variability, and assessed changes in both annual and seasonal precipitation during four time series, identified herein as: (first) 1961-2013, (second) 1971-2013, (third) 1981-2013, and (fourth) 1991-2013, over the UIB. This study investigated spatial characteristics of the precipitation time series over 15 weather stations and provides strong evidence of annual precipitation by determining significant trends at 6 stations (Astore, Chilas, Dir, Drosh, Gupis, and Kakul) out of the 15 studied stations, revealing a significant negative trend during the fourth time series. Our study also showed significantly increased precipitation at Bunji, Chitral, and Skardu, whereas such trends at the rest of the stations appear insignificant. Moreover, our study found that seasonal precipitation decreased at some locations (at a high level of significance), as well as periods of scarce precipitation during all four seasons. The observed decreases in precipitation appear stronger and more significant in autumn; having 10 stations exhibiting decreasing precipitation during the fourth time series, with respect to time and space. Furthermore, the observed decreases in precipitation appear robust and more significant for regions at high elevation (>1300 m). This analysis concludes that decreasing precipitation dominated the UIB, both temporally and spatially including in the higher areas.

  19. Spectral analysis of uneven time series of geological variables; Analisis espectral de series temporales de variables geologicas con muestreo irregular

    Energy Technology Data Exchange (ETDEWEB)

    Pardo-Iguzquiza, E.; Rodriguez-Tovar, F. J.


    In geosciences the sampling of a time series tends to afford uneven results, sometimes because the sampling itself is random or because of hiatuses or even completely missing data or due to difficulties involved in the conversion of data from a spatial to a time scale when the sedimentation rate was not constant. Whatever the case, the best solution does not lie in interpolation but rather in resorting to a method that deals with the irregular data. We show here how the use of the smoothed Lomb-Scargle periodogram is both a practical and efficient choice. We describe the effects on the estimated power spectrum of the type of irregular sampling, the number of data, interpolation, and the presence of drift. We propose the permutation test as being an efficient way of calculating statistical confidence levels. By applying the Lomb-Scargle periodogram to a synthetic series with a known spectral content we are able to confirm the validity of this method in the face of the difficulties mentioned above. A case study with real data, including hiatuses, representing the thickness of the annual banding in a stalagmite, is chosen to demonstrate an application using the statistical and physical interpretation of spectral peaks. (Author)

  20. Time series analysis of Mexico City subsidence constrained by radar interferometry (United States)

    Doin, Marie-Pierre; Lopez-Quiroz, Penelope; Yan, Yajing; Bascou, Pascale; Pinel, Virginie


    unwrapping errors for each pixel and show that they are strongly decreased by iterations in the unwrapping process. (3) Finally, we present a new algorithm for time series analysis that differs from classical SVD decomposition and is best suited to the present data base. Accurate deformation time series are then derived over the metropolitan area of the city with a spatial resolution of 30 × 30 m. We also use the Gamma-PS software on the same data set. The phase differences are unwrapped within small patches with respect to a reference point chosen in each patch, whose phase is in turn unwrapped relatively to a reference point common for the whole area of interest. After removing the modelled contribution of the linear displacement rate and DEM error, some residual interferograms, presenting unwrapping errors because of strong residual orbital ramp or atmospheric phase screen, are spatially unwrapped by a minimum cost-flow algorithm. The next steps are to estimate and remove the residual orbital ramp and to apply temporal low-pass filter to remove atmospheric contributions. The step by step comparison of the SBAS and PS approaches shows both methods complementarity. The SBAS analysis provide subsidence rates with an accuracy of a mm/yr over the whole basin in a large area, together with the subsidence non linear behavior through time, however at the expense of some spatial regularization. The PS method provides locally accurate and punctual deformation rates, but fails in this case to yield a good large scale map and the non linear temporal behavior of the subsidence. We conclude that the relative contrast in subsidence between individual buildings and infrastructure must be relatively small, on average of the order of 5mm/yr.

  1. Studies in astronomical time series analysis. I - Modeling random processes in the time domain (United States)

    Scargle, J. D.


    Several random process models in the time domain are defined and discussed. Attention is given to the moving average model, the autoregressive model, and relationships between and combinations of these models. Consideration is then given to methods for investigating pulse structure, procedures of model construction, computational methods, and numerical experiments. A FORTRAN algorithm of time series analysis has been developed which is relatively stable numerically. Results of test cases are given to study the effect of adding noise and of different distributions for the pulse amplitudes. A preliminary analysis of the light curve of the quasar 3C 272 is considered as an example.

  2. The Real-time Frequency Spectrum Analysis of Neutron Pulse Signal Series

    International Nuclear Information System (INIS)

    Tang Yuelin; Ren Yong; Wei Biao; Feng Peng; Mi Deling; Pan Yingjun; Li Jiansheng; Ye Cenming


    The frequency spectrum analysis of neutron pulse signal is a very important method in nuclear stochastic signal processing Focused on the special '0' and '1' of neutron pulse signal series, this paper proposes new rotation-table and realizes a real-time frequency spectrum algorithm under 1G Hz sample rate based on PC with add, address and SSE. The numerical experimental results show that under the count rate of 3X10 6 s -1 , this algorithm is superior to FFTW in time-consumption and can meet the real-time requirement of frequency spectrum analysis. (authors)

  3. Validation of non-stationary precipitation series for site-specific impact assessment: comparison of two statistical downscaling techniques (United States)

    Mullan, Donal; Chen, Jie; Zhang, Xunchang John


    Statistical downscaling (SD) methods have become a popular, low-cost and accessible means of bridging the gap between the coarse spatial resolution at which climate models output climate scenarios and the finer spatial scale at which impact modellers require these scenarios, with various different SD techniques used for a wide range of applications across the world. This paper compares the Generator for Point Climate Change (GPCC) model and the Statistical DownScaling Model (SDSM)—two contrasting SD methods—in terms of their ability to generate precipitation series under non-stationary conditions across ten contrasting global climates. The mean, maximum and a selection of distribution statistics as well as the cumulative frequencies of dry and wet spells for four different temporal resolutions were compared between the models and the observed series for a validation period. Results indicate that both methods can generate daily precipitation series that generally closely mirror observed series for a wide range of non-stationary climates. However, GPCC tends to overestimate higher precipitation amounts, whilst SDSM tends to underestimate these. This infers that GPCC is more likely to overestimate the effects of precipitation on a given impact sector, whilst SDSM is likely to underestimate the effects. GPCC performs better than SDSM in reproducing wet and dry day frequency, which is a key advantage for many impact sectors. Overall, the mixed performance of the two methods illustrates the importance of users performing a thorough validation in order to determine the influence of simulated precipitation on their chosen impact sector.

  4. On the Impact of a Quadratic Acceleration Term in the Analysis of Position Time Series (United States)

    Bogusz, Janusz; Klos, Anna; Bos, Machiel Simon; Hunegnaw, Addisu; Teferle, Felix Norman


    The analysis of Global Navigation Satellite System (GNSS) position time series generally assumes that each of the coordinate component series is described by the sum of a linear rate (velocity) and various periodic terms. The residuals, the deviations between the fitted model and the observations, are then a measure of the epoch-to-epoch scatter and have been used for the analysis of the stochastic character (noise) of the time series. Often the parameters of interest in GNSS position time series are the velocities and their associated uncertainties, which have to be determined with the highest reliability. It is clear that not all GNSS position time series follow this simple linear behaviour. Therefore, we have added an acceleration term in the form of a quadratic polynomial function to the model in order to better describe the non-linear motion in the position time series. This non-linear motion could be a response to purely geophysical processes, for example, elastic rebound of the Earth's crust due to ice mass loss in Greenland, artefacts due to deficiencies in bias mitigation models, for example, of the GNSS satellite and receiver antenna phase centres, or any combination thereof. In this study we have simulated 20 time series with different stochastic characteristics such as white, flicker or random walk noise of length of 23 years. The noise amplitude was assumed at 1 mm/y-/4. Then, we added the deterministic part consisting of a linear trend of 20 mm/y (that represents the averaged horizontal velocity) and accelerations ranging from minus 0.6 to plus 0.6 mm/y2. For all these data we estimated the noise parameters with Maximum Likelihood Estimation (MLE) using the Hector software package without taken into account the non-linear term. In this way we set the benchmark to then investigate how the noise properties and velocity uncertainty may be affected by any un-modelled, non-linear term. The velocities and their uncertainties versus the accelerations for

  5. Development of indicators of vegetation recovery based on time series analysis of SPOT Vegetation data (United States)

    Lhermitte, S.; Tips, M.; Verbesselt, J.; Jonckheere, I.; Van Aardt, J.; Coppin, Pol


    Large-scale wild fires have direct impacts on natural ecosystems and play a major role in the vegetation ecology and carbon budget. Accurate methods for describing post-fire development of vegetation are therefore essential for the understanding and monitoring of terrestrial ecosystems. Time series analysis of satellite imagery offers the potential to quantify these parameters with spatial and temporal accuracy. Current research focuses on the potential of time series analysis of SPOT Vegetation S10 data (1999-2001) to quantify the vegetation recovery of large-scale burns detected in the framework of GBA2000. The objective of this study was to provide quantitative estimates of the spatio-temporal variation of vegetation recovery based on remote sensing indicators. Southern Africa was used as a pilot study area, given the availability of ground and satellite data. An automated technique was developed to extract consistent indicators of vegetation recovery from the SPOT-VGT time series. Reference areas were used to quantify the vegetation regrowth by means of Regeneration Indices (RI). Two kinds of recovery indicators (time and value- based) were tested for RI's of NDVI, SR, SAVI, NDWI, and pure band information. The effects of vegetation structure and temporal fire regime features on the recovery indicators were subsequently analyzed. Statistical analyses were conducted to assess whether the recovery indicators were different for different vegetation types and dependent on timing of the burning season. Results highlighted the importance of appropriate reference areas and the importance of correct normalization of the SPOT-VGT data.

  6. Dynamic Forecasting Conditional Probability of Bombing Attacks Based on Time-Series and Intervention Analysis. (United States)

    Li, Shuying; Zhuang, Jun; Shen, Shifei


    In recent years, various types of terrorist attacks occurred, causing worldwide catastrophes. According to the Global Terrorism Database (GTD), among all attack tactics, bombing attacks happened most frequently, followed by armed assaults. In this article, a model for analyzing and forecasting the conditional probability of bombing attacks (CPBAs) based on time-series methods is developed. In addition, intervention analysis is used to analyze the sudden increase in the time-series process. The results show that the CPBA increased dramatically at the end of 2011. During that time, the CPBA increased by 16.0% in a two-month period to reach the peak value, but still stays 9.0% greater than the predicted level after the temporary effect gradually decays. By contrast, no significant fluctuation can be found in the conditional probability process of armed assault. It can be inferred that some social unrest, such as America's troop withdrawal from Afghanistan and Iraq, could have led to the increase of the CPBA in Afghanistan, Iraq, and Pakistan. The integrated time-series and intervention model is used to forecast the monthly CPBA in 2014 and through 2064. The average relative error compared with the real data in 2014 is 3.5%. The model is also applied to the total number of attacks recorded by the GTD between 2004 and 2014. © 2016 Society for Risk Analysis.

  7. Comparative performance analysis of shunt and series passive filter for LED lamp (United States)

    Sarwono, Edi; Facta, Mochammad; Handoko, Susatyo


    Light Emitting Diode lamp or LED lamp nowadays is widely used by consumers as a new innovation in the lighting technologies due to its energy saving for low power consumption lamps for brighter light intensity. How ever, the LED lamp produce an electric pollutant known as harmonics. The harmonics is generated by rectifier as part of LED lamp circuit. The present of harmonics in current or voltage has made the source waveform from the grid is distorted. This distortion may cause inacurrate measurement, mall function, and excessive heating for any element at the grid. This paper present an analysis work of shunt and series filters to suppress the harmonics generated by the LED lamp circuit. The work was initiated by conducting several tests to investigate the harmonic content of voltage and currents. The measurements in this work were carried out by using HIOKI Power Quality Analyzer 3197. The measurement results showed that the harmonics current of tested LED lamps were above the limit of IEEE standard 519-2014. Based on the measurement results shunt and series filters were constructed as low pass filters. The bode analysis were appled during construction and prediction of the filters performance. Based on experimental results, the application of shunt filter at input side of LED lamp has reduced THD current up to 88%. On the other hand, the series filter has significantly reduced THD current up to 92%.

  8. Work-related accidents among the Iranian population: a time series analysis, 2000-2011. (United States)

    Karimlou, Masoud; Salehi, Masoud; Imani, Mehdi; Hosseini, Agha-Fatemeh; Dehnad, Afsaneh; Vahabi, Nasim; Bakhtiyari, Mahmood


    Work-related accidents result in human suffering and economic losses and are considered as a major health problem worldwide, especially in the economically developing world. To introduce seasonal autoregressive moving average (ARIMA) models for time series analysis of work-related accident data for workers insured by the Iranian Social Security Organization (ISSO) between 2000 and 2011. In this retrospective study, all insured people experiencing at least one work-related accident during a 10-year period were included in the analyses. We used Box-Jenkins modeling to develop a time series model of the total number of accidents. There was an average of 1476 accidents per month (1476·05±458·77, mean±SD). The final ARIMA (p,d,q) (P,D,Q)s model for fitting to data was: ARIMA(1,1,1)×(0,1,1)12 consisting of the first ordering of the autoregressive, moving average and seasonal moving average parameters with 20·942 mean absolute percentage error (MAPE). The final model showed that time series analysis of ARIMA models was useful for forecasting the number of work-related accidents in Iran. In addition, the forecasted number of work-related accidents for 2011 explained the stability of occurrence of these accidents in recent years, indicating a need for preventive occupational health and safety policies such as safety inspection.

  9. Work-related accidents among the Iranian population: a time series analysis, 2000–2011 (United States)

    Karimlou, Masoud; Imani, Mehdi; Hosseini, Agha-Fatemeh; Dehnad, Afsaneh; Vahabi, Nasim; Bakhtiyari, Mahmood


    Background Work-related accidents result in human suffering and economic losses and are considered as a major health problem worldwide, especially in the economically developing world. Objectives To introduce seasonal autoregressive moving average (ARIMA) models for time series analysis of work-related accident data for workers insured by the Iranian Social Security Organization (ISSO) between 2000 and 2011. Methods In this retrospective study, all insured people experiencing at least one work-related accident during a 10-year period were included in the analyses. We used Box–Jenkins modeling to develop a time series model of the total number of accidents. Results There was an average of 1476 accidents per month (1476·05±458·77, mean±SD). The final ARIMA (p,d,q) (P,D,Q)s model for fitting to data was: ARIMA(1,1,1)×(0,1,1)12 consisting of the first ordering of the autoregressive, moving average and seasonal moving average parameters with 20·942 mean absolute percentage error (MAPE). Conclusions The final model showed that time series analysis of ARIMA models was useful for forecasting the number of work-related accidents in Iran. In addition, the forecasted number of work-related accidents for 2011 explained the stability of occurrence of these accidents in recent years, indicating a need for preventive occupational health and safety policies such as safety inspection. PMID:26119774

  10. Case studies: Risk-based analysis of technical specifications

    International Nuclear Information System (INIS)

    Wagner, D.P.; Minton, L.A.; Gaertner, J.P.


    The SOCRATES computer program uses the results of a Probabilistic Risk Assessment (PRA) or a system level risk analysis to calculate changes in risk due to changes in the surveillance test interval and/or the allowed outage time stated in the technical specification. The computer program can accommodate various testing strategies (such as staggered or simultaneous testing) to allow modeling of component testing as it is carried out at a plant. The methods and computer program are an integral part of a larger decision process aimed at determining benefits from technical specification changes. These benefits can include cost savings to the utilities by reducing forced shutdowns with no adverse impacts on risk. Three summaries of case study applications are included to demonstrate the types of results that can be achieved through risk-based evaluation of technical specifications. (orig.)

  11. Empirical mode decomposition and long-range correlation analysis of sunspot time series

    International Nuclear Information System (INIS)

    Zhou, Yu; Leung, Yee


    Sunspots, which are the best known and most variable features of the solar surface, affect our planet in many ways. The number of sunspots during a period of time is highly variable and arouses strong research interest. When multifractal detrended fluctuation analysis (MF-DFA) is employed to study the fractal properties and long-range correlation of the sunspot series, some spurious crossover points might appear because of the periodic and quasi-periodic trends in the series. However many cycles of solar activities can be reflected by the sunspot time series. The 11-year cycle is perhaps the most famous cycle of the sunspot activity. These cycles pose problems for the investigation of the scaling behavior of sunspot time series. Using different methods to handle the 11-year cycle generally creates totally different results. Using MF-DFA, Movahed and co-workers employed Fourier truncation to deal with the 11-year cycle and found that the series is long-range anti-correlated with a Hurst exponent, H, of about 0.12. However, Hu and co-workers proposed an adaptive detrending method for the MF-DFA and discovered long-range correlation characterized by H≈0.74. In an attempt to get to the bottom of the problem in the present paper, empirical mode decomposition (EMD), a data-driven adaptive method, is applied to first extract the components with different dominant frequencies. MF-DFA is then employed to study the long-range correlation of the sunspot time series under the influence of these components. On removing the effects of these periods, the natural long-range correlation of the sunspot time series can be revealed. With the removal of the 11-year cycle, a crossover point located at around 60 months is discovered to be a reasonable point separating two different time scale ranges, H≈0.72 and H≈1.49. And on removing all cycles longer than 11 years, we have H≈0.69 and H≈0.28. The three cycle-removing methods—Fourier truncation, adaptive detrending and the

  12. Using Argumentation Logic for Firewall Policy Specification and Analysis


    Bandara, Arosha K.; Kakas, Antonis; Lupu, Emil C.; Russo, Alessandra


    Firewalls are important perimeter security mechanisms that imple-ment an organisation's network security requirements and can be notoriously difficult to configure correctly. Given their widespread use, it is crucial that network administrators have tools to translate their security requirements into firewall configuration rules and ensure that these rules are consistent with each other. In this paper we propose an approach to firewall policy specification and analysis that uses a formal fram...

  13. Detrended fluctuation analysis based on higher-order moments of financial time series (United States)

    Teng, Yue; Shang, Pengjian


    In this paper, a generalized method of detrended fluctuation analysis (DFA) is proposed as a new measure to assess the complexity of a complex dynamical system such as stock market. We extend DFA and local scaling DFA to higher moments such as skewness and kurtosis (labeled SMDFA and KMDFA), so as to investigate the volatility scaling property of financial time series. Simulations are conducted over synthetic and financial data for providing the comparative study. We further report the results of volatility behaviors in three American countries, three Chinese and three European stock markets by using DFA and LSDFA method based on higher moments. They demonstrate the dynamics behaviors of time series in different aspects, which can quantify the changes of complexity for stock market data and provide us with more meaningful information than single exponent. And the results reveal some higher moments volatility and higher moments multiscale volatility details that cannot be obtained using the traditional DFA method.

  14. Time series analysis of pressure fluctuation in gas-solid fluidized beds

    Directory of Open Access Journals (Sweden)

    C. Alberto S. Felipe


    Full Text Available The purpose of the present work was to study the differentiation of states of typical fluidization (single bubble, multiple bubble and slugging in a gas-solid fluidized bed, using spectral analysis of pressure fluctuation time series. The effects of the method of measuring (differential and absolute pressure fluctuations and the axial position of the probes in the fluidization column on the identification of each of the regimes studied were evaluated. Fast Fourier Transform (FFT was the mathematic tool used to analysing the data of pressure fluctuations, which expresses the behavior of a time series in the frequency domain. Results indicated that the plenum chamber was a place for reliable measurement and that care should be taken in measurement in the dense phase. The method allowed fluid dynamic regimes to be differentiated by their dominant frequency characteristics.

  15. Analysis of Data from a Series of Events by a Geometric Process Model

    Institute of Scientific and Technical Information of China (English)

    Yeh Lam; Li-xing Zhu; Jennifer S. K. Chan; Qun Liu


    Geometric process was first introduced by Lam[10,11]. A stochastic process {Xi, i = 1, 2,…} is called a geometric process (GP) if, for some a > 0, {ai-1Xi, i = 1, 2,…} forms a renewal process. In thispaper, the GP is used to analyze the data from a series of events. A nonparametric method is introduced forthe estimation of the three parameters in the GP. The limiting distributions of the three estimators are studied.Through the analysis of some real data sets, the GP model is compared with other three homogeneous andnonhomogeneous Poisson models. It seems that on average the GP model is the best model among these fourmodels in analyzing the data from a series of events.

  16. The Relative Importance of the Service Sector in the Mexican Economy: A Time Series Analysis

    Directory of Open Access Journals (Sweden)

    Carlos Alberto Flores


    Full Text Available We conduct a study of the secondary and tertiary sectors with the goal of highlighting the relative im-portance of services in the Mexican economy. We consider a time series analysis approach designed to identify the stochastic nature of the series, as well as to define their long-run and-short run relationships with Gross Domestic Product (GDP. The results of cointegration tests suggest that, for the most part, activities in the secondary and tertiary sectors share a common trend with GDP. Interestingly, the long-run elasticities of GDP with respect to services are on average larger than those with respect to secondary activities. Common cycle tests results identify the existence of common cycles between GDP and the disaggregated sectors, as well as with manufacturing, commerce, real estate and transportation. In this case, the short-run elasticities of secondary activities are on average larger than those corresponding to services.

  17. Investigation of interfacial wave structure using time-series analysis techniques

    International Nuclear Information System (INIS)

    Jayanti, S.; Hewitt, G.F.; Cliffe, K.A.


    The report presents an investigation into the interfacial structure in horizontal annular flow using spectral and time-series analysis techniques. Film thickness measured using conductance probes shows an interesting transition in wave pattern from a continuous low-frequency wave pattern to an intermittent, high-frequency one. From the autospectral density function of the film thickness, it appears that this transition is caused by the breaking up of long waves into smaller ones. To investigate the possibility of the wave structure being represented as a low order chaotic system, phase portraits of the time series were constructed using the technique developed by Broomhead and co-workers (1986, 1987 and 1989). These showed a banded structure when waves of relatively high frequency were filtered out. Although these results are encouraging, further work is needed to characterise the attractor. (Author)

  18. Causality as a Rigorous Notion and Quantitative Causality Analysis with Time Series (United States)

    Liang, X. S.


    Given two time series, can one faithfully tell, in a rigorous and quantitative way, the cause and effect between them? Here we show that this important and challenging question (one of the major challenges in the science of big data), which is of interest in a wide variety of disciplines, has a positive answer. Particularly, for linear systems, the maximal likelihood estimator of the causality from a series X2 to another series X1, written T2→1, turns out to be concise in form: T2→1 = [C11 C12 C2,d1 — C112 C1,d1] / [C112 C22 — C11C122] where Cij (i,j=1,2) is the sample covariance between Xi and Xj, and Ci,dj the covariance between Xi and ΔXj/Δt, the difference approximation of dXj/dt using the Euler forward scheme. An immediate corollary is that causation implies correlation, but not vice versa, resolving the long-standing debate over causation versus correlation. The above formula has been validated with touchstone series purportedly generated with one-way causality that evades the classical approaches such as Granger causality test and transfer entropy analysis. It has also been applied successfully to the investigation of many real problems. Through a simple analysis with the stock series of IBM and GE, an unusually strong one-way causality is identified from the former to the latter in their early era, revealing to us an old story, which has almost faded into oblivion, about "Seven Dwarfs" competing with a "Giant" for the computer market. Another example presented here regards the cause-effect relation between the two climate modes, El Niño and Indian Ocean Dipole (IOD). In general, these modes are mutually causal, but the causality is asymmetric. To El Niño, the information flowing from IOD manifests itself as a propagation of uncertainty from the Indian Ocean. In the third example, an unambiguous one-way causality is found between CO2 and the global mean temperature anomaly. While it is confirmed that CO2 indeed drives the recent global warming

  19. Phase correction and error estimation in InSAR time series analysis (United States)

    Zhang, Y.; Fattahi, H.; Amelung, F.


    During the last decade several InSAR time series approaches have been developed in response to the non-idea acquisition strategy of SAR satellites, such as large spatial and temporal baseline with non-regular acquisitions. The small baseline tubes and regular acquisitions of new SAR satellites such as Sentinel-1 allows us to form fully connected networks of interferograms and simplifies the time series analysis into a weighted least square inversion of an over-determined system. Such robust inversion allows us to focus more on the understanding of different components in InSAR time-series and its uncertainties. We present an open-source python-based package for InSAR time series analysis, called PySAR (, with unique functionalities for obtaining unbiased ground displacement time-series, geometrical and atmospheric correction of InSAR data and quantifying the InSAR uncertainty. Our implemented strategy contains several features including: 1) improved spatial coverage using coherence-based network of interferograms, 2) unwrapping error correction using phase closure or bridging, 3) tropospheric delay correction using weather models and empirical approaches, 4) DEM error correction, 5) optimal selection of reference date and automatic outlier detection, 6) InSAR uncertainty due to the residual tropospheric delay, decorrelation and residual DEM error, and 7) variance-covariance matrix of final products for geodetic inversion. We demonstrate the performance using SAR datasets acquired by Cosmo-Skymed and TerraSAR-X, Sentinel-1 and ALOS/ALOS-2, with application on the highly non-linear volcanic deformation in Japan and Ecuador (figure 1). Our result shows precursory deformation before the 2015 eruptions of Cotopaxi volcano, with a maximum uplift of 3.4 cm on the western flank (fig. 1b), with a standard deviation of 0.9 cm (fig. 1a), supporting the finding by Morales-Rivera et al. (2017, GRL); and a post-eruptive subsidence on the same

  20. Multidimensional scaling analysis of financial time series based on modified cross-sample entropy methods (United States)

    He, Jiayi; Shang, Pengjian; Xiong, Hui


    Stocks, as the concrete manifestation of financial time series with plenty of potential information, are often used in the study of financial time series. In this paper, we utilize the stock data to recognize their patterns through out the dissimilarity matrix based on modified cross-sample entropy, then three-dimensional perceptual maps of the results are provided through multidimensional scaling method. Two modified multidimensional scaling methods are proposed in this paper, that is, multidimensional scaling based on Kronecker-delta cross-sample entropy (MDS-KCSE) and multidimensional scaling based on permutation cross-sample entropy (MDS-PCSE). These two methods use Kronecker-delta based cross-sample entropy and permutation based cross-sample entropy to replace the distance or dissimilarity measurement in classical multidimensional scaling (MDS). Multidimensional scaling based on Chebyshev distance (MDSC) is employed to provide a reference for comparisons. Our analysis reveals a clear clustering both in synthetic data and 18 indices from diverse stock markets. It implies that time series generated by the same model are easier to have similar irregularity than others, and the difference in the stock index, which is caused by the country or region and the different financial policies, can reflect the irregularity in the data. In the synthetic data experiments, not only the time series generated by different models can be distinguished, the one generated under different parameters of the same model can also be detected. In the financial data experiment, the stock indices are clearly divided into five groups. Through analysis, we find that they correspond to five regions, respectively, that is, Europe, North America, South America, Asian-Pacific (with the exception of mainland China), mainland China and Russia. The results also demonstrate that MDS-KCSE and MDS-PCSE provide more effective divisions in experiments than MDSC.

  1. Radial artery pulse waveform analysis based on curve fitting using discrete Fourier series. (United States)

    Jiang, Zhixing; Zhang, David; Lu, Guangming


    Radial artery pulse diagnosis has been playing an important role in traditional Chinese medicine (TCM). For its non-invasion and convenience, the pulse diagnosis has great significance in diseases analysis of modern medicine. The practitioners sense the pulse waveforms in patients' wrist to make diagnoses based on their non-objective personal experience. With the researches of pulse acquisition platforms and computerized analysis methods, the objective study on pulse diagnosis can help the TCM to keep up with the development of modern medicine. In this paper, we propose a new method to extract feature from pulse waveform based on discrete Fourier series (DFS). It regards the waveform as one kind of signal that consists of a series of sub-components represented by sine and cosine (SC) signals with different frequencies and amplitudes. After the pulse signals are collected and preprocessed, we fit the average waveform for each sample using discrete Fourier series by least squares. The feature vector is comprised by the coefficients of discrete Fourier series function. Compared with the fitting method using Gaussian mixture function, the fitting errors of proposed method are smaller, which indicate that our method can represent the original signal better. The classification performance of proposed feature is superior to the other features extracted from waveform, liking auto-regression model and Gaussian mixture model. The coefficients of optimized DFS function, who is used to fit the arterial pressure waveforms, can obtain better performance in modeling the waveforms and holds more potential information for distinguishing different psychological states. Copyright © 2018 Elsevier B.V. All rights reserved.

  2. Time-Series Analysis of Continuously Monitored Blood Glucose: The Impacts of Geographic and Daily Lifestyle Factors

    Directory of Open Access Journals (Sweden)

    Sean T. Doherty


    Full Text Available Type 2 diabetes is known to be associated with environmental, behavioral, and lifestyle factors. However, the actual impacts of these factors on blood glucose (BG variation throughout the day have remained relatively unexplored. Continuous blood glucose monitors combined with human activity tracking technologies afford new opportunities for exploration in a naturalistic setting. Data from a study of 40 patients with diabetes is utilized in this paper, including continuously monitored BG, food/medicine intake, and patient activity/location tracked using global positioning systems over a 4-day period. Standard linear regression and more disaggregated time-series analysis using autoregressive integrated moving average (ARIMA are used to explore patient BG variation throughout the day and over space. The ARIMA models revealed a wide variety of BG correlating factors related to specific activity types, locations (especially those far from home, and travel modes, although the impacts were highly personal. Traditional variables related to food intake and medications were less often significant. Overall, the time-series analysis revealed considerable patient-by-patient variation in the effects of geographic and daily lifestyle factors. We would suggest that maps of BG spatial variation or an interactive messaging system could provide new tools to engage patients and highlight potential risk factors.

  3. Analysis of cyclical behavior in time series of stock market returns (United States)

    Stratimirović, Djordje; Sarvan, Darko; Miljković, Vladimir; Blesić, Suzana


    In this paper we have analyzed scaling properties and cyclical behavior of the three types of stock market indexes (SMI) time series: data belonging to stock markets of developed economies, emerging economies, and of the underdeveloped or transitional economies. We have used two techniques of data analysis to obtain and verify our findings: the wavelet transform (WT) spectral analysis to identify cycles in the SMI returns data, and the time-dependent detrended moving average (tdDMA) analysis to investigate local behavior around market cycles and trends. We found cyclical behavior in all SMI data sets that we have analyzed. Moreover, the positions and the boundaries of cyclical intervals that we found seam to be common for all markets in our dataset. We list and illustrate the presence of nine such periods in our SMI data. We report on the possibilities to differentiate between the level of growth of the analyzed markets by way of statistical analysis of the properties of wavelet spectra that characterize particular peak behaviors. Our results show that measures like the relative WT energy content and the relative WT amplitude of the peaks in the small scales region could be used to partially differentiate between market economies. Finally, we propose a way to quantify the level of development of a stock market based on estimation of local complexity of market's SMI series. From the local scaling exponents calculated for our nine peak regions we have defined what we named the Development Index, which proved, at least in the case of our dataset, to be suitable to rank the SMI series that we have analyzed in three distinct groups.

  4. Time series analysis of soil Radon-222 recorded at Kutch region, Gujarat, India

    International Nuclear Information System (INIS)

    Madhusudan Rao, K.; Rastogi, B.K.; Barman, Chiranjib; Chaudhuri, Hirok


    Kutch region in Gujarat lies in a seismic vulnerable zone (seismic zone-v). After the devastating Bhuj earthquake (7.7M) of January 26, 2001 in the Kutch region several researcher focused their attention to monitor geophysical and geochemical precursors for earthquakes in the region. In order to find out the possible geochemical precursory signals for earthquake events, we monitored radioactive gas radon-222 in sub surface soil gas at Kutch region. We have analysed the recorded soil radon-222 time series by means of nonlinear techniques such as FFT power spectral analysis, empirical mode decomposition, multi-fractal analysis along with other linear statistical methods. Some fascinating and fruitful results originated out the nonlinear analysis of the said time series have been discussed in the present paper. The entire analytical method aided us to recognize the nature and pattern of soil radon-222 emanation process. Moreover the recording and statistical and non-linear analysis of soil radon data at Kutch region will assist us to understand the preparation phase of an imminent seismic event in the region. (author)

  5. Comparing and Contrasting Traditional Membrane Bioreactor Models with Novel Ones Based on Time Series Analysis

    Directory of Open Access Journals (Sweden)

    Parneet Paul


    Full Text Available The computer modelling and simulation of wastewater treatment plant and their specific technologies, such as membrane bioreactors (MBRs, are becoming increasingly useful to consultant engineers when designing, upgrading, retrofitting, operating and controlling these plant. This research uses traditional phenomenological mechanistic models based on MBR filtration and biochemical processes to measure the effectiveness of alternative and novel time series models based upon input–output system identification methods. Both model types are calibrated and validated using similar plant layouts and data sets derived for this purpose. Results prove that although both approaches have their advantages, they also have specific disadvantages as well. In conclusion, the MBR plant designer and/or operator who wishes to use good quality, calibrated models to gain a better understanding of their process, should carefully consider which model type is selected based upon on what their initial modelling objectives are. Each situation usually proves unique.


    Bacskay, A.


    , specific heat, density, and viscosity) is generated at user-selected output intervals and stored for reference. The Integrated Plot Utility (IPU) provides plotting capability for all data output. System utility commands are provided to enable the user to operate more efficiently in the CASE/A environment. The user is able to customize a simulation through optional operations FORTRAN logic. This user-developed code is compiled and linked with a CASE/A model and enables the user to control and timeline component operating parameters during various phases of the iterative solution process. CASE/A provides for transient tracking of the flow stream constituents and determination of their thermodynamic state throughout an ECLSS/ATCS simulation, performing heat transfer, chemical reaction, mass/energy balance, and system pressure drop analysis based on user-specified operating conditions. The program tracks each constituent through all combination and decomposition states while maintaining a mass and energy balance on the overall system. This allows rapid assessment of ECLSS designs, the impact of alternate technologies, and impacts due to changes in metabolic forcing functions, consumables usage, and system control considerations. CASE/A is written in FORTRAN 77 for the DEC VAX/VMS computer series, and requires 12Mb of disk storage and a minimum paging file quota of 20,000 pages. The program operates on the Tektronix 4014 graphics standard and VT100 text standard. The program requires a Tektronix 4014 or later graphics terminal, third party composite graphics/text terminal, or personal computer loaded with appropriate VT100/TEK 4014 emulator software. The use of composite terminals or personal computers with popular emulation software is recommended for enhanced CASE/A operations and general ease of use. The program is available on an unlabeled 9-track 6250 BPI DEC VAX BACKUP format magnetic tape. CASE/A development began in 1985 under contract to NASA/Marshall Space Flight

  7. Patient-specific coronary blood supply territories for quantitative perfusion analysis (United States)

    Zakkaroff, Constantine; Biglands, John D.; Greenwood, John P.; Plein, Sven; Boyle, Roger D.; Radjenovic, Aleksandra; Magee, Derek R.


    Abstract Myocardial perfusion imaging, coupled with quantitative perfusion analysis, provides an important diagnostic tool for the identification of ischaemic heart disease caused by coronary stenoses. The accurate mapping between coronary anatomy and under-perfused areas of the myocardium is important for diagnosis and treatment. However, in the absence of the actual coronary anatomy during the reporting of perfusion images, areas of ischaemia are allocated to a coronary territory based on a population-derived 17-segment (American Heart Association) AHA model of coronary blood supply. This work presents a solution for the fusion of 2D Magnetic Resonance (MR) myocardial perfusion images and 3D MR angiography data with the aim to improve the detection of ischaemic heart disease. The key contribution of this work is a novel method for the mediated spatiotemporal registration of perfusion and angiography data and a novel method for the calculation of patient-specific coronary supply territories. The registration method uses 4D cardiac MR cine series spanning the complete cardiac cycle in order to overcome the under-constrained nature of non-rigid slice-to-volume perfusion-to-angiography registration. This is achieved by separating out the deformable registration problem and solving it through phase-to-phase registration of the cine series. The use of patient-specific blood supply territories in quantitative perfusion analysis (instead of the population-based model of coronary blood supply) has the potential of increasing the accuracy of perfusion analysis. Quantitative perfusion analysis diagnostic accuracy evaluation with patient-specific territories against the AHA model demonstrates the value of the mediated spatiotemporal registration in the context of ischaemic heart disease diagnosis. PMID:29392098

  8. Functional Time Series Models to Estimate Future Age-Specific Breast Cancer Incidence Rates for Women in Karachi, Pakistan

    Institute of Scientific and Technical Information of China (English)

    Farah Yasmeen[1; Sidra Zaheer[2


    Background: Breast cancer is the most common female cancer in Pakistan. The incidence of breast cancer in Pakistan is about 2.5 times higher than that in the neighboring countries India and Iran. In Karachi, the most populated city of Pakistan, the age-standardized rate of breast cancer was 69.1 per 100,000 women during 1998-2002, which is the highest recorded rate in Asia. The carcinoma of breast in Pakistan is an enormous public health concern. In this study, we examined the recent trends of breast cancer incidence rates among the women in Karachi. Methods: We obtained the secondary data of breast cancer incidence from various hospitals. They included Jinnah Hospital, KIRAN (Karachi Institute of Radiotherapy and Nuclear Medicine), and Civil hospital, where the data were available for the years 2004-2011. A total of 5331 new cases of female breast cancer were registered during this period. We analyzed the data in 5-year age groups 15-19, 20-24, 25-29, 30-34, 35-39, 40-44, 45-49, 50-54, 55-59, 60-64, 65-69, 70-74, 75+. Nonparametric smoothing were used to obtained age-specific incidence curves, and then the curves are decomposed using principal components analysis to fit FTS (functional time series) model. We then used exponential smoothing statspace models to estimate the forecasts of incidence curve and construct prediction intervals. Results: The breast cancer incidence rates in Karachi increased with age for all available years. The rates increased monotonically and are relatively sharp with the age from 15 years to 50 years and then they show variability after the age of 50 years. 10-year forecasts for the female breast cancer incidence rates in Karachi show that the future rates are expected to remain stable for the age-groups 15-50 years, but they will increase for the females of 50-years and over. Hence in future, the newly diagnosed breast cancer cases in the older women in Karachi are expected to increase. Conclusion: Prediction of age

  9. Characterization of Land Transitions Patterns from Multivariate Time Series Using Seasonal Trend Analysis and Principal Component Analysis

    Directory of Open Access Journals (Sweden)

    Benoit Parmentier


    Full Text Available Characterizing biophysical changes in land change areas over large regions with short and noisy multivariate time series and multiple temporal parameters remains a challenging task. Most studies focus on detection rather than the characterization, i.e., the manner by which surface state variables are altered by the process of changes. In this study, a procedure is presented to extract and characterize simultaneous temporal changes in MODIS multivariate times series from three surface state variables the Normalized Difference Vegetation Index (NDVI, land surface temperature (LST and albedo (ALB. The analysis involves conducting a seasonal trend analysis (STA to extract three seasonal shape parameters (Amplitude 0, Amplitude 1 and Amplitude 2 and using principal component analysis (PCA to contrast trends in change and no-change areas. We illustrate the method by characterizing trends in burned and unburned pixels in Alaska over the 2001–2009 time period. Findings show consistent and meaningful extraction of temporal patterns related to fire disturbances. The first principal component (PC1 is characterized by a decrease in mean NDVI (Amplitude 0 with a concurrent increase in albedo (the mean and the annual amplitude and an increase in LST annual variability (Amplitude 1. These results provide systematic empirical evidence of surface changes associated with one type of land change, fire disturbances, and suggest that STA with PCA may be used to characterize many other types of land transitions over large landscape areas using multivariate Earth observation time series.

  10. ALEA: a toolbox for allele-specific epigenomics analysis. (United States)

    Younesy, Hamid; Möller, Torsten; Heravi-Moussavi, Alireza; Cheng, Jeffrey B; Costello, Joseph F; Lorincz, Matthew C; Karimi, Mohammad M; Jones, Steven J M


    The assessment of expression and epigenomic status using sequencing based methods provides an unprecedented opportunity to identify and correlate allelic differences with epigenomic status. We present ALEA, a computational toolbox for allele-specific epigenomics analysis, which incorporates allelic variation data within existing resources, allowing for the identification of significant associations between epigenetic modifications and specific allelic variants in human and mouse cells. ALEA provides a customizable pipeline of command line tools for allele-specific analysis of next-generation sequencing data (ChIP-seq, RNA-seq, etc.) that takes the raw sequencing data and produces separate allelic tracks ready to be viewed on genome browsers. The pipeline has been validated using human and hybrid mouse ChIP-seq and RNA-seq data. The package, test data and usage instructions are available online at CONTACT: : or Supplementary information: Supplementary data are available at Bioinformatics online. © The Author 2013. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail:

  11. Forecasting malaria cases using climatic factors in delhi, India: a time series analysis. (United States)

    Kumar, Varun; Mangal, Abha; Panesar, Sanjeet; Yadav, Geeta; Talwar, Richa; Raut, Deepak; Singh, Saudan


    Background. Malaria still remains a public health problem in developing countries and changing environmental and climatic factors pose the biggest challenge in fighting against the scourge of malaria. Therefore, the study was designed to forecast malaria cases using climatic factors as predictors in Delhi, India. Methods. The total number of monthly cases of malaria slide positives occurring from January 2006 to December 2013 was taken from the register maintained at the malaria clinic at Rural Health Training Centre (RHTC), Najafgarh, Delhi. Climatic data of monthly mean rainfall, relative humidity, and mean maximum temperature were taken from Regional Meteorological Centre, Delhi. Expert modeler of SPSS ver. 21 was used for analyzing the time series data. Results. Autoregressive integrated moving average, ARIMA (0,1,1) (0,1,0)(12), was the best fit model and it could explain 72.5% variability in the time series data. Rainfall (P value = 0.004) and relative humidity (P value = 0.001) were found to be significant predictors for malaria transmission in the study area. Seasonal adjusted factor (SAF) for malaria cases shows peak during the months of August and September. Conclusion. ARIMA models of time series analysis is a simple and reliable tool for producing reliable forecasts for malaria in Delhi, India.

  12. Capturing Context-Related Change in Emotional Dynamics via Fixed Moderated Time Series Analysis. (United States)

    Adolf, Janne K; Voelkle, Manuel C; Brose, Annette; Schmiedek, Florian


    Much of recent affect research relies on intensive longitudinal studies to assess daily emotional experiences. The resulting data are analyzed with dynamic models to capture regulatory processes involved in emotional functioning. Daily contexts, however, are commonly ignored. This may not only result in biased parameter estimates and wrong conclusions, but also ignores the opportunity to investigate contextual effects on emotional dynamics. With fixed moderated time series analysis, we present an approach that resolves this problem by estimating context-dependent change in dynamic parameters in single-subject time series models. The approach examines parameter changes of known shape and thus addresses the problem of observed intra-individual heterogeneity (e.g., changes in emotional dynamics due to observed changes in daily stress). In comparison to existing approaches to unobserved heterogeneity, model estimation is facilitated and different forms of change can readily be accommodated. We demonstrate the approach's viability given relatively short time series by means of a simulation study. In addition, we present an empirical application, targeting the joint dynamics of affect and stress and how these co-vary with daily events. We discuss potentials and limitations of the approach and close with an outlook on the broader implications for understanding emotional adaption and development.

  13. Non-linear time series analysis on flow instability of natural circulation under rolling motion condition

    International Nuclear Information System (INIS)

    Zhang, Wenchao; Tan, Sichao; Gao, Puzhen; Wang, Zhanwei; Zhang, Liansheng; Zhang, Hong


    Highlights: • Natural circulation flow instabilities in rolling motion are studied. • The method of non-linear time series analysis is used. • Non-linear evolution characteristic of flow instability is analyzed. • Irregular complex flow oscillations are chaotic oscillations. • The effect of rolling parameter on the threshold of chaotic oscillation is studied. - Abstract: Non-linear characteristics of natural circulation flow instabilities under rolling motion conditions were studied by the method of non-linear time series analysis. Experimental flow time series of different dimensionless power and rolling parameters were analyzed based on phase space reconstruction theory. Attractors which were reconstructed in phase space and the geometric invariants, including correlation dimension, Kolmogorov entropy and largest Lyapunov exponent, were determined. Non-linear characteristics of natural circulation flow instabilities under rolling motion conditions was studied based on the results of the geometric invariant analysis. The results indicated that the values of the geometric invariants first increase and then decrease as dimensionless power increases which indicated the non-linear characteristics of the system first enhance and then weaken. The irregular complex flow oscillation is typical chaotic oscillation because the value of geometric invariants is at maximum. The threshold of chaotic oscillation becomes larger as the rolling frequency or rolling amplitude becomes big. The main influencing factors that influence the non-linear characteristics of the natural circulation system under rolling motion are thermal driving force, flow resistance and the additional forces caused by rolling motion. The non-linear characteristics of the natural circulation system under rolling motion changes caused by the change of the feedback and coupling degree among these influencing factors when the dimensionless power or rolling parameters changes

  14. Seasonal and annual precipitation time series trend analysis in North Carolina, United States (United States)

    Sayemuzzaman, Mohammad; Jha, Manoj K.


    The present study performs the spatial and temporal trend analysis of the annual and seasonal time-series of a set of uniformly distributed 249 stations precipitation data across the state of North Carolina, United States over the period of 1950-2009. The Mann-Kendall (MK) test, the Theil-Sen approach (TSA) and the Sequential Mann-Kendall (SQMK) test were applied to quantify the significance of trend, magnitude of trend, and the trend shift, respectively. Regional (mountain, piedmont and coastal) precipitation trends were also analyzed using the above-mentioned tests. Prior to the application of statistical tests, the pre-whitening technique was used to eliminate the effect of autocorrelation of precipitation data series. The application of the above-mentioned procedures has shown very notable statewide increasing trend for winter and decreasing trend for fall precipitation. Statewide mixed (increasing/decreasing) trend has been detected in annual, spring, and summer precipitation time series. Significant trends (confidence level ≥ 95%) were detected only in 8, 7, 4 and 10 nos. of stations (out of 249 stations) in winter, spring, summer, and fall, respectively. Magnitude of the highest increasing (decreasing) precipitation trend was found about 4 mm/season (- 4.50 mm/season) in fall (summer) season. Annual precipitation trend magnitude varied between - 5.50 mm/year and 9 mm/year. Regional trend analysis found increasing precipitation in mountain and coastal regions in general except during the winter. Piedmont region was found to have increasing trends in summer and fall, but decreasing trend in winter, spring and on an annual basis. The SQMK test on "trend shift analysis" identified a significant shift during 1960 - 70 in most parts of the state. Finally, the comparison between winter (summer) precipitations with the North Atlantic Oscillation (Southern Oscillation) indices concluded that the variability and trend of precipitation can be explained by the

  15. Analysis of radiation-induced microchemical evolution in 300 series stainless steel

    International Nuclear Information System (INIS)

    Brager, H.R.; Garner, F.A.


    The irradiation of 300 series stainless steel by fast neutrons leads to an evolution of alloy microstructures that involves not only the formation of voids and dislocations, but also an extensive repartitioning of elements between various phases. This latter evolution has been shown to be the primary determinant of the alloy behavior in response to the large number of variables which influence void swelling and irradiation creep. The combined use of scanning transmission electron microscopy and energy-dispersive x-ray analysis has been the key element in the study of this phenomenon. Problems associated with the analysis of radioactive specimens are resolved by minor equipment modifications. Problems associated with spatial resolution limitations and the complexity and heterogeneity of the microchemical evolution have been overcome by using several data acquisition techniques. These include the measurement of compositional profiles near sinks, the use of foil-edge analysis, and the statistical sampling of many matrix and precipitate volumes

  16. Analysis and specification tools in relation to the APSE (United States)

    Hendricks, John W.


    Ada and the Ada Programming Support Environment (APSE) specifically address the phases of the system/software life cycle which follow after the user's problem was translated into system and software development specifications. The waterfall model of the life cycle identifies the analysis and requirements definition phases as preceeding program design and coding. Since Ada is a programming language and the APSE is a programming support environment, they are primarily targeted to support program (code) development, tecting, and maintenance. The use of Ada based or Ada related specification languages (SLs) and program design languages (PDLs) can extend the use of Ada back into the software design phases of the life cycle. Recall that the standardization of the APSE as a programming support environment is only now happening after many years of evolutionary experience with diverse sets of programming support tools. Restricting consideration to one, or even a few chosen specification and design tools, could be a real mistake for an organization or a major project such as the Space Station, which will need to deal with an increasingly complex level of system problems. To require that everything be Ada-like, be implemented in Ada, run directly under the APSE, and fit into a rigid waterfall model of the life cycle would turn a promising support environment into a straight jacket for progress.

  17. Thermal Desorption Analysis of Effective Specific Soil Surface Area (United States)

    Smagin, A. V.; Bashina, A. S.; Klyueva, V. V.; Kubareva, A. V.


    A new method of assessing the effective specific surface area based on the successive thermal desorption of water vapor at different temperature stages of sample drying is analyzed in comparison with the conventional static adsorption method using a representative set of soil samples of different genesis and degree of dispersion. The theory of the method uses the fundamental relationship between the thermodynamic water potential (Ψ) and the absolute temperature of drying ( T): Ψ = Q - aT, where Q is the specific heat of vaporization, and a is the physically based parameter related to the initial temperature and relative humidity of the air in the external thermodynamic reservoir (laboratory). From gravimetric data on the mass fraction of water ( W) and the Ψ value, Polyanyi potential curves ( W(Ψ)) for the studied samples are plotted. Water sorption isotherms are then calculated, from which the capacity of monolayer and the target effective specific surface area are determined using the BET theory. Comparative analysis shows that the new method well agrees with the conventional estimation of the degree of dispersion by the BET and Kutilek methods in a wide range of specific surface area values between 10 and 250 m2/g.

  18. Analysis specifications for the CC3 biosphere model BIOTRAC

    International Nuclear Information System (INIS)

    Szekely, J.G.; Wojciechowski, L.C.; Stephens, M.E.; Halliday, H.A.


    AECL Research is assessing a concept for disposing of Canada's nuclear fuel waste in a vault deep in plutonic rock of the Canadian Shield. A computer program called the Systems Variability Analysis Code (SYVAC) has been developed as an analytical tool for the postclosure (long-term) assessment of the concept. SYVAC3, the third generation of the code, is an executive program that directs repeated simulation of the disposal system to take into account parameter variation. For the postclosure assessment, the system model, CC3 (Canadian Concept, generation 3), was developed to describe a hypothetical disposal system that includes a disposal vault, the local geosphere and the biosphere in the vicinity of any discharge zones. BIOTRAC (BIOsphere TRansport And Consequences) is the biosphere model in the CC3 system model. The specifications for BIOTRAC, which were developed over a period of seven years, were subjected to numerous walkthrough examinations by the Biosphere Model Working Group to ensure that the intent of the model developers would be correctly specified for transformation into FORTRAN code. The FORTRAN version of BIOTRAC was written from interim versions of these specifications. Improvements to the code are based on revised versions of these specifications. The specifications consist of a data dictionary; sets of synopses, data flow diagrams and mini specs for the component models of BIOTRAC (surface water, soil, atmosphere, and food chain and dose); and supporting calculations (interface to the geosphere, consequences, and mass balance). (author). 20 refs., tabs., figs


    Directory of Open Access Journals (Sweden)

    E. Nasanbat


    Full Text Available This study examines MODIS NDVI satellite imagery time series can be used to determine hotspot of land degradation area in whole Mongolia. The trend statistical analysis of Mann-Kendall was applied to a 16-year MODIS NDVI satellite imagery record, based on 16-day composited temporal data (from May to September for growing seasons and from 2000 to 2016. We performed to frequency analysis that resulting NDVI residual trend pattern would enable successful determined of negative and positive changes in photo synthetically health vegetation. Our result showed that negative and positive values and generated a map of significant trends. Also, we examined long-term of meteorological parameters for the same period. The result showed positive and negative NDVI trends concurred with land cover types change representing an improve or a degrade in vegetation, respectively. Also, integrated the climate parameters which were precipitation and air temperature changes in the same time period seem to have had an affecting on huge NDVI trend area. The time series trend analysis approach applied successfully determined hotspot of an improvement and a degraded area due to land degradation and desertification.

  20. Frequency Analysis of Modis Ndvi Time Series for Determining Hotspot of Land Degradation in Mongolia (United States)

    Nasanbat, E.; Sharav, S.; Sanjaa, T.; Lkhamjav, O.; Magsar, E.; Tuvdendorj, B.


    This study examines MODIS NDVI satellite imagery time series can be used to determine hotspot of land degradation area in whole Mongolia. The trend statistical analysis of Mann-Kendall was applied to a 16-year MODIS NDVI satellite imagery record, based on 16-day composited temporal data (from May to September) for growing seasons and from 2000 to 2016. We performed to frequency analysis that resulting NDVI residual trend pattern would enable successful determined of negative and positive changes in photo synthetically health vegetation. Our result showed that negative and positive values and generated a map of significant trends. Also, we examined long-term of meteorological parameters for the same period. The result showed positive and negative NDVI trends concurred with land cover types change representing an improve or a degrade in vegetation, respectively. Also, integrated the climate parameters which were precipitation and air temperature changes in the same time period seem to have had an affecting on huge NDVI trend area. The time series trend analysis approach applied successfully determined hotspot of an improvement and a degraded area due to land degradation and desertification.

  1. Wet tropospheric delays forecast based on Vienna Mapping Function time series analysis (United States)

    Rzepecka, Zofia; Kalita, Jakub


    It is well known that the dry part of the zenith tropospheric delay (ZTD) is much easier to model than the wet part (ZTW). The aim of the research is applying stochastic modeling and prediction of ZTW using time series analysis tools. Application of time series analysis enables closer understanding of ZTW behavior as well as short-term prediction of future ZTW values. The ZTW data used for the studies were obtained from the GGOS service hold by Vienna technical University. The resolution of the data is six hours. ZTW for the years 2010 -2013 were adopted for the study. The International GNSS Service (IGS) permanent stations LAMA and GOPE, located in mid-latitudes, were admitted for the investigations. Initially the seasonal part was separated and modeled using periodic signals and frequency analysis. The prominent annual and semi-annual signals were removed using sines and consines functions. The autocorrelation of the resulting signal is significant for several days (20-30 samples). The residuals of this fitting were further analyzed and modeled with ARIMA processes. For both the stations optimal ARMA processes based on several criterions were obtained. On this basis predicted ZTW values were computed for one day ahead, leaving the white process residuals. Accuracy of the prediction can be estimated at about 3 cm.

  2. Time series analysis of reference crop evapotranspiration for Bokaro District, Jharkhand, India

    Directory of Open Access Journals (Sweden)

    Gautam Ratnesh


    Full Text Available Evapotranspiration is the one of the major role playing element in water cycle. More accurate measurement and forecasting of Evapotranspiration would enable more efficient water resources management. This study, is therefore, particularly focused on evapotranspiration modelling and forecasting, since forecasting would provide better information for optimal water resources management. There are numerous techniques of evapotranspiration forecasting that include autoregressive (AR and moving average (MA, autoregressive moving average (ARMA, autoregressive integrated moving average (ARIMA, Thomas Feiring, etc. Out of these models ARIMA model has been found to be more suitable for analysis and forecasting of hydrological events. Therefore, in this study ARIMA models have been used for forecasting of mean monthly reference crop evapotranspiration by stochastic analysis. The data series of 102 years i.e. 1224 months of Bokaro District were used for analysis and forecasting. Different order of ARIMA model was selected on the basis of autocorrelation function (ACF and partial autocorrelation (PACF of data series. Maximum likelihood method was used for determining the parameters of the models. To see the statistical parameter of model, best fitted model is ARIMA (0, 1, 4 (0, 1, 112.

  3. Permutation entropy based time series analysis: Equalities in the input signal can lead to false conclusions

    Energy Technology Data Exchange (ETDEWEB)

    Zunino, Luciano, E-mail: [Centro de Investigaciones Ópticas (CONICET La Plata – CIC), C.C. 3, 1897 Gonnet (Argentina); Departamento de Ciencias Básicas, Facultad de Ingeniería, Universidad Nacional de La Plata (UNLP), 1900 La Plata (Argentina); Olivares, Felipe, E-mail: [Instituto de Física, Pontificia Universidad Católica de Valparaíso (PUCV), 23-40025 Valparaíso (Chile); Scholkmann, Felix, E-mail: [Research Office for Complex Physical and Biological Systems (ROCoS), Mutschellenstr. 179, 8038 Zurich (Switzerland); Biomedical Optics Research Laboratory, Department of Neonatology, University Hospital Zurich, University of Zurich, 8091 Zurich (Switzerland); Rosso, Osvaldo A., E-mail: [Instituto de Física, Universidade Federal de Alagoas (UFAL), BR 104 Norte km 97, 57072-970, Maceió, Alagoas (Brazil); Instituto Tecnológico de Buenos Aires (ITBA) and CONICET, C1106ACD, Av. Eduardo Madero 399, Ciudad Autónoma de Buenos Aires (Argentina); Complex Systems Group, Facultad de Ingeniería y Ciencias Aplicadas, Universidad de los Andes, Av. Mons. Álvaro del Portillo 12.455, Las Condes, Santiago (Chile)


    A symbolic encoding scheme, based on the ordinal relation between the amplitude of neighboring values of a given data sequence, should be implemented before estimating the permutation entropy. Consequently, equalities in the analyzed signal, i.e. repeated equal values, deserve special attention and treatment. In this work, we carefully study the effect that the presence of equalities has on permutation entropy estimated values when these ties are symbolized, as it is commonly done, according to their order of appearance. On the one hand, the analysis of computer-generated time series is initially developed to understand the incidence of repeated values on permutation entropy estimations in controlled scenarios. The presence of temporal correlations is erroneously concluded when true pseudorandom time series with low amplitude resolutions are considered. On the other hand, the analysis of real-world data is included to illustrate how the presence of a significant number of equal values can give rise to false conclusions regarding the underlying temporal structures in practical contexts. - Highlights: • Impact of repeated values in a signal when estimating permutation entropy is studied. • Numerical and experimental tests are included for characterizing this limitation. • Non-negligible temporal correlations can be spuriously concluded by repeated values. • Data digitized with low amplitude resolutions could be especially affected. • Analysis with shuffled realizations can help to overcome this limitation.

  4. Visualization of time series statistical data by shape analysis (GDP ratio changes among Asia countries) (United States)

    Shirota, Yukari; Hashimoto, Takako; Fitri Sari, Riri


    It has been very significant to visualize time series big data. In the paper we shall discuss a new analysis method called “statistical shape analysis” or “geometry driven statistics” on time series statistical data in economics. In the paper, we analyse the agriculture, value added and industry, value added (percentage of GDP) changes from 2000 to 2010 in Asia. We handle the data as a set of landmarks on a two-dimensional image to see the deformation using the principal components. The point of the analysis method is the principal components of the given formation which are eigenvectors of its bending energy matrix. The local deformation can be expressed as the set of non-Affine transformations. The transformations give us information about the local differences between in 2000 and in 2010. Because the non-Affine transformation can be decomposed into a set of partial warps, we present the partial warps visually. The statistical shape analysis is widely used in biology but, in economics, no application can be found. In the paper, we investigate its potential to analyse the economic data.

  5. Renal transplant lithiasis: analysis of our series and review of the literature. (United States)

    Stravodimos, Konstantinos G; Adamis, Stefanos; Tyritzis, Stavros; Georgios, Zavos; Constantinides, Constantinos A


    Renal transplant lithiasis represents a rather uncommon complication. Even rare, it can result in significant morbidity and a devastating loss of renal function if obstruction occurs. We present our experience with graft lithiasis in our series of renal transplantations and review the literature regarding the epidemiology, pathophysiology, and current therapeutic strategies in the management of renal transplant lithiasis. In a retrospective analysis of a consecutive series of 1525 renal transplantations that were performed between January 1983 and March 2007, 7 patients were found to have allograft lithiasis. In five cases, the calculi were localized in the renal unit, and in two cases, in the ureter. A review in the English language was also performed of the Medline and PubMed databases using the keywords renal transplant lithiasis, donor-gifted lithiasis, and urological complications after kidney transplantation. Several retrospective studies regarding the incidence, etiology, as well as predisposing factors for graft lithiasis were reviewed. Data regarding the current therapeutic strategies for graft lithiasis were also evaluated, and outcomes were compared with the results of our series. Most studies report a renal transplant lithiasis incidence of 0.4% to 1%. In our series, incidence of graft lithiasis was 0.46% (n=7). Of the seven patients, three were treated via percutaneous nephrolithotripsy (PCNL); in three patients, shockwave lithotripsy (SWL) was performed; and in a single case, spontaneous passage of a urinary calculus was observed. All patients are currently stone free but still remain under close urologic surveillance. Renal transplant lithiasis requires vigilance, a high index of suspicion, prompt recognition, and management. Treatment protocols should mimic those for solitary kidneys. Minimally invasive techniques are available to remove graft calculi. Long-term follow-up is essential to determine the outcome, as well as to prevent recurrence.

  6. Use of recurrence plot and recurrence quantification analysis in Taiwan unemployment rate time series (United States)

    Chen, Wei-Shing


    The aim of the article is to answer the question if the Taiwan unemployment rate dynamics is generated by a non-linear deterministic dynamic process. This paper applies a recurrence plot and recurrence quantification approach based on the analysis of non-stationary hidden transition patterns of the unemployment rate of Taiwan. The case study uses the time series data of the Taiwan’s unemployment rate during the period from 1978/01 to 2010/06. The results show that recurrence techniques are able to identify various phases in the evolution of unemployment transition in Taiwan.

  7. Interpretation of engine cycle-to-cycle variation by chaotic time series analysis

    Energy Technology Data Exchange (ETDEWEB)

    Daw, C.S.; Kahl, W.K.


    In this paper we summarize preliminary results from applying a new mathematical technique -- chaotic time series analysis (CTSA) -- to cylinder pressure data from a spark-ignition (SI) four-stroke engine fueled with both methanol and iso-octane. Our objective is to look for the presence of deterministic chaos'' dynamics in peak pressure variations and to investigate the potential usefulness of CTSA as a diagnostic tool. Our results suggest that sequential peak cylinder pressures exhibit some characteristic features of deterministic chaos and that CTSA can extract previously unrecognized information from such data. 18 refs., 11 figs., 2 tabs.

  8. Analysis of the gamma spectra of the uranium, actinium, and thorium decay series

    International Nuclear Information System (INIS)

    Momeni, M.H.


    This report describes the identification of radionuclides in the uranium, actinium, and thorium series by analysis of gamma spectra in the energy range of 40 to 1400 keV. Energies and absolute efficiencies for each gamma line were measured by means of a high-resolution germanium detector and compared with those in the literature. A gamma spectroscopy method, which utilizes an on-line computer for deconvolution of spectra, search and identification of each line, and estimation of activity for each radionuclide, was used to analyze soil and uranium tailings, and ore

  9. Analysis of the development trend of China’s business administration based on time series


    Jiang Rui


    On the general direction of the economic system, China is in a crucial period of the establishment of the modern enterprise system and reform of the macroeconomic system, and a lot of high-quality business administration talents are required to make China’s economy be stably developed. This paper carries out time series analysis of the development situation of China’s business administration major: on the whole, the society currently presents an upward trend on the demand for the business adm...

  10. Impact of public programs on fertility and gender specific investment in human capital of children in rural India: cross sectional and time series analyses. (United States)

    Duraisamy, P; Malathy, R


    Cross sectional and time series analyses are conducted with 1971 and 1981 rural district level data for India in order to estimate variations in program impacts on household decisionmaking concerning fertility, child mortality, and schooling; to analyze how the variation in public program subsidies and services influences sex specific investments in schooling; and to examine the bias in cross sectional estimates by employing fixed effects methodology. The theory of household production uses the framework development by Rosenzweig and Wolpin. The utility function is expressed as a function of families' desired number of children, sex specific investment in human capital of children measured by schooling of males and females, and a composite consumption good. Budget constraints are characterized in terms of the biological supply of births or natural fertility, the number of births averted by fertility control, exogenous money income, the prices of number of children, contraceptives, child schooling, and consumption of goods. Demand functions are constructed from maximizing the utility function subject to the budget constraint. Data constitute 40% of the total districts and 50% of the rural population. The empirical specification of the linear model and variable description are provided. Other explanatory variables included are adult educational attainment; % of scheduled castes and tribes and % Muslim; and % rural population. Estimation methods are described and justification is provided for the use of ordinary least squares and fixed effects methods. The results of the cross sectional analysis reveal that own-program effects of family planning and primary health centers reduced family size in 1971 and 81. The increase in secondary school enrollment is evidenced in only 1971. There is a significant effect of family planning (FP) clinics on the demand for surviving children only in 1971. The presence of a seconary school in a village reduces the demand for children in

  11. Time-series analysis of air pollution and cause-specific mortality

    NARCIS (Netherlands)

    Zmirou, D; Schwartz, J; Saez, M; Zanobetti, A; Wojtyniak, B; Touloumi, G; Spix, C; de Leon, AP; Le Moullec, Y; Bacharova, L; Schouten, J; Ponka, A; Katsouyanni, K

    Ten large European cities provided data on daily air pollution as well as mortality from respiratory and cardiovascular mortality. We used Poisson autoregressive models that controlled for trend, season, influenza epidemics, and meteorologic influences to assess the short-term effects of air

  12. A Framework for RFID Survivability Requirement Analysis and Specification (United States)

    Zuo, Yanjun; Pimple, Malvika; Lande, Suhas

    Many industries are becoming dependent on Radio Frequency Identification (RFID) technology for inventory management and asset tracking. The data collected about tagged objects though RFID is used in various high level business operations. The RFID system should hence be highly available, reliable, and dependable and secure. In addition, this system should be able to resist attacks and perform recovery in case of security incidents. Together these requirements give rise to the notion of a survivable RFID system. The main goal of this paper is to analyze and specify the requirements for an RFID system to become survivable. These requirements, if utilized, can assist the system in resisting against devastating attacks and recovering quickly from damages. This paper proposes the techniques and approaches for RFID survivability requirements analysis and specification. From the perspective of system acquisition and engineering, survivability requirement is the important first step in survivability specification, compliance formulation, and proof verification.

  13. Analysis of Urine as Indicators of Specific Body Conditions (United States)

    Dey, Souradeep; Saha, Triya; Narendrakumar, Uttamchand


    Urinalysis can be defined as a procedure for examining various factors of urine, which include physical properties, particulate matter, cells, casts, crystals, organisms and solutes. Urinalysis is recommended to be a part of the initial examination of all patients as its cheap, feasible and gives productive results. This paper focuses on the analysis of urine collected at specific body conditions. Here we illustrate the urine profile of different persons having various body conditions, which include, having urinary tract infection, undergoing strenuous exercise, having back pain regularly, having very low urine output and a person who is on 24 hours of diet. Examination of urine collected from different persons having specific body conditions usually helps us in the diagnosis of various diseases, which it indicates.

  14. Time Series Analysis of Onchocerciasis Data from Mexico: A Trend towards Elimination (United States)

    Pérez-Rodríguez, Miguel A.; Adeleke, Monsuru A.; Orozco-Algarra, María E.; Arrendondo-Jiménez, Juan I.; Guo, Xianwu


    Background In Latin America, there are 13 geographically isolated endemic foci distributed among Mexico, Guatemala, Colombia, Venezuela, Brazil and Ecuador. The communities of the three endemic foci found within Mexico have been receiving ivermectin treatment since 1989. In this study, we predicted the trend of occurrence of cases in Mexico by applying time series analysis to monthly onchocerciasis data reported by the Mexican Secretariat of Health between 1988 and 2011 using the software R. Results A total of 15,584 cases were reported in Mexico from 1988 to 2011. The data of onchocerciasis cases are mainly from the main endemic foci of Chiapas and Oaxaca. The last case in Oaxaca was reported in 1998, but new cases were reported in the Chiapas foci up to 2011. Time series analysis performed for the foci in Mexico showed a decreasing trend of the disease over time. The best-fitted models with the smallest Akaike Information Criterion (AIC) were Auto-Regressive Integrated Moving Average (ARIMA) models, which were used to predict the tendency of onchocerciasis cases for two years ahead. According to the ARIMA models predictions, the cases in very low number (below 1) are expected for the disease between 2012 and 2013 in Chiapas, the last endemic region in Mexico. Conclusion The endemic regions of Mexico evolved from high onchocerciasis-endemic states to the interruption of transmission due to the strategies followed by the MSH, based on treatment with ivermectin. The extremely low level of expected cases as predicted by ARIMA models for the next two years suggest that the onchocerciasis is being eliminated in Mexico. To our knowledge, it is the first study utilizing time series for predicting case dynamics of onchocerciasis, which could be used as a benchmark during monitoring and post-treatment surveillance. PMID:23459370

  15. Use of a Principal Components Analysis for the Generation of Daily Time Series. (United States)

    Dreveton, Christine; Guillou, Yann


    A new approach for generating daily time series is considered in response to the weather-derivatives market. This approach consists of performing a principal components analysis to create independent variables, the values of which are then generated separately with a random process. Weather derivatives are financial or insurance products that give companies the opportunity to cover themselves against adverse climate conditions. The aim of a generator is to provide a wider range of feasible situations to be used in an assessment of risk. Generation of a temperature time series is required by insurers or bankers for pricing weather options. The provision of conditional probabilities and a good representation of the interannual variance are the main challenges of a generator when used for weather derivatives. The generator was developed according to this new approach using a principal components analysis and was applied to the daily average temperature time series of the Paris-Montsouris station in France. The observed dataset was homogenized and the trend was removed to represent correctly the present climate. The results obtained with the generator show that it represents correctly the interannual variance of the observed climate; this is the main result of the work, because one of the main discrepancies of other generators is their inability to represent accurately the observed interannual climate variance—this discrepancy is not acceptable for an application to weather derivatives. The generator was also tested to calculate conditional probabilities: for example, the knowledge of the aggregated value of heating degree-days in the middle of the heating season allows one to estimate the probability if reaching a threshold at the end of the heating season. This represents the main application of a climate generator for use with weather derivatives.

  16. Online Time Series Analysis of Land Products over Asia Monsoon Region via Giovanni (United States)

    Shen, Suhung; Leptoukh, Gregory G.; Gerasimov, Irina


    Time series analysis is critical to the study of land cover/land use changes and climate. Time series studies at local-to-regional scales require higher spatial resolution, such as 1km or less, data. MODIS land products of 250m to 1km resolution enable such studies. However, such MODIS land data files are distributed in 10ox10o tiles, due to large data volumes. Conducting a time series study requires downloading all tiles that include the study area for the time period of interest, and mosaicking the tiles spatially. This can be an extremely time-consuming process. In support of the Monsoon Asia Integrated Regional Study (MAIRS) program, NASA GES DISC (Goddard Earth Sciences Data and Information Services Center) has processed MODIS land products at 1 km resolution over the Asia monsoon region (0o-60oN, 60o-150oE) with a common data structure and format. The processed data have been integrated into the Giovanni system (Goddard Interactive Online Visualization ANd aNalysis Infrastructure) that enables users to explore, analyze, and download data over an area and time period of interest easily. Currently, the following regional MODIS land products are available in Giovanni: 8-day 1km land surface temperature and active fire, monthly 1km vegetation index, and yearly 0.05o, 500m land cover types. More data will be added in the near future. By combining atmospheric and oceanic data products in the Giovanni system, it is possible to do further analyses of environmental and climate changes associated with the land, ocean, and atmosphere. This presentation demonstrates exploring land products in the Giovanni system with sample case scenarios.

  17. Series: Practical guidance to qualitative research. Part 3: Sampling, data collection and analysis. (United States)

    Moser, Albine; Korstjens, Irene


    In the course of our supervisory work over the years, we have noticed that qualitative research tends to evoke a lot of questions and worries, so-called frequently asked questions (FAQs). This series of four articles intends to provide novice researchers with practical guidance for conducting high-quality qualitative research in primary care. By 'novice' we mean Master's students and junior researchers, as well as experienced quantitative researchers who are engaging in qualitative research for the first time. This series addresses their questions and provides researchers, readers, reviewers and editors with references to criteria and tools for judging the quality of qualitative research papers. The second article focused on context, research questions and designs, and referred to publications for further reading. This third article addresses FAQs about sampling, data collection and analysis. The data collection plan needs to be broadly defined and open at first, and become flexible during data collection. Sampling strategies should be chosen in such a way that they yield rich information and are consistent with the methodological approach used. Data saturation determines sample size and will be different for each study. The most commonly used data collection methods are participant observation, face-to-face in-depth interviews and focus group discussions. Analyses in ethnographic, phenomenological, grounded theory, and content analysis studies yield different narrative findings: a detailed description of a culture, the essence of the lived experience, a theory, and a descriptive summary, respectively. The fourth and final article will focus on trustworthiness and publishing qualitative research.

  18. Disentangling Time-series Spectra with Gaussian Processes: Applications to Radial Velocity Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Czekala, Ian [Kavli Institute for Particle Astrophysics and Cosmology, Stanford University, Stanford, CA 94305 (United States); Mandel, Kaisey S.; Andrews, Sean M.; Dittmann, Jason A. [Harvard-Smithsonian Center for Astrophysics, 60 Garden Street, Cambridge, MA 02138 (United States); Ghosh, Sujit K. [Department of Statistics, NC State University, 2311 Stinson Drive, Raleigh, NC 27695 (United States); Montet, Benjamin T. [Department of Astronomy and Astrophysics, University of Chicago, 5640 S. Ellis Avenue, Chicago, IL 60637 (United States); Newton, Elisabeth R., E-mail: [Massachusetts Institute of Technology, Cambridge, MA 02138 (United States)


    Measurements of radial velocity variations from the spectroscopic monitoring of stars and their companions are essential for a broad swath of astrophysics; these measurements provide access to the fundamental physical properties that dictate all phases of stellar evolution and facilitate the quantitative study of planetary systems. The conversion of those measurements into both constraints on the orbital architecture and individual component spectra can be a serious challenge, however, especially for extreme flux ratio systems and observations with relatively low sensitivity. Gaussian processes define sampling distributions of flexible, continuous functions that are well-motivated for modeling stellar spectra, enabling proficient searches for companion lines in time-series spectra. We introduce a new technique for spectral disentangling, where the posterior distributions of the orbital parameters and intrinsic, rest-frame stellar spectra are explored simultaneously without needing to invoke cross-correlation templates. To demonstrate its potential, this technique is deployed on red-optical time-series spectra of the mid-M-dwarf binary LP661-13. We report orbital parameters with improved precision compared to traditional radial velocity analysis and successfully reconstruct the primary and secondary spectra. We discuss potential applications for other stellar and exoplanet radial velocity techniques and extensions to time-variable spectra. The code used in this analysis is freely available as an open-source Python package.

  19. Combined use of correlation dimension and entropy as discriminating measures for time series analysis (United States)

    Harikrishnan, K. P.; Misra, R.; Ambika, G.


    We show that the combined use of correlation dimension (D2) and correlation entropy (K2) as discriminating measures can extract a more accurate information regarding the different types of noise present in a time series data. For this, we make use of an algorithmic approach for computing D2 and K2 proposed by us recently [Harikrishnan KP, Misra R, Ambika G, Kembhavi AK. Physica D 2006;215:137; Harikrishnan KP, Ambika G, Misra R. Mod Phys Lett B 2007;21:129; Harikrishnan KP, Misra R, Ambika G. Pramana - J Phys, in press], which is a modification of the standard Grassberger-Proccacia scheme. While the presence of white noise can be easily identified by computing D2 of data and surrogates, K2 is a better discriminating measure to detect colored noise in the data. Analysis of time series from a real world system involving both white and colored noise is presented as evidence. To our knowledge, this is the first time that such a combined analysis is undertaken on a real world data.

  20. Visibility graph analysis of heart rate time series and bio-marker of congestive heart failure (United States)

    Bhaduri, Anirban; Bhaduri, Susmita; Ghosh, Dipak


    Study of RR interval time series for Congestive Heart Failure had been an area of study with different methods including non-linear methods. In this article the cardiac dynamics of heart beat are explored in the light of complex network analysis, viz. visibility graph method. Heart beat (RR Interval) time series data taken from Physionet database [46, 47] belonging to two groups of subjects, diseased (congestive heart failure) (29 in number) and normal (54 in number) are analyzed with the technique. The overall results show that a quantitative parameter can significantly differentiate between the diseased subjects and the normal subjects as well as different stages of the disease. Further, the data when split into periods of around 1 hour each and analyzed separately, also shows the same consistent differences. This quantitative parameter obtained using the visibility graph analysis thereby can be used as a potential bio-marker as well as a subsequent alarm generation mechanism for predicting the onset of Congestive Heart Failure.

  1. Multifractal analysis of visibility graph-based Ito-related connectivity time series. (United States)

    Czechowski, Zbigniew; Lovallo, Michele; Telesca, Luciano


    In this study, we investigate multifractal properties of connectivity time series resulting from the visibility graph applied to normally distributed time series generated by the Ito equations with multiplicative power-law noise. We show that multifractality of the connectivity time series (i.e., the series of numbers of links outgoing any node) increases with the exponent of the power-law noise. The multifractality of the connectivity time series could be due to the width of connectivity degree distribution that can be related to the exit time of the associated Ito time series. Furthermore, the connectivity time series are characterized by persistence, although the original Ito time series are random; this is due to the procedure of visibility graph that, connecting the values of the time series, generates persistence but destroys most of the nonlinear correlations. Moreover, the visibility graph is sensitive for detecting wide "depressions" in input time series.

  2. Ecological Momentary Assessments and Automated Time Series Analysis to Promote Tailored Health Care : A Proof-of-Principle Study

    NARCIS (Netherlands)

    van der Krieke, Lian; Emerencia, Ando C; Bos, Elisabeth H; Rosmalen, Judith Gm; Riese, Harriëtte; Aiello, Marco; Sytema, Sjoerd; de Jonge, Peter


    BACKGROUND: Health promotion can be tailored by combining ecological momentary assessments (EMA) with time series analysis. This combined method allows for studying the temporal order of dynamic relationships among variables, which may provide concrete indications for intervention. However,

  3. Practical analysis of specificity-determining residues in protein families. (United States)

    Chagoyen, Mónica; García-Martín, Juan A; Pazos, Florencio


    Determining the residues that are important for the molecular activity of a protein is a topic of broad interest in biomedicine and biotechnology. This knowledge can help understanding the protein's molecular mechanism as well as to fine-tune its natural function eventually with biotechnological or therapeutic implications. Some of the protein residues are essential for the function common to all members of a family of proteins, while others explain the particular specificities of certain subfamilies (like binding on different substrates or cofactors and distinct binding affinities). Owing to the difficulty in experimentally determining them, a number of computational methods were developed to detect these functional residues, generally known as 'specificity-determining positions' (or SDPs), from a collection of homologous protein sequences. These methods are mature enough for being routinely used by molecular biologists in directing experiments aimed at getting insight into the functional specificity of a family of proteins and eventually modifying it. In this review, we summarize some of the recent discoveries achieved through SDP computational identification in a number of relevant protein families, as well as the main approaches and software tools available to perform this type of analysis. © The Author 2015. Published by Oxford University Press. For Permissions, please email:

  4. Analysis specifications for the CC3 geosphere model GEONET

    International Nuclear Information System (INIS)

    Melnyk, T.W.


    AECL is assessing a concept for disposing of Canada's nuclear fuel waste in a sealed vault deep in plutonic rock of the Canadian Shield. A computer program has been developed as an analytical tool for the postclosure assessment case study, a system model, CC3 (Canadian Concept, generation 3), has been developed to describe a hypothetical disposal system. This system model includes separate models for the engineered barriers within the disposal vault, the geosphere in which the vault is emplaced, and the biosphere in the vicinity of any discharge zones. The system model is embedded within a computer code SYVAC3, (SYstems Variability Analysis Code, generation 3), which takes parameter uncertainty into account by repeated simulation of the system. GEONET (GEOsphere NETwork) is the geosphere model component of this system model. It simulates contaminant transport from the vault to the biosphere along a transport network composed of one-dimensional transport segments that are connected together in three-dimensional space. This document is a set of specifications for GEONET that were developed over a number of years. Improvements to the code will be based on revisions to these specifications. The specifications consist of a model synopsis, describing all the relevant equations and assumptions used in the model, a set of formal data flow diagrams and minispecifications, and a data dictionary. (author). 26 refs., 20 figs

  5. Investigation on Law and Economics Based on Complex Network and Time Series Analysis (United States)

    Yang, Jian; Qu, Zhao; Chang, Hui


    The research focuses on the cooperative relationship and the strategy tendency among three mutually interactive parties in financing: small enterprises, commercial banks and micro-credit companies. Complex network theory and time series analysis were applied to figure out the quantitative evidence. Moreover, this paper built up a fundamental model describing the particular interaction among them through evolutionary game. Combining the results of data analysis and current situation, it is justifiable to put forward reasonable legislative recommendations for regulations on lending activities among small enterprises, commercial banks and micro-credit companies. The approach in this research provides a framework for constructing mathematical models and applying econometrics and evolutionary game in the issue of corporation financing. PMID:26076460

  6. Modeling of human operator dynamics in simple manual control utilizing time series analysis. [tracking (position) (United States)

    Agarwal, G. C.; Osafo-Charles, F.; Oneill, W. D.; Gottlieb, G. L.


    Time series analysis is applied to model human operator dynamics in pursuit and compensatory tracking modes. The normalized residual criterion is used as a one-step analytical tool to encompass the processes of identification, estimation, and diagnostic checking. A parameter constraining technique is introduced to develop more reliable models of human operator dynamics. The human operator is adequately modeled by a second order dynamic system both in pursuit and compensatory tracking modes. In comparing the data sampling rates, 100 msec between samples is adequate and is shown to provide better results than 200 msec sampling. The residual power spectrum and eigenvalue analysis show that the human operator is not a generator of periodic characteristics.

  7. Analysis of the Main Factors Influencing Food Production in China Based on Time Series Trend Chart

    Institute of Scientific and Technical Information of China (English)

    Shuangjin; WANG; Jianying; LI


    Based on the annual sample data on food production in China since the reform and opening up,we select 8 main factors influencing the total food production( growing area,application rate of chemical fertilizer,effective irrigation area,the affected area,total machinery power,food production cost index,food production price index,financial funds for supporting agriculture,farmers and countryside),and put them into categories of material input,resources and environment,and policy factors. Using the factor analysis,we carry out the multi-angle analysis of these typical influencing factors one by one through the time series trend chart. It is found that application rate of chemical fertilizer,the growing area of food crops and drought-affected area become the key factors affecting food production. On this basis,we set forth the corresponding recommendations for improving the comprehensive food production capacity.

  8. Automated preparation of Kepler time series of planet hosts for asteroseismic analysis

    DEFF Research Database (Denmark)

    Handberg, R.; Lund, M. N.


    . In this paper we present the KASOC Filter, which is used to automatically prepare data from the Kepler/K2 mission for asteroseismic analyses of solar-like planet host stars. The methods are very effective at removing unwanted signals of both instrumental and planetary origins and produce significantly cleaner......One of the tasks of the Kepler Asteroseismic Science Operations Center (KASOC) is to provide asteroseismic analyses on Kepler Objects of Interest (KOIs). However, asteroseismic analysis of planetary host stars presents some unique complications with respect to data preprocessing, compared to pure...... asteroseismic targets. If not accounted for, the presence of planetary transits in the photometric time series often greatly complicates or even hinders these asteroseismic analyses. This drives the need for specialised methods of preprocessing data to make them suitable for asteroseismic analysis...

  9. Time series modeling for analysis and control advanced autopilot and monitoring systems

    CERN Document Server

    Ohtsu, Kohei; Kitagawa, Genshiro


    This book presents multivariate time series methods for the analysis and optimal control of feedback systems. Although ships’ autopilot systems are considered through the entire book, the methods set forth in this book can be applied to many other complicated, large, or noisy feedback control systems for which it is difficult to derive a model of the entire system based on theory in that subject area. The basic models used in this method are the multivariate autoregressive model with exogenous variables (ARX) model and the radial bases function net-type coefficients ARX model. The noise contribution analysis can then be performed through the estimated autoregressive (AR) model and various types of autopilot systems can be designed through the state–space representation of the models. The marine autopilot systems addressed in this book include optimal controllers for course-keeping motion, rolling reduction controllers with rudder motion, engine governor controllers, noise adaptive autopilots, route-tracki...

  10. Use of a prototype pulse oximeter for time series analysis of heart rate variability (United States)

    González, Erika; López, Jehú; Hautefeuille, Mathieu; Velázquez, Víctor; Del Moral, Jésica


    This work presents the development of a low cost pulse oximeter prototype consisting of pulsed red and infrared commercial LEDs and a broad spectral photodetector used to register time series of heart rate and oxygen saturation of blood. This platform, besides providing these values, like any other pulse oximeter, processes the signals to compute a power spectrum analysis of the patient heart rate variability in real time and, additionally, the device allows access to all raw and analyzed data if databases construction is required or another kind of further analysis is desired. Since the prototype is capable of acquiring data for long periods of time, it is suitable for collecting data in real life activities, enabling the development of future wearable applications.

  11. The Usage of Time Series Control Charts for Financial Process Analysis

    Directory of Open Access Journals (Sweden)

    Kovářík Martin


    Full Text Available We will deal with financial proceedings of the company using methods of SPC (Statistical Process Control, specifically through time series control charts. The paper will outline the intersection of two disciplines which are econometrics and statistical process control. The theoretical part will discuss the methodology of time series control charts and in the research part there will be this methodology demonstrated in three case studies. The first study will focus on the regulation of simulated financial flows for a company by CUSUM control chart. The second study will involve the regulation of financial flows for a heteroskedastic financial process by EWMA control chart. The last case study of our paper will be devoted to applications of ARIMA, EWMA and CUSUM control charts in the financial data that are sensitive to the mean shifting while calculating the autocorrelation in the data. In this paper, we highlight the versatility of control charts not only in manufacturing but also in managing the financial stability of cash flows.

  12. Short-term forecasting of meteorological time series using Nonparametric Functional Data Analysis (NPFDA) (United States)

    Curceac, S.; Ternynck, C.; Ouarda, T.


    Over the past decades, a substantial amount of research has been conducted to model and forecast climatic variables. In this study, Nonparametric Functional Data Analysis (NPFDA) methods are applied to forecast air temperature and wind speed time series in Abu Dhabi, UAE. The dataset consists of hourly measurements recorded for a period of 29 years, 1982-2010. The novelty of the Functional Data Analysis approach is in expressing the data as curves. In the present work, the focus is on daily forecasting and the functional observations (curves) express the daily measurements of the above mentioned variables. We apply a non-linear regression model with a functional non-parametric kernel estimator. The computation of the estimator is performed using an asymmetrical quadratic kernel function for local weighting based on the bandwidth obtained by a cross validation procedure. The proximities between functional objects are calculated by families of semi-metrics based on derivatives and Functional Principal Component Analysis (FPCA). Additionally, functional conditional mode and functional conditional median estimators are applied and the advantages of combining their results are analysed. A different approach employs a SARIMA model selected according to the minimum Akaike (AIC) and Bayessian (BIC) Information Criteria and based on the residuals of the model. The performance of the models is assessed by calculating error indices such as the root mean square error (RMSE), relative RMSE, BIAS and relative BIAS. The results indicate that the NPFDA models provide more accurate forecasts than the SARIMA models. Key words: Nonparametric functional data analysis, SARIMA, time series forecast, air temperature, wind speed


    Directory of Open Access Journals (Sweden)

    I. P. Stamat


    Full Text Available In the Sanitary Rules SR and SR classification of the industrial waste containing naturally occurring radioactive materials is adopted in accordance to the values of their effective specific activity Aeff. In a case of the disturbed equilibrium in 238U and 232Th series it is necessary to take into consideration actual contribution of the separate natural radionuclides of the mentioned series into the value of gamma dose rate of the waste. This will permit to avoid unjustified overestimating or understating of the waste category which prevents as unjustified expenditures on their treating so undertaking of the necessary measures providing radiation safety.

  14. QuASAR: quantitative allele-specific analysis of reads. (United States)

    Harvey, Chris T; Moyerbrailean, Gregory A; Davis, Gordon O; Wen, Xiaoquan; Luca, Francesca; Pique-Regi, Roger


    Expression quantitative trait loci (eQTL) studies have discovered thousands of genetic variants that regulate gene expression, enabling a better understanding of the functional role of non-coding sequences. However, eQTL studies are costly, requiring large sample sizes and genome-wide genotyping of each sample. In contrast, analysis of allele-specific expression (ASE) is becoming a popular approach to detect the effect of genetic variation on gene expression, even within a single individual. This is typically achieved by counting the number of RNA-seq reads matching each allele at heterozygous sites and testing the null hypothesis of a 1:1 allelic ratio. In principle, when genotype information is not readily available, it could be inferred from the RNA-seq reads directly. However, there are currently no existing methods that jointly infer genotypes and conduct ASE inference, while considering uncertainty in the genotype calls. We present QuASAR, quantitative allele-specific analysis of reads, a novel statistical learning method for jointly detecting heterozygous genotypes and inferring ASE. The proposed ASE inference step takes into consideration the uncertainty in the genotype calls, while including parameters that model base-call errors in sequencing and allelic over-dispersion. We validated our method with experimental data for which high-quality genotypes are available. Results for an additional dataset with multiple replicates at different sequencing depths demonstrate that QuASAR is a powerful tool for ASE analysis when genotypes are not available. or Supplementary Material is available at Bioinformatics online. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail:

  15. Forecast models for suicide: Time-series analysis with data from Italy. (United States)

    Preti, Antonio; Lentini, Gianluca


    The prediction of suicidal behavior is a complex task. To fine-tune targeted preventative interventions, predictive analytics (i.e. forecasting future risk of suicide) is more important than exploratory data analysis (pattern recognition, e.g. detection of seasonality in suicide time series). This study sets out to investigate the accuracy of forecasting models of suicide for men and women. A total of 101 499 male suicides and of 39 681 female suicides - occurred in Italy from 1969 to 2003 - were investigated. In order to apply the forecasting model and test its accuracy, the time series were split into a training set (1969 to 1996; 336 months) and a test set (1997 to 2003; 84 months). The main outcome was the accuracy of forecasting models on the monthly number of suicides. These measures of accuracy were used: mean absolute error; root mean squared error; mean absolute percentage error; mean absolute scaled error. In both male and female suicides a change in the trend pattern was observed, with an increase from 1969 onwards to reach a maximum around 1990 and decrease thereafter. The variances attributable to the seasonal and trend components were, respectively, 24% and 64% in male suicides, and 28% and 41% in female ones. Both annual and seasonal historical trends of monthly data contributed to forecast future trends of suicide with a margin of error around 10%. The finding is clearer in male than in female time series of suicide. The main conclusion of the study is that models taking seasonality into account seem to be able to derive information on deviation from the mean when this occurs as a zenith, but they fail to reproduce it when it occurs as a nadir. Preventative efforts should concentrate on the factors that influence the occurrence of increases above the main trend in both seasonal and cyclic patterns of suicides.

  16. Delimiting Allelic Imbalance of TYMS by Allele-Specific Analysis. (United States)

    Balboa-Beltrán, Emilia; Cruz, Raquel; Carracedo, Angel; Barros, Francisco


    Allelic imbalance of thymidylate synthase (TYMS) is attributed to polymorphisms in the 5'- and 3'-untranslated region (UTR). These polymorphisms have been related to the risk of suffering different cancers, for example leukemia, breast or gastric cancer, and response to different drugs, among which are methotrexate glutamates, stavudine, and specifically 5-fluorouracil (5-FU), as TYMS is its direct target. A vast literature has been published in relation to 5-FU, even suggesting the sole use of these polymorphisms to effectively manage 5-FU dosage. Estimates of the extent to which these polymorphisms influence in TYMS expression have in the past been based on functional analysis by luciferase assays and quantification of TYMS mRNA, but both these studies, as the association studies with cancer risk or with toxicity or response to 5-FU, are very contradictory. Regarding functional assays, the artificial genetic environment created in luciferase assay and the problems derived from quantitative polymerase chain reactions (qPCRs), for example the use of a reference gene, may have distorted the results. To avoid these sources of interference, we have analyzed the allelic imbalance of TYMS by allelic-specific analysis in peripheral blood mononuclear cells (PBMCs) from patients.Allelic imbalance in PBMCs, taken from 40 patients with suspected myeloproliferative haematological diseases, was determined by fluorescent fragment analysis (for the 3'-UTR polymorphism), Sanger sequencing and allelic-specific qPCR in multiplex (for the 5'-UTR polymorphisms).For neither the 3'- nor the 5'-UTR polymorphisms did the observed allelic imbalance exceed 1.5 fold. None of the TYMS polymorphisms is statistically associated with allelic imbalance.The results acquired allow us to deny the previously established assertion of an influence of 2 to 4 fold of the rs45445694 and rs2853542 polymorphisms in the expression of TYMS and narrow its allelic imbalance to 1.5 fold, in our population

  17. Accuracy evaluation of Fourier series analysis and singular spectrum analysis for predicting the volume of motorcycle sales in Indonesia (United States)

    Sasmita, Yoga; Darmawan, Gumgum


    This research aims to evaluate the performance of forecasting by Fourier Series Analysis (FSA) and Singular Spectrum Analysis (SSA) which are more explorative and not requiring parametric assumption. Those methods are applied to predicting the volume of motorcycle sales in Indonesia from January 2005 to December 2016 (monthly). Both models are suitable for seasonal and trend component data. Technically, FSA defines time domain as the result of trend and seasonal component in different frequencies which is difficult to identify in the time domain analysis. With the hidden period is 2,918 ≈ 3 and significant model order is 3, FSA model is used to predict testing data. Meanwhile, SSA has two main processes, decomposition and reconstruction. SSA decomposes the time series data into different components. The reconstruction process starts with grouping the decomposition result based on similarity period of each component in trajectory matrix. With the optimum of window length (L = 53) and grouping effect (r = 4), SSA predicting testing data. Forecasting accuracy evaluation is done based on Mean Absolute Percentage Error (MAPE), Mean Absolute Error (MAE) and Root Mean Square Error (RMSE). The result shows that in the next 12 month, SSA has MAPE = 13.54 percent, MAE = 61,168.43 and RMSE = 75,244.92 and FSA has MAPE = 28.19 percent, MAE = 119,718.43 and RMSE = 142,511.17. Therefore, to predict volume of motorcycle sales in the next period should use SSA method which has better performance based on its accuracy.

  18. Time Series Analysis of the Bacillus subtilis Sporulation Network Reveals Low Dimensional Chaotic Dynamics. (United States)

    Lecca, Paola; Mura, Ivan; Re, Angela; Barker, Gary C; Ihekwaba, Adaoha E C


    Chaotic behavior refers to a behavior which, albeit irregular, is generated by an underlying deterministic process. Therefore, a chaotic behavior is potentially controllable. This possibility becomes practically amenable especially when chaos is shown to be low-dimensional, i.e., to be attributable to a small fraction of the total systems components. In this case, indeed, including the major drivers of chaos in a system into the modeling approach allows us to improve predictability of the systems dynamics. Here, we analyzed the numerical simulations of an accurate ordinary differential equation model of the gene network regulating sporulation initiation in Bacillus subtilis to explore whether the non-linearity underlying time series data is due to low-dimensional chaos. Low-dimensional chaos is expectedly common in systems with few degrees of freedom, but rare in systems with many degrees of freedom such as the B. subtilis sporulation network. The estimation of a number of indices, which reflect the chaotic nature of a system, indicates that the dynamics of this network is affected by deterministic chaos. The neat separation between the indices obtained from the time series simulated from the model and those obtained from time series generated by Gaussian white and colored noise confirmed that the B. subtilis sporulation network dynamics is affected by low dimensional chaos rather than by noise. Furthermore, our analysis identifies the principal driver of the networks chaotic dynamics to be sporulation initiation phosphotransferase B (Spo0B). We then analyzed the parameters and the phase space of the system to characterize the instability points of the network dynamics, and, in turn, to identify the ranges of values of Spo0B and of the other drivers of the chaotic dynamics, for which the whole system is highly sensitive to minimal perturbation. In summary, we described an unappreciated source of complexity in the B. subtilis sporulation network by gathering

  19. Religious Activities and Suicide Prevention: A Gender Specific Analysis

    Directory of Open Access Journals (Sweden)

    Steven Stack


    Full Text Available The present analysis contributes to the existing literature on religion and suicide in three interrelated ways: (1 providing an analysis of suicide completions whereas most research is based on non-lethal levels of suicidality; (2 assessing the relationship with concrete individual level data on completed suicides instead of aggregated data marked by the ecological fallacy issue; and (3 providing gender specific analyses to determine if the relationship is gendered. METHODS. Data come from the U.S. Public Health Service, National Mortality Followback Survey. They refer to 16,795 deaths including 1385 suicides. Significant others of the deceased were interviewed to measure all variables. The dependent variable is a binary variable where 1 = death by suicide and 0 = all other causes. The central independent variable is an index of religious activities. Controls are included for five categories of confounders (1 psychiatric morbidity; (2 help-seeking behavior; (3 Opportunity factors such as firearms; (4 social integration; and (5 demographics. RESULTS. Multivariate logistic regression analysis determined that controlling for 16 predictors of suicide, a one unit increase in religious activities reduced the odds of a suicide death by 17% for males and by 15% for females. The difference in coefficients is not significant (Z = 0.51. Other significant predictors of suicide deaths included suicide ideation (OR = 8.87, males, OR = 11.48, females and firearm availability (OR = 4.21, males, OR = 2.83, females. DISCUSSION. Religious activities were found to lower suicide risk equally for both men and women. Further work is needed to assess pathways, including suicide ideation, between religious activities and lowered suicide risk. This is the first U.S. based study to test for a gendered association between religion and suicide at the individual level of analysis.

  20. Tract specific analysis in patients with sickle cell disease (United States)

    Chai, Yaqiong; Coloigner, Julie; Qu, Xiaoping; Choi, Soyoung; Bush, Adam; Borzage, Matt; Vu, Chau; Lepore, Natasha; Wood, John


    Sickle cell disease (SCD) is a hereditary blood disorder in which the oxygen-carrying hemoglobin molecule in red blood cells is abnormal. It affects numerous people in the world and leads to a shorter life span, pain, anemia, serious infections and neurocognitive decline. Tract-Specific Analysis (TSA) is a statistical method to evaluate white matter alterations due to neurocognitive diseases, using diffusion tensor magnetic resonance images. Here, for the first time, TSA is used to compare 11 major brain white matter (WM) tracts between SCD patients and age-matched healthy subjects. Alterations are found in the corpus callosum (CC), the cortico-spinal tract (CST), inferior fronto-occipital fasciculus (IFO), inferior longitudinal fasciculus (ILF), superior longitudinal fasciculus (SLF), and uncinated fasciculus (UNC). Based on previous studies on the neurocognitive functions of these tracts, the significant areas found in this paper might be related to several cognitive impairments and depression, both of which are observed in SCD patients.

  1. Analysis of time series for postal shipments in Regional VII East Java Indonesia (United States)

    Kusrini, DE; Ulama, B. S. S.; Aridinanti, L.


    The change of number delivery goods through PT. Pos Regional VII East Java Indonesia indicates that the trend of increasing and decreasing the delivery of documents and non-documents in PT. Pos Regional VII East Java Indonesia is strongly influenced by conditions outside of PT. Pos Regional VII East Java Indonesia so that the prediction the number of document and non-documents requires a model that can accommodate it. Based on the time series plot monthly data fluctuations occur from 2013-2016 then the model is done using ARIMA or seasonal ARIMA and selected the best model based on the smallest AIC value. The results of data analysis about the number of shipments on each product sent through the Sub-Regional Postal Office VII East Java indicates that there are 5 post offices of 26 post offices entering the territory. The largest number of shipments is available on the PPB (Paket Pos Biasa is regular package shipment/non-document ) and SKH (Surat Kilat Khusus is Special Express Mail/document) products. The time series model generated is largely a Random walk model meaning that the number of shipment in the future is influenced by random effects that are difficult to predict. Some are AR and MA models, except for Express shipment products with Malang post office destination which has seasonal ARIMA model on lag 6 and 12. This means that the number of items in the following month is affected by the number of items in the previous 6 months.

  2. The Relationship between Logistics and Economic Development in Indonesia: Analysis of Time Series Data

    Directory of Open Access Journals (Sweden)

    Mohammad Reza


    Full Text Available This paper investigates the relationship between logistics and economic development in Indonesia using time series data on traffic volume and economic growth for the period from 1988 to 2010. Literature reviews were conducted to find the most applicable econometric model. The data of cargo volume that travels through sea, air and rail is used as the logistics index, while GDP is used for the economic index. The time series data was tested using stationarity and co-integration tests. Granger causality tests were employed, and then a proposed logistic model is presented. This study showed that logistics plays an important role in supporting and sustaining economic growth, in a form where the economic growth is the significant demand-pull effect towards logistics. Although the model is developed in the context of Indonesia, the overall statistical analysis can be generalized to other developing economies. Based on the model, this paper presented the importance of sustaining economic development with regards continuously improving the logistics infrastructure.

  3. Sustainability of Italian budgetary policies: a time series analysis (1862-2013

    Directory of Open Access Journals (Sweden)

    Gordon L. Brady


    Full Text Available In this paper, we analyze the sustainability of Italian public finances using a unique database covering the period 1862-2013. This paper focuses on empirical tests for the sustainability and solvency of fiscal policies. A necessary but not sufficient condition implies that the growth rate of public debt should in the limit be smaller than the asymptotic rate of interest. In addition, the debt-to-GDP ratio must eventually stabilize at a steady-state level. The results of unit root and stationarity tests show that the variables are non-stationary at levels, but stationary in first-differences form, or I(1. However, some breaks in the series emerge, given internal and external crises (wars, oil shocks, regime changes, institutional reforms. Therefore, the empirical analysis is conducted for the entire period, as well as two sub‐periods (1862‐1913 and 1947‐2013. Moreover, anecdotal evidence and visual inspection of the series confirm our results. Furthermore, we conduct tests on cointegration, which evidence that a long-run relationship between public expenditure and revenues is found only for the first sub-period (1862-1913. In essence, the paper’s results reveal that Italy have sustainability problems in the Republican age.

  4. Free vibration characteristics analysis of rectangular plate with rectangular opening based on Fourier series method

    Directory of Open Access Journals (Sweden)

    WANG Minhao


    Full Text Available Plate structures with openings are common in many engineering structures. The study of the vibration characteristics of such structures is directly related to the vibration reduction, noise reduction and stability analysis of an overall structure. This paper conducts research into the free vibration characteristics of a thin elastic plate with a rectangular opening parallel to the plate in an arbitrary position. We use the improved Fourier series to represent the displacement tolerance function of the rectangular plate with an opening. We can divide the plate into an eight zone plate to simplify the calculation. We then use linear springs, which are uniformly distributed along the boundary, to simulate the classical boundary conditions and the boundary conditions of the boundaries between the regions. According to the energy functional and variational method, we can obtain the overall energy functional. We can also obtain the generalized eigenvalue matrix equation by studying the extremum of the unknown improved Fourier series expansion coefficients. We can then obtain the natural frequencies and corresponding vibration modes of the rectangular plate with an opening by solving the equation. We then compare the calculated results with the finite element method to verify the accuracy and effectiveness of the method proposed in this paper. Finally, we research the influence of the boundary condition, opening size and opening position on the vibration characteristics of a plate with an opening. This provides a theoretical reference for practical engineering application.

  5. Parametric time series analysis of geoelectrical signals: an application to earthquake forecasting in Southern Italy

    Directory of Open Access Journals (Sweden)

    V. Tramutoli


    Full Text Available An autoregressive model was selected to describe geoelectrical time series. An objective technique was subsequently applied to analyze and discriminate values above (below an a priorifixed threshold possibly related to seismic events. A complete check of the model and the main guidelines to estimate the occurrence probability of extreme events are reported. A first application of the proposed technique is discussed through the analysis of the experimental data recorded by an automatic station located in Tito, a small town on the Apennine chain in Southern Italy. This region was hit by the November 1980 Irpinia-Basilicata earthquake and it is one of most active areas of the Mediterranean region. After a preliminary filtering procedure to reduce the influence of external parameters (i.e. the meteo-climatic effects, it was demonstrated that the geoelectrical residual time series are well described by means of a second order autoregressive model. Our findings outline a statistical methodology to evaluate the efficiency of electrical seismic precursors.

  6. Evaluation of the autoregression time-series model for analysis of a noisy signal

    International Nuclear Information System (INIS)

    Allen, J.W.


    The autoregression (AR) time-series model of a continuous noisy signal was statistically evaluated to determine quantitatively the uncertainties of the model order, the model parameters, and the model's power spectral density (PSD). The result of such a statistical evaluation enables an experimenter to decide whether an AR model can adequately represent a continuous noisy signal and be consistent with the signal's frequency spectrum, and whether it can be used for on-line monitoring. Although evaluations of other types of signals have been reported in the literature, no direct reference has been found to AR model's uncertainties for continuous noisy signals; yet the evaluation is necessary to decide the usefulness of AR models of typical reactor signals (e.g., neutron detector output or thermocouple output) and the potential of AR models for on-line monitoring applications. AR and other time-series models for noisy data representation are being investigated by others since such models require fewer parameters than the traditional PSD model. For this study, the AR model was selected for its simplicity and conduciveness to uncertainty analysis, and controlled laboratory bench signals were used for continuous noisy data. (author)

  7. Economic feasibility of biogas production in swine farms using time series analysis

    Directory of Open Access Journals (Sweden)

    Felipe Luis Rockenbach


    Full Text Available ABSTRACT: This study aimed to measure the economic feasibility and the time needed to return capital invested for the installation of a swine manure treatment system, these values originated the sale of carbon credits and/or of compensation of electric energy in swine farms, using the Box-Jenkins forecast models. It was found that the use of biogas is a viable option in a large scale with machines that operate daily for 10h or more, being the return period between 70 to 80 months. Time series analysis models are important to anticipate the series under study behavior, providing the swine breeder/investor means to reduce the financial investment risk as well as helping to decrease the production costs. Moreover, this process can be seen as another source of income and enable the breeder to be self-sufficient in the continuous supply of electric energy, which is very valuable nowadays considering that breeders are now increasingly using various technologies.

  8. River catchment rainfall series analysis using additive Holt-Winters method (United States)

    Puah, Yan Jun; Huang, Yuk Feng; Chua, Kuan Chin; Lee, Teang Shui


    Climate change is receiving more attention from researchers as the frequency of occurrence of severe natural disasters is getting higher. Tropical countries like Malaysia have no distinct four seasons; rainfall has become the popular parameter to assess climate change. Conventional ways that determine rainfall trends can only provide a general result in single direction for the whole study period. In this study, rainfall series were modelled using additive Holt-Winters method to examine the rainfall pattern in Langat River Basin, Malaysia. Nine homogeneous series of more than 25 years data and less than 10% missing data were selected. Goodness of fit of the forecasted models was measured. It was found that seasonal rainfall model forecasts are generally better than the monthly rainfall model forecasts. Three stations in the western region exhibited increasing trend. Rainfall in southern region showed fluctuation. Increasing trends were discovered at stations in the south-eastern region except the seasonal analysis at station 45253. Decreasing trend was found at station 2818110 in the east, while increasing trend was shown at station 44320 that represents the north-eastern region. The accuracies of both rainfall model forecasts were tested using the recorded data of years 2010-2012. Most of the forecasts are acceptable.

  9. Specification, Design, and Analysis of Advanced HUMS Architectures (United States)

    Mukkamala, Ravi


    During the two-year project period, we have worked on several aspects of domain-specific architectures for HUMS. In particular, we looked at using scenario-based approach for the design and designed a language for describing such architectures. The language is now being used in all aspects of our HUMS design. In particular, we have made contributions in the following areas. 1) We have employed scenarios in the development of HUMS in three main areas. They are: (a) To improve reusability by using scenarios as a library indexing tool and as a domain analysis tool; (b) To improve maintainability by recording design rationales from two perspectives - problem domain and solution domain; (c) To evaluate the software architecture. 2) We have defined a new architectural language called HADL or HUMS Architectural Definition Language. It is a customized version of xArch/xADL. It is based on XML and, hence, is easily portable from domain to domain, application to application, and machine to machine. Specifications written in HADL can be easily read and parsed using the currently available XML parsers. Thus, there is no need to develop a plethora of software to support HADL. 3) We have developed an automated design process that involves two main techniques: (a) Selection of solutions from a large space of designs; (b) Synthesis of designs. However, the automation process is not an absolute Artificial Intelligence (AI) approach though it uses a knowledge-based system that epitomizes a specific HUMS domain. The process uses a database of solutions as an aid to solve the problems rather than creating a new design in the literal sense. Since searching is adopted as the main technique, the challenges involved are: (a) To minimize the effort in searching the database where a very large number of possibilities exist; (b) To develop representations that could conveniently allow us to depict design knowledge evolved over many years; (c) To capture the required information that aid the

  10. Simple and complicated rectal diverticula: endoscopic analysis of a case series from Brazil

    Directory of Open Access Journals (Sweden)

    Guilherme Lang Motta


    Full Text Available INTRODUCTION: Diverticular disease of the colon is a very common condition, present in most of the elderly population. However, the occurrence of rectal diverticula is extremely unusual. It is typically an incidental finding at colonoscopy. OBJECTIVE: Describe epidemiological, clinical, surgical and endoscopic characteristics of a case series of rectal diverticula in Brazil. METHODS: Four patients with rectal diverticula were analyzed in terms of symptomatology, associated conditions and colonoscopy findings. Endoscopic findings were discussed individually. RESULTS: The prevalence of rectal diverticula at our endoscopy unit was 0.15% of all colonoscopies, affecting 0.74% of patients with colonic diverticulosis. The endoscopic analysis showed the diverticulum ostium with mean size of 2.3 cm, depth of 2.8 cm and anal margin distance of 6.8 cm. Colonoscopy also demonstrated simple rectal diverticulum in all patients. Diverticula were located in the anterior, right lateral and posterior walls of the rectum. One patient developed diverticulitis as complication and underwent to diverticulectomy. CONCLUSIONS: Rectal diverticulum is an incidental finding at colonoscopy and associated with diverticulosis. Its rarity and specific colonoscopic characteristics make it a unique entity. Asymptomatic in most cases, it rarely needs intervention. Surgery is reserved for complicated cases.INTRODUÇÃO: Diverticulose é uma condição muito comum, presente em grande parte da população idosa. Divertículo retal, entretanto, é condição rara. Geralmente é um achado incidental em colonoscopias. OBJETIVO: Descrever as características epidemiológicas, clínicas, cirúrgicas e, especialmente, endoscópicas de uma série de casos de divertículos retais no Brasil. MÉTODOS: Quatro pacientes com divertículos retais são analisados em relação a sintomatologia, condições associadas e colonoscopias. Os achados endoscópicos são discutidos especificamente

  11. Learning from environmental data: Methods for analysis of forest nutrition time series

    Energy Technology Data Exchange (ETDEWEB)

    Sulkava, M. (Helsinki Univ. of Technology, Espoo (Finland). Computer and Information Science)


    Data analysis methods play an important role in increasing our knowledge of the environment as the amount of data measured from the environment increases. This thesis fits under the scope of environmental informatics and environmental statistics. They are fields, in which data analysis methods are developed and applied for the analysis of environmental data. The environmental data studied in this thesis are time series of nutrient concentration measurements of pine and spruce needles. In addition, there are data of laboratory quality and related environmental factors, such as the weather and atmospheric depositions. The most important methods used for the analysis of the data are based on the self-organizing map and linear regression models. First, a new clustering algorithm of the self-organizing map is proposed. It is found to provide better results than two other methods for clustering of the self-organizing map. The algorithm is used to divide the nutrient concentration data into clusters, and the result is evaluated by environmental scientists. Based on the clustering, the temporal development of the forest nutrition is modeled and the effect of nitrogen and sulfur deposition on the foliar mineral composition is assessed. Second, regression models are used for studying how much environmental factors and properties of the needles affect the changes in the nutrient concentrations of the needles between their first and second year of existence. The aim is to build understandable models with good prediction capabilities. Sparse regression models are found to outperform more traditional regression models in this task. Third, fusion of laboratory quality data from different sources is performed to estimate the precisions of the analytical methods. Weighted regression models are used to quantify how much the precision of observations can affect the time needed to detect a trend in environmental time series. The results of power analysis show that improving the

  12. Experiment Operating Specification for the Semiscale MOD-2C feedwater and steam line break experiment series. Appendix S-FS-6 and 7

    International Nuclear Information System (INIS)

    Boucher, T.J.; Owca, W.A.


    This document is the Semiscale MOD-2C feedwater and steam line break experiment series Experiment Operating Specification Appendix for tests S-FS-6 and S-FS-7. Test S-FS-6 is the third test in the series and simulates a 100% break in a steam generator bottom feedwater line downstream of the check valve accompanied by compounding factors (such as check valve failure, loss-of-offsite power at SIS and SIS delayed until low steam generator pressure signal). The test is terminated after plant stabilization and recovery procedures including unaffected loop steam and feed, pressurizer heater operation, pressurizer auxiliary spray operation, and normal charging/letdown operation. Test S-FS-7 is the fourth test in the series and simulates a 14.3% break in a steam generator bottom feedwater line downstream of the check valve, accompanied by compounding factors. The test is terminated after plant stabilization procedures including unaffected loop steam and feed, pressurizer heater operation, and normal charging/letdown operation. The test was followed by an affected loop secondary refill after isolating the break. The Appendix contains information on the major fluid systems, initial experiment conditions, experiment boundary conditions, and sequence of experiment events. Also included is a discussion of the scaling criteria and philosophy used to develop the experiment initial and boundary conditions and system configuration

  13. Development of analysis software for radiation time-series data with the use of visual studio 2005

    International Nuclear Information System (INIS)

    Hohara, Sin-ya; Horiguchi, Tetsuo; Ito, Shin


    Time-Series Analysis supplies a new vision that conventional analysis methods such as energy spectroscopy haven't achieved ever. However, application of time-series analysis to radiation measurements needs much effort in software and hardware development. By taking advantage of Visual Studio 2005, we developed an analysis software, 'ListFileConverter', for time-series radiation measurement system called as 'MPA-3'. The software is based on graphical user interface (GUI) architecture that enables us to save a large amount of operation time in the analysis, and moreover to make an easy-access to special file structure of MPA-3 data. In this paper, detailed structure of ListFileConverter is fully explained, and experimental results for counting capability of MPA-3 hardware system and those for neutron measurements with our UTR-KINKI reactor are also given. (author)

  14. Analysis of rhythmic variance - ANORVA. A new simple method for detecting rhythms in biological time series

    Directory of Open Access Journals (Sweden)

    Peter Celec


    Full Text Available Cyclic variations of variables are ubiquitous in biomedical science. A number of methods for detecting rhythms have been developed, but they are often difficult to interpret. A simple procedure for detecting cyclic variations in biological time series and quantification of their probability is presented here. Analysis of rhythmic variance (ANORVA is based on the premise that the variance in groups of data from rhythmic variables is low when a time distance of one period exists between the data entries. A detailed stepwise calculation is presented including data entry and preparation, variance calculating, and difference testing. An example for the application of the procedure is provided, and a real dataset of the number of papers published per day in January 2003 using selected keywords is compared to randomized datasets. Randomized datasets show no cyclic variations. The number of papers published daily, however, shows a clear and significant (p<0.03 circaseptan (period of 7 days rhythm, probably of social origin

  15. Possible signatures of dissipation from time-series analysis techniques using a turbulent laboratory magnetohydrodynamic plasma

    International Nuclear Information System (INIS)

    Schaffner, D. A.; Brown, M. R.; Rock, A. B.


    The frequency spectrum of magnetic fluctuations as measured on the Swarthmore Spheromak Experiment is broadband and exhibits a nearly Kolmogorov 5/3 scaling. It features a steepening region which is indicative of dissipation of magnetic fluctuation energy similar to that observed in fluid and magnetohydrodynamic turbulence systems. Two non-spectrum based time-series analysis techniques are implemented on this data set in order to seek other possible signatures of turbulent dissipation beyond just the steepening of fluctuation spectra. Presented here are results for the flatness, permutation entropy, and statistical complexity, each of which exhibits a particular character at spectral steepening scales which can then be compared to the behavior of the frequency spectrum.

  16. Singular spectrum analysis in nonlinear dynamics, with applications to paleoclimatic time series (United States)

    Vautard, R.; Ghil, M.


    Two dimensions of a dynamical system given by experimental time series are distinguished. Statistical dimension gives a theoretical upper bound for the minimal number of degrees of freedom required to describe the attractor up to the accuracy of the data, taking into account sampling and noise problems. The dynamical dimension is the intrinsic dimension of the attractor and does not depend on the quality of the data. Singular Spectrum Analysis (SSA) provides estimates of the statistical dimension. SSA also describes the main physical phenomena reflected by the data. It gives adaptive spectral filters associated with the dominant oscillations of the system and clarifies the noise characteristics of the data. SSA is applied to four paleoclimatic records. The principal climatic oscillations and the regime changes in their amplitude are detected. About 10 degrees of freedom are statistically significant in the data. Large noise and insufficient sample length do not allow reliable estimates of the dynamical dimension.

  17. Analysis of hohlraum energetics of the SG series and the NIF experiments with energy balance model

    Directory of Open Access Journals (Sweden)

    Guoli Ren


    Full Text Available The basic energy balance model is applied to analyze the hohlraum energetics data from the Shenguang (SG series laser facilities and the National Ignition Facility (NIF experiments published in the past few years. The analysis shows that the overall hohlraum energetics data are in agreement with the energy balance model within 20% deviation. The 20% deviation might be caused by the diversity in hohlraum parameters, such as material, laser pulse, gas filling density, etc. In addition, the NIF's ignition target designs and our ignition target designs given by simulations are also in accordance with the energy balance model. This work confirms the value of the energy balance model for ignition target design and experimental data assessment, and demonstrates that the NIF energy is enough to achieve ignition if a 1D spherical radiation drive could be created, meanwhile both the laser plasma instabilities and hydrodynamic instabilities could be suppressed.

  18. Event-sequence time series analysis in ground-based gamma-ray astronomy

    International Nuclear Information System (INIS)

    Barres de Almeida, U.; Chadwick, P.; Daniel, M.; Nolan, S.; McComb, L.


    The recent, extreme episodes of variability detected from Blazars by the leading atmospheric Cerenkov experiments motivate the development and application of specialized statistical techniques that enable the study of this rich data set to its furthest extent. The identification of the shortest variability timescales supported by the data and the actual variability structure observed in the light curves of these sources are some of the fundamental aspects being studied, that answers can bring new developments on the understanding of the physics of these objects and on the mechanisms of production of VHE gamma-rays in the Universe. Some of our efforts in studying the time variability of VHE sources involve the application of dynamic programming algorithms to the problem of detecting change-points in a Poisson sequence. In this particular paper we concentrate on the more primary issue of the applicability of counting statistics to the analysis of time-series on VHE gamma-ray astronomy.

  19. Studies in astronomical time series analysis: Modeling random processes in the time domain (United States)

    Scargle, J. D.


    Random process models phased in the time domain are used to analyze astrophysical time series data produced by random processes. A moving average (MA) model represents the data as a sequence of pulses occurring randomly in time, with random amplitudes. An autoregressive (AR) model represents the correlations in the process in terms of a linear function of past values. The best AR model is determined from sampled data and transformed to an MA for interpretation. The randomness of the pulse amplitudes is maximized by a FORTRAN algorithm which is relatively stable numerically. Results of test cases are given to study the effects of adding noise and of different distributions for the pulse amplitudes. A preliminary analysis of the optical light curve of the quasar 3C 273 is given.

  20. Fuzzy central tendency measure for time series variability analysis with application to fatigue electromyography signals. (United States)

    Xie, Hong-Bo; Dokos, Socrates


    A new method, namely fuzzy central tendency measure (fCTM) analysis, that could enable measurement of the variability of a time series, is presented in this study. Tests on simulated data sets show that fCTM is superior to the conventional central tendency measure (CTM) in several respects, including improved relative consistency and robustness to noise. The proposed fCTM method was applied to electromyograph (EMG) signals recorded during sustained isometric contraction for tracking local muscle fatigue. The results showed that the fCTM increased significantly during the development of muscle fatigue, and it was more sensitive to the fatigue phenomenon than mean frequency (MNF), the most commonly-used muscle fatigue indicator.

  1. Nonlinear Analysis on Cross-Correlation of Financial Time Series by Continuum Percolation System (United States)

    Niu, Hongli; Wang, Jun

    We establish a financial price process by continuum percolation system, in which we attribute price fluctuations to the investors’ attitudes towards the financial market, and consider the clusters in continuum percolation as the investors share the same investment opinion. We investigate the cross-correlations in two return time series, and analyze the multifractal behaviors in this relationship. Further, we study the corresponding behaviors for the real stock indexes of SSE and HSI as well as the liquid stocks pair of SPD and PAB by comparison. To quantify the multifractality in cross-correlation relationship, we employ multifractal detrended cross-correlation analysis method to perform an empirical research for the simulation data and the real markets data.

  2. Analysis of the development trend of China’s business administration based on time series

    Directory of Open Access Journals (Sweden)

    Jiang Rui


    Full Text Available On the general direction of the economic system, China is in a crucial period of the establishment of the modern enterprise system and reform of the macroeconomic system, and a lot of high-quality business administration talents are required to make China’s economy be stably developed. This paper carries out time series analysis of the development situation of China’s business administration major: on the whole, the society currently presents an upward trend on the demand for the business administration talents. With the gradually increasing demand for the business administration talents, various colleges and universities also set up the business administration major to train a large number of administration talents, thus leading to an upward trend for the academic focus on business administration.

  3. Site-specific meteorology identification for DOE facility accident analysis

    International Nuclear Information System (INIS)

    Rabin, S.B.


    Currently, chemical dispersion calculations performed for safety analysis of DOE facilities assume a Pasquill D-Stability Class with a 4.5 m/s windspeed. These meteorological conditions are assumed to conservatively address the source term generation mechanism as well as the dispersion mechanism thereby resulting in a net conservative downwind consequence. While choosing this Stability Class / Windspeed combination may result in an overall conservative consequence, the level of conservative can not be quantified. The intent of this paper is to document a methodology which incorporates site-specific meteorology to determine a quantifiable consequence of a chemical release. A five-year meteorological database, appropriate for the facility location, is utilized for these chemical consequence calculations, and is consistent with the approach used for radiological releases. The hourly averages of meteorological conditions have been binned into 21 groups for the chemical consequence calculations. These 21 cases each have a probability of occurrence based on the number of times each case has occurred over the five year sampling period. A code has been developed which automates the running of all the cases with a commercially available air modeling code. The 21 cases are sorted by concentration. A concentration may be selected by the user for a quantified level of conservatism. The methodology presented is intended to improve the technical accuracy and defensability of Chemical Source Term / Dispersion Safety Analysis work. The result improves the quality of safety analyses products without significantly increasing the cost

  4. Series: Practical guidance to qualitative research. Part 3: Sampling, data collection and analysis (United States)

    Moser, Albine; Korstjens, Irene


    Abstract In the course of our supervisory work over the years, we have noticed that qualitative research tends to evoke a lot of questions and worries, so-called frequently asked questions (FAQs). This series of four articles intends to provide novice researchers with practical guidance for conducting high-quality qualitative research in primary care. By ‘novice’ we mean Master’s students and junior researchers, as well as experienced quantitative researchers who are engaging in qualitative research for the first time. This series addresses their questions and provides researchers, readers, reviewers and editors with references to criteria and tools for judging the quality of qualitative research papers. The second article focused on context, research questions and designs, and referred to publications for further reading. This third article addresses FAQs about sampling, data collection and analysis. The data collection plan needs to be broadly defined and open at first, and become flexible during data collection. Sampling strategies should be chosen in such a way that they yield rich information and are consistent with the methodological approach used. Data saturation determines sample size and will be different for each study. The most commonly used data collection methods are participant observation, face-to-face in-depth interviews and focus group discussions. Analyses in ethnographic, phenomenological, grounded theory, and content analysis studies yield different narrative findings: a detailed description of a culture, the essence of the lived experience, a theory, and a descriptive summary, respectively. The fourth and final article will focus on trustworthiness and publishing qualitative research. PMID:29199486

  5. Ontology-based specification, identification and analysis of perioperative risks. (United States)

    Uciteli, Alexandr; Neumann, Juliane; Tahar, Kais; Saleh, Kutaiba; Stucke, Stephan; Faulbrück-Röhr, Sebastian; Kaeding, André; Specht, Martin; Schmidt, Tobias; Neumuth, Thomas; Besting, Andreas; Stegemann, Dominik; Portheine, Frank; Herre, Heinrich


    Medical personnel in hospitals often works under great physical and mental strain. In medical decision-making, errors can never be completely ruled out. Several studies have shown that between 50 and 60% of adverse events could have been avoided through better organization, more attention or more effective security procedures. Critical situations especially arise during interdisciplinary collaboration and the use of complex medical technology, for example during surgical interventions and in perioperative settings (the period of time before, during and after surgical intervention). In this paper, we present an ontology and an ontology-based software system, which can identify risks across medical processes and supports the avoidance of errors in particular in the perioperative setting. We developed a practicable definition of the risk notion, which is easily understandable by the medical staff and is usable for the software tools. Based on this definition, we developed a Risk Identification Ontology (RIO) and used it for the specification and the identification of perioperative risks. An agent system was developed, which gathers risk-relevant data during the whole perioperative treatment process from various sources and provides it for risk identification and analysis in a centralized fashion. The results of such an analysis are provided to the medical personnel in form of context-sensitive hints and alerts. For the identification of the ontologically specified risks, we developed an ontology-based software module, called Ontology-based Risk Detector (OntoRiDe). About 20 risks relating to cochlear implantation (CI) have already been implemented. Comprehensive testing has indicated the correctness of the data acquisition, risk identification and analysis components, as well as the web-based visualization of results.

  6. Fourier series analysis of a cylindrical pressure vessel subjected to axial end load and external pressure

    International Nuclear Information System (INIS)

    Brar, Gurinder Singh; Hari, Yogeshwar; Williams, Dennis K.


    Pressure Vessel Code, Section VIII, Division 2 and ASME STS-1. -- Highlights: • Fourier series is used to predict the load carrying capacity of cylindrical vessel. • Reliability approach used for analysis as against the deterministic approach. • Cylindrical pressure vessel is subjected to axial end load and external pressure. • Axisymmetric and asymmetric analysis carried out for imperfect pressure vessels. • Results are compared to the recommendations laid out in ASME B and PV Code

  7. Dissolved organic nitrogen dynamics in the North Sea: A time series analysis (1995-2005) (United States)

    Van Engeland, T.; Soetaert, K.; Knuijt, A.; Laane, R. W. P. M.; Middelburg, J. J.


    Dissolved organic nitrogen (DON) dynamics in the North Sea was explored by means of long-term time series of nitrogen parameters from the Dutch national monitoring program. Generally, the data quality was good with little missing data points. Different imputation methods were used to verify the robustness of the patterns against these missing data. No long-term trends in DON concentrations were found over the sampling period (1995-2005). Inter-annual variability in the different time series showed both common and station-specific behavior. The stations could be divided into two regions, based on absolute concentrations and the dominant times scales of variability. Average DON concentrations were 11 μmol l -1 in the coastal region and 5 μmol l -1 in the open sea. Organic fractions of total dissolved nitrogen (TDN) averaged 38 and 71% in the coastal zone and open sea, respectively, but increased over time due to decreasing dissolved inorganic nitrogen (DIN) concentrations. In both regions intra-annual variability dominated over inter-annual variability, but DON variation in the open sea was markedly shifted towards shorter time scales relative to coastal stations. In the coastal zone a consistent seasonal DON cycle existed with high values in spring-summer and low values in autumn-winter. In the open sea seasonality was weak. A marked shift in the seasonality was found at the Dogger Bank, with DON accumulation towards summer and low values in winter prior to 1999, and accumulation in spring and decline throughout summer after 1999. This study clearly shows that DON is a dynamic actor in the North Sea and should be monitored systematically to enable us to understand fully the functioning of this ecosystem.

  8. Nonlinear Analysis of Time Series in Genome-Wide Linkage Disequilibrium Data (United States)

    Hernández-Lemus, Enrique; Estrada-Gil, Jesús K.; Silva-Zolezzi, Irma; Fernández-López, J. Carlos; Hidalgo-Miranda, Alfredo; Jiménez-Sánchez, Gerardo


    The statistical study of large scale genomic data has turned out to be a very important tool in population genetics. Quantitative methods are essential to understand and implement association studies in the biomedical and health sciences. Nevertheless, the characterization of recently admixed populations has been an elusive problem due to the presence of a number of complex phenomena. For example, linkage disequilibrium structures are thought to be more complex than their non-recently admixed population counterparts, presenting the so-called ancestry blocks, admixed regions that are not yet smoothed by the effect of genetic recombination. In order to distinguish characteristic features for various populations we have implemented several methods, some of them borrowed or adapted from the analysis of nonlinear time series in statistical physics and quantitative physiology. We calculate the main fractal dimensions (Kolmogorov's capacity, information dimension and correlation dimension, usually named, D0, D1 and D2). We also have made detrended fluctuation analysis and information based similarity index calculations for the probability distribution of correlations of linkage disequilibrium coefficient of six recently admixed (mestizo) populations within the Mexican Genome Diversity Project [1] and for the non-recently admixed populations in the International HapMap Project [2]. Nonlinear correlations showed up as a consequence of internal structure within the haplotype distributions. The analysis of these correlations as well as the scope and limitations of these procedures within the biomedical sciences are discussed.

  9. Characterization of Ground Displacement Sources from Variational Bayesian Independent Component Analysis of Space Geodetic Time Series (United States)

    Gualandi, Adriano; Serpelloni, Enrico; Elina Belardinelli, Maria; Bonafede, Maurizio; Pezzo, Giuseppe; Tolomei, Cristiano


    A critical point in the analysis of ground displacement time series, as those measured by modern space geodetic techniques (primarly continuous GPS/GNSS and InSAR) is the development of data driven methods that allow to discern and characterize the different sources that generate the observed displacements. A widely used multivariate statistical technique is the Principal Component Analysis (PCA), which allows to reduce the dimensionality of the data space maintaining most of the variance of the dataset explained. It reproduces the original data using a limited number of Principal Components, but it also shows some deficiencies, since PCA does not perform well in finding the solution to the so-called Blind Source Separation (BSS) problem. The recovering and separation of the different sources that generate the observed ground deformation is a fundamental task in order to provide a physical meaning to the possible different sources. PCA fails in the BSS problem since it looks for a new Euclidean space where the projected data are uncorrelated. Usually, the uncorrelation condition is not strong enough and it has been proven that the BSS problem can be tackled imposing on the components to be independent. The Independent Component Analysis (ICA) is, in fact, another popular technique adopted to approach this problem, and it can be used in all those fields where PCA is also applied. An ICA approach enables us to explain the displacement time series imposing a fewer number of constraints on the model, and to reveal anomalies in the data such as transient deformation signals. However, the independence condition is not easy to impose, and it is often necessary to introduce some approximations. To work around this problem, we use a variational bayesian ICA (vbICA) method, which models the probability density function (pdf) of each source signal using a mix of Gaussian distributions. This technique allows for more flexibility in the description of the pdf of the sources

  10. Forecasting of UV-Vis absorbance time series using artificial neural networks combined with principal component analysis. (United States)

    Plazas-Nossa, Leonardo; Hofer, Thomas; Gruber, Günter; Torres, Andres


    This work proposes a methodology for the forecasting of online water quality data provided by UV-Vis spectrometry. Therefore, a combination of principal component analysis (PCA) to reduce the dimensionality of a data set and artificial neural networks (ANNs) for forecasting purposes was used. The results obtained were compared with those obtained by using discrete Fourier transform (DFT). The proposed methodology was applied to four absorbance time series data sets composed by a total number of 5705 UV-Vis spectra. Absolute percentage errors obtained by applying the proposed PCA/ANN methodology vary between 10% and 13% for all four study sites. In general terms, the results obtained were hardly generalizable, as they appeared to be highly dependent on specific dynamics of the water system; however, some trends can be outlined. PCA/ANN methodology gives better results than PCA/DFT forecasting procedure by using a specific spectra range for the following conditions: (i) for Salitre wastewater treatment plant (WWTP) (first hour) and Graz West R05 (first 18 min), from the last part of UV range to all visible range; (ii) for Gibraltar pumping station (first 6 min) for all UV-Vis absorbance spectra; and (iii) for San Fernando WWTP (first 24 min) for all of UV range to middle part of visible range.

  11. Statistical performance and information content of time lag analysis and redundancy analysis in time series modeling. (United States)

    Angeler, David G; Viedma, Olga; Moreno, José M


    Time lag analysis (TLA) is a distance-based approach used to study temporal dynamics of ecological communities by measuring community dissimilarity over increasing time lags. Despite its increased use in recent years, its performance in comparison with other more direct methods (i.e., canonical ordination) has not been evaluated. This study fills this gap using extensive simulations and real data sets from experimental temporary ponds (true zooplankton communities) and landscape studies (landscape categories as pseudo-communities) that differ in community structure and anthropogenic stress history. Modeling time with a principal coordinate of neighborhood matrices (PCNM) approach, the canonical ordination technique (redundancy analysis; RDA) consistently outperformed the other statistical tests (i.e., TLAs, Mantel test, and RDA based on linear time trends) using all real data. In addition, the RDA-PCNM revealed different patterns of temporal change, and the strength of each individual time pattern, in terms of adjusted variance explained, could be evaluated, It also identified species contributions to these patterns of temporal change. This additional information is not provided by distance-based methods. The simulation study revealed better Type I error properties of the canonical ordination techniques compared with the distance-based approaches when no deterministic component of change was imposed on the communities. The simulation also revealed that strong emphasis on uniform deterministic change and low variability at other temporal scales is needed to result in decreased statistical power of the RDA-PCNM approach relative to the other methods. Based on the statistical performance of and information content provided by RDA-PCNM models, this technique serves ecologists as a powerful tool for modeling temporal change of ecological (pseudo-) communities.

  12. Is Case-Specificity Content-Specificity? An Analysis of Data from Extended-Matching Questions (United States)

    Dory, Valerie; Gagnon, Robert; Charlin, Bernard


    Case-specificity, i.e., variability of a subject's performance across cases, has been a consistent finding in medical education. It has important implications for assessment validity and reliability. Its root causes remain a matter of discussion. One hypothesis, content-specificity, links variability of performance to variable levels of relevant…

  13. Interrupted time-series analysis of regulations to reduce paracetamol (acetaminophen poisoning.

    Directory of Open Access Journals (Sweden)

    Oliver W Morgan


    Full Text Available Paracetamol (acetaminophen poisoning is the leading cause of acute liver failure in Great Britain and the United States. Successful interventions to reduced harm from paracetamol poisoning are needed. To achieve this, the government of the United Kingdom introduced legislation in 1998 limiting the pack size of paracetamol sold in shops. Several studies have reported recent decreases in fatal poisonings involving paracetamol. We use interrupted time-series analysis to evaluate whether the recent fall in the number of paracetamol deaths is different to trends in fatal poisoning involving aspirin, paracetamol compounds, antidepressants, or nondrug poisoning suicide.We calculated directly age-standardised mortality rates for paracetamol poisoning in England and Wales from 1993 to 2004. We used an ordinary least-squares regression model divided into pre- and postintervention segments at 1999. The model included a term for autocorrelation within the time series. We tested for changes in the level and slope between the pre- and postintervention segments. To assess whether observed changes in the time series were unique to paracetamol, we compared against poisoning deaths involving compound paracetamol (not covered by the regulations, aspirin, antidepressants, and nonpoisoning suicide deaths. We did this comparison by calculating a ratio of each comparison series with paracetamol and applying a segmented regression model to the ratios. No change in the ratio level or slope indicated no difference compared to the control series. There were about 2,200 deaths involving paracetamol. The age-standardised mortality rate rose from 8.1 per million in 1993 to 8.8 per million in 1997, subsequently falling to about 5.3 per million in 2004. After the regulations were introduced, deaths dropped by 2.69 per million (p = 0.003. Trends in the age-standardised mortality rate for paracetamol compounds, aspirin, and antidepressants were broadly similar to paracetamol

  14. Investigating cardiorespiratory interaction by cross-spectral analysis of event series (United States)

    Schäfer, Carsten; Rosenblum, Michael G.; Pikovsky, Arkady S.; Kurths, Jürgen


    The human cardiovascular and respiratory systems interact with each other and show effects of modulation and synchronization. Here we present a cross-spectral technique that specifically considers the event-like character of the heartbeat and avoids typical restrictions of other spectral methods. Using models as well as experimental data, we demonstrate how modulation and synchronization can be distinguished. Finally, we compare the method to traditional techniques and to the analysis of instantaneous phases.

  15. Development of Nuclear Plant Specific Analysis Simulators with ATLAS

    International Nuclear Information System (INIS)

    Jakubowski, Z.; Draeger, P.; Horche, W.; Pointner, W.


    The simulation software ATLAS, based on the best-estimate code ATHLET, has been developed by the GRS for a range of applications in the field of nuclear plant safety analysis. Through application of versatile simulation tools and graphical interfaces the user should be able to analyse with ATLAS all essential accident scenarios. Detailed analysis simulators for several German and Russian NPPs are being constructed on the basis of ATLAS. An overview of the ATLAS is presented in the paper, describing its configuration, functions performed by main components and relationships among them. A significant part of any power plant simulator are the balance-of-plant (BOP) models, not only because all the plant transients and non-LOCA accidents can be initiated by operation of BOP systems, but also because the response of the plant to transients or accidents is strongly influenced by the automatic operation of BOP systems. Modelling aspects of BOP systems are shown in detail, also the interface between the process model and BOP systems. Special emphasis has been put on the BOP model builder based on the methodology developed in the GRS. The BOP modeler called GCSM-Generator is an object oriented tool which runs on the online expert system G2. It is equipped with utilities to edit the BOP models, to verification them and to generate a GCSM code, specific for the ATLAS. The communication system of ATLAS presents graphically the results of the simulation and allows interactively influencing the execution of the simulation process (malfunctions, manual control). Displays for communications with simulated processes and presentation of calculations results are also presented. In the framework of the verification of simulation models different tools are used e.g. the PC-codes MATHCAD for the calculation and documentation, ATLET-Input-Graphic for control of geometry data and the expert system G2 for development of BOP-Models. The validation procedure and selected analyses results

  16. A time-series approach to random number generation: Using recurrence quantification analysis to capture executive behavior

    Directory of Open Access Journals (Sweden)

    Wouter eOomens


    Full Text Available The concept of executive functions plays a prominent role in contemporary experimental and clinical studies on cognition. One paradigm used in this framework is the random number generation (RNG task, the execution of which demands aspects of executive functioning, specifically inhibition and working memory. Data from the RNG task are best seen as a series of successive events. However, traditional RNG measures that are used to quantify executive functioning are mostly summary statistics referring to deviations from mathematical randomness. In the current study, we explore the utility of recurrence quantification analysis (RQA, a nonlinear method that keeps the entire sequence intact, as a better way to describe executive functioning compared to traditional measures. To this aim, 242 first- and second-year students completed a non-paced RNG task. Principal component analysis of their data showed that traditional and RQA measures convey more or less the same information. However, RQA measures do so more parsimoniously and have a better interpretation.

  17. Analysis of Seasonal Signal in GPS Short-Baseline Time Series (United States)

    Wang, Kaihua; Jiang, Weiping; Chen, Hua; An, Xiangdong; Zhou, Xiaohui; Yuan, Peng; Chen, Qusen


    Proper modeling of seasonal signals and their quantitative analysis are of interest in geoscience applications, which are based on position time series of permanent GPS stations. Seasonal signals in GPS short-baseline (paper, to better understand the seasonal signal in GPS short-baseline time series, we adopted and processed six different short-baselines with data span that varies from 2 to 14 years and baseline length that varies from 6 to 1100 m. To avoid seasonal signals that are overwhelmed by noise, each of the station pairs is chosen with significant differences in their height (> 5 m) or type of the monument. For comparison, we also processed an approximately zero baseline with a distance of pass-filtered (BP) noise is valid for approximately 40% of the baseline components, and another 20% of the components can be best modeled by a combination of the first-order Gauss-Markov (FOGM) process plus white noise (WN). The TEM displacements are then modeled by considering the monument height of the building structure beneath the GPS antenna. The median contributions of TEM to the annual amplitude in the vertical direction are 84% and 46% with and without additional parts of the monument, respectively. Obvious annual signals with amplitude > 0.4 mm in the horizontal direction are observed in five short-baselines, and the amplitudes exceed 1 mm in four of them. These horizontal seasonal signals are likely related to the propagation of daily/sub-daily TEM displacement or other signals related to the site environment. Mismodeling of the tropospheric delay may also introduce spurious seasonal signals with annual amplitudes of 5 and 2 mm, respectively, for two short-baselines with elevation differences greater than 100 m. The results suggest that the monument height of the additional part of a typical GPS station should be considered when estimating the TEM displacement and that the tropospheric delay should be modeled cautiously, especially with station pairs with

  18. DynPeak: An Algorithm for Pulse Detection and Frequency Analysis in Hormonal Time Series (United States)

    Vidal, Alexandre; Zhang, Qinghua; Médigue, Claire; Fabre, Stéphane; Clément, Frédérique


    The endocrine control of the reproductive function is often studied from the analysis of luteinizing hormone (LH) pulsatile secretion by the pituitary gland. Whereas measurements in the cavernous sinus cumulate anatomical and technical difficulties, LH levels can be easily assessed from jugular blood. However, plasma levels result from a convolution process due to clearance effects when LH enters the general circulation. Simultaneous measurements comparing LH levels in the cavernous sinus and jugular blood have revealed clear differences in the pulse shape, the amplitude and the baseline. Besides, experimental sampling occurs at a relatively low frequency (typically every 10 min) with respect to LH highest frequency release (one pulse per hour) and the resulting LH measurements are noised by both experimental and assay errors. As a result, the pattern of plasma LH may be not so clearly pulsatile. Yet, reliable information on the InterPulse Intervals (IPI) is a prerequisite to study precisely the steroid feedback exerted on the pituitary level. Hence, there is a real need for robust IPI detection algorithms. In this article, we present an algorithm for the monitoring of LH pulse frequency, basing ourselves both on the available endocrinological knowledge on LH pulse (shape and duration with respect to the frequency regime) and synthetic LH data generated by a simple model. We make use of synthetic data to make clear some basic notions underlying our algorithmic choices. We focus on explaining how the process of sampling affects drastically the original pattern of secretion, and especially the amplitude of the detectable pulses. We then describe the algorithm in details and perform it on different sets of both synthetic and experimental LH time series. We further comment on how to diagnose possible outliers from the series of IPIs which is the main output of the algorithm. PMID:22802933

  19. A knowledge translation tool improved osteoporosis disease management in primary care: an interrupted time series analysis. (United States)

    Kastner, Monika; Sawka, Anna M; Hamid, Jemila; Chen, Maggie; Thorpe, Kevin; Chignell, Mark; Ewusie, Joycelyne; Marquez, Christine; Newton, David; Straus, Sharon E


    Osteoporosis affects over 200 million people worldwide at a high cost to healthcare systems, yet gaps in management still exist. In response, we developed a multi-component osteoporosis knowledge translation (Op-KT) tool involving a patient-initiated risk assessment questionnaire (RAQ), which generates individualized best practice recommendations for physicians and customized education for patients at the point of care. The objective of this study was to evaluate the effectiveness of the Op-KT tool for appropriate disease management by physicians. The Op-KT tool was evaluated using an interrupted time series design. This involved multiple assessments of the outcomes 12 months before (baseline) and 12 months after tool implementation (52 data points in total). Inclusion criteria were family physicians and their patients at risk for osteoporosis (women aged ≥ 50 years, men aged ≥ 65 years). Primary outcomes were the initiation of appropriate osteoporosis screening and treatment. Analyses included segmented linear regression modeling and analysis of variance. The Op-KT tool was implemented in three family practices in Ontario, Canada representing 5 family physicians with 2840 age eligible patients (mean age 67 years; 76% women). Time series regression models showed an overall increase from baseline in the initiation of screening (3.4%; P management addressed by their physician. Study limitations included the inherent susceptibility of our design compared with a randomized trial. The multicomponent Op-KT tool significantly increased osteoporosis investigations in three family practices, and highlights its potential to facilitate patient self-management. Next steps include wider implementation and evaluation of the tool in primary care.

  20. Time-series panel analysis (TSPA): multivariate modeling of temporal associations in psychotherapy process. (United States)

    Ramseyer, Fabian; Kupper, Zeno; Caspar, Franz; Znoj, Hansjörg; Tschacher, Wolfgang


    Processes occurring in the course of psychotherapy are characterized by the simple fact that they unfold in time and that the multiple factors engaged in change processes vary highly between individuals (idiographic phenomena). Previous research, however, has neglected the temporal perspective by its traditional focus on static phenomena, which were mainly assessed at the group level (nomothetic phenomena). To support a temporal approach, the authors introduce time-series panel analysis (TSPA), a statistical methodology explicitly focusing on the quantification of temporal, session-to-session aspects of change in psychotherapy. TSPA-models are initially built at the level of individuals and are subsequently aggregated at the group level, thus allowing the exploration of prototypical models. TSPA is based on vector auto-regression (VAR), an extension of univariate auto-regression models to multivariate time-series data. The application of TSPA is demonstrated in a sample of 87 outpatient psychotherapy patients who were monitored by postsession questionnaires. Prototypical mechanisms of change were derived from the aggregation of individual multivariate models of psychotherapy process. In a 2nd step, the associations between mechanisms of change (TSPA) and pre- to postsymptom change were explored. TSPA allowed a prototypical process pattern to be identified, where patient's alliance and self-efficacy were linked by a temporal feedback-loop. Furthermore, therapist's stability over time in both mastery and clarification interventions was positively associated with better outcomes. TSPA is a statistical tool that sheds new light on temporal mechanisms of change. Through this approach, clinicians may gain insight into prototypical patterns of change in psychotherapy. PsycINFO Database Record (c) 2014 APA, all rights reserved.

  1. Hybrid analysis for indicating patients with breast cancer using temperature time series. (United States)

    Silva, Lincoln F; Santos, Alair Augusto S M D; Bravo, Renato S; Silva, Aristófanes C; Muchaluat-Saade, Débora C; Conci, Aura


    Breast cancer is the most common cancer among women worldwide. Diagnosis and treatment in early stages increase cure chances. The temperature of cancerous tissue is generally higher than that of healthy surrounding tissues, making thermography an option to be considered in screening strategies of this cancer type. This paper proposes a hybrid methodology for analyzing dynamic infrared thermography in order to indicate patients with risk of breast cancer, using unsupervised and supervised machine learning techniques, which characterizes the methodology as hybrid. The dynamic infrared thermography monitors or quantitatively measures temperature changes on the examined surface, after a thermal stress. In the dynamic infrared thermography execution, a sequence of breast thermograms is generated. In the proposed methodology, this sequence is processed and analyzed by several techniques. First, the region of the breasts is segmented and the thermograms of the sequence are registered. Then, temperature time series are built and the k-means algorithm is applied on these series using various values of k. Clustering formed by k-means algorithm, for each k value, is evaluated using clustering validation indices, generating values treated as features in the classification model construction step. A data mining tool was used to solve the combined algorithm selection and hyperparameter optimization (CASH) problem in classification tasks. Besides the classification algorithm recommended by the data mining tool, classifiers based on Bayesian networks, neural networks, decision rules and decision tree were executed on the data set used for evaluation. Test results support that the proposed analysis methodology is able to indicate patients with breast cancer. Among 39 tested classification algorithms, K-Star and Bayes Net presented 100% classification accuracy. Furthermore, among the Bayes Net, multi-layer perceptron, decision table and random forest classification algorithms, an

  2. Principal component analysis of MSBAS DInSAR time series from Campi Flegrei, Italy (United States)

    Tiampo, Kristy F.; González, Pablo J.; Samsonov, Sergey; Fernández, Jose; Camacho, Antonio


    Because of its proximity to the city of Naples and with a population of nearly 1 million people within its caldera, Campi Flegrei is one of the highest risk volcanic areas in the world. Since the last major eruption in 1538, the caldera has undergone frequent episodes of ground subsidence and uplift accompanied by seismic activity that has been interpreted as the result of a stationary, deeper source below the caldera that feeds shallower eruptions. However, the location and depth of the deeper source is not well-characterized and its relationship to current activity is poorly understood. Recently, a significant increase in the uplift rate has occurred, resulting in almost 13 cm of uplift by 2013 (De Martino et al., 2014; Samsonov et al., 2014b; Di Vito et al., 2016). Here we apply a principal component decomposition to high resolution time series from the region produced by the advanced Multidimensional SBAS DInSAR technique in order to better delineate both the deeper source and the recent shallow activity. We analyzed both a period of substantial subsidence (1993-1999) and a second of significant uplift (2007-2013) and inverted the associated vertical surface displacement for the most likely source models. Results suggest that the underlying dynamics of the caldera changed in the late 1990s, from one in which the primary signal arises from a shallow deflating source above a deeper, expanding source to one dominated by a shallow inflating source. In general, the shallow source lies between 2700 and 3400 m below the caldera while the deeper source lies at 7600 m or more in depth. The combination of principal component analysis with high resolution MSBAS time series data allows for these new insights and confirms the applicability of both to areas at risk from dynamic natural hazards.

  3. Using forecast modelling to evaluate treatment effects in single-group interrupted time series analysis. (United States)

    Linden, Ariel


    Interrupted time series analysis (ITSA) is an evaluation methodology in which a single treatment unit's outcome is studied serially over time and the intervention is expected to "interrupt" the level and/or trend of that outcome. ITSA is commonly evaluated using methods which may produce biased results if model assumptions are violated. In this paper, treatment effects are alternatively assessed by using forecasting methods to closely fit the preintervention observations and then forecast the post-intervention trend. A treatment effect may be inferred if the actual post-intervention observations diverge from the forecasts by some specified amount. The forecasting approach is demonstrated using the effect of California's Proposition 99 for reducing cigarette sales. Three forecast models are fit to the preintervention series-linear regression (REG), Holt-Winters (HW) non-seasonal smoothing, and autoregressive moving average (ARIMA)-and forecasts are generated into the post-intervention period. The actual observations are then compared with the forecasts to assess intervention effects. The preintervention data were fit best by HW, followed closely by ARIMA. REG fit the data poorly. The actual post-intervention observations were above the forecasts in HW and ARIMA, suggesting no intervention effect, but below the forecasts in the REG (suggesting a treatment effect), thereby raising doubts about any definitive conclusion of a treatment effect. In a single-group ITSA, treatment effects are likely to be biased if the model is misspecified. Therefore, evaluators should consider using forecast models to accurately fit the preintervention data and generate plausible counterfactual forecasts, thereby improving causal inference of treatment effects in single-group ITSA studies. © 2018 John Wiley & Sons, Ltd.

  4. Oxytetracycline analysis in honey using a specific portable analyzer (United States)

    Chen, Guoying; Schwartz, Daniel; Braden, S.; Nunez, Alberto


    Oxytetracycline (OTC) residue in honey is detected using a portable analyzer designed to specifically target tetracycline (TC) drugs based on europium-sensitized luminescence (ESL). A 385 nm light emitting diode (LED) is used as the excitation source and a photomultiplier tube as the light detector. OTC is extracted from honey and cleaned up by solid phase extraction (SPE) using Strata X-WC weak cation exchange cartridges. To the eluate Eu(III) is added to form a Eu-TC chelate at pH 8.5. Efficient intrachelate energy transfer allows sensitive OTC detection at λ ex=385 nm and λ em=610 nm. After a 25-µs time delay, the ESL signal is integrated over a 25-1000 µs interval. The signal intensity reveals a linear relationship (R2=0.972) to OTC concentrations in the 10-200 ng/g range. The limit-of-detection is 6.7 ng/g with an average 5.8% relative standard deviation. The background signal corresponds to ~10 ppb. This instrumentation and method combination enables field analysis that is especially useful for beekeeping industry.

  5. A time-series analysis of flood disaster around Lena river using Landsat TM/ETM+ (United States)

    Sakai, Toru; Hatta, Shigemi; Okumura, Makoto; Takeuchi, Wataru; Hiyama, Tetsuya; Inoue, Gen


    Tabaga (61.83°N, 129.60°E) was frozen hard until early May 2007. River-ice breakup began in patches on 13 May 2007. Then, the area of Lena river rapidly increased due to overhead flooding on 14 May 2007, and reached the peak on 15 May 2007. In the brief period of one or two days, the area of Lena river was more than twice. After this, the area of Lena river exponentially decreased over three months, and it was quite stable in late August 2007. A time-series of Landsat TM/ETM+ images could detect these large temporal variations. In addition, the temporal variations in the area of Lena river synchronized with water stage measured in the field. These results indicate that a time-series of Landsat TM/ETM+ images enables to monitor natural disturbances caused at short-term intervals, although significantly limited to local scales. The requirement of spatial and temporal resolution is often application specific in the context of the desired measurement goals. This type of research and resultant information is critical for the utilization of remote sensing data to the fullest extent.

  6. An Analysis of the Factors Impacting Employee's Specific Investment

    Institute of Scientific and Technical Information of China (English)

    WU Ai-hua; GE Wen-lei


    The amount of specific investment from employees is limited, and the reasons of the under-investment from employees are analyzed in this paper. Based on the relationship of the specific investment and the employee demission, an empirical study has been conducted focusing on the factors influencing the employee turnover and the specific investment. A theoretical model of the factors influencing employee's specific investment is given.

  7. Assessment of land degradation using time series trend analysis of vegetation indictors in Otindag Sandy land

    International Nuclear Information System (INIS)

    Wang, H Y; Li, Z Y; Gao, Z H; Wu, J J; Sun, B; Li, C L


    Land condition assessment is a basic prerequisite for finding the degradation of a territory, which might lead to desertification under climatic and human pressures. The temporal change in vegetation productivity is a key indicator of land degradation. In this paper, taking the Otindag Sandy Land as a case, the mean normalized difference vegetation index (NDVI a ), net primary production (NPP) and vegetation rain use efficiency (RUE) dynamic trends during 2001–2010 were analysed. The Mann-Kendall test and the Correlation Analysis method were used and their sensitivities to land degradation were evaluated. The results showed that the three vegetation indicators (NDVI a , NPP and RUE) showed a downward trend with the two methods in the past 10 years and the land was degraded. For the analysis of the three vegetation indicators (NDVI a , NPP and RUE), it indicated a decreasing trend in 62.57%, 74.16% and 88.56% of the study area according to the Mann-Kendall test and in 57.85%, 68.38% and 85.29% according to the correlation analysis method. However, the change trends were not significant, the significant trends at the 95% confidence level only accounted for a small proportion. Analysis of NDVI a , NPP and RUE series showed a significant decreasing trend in 9.21%, 4.81% and 6.51% with the Mann-Kendall test. The NPP change trends showed obvious positive link with the precipitation in the study area. While the effect of the inter-annual variation of the precipitation for RUE was small, the vegetation RUE can provide valuable insights into the status of land condition and had best sensitivity to land degradation

  8. Testing Homeopathy in Mouse Emotional Response Models: Pooled Data Analysis of Two Series of Studies

    Directory of Open Access Journals (Sweden)

    Paolo Bellavite


    Full Text Available Two previous investigations were performed to assess the activity of Gelsemium sempervirens (Gelsemium s. in mice, using emotional response models. These two series are pooled and analysed here. Gelsemium s. in various homeopathic centesimal dilutions/dynamizations (4C, 5C, 7C, 9C, and 30C, a placebo (solvent vehicle, and the reference drugs diazepam (1 mg/kg body weight or buspirone (5 mg/kg body weight were delivered intraperitoneally to groups of albino CD1 mice, and their effects on animal behaviour were assessed by the light-dark (LD choice test and the open-field (OF exploration test. Up to 14 separate replications were carried out in fully blind and randomised conditions. Pooled analysis demonstrated highly significant effects of Gelsemium s. 5C, 7C, and 30C on the OF parameter “time spent in central area” and of Gelsemium s. 5C, 9C, and 30C on the LD parameters “time spent in lit area” and “number of light-dark transitions,” without any sedative action or adverse effects on locomotion. This pooled data analysis confirms and reinforces the evidence that Gelsemium s. regulates emotional responses and behaviour of laboratory mice in a nonlinear fashion with dilution/dynamization.

  9. Spectral analysis of time series of events: effect of respiration on heart rate in neonates

    International Nuclear Information System (INIS)

    Van Drongelen, Wim; Williams, Amber L; Lasky, Robert E


    Certain types of biomedical processes such as the heart rate generator can be considered as signals that are sampled by the occurring events, i.e. QRS complexes. This sampling property generates problems for the evaluation of spectral parameters of such signals. First, the irregular occurrence of heart beats creates an unevenly sampled data set which must either be pre-processed (e.g. by using trace binning or interpolation) prior to spectral analysis, or analyzed with specialized methods (e.g. Lomb's algorithm). Second, the average occurrence of events determines the Nyquist limit for the sampled time series. Here we evaluate different types of spectral analysis of recordings of neonatal heart rate. Coupling between respiration and heart rate and the detection of heart rate itself are emphasized. We examine both standard and data adaptive frequency bands of heart rate signals generated by models of coupled oscillators and recorded data sets from neonates. We find that an important spectral artifact occurs due to a mirror effect around the Nyquist limit of half the average heart rate. Further we conclude that the presence of respiratory coupling can only be detected under low noise conditions and if a data-adaptive respiratory band is used

  10. Variability of African Farming Systems from Phenological Analysis of NDVI Time Series (United States)

    Vrieling, Anton; deBeurs, K. M.; Brown, Molly E.


    Food security exists when people have access to sufficient, safe and nutritious food at all times to meet their dietary needs. The natural resource base is one of the many factors affecting food security. Its variability and decline creates problems for local food production. In this study we characterize for sub-Saharan Africa vegetation phenology and assess variability and trends of phenological indicators based on NDVI time series from 1982 to 2006. We focus on cumulated NDVI over the season (cumNDVI) which is a proxy for net primary productivity. Results are aggregated at the level of major farming systems, while determining also spatial variability within farming systems. High temporal variability of cumNDVI occurs in semiarid and subhumid regions. The results show a large area of positive cumNDVI trends between Senegal and South Sudan. These correspond to positive CRU rainfall trends found and relate to recovery after the 1980's droughts. We find significant negative cumNDVI trends near the south-coast of West Africa (Guinea coast) and in Tanzania. For each farming system, causes of change and variability are discussed based on available literature (Appendix A). Although food security comprises more than the local natural resource base, our results can perform an input for food security analysis by identifying zones of high variability or downward trends. Farming systems are found to be a useful level of analysis. Diversity and trends found within farming system boundaries underline that farming systems are dynamic.

  11. Time series analysis of ambient air concentrations in Alexandria and Nile delta region, Egypt

    International Nuclear Information System (INIS)

    EI Raev, M.; Shalaby, E.A.; Ghatass, Z.F.; Marey, H.S.


    Data collected from the Air Monitoring Network of Alexandria and Delta (EEAA/EIMP-program), were analyzed. Emphasis is given to indicator pollutants PM 10 , NO 2 , SO 2 , O 3 and CO. Two sites have been selected in Alexandria (IGSR and Shohada) and three sites in Delta region (Kafr Elzyat, Mansoura and Mahalla) for analysis of three years from 2000-2002. Box -Jenkins modeling has been used mainly for forecasting and assessing relative importance of various parameters or pollutants. Results showed that, the autoregressive (AR) order for all series ranged from 0-2 except NO 2 at Mansoura site. Also the moving average order ranged from 0-2 except CO at IGSR site. Nitrogen dioxide and Ozone at IGSR site have the same ARIMA model which is (0, 1, and 2). Cross correlation analysis has revealed important information on the dynamics, chemistry and interpretation of ambient pollution. Cross-correlation functions of SO 2 and PM 10 at IGSR sites suggest that, sulfur dioxide has been adsorbed on the surface of particulates which has an alkaline nature. This enhances the oxidation of sulfur dioxide to sulfate, which results in low levels of SO 2 in spite of the presence of sources

  12. Statistical Analysis of fMRI Time-Series: A Critical Review of the GLM Approach

    Directory of Open Access Journals (Sweden)

    Martin M Monti


    Full Text Available Functional Magnetic Resonance Imaging (fMRI is one of the most widely used tools to study the neural underpinnings of human cognition. Standard analysis of fMRI data relies on a General Linear Model (GLM approach to separate stimulus induced signals from noise. Crucially, this approach relies on a number of assumptions about the data which, for inferences to be valid, must be met. The current paper reviews the GLM approach to analysis of fMRI time-series, focusing in particular on the degree to which such data abides by the assumptions of the GLM framework, and on the methods that have been developed to correct for any violation of those assumptions. Rather than biasing estimates of effect size, the major consequence of non-conformity to the assumptions is to introduce bias into estimates of the variance, thus affecting test statistics, power and false positive rates. Furthermore, this bias can have pervasive effects on both individual subject and group-level statistics, potentially yielding qualitatively different results across replications, especially after the thresholding procedures commonly used for inference-making.

  13. A Time Series Analysis Using R for Understanding Car Sales On The Romanian Market

    Directory of Open Access Journals (Sweden)

    Mihaela Cornelia Sandu


    Full Text Available The size of the Romanian automobile industry is relative small compared to the main car producers in Europe and the world, but an analysis of its structure and dynamic appears to be most relevant given the strong linkages with the main macroeconomic indicators and important microeconomic variables at the level of the household.The paper presents a time series analysis for car sales in Romania, in the period 2007-2014, focusing on the sales dynamic of the national main producer– Dacia Pitesti. The aim of the investigation is twofold: to test the impact of macroeconomic variables on this important and underexplored segment of the economy and to emphasize potential differences between the factors influencing the buying decision for domestic versus foreign cars (observed in three regimes: new, registered and reenrolled. While the major influence of the global economic crisis cannot be ignored for the analyzed interval, we believe that it may also help to illustrate the real behaviors of individuals by setting the line between the immediate period after the crisis as treatment under scarcity conditions and the re-installment of normality towards the second half of the time interval. The results are confirming the general findings of the literature for the main indicators but they not entirely consistent with the rational economic models, especially with regard to the nature of the investigated goods (the cars – normal or positional.

  14. Heat flux measurements of Tb{sub 3}M series (M=Co, Rh and Ru): Specific heat and magnetocaloric properties

    Energy Technology Data Exchange (ETDEWEB)

    Monteiro, J.C.B., E-mail: [Universidade Estadual de Campinas, Instituto de Física Gleb Wataghin, Campinas, SP 13083-859 (Brazil); Lombardi, G.A. [Universidade Estadual de Campinas, Instituto de Física Gleb Wataghin, Campinas, SP 13083-859 (Brazil); Reis, R.D. dos [Max-Planck Institute for Chemical Physics of Solids, Nöthnitzer Str. 40, 01187 Dresden (Germany); Freitas, H.E.; Cardoso, L.P.; Mansanares, A.M.; Gandra, F.G. [Universidade Estadual de Campinas, Instituto de Física Gleb Wataghin, Campinas, SP 13083-859 (Brazil)


    We report on the magnetic properties and magnetocaloric effect (MCE) for the Tb{sub 3}M series, with M=Co, Rh and Ru, obtained using a heat flux technique. The specific heat of Tb{sub 3}Co and Tb{sub 3}Rh are very similar, with a first order type transition occurring around 6 K below the magnetic ordering temperature without any corresponding feature on the magnetization. The slightly enhanced electronic specific heat, the Debye temperature around 150 K and the presence of the magnetic specific heat well above the ordering temperature are also characteristic of many other compounds of the R{sub 3}M family (R=Rare Earth). The specific heat for Tb{sub 3}Ru, however, presents two peaks at 37 K and 74 K. The magnetization shows that below the first peak the system presents an antiferromagnetic behavior and is paramagnetic above 74 K. We obtained a magnetocaloric effect for M=Co and Rh, −∆S=12 J/kg K, but for Tb{sub 3}Ru it is less than 3 J/kg K (μ{sub 0}∆H=5 T). We believe that the experimental results show that the MCE is directly related with the process of hybridization of the (R)5d-(M)d electrons that occurs in the R{sub 3}M materials.

  15. case series

    African Journals Online (AJOL)


    Key words: Case report, case series, concept analysis, research design. African Health Sciences 2012; (4): 557 - 562 PO Box 17666 .... According to the latest version of the Dictionary of. Epidemiology ...


    Directory of Open Access Journals (Sweden)

    Patrik Drid


    Full Text Available Specific physical preparation affects the development of motor abilities required for execution of specific movements in judo. When selecting proper specific exercises for judo for a target motor ability, it is necessary to precede it with the study of the structure of specific judo techniques and activities of individual muscle groups engaged for execution of the technique. On the basis of this, one can understand which muscles are most engaged during realization of individual techniques, which serves as a standpoint for selection of a particular complex of specific exercises to produce the highest effects. In addition to the development of particular muscle groups, the means of specific preparation will take effect on the development of those motor abilities which are evaluated as the indispensable for the development of particular qualities which are characteristic for judo. This paper analyses the relationship between judo techniques field and specific motor abilities.

  17. Adjuvant radiotherapy after extrapleural pneumonectomy for mesothelioma. Prospective analysis of a multi-institutional series

    International Nuclear Information System (INIS)

    Tonoli, Sandro; Vitali, Paola; Scotti, Vieri; Bertoni, Filippo; Spiazzi, Luigi; Ghedi, Barbara; Buonamici, Fabrizio Banci; Marrazzo, Livia; Guidi, Gabriele; Meattini, Icro; Bastiani, Paolo; Amichetti, Maurizio; Schwarz, Marco; Magrini, Stefano Maria


    Background and purpose: To evaluate survival, locoregional control and toxicity in a series of 56 mesothelioma patients treated from May 2005 to May 2010 with post-operative radiotherapy after extrapleural pneumonectomy (EPP) in three Italian Institutions (Brescia, Florence, and Modena). Material and methods: Fifty-six patients treated with adjuvant radiotherapy (RT) after EPP were analyzed. Four patients were treated with 3DCRT, 50 with IMRT and two with helical tomotherapy. Forty-five to 50 Gy in 25 fractions were given to the affected hemithorax and to ipsilateral mediastinum, with a simultaneous integrated boost to the sites of microscopically involved margins up to 60 Gy in 20/56 cases. Results: Three year locoregional control (LRC), distant metastasis free (DMF), disease free (DF), disease specific (DSS) and overall survival (OS) rates are 90%, 66%, 57%, 62%, and 60%, respectively. Conclusion: Postoperative RT with modern techniques is an effective method to obtain excellent local control and cure rates in mesothelioma patients submitted to EPP.

  18. Trend analysis using non-stationary time series clustering based on the finite element method


    Gorji Sefidmazgi, M.; Sayemuzzaman, M.; Homaifar, A.; Jha, M. K.; Liess, S.


    In order to analyze low-frequency variability of climate, it is useful to model the climatic time series with multiple linear trends and locate the times of significant changes. In this paper, we have used non-stationary time series clustering to find change points in the trends. Clustering in a multi-dimensional non-stationary time series is challenging, since the problem is mathematically ill-posed. Clustering based on the finite element method (FEM) is one of the methods ...

  19. The importance of alcoholic beverage type for suicide in Japan: a time-series analysis, 1963-2007. (United States)

    Norström, Thor; Stickley, Andrew; Shibuya, Kenji


    Japan has one of the highest suicide rates in the world. Cohort analysis has suggested that alcohol consumption is a risk factor for suicide in Japan. However, this relationship has not been observed at the population level when a measure of per capita total alcohol consumption has been analysed. The present study employed a time-series analysis to examine whether these contradictory findings may be due to the existence of beverage-specific effects on suicide. An autoregressive integrated moving average model was used to assess the relationship between the consumption of different types of alcohol and suicide rates from 1963 to 2007. The data comprised age-adjusted suicide rates for the ages 15-69, and information on beverage-specific alcohol consumption per capita (15+). The unemployment rate was included as a control variable. During 1963-2007, male suicide rates increased substantially whereas female rates decreased slightly. Consumption of distilled spirits was significantly related to male suicide rates (but not in women) with a 1L increase in consumption associated with a 21.4% (95% confidence interval: 3.2-42.9) increase in male suicide rates. There was no statistically significant relationship between suicide and any other form of alcohol consumption (beer, wine, other alcohol). This is the first study that has shown an association between spirits consumption and male suicide in Japan. Potentially beneficial policy changes include increasing spirits prices through taxation, reducing the physical availability of alcohol and discouraging the practice of heavy drinking. © 2011 Australasian Professional Society on Alcohol and other Drugs.

  20. The investigation of Martian dune fields using very high resolution photogrammetric measurements and time series analysis (United States)

    Kim, J.; Park, M.; Baik, H. S.; Choi, Y.


    At the present time, arguments continue regarding the migration speeds of Martian dune fields and their correlation with atmospheric circulation. However, precisely measuring the spatial translation of Martian dunes has rarely conducted only a very few times Therefore, we developed a generic procedure to precisely measure the migration of dune fields with recently introduced 25-cm resolution High Resolution Imaging Science Experimen (HIRISE) employing a high-accuracy photogrammetric processor and sub-pixel image correlator. The processor was designed to trace estimated dune migration, albeit slight, over the Martian surface by 1) the introduction of very high resolution ortho images and stereo analysis based on hierarchical geodetic control for better initial point settings; 2) positioning error removal throughout the sensor model refinement with a non-rigorous bundle block adjustment, which makes possible the co-alignment of all images in a time series; and 3) improved sub-pixel co-registration algorithms using optical flow with a refinement stage conducted on a pyramidal grid processor and a blunder classifier. Moreover, volumetric changes of Martian dunes were additionally traced by means of stereo analysis and photoclinometry. The established algorithms have been tested using high-resolution HIRISE images over a large number of Martian dune fields covering whole Mars Global Dune Database. Migrations over well-known crater dune fields appeared to be almost static for the considerable temporal periods and were weakly correlated with wind directions estimated by the Mars Climate Database (Millour et al. 2015). Only over a few Martian dune fields, such as Kaiser crater, meaningful migration speeds (>1m/year) compared to phtotogrammetric error residual have been measured. Currently a technical improved processor to compensate error residual using time series observation is under developing and expected to produce the long term migration speed over Martian dune

  1. [Time-series analysis on effect of air pollution on stroke mortality in Tianjin, China]. (United States)

    Wang, De-zheng; Gu, Qing; Jiang, Guo-hong; Yang, De-yi; Zhang, Hui; Song, Gui-de; Zhang, Ying


    To investigate the effect of air pollution on stroke mortality in Tianjin, China, and to provide basis for stroke control and prevention. Total data of mortality surveillance were collected by Tianjin Centers for Disease Control and Prevention. Meteorological data and atmospheric pollution data were from Tianjin Meteorological Bureau and Tianjin Environmental Monitoring Center, respectively. Generalized additive Poisson regression model was used in time-series analysis on the relationship between air pollution and stroke mortality in Tianjin. Single-pollutant analysis and multi-pollutant analysis were performed after adjustment for confounding factors such as meteorological factors, long-term trend of death, "days of the week" effect and population. The crude death rates of stroke in Tianjin were from 136.67 in 2001 to 160.01/100000 in 2009, with an escalating trend (P = 0.000), while the standardized mortality ratios of stroke in Tianjin were from 138.36 to 99.14/100000, with a declining trend (P = 0.000). An increase of 10 µg/m³ in daily average concentrations of atmospheric SO₂, NO₂ and PM₁₀ led to 1.0105 (95%CI: 1.0060 ∼ 1.0153), 1.0197 (95%CI: 1.0149 ∼ 1.0246) and 1.0064 (95%CI: 1.0052 ∼ 1.0077), respectively, in relative risks of stroke mortality. SO₂ effect peaked after 1-day exposure, while NO₂ and PM₁₀ effects did within 1 day. Air pollution in Tianjin may increase the risk of stroke mortality in the population and induce acute onset of stroke. It is necessary to carry out air pollution control and allocate health resources rationally to reduce the hazard of stroke mortality.


    Directory of Open Access Journals (Sweden)

    Sipos-Gug Sebastian


    Full Text Available Entrepreneurship is an active field of research, having known a major increase in interest and publication levels in the last years (Landström et al., 2012. Within this field recently there has been an increasing interest in understanding why some regions seem to have a significantly higher entrepreneurship activity compared to others. In line with this research field, we would like to investigate the differences in entrepreneurial activity among the Romanian counties (NUTS 3 regions. While the classical research paradigm in this field is to conduct a temporally stationary analysis, we choose to use a time series clustering analysis to better understanding the dynamics of entrepreneurial activity between counties. Our analysis showed that if we use the total number of new privately owned companies that are founded each year in the last decade (2002-2012 we can distinguish between 5 clusters, one with high total entrepreneurial activity (18 counties, one with above average activity (8 counties, two clusters with average and slightly below average activity (total of 18 counties and one cluster with low and declining activity (2 counties. If we are interested in the entrepreneurial activity rate, that is the number of new privately owned companies founded each year adjusted by the population of the respective county, we obtain 4 clusters, one with a very high entrepreneurial rate (1 county, one with average rate (10 counties, and two clusters with below average entrepreneurial rate (total of 31 counties. In conclusion, our research shows that Romania is far from being a homogeneous geographical area in respect to entrepreneurial activity. Depending on what we are interested in, it can be divided in 5 or 4 clusters of counties, which behave differently as a function of time. Further research should be focused on explaining these regional differences, on studying the high performance clusters and trying to improve the low performing ones.

  3. Genotype-Specific Measles Transmissibility: A Branching Process Analysis. (United States)

    Ackley, Sarah F; Hacker, Jill K; Enanoria, Wayne T A; Worden, Lee; Blumberg, Seth; Porco, Travis C; Zipprich, Jennifer


    Substantial heterogeneity in measles outbreak sizes may be due to genotype-specific transmissibility. Using a branching process analysis, we characterize differences in measles transmission by estimating the association between genotype and the reproduction number R among postelimination California measles cases during 2000-2015 (400 cases, 165 outbreaks). Assuming a negative binomial secondary case distribution, we fit a branching process model to the distribution of outbreak sizes using maximum likelihood and estimated the reproduction number R for a multigenotype model. Genotype B3 is found to be significantly more transmissible than other genotypes (P = .01) with an R of 0.64 (95% confidence interval [CI], .48-.71), while the R for all other genotypes combined is 0.43 (95% CI, .28-.54). This result is robust to excluding the 2014-2015 outbreak linked to Disneyland theme parks (referred to as "outbreak A" for conciseness and clarity) (P = .04) and modeling genotype as a random effect (P = .004 including outbreak A and P = .02 excluding outbreak A). This result was not accounted for by season of introduction, age of index case, or vaccination of the index case. The R for outbreaks with a school-aged index case is 0.69 (95% CI, .52-.78), while the R for outbreaks with a non-school-aged index case is 0.28 (95% CI, .19-.35), but this cannot account for differences between genotypes. Variability in measles transmissibility may have important implications for measles control; the vaccination threshold required for elimination may not be the same for all genotypes or age groups.

  4. A task specific uncertainty analysis method for least-squares-based form characterization of ultra-precision freeform surfaces

    International Nuclear Information System (INIS)

    Ren, M J; Cheung, C F; Kong, L B


    In the measurement of ultra-precision freeform surfaces, least-squares-based form characterization methods are widely used to evaluate the form error of the measured surfaces. Although many methodologies have been proposed in recent years to improve the efficiency of the characterization process, relatively little research has been conducted on the analysis of associated uncertainty in the characterization results which may result from those characterization methods being used. As a result, this paper presents a task specific uncertainty analysis method with application in the least-squares-based form characterization of ultra-precision freeform surfaces. That is, the associated uncertainty in the form characterization results is estimated when the measured data are extracted from a specific surface with specific sampling strategy. Three factors are considered in this study which include measurement error, surface form error and sample size. The task specific uncertainty analysis method has been evaluated through a series of experiments. The results show that the task specific uncertainty analysis method can effectively estimate the uncertainty of the form characterization results for a specific freeform surface measurement

  5. Series: The research agenda for general practice/family medicine and primary health care in Europe. Part 4. Results: specific problem solving skills. (United States)

    Hummers-Pradier, Eva; Beyer, Martin; Chevallier, Patrick; Eilat-Tsanani, Sophia; Lionis, Christos; Peremans, Lieve; Petek, Davorina; Rurik, Imre; Soler, Jean Karl; Stoffers, Henri Ejh; Topsever, Pinar; Ungan, Mehmet; van Royen, Paul


    The 'Research Agenda for General Practice/Family Medicine and Primary Health Care in Europe' summarizes the evidence relating to the core competencies and characteristics of the Wonca Europe definition of GP/FM, and its implications for general practitioners/family doctors, researchers and policy makers. The European Journal of General Practice publishes a series of articles based on this document. The previous articles presented background, objectives, and methodology, as well results on 'primary care management' and 'community orientation' and the person-related core competencies of GP/FM. This article reflects on the general practitioner's 'specific problem solving skills'. These include decision making on diagnosis and therapy of specific diseases, accounting for the properties of primary care, but also research questions related to quality management and resource use, shared decision making, or professional education and development. Clinical research covers most specific diseases, but often lacks pragmatism and primary care relevance. Quality management is a stronghold of GP/FM research. Educational interventions can be effective when well designed for a specific setting and situation. However, their message that 'usual care' by general practitioners is insufficient may be problematic. GP and their patients need more research into diagnostic reasoning with a step-wise approach to increase predictive values in a setting characterized by uncertainty and low prevalence of specific diseases. Pragmatic comparative effectiveness studies of new and established drugs or non-pharmaceutical therapy are needed. Multi-morbidity and complexity should be addressed. Studies on therapy, communication strategies and educational interventions should consider impact on health and sustainability of effects.

  6. Risk assessment of environmentally influenced airway diseases based on time-series analysis. (United States)

    Herbarth, O


    Threshold values are of prime importance in providing a sound basis for public health decisions. A key issue is determining threshold or maximum exposure values for pollutants and assessing their potential health risks. Environmental epidemiology could be instrumental in assessing these levels, especially since the assessment of ambient exposures involves relatively low concentrations of pollutants. This paper presents a statistical method that allows the determination of threshold values as well as the assessment of the associated risk using a retrospective, longitudinal study design with a prospective follow-up. Morbidity data were analyzed using the Fourier method, a time-series analysis that is based on the assumption of a high temporal resolution of the data. This method eliminates time-dependent responses like temporal inhomogeneity and pseudocorrelation. The frequency of calls for respiratory distress conditions to the regional Mobile Medical Emergency Service (MMES) in the city of Leipzig were investigated. The entire population of Leipzig served as a pool for data collection. In addition to the collection of morbidity data, air pollution measurements were taken every 30 min for the entire study period using sulfur dioxide as the regional indicator variable. This approach allowed the calculation of a dose-response curve for respiratory diseases and air pollution indices in children and adults. Significantly higher morbidities were observed above a 24-hr mean value of 0.6 mg SO2/m3 air for children and 0.8 mg SO2/m3 for adults.(ABSTRACT TRUNCATED AT 250 WORDS)

  7. Tetrodotoxin poisoning caused by Goby fish consumption in southeast China: a retrospective case series analysis

    Directory of Open Access Journals (Sweden)

    Jie You


    Full Text Available OBJECTIVES: To investigate an unusual outbreak of tetrodotoxin poisoning in Leizhou, southeast China, a case series analysis was conducted to identify the source of illness. METHODS: A total of 22 individuals experienced symptoms of poisoning, including tongue numbness, dizziness, nausea and limb numbness and weakness. Two toxic species, Amoya caninus and Yongeichthys nebulosus, were morphologically identified from the batches of gobies consumed by the patients. Tetrodotoxin levels in the blood and Goby fish samples were detected using liquid chromatography-tandem mass spectrometry. RESULTS: The tetrodotoxin levels in the remaining cooked Goby fish were determined to be 2090.12 µg/kg. For Amoya caninus, the toxicity levels were 1858.29 µg/kg in the muscle and 1997.19 µg/kg in the viscera and for Yongeichthys nebulosus, they were 2783.00 µg/kg in the muscle and 2966.21 µg/kg in the viscera. CONCLUSION: This outbreak demonstrates an underestimation of the risk of Goby fish poisoning. Furthermore, the relationships among the toxic species, climates and marine algae present should be clarified in the future.

  8. On the limits of probabilistic forecasting in nonlinear time series analysis II: Differential entropy. (United States)

    Amigó, José M; Hirata, Yoshito; Aihara, Kazuyuki


    In a previous paper, the authors studied the limits of probabilistic prediction in nonlinear time series analysis in a perfect model scenario, i.e., in the ideal case that the uncertainty of an otherwise deterministic model is due to only the finite precision of the observations. The model consisted of the symbolic dynamics of a measure-preserving transformation with respect to a finite partition of the state space, and the quality of the predictions was measured by the so-called ignorance score, which is a conditional entropy. In practice, though, partitions are dispensed with by considering numerical and experimental data to be continuous, which prompts us to trade off in this paper the Shannon entropy for the differential entropy. Despite technical differences, we show that the core of the previous results also hold in this extended scenario for sufficiently high precision. The corresponding imperfect model scenario will be revisited too because it is relevant for the applications. The theoretical part and its application to probabilistic forecasting are illustrated with numerical simulations and a new prediction algorithm.

  9. A Long-Term Prediction Model of Beijing Haze Episodes Using Time Series Analysis

    Directory of Open Access Journals (Sweden)

    Xiaoping Yang


    Full Text Available The rapid industrial development has led to the intermittent outbreak of pm2.5 or haze in developing countries, which has brought about great environmental issues, especially in big cities such as Beijing and New Delhi. We investigated the factors and mechanisms of haze change and present a long-term prediction model of Beijing haze episodes using time series analysis. We construct a dynamic structural measurement model of daily haze increment and reduce the model to a vector autoregressive model. Typical case studies on 886 continuous days indicate that our model performs very well on next day’s Air Quality Index (AQI prediction, and in severely polluted cases (AQI ≥ 300 the accuracy rate of AQI prediction even reaches up to 87.8%. The experiment of one-week prediction shows that our model has excellent sensitivity when a sudden haze burst or dissipation happens, which results in good long-term stability on the accuracy of the next 3–7 days’ AQI prediction.

  10. Mapping Mountain Pine Beetle Mortality through Growth Trend Analysis of Time-Series Landsat Data

    Directory of Open Access Journals (Sweden)

    Lu Liang


    Full Text Available Disturbances are key processes in the carbon cycle of forests and other ecosystems. In recent decades, mountain pine beetle (MPB; Dendroctonus ponderosae outbreaks have become more frequent and extensive in western North America. Remote sensing has the ability to fill the data gaps of long-term infestation monitoring, but the elimination of observational noise and attributing changes quantitatively are two main challenges in its effective application. Here, we present a forest growth trend analysis method that integrates Landsat temporal trajectories and decision tree techniques to derive annual forest disturbance maps over an 11-year period. The temporal trajectory component successfully captures the disturbance events as represented by spectral segments, whereas decision tree modeling efficiently recognizes and attributes events based upon the characteristics of the segments. Validated against a point set sampled across a gradient of MPB mortality, 86.74% to 94.00% overall accuracy was achieved with small variability in accuracy among years. In contrast, the overall accuracies of single-date classifications ranged from 37.20% to 75.20% and only become comparable with our approach when the training sample size was increased at least four-fold. This demonstrates that the advantages of this time series work flow exist in its small training sample size requirement. The easily understandable, interpretable and modifiable characteristics of our approach suggest that it could be applicable to other ecoregions.

  11. Personal identity narratives of therapeutic songwriting participants following Spinal Cord Injury: A case series analysis. (United States)

    Roddy, Chantal; Rickard, Nikki; Tamplin, Jeanette; Baker, Felicity Anne


    Spinal Cord Injury (SCI) patients face unique identity challenges associated with physical limitations, higher comorbid depression, increased suicidality and reduced subjective well-being. Post-injury identity is often unaddressed in subacute rehabilitation environments where critical physical and functional rehabilitation goals are prioritized. Therapeutic songwriting has demonstrated prior efficacy in promoting healthy adjustment and as a means of expression for post-injury narratives. The current study sought to examine the identity narratives of therapeutic songwriting participants. Case-series analysis of the individual identity trajectories of eight individuals. Subacute rehabilitation facility, Victoria, Australia. Eight individuals with an SCI; 7 males and 1 female. Six-week therapeutic songwriting intervention facilitated by a music therapist to promote identity rehabilitation. Identity, subjective well-being and distress, emotional state. Three participants demonstrated positive trajectories and a further three showed negative trajectories; remaining participants were ambiguous in their response. Injury severity differentiated those with positive trajectories from those with negative trajectories, with greater injury severity apparent for those showing negative trends. Self-concept also improved more in those with positive trajectories. Core demographic variables did not however meaningfully predict the direction of change in core identity or wellbeing indices. Identity-focused songwriting holds promise as a means of promoting healthy identity reintegration. Further research on benefits for those with less severe spinal injuries is warranted.

  12. Mental health impacts of flooding: a controlled interrupted time series analysis of prescribing data in England. (United States)

    Milojevic, Ai; Armstrong, Ben; Wilkinson, Paul


    There is emerging evidence that people affected by flooding suffer adverse impacts on their mental well-being, mostly based on self-reports. We examined prescription records for drugs used in the management of common mental disorder among primary care practices located in the vicinity of recent large flood events in England, 2011-2014. A controlled interrupted time series analysis was conducted of the number of prescribing items for antidepressant drugs in the year before and after the flood onset. Pre-post changes were compared by distance of the practice from the inundated boundaries among 930 practices located within 10 km of a flood. After control for deprivation and population density, there was an increase of 0.59% (95% CI 0.24 to 0.94) prescriptions in the postflood year among practices located within 1 km of a flood over and above the change observed in the furthest distance band. The increase was greater in more deprived areas. This study suggests an increase in prescribed antidepressant drugs in the year after flooding in primary care practices close to recent major floods in England. The degree to which the increase is actually concentrated in those flooded can only be determined by more detailed linkage studies. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  13. Comparison on the Analysis on PM10 Data based on Average and Extreme Series

    Directory of Open Access Journals (Sweden)

    Mohd Amin Nor Azrita


    Full Text Available The main concern in environmental issue is on extreme phenomena (catastrophic instead of common events. However, most statistical approaches are concerned primarily with the centre of a distribution or on the average value rather than the tail of the distribution which contains the extreme observations. The concept of extreme value theory affords attention to the tails of distribution where standard models are proved unreliable to analyse extreme series. High level of particulate matter (PM10 is a common environmental problem which causes various impacts to human health and material damages. If the main concern is on extreme events, then extreme value analysis provides the best result with significant evidence. The monthly average and monthly maxima PM10 data for Perlis from 2003 to 2014 were analysed. Forecasting for average data is made by Holt-Winters method while return level determine the predicted value of extreme events that occur on average once in a certain period. The forecasting from January 2015 to December 2016 for average data found that the highest forecasted value is 58.18 (standard deviation 18.45 on February 2016 while return level achieved 253.76 units for 24 months (2015-2016 return periods.


    Directory of Open Access Journals (Sweden)

    Marko Martinović


    Full Text Available The domestic currency Croatian kuna (HRK was introduced in May 1995. To date, the Croatian National Bank (HNB, as a regulator and formulator of monetary policy in Croatia has operated a policy of stable exchange rate, typically referenced to the formal currency of the European Union euro (EUR. From the date of introduction of the euro 01/01/1999 until 01/01/2016 the value of the currency pair HRK / EUR changed in value by only 4.25% (HNB. Although the value of the Croatian kuna is relatively stable, there are some fluctuations on an annual level (e.g. in ­­­the last few years because of the global crisis as well as  on periodic levels within a year. The aim of this paper is to show the movement of the value of the currency pair since the beginning of 2002 to the present day (the time curve, analyze the correctness, trends and periodicity (seasonal behavior, if any exist.The research will be done using the method of Time Series Analysis, assuming that the external (global economy and internal factors (economic policy remain similar or the same. According to the results, further assessment of price developments in the period followed will be made by using the obtained predicative models. In the event that the curve contains the component of periodicity, the observed patterns will be studied further.

  15. Forecasting Container Throughput at the Doraleh Port in Djibouti through Time Series Analysis (United States)

    Mohamed Ismael, Hawa; Vandyck, George Kobina

    The Doraleh Container Terminal (DCT) located in Djibouti has been noted as the most technologically advanced container terminal on the African continent. DCT's strategic location at the crossroads of the main shipping lanes connecting Asia, Africa and Europe put it in a unique position to provide important shipping services to vessels plying that route. This paper aims to forecast container throughput through the Doraleh Container Port in Djibouti by Time Series Analysis. A selection of univariate forecasting models has been used, namely Triple Exponential Smoothing Model, Grey Model and Linear Regression Model. By utilizing the above three models and their combination, the forecast of container throughput through the Doraleh port was realized. A comparison of the different forecasting results of the three models, in addition to the combination forecast is then undertaken, based on commonly used evaluation criteria Mean Absolute Deviation (MAD) and Mean Absolute Percentage Error (MAPE). The study found that the Linear Regression forecasting Model was the best prediction method for forecasting the container throughput, since its forecast error was the least. Based on the regression model, a ten (10) year forecast for container throughput at DCT has been made.

  16. Accuracy analysis of measurements on a stable power-law distributed series of events

    International Nuclear Information System (INIS)

    Matthews, J O; Hopcraft, K I; Jakeman, E; Siviour, G B


    We investigate how finite measurement time limits the accuracy with which the parameters of a stably distributed random series of events can be determined. The model process is generated by timing the emigration of individuals from a population that is subject to deaths and a particular choice of multiple immigration events. This leads to a scale-free discrete random process where customary measures, such as mean value and variance, do not exist. However, converting the number of events occurring in fixed time intervals to a 1-bit 'clipped' process allows the construction of well-behaved statistics that still retain vestiges of the original power-law and fluctuation properties. These statistics include the clipped mean and correlation function, from measurements of which both the power-law index of the distribution of events and the time constant of its fluctuations can be deduced. We report here a theoretical analysis of the accuracy of measurements of the mean of the clipped process. This indicates that, for a fixed experiment time, the error on measurements of the sample mean is minimized by an optimum choice of the number of samples. It is shown furthermore that this choice is sensitive to the power-law index and that the approach to Poisson statistics is dominated by rare events or 'outliers'. Our results are supported by numerical simulation

  17. A population based time series analysis of asthma hospitalisations in Ontario, Canada: 1988 to 2000

    Directory of Open Access Journals (Sweden)

    Upshur Ross EG


    Full Text Available Abstract Background Asthma is a common yet incompletely understood health problem associated with a high morbidity burden. A wide variety of seasonally variable environmental stimuli such as viruses and air pollution are believed to influence asthma morbidity. This study set out to examine the seasonal patterns of asthma hospitalisations in relation to age and gender for the province of Ontario over a period of 12 years. Methods A retrospective, population-based study design was used to assess temporal patterns in hospitalisations for asthma from April 1, 1988 to March 31, 2000. Approximately 14 million residents of Ontario eligible for universal healthcare coverage during this time were included for analysis. Time series analyses were conducted on monthly aggregations of hospitalisations. Results There is strong evidence of an autumn peak and summer trough seasonal pattern occurring every year over the 12-year period (Fisher-Kappa (FK = 23.93, p > 0.01; Bartlett Kolmogorov Smirnov (BKS = 0.459, p Conclusions A clear and consistent seasonal pattern was observed in this study for asthma hospitalisations. These findings have important implications for the development of effective management and prevention strategies.

  18. Absolute high-resolution Se+ photoionization cross-section measurements with Rydberg-series analysis

    International Nuclear Information System (INIS)

    Esteves, D. A.; Bilodeau, R. C.; Sterling, N. C.; Phaneuf, R. A.; Kilcoyne, A. L. D.; Red, E. C.; Aguilar, A.


    Absolute single photoionization cross-section measurements for Se + ions were performed at the Advanced Light Source (ALS) at Lawrence Berkeley National Laboratory using the photo-ion merged-beams technique. Measurements were made at a photon energy resolution of 5.5 meV from 17.75 to 21.85 eV spanning the 4s 2 4p 3 4 S 3/2 o ground-state ionization threshold and the 2 P 3/2 o , 2 P 1/2 o , 2 D 5/2 o , and 2 D 3/2 o metastable state thresholds. Extensive analysis of the complex resonant structure in this region identified numerous Rydberg series of resonances and obtained the Se 2+ 4s 2 4p 23 P 2 and 4s 2 4p 21 S 0 state energies. In addition, particular attention was given to removing significant effects in the measurements due to a small percentage of higher-order undulator radiation.

  19. Time series analysis of the developed financial markets' integration using visibility graphs (United States)

    Zhuang, Enyu; Small, Michael; Feng, Gang


    A time series representing the developed financial markets' segmentation from 1973 to 2012 is studied. The time series reveals an obvious market integration trend. To further uncover the features of this time series, we divide it into seven windows and generate seven visibility graphs. The measuring capabilities of the visibility graphs provide means to quantitatively analyze the original time series. It is found that the important historical incidents that influenced market integration coincide with variations in the measured graphical node degree. Through the measure of neighborhood span, the frequencies of the historical incidents are disclosed. Moreover, it is also found that large "cycles" and significant noise in the time series are linked to large and small communities in the generated visibility graphs. For large cycles, how historical incidents significantly affected market integration is distinguished by density and compactness of the corresponding communities.

  20. Temporal trend of carpal tunnel release surgery: a population-based time series analysis.

    Directory of Open Access Journals (Sweden)

    Naif Fnais

    Full Text Available BACKGROUND: Carpal tunnel release (CTR is among the most common hand surgeries, although little is known about its pattern. In this study, we aimed to investigate temporal trends, age and gender variation and current practice patterns in CTR surgeries. METHODS: We conducted a population-based time series analysis among over 13 million residents of Ontario, who underwent operative management for carpal tunnel syndrome (CTS from April 1, 1992 to March 31, 2010 using administrative claims data. RESULTS: The primary analysis revealed a fairly stable procedure rate of approximately 10 patients per 10,000 population per year receiving CTRs without any significant, consistent temporal trend (p = 0.94. Secondary analyses revealed different trends in procedure rates according to age. The annual procedure rate among those age >75 years increased from 22 per 10,000 population at the beginning of the study period to over 26 patients per 10,000 population (p<0.01 by the end of the study period. CTR surgical procedures were approximately two-fold more common among females relative to males (64.9% vs. 35.1 respectively; p<0.01. Lastly, CTR procedures are increasingly being conducted in the outpatient setting while procedures in the inpatient setting have been declining steadily - the proportion of procedures performed in the outpatient setting increased from 13% to over 30% by 2010 (p<0.01. CONCLUSION: Overall, CTR surgical-procedures are conducted at a rate of approximately 10 patients per 10,000 population annually with significant variation with respect to age and gender. CTR surgical procedures in ambulatory-care facilities may soon outpace procedure rates in the in-hospital setting.

  1. Evaluating the impact of flexible alcohol trading hours on violence: an interrupted time series analysis.

    Directory of Open Access Journals (Sweden)

    David K Humphreys

    Full Text Available On November 24(th 2005, the Government of England and Wales removed regulatory restrictions on the times at which licensed premises could sell alcohol. This study tests availability theory by treating the implementation of Licensing Act (2003 as a natural experiment in alcohol policy.An interrupted time series design was employed to estimate the Act's immediate and delayed impact on violence in the City of Manchester (Population 464,200. We collected police recorded rates of violence, robbery, and total crime between the 1st of February 2004 and the 31st of December 2007. Events were aggregated by week, yielding a total of 204 observations (95 pre-, and 109 post-intervention. Secondary analysis examined changes in daily patterns of violence. Pre- and post-intervention events were separated into four three-hour segments 18∶00-20∶59, 21∶00-23.59, 00∶00-02∶59, 03∶00-05∶59.Analysis found no evidence that the Licensing Act (2003 affected the overall volume of violence. However, analyses of night-time violence found a gradual and permanent shift of weekend violence into later parts of the night. The results estimated an initial increase of 27.5% between 03∶00 to 06∶00 (ω = 0.2433, 95% CI = 0.06, 0.42, which increased to 36% by the end of the study period (δ = -0.897, 95% CI = -1.02, -0.77.This study found no evidence that a national policy increasing the physical availability of alcohol affected the overall volume of violence. There was, however, evidence suggesting that the policy may be associated with changes to patterns of violence in the early morning (3 a.m. to 6 a.m..

  2. Case Series Analysis of New Zealand Reports of Rapid Intense Potentiation of Warfarin by Roxithromycin. (United States)

    Savage, Ruth L; Tatley, Michael V


    We undertook an analysis of all the reports to the New Zealand Centre for Adverse Reactions Monitoring of a roxithromycin/warfarin interaction after two recent reports described intense rapid warfarin potentiation. The interaction was first published in 1995. Cytochrome P450 3A4 inhibition has been the proposed mechanism but has limited biologic plausibility. There are suggestions that the clinical significance of the interaction may be increased by severe illness, polypharmacy, renal dysfunction, older age and increased warfarin sensitivity. To investigate the potentiating effect of warfarin on roxithromycin in this New Zealand case series, the reports were reviewed to identify patients at risk, compare the reporting pattern with published Australian data and evaluate the appropriateness of current prescribing advice. Thirty patient reports were identified. The age range was 23-88 years, mean 66.8, median 73.0 (standard deviation 17.7) and the international normalised ratios after roxithromycin commencement ranged from 3.6 to 16.7 (mean 7.6, median 7.6, standard deviation 3.6). For eight patients with measurements on day 3, international normalised ratios were 4.3-16.7 (mean 10.4, median 8.8, standard deviation 4.4). Four patients had serious haemorrhage. Indications for roxithromycin were a range of respiratory tract infections. Anticoagulation was stable for most patients prior to acute infection. Serious infection occurred in 54.5% (12 of 22 patients with information). Polypharmacy (five or more medicines daily) was used by 36.7% of patients long term, increasing acutely to 83.3%, including additional potentially interacting medicines. Warfarin daily dose (1.5-13.0 mg, mean 4.4, median 4.0, standard deviation 2.2) was moderate to low. Pre-roxithromycin international normalised ratio values ranged from 1.4 to 3.7, mean and median 2.5, standard deviation 0.5. A high proportion of interactions were observed between warfarin and roxithromycin compared with other

  3. User's guide for the implementation of level one of the proposed American National Standard Specifications for an information interchange data descriptive file on control data 6000/7000 series computers

    CERN Document Server

    Wiley, R A


    User's guide for the implementation of level one of the proposed American National Standard Specifications for an information interchange data descriptive file on control data 6000/7000 series computers

  4. Plastics piping systems for industrial applications – Acrylonitrile-butadiene-styrene (ABS), unplasticized poly(vinyl chloride) (PVC-U) and chlorinated poly(vinyl chloride) (PVC-C) – Specifications for components and the system – Metric series

    CERN Document Server

    Deutsches Institut für Normung. Berlin


    Plastics piping systems for industrial applications – Acrylonitrile-butadiene-styrene (ABS), unplasticized poly(vinyl chloride) (PVC-U) and chlorinated poly(vinyl chloride) (PVC-C) – Specifications for components and the system – Metric series

  5. Plastics piping systems for industrial applications : acrylonitrile-butadiene- styrene (ABS), unplasticized poly(vinyl chloride) (PVC-U) and chlorinated poly(vinyl chloride) (PVC-C) : specifications for components and the system : metric series

    CERN Document Server

    International Organization for Standardization. Geneva


    Plastics piping systems for industrial applications : acrylonitrile-butadiene- styrene (ABS), unplasticized poly(vinyl chloride) (PVC-U) and chlorinated poly(vinyl chloride) (PVC-C) : specifications for components and the system : metric series

  6. The generation of hourly diffuse irradiation: A model from the analysis of the fluctuation of global irradiance series

    Energy Technology Data Exchange (ETDEWEB)

    Posadillo, R.; Lopez Luque, R. [Grupo de Investigacion de Fisica para las Energias y Recursos Renovables, Dpto. de Fisica Aplicada, UCO, Edificio C2 Campus de Rabanales, 14071 Cordoba (Spain)


    An analysis of models for the estimation of hourly diffuse irradiation based on the interrelations between the hourly diffuse fraction k{sub d} and the hourly clearness index k{sub t}, has concluded that k{sub t} is not a sufficient variable for parametrizing the effect of clouds on diffuse irradiation. A detailed study of the dispersion recorded by this diffuse component for a specific clearness index under partly cloudy sky conditions has led to analyzing how the variability in the instantaneous clearness index influences this dispersion. The data sets correspond to 10 years of hourly and instantaneous value records of global and diffuse radiation collected in Cordoba, Spain. In addition to the inclusion of the sine of solar elevation as a variable into the k{sub d}-k{sub t} correlations, this model propose the inclusion of others parameters related to the variability in the normalized clearness index within an hour and with the fluctuations presented by the time series of the instantaneous values of that index. Also presented is the implementation of an algorithm permitting both the determination of the hourly diffuse irradiation and the discrimination between the different sky conditions in those situations known by the designation partly cloudy sky. (author)

  7. The generation of hourly diffuse irradiation: A model from the analysis of the fluctuation of global irradiance series

    International Nuclear Information System (INIS)

    Posadillo, R.; Lopez Luque, R.


    An analysis of models for the estimation of hourly diffuse irradiation based on the interrelations between the hourly diffuse fraction k d and the hourly clearness index k t , has concluded that k t is not a sufficient variable for parametrizing the effect of clouds on diffuse irradiation. A detailed study of the dispersion recorded by this diffuse component for a specific clearness index under partly cloudy sky conditions has led to analyzing how the variability in the instantaneous clearness index influences this dispersion. The data sets correspond to 10 years of hourly and instantaneous value records of global and diffuse radiation collected in Cordoba, Spain. In addition to the inclusion of the sine of solar elevation as a variable into the k d -k t correlations, this model propose the inclusion of others parameters related to the variability in the normalized clearness index within an hour and with the fluctuations presented by the time series of the instantaneous values of that index. Also presented is the implementation of an algorithm permitting both the determination of the hourly diffuse irradiation and the discrimination between the different sky conditions in those situations known by the designation partly cloudy sky.

  8. [Plant-specific pressured thermal shock safety analysis report

    International Nuclear Information System (INIS)

    Selby, D.L.


    Information is presented concerning plant data; determination of detailed PTS sequences for analysis; fracture mechanics analysis; integration of analysis; sensitivity and uncertainty analyses of through-wall crack frequencies; and effect of corrective actions on vessel through-wall crack frequency

  9. Measuring Quality Improvement in Acute Ischemic Stroke Care: Interrupted Time Series Analysis of Door-to-Needle Time

    Directory of Open Access Journals (Sweden)

    Anne Margreet van Dishoeck


    Full Text Available Background: In patients with acute ischemic stroke, early treatment with recombinant tissue plasminogen activator (rtPA improves functional outcome by effectively reducing disability and dependency. Timely thrombolysis, within 1 h, is a vital aspect of acute stroke treatment, and is reflected in the widely used performance indicator ‘door-to-needle time' (DNT. DNT measures the time from the moment the patient enters the emergency department until he/she receives intravenous rtPA. The purpose of the study was to measure quality improvement from the first implementation of thrombolysis in stroke patients in a university hospital in the Netherlands. We further aimed to identify specific interventions that affect DNT. Methods: We included all patients with acute ischemic stroke consecutively admitted to a large university hospital in the Netherlands between January 2006 and December 2012, and focused on those treated with thrombolytic therapy on admission. Data were collected routinely for research purposes and internal quality measurement (the Erasmus Stroke Study. We used a retrospective interrupted time series design to study the trend in DNT, analyzed by means of segmented regression. Results: Between January 2006 and December 2012, 1,703 patients with ischemic stroke were admitted and 262 (17% were treated with rtPA. Patients treated with thrombolysis were on average 63 years old at the time of the stroke and 52% were male. Mean age (p = 0.58 and sex distribution (p = 0.98 did not change over the years. The proportion treated with thrombolysis increased from 5% in 2006 to 22% in 2012. In 2006, none of the patients were treated within 1 h. In 2012, this had increased to 81%. In a logistic regression analysis, this trend was significant (OR 1.6 per year, CI 1.4-1.8. The median DNT was reduced from 75 min in 2006 to 45 min in 2012 (p Conclusion and Implications: The DNT steadily improved from the first implementation of thrombolysis. Specific

  10. Experimental analysis of specification language impact on NPP software diversity

    International Nuclear Information System (INIS)

    Yoo, Chang Sik; Seong, Poong Hyun


    When redundancy and diversity is applied in NPP digital computer system, diversification of system software may be a critical point for the entire system dependability. As the means of enhancing software diversity, specification language diversity is suggested in this study. We set up a simple hypothesis for the specification language impact on common errors, and an experiment based on NPP protection system application was performed. Experiment result showed that this hypothesis could be justified and specification language diversity is effective in overcoming software common mode failure problem

  11. Characterization of phenols biodegradation by compound specific stable isotope analysis (United States)

    Wei, Xi; Gilevska, Tetyana; Wenzig, Felix; Hans, Richnow; Vogt, Carsten


    -cresol degradation and 2.2±0.3‰ for m-cresol degradation, respectively. The carbon isotope fractionation patterns of phenol degradation differed more profoundly. Oxygen-dependent monooxygenation of phenol by A.calcoaceticus as the initial reaction yielded ƐC values of -1.5±0.02‰. In contrast, the anaerobic degradation initiated by ATP-dependent carboxylation performed by Thauera aromatia DSM 6984, produced no detectable fractionation (ƐC 0±0.1‰). D. cetonica showed a slight inverse carbon isotope fractionation (ƐC 0.4±0.1‰). In conclusion, a validated method for compound specific stable isotope analysis was developed for phenolic compounds, and the first data set of carbon enrichment factors upon the biodegradation of phenol and cresols with different activation mechanisms has been obtained in the present study. Carbon isotope fractionation analysis is a potentially powerful tool to monitor phenolic compounds degradation in the environment.

  12. Alcopops, taxation and harm: a segmented time series analysis of emergency department presentations. (United States)

    Gale, Marianne; Muscatello, David J; Dinh, Michael; Byrnes, Joshua; Shakeshaft, Anthony; Hayen, Andrew; MacIntyre, Chandini Raina; Haber, Paul; Cretikos, Michelle; Morton, Patricia


    In Australia, a Goods and Services Tax (GST) introduced in 2000 led to a decline in the price of ready-to-drink (RTD) beverages relative to other alcohol products. The 2008 RTD ("alcopops") tax increased RTD prices. The objective of this study was to estimate the change in incidence of Emergency Department (ED) presentations for acute alcohol problems associated with each tax. Segmented regression analyses were performed on age and sex-specific time series of monthly presentation rates for acute alcohol problems to 39 hospital emergency departments across New South Wales, Australia over 15 years, 1997 to 2011. Indicator variables represented the introduction of each tax. Retail liquor turnover controlled for large-scale economic factors such as the global financial crisis that may have influenced demand. Under-age (15-17 years) and legal age (18 years and over) drinkers were included. The GST was associated with a statistically significant increase in ED presentations for acute alcohol problems among 18-24 year old females (0 · 14/100,000/month, 95% CI 0 · 05-0 · 22). The subsequent alcopops tax was associated with a statistically significant decrease in males 15-50 years, and females 15-65 years, particularly in 18-24 year old females (-0 · 37/100,000/month, 95% CI -0 · 45 to -0 · 29). An increase in retail turnover of liquor was positively and statistically significantly associated with ED presentations for acute alcohol problems across all age and sex strata. Reduced tax on RTDs was associated with increasing ED presentations for acute alcohol problems among young women. The alcopops tax was associated with declining presentations in young to middle-aged persons of both sexes, including under-age drinkers.

  13. Analysis of isoelectron isonuclear series of holovalent tetraelectron compounds as a system of bicomponent chemical compounds

    Energy Technology Data Exchange (ETDEWEB)

    Vigdorovich, V.N.; Dzhuraev, T.D.


    Analogs and prototypes of the compounds supplementing the system of isoelectron isonuclear series of holovalent tetraelectron compounds by Gorunova are revealed. The investigation of all series of tetraelectron ovalenthol compounds allows one to supplement the variety of known series used for regular tracing and forecasting of compound properties (series of cation and anion substitutions by isonuclear series of the A/sup 4/B/sup 4/, A/sup 3/B/sup 5/, A/sup 1/B/sup 7/ type and others compounds. The above series for medium ordinal numbers anti Z equal 10, 14, 18, 23 and 36 permit to illustrate the possibility of existence of such analogs or series, for example for the compounds of the type A/sup 3/-- B/sup 5/:AlN-BP or Z=1(f AlP-ScN-BV (for Z=14), ScP-AlV (for Z=18), GaP-AlAs-YN-BNb (for Z=23) and YAs-GaNb-InV-ScSb-LaP-AlPr (for Z=36).

  14. Analysis of isoelectron isonuclear series of holovalent tetraelectron compounds as a system of bicomponent chemical compounds

    International Nuclear Information System (INIS)

    Vigdorovich, V.N.; Dzhuraev, T.D.


    Analogs and prototypes of the compounds supplementing the system of isoelectron isonuclear series of holovalent tetraelectron compounds by Gorunova are revealed. The investigation of all series of tetraelectron ovalenthol compounds allows one to supplement the variety of known series used for regular tracing and forecasting of compound properties (series of cation and anion substitutions by isonuclear series of the A 4 B 4 , A 3 B 5 , A 1 B 7 type and others compounds. The above series for medium ordinal numbers anti Z equal 10, 14, 18, 23 and 36 permit to illustrate the possibility of existence of such analogs or series, for example for the compounds of the type A 3 -- B 5 :AlN-BP or Z=1(f AlP-ScN-BV (for Z=14), ScP-AlV (for Z=18), GaP-AlAs-YN-BNb ( for Z=23) and YAs-GaNb-InV-ScSb-LaP-AlPr (for Z=36)

  15. Analysis of series resistance effects on forward I - V and C - V characteristics of mis type diodes

    International Nuclear Information System (INIS)

    Altindal, S.; Tekeli, Z.; Karadeniz, S.; Tugluoglu, N.; Ercan, I.


    In order to determine the series resistance R s , we have followed Lie et al., Cheung et al. and Kang et al., from the plot of I vs dV/dLn(I) which was linear curve over a wide range of current values at each temperature. The values of Rs were obtained from the slope of the linear parts of the curves and then the series resistance at each temperature has been evaluated at Ln(I) vs (V-IR s ) curves. The curves are linear over a wide range of voltage. The most reliable values of ideality factor n and reverse saturation current Is were then determined. In addition to role of series resistance on the C-V and G-V characteristics of diode have been investigated. Both C-V and G-V measurements show that the measured capacitance and conductance seriously varies with applied bias and frequency due to presence of R s . The density of interface states, barrier height and series resistance from the forward bias I-V characteristics using this method agrees very well with that obtained from the capacitance technique. It is clear that ignoring the series resistance (device with high series resistance) can lead to significant errors in the analysis of the I-V-T, C-V-f and G-V-f characteristics

  16. Fuel specific consumption and emission analysis in a cycle diesel ...

    African Journals Online (AJOL)



    ICE) to produce electrical energy. The used motor .... steady rotation rate and they have observed that the ... was run by the ASSISTAT “free software”. ..... Specification for Highway Weighin-Motion (WIM) Systems with User.

  17. Analysis of three amphibian populations with quarter-century long time-series.


    Meyer, A H; Schimidt, B R; Grossenbacher, K


    Amphibians are in decline in many parts of the world. Long tme-series of amphibian populations are necessary to distinguish declines from the often strong fluctuations observed in natural populations. Time-series may also help to understand the causes of these declines. We analysed 23-28-year long time-series of the frog Rana temporaria. Only one of the three studied populations showed a negative trend which was probably caused by the introduction of fish. Two populations appeared to be densi...

  18. Functional Analysis of Maize Silk-Specific ZmbZIP25 Promoter

    Directory of Open Access Journals (Sweden)

    Wanying Li


    Full Text Available ZmbZIP25 (Zea mays bZIP (basic leucine zipper transcription factor 25 is a function-unknown protein that belongs to the D group of the bZIP transcription factor family. RNA-seq data showed that the expression of ZmbZIP25 was tissue-specific in maize silks, and this specificity was confirmed by RT-PCR (reverse transcription-polymerase chain reaction. In situ RNA hybridization showed that ZmbZIP25 was expressed exclusively in the xylem of maize silks. A 5′ RACE (rapid amplification of cDNA ends assay identified an adenine residue as the transcription start site of the ZmbZIP25 gene. To characterize this silk-specific promoter, we isolated and analyzed a 2450 bp (from −2083 to +367 and a 2600 bp sequence of ZmbZIP25 (from −2083 to +517, the transcription start site was denoted +1. Stable expression assays in Arabidopsis showed that the expression of the reporter gene GUS driven by the 2450 bp ZmbZIP25 5′-flanking fragment occurred exclusively in the papillae of Arabidopsis stigmas. Furthermore, transient expression assays in maize indicated that GUS and GFP expression driven by the 2450 bp ZmbZIP25 5′-flanking sequences occurred only in maize silks and not in other tissues. However, no GUS or GFP expression was driven by the 2600 bp ZmbZIP25 5′-flanking sequences in either stable or transient expression assays. A series of deletion analyses of the 2450 bp ZmbZIP25 5′-flanking sequence was performed in transgenic Arabidopsis plants, and probable elements prediction analysis revealed the possible presence of negative regulatory elements within the 161 bp region from −1117 to −957 that were responsible for the specificity of the ZmbZIP25 5′-flanking sequence.

  19. Extended local similarity analysis (eLSA) of microbial community and other time series data with replicates. (United States)

    Xia, Li C; Steele, Joshua A; Cram, Jacob A; Cardon, Zoe G; Simmons, Sheri L; Vallino, Joseph J; Fuhrman, Jed A; Sun, Fengzhu


    The increasing availability of time series microbial community data from metagenomics and other molecular biological studies has enabled the analysis of large-scale microbial co-occurrence and association networks. Among the many analytical techniques available, the Local Similarity Analysis (LSA) method is unique in that it captures local and potentially time-delayed co-occurrence and association patterns in time series data that cannot otherwise be identified by ordinary correlation analysis. However LSA, as originally developed, does not consider time series data with replicates, which hinders the full exploitation of available information. With replicates, it is possible to understand the variability of local similarity (LS) score and to obtain its confidence interval. We extended our LSA technique to time series data with replicates and termed it extended LSA, or eLSA. Simulations showed the capability of eLSA to capture subinterval and time-delayed associations. We implemented the eLSA technique into an easy-to-use analytic software package. The software pipeline integrates data normalization, statistical correlation calculation, statistical significance evaluation, and association network construction steps. We applied the eLSA technique to microbial community and gene expression datasets, where unique time-dependent associations were identified. The extended LSA analysis technique was demonstrated to reveal statistically significant local and potentially time-delayed association patterns in replicated time series data beyond that of ordinary correlation analysis. These statistically significant associations can provide insights to the real dynamics of biological systems. The newly designed eLSA software efficiently streamlines the analysis and is freely available from the eLSA homepage, which can be accessed at

  20. Hybrid Wavelet De-noising and Rank-Set Pair Analysis approach for forecasting hydro-meteorological time series (United States)

    WANG, D.; Wang, Y.; Zeng, X.


    Accurate, fast forecasting of hydro-meteorological time series is presently a major challenge in drought and flood mitigation. This paper proposes a hybrid approach, Wavelet De-noising (WD) and Rank-Set Pair Analysis (RSPA), that takes full advantage of a combination of the two approaches to improve forecasts of hydro-meteorological time series. WD allows decomposition and reconstruction of a time series by the wavelet transform, and hence separation of the noise from the original series. RSPA, a more reliable and efficient version of Set Pair Analysis, is integrated with WD to form the hybrid WD-RSPA approach. Two types of hydro-meteorological data sets with different characteristics and different levels of human influences at some representative stations are used to illustrate the WD-RSPA approach. The approach is also compared to three other generic methods: the conventional Auto Regressive Integrated Moving Average (ARIMA) method, Artificial Neural Networks (ANNs) (BP-error Back Propagation, MLP-Multilayer Perceptron and RBF-Radial Basis Function), and RSPA alone. Nine error metrics are used to evaluate the model performance. The results show that WD-RSPA is accurate, feasible, and effective. In particular, WD-RSPA is found to be the best among the various generic methods compared in this paper, even when the extreme events are included within a time series.

  1. A very British affair six Britons and the development of time series analysis during the 20th century

    CERN Document Server

    Mills, T


    This book develops the major themes of time series analysis from its formal beginnings in the early part of the 20th century to the present day through the research of six distinguished British statisticians, all of whose work is characterised by the British traits of pragmatism and the desire to solve practical problems of importance.

  2. Mapping agroecological zones and time lag in vegetation growth by means of Fourier analysis of time series of NDVI images (United States)

    Menenti, M.; Azzali, S.; Verhoef, W.; Van Swol, R.


    Examples are presented of applications of a fast Fourier transform algorithm to analyze time series of images of Normalized Difference Vegetation Index values. The results obtained for a case study on Zambia indicated that differences in vegetation development among map units of an existing agroclimatic map were not significant, while reliable differences were observed among the map units obtained using the Fourier analysis.

  3. An entropy based analysis of the relationship between the DOW JONES Index and the TRNA Sentiment series

    NARCIS (Netherlands)

    D.E. Allen (David); M.J. McAleer (Michael); A.K. Singh (Abhay)


    textabstractThis paper features an analysis of the relationship between the DOW JONES Industrial Average Index (DJIA) and a sentiment news series using daily data obtained from the Thomson Reuters News Analytics (TRNA)1 provided by SIRCA (The Securities Industry Research Centre of the Asia Pacic).

  4. Time Series Analysis of Non-Gaussian Observations Based on State Space Models from Both Classical and Bayesian Perspectives

    NARCIS (Netherlands)

    Durbin, J.; Koopman, S.J.M.


    The analysis of non-Gaussian time series using state space models is considered from both classical and Bayesian perspectives. The treatment in both cases is based on simulation using importance sampling and antithetic variables; Monte Carlo Markov chain methods are not employed. Non-Gaussian

  5. An econometric time-series analysis of global CO2 concentrations and emissions

    International Nuclear Information System (INIS)

    Cohen, B.C.; Labys, W.C.; Eliste, P.


    This paper extends previous work on the econometric modelling of CO 2 concentrations and emissions. The importance of such work rests in the fact that models of the Cohen-Labys variety represent the only alternative to scientific or physical models of CO 2 accumulations whose parameters are inferred rather than estimated. The stimulation for this study derives from the recent discovery of oscillations and cycles in the net biospheric flux of CO 2 . A variety of time series tests is thus used to search for the presence of normality, stationarity, cyclicality and stochastic processes in global CO 2 emissions and concentrations series. Given the evidence for cyclicality of a short-run nature in the spectra of these series, both structural time series and error correction model are applied to confirm the frequency and amplitude of these cycles. Our results suggest new possibilities for determining equilibrium levels of CO 2 concentrations and subsequently revising stabilization policies. (Author)

  6. Cointegration and Error Correction Modelling in Time-Series Analysis: A Brief Introduction

    Directory of Open Access Journals (Sweden)

    Helmut Thome


    Full Text Available Criminological research is often based on time-series data showing some type of trend movement. Trending time-series may correlate strongly even in cases where no causal relationship exists (spurious causality. To avoid this problem researchers often apply some technique of detrending their data, such as by differencing the series. This approach, however, may bring up another problem: that of spurious non-causality. Both problems can, in principle, be avoided if the series under investigation are “difference-stationary” (if the trend movements are stochastic and “cointegrated” (if the stochastically changing trendmovements in different variables correspond to each other. The article gives a brief introduction to key instruments and interpretative tools applied in cointegration modelling.

  7. Chaos analysis of the electrical signal time series evoked by acupuncture

    International Nuclear Information System (INIS)

    Wang Jiang; Sun Li; Fei Xiangyang; Zhu Bing


    This paper employs chaos theory to analyze the time series of electrical signal which are evoked by different acupuncture methods applied to the Zusanli point. The phase space is reconstructed and the embedding parameters are obtained by the mutual information and Cao's methods. Subsequently, the largest Lyapunov exponent is calculated. From the analyses we can conclude that the time series are chaotic. In addition, differences between various acupuncture methods are discussed

  8. Chaos analysis of the electrical signal time series evoked by acupuncture

    Energy Technology Data Exchange (ETDEWEB)

    Wang Jiang [School of Electrical Engineering, Tianjin University, Tianjin 300072 (China)]. E-mail:; Sun Li [School of Electrical Engineering, Tianjin University, Tianjin 300072 (China); Fei Xiangyang [School of Electrical Engineering, Tianjin University, Tianjin 300072 (China); Zhu Bing [Institute of Acupuncture and Moxibustion, China Academy of Traditional Chinese Medicine, Beijing 100700 (China)


    This paper employs chaos theory to analyze the time series of electrical signal which are evoked by different acupuncture methods applied to the Zusanli point. The phase space is reconstructed and the embedding parameters are obtained by the mutual information and Cao's methods. Subsequently, the largest Lyapunov exponent is calculated. From the analyses we can conclude that the time series are chaotic. In addition, differences between various acupuncture methods are discussed.

  9. Does Financial Development Reduce CO2 Emissions in Malaysian Economy? A Time Series Analysis


    Shahbaz, Muhammad; Solarin, Sakiru Adebola; Mahmood, Haider


    This study deals with the question whether financial development reduces CO2 emissions or not in case of Malaysia. For this purpose, we apply the bounds testing approach to cointegration for long run relations between the variables. The study uses annual time series data over the period 1971-2008. Ng-Perron stationarity test is applied to test the unit root properties of the series. Our results validate the presence of cointegration between CO2 emissions, financial development, energy co...

  10. Self-potential time series analysis in a seismic area of the Southern Apennines: preliminary results


    Di Bello, G.; Lapenna, V.; Satriano, C.; Tramutoli, V.


    The self-potential time series recorded during the period May 1991 - August 1992 by an automatic station, located in a seismic area of Southern Apennines, is analyzed. We deal with the spectral and the statistical features of the electrotellurie precursors: they can play a major role in the approach to seismic prediction. The time-dynamics of the experimental time series is investigated, the cyclic components and the time trends are removed. In particular we consider the influence of external...

  11. Decentring the Wizard: An Analysis of the Constructed Discourses of Animality in the Harry Potter Series


    Haugann, Martine Juritzen


    This thesis seeks to explore the way constructions of animality present problematic discourses of race, gender and human ethnic groups in the Harry Potter series. This is done with special emphasis on the third novel, Harry Potter and the Prisoner of Azkaban, the fifth, Harry Potter and the Order of the Phoenix, and the seventh, Harry Potter and the Deathly Hallows. The overall claim is that problematic representations of animality constructions in the series reinforce, rather than resist, st...

  12. A New Modified Histogram Matching Normalization for Time Series Microarray Analysis. (United States)

    Astola, Laura; Molenaar, Jaap


    Microarray data is often utilized in inferring regulatory networks. Quantile normalization (QN) is a popular method to reduce array-to-array variation. We show that in the context of time series measurements QN may not be the best choice for this task, especially not if the inference is based on continuous time ODE model. We propose an alternative normalization method that is better suited for network inference from time series data.

  13. A New Modified Histogram Matching Normalization for Time Series Microarray Analysis

    Directory of Open Access Journals (Sweden)

    Laura Astola


    Full Text Available Microarray data is often utilized in inferring regulatory networks. Quantile normalization (QN is a popular method to reduce array-to-array variation. We show that in the context of time series measurements QN may not be the best choice for this task, especially not if the inference is based on continuous time ODE model. We propose an alternative normalization method that is better suited for network inference from time series data.

  14. Electromotive force analysis of current transformer during lightning surge inflow using Fourier series expansion

    Directory of Open Access Journals (Sweden)

    Youngsun Kim


    Full Text Available The most common structure used for current transformers (CTs consists of secondary windings around a ferromagnetic core past the primary current being measured. A CT used as a surge protection device (SPD may experience large inrushes of current, like surges. However, when a large current flows into the primary winding, measuring the magnitude of the current is difficult because the ferromagnetic core becomes magnetically saturated. Several approaches to reduce the saturation effect are described in the literature. A Rogowski coil is representative of several devices that measure large currents. It is an electrical device that measures alternating current (AC or high-frequency current. However, such devices are very expensive in application. In addition, the volume of a CT must be increased to measure sufficiently large currents, but for installation spaces that are too small, other methods must be used. To solve this problem, it is necessary to analyze the magnetic field and electromotive force (EMF characteristics when designing a CT. Thus, we proposed an analysis method for the CT under an inrush current using the time-domain finite element method (TDFEM. The input source current of a surge waveform is expanded by a Fourier series to obtain an instantaneous value. An FEM model of the device is derived in a two-dimensional system and coupled with EMF circuits. The time-derivative term in the differential equation is solved in each time step by the finite difference method. It is concluded that the proposed algorithm is useful for analyzing CT characteristics, including the field distribution. Consequently, the proposed algorithm yields a reference for obtaining the effects of design parameters and magnetic materials for special shapes and sizes before the CT is designed and manufactured.

  15. Electromotive force analysis of current transformer during lightning surge inflow using Fourier series expansion (United States)

    Kim, Youngsun


    The most common structure used for current transformers (CTs) consists of secondary windings around a ferromagnetic core past the primary current being measured. A CT used as a surge protection device (SPD) may experience large inrushes of current, like surges. However, when a large current flows into the primary winding, measuring the magnitude of the current is difficult because the ferromagnetic core becomes magnetically saturated. Several approaches to reduce the saturation effect are described in the literature. A Rogowski coil is representative of several devices that measure large currents. It is an electrical device that measures alternating current (AC) or high-frequency current. However, such devices are very expensive in application. In addition, the volume of a CT must be increased to measure sufficiently large currents, but for installation spaces that are too small, other methods must be used. To solve this problem, it is necessary to analyze the magnetic field and electromotive force (EMF) characteristics when designing a CT. Thus, we proposed an analysis method for the CT under an inrush current using the time-domain finite element method (TDFEM). The input source current of a surge waveform is expanded by a Fourier series to obtain an instantaneous value. An FEM model of the device is derived in a two-dimensional system and coupled with EMF circuits. The time-derivative term in the differential equation is solved in each time step by the finite difference method. It is concluded that the proposed algorithm is useful for analyzing CT characteristics, including the field distribution. Consequently, the proposed algorithm yields a reference for obtaining the effects of design parameters and magnetic materials for special shapes and sizes before the CT is designed and manufactured.

  16. Association between air pollution and cardiovascular mortality in Hefei, China: A time-series analysis. (United States)

    Zhang, Chao; Ding, Rui; Xiao, Changchun; Xu, Yachun; Cheng, Han; Zhu, Furong; Lei, Ruoqian; Di, Dongsheng; Zhao, Qihong; Cao, Jiyu


    In recent years, air pollution has become an alarming problem in China. However, evidence on the effects of air pollution on cardiovascular mortality is still not conclusive to date. This research aimed to assess the short-term effects of air pollution on cardiovascular morbidity in Hefei, China. Data of air pollution, cardiovascular mortality, and meteorological characteristics in Hefei between 2010 and 2015 were collected. Time-series analysis in generalized additive model was applied to evaluate the association between air pollution and daily cardiovascular mortality. During the study period, the annual average concentration of PM 10, SO 2 , and NO 2 was 105.91, 20.58, and 30.93 μg/m 3 , respectively. 21,816 people (including 11,876 man, and 14,494 people over 75 years of age) died of cardiovascular diseases. In single pollutant model, the effects of multi-day exposure were greater than single-day exposure of the air pollution. For every increase of 10 μg/m 3 in SO 2 , NO 2 , and PM 10 levels, CVD mortality increased by 5.26% (95%CI: 3.31%-7.23%), 2.71% (95%CI: 1.23%-4.22%), and 0.68% (95%CI: 0.33%-1.04%) at a lag03, respectively. The multi-pollutant models showed that PM 10 and SO 2 remained associated with CVD mortality, although the effect estimates attenuated. However, the effect of NO 2 on CVD mortality decreased to statistically insignificant. Subgroup analyses further showed that women were more vulnerable than man upon air pollution exposure. These findings showed that air pollution could significantly increase the CVD mortality. Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. [Insulinoma of the pancreas: analysis of a clinical series of 30 cases]. (United States)

    Andronesi, D; Andronesi, A; Tonea, A; Andrei, S; Herlea, V; Lupescu, I; Ionescu-Târgovişte, C; Coculescu, M; Fica, S; Ionescu, M; Gheorghe, C; Popescu, I


    Insulinoma is the most frequent neuroendocrine pancreatic tumor and is the main cause for hypoglicemia due to endogenous hyperinsulinism. We performed an analysis of a clinical series in order to study the clinical and biological spectrum of presentation, the preoperatory imagistic diagnosis and results of the surgical approach. Between 1986-2009, 30 patients with symptoms suggesting an insulinoma were hospitalized in our department. Preoperatory localization of insulinomas was possible in 16 patients. The most sensitive imagistic methods were ecoendoscopy and magnetic resonance. Intraoperatory ultrasound was performed in 16 patients and its sensitivity in detection of insulinomas was 93%; the combination between intraoperative ultrasound and manual exploration of pancreas by the surgeon reached a 100% sensitivity. Before the intraoperatory ultrasound was used the tumor excision was predominantly done by extensive pancreatic resection, while after this was available in our centre more conservative (enucleo-resection) procedures were chosen. In 1 patient the resection was done by laparoscopy, and in 1 patient by robotic surgery. The dimensions of the tumor were less than 2 cm in most of the patients; 2 had nesidioblastosis and 2 had multiple insulinomas; all 28 patients proved to have benign insulinomas at histological specimens. Following surgery, the symptoms disappear in all patients. The most common complication following extensive pancreatic resections was acute pancreatitis, while after enucleation pancreatic fistula occurred more frequently. Due to small dimensions, the preoperative diagnosis of insulinomas is usually difficult, ecoendoscopy being the most sensitive method. Intraoperative ultrasound is essential for insulinoma localization and for chosing the optimal type of excision. Enucleation is the resection method to be chosen whenever this it is technical possible. In benign insulinomas the prognosis is excellent, surgical resection being curative in

  18. Efficient Transfer Entropy Analysis of Non-Stationary Neural Time Series (United States)

    Vicente, Raul; Díaz-Pernas, Francisco J.; Wibral, Michael


    Information theory allows us to investigate information processing in neural systems in terms of information transfer, storage and modification. Especially the measure of information transfer, transfer entropy, has seen a dramatic surge of interest in neuroscience. Estimating transfer entropy from two processes requires the observation of multiple realizations of these processes to estimate associated probability density functions. To obtain these necessary observations, available estimators typically assume stationarity of processes to allow pooling of observations over time. This assumption however, is a major obstacle to the application of these estimators in neuroscience as observed processes are often non-stationary. As a solution, Gomez-Herrero and colleagues theoretically showed that the stationarity assumption may be avoided by estimating transfer entropy from an ensemble of realizations. Such an ensemble of realizations is often readily available in neuroscience experiments in the form of experimental trials. Thus, in this work we combine the ensemble method with a recently proposed transfer entropy estimator to make transfer entropy estimation applicable to non-stationary time series. We present an efficient implementation of the approach that is suitable for the increased computational demand of the ensemble method's practical application. In particular, we use a massively parallel implementation for a graphics processing unit to handle the computationally most heavy aspects of the ensemble method for transfer entropy estimation. We test the performance and robustness of our implementation on data from numerical simulations of stochastic processes. We also demonstrate the applicability of the ensemble method to magnetoencephalographic data. While we mainly evaluate the proposed method for neuroscience data, we expect it to be applicable in a variety of fields that are concerned with the analysis of information transfer in complex biological, social, and

  19. Information-Theoretic Performance Analysis of Sensor Networks via Markov Modeling of Time Series Data. (United States)

    Li, Yue; Jha, Devesh K; Ray, Asok; Wettergren, Thomas A; Yue Li; Jha, Devesh K; Ray, Asok; Wettergren, Thomas A; Wettergren, Thomas A; Li, Yue; Ray, Asok; Jha, Devesh K


    This paper presents information-theoretic performance analysis of passive sensor networks for detection of moving targets. The proposed method falls largely under the category of data-level information fusion in sensor networks. To this end, a measure of information contribution for sensors is formulated in a symbolic dynamics framework. The network information state is approximately represented as the largest principal component of the time series collected across the network. To quantify each sensor's contribution for generation of the information content, Markov machine models as well as x-Markov (pronounced as cross-Markov) machine models, conditioned on the network information state, are constructed; the difference between the conditional entropies of these machines is then treated as an approximate measure of information contribution by the respective sensors. The x-Markov models represent the conditional temporal statistics given the network information state. The proposed method has been validated on experimental data collected from a local area network of passive sensors for target detection, where the statistical characteristics of environmental disturbances are similar to those of the target signal in the sense of time scale and texture. A distinctive feature of the proposed algorithm is that the network decisions are independent of the behavior and identity of the individual sensors, which is desirable from computational perspectives. Results are presented to demonstrate the proposed method's efficacy to correctly identify the presence of a target with very low false-alarm rates. The performance of the underlying algorithm is compared with that of a recent data-driven, feature-level information fusion algorithm. It is shown that the proposed algorithm outperforms the other algorithm.

  20. Time series analysis of embodied interaction: Movement variability and complexity matching as dyadic properties

    Directory of Open Access Journals (Sweden)

    Leonardo Zapata-Fonseca


    Full Text Available There is a growing consensus that a fuller understanding of social cognition depends on more systematic studies of real-time social interaction. Such studies require methods that can deal with the complex dynamics taking place at multiple interdependent temporal and spatial scales, spanning sub-personal, personal, and dyadic levels of analysis. We demonstrate the value of adopting an extended multi-scale approach by re-analyzing movement time series generated in a study of embodied dyadic interaction in a minimal virtual reality environment (a perceptual crossing experiment. Reduced movement variability revealed an interdependence between social awareness and social coordination that cannot be accounted for by either subjective or objective factors alone: it picks out interactions in which subjective and objective conditions are convergent (i.e. elevated coordination is perceived as clearly social, and impaired coordination is perceived as socially ambiguous. This finding is consistent with the claim that interpersonal interaction can be partially constitutive of direct social perception. Clustering statistics (Allan Factor of salient events revealed fractal scaling. Complexity matching defined as the similarity between these scaling laws was significantly more pronounced in pairs of participants as compared to surrogate dyads. This further highlights the multi-scale and distributed character of social interaction and extends previous complexity matching results from dyadic conversation to nonverbal social interaction dynamics. Trials with successful joint interaction were also associated with an increase in local coordination. Consequently, a local coordination pattern emerges on the background of complex dyadic interactions in the PCE task and makes joint successful performance possible.

  1. Using barometric time series of the IMS infrasound network for a global analysis of thermally induced atmospheric tides (United States)

    Hupe, Patrick; Ceranna, Lars; Pilger, Christoph


    The International Monitoring System (IMS) has been established to monitor compliance with the Comprehensive Nuclear-Test-Ban Treaty and comprises four technologies, one of which is infrasound. When fully established, the IMS infrasound network consists of 60 sites uniformly distributed around the globe. Besides its primary purpose of determining explosions in the atmosphere, the recorded data reveal information on other anthropogenic and natural infrasound sources. Furthermore, the almost continuous multi-year recordings of differential and absolute air pressure allow for analysing the atmospheric conditions. In this paper, spectral analysis tools are applied to derive atmospheric dynamics from barometric time series. Based on the solar atmospheric tides, a methodology for performing geographic and temporal variability analyses is presented, which is supposed to serve for upcoming studies related to atmospheric dynamics. The surplus value of using the IMS infrasound network data for such purposes is demonstrated by comparing the findings on the thermal tides with previous studies and the Modern-Era Retrospective analysis for Research and Applications Version 2 (MERRA-2), which represents the solar tides well in its surface pressure fields. Absolute air pressure recordings reveal geographical characteristics of atmospheric tides related to the solar day and even to the lunar day. We therefore claim the chosen methodology of using the IMS infrasound network to be applicable for global and temporal studies on specific atmospheric dynamics. Given the accuracy and high temporal resolution of the barometric data from the IMS infrasound network, interactions with gravity waves and planetary waves can be examined in future for refining the knowledge of atmospheric dynamics, e.g. the origin of tidal harmonics up to 9 cycles per day as found in the barometric data sets. Data assimilation in empirical models of solar tides would be a valuable application of the IMS infrasound


    Directory of Open Access Journals (Sweden)

    Abror Abror


    Full Text Available Indonesia located in tropic area consists of wet season and dry season. However, in last few years, in river discharge in dry season is very little, but in contrary, in wet season, frequency of flood increases with sharp peak and increasingly great water elevation. The increased flood discharge may occur due to change in land use or change in rainfall characteristic. Both matters should get clarity. Therefore, a research should be done to analyze rainfall characteristic, land use and flood discharge in some watershed area (DAS quantitatively from time series data. The research was conducted in DAS Gintung in Parakankidang, DAS Gung in Danawarih, DAS Rambut in Cipero, DAS Kemiri in Sidapurna and DAS Comal in Nambo, located in Tegal Regency and Pemalang Regency in Central Java Province. This research activity consisted of three main steps: input, DAS system and output. Input is DAS determination and selection and searching secondary data. DAS system is early secondary data processing consisting of rainfall analysis, HSS GAMA I parameter, land type analysis and DAS land use. Output is final processing step that consisting of calculation of Tadashi Tanimoto, USSCS effective rainfall, flood discharge, ARIMA analysis, result analysis and conclusion. Analytical calculation of ARIMA Box-Jenkins time series used software Number Cruncher Statistical Systems and Power Analysis Sample Size (NCSS-PASS version 2000, which result in time series characteristic in form of time series pattern, mean square errors (MSE, root mean square ( RMS, autocorrelation of residual and trend. Result of this research indicates that composite CN and flood discharge is proportional that means when composite CN trend increase then flood discharge trend also increase and vice versa. Meanwhile, decrease of rainfall trend is not always followed with decrease in flood discharge trend. The main cause of flood discharge characteristic is DAS management characteristic, not change in

  3. Economic cycles and their synchronization: Spectral analysis of macroeconomic series from Italy, The Netherlands, and the UK (United States)

    Sella, Lisa; Vivaldo, Gianna; Ghil, Michael; Groth, Andreas


    The present work applies several advanced spectral methods (Ghil et al., Rev. Geophys., 2002) to the analysis of macroeconomic fluctuations in Italy, The Netherlands, and the United Kingdom. These methods provide valuable time-and-frequency-domain tools that complement traditional time-domain analysis, and are thus fairly well known by now in the geosciences and life sciences, but not yet widespread in quantitative economics. In particular, they enable the identification and characterization of nonlinear trends and dominant cycles --- including low-frequency and seasonal components --- that characterize the behavior of each time series. We explore five fundamental indicators of the real (i.e., non-monetary), aggregate economy --- namely gross domestic product (GDP), consumption, fixed investments, exports and imports --- in a univariate as well as multivariate setting. A single-channel analysis by means of three independent spectral methods --- singular spectrum analysis (SSA), the multi-taper method (MTM), and the maximum-entropy method (MEM) --- reveals very similar near-annual cycles, as well as several longer periodicities, in the macroeconomic indicators of all the countries analyzed. Since each indicator represents different features of an economic system, we combine them to infer if common oscillatory modes are present, either among different indicators within the same country or among the same indicators across different countries. Multichannel-SSA (M-SSA) reinforces the previous results, and shows that the common modes agree in character with solutions of a non-equilibrium dynamic model (NEDyM) that produces endogenous business cycles (Hallegatte et al., JEBO, 2008). The presence of these modes in NEDyM results from adjustment delays and other nonequilibrium effects that were added to a neoclassical Solow (Q. J. Econ., 1956) growth model. Their confirmation by the present analysis has important consequences for the net impact of natural disasters on the

  4. Electron microscopic analysis of the specific granule content of ...

    African Journals Online (AJOL)

    Knowledge about the stimulus for the release of atrial natriuretic peptide (ANP) from human atria is incomplete. Atrial stretch is known to be a stimulus and atrial tachyarrhythmias are thought to be another. The effects of atrial size (by twodimensional echocardiography) and atrial fibrillation on the atrial specific granule ...

  5. Multinetwork of international trade: A commodity-specific analysis (United States)

    Barigozzi, Matteo; Fagiolo, Giorgio; Garlaschelli, Diego


    We study the topological properties of the multinetwork of commodity-specific trade relations among world countries over the 1992-2003 period, comparing them with those of the aggregate-trade network, known in the literature as the international-trade network (ITN). We show that link-weight distributions of commodity-specific networks are extremely heterogeneous and (quasi) log normality of aggregate link-weight distribution is generated as a sheer outcome of aggregation. Commodity-specific networks also display average connectivity, clustering, and centrality levels very different from their aggregate counterpart. We also find that ITN complete connectivity is mainly achieved through the presence of many weak links that keep commodity-specific networks together and that the correlation structure existing between topological statistics within each single network is fairly robust and mimics that of the aggregate network. Finally, we employ cross-commodity correlations between link weights to build hierarchies of commodities. Our results suggest that on the top of a relatively time-invariant “intrinsic” taxonomy (based on inherent between-commodity similarities), the roles played by different commodities in the ITN have become more and more dissimilar, possibly as the result of an increased trade specialization. Our approach is general and can be used to characterize any multinetwork emerging as a nontrivial aggregation of several interdependent layers.

  6. Implementation of a Time Series Analysis for the Assessment of the Role of Climate Variability in a Post-Disturbance Savanna System (United States)

    Gibbes, C.; Southworth, J.; Waylen, P. R.


    How do climate variability and climate change influence vegetation cover and vegetation change in savannas? A landscape scale investigation of the effect of changes in precipitation on vegetation is undertaken through the employment of a time series analysis. The multi-national study region is located within the Kavango-Zambezi region, and is delineated by the Okavango, Kwando, and Zambezi watersheds. A mean-variance time-series analysis quantifies vegetation dynamics and characterizes vegetation response to climate. The spatially explicit approach used to quantify the persistence of vegetation productivity permits the extraction of information regarding long term climate-landscape dynamics. Results show a pattern of reduced mean annual precipitation and increased precipitation variability across key social and ecological areas within the study region. Despite decreased mean annual precipitation since the mid to late 1970's vegetation trends predominantly indicate increasing biomass. The limited areas which have diminished vegetative cover relate to specific vegetation types, and are associated with declines in precipitation variability. Results indicate that in addition to short term changes in vegetation cover, long term trends in productive biomass are apparent, relate to spatial differences in precipitation variability, and potentially represent shifts vegetation composition. This work highlights the importance of time-series analyses for examining climate-vegetation linkages in a spatially explicit manner within a highly vulnerable region of the world.

  7. Surveillance Analysis Computer System (SACS) software requirements specification (SRS)

    International Nuclear Information System (INIS)

    Glasscock, J.A.; Flanagan, M.J.


    This document is the primary document establishing requirements for the Surveillance Analysis Computer System (SACS) Database, an Impact Level 3Q system. The purpose is to provide the customer and the performing organization with the requirements for the SACS Project

  8. Advanced approach to the analysis of a series of in-situ nuclear forward scattering experiments

    Energy Technology Data Exchange (ETDEWEB)

    Vrba, Vlastimil, E-mail: [Department of Experimental Physics, Faculty of Science, Palacký University Olomouc, 17. listopadu 12, 771 46 Olomouc (Czech Republic); Procházka, Vít, E-mail: [Department of Experimental Physics, Faculty of Science, Palacký University Olomouc, 17. listopadu 12, 771 46 Olomouc (Czech Republic); Smrčka, David, E-mail: [Department of Experimental Physics, Faculty of Science, Palacký University Olomouc, 17. listopadu 12, 771 46 Olomouc (Czech Republic); Miglierini, Marcel, E-mail: [Slovak University of Technology in Bratislava, Faculty of Electrical Engineering and Information Technology, Institute of Nuclear and Physical Engineering, Ilkovicova 3, 812 19 Bratislava (Slovakia); Department of Nuclear Reactors, Faculty of Nuclear Science and Physical Engineering, Czech Technical University in Prague, V Holešovičkách 2, 180 00 Prague (Czech Republic)


    This study introduces a sequential fitting procedure as a specific approach to nuclear forward scattering (NFS) data evaluation. Principles and usage of this advanced evaluation method are described in details and its utilization is demonstrated on NFS in-situ investigations of fast processes. Such experiments frequently consist of hundreds of time spectra which need to be evaluated. The introduced procedure allows the analysis of these experiments and significantly decreases the time needed for the data evaluation. The key contributions of the study are the sequential use of the output fitting parameters of a previous data set as the input parameters for the next data set and the model suitability crosscheck option of applying the procedure in ascending and descending directions of the data sets. Described fitting methodology is beneficial for checking of model validity and reliability of obtained results.

  9. Advanced approach to the analysis of a series of in-situ nuclear forward scattering experiments

    International Nuclear Information System (INIS)

    Vrba, Vlastimil; Procházka, Vít; Smrčka, David; Miglierini, Marcel


    This study introduces a sequential fitting procedure as a specific approach to nuclear forward scattering (NFS) data evaluation. Principles and usage of this advanced evaluation method are described in details and its utilization is demonstrated on NFS in-situ investigations of fast processes. Such experiments frequently consist of hundreds of time spectra which need to be evaluated. The introduced procedure allows the analysis of these experiments and significantly decreases the time needed for the data evaluation. The key contributions of the study are the sequential use of the output fitting parameters of a previous data set as the input parameters for the next data set and the model suitability crosscheck option of applying the procedure in ascending and descending directions of the data sets. Described fitting methodology is beneficial for checking of model validity and reliability of obtained results.

  10. Association between air pollution and suicide: a time series analysis in four Colombian cities. (United States)

    Fernández-Niño, Julián Alfredo; Astudillo-García, Claudia Iveth; Rodríguez-Villamizar, Laura Andrea; Florez-Garcia, Víctor Alfonso


    Recent epidemiological studies have suggested that air pollution could be associated with suicide. However, other studies have criticized these results for being analytically weak and not taking into account potential confounding factors. As such, further studies examining the relationship under diverse contexts are necessary to help clarify this issue. This study explored the association between specific air pollutants (NO 2 , SO 2 , PM 10 , PM 2.5 , CO and O 3 ) and suicide incidence in four Colombian cities after adjusting for climatic variables and holidays. A time series of daily suicides among men and women living in Bogota, Medellin, Cali and Bucaramanga was generated using information from the National Administrative Department of Statistics (DANE) for the years 2011-2014. At the same time, the average daily concentration of each air pollutant for each city was obtained from monitoring stations belonging to the National Air Quality Surveillance System. Using this information together, we generated conditional Poisson models (stratified by day, month and year) for the suicide rate in men and women, with air pollutants as the principal explanatory variable. These models were adjusted for temperature, relative humidity, precipitation and holidays. No association was found between any of the examined pollutants and suicide: NO 2 (IRR:0.99, 95% CI: 0.95-1.04), SO 2 (IRR:0.99, 95% CI: 0.98-1.01), PM 10 (IRR:0.99, 95% CI:0.95-1.03), PM 2.5 (IRR:1.01, 95% CI: 0.98-1.05), CO (IRR:1.00, 95% CI:1.00-1.00) and O 3 (IRR: 1.00, 95% CI: 0.96-1.04). In the same way, no association was found in stratified models by sex and age group neither in lagged and cumulative effects models. After adjusting for major confounding factors, we found no statistically significant association between air pollution and suicide in Colombia. These "negative" results provide further insight into the current discussion regarding the existence of such a relationship.

  11. Time-series analysis in imatinib-resistant chronic myeloid leukemia K562-cells under different drug treatments. (United States)

    Zhao, Yan-Hong; Zhang, Xue-Fang; Zhao, Yan-Qiu; Bai, Fan; Qin, Fan; Sun, Jing; Dong, Ying


    Chronic myeloid leukemia (CML) is characterized by the accumulation of active BCR-ABL protein. Imatinib is the first-line treatment of CML; however, many patients are resistant to this drug. In this study, we aimed to compare the differences in expression patterns and functions of time-series genes in imatinib-resistant CML cells under different drug treatments. GSE24946 was downloaded from the GEO database, which included 17 samples of K562-r cells with (n=12) or without drug administration (n=5). Three drug treatment groups were considered for this study: arsenic trioxide (ATO), AMN107, and ATO+AMN107. Each group had one sample at each time point (3, 12, 24, and 48 h). Time-series genes with a ratio of standard deviation/average (coefficient of variation) >0.15 were screened, and their expression patterns were revealed based on Short Time-series Expression Miner (STEM). Then, the functional enrichment analysis of time-series genes in each group was performed using DAVID, and the genes enriched in the top ten functional categories were extracted to detect their expression patterns. Different time-series genes were identified in the three groups, and most of them were enriched in the ribosome and oxidative phosphorylation pathways. Time-series genes in the three treatment groups had different expression patterns and functions. Time-series genes in the ATO group (e.g. CCNA2 and DAB2) were significantly associated with cell adhesion, those in the AMN107 group were related to cellular carbohydrate metabolic process, while those in the ATO+AMN107 group (e.g. AP2M1) were significantly related to cell proliferation and antigen processing. In imatinib-resistant CML cells, ATO could influence genes related to cell adhesion, AMN107 might affect genes involved in cellular carbohydrate metabolism, and the combination therapy might regulate genes involved in cell proliferation.

  12. All-phase MR angiography using independent component analysis of dynamic contrast enhanced MRI time series. φ-MRA

    International Nuclear Information System (INIS)

    Suzuki, Kiyotaka; Matsuzawa, Hitoshi; Watanabe, Masaki; Nakada, Tsutomu; Nakayama, Naoki; Kwee, I.L.


    Dynamic contrast enhanced magnetic resonance imaging (dynamic MRI) represents a MRI version of non-diffusible tracer methods, the main clinical use of which is the physiological construction of what is conventionally referred to as perfusion images. The raw data utilized for constructing MRI perfusion images are time series of pixel signal alterations associated with the passage of a gadolinium containing contrast agent. Such time series are highly compatible with independent component analysis (ICA), a novel statistical signal processing technique capable of effectively separating a single mixture of multiple signals into their original independent source signals (blind separation). Accordingly, we applied ICA to dynamic MRI time series. The technique was found to be powerful, allowing for hitherto unobtainable assessment of regional cerebral hemodynamics in vivo. (author)


    Koizumi, Akira; Suehiro, Miki; Arai, Yasuhiro; Inakazu, Toyono; Masuko, Atushi; Tamura, Satoshi; Ashida, Hiroshi

    The purpose of this study is to define one apartment complex as "the water supply block" and to show the relationship between the amount of water supply for an apartment house and its time series fluctuation. We examined the observation data which were collected from 33 apartment houses. The water meters were installed at individual observation points for about 20 days in Tokyo. This study used Fourier analysis in order to grasp the irregularity in a time series data. As a result, this paper demonstrated that the smaller the amount of water supply became, the larger irregularity the time series fluctuation had. We also found that it was difficult to describe the daily cyclical pattern for a small apartment house using the dominant periodic components which were obtained from a Fourier spectrum. Our research give useful information about the design for a directional water supply system, as to making estimates of the hourly fluctuation and the maximum daily water demand.

  14. Analysis of owner design specifications for snubbers. Report 2

    Energy Technology Data Exchange (ETDEWEB)

    Butler, J.H.


    The report discusses and evaluates the adequacy of buyer specifications for snubbers. Technical specifications for snubbers (hydraulic and mechanical) were studied in detail in an effort to define the ''fundamental needs'' of the industry with respect to characteristics of snubbers during operation. In the course of this study, it was determined that there is insufficient consensus among users to make such a definition. Authorities in the fields of structural dynamics, structural systems design, and snubber design were also consulted for additional information. Information from these sources is incorporated to identify the fundamental areas of concern, areas where consensus is lacking, and problems to be resolved in order to establish meaningful standards for this equipment.

  15. Analysis of owner design specifications for snubbers. Report 2

    International Nuclear Information System (INIS)

    Butler, J.H.


    The report discusses and evaluates the adequacy of buyer specifications for snubbers. Technical specifications for snubbers (hydraulic and mechanical) were studied in detail in an effort to define the ''fundamental needs'' of the industry with respect to characteristics of snubbers during operation. In the course of this study, it was determined that there is insufficient consensus among users to make such a definition. Authorities in the fields of structural dynamics, structural systems design, and snubber design were also consulted for additional information. Information from these sources is incorporated to identify the fundamental areas of concern, areas where consensus is lacking, and problems to be resolved in order to establish meaningful standards for this equipment

  16. Computer Support of Semantic Text Analysis of a Technical Specification on Designing Software


    Zaboleeva-Zotova, Alla; Orlova, Yulia


    The given work is devoted to development of the computer-aided system of semantic text analysis of a technical specification. The purpose of this work is to increase efficiency of software engineering based on automation of semantic text analysis of a technical specification. In work it is offered and investigated a technique of the text analysis of a technical specification is submitted, the expanded fuzzy attribute grammar of a technical specification, intended for formaliza...

  17. The need for detailed gender-specific occupational safety analysis. (United States)

    Cruz Rios, Fernanda; Chong, Wai K; Grau, David


    The female work in population is growing in the United States, therefore the occupational health and safety entities must start to analyze gender-specific data related to every industry, especially to nontraditional occupations. Women working in nontraditional jobs are often exposed to extreme workplace hazards. These women have their safety and health threatened because there are no adequate policies to mitigate gender-specific risks such as discrimination and harassment. Employers tend to aggravate this situation because they often fail to provide proper reporting infrastructure and support. According to past studies, women suffered from workplace injuries and illnesses that were less prominent among men. Statistics also confirmed that men and women faced different levels of risks in distinct work environments. For example, the rates of workplace violence and murders by personal acquaintances were significantly higher among women. In this paper, the authors analyze prior public data on fatal and nonfatal injuries to understand why we need to differentiate genders when analyzing occupational safety and health issues. The analyses confirmed that women dealt with unique workplace hazards compared to men. It is urgent that public agencies, such as the U.S. Department of Labor, record gender-specific data in details and by occupations and industries. The reader will become aware of the current lack - and need - of data and knowledge about injuries and illnesses separated by gender and industry. Finally, safety and health researchers are encouraged to investigate the gender-specific data in all industries and occupations, as soon as they become available. Copyright © 2017 National Safety Council and Elsevier Ltd. All rights reserved.

  18. Probabilistic analysis of 900 MWe PWR. Shutdown technical specifications

    International Nuclear Information System (INIS)

    Mattei, J.M.; Bars, G.


    During annual shutdown, preventive maintenance and modifications which are made on PWRs cause scheduled unavailabilities of equipment or systems which might harm the safety of the installation, in spite of the low level of decay heat during this period. The pumps in the auxiliary feedwater system, component cooling water system, service water system, the water injection arrays (LPIS, HPIS, CVCS), and the containment spray system may have scheduled unavailability, as well as the power supply of the electricity boards. The EDF utility is aware of the risks related to these situations for which accident procedures have been set up and hence has proposed limiting downtime for this equipment during the shutdown period, through technical specifications. The project defines the equipment required to ensure the functions important for safety during the various shutdown phases (criticality, water inventory, evacuation of decay heat, containment). In order to be able to judge the acceptability of these specifications, the IPSN, the technical support of the Service Central de Surete des Installations Nucleaires, has used probabilistic methodology to analyse the impact on the core melt probability of these specifications, for a French 900 MWe PWR

  19. The impact of policy guidelines on hospital antibiotic use over a decade: a segmented time series analysis.

    Directory of Open Access Journals (Sweden)

    Sujith J Chandy

    Full Text Available Antibiotic pressure contributes to rising antibiotic resistance. Policy guidelines encourage rational prescribing behavior, but effectiveness in containing antibiotic use needs further assessment. This study therefore assessed the patterns of antibiotic use over a decade and analyzed the impact of different modes of guideline development and dissemination on inpatient antibiotic use.Antibiotic use was calculated monthly as defined daily doses (DDD per 100 bed days for nine antibiotic groups and overall. This time series compared trends in antibiotic use in five adjacent time periods identified as 'Segments,' divided based on differing modes of guideline development and implementation: Segment 1--Baseline prior to antibiotic guidelines development; Segment 2--During preparation of guidelines and booklet dissemination; Segment 3--Dormant period with no guidelines dissemination; Segment 4--Booklet dissemination of revised guidelines; Segment 5--Booklet dissemination of revised guidelines with intranet access. Regression analysis adapted for segmented time series and adjusted for seasonality assessed changes in antibiotic use trend.Overall antibiotic use increased at a monthly rate of 0.95 (SE = 0.18, 0.21 (SE = 0.08 and 0.31 (SE = 0.06 for Segments 1, 2 and 3, stabilized in Segment 4 (0.05; SE = 0.10 and declined in Segment 5 (-0.37; SE = 0.11. Segments 1, 2 and 4 exhibited seasonal fluctuations. Pairwise segmented regression adjusted for seasonality revealed a significant drop in monthly antibiotic use of 0.401 (SE = 0.089; p<0.001 for Segment 5 compared to Segment 4. Most antibiotic groups showed similar trends to overall use.Use of overall and specific antibiotic groups showed varied patterns and seasonal fluctuations. Containment of rising overall antibiotic use was possible during periods of active guideline dissemination. Wider access through intranet facilitated significant decline in use. Stakeholders and policy

  20. Self-potential time series analysis in a seismic area of the Southern Apennines: preliminary results

    Directory of Open Access Journals (Sweden)

    V. Tramutoli


    Full Text Available The self-potential time series recorded during the period May 1991 - August 1992 by an automatic station, located in a seismic area of Southern Apennines, is analyzed. We deal with the spectral and the statistical features of the electrotellurie precursors: they can play a major role in the approach to seismic prediction. The time-dynamics of the experimental time series is investigated, the cyclic components and the time trends are removed. In particular we consider the influence of external noise, related to anthropic activities and meteoclimatic parameters, and pick out the anomalies from the residual series. Finally we show the preliminary results of the correlation between the anomalies in the time patterns of self-potential data and the earthquakes which occurred in the area.