WorldWideScience

Sample records for series signal analysis

  1. The Photoplethismographic Signal Processed with Nonlinear Time Series Analysis Tools

    International Nuclear Information System (INIS)

    Hernandez Caceres, Jose Luis; Hong, Rolando; Garcia Lanz, Abel; Garcia Dominguez, Luis; Cabannas, Karelia

    2001-01-01

    Finger photoplethismography (PPG) signals were submitted to nonlinear time series analysis. The applied analytical techniques were: (i) High degree polynomial fitting for baseline estimation; (ii) FFT analysis for estimating power spectra; (iii) fractal dimension estimation via the Higuchi's time-domain method, and (iv) kernel nonparametric estimation for reconstructing noise free-attractors and also for estimating signal's stochastic components

  2. The Real-time Frequency Spectrum Analysis of Neutron Pulse Signal Series

    International Nuclear Information System (INIS)

    Tang Yuelin; Ren Yong; Wei Biao; Feng Peng; Mi Deling; Pan Yingjun; Li Jiansheng; Ye Cenming

    2009-01-01

    The frequency spectrum analysis of neutron pulse signal is a very important method in nuclear stochastic signal processing Focused on the special '0' and '1' of neutron pulse signal series, this paper proposes new rotation-table and realizes a real-time frequency spectrum algorithm under 1G Hz sample rate based on PC with add, address and SSE. The numerical experimental results show that under the count rate of 3X10 6 s -1 , this algorithm is superior to FFTW in time-consumption and can meet the real-time requirement of frequency spectrum analysis. (authors)

  3. Bioelectric signal classification using a recurrent probabilistic neural network with time-series discriminant component analysis.

    Science.gov (United States)

    Hayashi, Hideaki; Shima, Keisuke; Shibanoki, Taro; Kurita, Yuichi; Tsuji, Toshio

    2013-01-01

    This paper outlines a probabilistic neural network developed on the basis of time-series discriminant component analysis (TSDCA) that can be used to classify high-dimensional time-series patterns. TSDCA involves the compression of high-dimensional time series into a lower-dimensional space using a set of orthogonal transformations and the calculation of posterior probabilities based on a continuous-density hidden Markov model that incorporates a Gaussian mixture model expressed in the reduced-dimensional space. The analysis can be incorporated into a neural network so that parameters can be obtained appropriately as network coefficients according to backpropagation-through-time-based training algorithm. The network is considered to enable high-accuracy classification of high-dimensional time-series patterns and to reduce the computation time taken for network training. In the experiments conducted during the study, the validity of the proposed network was demonstrated for EEG signals.

  4. Evaluation of the autoregression time-series model for analysis of a noisy signal

    International Nuclear Information System (INIS)

    Allen, J.W.

    1977-01-01

    The autoregression (AR) time-series model of a continuous noisy signal was statistically evaluated to determine quantitatively the uncertainties of the model order, the model parameters, and the model's power spectral density (PSD). The result of such a statistical evaluation enables an experimenter to decide whether an AR model can adequately represent a continuous noisy signal and be consistent with the signal's frequency spectrum, and whether it can be used for on-line monitoring. Although evaluations of other types of signals have been reported in the literature, no direct reference has been found to AR model's uncertainties for continuous noisy signals; yet the evaluation is necessary to decide the usefulness of AR models of typical reactor signals (e.g., neutron detector output or thermocouple output) and the potential of AR models for on-line monitoring applications. AR and other time-series models for noisy data representation are being investigated by others since such models require fewer parameters than the traditional PSD model. For this study, the AR model was selected for its simplicity and conduciveness to uncertainty analysis, and controlled laboratory bench signals were used for continuous noisy data. (author)

  5. Analysis of Seasonal Signal in GPS Short-Baseline Time Series

    Science.gov (United States)

    Wang, Kaihua; Jiang, Weiping; Chen, Hua; An, Xiangdong; Zhou, Xiaohui; Yuan, Peng; Chen, Qusen

    2018-04-01

    Proper modeling of seasonal signals and their quantitative analysis are of interest in geoscience applications, which are based on position time series of permanent GPS stations. Seasonal signals in GPS short-baseline (paper, to better understand the seasonal signal in GPS short-baseline time series, we adopted and processed six different short-baselines with data span that varies from 2 to 14 years and baseline length that varies from 6 to 1100 m. To avoid seasonal signals that are overwhelmed by noise, each of the station pairs is chosen with significant differences in their height (> 5 m) or type of the monument. For comparison, we also processed an approximately zero baseline with a distance of pass-filtered (BP) noise is valid for approximately 40% of the baseline components, and another 20% of the components can be best modeled by a combination of the first-order Gauss-Markov (FOGM) process plus white noise (WN). The TEM displacements are then modeled by considering the monument height of the building structure beneath the GPS antenna. The median contributions of TEM to the annual amplitude in the vertical direction are 84% and 46% with and without additional parts of the monument, respectively. Obvious annual signals with amplitude > 0.4 mm in the horizontal direction are observed in five short-baselines, and the amplitudes exceed 1 mm in four of them. These horizontal seasonal signals are likely related to the propagation of daily/sub-daily TEM displacement or other signals related to the site environment. Mismodeling of the tropospheric delay may also introduce spurious seasonal signals with annual amplitudes of 5 and 2 mm, respectively, for two short-baselines with elevation differences greater than 100 m. The results suggest that the monument height of the additional part of a typical GPS station should be considered when estimating the TEM displacement and that the tropospheric delay should be modeled cautiously, especially with station pairs with

  6. Permutation entropy based time series analysis: Equalities in the input signal can lead to false conclusions

    Energy Technology Data Exchange (ETDEWEB)

    Zunino, Luciano, E-mail: lucianoz@ciop.unlp.edu.ar [Centro de Investigaciones Ópticas (CONICET La Plata – CIC), C.C. 3, 1897 Gonnet (Argentina); Departamento de Ciencias Básicas, Facultad de Ingeniería, Universidad Nacional de La Plata (UNLP), 1900 La Plata (Argentina); Olivares, Felipe, E-mail: olivaresfe@gmail.com [Instituto de Física, Pontificia Universidad Católica de Valparaíso (PUCV), 23-40025 Valparaíso (Chile); Scholkmann, Felix, E-mail: Felix.Scholkmann@gmail.com [Research Office for Complex Physical and Biological Systems (ROCoS), Mutschellenstr. 179, 8038 Zurich (Switzerland); Biomedical Optics Research Laboratory, Department of Neonatology, University Hospital Zurich, University of Zurich, 8091 Zurich (Switzerland); Rosso, Osvaldo A., E-mail: oarosso@gmail.com [Instituto de Física, Universidade Federal de Alagoas (UFAL), BR 104 Norte km 97, 57072-970, Maceió, Alagoas (Brazil); Instituto Tecnológico de Buenos Aires (ITBA) and CONICET, C1106ACD, Av. Eduardo Madero 399, Ciudad Autónoma de Buenos Aires (Argentina); Complex Systems Group, Facultad de Ingeniería y Ciencias Aplicadas, Universidad de los Andes, Av. Mons. Álvaro del Portillo 12.455, Las Condes, Santiago (Chile)

    2017-06-15

    A symbolic encoding scheme, based on the ordinal relation between the amplitude of neighboring values of a given data sequence, should be implemented before estimating the permutation entropy. Consequently, equalities in the analyzed signal, i.e. repeated equal values, deserve special attention and treatment. In this work, we carefully study the effect that the presence of equalities has on permutation entropy estimated values when these ties are symbolized, as it is commonly done, according to their order of appearance. On the one hand, the analysis of computer-generated time series is initially developed to understand the incidence of repeated values on permutation entropy estimations in controlled scenarios. The presence of temporal correlations is erroneously concluded when true pseudorandom time series with low amplitude resolutions are considered. On the other hand, the analysis of real-world data is included to illustrate how the presence of a significant number of equal values can give rise to false conclusions regarding the underlying temporal structures in practical contexts. - Highlights: • Impact of repeated values in a signal when estimating permutation entropy is studied. • Numerical and experimental tests are included for characterizing this limitation. • Non-negligible temporal correlations can be spuriously concluded by repeated values. • Data digitized with low amplitude resolutions could be especially affected. • Analysis with shuffled realizations can help to overcome this limitation.

  7. Fuzzy central tendency measure for time series variability analysis with application to fatigue electromyography signals.

    Science.gov (United States)

    Xie, Hong-Bo; Dokos, Socrates

    2013-01-01

    A new method, namely fuzzy central tendency measure (fCTM) analysis, that could enable measurement of the variability of a time series, is presented in this study. Tests on simulated data sets show that fCTM is superior to the conventional central tendency measure (CTM) in several respects, including improved relative consistency and robustness to noise. The proposed fCTM method was applied to electromyograph (EMG) signals recorded during sustained isometric contraction for tracking local muscle fatigue. The results showed that the fCTM increased significantly during the development of muscle fatigue, and it was more sensitive to the fatigue phenomenon than mean frequency (MNF), the most commonly-used muscle fatigue indicator.

  8. Chaos analysis of the electrical signal time series evoked by acupuncture

    International Nuclear Information System (INIS)

    Wang Jiang; Sun Li; Fei Xiangyang; Zhu Bing

    2007-01-01

    This paper employs chaos theory to analyze the time series of electrical signal which are evoked by different acupuncture methods applied to the Zusanli point. The phase space is reconstructed and the embedding parameters are obtained by the mutual information and Cao's methods. Subsequently, the largest Lyapunov exponent is calculated. From the analyses we can conclude that the time series are chaotic. In addition, differences between various acupuncture methods are discussed

  9. Chaos analysis of the electrical signal time series evoked by acupuncture

    Energy Technology Data Exchange (ETDEWEB)

    Wang Jiang [School of Electrical Engineering, Tianjin University, Tianjin 300072 (China)]. E-mail: jiangwang@tju.edu.cn; Sun Li [School of Electrical Engineering, Tianjin University, Tianjin 300072 (China); Fei Xiangyang [School of Electrical Engineering, Tianjin University, Tianjin 300072 (China); Zhu Bing [Institute of Acupuncture and Moxibustion, China Academy of Traditional Chinese Medicine, Beijing 100700 (China)

    2007-08-15

    This paper employs chaos theory to analyze the time series of electrical signal which are evoked by different acupuncture methods applied to the Zusanli point. The phase space is reconstructed and the embedding parameters are obtained by the mutual information and Cao's methods. Subsequently, the largest Lyapunov exponent is calculated. From the analyses we can conclude that the time series are chaotic. In addition, differences between various acupuncture methods are discussed.

  10. Analysis of the signal transfer and folding in N-path filters with a series inductance

    NARCIS (Netherlands)

    Duipmans, L.; Struiksma, R.E.; Klumperink, E.A.M.; Nauta, B.; Vliet, F.E. van

    2015-01-01

    N-path filters exploiting switched-series-R-C networks can realize high-Q blocking-tolerant band-pass filters. Moreover, their center frequency is flexibly programmable by a digital clock. Unfortunately, the time variant nature of these circuits also results in unwanted signal folding. This paper

  11. Nonlinear time-series analysis of current signal in cathodic contact glow discharge electrolysis

    International Nuclear Information System (INIS)

    Allagui, Anis; Abdelkareem, Mohammad Ali; Rojas, Andrea Espinel; Bonny, Talal; Elwakil, Ahmed S.

    2016-01-01

    In the standard two-electrode configuration employed in electrolytic process, when the control dc voltage is brought to a critical value, the system undergoes a transition from conventional electrolysis to contact glow discharge electrolysis (CGDE), which has also been referred to as liquid-submerged micro-plasma, glow discharge plasma electrolysis, electrode effect, electrolytic plasma, etc. The light-emitting process is associated with the development of an irregular and erratic current time-series which has been arbitrarily labelled as “random,” and thus dissuaded further research in this direction. Here, we examine the current time-series signals measured in cathodic CGDE configuration in a concentrated KOH solution at different dc bias voltages greater than the critical voltage. We show that the signals are, in fact, not random according to the NIST SP. 800-22 test suite definition. We also demonstrate that post-processing low-pass filtered sequences requires less time than the native as-measured sequences, suggesting a superposition of low frequency chaotic fluctuations and high frequency behaviors (which may be produced by more than one possible source of entropy). Using an array of nonlinear time-series analyses for dynamical systems, i.e., the computation of largest Lyapunov exponents and correlation dimensions, and re-construction of phase portraits, we found that low-pass filtered datasets undergo a transition from quasi-periodic to chaotic to quasi-hyper-chaotic behavior, and back again to chaos when the voltage controlling-parameter is increased. The high frequency part of the signals is discussed in terms of highly nonlinear turbulent motion developed around the working electrode.

  12. Digital signal processing for the Johnson noise thermometry: a time series analysis of the Johnson noise

    International Nuclear Information System (INIS)

    Moon, Byung Soo; Hwang, In Koo; Chung, Chong Eun; Kwon, Kee Choon; David, E. H.; Kisner, R.A.

    2004-06-01

    In this report, we first proved that a random signal obtained by taking the sum of a set of signal frequency signals generates a continuous Markov process. We used this random signal to simulate the Johnson noise and verified that the Johnson noise thermometry can be used to improve the measurements of the reactor coolant temperature within an accuracy of below 0.14%. Secondly, by using this random signal we determined the optimal sampling rate when the frequency band of the Johnson noise signal is given. Also the results of our examination on how good the linearity of the Johnson noise is and how large the relative error of the temperature could become when the temperature increases are described. Thirdly, the results of our analysis on a set of the Johnson noise signal blocks taken from a simple electric circuit are described. We showed that the properties of the continuous Markov process are satisfied even when some channel noises are present. Finally, we describe the algorithm we devised to handle the problem of the time lag in the long-term average or the moving average in a transient state. The algorithm is based on the Haar wavelet and is to estimate the transient temperature that has much smaller time delay. We have shown that the algorithm can track the transient temperature successfully

  13. Parametric time series analysis of geoelectrical signals: an application to earthquake forecasting in Southern Italy

    Directory of Open Access Journals (Sweden)

    V. Tramutoli

    1996-06-01

    Full Text Available An autoregressive model was selected to describe geoelectrical time series. An objective technique was subsequently applied to analyze and discriminate values above (below an a priorifixed threshold possibly related to seismic events. A complete check of the model and the main guidelines to estimate the occurrence probability of extreme events are reported. A first application of the proposed technique is discussed through the analysis of the experimental data recorded by an automatic station located in Tito, a small town on the Apennine chain in Southern Italy. This region was hit by the November 1980 Irpinia-Basilicata earthquake and it is one of most active areas of the Mediterranean region. After a preliminary filtering procedure to reduce the influence of external parameters (i.e. the meteo-climatic effects, it was demonstrated that the geoelectrical residual time series are well described by means of a second order autoregressive model. Our findings outline a statistical methodology to evaluate the efficiency of electrical seismic precursors.

  14. Visual time series analysis

    DEFF Research Database (Denmark)

    Fischer, Paul; Hilbert, Astrid

    2012-01-01

    We introduce a platform which supplies an easy-to-handle, interactive, extendable, and fast analysis tool for time series analysis. In contrast to other software suits like Maple, Matlab, or R, which use a command-line-like interface and where the user has to memorize/look-up the appropriate...... commands, our application is select-and-click-driven. It allows to derive many different sequences of deviations for a given time series and to visualize them in different ways in order to judge their expressive power and to reuse the procedure found. For many transformations or model-ts, the user may...... choose between manual and automated parameter selection. The user can dene new transformations and add them to the system. The application contains efficient implementations of advanced and recent techniques for time series analysis including techniques related to extreme value analysis and filtering...

  15. Multiscale Signal Analysis and Modeling

    CERN Document Server

    Zayed, Ahmed

    2013-01-01

    Multiscale Signal Analysis and Modeling presents recent advances in multiscale analysis and modeling using wavelets and other systems. This book also presents applications in digital signal processing using sampling theory and techniques from various function spaces, filter design, feature extraction and classification, signal and image representation/transmission, coding, nonparametric statistical signal processing, and statistical learning theory. This book also: Discusses recently developed signal modeling techniques, such as the multiscale method for complex time series modeling, multiscale positive density estimations, Bayesian Shrinkage Strategies, and algorithms for data adaptive statistics Introduces new sampling algorithms for multidimensional signal processing Provides comprehensive coverage of wavelets with presentations on waveform design and modeling, wavelet analysis of ECG signals and wavelet filters Reviews features extraction and classification algorithms for multiscale signal and image proce...

  16. Applied time series analysis

    CERN Document Server

    Woodward, Wayne A; Elliott, Alan C

    2011-01-01

    ""There is scarcely a standard technique that the reader will find left out … this book is highly recommended for those requiring a ready introduction to applicable methods in time series and serves as a useful resource for pedagogical purposes.""-International Statistical Review (2014), 82""Current time series theory for practice is well summarized in this book.""-Emmanuel Parzen, Texas A&M University""What an extraordinary range of topics covered, all very insightfully. I like [the authors'] innovations very much, such as the AR factor table.""-David Findley, U.S. Census Bureau (retired)""…

  17. A new method based on fractal variance function for analysis and quantification of sympathetic and vagal activity in variability of R-R time series in ECG signals

    Energy Technology Data Exchange (ETDEWEB)

    Conte, Elio [Department of Pharmacology and Human Physiology and Tires, Center for Innovative Technologies for Signal Detection and Processing, University of Bari, Bari (Italy); School of Advanced International Studies on Nuclear, Theoretical and Nonlinear Methodologies-Bari (Italy)], E-mail: fisio2@fisiol.uniba.it; Federici, Antonio [Department of Pharmacology and Human Physiology and Tires, Center for Innovative Technologies for Signal Detection and Processing, University of Bari, Bari (Italy); Zbilut, Joseph P. [Department of Molecular Biophysics and Physiology, Rush University Medical Center, 1653W Congress, Chicago, IL 60612 (United States)

    2009-08-15

    It is known that R-R time series calculated from a recorded ECG, are strongly correlated to sympathetic and vagal regulation of the sinus pacemaker activity. In human physiology it is a crucial question to estimate such components with accuracy. Fourier analysis dominates still to day the data analysis efforts of such data ignoring that FFT is valid under some crucial restrictions that results largely violated in R-R time series data as linearity and stationarity. In order to go over such approach, we introduce a new method, called CZF. It is based on variogram analysis. It is aimed from a profound link with Recurrence Quantification Analysis that is a basic tool for investigation of non linear and non stationary time series. Therefore, a relevant feature of the method is that it finally may be applied also in cases of non linear and non stationary time series analysis. In addition, the method enables also to analyze the fractal variance function, the Generalized Fractal Dimension and, finally, the relative probability density function of the data. The CZF gives very satisfactory results. In the present paper it has been applied to direct experimental cases of normal subjects, patients with hypertension before and after therapy and in children under some different conditions of experimentation.

  18. A new method based on fractal variance function for analysis and quantification of sympathetic and vagal activity in variability of R-R time series in ECG signals

    International Nuclear Information System (INIS)

    Conte, Elio; Federici, Antonio; Zbilut, Joseph P.

    2009-01-01

    It is known that R-R time series calculated from a recorded ECG, are strongly correlated to sympathetic and vagal regulation of the sinus pacemaker activity. In human physiology it is a crucial question to estimate such components with accuracy. Fourier analysis dominates still to day the data analysis efforts of such data ignoring that FFT is valid under some crucial restrictions that results largely violated in R-R time series data as linearity and stationarity. In order to go over such approach, we introduce a new method, called CZF. It is based on variogram analysis. It is aimed from a profound link with Recurrence Quantification Analysis that is a basic tool for investigation of non linear and non stationary time series. Therefore, a relevant feature of the method is that it finally may be applied also in cases of non linear and non stationary time series analysis. In addition, the method enables also to analyze the fractal variance function, the Generalized Fractal Dimension and, finally, the relative probability density function of the data. The CZF gives very satisfactory results. In the present paper it has been applied to direct experimental cases of normal subjects, patients with hypertension before and after therapy and in children under some different conditions of experimentation.

  19. Methods for removal of unwanted signals from gravity time-series: Comparison using linear techniques complemented with analysis of system dynamics

    Science.gov (United States)

    Valencio, Arthur; Grebogi, Celso; Baptista, Murilo S.

    2017-10-01

    The presence of undesirable dominating signals in geophysical experimental data is a challenge in many subfields. One remarkable example is surface gravimetry, where frequencies from Earth tides correspond to time-series fluctuations up to a thousand times larger than the phenomena of major interest, such as hydrological gravity effects or co-seismic gravity changes. This work discusses general methods for the removal of unwanted dominating signals by applying them to 8 long-period gravity time-series of the International Geodynamics and Earth Tides Service, equivalent to the acquisition from 8 instruments in 5 locations representative of the network. We compare three different conceptual approaches for tide removal: frequency filtering, physical modelling, and data-based modelling. Each approach reveals a different limitation to be considered depending on the intended application. Vestiges of tides remain in the residues for the modelling procedures, whereas the signal was distorted in different ways by the filtering and data-based procedures. The linear techniques employed were power spectral density, spectrogram, cross-correlation, and classical harmonics decomposition, while the system dynamics was analysed by state-space reconstruction and estimation of the largest Lyapunov exponent. Although the tides could not be completely eliminated, they were sufficiently reduced to allow observation of geophysical events of interest above the 10 nm s-2 level, exemplified by a hydrology-related event of 60 nm s-2. The implementations adopted for each conceptual approach are general, so that their principles could be applied to other kinds of data affected by undesired signals composed mainly by periodic or quasi-periodic components.

  20. Method of signal analysis

    International Nuclear Information System (INIS)

    Berthomier, Charles

    1975-01-01

    A method capable of handling the amplitude and the frequency time laws of a certain kind of geophysical signals is described here. This method is based upon the analytical signal idea of Gabor and Ville, which is constructed either in the time domain by adding an imaginary part to the real signal (in-quadrature signal), or in the frequency domain by suppressing negative frequency components. The instantaneous frequency of the initial signal is then defined as the time derivative of the phase of the analytical signal, and his amplitude, or envelope, as the modulus of this complex signal. The method is applied to three types of magnetospheric signals: chorus, whistlers and pearls. The results obtained by analog and numerical calculations are compared to results obtained by classical systems using filters, i.e. based upon a different definition of the concept of frequency. The precision with which the frequency-time laws are determined leads then to the examination of the principle of the method and to a definition of instantaneous power density spectrum attached to the signal, and to the first consequences of this definition. In this way, a two-dimensional representation of the signal is introduced which is less deformed by the analysis system properties than the usual representation, and which moreover has the advantage of being obtainable practically in real time [fr

  1. Signal flow analysis

    CERN Document Server

    Abrahams, J R; Hiller, N

    1965-01-01

    Signal Flow Analysis provides information pertinent to the fundamental aspects of signal flow analysis. This book discusses the basic theory of signal flow graphs and shows their relation to the usual algebraic equations.Organized into seven chapters, this book begins with an overview of properties of a flow graph. This text then demonstrates how flow graphs can be applied to a wide range of electrical circuits that do not involve amplification. Other chapters deal with the parameters as well as circuit applications of transistors. This book discusses as well the variety of circuits using ther

  2. Signal Processing for Time-Series Functions on a Graph

    Science.gov (United States)

    2018-02-01

    Figures Fig. 1 Time -series function on a fixed graph.............................................2 iv Approved for public release; distribution is...φi〉`2(V)φi (39) 6= f̄ (40) Instead, we simply recover the average of f over time . 13 Approved for public release; distribution is unlimited. This...ARL-TR-8276• FEB 2018 US Army Research Laboratory Signal Processing for Time -Series Functions on a Graph by Humberto Muñoz-Barona, Jean Vettel, and

  3. Biomedical signal analysis

    CERN Document Server

    Rangayyan, Rangaraj M

    2015-01-01

    The book will help assist a reader in the development of techniques for analysis of biomedical signals and computer aided diagnoses with a pedagogical examination of basic and advanced topics accompanied by over 350 figures and illustrations. Wide range of filtering techniques presented to address various applications. 800 mathematical expressions and equations. Practical questions, problems and laboratory exercises. Includes fractals and chaos theory with biomedical applications.

  4. Thematic minireview series: cell biology of G protein signaling.

    Science.gov (United States)

    Dohlman, Henrik G

    2015-03-13

    This thematic series is on the topic of cell signaling from a cell biology perspective, with a particular focus on G proteins. G protein-coupled receptors (GPCRs, also known as seven-transmembrane receptors) are typically found at the cell surface. Upon agonist binding, these receptors will activate a GTP-binding G protein at the cytoplasmic face of the plasma membrane. Additionally, there is growing evidence that G proteins can also be activated by non-receptor binding partners, and they can signal from non-plasma membrane compartments. The production of second messengers at multiple, spatially distinct locations represents a type of signal encoding that has been largely neglected. The first minireview in the series describes biosensors that are being used to monitor G protein signaling events in live cells. The second describes the implementation of antibody-based biosensors to dissect endosome signaling by G proteins and their receptors. The third describes the function of a non-receptor, cytoplasmic activator of G protein signaling, called GIV (Girdin). Collectively, the advances described in these articles provide a deeper understanding and emerging opportunities for new pharmacology. © 2015 by The American Society for Biochemistry and Molecular Biology, Inc.

  5. Traffic dispersion through a series of signals with irregular split

    Science.gov (United States)

    Nagatani, Takashi

    2016-01-01

    We study the traffic behavior of a group of vehicles moving through a sequence of signals with irregular splits on a roadway. We present the stochastic model of vehicular traffic controlled by signals. The dynamic behavior of vehicular traffic is clarified by analyzing traffic pattern and travel time numerically. The group of vehicles breaks up more and more by the irregularity of signal's split. The traffic dispersion is induced by the irregular split. We show that the traffic dispersion depends highly on the cycle time and the strength of split's irregularity. Also, we study the traffic behavior through the series of signals at the green-wave strategy. The dependence of the travel time on offset time is derived for various values of cycle time. The region map of the traffic dispersion is shown in (cycle time, offset time)-space.

  6. A KST framework for correlation network construction from time series signals

    Science.gov (United States)

    Qi, Jin-Peng; Gu, Quan; Zhu, Ying; Zhang, Ping

    2018-04-01

    A KST (Kolmogorov-Smirnov test and T statistic) method is used for construction of a correlation network based on the fluctuation of each time series within the multivariate time signals. In this method, each time series is divided equally into multiple segments, and the maximal data fluctuation in each segment is calculated by a KST change detection procedure. Connections between each time series are derived from the data fluctuation matrix, and are used for construction of the fluctuation correlation network (FCN). The method was tested with synthetic simulations and the result was compared with those from using KS or T only for detection of data fluctuation. The novelty of this study is that the correlation analyses was based on the data fluctuation in each segment of each time series rather than on the original time signals, which would be more meaningful for many real world applications and for analysis of large-scale time signals where prior knowledge is uncertain.

  7. Highly comparative time-series analysis: the empirical structure of time series and their methods.

    Science.gov (United States)

    Fulcher, Ben D; Little, Max A; Jones, Nick S

    2013-06-06

    The process of collecting and organizing sets of observations represents a common theme throughout the history of science. However, despite the ubiquity of scientists measuring, recording and analysing the dynamics of different processes, an extensive organization of scientific time-series data and analysis methods has never been performed. Addressing this, annotated collections of over 35 000 real-world and model-generated time series, and over 9000 time-series analysis algorithms are analysed in this work. We introduce reduced representations of both time series, in terms of their properties measured by diverse scientific methods, and of time-series analysis methods, in terms of their behaviour on empirical time series, and use them to organize these interdisciplinary resources. This new approach to comparing across diverse scientific data and methods allows us to organize time-series datasets automatically according to their properties, retrieve alternatives to particular analysis methods developed in other scientific disciplines and automate the selection of useful methods for time-series classification and regression tasks. The broad scientific utility of these tools is demonstrated on datasets of electroencephalograms, self-affine time series, heartbeat intervals, speech signals and others, in each case contributing novel analysis techniques to the existing literature. Highly comparative techniques that compare across an interdisciplinary literature can thus be used to guide more focused research in time-series analysis for applications across the scientific disciplines.

  8. Small-signal model for the series resonant converter

    Science.gov (United States)

    King, R. J.; Stuart, T. A.

    1985-01-01

    The results of a previous discrete-time model of the series resonant dc-dc converter are reviewed and from these a small signal dynamic model is derived. This model is valid for low frequencies and is based on the modulation of the diode conduction angle for control. The basic converter is modeled separately from its output filter to facilitate the use of these results for design purposes. Experimental results are presented.

  9. Time Series Analysis and Forecasting by Example

    CERN Document Server

    Bisgaard, Soren

    2011-01-01

    An intuition-based approach enables you to master time series analysis with ease Time Series Analysis and Forecasting by Example provides the fundamental techniques in time series analysis using various examples. By introducing necessary theory through examples that showcase the discussed topics, the authors successfully help readers develop an intuitive understanding of seemingly complicated time series models and their implications. The book presents methodologies for time series analysis in a simplified, example-based approach. Using graphics, the authors discuss each presented example in

  10. Two-dimensional signal analysis

    CERN Document Server

    Garello, René

    2010-01-01

    This title sets out to show that 2-D signal analysis has its own role to play alongside signal processing and image processing.Concentrating its coverage on those 2-D signals coming from physical sensors (such as radars and sonars), the discussion explores a 2-D spectral approach but develops the modeling of 2-D signals and proposes several data-oriented analysis techniques for dealing with them. Coverage is also given to potential future developments in this area.

  11. The analysis of time series: an introduction

    National Research Council Canada - National Science Library

    Chatfield, Christopher

    1989-01-01

    .... A variety of practical examples are given to support the theory. The book covers a wide range of time-series topics, including probability models for time series, Box-Jenkins forecasting, spectral analysis, linear systems and system identification...

  12. Time series analysis time series analysis methods and applications

    CERN Document Server

    Rao, Tata Subba; Rao, C R

    2012-01-01

    The field of statistics not only affects all areas of scientific activity, but also many other matters such as public policy. It is branching rapidly into so many different subjects that a series of handbooks is the only way of comprehensively presenting the various aspects of statistical methodology, applications, and recent developments. The Handbook of Statistics is a series of self-contained reference books. Each volume is devoted to a particular topic in statistics, with Volume 30 dealing with time series. The series is addressed to the entire community of statisticians and scientists in various disciplines who use statistical methodology in their work. At the same time, special emphasis is placed on applications-oriented techniques, with the applied statistician in mind as the primary audience. Comprehensively presents the various aspects of statistical methodology Discusses a wide variety of diverse applications and recent developments Contributors are internationally renowened experts in their respect...

  13. A Course in Time Series Analysis

    CERN Document Server

    Peña, Daniel; Tsay, Ruey S

    2011-01-01

    New statistical methods and future directions of research in time series A Course in Time Series Analysis demonstrates how to build time series models for univariate and multivariate time series data. It brings together material previously available only in the professional literature and presents a unified view of the most advanced procedures available for time series model building. The authors begin with basic concepts in univariate time series, providing an up-to-date presentation of ARIMA models, including the Kalman filter, outlier analysis, automatic methods for building ARIMA models, a

  14. Wavelet analysis for nonstationary signals

    International Nuclear Information System (INIS)

    Penha, Rosani Maria Libardi da

    1999-01-01

    Mechanical vibration signals play an important role in anomalies identification resulting of equipment malfunctioning. Traditionally, Fourier spectral analysis is used where the signals are assumed to be stationary. However, occasional transient impulses and start-up process are examples of nonstationary signals that can be found in mechanical vibrations. These signals can provide important information about the equipment condition, as early fault detection. The Fourier analysis can not adequately be applied to nonstationary signals because the results provide data about the frequency composition averaged over the duration of the signal. In this work, two methods for nonstationary signal analysis are used: Short Time Fourier Transform (STFT) and wavelet transform. The STFT is a method of adapting Fourier spectral analysis for nonstationary application to time-frequency domain. To have a unique resolution throughout the entire time-frequency domain is its main limitation. The wavelet transform is a new analysis technique suitable to nonstationary signals, which handles the STFT drawbacks, providing multi-resolution frequency analysis and time localization in a unique time-scale graphic. The multiple frequency resolutions are obtained by scaling (dilatation/compression) the wavelet function. A comparison of the conventional Fourier transform, STFT and wavelet transform is made applying these techniques to: simulated signals, arrangement rotor rig vibration signal and rotate machine vibration signal Hanning window was used to STFT analysis. Daubechies and harmonic wavelets were used to continuos, discrete and multi-resolution wavelet analysis. The results show the Fourier analysis was not able to detect changes in the signal frequencies or discontinuities. The STFT analysis detected the changes in the signal frequencies, but with time-frequency resolution problems. The wavelet continuos and discrete transform demonstrated to be a high efficient tool to detect

  15. Semi-classical signal analysis

    KAUST Repository

    Laleg-Kirati, Taous-Meriem

    2012-09-30

    This study introduces a new signal analysis method, based on a semi-classical approach. The main idea in this method is to interpret a pulse-shaped signal as a potential of a Schrödinger operator and then to use the discrete spectrum of this operator for the analysis of the signal. We present some numerical examples and the first results obtained with this method on the analysis of arterial blood pressure waveforms. © 2012 Springer-Verlag London Limited.

  16. Biological signals classification and analysis

    CERN Document Server

    Kiasaleh, Kamran

    2015-01-01

    This authored monograph presents key aspects of signal processing analysis in the biomedical arena. Unlike wireless communication systems, biological entities produce signals with underlying nonlinear, chaotic nature that elude classification using the standard signal processing techniques, which have been developed over the past several decades for dealing primarily with standard communication systems. This book separates what is random from that which appears to be random, and yet is truly deterministic with random appearance. At its core, this work gives the reader a perspective on biomedical signals and the means to classify and process such signals. In particular, a review of random processes along with means to assess the behavior of random signals is also provided. The book also includes a general discussion of biological signals in order to demonstrate the inefficacy of the well-known techniques to correctly extract meaningful information from such signals. Finally, a thorough discussion of recently ...

  17. Sensitivity of Hurst parameter estimation to periodic signals in time series and filtering approaches

    Science.gov (United States)

    Marković, D.; Koch, M.

    2005-09-01

    The influence of the periodic signals in time series on the Hurst parameter estimate is investigated with temporal, spectral and time-scale methods. The Hurst parameter estimates of the simulated periodic time series with a white noise background show a high sensitivity on the signal to noise ratio and for some methods, also on the data length used. The analysis is then carried on to the investigation of extreme monthly river flows of the Elbe River (Dresden) and of the Rhine River (Kaub). Effects of removing the periodic components employing different filtering approaches are discussed and it is shown that such procedures are a prerequisite for an unbiased estimation of H. In summary, our results imply that the first step in a time series long-correlation study should be the separation of the deterministic components from the stochastic ones. Otherwise wrong conclusions concerning possible memory effects may be drawn.

  18. The foundations of modern time series analysis

    CERN Document Server

    Mills, Terence C

    2011-01-01

    This book develops the analysis of Time Series from its formal beginnings in the 1890s through to the publication of Box and Jenkins' watershed publication in 1970, showing how these methods laid the foundations for the modern techniques of Time Series analysis that are in use today.

  19. Signal analysis of ventricular fibrillation

    NARCIS (Netherlands)

    Herbschleb, J.N.; Heethaar, R.M.; Tweel, L.H. van der; Zimmerman, A.N.E.; Meijler, F.L.

    Signal analysis of electro(cardio)grams during ventricular fibrillation (VF) in dogs and human patients indicates more organization and regularity than the official WHO definition suggests. The majority of the signal is characterized by a power spectrum with narrow, equidistant peaks. In a further

  20. Semi-classical signal analysis

    KAUST Repository

    Laleg-Kirati, Taous-Meriem; Cré peau, Emmanuelle; Sorine, Michel

    2012-01-01

    This study introduces a new signal analysis method, based on a semi-classical approach. The main idea in this method is to interpret a pulse-shaped signal as a potential of a Schrödinger operator and then to use the discrete spectrum

  1. Analysis of series resonant converter with series-parallel connection

    Science.gov (United States)

    Lin, Bor-Ren; Huang, Chien-Lan

    2011-02-01

    In this study, a parallel inductor-inductor-capacitor (LLC) resonant converter series-connected on the primary side and parallel-connected on the secondary side is presented for server power supply systems. Based on series resonant behaviour, the power metal-oxide-semiconductor field-effect transistors are turned on at zero voltage switching and the rectifier diodes are turned off at zero current switching. Thus, the switching losses on the power semiconductors are reduced. In the proposed converter, the primary windings of the two LLC converters are connected in series. Thus, the two converters have the same primary currents to ensure that they can supply the balance load current. On the output side, two LLC converters are connected in parallel to share the load current and to reduce the current stress on the secondary windings and the rectifier diodes. In this article, the principle of operation, steady-state analysis and design considerations of the proposed converter are provided and discussed. Experiments with a laboratory prototype with a 24 V/21 A output for server power supply were performed to verify the effectiveness of the proposed converter.

  2. Signals and transforms in linear systems analysis

    CERN Document Server

    Wasylkiwskyj, Wasyl

    2013-01-01

    Signals and Transforms in Linear Systems Analysis covers the subject of signals and transforms, particularly in the context of linear systems theory. Chapter 2 provides the theoretical background for the remainder of the text. Chapter 3 treats Fourier series and integrals. Particular attention is paid to convergence properties at step discontinuities. This includes the Gibbs phenomenon and its amelioration via the Fejer summation techniques. Special topics include modulation and analytic signal representation, Fourier transforms and analytic function theory, time-frequency analysis and frequency dispersion. Fundamentals of linear system theory for LTI analogue systems, with a brief account of time-varying systems, are covered in Chapter 4 . Discrete systems are covered in Chapters 6 and 7.  The Laplace transform treatment in Chapter 5 relies heavily on analytic function theory as does Chapter 8 on Z -transforms. The necessary background on complex variables is provided in Appendix A. This book is intended to...

  3. Analysis of Heavy-Tailed Time Series

    DEFF Research Database (Denmark)

    Xie, Xiaolei

    This thesis is about analysis of heavy-tailed time series. We discuss tail properties of real-world equity return series and investigate the possibility that a single tail index is shared by all return series of actively traded equities in a market. Conditions for this hypothesis to be true...... are identified. We study the eigenvalues and eigenvectors of sample covariance and sample auto-covariance matrices of multivariate heavy-tailed time series, and particularly for time series with very high dimensions. Asymptotic approximations of the eigenvalues and eigenvectors of such matrices are found...... and expressed in terms of the parameters of the dependence structure, among others. Furthermore, we study an importance sampling method for estimating rare-event probabilities of multivariate heavy-tailed time series generated by matrix recursion. We show that the proposed algorithm is efficient in the sense...

  4. Applied time series analysis and innovative computing

    CERN Document Server

    Ao, Sio-Iong

    2010-01-01

    This text is a systematic, state-of-the-art introduction to the use of innovative computing paradigms as an investigative tool for applications in time series analysis. It includes frontier case studies based on recent research.

  5. Time Series Analysis Forecasting and Control

    CERN Document Server

    Box, George E P; Reinsel, Gregory C

    2011-01-01

    A modernized new edition of one of the most trusted books on time series analysis. Since publication of the first edition in 1970, Time Series Analysis has served as one of the most influential and prominent works on the subject. This new edition maintains its balanced presentation of the tools for modeling and analyzing time series and also introduces the latest developments that have occurred n the field over the past decade through applications from areas such as business, finance, and engineering. The Fourth Edition provides a clearly written exploration of the key methods for building, cl

  6. Analog and digital signal analysis from basics to applications

    CERN Document Server

    Cohen Tenoudji, Frédéric

    2016-01-01

    This book provides comprehensive, graduate-level treatment of analog and digital signal analysis suitable for course use and self-guided learning. This expert text guides the reader from the basics of signal theory through a range of application tools for use in acoustic analysis, geophysics, and data compression. Each concept is introduced and explained step by step, and the necessary mathematical formulae are integrated in an accessible and intuitive way. The first part of the book explores how analog systems and signals form the basics of signal analysis. This section covers Fourier series and integral transforms of analog signals, Laplace and Hilbert transforms, the main analog filter classes, and signal modulations. Part II covers digital signals, demonstrating their key advantages. It presents z and Fourier transforms, digital filtering, inverse filters, deconvolution, and parametric modeling for deterministic signals. Wavelet decomposition and reconstruction of non-stationary signals are also discussed...

  7. Visibility Graph Based Time Series Analysis.

    Science.gov (United States)

    Stephen, Mutua; Gu, Changgui; Yang, Huijie

    2015-01-01

    Network based time series analysis has made considerable achievements in the recent years. By mapping mono/multivariate time series into networks, one can investigate both it's microscopic and macroscopic behaviors. However, most proposed approaches lead to the construction of static networks consequently providing limited information on evolutionary behaviors. In the present paper we propose a method called visibility graph based time series analysis, in which series segments are mapped to visibility graphs as being descriptions of the corresponding states and the successively occurring states are linked. This procedure converts a time series to a temporal network and at the same time a network of networks. Findings from empirical records for stock markets in USA (S&P500 and Nasdaq) and artificial series generated by means of fractional Gaussian motions show that the method can provide us rich information benefiting short-term and long-term predictions. Theoretically, we propose a method to investigate time series from the viewpoint of network of networks.

  8. Visibility Graph Based Time Series Analysis.

    Directory of Open Access Journals (Sweden)

    Mutua Stephen

    Full Text Available Network based time series analysis has made considerable achievements in the recent years. By mapping mono/multivariate time series into networks, one can investigate both it's microscopic and macroscopic behaviors. However, most proposed approaches lead to the construction of static networks consequently providing limited information on evolutionary behaviors. In the present paper we propose a method called visibility graph based time series analysis, in which series segments are mapped to visibility graphs as being descriptions of the corresponding states and the successively occurring states are linked. This procedure converts a time series to a temporal network and at the same time a network of networks. Findings from empirical records for stock markets in USA (S&P500 and Nasdaq and artificial series generated by means of fractional Gaussian motions show that the method can provide us rich information benefiting short-term and long-term predictions. Theoretically, we propose a method to investigate time series from the viewpoint of network of networks.

  9. Transition Icons for Time-Series Visualization and Exploratory Analysis.

    Science.gov (United States)

    Nickerson, Paul V; Baharloo, Raheleh; Wanigatunga, Amal A; Manini, Todd M; Tighe, Patrick J; Rashidi, Parisa

    2018-03-01

    The modern healthcare landscape has seen the rapid emergence of techniques and devices that temporally monitor and record physiological signals. The prevalence of time-series data within the healthcare field necessitates the development of methods that can analyze the data in order to draw meaningful conclusions. Time-series behavior is notoriously difficult to intuitively understand due to its intrinsic high-dimensionality, which is compounded in the case of analyzing groups of time series collected from different patients. Our framework, which we call transition icons, renders common patterns in a visual format useful for understanding the shared behavior within groups of time series. Transition icons are adept at detecting and displaying subtle differences and similarities, e.g., between measurements taken from patients receiving different treatment strategies or stratified by demographics. We introduce various methods that collectively allow for exploratory analysis of groups of time series, while being free of distribution assumptions and including simple heuristics for parameter determination. Our technique extracts discrete transition patterns from symbolic aggregate approXimation representations, and compiles transition frequencies into a bag of patterns constructed for each group. These transition frequencies are normalized and aligned in icon form to intuitively display the underlying patterns. We demonstrate the transition icon technique for two time-series datasets-postoperative pain scores, and hip-worn accelerometer activity counts. We believe transition icons can be an important tool for researchers approaching time-series data, as they give rich and intuitive information about collective time-series behaviors.

  10. Allan deviation analysis of financial return series

    Science.gov (United States)

    Hernández-Pérez, R.

    2012-05-01

    We perform a scaling analysis for the return series of different financial assets applying the Allan deviation (ADEV), which is used in the time and frequency metrology to characterize quantitatively the stability of frequency standards since it has demonstrated to be a robust quantity to analyze fluctuations of non-stationary time series for different observation intervals. The data used are opening price daily series for assets from different markets during a time span of around ten years. We found that the ADEV results for the return series at short scales resemble those expected for an uncorrelated series, consistent with the efficient market hypothesis. On the other hand, the ADEV results for absolute return series for short scales (first one or two decades) decrease following approximately a scaling relation up to a point that is different for almost each asset, after which the ADEV deviates from scaling, which suggests that the presence of clustering, long-range dependence and non-stationarity signatures in the series drive the results for large observation intervals.

  11. Introduction to time series analysis and forecasting

    CERN Document Server

    Montgomery, Douglas C; Kulahci, Murat

    2008-01-01

    An accessible introduction to the most current thinking in and practicality of forecasting techniques in the context of time-oriented data. Analyzing time-oriented data and forecasting are among the most important problems that analysts face across many fields, ranging from finance and economics to production operations and the natural sciences. As a result, there is a widespread need for large groups of people in a variety of fields to understand the basic concepts of time series analysis and forecasting. Introduction to Time Series Analysis and Forecasting presents the time series analysis branch of applied statistics as the underlying methodology for developing practical forecasts, and it also bridges the gap between theory and practice by equipping readers with the tools needed to analyze time-oriented data and construct useful, short- to medium-term, statistically based forecasts.

  12. Entropic Analysis of Electromyography Time Series

    Science.gov (United States)

    Kaufman, Miron; Sung, Paul

    2005-03-01

    We are in the process of assessing the effectiveness of fractal and entropic measures for the diagnostic of low back pain from surface electromyography (EMG) time series. Surface electromyography (EMG) is used to assess patients with low back pain. In a typical EMG measurement, the voltage is measured every millisecond. We observed back muscle fatiguing during one minute, which results in a time series with 60,000 entries. We characterize the complexity of time series by computing the Shannon entropy time dependence. The analysis of the time series from different relevant muscles from healthy and low back pain (LBP) individuals provides evidence that the level of variability of back muscle activities is much larger for healthy individuals than for individuals with LBP. In general the time dependence of the entropy shows a crossover from a diffusive regime to a regime characterized by long time correlations (self organization) at about 0.01s.

  13. Signal and image multiresolution analysis

    CERN Document Server

    Ouahabi, Abdelialil

    2012-01-01

    Multiresolution analysis using the wavelet transform has received considerable attention in recent years by researchers in various fields. It is a powerful tool for efficiently representing signals and images at multiple levels of detail with many inherent advantages, including compression, level-of-detail display, progressive transmission, level-of-detail editing, filtering, modeling, fractals and multifractals, etc.This book aims to provide a simple formalization and new clarity on multiresolution analysis, rendering accessible obscure techniques, and merging, unifying or completing

  14. Mathematical methods in time series analysis and digital image processing

    CERN Document Server

    Kurths, J; Maass, P; Timmer, J

    2008-01-01

    The aim of this volume is to bring together research directions in theoretical signal and imaging processing developed rather independently in electrical engineering, theoretical physics, mathematics and the computer sciences. In particular, mathematically justified algorithms and methods, the mathematical analysis of these algorithms, and methods as well as the investigation of connections between methods from time series analysis and image processing are reviewed. An interdisciplinary comparison of these methods, drawing upon common sets of test problems from medicine and geophysical/enviromental sciences, is also addressed. This volume coherently summarizes work carried out in the field of theoretical signal and image processing. It focuses on non-linear and non-parametric models for time series as well as on adaptive methods in image processing.

  15. Introduction to time series analysis and forecasting

    CERN Document Server

    Montgomery, Douglas C; Kulahci, Murat

    2015-01-01

    Praise for the First Edition ""…[t]he book is great for readers who need to apply the methods and models presented but have little background in mathematics and statistics."" -MAA Reviews Thoroughly updated throughout, Introduction to Time Series Analysis and Forecasting, Second Edition presents the underlying theories of time series analysis that are needed to analyze time-oriented data and construct real-world short- to medium-term statistical forecasts.    Authored by highly-experienced academics and professionals in engineering statistics, the Second Edition features discussions on both

  16. Signal analysis for failure detection

    International Nuclear Information System (INIS)

    Parpaglione, M.C.; Perez, L.V.; Rubio, D.A.; Czibener, D.; D'Attellis, C.E.; Brudny, P.I.; Ruzzante, J.E.

    1994-01-01

    Several methods for analysis of acoustic emission signals are presented. They are mainly oriented to detection of changes in noisy signals and characterization of higher amplitude discrete pulses or bursts. The aim was to relate changes and events with failure, crack or wear in materials, being the final goal to obtain automatic means of detecting such changes and/or events. Performance evaluation was made using both simulated and laboratory test signals. The methods being presented are the following: 1. Application of the Hopfield Neural Network (NN) model for classifying faults in pipes and detecting wear of a bearing. 2. Application of the Kohonnen and Back Propagation Neural Network model for the same problem. 3. Application of Kalman filtering to determine time occurrence of bursts. 4. Application of a bank of Kalman filters (KF) for failure detection in pipes. 5. Study of amplitude distribution of signals for detecting changes in their shape. 6. Application of the entropy distance to measure differences between signals. (author). 10 refs, 11 figs

  17. Tool Wear Monitoring Using Time Series Analysis

    Science.gov (United States)

    Song, Dong Yeul; Ohara, Yasuhiro; Tamaki, Haruo; Suga, Masanobu

    A tool wear monitoring approach considering the nonlinear behavior of cutting mechanism caused by tool wear and/or localized chipping is proposed, and its effectiveness is verified through the cutting experiment and actual turning machining. Moreover, the variation in the surface roughness of the machined workpiece is also discussed using this approach. In this approach, the residual error between the actually measured vibration signal and the estimated signal obtained from the time series model corresponding to dynamic model of cutting is introduced as the feature of diagnosis. Consequently, it is found that the early tool wear state (i.e. flank wear under 40µm) can be monitored, and also the optimal tool exchange time and the tool wear state for actual turning machining can be judged by this change in the residual error. Moreover, the variation of surface roughness Pz in the range of 3 to 8µm can be estimated by the monitoring of the residual error.

  18. Nonlinear Time Series Analysis via Neural Networks

    Science.gov (United States)

    Volná, Eva; Janošek, Michal; Kocian, Václav; Kotyrba, Martin

    This article deals with a time series analysis based on neural networks in order to make an effective forex market [Moore and Roche, J. Int. Econ. 58, 387-411 (2002)] pattern recognition. Our goal is to find and recognize important patterns which repeatedly appear in the market history to adapt our trading system behaviour based on them.

  19. Lecture notes for Advanced Time Series Analysis

    DEFF Research Database (Denmark)

    Madsen, Henrik; Holst, Jan

    1997-01-01

    A first version of this notes was used at the lectures in Grenoble, and they are now extended and improved (together with Jan Holst), and used in Ph.D. courses on Advanced Time Series Analysis at IMM and at the Department of Mathematical Statistics, University of Lund, 1994, 1997, ...

  20. A taylor series approach to survival analysis

    International Nuclear Information System (INIS)

    Brodsky, J.B.; Groer, P.G.

    1984-09-01

    A method of survival analysis using hazard functions is developed. The method uses the well known mathematical theory for Taylor Series. Hypothesis tests of the adequacy of many statistical models, including proportional hazards and linear and/or quadratic dose responses, are obtained. A partial analysis of leukemia mortality in the Life Span Study cohort is used as an example. Furthermore, a relatively robust estimation procedure for the proportional hazards model is proposed. (author)

  1. Time series analysis of barometric pressure data

    International Nuclear Information System (INIS)

    La Rocca, Paola; Riggi, Francesco; Riggi, Daniele

    2010-01-01

    Time series of atmospheric pressure data, collected over a period of several years, were analysed to provide undergraduate students with educational examples of application of simple statistical methods of analysis. In addition to basic methods for the analysis of periodicities, a comparison of two forecast models, one based on autoregression algorithms, and the other making use of an artificial neural network, was made. Results show that the application of artificial neural networks may give slightly better results compared to traditional methods.

  2. The Statistical Analysis of Time Series

    CERN Document Server

    Anderson, T W

    2011-01-01

    The Wiley Classics Library consists of selected books that have become recognized classics in their respective fields. With these new unabridged and inexpensive editions, Wiley hopes to extend the life of these important works by making them available to future generations of mathematicians and scientists. Currently available in the Series: T. W. Anderson Statistical Analysis of Time Series T. S. Arthanari & Yadolah Dodge Mathematical Programming in Statistics Emil Artin Geometric Algebra Norman T. J. Bailey The Elements of Stochastic Processes with Applications to the Natural Sciences George

  3. Fourier analysis of time series an introduction

    CERN Document Server

    Bloomfield, Peter

    2000-01-01

    A new, revised edition of a yet unrivaled work on frequency domain analysis Long recognized for his unique focus on frequency domain methods for the analysis of time series data as well as for his applied, easy-to-understand approach, Peter Bloomfield brings his well-known 1976 work thoroughly up to date. With a minimum of mathematics and an engaging, highly rewarding style, Bloomfield provides in-depth discussions of harmonic regression, harmonic analysis, complex demodulation, and spectrum analysis. All methods are clearly illustrated using examples of specific data sets, while ample

  4. Nonlinear time series analysis with R

    CERN Document Server

    Huffaker, Ray; Rosa, Rodolfo

    2017-01-01

    In the process of data analysis, the investigator is often facing highly-volatile and random-appearing observed data. A vast body of literature shows that the assumption of underlying stochastic processes was not necessarily representing the nature of the processes under investigation and, when other tools were used, deterministic features emerged. Non Linear Time Series Analysis (NLTS) allows researchers to test whether observed volatility conceals systematic non linear behavior, and to rigorously characterize governing dynamics. Behavioral patterns detected by non linear time series analysis, along with scientific principles and other expert information, guide the specification of mechanistic models that serve to explain real-world behavior rather than merely reproducing it. Often there is a misconception regarding the complexity of the level of mathematics needed to understand and utilize the tools of NLTS (for instance Chaos theory). However, mathematics used in NLTS is much simpler than many other subjec...

  5. Time-Series Analysis: A Cautionary Tale

    Science.gov (United States)

    Damadeo, Robert

    2015-01-01

    Time-series analysis has often been a useful tool in atmospheric science for deriving long-term trends in various atmospherically important parameters (e.g., temperature or the concentration of trace gas species). In particular, time-series analysis has been repeatedly applied to satellite datasets in order to derive the long-term trends in stratospheric ozone, which is a critical atmospheric constituent. However, many of the potential pitfalls relating to the non-uniform sampling of the datasets were often ignored and the results presented by the scientific community have been unknowingly biased. A newly developed and more robust application of this technique is applied to the Stratospheric Aerosol and Gas Experiment (SAGE) II version 7.0 ozone dataset and the previous biases and newly derived trends are presented.

  6. Signal Processing for Nondifferentiable Data Defined on Cantor Sets: A Local Fractional Fourier Series Approach

    Directory of Open Access Journals (Sweden)

    Zhi-Yong Chen

    2014-01-01

    Full Text Available From the signal processing point of view, the nondifferentiable data defined on the Cantor sets are investigated in this paper. The local fractional Fourier series is used to process the signals, which are the local fractional continuous functions. Our results can be observed as significant extensions of the previously known results for the Fourier series in the framework of the local fractional calculus. Some examples are given to illustrate the efficiency and implementation of the present method.

  7. Mathematical principles of signal processing Fourier and wavelet analysis

    CERN Document Server

    Brémaud, Pierre

    2002-01-01

    Fourier analysis is one of the most useful tools in many applied sciences. The recent developments of wavelet analysis indicates that in spite of its long history and well-established applications, the field is still one of active research. This text bridges the gap between engineering and mathematics, providing a rigorously mathematical introduction of Fourier analysis, wavelet analysis and related mathematical methods, while emphasizing their uses in signal processing and other applications in communications engineering. The interplay between Fourier series and Fourier transforms is at the heart of signal processing, which is couched most naturally in terms of the Dirac delta function and Lebesgue integrals. The exposition is organized into four parts. The first is a discussion of one-dimensional Fourier theory, including the classical results on convergence and the Poisson sum formula. The second part is devoted to the mathematical foundations of signal processing - sampling, filtering, digital signal proc...

  8. Unsupervised Symbolization of Signal Time Series for Extraction of the Embedded Information

    Directory of Open Access Journals (Sweden)

    Yue Li

    2017-03-01

    Full Text Available This paper formulates an unsupervised algorithm for symbolization of signal time series to capture the embedded dynamic behavior. The key idea is to convert time series of the digital signal into a string of (spatially discrete symbols from which the embedded dynamic information can be extracted in an unsupervised manner (i.e., no requirement for labeling of time series. The main challenges here are: (1 definition of the symbol assignment for the time series; (2 identification of the partitioning segment locations in the signal space of time series; and (3 construction of probabilistic finite-state automata (PFSA from the symbol strings that contain temporal patterns. The reported work addresses these challenges by maximizing the mutual information measures between symbol strings and PFSA states. The proposed symbolization method has been validated by numerical simulation as well as by experimentation in a laboratory environment. Performance of the proposed algorithm has been compared to that of two commonly used algorithms of time series partitioning.

  9. Radar signal analysis and processing using Matlab

    CERN Document Server

    Mahafza, Bassem R

    2008-01-01

    Offering radar-related software for the analysis and design of radar waveform and signal processing, this book provides comprehensive coverage of radar signals and signal processing techniques and algorithms. It contains numerous graphical plots, common radar-related functions, table format outputs, and end-of-chapter problems. The complete set of MATLAB[registered] functions and routines are available for download online.

  10. Topological data analysis of financial time series: Landscapes of crashes

    Science.gov (United States)

    Gidea, Marian; Katz, Yuri

    2018-02-01

    We explore the evolution of daily returns of four major US stock market indices during the technology crash of 2000, and the financial crisis of 2007-2009. Our methodology is based on topological data analysis (TDA). We use persistence homology to detect and quantify topological patterns that appear in multidimensional time series. Using a sliding window, we extract time-dependent point cloud data sets, to which we associate a topological space. We detect transient loops that appear in this space, and we measure their persistence. This is encoded in real-valued functions referred to as a 'persistence landscapes'. We quantify the temporal changes in persistence landscapes via their Lp-norms. We test this procedure on multidimensional time series generated by various non-linear and non-equilibrium models. We find that, in the vicinity of financial meltdowns, the Lp-norms exhibit strong growth prior to the primary peak, which ascends during a crash. Remarkably, the average spectral density at low frequencies of the time series of Lp-norms of the persistence landscapes demonstrates a strong rising trend for 250 trading days prior to either dotcom crash on 03/10/2000, or to the Lehman bankruptcy on 09/15/2008. Our study suggests that TDA provides a new type of econometric analysis, which complements the standard statistical measures. The method can be used to detect early warning signals of imminent market crashes. We believe that this approach can be used beyond the analysis of financial time series presented here.

  11. Time series analysis of temporal networks

    Science.gov (United States)

    Sikdar, Sandipan; Ganguly, Niloy; Mukherjee, Animesh

    2016-01-01

    A common but an important feature of all real-world networks is that they are temporal in nature, i.e., the network structure changes over time. Due to this dynamic nature, it becomes difficult to propose suitable growth models that can explain the various important characteristic properties of these networks. In fact, in many application oriented studies only knowing these properties is sufficient. For instance, if one wishes to launch a targeted attack on a network, this can be done even without the knowledge of the full network structure; rather an estimate of some of the properties is sufficient enough to launch the attack. We, in this paper show that even if the network structure at a future time point is not available one can still manage to estimate its properties. We propose a novel method to map a temporal network to a set of time series instances, analyze them and using a standard forecast model of time series, try to predict the properties of a temporal network at a later time instance. To our aim, we consider eight properties such as number of active nodes, average degree, clustering coefficient etc. and apply our prediction framework on them. We mainly focus on the temporal network of human face-to-face contacts and observe that it represents a stochastic process with memory that can be modeled as Auto-Regressive-Integrated-Moving-Average (ARIMA). We use cross validation techniques to find the percentage accuracy of our predictions. An important observation is that the frequency domain properties of the time series obtained from spectrogram analysis could be used to refine the prediction framework by identifying beforehand the cases where the error in prediction is likely to be high. This leads to an improvement of 7.96% (for error level ≤20%) in prediction accuracy on an average across all datasets. As an application we show how such prediction scheme can be used to launch targeted attacks on temporal networks. Contribution to the Topical Issue

  12. Sentiment analysis for PTSD signals

    CERN Document Server

    Kagan, Vadim; Sapounas, Demetrios

    2013-01-01

    This book describes a computational framework for real-time detection of psychological signals related to Post-Traumatic Stress Disorder (PTSD) in online text-based posts, including blogs and web forums. Further, it explores how emerging computational techniques such as sentiment mining can be used in real-time to identify posts that contain PTSD-related signals, flag those posts, and bring them to the attention of psychologists, thus providing an automated flag and referral capability.

  13. Correcting orbital drift signal in the time series of AVHRR derived convective cloud fraction using rotated empirical orthogonal function

    Directory of Open Access Journals (Sweden)

    A. Devasthale

    2012-02-01

    Full Text Available The Advanced Very High Resolution Radiometer (AVHRR instruments onboard the series of National Oceanic and Atmospheric Administration (NOAA satellites offer the longest available meteorological data records from space. These satellites have drifted in orbit resulting in shifts in the local time sampling during the life span of the sensors onboard. Depending upon the amplitude of the diurnal cycle of the geophysical parameters derived, orbital drift may cause spurious trends in their time series. We investigate tropical deep convective clouds, which show pronounced diurnal cycle amplitude, to estimate an upper bound of the impact of orbital drift on their time series. We carry out a rotated empirical orthogonal function analysis (REOF and show that the REOFs are useful in delineating orbital drift signal and, more importantly, in subtracting this signal in the time series of convective cloud amount. These results will help facilitate the derivation of homogenized data series of cloud amount from NOAA satellite sensors and ultimately analyzing trends from them. However, we suggest detailed comparison of various methods and rigorous testing thereof applying final orbital drift corrections.

  14. Amplitude calibration of an acoustic backscattered signal from a bottom-moored ADCP based on long-term measurement series

    Science.gov (United States)

    Piotukh, V. B.; Zatsepin, A. G.; Kuklev, S. B.

    2017-05-01

    A possible approach to, and preliminary results of, amplitude calibration of acoustic signals backscattered from an ADCP moored at the bottom of the near-shelf zone of the Black Sea is considered. The aim of this work is to obtain vertical profiles of acoustic scattering signal levels, showing the real characteristics of the volume content of suspended sediments in sea water in units of conventional acoustic turbidity for a given signal frequency. In this case, the assumption about the intervals of maximum acoustic transparency and vertical homogeneity of the marine environment in long-term series of ADCP measurements is used. According to this hypothesis, the intervals of the least values of acoustic backscattered signals are detected, an empirical transfer function of the ADCP reception path is constructed, and it is calibrated. Normalized sets of acoustic backscattered signals relative to a signal from a level of conventionally clear water are obtained. New features in the behavior of vertical profiles of an acoustic echo-signal are revealed due to the calibration. The results of this work will be used in subsequent analysis of the vertical and time variations in suspended sediment content in the near-shelf zone of the Black Sea.

  15. Correlation between detrended fluctuation analysis and the Lempel-Ziv complexity in nonlinear time series analysis

    International Nuclear Information System (INIS)

    Tang You-Fu; Liu Shu-Lin; Jiang Rui-Hong; Liu Ying-Hui

    2013-01-01

    We study the correlation between detrended fluctuation analysis (DFA) and the Lempel-Ziv complexity (LZC) in nonlinear time series analysis in this paper. Typical dynamic systems including a logistic map and a Duffing model are investigated. Moreover, the influence of Gaussian random noise on both the DFA and LZC are analyzed. The results show a high correlation between the DFA and LZC, which can quantify the non-stationarity and the nonlinearity of the time series, respectively. With the enhancement of the random component, the exponent a and the normalized complexity index C show increasing trends. In addition, C is found to be more sensitive to the fluctuation in the nonlinear time series than α. Finally, the correlation between the DFA and LZC is applied to the extraction of vibration signals for a reciprocating compressor gas valve, and an effective fault diagnosis result is obtained

  16. Nonparametric factor analysis of time series

    OpenAIRE

    Rodríguez-Poo, Juan M.; Linton, Oliver Bruce

    1998-01-01

    We introduce a nonparametric smoothing procedure for nonparametric factor analaysis of multivariate time series. The asymptotic properties of the proposed procedures are derived. We present an application based on the residuals from the Fair macromodel.

  17. DIY Solar Market Analysis Webinar Series: Solar Resource and Technical

    Science.gov (United States)

    Series: Solar Resource and Technical Potential DIY Solar Market Analysis Webinar Series: Solar Resource and Technical Potential Wednesday, June 11, 2014 As part of a Do-It-Yourself Solar Market Analysis Potential | State, Local, and Tribal Governments | NREL DIY Solar Market Analysis Webinar

  18. What marketing scholars should know about time series analysis : time series applications in marketing

    NARCIS (Netherlands)

    Horváth, Csilla; Kornelis, Marcel; Leeflang, Peter S.H.

    2002-01-01

    In this review, we give a comprehensive summary of time series techniques in marketing, and discuss a variety of time series analysis (TSA) techniques and models. We classify them in the sets (i) univariate TSA, (ii) multivariate TSA, and (iii) multiple TSA. We provide relevant marketing

  19. Time series analysis in the social sciences the fundamentals

    CERN Document Server

    Shin, Youseop

    2017-01-01

    Times Series Analysis in the Social Sciences is a practical and highly readable introduction written exclusively for students and researchers whose mathematical background is limited to basic algebra. The book focuses on fundamental elements of time series analysis that social scientists need to understand so they can employ time series analysis for their research and practice. Through step-by-step explanations and using monthly violent crime rates as case studies, this book explains univariate time series from the preliminary visual analysis through the modeling of seasonality, trends, and re

  20. Complex motion of a vehicle through a series of signals controlled by power-law phase

    Science.gov (United States)

    Nagatani, Takashi

    2017-07-01

    We study the dynamic motion of a vehicle moving through the series of traffic signals controlled by the position-dependent phase of power law. All signals are controlled by both cycle time and position-dependent phase. The dynamic model of the vehicular motion is described in terms of the nonlinear map. The vehicular motion varies in a complex manner by varying cycle time for various values of the power of the position-dependent phase. The vehicle displays the periodic motion with a long cycle for the integer power of the phase, while the vehicular motion exhibits the very complex behavior for the non-integer power of the phase.

  1. Multivariate Analysis for the Processing of Signals

    Directory of Open Access Journals (Sweden)

    Beattie J.R.

    2014-01-01

    Full Text Available Real-world experiments are becoming increasingly more complex, needing techniques capable of tracking this complexity. Signal based measurements are often used to capture this complexity, where a signal is a record of a sample’s response to a parameter (e.g. time, displacement, voltage, wavelength that is varied over a range of values. In signals the responses at each value of the varied parameter are related to each other, depending on the composition or state sample being measured. Since signals contain multiple information points, they have rich information content but are generally complex to comprehend. Multivariate Analysis (MA has profoundly transformed their analysis by allowing gross simplification of the tangled web of variation. In addition MA has also provided the advantage of being much more robust to the influence of noise than univariate methods of analysis. In recent years, there has been a growing awareness that the nature of the multivariate methods allows exploitation of its benefits for purposes other than data analysis, such as pre-processing of signals with the aim of eliminating irrelevant variations prior to analysis of the signal of interest. It has been shown that exploiting multivariate data reduction in an appropriate way can allow high fidelity denoising (removal of irreproducible non-signals, consistent and reproducible noise-insensitive correction of baseline distortions (removal of reproducible non-signals, accurate elimination of interfering signals (removal of reproducible but unwanted signals and the standardisation of signal amplitude fluctuations. At present, the field is relatively small but the possibilities for much wider application are considerable. Where signal properties are suitable for MA (such as the signal being stationary along the x-axis, these signal based corrections have the potential to be highly reproducible, and highly adaptable and are applicable in situations where the data is noisy or

  2. TIME SERIES ANALYSIS USING A UNIQUE MODEL OF TRANSFORMATION

    Directory of Open Access Journals (Sweden)

    Goran Klepac

    2007-12-01

    Full Text Available REFII1 model is an authorial mathematical model for time series data mining. The main purpose of that model is to automate time series analysis, through a unique transformation model of time series. An advantage of this approach of time series analysis is the linkage of different methods for time series analysis, linking traditional data mining tools in time series, and constructing new algorithms for analyzing time series. It is worth mentioning that REFII model is not a closed system, which means that we have a finite set of methods. At first, this is a model for transformation of values of time series, which prepares data used by different sets of methods based on the same model of transformation in a domain of problem space. REFII model gives a new approach in time series analysis based on a unique model of transformation, which is a base for all kind of time series analysis. The advantage of REFII model is its possible application in many different areas such as finance, medicine, voice recognition, face recognition and text mining.

  3. STUDIES IN ASTRONOMICAL TIME SERIES ANALYSIS. VI. BAYESIAN BLOCK REPRESENTATIONS

    Energy Technology Data Exchange (ETDEWEB)

    Scargle, Jeffrey D. [Space Science and Astrobiology Division, MS 245-3, NASA Ames Research Center, Moffett Field, CA 94035-1000 (United States); Norris, Jay P. [Physics Department, Boise State University, 2110 University Drive, Boise, ID 83725-1570 (United States); Jackson, Brad [The Center for Applied Mathematics and Computer Science, Department of Mathematics, San Jose State University, One Washington Square, MH 308, San Jose, CA 95192-0103 (United States); Chiang, James, E-mail: jeffrey.d.scargle@nasa.gov [W. W. Hansen Experimental Physics Laboratory, Kavli Institute for Particle Astrophysics and Cosmology, Department of Physics and SLAC National Accelerator Laboratory, Stanford University, Stanford, CA 94305 (United States)

    2013-02-20

    This paper addresses the problem of detecting and characterizing local variability in time series and other forms of sequential data. The goal is to identify and characterize statistically significant variations, at the same time suppressing the inevitable corrupting observational errors. We present a simple nonparametric modeling technique and an algorithm implementing it-an improved and generalized version of Bayesian Blocks-that finds the optimal segmentation of the data in the observation interval. The structure of the algorithm allows it to be used in either a real-time trigger mode, or a retrospective mode. Maximum likelihood or marginal posterior functions to measure model fitness are presented for events, binned counts, and measurements at arbitrary times with known error distributions. Problems addressed include those connected with data gaps, variable exposure, extension to piecewise linear and piecewise exponential representations, multivariate time series data, analysis of variance, data on the circle, other data modes, and dispersed data. Simulations provide evidence that the detection efficiency for weak signals is close to a theoretical asymptotic limit derived by Arias-Castro et al. In the spirit of Reproducible Research all of the code and data necessary to reproduce all of the figures in this paper are included as supplementary material.

  4. Studies in Astronomical Time Series Analysis. VI. Bayesian Block Representations

    Science.gov (United States)

    Scargle, Jeffrey D.; Norris, Jay P.; Jackson, Brad; Chiang, James

    2013-01-01

    This paper addresses the problem of detecting and characterizing local variability in time series and other forms of sequential data. The goal is to identify and characterize statistically significant variations, at the same time suppressing the inevitable corrupting observational errors. We present a simple nonparametric modeling technique and an algorithm implementing it-an improved and generalized version of Bayesian Blocks [Scargle 1998]-that finds the optimal segmentation of the data in the observation interval. The structure of the algorithm allows it to be used in either a real-time trigger mode, or a retrospective mode. Maximum likelihood or marginal posterior functions to measure model fitness are presented for events, binned counts, and measurements at arbitrary times with known error distributions. Problems addressed include those connected with data gaps, variable exposure, extension to piece- wise linear and piecewise exponential representations, multivariate time series data, analysis of variance, data on the circle, other data modes, and dispersed data. Simulations provide evidence that the detection efficiency for weak signals is close to a theoretical asymptotic limit derived by [Arias-Castro, Donoho and Huo 2003]. In the spirit of Reproducible Research [Donoho et al. (2008)] all of the code and data necessary to reproduce all of the figures in this paper are included as auxiliary material.

  5. STUDIES IN ASTRONOMICAL TIME SERIES ANALYSIS. VI. BAYESIAN BLOCK REPRESENTATIONS

    International Nuclear Information System (INIS)

    Scargle, Jeffrey D.; Norris, Jay P.; Jackson, Brad; Chiang, James

    2013-01-01

    This paper addresses the problem of detecting and characterizing local variability in time series and other forms of sequential data. The goal is to identify and characterize statistically significant variations, at the same time suppressing the inevitable corrupting observational errors. We present a simple nonparametric modeling technique and an algorithm implementing it—an improved and generalized version of Bayesian Blocks—that finds the optimal segmentation of the data in the observation interval. The structure of the algorithm allows it to be used in either a real-time trigger mode, or a retrospective mode. Maximum likelihood or marginal posterior functions to measure model fitness are presented for events, binned counts, and measurements at arbitrary times with known error distributions. Problems addressed include those connected with data gaps, variable exposure, extension to piecewise linear and piecewise exponential representations, multivariate time series data, analysis of variance, data on the circle, other data modes, and dispersed data. Simulations provide evidence that the detection efficiency for weak signals is close to a theoretical asymptotic limit derived by Arias-Castro et al. In the spirit of Reproducible Research all of the code and data necessary to reproduce all of the figures in this paper are included as supplementary material.

  6. Mathematical foundations of time series analysis a concise introduction

    CERN Document Server

    Beran, Jan

    2017-01-01

    This book provides a concise introduction to the mathematical foundations of time series analysis, with an emphasis on mathematical clarity. The text is reduced to the essential logical core, mostly using the symbolic language of mathematics, thus enabling readers to very quickly grasp the essential reasoning behind time series analysis. It appeals to anybody wanting to understand time series in a precise, mathematical manner. It is suitable for graduate courses in time series analysis but is equally useful as a reference work for students and researchers alike.

  7. Knee joint vibroarthrographic signal processing and analysis

    CERN Document Server

    Wu, Yunfeng

    2015-01-01

    This book presents the cutting-edge technologies of knee joint vibroarthrographic signal analysis for the screening and detection of knee joint injuries. It describes a number of effective computer-aided methods for analysis of the nonlinear and nonstationary biomedical signals generated by complex physiological mechanics. This book also introduces several popular machine learning and pattern recognition algorithms for biomedical signal classifications. The book is well-suited for all researchers looking to better understand knee joint biomechanics and the advanced technology for vibration arthrometry. Dr. Yunfeng Wu is an Associate Professor at the School of Information Science and Technology, Xiamen University, Xiamen, Fujian, China.

  8. Comparison of correlation analysis techniques for irregularly sampled time series

    Directory of Open Access Journals (Sweden)

    K. Rehfeld

    2011-06-01

    Full Text Available Geoscientific measurements often provide time series with irregular time sampling, requiring either data reconstruction (interpolation or sophisticated methods to handle irregular sampling. We compare the linear interpolation technique and different approaches for analyzing the correlation functions and persistence of irregularly sampled time series, as Lomb-Scargle Fourier transformation and kernel-based methods. In a thorough benchmark test we investigate the performance of these techniques.

    All methods have comparable root mean square errors (RMSEs for low skewness of the inter-observation time distribution. For high skewness, very irregular data, interpolation bias and RMSE increase strongly. We find a 40 % lower RMSE for the lag-1 autocorrelation function (ACF for the Gaussian kernel method vs. the linear interpolation scheme,in the analysis of highly irregular time series. For the cross correlation function (CCF the RMSE is then lower by 60 %. The application of the Lomb-Scargle technique gave results comparable to the kernel methods for the univariate, but poorer results in the bivariate case. Especially the high-frequency components of the signal, where classical methods show a strong bias in ACF and CCF magnitude, are preserved when using the kernel methods.

    We illustrate the performances of interpolation vs. Gaussian kernel method by applying both to paleo-data from four locations, reflecting late Holocene Asian monsoon variability as derived from speleothem δ18O measurements. Cross correlation results are similar for both methods, which we attribute to the long time scales of the common variability. The persistence time (memory is strongly overestimated when using the standard, interpolation-based, approach. Hence, the Gaussian kernel is a reliable and more robust estimator with significant advantages compared to other techniques and suitable for large scale application to paleo-data.

  9. Remote-Sensing Time Series Analysis, a Vegetation Monitoring Tool

    Science.gov (United States)

    McKellip, Rodney; Prados, Donald; Ryan, Robert; Ross, Kenton; Spruce, Joseph; Gasser, Gerald; Greer, Randall

    2008-01-01

    The Time Series Product Tool (TSPT) is software, developed in MATLAB , which creates and displays high signal-to- noise Vegetation Indices imagery and other higher-level products derived from remotely sensed data. This tool enables automated, rapid, large-scale regional surveillance of crops, forests, and other vegetation. TSPT temporally processes high-revisit-rate satellite imagery produced by the Moderate Resolution Imaging Spectroradiometer (MODIS) and by other remote-sensing systems. Although MODIS imagery is acquired daily, cloudiness and other sources of noise can greatly reduce the effective temporal resolution. To improve cloud statistics, the TSPT combines MODIS data from multiple satellites (Aqua and Terra). The TSPT produces MODIS products as single time-frame and multitemporal change images, as time-series plots at a selected location, or as temporally processed image videos. Using the TSPT program, MODIS metadata is used to remove and/or correct bad and suspect data. Bad pixel removal, multiple satellite data fusion, and temporal processing techniques create high-quality plots and animated image video sequences that depict changes in vegetation greenness. This tool provides several temporal processing options not found in other comparable imaging software tools. Because the framework to generate and use other algorithms is established, small modifications to this tool will enable the use of a large range of remotely sensed data types. An effective remote-sensing crop monitoring system must be able to detect subtle changes in plant health in the earliest stages, before the effects of a disease outbreak or other adverse environmental conditions can become widespread and devastating. The integration of the time series analysis tool with ground-based information, soil types, crop types, meteorological data, and crop growth models in a Geographic Information System, could provide the foundation for a large-area crop-surveillance system that could identify

  10. Time Series Analysis Using Geometric Template Matching.

    Science.gov (United States)

    Frank, Jordan; Mannor, Shie; Pineau, Joelle; Precup, Doina

    2013-03-01

    We present a novel framework for analyzing univariate time series data. At the heart of the approach is a versatile algorithm for measuring the similarity of two segments of time series called geometric template matching (GeTeM). First, we use GeTeM to compute a similarity measure for clustering and nearest-neighbor classification. Next, we present a semi-supervised learning algorithm that uses the similarity measure with hierarchical clustering in order to improve classification performance when unlabeled training data are available. Finally, we present a boosting framework called TDEBOOST, which uses an ensemble of GeTeM classifiers. TDEBOOST augments the traditional boosting approach with an additional step in which the features used as inputs to the classifier are adapted at each step to improve the training error. We empirically evaluate the proposed approaches on several datasets, such as accelerometer data collected from wearable sensors and ECG data.

  11. Growth And Export Expansion In Mauritius - A Time Series Analysis ...

    African Journals Online (AJOL)

    Growth And Export Expansion In Mauritius - A Time Series Analysis. ... RV Sannassee, R Pearce ... Using Granger Causality tests, the short-run analysis results revealed that there is significant reciprocal causality between real export earnings ...

  12. Assessing Spontaneous Combustion Instability with Nonlinear Time Series Analysis

    Science.gov (United States)

    Eberhart, C. J.; Casiano, M. J.

    2015-01-01

    Considerable interest lies in the ability to characterize the onset of spontaneous instabilities within liquid propellant rocket engine (LPRE) combustion devices. Linear techniques, such as fast Fourier transforms, various correlation parameters, and critical damping parameters, have been used at great length for over fifty years. Recently, nonlinear time series methods have been applied to deduce information pertaining to instability incipiency hidden in seemingly stochastic combustion noise. A technique commonly used in biological sciences known as the Multifractal Detrended Fluctuation Analysis has been extended to the combustion dynamics field, and is introduced here as a data analysis approach complementary to linear ones. Advancing, a modified technique is leveraged to extract artifacts of impending combustion instability that present themselves a priori growth to limit cycle amplitudes. Analysis is demonstrated on data from J-2X gas generator testing during which a distinct spontaneous instability was observed. Comparisons are made to previous work wherein the data were characterized using linear approaches. Verification of the technique is performed by examining idealized signals and comparing two separate, independently developed tools.

  13. Stochastic time series analysis of hydrology data for water resources

    Science.gov (United States)

    Sathish, S.; Khadar Babu, S. K.

    2017-11-01

    The prediction to current publication of stochastic time series analysis in hydrology and seasonal stage. The different statistical tests for predicting the hydrology time series on Thomas-Fiering model. The hydrology time series of flood flow have accept a great deal of consideration worldwide. The concentration of stochastic process areas of time series analysis method are expanding with develop concerns about seasonal periods and global warming. The recent trend by the researchers for testing seasonal periods in the hydrologic flowseries using stochastic process on Thomas-Fiering model. The present article proposed to predict the seasonal periods in hydrology using Thomas-Fiering model.

  14. Multichannel biomedical time series clustering via hierarchical probabilistic latent semantic analysis.

    Science.gov (United States)

    Wang, Jin; Sun, Xiangping; Nahavandi, Saeid; Kouzani, Abbas; Wu, Yuchuan; She, Mary

    2014-11-01

    Biomedical time series clustering that automatically groups a collection of time series according to their internal similarity is of importance for medical record management and inspection such as bio-signals archiving and retrieval. In this paper, a novel framework that automatically groups a set of unlabelled multichannel biomedical time series according to their internal structural similarity is proposed. Specifically, we treat a multichannel biomedical time series as a document and extract local segments from the time series as words. We extend a topic model, i.e., the Hierarchical probabilistic Latent Semantic Analysis (H-pLSA), which was originally developed for visual motion analysis to cluster a set of unlabelled multichannel time series. The H-pLSA models each channel of the multichannel time series using a local pLSA in the first layer. The topics learned in the local pLSA are then fed to a global pLSA in the second layer to discover the categories of multichannel time series. Experiments on a dataset extracted from multichannel Electrocardiography (ECG) signals demonstrate that the proposed method performs better than previous state-of-the-art approaches and is relatively robust to the variations of parameters including length of local segments and dictionary size. Although the experimental evaluation used the multichannel ECG signals in a biometric scenario, the proposed algorithm is a universal framework for multichannel biomedical time series clustering according to their structural similarity, which has many applications in biomedical time series management. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  15. Economic Analysis in Series-Distillation Desalination

    Directory of Open Access Journals (Sweden)

    Mirna Rahmah Lubis

    2010-06-01

    Full Text Available The ability to produce potable water economically is the primary purpose of seawater desalination research. Reverse osmosis (RO and multi-stage flash (MSF cost more than potable water produced from fresh water resources. Therefore, this research investigates a high-efficiency mechanical vapor-compression distillation system that employs an improved water flow arrangement. The incoming salt concentration was 0.15% salt for brackish water and 3.5% salt for seawater, whereas the outgoing salt concentration was 1.5% and 7%, respectively. Distillation was performed at 439 K and 722 kPa for both brackish water feed and seawater feed. Water costs of the various conditions were calculated for brackish water and seawater feeds using optimum conditions considered as 25 and 20 stages, respectively. For brackish water at a temperature difference of 0.96 K, the energy requirement is 2.0 kWh/m3. At this condition, the estimated water cost is $0.39/m3 achieved with 10,000,000 gal/day distillate, 30-year bond, 5% interest rate, and $0.05/kWh electricity. For seawater at a temperature difference of 0.44 K, the energy requirement is 3.97 kWh/m3 and the estimated water cost is $0.61/m3. Greater efficiency of the vapor compression system is achieved by connecting multiple evaporators in series, rather than the traditional parallel arrangement. The efficiency results from the gradual increase of salinity in each stage of the series arrangement in comparison to parallel. Calculations using various temperature differences between boiling brine and condensing steam show the series arrangement has the greatest improvement at lower temperature differences. Keywords: desalination, dropwise condensation, mechanical-vapor compression

  16. Finding hidden periodic signals in time series - an application to stock prices

    Science.gov (United States)

    O'Shea, Michael

    2014-03-01

    Data in the form of time series appear in many areas of science. In cases where the periodicity is apparent and the only other contribution to the time series is stochastic in origin, the data can be `folded' to improve signal to noise and this has been done for light curves of variable stars with the folding resulting in a cleaner light curve signal. Stock index prices versus time are classic examples of time series. Repeating patterns have been claimed by many workers and include unusually large returns on small-cap stocks during the month of January, and small returns on the Dow Jones Industrial average (DJIA) in the months June through September compared to the rest of the year. Such observations imply that these prices have a periodic component. We investigate this for the DJIA. If such a component exists it is hidden in a large non-periodic variation and a large stochastic variation. We show how to extract this periodic component and for the first time reveal its yearly (averaged) shape. This periodic component leads directly to the `Sell in May and buy at Halloween' adage. We also drill down and show that this yearly variation emerges from approximately half of the underlying stocks making up the DJIA index.

  17. Analysis of JET ELMy time series

    International Nuclear Information System (INIS)

    Zvejnieks, G.; Kuzovkov, V.N.

    2005-01-01

    Full text: Achievement of the planned operational regime in the next generation tokamaks (such as ITER) still faces principal problems. One of the main challenges is obtaining the control of edge localized modes (ELMs), which should lead to both long plasma pulse times and reasonable divertor life time. In order to control ELMs the hypothesis was proposed by Degeling [1] that ELMs exhibit features of chaotic dynamics and thus a standard chaos control methods might be applicable. However, our findings which are based on the nonlinear autoregressive (NAR) model contradict this hypothesis for JET ELMy time-series. In turn, it means that ELM behavior is of a relaxation or random type. These conclusions coincide with our previous results obtained for ASDEX Upgrade time series [2]. [1] A.W. Degeling, Y.R. Martin, P.E. Bak, J. B.Lister, and X. Llobet, Plasma Phys. Control. Fusion 43, 1671 (2001). [2] G. Zvejnieks, V.N. Kuzovkov, O. Dumbrajs, A.W. Degeling, W. Suttrop, H. Urano, and H. Zohm, Physics of Plasmas 11, 5658 (2004)

  18. Biological time series analysis using a context free language: applicability to pulsatile hormone data.

    Directory of Open Access Journals (Sweden)

    Dennis A Dean

    Full Text Available We present a novel approach for analyzing biological time-series data using a context-free language (CFL representation that allows the extraction and quantification of important features from the time-series. This representation results in Hierarchically AdaPtive (HAP analysis, a suite of multiple complementary techniques that enable rapid analysis of data and does not require the user to set parameters. HAP analysis generates hierarchically organized parameter distributions that allow multi-scale components of the time-series to be quantified and includes a data analysis pipeline that applies recursive analyses to generate hierarchically organized results that extend traditional outcome measures such as pharmacokinetics and inter-pulse interval. Pulsicons, a novel text-based time-series representation also derived from the CFL approach, are introduced as an objective qualitative comparison nomenclature. We apply HAP to the analysis of 24 hours of frequently sampled pulsatile cortisol hormone data, which has known analysis challenges, from 14 healthy women. HAP analysis generated results in seconds and produced dozens of figures for each participant. The results quantify the observed qualitative features of cortisol data as a series of pulse clusters, each consisting of one or more embedded pulses, and identify two ultradian phenotypes in this dataset. HAP analysis is designed to be robust to individual differences and to missing data and may be applied to other pulsatile hormones. Future work can extend HAP analysis to other time-series data types, including oscillatory and other periodic physiological signals.

  19. Book: Marine Bioacoustic Signal Processing and Analysis

    Science.gov (United States)

    2011-09-30

    physicists , and mathematicians . However, more and more biologists and psychologists are starting to use advanced signal processing techniques and...Book: Marine Bioacoustic Signal Processing and Analysis 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT ...chapters than it should be, since the project must be finished by Dec. 31. I have started setting aside 2 hours of uninterrupted per workday to work

  20. Metagenomics meets time series analysis: unraveling microbial community dynamics

    NARCIS (Netherlands)

    Faust, K.; Lahti, L.M.; Gonze, D.; Vos, de W.M.; Raes, J.

    2015-01-01

    The recent increase in the number of microbial time series studies offers new insights into the stability and dynamics of microbial communities, from the world's oceans to human microbiota. Dedicated time series analysis tools allow taking full advantage of these data. Such tools can reveal periodic

  1. Signal analysis of Hindustani classical music

    CERN Document Server

    Datta, Asoke Kumar; Sengupta, Ranjan; Chakraborty, Soubhik; Mahto, Kartik; Patranabis, Anirban

    2017-01-01

    This book presents a comprehensive overview of the basics of Hindustani music and the associated signal analysis and technological developments. It begins with an in-depth introduction to musical signal analysis and its current applications, and then moves on to a detailed discussion of the features involved in understanding the musical meaning of the signal in the context of Hindustani music. The components consist of tones, shruti, scales, pitch duration and stability, raga, gharana and musical instruments. The book covers the various technological developments in this field, supplemented with a number of case studies and their analysis. The book offers new music researchers essential insights into the use of the automatic concept for finding and testing the musical features for their applications. Intended primarily for postgraduate and PhD students working in the area of scientific research on Hindustani music, as well as other genres where the concepts are applicable, it is also a valuable resource for p...

  2. Research on Healthy Anomaly Detection Model Based on Deep Learning from Multiple Time-Series Physiological Signals

    Directory of Open Access Journals (Sweden)

    Kai Wang

    2016-01-01

    Full Text Available Health is vital to every human being. To further improve its already respectable medical technology, the medical community is transitioning towards a proactive approach which anticipates and mitigates risks before getting ill. This approach requires measuring the physiological signals of human and analyzes these data at regular intervals. In this paper, we present a novel approach to apply deep learning in physiological signals analysis that allows doctor to identify latent risks. However, extracting high level information from physiological time-series data is a hard problem faced by the machine learning communities. Therefore, in this approach, we apply model based on convolutional neural network that can automatically learn features from raw physiological signals in an unsupervised manner and then based on the learned features use multivariate Gauss distribution anomaly detection method to detect anomaly data. Our experiment is shown to have a significant performance in physiological signals anomaly detection. So it is a promising tool for doctor to identify early signs of illness even if the criteria are unknown a priori.

  3. EXTRACTING PERIODIC TRANSIT SIGNALS FROM NOISY LIGHT CURVES USING FOURIER SERIES

    Energy Technology Data Exchange (ETDEWEB)

    Samsing, Johan [Department of Astrophysical Sciences, Princeton University, Peyton Hall, 4 Ivy Lane, Princeton, NJ 08544 (United States)

    2015-07-01

    We present a simple and powerful method for extracting transit signals associated with a known transiting planet from noisy light curves. Assuming the orbital period of the planet is known and the signal is periodic, we illustrate that systematic noise can be removed in Fourier space at all frequencies by only using data within a fixed time frame with a width equal to an integer number of orbital periods. This results in a reconstruction of the full transit signal, which on average is unbiased despite no prior knowledge of either the noise or the transit signal itself being used in the analysis. The method therefore has clear advantages over standard phase folding, which normally requires external input such as nearby stars or noise models for removing systematic components. In addition, we can extract the full orbital transit signal (360°) simultaneously, and Kepler-like data can be analyzed in just a few seconds. We illustrate the performance of our method by applying it to a dataset composed of light curves from Kepler with a fake injected signal emulating a planet with rings. For extracting periodic transit signals, our presented method is in general the optimal and least biased estimator and could therefore lead the way toward the first detections of, e.g., planet rings and exo-trojan asteroids.

  4. Topic Time Series Analysis of Microblogs

    Science.gov (United States)

    2014-10-01

    may be distributed more globally. Tweets on a specific topic that cluster spatially, temporally or both might be of interest to analysts, marketers ...of $ and @, with the latter only in the case that it is the only character in the token (the @ symbol is significant in its usage by Instagram in...is generated by Instagram . Topic 80, Distance: 143.2101 Top words: 1. rawr 2. ˆ0ˆ 3. kill 4. jurassic 5. dinosaur Analysis: This topic is quite

  5. Multitaper spectral analysis of atmospheric radar signals

    Directory of Open Access Journals (Sweden)

    V. K. Anandan

    2004-11-01

    Full Text Available Multitaper spectral analysis using sinusoidal taper has been carried out on the backscattered signals received from the troposphere and lower stratosphere by the Gadanki Mesosphere-Stratosphere-Troposphere (MST radar under various conditions of the signal-to-noise ratio. Comparison of study is made with sinusoidal taper of the order of three and single tapers of Hanning and rectangular tapers, to understand the relative merits of processing under the scheme. Power spectra plots show that echoes are better identified in the case of multitaper estimation, especially in the region of a weak signal-to-noise ratio. Further analysis is carried out to obtain three lower order moments from three estimation techniques. The results show that multitaper analysis gives a better signal-to-noise ratio or higher detectability. The spectral analysis through multitaper and single tapers is subjected to study of consistency in measurements. Results show that the multitaper estimate is better consistent in Doppler measurements compared to single taper estimates. Doppler width measurements with different approaches were studied and the results show that the estimation was better in the multitaper technique in terms of temporal resolution and estimation accuracy.

  6. Hilbert-Schmidt and Sobol sensitivity indices for static and time series Wnt signaling measurements in colorectal cancer - part A.

    Science.gov (United States)

    Sinha, Shriprakash

    2017-12-04

    Ever since the accidental discovery of Wingless [Sharma R.P., Drosophila information service, 1973, 50, p 134], research in the field of Wnt signaling pathway has taken significant strides in wet lab experiments and various cancer clinical trials, augmented by recent developments in advanced computational modeling of the pathway. Information rich gene expression profiles reveal various aspects of the signaling pathway and help in studying different issues simultaneously. Hitherto, not many computational studies exist which incorporate the simultaneous study of these issues. This manuscript ∙ explores the strength of contributing factors in the signaling pathway, ∙ analyzes the existing causal relations among the inter/extracellular factors effecting the pathway based on prior biological knowledge and ∙ investigates the deviations in fold changes in the recently found prevalence of psychophysical laws working in the pathway. To achieve this goal, local and global sensitivity analysis is conducted on the (non)linear responses between the factors obtained from static and time series expression profiles using the density (Hilbert-Schmidt Information Criterion) and variance (Sobol) based sensitivity indices. The results show the advantage of using density based indices over variance based indices mainly due to the former's employment of distance measures & the kernel trick via Reproducing kernel Hilbert space (RKHS) that capture nonlinear relations among various intra/extracellular factors of the pathway in a higher dimensional space. In time series data, using these indices it is now possible to observe where in time, which factors get influenced & contribute to the pathway, as changes in concentration of the other factors are made. This synergy of prior biological knowledge, sensitivity analysis & representations in higher dimensional spaces can facilitate in time based administration of target therapeutic drugs & reveal hidden biological information within

  7. Time Series Factor Analysis with an Application to Measuring Money

    NARCIS (Netherlands)

    Gilbert, Paul D.; Meijer, Erik

    2005-01-01

    Time series factor analysis (TSFA) and its associated statistical theory is developed. Unlike dynamic factor analysis (DFA), TSFA obviates the need for explicitly modeling the process dynamics of the underlying phenomena. It also differs from standard factor analysis (FA) in important respects: the

  8. Time averaging, ageing and delay analysis of financial time series

    Science.gov (United States)

    Cherstvy, Andrey G.; Vinod, Deepak; Aghion, Erez; Chechkin, Aleksei V.; Metzler, Ralf

    2017-06-01

    We introduce three strategies for the analysis of financial time series based on time averaged observables. These comprise the time averaged mean squared displacement (MSD) as well as the ageing and delay time methods for varying fractions of the financial time series. We explore these concepts via statistical analysis of historic time series for several Dow Jones Industrial indices for the period from the 1960s to 2015. Remarkably, we discover a simple universal law for the delay time averaged MSD. The observed features of the financial time series dynamics agree well with our analytical results for the time averaged measurables for geometric Brownian motion, underlying the famed Black-Scholes-Merton model. The concepts we promote here are shown to be useful for financial data analysis and enable one to unveil new universal features of stock market dynamics.

  9. Biostatistics series module 9: Survival analysis

    Directory of Open Access Journals (Sweden)

    Avijit Hazra

    2017-01-01

    Full Text Available Survival analysis is concerned with “time to event“ data. Conventionally, it dealt with cancer death as the event in question, but it can handle any event occurring over a time frame, and this need not be always adverse in nature. When the outcome of a study is the time to an event, it is often not possible to wait until the event in question has happened to all the subjects, for example, until all are dead. In addition, subjects may leave the study prematurely. Such situations lead to what is called censored observations as complete information is not available for these subjects. The data set is thus an assemblage of times to the event in question and times after which no more information on the individual is available. Survival analysis methods are the only techniques capable of handling censored observations without treating them as missing data. They also make no assumption regarding normal distribution of time to event data. Descriptive methods for exploring survival times in a sample include life table and Kaplan–Meier techniques as well as various kinds of distribution fitting as advanced modeling techniques. The Kaplan–Meier cumulative survival probability over time plot has become the signature plot for biomedical survival analysis. Several techniques are available for comparing the survival experience in two or more groups – the log-rank test is popularly used. This test can also be used to produce an odds ratio as an estimate of risk of the event in the test group; this is called hazard ratio (HR. Limitations of the traditional log-rank test have led to various modifications and enhancements. Finally, survival analysis offers different regression models for estimating the impact of multiple predictors on survival. Cox's proportional hazard model is the most general of the regression methods that allows the hazard function to be modeled on a set of explanatory variables without making restrictive assumptions concerning the

  10. Time Series Analysis of Insar Data: Methods and Trends

    Science.gov (United States)

    Osmanoglu, Batuhan; Sunar, Filiz; Wdowinski, Shimon; Cano-Cabral, Enrique

    2015-01-01

    Time series analysis of InSAR data has emerged as an important tool for monitoring and measuring the displacement of the Earth's surface. Changes in the Earth's surface can result from a wide range of phenomena such as earthquakes, volcanoes, landslides, variations in ground water levels, and changes in wetland water levels. Time series analysis is applied to interferometric phase measurements, which wrap around when the observed motion is larger than one-half of the radar wavelength. Thus, the spatio-temporal ''unwrapping" of phase observations is necessary to obtain physically meaningful results. Several different algorithms have been developed for time series analysis of InSAR data to solve for this ambiguity. These algorithms may employ different models for time series analysis, but they all generate a first-order deformation rate, which can be compared to each other. However, there is no single algorithm that can provide optimal results in all cases. Since time series analyses of InSAR data are used in a variety of applications with different characteristics, each algorithm possesses inherently unique strengths and weaknesses. In this review article, following a brief overview of InSAR technology, we discuss several algorithms developed for time series analysis of InSAR data using an example set of results for measuring subsidence rates in Mexico City.

  11. Stochastic Analysis : A Series of Lectures

    CERN Document Server

    Dozzi, Marco; Flandoli, Franco; Russo, Francesco

    2015-01-01

    This book presents in thirteen refereed survey articles an overview of modern activity in stochastic analysis, written by leading international experts. The topics addressed include stochastic fluid dynamics and regularization by noise of deterministic dynamical systems; stochastic partial differential equations driven by Gaussian or Lévy noise, including the relationship between parabolic equations and particle systems, and wave equations in a geometric framework; Malliavin calculus and applications to stochastic numerics; stochastic integration in Banach spaces; porous media-type equations; stochastic deformations of classical mechanics and Feynman integrals and stochastic differential equations with reflection. The articles are based on short courses given at the Centre Interfacultaire Bernoulli of the Ecole Polytechnique Fédérale de Lausanne, Switzerland, from January to June 2012. They offer a valuable resource not only for specialists, but also for other researchers and Ph.D. students in the fields o...

  12. Volatility Analysis of Bitcoin Price Time Series

    Directory of Open Access Journals (Sweden)

    Lukáš Pichl

    2017-12-01

    Full Text Available Bitcoin has the largest share in the total capitalization of cryptocurrency markets currently reaching above 70 billion USD. In this work we focus on the price of Bitcoin in terms of standard currencies and their volatility over the last five years. The average day-to-day return throughout this period is 0.328%, amounting in exponential growth from 6 USD to over 4,000 USD per 1 BTC at present. Multi-scale analysis is performed from the level of the tick data, through the 5 min, 1 hour and 1 day scales. Distribution of trading volumes (1 sec, 1 min, 1 hour and 1 day aggregated from the Kraken BTCEUR tick data is provided that shows the artifacts of algorithmic trading (selling transactions with volume peaks distributed at integer multiples of BTC unit. Arbitrage opportunities are studied using the EUR, USD and CNY currencies. Whereas the arbitrage spread for EUR-USD currency pair is found narrow at the order of a percent, at the 1 hour sampling period the arbitrage spread for USD-CNY (and similarly EUR-CNY is found to be more substantial, reaching as high as above 5 percent on rare occasions. The volatility of BTC exchange rates is modeled using the day-to-day distribution of logarithmic return, and the Realized Volatility, sum of the squared logarithmic returns on 5-minute basis. In this work we demonstrate that the Heterogeneous Autoregressive model for Realized Volatility Andersen et al. (2007 applies reasonably well to the BTCUSD dataset. Finally, a feed-forward neural network with 2 hidden layers using 10-day moving window sampling daily return predictors is applied to estimate the next-day logarithmic return. The results show that such an artificial neural network prediction is capable of approximate capture of the actual log return distribution; more sophisticated methods, such as recurrent neural networks and LSTM (Long Short Term Memory techniques from deep learning may be necessary for higher prediction accuracy.

  13. Multivariate time series analysis with R and financial applications

    CERN Document Server

    Tsay, Ruey S

    2013-01-01

    Since the publication of his first book, Analysis of Financial Time Series, Ruey Tsay has become one of the most influential and prominent experts on the topic of time series. Different from the traditional and oftentimes complex approach to multivariate (MV) time series, this sequel book emphasizes structural specification, which results in simplified parsimonious VARMA modeling and, hence, eases comprehension. Through a fundamental balance between theory and applications, the book supplies readers with an accessible approach to financial econometric models and their applications to real-worl

  14. Elements of nonlinear time series analysis and forecasting

    CERN Document Server

    De Gooijer, Jan G

    2017-01-01

    This book provides an overview of the current state-of-the-art of nonlinear time series analysis, richly illustrated with examples, pseudocode algorithms and real-world applications. Avoiding a “theorem-proof” format, it shows concrete applications on a variety of empirical time series. The book can be used in graduate courses in nonlinear time series and at the same time also includes interesting material for more advanced readers. Though it is largely self-contained, readers require an understanding of basic linear time series concepts, Markov chains and Monte Carlo simulation methods. The book covers time-domain and frequency-domain methods for the analysis of both univariate and multivariate (vector) time series. It makes a clear distinction between parametric models on the one hand, and semi- and nonparametric models/methods on the other. This offers the reader the option of concentrating exclusively on one of these nonlinear time series analysis methods. To make the book as user friendly as possible...

  15. Assessment of the dynamics of atrial signals and local atrial period series during atrial fibrillation: effects of isoproterenol administration

    Directory of Open Access Journals (Sweden)

    Mantica Massimo

    2004-10-01

    Full Text Available Abstract Background The autonomic nervous system (ANS plays an important role in the genesis and maintenance of atrial fibrillation (AF, but quantification of its electrophysiologic effects is extremely complex and difficult. Aim of the study was to evaluate the capability of linear and non-linear indexes to capture the fine changing dynamics of atrial signals and local atrial period (LAP series during adrenergic activation induced by isoproterenol (a sympathomimetic drug infusion. Methods Nine patients with paroxysmal or persistent AF (aged 60 ± 6 underwent electrophysiological study in which isoproterenol was administered to patients. Atrial electrograms were acquired during i sinus rhythm (SR; ii sinus rhythm during isoproterenol (SRISO administration; iii atrial fibrillation (AF and iv atrial fibrillation during isoproterenol (AFISO administration. The level of organization between two electrograms was assessed by the synchronization index (S, whereas the degree of recurrence of a pattern in a signal was defined by the regularity index (R. In addition, the level of predictability (LP and regularity of LAP series were computed. Results LAP series analysis shows a reduction of both LP and R index during isoproterenol infusion in SR and AF (RSR = 0.75 ± 0.07 RSRISO = 0.69 ± 0.10, p AF = 0.31 ± 0.08 RAFISO = 0.26 ± 0.09, p SR = 99.99 ± 0.001 LPSRISO = 99.97 ± 0.03, p AF = 69.46 ± 21.55 LPAFISO = 55 ± 24.75; p SR = 0.49 ± 0.08 RSRISO = 0.46 ± 0.09 p AF = 0.29 ± 0.09 RAFISO = 0.28 ± 0.08 n.s.. Conclusions The proposed parameters succeeded in discriminating the subtle changes due to isoproterenol infusion during both the rhythms especially when considering LAP series analysis. The reduced value of analyzed parameters after isoproterenol administration could reflect an important pro-arrhythmic influence of adrenergic activation on favoring maintenance of AF.

  16. Nonlinear time series analysis of the human electrocardiogram

    International Nuclear Information System (INIS)

    Perc, Matjaz

    2005-01-01

    We analyse the human electrocardiogram with simple nonlinear time series analysis methods that are appropriate for graduate as well as undergraduate courses. In particular, attention is devoted to the notions of determinism and stationarity in physiological data. We emphasize that methods of nonlinear time series analysis can be successfully applied only if the studied data set originates from a deterministic stationary system. After positively establishing the presence of determinism and stationarity in the studied electrocardiogram, we calculate the maximal Lyapunov exponent, thus providing interesting insights into the dynamics of the human heart. Moreover, to facilitate interest and enable the integration of nonlinear time series analysis methods into the curriculum at an early stage of the educational process, we also provide user-friendly programs for each implemented method

  17. Handbook of Time Series Analysis Recent Theoretical Developments and Applications

    CERN Document Server

    Schelter, Björn; Timmer, Jens

    2006-01-01

    This handbook provides an up-to-date survey of current research topics and applications of time series analysis methods written by leading experts in their fields. It covers recent developments in univariate as well as bivariate and multivariate time series analysis techniques ranging from physics' to life sciences' applications. Each chapter comprises both methodological aspects and applications to real world complex systems, such as the human brain or Earth's climate. Covering an exceptionally broad spectrum of topics, beginners, experts and practitioners who seek to understand the latest de

  18. Cellular signaling identifiability analysis: a case study.

    Science.gov (United States)

    Roper, Ryan T; Pia Saccomani, Maria; Vicini, Paolo

    2010-05-21

    Two primary purposes for mathematical modeling in cell biology are (1) simulation for making predictions of experimental outcomes and (2) parameter estimation for drawing inferences from experimental data about unobserved aspects of biological systems. While the former purpose has become common in the biological sciences, the latter is less common, particularly when studying cellular and subcellular phenomena such as signaling-the focus of the current study. Data are difficult to obtain at this level. Therefore, even models of only modest complexity can contain parameters for which the available data are insufficient for estimation. In the present study, we use a set of published cellular signaling models to address issues related to global parameter identifiability. That is, we address the following question: assuming known time courses for some model variables, which parameters is it theoretically impossible to estimate, even with continuous, noise-free data? Following an introduction to this problem and its relevance, we perform a full identifiability analysis on a set of cellular signaling models using DAISY (Differential Algebra for the Identifiability of SYstems). We use our analysis to bring to light important issues related to parameter identifiability in ordinary differential equation (ODE) models. We contend that this is, as of yet, an under-appreciated issue in biological modeling and, more particularly, cell biology. Copyright (c) 2010 Elsevier Ltd. All rights reserved.

  19. Refined generalized multiscale entropy analysis for physiological signals

    Science.gov (United States)

    Liu, Yunxiao; Lin, Youfang; Wang, Jing; Shang, Pengjian

    2018-01-01

    Multiscale entropy analysis has become a prevalent complexity measurement and been successfully applied in various fields. However, it only takes into account the information of mean values (first moment) in coarse-graining procedure. Then generalized multiscale entropy (MSEn) considering higher moments to coarse-grain a time series was proposed and MSEσ2 has been implemented. However, the MSEσ2 sometimes may yield an imprecise estimation of entropy or undefined entropy, and reduce statistical reliability of sample entropy estimation as scale factor increases. For this purpose, we developed the refined model, RMSEσ2, to improve MSEσ2. Simulations on both white noise and 1 / f noise show that RMSEσ2 provides higher entropy reliability and reduces the occurrence of undefined entropy, especially suitable for short time series. Besides, we discuss the effect on RMSEσ2 analysis from outliers, data loss and other concepts in signal processing. We apply the proposed model to evaluate the complexity of heartbeat interval time series derived from healthy young and elderly subjects, patients with congestive heart failure and patients with atrial fibrillation respectively, compared to several popular complexity metrics. The results demonstrate that RMSEσ2 measured complexity (a) decreases with aging and diseases, and (b) gives significant discrimination between different physiological/pathological states, which may facilitate clinical application.

  20. Assessing ionospheric activity by long time series of GNSS signals: the search of possible connection with seismicity

    Science.gov (United States)

    Galeandro, Angelo; Mancini, Francesco; De Giglio, Michaela; Barbarella, Maurizio

    2014-05-01

    The modifications of some atmospheric physical properties prior to a high magnitude earthquake were recently debated in the frame of the Lithosphere-Atmosphere-Ionosphere (LAI) Coupling model. Among this variety of phenomena, the ionization of air at the ionospheric levels due to leaking of gases from earth crust through the analysis of long time series of GNSS (Global Navigation Satellite System) signals was investigated in this work. Several authors used the dispersive properties of the ionospheric strata towards the GNSS signals to detect possible ionospheric anomalies over areas affected by earthquakes and some evidences were encountered. However, the spatial scale and temporal domains over which such disturbances come into evidence is still a controversial item. Furthermore, the correspondence by chance between ionospheric disturbances and relevant seismic activity is even more difficult to model whenever the reference time period and spatial extent of investigation are confined. Problems could also arise from phenomena due to solar activity (now at culmination within the 11 years-long solar cycle) because such global effects could reduce the ability to detect disturbances at regional or local spatial scale. In this work, two case studies were investigated. The first one focuses on the M = 6.3 earthquake occurred on April 6, 2009, close to the city of L'Aquila (Abruzzo, Italy). The second concerns the M = 5.9 earthquake occurred on May 20, 2012, between the cities of Ferrara and Modena (Emilia Romagna, Italy). To investigate possible connections between the ionospheric activity and seismicity for such events, a five-year (2008-2012) long series of high resolution ionospheric maps was used. These maps were produced by authors from GNSS data collected by permanent stations uniformly distributed around the epicenters and allowed to assess the ionospheric activity through the analysis of the TEC (Total Electron Content). To avoid the influence of solar activity

  1. Time Series Data Analysis of Wireless Sensor Network Measurements of Temperature.

    Science.gov (United States)

    Bhandari, Siddhartha; Bergmann, Neil; Jurdak, Raja; Kusy, Branislav

    2017-05-26

    Wireless sensor networks have gained significant traction in environmental signal monitoring and analysis. The cost or lifetime of the system typically depends on the frequency at which environmental phenomena are monitored. If sampling rates are reduced, energy is saved. Using empirical datasets collected from environmental monitoring sensor networks, this work performs time series analyses of measured temperature time series. Unlike previous works which have concentrated on suppressing the transmission of some data samples by time-series analysis but still maintaining high sampling rates, this work investigates reducing the sampling rate (and sensor wake up rate) and looks at the effects on accuracy. Results show that the sampling period of the sensor can be increased up to one hour while still allowing intermediate and future states to be estimated with interpolation RMSE less than 0.2 °C and forecasting RMSE less than 1 °C.

  2. Time series analysis and its applications with R examples

    CERN Document Server

    Shumway, Robert H

    2017-01-01

    The fourth edition of this popular graduate textbook, like its predecessors, presents a balanced and comprehensive treatment of both time and frequency domain methods with accompanying theory. Numerous examples using nontrivial data illustrate solutions to problems such as discovering natural and anthropogenic climate change, evaluating pain perception experiments using functional magnetic resonance imaging, and monitoring a nuclear test ban treaty. The book is designed as a textbook for graduate level students in the physical, biological, and social sciences and as a graduate level text in statistics. Some parts may also serve as an undergraduate introductory course. Theory and methodology are separated to allow presentations on different levels. In addition to coverage of classical methods of time series regression, ARIMA models, spectral analysis and state-space models, the text includes modern developments including categorical time series analysis, multivariate spectral methods, long memory series, nonli...

  3. Multiresolution analysis of Bursa Malaysia KLCI time series

    Science.gov (United States)

    Ismail, Mohd Tahir; Dghais, Amel Abdoullah Ahmed

    2017-05-01

    In general, a time series is simply a sequence of numbers collected at regular intervals over a period. Financial time series data processing is concerned with the theory and practice of processing asset price over time, such as currency, commodity data, and stock market data. The primary aim of this study is to understand the fundamental characteristics of selected financial time series by using the time as well as the frequency domain analysis. After that prediction can be executed for the desired system for in sample forecasting. In this study, multiresolution analysis which the assist of discrete wavelet transforms (DWT) and maximal overlap discrete wavelet transform (MODWT) will be used to pinpoint special characteristics of Bursa Malaysia KLCI (Kuala Lumpur Composite Index) daily closing prices and return values. In addition, further case study discussions include the modeling of Bursa Malaysia KLCI using linear ARIMA with wavelets to address how multiresolution approach improves fitting and forecasting results.

  4. Alleviating Border Effects in Wavelet Transforms for Nonlinear Time-varying Signal Analysis

    Directory of Open Access Journals (Sweden)

    SU, H.

    2011-08-01

    Full Text Available Border effects are very common in many finite signals analysis and processing approaches using convolution operation. Alleviating the border effects that can occur in the processing of finite-length signals using wavelet transform is considered in this paper. Traditional methods for alleviating the border effects are suitable to compression or coding applications. We propose an algorithm based on Fourier series which is proved to be appropriate to the application of time-frequency analysis of nonlinear signals. Fourier series extension method preserves the time-varying characteristics of the signals. A modified signal duration expression for measuring the extent of border effects region is presented. The proposed algorithm is confirmed to be efficient to alleviate the border effects in comparison to the current methods through the numerical examples.

  5. Time Series Analysis of Wheat Futures Reward in China

    Institute of Scientific and Technical Information of China (English)

    2005-01-01

    Different from the fact that the main researches are focused on single futures contract and lack of the comparison of different periods, this paper described the statistical characteristics of wheat futures reward time series of Zhengzhou Commodity Exchange in recent three years. Besides the basic statistic analysis, the paper used the GARCH and EGARCH model to describe the time series which had the ARCH effect and analyzed the persistence of volatility shocks and the leverage effect. The results showed that compared with that of normal one,wheat futures reward series were abnormality, leptokurtic and thick tail distribution. The study also found that two-part of the reward series had no autocorrelation. Among the six correlative series, three ones presented the ARCH effect. By using of the Auto-regressive Distributed Lag Model, GARCH model and EGARCH model, the paper demonstrates the persistence of volatility shocks and the leverage effect on the wheat futures reward time series. The results reveal that on the one hand, the statistical characteristics of the wheat futures reward are similar to the aboard mature futures market as a whole. But on the other hand, the results reflect some shortages such as the immatureness and the over-control by the government in the Chinese future market.

  6. Time series analysis in chaotic diode resonator circuit

    Energy Technology Data Exchange (ETDEWEB)

    Hanias, M.P. [TEI of Chalkis, GR 34400, Evia, Chalkis (Greece)] e-mail: mhanias@teihal.gr; Giannaris, G. [TEI of Chalkis, GR 34400, Evia, Chalkis (Greece); Spyridakis, A. [TEI of Chalkis, GR 34400, Evia, Chalkis (Greece); Rigas, A. [TEI of Chalkis, GR 34400, Evia, Chalkis (Greece)

    2006-01-01

    A diode resonator chaotic circuit is presented. Multisim is used to simulate the circuit and show the presence of chaos. Time series analysis performed by the method proposed by Grasberger and Procaccia. The correlation and minimum embedding dimension {nu} and m {sub min}, respectively, were calculated. Also the corresponding Kolmogorov entropy was calculated.

  7. Time series analysis in chaotic diode resonator circuit

    International Nuclear Information System (INIS)

    Hanias, M.P.; Giannaris, G.; Spyridakis, A.; Rigas, A.

    2006-01-01

    A diode resonator chaotic circuit is presented. Multisim is used to simulate the circuit and show the presence of chaos. Time series analysis performed by the method proposed by Grasberger and Procaccia. The correlation and minimum embedding dimension ν and m min , respectively, were calculated. Also the corresponding Kolmogorov entropy was calculated

  8. Time series analysis of monthly pulpwood use in the Northeast

    Science.gov (United States)

    James T. Bones

    1980-01-01

    Time series analysis was used to develop a model that depicts pulpwood use in the Northeast. The model is useful in forecasting future pulpwood requirements (short term) or monitoring pulpwood-use activity in relation to past use patterns. The model predicted a downturn in use during 1980.

  9. Multi-granular trend detection for time-series analysis

    NARCIS (Netherlands)

    van Goethem, A.I.; Staals, F.; Löffler, M.; Dykes, J.; Speckmann, B.

    2017-01-01

    Time series (such as stock prices) and ensembles (such as model runs for weather forecasts) are two important types of one-dimensional time-varying data. Such data is readily available in large quantities but visual analysis of the raw data quickly becomes infeasible, even for moderately sized data

  10. Time Series Analysis Based on Running Mann Whitney Z Statistics

    Science.gov (United States)

    A sensitive and objective time series analysis method based on the calculation of Mann Whitney U statistics is described. This method samples data rankings over moving time windows, converts those samples to Mann-Whitney U statistics, and then normalizes the U statistics to Z statistics using Monte-...

  11. Time series analysis in astronomy: Limits and potentialities

    DEFF Research Database (Denmark)

    Vio, R.; Kristensen, N.R.; Madsen, Henrik

    2005-01-01

    In this paper we consider the problem of the limits concerning the physical information that can be extracted from the analysis of one or more time series ( light curves) typical of astrophysical objects. On the basis of theoretical considerations and numerical simulations, we show that with no a...

  12. Time Series Analysis of 3D Coordinates Using Nonstochastic Observations

    NARCIS (Netherlands)

    Velsink, H.

    2016-01-01

    Adjustment and testing of a combination of stochastic and nonstochastic observations is applied to the deformation analysis of a time series of 3D coordinates. Nonstochastic observations are constant values that are treated as if they were observations. They are used to formulate constraints on

  13. Time Series Analysis of 3D Coordinates Using Nonstochastic Observations

    NARCIS (Netherlands)

    Hiddo Velsink

    2016-01-01

    From the article: Abstract Adjustment and testing of a combination of stochastic and nonstochastic observations is applied to the deformation analysis of a time series of 3D coordinates. Nonstochastic observations are constant values that are treated as if they were observations. They are used to

  14. Identification of human operator performance models utilizing time series analysis

    Science.gov (United States)

    Holden, F. M.; Shinners, S. M.

    1973-01-01

    The results of an effort performed by Sperry Systems Management Division for AMRL in applying time series analysis as a tool for modeling the human operator are presented. This technique is utilized for determining the variation of the human transfer function under various levels of stress. The human operator's model is determined based on actual input and output data from a tracking experiment.

  15. Analysis and implementation of LLC-T series parallel resonant ...

    African Journals Online (AJOL)

    A prototype 300 W, 100 kHz converter is designed and built to experimentally demonstrate, dynamic and steady state performance for the LLC-T series parallel resonant converter. A comparative study is performed between experimental results and the simulation studies. The analysis shows that the output of converter is ...

  16. All-phase MR angiography using independent component analysis of dynamic contrast enhanced MRI time series. φ-MRA

    International Nuclear Information System (INIS)

    Suzuki, Kiyotaka; Matsuzawa, Hitoshi; Watanabe, Masaki; Nakada, Tsutomu; Nakayama, Naoki; Kwee, I.L.

    2003-01-01

    Dynamic contrast enhanced magnetic resonance imaging (dynamic MRI) represents a MRI version of non-diffusible tracer methods, the main clinical use of which is the physiological construction of what is conventionally referred to as perfusion images. The raw data utilized for constructing MRI perfusion images are time series of pixel signal alterations associated with the passage of a gadolinium containing contrast agent. Such time series are highly compatible with independent component analysis (ICA), a novel statistical signal processing technique capable of effectively separating a single mixture of multiple signals into their original independent source signals (blind separation). Accordingly, we applied ICA to dynamic MRI time series. The technique was found to be powerful, allowing for hitherto unobtainable assessment of regional cerebral hemodynamics in vivo. (author)

  17. Teaching Earth Signals Analysis Using the Java-DSP Earth Systems Edition: Modern and Past Climate Change

    Science.gov (United States)

    Ramamurthy, Karthikeyan Natesan; Hinnov, Linda A.; Spanias, Andreas S.

    2014-01-01

    Modern data collection in the Earth Sciences has propelled the need for understanding signal processing and time-series analysis techniques. However, there is an educational disconnect in the lack of instruction of time-series analysis techniques in many Earth Science academic departments. Furthermore, there are no platform-independent freeware…

  18. Time series analysis methods and applications for flight data

    CERN Document Server

    Zhang, Jianye

    2017-01-01

    This book focuses on different facets of flight data analysis, including the basic goals, methods, and implementation techniques. As mass flight data possesses the typical characteristics of time series, the time series analysis methods and their application for flight data have been illustrated from several aspects, such as data filtering, data extension, feature optimization, similarity search, trend monitoring, fault diagnosis, and parameter prediction, etc. An intelligent information-processing platform for flight data has been established to assist in aircraft condition monitoring, training evaluation and scientific maintenance. The book will serve as a reference resource for people working in aviation management and maintenance, as well as researchers and engineers in the fields of data analysis and data mining.

  19. Time series analysis of ozone data in Isfahan

    Science.gov (United States)

    Omidvari, M.; Hassanzadeh, S.; Hosseinibalam, F.

    2008-07-01

    Time series analysis used to investigate the stratospheric ozone formation and decomposition processes. Different time series methods are applied to detect the reason for extreme high ozone concentrations for each season. Data was convert into seasonal component and frequency domain, the latter has been evaluated by using the Fast Fourier Transform (FFT), spectral analysis. The power density spectrum estimated from the ozone data showed peaks at cycle duration of 22, 20, 36, 186, 365 and 40 days. According to seasonal component analysis most fluctuation was in 1999 and 2000, but the least fluctuation was in 2003. The best correlation between ozone and sun radiation was found in 2000. Other variables which are not available cause to this fluctuation in the 1999 and 2001. The trend of ozone is increasing in 1999 and is decreasing in other years.

  20. Generalized sample entropy analysis for traffic signals based on similarity measure

    Science.gov (United States)

    Shang, Du; Xu, Mengjia; Shang, Pengjian

    2017-05-01

    Sample entropy is a prevailing method used to quantify the complexity of a time series. In this paper a modified method of generalized sample entropy and surrogate data analysis is proposed as a new measure to assess the complexity of a complex dynamical system such as traffic signals. The method based on similarity distance presents a different way of signals patterns match showing distinct behaviors of complexity. Simulations are conducted over synthetic data and traffic signals for providing the comparative study, which is provided to show the power of the new method. Compared with previous sample entropy and surrogate data analysis, the new method has two main advantages. The first one is that it overcomes the limitation about the relationship between the dimension parameter and the length of series. The second one is that the modified sample entropy functions can be used to quantitatively distinguish time series from different complex systems by the similar measure.

  1. Radial artery pulse waveform analysis based on curve fitting using discrete Fourier series.

    Science.gov (United States)

    Jiang, Zhixing; Zhang, David; Lu, Guangming

    2018-04-19

    Radial artery pulse diagnosis has been playing an important role in traditional Chinese medicine (TCM). For its non-invasion and convenience, the pulse diagnosis has great significance in diseases analysis of modern medicine. The practitioners sense the pulse waveforms in patients' wrist to make diagnoses based on their non-objective personal experience. With the researches of pulse acquisition platforms and computerized analysis methods, the objective study on pulse diagnosis can help the TCM to keep up with the development of modern medicine. In this paper, we propose a new method to extract feature from pulse waveform based on discrete Fourier series (DFS). It regards the waveform as one kind of signal that consists of a series of sub-components represented by sine and cosine (SC) signals with different frequencies and amplitudes. After the pulse signals are collected and preprocessed, we fit the average waveform for each sample using discrete Fourier series by least squares. The feature vector is comprised by the coefficients of discrete Fourier series function. Compared with the fitting method using Gaussian mixture function, the fitting errors of proposed method are smaller, which indicate that our method can represent the original signal better. The classification performance of proposed feature is superior to the other features extracted from waveform, liking auto-regression model and Gaussian mixture model. The coefficients of optimized DFS function, who is used to fit the arterial pressure waveforms, can obtain better performance in modeling the waveforms and holds more potential information for distinguishing different psychological states. Copyright © 2018 Elsevier B.V. All rights reserved.

  2. Neutron noise analysis of BWR using time series analysis

    International Nuclear Information System (INIS)

    Fukunishi, Kohyu

    1976-01-01

    The main purpose of this paper is to give more quantitative understanding of noise source in neutron flux and to provide a useful tool for the detection and diagnosis of reactor. The space dependent effects of distributed neutron flux signals at the axial direction of two different strings are investigated by the power contribution ratio among neutron fluxes and the incoherent noise spectra of neutron fluxes derived from autoregressive spectra. The signals are measured on the medium sized commercial BWR of 460 MWe in Japan. From the obtained results, local and global noise sources in neutron flux are discussed. This method is indicated to be a useful tool for detection and diagnosis of anomalous phenomena in BWR. (orig./RW) [de

  3. Mathematical properties of a semi-classical signal analysis method: Noisy signal case

    KAUST Repository

    Liu, Dayan

    2012-08-01

    Recently, a new signal analysis method based on a semi-classical approach has been proposed [1]. The main idea in this method is to interpret a signal as a potential of a Schrodinger operator and then to use the discrete spectrum of this operator to analyze the signal. In this paper, we are interested in a mathematical analysis of this method in discrete case considering noisy signals. © 2012 IEEE.

  4. Mathematical properties of a semi-classical signal analysis method: Noisy signal case

    KAUST Repository

    Liu, Dayan; Laleg-Kirati, Taous-Meriem

    2012-01-01

    Recently, a new signal analysis method based on a semi-classical approach has been proposed [1]. The main idea in this method is to interpret a signal as a potential of a Schrodinger operator and then to use the discrete spectrum of this operator to analyze the signal. In this paper, we are interested in a mathematical analysis of this method in discrete case considering noisy signals. © 2012 IEEE.

  5. : Signal Decomposition of High Resolution Time Series River data to Separate Local and Regional Components of Conductivity

    Science.gov (United States)

    Signal processing techniques were applied to high-resolution time series data obtained from conductivity loggers placed upstream and downstream of a wastewater treatment facility along a river. Data was collected over 14-60 days, and several seasons. The power spectral densit...

  6. Signal Decomposition of High Resolution Time Series River Data to Separate Local and Regional Components of Conductivity

    Science.gov (United States)

    Signal processing techniques were applied to high-resolution time series data obtained from conductivity loggers placed upstream and downstream of an oil and gas wastewater treatment facility along a river. Data was collected over 14-60 days. The power spectral density was us...

  7. Time series analysis of nuclear instrumentation in EBR-II

    International Nuclear Information System (INIS)

    Imel, G.R.

    1996-01-01

    Results of a time series analysis of the scaler count data from the 3 wide range nuclear detectors in the Experimental Breeder Reactor-II are presented. One of the channels was replaced, and it was desired to determine if there was any statistically significant change (ie, improvement) in the channel's response after the replacement. Data were collected from all 3 channels for 16-day periods before and after detector replacement. Time series analysis and statistical tests showed that there was no significant change after the detector replacement. Also, there were no statistically significant differences among the 3 channels, either before or after the replacement. Finally, it was determined that errors in the reactivity change inferred from subcritical count monitoring during fuel handling would be on the other of 20-30 cents for single count intervals

  8. A Recurrent Probabilistic Neural Network with Dimensionality Reduction Based on Time-series Discriminant Component Analysis.

    Science.gov (United States)

    Hayashi, Hideaki; Shibanoki, Taro; Shima, Keisuke; Kurita, Yuichi; Tsuji, Toshio

    2015-12-01

    This paper proposes a probabilistic neural network (NN) developed on the basis of time-series discriminant component analysis (TSDCA) that can be used to classify high-dimensional time-series patterns. TSDCA involves the compression of high-dimensional time series into a lower dimensional space using a set of orthogonal transformations and the calculation of posterior probabilities based on a continuous-density hidden Markov model with a Gaussian mixture model expressed in the reduced-dimensional space. The analysis can be incorporated into an NN, which is named a time-series discriminant component network (TSDCN), so that parameters of dimensionality reduction and classification can be obtained simultaneously as network coefficients according to a backpropagation through time-based learning algorithm with the Lagrange multiplier method. The TSDCN is considered to enable high-accuracy classification of high-dimensional time-series patterns and to reduce the computation time taken for network training. The validity of the TSDCN is demonstrated for high-dimensional artificial data and electroencephalogram signals in the experiments conducted during the study.

  9. Multivariate stochastic analysis for Monthly hydrological time series at Cuyahoga River Basin

    Science.gov (United States)

    zhang, L.

    2011-12-01

    Copula has become a very powerful statistic and stochastic methodology in case of the multivariate analysis in Environmental and Water resources Engineering. In recent years, the popular one-parameter Archimedean copulas, e.g. Gumbel-Houggard copula, Cook-Johnson copula, Frank copula, the meta-elliptical copula, e.g. Gaussian Copula, Student-T copula, etc. have been applied in multivariate hydrological analyses, e.g. multivariate rainfall (rainfall intensity, duration and depth), flood (peak discharge, duration and volume), and drought analyses (drought length, mean and minimum SPI values, and drought mean areal extent). Copula has also been applied in the flood frequency analysis at the confluences of river systems by taking into account the dependence among upstream gauge stations rather than by using the hydrological routing technique. In most of the studies above, the annual time series have been considered as stationary signal which the time series have been assumed as independent identically distributed (i.i.d.) random variables. But in reality, hydrological time series, especially the daily and monthly hydrological time series, cannot be considered as i.i.d. random variables due to the periodicity existed in the data structure. Also, the stationary assumption is also under question due to the Climate Change and Land Use and Land Cover (LULC) change in the fast years. To this end, it is necessary to revaluate the classic approach for the study of hydrological time series by relaxing the stationary assumption by the use of nonstationary approach. Also as to the study of the dependence structure for the hydrological time series, the assumption of same type of univariate distribution also needs to be relaxed by adopting the copula theory. In this paper, the univariate monthly hydrological time series will be studied through the nonstationary time series analysis approach. The dependence structure of the multivariate monthly hydrological time series will be

  10. Spectral Unmixing Analysis of Time Series Landsat 8 Images

    Science.gov (United States)

    Zhuo, R.; Xu, L.; Peng, J.; Chen, Y.

    2018-05-01

    Temporal analysis of Landsat 8 images opens up new opportunities in the unmixing procedure. Although spectral analysis of time series Landsat imagery has its own advantage, it has rarely been studied. Nevertheless, using the temporal information can provide improved unmixing performance when compared to independent image analyses. Moreover, different land cover types may demonstrate different temporal patterns, which can aid the discrimination of different natures. Therefore, this letter presents time series K-P-Means, a new solution to the problem of unmixing time series Landsat imagery. The proposed approach is to obtain the "purified" pixels in order to achieve optimal unmixing performance. The vertex component analysis (VCA) is used to extract endmembers for endmember initialization. First, nonnegative least square (NNLS) is used to estimate abundance maps by using the endmember. Then, the estimated endmember is the mean value of "purified" pixels, which is the residual of the mixed pixel after excluding the contribution of all nondominant endmembers. Assembling two main steps (abundance estimation and endmember update) into the iterative optimization framework generates the complete algorithm. Experiments using both simulated and real Landsat 8 images show that the proposed "joint unmixing" approach provides more accurate endmember and abundance estimation results compared with "separate unmixing" approach.

  11. Notes on economic time series analysis system theoretic perspectives

    CERN Document Server

    Aoki, Masanao

    1983-01-01

    In seminars and graduate level courses I have had several opportunities to discuss modeling and analysis of time series with economists and economic graduate students during the past several years. These experiences made me aware of a gap between what economic graduate students are taught about vector-valued time series and what is available in recent system literature. Wishing to fill or narrow the gap that I suspect is more widely spread than my personal experiences indicate, I have written these notes to augment and reor­ ganize materials I have given in these courses and seminars. I have endeavored to present, in as much a self-contained way as practicable, a body of results and techniques in system theory that I judge to be relevant and useful to economists interested in using time series in their research. I have essentially acted as an intermediary and interpreter of system theoretic results and perspectives in time series by filtering out non-essential details, and presenting coherent accounts of wha...

  12. The enhanced greenhouse signal versus natural variations in observed climate time series: a statistical approach

    Energy Technology Data Exchange (ETDEWEB)

    Schoenwiese, C D [J.W. Goethe Univ., Frankfurt (Germany). Inst. for Meteorology and Geophysics

    1996-12-31

    It is a well-known fact that human activities lead to an atmospheric concentration increase of some IR-active trace gases (greenhouse gases GHG) and that this influence enhances the `greenhouse effect`. However, there are major quantitative and regional uncertainties in the related climate model projections and the observational data reflect the whole complex of both anthropogenic and natural forcing of the climate system. This contribution aims at the separation of the anthropogenic enhanced greenhouse signal in observed global surface air temperature data versus other forcing using statistical methods such as multiple (multiforced) regressions and neural networks. The competitive natural forcing considered are volcanic and solar activity, in addition the ENSO (El Nino/Southern Oscillation) mechanism. This analysis will be extended also to the NAO (North Atlantic Oscillation) and anthropogenic sulfate formation in the troposphere

  13. The enhanced greenhouse signal versus natural variations in observed climate time series: a statistical approach

    Energy Technology Data Exchange (ETDEWEB)

    Schoenwiese, C.D. [J.W. Goethe Univ., Frankfurt (Germany). Inst. for Meteorology and Geophysics

    1995-12-31

    It is a well-known fact that human activities lead to an atmospheric concentration increase of some IR-active trace gases (greenhouse gases GHG) and that this influence enhances the `greenhouse effect`. However, there are major quantitative and regional uncertainties in the related climate model projections and the observational data reflect the whole complex of both anthropogenic and natural forcing of the climate system. This contribution aims at the separation of the anthropogenic enhanced greenhouse signal in observed global surface air temperature data versus other forcing using statistical methods such as multiple (multiforced) regressions and neural networks. The competitive natural forcing considered are volcanic and solar activity, in addition the ENSO (El Nino/Southern Oscillation) mechanism. This analysis will be extended also to the NAO (North Atlantic Oscillation) and anthropogenic sulfate formation in the troposphere

  14. Wet tropospheric delays forecast based on Vienna Mapping Function time series analysis

    Science.gov (United States)

    Rzepecka, Zofia; Kalita, Jakub

    2016-04-01

    It is well known that the dry part of the zenith tropospheric delay (ZTD) is much easier to model than the wet part (ZTW). The aim of the research is applying stochastic modeling and prediction of ZTW using time series analysis tools. Application of time series analysis enables closer understanding of ZTW behavior as well as short-term prediction of future ZTW values. The ZTW data used for the studies were obtained from the GGOS service hold by Vienna technical University. The resolution of the data is six hours. ZTW for the years 2010 -2013 were adopted for the study. The International GNSS Service (IGS) permanent stations LAMA and GOPE, located in mid-latitudes, were admitted for the investigations. Initially the seasonal part was separated and modeled using periodic signals and frequency analysis. The prominent annual and semi-annual signals were removed using sines and consines functions. The autocorrelation of the resulting signal is significant for several days (20-30 samples). The residuals of this fitting were further analyzed and modeled with ARIMA processes. For both the stations optimal ARMA processes based on several criterions were obtained. On this basis predicted ZTW values were computed for one day ahead, leaving the white process residuals. Accuracy of the prediction can be estimated at about 3 cm.

  15. Programmable delay circuit for sparker signal analysis

    Digital Repository Service at National Institute of Oceanography (India)

    Pathak, D.

    The sparker echo signal had been recorded along with the EPC recorder trigger on audio cassettes in a dual channel analog recorder. The sparker signal in the analog form had to be digitised for further signal processing techniques to be performed...

  16. A Multivariate Time Series Method for Monte Carlo Reactor Analysis

    International Nuclear Information System (INIS)

    Taro Ueki

    2008-01-01

    A robust multivariate time series method has been established for the Monte Carlo calculation of neutron multiplication problems. The method is termed Coarse Mesh Projection Method (CMPM) and can be implemented using the coarse statistical bins for acquisition of nuclear fission source data. A novel aspect of CMPM is the combination of the general technical principle of projection pursuit in the signal processing discipline and the neutron multiplication eigenvalue problem in the nuclear engineering discipline. CMPM enables reactor physicists to accurately evaluate major eigenvalue separations of nuclear reactors with continuous energy Monte Carlo calculation. CMPM was incorporated in the MCNP Monte Carlo particle transport code of Los Alamos National Laboratory. The great advantage of CMPM over the traditional Fission Matrix method is demonstrated for the three space-dimensional modeling of the initial core of a pressurized water reactor

  17. Cluster analysis of activity-time series in motor learning

    DEFF Research Database (Denmark)

    Balslev, Daniela; Nielsen, Finn Å; Futiger, Sally A

    2002-01-01

    Neuroimaging studies of learning focus on brain areas where the activity changes as a function of time. To circumvent the difficult problem of model selection, we used a data-driven analytic tool, cluster analysis, which extracts representative temporal and spatial patterns from the voxel......-time series. The optimal number of clusters was chosen using a cross-validated likelihood method, which highlights the clustering pattern that generalizes best over the subjects. Data were acquired with PET at different time points during practice of a visuomotor task. The results from cluster analysis show...

  18. Coupling detrended fluctuation analysis for analyzing coupled nonstationary signals

    Science.gov (United States)

    Hedayatifar, L.; Vahabi, M.; Jafari, G. R.

    2011-08-01

    When many variables are coupled to each other, a single case study could not give us thorough and precise information. When these time series are stationary, different methods of random matrix analysis and complex networks can be used. But, in nonstationary cases, the multifractal-detrended-cross-correlation-analysis (MF-DXA) method was introduced for just two coupled time series. In this article, we have extended the MF-DXA to the method of coupling detrended fluctuation analysis (CDFA) for the case when more than two series are correlated to each other. Here, we have calculated the multifractal properties of the coupled time series, and by comparing CDFA results of the original series with those of the shuffled and surrogate series, we can estimate the source of multifractality and the extent to which our series are coupled to each other. We illustrate the method by selected examples from air pollution and foreign exchange rates.

  19. A Filtering of Incomplete GNSS Position Time Series with Probabilistic Principal Component Analysis

    Science.gov (United States)

    Gruszczynski, Maciej; Klos, Anna; Bogusz, Janusz

    2018-04-01

    For the first time, we introduced the probabilistic principal component analysis (pPCA) regarding the spatio-temporal filtering of Global Navigation Satellite System (GNSS) position time series to estimate and remove Common Mode Error (CME) without the interpolation of missing values. We used data from the International GNSS Service (IGS) stations which contributed to the latest International Terrestrial Reference Frame (ITRF2014). The efficiency of the proposed algorithm was tested on the simulated incomplete time series, then CME was estimated for a set of 25 stations located in Central Europe. The newly applied pPCA was compared with previously used algorithms, which showed that this method is capable of resolving the problem of proper spatio-temporal filtering of GNSS time series characterized by different observation time span. We showed, that filtering can be carried out with pPCA method when there exist two time series in the dataset having less than 100 common epoch of observations. The 1st Principal Component (PC) explained more than 36% of the total variance represented by time series residuals' (series with deterministic model removed), what compared to the other PCs variances (less than 8%) means that common signals are significant in GNSS residuals. A clear improvement in the spectral indices of the power-law noise was noticed for the Up component, which is reflected by an average shift towards white noise from - 0.98 to - 0.67 (30%). We observed a significant average reduction in the accuracy of stations' velocity estimated for filtered residuals by 35, 28 and 69% for the North, East, and Up components, respectively. CME series were also subjected to analysis in the context of environmental mass loading influences of the filtering results. Subtraction of the environmental loading models from GNSS residuals provides to reduction of the estimated CME variance by 20 and 65% for horizontal and vertical components, respectively.

  20. Characteristics of official and experimental GRACE time series by GFZ and CSR - with applications to polar signals

    Science.gov (United States)

    Horvath, Alexander; Horwath, Martin; Pail, Roland

    2014-05-01

    The Release-05 monthly solutions by the three centers of the GRACE Science and Data System are a significant improvement with respect to the previous Release 4. Meanwhile, previous assessments have revealed different noise levels between the solutions by CSR, GFZ and JPL, and also different amplitudes of interannual signal in the solutions by GFZ as compared to the two other centers. Encouraged by the science community, GFZ and CSR have kindly provided additional sets of time series. GFZ has reprocessed the RL05 monthly solutions (up to degree and order 90) with revised processing. CSR has made available monthly solutions with standard processing up to degree and order 96, in addition to their solutions up to degree and order 60. We compare these different time series with respect to their signal and noise content and analyze them on global and regional scale. For the regional scale our special interest is paid on Antarctica and on revealing polar signals such as ice mass trends and GIA. Following the necessity of destriping, an optimal choice for the setup of the Swenson & Wahr filter approach is evaluated to adapt to the specific signal and noise level in Antarctica. Furthermore we analyze the potential benefit of mixed time series solutions in order to combine the strengths of the solutions available. Concerning the question for an optimal maximum degree we suggest that for resolving large polar ice mass changes, it would be beneficial to provide gravity field variations even beyond degree 90.

  1. Time Series Analysis, Modeling and Applications A Computational Intelligence Perspective

    CERN Document Server

    Chen, Shyi-Ming

    2013-01-01

    Temporal and spatiotemporal data form an inherent fabric of the society as we are faced with streams of data coming from numerous sensors, data feeds, recordings associated with numerous areas of application embracing physical and human-generated phenomena (environmental data, financial markets, Internet activities, etc.). A quest for a thorough analysis, interpretation, modeling and prediction of time series comes with an ongoing challenge for developing models that are both accurate and user-friendly (interpretable). The volume is aimed to exploit the conceptual and algorithmic framework of Computational Intelligence (CI) to form a cohesive and comprehensive environment for building models of time series. The contributions covered in the volume are fully reflective of the wealth of the CI technologies by bringing together ideas, algorithms, and numeric studies, which convincingly demonstrate their relevance, maturity and visible usefulness. It reflects upon the truly remarkable diversity of methodological a...

  2. Recurrence Density Enhanced Complex Networks for Nonlinear Time Series Analysis

    Science.gov (United States)

    Costa, Diego G. De B.; Reis, Barbara M. Da F.; Zou, Yong; Quiles, Marcos G.; Macau, Elbert E. N.

    We introduce a new method, which is entitled Recurrence Density Enhanced Complex Network (RDE-CN), to properly analyze nonlinear time series. Our method first transforms a recurrence plot into a figure of a reduced number of points yet preserving the main and fundamental recurrence properties of the original plot. This resulting figure is then reinterpreted as a complex network, which is further characterized by network statistical measures. We illustrate the computational power of RDE-CN approach by time series by both the logistic map and experimental fluid flows, which show that our method distinguishes different dynamics sufficiently well as the traditional recurrence analysis. Therefore, the proposed methodology characterizes the recurrence matrix adequately, while using a reduced set of points from the original recurrence plots.

  3. A large-signal dynamic simulation for the series resonant converter

    Science.gov (United States)

    King, R. J.; Stuart, T. A.

    1983-01-01

    A simple nonlinear discrete-time dynamic model for the series resonant dc-dc converter is derived using approximations appropriate to most power converters. This model is useful for the dynamic simulation of a series resonant converter using only a desktop calculator. The model is compared with a laboratory converter for a large transient event.

  4. On-line diagnostic techniques for air-operated control valves based on time series analysis

    International Nuclear Information System (INIS)

    Ito, Kenji; Matsuoka, Yoshinori; Minamikawa, Shigeru; Komatsu, Yasuki; Satoh, Takeshi.

    1996-01-01

    The objective of this research is to study the feasibility of applying on-line diagnostic techniques based on time series analysis to air-operated control valves - numerous valves of the type which are used in PWR plants. Generally the techniques can detect anomalies by failures in the initial stages for which detection is difficult by conventional surveillance of process parameters measured directly. However, the effectiveness of these techniques depends on the system being diagnosed. The difficulties in applying diagnostic techniques to air-operated control valves seem to come from the reduced sensitivity of their response as compared with hydraulic control systems, as well as the need to identify anomalies in low level signals that fluctuate only slightly but continuously. In this research, simulation tests were performed by setting various kinds of failure modes for a test valve with the same specifications as of a valve actually used in the plants. Actual control signals recorded from an operating plant were then used as input signals for simulation. The results of the tests confirmed the feasibility of applying on-line diagnostic techniques based on time series analysis to air-operated control valves. (author)

  5. Time series analysis for psychological research: examining and forecasting change.

    Science.gov (United States)

    Jebb, Andrew T; Tay, Louis; Wang, Wei; Huang, Qiming

    2015-01-01

    Psychological research has increasingly recognized the importance of integrating temporal dynamics into its theories, and innovations in longitudinal designs and analyses have allowed such theories to be formalized and tested. However, psychological researchers may be relatively unequipped to analyze such data, given its many characteristics and the general complexities involved in longitudinal modeling. The current paper introduces time series analysis to psychological research, an analytic domain that has been essential for understanding and predicting the behavior of variables across many diverse fields. First, the characteristics of time series data are discussed. Second, different time series modeling techniques are surveyed that can address various topics of interest to psychological researchers, including describing the pattern of change in a variable, modeling seasonal effects, assessing the immediate and long-term impact of a salient event, and forecasting future values. To illustrate these methods, an illustrative example based on online job search behavior is used throughout the paper, and a software tutorial in R for these analyses is provided in the Supplementary Materials.

  6. Time series analysis for psychological research: examining and forecasting change

    Science.gov (United States)

    Jebb, Andrew T.; Tay, Louis; Wang, Wei; Huang, Qiming

    2015-01-01

    Psychological research has increasingly recognized the importance of integrating temporal dynamics into its theories, and innovations in longitudinal designs and analyses have allowed such theories to be formalized and tested. However, psychological researchers may be relatively unequipped to analyze such data, given its many characteristics and the general complexities involved in longitudinal modeling. The current paper introduces time series analysis to psychological research, an analytic domain that has been essential for understanding and predicting the behavior of variables across many diverse fields. First, the characteristics of time series data are discussed. Second, different time series modeling techniques are surveyed that can address various topics of interest to psychological researchers, including describing the pattern of change in a variable, modeling seasonal effects, assessing the immediate and long-term impact of a salient event, and forecasting future values. To illustrate these methods, an illustrative example based on online job search behavior is used throughout the paper, and a software tutorial in R for these analyses is provided in the Supplementary Materials. PMID:26106341

  7. Chaotic time series analysis in economics: Balance and perspectives

    International Nuclear Information System (INIS)

    Faggini, Marisa

    2014-01-01

    The aim of the paper is not to review the large body of work concerning nonlinear time series analysis in economics, about which much has been written, but rather to focus on the new techniques developed to detect chaotic behaviours in economic data. More specifically, our attention will be devoted to reviewing some of these techniques and their application to economic and financial data in order to understand why chaos theory, after a period of growing interest, appears now not to be such an interesting and promising research area

  8. Chaotic time series analysis in economics: Balance and perspectives

    Energy Technology Data Exchange (ETDEWEB)

    Faggini, Marisa, E-mail: mfaggini@unisa.it [Dipartimento di Scienze Economiche e Statistiche, Università di Salerno, Fisciano 84084 (Italy)

    2014-12-15

    The aim of the paper is not to review the large body of work concerning nonlinear time series analysis in economics, about which much has been written, but rather to focus on the new techniques developed to detect chaotic behaviours in economic data. More specifically, our attention will be devoted to reviewing some of these techniques and their application to economic and financial data in order to understand why chaos theory, after a period of growing interest, appears now not to be such an interesting and promising research area.

  9. Application of spectral decomposition of 222Rn activity concentration signal series measured in Niedźwiedzia Cave to identification of mechanisms responsible for different time-period variations

    International Nuclear Information System (INIS)

    Przylibski, Tadeusz Andrzej; Wyłomańska, Agnieszka; Zimroz, Radosław; Fijałkowska-Lichwa, Lidia

    2015-01-01

    The authors present an application of spectral decomposition of 222 Rn activity concentration signal series as a mathematical tool used for distinguishing processes determining temporal changes of radon concentration in cave air. The authors demonstrate that decomposition of monitored signal such as 222 Rn activity concentration in cave air facilitates characterizing the processes affecting changes in the measured concentration of this gas. Thanks to this, one can better correlate and characterize the influence of various processes on radon behaviour in cave air. Distinguishing and characterising these processes enables the understanding of radon behaviour in cave environment and it may also enable and facilitate using radon as a precursor of geodynamic phenomena in the lithosphere. Thanks to the conducted analyses, the authors confirmed the unquestionable influence of convective air exchange between the cave and the atmosphere on seasonal and short-term (diurnal) changes in 222 Rn activity concentration in cave air. Thanks to the applied methodology of signal analysis and decomposition, the authors also identified a third process affecting 222 Rn activity concentration changes in cave air. This is a deterministic process causing changes in radon concentration, with a distribution different from the Gaussian one. The authors consider these changes to be the effect of turbulent air movements caused by the movement of visitors in caves. This movement is heterogeneous in terms of the number of visitors per group and the number of groups visiting a cave per day and per year. Such a process perfectly elucidates the observed character of the registered changes in 222 Rn activity concentration in one of the decomposed components of the analysed signal. The obtained results encourage further research into precise relationships between the registered 222 Rn activity concentration changes and factors causing them, as well as into using radon as a precursor of geodynamic

  10. Signals analysis of fluxgate array for wire rope defaults

    International Nuclear Information System (INIS)

    Gu Wei; Chu Jianxin

    2005-01-01

    In order to detecting the magnetic leakage fields of the wire rope defaults, a transducer made up of the fluxgate array is designed, and a series of the characteristic values of wire rope defaults signals are defined. By processing the characteristic signals, the LF or LMA of wire rope are distinguished, and the default extent is estimated. The experiment results of the new method for detecting the wire rope faults are introduced

  11. Acoustic signal analysis in the creeping discharge

    International Nuclear Information System (INIS)

    Nakamiya, T; Sonoda, Y; Tsuda, R; Ebihara, K; Ikegami, T

    2008-01-01

    We have previously succeeded in measuring the acoustic signal due to the dielectric barrier discharge and discriminating the dominant frequency components of the acoustic signal. The dominant frequency components appear over 20kHz of acoustic signal by the dielectric barrier discharge. Recently surface discharge control technology has been focused from practical applications such as ozonizer, NO X reactors, light source or display. The fundamental experiments are carried to examine the creeping discharge using the acoustic signal. When the high voltage (6kV, f = 10kHz) is applied to the electrode, the discharge current flows and the acoustic sound is generated. The current, voltage waveforms of creeping discharge and the sound signal detected by the condenser microphone are stored in the digital memory scope. In this scheme, Continuous Wavelet Transform (CWT) is applied to discriminate the acoustic sound of the micro discharge and the dominant frequency components are studied. CWT results of sound signal show the frequency spectrum of wideband up to 100kHz. In addition, the energy distributions of acoustic signal are examined by CWT

  12. Time series clustering analysis of health-promoting behavior

    Science.gov (United States)

    Yang, Chi-Ta; Hung, Yu-Shiang; Deng, Guang-Feng

    2013-10-01

    Health promotion must be emphasized to achieve the World Health Organization goal of health for all. Since the global population is aging rapidly, ComCare elder health-promoting service was developed by the Taiwan Institute for Information Industry in 2011. Based on the Pender health promotion model, ComCare service offers five categories of health-promoting functions to address the everyday needs of seniors: nutrition management, social support, exercise management, health responsibility, stress management. To assess the overall ComCare service and to improve understanding of the health-promoting behavior of elders, this study analyzed health-promoting behavioral data automatically collected by the ComCare monitoring system. In the 30638 session records collected for 249 elders from January, 2012 to March, 2013, behavior patterns were identified by fuzzy c-mean time series clustering algorithm combined with autocorrelation-based representation schemes. The analysis showed that time series data for elder health-promoting behavior can be classified into four different clusters. Each type reveals different health-promoting needs, frequencies, function numbers and behaviors. The data analysis result can assist policymakers, health-care providers, and experts in medicine, public health, nursing and psychology and has been provided to Taiwan National Health Insurance Administration to assess the elder health-promoting behavior.

  13. Time series analysis of gold production in Malaysia

    Science.gov (United States)

    Muda, Nora; Hoon, Lee Yuen

    2012-05-01

    Gold is a soft, malleable, bright yellow metallic element and unaffected by air or most reagents. It is highly valued as an asset or investment commodity and is extensively used in jewellery, industrial application, dentistry and medical applications. In Malaysia, gold mining is limited in several areas such as Pahang, Kelantan, Terengganu, Johor and Sarawak. The main purpose of this case study is to obtain a suitable model for the production of gold in Malaysia. The model can also be used to predict the data of Malaysia's gold production in the future. Box-Jenkins time series method was used to perform time series analysis with the following steps: identification, estimation, diagnostic checking and forecasting. In addition, the accuracy of prediction is tested using mean absolute percentage error (MAPE). From the analysis, the ARIMA (3,1,1) model was found to be the best fitted model with MAPE equals to 3.704%, indicating the prediction is very accurate. Hence, this model can be used for forecasting. This study is expected to help the private and public sectors to understand the gold production scenario and later plan the gold mining activities in Malaysia.

  14. Predicting the Market Potential Using Time Series Analysis

    Directory of Open Access Journals (Sweden)

    Halmet Bradosti

    2015-12-01

    Full Text Available The aim of this analysis is to forecast a mini-market sales volume for the period of twelve months starting August 2015 to August 2016. The study is based on the monthly sales in Iraqi Dinar for a private local mini-market for the month of April 2014 to July 2015. As revealed on the graph and of course if the stagnant economic condition continues, the trend of future sales is down-warding. Based on time series analysis, the business may continue to operate and generate small revenues until August 2016. However, due to low sales volume, low profit margin and operating expenses, the revenues may not be adequate enough to produce positive net income and the business may not be able to operate afterward. The principal question rose from this is the forecasting sales in the region will be difficult where the business cycle so dynamic and revolutionary due to systematic risks and unforeseeable future.

  15. Signal Analysis for Radiation Event Identification

    Energy Technology Data Exchange (ETDEWEB)

    Steven A. Wallace

    2004-12-30

    The method of digitizing the scintillation output signals from a lithiated sol-gel based glass is described. The design considerations for using the lithiated scintillator for the detection of Special Nuclear Material (SNM) is presented.

  16. Pipe-anchor discontinuity analysis utilizing power series solutions, Bessel functions, and Fourier series

    International Nuclear Information System (INIS)

    Williams, Dennis K.; Ranson, William F.

    2003-01-01

    One of the paradigmatic classes of problems that frequently arise in piping stress analysis discipline is the effect of local stresses created by supports and restraints attachments. Over the past 20 years, concerns have been identified by both regulatory agencies in the nuclear power industry and others in the process and chemicals industries concerning the effect of various stiff clamping arrangements on the expected life of the pipe and its various piping components. In many of the commonly utilized geometries and arrangements of pipe clamps, the elasticity problem becomes the axisymmetric stress and deformation determination in a hollow cylinder (pipe) subjected to the appropriate boundary conditions and respective loads per se. One of the geometries that serve as a pipe anchor is comprised of two pipe clamps that are bolted tightly to the pipe and affixed to a modified shoe-type arrangement. The shoe is employed for the purpose of providing an immovable base that can be easily attached either by bolting or welding to a structural steel pipe rack. Over the past 50 years, the computational tools available to the piping analyst have changed dramatically and thereby have caused the implementation of solutions to the basic problems of elasticity to change likewise. The need to obtain closed form elasticity solutions, however, has always been a driving force in engineering. The employment of symbolic calculus that is currently available through numerous software packages makes closed form solutions very economical. This paper briefly traces the solutions over the past 50 years to a variety of axisymmetric stress problems involving hollow circular cylinders employing a Fourier series representation. In the present example, a properly chosen Fourier series represent the mathematical simulation of the imposed axial displacements on the outside diametrical surface. A general solution technique is introduced for the axisymmetric discontinuity stresses resulting from an

  17. Schottky signal analysis: tune and chromaticity computation

    CERN Document Server

    Chanon, Ondine

    2016-01-01

    Schottky monitors are used to determine important beam parameters in a non-destructive way. The Schottky signal is due to the internal statistical fluctuations of the particles inside the beam. In this report, after explaining the different components of a Schottky signal, an algorithm to compute the betatron tune is presented, followed by some ideas to compute machine chromaticity. The tests have been performed with offline and/or online LHC data.

  18. Fractal time series analysis of postural stability in elderly and control subjects

    Directory of Open Access Journals (Sweden)

    Doussot Michel

    2007-05-01

    Full Text Available Abstract Background The study of balance using stabilogram analysis is of particular interest in the study of falls. Although simple statistical parameters derived from the stabilogram have been shown to predict risk of falls, such measures offer little insight into the underlying control mechanisms responsible for degradation in balance. In contrast, fractal and non-linear time-series analysis of stabilograms, such as estimations of the Hurst exponent (H, may provide information related to the underlying motor control strategies governing postural stability. In order to be adapted for a home-based follow-up of balance, such methods need to be robust, regardless of the experimental protocol, while producing time-series that are as short as possible. The present study compares two methods of calculating H: Detrended Fluctuation Analysis (DFA and Stabilogram Diffusion Analysis (SDA for elderly and control subjects, as well as evaluating the effect of recording duration. Methods Centre of pressure signals were obtained from 90 young adult subjects and 10 elderly subjects. Data were sampled at 100 Hz for 30 s, including stepping onto and off the force plate. Estimations of H were made using sliding windows of 10, 5, and 2.5 s durations, with windows slid forward in 1-s increments. Multivariate analysis of variance was used to test for the effect of time, age and estimation method on the Hurst exponent, while the intra-class correlation coefficient (ICC was used as a measure of reliability. Results Both SDA and DFA methods were able to identify differences in postural stability between control and elderly subjects for time series as short as 5 s, with ICC values as high as 0.75 for DFA. Conclusion Both methods would be well-suited to non-invasive longitudinal assessment of balance. In addition, reliable estimations of H were obtained from time series as short as 5 s.

  19. Inorganic chemical analysis of environmental materials—A lecture series

    Science.gov (United States)

    Crock, J.G.; Lamothe, P.J.

    2011-01-01

    At the request of the faculty of the Colorado School of Mines, Golden, Colorado, the authors prepared and presented a lecture series to the students of a graduate level advanced instrumental analysis class. The slides and text presented in this report are a compilation and condensation of this series of lectures. The purpose of this report is to present the slides and notes and to emphasize the thought processes that should be used by a scientist submitting samples for analyses in order to procure analytical data to answer a research question. First and foremost, the analytical data generated can be no better than the samples submitted. The questions to be answered must first be well defined and the appropriate samples collected from the population that will answer the question. The proper methods of analysis, including proper sample preparation and digestion techniques, must then be applied. Care must be taken to achieve the required limits of detection of the critical analytes to yield detectable analyte concentration (above "action" levels) for the majority of the study's samples and to address what portion of those analytes answer the research question-total or partial concentrations. To guarantee a robust analytical result that answers the research question(s), a well-defined quality assurance and quality control (QA/QC) plan must be employed. This QA/QC plan must include the collection and analysis of field and laboratory blanks, sample duplicates, and matrix-matched standard reference materials (SRMs). The proper SRMs may include in-house materials and/or a selection of widely available commercial materials. A discussion of the preparation and applicability of in-house reference materials is also presented. Only when all these analytical issues are sufficiently addressed can the research questions be answered with known certainty.

  20. Time series analysis of brain regional volume by MR image

    International Nuclear Information System (INIS)

    Tanaka, Mika; Tarusawa, Ayaka; Nihei, Mitsuyo; Fukami, Tadanori; Yuasa, Tetsuya; Wu, Jin; Ishiwata, Kiichi; Ishii, Kenji

    2010-01-01

    The present study proposed a methodology of time series analysis of volumes of frontal, parietal, temporal and occipital lobes and cerebellum because such volumetric reports along the process of individual's aging have been scarcely presented. Subjects analyzed were brain images of 2 healthy males and 18 females of av. age of 69.0 y, of which T1-weighted 3D SPGR (spoiled gradient recalled in the steady state) acquisitions with a GE SIGNA EXCITE HD 1.5T machine were conducted for 4 times in the time series of 42-50 months. The image size was 256 x 256 x (86-124) voxels with digitization level 16 bits. As the template for the regions, the standard gray matter atlas (icbn452 a tlas p robability g ray) and its labeled one (icbn.Labels), provided by UCLA Laboratory of Neuro Imaging, were used for individual's standardization. Segmentation, normalization and coregistration were performed with the MR imaging software SPM8 (Statistic Parametric Mapping 8). Volumes of regions were calculated as their voxel ratio to the whole brain voxel in percent. It was found that the regional volumes decreased with aging in all above lobes examined and cerebellum in average percent per year of -0.11, -0.07, -0.04, -0.02, and -0.03, respectively. The procedure for calculation of the regional volumes, which has been manually operated hitherto, can be automatically conducted for the individual brain using the standard atlases above. (T.T.)

  1. An Optimization-Driven Analysis Pipeline to Uncover Biomarkers and Signaling Paths: Cervix Cancer

    Directory of Open Access Journals (Sweden)

    Enery Lorenzo

    2015-05-01

    Full Text Available Establishing how a series of potentially important genes might relate to each other is relevant to understand the origin and evolution of illnesses, such as cancer. High‑throughput biological experiments have played a critical role in providing information in this regard. A special challenge, however, is that of trying to conciliate information from separate microarray experiments to build a potential genetic signaling path. This work proposes a two-step analysis pipeline, based on optimization, to approach meta-analysis aiming to build a proxy for a genetic signaling path.

  2. Orbiter CCTV video signal noise analysis

    Science.gov (United States)

    Lawton, R. M.; Blanke, L. R.; Pannett, R. F.

    1977-01-01

    The amount of steady state and transient noise which will couple to orbiter CCTV video signal wiring is predicted. The primary emphasis is on the interim system, however, some predictions are made concerning the operational system wiring in the cabin area. Noise sources considered are RF fields from on board transmitters, precipitation static, induced lightning currents, and induced noise from adjacent wiring. The most significant source is noise coupled to video circuits from associated circuits in common connectors. Video signal crosstalk is the primary cause of steady state interference, and mechanically switched control functions cause the largest induced transients.

  3. A non linear analysis of human gait time series based on multifractal analysis and cross correlations

    International Nuclear Information System (INIS)

    Munoz-Diosdado, A

    2005-01-01

    We analyzed databases with gait time series of adults and persons with Parkinson, Huntington and amyotrophic lateral sclerosis (ALS) diseases. We obtained the staircase graphs of accumulated events that can be bounded by a straight line whose slope can be used to distinguish between gait time series from healthy and ill persons. The global Hurst exponent of these series do not show tendencies, we intend that this is because some gait time series have monofractal behavior and others have multifractal behavior so they cannot be characterized with a single Hurst exponent. We calculated the multifractal spectra, obtained the spectra width and found that the spectra of the healthy young persons are almost monofractal. The spectra of ill persons are wider than the spectra of healthy persons. In opposition to the interbeat time series where the pathology implies loss of multifractality, in the gait time series the multifractal behavior emerges with the pathology. Data were collected from healthy and ill subjects as they walked in a roughly circular path and they have sensors in both feet, so we have one time series for the left foot and other for the right foot. First, we analyzed these time series separately, and then we compared both results, with direct comparison and with a cross correlation analysis. We tried to find differences in both time series that can be used as indicators of equilibrium problems

  4. A non linear analysis of human gait time series based on multifractal analysis and cross correlations

    Energy Technology Data Exchange (ETDEWEB)

    Munoz-Diosdado, A [Department of Mathematics, Unidad Profesional Interdisciplinaria de Biotecnologia, Instituto Politecnico Nacional, Av. Acueducto s/n, 07340, Mexico City (Mexico)

    2005-01-01

    We analyzed databases with gait time series of adults and persons with Parkinson, Huntington and amyotrophic lateral sclerosis (ALS) diseases. We obtained the staircase graphs of accumulated events that can be bounded by a straight line whose slope can be used to distinguish between gait time series from healthy and ill persons. The global Hurst exponent of these series do not show tendencies, we intend that this is because some gait time series have monofractal behavior and others have multifractal behavior so they cannot be characterized with a single Hurst exponent. We calculated the multifractal spectra, obtained the spectra width and found that the spectra of the healthy young persons are almost monofractal. The spectra of ill persons are wider than the spectra of healthy persons. In opposition to the interbeat time series where the pathology implies loss of multifractality, in the gait time series the multifractal behavior emerges with the pathology. Data were collected from healthy and ill subjects as they walked in a roughly circular path and they have sensors in both feet, so we have one time series for the left foot and other for the right foot. First, we analyzed these time series separately, and then we compared both results, with direct comparison and with a cross correlation analysis. We tried to find differences in both time series that can be used as indicators of equilibrium problems.

  5. Automatic analysis of signals during Eddy currents controls

    International Nuclear Information System (INIS)

    Chiron, D.

    1983-06-01

    A method and the corresponding instrument have been developed for automatic analysis of Eddy currents testing signals. This apparatus enables at the same time the analysis, every 2 milliseconds, of two signals at two different frequencies. It can be used either on line with an Eddy Current testing instrument or with a magnetic tape recorder [fr

  6. Compressive Sensing: Analysis of Signals in Radio Astronomy

    Directory of Open Access Journals (Sweden)

    Gaigals G.

    2013-12-01

    Full Text Available The compressive sensing (CS theory says that for some kind of signals there is no need to keep or transfer all the data acquired accordingly to the Nyquist criterion. In this work we investigate if the CS approach is applicable for recording and analysis of radio astronomy (RA signals. Since CS methods are applicable for the signals with sparse (and compressible representations, the compressibility of RA signals is verified. As a result, we identify which RA signals can be processed using CS, find the parameters which can improve or degrade CS application to RA results, describe the optimum way how to perform signal filtering in CS applications. Also, a range of virtual LabVIEW instruments are created for the signal analysis with the CS theory.

  7. Discontinuous conduction mode analysis of phase-modulated series ...

    Indian Academy of Sciences (India)

    modulated dc–dc series resonant converter (SRC) operating in discontinuous conduction mode (DCM). The conventional fundamental harmonic approximation technique is extended for a non-ideal series resonant tank to clarify the limitations of ...

  8. Source Signals Separation and Reconstruction Following Principal Component Analysis

    Directory of Open Access Journals (Sweden)

    WANG Cheng

    2014-02-01

    Full Text Available For separation and reconstruction of source signals from observed signals problem, the physical significance of blind source separation modal and independent component analysis is not very clear, and its solution is not unique. Aiming at these disadvantages, a new linear and instantaneous mixing model and a novel source signals separation reconstruction solving method from observed signals based on principal component analysis (PCA are put forward. Assumption of this new model is statistically unrelated rather than independent of source signals, which is different from the traditional blind source separation model. A one-to-one relationship between linear and instantaneous mixing matrix of new model and linear compound matrix of PCA, and a one-to-one relationship between unrelated source signals and principal components are demonstrated using the concept of linear separation matrix and unrelated of source signals. Based on this theoretical link, source signals separation and reconstruction problem is changed into PCA of observed signals then. The theoretical derivation and numerical simulation results show that, in despite of Gauss measurement noise, wave form and amplitude information of unrelated source signal can be separated and reconstructed by PCA when linear mixing matrix is column orthogonal and normalized; only wave form information of unrelated source signal can be separated and reconstructed by PCA when linear mixing matrix is column orthogonal but not normalized, unrelated source signal cannot be separated and reconstructed by PCA when mixing matrix is not column orthogonal or linear.

  9. Centrality measures in temporal networks with time series analysis

    Science.gov (United States)

    Huang, Qiangjuan; Zhao, Chengli; Zhang, Xue; Wang, Xiaojie; Yi, Dongyun

    2017-05-01

    The study of identifying important nodes in networks has a wide application in different fields. However, the current researches are mostly based on static or aggregated networks. Recently, the increasing attention to networks with time-varying structure promotes the study of node centrality in temporal networks. In this paper, we define a supra-evolution matrix to depict the temporal network structure. With using of the time series analysis, the relationships between different time layers can be learned automatically. Based on the special form of the supra-evolution matrix, the eigenvector centrality calculating problem is turned into the calculation of eigenvectors of several low-dimensional matrices through iteration, which effectively reduces the computational complexity. Experiments are carried out on two real-world temporal networks, Enron email communication network and DBLP co-authorship network, the results of which show that our method is more efficient at discovering the important nodes than the common aggregating method.

  10. Adult Craniopharyngioma: Case Series, Systematic Review, and Meta-Analysis.

    Science.gov (United States)

    Dandurand, Charlotte; Sepehry, Amir Ali; Asadi Lari, Mohammad Hossein; Akagami, Ryojo; Gooderham, Peter

    2017-12-18

    The optimal therapeutic approach for adult craniopharyngioma remains controversial. Some advocate for gross total resection (GTR), while others advocate for subtotal resection followed by adjuvant radiotherapy (STR + XRT). To conduct a systematic review and meta-analysis assessing the rate of recurrence in the follow-up of 3 yr in adult craniopharyngioma stratified by extent of resection and presence of adjuvant radiotherapy. MEDLINE (1946-July 1, 2016) and EMBASE (1980-June 30, 2016) were systematically reviewed. From1975 to 2013, 33 patients were treated with initial surgical resection for adult onset craniopharyngioma at our center and were reviewed for inclusion in this study. Data from 22 patients were available for inclusion as a case series in the systematic review. Eligible studies (n = 21) were identified from the literature in addition to a case series of our institutional experience. Three groups were available for analysis: GTR, STR + XRT, and STR. The rates of recurrence were 17%, 27%, and 45%, respectively. The risk of developing recurrence was significant for GTR vs STR (odds ratio [OR]: 0.24, 95% confidence interval [CI]: 0.15-0.38) and STR + XRT vs STR (OR: 0.20, 95% CI: 0.10-0.41). Risk of recurrence after GTR vs STR + XRT did not reach significance (OR: 0.63, 95% CI: 0.33-1.24, P = .18). This is the first and largest systematic review focusing on the rate of recurrence in adult craniopharyngioma. Although the rates of recurrence are favoring GTR, difference in risk of recurrence did not reach significance. This study provides guidance to clinicians and directions for future research with the need to stratify outcomes per treatment modalities. Copyright © 2017 by the Congress of Neurological Surgeons

  11. Series-nonuniform rational B-spline signal feedback: From chaos to any embedded periodic orbit or target point

    Energy Technology Data Exchange (ETDEWEB)

    Shao, Chenxi, E-mail: cxshao@ustc.edu.cn; Xue, Yong; Fang, Fang; Bai, Fangzhou [Department of Computer Science and Technology, University of Science and Technology of China, Hefei 230027 (China); Yin, Peifeng [Department of Computer Science and Engineering, Pennsylvania State University, State College, Pennsylvania 16801 (United States); Wang, Binghong [Department of Modern Physics, University of Science and Technology of China, Hefei 230026 (China)

    2015-07-15

    The self-controlling feedback control method requires an external periodic oscillator with special design, which is technically challenging. This paper proposes a chaos control method based on time series non-uniform rational B-splines (SNURBS for short) signal feedback. It first builds the chaos phase diagram or chaotic attractor with the sampled chaotic time series and any target orbit can then be explicitly chosen according to the actual demand. Second, we use the discrete timing sequence selected from the specific target orbit to build the corresponding external SNURBS chaos periodic signal, whose difference from the system current output is used as the feedback control signal. Finally, by properly adjusting the feedback weight, we can quickly lead the system to an expected status. We demonstrate both the effectiveness and efficiency of our method by applying it to two classic chaotic systems, i.e., the Van der Pol oscillator and the Lorenz chaotic system. Further, our experimental results show that compared with delayed feedback control, our method takes less time to obtain the target point or periodic orbit (from the starting point) and that its parameters can be fine-tuned more easily.

  12. DETECT: a MATLAB toolbox for event detection and identification in time series, with applications to artifact detection in EEG signals.

    Science.gov (United States)

    Lawhern, Vernon; Hairston, W David; Robbins, Kay

    2013-01-01

    Recent advances in sensor and recording technology have allowed scientists to acquire very large time-series datasets. Researchers often analyze these datasets in the context of events, which are intervals of time where the properties of the signal change relative to a baseline signal. We have developed DETECT, a MATLAB toolbox for detecting event time intervals in long, multi-channel time series. Our primary goal is to produce a toolbox that is simple for researchers to use, allowing them to quickly train a model on multiple classes of events, assess the accuracy of the model, and determine how closely the results agree with their own manual identification of events without requiring extensive programming knowledge or machine learning experience. As an illustration, we discuss application of the DETECT toolbox for detecting signal artifacts found in continuous multi-channel EEG recordings and show the functionality of the tools found in the toolbox. We also discuss the application of DETECT for identifying irregular heartbeat waveforms found in electrocardiogram (ECG) data as an additional illustration.

  13. Series-nonuniform rational B-spline signal feedback: From chaos to any embedded periodic orbit or target point.

    Science.gov (United States)

    Shao, Chenxi; Xue, Yong; Fang, Fang; Bai, Fangzhou; Yin, Peifeng; Wang, Binghong

    2015-07-01

    The self-controlling feedback control method requires an external periodic oscillator with special design, which is technically challenging. This paper proposes a chaos control method based on time series non-uniform rational B-splines (SNURBS for short) signal feedback. It first builds the chaos phase diagram or chaotic attractor with the sampled chaotic time series and any target orbit can then be explicitly chosen according to the actual demand. Second, we use the discrete timing sequence selected from the specific target orbit to build the corresponding external SNURBS chaos periodic signal, whose difference from the system current output is used as the feedback control signal. Finally, by properly adjusting the feedback weight, we can quickly lead the system to an expected status. We demonstrate both the effectiveness and efficiency of our method by applying it to two classic chaotic systems, i.e., the Van der Pol oscillator and the Lorenz chaotic system. Further, our experimental results show that compared with delayed feedback control, our method takes less time to obtain the target point or periodic orbit (from the starting point) and that its parameters can be fine-tuned more easily.

  14. DETECT: a MATLAB toolbox for event detection and identification in time series, with applications to artifact detection in EEG signals.

    Directory of Open Access Journals (Sweden)

    Vernon Lawhern

    Full Text Available Recent advances in sensor and recording technology have allowed scientists to acquire very large time-series datasets. Researchers often analyze these datasets in the context of events, which are intervals of time where the properties of the signal change relative to a baseline signal. We have developed DETECT, a MATLAB toolbox for detecting event time intervals in long, multi-channel time series. Our primary goal is to produce a toolbox that is simple for researchers to use, allowing them to quickly train a model on multiple classes of events, assess the accuracy of the model, and determine how closely the results agree with their own manual identification of events without requiring extensive programming knowledge or machine learning experience. As an illustration, we discuss application of the DETECT toolbox for detecting signal artifacts found in continuous multi-channel EEG recordings and show the functionality of the tools found in the toolbox. We also discuss the application of DETECT for identifying irregular heartbeat waveforms found in electrocardiogram (ECG data as an additional illustration.

  15. Interrupted time-series analysis: studying trends in neurosurgery.

    Science.gov (United States)

    Wong, Ricky H; Smieliauskas, Fabrice; Pan, I-Wen; Lam, Sandi K

    2015-12-01

    OBJECT Neurosurgery studies traditionally have evaluated the effects of interventions on health care outcomes by studying overall changes in measured outcomes over time. Yet, this type of linear analysis is limited due to lack of consideration of the trend's effects both pre- and postintervention and the potential for confounding influences. The aim of this study was to illustrate interrupted time-series analysis (ITSA) as applied to an example in the neurosurgical literature and highlight ITSA's potential for future applications. METHODS The methods used in previous neurosurgical studies were analyzed and then compared with the methodology of ITSA. RESULTS The ITSA method was identified in the neurosurgical literature as an important technique for isolating the effect of an intervention (such as a policy change or a quality and safety initiative) on a health outcome independent of other factors driving trends in the outcome. The authors determined that ITSA allows for analysis of the intervention's immediate impact on outcome level and on subsequent trends and enables a more careful measure of the causal effects of interventions on health care outcomes. CONCLUSIONS ITSA represents a significant improvement over traditional observational study designs in quantifying the impact of an intervention. ITSA is a useful statistical procedure to understand, consider, and implement as the field of neurosurgery evolves in sophistication in big-data analytics, economics, and health services research.

  16. Performance analysis of NOAA tropospheric signal delay model

    International Nuclear Information System (INIS)

    Ibrahim, Hassan E; El-Rabbany, Ahmed

    2011-01-01

    Tropospheric delay is one of the dominant global positioning system (GPS) errors, which degrades the positioning accuracy. Recent development in tropospheric modeling relies on implementation of more accurate numerical weather prediction (NWP) models. In North America one of the NWP-based tropospheric correction models is the NOAA Tropospheric Signal Delay Model (NOAATrop), which was developed by the US National Oceanic and Atmospheric Administration (NOAA). Because of its potential to improve the GPS positioning accuracy, the NOAATrop model became the focus of many researchers. In this paper, we analyzed the performance of the NOAATrop model and examined its effect on ionosphere-free-based precise point positioning (PPP) solution. We generated 3 year long tropospheric zenith total delay (ZTD) data series for the NOAATrop model, Hopfield model, and the International GNSS Services (IGS) final tropospheric correction product, respectively. These data sets were generated at ten IGS reference stations spanning Canada and the United States. We analyzed the NOAATrop ZTD data series and compared them with those of the Hopfield model. The IGS final tropospheric product was used as a reference. The analysis shows that the performance of the NOAATrop model is a function of both season (time of the year) and geographical location. However, its performance was superior to the Hopfield model in all cases. We further investigated the effect of implementing the NOAATrop model on the ionosphere-free-based PPP solution convergence and accuracy. It is shown that the use of the NOAATrop model improved the PPP solution convergence by 1%, 10% and 15% for the latitude, longitude and height components, respectively

  17. Modeling activity patterns of wildlife using time-series analysis.

    Science.gov (United States)

    Zhang, Jindong; Hull, Vanessa; Ouyang, Zhiyun; He, Liang; Connor, Thomas; Yang, Hongbo; Huang, Jinyan; Zhou, Shiqiang; Zhang, Zejun; Zhou, Caiquan; Zhang, Hemin; Liu, Jianguo

    2017-04-01

    The study of wildlife activity patterns is an effective approach to understanding fundamental ecological and evolutionary processes. However, traditional statistical approaches used to conduct quantitative analysis have thus far had limited success in revealing underlying mechanisms driving activity patterns. Here, we combine wavelet analysis, a type of frequency-based time-series analysis, with high-resolution activity data from accelerometers embedded in GPS collars to explore the effects of internal states (e.g., pregnancy) and external factors (e.g., seasonal dynamics of resources and weather) on activity patterns of the endangered giant panda ( Ailuropoda melanoleuca ). Giant pandas exhibited higher frequency cycles during the winter when resources (e.g., water and forage) were relatively poor, as well as during spring, which includes the giant panda's mating season. During the summer and autumn when resources were abundant, pandas exhibited a regular activity pattern with activity peaks every 24 hr. A pregnant individual showed distinct differences in her activity pattern from other giant pandas for several months following parturition. These results indicate that animals adjust activity cycles to adapt to seasonal variation of the resources and unique physiological periods. Wavelet coherency analysis also verified the synchronization of giant panda activity level with air temperature and solar radiation at the 24-hr band. Our study also shows that wavelet analysis is an effective tool for analyzing high-resolution activity pattern data and its relationship to internal and external states, an approach that has the potential to inform wildlife conservation and management across species.

  18. Graphical Data Analysis on the Circle: Wrap-Around Time Series Plots for (Interrupted) Time Series Designs.

    Science.gov (United States)

    Rodgers, Joseph Lee; Beasley, William Howard; Schuelke, Matthew

    2014-01-01

    Many data structures, particularly time series data, are naturally seasonal, cyclical, or otherwise circular. Past graphical methods for time series have focused on linear plots. In this article, we move graphical analysis onto the circle. We focus on 2 particular methods, one old and one new. Rose diagrams are circular histograms and can be produced in several different forms using the RRose software system. In addition, we propose, develop, illustrate, and provide software support for a new circular graphical method, called Wrap-Around Time Series Plots (WATS Plots), which is a graphical method useful to support time series analyses in general but in particular in relation to interrupted time series designs. We illustrate the use of WATS Plots with an interrupted time series design evaluating the effect of the Oklahoma City bombing on birthrates in Oklahoma County during the 10 years surrounding the bombing of the Murrah Building in Oklahoma City. We compare WATS Plots with linear time series representations and overlay them with smoothing and error bands. Each method is shown to have advantages in relation to the other; in our example, the WATS Plots more clearly show the existence and effect size of the fertility differential.

  19. Pattern theory the stochastic analysis of real-world signals

    CERN Document Server

    Mumford, David

    2010-01-01

    Pattern theory is a distinctive approach to the analysis of all forms of real-world signals. At its core is the design of a large variety of probabilistic models whose samples reproduce the look and feel of the real signals, their patterns, and their variability. Bayesian statistical inference then allows you to apply these models in the analysis of new signals. This book treats the mathematical tools, the models themselves, and the computational algorithms for applying statistics to analyze six representative classes of signals of increasing complexity. The book covers patterns in text, sound

  20. Analysis and prediction of leucine-rich nuclear export signals

    DEFF Research Database (Denmark)

    La Cour, T.; Kiemer, Lars; Mølgaard, Anne

    2004-01-01

    We present a thorough analysis of nuclear export signals and a prediction server, which we have made publicly available. The machine learning prediction method is a significant improvement over the generally used consensus patterns. Nuclear export signals (NESs) are extremely important regulators...... this analysis is that the most important properties of NESs are accessibility and flexibility allowing relevant proteins to interact with the signal. Furthermore, we show that not only the known hydrophobic residues are important in defining a nuclear export signals. We employ both neural networks and hidden...

  1. Integrating a Linear Signal Model with Groundwater and Rainfall time-series on the Characteristic Identification of Groundwater Systems

    Science.gov (United States)

    Chen, Yu-Wen; Wang, Yetmen; Chang, Liang-Cheng

    2017-04-01

    Groundwater resources play a vital role on regional supply. To avoid irreversible environmental impact such as land subsidence, the characteristic identification of groundwater system is crucial before sustainable management of groundwater resource. This study proposes a signal process approach to identify the character of groundwater systems based on long-time hydrologic observations include groundwater level and rainfall. The study process contains two steps. First, a linear signal model (LSM) is constructed and calibrated to simulate the variation of underground hydrology based on the time series of groundwater levels and rainfall. The mass balance equation of the proposed LSM contains three major terms contain net rate of horizontal exchange, rate of rainfall recharge and rate of pumpage and four parameters are required to calibrate. Because reliable records of pumpage is rare, the time-variant groundwater amplitudes of daily frequency (P ) calculated by STFT are assumed as linear indicators of puamage instead of pumpage records. Time series obtained from 39 observation wells and 50 rainfall stations in and around the study area, Pintung Plain, are paired for model construction. Second, the well-calibrated parameters of the linear signal model can be used to interpret the characteristic of groundwater system. For example, the rainfall recharge coefficient (γ) means the transform ratio between rainfall intention and groundwater level raise. The area around the observation well with higher γ means that the saturated zone here is easily affected by rainfall events and the material of unsaturated zone might be gravel or coarse sand with high infiltration ratio. Considering the spatial distribution of γ, the values of γ decrease from the upstream to the downstream of major rivers and also are correlated to the spatial distribution of grain size of surface soil. Via the time-series of groundwater levels and rainfall, the well-calibrated parameters of LSM have

  2. Small-signal analysis of granular semiconductors

    Energy Technology Data Exchange (ETDEWEB)

    Varpula, Aapo; Sinkkonen, Juha; Novikov, Sergey, E-mail: aapo.varpula@tkk.f [Department of Micro and Nanosciences, Aalto University, PO Box 13500, FI-00076 Aalto, Espoo (Finland)

    2010-11-01

    The small-signal ac response of granular n-type semiconductors is calculated analytically using the drift-diffusion theory when electronic trapping at grain boundaries is present. An electrical equivalent circuit (EEC) model of a granular n-type semiconductor is presented. The analytical model is verified with numerical simulation performed by SILVACO ATLAS. The agreement between the analytical and numerical results is very good in a broad frequency range at low dc bias voltages.

  3. Small-signal analysis of granular semiconductors

    International Nuclear Information System (INIS)

    Varpula, Aapo; Sinkkonen, Juha; Novikov, Sergey

    2010-01-01

    The small-signal ac response of granular n-type semiconductors is calculated analytically using the drift-diffusion theory when electronic trapping at grain boundaries is present. An electrical equivalent circuit (EEC) model of a granular n-type semiconductor is presented. The analytical model is verified with numerical simulation performed by SILVACO ATLAS. The agreement between the analytical and numerical results is very good in a broad frequency range at low dc bias voltages.

  4. Analysis of transient signals by Wavelet transform

    International Nuclear Information System (INIS)

    Penha, Rosani Libardi da; Silva, Aucyone A. da; Ting, Daniel K.S.; Oliveira Neto, Jose Messias de

    2000-01-01

    The objective of this work is to apply the Wavelet Transform in transient signals. The Wavelet technique can outline the short time events that are not easily detected using traditional techniques. In this work, the Wavelet Transform is compared with Fourier Transform, by using simulated data and rotor rig data. This data contain known transients. The wavelet could follow all the transients, what do not happen to the Fourier techniques. (author)

  5. Large scale analysis of signal reachability.

    Science.gov (United States)

    Todor, Andrei; Gabr, Haitham; Dobra, Alin; Kahveci, Tamer

    2014-06-15

    Major disorders, such as leukemia, have been shown to alter the transcription of genes. Understanding how gene regulation is affected by such aberrations is of utmost importance. One promising strategy toward this objective is to compute whether signals can reach to the transcription factors through the transcription regulatory network (TRN). Due to the uncertainty of the regulatory interactions, this is a #P-complete problem and thus solving it for very large TRNs remains to be a challenge. We develop a novel and scalable method to compute the probability that a signal originating at any given set of source genes can arrive at any given set of target genes (i.e., transcription factors) when the topology of the underlying signaling network is uncertain. Our method tackles this problem for large networks while providing a provably accurate result. Our method follows a divide-and-conquer strategy. We break down the given network into a sequence of non-overlapping subnetworks such that reachability can be computed autonomously and sequentially on each subnetwork. We represent each interaction using a small polynomial. The product of these polynomials express different scenarios when a signal can or cannot reach to target genes from the source genes. We introduce polynomial collapsing operators for each subnetwork. These operators reduce the size of the resulting polynomial and thus the computational complexity dramatically. We show that our method scales to entire human regulatory networks in only seconds, while the existing methods fail beyond a few tens of genes and interactions. We demonstrate that our method can successfully characterize key reachability characteristics of the entire transcriptions regulatory networks of patients affected by eight different subtypes of leukemia, as well as those from healthy control samples. All the datasets and code used in this article are available at bioinformatics.cise.ufl.edu/PReach/scalable.htm. © The Author 2014

  6. Time series analysis of soil Radon-222 recorded at Kutch region, Gujarat, India

    International Nuclear Information System (INIS)

    Madhusudan Rao, K.; Rastogi, B.K.; Barman, Chiranjib; Chaudhuri, Hirok

    2013-01-01

    Kutch region in Gujarat lies in a seismic vulnerable zone (seismic zone-v). After the devastating Bhuj earthquake (7.7M) of January 26, 2001 in the Kutch region several researcher focused their attention to monitor geophysical and geochemical precursors for earthquakes in the region. In order to find out the possible geochemical precursory signals for earthquake events, we monitored radioactive gas radon-222 in sub surface soil gas at Kutch region. We have analysed the recorded soil radon-222 time series by means of nonlinear techniques such as FFT power spectral analysis, empirical mode decomposition, multi-fractal analysis along with other linear statistical methods. Some fascinating and fruitful results originated out the nonlinear analysis of the said time series have been discussed in the present paper. The entire analytical method aided us to recognize the nature and pattern of soil radon-222 emanation process. Moreover the recording and statistical and non-linear analysis of soil radon data at Kutch region will assist us to understand the preparation phase of an imminent seismic event in the region. (author)

  7. Signal analysis of steam line acoustics

    International Nuclear Information System (INIS)

    Martin, C. Samuel

    2003-01-01

    The vibration of nuclear steam piping is usually associated with pressure fluctuations emanating from flow disturbances such as steam generator nozzles, bends, or other pipe fittings. Flow separation at pipe tees and within steam chest manifolds or headers generate pressure fluctuations that propagate both upstream to steam generators as well as downstream to the steam turbine. Steady-state acoustic oscillations at various frequencies occur within the piping, possibly exciting structural vibrations. This paper focuses on the assessment of the origin of the disturbances using signal analyses of two dynamic pressure recordings from pressure transducers located along straight runs in the steam piping. The technique involves performing the cross spectrum to two dynamic pressure signals in piping between (1) the steam generator and steam chest header, and (2) between the header and steam turbine outlet. If, at a specified frequency, no causality occurs between the two signals then the cross spectra magnitude will be negligible. Of interest here is the value of the phase between the two signals for frequencies for which the magnitude of the cross spectrum is not negligible. It is shown in the paper that the direction of the dominant waves at all frequencies can be related to the phase angle from the cross spectrum. It has to be realized that pressure waves emanating from one source such as a steam generator will propagate along uniform steam pipes with little transformation or attenuation, but will be reflected at fittings and at inlets and outlets. Hence, the eventual steady-state time record at a given location in the piping is a result of not only the disturbance, but also reflections of earlier pulsations. Cross-spectral analyses has been employed to determine the direction of the dominant acoustic waves in the piping for various frequencies for which there are signals. To prove the technique, synthetic spectra are generated comprised of harmonic waves moving both

  8. Spectral analysis of time series of events: effect of respiration on heart rate in neonates

    International Nuclear Information System (INIS)

    Van Drongelen, Wim; Williams, Amber L; Lasky, Robert E

    2009-01-01

    Certain types of biomedical processes such as the heart rate generator can be considered as signals that are sampled by the occurring events, i.e. QRS complexes. This sampling property generates problems for the evaluation of spectral parameters of such signals. First, the irregular occurrence of heart beats creates an unevenly sampled data set which must either be pre-processed (e.g. by using trace binning or interpolation) prior to spectral analysis, or analyzed with specialized methods (e.g. Lomb's algorithm). Second, the average occurrence of events determines the Nyquist limit for the sampled time series. Here we evaluate different types of spectral analysis of recordings of neonatal heart rate. Coupling between respiration and heart rate and the detection of heart rate itself are emphasized. We examine both standard and data adaptive frequency bands of heart rate signals generated by models of coupled oscillators and recorded data sets from neonates. We find that an important spectral artifact occurs due to a mirror effect around the Nyquist limit of half the average heart rate. Further we conclude that the presence of respiratory coupling can only be detected under low noise conditions and if a data-adaptive respiratory band is used

  9. Time Series Analysis of the Quasar PKS 1749+096

    Science.gov (United States)

    Lam, Michael T.; Balonek, T. J.

    2011-01-01

    Multiple timescales of variability are observed in quasars at a variety of wavelengths, the nature of which is not fully understood. In 2007 and 2008, the quasar 1749+096 underwent two unprecedented optical outbursts, reaching a brightness never before seen in our twenty years of monitoring. Much lower level activity had been seen prior to these two outbursts. We present an analysis of the timescales of variability over the two regimes using a variety of statistical techniques. An IDL software package developed at Colgate University over the summer of 2010, the Quasar User Interface (QUI), provides effective computation of four time series functions for analyzing underlying trends present in generic, discretely sampled data sets. Using the Autocorrelation Function, Structure Function, and Power Spectrum, we are able to quickly identify possible variability timescales. QUI is also capable of computing the Cross-Correlation Function for comparing variability at different wavelengths. We apply these algorithms to 1749+096 and present our analysis of the timescales for this object. Funding for this project was received from Colgate University, the Justus and Jayne Schlichting Student Research Fund, and the NASA / New York Space Grant.

  10. Analysis of signal acquisition in GPS receiver software

    Directory of Open Access Journals (Sweden)

    Vlada S. Sokolović

    2011-01-01

    Full Text Available This paper presents a critical analysis of the flow signal processing carried out in GPS receiver software, which served as a basis for a critical comparison of different signal processing architectures within the GPS receiver. It is possible to achieve Increased flexibility and reduction of GPS device commercial costs, including those of mobile devices, by using radio technology software (SDR, Software Defined Radio. The SDR application can be realized when certain hardware components in a GPS receiver are replaced. Signal processing in the SDR is implemented using a programmable DSP (Digital Signal Processing or FPGA (Field Programmable Gate Array circuit, which allows a simple change of digital signal processing algorithms and a simple change of the receiver parameters. The starting point of the research is the signal generated on the satellite the structure of which is shown in the paper. Based on the GPS signal structure, a receiver is realized with a task to extract an appropriate signal from the spectrum and detect it. Based on collected navigation data, the receiver calculates the position of the end user. The signal coming from the satellite may be at the carrier frequencies of L1 and L2. Since the SPS is used in the civil service, all the tests shown in the work were performed on the L1 signal. The signal coming to the receiver is generated in the spread spectrum technology and is situated below the level of noise. Such signals often interfere with signals from the environment which presents a difficulty for a receiver to perform proper detection and signal processing. Therefore, signal processing technology is continually being improved, aiming at more accurate and faster signal processing. All tests were carried out on a signal acquired from the satellite using the SE4110 input circuit used for filtering, amplification and signal selection. The samples of the received signal were forwarded to a computer for data post processing, i. e

  11. Analysis of time series and size of equivalent sample

    International Nuclear Information System (INIS)

    Bernal, Nestor; Molina, Alicia; Pabon, Daniel; Martinez, Jorge

    2004-01-01

    In a meteorological context, a first approach to the modeling of time series is to use models of autoregressive type. This allows one to take into account the meteorological persistence or temporal behavior, thereby identifying the memory of the analyzed process. This article seeks to pre-sent the concept of the size of an equivalent sample, which helps to identify in the data series sub periods with a similar structure. Moreover, in this article we examine the alternative of adjusting the variance of the series, keeping in mind its temporal structure, as well as an adjustment to the covariance of two time series. This article presents two examples, the first one corresponding to seven simulated series with autoregressive structure of first order, and the second corresponding to seven meteorological series of anomalies of the air temperature at the surface in two Colombian regions

  12. Automated preparation of Kepler time series of planet hosts for asteroseismic analysis

    DEFF Research Database (Denmark)

    Handberg, R.; Lund, M. N.

    2014-01-01

    . In this paper we present the KASOC Filter, which is used to automatically prepare data from the Kepler/K2 mission for asteroseismic analyses of solar-like planet host stars. The methods are very effective at removing unwanted signals of both instrumental and planetary origins and produce significantly cleaner......One of the tasks of the Kepler Asteroseismic Science Operations Center (KASOC) is to provide asteroseismic analyses on Kepler Objects of Interest (KOIs). However, asteroseismic analysis of planetary host stars presents some unique complications with respect to data preprocessing, compared to pure...... asteroseismic targets. If not accounted for, the presence of planetary transits in the photometric time series often greatly complicates or even hinders these asteroseismic analyses. This drives the need for specialised methods of preprocessing data to make them suitable for asteroseismic analysis...

  13. Use of a prototype pulse oximeter for time series analysis of heart rate variability

    Science.gov (United States)

    González, Erika; López, Jehú; Hautefeuille, Mathieu; Velázquez, Víctor; Del Moral, Jésica

    2015-05-01

    This work presents the development of a low cost pulse oximeter prototype consisting of pulsed red and infrared commercial LEDs and a broad spectral photodetector used to register time series of heart rate and oxygen saturation of blood. This platform, besides providing these values, like any other pulse oximeter, processes the signals to compute a power spectrum analysis of the patient heart rate variability in real time and, additionally, the device allows access to all raw and analyzed data if databases construction is required or another kind of further analysis is desired. Since the prototype is capable of acquiring data for long periods of time, it is suitable for collecting data in real life activities, enabling the development of future wearable applications.

  14. Signal integrity analysis on discontinuous microstrip line

    International Nuclear Information System (INIS)

    Qiao, Qingyang; Dai, Yawen; Chen, Zipeng

    2013-01-01

    In high speed PCB design, microstirp lines were used to control the impedance, however, the discontinuous microstrip line can cause signal integrity problems. In this paper, we use the transmission line theory to study the characteristics of microstrip lines. Research results indicate that the discontinuity such as truncation, gap and size change result in the problems such as radiation, reflection, delay and ground bounce. We change the discontinuities to distributed parameter circuits, analysed the steady-state response and transient response and the phase delay. The transient response cause radiation and voltage jump.

  15. Applications of wavelet transforms for nuclear power plant signal analysis

    International Nuclear Information System (INIS)

    Seker, S.; Turkcan, E.; Upadhyaya, B.R.; Erbay, A.S.

    1998-01-01

    The safety of Nuclear Power Plants (NPPs) may be enhanced by the timely processing of information derived from multiple process signals from NPPs. The most widely used technique in signal analysis applications is the Fourier transform in the frequency domain to generate power spectral densities (PSD). However, the Fourier transform is global in nature and will obscure any non-stationary signal feature. Lately, a powerful technique called the Wavelet Transform, has been developed. This transform uses certain basis functions for representing the data in an effective manner, with capability for sub-band analysis and providing time-frequency localization as needed. This paper presents a brief overview of wavelets applied to the nuclear industry for signal processing and plant monitoring. The basic theory of Wavelets is also summarized. In order to illustrate the application of wavelet transforms data were acquired from the operating nuclear power plant Borssele in the Netherlands. The experimental data consist of various signals in the power plant and are selected from a stationary power operation. Their frequency characteristics and the mutual relations were investigated using MATLAB signal processing and wavelet toolbox for computing their PSDs and coherence functions by multi-resolution analysis. The results indicate that the sub-band PSD matches with the original signal PSD and enhances the estimation of coherence functions. The Wavelet analysis demonstrates the feasibility of application to stationary signals to provide better estimates in the frequency band of interest as compared to the classical FFT approach. (author)

  16. Social Signals, their function, and automatic analysis: A survey

    NARCIS (Netherlands)

    Vinciarelli, Alessandro; Pantic, Maja; Bourlard, Hervé; Pentland, Alex

    2008-01-01

    Social Signal Processing (SSP) aims at the analysis of social behaviour in both Human-Human and Human-Computer interactions. SSP revolves around automatic sensing and interpretation of social signals, complex aggregates of nonverbal behaviours through which individuals express their attitudes

  17. Signal-dependent independent component analysis by tunable mother wavelets

    International Nuclear Information System (INIS)

    Seo, Kyung Ho

    2006-02-01

    The objective of this study is to improve the standard independent component analysis when applied to real-world signals. Independent component analysis starts from the assumption that signals from different physical sources are statistically independent. But real-world signals such as EEG, ECG, MEG, and fMRI signals are not statistically independent perfectly. By definition, standard independent component analysis algorithms are not able to estimate statistically dependent sources, that is, when the assumption of independence does not hold. Therefore before independent component analysis, some preprocessing stage is needed. This paper started from simple intuition that wavelet transformed source signals by 'well-tuned' mother wavelet will be simplified sufficiently, and then the source separation will show better results. By the correlation coefficient method, the tuning process between source signal and tunable mother wavelet was executed. Gamma component of raw EEG signal was set to target signal, and wavelet transform was executed by tuned mother wavelet and standard mother wavelets. Simulation results by these wavelets was shown

  18. Interglacial climate dynamics and advanced time series analysis

    Science.gov (United States)

    Mudelsee, Manfred; Bermejo, Miguel; Köhler, Peter; Lohmann, Gerrit

    2013-04-01

    Studying the climate dynamics of past interglacials (IGs) helps to better assess the anthropogenically influenced dynamics of the current IG, the Holocene. We select the IG portions from the EPICA Dome C ice core archive, which covers the past 800 ka, to apply methods of statistical time series analysis (Mudelsee 2010). The analysed variables are deuterium/H (indicating temperature) (Jouzel et al. 2007), greenhouse gases (Siegenthaler et al. 2005, Loulergue et al. 2008, L¨ü thi et al. 2008) and a model-co-derived climate radiative forcing (Köhler et al. 2010). We select additionally high-resolution sea-surface-temperature records from the marine sedimentary archive. The first statistical method, persistence time estimation (Mudelsee 2002) lets us infer the 'climate memory' property of IGs. Second, linear regression informs about long-term climate trends during IGs. Third, ramp function regression (Mudelsee 2000) is adapted to look on abrupt climate changes during IGs. We compare the Holocene with previous IGs in terms of these mathematical approaches, interprete results in a climate context, assess uncertainties and the requirements to data from old IGs for yielding results of 'acceptable' accuracy. This work receives financial support from the Deutsche Forschungsgemeinschaft (Project ClimSens within the DFG Research Priority Program INTERDYNAMIK) and the European Commission (Marie Curie Initial Training Network LINC, No. 289447, within the 7th Framework Programme). References Jouzel J, Masson-Delmotte V, Cattani O, Dreyfus G, Falourd S, Hoffmann G, Minster B, Nouet J, Barnola JM, Chappellaz J, Fischer H, Gallet JC, Johnsen S, Leuenberger M, Loulergue L, Luethi D, Oerter H, Parrenin F, Raisbeck G, Raynaud D, Schilt A, Schwander J, Selmo E, Souchez R, Spahni R, Stauffer B, Steffensen JP, Stenni B, Stocker TF, Tison JL, Werner M, Wolff EW (2007) Orbital and millennial Antarctic climate variability over the past 800,000 years. Science 317:793. Köhler P, Bintanja R

  19. Time-causal decomposition of geomagnetic time series into secular variation, solar quiet, and disturbance signals

    Science.gov (United States)

    Rigler, E. Joshua

    2017-04-26

    A theoretical basis and prototype numerical algorithm are provided that decompose regular time series of geomagnetic observations into three components: secular variation; solar quiet, and disturbance. Respectively, these three components correspond roughly to slow changes in the Earth’s internal magnetic field, periodic daily variations caused by quasi-stationary (with respect to the sun) electrical current systems in the Earth’s magnetosphere, and episodic perturbations to the geomagnetic baseline that are typically driven by fluctuations in a solar wind that interacts electromagnetically with the Earth’s magnetosphere. In contrast to similar algorithms applied to geomagnetic data in the past, this one addresses the issue of real time data acquisition directly by applying a time-causal, exponential smoother with “seasonal corrections” to the data as soon as they become available.

  20. Damage localization of marine risers using time series of vibration signals

    Science.gov (United States)

    Liu, Hao; Yang, Hezhen; Liu, Fushun

    2014-10-01

    Based on dynamic response signals a damage detection algorithm is developed for marine risers. Damage detection methods based on numerous modal properties have encountered issues in the researches in offshore oil community. For example, significant increase in structure mass due to marine plant/animal growth and changes in modal properties by equipment noise are not the result of damage for riser structures. In an attempt to eliminate the need to determine modal parameters, a data-based method is developed. The implementation of the method requires that vibration data are first standardized to remove the influence of different loading conditions and the autoregressive moving average (ARMA) model is used to fit vibration response signals. In addition, a damage feature factor is introduced based on the autoregressive (AR) parameters. After that, the Euclidean distance between ARMA models is subtracted as a damage indicator for damage detection and localization and a top tensioned riser simulation model with different damage scenarios is analyzed using the proposed method with dynamic acceleration responses of a marine riser as sensor data. Finally, the influence of measured noise is analyzed. According to the damage localization results, the proposed method provides accurate damage locations of risers and is robust to overcome noise effect.

  1. The Prediction of Teacher Turnover Employing Time Series Analysis.

    Science.gov (United States)

    Costa, Crist H.

    The purpose of this study was to combine knowledge of teacher demographic data with time-series forecasting methods to predict teacher turnover. Moving averages and exponential smoothing were used to forecast discrete time series. The study used data collected from the 22 largest school districts in Iowa, designated as FACT schools. Predictions…

  2. A comparative analysis of spectral exponent estimation techniques for 1/f(β) processes with applications to the analysis of stride interval time series.

    Science.gov (United States)

    Schaefer, Alexander; Brach, Jennifer S; Perera, Subashan; Sejdić, Ervin

    2014-01-30

    The time evolution and complex interactions of many nonlinear systems, such as in the human body, result in fractal types of parameter outcomes that exhibit self similarity over long time scales by a power law in the frequency spectrum S(f)=1/f(β). The scaling exponent β is thus often interpreted as a "biomarker" of relative health and decline. This paper presents a thorough comparative numerical analysis of fractal characterization techniques with specific consideration given to experimentally measured gait stride interval time series. The ideal fractal signals generated in the numerical analysis are constrained under varying lengths and biases indicative of a range of physiologically conceivable fractal signals. This analysis is to complement previous investigations of fractal characteristics in healthy and pathological gait stride interval time series, with which this study is compared. The results of our analysis showed that the averaged wavelet coefficient method consistently yielded the most accurate results. Class dependent methods proved to be unsuitable for physiological time series. Detrended fluctuation analysis as most prevailing method in the literature exhibited large estimation variances. The comparative numerical analysis and experimental applications provide a thorough basis for determining an appropriate and robust method for measuring and comparing a physiologically meaningful biomarker, the spectral index β. In consideration of the constraints of application, we note the significant drawbacks of detrended fluctuation analysis and conclude that the averaged wavelet coefficient method can provide reasonable consistency and accuracy for characterizing these fractal time series. Copyright © 2013 Elsevier B.V. All rights reserved.

  3. ON THE FOURIER AND WAVELET ANALYSIS OF CORONAL TIME SERIES

    International Nuclear Information System (INIS)

    Auchère, F.; Froment, C.; Bocchialini, K.; Buchlin, E.; Solomon, J.

    2016-01-01

    Using Fourier and wavelet analysis, we critically re-assess the significance of our detection of periodic pulsations in coronal loops. We show that the proper identification of the frequency dependence and statistical properties of the different components of the power spectra provides a strong argument against the common practice of data detrending, which tends to produce spurious detections around the cut-off frequency of the filter. In addition, the white and red noise models built into the widely used wavelet code of Torrence and Compo cannot, in most cases, adequately represent the power spectra of coronal time series, thus also possibly causing false positives. Both effects suggest that several reports of periodic phenomena should be re-examined. The Torrence and Compo code nonetheless effectively computes rigorous confidence levels if provided with pertinent models of mean power spectra, and we describe the appropriate manner in which to call its core routines. We recall the meaning of the default confidence levels output from the code, and we propose new Monte-Carlo-derived levels that take into account the total number of degrees of freedom in the wavelet spectra. These improvements allow us to confirm that the power peaks that we detected have a very low probability of being caused by noise.

  4. ON THE FOURIER AND WAVELET ANALYSIS OF CORONAL TIME SERIES

    Energy Technology Data Exchange (ETDEWEB)

    Auchère, F.; Froment, C.; Bocchialini, K.; Buchlin, E.; Solomon, J., E-mail: frederic.auchere@ias.u-psud.fr [Institut d’Astrophysique Spatiale, CNRS, Univ. Paris-Sud, Université Paris-Saclay, Bât. 121, F-91405 Orsay (France)

    2016-07-10

    Using Fourier and wavelet analysis, we critically re-assess the significance of our detection of periodic pulsations in coronal loops. We show that the proper identification of the frequency dependence and statistical properties of the different components of the power spectra provides a strong argument against the common practice of data detrending, which tends to produce spurious detections around the cut-off frequency of the filter. In addition, the white and red noise models built into the widely used wavelet code of Torrence and Compo cannot, in most cases, adequately represent the power spectra of coronal time series, thus also possibly causing false positives. Both effects suggest that several reports of periodic phenomena should be re-examined. The Torrence and Compo code nonetheless effectively computes rigorous confidence levels if provided with pertinent models of mean power spectra, and we describe the appropriate manner in which to call its core routines. We recall the meaning of the default confidence levels output from the code, and we propose new Monte-Carlo-derived levels that take into account the total number of degrees of freedom in the wavelet spectra. These improvements allow us to confirm that the power peaks that we detected have a very low probability of being caused by noise.

  5. Prediction of solar cycle 24 using fourier series analysis

    International Nuclear Information System (INIS)

    Khalid, M.; Sultana, M.; Zaidi, F.

    2014-01-01

    Predicting the behavior of solar activity has become very significant. It is due to its influence on Earth and the surrounding environment. Apt predictions of the amplitude and timing of the next solar cycle will aid in the estimation of the several results of Space Weather. In the past, many prediction procedures have been used and have been successful to various degrees in the field of solar activity forecast. In this study, Solar cycle 24 is forecasted by the Fourier series method. Comparative analysis has been made by auto regressive integrated moving averages method. From sources, January 2008 was the minimum preceding solar cycle 24, the amplitude and shape of solar cycle 24 is approximate on monthly number of sunspots. This forecast framework approximates a mean solar cycle 24, with the maximum appearing during May 2014 (+- 8 months), with most sunspot of 98 +- 10. Solar cycle 24 will be ending in June 2020 (+- 7 months). The difference between two consecutive peak values of solar cycles (i.e. solar cycle 23 and 24 ) is 165 months(+- 6 months). (author)

  6. Transistor Small Signal Analysis under Radiation Effects

    International Nuclear Information System (INIS)

    Sharshar, K.A.A.

    2004-01-01

    A Small signal transistor parameters dedicate the operation of bipolar transistor before and after exposed to gamma radiation (1 Mrad up to 5 Mrads) and electron beam(1 MeV, 25 mA) with the same doses as a radiation sources, the electrical parameters of the device are changed. The circuit Model has been discussed.Parameters, such as internal emitter resistance (re), internal base resistance, internal collector resistance (re), emitter base photocurrent (Ippe) and base collector photocurrent (Ippe). These parameters affect on the operation of the device in its applications, which work as an effective element, such as current gain (hFE≡β)degradation it's and effective parameter in the device operation. Also the leakage currents (IcBO) and (IEBO) are most important parameters, Which increased with radiation doses. Theoretical representation of the change in the equivalent circuit for NPN and PNP bipolar transistor were discussed, the input and output parameters of the two types were discussed due to the change in small signal input resistance of the two types. The emitter resistance(re) were changed by the effect of gamma and electron beam irradiation, which makes a change in the role of matching impedances between transistor stages. Also the transistor stability factors S(Ico), S(VBE) and S(β are detected to indicate the transistor operations after exposed to radiation fields. In low doses the gain stability is modified due to recombination of induced charge generated during device fabrication. Also the load resistance values are connected to compensate the effect

  7. Photoacoustic signal and noise analysis for Si thin plate: signal correction in frequency domain.

    Science.gov (United States)

    Markushev, D D; Rabasović, M D; Todorović, D M; Galović, S; Bialkowski, S E

    2015-03-01

    Methods for photoacoustic signal measurement, rectification, and analysis for 85 μm thin Si samples in the 20-20 000 Hz modulation frequency range are presented. Methods for frequency-dependent amplitude and phase signal rectification in the presence of coherent and incoherent noise as well as distortion due to microphone characteristics are presented. Signal correction is accomplished using inverse system response functions deduced by comparing real to ideal signals for a sample with well-known bulk parameters and dimensions. The system response is a piece-wise construction, each component being due to a particular effect of the measurement system. Heat transfer and elastic effects are modeled using standard Rosencweig-Gersho and elastic-bending theories. Thermal diffusion, thermoelastic, and plasmaelastic signal components are calculated and compared to measurements. The differences between theory and experiment are used to detect and correct signal distortion and to determine detector and sound-card characteristics. Corrected signal analysis is found to faithfully reflect known sample parameters.

  8. Time-series-analysis techniques applied to nuclear-material accounting

    International Nuclear Information System (INIS)

    Pike, D.H.; Morrison, G.W.; Downing, D.J.

    1982-05-01

    This document is designed to introduce the reader to the applications of Time Series Analysis techniques to Nuclear Material Accountability data. Time series analysis techniques are designed to extract information from a collection of random variables ordered by time by seeking to identify any trends, patterns, or other structure in the series. Since nuclear material accountability data is a time series, one can extract more information using time series analysis techniques than by using other statistical techniques. Specifically, the objective of this document is to examine the applicability of time series analysis techniques to enhance loss detection of special nuclear materials. An introductory section examines the current industry approach which utilizes inventory differences. The error structure of inventory differences is presented. Time series analysis techniques discussed include the Shewhart Control Chart, the Cumulative Summation of Inventory Differences Statistics (CUSUM) and the Kalman Filter and Linear Smoother

  9. A review of intelligent systems for heart sound signal analysis.

    Science.gov (United States)

    Nabih-Ali, Mohammed; El-Dahshan, El-Sayed A; Yahia, Ashraf S

    2017-10-01

    Intelligent computer-aided diagnosis (CAD) systems can enhance the diagnostic capabilities of physicians and reduce the time required for accurate diagnosis. CAD systems could provide physicians with a suggestion about the diagnostic of heart diseases. The objective of this paper is to review the recent published preprocessing, feature extraction and classification techniques and their state of the art of phonocardiogram (PCG) signal analysis. Published literature reviewed in this paper shows the potential of machine learning techniques as a design tool in PCG CAD systems and reveals that the CAD systems for PCG signal analysis are still an open problem. Related studies are compared to their datasets, feature extraction techniques and the classifiers they used. Current achievements and limitations in developing CAD systems for PCG signal analysis using machine learning techniques are presented and discussed. In the light of this review, a number of future research directions for PCG signal analysis are provided.

  10. On-line analysis of reactor noise using time-series analysis

    International Nuclear Information System (INIS)

    McGevna, V.G.

    1981-10-01

    A method to allow use of time series analysis for on-line noise analysis has been developed. On-line analysis of noise in nuclear power reactors has been limited primarily to spectral analysis and related frequency domain techniques. Time series analysis has many distinct advantages over spectral analysis in the automated processing of reactor noise. However, fitting an autoregressive-moving average (ARMA) model to time series data involves non-linear least squares estimation. Unless a high speed, general purpose computer is available, the calculations become too time consuming for on-line applications. To eliminate this problem, a special purpose algorithm was developed for fitting ARMA models. While it is based on a combination of steepest descent and Taylor series linearization, properties of the ARMA model are used so that the auto- and cross-correlation functions can be used to eliminate the need for estimating derivatives. The number of calculations, per iteration varies lineegardless of the mee 0.2% yield strength displayed anisotropy, with axial and circumferential values being greater than radial. For CF8-CPF8 and CF8M-CPF8M castings to meet current ASME Code S acid fuel cells

  11. On semi-classical questions related to signal analysis

    KAUST Repository

    Helffer, Bernard

    2011-12-01

    This study explores the reconstruction of a signal using spectral quantities associated with some self-adjoint realization of an h-dependent Schrödinger operator -h2(d2/dx2)-y(x), h>0, when the parameter h tends to 0. Theoretical results in semi-classical analysis are proved. Some numerical results are also presented. We first consider as a toy model the sech2 function. Then we study a real signal given by arterial blood pressure measurements. This approach seems to be very promising in signal analysis. Indeed it provides new spectral quantities that can give relevant information on some signals as it is the case for arterial blood pressure signal. © 2011 - IOS Press and the authors. All rights reserved.

  12. [Computers in biomedical research: I. Analysis of bioelectrical signals].

    Science.gov (United States)

    Vivaldi, E A; Maldonado, P

    2001-08-01

    A personal computer equipped with an analog-to-digital conversion card is able to input, store and display signals of biomedical interest. These signals can additionally be submitted to ad-hoc software for analysis and diagnosis. Data acquisition is based on the sampling of a signal at a given rate and amplitude resolution. The automation of signal processing conveys syntactic aspects (data transduction, conditioning and reduction); and semantic aspects (feature extraction to describe and characterize the signal and diagnostic classification). The analytical approach that is at the basis of computer programming allows for the successful resolution of apparently complex tasks. Two basic principles involved are the definition of simple fundamental functions that are then iterated and the modular subdivision of tasks. These two principles are illustrated, respectively, by presenting the algorithm that detects relevant elements for the analysis of a polysomnogram, and the task flow in systems that automate electrocardiographic reports.

  13. The Signal and Noise Analysis of Direct Conversion EHM Transceivers

    Directory of Open Access Journals (Sweden)

    Shayegh

    2006-01-01

    Full Text Available A direct conversion modulator-demodulator with even harmonic mixers with emphasis on noise analysis is presented. The circuits consist of even harmonic mixers (EHMs realized with antiparallel diode pairs (APDPs. We evaluate the different levels of I/Q imbalances and DC offsets and use signal space concepts to analyze the bit error rate (BER of the proposed transceiver using M-ary QAM schemes. Moreover, the simultaneous analysis of the signal and noise has been presented.

  14. Efficient algorithm for baseline wander and powerline noise removal from ECG signals based on discrete Fourier series.

    Science.gov (United States)

    Bahaz, Mohamed; Benzid, Redha

    2018-03-01

    Electrocardiogram (ECG) signals are often contaminated with artefacts and noises which can lead to incorrect diagnosis when they are visually inspected by cardiologists. In this paper, the well-known discrete Fourier series (DFS) is re-explored and an efficient DFS-based method is proposed to reduce contribution of both baseline wander (BW) and powerline interference (PLI) noises in ECG records. In the first step, the determination of the exact number of low frequency harmonics contributing in BW is achieved. Next, the baseline drift is estimated by the sum of all associated Fourier sinusoids components. Then, the baseline shift is discarded efficiently by a subtraction of its approximated version from the original biased ECG signal. Concerning the PLI, the subtraction of the contributing harmonics calculated in the same manner reduces efficiently such type of noise. In addition of visual quality results, the proposed algorithm shows superior performance in terms of higher signal-to-noise ratio and smaller mean square error when faced to the DCT-based algorithm.

  15. Analysis of complex time series using refined composite multiscale entropy

    International Nuclear Information System (INIS)

    Wu, Shuen-De; Wu, Chiu-Wen; Lin, Shiou-Gwo; Lee, Kung-Yen; Peng, Chung-Kang

    2014-01-01

    Multiscale entropy (MSE) is an effective algorithm for measuring the complexity of a time series that has been applied in many fields successfully. However, MSE may yield an inaccurate estimation of entropy or induce undefined entropy because the coarse-graining procedure reduces the length of a time series considerably at large scales. Composite multiscale entropy (CMSE) was recently proposed to improve the accuracy of MSE, but it does not resolve undefined entropy. Here we propose a refined composite multiscale entropy (RCMSE) to improve CMSE. For short time series analyses, we demonstrate that RCMSE increases the accuracy of entropy estimation and reduces the probability of inducing undefined entropy.

  16. Unsupervised land cover change detection: meaningful sequential time series analysis

    CSIR Research Space (South Africa)

    Salmon, BP

    2011-06-01

    Full Text Available An automated land cover change detection method is proposed that uses coarse spatial resolution hyper-temporal earth observation satellite time series data. The study compared three different unsupervised clustering approaches that operate on short...

  17. Advances in Antithetic Time Series Analysis : Separating Fact from Artifact

    Directory of Open Access Journals (Sweden)

    Dennis Ridley

    2016-01-01

    Full Text Available The problem of biased time series mathematical model parameter estimates is well known to be insurmountable. When used to predict future values by extrapolation, even a de minimis bias will eventually grow into a large bias, with misleading results. This paper elucidates how combining antithetic time series' solves this baffling problem of bias in the fitted and forecast values by dynamic bias cancellation. Instead of growing to infinity, the average error can converge to a constant. (original abstract

  18. Data imputation analysis for Cosmic Rays time series

    Science.gov (United States)

    Fernandes, R. C.; Lucio, P. S.; Fernandez, J. H.

    2017-05-01

    The occurrence of missing data concerning Galactic Cosmic Rays time series (GCR) is inevitable since loss of data is due to mechanical and human failure or technical problems and different periods of operation of GCR stations. The aim of this study was to perform multiple dataset imputation in order to depict the observational dataset. The study has used the monthly time series of GCR Climax (CLMX) and Roma (ROME) from 1960 to 2004 to simulate scenarios of 10%, 20%, 30%, 40%, 50%, 60%, 70%, 80% and 90% of missing data compared to observed ROME series, with 50 replicates. Then, the CLMX station as a proxy for allocation of these scenarios was used. Three different methods for monthly dataset imputation were selected: AMÉLIA II - runs the bootstrap Expectation Maximization algorithm, MICE - runs an algorithm via Multivariate Imputation by Chained Equations and MTSDI - an Expectation Maximization algorithm-based method for imputation of missing values in multivariate normal time series. The synthetic time series compared with the observed ROME series has also been evaluated using several skill measures as such as RMSE, NRMSE, Agreement Index, R, R2, F-test and t-test. The results showed that for CLMX and ROME, the R2 and R statistics were equal to 0.98 and 0.96, respectively. It was observed that increases in the number of gaps generate loss of quality of the time series. Data imputation was more efficient with MTSDI method, with negligible errors and best skill coefficients. The results suggest a limit of about 60% of missing data for imputation, for monthly averages, no more than this. It is noteworthy that CLMX, ROME and KIEL stations present no missing data in the target period. This methodology allowed reconstructing 43 time series.

  19. Geomechanical time series and its singularity spectrum analysis

    Czech Academy of Sciences Publication Activity Database

    Lyubushin, Alexei A.; Kaláb, Zdeněk; Lednická, Markéta

    2012-01-01

    Roč. 47, č. 1 (2012), s. 69-77 ISSN 1217-8977 R&D Projects: GA ČR GA105/09/0089 Institutional research plan: CEZ:AV0Z30860518 Keywords : geomechanical time series * singularity spectrum * time series segmentation * laser distance meter Subject RIV: DC - Siesmology, Volcanology, Earth Structure Impact factor: 0.347, year: 2012 http://www.akademiai.com/content/88v4027758382225/fulltext.pdf

  20. Epoch-based analysis of speech signals

    Indian Academy of Sciences (India)

    on speech production characteristics, but also helps in accurate analysis of speech. .... include time delay estimation, speech enhancement from single and multi- ...... log. (. E[k]. ∑K−1 l=0. E[l]. ) ,. (7) where K is the number of samples in the ...

  1. Smoke Signals: Adolescent Smoking and School Continuation. Working Papers Series. SAN06-05

    Science.gov (United States)

    Cook, Philip J.; Hutchinson, Rebecca

    2006-01-01

    This paper presents an exploratory analysis using NLSY97 data of the relationship between the likelihood of school continuation and the choices of whether to smoke or drink. We demonstrate that in the United States as of the late 1990s, smoking in 11th-grade was a uniquely powerful predictor of whether the student finished high school, and if so…

  2. Artificial intelligence applied to process signal analysis

    Science.gov (United States)

    Corsberg, Dan

    1988-01-01

    Many space station processes are highly complex systems subject to sudden, major transients. In any complex process control system, a critical aspect of the human/machine interface is the analysis and display of process information. Human operators can be overwhelmed by large clusters of alarms that inhibit their ability to diagnose and respond to a disturbance. Using artificial intelligence techniques and a knowledge base approach to this problem, the power of the computer can be used to filter and analyze plant sensor data. This will provide operators with a better description of the process state. Once a process state is recognized, automatic action could be initiated and proper system response monitored.

  3. Music Structure Analysis from Acoustic Signals

    Science.gov (United States)

    Dannenberg, Roger B.; Goto, Masataka

    Music is full of structure, including sections, sequences of distinct musical textures, and the repetition of phrases or entire sections. The analysis of music audio relies upon feature vectors that convey information about music texture or pitch content. Texture generally refers to the average spectral shape and statistical fluctuation, often reflecting the set of sounding instruments, e.g., strings, vocal, or drums. Pitch content reflects melody and harmony, which is often independent of texture. Structure is found in several ways. Segment boundaries can be detected by observing marked changes in locally averaged texture.

  4. Spatially adaptive mixture modeling for analysis of FMRI time series.

    Science.gov (United States)

    Vincent, Thomas; Risser, Laurent; Ciuciu, Philippe

    2010-04-01

    Within-subject analysis in fMRI essentially addresses two problems, the detection of brain regions eliciting evoked activity and the estimation of the underlying dynamics. In Makni et aL, 2005 and Makni et aL, 2008, a detection-estimation framework has been proposed to tackle these problems jointly, since they are connected to one another. In the Bayesian formalism, detection is achieved by modeling activating and nonactivating voxels through independent mixture models (IMM) within each region while hemodynamic response estimation is performed at a regional scale in a nonparametric way. Instead of IMMs, in this paper we take advantage of spatial mixture models (SMM) for their nonlinear spatial regularizing properties. The proposed method is unsupervised and spatially adaptive in the sense that the amount of spatial correlation is automatically tuned from the data and this setting automatically varies across brain regions. In addition, the level of regularization is specific to each experimental condition since both the signal-to-noise ratio and the activation pattern may vary across stimulus types in a given brain region. These aspects require the precise estimation of multiple partition functions of underlying Ising fields. This is addressed efficiently using first path sampling for a small subset of fields and then using a recently developed fast extrapolation technique for the large remaining set. Simulation results emphasize that detection relying on supervised SMM outperforms its IMM counterpart and that unsupervised spatial mixture models achieve similar results without any hand-tuning of the correlation parameter. On real datasets, the gain is illustrated in a localizer fMRI experiment: brain activations appear more spatially resolved using SMM in comparison with classical general linear model (GLM)-based approaches, while estimating a specific parcel-based HRF shape. Our approach therefore validates the treatment of unsmoothed fMRI data without fixed GLM

  5. siGnum: graphical user interface for EMG signal analysis.

    Science.gov (United States)

    Kaur, Manvinder; Mathur, Shilpi; Bhatia, Dinesh; Verma, Suresh

    2015-01-01

    Electromyography (EMG) signals that represent the electrical activity of muscles can be used for various clinical and biomedical applications. These are complicated and highly varying signals that are dependent on anatomical location and physiological properties of the muscles. EMG signals acquired from the muscles require advanced methods for detection, decomposition and processing. This paper proposes a novel Graphical User Interface (GUI) siGnum developed in MATLAB that will apply efficient and effective techniques on processing of the raw EMG signals and decompose it in a simpler manner. It could be used independent of MATLAB software by employing a deploy tool. This would enable researcher's to gain good understanding of EMG signal and its analysis procedures that can be utilized for more powerful, flexible and efficient applications in near future.

  6. Time Series Analysis of Wheat flour Price Shocks in Pakistan: A Case Analysis

    OpenAIRE

    Asad Raza Abdi; Ali Hassan Halepoto; Aisha Bashir Shah; Faiz M. Shaikh

    2013-01-01

    The current research investigates the wheat flour Price Shocks in Pakistan: A case analysis. Data was collected by using secondary sources by using Time series Analysis, and data were analyzed by using SPSS-20 version. It was revealed that the price of wheat flour increases from last four decades, and trend of price shocks shows that due to certain market variation and supply and demand shocks also play a positive relationship in price shocks in the wheat prices. It was further revealed th...

  7. Reliability analysis for Atucha II reactor protection system signals

    International Nuclear Information System (INIS)

    Roca, Jose Luis

    1996-01-01

    Atucha II is a 745 MW Argentine Power Nuclear Reactor constructed by ENACE SA, Nuclear Argentine Company for Electrical Power Generation and SIEMENS AG KWU, Erlangen, Germany. A preliminary modular logic analysis of RPS (Reactor Protection System) signals was performed by means of the well known Swedish professional risk and reliability software named Risk-Spectrum taking as a basis a reference signal coded as JR17ER003 which command the two moderator loops valves. From the reliability and behavior knowledge for this reference signal follows an estimation of the reliability for the other 97 RPS signals. Because the preliminary character of this analysis Main Important Measures are not performed at this stage. Reliability is by the statistic value named unavailability predicted. The scope of this analysis is restricted from the measurement elements to the RPS buffer outputs. In the present context only one redundancy is analyzed so in the Instrumentation and Control area there no CCF (Common Cause Failures) present for signals. Finally those unavailability values could be introduced in the failure domain for the posterior complete Atucha II reliability analysis which includes all mechanical and electromechanical features. Also an estimation of the spurious frequency of RPS signals defined as faulty by no trip is performed

  8. Reliability analysis for Atucha II reactor protection system signals

    International Nuclear Information System (INIS)

    Roca, Jose L.

    2000-01-01

    Atucha II is a 745 MW Argentine power nuclear reactor constructed by Nuclear Argentine Company for Electric Power Generation S.A. (ENACE S.A.) and SIEMENS AG KWU, Erlangen, Germany. A preliminary modular logic analysis of RPS (Reactor Protection System) signals was performed by means of the well known Swedish professional risk and reliability software named Risk-Spectrum taking as a basis a reference signal coded as JR17ER003 which command the two moderator loops valves. From the reliability and behavior knowledge for this reference signal follows an estimation of the reliability for the other 97 RPS signals. Because the preliminary character of this analysis Main Important Measures are not performed at this stage. Reliability is by the statistic value named unavailability predicted. The scope of this analysis is restricted from the measurement elements to the RPS buffer outputs. In the present context only one redundancy is analyzed so in the Instrumentation and Control area there no CCF (Common Cause Failures) present for signals. Finally those unavailability values could be introduced in the failure domain for the posterior complete Atucha II reliability analysis which includes all mechanical and electromechanical features. Also an estimation of the spurious frequency of RPS signals defined as faulty by no trip is performed. (author)

  9. Massive Signal Analysis with Hadoop (Invited)

    Science.gov (United States)

    Addair, T.

    2013-12-01

    The Geophysical Monitoring Program (GMP) at Lawrence Livermore National Laboratory is in the process of transitioning from a primarily human-driven analysis pipeline to a more automated and exploratory system. Waveform correlation represents a significant part of this effort, and the results that come out of this processing could lead to the development of more sophisticated event detection and analysis systems that require less human interaction, and address fundamental shortcomings in existing systems. Furthermore, use of distributed IO systems fundamentally addresses a scalability concern for the GMP as our data holdings continue to grow rapidly. As the data volume increases, it becomes less reasonable to rely upon human analysts to sift through all the information. Not only is more automation essential to keeping up with the ingestion rate, but so too do we require faster and more sophisticated tools for visualizing and interacting with the data. These issues of scalability are not unique to GMP or the seismic domain. All across the lab, and throughout industry, we hear about the promise of 'big data' to address the need of quickly analyzing vast amounts of data in fundamentally new ways. Our waveform correlation system finds and correlates nearby seismic events across the entire Earth. In our original implementation of the system, we processed some 50 TB of data on an in-house traditional HPC cluster (44 cores, 1 filesystem) over the span of 42 days. Having determined the primary bottleneck in the performance to be reading waveforms off a single BlueArc file server, we began investigating distributed IO solutions like Hadoop. As a test case, we took a 1 TB subset of our data and ported it to Livermore Computing's development Hadoop cluster. Through a pilot project sponsored by Livermore Computing (LC), the GMP successfully implemented the waveform correlation system in the Hadoop distributed MapReduce computing framework. Hadoop is an open source

  10. Track Irregularity Time Series Analysis and Trend Forecasting

    Directory of Open Access Journals (Sweden)

    Jia Chaolong

    2012-01-01

    Full Text Available The combination of linear and nonlinear methods is widely used in the prediction of time series data. This paper analyzes track irregularity time series data by using gray incidence degree models and methods of data transformation, trying to find the connotative relationship between the time series data. In this paper, GM (1,1 is based on first-order, single variable linear differential equations; after an adaptive improvement and error correction, it is used to predict the long-term changing trend of track irregularity at a fixed measuring point; the stochastic linear AR, Kalman filtering model, and artificial neural network model are applied to predict the short-term changing trend of track irregularity at unit section. Both long-term and short-term changes prove that the model is effective and can achieve the expected accuracy.

  11. Analysis of historical series of industrial demand of energy; Analisi delle serie storiche dei consumi energetici dell`industria

    Energy Technology Data Exchange (ETDEWEB)

    Moauro, F. [ENEA, Centro Ricerche Casaccia, Rome (Italy). Dip. Energia

    1995-03-01

    This paper reports a short term analysis of the Italian demand for energy fonts and a check of a statistic model supposing the industrial demand for energy fonts as a function of prices and production, according to neoclassic neoclassic micro economic theory. To this pourpose monthly time series of industrial consumption of main energy fonts in 6 sectors, industrial production indexes in the same sectors and indexes of energy prices (coal, natural gas, oil products, electricity) have been used. The statistic methodology refers to modern analysis of time series and specifically to transfer function models. These ones permit rigorous identification and representation of the most important dynamic relations between dependent variables (production and prices), as relation of an input-output system. The results have shown an important positive correlation between energy consumption with prices. Furthermore, it has been shown the reliability of forecasts and their use as monthly energy indicators.

  12. Unstable Periodic Orbit Analysis of Histograms of Chaotic Time Series

    International Nuclear Information System (INIS)

    Zoldi, S.M.

    1998-01-01

    Using the Lorenz equations, we have investigated whether unstable periodic orbits (UPOs) associated with a strange attractor may predict the occurrence of the robust sharp peaks in histograms of some experimental chaotic time series. Histograms with sharp peaks occur for the Lorenz parameter value r=60.0 but not for r=28.0 , and the sharp peaks for r=60.0 do not correspond to a histogram derived from any single UPO. However, we show that histograms derived from the time series of a non-Axiom-A chaotic system can be accurately predicted by an escape-time weighting of UPO histograms. copyright 1998 The American Physical Society

  13. Minimum entropy density method for the time series analysis

    Science.gov (United States)

    Lee, Jeong Won; Park, Joongwoo Brian; Jo, Hang-Hyun; Yang, Jae-Suk; Moon, Hie-Tae

    2009-01-01

    The entropy density is an intuitive and powerful concept to study the complicated nonlinear processes derived from physical systems. We develop the minimum entropy density method (MEDM) to detect the structure scale of a given time series, which is defined as the scale in which the uncertainty is minimized, hence the pattern is revealed most. The MEDM is applied to the financial time series of Standard and Poor’s 500 index from February 1983 to April 2006. Then the temporal behavior of structure scale is obtained and analyzed in relation to the information delivery time and efficient market hypothesis.

  14. Postprocessing algorithm for automated analysis of pelvic intraoperative neuromonitoring signals

    Directory of Open Access Journals (Sweden)

    Wegner Celine

    2016-09-01

    Full Text Available Two dimensional pelvic intraoperative neuromonitoring (pIONM® is based on electric stimulation of autonomic nerves under observation of electromyography of internal anal sphincter (IAS and manometry of urinary bladder. The method provides nerve identification and verification of its’ functional integrity. Currently pIONM® is gaining increased attention in times where preservation of function is becoming more and more important. Ongoing technical and methodological developments in experimental and clinical settings require further analysis of the obtained signals. This work describes a postprocessing algorithm for pIONM® signals, developed for automated analysis of huge amount of recorded data. The analysis routine includes a graphical representation of the recorded signals in the time and frequency domain, as well as a quantitative evaluation by means of features calculated from the time and frequency domain. The produced plots are summarized automatically in a PowerPoint presentation. The calculated features are filled into a standardized Excel-sheet, ready for statistical analysis.

  15. The speech signal segmentation algorithm using pitch synchronous analysis

    Directory of Open Access Journals (Sweden)

    Amirgaliyev Yedilkhan

    2017-03-01

    Full Text Available Parameterization of the speech signal using the algorithms of analysis synchronized with the pitch frequency is discussed. Speech parameterization is performed by the average number of zero transitions function and the signal energy function. Parameterization results are used to segment the speech signal and to isolate the segments with stable spectral characteristics. Segmentation results can be used to generate a digital voice pattern of a person or be applied in the automatic speech recognition. Stages needed for continuous speech segmentation are described.

  16. Time-Frequency Analysis of Signals Generated by Rotating Machines

    Directory of Open Access Journals (Sweden)

    R. Zetik

    1999-06-01

    Full Text Available This contribution is devoted to the higher order time-frequency analyses of signals. Firstly, time-frequency representations of higher order (TFRHO are defined. Then L-Wigner distribution (LWD is given as a special case of TFRHO. Basic properties of LWD are illustrated based on the analysis of mono-component and multi-component synthetic signals and acoustical signals generated by rotating machine. The obtained results confirm usefulness of LWD application for the purpose of rotating machine condition monitoring.

  17. Analysis of musical expression in audio signals

    Science.gov (United States)

    Dixon, Simon

    2003-01-01

    In western art music, composers communicate their work to performers via a standard notation which specificies the musical pitches and relative timings of notes. This notation may also include some higher level information such as variations in the dynamics, tempo and timing. Famous performers are characterised by their expressive interpretation, the ability to convey structural and emotive information within the given framework. The majority of work on audio content analysis focusses on retrieving score-level information; this paper reports on the extraction of parameters describing the performance, a task which requires a much higher degree of accuracy. Two systems are presented: BeatRoot, an off-line beat tracking system which finds the times of musical beats and tracks changes in tempo throughout a performance, and the Performance Worm, a system which provides a real-time visualisation of the two most important expressive dimensions, tempo and dynamics. Both of these systems are being used to process data for a large-scale study of musical expression in classical and romantic piano performance, which uses artificial intelligence (machine learning) techniques to discover fundamental patterns or principles governing expressive performance.

  18. Analysis of the finescale timing of repeated signals: does shell rapping in hermit crabs signal stamina?

    Science.gov (United States)

    Briffa; Elwood

    2000-01-01

    Hermit crabs, Pagurus bernhardus, sometimes exchange shells after a period of shell rapping, when the initiating or attacking crab brings its shell rapidly and repeatedly into contact with the shell of the noninitiator or defender in a series of bouts. Bouts are separated by pauses, and raps within bouts are separated by very short periods called 'gaps'. Since within-contest variation is missed when signals are studied by averaging performance rates over entire contests, we analysed the fine within-bout structure of this repeated, aggressive signal. We found that the pattern is consistent with high levels of fatigue in initiators. The duration of the gaps between individual raps increased both within bouts and from bout to bout, and we conclude that this activity is costly to perform. Furthermore, long pauses between bouts is correlated with increased vigour of rapping in the subsequent bout, which suggests that the pause allows for recovery from fatigue induced by rapping. These between-bout pauses may be assessed by noninitiators and provide a signal of stamina. Copyright 2000 The Association for the Study of Animal Behaviour.

  19. A general framework for time series data mining based on event analysis: application to the medical domains of electroencephalography and stabilometry.

    Science.gov (United States)

    Lara, Juan A; Lizcano, David; Pérez, Aurora; Valente, Juan P

    2014-10-01

    There are now domains where information is recorded over a period of time, leading to sequences of data known as time series. In many domains, like medicine, time series analysis requires to focus on certain regions of interest, known as events, rather than analyzing the whole time series. In this paper, we propose a framework for knowledge discovery in both one-dimensional and multidimensional time series containing events. We show how our approach can be used to classify medical time series by means of a process that identifies events in time series, generates time series reference models of representative events and compares two time series by analyzing the events they have in common. We have applied our framework on time series generated in the areas of electroencephalography (EEG) and stabilometry. Framework performance was evaluated in terms of classification accuracy, and the results confirmed that the proposed schema has potential for classifying EEG and stabilometric signals. The proposed framework is useful for discovering knowledge from medical time series containing events, such as stabilometric and electroencephalographic time series. These results would be equally applicable to other medical domains generating iconographic time series, such as, for example, electrocardiography (ECG). Copyright © 2014 Elsevier Inc. All rights reserved.

  20. Dynamic Factor Analysis of Nonstationary Multivariate Time Series.

    Science.gov (United States)

    Molenaar, Peter C. M.; And Others

    1992-01-01

    The dynamic factor model proposed by P. C. Molenaar (1985) is exhibited, and a dynamic nonstationary factor model (DNFM) is constructed with latent factor series that have time-varying mean functions. The use of a DNFM is illustrated using data from a television viewing habits study. (SLD)

  1. Koopman Operator Framework for Time Series Modeling and Analysis

    Science.gov (United States)

    Surana, Amit

    2018-01-01

    We propose an interdisciplinary framework for time series classification, forecasting, and anomaly detection by combining concepts from Koopman operator theory, machine learning, and linear systems and control theory. At the core of this framework is nonlinear dynamic generative modeling of time series using the Koopman operator which is an infinite-dimensional but linear operator. Rather than working with the underlying nonlinear model, we propose two simpler linear representations or model forms based on Koopman spectral properties. We show that these model forms are invariants of the generative model and can be readily identified directly from data using techniques for computing Koopman spectral properties without requiring the explicit knowledge of the generative model. We also introduce different notions of distances on the space of such model forms which is essential for model comparison/clustering. We employ the space of Koopman model forms equipped with distance in conjunction with classical machine learning techniques to develop a framework for automatic feature generation for time series classification. The forecasting/anomaly detection framework is based on using Koopman model forms along with classical linear systems and control approaches. We demonstrate the proposed framework for human activity classification, and for time series forecasting/anomaly detection in power grid application.

  2. Looking for very low tectonic deformation in GNSS time series impacted by strong hydrological signal in the Okavango Delta, Botswana

    Science.gov (United States)

    Pastier, Anne-Morwenn; Dauteuil, Olivier; Murray-Hudson, Michael; Makati, Kaelo; Moreau, Frédérique; Crave, Alain; Longuevergne, Laurent; Walpersdorf, Andrea

    2017-04-01

    Located in northern Botswana, the Okavango Delta is a vast wetland, fed from the Angolan highlands and constrained by a half-graben in the Kalahari depression. Since the 70's, the Okavango graben is usually considered as the terminus of the East African Rift System. But a recent geodetic study showed there has been no extension on the tectonic structure over the past 5 years, and recent geophysical studies began to call this hypothesis into question. The deformation in the area could instead be related to far-field deformation accommodation due to the motion of the Kalahari craton relative to the rest of the Nubian plate and to the opening of the Rift Valley. Getting to the vertical deformation isn't trivial. The GNSS time series show a strong annual deformation of the ground surface (3 cm of amplitude). On the vertical component, this periodic signal is so strong that it hides the tectonic long-term deformation, while this information would give a crucial insight on the geodynamic process at play. This periodic signal is related to the seasonal loading of water due to the rainy season. This hypothesis is corroborated by the modeling of the surface deformation based on the GRACE satellites data, interpreted as the variation of groundwater amount. In the Okavango Delta, the peak of water level isn't paced with the local precipitations, but is driven by a flood pulse coming from the Angolan Highlands. The migration of this massive water body isn't visible at first order in GRACE data. Yet, local precipitations are supposed to undergo too much evapotranspiration to be significant in the hydrological balance. Thus this later water body isn't supposed to produce a mass anomaly in GRACE time series. This paradox could highlight a relationship not yet defined between groundwater and local rainfall. The wide spatial resolution of GRACE data (about 300 km) doesn't allow a modeling accurate enough to give access to the slow tectonic deformation, nor to determine the

  3. Complexity analysis of the turbulent environmental fluid flow time series

    Science.gov (United States)

    Mihailović, D. T.; Nikolić-Đorić, E.; Drešković, N.; Mimić, G.

    2014-02-01

    We have used the Kolmogorov complexities, sample and permutation entropies to quantify the randomness degree in river flow time series of two mountain rivers in Bosnia and Herzegovina, representing the turbulent environmental fluid, for the period 1926-1990. In particular, we have examined the monthly river flow time series from two rivers (the Miljacka and the Bosnia) in the mountain part of their flow and then calculated the Kolmogorov complexity (KL) based on the Lempel-Ziv Algorithm (LZA) (lower-KLL and upper-KLU), sample entropy (SE) and permutation entropy (PE) values for each time series. The results indicate that the KLL, KLU, SE and PE values in two rivers are close to each other regardless of the amplitude differences in their monthly flow rates. We have illustrated the changes in mountain river flow complexity by experiments using (i) the data set for the Bosnia River and (ii) anticipated human activities and projected climate changes. We have explored the sensitivity of considered measures in dependence on the length of time series. In addition, we have divided the period 1926-1990 into three subintervals: (a) 1926-1945, (b) 1946-1965, (c) 1966-1990, and calculated the KLL, KLU, SE, PE values for the various time series in these subintervals. It is found that during the period 1946-1965, there is a decrease in their complexities, and corresponding changes in the SE and PE, in comparison to the period 1926-1990. This complexity loss may be primarily attributed to (i) human interventions, after the Second World War, on these two rivers because of their use for water consumption and (ii) climate change in recent times.

  4. Multiscale multifractal multiproperty analysis of financial time series based on Rényi entropy

    Science.gov (United States)

    Yujun, Yang; Jianping, Li; Yimei, Yang

    This paper introduces a multiscale multifractal multiproperty analysis based on Rényi entropy (3MPAR) method to analyze short-range and long-range characteristics of financial time series, and then applies this method to the five time series of five properties in four stock indices. Combining the two analysis techniques of Rényi entropy and multifractal detrended fluctuation analysis (MFDFA), the 3MPAR method focuses on the curves of Rényi entropy and generalized Hurst exponent of five properties of four stock time series, which allows us to study more universal and subtle fluctuation characteristics of financial time series. By analyzing the curves of the Rényi entropy and the profiles of the logarithm distribution of MFDFA of five properties of four stock indices, the 3MPAR method shows some fluctuation characteristics of the financial time series and the stock markets. Then, it also shows a richer information of the financial time series by comparing the profile of five properties of four stock indices. In this paper, we not only focus on the multifractality of time series but also the fluctuation characteristics of the financial time series and subtle differences in the time series of different properties. We find that financial time series is far more complex than reported in some research works using one property of time series.

  5. A method of signal transmission path analysis for multivariate random processes

    International Nuclear Information System (INIS)

    Oguma, Ritsuo

    1984-04-01

    A method for noise analysis called ''STP (signal transmission path) analysis'' is presentd as a tool to identify noise sources and their propagation paths in multivariate random proceses. Basic idea of the analysis is to identify, via time series analysis, effective network for the signal power transmission among variables in the system and to make use of its information to the noise analysis. In the present paper, we accomplish this through two steps of signal processings; first, we estimate, using noise power contribution analysis, variables which have large contribution to the power spectrum of interest, and then evaluate the STPs for each pair of variables to identify STPs which play significant role for the generated noise to transmit to the variable under evaluation. The latter part of the analysis is executed through comparison of partial coherence function and newly introduced partial noise power contribution function. This paper presents the procedure of the STP analysis and demonstrates, using simulation data as well as Borssele PWR noise data, its effectiveness for investigation of noise generation and propagation mechanisms. (author)

  6. Frames and operator theory in analysis and signal processing

    CERN Document Server

    Larson, David R; Nashed, Zuhair; Nguyen, Minh Chuong; Papadakis, Manos

    2008-01-01

    This volume contains articles based on talks presented at the Special Session Frames and Operator Theory in Analysis and Signal Processing, held in San Antonio, Texas, in January of 2006. Recently, the field of frames has undergone tremendous advancement. Most of the work in this field is focused on the design and construction of more versatile frames and frames tailored towards specific applications, e.g., finite dimensional uniform frames for cellular communication. In addition, frames are now becoming a hot topic in mathematical research as a part of many engineering applications, e.g., matching pursuits and greedy algorithms for image and signal processing. Topics covered in this book include: Application of several branches of analysis (e.g., PDEs; Fourier, wavelet, and harmonic analysis; transform techniques; data representations) to industrial and engineering problems, specifically image and signal processing. Theoretical and applied aspects of frames and wavelets. Pure aspects of operator theory empha...

  7. Differentiating BOLD and non-BOLD signals in fMRI time series using multi-echo EPI.

    Science.gov (United States)

    Kundu, Prantik; Inati, Souheil J; Evans, Jennifer W; Luh, Wen-Ming; Bandettini, Peter A

    2012-04-15

    A central challenge in the fMRI based study of functional connectivity is distinguishing neuronally related signal fluctuations from the effects of motion, physiology, and other nuisance sources. Conventional techniques for removing nuisance effects include modeling of noise time courses based on external measurements followed by temporal filtering. These techniques have limited effectiveness. Previous studies have shown using multi-echo fMRI that neuronally related fluctuations are Blood Oxygen Level Dependent (BOLD) signals that can be characterized in terms of changes in R(2)* and initial signal intensity (S(0)) based on the analysis of echo-time (TE) dependence. We hypothesized that if TE-dependence could be used to differentiate BOLD and non-BOLD signals, non-BOLD signal could be removed to denoise data without conventional noise modeling. To test this hypothesis, whole brain multi-echo data were acquired at 3 TEs and decomposed with Independent Components Analysis (ICA) after spatially concatenating data across space and TE. Components were analyzed for the degree to which their signal changes fit models for R(2)* and S(0) change, and summary scores were developed to characterize each component as BOLD-like or not BOLD-like. These scores clearly differentiated BOLD-like "functional network" components from non BOLD-like components related to motion, pulsatility, and other nuisance effects. Using non BOLD-like component time courses as noise regressors dramatically improved seed-based correlation mapping by reducing the effects of high and low frequency non-BOLD fluctuations. A comparison with seed-based correlation mapping using conventional noise regressors demonstrated the superiority of the proposed technique for both individual and group level seed-based connectivity analysis, especially in mapping subcortical-cortical connectivity. The differentiation of BOLD and non-BOLD components based on TE-dependence was highly robust, which allowed for the

  8. Modeling Philippine Stock Exchange Composite Index Using Time Series Analysis

    Science.gov (United States)

    Gayo, W. S.; Urrutia, J. D.; Temple, J. M. F.; Sandoval, J. R. D.; Sanglay, J. E. A.

    2015-06-01

    This study was conducted to develop a time series model of the Philippine Stock Exchange Composite Index and its volatility using the finite mixture of ARIMA model with conditional variance equations such as ARCH, GARCH, EG ARCH, TARCH and PARCH models. Also, the study aimed to find out the reason behind the behaviorof PSEi, that is, which of the economic variables - Consumer Price Index, crude oil price, foreign exchange rate, gold price, interest rate, money supply, price-earnings ratio, Producers’ Price Index and terms of trade - can be used in projecting future values of PSEi and this was examined using Granger Causality Test. The findings showed that the best time series model for Philippine Stock Exchange Composite index is ARIMA(1,1,5) - ARCH(1). Also, Consumer Price Index, crude oil price and foreign exchange rate are factors concluded to Granger cause Philippine Stock Exchange Composite Index.

  9. Trend Estimation and Regression Analysis in Climatological Time Series: An Application of Structural Time Series Models and the Kalman Filter.

    Science.gov (United States)

    Visser, H.; Molenaar, J.

    1995-05-01

    The detection of trends in climatological data has become central to the discussion on climate change due to the enhanced greenhouse effect. To prove detection, a method is needed (i) to make inferences on significant rises or declines in trends, (ii) to take into account natural variability in climate series, and (iii) to compare output from GCMs with the trends in observed climate data. To meet these requirements, flexible mathematical tools are needed. A structural time series model is proposed with which a stochastic trend, a deterministic trend, and regression coefficients can be estimated simultaneously. The stochastic trend component is described using the class of ARIMA models. The regression component is assumed to be linear. However, the regression coefficients corresponding with the explanatory variables may be time dependent to validate this assumption. The mathematical technique used to estimate this trend-regression model is the Kaiman filter. The main features of the filter are discussed.Examples of trend estimation are given using annual mean temperatures at a single station in the Netherlands (1706-1990) and annual mean temperatures at Northern Hemisphere land stations (1851-1990). The inclusion of explanatory variables is shown by regressing the latter temperature series on four variables: Southern Oscillation index (SOI), volcanic dust index (VDI), sunspot numbers (SSN), and a simulated temperature signal, induced by increasing greenhouse gases (GHG). In all analyses, the influence of SSN on global temperatures is found to be negligible. The correlations between temperatures and SOI and VDI appear to be negative. For SOI, this correlation is significant, but for VDI it is not, probably because of a lack of volcanic eruptions during the sample period. The relation between temperatures and GHG is positive, which is in agreement with the hypothesis of a warming climate because of increasing levels of greenhouse gases. The prediction performance of

  10. Time series analysis of the behavior of brazilian natural rubber

    Directory of Open Access Journals (Sweden)

    Antônio Donizette de Oliveira

    2009-03-01

    Full Text Available The natural rubber is a non-wood product obtained of the coagulation of some lattices of forest species, being Hevea brasiliensis the main one. Native from the Amazon Region, this species was already known by the Indians before the discovery of America. The natural rubber became a product globally valued due to its multiple applications in the economy, being its almost perfect substitute the synthetic rubber derived from the petroleum. Similarly to what happens with other countless products the forecast of future prices of the natural rubber has been object of many studies. The use of models of forecast of univariate timeseries stands out as the more accurate and useful to reduce the uncertainty in the economic decision making process. This studyanalyzed the historical series of prices of the Brazilian natural rubber (R$/kg, in the Jan/99 - Jun/2006 period, in order tocharacterize the rubber price behavior in the domestic market; estimated a model for the time series of monthly natural rubberprices; and foresaw the domestic prices of the natural rubber, in the Jul/2006 - Jun/2007 period, based on the estimated models.The studied models were the ones belonging to the ARIMA family. The main results were: the domestic market of the natural rubberis expanding due to the growth of the world economy; among the adjusted models, the ARIMA (1,1,1 model provided the bestadjustment of the time series of prices of the natural rubber (R$/kg; the prognosis accomplished for the series supplied statistically adequate fittings.

  11. Dynamical analysis and visualization of tornadoes time series.

    Directory of Open Access Journals (Sweden)

    António M Lopes

    Full Text Available In this paper we analyze the behavior of tornado time-series in the U.S. from the perspective of dynamical systems. A tornado is a violently rotating column of air extending from a cumulonimbus cloud down to the ground. Such phenomena reveal features that are well described by power law functions and unveil characteristics found in systems with long range memory effects. Tornado time series are viewed as the output of a complex system and are interpreted as a manifestation of its dynamics. Tornadoes are modeled as sequences of Dirac impulses with amplitude proportional to the events size. First, a collection of time series involving 64 years is analyzed in the frequency domain by means of the Fourier transform. The amplitude spectra are approximated by power law functions and their parameters are read as an underlying signature of the system dynamics. Second, it is adopted the concept of circular time and the collective behavior of tornadoes analyzed. Clustering techniques are then adopted to identify and visualize the emerging patterns.

  12. Dynamical analysis and visualization of tornadoes time series.

    Science.gov (United States)

    Lopes, António M; Tenreiro Machado, J A

    2015-01-01

    In this paper we analyze the behavior of tornado time-series in the U.S. from the perspective of dynamical systems. A tornado is a violently rotating column of air extending from a cumulonimbus cloud down to the ground. Such phenomena reveal features that are well described by power law functions and unveil characteristics found in systems with long range memory effects. Tornado time series are viewed as the output of a complex system and are interpreted as a manifestation of its dynamics. Tornadoes are modeled as sequences of Dirac impulses with amplitude proportional to the events size. First, a collection of time series involving 64 years is analyzed in the frequency domain by means of the Fourier transform. The amplitude spectra are approximated by power law functions and their parameters are read as an underlying signature of the system dynamics. Second, it is adopted the concept of circular time and the collective behavior of tornadoes analyzed. Clustering techniques are then adopted to identify and visualize the emerging patterns.

  13. Financial time series analysis based on information categorization method

    Science.gov (United States)

    Tian, Qiang; Shang, Pengjian; Feng, Guochen

    2014-12-01

    The paper mainly applies the information categorization method to analyze the financial time series. The method is used to examine the similarity of different sequences by calculating the distances between them. We apply this method to quantify the similarity of different stock markets. And we report the results of similarity in US and Chinese stock markets in periods 1991-1998 (before the Asian currency crisis), 1999-2006 (after the Asian currency crisis and before the global financial crisis), and 2007-2013 (during and after global financial crisis) by using this method. The results show the difference of similarity between different stock markets in different time periods and the similarity of the two stock markets become larger after these two crises. Also we acquire the results of similarity of 10 stock indices in three areas; it means the method can distinguish different areas' markets from the phylogenetic trees. The results show that we can get satisfactory information from financial markets by this method. The information categorization method can not only be used in physiologic time series, but also in financial time series.

  14. Power system small signal stability analysis and control

    CERN Document Server

    Mondal, Debasish; Sengupta, Aparajita

    2014-01-01

    Power System Small Signal Stability Analysis and Control presents a detailed analysis of the problem of severe outages due to the sustained growth of small signal oscillations in modern interconnected power systems. The ever-expanding nature of power systems and the rapid upgrade to smart grid technologies call for the implementation of robust and optimal controls. Power systems that are forced to operate close to their stability limit have resulted in the use of control devices by utility companies to improve the performance of the transmission system against commonly occurring power system

  15. Study of interhemispheric asymmetries in electroencephalographic signals by frequency analysis

    International Nuclear Information System (INIS)

    Zapata, J F; Garzon, J

    2011-01-01

    This study provides a new method for the detection of interhemispheric asymmetries in patients with continuous video-electroencephalography (EEG) monitoring at Intensive Care Unit (ICU), using wavelet energy. We obtained the registration of EEG signals in 42 patients with different pathologies, and then we proceeded to perform signal processing using the Matlab program, we compared the abnormalities recorded in the report by the neurophysiologist, the images of each patient and the result of signals analysis with the Discrete Wavelet Transform (DWT). Conclusions: there exists correspondence between the abnormalities found in the processing of the signal with the clinical reports of findings in patients; according to previous conclusion, the methodology used can be a useful tool for diagnosis and early quantitative detection of interhemispheric asymmetries.

  16. Hybrid soft computing systems for electromyographic signals analysis: a review

    Science.gov (United States)

    2014-01-01

    Electromyographic (EMG) is a bio-signal collected on human skeletal muscle. Analysis of EMG signals has been widely used to detect human movement intent, control various human-machine interfaces, diagnose neuromuscular diseases, and model neuromusculoskeletal system. With the advances of artificial intelligence and soft computing, many sophisticated techniques have been proposed for such purpose. Hybrid soft computing system (HSCS), the integration of these different techniques, aims to further improve the effectiveness, efficiency, and accuracy of EMG analysis. This paper reviews and compares key combinations of neural network, support vector machine, fuzzy logic, evolutionary computing, and swarm intelligence for EMG analysis. Our suggestions on the possible future development of HSCS in EMG analysis are also given in terms of basic soft computing techniques, further combination of these techniques, and their other applications in EMG analysis. PMID:24490979

  17. Hybrid soft computing systems for electromyographic signals analysis: a review.

    Science.gov (United States)

    Xie, Hong-Bo; Guo, Tianruo; Bai, Siwei; Dokos, Socrates

    2014-02-03

    Electromyographic (EMG) is a bio-signal collected on human skeletal muscle. Analysis of EMG signals has been widely used to detect human movement intent, control various human-machine interfaces, diagnose neuromuscular diseases, and model neuromusculoskeletal system. With the advances of artificial intelligence and soft computing, many sophisticated techniques have been proposed for such purpose. Hybrid soft computing system (HSCS), the integration of these different techniques, aims to further improve the effectiveness, efficiency, and accuracy of EMG analysis. This paper reviews and compares key combinations of neural network, support vector machine, fuzzy logic, evolutionary computing, and swarm intelligence for EMG analysis. Our suggestions on the possible future development of HSCS in EMG analysis are also given in terms of basic soft computing techniques, further combination of these techniques, and their other applications in EMG analysis.

  18. Gear wear monitoring by modulation signal bispectrum based on motor current signal analysis

    Science.gov (United States)

    Zhang, Ruiliang; Gu, Fengshou; Mansaf, Haram; Wang, Tie; Ball, Andrew D.

    2017-09-01

    Gears are important mechanical components for power transmissions. Tooth wear is one of the most common failure modes, which can present throughout a gear's lifetime. It is significant to accurately monitor gear wear progression in order to take timely predictive maintenances. Motor current signature analysis (MCSA) is an effective and non-intrusive approach which is able to monitor faults from both electrical and mechanical systems. However, little research has been reported in monitoring the gear wear and estimating its severity based on MCSA. This paper presents a novel gear wear monitoring method through a modulation signal bispectrum based motor current signal analysis (MSB-MCSA). For a steady gear transmission, it is inevitable to exist load and speed oscillations due to various errors including wears. These oscillations can induce small modulations in the current signals of the driving motor. MSB is particularly effective in characterising such small modulation signals. Based on these understandings, the monitoring process was implemented based on the current signals from a run-to-failure test of an industrial two stages helical gearbox under a moderate accelerated fatigue process. At the initial operation of the test, MSB analysis results showed that the peak values at the bifrequencies of gear rotations and the power supply can be effective monitoring features for identifying faulty gears and wear severity as they exhibit agreeable changes with gear loads. A monotonically increasing trend established by these features allows a clear indication of the gear wear progression. The dismantle inspection at 477 h of operation, made when one of the monitored features is about 123% higher than its baseline, has found that there are severe scuffing wear marks on a number of tooth surfaces on the driving gear, showing that the gear endures a gradual wear process during its long test operation. Therefore, it is affirmed that the MSB-MSCA approach proposed is reliable

  19. Time-Series Analysis of Supergranule Characterstics at Solar Minimum

    Science.gov (United States)

    Williams, Peter E.; Pesnell, W. Dean

    2013-01-01

    Sixty days of Doppler images from the Solar and Heliospheric Observatory (SOHO) / Michelson Doppler Imager (MDI) investigation during the 1996 and 2008 solar minima have been analyzed to show that certain supergranule characteristics (size, size range, and horizontal velocity) exhibit fluctuations of three to five days. Cross-correlating parameters showed a good, positive correlation between supergranulation size and size range, and a moderate, negative correlation between size range and velocity. The size and velocity do exhibit a moderate, negative correlation, but with a small time lag (less than 12 hours). Supergranule sizes during five days of co-temporal data from MDI and the Solar Dynamics Observatory (SDO) / Helioseismic Magnetic Imager (HMI) exhibit similar fluctuations with a high level of correlation between them. This verifies the solar origin of the fluctuations, which cannot be caused by instrumental artifacts according to these observations. Similar fluctuations are also observed in data simulations that model the evolution of the MDI Doppler pattern over a 60-day period. Correlations between the supergranule size and size range time-series derived from the simulated data are similar to those seen in MDI data. A simple toy-model using cumulative, uncorrelated exponential growth and decay patterns at random emergence times produces a time-series similar to the data simulations. The qualitative similarities between the simulated and the observed time-series suggest that the fluctuations arise from stochastic processes occurring within the solar convection zone. This behavior, propagating to surface manifestations of supergranulation, may assist our understanding of magnetic-field-line advection, evolution, and interaction.

  20. Wavelet analysis as a tool to characteriseand remove environmental noisefrom self-potential time series

    OpenAIRE

    Chianese, D.; Colangelo, G.; D'Emilio, M.; Lanfredi, M.; Lapenna, V.; Ragosta, M.; Macchiato, M. F.

    2004-01-01

    Multiresolution wavelet analysis of self-potential signals and rainfall levels is performed for extracting fluctuations in electrical signals, which might be addressed to meteorological variability. In the time-scale domain of the wavelet transform, rain data are used as markers to single out those wavelet coefficients of the electric signal which can be considered relevant to the environmental disturbance. Then these coefficients are filtered out and the signal is recovered by anti...

  1. DEAP: A Database for Emotion Analysis Using Physiological Signals

    NARCIS (Netherlands)

    Koelstra, Sander; Mühl, C.; Soleymani, Mohammad; Lee, Jung Seok; Yazdani, Ashkan; Ebrahimi, Touradj; Pun, Thierry; Nijholt, Antinus; Patras, Ioannis

    2012-01-01

    We present a multimodal dataset for the analysis of human affective states. The electroencephalogram (EEG) and peripheral physiological signals of 32 participants were recorded as each watched 40 one-minute long excerpts of music videos. Participants rated each video in terms of the levels of

  2. Joint time frequency analysis in digital signal processing

    DEFF Research Database (Denmark)

    Pedersen, Flemming

    with this technique is that the resolution is limited because of distortion. To overcome the resolution limitations of the Fourier Spectogram, many new distributions have been developed. In spite of this the Fourier Spectogram is by far the prime method for the analysis of signals whose spectral content is varying...

  3. Visibility graph analysis on quarterly macroeconomic series of China based on complex network theory

    Science.gov (United States)

    Wang, Na; Li, Dong; Wang, Qiwen

    2012-12-01

    The visibility graph approach and complex network theory provide a new insight into time series analysis. The inheritance of the visibility graph from the original time series was further explored in the paper. We found that degree distributions of visibility graphs extracted from Pseudo Brownian Motion series obtained by the Frequency Domain algorithm exhibit exponential behaviors, in which the exponential exponent is a binomial function of the Hurst index inherited in the time series. Our simulations presented that the quantitative relations between the Hurst indexes and the exponents of degree distribution function are different for different series and the visibility graph inherits some important features of the original time series. Further, we convert some quarterly macroeconomic series including the growth rates of value-added of three industry series and the growth rates of Gross Domestic Product series of China to graphs by the visibility algorithm and explore the topological properties of graphs associated from the four macroeconomic series, namely, the degree distribution and correlations, the clustering coefficient, the average path length, and community structure. Based on complex network analysis we find degree distributions of associated networks from the growth rates of value-added of three industry series are almost exponential and the degree distributions of associated networks from the growth rates of GDP series are scale free. We also discussed the assortativity and disassortativity of the four associated networks as they are related to the evolutionary process of the original macroeconomic series. All the constructed networks have “small-world” features. The community structures of associated networks suggest dynamic changes of the original macroeconomic series. We also detected the relationship among government policy changes, community structures of associated networks and macroeconomic dynamics. We find great influences of government

  4. Anatomy of the ICDS series: A bibliometric analysis

    International Nuclear Information System (INIS)

    Cardona, Manuel; Marx, Werner

    2007-01-01

    In this article, the proceedings of the International Conferences on Defects in Semiconductors (ICDS) have been analyzed by bibliometric methods. The papers of these conferences have been published as articles in regular journals or special proceedings journals and in books with diverse publishers. The conference name/title changed several times. Many of the proceedings did not appear in the so-called 'source journals' covered by the Thomson/ISI citation databases, in particular by the Science Citation Index (SCI). But the number of citations within these source journals can be determined using the Cited Reference Search mode under the Web of Science (WoS) and the SCI offered by the host STN International. The search functions of both systems were needed to select the papers published as different document types and to cover the full time span of the series. The most cited ICDS papers were identified, and the overall numbers of citations as well as the time-dependent impact of these papers, of single conferences, and of the complete series, was established. The complete of citing papers was analyzed with respect to the countries of the citing authors, the citing journals, and the ISI subject categories

  5. Time-variant power spectral analysis of heart-rate time series by ...

    Indian Academy of Sciences (India)

    Frequency domain representation of a short-term heart-rate time series (HRTS) signal is a popular method for evaluating the cardiovascular control system. The spectral parameters, viz. percentage power in low frequency band (%PLF), percentage power in high frequency band (%PHF), power ratio of low frequency to high ...

  6. Discontinuous conduction mode analysis of phase-modulated series ...

    Indian Academy of Sciences (India)

    Utsab Kundu

    domain analysis; frequency domain analysis; critical load resistance. 1. Introduction ... DCMSRC design process, requiring repeated circuit simu- lations for design ... Structured derivation of Av is presented, ..... System specifications. L. C r. Lm.

  7. Signal Adaptive System for Space/Spatial-Frequency Analysis

    Directory of Open Access Journals (Sweden)

    Veselin N. Ivanović

    2009-01-01

    Full Text Available This paper outlines the development of a multiple-clock-cycle implementation (MCI of a signal adaptive two-dimensional (2D system for space/spatial-frequency (S/SF signal analysis. The design is based on a method for improved S/SF representation of the analyzed 2D signals, also proposed here. The proposed MCI design optimizes critical design performances related to hardware complexity, making it a suitable system for real time implementation on an integrated chip. Additionally, the design allows the implemented system to take a variable number of clock cycles (CLKs (the only necessary ones regarding desirable—2D Wigner distribution-presentation of autoterms in different frequency-frequency points during the execution. This ability represents a major advantage of the proposed design which helps to optimize the time required for execution and produce an improved, cross-terms-free S/SF signal representation. The design has been verified by a field-programmable gate array (FPGA circuit design, capable of performing S/SF analysis of 2D signals in real time.

  8. Phosphoproteomics-based systems analysis of signal transduction networks

    Directory of Open Access Journals (Sweden)

    Hiroko eKozuka-Hata

    2012-01-01

    Full Text Available Signal transduction systems coordinate complex cellular information to regulate biological events such as cell proliferation and differentiation. Although the accumulating evidence on widespread association of signaling molecules has revealed essential contribution of phosphorylation-dependent interaction networks to cellular regulation, their dynamic behavior is mostly yet to be analyzed. Recent technological advances regarding mass spectrometry-based quantitative proteomics have enabled us to describe the comprehensive status of phosphorylated molecules in a time-resolved manner. Computational analyses based on the phosphoproteome dynamics accelerate generation of novel methodologies for mathematical analysis of cellular signaling. Phosphoproteomics-based numerical modeling can be used to evaluate regulatory network elements from a statistical point of view. Integration with transcriptome dynamics also uncovers regulatory hubs at the transcriptional level. These omics-based computational methodologies, which have firstly been applied to representative signaling systems such as the epidermal growth factor receptor pathway, have now opened up a gate for systems analysis of signaling networks involved in immune response and cancer.

  9. The Analysis Of Personality Disorder On Two Characters In The Animation Series Black Rock Shooter

    OpenAIRE

    Ramadhana, Rizki Andrian

    2015-01-01

    The title of this thesis is The Analysis of Personality Disorder on Two Characters in the Animation Series “Black Rock Shooter” which discusses about the personality disorder of two characters from this series; they are Kagari Izuriha and Yomi Takanashi. The animation series Black Rock Shooter is chosen as the source of data because this animation has psychological genre and represents the complexity of human relationship, especially when build up a friendship. It is because human is a social...

  10. Characterization of Ground Displacement Sources from Variational Bayesian Independent Component Analysis of Space Geodetic Time Series

    Science.gov (United States)

    Gualandi, Adriano; Serpelloni, Enrico; Elina Belardinelli, Maria; Bonafede, Maurizio; Pezzo, Giuseppe; Tolomei, Cristiano

    2015-04-01

    A critical point in the analysis of ground displacement time series, as those measured by modern space geodetic techniques (primarly continuous GPS/GNSS and InSAR) is the development of data driven methods that allow to discern and characterize the different sources that generate the observed displacements. A widely used multivariate statistical technique is the Principal Component Analysis (PCA), which allows to reduce the dimensionality of the data space maintaining most of the variance of the dataset explained. It reproduces the original data using a limited number of Principal Components, but it also shows some deficiencies, since PCA does not perform well in finding the solution to the so-called Blind Source Separation (BSS) problem. The recovering and separation of the different sources that generate the observed ground deformation is a fundamental task in order to provide a physical meaning to the possible different sources. PCA fails in the BSS problem since it looks for a new Euclidean space where the projected data are uncorrelated. Usually, the uncorrelation condition is not strong enough and it has been proven that the BSS problem can be tackled imposing on the components to be independent. The Independent Component Analysis (ICA) is, in fact, another popular technique adopted to approach this problem, and it can be used in all those fields where PCA is also applied. An ICA approach enables us to explain the displacement time series imposing a fewer number of constraints on the model, and to reveal anomalies in the data such as transient deformation signals. However, the independence condition is not easy to impose, and it is often necessary to introduce some approximations. To work around this problem, we use a variational bayesian ICA (vbICA) method, which models the probability density function (pdf) of each source signal using a mix of Gaussian distributions. This technique allows for more flexibility in the description of the pdf of the sources

  11. Multi-Scale Entropy Analysis as a Method for Time-Series Analysis of Climate Data

    Directory of Open Access Journals (Sweden)

    Heiko Balzter

    2015-03-01

    Full Text Available Evidence is mounting that the temporal dynamics of the climate system are changing at the same time as the average global temperature is increasing due to multiple climate forcings. A large number of extreme weather events such as prolonged cold spells, heatwaves, droughts and floods have been recorded around the world in the past 10 years. Such changes in the temporal scaling behaviour of climate time-series data can be difficult to detect. While there are easy and direct ways of analysing climate data by calculating the means and variances for different levels of temporal aggregation, these methods can miss more subtle changes in their dynamics. This paper describes multi-scale entropy (MSE analysis as a tool to study climate time-series data and to identify temporal scales of variability and their change over time in climate time-series. MSE estimates the sample entropy of the time-series after coarse-graining at different temporal scales. An application of MSE to Central European, variance-adjusted, mean monthly air temperature anomalies (CRUTEM4v is provided. The results show that the temporal scales of the current climate (1960–2014 are different from the long-term average (1850–1960. For temporal scale factors longer than 12 months, the sample entropy increased markedly compared to the long-term record. Such an increase can be explained by systems theory with greater complexity in the regional temperature data. From 1961 the patterns of monthly air temperatures are less regular at time-scales greater than 12 months than in the earlier time period. This finding suggests that, at these inter-annual time scales, the temperature variability has become less predictable than in the past. It is possible that climate system feedbacks are expressed in altered temporal scales of the European temperature time-series data. A comparison with the variance and Shannon entropy shows that MSE analysis can provide additional information on the

  12. Statistical Analysis of fMRI Time-Series: A Critical Review of the GLM Approach

    Directory of Open Access Journals (Sweden)

    Martin M Monti

    2011-03-01

    Full Text Available Functional Magnetic Resonance Imaging (fMRI is one of the most widely used tools to study the neural underpinnings of human cognition. Standard analysis of fMRI data relies on a General Linear Model (GLM approach to separate stimulus induced signals from noise. Crucially, this approach relies on a number of assumptions about the data which, for inferences to be valid, must be met. The current paper reviews the GLM approach to analysis of fMRI time-series, focusing in particular on the degree to which such data abides by the assumptions of the GLM framework, and on the methods that have been developed to correct for any violation of those assumptions. Rather than biasing estimates of effect size, the major consequence of non-conformity to the assumptions is to introduce bias into estimates of the variance, thus affecting test statistics, power and false positive rates. Furthermore, this bias can have pervasive effects on both individual subject and group-level statistics, potentially yielding qualitatively different results across replications, especially after the thresholding procedures commonly used for inference-making.

  13. A comparative analysis of spectral exponent estimation techniques for 1/fβ processes with applications to the analysis of stride interval time series

    Science.gov (United States)

    Schaefer, Alexander; Brach, Jennifer S.; Perera, Subashan; Sejdić, Ervin

    2013-01-01

    Background The time evolution and complex interactions of many nonlinear systems, such as in the human body, result in fractal types of parameter outcomes that exhibit self similarity over long time scales by a power law in the frequency spectrum S(f) = 1/fβ. The scaling exponent β is thus often interpreted as a “biomarker” of relative health and decline. New Method This paper presents a thorough comparative numerical analysis of fractal characterization techniques with specific consideration given to experimentally measured gait stride interval time series. The ideal fractal signals generated in the numerical analysis are constrained under varying lengths and biases indicative of a range of physiologically conceivable fractal signals. This analysis is to complement previous investigations of fractal characteristics in healthy and pathological gait stride interval time series, with which this study is compared. Results The results of our analysis showed that the averaged wavelet coefficient method consistently yielded the most accurate results. Comparison with Existing Methods: Class dependent methods proved to be unsuitable for physiological time series. Detrended fluctuation analysis as most prevailing method in the literature exhibited large estimation variances. Conclusions The comparative numerical analysis and experimental applications provide a thorough basis for determining an appropriate and robust method for measuring and comparing a physiologically meaningful biomarker, the spectral index β. In consideration of the constraints of application, we note the significant drawbacks of detrended fluctuation analysis and conclude that the averaged wavelet coefficient method can provide reasonable consistency and accuracy for characterizing these fractal time series. PMID:24200509

  14. Analysis of a dynamic model of guard cell signaling reveals the stability of signal propagation

    Science.gov (United States)

    Gan, Xiao; Albert, RéKa

    Analyzing the long-term behaviors (attractors) of dynamic models of biological systems can provide valuable insight into biological phenotypes and their stability. We identified the long-term behaviors of a multi-level, 70-node discrete dynamic model of the stomatal opening process in plants. We reduce the model's huge state space by reducing unregulated nodes and simple mediator nodes, and by simplifying the regulatory functions of selected nodes while keeping the model consistent with experimental observations. We perform attractor analysis on the resulting 32-node reduced model by two methods: 1. converting it into a Boolean model, then applying two attractor-finding algorithms; 2. theoretical analysis of the regulatory functions. We conclude that all nodes except two in the reduced model have a single attractor; and only two nodes can admit oscillations. The multistability or oscillations do not affect the stomatal opening level in any situation. This conclusion applies to the original model as well in all the biologically meaningful cases. We further demonstrate the robustness of signal propagation by showing that a large percentage of single-node knockouts does not affect the stomatal opening level. Thus, we conclude that the complex structure of this signal transduction network provides multiple information propagation pathways while not allowing extensive multistability or oscillations, resulting in robust signal propagation. Our innovative combination of methods offers a promising way to analyze multi-level models.

  15. SURFACE ELECTROMYOGRAPHY IN BIOMECHANICS: APPLICATIONS AND SIGNAL ANALYSIS ASPECTS

    Directory of Open Access Journals (Sweden)

    DEAK GRAłIELA-FLAVIA

    2009-12-01

    Full Text Available Surface electromyography (SEMG is a technique for detecting and recording the electrical activity of the muscles using surface electrodes. The EMG signal is used in biomechanics mainly as an indicator of the initiation of muscle activation, as an indicator of the force produced by a contracting muscle, and as an index ofthe fatigue occurring within a muscle. EMG, used as a method of investigation, can tell us if the muscle is active or not, if the muscle is more or less active, when it is on or off, how much active is it, and finally, if it fatigues.The purpose of this article is to discuss some specific EMG signal analysis aspects with emphasis on comparison type analysis and frequency fatigue analysis.

  16. Analysis of room transfer function and reverberant signal statistics

    DEFF Research Database (Denmark)

    Georganti, Eleftheria; Mourjopoulos, John; Jacobsen, Finn

    2008-01-01

    For some time now, statistical analysis has been a valuable tool in analyzing room transfer functions (RTFs). This work examines existing statistical time-frequency models and techniques for RTF analysis (e.g., Schroeder's stochastic model and the standard deviation over frequency bands for the RTF...... magnitude and phase). RTF fractional octave smoothing, as with 1-slash 3 octave analysis, may lead to RTF simplifications that can be useful for several audio applications, like room compensation, room modeling, auralisation purposes. The aim of this work is to identify the relationship of optimal response...... and the corresponding ratio of the direct and reverberant signal. In addition, this work examines the statistical quantities for speech and audio signals prior to their reproduction within rooms and when recorded in rooms. Histograms and other statistical distributions are used to compare RTF minima of typical...

  17. Time Series in Education: The Analysis of Daily Attendance in Two High Schools

    Science.gov (United States)

    Koopmans, Matthijs

    2011-01-01

    This presentation discusses the use of a time series approach to the analysis of daily attendance in two urban high schools over the course of one school year (2009-10). After establishing that the series for both schools were stationary, they were examined for moving average processes, autoregression, seasonal dependencies (weekly cycles),…

  18. Mapping air temperature using time series analysis of LST : The SINTESI approach

    NARCIS (Netherlands)

    Alfieri, S.M.; De Lorenzi, F.; Menenti, M.

    2013-01-01

    This paper presents a new procedure to map time series of air temperature (Ta) at fine spatial resolution using time series analysis of satellite-derived land surface temperature (LST) observations. The method assumes that air temperature is known at a single (reference) location such as in gridded

  19. Time-series analysis of Nigeria rice supply and demand: Error ...

    African Journals Online (AJOL)

    The study examined a time-series analysis of Nigeria rice supply and demand with a view to determining any long-run equilibrium between them using the Error Correction Model approach (ECM). The data used for the study represents the annual series of 1960-2007 (47 years) for rice supply and demand in Nigeria, ...

  20. Taxation in Public Education. Analysis and Bibliography Series, No. 12.

    Science.gov (United States)

    Ross, Larry L.

    Intended for both researchers and practitioners, this analysis and bibliography cites approximately 100 publications on educational taxation, including general texts and reports, statistical reports, taxation guidelines, and alternative proposals for taxation. Topics covered in the analysis section include State and Federal aid, urban and suburban…

  1. Correlation analysis of respiratory signals by using parallel coordinate plots.

    Science.gov (United States)

    Saatci, Esra

    2018-01-01

    The understanding of the bonds and the relationships between the respiratory signals, i.e. the airflow, the mouth pressure, the relative temperature and the relative humidity during breathing may provide the improvement on the measurement methods of respiratory mechanics and sensor designs or the exploration of the several possible applications in the analysis of respiratory disorders. Therefore, the main objective of this study was to propose a new combination of methods in order to determine the relationship between respiratory signals as a multidimensional data. In order to reveal the coupling between the processes two very different methods were used: the well-known statistical correlation analysis (i.e. Pearson's correlation and cross-correlation coefficient) and parallel coordinate plots (PCPs). Curve bundling with the number intersections for the correlation analysis, Least Mean Square Time Delay Estimator (LMS-TDE) for the point delay detection and visual metrics for the recognition of the visual structures were proposed and utilized in PCP. The number of intersections was increased when the correlation coefficient changed from high positive to high negative correlation between the respiratory signals, especially if whole breath was processed. LMS-TDE coefficients plotted in PCP indicated well-matched point delay results to the findings in the correlation analysis. Visual inspection of PCB by visual metrics showed range, dispersions, entropy comparisons and linear and sinusoidal-like relationships between the respiratory signals. It is demonstrated that the basic correlation analysis together with the parallel coordinate plots perceptually motivates the visual metrics in the display and thus can be considered as an aid to the user analysis by providing meaningful views of the data. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. Variability of signal-to-noise ratio and the network analysis of gravitational wave burst signals

    International Nuclear Information System (INIS)

    Mohanty, S D; Rakhmanov, M; Klimenko, S; Mitselmakher, G

    2006-01-01

    The detection and estimation of gravitational wave burst signals, with a priori unknown polarization waveforms, requires the use of data from a network of detectors. Maximizing the network likelihood functional over all waveforms and sky positions yields point estimates for them as well as a detection statistic. However, the transformation from the data to estimates can become ill-conditioned over parts of the sky, resulting in significant errors in estimation. We modify the likelihood procedure by introducing a penalty functional which suppresses candidate solutions that display large signal-to-noise ratio (SNR) variability as the source is displaced on the sky. Simulations show that the resulting network analysis method performs significantly better in estimating the sky position of a source. Further, this method can be applied to any network, irrespective of the number or mutual alignment of detectors

  3. Modelling and Analysis of Biochemical Signalling Pathway Cross-talk

    Directory of Open Access Journals (Sweden)

    Robin Donaldson

    2010-02-01

    Full Text Available Signalling pathways are abstractions that help life scientists structure the coordination of cellular activity. Cross-talk between pathways accounts for many of the complex behaviours exhibited by signalling pathways and is often critical in producing the correct signal-response relationship. Formal models of signalling pathways and cross-talk in particular can aid understanding and drive experimentation. We define an approach to modelling based on the concept that a pathway is the (synchronising parallel composition of instances of generic modules (with internal and external labels. Pathways are then composed by (synchronising parallel composition and renaming; different types of cross-talk result from different combinations of synchronisation and renaming. We define a number of generic modules in PRISM and five types of cross-talk: signal flow, substrate availability, receptor function, gene expression and intracellular communication. We show that Continuous Stochastic Logic properties can both detect and distinguish the types of cross-talk. The approach is illustrated with small examples and an analysis of the cross-talk between the TGF-b/BMP, WNT and MAPK pathways.

  4. Applied Fourier analysis from signal processing to medical imaging

    CERN Document Server

    Olson, Tim

    2017-01-01

    The first of its kind, this focused textbook serves as a self-contained resource for teaching from scratch the fundamental mathematics of Fourier analysis and illustrating some of its most current, interesting applications, including medical imaging and radar processing. Developed by the author from extensive classroom teaching experience, it provides a breadth of theory that allows students to appreciate the utility of the subject, but at as accessible a depth as possible. With myriad applications included, this book can be adapted to a one or two semester course in Fourier Analysis or serve as the basis for independent study. Applied Fourier Analysis assumes no prior knowledge of analysis from its readers, and begins by making the transition from linear algebra to functional analysis. It goes on to cover basic Fourier series and Fourier transforms before delving into applications in sampling and interpolation theory, digital communications, radar processing, medical i maging, and heat and wave equations. Fo...

  5. RECONSTRUCTION OF PRECIPITATION SERIES AND ANALYSIS OF CLIMATE CHANGE OVER PAST 500 YEARS IN NORTHERN CHINA

    Institute of Scientific and Technical Information of China (English)

    RONG Yan-shu; TU Qi-pu

    2005-01-01

    It is important and necessary to get a much longer precipitation series in order to research features of drought/flood and climate change.Based on dryness and wetness grades series of 18 stations in Northern China of 533 years from 1470 to 2002, the Moving Cumulative Frequency Method (MCFM) was developed, moving average precipitation series from 1499 to 2002 were reconstructed by testing three kinds of average precipitation, and the features of climate change and dry and wet periods were researched by using reconstructed precipitation series in the present paper.The results showed that there were good relationship between the reconstructed precipitation series and the observation precipitation series since 1954 and their relative root-mean-square error were below 1.89%, that the relation between reconstructed series and the dryness and wetness grades series were nonlinear and this nonlinear relation implied that reconstructed series were reliable and could became foundation data for researching evolution of the drought and flood.Analysis of climate change upon reconstructed precipitation series revealed that although drought intensity of recent dry period from middle 1970s of 20th century until early 21st century was not the strongest in historical climate of Northern China, intensity and duration of wet period was a great deal decreasing and shortening respectively, climate evolve to aridification situation in Northern China.

  6. The Timeseries Toolbox - A Web Application to Enable Accessible, Reproducible Time Series Analysis

    Science.gov (United States)

    Veatch, W.; Friedman, D.; Baker, B.; Mueller, C.

    2017-12-01

    The vast majority of data analyzed by climate researchers are repeated observations of physical process or time series data. This data lends itself of a common set of statistical techniques and models designed to determine trends and variability (e.g., seasonality) of these repeated observations. Often, these same techniques and models can be applied to a wide variety of different time series data. The Timeseries Toolbox is a web application designed to standardize and streamline these common approaches to time series analysis and modeling with particular attention to hydrologic time series used in climate preparedness and resilience planning and design by the U. S. Army Corps of Engineers. The application performs much of the pre-processing of time series data necessary for more complex techniques (e.g. interpolation, aggregation). With this tool, users can upload any dataset that conforms to a standard template and immediately begin applying these techniques to analyze their time series data.

  7. Financial time series analysis based on effective phase transfer entropy

    Science.gov (United States)

    Yang, Pengbo; Shang, Pengjian; Lin, Aijing

    2017-02-01

    Transfer entropy is a powerful technique which is able to quantify the impact of one dynamic system on another system. In this paper, we propose the effective phase transfer entropy method based on the transfer entropy method. We use simulated data to test the performance of this method, and the experimental results confirm that the proposed approach is capable of detecting the information transfer between the systems. We also explore the relationship between effective phase transfer entropy and some variables, such as data size, coupling strength and noise. The effective phase transfer entropy is positively correlated with the data size and the coupling strength. Even in the presence of a large amount of noise, it can detect the information transfer between systems, and it is very robust to noise. Moreover, this measure is indeed able to accurately estimate the information flow between systems compared with phase transfer entropy. In order to reflect the application of this method in practice, we apply this method to financial time series and gain new insight into the interactions between systems. It is demonstrated that the effective phase transfer entropy can be used to detect some economic fluctuations in the financial market. To summarize, the effective phase transfer entropy method is a very efficient tool to estimate the information flow between systems.

  8. Stock price forecasting based on time series analysis

    Science.gov (United States)

    Chi, Wan Le

    2018-05-01

    Using the historical stock price data to set up a sequence model to explain the intrinsic relationship of data, the future stock price can forecasted. The used models are auto-regressive model, moving-average model and autoregressive-movingaverage model. The original data sequence of unit root test was used to judge whether the original data sequence was stationary. The non-stationary original sequence as a first order difference needed further processing. Then the stability of the sequence difference was re-inspected. If it is still non-stationary, the second order differential processing of the sequence is carried out. Autocorrelation diagram and partial correlation diagram were used to evaluate the parameters of the identified ARMA model, including coefficients of the model and model order. Finally, the model was used to forecast the fitting of the shanghai composite index daily closing price with precision. Results showed that the non-stationary original data series was stationary after the second order difference. The forecast value of shanghai composite index daily closing price was closer to actual value, indicating that the ARMA model in the paper was a certain accuracy.

  9. Industrial electricity demand for Turkey: A structural time series analysis

    International Nuclear Information System (INIS)

    Dilaver, Zafer; Hunt, Lester C.

    2011-01-01

    This research investigates the relationship between Turkish industrial electricity consumption, industrial value added and electricity prices in order to forecast future Turkish industrial electricity demand. To achieve this, an industrial electricity demand function for Turkey is estimated by applying the structural time series technique to annual data over the period 1960 to 2008. In addition to identifying the size and significance of the price and industrial value added (output) elasticities, this technique also uncovers the electricity Underlying Energy Demand Trend (UEDT) for the Turkish industrial sector and is, as far as is known, the first attempt to do this. The results suggest that output and real electricity prices and a UEDT all have an important role to play in driving Turkish industrial electricity demand. Consequently, they should all be incorporated when modelling Turkish industrial electricity demand and the estimated UEDT should arguably be considered in future energy policy decisions concerning the Turkish electricity industry. The output and price elasticities are estimated to be 0.15 and - 0.16 respectively, with an increasing (but at a decreasing rate) UEDT and based on the estimated equation, and different forecast assumptions, it is predicted that Turkish industrial electricity demand will be somewhere between 97 and 148 TWh by 2020. -- Research Highlights: → Estimated output and price elasticities of 0.15 and -0.16 respectively. → Estimated upward sloping UEDT (i.e. energy using) but at a decreasing rate. → Predicted Turkish industrial electricity demand between 97 and 148 TWh in 2020.

  10. A unified nonlinear stochastic time series analysis for climate science.

    Science.gov (United States)

    Moon, Woosok; Wettlaufer, John S

    2017-03-13

    Earth's orbit and axial tilt imprint a strong seasonal cycle on climatological data. Climate variability is typically viewed in terms of fluctuations in the seasonal cycle induced by higher frequency processes. We can interpret this as a competition between the orbitally enforced monthly stability and the fluctuations/noise induced by weather. Here we introduce a new time-series method that determines these contributions from monthly-averaged data. We find that the spatio-temporal distribution of the monthly stability and the magnitude of the noise reveal key fingerprints of several important climate phenomena, including the evolution of the Arctic sea ice cover, the El Nio Southern Oscillation (ENSO), the Atlantic Nio and the Indian Dipole Mode. In analogy with the classical destabilising influence of the ice-albedo feedback on summertime sea ice, we find that during some time interval of the season a destabilising process operates in all of these climate phenomena. The interaction between the destabilisation and the accumulation of noise, which we term the memory effect, underlies phase locking to the seasonal cycle and the statistical nature of seasonal predictability.

  11. Methodology Series Module 6: Systematic Reviews and Meta-analysis.

    Science.gov (United States)

    Setia, Maninder Singh

    2016-01-01

    Systematic reviews and meta-analysis have become an important of biomedical literature, and they provide the "highest level of evidence" for various clinical questions. There are a lot of studies - sometimes with contradictory conclusions - on a particular topic in literature. Hence, as a clinician, which results will you believe? What will you tell your patient? Which drug is better? A systematic review or a meta-analysis may help us answer these questions. In addition, it may also help us understand the quality of the articles in literature or the type of studies that have been conducted and published (example, randomized trials or observational studies). The first step it to identify a research question for systematic review or meta-analysis. The next step is to identify the articles that will be included in the study. This will be done by searching various databases; it is important that the researcher should search for articles in more than one database. It will also be useful to form a group of researchers and statisticians that have expertise in conducting systematic reviews and meta-analysis before initiating them. We strongly encourage the readers to register their proposed review/meta-analysis with PROSPERO. Finally, these studies should be reported according to the Preferred Reporting Items for Systematic Reviews and Meta-analysis checklist.

  12. Principal component analysis of MSBAS DInSAR time series from Campi Flegrei, Italy

    Science.gov (United States)

    Tiampo, Kristy F.; González, Pablo J.; Samsonov, Sergey; Fernández, Jose; Camacho, Antonio

    2017-09-01

    Because of its proximity to the city of Naples and with a population of nearly 1 million people within its caldera, Campi Flegrei is one of the highest risk volcanic areas in the world. Since the last major eruption in 1538, the caldera has undergone frequent episodes of ground subsidence and uplift accompanied by seismic activity that has been interpreted as the result of a stationary, deeper source below the caldera that feeds shallower eruptions. However, the location and depth of the deeper source is not well-characterized and its relationship to current activity is poorly understood. Recently, a significant increase in the uplift rate has occurred, resulting in almost 13 cm of uplift by 2013 (De Martino et al., 2014; Samsonov et al., 2014b; Di Vito et al., 2016). Here we apply a principal component decomposition to high resolution time series from the region produced by the advanced Multidimensional SBAS DInSAR technique in order to better delineate both the deeper source and the recent shallow activity. We analyzed both a period of substantial subsidence (1993-1999) and a second of significant uplift (2007-2013) and inverted the associated vertical surface displacement for the most likely source models. Results suggest that the underlying dynamics of the caldera changed in the late 1990s, from one in which the primary signal arises from a shallow deflating source above a deeper, expanding source to one dominated by a shallow inflating source. In general, the shallow source lies between 2700 and 3400 m below the caldera while the deeper source lies at 7600 m or more in depth. The combination of principal component analysis with high resolution MSBAS time series data allows for these new insights and confirms the applicability of both to areas at risk from dynamic natural hazards.

  13. A signal processing analysis of Purkinje cells in vitro

    Directory of Open Access Journals (Sweden)

    Ze'ev R Abrams

    2010-05-01

    Full Text Available Cerebellar Purkinje cells in vitro fire recurrent sequences of Sodium and Calcium spikes. Here, we analyze the Purkinje cell using harmonic analysis, and our experiments reveal that its output signal is comprised of three distinct frequency bands, which are combined using Amplitude and Frequency Modulation (AM/FM. We find that the three characteristic frequencies - Sodium, Calcium and Switching – occur in various combinations in all waveforms observed using whole-cell current clamp recordings. We found that the Calcium frequency can display a frequency doubling of its frequency mode, and the Switching frequency can act as a possible generator of pauses that are typically seen in Purkinje output recordings. Using a reversibly photo-switchable kainate receptor agonist, we demonstrate the external modulation of the Calcium and Switching frequencies. These experiments and Fourier analysis suggest that the Purkinje cell can be understood as a harmonic signal oscillator, enabling a higher level of interpretation of Purkinje signaling based on modern signal processing techniques.

  14. A novel water quality data analysis framework based on time-series data mining.

    Science.gov (United States)

    Deng, Weihui; Wang, Guoyin

    2017-07-01

    The rapid development of time-series data mining provides an emerging method for water resource management research. In this paper, based on the time-series data mining methodology, we propose a novel and general analysis framework for water quality time-series data. It consists of two parts: implementation components and common tasks of time-series data mining in water quality data. In the first part, we propose to granulate the time series into several two-dimensional normal clouds and calculate the similarities in the granulated level. On the basis of the similarity matrix, the similarity search, anomaly detection, and pattern discovery tasks in the water quality time-series instance dataset can be easily implemented in the second part. We present a case study of this analysis framework on weekly Dissolve Oxygen time-series data collected from five monitoring stations on the upper reaches of Yangtze River, China. It discovered the relationship of water quality in the mainstream and tributary as well as the main changing patterns of DO. The experimental results show that the proposed analysis framework is a feasible and efficient method to mine the hidden and valuable knowledge from water quality historical time-series data. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. Time series analysis of diverse extreme phenomena: universal features

    Science.gov (United States)

    Eftaxias, K.; Balasis, G.

    2012-04-01

    The field of study of complex systems holds that the dynamics of complex systems are founded on universal principles that may used to describe a great variety of scientific and technological approaches of different types of natural, artificial, and social systems. We suggest that earthquake, epileptic seizures, solar flares, and magnetic storms dynamics can be analyzed within similar mathematical frameworks. A central property of aforementioned extreme events generation is the occurrence of coherent large-scale collective behavior with very rich structure, resulting from repeated nonlinear interactions among the corresponding constituents. Consequently, we apply the Tsallis nonextensive statistical mechanics as it proves an appropriate framework in order to investigate universal principles of their generation. First, we examine the data in terms of Tsallis entropy aiming to discover common "pathological" symptoms of transition to a significant shock. By monitoring the temporal evolution of the degree of organization in time series we observe similar distinctive features revealing significant reduction of complexity during their emergence. Second, a model for earthquake dynamics coming from a nonextensive Tsallis formalism, starting from first principles, has been recently introduced. This approach leads to an energy distribution function (Gutenberg-Richter type law) for the magnitude distribution of earthquakes, providing an excellent fit to seismicities generated in various large geographic areas usually identified as seismic regions. We show that this function is able to describe the energy distribution (with similar non-extensive q-parameter) of solar flares, magnetic storms, epileptic and earthquake shocks. The above mentioned evidence of a universal statistical behavior suggests the possibility of a common approach for studying space weather, earthquakes and epileptic seizures.

  16. Real analysis series, functions of several variables, and applications

    CERN Document Server

    Laczkovich, Miklós

    2017-01-01

    This book develops the theory of multivariable analysis, building on the single variable foundations established in the companion volume, Real Analysis: Foundations and Functions of One Variable. Together, these volumes form the first English edition of the popular Hungarian original, Valós Analízis I & II, based on courses taught by the authors at Eötvös Loránd University, Hungary, for more than 30 years. Numerous exercises are included throughout, offering ample opportunities to master topics by progressing from routine to difficult problems. Hints or solutions to many of the more challenging exercises make this book ideal for independent study, or further reading. Intended as a sequel to a course in single variable analysis, this book builds upon and expands these ideas into higher dimensions. The modular organization makes this text adaptable for either a semester or year-long introductory course. Topics include: differentiation and integration of functions of several variables; infinite numerica...

  17. Developing a complex independent component analysis technique to extract non-stationary patterns from geophysical time-series

    Science.gov (United States)

    Forootan, Ehsan; Kusche, Jürgen

    2016-04-01

    Geodetic/geophysical observations, such as the time series of global terrestrial water storage change or sea level and temperature change, represent samples of physical processes and therefore contain information about complex physical interactionswith many inherent time scales. Extracting relevant information from these samples, for example quantifying the seasonality of a physical process or its variability due to large-scale ocean-atmosphere interactions, is not possible by rendering simple time series approaches. In the last decades, decomposition techniques have found increasing interest for extracting patterns from geophysical observations. Traditionally, principal component analysis (PCA) and more recently independent component analysis (ICA) are common techniques to extract statistical orthogonal (uncorrelated) and independent modes that represent the maximum variance of observations, respectively. PCA and ICA can be classified as stationary signal decomposition techniques since they are based on decomposing the auto-covariance matrix or diagonalizing higher (than two)-order statistical tensors from centered time series. However, the stationary assumption is obviously not justifiable for many geophysical and climate variables even after removing cyclic components e.g., the seasonal cycles. In this paper, we present a new decomposition method, the complex independent component analysis (CICA, Forootan, PhD-2014), which can be applied to extract to non-stationary (changing in space and time) patterns from geophysical time series. Here, CICA is derived as an extension of real-valued ICA (Forootan and Kusche, JoG-2012), where we (i) define a new complex data set using a Hilbert transformation. The complex time series contain the observed values in their real part, and the temporal rate of variability in their imaginary part. (ii) An ICA algorithm based on diagonalization of fourth-order cumulants is then applied to decompose the new complex data set in (i

  18. Analysis of engineering cycles thermodynamics and fluid mechanics series

    CERN Document Server

    Haywood, R W

    1980-01-01

    Analysis of Engineering Cycles, Third Edition, deals principally with an analysis of the overall performance, under design conditions, of work-producing power plants and work-absorbing refrigerating and gas-liquefaction plants, most of which are either cyclic or closely related thereto. The book is organized into two parts, dealing first with simple power and refrigerating plants and then moving on to more complex plants. The principal modifications in this Third Edition arise from the updating and expansion of material on nuclear plants and on combined and binary plants. In view of increased

  19. Signal correlations in biomass combustion. An information theoretic analysis

    Energy Technology Data Exchange (ETDEWEB)

    Ruusunen, M.

    2013-09-01

    Increasing environmental and economic awareness are driving the development of combustion technologies to efficient biomass use and clean burning. To accomplish these goals, quantitative information about combustion variables is needed. However, for small-scale combustion units the existing monitoring methods are often expensive or complex. This study aimed to quantify correlations between flue gas temperatures and combustion variables, namely typical emission components, heat output, and efficiency. For this, data acquired from four small-scale combustion units and a large circulating fluidised bed boiler was studied. The fuel range varied from wood logs, wood chips, and wood pellets to biomass residue. Original signals and a defined set of their mathematical transformations were applied to data analysis. In order to evaluate the strength of the correlations, a multivariate distance measure based on information theory was derived. The analysis further assessed time-varying signal correlations and relative time delays. Ranking of the analysis results was based on the distance measure. The uniformity of the correlations in the different data sets was studied by comparing the 10-quantiles of the measured signal. The method was validated with two benchmark data sets. The flue gas temperatures and the combustion variables measured carried similar information. The strongest correlations were mainly linear with the transformed signal combinations and explicable by the combustion theory. Remarkably, the results showed uniformity of the correlations across the data sets with several signal transformations. This was also indicated by simulations using a linear model with constant structure to monitor carbon dioxide in flue gas. Acceptable performance was observed according to three validation criteria used to quantify modelling error in each data set. In general, the findings demonstrate that the presented signal transformations enable real-time approximation of the studied

  20. A hybrid symplectic principal component analysis and central tendency measure method for detection of determinism in noisy time series with application to mechanomyography.

    Science.gov (United States)

    Xie, Hong-Bo; Dokos, Socrates

    2013-06-01

    We present a hybrid symplectic geometry and central tendency measure (CTM) method for detection of determinism in noisy time series. CTM is effective for detecting determinism in short time series and has been applied in many areas of nonlinear analysis. However, its performance significantly degrades in the presence of strong noise. In order to circumvent this difficulty, we propose to use symplectic principal component analysis (SPCA), a new chaotic signal de-noising method, as the first step to recover the system dynamics. CTM is then applied to determine whether the time series arises from a stochastic process or has a deterministic component. Results from numerical experiments, ranging from six benchmark deterministic models to 1/f noise, suggest that the hybrid method can significantly improve detection of determinism in noisy time series by about 20 dB when the data are contaminated by Gaussian noise. Furthermore, we apply our algorithm to study the mechanomyographic (MMG) signals arising from contraction of human skeletal muscle. Results obtained from the hybrid symplectic principal component analysis and central tendency measure demonstrate that the skeletal muscle motor unit dynamics can indeed be deterministic, in agreement with previous studies. However, the conventional CTM method was not able to definitely detect the underlying deterministic dynamics. This result on MMG signal analysis is helpful in understanding neuromuscular control mechanisms and developing MMG-based engineering control applications.

  1. Develop advanced nonlinear signal analysis topographical mapping system

    Science.gov (United States)

    1994-01-01

    The Space Shuttle Main Engine (SSME) has been undergoing extensive flight certification and developmental testing, which involves some 250 health monitoring measurements. Under the severe temperature, pressure, and dynamic environments sustained during operation, numerous major component failures have occurred, resulting in extensive engine hardware damage and scheduling losses. To enhance SSME safety and reliability, detailed analysis and evaluation of the measurements signal are mandatory to assess its dynamic characteristics and operational condition. Efficient and reliable signal detection techniques will reduce catastrophic system failure risks and expedite the evaluation of both flight and ground test data, and thereby reduce launch turn-around time. The basic objective of this contract are threefold: (1) develop and validate a hierarchy of innovative signal analysis techniques for nonlinear and nonstationary time-frequency analysis. Performance evaluation will be carried out through detailed analysis of extensive SSME static firing and flight data. These techniques will be incorporated into a fully automated system; (2) develop an advanced nonlinear signal analysis topographical mapping system (ATMS) to generate a Compressed SSME TOPO Data Base (CSTDB). This ATMS system will convert tremendous amount of complex vibration signals from the entire SSME test history into a bank of succinct image-like patterns while retaining all respective phase information. High compression ratio can be achieved to allow minimal storage requirement, while providing fast signature retrieval, pattern comparison, and identification capabilities; and (3) integrate the nonlinear correlation techniques into the CSTDB data base with compatible TOPO input data format. Such integrated ATMS system will provide the large test archives necessary for quick signature comparison. This study will provide timely assessment of SSME component operational status, identify probable causes of

  2. Time series analysis of aerobic bacterial flora during Miso fermentation.

    Science.gov (United States)

    Onda, T; Yanagida, F; Tsuji, M; Shinohara, T; Yokotsuka, K

    2003-01-01

    This article reports a microbiological study of aerobic mesophilic bacteria that are present during the fermentation process of Miso. Aerobic bacteria were enumerated and isolated from Miso during fermentation and divided into nine groups using traditional phenotypic tests. The strains were identified by biochemical analysis and 16S rRNA sequence analysis. They were identified as Bacillus subtilis, B. amyloliquefaciens, Kocuria kristinae, Staphylococcus gallinarum and S. kloosii. All strains were sensitive to the bacteriocins produced by the lactic acid bacteria isolated from Miso. The dominant species among the undesirable species throughout the fermentation process were B. subtilis and B. amyloliquefaciens. It is suggested that bacteriocin-producing lactic acid bacteria are effective in the growth prevention of aerobic bacteria in Miso. This study has provided useful information for controlling of bacterial flora during Miso fermentation.

  3. Analysis of cross-correlations in electroencephalogram signals as an approach to proactive diagnosis of schizophrenia

    Science.gov (United States)

    Timashev, Serge F.; Panischev, Oleg Yu.; Polyakov, Yuriy S.; Demin, Sergey A.; Kaplan, Alexander Ya.

    2012-02-01

    We apply flicker-noise spectroscopy (FNS), a time series analysis method operating on structure functions and power spectrum estimates, to study the clinical electroencephalogram (EEG) signals recorded in children/adolescents (11 to 14 years of age) with diagnosed schizophrenia-spectrum symptoms at the National Center for Psychiatric Health (NCPH) of the Russian Academy of Medical Sciences. The EEG signals for these subjects were compared with the signals for a control sample of chronically depressed children/adolescents. The purpose of the study is to look for diagnostic signs of subjects' susceptibility to schizophrenia in the FNS parameters for specific electrodes and cross-correlations between the signals simultaneously measured at different points on the scalp. Our analysis of EEG signals from scalp-mounted electrodes at locations F3 and F4, which are symmetrically positioned in the left and right frontal areas of cerebral cortex, respectively, demonstrates an essential role of frequency-phase synchronization, a phenomenon representing specific correlations between the characteristic frequencies and phases of excitations in the brain. We introduce quantitative measures of frequency-phase synchronization and systematize the values of FNS parameters for the EEG data. The comparison of our results with the medical diagnoses for 84 subjects performed at NCPH makes it possible to group the EEG signals into 4 categories corresponding to different risk levels of subjects' susceptibility to schizophrenia. We suggest that the introduced quantitative characteristics and classification of cross-correlations may be used for the diagnosis of schizophrenia at the early stages of its development.

  4. Investigating complex patterns of blocked intestinal artery blood pressure signals by empirical mode decomposition and linguistic analysis

    International Nuclear Information System (INIS)

    Yeh, J-R; Lin, T-Y; Shieh, J-S; Chen, Y; Huang, N E; Wu, Z; Peng, C-K

    2008-01-01

    In this investigation, surgical operations of blocked intestinal artery have been conducted on pigs to simulate the condition of acute mesenteric arterial occlusion. The empirical mode decomposition method and the algorithm of linguistic analysis were applied to verify the blood pressure signals in simulated situation. We assumed that there was some information hidden in the high-frequency part of the blood pressure signal when an intestinal artery is blocked. The empirical mode decomposition method (EMD) has been applied to decompose the intrinsic mode functions (IMF) from a complex time series. But, the end effects and phenomenon of intermittence damage the consistence of each IMF. Thus, we proposed the complementary ensemble empirical mode decomposition method (CEEMD) to solve the problems of end effects and the phenomenon of intermittence. The main wave of blood pressure signals can be reconstructed by the main components, identified by Monte Carlo verification, and removed from the original signal to derive a riding wave. Furthermore, the concept of linguistic analysis was applied to design the blocking index to verify the pattern of riding wave of blood pressure using the measurements of dissimilarity. Blocking index works well to identify the situation in which the sampled time series of blood pressure signal was recorded. Here, these two totally different algorithms are successfully integrated and the existence of the existence of information hidden in high-frequency part of blood pressure signal has been proven

  5. Investigating complex patterns of blocked intestinal artery blood pressure signals by empirical mode decomposition and linguistic analysis

    Energy Technology Data Exchange (ETDEWEB)

    Yeh, J-R; Lin, T-Y; Shieh, J-S [Department of Mechanical Engineering, Yuan Ze University, 135 Far-East Road, Chung-Li, Taoyuan, Taiwan (China); Chen, Y [Far Eastern Memorial Hospital, Taiwan (China); Huang, N E [Research Center for Adaptive Data Analysis, National Central University, Taiwan (China); Wu, Z [Center for Ocean-Land-Atmosphere Studies (United States); Peng, C-K [Beth Israel Deaconess Medical Center, Harvard Medical School (United States)], E-mail: s939205@ mail.yzu.edu.tw

    2008-02-15

    In this investigation, surgical operations of blocked intestinal artery have been conducted on pigs to simulate the condition of acute mesenteric arterial occlusion. The empirical mode decomposition method and the algorithm of linguistic analysis were applied to verify the blood pressure signals in simulated situation. We assumed that there was some information hidden in the high-frequency part of the blood pressure signal when an intestinal artery is blocked. The empirical mode decomposition method (EMD) has been applied to decompose the intrinsic mode functions (IMF) from a complex time series. But, the end effects and phenomenon of intermittence damage the consistence of each IMF. Thus, we proposed the complementary ensemble empirical mode decomposition method (CEEMD) to solve the problems of end effects and the phenomenon of intermittence. The main wave of blood pressure signals can be reconstructed by the main components, identified by Monte Carlo verification, and removed from the original signal to derive a riding wave. Furthermore, the concept of linguistic analysis was applied to design the blocking index to verify the pattern of riding wave of blood pressure using the measurements of dissimilarity. Blocking index works well to identify the situation in which the sampled time series of blood pressure signal was recorded. Here, these two totally different algorithms are successfully integrated and the existence of the existence of information hidden in high-frequency part of blood pressure signal has been proven.

  6. Advanced Signal Analysis for Forensic Applications of Ground Penetrating Radar

    Energy Technology Data Exchange (ETDEWEB)

    Steven Koppenjan; Matthew Streeton; Hua Lee; Michael Lee; Sashi Ono

    2004-06-01

    Ground penetrating radar (GPR) systems have traditionally been used to image subsurface objects. The main focus of this paper is to evaluate an advanced signal analysis technique. Instead of compiling spatial data for the analysis, this technique conducts object recognition procedures based on spectral statistics. The identification feature of an object type is formed from the training vectors by a singular-value decomposition procedure. To illustrate its capability, this procedure is applied to experimental data and compared to the performance of the neural-network approach.

  7. Signal analysis and processing for SmartPET

    International Nuclear Information System (INIS)

    Scraggs, David; Boston, Andrew; Boston, Helen; Cooper, Reynold; Hall, Chris; Mather, Andy; Nolan, Paul; Turk, Gerard

    2007-01-01

    Measurement of induced transient charges on spectator electrodes is a critical requirement of the SmartPET project. Such a task requires the precise measurement of small amplitude pulses. Induced charge magnitudes on the SmartPET detectors were therefore studied and the suitability of wavelet analysis applied to de-noising signals was investigated. It was found that the absolute net maximum induced charge magnitudes from the two adjacent electrodes to the collecting electrode is 17% of the real charge magnitude for the AC side and 20% for the DC side. It was also found that wavelet analysis could identify induced charges of comparable magnitude to system noise

  8. Analysis and logical modeling of biological signaling transduction networks

    Science.gov (United States)

    Sun, Zhongyao

    The study of network theory and its application span across a multitude of seemingly disparate fields of science and technology: computer science, biology, social science, linguistics, etc. It is the intrinsic similarities embedded in the entities and the way they interact with one another in these systems that link them together. In this dissertation, I present from both the aspect of theoretical analysis and the aspect of application three projects, which primarily focus on signal transduction networks in biology. In these projects, I assembled a network model through extensively perusing literature, performed model-based simulations and validation, analyzed network topology, and proposed a novel network measure. The application of network modeling to the system of stomatal opening in plants revealed a fundamental question about the process that has been left unanswered in decades. The novel measure of the redundancy of signal transduction networks with Boolean dynamics by calculating its maximum node-independent elementary signaling mode set accurately predicts the effect of single node knockout in such signaling processes. The three projects as an organic whole advance the understanding of a real system as well as the behavior of such network models, giving me an opportunity to take a glimpse at the dazzling facets of the immense world of network science.

  9. Cluster analysis of activity-time series in motor learning

    DEFF Research Database (Denmark)

    Balslev, Daniela; Nielsen, Finn Årup; Frutiger, Sally A.

    2002-01-01

    Neuroimaging studies of learning focus on brain areas where the activity changes as a function of time. To circumvent the difficult problem of model selection, we used a data-driven analytic tool, cluster analysis, which extracts representative temporal and spatial patterns from the voxel...... practice-related activity in a fronto-parieto-cerebellar network, in agreement with previous studies of motor learning. These voxels were separated from a group of voxels showing an unspecific time-effect and another group of voxels, whose activation was an artifact from smoothing. Hum. Brain Mapping 15...

  10. Repeatability study of replicate crash tests: A signal analysis approach.

    Science.gov (United States)

    Seppi, Jeremy; Toczyski, Jacek; Crandall, Jeff R; Kerrigan, Jason

    2017-10-03

    To provide an objective basis on which to evaluate the repeatability of vehicle crash test methods, a recently developed signal analysis method was used to evaluate correlation of sensor time history data between replicate vehicle crash tests. The goal of this study was to evaluate the repeatability of rollover crash tests performed with the Dynamic Rollover Test System (DRoTS) relative to other vehicle crash test methods. Test data from DRoTS tests, deceleration rollover sled (DRS) tests, frontal crash tests, frontal offset crash tests, small overlap crash tests, small overlap impact (SOI) crash tests, and oblique crash tests were obtained from the literature and publicly available databases (the NHTSA vehicle database and the Insurance Institute for Highway Safety TechData) to examine crash test repeatability. Signal analysis of the DRoTS tests showed that force and deformation time histories had good to excellent repeatability, whereas vehicle kinematics showed only fair repeatability due to the vehicle mounting method for one pair of tests and slightly dissimilar mass properties (2.2%) in a second pair of tests. Relative to the DRS, the DRoTS tests showed very similar or higher levels of repeatability in nearly all vehicle kinematic data signals with the exception of global X' (road direction of travel) velocity and displacement due to the functionality of the DRoTS fixture. Based on the average overall scoring metric of the dominant acceleration, DRoTS was found to be as repeatable as all other crash tests analyzed. Vertical force measures showed good repeatability and were on par with frontal crash barrier forces. Dynamic deformation measures showed good to excellent repeatability as opposed to poor repeatability seen in SOI and oblique deformation measures. Using the signal analysis method as outlined in this article, the DRoTS was shown to have the same or better repeatability of crash test methods used in government regulatory and consumer evaluation test

  11. A Review of Sleep Disorder Diagnosis by Electromyogram Signal Analysis.

    Science.gov (United States)

    Shokrollahi, Mehrnaz; Krishnan, Sridhar

    2015-01-01

    Sleep and sleep-related problems play a role in a large number of human disorders and affect every field of medicine. It is estimated that 50 to 70 million Americans suffer from a chronic sleep disorder, which hinders their daily life, affects their health, and confers a significant economic burden to society. The negative public health consequences of sleep disorders are enormous and could have long-term effects, including increased risk of hypertension, diabetes, obesity, heart attack, stroke and in some cases death. Polysomnographic modalities can monitor sleep cycles to identify disrupted sleep patterns, adjust the treatments, increase therapeutic options and enhance the quality of life of recording the electroencephalogram (EEG), electromyogram (EMG) and electrocardiogram (ECG). Although the skills acquired by medical facilitators are quite extensive, it is just as important for them to have access to an assortment of technologies and to further improve their monitoring and treatment capabilities. Computer-aided analysis is one advantageous technique that could provide quantitative indices for sleep disorder screening. Evolving evidence suggests that Parkinson's disease may be associated with rapid eye movement sleep behavior disorder (RBD). With this article, we are reviewing studies that are related to EMG signal analysis for detection of neuromuscular diseases that result from sleep movement disorders. As well, the article describes the recent progress in analysis of EMG signals using temporal analysis, frequency-domain analysis, time-frequency, and sparse representations, followed by the comparison of the recent research.

  12. A Novel Method for Detection of Epilepsy in Short and Noisy EEG Signals Using Ordinal Pattern Analysis

    Directory of Open Access Journals (Sweden)

    Iman Veisi

    2010-03-01

    Full Text Available Introduction: In this paper, a novel complexity measure is proposed to detect dynamical changes in nonlinear systems using ordinal pattern analysis of time series data taken from the system. Epilepsy is considered as a dynamical change in nonlinear and complex brain system. The ability of the proposed measure for characterizing the normal and epileptic EEG signals when the signal is short or is contaminated with noise is investigated and compared with some traditional chaos-based measures. Materials and Methods: In the proposed method, the phase space of the time series is reconstructed and then partitioned using ordinal patterns. The partitions can be labeled using a set of symbols. Therefore, the state trajectory is converted to a symbol sequence. A finite state machine is then constructed to model the sequence. A new complexity measure is proposed to detect dynamical changes using the state transition matrix of the state machine. The proposed complexity measure was applied to detect epilepsy in short and noisy EEG signals and the results were compared with some chaotic measures. Results: The results indicate that this complexity measure can distinguish normal and epileptic EEG signals with an accuracy of more than 97% for clean EEG and more than 75% for highly noised EEG signals. Discussion and Conclusion: The complexity measure can be computed in a very fast and easy way and, unlike traditional chaotic measures, is robust with respect to noise corrupting the data. This measure is also capable of dynamical change detection in short time series data.

  13. A Study of Wavelet Analysis and Data Extraction from Second-Order Self-Similar Time Series

    Directory of Open Access Journals (Sweden)

    Leopoldo Estrada Vargas

    2013-01-01

    Full Text Available Statistical analysis and synthesis of self-similar discrete time signals are presented. The analysis equation is formally defined through a special family of basis functions of which the simplest case matches the Haar wavelet. The original discrete time series is synthesized without loss by a linear combination of the basis functions after some scaling, displacement, and phase shift. The decomposition is then used to synthesize a new second-order self-similar signal with a different Hurst index than the original. The components are also used to describe the behavior of the estimated mean and variance of self-similar discrete time series. It is shown that the sample mean, although it is unbiased, provides less information about the process mean as its Hurst index is higher. It is also demonstrated that the classical variance estimator is biased and that the widely accepted aggregated variance-based estimator of the Hurst index results biased not due to its nature (which is being unbiased and has minimal variance but to flaws in its implementation. Using the proposed decomposition, the correct estimation of the Variance Plot is described, as well as its close association with the popular Logscale Diagram.

  14. Random signal tomographical analysis of two-phase flow

    International Nuclear Information System (INIS)

    Han, P.; Wesser, U.

    1990-01-01

    This paper reports on radiation tomography which is a useful tool for studying the internal structures of two-phase flow. However, general tomography analysis gives only time-averaged results, hence much information is lost. As a result, it is sometimes difficult to identify the flow regime; for example, the time-averaged picture does not significantly change as an annual flow develops from a slug flow. A two-phase flow diagnostic technique based on random signal tomographical analysis is developed. It extracts more information by studying the statistical variation of the measured signal with time. Local statistical parameters, including mean value, variance, skewness and flatness etc., are reconstructed from the information obtained by a general tomography technique. More important information are provided by the results. Not only the void fraction can be easily calculated, but also the flow pattern can be identified more objectively and more accurately. The experimental setup is introduced. It consisted of a two-phase flow loop, an X-ray system, a fan-like five-beam detector system and a signal acquisition and processing system. In the experiment, for both horizontal and vertical test sections (aluminum and steel tube with Di/Do = 40/45 mm), different flow situations are realized by independently adjusting air and water mass flow. Through a glass tube connected with the test section, some typical flow patterns are visualized and used for comparing with the reconstruction results

  15. Multifractal detrended cross-correlation analysis on gold, crude oil and foreign exchange rate time series

    Science.gov (United States)

    Pal, Mayukha; Madhusudana Rao, P.; Manimaran, P.

    2014-12-01

    We apply the recently developed multifractal detrended cross-correlation analysis method to investigate the cross-correlation behavior and fractal nature between two non-stationary time series. We analyze the daily return price of gold, West Texas Intermediate and Brent crude oil, foreign exchange rate data, over a period of 18 years. The cross correlation has been measured from the Hurst scaling exponents and the singularity spectrum quantitatively. From the results, the existence of multifractal cross-correlation between all of these time series is found. We also found that the cross correlation between gold and oil prices possess uncorrelated behavior and the remaining bivariate time series possess persistent behavior. It was observed for five bivariate series that the cross-correlation exponents are less than the calculated average generalized Hurst exponents (GHE) for q0 and for one bivariate series the cross-correlation exponent is greater than GHE for all q values.

  16. A Reception Analysis on the Youth Audiences of TV Series in Marivan

    Directory of Open Access Journals (Sweden)

    Omid Karimi

    2014-03-01

    Full Text Available The aim of this article is to describe the role of foreign media as the agitators of popular culture. For that with reception analysis it’s pay to describe decoding of youth audiences about this series. Globalization theory and Reception in Communication theory are formed the theoretical system of current article. The methodology in this research is qualitative one, and two techniques as in-depth interview and observation are used for data collection. The results show different people based on individual features, social and cultural backgrounds have inclination toward special characters and identify with them. This inclination so far the audience fallow the series because of his/her favorite character. Also there is a great compatibility between audience backgrounds and their receptions. A number of audience have criticized the series and point out the negative consequences on its society. However, seeing the series continue; really they prefer watching series enjoying to risks of it.

  17. Cerebral venous sinus thrombosis on MRI: A case series analysis

    Directory of Open Access Journals (Sweden)

    Sanjay M Khaladkar

    2014-01-01

    Full Text Available Background: Cerebral venous sinus thrombosis (CVST is a rare form of stroke seen in young and middle aged group, especially in women due to thrombus of dural venous sinuses and can cause acute neurological deterioration with increased morbidity and mortality if not diagnosed in early stage. Neurological deficit occurs due to focal or diffuse cerebral edema and venous non-hemorrhagic or hemorrhagic infarct. Aim and Objectives: To assess/evaluate the role of Magnetic Resonance Imaging (MRI and Magnetic Resonance Venography (MRV as an imaging modality for early diagnosis of CVST and to study patterns of venous thrombosis, in detecting changes in brain parenchyma and residual effects of CVST using MRI. Materials and Methods: Retrospective descriptive analysis of 40 patients of CVST diagnosed on MRI brain and MRV was done. Results: 29/40 (72.5% were males and 11/40 (27.5% were females. Most of the patients were in the age group of 21-40 years (23/40-57.5%. Most of the patients 16/40 (40% presented within 7 days. No definite cause of CVST was found in 24 (60% patients in spite of detailed history. In 36/40 (90% of cases major sinuses were involved, deep venous system were involved in 7/40 (17.5% cases, superficial cortical vein was involved in 1/40 (2.5% cases. Analysis of stage of thrombus (acute, subacute, chronic was done based on its appearance on T1 and T2WI. 31/40 (77.5% patients showed complete absence of flow on MRV, while 9/40 (22.5% cases showed partial flow on MR venogram. Brain parenchyma was normal in 20/40 (50% patients while 6/40 (15% cases had non-hemorrhagic infarct and 14/40 (35% patients presented with hemorrhagic infarct. Conclusion: Our study concluded that MRI brain with MRV is sensitive in diagnosing both direct signs (evidence of thrombus inside the affected veins and indirect signs (parenchymal changes of CVST and their follow up.

  18. Chaos in Electronic Circuits: Nonlinear Time Series Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Wheat, Jr., Robert M. [Kennedy Western Univ., Cheyenne, WY (United States)

    2003-07-01

    Chaos in electronic circuits is a phenomenon that has been largely ignored by engineers, manufacturers, and researchers until the early 1990’s and the work of Chua, Matsumoto, and others. As the world becomes more dependent on electronic devices, the detrimental effects of non-normal operation of these devices becomes more significant. Developing a better understanding of the mechanisms involved in the chaotic behavior of electronic circuits is a logical step toward the prediction and prevention of any potentially catastrophic occurrence of this phenomenon. Also, a better understanding of chaotic behavior, in a general sense, could potentially lead to better accuracy in the prediction of natural events such as weather, volcanic activity, and earthquakes. As a first step in this improvement of understanding, and as part of the research being reported here, methods of computer modeling, identifying and analyzing, and producing chaotic behavior in simple electronic circuits have been developed. The computer models were developed using both the Alternative Transient Program (ATP) and Spice, the analysis techniques have been implemented using the C and C++ programming languages, and the chaotically behaving circuits developed using “off the shelf” electronic components.

  19. Financing Human Development for Sectorial Growth: A Time Series Analysis

    Directory of Open Access Journals (Sweden)

    Shobande Abdul Olatunji

    2017-06-01

    Full Text Available The role which financing human development plays in fostering the sectorial growth of an economy cannot be undermined. It is a key instrument which can be utilized to alleviate poverty, create employment and ensure the sustenance of economic growth and development. Thus financing human development for sectorial growth has taken the center stage of economic growth and development strategies in most countries. In a constructive effort to examine the in-depth relationship between the variables in the Nigerian space, this paper provides evidence on the impact of financing human development and sectorial growth in Nigeria between 1982 and 2016, using the Johansen co-integration techniques to test for co-integration among the variables and the Vector Error Correction Model (VECM to ascertain the speed of adjustment of the variables to their long run equilibrium position. The analysis shows that a long and short run relationship exists between financing human capital development and sectorial growth during the period reviewed. Therefore, the paper argues that for an active foundation for sustainable sectorial growth and development, financing human capital development across each unit is urgently required through increased budgetary allocation for both health and educational sectors since they are key components of human capital development in a nation.

  20. Unveiling Hidden Dynamics of Hippo Signalling: A Systems Analysis

    Directory of Open Access Journals (Sweden)

    Sung-Young Shin

    2016-08-01

    Full Text Available The Hippo signalling pathway has recently emerged as an important regulator of cell apoptosis and proliferation with significant implications in human diseases. In mammals, the pathway contains the core kinases MST1/2, which phosphorylate and activate LATS1/2 kinases. The pro-apoptotic function of the MST/LATS signalling axis was previously linked to the Akt and ERK MAPK pathways, demonstrating that the Hippo pathway does not act alone but crosstalks with other signalling pathways to coordinate network dynamics and cellular outcomes. These crosstalks were characterised by a multitude of complex regulatory mechanisms involving competitive protein-protein interactions and phosphorylation mediated feedback loops. However, how these different mechanisms interplay in different cellular contexts to drive the context-specific network dynamics of Hippo-ERK signalling remains elusive. Using mathematical modelling and computational analysis, we uncovered that the Hippo-ERK network can generate highly diverse dynamical profiles that can be clustered into distinct dose-response patterns. For each pattern, we offered mechanistic explanation that defines when and how the observed phenomenon can arise. We demonstrated that Akt displays opposing, dose-dependent functions towards ERK, which are mediated by the balance between the Raf-1/MST2 protein interaction module and the LATS1 mediated feedback regulation. Moreover, Ras displays a multi-functional role and drives biphasic responses of both MST2 and ERK activities; which are critically governed by the competitive protein interaction between MST2 and Raf-1. Our study represents the first in-depth and systematic analysis of the Hippo-ERK network dynamics and provides a concrete foundation for future studies.

  1. Assessing error sources for Landsat time series analysis for tropical test sites in Viet Nam and Ethiopia

    Science.gov (United States)

    Schultz, Michael; Verbesselt, Jan; Herold, Martin; Avitabile, Valerio

    2013-10-01

    Researchers who use remotely sensed data can spend half of their total effort analysing prior data. If this data preprocessing does not match the application, this time spent on data analysis can increase considerably and can lead to inaccuracies. Despite the existence of a number of methods for pre-processing Landsat time series, each method has shortcomings, particularly for mapping forest changes under varying illumination, data availability and atmospheric conditions. Based on the requirements of mapping forest changes as defined by the United Nations (UN) Reducing Emissions from Forest Degradation and Deforestation (REDD) program, the accurate reporting of the spatio-temporal properties of these changes is necessary. We compared the impact of three fundamentally different radiometric preprocessing techniques Moderate Resolution Atmospheric TRANsmission (MODTRAN), Second Simulation of a Satellite Signal in the Solar Spectrum (6S) and simple Dark Object Subtraction (DOS) on mapping forest changes using Landsat time series data. A modification of Breaks For Additive Season and Trend (BFAST) monitor was used to jointly map the spatial and temporal agreement of forest changes at test sites in Ethiopia and Viet Nam. The suitability of the pre-processing methods for the occurring forest change drivers was assessed using recently captured Ground Truth and high resolution data (1000 points). A method for creating robust generic forest maps used for the sampling design is presented. An assessment of error sources has been performed identifying haze as a major source for time series analysis commission error.

  2. Simultaneous determination of radionuclides separable into natural decay series by use of time-interval analysis

    International Nuclear Information System (INIS)

    Hashimoto, Tetsuo; Sanada, Yukihisa; Uezu, Yasuhiro

    2004-01-01

    A delayed coincidence method, time-interval analysis (TIA), has been applied to successive α-α decay events on the millisecond time-scale. Such decay events are part of the 220 Rn→ 216 Po (T 1/2 145 ms) (Th-series) and 219 Rn→ 215 Po (T 1/2 1.78 ms) (Ac-series). By using TIA in addition to measurement of 226 Ra (U-series) from α-spectrometry by liquid scintillation counting (LSC), two natural decay series could be identified and separated. The TIA detection efficiency was improved by using the pulse-shape discrimination technique (PSD) to reject β-pulses, by solvent extraction of Ra combined with simple chemical separation, and by purging the scintillation solution with dry N 2 gas. The U- and Th-series together with the Ac-series were determined, respectively, from alpha spectra and TIA carried out immediately after Ra-extraction. Using the 221 Fr→ 217 At (T 1/2 32.3 ms) decay process as a tracer, overall yields were estimated from application of TIA to the 225 Ra (Np-decay series) at the time of maximum growth. The present method has proven useful for simultaneous determination of three radioactive decay series in environmental samples. (orig.)

  3. Time series analysis of wind speed using VAR and the generalized impulse response technique

    Energy Technology Data Exchange (ETDEWEB)

    Ewing, Bradley T. [Area of Information Systems and Quantitative Sciences, Rawls College of Business and Wind Science and Engineering Research Center, Texas Tech University, Lubbock, TX 79409-2101 (United States); Kruse, Jamie Brown [Center for Natural Hazard Research, East Carolina University, Greenville, NC (United States); Schroeder, John L. [Department of Geosciences and Wind Science and Engineering Research Center, Texas Tech University, Lubbock, TX (United States); Smith, Douglas A. [Department of Civil Engineering and Wind Science and Engineering Research Center, Texas Tech University, Lubbock, TX (United States)

    2007-03-15

    This research examines the interdependence in time series wind speed data measured in the same location at four different heights. A multiple-equation system known as a vector autoregression is proposed for characterizing the time series dynamics of wind. Additionally, the recently developed method of generalized impulse response analysis provides insight into the cross-effects of the wind series and their responses to shocks. Findings are based on analysis of contemporaneous wind speed time histories taken at 13, 33, 70 and 160 ft above ground level with a sampling rate of 10 Hz. The results indicate that wind speeds measured at 70 ft was the most variable. Further, the turbulence persisted longer at the 70-ft measurement than at the other heights. The greatest interdependence is observed at 13 ft. Gusts at 160 ft led to the greatest persistence to an 'own' shock and led to greatest persistence in the responses of the other wind series. (author)

  4. Fractal analysis and nonlinear forecasting of indoor 222Rn time series

    International Nuclear Information System (INIS)

    Pausch, G.; Bossew, P.; Hofmann, W.; Steger, F.

    1998-01-01

    Fractal analyses of indoor 222 Rn time series were performed using different chaos theory based measurements such as time delay method, Hurst's rescaled range analysis, capacity (fractal) dimension, and Lyapunov exponent. For all time series we calculated only positive Lyapunov exponents which is a hint to chaos, while the Hurst exponents were well below 0.5, indicating antipersistent behaviour (past trends tend to reverse in the future). These time series were also analyzed with a nonlinear prediction method which allowed an estimation of the embedding dimensions with some restrictions, limiting the prediction to about three relative time steps. (orig.)

  5. CROSAT: A digital computer program for statistical-spectral analysis of two discrete time series

    International Nuclear Information System (INIS)

    Antonopoulos Domis, M.

    1978-03-01

    The program CROSAT computes directly from two discrete time series auto- and cross-spectra, transfer and coherence functions, using a Fast Fourier Transform subroutine. Statistical analysis of the time series is optional. While of general use the program is constructed to be immediately compatible with the ICL 4-70 and H316 computers at AEE Winfrith, and perhaps with minor modifications, with any other hardware system. (author)

  6. Electromagnetic modeling method for eddy current signal analysis

    International Nuclear Information System (INIS)

    Lee, D. H.; Jung, H. K.; Cheong, Y. M.; Lee, Y. S.; Huh, H.; Yang, D. J.

    2004-10-01

    An electromagnetic modeling method for eddy current signal analysis is necessary before an experiment is performed. Electromagnetic modeling methods consists of the analytical method and the numerical method. Also, the numerical methods can be divided by Finite Element Method(FEM), Boundary Element Method(BEM) and Volume Integral Method(VIM). Each modeling method has some merits and demerits. Therefore, the suitable modeling method can be chosen by considering the characteristics of each modeling. This report explains the principle and application of each modeling method and shows the comparison modeling programs

  7. Interleukin-2 signaling pathway analysis by quantitative phosphoproteomics

    DEFF Research Database (Denmark)

    Osinalde, Nerea; Moss, Helle; Arrizabalaga, Onetsine

    2011-01-01

    among which 79 were found with increased abundance in the tyrosine-phosphorylated complexes, including several previously not reported IL-2 downstream effectors. Combinatorial site-specific phosphoproteomic analysis resulted in identification of 99 phosphorylated sites mapping to the identified proteins...... with increased abundance in the tyrosine-phosphorylated complexes, of which 34 were not previously described. In addition, chemical inhibition of the identified IL-2-mediated JAK, PI3K and MAPK signaling pathways, resulted in distinct alteration on the IL-2 dependent proliferation....

  8. An Interactive Analysis of Hyperboles in a British TV Series: Implications For EFL Classes

    Science.gov (United States)

    Sert, Olcay

    2008-01-01

    This paper, part of an ongoing study on the analysis of hyperboles in a British TV series, reports findings drawing upon a 90,000 word corpus. The findings are compared to the ones from CANCODE (McCarthy and Carter 2004), a five-million word corpus of spontaneous speech, in order to identify similarities between the two. The analysis showed that…

  9. Applications of Some Classes of Sequences on Approximation of Functions (Signals by Almost Generalized Nörlund Means of Their Fourier Series

    Directory of Open Access Journals (Sweden)

    Xhevat Z. Krasniqi

    2015-11-01

    Full Text Available In this paper, using rest bounded variation sequences and head bounded variation sequences, some new results on approximation of functions (signals by almost generalized Nörlund means of their Fourier series are obtained. To our best knowledge this the first time to use such classes of sequences on approximations of the type treated in this paper. In addition, several corollaries are derived from our results as well as those obtained previously by others.

  10. Nonlinear analysis of magnetospheric data Part I. Geometric characteristics of the AE index time series and comparison with nonlinear surrogate data

    Directory of Open Access Journals (Sweden)

    G. P. Pavlos

    1999-01-01

    Full Text Available A long AE index time series is used as a crucial magnetospheric quantity in order to study the underlying dynainics. For this purpose we utilize methods of nonlinear and chaotic analysis of time series. Two basic components of this analysis are the reconstruction of the experimental tiine series state space trajectory of the underlying process and the statistical testing of an null hypothesis. The null hypothesis against which the experimental time series are tested is that the observed AE index signal is generated by a linear stochastic signal possibly perturbed by a static nonlinear distortion. As dis ' ' ating statistics we use geometrical characteristics of the reconstructed state space (Part I, which is the work of this paper and dynamical characteristics (Part II, which is the work a separate paper, and "nonlinear" surrogate data, generated by two different techniques which can mimic the original (AE index signal. lie null hypothesis is tested for geometrical characteristics which are the dimension of the reconstructed trajectory and some new geometrical parameters introduced in this work for the efficient discrimination between the nonlinear stochastic surrogate data and the AE index. Finally, the estimated geometric characteristics of the magnetospheric AE index present new evidence about the nonlinear and low dimensional character of the underlying magnetospheric dynamics for the AE index.

  11. A Markovian Entropy Measure for the Analysis of Calcium Activity Time Series.

    Science.gov (United States)

    Marken, John P; Halleran, Andrew D; Rahman, Atiqur; Odorizzi, Laura; LeFew, Michael C; Golino, Caroline A; Kemper, Peter; Saha, Margaret S

    2016-01-01

    Methods to analyze the dynamics of calcium activity often rely on visually distinguishable features in time series data such as spikes, waves, or oscillations. However, systems such as the developing nervous system display a complex, irregular type of calcium activity which makes the use of such methods less appropriate. Instead, for such systems there exists a class of methods (including information theoretic, power spectral, and fractal analysis approaches) which use more fundamental properties of the time series to analyze the observed calcium dynamics. We present a new analysis method in this class, the Markovian Entropy measure, which is an easily implementable calcium time series analysis method which represents the observed calcium activity as a realization of a Markov Process and describes its dynamics in terms of the level of predictability underlying the transitions between the states of the process. We applied our and other commonly used calcium analysis methods on a dataset from Xenopus laevis neural progenitors which displays irregular calcium activity and a dataset from murine synaptic neurons which displays activity time series that are well-described by visually-distinguishable features. We find that the Markovian Entropy measure is able to distinguish between biologically distinct populations in both datasets, and that it can separate biologically distinct populations to a greater extent than other methods in the dataset exhibiting irregular calcium activity. These results support the benefit of using the Markovian Entropy measure to analyze calcium dynamics, particularly for studies using time series data which do not exhibit easily distinguishable features.

  12. A Markovian Entropy Measure for the Analysis of Calcium Activity Time Series.

    Directory of Open Access Journals (Sweden)

    John P Marken

    Full Text Available Methods to analyze the dynamics of calcium activity often rely on visually distinguishable features in time series data such as spikes, waves, or oscillations. However, systems such as the developing nervous system display a complex, irregular type of calcium activity which makes the use of such methods less appropriate. Instead, for such systems there exists a class of methods (including information theoretic, power spectral, and fractal analysis approaches which use more fundamental properties of the time series to analyze the observed calcium dynamics. We present a new analysis method in this class, the Markovian Entropy measure, which is an easily implementable calcium time series analysis method which represents the observed calcium activity as a realization of a Markov Process and describes its dynamics in terms of the level of predictability underlying the transitions between the states of the process. We applied our and other commonly used calcium analysis methods on a dataset from Xenopus laevis neural progenitors which displays irregular calcium activity and a dataset from murine synaptic neurons which displays activity time series that are well-described by visually-distinguishable features. We find that the Markovian Entropy measure is able to distinguish between biologically distinct populations in both datasets, and that it can separate biologically distinct populations to a greater extent than other methods in the dataset exhibiting irregular calcium activity. These results support the benefit of using the Markovian Entropy measure to analyze calcium dynamics, particularly for studies using time series data which do not exhibit easily distinguishable features.

  13. Selecting the optimal anti-aliasing filter for multichannel biosignal acquisition intended for inter-signal phase shift analysis

    International Nuclear Information System (INIS)

    Keresnyei, Róbert; Hejjel, László; Megyeri, Péter; Zidarics, Zoltán

    2015-01-01

    The availability of microcomputer-based portable devices facilitates the high-volume multichannel biosignal acquisition and the analysis of their instantaneous oscillations and inter-signal temporal correlations. These new, non-invasively obtained parameters can have considerable prognostic or diagnostic roles. The present study investigates the inherent signal delay of the obligatory anti-aliasing filters. One cycle of each of the 8 electrocardiogram (ECG) and 4 photoplethysmogram signals from healthy volunteers or artificially synthesised series were passed through 100–80–60–40–20 Hz 2–4–6–8th order Bessel and Butterworth filters digitally synthesized by bilinear transformation, that resulted in a negligible error in signal delay compared to the mathematical model of the impulse- and step responses of the filters. The investigated filters have as diverse a signal delay as 2–46 ms depending on the filter parameters and the signal slew rate, which is difficult to predict in biological systems and thus difficult to compensate for. Its magnitude can be comparable to the examined phase shifts, deteriorating the accuracy of the measurement. As a conclusion, identical or very similar anti-aliasing filters with lower orders and higher corner frequencies, oversampling, and digital low pass filtering are recommended for biosignal acquisition intended for inter-signal phase shift analysis. (note)

  14. Advanced Time-Frequency Representation in Voice Signal Analysis

    Directory of Open Access Journals (Sweden)

    Dariusz Mika

    2018-03-01

    Full Text Available The most commonly used time-frequency representation of the analysis in voice signal is spectrogram. This representation belongs in general to Cohen's class, the class of time-frequency energy distributions. From the standpoint of properties of the resolution spectrogram representation is not optimal. In Cohen class representations are known which have a better resolution properties. All of them are created by smoothing the Wigner-Ville'a (WVD distribution characterized by the best resolution, however, the biggest harmful interference. Used smoothing functions decide about a compromise between the properties of resolution and eliminating harmful interference term. Another class of time-frequency energy distributions is the affine class of distributions. From the point of view of readability of analysis the best properties are known so called Redistribution of energy caused by the use of a general methodology referred to as reassignment to any time-frequency representation. Reassigned distributions efficiently combine a reduction of the interference terms provided by a well adapted smoothing kernel and an increased concentration of the signal components.

  15. Advances in Photopletysmography Signal Analysis for Biomedical Applications

    Directory of Open Access Journals (Sweden)

    Jermana L. Moraes

    2018-06-01

    Full Text Available Heart Rate Variability (HRV is an important tool for the analysis of a patient’s physiological conditions, as well a method aiding the diagnosis of cardiopathies. Photoplethysmography (PPG is an optical technique applied in the monitoring of the HRV and its adoption has been growing significantly, compared to the most commonly used method in medicine, Electrocardiography (ECG. In this survey, definitions of these technique are presented, the different types of sensors used are explained, and the methods for the study and analysis of the PPG signal (linear and nonlinear methods are described. Moreover, the progress, and the clinical and practical applicability of the PPG technique in the diagnosis of cardiovascular diseases are evaluated. In addition, the latest technologies utilized in the development of new tools for medical diagnosis are presented, such as Internet of Things, Internet of Health Things, genetic algorithms, artificial intelligence and biosensors which result in personalized advances in e-health and health care. After the study of these technologies, it can be noted that PPG associated with them is an important tool for the diagnosis of some diseases, due to its simplicity, its cost–benefit ratio, the easiness of signals acquisition, and especially because it is a non-invasive technique.

  16. Continuous EEG signal analysis for asynchronous BCI application.

    Science.gov (United States)

    Hsu, Wei-Yen

    2011-08-01

    In this study, we propose a two-stage recognition system for continuous analysis of electroencephalogram (EEG) signals. An independent component analysis (ICA) and correlation coefficient are used to automatically eliminate the electrooculography (EOG) artifacts. Based on the continuous wavelet transform (CWT) and Student's two-sample t-statistics, active segment selection then detects the location of active segment in the time-frequency domain. Next, multiresolution fractal feature vectors (MFFVs) are extracted with the proposed modified fractal dimension from wavelet data. Finally, the support vector machine (SVM) is adopted for the robust classification of MFFVs. The EEG signals are continuously analyzed in 1-s segments, and every 0.5 second moves forward to simulate asynchronous BCI works in the two-stage recognition architecture. The segment is first recognized as lifted or not in the first stage, and then is classified as left or right finger lifting at stage two if the segment is recognized as lifting in the first stage. Several statistical analyses are used to evaluate the performance of the proposed system. The results indicate that it is a promising system in the applications of asynchronous BCI work.

  17. Artifact suppression and analysis of brain activities with electroencephalography signals.

    Science.gov (United States)

    Rashed-Al-Mahfuz, Md; Islam, Md Rabiul; Hirose, Keikichi; Molla, Md Khademul Islam

    2013-06-05

    Brain-computer interface is a communication system that connects the brain with computer (or other devices) but is not dependent on the normal output of the brain (i.e., peripheral nerve and muscle). Electro-oculogram is a dominant artifact which has a significant negative influence on further analysis of real electroencephalography data. This paper presented a data adaptive technique for artifact suppression and brain wave extraction from electroencephalography signals to detect regional brain activities. Empirical mode decomposition based adaptive thresholding approach was employed here to suppress the electro-oculogram artifact. Fractional Gaussian noise was used to determine the threshold level derived from the analysis data without any training. The purified electroencephalography signal was composed of the brain waves also called rhythmic components which represent the brain activities. The rhythmic components were extracted from each electroencephalography channel using adaptive wiener filter with the original scale. The regional brain activities were mapped on the basis of the spatial distribution of rhythmic components, and the results showed that different regions of the brain are activated in response to different stimuli. This research analyzed the activities of a single rhythmic component, alpha with respect to different motor imaginations. The experimental results showed that the proposed method is very efficient in artifact suppression and identifying individual motor imagery based on the activities of alpha component.

  18. Improved signal analysis for motional Stark effect data

    International Nuclear Information System (INIS)

    Makowski, M.A.; Allen, S.L.; Ellis, R.; Geer, R.; Jayakumar, R.J.; Moller, J.M.; Rice, B.W.

    2005-01-01

    Nonideal effects in the optical train of the motional Stark effect diagnostic have been modeled using the Mueller matrix formalism. The effects examined are birefringence in the vacuum windows, an imperfect reflective mirror, and signal pollution due to the presence of a circularly polarized light component. Relations for the measured intensity ratio are developed for each case. These relations suggest fitting functions to more accurately model the calibration data. One particular function, termed the tangent offset model, is found to fit the data for all channels better than the currently used tangent slope function. Careful analysis of the calibration data with the fitting functions reveals that a nonideal effect is present in the edge array and is attributed to nonideal performance of a mirror in that system. The result of applying the fitting function to the analysis of our data has been to improve the equilibrium reconstruction

  19. Signal analysis of accelerometry data using gravity-based modeling

    Science.gov (United States)

    Davey, Neil P.; James, Daniel A.; Anderson, Megan E.

    2004-03-01

    Triaxial accelerometers have been used to measure human movement parameters in swimming. Interpretation of data is difficult due to interference sources including interaction of external bodies. In this investigation the authors developed a model to simulate the physical movement of the lower back. Theoretical accelerometery outputs were derived thus giving an ideal, or noiseless dataset. An experimental data collection apparatus was developed by adapting a system to the aquatic environment for investigation of swimming. Model data was compared against recorded data and showed strong correlation. Comparison of recorded and modeled data can be used to identify changes in body movement, this is especially useful when cyclic patterns are present in the activity. Strong correlations between data sets allowed development of signal processing algorithms for swimming stroke analysis using first the pure noiseless data set which were then applied to performance data. Video analysis was also used to validate study results and has shown potential to provide acceptable results.

  20. Leak detection in pipelines through spectral analysis of pressure signals

    Directory of Open Access Journals (Sweden)

    Souza A.L.

    2000-01-01

    Full Text Available The development and test of a technique for leak detection in pipelines is presented. The technique is based on the spectral analysis of pressure signals measured in pipeline sections where the formation of stationary waves is favoured, allowing leakage detection during the start/stop of pumps. Experimental tests were performed in a 1250 m long pipeline for various operational conditions of the pipeline (liquid flow rate and leakage configuration. Pressure transients were obtained by four transducers connected to a PC computer. The obtained results show that the spectral analysis of pressure transients, together with the knowledge of reflection points provide a simple and efficient way of identifying leaks during the start/stop of pumps in pipelines.

  1. ESTIMATING RELIABILITY OF DISTURBANCES IN SATELLITE TIME SERIES DATA BASED ON STATISTICAL ANALYSIS

    Directory of Open Access Journals (Sweden)

    Z.-G. Zhou

    2016-06-01

    Full Text Available Normally, the status of land cover is inherently dynamic and changing continuously on temporal scale. However, disturbances or abnormal changes of land cover — caused by such as forest fire, flood, deforestation, and plant diseases — occur worldwide at unknown times and locations. Timely detection and characterization of these disturbances is of importance for land cover monitoring. Recently, many time-series-analysis methods have been developed for near real-time or online disturbance detection, using satellite image time series. However, the detection results were only labelled with “Change/ No change” by most of the present methods, while few methods focus on estimating reliability (or confidence level of the detected disturbances in image time series. To this end, this paper propose a statistical analysis method for estimating reliability of disturbances in new available remote sensing image time series, through analysis of full temporal information laid in time series data. The method consists of three main steps. (1 Segmenting and modelling of historical time series data based on Breaks for Additive Seasonal and Trend (BFAST. (2 Forecasting and detecting disturbances in new time series data. (3 Estimating reliability of each detected disturbance using statistical analysis based on Confidence Interval (CI and Confidence Levels (CL. The method was validated by estimating reliability of disturbance regions caused by a recent severe flooding occurred around the border of Russia and China. Results demonstrated that the method can estimate reliability of disturbances detected in satellite image with estimation error less than 5% and overall accuracy up to 90%.

  2. Trend analysis and change point detection of annual and seasonal temperature series in Peninsular Malaysia

    Science.gov (United States)

    Suhaila, Jamaludin; Yusop, Zulkifli

    2017-06-01

    Most of the trend analysis that has been conducted has not considered the existence of a change point in the time series analysis. If these occurred, then the trend analysis will not be able to detect an obvious increasing or decreasing trend over certain parts of the time series. Furthermore, the lack of discussion on the possible factors that influenced either the decreasing or the increasing trend in the series needs to be addressed in any trend analysis. Hence, this study proposes to investigate the trends, and change point detection of mean, maximum and minimum temperature series, both annually and seasonally in Peninsular Malaysia and determine the possible factors that could contribute to the significance trends. In this study, Pettitt and sequential Mann-Kendall (SQ-MK) tests were used to examine the occurrence of any abrupt climate changes in the independent series. The analyses of the abrupt changes in temperature series suggested that most of the change points in Peninsular Malaysia were detected during the years 1996, 1997 and 1998. These detection points captured by Pettitt and SQ-MK tests are possibly related to climatic factors, such as El Niño and La Niña events. The findings also showed that the majority of the significant change points that exist in the series are related to the significant trend of the stations. Significant increasing trends of annual and seasonal mean, maximum and minimum temperatures in Peninsular Malaysia were found with a range of 2-5 °C/100 years during the last 32 years. It was observed that the magnitudes of the increasing trend in minimum temperatures were larger than the maximum temperatures for most of the studied stations, particularly at the urban stations. These increases are suspected to be linked with the effect of urban heat island other than El Niño event.

  3. Analysis of climatic variations in seasonal precipitation and temperature in Salamanca (Spain); Analisis de las variaciones climaticas en series estacionales de temperatura y precipitacion en Salamanca (Espana)

    Energy Technology Data Exchange (ETDEWEB)

    Garcia Casado, A.; Encinas, A.H.; Rodriguez Puebla, C. [Dpto. de Fisica General y de la Atmosfera Universidad de Salamanca, Salamanca (Spain)

    1996-12-31

    This paper describes the seasonal precipitation and temperature variability in Salamanca. The objectives of the study are: to determine the climate signals on inter annual time-scale within the time series; to redefine the series as a function of the significant oscillation components and to predict local precipitation and temperature variables. The methods used are spectral analysis to obtain the periods of the significant components, linear and nonlinear regression models to obtain the analytical functions that best fit the data. (Author) 14 refs.

  4. Forecasting of particulate matter time series using wavelet analysis and wavelet-ARMA/ARIMA model in Taiyuan, China.

    Science.gov (United States)

    Zhang, Hong; Zhang, Sheng; Wang, Ping; Qin, Yuzhe; Wang, Huifeng

    2017-07-01

    Particulate matter with aerodynamic diameter below 10 μm (PM 10 ) forecasting is difficult because of the uncertainties in describing the emission and meteorological fields. This paper proposed a wavelet-ARMA/ARIMA model to forecast the short-term series of the PM 10 concentrations. It was evaluated by experiments using a 10-year data set of daily PM 10 concentrations from 4 stations located in Taiyuan, China. The results indicated the following: (1) PM 10 concentrations of Taiyuan had a decreasing trend during 2005 to 2012 but increased in 2013. PM 10 concentrations had an obvious seasonal fluctuation related to coal-fired heating in winter and early spring. (2) Spatial differences among the four stations showed that the PM 10 concentrations in industrial and heavily trafficked areas were higher than those in residential and suburb areas. (3) Wavelet analysis revealed that the trend variation and the changes of the PM 10 concentration of Taiyuan were complicated. (4) The proposed wavelet-ARIMA model could be efficiently and successfully applied to the PM 10 forecasting field. Compared with the traditional ARMA/ARIMA methods, this wavelet-ARMA/ARIMA method could effectively reduce the forecasting error, improve the prediction accuracy, and realize multiple-time-scale prediction. Wavelet analysis can filter noisy signals and identify the variation trend and the fluctuation of the PM 10 time-series data. Wavelet decomposition and reconstruction reduce the nonstationarity of the PM 10 time-series data, and thus improve the accuracy of the prediction. This paper proposed a wavelet-ARMA/ARIMA model to forecast the PM 10 time series. Compared with the traditional ARMA/ARIMA method, this wavelet-ARMA/ARIMA method could effectively reduce the forecasting error, improve the prediction accuracy, and realize multiple-time-scale prediction. The proposed model could be efficiently and successfully applied to the PM 10 forecasting field.

  5. Information-Theoretic Performance Analysis of Sensor Networks via Markov Modeling of Time Series Data.

    Science.gov (United States)

    Li, Yue; Jha, Devesh K; Ray, Asok; Wettergren, Thomas A; Yue Li; Jha, Devesh K; Ray, Asok; Wettergren, Thomas A; Wettergren, Thomas A; Li, Yue; Ray, Asok; Jha, Devesh K

    2018-06-01

    This paper presents information-theoretic performance analysis of passive sensor networks for detection of moving targets. The proposed method falls largely under the category of data-level information fusion in sensor networks. To this end, a measure of information contribution for sensors is formulated in a symbolic dynamics framework. The network information state is approximately represented as the largest principal component of the time series collected across the network. To quantify each sensor's contribution for generation of the information content, Markov machine models as well as x-Markov (pronounced as cross-Markov) machine models, conditioned on the network information state, are constructed; the difference between the conditional entropies of these machines is then treated as an approximate measure of information contribution by the respective sensors. The x-Markov models represent the conditional temporal statistics given the network information state. The proposed method has been validated on experimental data collected from a local area network of passive sensors for target detection, where the statistical characteristics of environmental disturbances are similar to those of the target signal in the sense of time scale and texture. A distinctive feature of the proposed algorithm is that the network decisions are independent of the behavior and identity of the individual sensors, which is desirable from computational perspectives. Results are presented to demonstrate the proposed method's efficacy to correctly identify the presence of a target with very low false-alarm rates. The performance of the underlying algorithm is compared with that of a recent data-driven, feature-level information fusion algorithm. It is shown that the proposed algorithm outperforms the other algorithm.

  6. Phase synchronization based minimum spanning trees for analysis of financial time series with nonlinear correlations

    Science.gov (United States)

    Radhakrishnan, Srinivasan; Duvvuru, Arjun; Sultornsanee, Sivarit; Kamarthi, Sagar

    2016-02-01

    The cross correlation coefficient has been widely applied in financial time series analysis, in specific, for understanding chaotic behaviour in terms of stock price and index movements during crisis periods. To better understand time series correlation dynamics, the cross correlation matrices are represented as networks, in which a node stands for an individual time series and a link indicates cross correlation between a pair of nodes. These networks are converted into simpler trees using different schemes. In this context, Minimum Spanning Trees (MST) are the most favoured tree structures because of their ability to preserve all the nodes and thereby retain essential information imbued in the network. Although cross correlations underlying MSTs capture essential information, they do not faithfully capture dynamic behaviour embedded in the time series data of financial systems because cross correlation is a reliable measure only if the relationship between the time series is linear. To address the issue, this work investigates a new measure called phase synchronization (PS) for establishing correlations among different time series which relate to one another, linearly or nonlinearly. In this approach the strength of a link between a pair of time series (nodes) is determined by the level of phase synchronization between them. We compare the performance of phase synchronization based MST with cross correlation based MST along selected network measures across temporal frame that includes economically good and crisis periods. We observe agreement in the directionality of the results across these two methods. They show similar trends, upward or downward, when comparing selected network measures. Though both the methods give similar trends, the phase synchronization based MST is a more reliable representation of the dynamic behaviour of financial systems than the cross correlation based MST because of the former's ability to quantify nonlinear relationships among time

  7. Using bivariate signal analysis to characterize the epileptic focus: the benefit of surrogates.

    Science.gov (United States)

    Andrzejak, R G; Chicharro, D; Lehnertz, K; Mormann, F

    2011-04-01

    The disease epilepsy is related to hypersynchronous activity of networks of neurons. While acute epileptic seizures are the most extreme manifestation of this hypersynchronous activity, an elevated level of interdependence of neuronal dynamics is thought to persist also during the seizure-free interval. In multichannel recordings from brain areas involved in the epileptic process, this interdependence can be reflected in an increased linear cross correlation but also in signal properties of higher order. Bivariate time series analysis comprises a variety of approaches, each with different degrees of sensitivity and specificity for interdependencies reflected in lower- or higher-order properties of pairs of simultaneously recorded signals. Here we investigate which approach is best suited to detect putatively elevated interdependence levels in signals recorded from brain areas involved in the epileptic process. For this purpose, we use the linear cross correlation that is sensitive to lower-order signatures of interdependence, a nonlinear interdependence measure that integrates both lower- and higher-order properties, and a surrogate-corrected nonlinear interdependence measure that aims to specifically characterize higher-order properties. We analyze intracranial electroencephalographic recordings of the seizure-free interval from 29 patients with an epileptic focus located in the medial temporal lobe. Our results show that all three approaches detect higher levels of interdependence for signals recorded from the brain hemisphere containing the epileptic focus as compared to signals recorded from the opposite hemisphere. For the linear cross correlation, however, these differences are not significant. For the nonlinear interdependence measure, results are significant but only of moderate accuracy with regard to the discriminative power for the focal and nonfocal hemispheres. The highest significance and accuracy is obtained for the surrogate-corrected nonlinear

  8. Spectral analysis of highly aliased sea-level signals

    Science.gov (United States)

    Ray, Richard D.

    1998-10-01

    Observing high-wavenumber ocean phenomena with a satellite altimeter generally calls for "along-track" analyses of the data: measurements along a repeating satellite ground track are analyzed in a point-by-point fashion, as opposed to spatially averaging data over multiple tracks. The sea-level aliasing problems encountered in such analyses can be especially challenging. For TOPEX/POSEIDON, all signals with frequency greater than 18 cycles per year (cpy), including both tidal and subdiurnal signals, are folded into the 0-18 cpy band. Because the tidal bands are wider than 18 cpy, residual tidal cusp energy, plus any subdiurnal energy, is capable of corrupting any low-frequency signal of interest. The practical consequences of this are explored here by using real sea-level measurements from conventional tide gauges, for which the true oceanographic spectrum is known and to which a simulated "satellite-measured" spectrum, based on coarsely subsampled data, may be compared. At many locations the spectrum is sufficently red that interannual frequencies remain unaffected. Intra-annual frequencies, however, must be interpreted with greater caution, and even interannual frequencies can be corrupted if the spectrum is flat. The results also suggest that whenever tides must be estimated directly from the altimetry, response methods of analysis are preferable to harmonic methods, even in nonlinear regimes; this will remain so for the foreseeable future. We concentrate on three example tide gauges: two coastal stations on the Malay Peninsula where the closely aliased K1 and Ssa tides are strong and at Canton Island where trapped equatorial waves are aliased.

  9. Performance analysis of signaling protocols on OBS switches

    Science.gov (United States)

    Kirci, Pinar; Zaim, A. Halim

    2005-10-01

    In this paper, Just-In-Time (JIT), Just-Enough-Time (JET) and Horizon signalling schemes for Optical Burst Switched Networks (OBS) are presented. These signaling schemes run over a core dWDM network and a network architecture based on Optical Burst Switches (OBS) is proposed to support IP, ATM and Burst traffic. In IP and ATM traffic several packets are assembled in a single packet called burst and the burst contention is handled by burst dropping. The burst length distribution in IP traffic is arbitrary between 0 and 1, and is fixed in ATM traffic at 0,5. Burst traffic on the other hand is arbitrary between 1 and 5. The Setup and Setup ack length distributions are arbitrary. We apply the Poisson model with rate λ and Self-Similar model with pareto distribution rate α to identify inter-arrival times in these protocols. We consider a communication between a source client node and a destination client node over an ingress and one or more multiple intermediate switches.We use buffering only in the ingress node. The communication is based on single burst connections in which, the connection is set up just before sending a burst and then closed as soon as the burst is sent. Our analysis accounts for several important parameters, including the burst setup, burst setup ack, keepalive messages and the optical switching protocol. We compare the performance of the three signalling schemes on the network under as burst dropping probability under a range of network scenarios.

  10. Multi-complexity ensemble measures for gait time series analysis: application to diagnostics, monitoring and biometrics.

    Science.gov (United States)

    Gavrishchaka, Valeriy; Senyukova, Olga; Davis, Kristina

    2015-01-01

    Previously, we have proposed to use complementary complexity measures discovered by boosting-like ensemble learning for the enhancement of quantitative indicators dealing with necessarily short physiological time series. We have confirmed robustness of such multi-complexity measures for heart rate variability analysis with the emphasis on detection of emerging and intermittent cardiac abnormalities. Recently, we presented preliminary results suggesting that such ensemble-based approach could be also effective in discovering universal meta-indicators for early detection and convenient monitoring of neurological abnormalities using gait time series. Here, we argue and demonstrate that these multi-complexity ensemble measures for gait time series analysis could have significantly wider application scope ranging from diagnostics and early detection of physiological regime change to gait-based biometrics applications.

  11. Analysis of acoustic sound signal for ONB measurement

    International Nuclear Information System (INIS)

    Park, S. J.; Kim, H. I.; Han, K. Y.; Chai, H. T.; Park, C.

    2003-01-01

    The onset of nucleate boiling (ONB) was measured in a test fuel bundle composed of several fuel element simulators (FES) by analysing the aquatic sound signals. In order measure ONBs, a hydrophone, a pre-amplifier, and a data acquisition system to acquire/process the aquatic signal was prepared. The acoustic signal generated in the coolant is converted to the current signal through the microphone. When the signal is analyzed in the frequency domain, each sound signal can be identified according to its origin of sound source. As the power is increased to a certain degree, a nucleate boiling is started. The frequent formation and collapse of the void bubbles produce sound signal. By measuring this sound signal one can pinpoint the ONB. Since the signal characteristics is identical for different mass flow rates, this method can be applicable for ascertaining ONB

  12. Harmonic Analysis of a Nonstationary Series of Temperature Paleoreconstruction for the Central Part of Greenland

    Directory of Open Access Journals (Sweden)

    T.E. Danova

    2016-06-01

    Full Text Available The results of the investigations of a transformed series of reconstructed air temperature data for the central part of Greenland with an increment of 30 years have been presented. Stationarization of a ~ 50,000-years’ series of the reconstructed air temperature in the central part of Greenland according to ice core data has been performed using mathematical expectation. To obtain mathematical expectation estimation, the smoothing procedure by the methods of moving average and wavelet analysis has been carried out. Fourier’s transformation has been applied repeatedly to the stationarized series with changing the averaging time in the process of smoothing. Three averaging time values have been selected for the investigations: ~ 400–500 years, ~ 2,000 years, and ~ 4,000 years. Stationarization of the reconstructed temperature series with the help of wavelet transformation showed the best results when applying the averaging time of ~ 400 and ~ 2000 years, the trends well characterize the initial temperature series, there-by revealing the main patterns of its dynamics. Using the period with the averaging time of ~ 4,000 years showed the worst result: significant events of the main temperature series were lost in the process of averaging. The obtained results well correspond to cycling known to be inherent to the climatic system of the planet; the detected modes of 1,470 ± 500 years are comparable to the Dansgaard–Oeschger and Bond oscillations.

  13. Properties of Asymmetric Detrended Fluctuation Analysis in the time series of RR intervals

    Science.gov (United States)

    Piskorski, J.; Kosmider, M.; Mieszkowski, D.; Krauze, T.; Wykretowicz, A.; Guzik, P.

    2018-02-01

    Heart rate asymmetry is a phenomenon by which the accelerations and decelerations of heart rate behave differently, and this difference is consistent and unidirectional, i.e. in most of the analyzed recordings the inequalities have the same directions. So far, it has been established for variance and runs based types of descriptors of RR intervals time series. In this paper we apply the newly developed method of Asymmetric Detrended Fluctuation Analysis, which so far has mainly been used with economic time series, to the set of 420 stationary 30 min time series of RR intervals from young, healthy individuals aged between 20 and 40. This asymmetric approach introduces separate scaling exponents for rising and falling trends. We systematically study the presence of asymmetry in both global and local versions of this method. In this study global means "applying to the whole time series" and local means "applying to windows jumping along the recording". It is found that the correlation structure of the fluctuations left over after detrending in physiological time series shows strong asymmetric features in both magnitude, with α+ physiological data after shuffling or with a group of symmetric synthetic time series.

  14. The application of complex network time series analysis in turbulent heated jets

    International Nuclear Information System (INIS)

    Charakopoulos, A. K.; Karakasidis, T. E.; Liakopoulos, A.; Papanicolaou, P. N.

    2014-01-01

    In the present study, we applied the methodology of the complex network-based time series analysis to experimental temperature time series from a vertical turbulent heated jet. More specifically, we approach the hydrodynamic problem of discriminating time series corresponding to various regions relative to the jet axis, i.e., time series corresponding to regions that are close to the jet axis from time series originating at regions with a different dynamical regime based on the constructed network properties. Applying the transformation phase space method (k nearest neighbors) and also the visibility algorithm, we transformed time series into networks and evaluated the topological properties of the networks such as degree distribution, average path length, diameter, modularity, and clustering coefficient. The results show that the complex network approach allows distinguishing, identifying, and exploring in detail various dynamical regions of the jet flow, and associate it to the corresponding physical behavior. In addition, in order to reject the hypothesis that the studied networks originate from a stochastic process, we generated random network and we compared their statistical properties with that originating from the experimental data. As far as the efficiency of the two methods for network construction is concerned, we conclude that both methodologies lead to network properties that present almost the same qualitative behavior and allow us to reveal the underlying system dynamics

  15. Statistical attribution analysis of the nonstationarity of the annual runoff series of the Weihe River.

    Science.gov (United States)

    Xiong, Lihua; Jiang, Cong; Du, Tao

    2014-01-01

    Time-varying moments models based on Pearson Type III and normal distributions respectively are built under the generalized additive model in location, scale and shape (GAMLSS) framework to analyze the nonstationarity of the annual runoff series of the Weihe River, the largest tributary of the Yellow River. The detection of nonstationarities in hydrological time series (annual runoff, precipitation and temperature) from 1960 to 2009 is carried out using a GAMLSS model, and then the covariate analysis for the annual runoff series is implemented with GAMLSS. Finally, the attribution of each covariate to the nonstationarity of annual runoff is analyzed quantitatively. The results demonstrate that (1) obvious change-points exist in all three hydrological series, (2) precipitation, temperature and irrigated area are all significant covariates of the annual runoff series, and (3) temperature increase plays the main role in leading to the reduction of the annual runoff series in the study basin, followed by the decrease of precipitation and the increase of irrigated area.

  16. Acoustic cardiac signals analysis: a Kalman filter–based approach

    Directory of Open Access Journals (Sweden)

    Salleh SH

    2012-06-01

    Full Text Available Sheik Hussain Salleh,1 Hadrina Sheik Hussain,2 Tan Tian Swee,2 Chee-Ming Ting,2 Alias Mohd Noor,2 Surasak Pipatsart,3 Jalil Ali,4 Preecha P Yupapin31Department of Biomedical Instrumentation and Signal Processing, Universiti Teknologi Malaysia, Skudai, Malaysia; 2Centre for Biomedical Engineering Transportation Research Alliance, Universiti Teknologi Malaysia, Johor Bahru, Malaysia; 3Nanoscale Science and Engineering Research Alliance, King Mongkut's Institute of Technology Ladkrabang, Bangkok, Thailand; 4Institute of Advanced Photonics Science, Universiti Teknologi Malaysia, Johor Bahru, MalaysiaAbstract: Auscultation of the heart is accompanied by both electrical activity and sound. Heart auscultation provides clues to diagnose many cardiac abnormalities. Unfortunately, detection of relevant symptoms and diagnosis based on heart sound through a stethoscope is difficult. The reason GPs find this difficult is that the heart sounds are of short duration and separated from one another by less than 30 ms. In addition, the cost of false positives constitutes wasted time and emotional anxiety for both patient and GP. Many heart diseases cause changes in heart sound, waveform, and additional murmurs before other signs and symptoms appear. Heart-sound auscultation is the primary test conducted by GPs. These sounds are generated primarily by turbulent flow of blood in the heart. Analysis of heart sounds requires a quiet environment with minimum ambient noise. In order to address such issues, the technique of denoising and estimating the biomedical heart signal is proposed in this investigation. Normally, the performance of the filter naturally depends on prior information related to the statistical properties of the signal and the background noise. This paper proposes Kalman filtering for denoising statistical heart sound. The cycles of heart sounds are certain to follow first-order Gauss–Markov process. These cycles are observed with additional noise

  17. On statistical inference in time series analysis of the evolution of road safety.

    Science.gov (United States)

    Commandeur, Jacques J F; Bijleveld, Frits D; Bergel-Hayat, Ruth; Antoniou, Constantinos; Yannis, George; Papadimitriou, Eleonora

    2013-11-01

    Data collected for building a road safety observatory usually include observations made sequentially through time. Examples of such data, called time series data, include annual (or monthly) number of road traffic accidents, traffic fatalities or vehicle kilometers driven in a country, as well as the corresponding values of safety performance indicators (e.g., data on speeding, seat belt use, alcohol use, etc.). Some commonly used statistical techniques imply assumptions that are often violated by the special properties of time series data, namely serial dependency among disturbances associated with the observations. The first objective of this paper is to demonstrate the impact of such violations to the applicability of standard methods of statistical inference, which leads to an under or overestimation of the standard error and consequently may produce erroneous inferences. Moreover, having established the adverse consequences of ignoring serial dependency issues, the paper aims to describe rigorous statistical techniques used to overcome them. In particular, appropriate time series analysis techniques of varying complexity are employed to describe the development over time, relating the accident-occurrences to explanatory factors such as exposure measures or safety performance indicators, and forecasting the development into the near future. Traditional regression models (whether they are linear, generalized linear or nonlinear) are shown not to naturally capture the inherent dependencies in time series data. Dedicated time series analysis techniques, such as the ARMA-type and DRAG approaches are discussed next, followed by structural time series models, which are a subclass of state space methods. The paper concludes with general recommendations and practice guidelines for the use of time series models in road safety research. Copyright © 2012 Elsevier Ltd. All rights reserved.

  18. A new LMS algorithm for analysis of atrial fibrillation signals.

    Science.gov (United States)

    Ciaccio, Edward J; Biviano, Angelo B; Whang, William; Garan, Hasan

    2012-03-26

    A biomedical signal can be defined by its extrinsic features (x-axis and y-axis shift and scale) and intrinsic features (shape after normalization of extrinsic features). In this study, an LMS algorithm utilizing the method of differential steepest descent is developed, and is tested by normalization of extrinsic features in complex fractionated atrial electrograms (CFAE). Equations for normalization of x-axis and y-axis shift and scale are first derived. The algorithm is implemented for real-time analysis of CFAE acquired during atrial fibrillation (AF). Data was acquired at a 977 Hz sampling rate from 10 paroxysmal and 10 persistent AF patients undergoing clinical electrophysiologic study and catheter ablation therapy. Over 24 trials, normalization characteristics using the new algorithm with four weights were compared to the Widrow-Hoff LMS algorithm with four tapped delays. The time for convergence, and the mean squared error (MSE) after convergence, were compared. The new LMS algorithm was also applied to lead aVF of the electrocardiogram in one patient with longstanding persistent AF, to enhance the F wave and to monitor extrinsic changes in signal shape. The average waveform over a 25 s interval was used as a prototypical reference signal for matching with the aVF lead. Based on the derivation equations, the y-shift and y-scale adjustments of the new LMS algorithm were shown to be equivalent to the scalar form of the Widrow-Hoff LMS algorithm. For x-shift and x-scale adjustments, rather than implementing a long tapped delay as in Widrow-Hoff LMS, the new method uses only two weights. After convergence, the MSE for matching paroxysmal CFAE averaged 0.46 ± 0.49 μV(2)/sample for the new LMS algorithm versus 0.72 ± 0.35 μV(2)/sample for Widrow-Hoff LMS. The MSE for matching persistent CFAE averaged 0.55 ± 0.95 μV(2)/sample for the new LMS algorithm versus 0.62 ± 0.55 μV(2)/sample for Widrow-Hoff LMS. There were no significant differences in estimation

  19. A new LMS algorithm for analysis of atrial fibrillation signals

    Directory of Open Access Journals (Sweden)

    Ciaccio Edward J

    2012-03-01

    Full Text Available Abstract Background A biomedical signal can be defined by its extrinsic features (x-axis and y-axis shift and scale and intrinsic features (shape after normalization of extrinsic features. In this study, an LMS algorithm utilizing the method of differential steepest descent is developed, and is tested by normalization of extrinsic features in complex fractionated atrial electrograms (CFAE. Method Equations for normalization of x-axis and y-axis shift and scale are first derived. The algorithm is implemented for real-time analysis of CFAE acquired during atrial fibrillation (AF. Data was acquired at a 977 Hz sampling rate from 10 paroxysmal and 10 persistent AF patients undergoing clinical electrophysiologic study and catheter ablation therapy. Over 24 trials, normalization characteristics using the new algorithm with four weights were compared to the Widrow-Hoff LMS algorithm with four tapped delays. The time for convergence, and the mean squared error (MSE after convergence, were compared. The new LMS algorithm was also applied to lead aVF of the electrocardiogram in one patient with longstanding persistent AF, to enhance the F wave and to monitor extrinsic changes in signal shape. The average waveform over a 25 s interval was used as a prototypical reference signal for matching with the aVF lead. Results Based on the derivation equations, the y-shift and y-scale adjustments of the new LMS algorithm were shown to be equivalent to the scalar form of the Widrow-Hoff LMS algorithm. For x-shift and x-scale adjustments, rather than implementing a long tapped delay as in Widrow-Hoff LMS, the new method uses only two weights. After convergence, the MSE for matching paroxysmal CFAE averaged 0.46 ± 0.49μV2/sample for the new LMS algorithm versus 0.72 ± 0.35μV2/sample for Widrow-Hoff LMS. The MSE for matching persistent CFAE averaged 0.55 ± 0.95μV2/sample for the new LMS algorithm versus 0.62 ± 0.55μV2/sample for Widrow

  20. Regression and regression analysis time series prediction modeling on climate data of quetta, pakistan

    International Nuclear Information System (INIS)

    Jafri, Y.Z.; Kamal, L.

    2007-01-01

    Various statistical techniques was used on five-year data from 1998-2002 of average humidity, rainfall, maximum and minimum temperatures, respectively. The relationships to regression analysis time series (RATS) were developed for determining the overall trend of these climate parameters on the basis of which forecast models can be corrected and modified. We computed the coefficient of determination as a measure of goodness of fit, to our polynomial regression analysis time series (PRATS). The correlation to multiple linear regression (MLR) and multiple linear regression analysis time series (MLRATS) were also developed for deciphering the interdependence of weather parameters. Spearman's rand correlation and Goldfeld-Quandt test were used to check the uniformity or non-uniformity of variances in our fit to polynomial regression (PR). The Breusch-Pagan test was applied to MLR and MLRATS, respectively which yielded homoscedasticity. We also employed Bartlett's test for homogeneity of variances on a five-year data of rainfall and humidity, respectively which showed that the variances in rainfall data were not homogenous while in case of humidity, were homogenous. Our results on regression and regression analysis time series show the best fit to prediction modeling on climatic data of Quetta, Pakistan. (author)

  1. Bayesian near-boundary analysis in basic macroeconomic time series models

    NARCIS (Netherlands)

    M.D. de Pooter (Michiel); F. Ravazzolo (Francesco); R. Segers (René); H.K. van Dijk (Herman)

    2008-01-01

    textabstractSeveral lessons learnt from a Bayesian analysis of basic macroeconomic time series models are presented for the situation where some model parameters have substantial posterior probability near the boundary of the parameter region. This feature refers to near-instability within dynamic

  2. A Comparison of Missing-Data Procedures for Arima Time-Series Analysis

    Science.gov (United States)

    Velicer, Wayne F.; Colby, Suzanne M.

    2005-01-01

    Missing data are a common practical problem for longitudinal designs. Time-series analysis is a longitudinal method that involves a large number of observations on a single unit. Four different missing-data methods (deletion, mean substitution, mean of adjacent observations, and maximum likelihood estimation) were evaluated. Computer-generated…

  3. Using trajectory sensitivity analysis to find suitable locations of series compensators for improving rotor angle stability

    DEFF Research Database (Denmark)

    Nasri, Amin; Eriksson, Robert; Ghandhar, Mehrdad

    2014-01-01

    This paper proposes an approach based on trajectory sensitivity analysis (TSA) to find most suitable placement of series compensators in the power system. The main objective is to maximize the benefit of these devices in order to enhance the rotor angle stability. This approach is formulated...

  4. Detrended partial cross-correlation analysis of two nonstationary time series influenced by common external forces

    Science.gov (United States)

    Qian, Xi-Yuan; Liu, Ya-Min; Jiang, Zhi-Qiang; Podobnik, Boris; Zhou, Wei-Xing; Stanley, H. Eugene

    2015-06-01

    When common factors strongly influence two power-law cross-correlated time series recorded in complex natural or social systems, using detrended cross-correlation analysis (DCCA) without considering these common factors will bias the results. We use detrended partial cross-correlation analysis (DPXA) to uncover the intrinsic power-law cross correlations between two simultaneously recorded time series in the presence of nonstationarity after removing the effects of other time series acting as common forces. The DPXA method is a generalization of the detrended cross-correlation analysis that takes into account partial correlation analysis. We demonstrate the method by using bivariate fractional Brownian motions contaminated with a fractional Brownian motion. We find that the DPXA is able to recover the analytical cross Hurst indices, and thus the multiscale DPXA coefficients are a viable alternative to the conventional cross-correlation coefficient. We demonstrate the advantage of the DPXA coefficients over the DCCA coefficients by analyzing contaminated bivariate fractional Brownian motions. We calculate the DPXA coefficients and use them to extract the intrinsic cross correlation between crude oil and gold futures by taking into consideration the impact of the U.S. dollar index. We develop the multifractal DPXA (MF-DPXA) method in order to generalize the DPXA method and investigate multifractal time series. We analyze multifractal binomial measures masked with strong white noises and find that the MF-DPXA method quantifies the hidden multifractal nature while the multifractal DCCA method fails.

  5. Operation States Analysis of the Series-Parallel resonant Converter Working Above Resonance Frequency

    Directory of Open Access Journals (Sweden)

    Peter Dzurko

    2007-01-01

    Full Text Available Operation states analysis of a series-parallel converter working above resonance frequency is described in the paper. Principal equations are derived for individual operation states. On the basis of them the diagrams are made out. The diagrams give the complex image of the converter behaviour for individual circuit parameters. The waveforms may be utilised at designing the inverter individual parts.

  6. AAMFT Master Series Tapes: An Analysis of the Inclusion of Feminist Principles into Family Therapy Practice.

    Science.gov (United States)

    Haddock, Shelley A.; MacPhee, David; Zimmerman, Toni Schindler

    2001-01-01

    Content analysis of 23 American Association for Marriage and Family Therapy Master Series tapes was used to determine how well feminist behaviors have been incorporated into ideal family therapy practice. Feminist behaviors were infrequent, being evident in fewer than 3% of time blocks in event sampling and 10 of 39 feminist behaviors of the…

  7. Dynamic factor analysis in the frequency domain: causal modeling of multivariate psychophysiological time series

    NARCIS (Netherlands)

    Molenaar, P.C.M.

    1987-01-01

    Outlines a frequency domain analysis of the dynamic factor model and proposes a solution to the problem of constructing a causal filter of lagged factor loadings. The method is illustrated with applications to simulated and real multivariate time series. The latter applications involve topographic

  8. Harmonic analysis of dense time series of landsat imagery for modeling change in forest conditions

    Science.gov (United States)

    Barry Tyler. Wilson

    2015-01-01

    This study examined the utility of dense time series of Landsat imagery for small area estimation and mapping of change in forest conditions over time. The study area was a region in north central Wisconsin for which Landsat 7 ETM+ imagery and field measurements from the Forest Inventory and Analysis program are available for the decade of 2003 to 2012. For the periods...

  9. Economic Conditions and the Divorce Rate: A Time-Series Analysis of the Postwar United States.

    Science.gov (United States)

    South, Scott J.

    1985-01-01

    Challenges the belief that the divorce rate rises during prosperity and falls during economic recessions. Time-series regression analysis of postwar United States reveals small but positive effects of unemployment on divorce rate. Stronger influences on divorce rates are changes in age structure and labor-force participation rate of women.…

  10. Operation Analysis of the Series-Parallel Resonant Converter Working above Resonance Frequency

    Directory of Open Access Journals (Sweden)

    Peter Dzurko

    2006-01-01

    Full Text Available The present article deals with theoretical analysis of operation of a series-parallel converter working above resonance frequency. Derived are principal equations for individual operation intervals. Based on these made out are waveforms of individual quantities during both the inverter operation at load and no-load operation. The waveforms may be utilised at designing the inverter individual parts.

  11. The Recording and Quantification of Event-Related Potentials: II. Signal Processing and Analysis

    Directory of Open Access Journals (Sweden)

    Paniz Tavakoli

    2015-06-01

    Full Text Available Event-related potentials are an informative method for measuring the extent of information processing in the brain. The voltage deflections in an ERP waveform reflect the processing of sensory information as well as higher-level processing that involves selective attention, memory, semantic comprehension, and other types of cognitive activity. ERPs provide a non-invasive method of studying, with exceptional temporal resolution, cognitive processes in the human brain. ERPs are extracted from scalp-recorded electroencephalography by a series of signal processing steps. The present tutorial will highlight several of the analysis techniques required to obtain event-related potentials. Some methodological issues that may be encountered will also be discussed.

  12. Regional Land Subsidence Analysis in Eastern Beijing Plain by InSAR Time Series and Wavelet Transforms

    Directory of Open Access Journals (Sweden)

    Mingliang Gao

    2018-02-01

    Full Text Available Land subsidence is the disaster phenomenon of environmental geology with regionally surface altitude lowering caused by the natural or man-made factors. Beijing, the capital city of China, has suffered from land subsidence since the 1950s, and extreme groundwater extraction has led to subsidence rates of more than 100 mm/year. In this study, we employ two SAR datasets acquired by Envisat and TerraSAR-X satellites to investigate the surface deformation in Beijing Plain from 2003 to 2013 based on the multi-temporal InSAR technique. Furthermore, we also use observation wells to provide in situ hydraulic head levels to perform the evolution of land subsidence and spatial-temporal changes of groundwater level. Then, we analyze the accumulated displacement and hydraulic head level time series using continuous wavelet transform to separate periodic signal components. Finally, cross wavelet transform (XWT and wavelet transform coherence (WTC are implemented to analyze the relationship between the accumulated displacement and hydraulic head level time series. The results show that the subsidence centers in the northern Beijing Plain is spatially consistent with the groundwater drop funnels. According to the analysis of well based results located in different areas, the long-term groundwater exploitation in the northern subsidence area has led to the continuous decline of the water level, resulting in the inelastic and permanent compaction, while for the monitoring wells located outside the subsidence area, the subsidence time series show obvious elastic deformation characteristics (seasonal characteristics as the groundwater level changes. Moreover, according to the wavelet transformation, the land subsidence time series at monitoring well site lags several months behind the groundwater level change.

  13. On semi-classical questions related to signal analysis

    KAUST Repository

    Helffer, Bernard; Laleg-Kirati, Taous-Meriem

    2011-01-01

    . Indeed it provides new spectral quantities that can give relevant information on some signals as it is the case for arterial blood pressure signal. © 2011 - IOS Press and the authors. All rights reserved.

  14. TSLP signaling pathway map: a platform for analysis of TSLP-mediated signaling.

    Science.gov (United States)

    Zhong, Jun; Sharma, Jyoti; Raju, Rajesh; Palapetta, Shyam Mohan; Prasad, T S Keshava; Huang, Tai-Chung; Yoda, Akinori; Tyner, Jeffrey W; van Bodegom, Diederik; Weinstock, David M; Ziegler, Steven F; Pandey, Akhilesh

    2014-01-01

    Thymic stromal lymphopoietin (TSLP) is a four-helix bundle cytokine that plays a critical role in the regulation of immune responses and in the differentiation of hematopoietic cells. TSLP signals through a heterodimeric receptor complex consisting of an interleukin-7 receptor α chain and a unique TSLP receptor (TSLPR) [also known as cytokine receptor-like factor 2 (CRLF2)]. Cellular targets of TSLP include dendritic cells, B cells, mast cells, regulatory T (Treg) cells and CD4+ and CD8+ T cells. The TSLP/TSLPR axis can activate multiple signaling transduction pathways including the JAK/STAT pathway and the PI-3 kinase pathway. Aberrant TSLP/TSLPR signaling has been associated with a variety of human diseases including asthma, atopic dermatitis, nasal polyposis, inflammatory bowel disease, eosinophilic eosophagitis and, most recently, acute lymphoblastic leukemia. A centralized resource of the TSLP signaling pathway cataloging signaling events is not yet available. In this study, we present a literature-annotated resource of reactions in the TSLP signaling pathway. This pathway map is publicly available through NetPath (http://www.netpath.org/), an open access signal transduction pathway resource developed previously by our group. This map includes 236 molecules and 252 reactions that are involved in TSLP/TSLPR signaling pathway. We expect that the TSLP signaling pathway map will provide a rich resource to study the biology of this important cytokine as well as to identify novel therapeutic targets for diseases associated with dysregulated TSLP/TSLPR signaling. Database URL: http://www.netpath.org/pathways?path_id=NetPath_24.

  15. Analysis of the influence of memory content of auditory stimuli on the memory content of EEG signal.

    Science.gov (United States)

    Namazi, Hamidreza; Khosrowabadi, Reza; Hussaini, Jamal; Habibi, Shaghayegh; Farid, Ali Akhavan; Kulish, Vladimir V

    2016-08-30

    One of the major challenges in brain research is to relate the structural features of the auditory stimulus to structural features of Electroencephalogram (EEG) signal. Memory content is an important feature of EEG signal and accordingly the brain. On the other hand, the memory content can also be considered in case of stimulus. Beside all works done on analysis of the effect of stimuli on human EEG and brain memory, no work discussed about the stimulus memory and also the relationship that may exist between the memory content of stimulus and the memory content of EEG signal. For this purpose we consider the Hurst exponent as the measure of memory. This study reveals the plasticity of human EEG signals in relation to the auditory stimuli. For the first time we demonstrated that the memory content of an EEG signal shifts towards the memory content of the auditory stimulus used. The results of this analysis showed that an auditory stimulus with higher memory content causes a larger increment in the memory content of an EEG signal. For the verification of this result, we benefit from approximate entropy as indicator of time series randomness. The capability, observed in this research, can be further investigated in relation to human memory.

  16. Scattering Analysis of a Compact Dipole Array with Series and Parallel Feed Network including Mutual Coupling Effect

    Directory of Open Access Journals (Sweden)

    H. L. Sneha

    2013-01-01

    Full Text Available The current focus in defense arena is towards the stealth technology with an emphasis to control the radar cross-section (RCS. The scattering from the antennas mounted over the platform is of prime importance especially for a low-observable aerospace vehicle. This paper presents the analysis of the scattering cross section of a uniformly spaced linear dipole array. Two types of feed networks, that is, series and parallel feed networks, are considered. The total RCS of phased array with either kind of feed network is obtained by following the signal as it enters through the aperture and travels through the feed network. The RCS estimation of array is done including the mutual coupling effect between the dipole elements in three configurations, that is, side-by-side, collinear, and parallel-in-echelon. The results presented can be useful while designing a phased array with optimum performance towards low observability.

  17. Unified functional network and nonlinear time series analysis for complex systems science: The pyunicorn package

    Science.gov (United States)

    Donges, Jonathan; Heitzig, Jobst; Beronov, Boyan; Wiedermann, Marc; Runge, Jakob; Feng, Qing Yi; Tupikina, Liubov; Stolbova, Veronika; Donner, Reik; Marwan, Norbert; Dijkstra, Henk; Kurths, Jürgen

    2016-04-01

    We introduce the pyunicorn (Pythonic unified complex network and recurrence analysis toolbox) open source software package for applying and combining modern methods of data analysis and modeling from complex network theory and nonlinear time series analysis. pyunicorn is a fully object-oriented and easily parallelizable package written in the language Python. It allows for the construction of functional networks such as climate networks in climatology or functional brain networks in neuroscience representing the structure of statistical interrelationships in large data sets of time series and, subsequently, investigating this structure using advanced methods of complex network theory such as measures and models for spatial networks, networks of interacting networks, node-weighted statistics, or network surrogates. Additionally, pyunicorn provides insights into the nonlinear dynamics of complex systems as recorded in uni- and multivariate time series from a non-traditional perspective by means of recurrence quantification analysis, recurrence networks, visibility graphs, and construction of surrogate time series. The range of possible applications of the library is outlined, drawing on several examples mainly from the field of climatology. pyunicorn is available online at https://github.com/pik-copan/pyunicorn. Reference: J.F. Donges, J. Heitzig, B. Beronov, M. Wiedermann, J. Runge, Q.-Y. Feng, L. Tupikina, V. Stolbova, R.V. Donner, N. Marwan, H.A. Dijkstra, and J. Kurths, Unified functional network and nonlinear time series analysis for complex systems science: The pyunicorn package, Chaos 25, 113101 (2015), DOI: 10.1063/1.4934554, Preprint: arxiv.org:1507.01571 [physics.data-an].

  18. Independent component analysis: A new possibility for analysing series of electron energy loss spectra

    International Nuclear Information System (INIS)

    Bonnet, Nogl; Nuzillard, Danielle

    2005-01-01

    A complementary approach is proposed for analysing series of electron energy-loss spectra that can be recorded with the spectrum-line technique, across an interface for instance. This approach, called blind source separation (BSS) or independent component analysis (ICA), complements two existing methods: the spatial difference approach and multivariate statistical analysis. The principle of the technique is presented and illustrations are given through one simulated example and one real example

  19. Wavelet analysis as a tool to characteriseand remove environmental noisefrom self-potential time series

    Directory of Open Access Journals (Sweden)

    M. Ragosta

    2004-06-01

    Full Text Available Multiresolution wavelet analysis of self-potential signals and rainfall levels is performed for extracting fluctuations in electrical signals, which might be addressed to meteorological variability. In the time-scale domain of the wavelet transform, rain data are used as markers to single out those wavelet coefficients of the electric signal which can be considered relevant to the environmental disturbance. Then these coefficients are filtered out and the signal is recovered by anti-transforming the retained coefficients. Such methodological approach might be applied to characterise unwanted environmental noise. It also can be considered as a practical technique to remove noise that can hamper the correct assessment and use of electrical techniques for the monitoring of geophysical phenomena.

  20. Monitoring vegetation recovery in fire-affected areas using temporal profiles of spectral signal from time series MODIS and LANDSAT satellite images

    Science.gov (United States)

    Georgopoulou, Danai; Koutsias, Nikos

    2015-04-01

    Vegetation phenology is an important element of vegetation characteristics that can be useful in vegetation monitoring especially when satellite remote sensing observations are used. In that sense temporal profiles extracted from spectral signal of time series MODIS and LANDSAT satellite images can be used to characterize vegetation phenology and thus to be helpful for monitoring vegetation recovery in fire-affected areas. The aim of this study is to explore the vegetation recovery pattern of the catastrophic wildfires that occurred in Peloponnisos, southern Greece, in 2007. These fires caused the loss of 67 lives and were recognized as the most extreme natural disaster in the country's recent history. Satellite remote sensing data from MODIS and LANDSAT satellites in the period from 2000 to 2014 were acquired and processed to extract the temporal profiles of the spectral signal for selected areas within the fire-affected areas. This dataset and time period analyzed together with the time that these fires occurred gave the opportunity to create temporal profiles seven years before and seven years after the fire. The different scale of the data used gave us the chance to understand how vegetation phenology and therefore the recovery patterns are influenced by the spatial resolution of the satellite data used. Different metrics linked to key phenological events have been created and used to assess vegetation recovery in the fire-affected areas. Our analysis was focused in the main land cover types that were mostly affected by the 2007 wildland fires. Based on CORINE land-cover maps these were agricultural lands highly interspersed with large areas of natural vegetation followed by sclerophyllous vegetation, transitional woodland shrubs, complex cultivation patterns and olive groves. Apart of the use of the original spectral data we estimated and used vegetation indices commonly found in vegetation studies as well as in burned area mapping studies. In this study we

  1. A review of signals used in sleep analysis

    International Nuclear Information System (INIS)

    Roebuck, A; Monasterio, V; Gederi, E; Osipov, M; Behar, J; Clifford, G D; Malhotra, A; Penzel, T

    2014-01-01

    This article presents a review of signals used for measuring physiology and activity during sleep and techniques for extracting information from these signals. We examine both clinical needs and biomedical signal processing approaches across a range of sensor types. Issues with recording and analysing the signals are discussed, together with their applicability to various clinical disorders. Both univariate and data fusion (exploiting the diverse characteristics of the primary recorded signals) approaches are discussed, together with a comparison of automated methods for analysing sleep. (topical review)

  2. Symbolic transfer entropy-based premature signal analysis

    International Nuclear Information System (INIS)

    Wang Jun; Yu Zheng-Feng

    2012-01-01

    In this paper, we use symbolic transfer entropy to study the coupling strength between premature signals. Numerical experiments show that three types of signal couplings are in the same direction. Among them, normal signal coupling is the strongest, followed by that of premature ventricular contractions, and that of atrial premature beats is the weakest. The T test shows that the entropies of the three signals are distinct. Symbolic transfer entropy requires less data, can distinguish the three types of signals and has very good computational efficiency. (interdisciplinary physics and related areas of science and technology)

  3. Time Series Imputation via L1 Norm-Based Singular Spectrum Analysis

    Science.gov (United States)

    Kalantari, Mahdi; Yarmohammadi, Masoud; Hassani, Hossein; Silva, Emmanuel Sirimal

    Missing values in time series data is a well-known and important problem which many researchers have studied extensively in various fields. In this paper, a new nonparametric approach for missing value imputation in time series is proposed. The main novelty of this research is applying the L1 norm-based version of Singular Spectrum Analysis (SSA), namely L1-SSA which is robust against outliers. The performance of the new imputation method has been compared with many other established methods. The comparison is done by applying them to various real and simulated time series. The obtained results confirm that the SSA-based methods, especially L1-SSA can provide better imputation in comparison to other methods.

  4. Analysis of Land Subsidence Monitoring in Mining Area with Time-Series Insar Technology

    Science.gov (United States)

    Sun, N.; Wang, Y. J.

    2018-04-01

    Time-series InSAR technology has become a popular land subsidence monitoring method in recent years, because of its advantages such as high accuracy, wide area, low expenditure, intensive monitoring points and free from accessibility restrictions. In this paper, we applied two kinds of satellite data, ALOS PALSAR and RADARSAT-2, to get the subsidence monitoring results of the study area in two time periods by time-series InSAR technology. By analyzing the deformation range, rate and amount, the time-series analysis of land subsidence in mining area was realized. The results show that InSAR technology could be used to monitor land subsidence in large area and meet the demand of subsidence monitoring in mining area.

  5. An electromagnetic signals monitoring and analysis wireless platform employing personal digital assistants and pattern analysis techniques

    Science.gov (United States)

    Ninos, K.; Georgiadis, P.; Cavouras, D.; Nomicos, C.

    2010-05-01

    This study presents the design and development of a mobile wireless platform to be used for monitoring and analysis of seismic events and related electromagnetic (EM) signals, employing Personal Digital Assistants (PDAs). A prototype custom-developed application was deployed on a 3G enabled PDA that could connect to the FTP server of the Institute of Geodynamics of the National Observatory of Athens and receive and display EM signals at 4 receiver frequencies (3 KHz (E-W, N-S), 10 KHz (E-W, N-S), 41 MHz and 46 MHz). Signals may originate from any one of the 16 field-stations located around the Greek territory. Employing continuous recordings of EM signals gathered from January 2003 till December 2007, a Support Vector Machines (SVM)-based classification system was designed to distinguish EM precursor signals within noisy background. EM-signals corresponding to recordings preceding major seismic events (Ms≥5R) were segmented, by an experienced scientist, and five features (mean, variance, skewness, kurtosis, and a wavelet based feature), derived from the EM-signals were calculated. These features were used to train the SVM-based classification scheme. The performance of the system was evaluated by the exhaustive search and leave-one-out methods giving 87.2% overall classification accuracy, in correctly identifying EM precursor signals within noisy background employing all calculated features. Due to the insufficient processing power of the PDAs, this task was performed on a typical desktop computer. This optimal trained context of the SVM classifier was then integrated in the PDA based application rendering the platform capable to discriminate between EM precursor signals and noise. System's efficiency was evaluated by an expert who reviewed 1/ multiple EM-signals, up to 18 days prior to corresponding past seismic events, and 2/ the possible EM-activity of a specific region employing the trained SVM classifier. Additionally, the proposed architecture can form a

  6. Sensitivity analysis of machine-learning models of hydrologic time series

    Science.gov (United States)

    O'Reilly, A. M.

    2017-12-01

    Sensitivity analysis traditionally has been applied to assessing model response to perturbations in model parameters, where the parameters are those model input variables adjusted during calibration. Unlike physics-based models where parameters represent real phenomena, the equivalent of parameters for machine-learning models are simply mathematical "knobs" that are automatically adjusted during training/testing/verification procedures. Thus the challenge of extracting knowledge of hydrologic system functionality from machine-learning models lies in their very nature, leading to the label "black box." Sensitivity analysis of the forcing-response behavior of machine-learning models, however, can provide understanding of how the physical phenomena represented by model inputs affect the physical phenomena represented by model outputs.As part of a previous study, hybrid spectral-decomposition artificial neural network (ANN) models were developed to simulate the observed behavior of hydrologic response contained in multidecadal datasets of lake water level, groundwater level, and spring flow. Model inputs used moving window averages (MWA) to represent various frequencies and frequency-band components of time series of rainfall and groundwater use. Using these forcing time series, the MWA-ANN models were trained to predict time series of lake water level, groundwater level, and spring flow at 51 sites in central Florida, USA. A time series of sensitivities for each MWA-ANN model was produced by perturbing forcing time-series and computing the change in response time-series per unit change in perturbation. Variations in forcing-response sensitivities are evident between types (lake, groundwater level, or spring), spatially (among sites of the same type), and temporally. Two generally common characteristics among sites are more uniform sensitivities to rainfall over time and notable increases in sensitivities to groundwater usage during significant drought periods.

  7. Application of the Allan Variance to Time Series Analysis in Astrometry and Geodesy: A Review.

    Science.gov (United States)

    Malkin, Zinovy

    2016-04-01

    The Allan variance (AVAR) was introduced 50 years ago as a statistical tool for assessing the frequency standards deviations. For the past decades, AVAR has increasingly been used in geodesy and astrometry to assess the noise characteristics in geodetic and astrometric time series. A specific feature of astrometric and geodetic measurements, as compared with clock measurements, is that they are generally associated with uncertainties; thus, an appropriate weighting should be applied during data analysis. In addition, some physically connected scalar time series naturally form series of multidimensional vectors. For example, three station coordinates time series X, Y, and Z can be combined to analyze 3-D station position variations. The classical AVAR is not intended for processing unevenly weighted and/or multidimensional data. Therefore, AVAR modifications, namely weighted AVAR (WAVAR), multidimensional AVAR (MAVAR), and weighted multidimensional AVAR (WMAVAR), were introduced to overcome these deficiencies. In this paper, a brief review is given of the experience of using AVAR and its modifications in processing astrogeodetic time series.

  8. Social network analysis of character interaction in the Stargate and Star Trek television series

    Science.gov (United States)

    Tan, Melody Shi Ai; Ujum, Ephrance Abu; Ratnavelu, Kuru

    This paper undertakes a social network analysis of two science fiction television series, Stargate and Star Trek. Television series convey stories in the form of character interaction, which can be represented as “character networks”. We connect each pair of characters that exchanged spoken dialogue in any given scene demarcated in the television series transcripts. These networks are then used to characterize the overall structure and topology of each series. We find that the character networks of both series have similar structure and topology to that found in previous work on mythological and fictional networks. The character networks exhibit the small-world effects but found no significant support for power-law. Since the progression of an episode depends to a large extent on the interaction between each of its characters, the underlying network structure tells us something about the complexity of that episode’s storyline. We assessed the complexity using techniques from spectral graph theory. We found that the episode networks are structured either as (1) closed networks, (2) those containing bottlenecks that connect otherwise disconnected clusters or (3) a mixture of both.

  9. Principal components and iterative regression analysis of geophysical series: Application to Sunspot number (1750 2004)

    Science.gov (United States)

    Nordemann, D. J. R.; Rigozo, N. R.; de Souza Echer, M. P.; Echer, E.

    2008-11-01

    We present here an implementation of a least squares iterative regression method applied to the sine functions embedded in the principal components extracted from geophysical time series. This method seems to represent a useful improvement for the non-stationary time series periodicity quantitative analysis. The principal components determination followed by the least squares iterative regression method was implemented in an algorithm written in the Scilab (2006) language. The main result of the method is to obtain the set of sine functions embedded in the series analyzed in decreasing order of significance, from the most important ones, likely to represent the physical processes involved in the generation of the series, to the less important ones that represent noise components. Taking into account the need of a deeper knowledge of the Sun's past history and its implication to global climate change, the method was applied to the Sunspot Number series (1750-2004). With the threshold and parameter values used here, the application of the method leads to a total of 441 explicit sine functions, among which 65 were considered as being significant and were used for a reconstruction that gave a normalized mean squared error of 0.146.

  10. Interrupted time series analysis in drug utilization research is increasing: systematic review and recommendations.

    Science.gov (United States)

    Jandoc, Racquel; Burden, Andrea M; Mamdani, Muhammad; Lévesque, Linda E; Cadarette, Suzanne M

    2015-08-01

    To describe the use and reporting of interrupted time series methods in drug utilization research. We completed a systematic search of MEDLINE, Web of Science, and reference lists to identify English language articles through to December 2013 that used interrupted time series methods in drug utilization research. We tabulated the number of studies by publication year and summarized methodological detail. We identified 220 eligible empirical applications since 1984. Only 17 (8%) were published before 2000, and 90 (41%) were published since 2010. Segmented regression was the most commonly applied interrupted time series method (67%). Most studies assessed drug policy changes (51%, n = 112); 22% (n = 48) examined the impact of new evidence, 18% (n = 39) examined safety advisories, and 16% (n = 35) examined quality improvement interventions. Autocorrelation was considered in 66% of studies, 31% reported adjusting for seasonality, and 15% accounted for nonstationarity. Use of interrupted time series methods in drug utilization research has increased, particularly in recent years. Despite methodological recommendations, there is large variation in reporting of analytic methods. Developing methodological and reporting standards for interrupted time series analysis is important to improve its application in drug utilization research, and we provide recommendations for consideration. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.

  11. BiGGEsTS: integrated environment for biclustering analysis of time series gene expression data

    Directory of Open Access Journals (Sweden)

    Madeira Sara C

    2009-07-01

    Full Text Available Abstract Background The ability to monitor changes in expression patterns over time, and to observe the emergence of coherent temporal responses using expression time series, is critical to advance our understanding of complex biological processes. Biclustering has been recognized as an effective method for discovering local temporal expression patterns and unraveling potential regulatory mechanisms. The general biclustering problem is NP-hard. In the case of time series this problem is tractable, and efficient algorithms can be used. However, there is still a need for specialized applications able to take advantage of the temporal properties inherent to expression time series, both from a computational and a biological perspective. Findings BiGGEsTS makes available state-of-the-art biclustering algorithms for analyzing expression time series. Gene Ontology (GO annotations are used to assess the biological relevance of the biclusters. Methods for preprocessing expression time series and post-processing results are also included. The analysis is additionally supported by a visualization module capable of displaying informative representations of the data, including heatmaps, dendrograms, expression charts and graphs of enriched GO terms. Conclusion BiGGEsTS is a free open source graphical software tool for revealing local coexpression of genes in specific intervals of time, while integrating meaningful information on gene annotations. It is freely available at: http://kdbio.inesc-id.pt/software/biggests. We present a case study on the discovery of transcriptional regulatory modules in the response of Saccharomyces cerevisiae to heat stress.

  12. Definition of distance for nonlinear time series analysis of marked point process data

    Energy Technology Data Exchange (ETDEWEB)

    Iwayama, Koji, E-mail: koji@sat.t.u-tokyo.ac.jp [Research Institute for Food and Agriculture, Ryukoku Univeristy, 1-5 Yokotani, Seta Oe-cho, Otsu-Shi, Shiga 520-2194 (Japan); Hirata, Yoshito; Aihara, Kazuyuki [Institute of Industrial Science, The University of Tokyo, 4-6-1 Komaba, Meguro-ku, Tokyo 153-8505 (Japan)

    2017-01-30

    Marked point process data are time series of discrete events accompanied with some values, such as economic trades, earthquakes, and lightnings. A distance for marked point process data allows us to apply nonlinear time series analysis to such data. We propose a distance for marked point process data which can be calculated much faster than the existing distance when the number of marks is small. Furthermore, under some assumptions, the Kullback–Leibler divergences between posterior distributions for neighbors defined by this distance are small. We performed some numerical simulations showing that analysis based on the proposed distance is effective. - Highlights: • A new distance for marked point process data is proposed. • The distance can be computed fast enough for a small number of marks. • The method to optimize parameter values of the distance is also proposed. • Numerical simulations indicate that the analysis based on the distance is effective.

  13. Bearing defect signature analysis using advanced nonlinear signal analysis in a controlled environment

    Science.gov (United States)

    Zoladz, T.; Earhart, E.; Fiorucci, T.

    1995-01-01

    Utilizing high-frequency data from a highly instrumented rotor assembly, seeded bearing defect signatures are characterized using both conventional linear approaches, such as power spectral density analysis, and recently developed nonlinear techniques such as bicoherence analysis. Traditional low-frequency (less than 20 kHz) analysis and high-frequency envelope analysis of both accelerometer and acoustic emission data are used to recover characteristic bearing distress information buried deeply in acquired data. The successful coupling of newly developed nonlinear signal analysis with recovered wideband envelope data from accelerometers and acoustic emission sensors is the innovative focus of this research.

  14. Analysis of optical bleaching of OSL signal in sediment quartz

    International Nuclear Information System (INIS)

    Przegiętka, K.R.; Chruścińska, A.

    2013-01-01

    The aim of this work was to study the effect of the quality of optical bleaching on the results of OSL (Optically Stimulated Luminescence) dating method. The large aliquots of coarse quartz grains extracted from fluvial deposit were used in the study. The poor, medium and good bleaching were simulated in laboratory with help of Blue LED light source in series of experiments. Then the samples were irradiated with a common laboratory dose. The equivalent doses (DE) were measured by the help of standard Single Aliquot Regeneration (SAR) technique, but obtained DE distributions are analyzed in a new way. The method for recognizing and compensating for partial bleaching is proposed. The conclusions for dating sediment quartz samples are presented and discussed. -- Highlights: ► Bleaching experiments on sediment quartz are performed. ► Blue LED light source incorporated in luminescence reader is used. ► New analysis of data measured by standard SAR OSL technique is proposed. ► The results are promising for recognizing and compensating for partial bleaching

  15. Detection and Characterization of Ground Displacement Sources from Variational Bayesian Independent Component Analysis of GPS Time Series

    Science.gov (United States)

    Gualandi, A.; Serpelloni, E.; Belardinelli, M. E.

    2014-12-01

    A critical point in the analysis of ground displacements time series is the development of data driven methods that allow to discern and characterize the different sources that generate the observed displacements. A widely used multivariate statistical technique is the Principal Component Analysis (PCA), which allows to reduce the dimensionality of the data space maintaining most of the variance of the dataset explained. It reproduces the original data using a limited number of Principal Components, but it also shows some deficiencies. Indeed, PCA does not perform well in finding the solution to the so-called Blind Source Separation (BSS) problem, i.e. in recovering and separating the original sources that generated the observed data. This is mainly due to the assumptions on which PCA relies: it looks for a new Euclidean space where the projected data are uncorrelated. Usually, the uncorrelation condition is not strong enough and it has been proven that the BSS problem can be tackled imposing on the components to be independent. The Independent Component Analysis (ICA) is, in fact, another popular technique adopted to approach this problem, and it can be used in all those fields where PCA is also applied. An ICA approach enables us to explain the time series imposing a fewer number of constraints on the model, and to reveal anomalies in the data such as transient signals. However, the independence condition is not easy to impose, and it is often necessary to introduce some approximations. To work around this problem, we use a variational bayesian ICA (vbICA) method, which models the probability density function (pdf) of each source signal using a mix of Gaussian distributions. This technique allows for more flexibility in the description of the pdf of the sources, giving a more reliable estimate of them. Here we present the application of the vbICA technique to GPS position time series. First, we use vbICA on synthetic data that simulate a seismic cycle

  16. Real-time determination of the signal-to-noise ratio of partly coherent seismic time series

    DEFF Research Database (Denmark)

    Kjeldsen, Peter Møller

    1994-01-01

    it is of great practical interest to be able to monitor the S/N while the traces are recorded an approach for fast real-time determination of the S/N of seismic time series is proposed. The described method is based on an iterative procedure utilizing the trace-to-trace coherence, but unlike procedures known so...... far it uses calculated initial guesses and stop criterions. This significantly reduces the computational burden of the procedure so that real-time capabilities are obtained...

  17. Application of Time Series Analysis in Determination of Lag Time in Jahanbin Basin

    Directory of Open Access Journals (Sweden)

    Seied Yahya Mirzaee

    2005-11-01

        One of the important issues that have significant role in study of hydrology of basin is determination of lag time. Lag time has significant role in hydrological studies. Quantity of rainfall related lag time depends on several factors, such as permeability, vegetation cover, catchments slope, rainfall intensity, storm duration and type of rain. Determination of lag time is important parameter in many projects such as dam design and also water resource studies. Lag time of basin could be calculated using various methods. One of these methods is time series analysis of spectral density. The analysis is based on fouries series. The time series is approximated with Sinuous and Cosines functions. In this method harmonically significant quantities with individual frequencies are presented. Spectral density under multiple time series could be used to obtain basin lag time for annual runoff and short-term rainfall fluctuation. A long lag time could be due to snowmelt as well as melting ice due to rainfalls in freezing days. In this research the lag time of Jahanbin basin has been determined using spectral density method. The catchments is subjected to both rainfall and snowfall. For short term rainfall fluctuation with a return period  2, 3, 4 months, the lag times were found 0.18, 0.5 and 0.083 month, respectively.

  18. Statistical tools for analysis and modeling of cosmic populations and astronomical time series: CUDAHM and TSE

    Science.gov (United States)

    Loredo, Thomas; Budavari, Tamas; Scargle, Jeffrey D.

    2018-01-01

    This presentation provides an overview of open-source software packages addressing two challenging classes of astrostatistics problems. (1) CUDAHM is a C++ framework for hierarchical Bayesian modeling of cosmic populations, leveraging graphics processing units (GPUs) to enable applying this computationally challenging paradigm to large datasets. CUDAHM is motivated by measurement error problems in astronomy, where density estimation and linear and nonlinear regression must be addressed for populations of thousands to millions of objects whose features are measured with possibly complex uncertainties, potentially including selection effects. An example calculation demonstrates accurate GPU-accelerated luminosity function estimation for simulated populations of $10^6$ objects in about two hours using a single NVIDIA Tesla K40c GPU. (2) Time Series Explorer (TSE) is a collection of software in Python and MATLAB for exploratory analysis and statistical modeling of astronomical time series. It comprises a library of stand-alone functions and classes, as well as an application environment for interactive exploration of times series data. The presentation will summarize key capabilities of this emerging project, including new algorithms for analysis of irregularly-sampled time series.

  19. Fourier and wavelet analysis of skin laser doppler flowmetry signals

    OpenAIRE

    Qi, Wei

    2011-01-01

    ObjectiveThis thesis examines the measurement of skin microvascular blood flows from Laser Doppler Flowmetry (LDF) signals. Both healthy subjects and those with features of the metabolic syndrome are studied using signal processing techniques such as the Fourier and Wavelet transforms. An aim of this study is to investigate whether change in blood flow at rest can be detected from the spectral content of the processed signals in the diferent subject groups. Additionally the effect of insulin ...

  20. An Analysis of the Influence of Signals Intelligence Through Wargaming

    National Research Council Canada - National Science Library

    McCaffrey, Charles

    2000-01-01

    Signals intelligence (SIGINT), information derived from the monitoring, interception, decryption and evaluation of an adversary's electronic communications, has long been viewed as a significant factor in modem warfare...

  1. Brain Network Analysis from High-Resolution EEG Signals

    Science.gov (United States)

    de Vico Fallani, Fabrizio; Babiloni, Fabio

    lattice and a random structure. Such a model has been designated as "small-world" network in analogy with the concept of the small-world phenomenon observed more than 30 years ago in social systems. In a similar way, many types of functional brain networks have been analyzed according to this mathematical approach. In particular, several studies based on different imaging techniques (fMRI, MEG and EEG) have found that the estimated functional networks showed small-world characteristics. In the functional brain connectivity context, these properties have been demonstrated to reflect an optimal architecture for the information processing and propagation among the involved cerebral structures. However, the performance of cognitive and motor tasks as well as the presence of neural diseases has been demonstrated to affect such a small-world topology, as revealed by the significant changes of L and C. Moreover, some functional brain networks have been mostly found to be very unlike the random graphs in their degree-distribution, which gives information about the allocation of the functional links within the connectivity pattern. It was demonstrated that the degree distributions of these networks follow a power-law trend. For this reason those networks are called "scale-free". They still exhibit the small-world phenomenon but tend to contain few nodes that act as highly connected "hubs". Scale-free networks are known to show resistance to failure, facility of synchronization and fast signal processing. Hence, it would be important to see whether the scaling properties of the functional brain networks are altered under various pathologies or experimental tasks. The present Chapter proposes a theoretical graph approach in order to evaluate the functional connectivity patterns obtained from high-resolution EEG signals. In this way, the "Brain Network Analysis" (in analogy with the Social Network Analysis that has emerged as a key technique in modern sociology) represents an

  2. Reconstruction and signal propagation analysis of the Syk signaling network in breast cancer cells.

    Directory of Open Access Journals (Sweden)

    Aurélien Naldi

    2017-03-01

    Full Text Available The ability to build in-depth cell signaling networks from vast experimental data is a key objective of computational biology. The spleen tyrosine kinase (Syk protein, a well-characterized key player in immune cell signaling, was surprisingly first shown by our group to exhibit an onco-suppressive function in mammary epithelial cells and corroborated by many other studies, but the molecular mechanisms of this function remain largely unsolved. Based on existing proteomic data, we report here the generation of an interaction-based network of signaling pathways controlled by Syk in breast cancer cells. Pathway enrichment of the Syk targets previously identified by quantitative phospho-proteomics indicated that Syk is engaged in cell adhesion, motility, growth and death. Using the components and interactions of these pathways, we bootstrapped the reconstruction of a comprehensive network covering Syk signaling in breast cancer cells. To generate in silico hypotheses on Syk signaling propagation, we developed a method allowing to rank paths between Syk and its targets. We first annotated the network according to experimental datasets. We then combined shortest path computation with random walk processes to estimate the importance of individual interactions and selected biologically relevant pathways in the network. Molecular and cell biology experiments allowed to distinguish candidate mechanisms that underlie the impact of Syk on the regulation of cortactin and ezrin, both involved in actin-mediated cell adhesion and motility. The Syk network was further completed with the results of our biological validation experiments. The resulting Syk signaling sub-networks can be explored via an online visualization platform.

  3. Spectrogram Image Analysis of Error Signals for Minimizing Impulse Noise

    Directory of Open Access Journals (Sweden)

    Jeakwan Kim

    2016-01-01

    Full Text Available This paper presents the theoretical and experimental study on the spectrogram image analysis of error signals for minimizing the impulse input noises in the active suppression of noise. Impulse inputs of some specific wave patterns as primary noises to a one-dimensional duct with the length of 1800 mm are shown. The convergence speed of the adaptive feedforward algorithm based on the least mean square approach was controlled by a normalized step size which was incorporated into the algorithm. The variations of the step size govern the stability as well as the convergence speed. Because of this reason, a normalized step size is introduced as a new method for the control of impulse noise. The spectrogram images which indicate the degree of the attenuation of the impulse input noises are considered to represent the attenuation with the new method. The algorithm is extensively investigated in both simulation and real-time control experiment. It is demonstrated that the suggested algorithm worked with a nice stability and performance against impulse noises. The results in this study can be used for practical active noise control systems.

  4. Signal Integrity Analysis in Single and Bundled Carbon Nanotube Interconnects

    International Nuclear Information System (INIS)

    Majumder, M.K.; Pandya, N.D.; Kaushik, B.K.; Manhas, S.K.

    2013-01-01

    Carbon nanotube (CN T) can be considered as an emerging interconnect material in current nano scale regime. They are more promising than other interconnect materials such as Al or Cu because of their robustness to electromigration. This research paper aims to address the crosstalk-related issues (signal integrity) in interconnect lines. Different analytical models of single- (SWCNT), double- (DWCNT), and multiwalled CNTs (MWCNT) are studied to analyze the crosstalk delay at global interconnect lengths. A capacitively coupled three-line bus architecture employing CMOS driver is used for accurate estimation of crosstalk delay. Each line in bus architecture is represented with the equivalent RLC models of single and bundled SWCNT, DWCNT, and MWCNT interconnects. Crosstalk delay is observed at middle line (victim) when it switches in opposite direction with respect to the other two lines (aggressors). Using the data predicted by ITRS 2012, a comparative analysis on the basis of crosstalk delay is performed for bundled SWCNT/DWCNT and single MWCNT interconnects. It is observed that the overall crosstalk delay is improved by 40.92% and 21.37% for single MWCNT in comparison to bundled SWCNT and bundled DWCNT interconnects, respectively.

  5. Adventures in Modern Time Series Analysis: From the Sun to the Crab Nebula and Beyond

    Science.gov (United States)

    Scargle, Jeffrey

    2014-01-01

    With the generation of long, precise, and finely sampled time series the Age of Digital Astronomy is uncovering and elucidating energetic dynamical processes throughout the Universe. Fulfilling these opportunities requires data effective analysis techniques rapidly and automatically implementing advanced concepts. The Time Series Explorer, under development in collaboration with Tom Loredo, provides tools ranging from simple but optimal histograms to time and frequency domain analysis for arbitrary data modes with any time sampling. Much of this development owes its existence to Joe Bredekamp and the encouragement he provided over several decades. Sample results for solar chromospheric activity, gamma-ray activity in the Crab Nebula, active galactic nuclei and gamma-ray bursts will be displayed.

  6. Analysis of small-signal intensity modulation of semiconductor ...

    Indian Academy of Sciences (India)

    This paper demonstrates theoretical characterization of intensity modulation of semiconductor lasers (SL's). The study is based on a small-signal model to solve the laser rate equations taking into account suppression of optical gain. Analytical forms of the small-signal modulation response and modulation bandwidth are ...

  7. Flood Frequency Analysis For Partial Duration Series In Ganjiang River Basin

    Science.gov (United States)

    zhangli, Sun; xiufang, Zhu; yaozhong, Pan

    2016-04-01

    Accurate estimation of flood frequency is key to effective, nationwide flood damage abatement programs. The partial duration series (PDS) method is widely used in hydrologic studies because it considers all events above a certain threshold level as compared to the annual maximum series (AMS) method, which considers only the annual maximum value. However, the PDS has a drawback in that it is difficult to define the thresholds and maintain an independent and identical distribution of the partial duration time series; this drawback is discussed in this paper. The Ganjiang River is the seventh largest tributary of the Yangtze River, the longest river in China. The Ganjiang River covers a drainage area of 81,258 km2 at the Wanzhou hydrologic station as the basin outlet. In this work, 56 years of daily flow data (1954-2009) from the Wanzhou station were used to analyze flood frequency, and the Pearson-III model was employed as the hydrologic probability distribution. Generally, three tasks were accomplished: (1) the threshold of PDS by percentile rank of daily runoff was obtained; (2) trend analysis of the flow series was conducted using PDS; and (3) flood frequency analysis was conducted for partial duration flow series. The results showed a slight upward trend of the annual runoff in the Ganjiang River basin. The maximum flow with a 0.01 exceedance probability (corresponding to a 100-year flood peak under stationary conditions) was 20,000 m3/s, while that with a 0.1 exceedance probability was 15,000 m3/s. These results will serve as a guide to hydrological engineering planning, design, and management for policymakers and decision makers associated with hydrology.

  8. Time-series analysis of climatologic measurements: a method to distinguish future climatic changes

    International Nuclear Information System (INIS)

    Duband, D.

    1992-01-01

    Time-series analysis of climatic parameters as air temperature, rivers flow rate, lakes or seas level is an indispensable basis to detect a possible significant climatic change. These observations, when they are carefully analyzed and criticized, constitute the necessary reference for testing and validation numerical climatic models which try to simulate the physical and dynamical process of the ocean-atmosphere couple, taking continents into account. 32 refs., 13 figs

  9. Chernobyl effects on domestic and inbound tourism in Sweden. A time series analysis

    International Nuclear Information System (INIS)

    Hultkrantz, L.; Olsson, C.

    1997-01-01

    This paper estimates the impact of the Chernobyl nuclear accident on domestic and international tourism in Sweden. From ARIMA time series forecasts, outlier search, and intervention analysis based on regional monthly accommodation data from 1978-1989, no effect on domestic tourism is found. However, there is an enduring deterrence effect on incoming tourism. The loss of gross revenue from incoming tourism because of the Chernobyl accident, is estimated to 2.5 billion SEK. 5 figs., 7 tabs., 1 appendix, 27 refs

  10. On-line condition monitoring of nuclear systems via symbolic time series analysis

    International Nuclear Information System (INIS)

    Rajagopalan, V.; Ray, A.; Garcia, H. E.

    2006-01-01

    This paper provides a symbolic time series analysis approach to fault diagnostics and condition monitoring. The proposed technique is built upon concepts from wavelet theory, symbolic dynamics and pattern recognition. Various aspects of the methodology such as wavelet selection, choice of alphabet and determination of depth of D-Markov Machine are explained in the paper. The technique is validated with experiments performed in a Machine Condition Monitoring (MCM) test bed at the Idaho National Laboratory. (authors)

  11. A Time Series Analysis to Asymmetric Marketing Competition Within a Market Structure

    OpenAIRE

    Francisco F. R. Ramos

    1996-01-01

    As a complementary to the existing studies of competitive market structure analysis, the present paper proposed a time series methodology to provide a more detailed picture of marketing competition in relation to competitive market structure. Two major hypotheses were tested as part of this project. First, it was found that some significant cross- lead and lag effects of marketing variables on sales between brands existed even between differents submarkets. second, it was found that high qual...

  12. Time series analysis in road safety research uisng state space methods

    OpenAIRE

    BIJLEVELD, FD

    2008-01-01

    In this thesis we present a comprehensive study into novel time series models for aggregated road safety data. The models are mainly intended for analysis of indicators relevant to road safety, with a particular focus on how to measure these factors. Such developments may need to be related to or explained by external influences. It is also possible to make forecasts using the models. Relevant indicators include the number of persons killed permonth or year. These statistics are closely watch...

  13. Chernobyl effects on domestic and inbound tourism in Sweden. A time series analysis

    Energy Technology Data Exchange (ETDEWEB)

    Hultkrantz, L. [Department of Economics, University of Uppsala, Uppsala (Sweden); Olsson, C. [Department of Economics, Umeaa University, Umeaa (Sweden)

    1997-03-01

    This paper estimates the impact of the Chernobyl nuclear accident on domestic and international tourism in Sweden. From ARIMA time series forecasts, outlier search, and intervention analysis based on regional monthly accommodation data from 1978-1989, no effect on domestic tourism is found. However, there is an enduring deterrence effect on incoming tourism. The loss of gross revenue from incoming tourism because of the Chernobyl accident, is estimated to 2.5 billion SEK. 5 figs., 7 tabs., 1 appendix, 27 refs.

  14. Time Series Modeling of Army Mission Command Communication Networks: An Event-Driven Analysis

    Science.gov (United States)

    2013-06-01

    Lehmann, D. R. (1984). How advertising affects sales: Meta- analysis of econometric results. Journal of Marketing Research , 21, 65-74. Barabási, A. L...317-357. Leone, R. P. (1983). Modeling sales-advertising relationships: An integrated time series- econometric approach. Journal of Marketing ... Research , 20, 291-295. McGrath, J. E., & Kravitz, D. A. (1982). Group research. Annual Review of Psychology, 33, 195- 230. Monge, P. R., & Contractor

  15. Computational Analysis and Simulation of Empathic Behaviors: a Survey of Empathy Modeling with Behavioral Signal Processing Framework.

    Science.gov (United States)

    Xiao, Bo; Imel, Zac E; Georgiou, Panayiotis; Atkins, David C; Narayanan, Shrikanth S

    2016-05-01

    Empathy is an important psychological process that facilitates human communication and interaction. Enhancement of empathy has profound significance in a range of applications. In this paper, we review emerging directions of research on computational analysis of empathy expression and perception as well as empathic interactions, including their simulation. We summarize the work on empathic expression analysis by the targeted signal modalities (e.g., text, audio, and facial expressions). We categorize empathy simulation studies into theory-based emotion space modeling or application-driven user and context modeling. We summarize challenges in computational study of empathy including conceptual framing and understanding of empathy, data availability, appropriate use and validation of machine learning techniques, and behavior signal processing. Finally, we propose a unified view of empathy computation and offer a series of open problems for future research.

  16. Signal analysis approach to ultrasonic evaluation of diffusion bond quality

    International Nuclear Information System (INIS)

    Thomas, Graham; Chinn, Diane

    1999-01-01

    Solid state bonds like the diffusion bond are attractive techniques for joining dissimilar materials since they are not prone to the defects that occur with fusion welding. Ultrasonic methods can detect the presence of totally unbonded regions but have difficulty sensing poor bonded areas where the substrates are in intimate contact. Standard ultrasonic imaging is based on amplitude changes in the signal reflected from the bond interface. Unfortunately, amplitude alone is not sensitive to bond quality. We demonstrated that there is additional information in the ultrasonic signal that correlates with bond quality. In our approach, we interrogated a set of dissimilar diffusion bonded samples with broad band ultrasonic signals. The signals were digitally processed and the characteristics of the signals that corresponded to bond quality were determined. These characteristics or features were processed with pattern recognition algorithms to produce predictions of bond quality. The predicted bond quality was then compared with the destructive measurement to assess the classification capability of the ultrasonic technique

  17. Comparative Analysis of the Clinical Significance of Oscillatory Components in the Rhythmic Structure of Pulse Signal in the Diagnostics of Psychosomatic Disorders in School Age Children.

    Science.gov (United States)

    Desova, A A; Dorofeyuk, A A; Anokhin, A M

    2017-01-01

    We performed a comparative analysis of the types of spectral density typical of various parameters of pulse signal. The experimental material was obtained during the examination of school age children with various psychosomatic disorders. We also performed a typological analysis of the spectral density functions corresponding to the time series of different parameters of a single oscillation of pulse signals; the results of their comparative analysis are presented. We determined the most significant spectral components for two disordersin children: arterial hypertension and mitral valve prolapse.

  18. TIME SERIES ANALYSIS ON STOCK MARKET FOR TEXT MINING CORRELATION OF ECONOMY NEWS

    Directory of Open Access Journals (Sweden)

    Sadi Evren SEKER

    2014-01-01

    Full Text Available This paper proposes an information retrieval methodfor the economy news. Theeffect of economy news, are researched in the wordlevel and stock market valuesare considered as the ground proof.The correlation between stock market prices and economy news is an already ad-dressed problem for most of the countries. The mostwell-known approach is ap-plying the text mining approaches to the news and some time series analysis tech-niques over stock market closing values in order toapply classification or cluster-ing algorithms over the features extracted. This study goes further and tries to askthe question what are the available time series analysis techniques for the stockmarket closing values and which one is the most suitable? In this study, the newsand their dates are collected into a database and text mining is applied over thenews, the text mining part has been kept simple with only term frequency – in-verse document frequency method. For the time series analysis part, we havestudied 10 different methods such as random walk, moving average, acceleration,Bollinger band, price rate of change, periodic average, difference, momentum orrelative strength index and their variation. In this study we have also explainedthese techniques in a comparative way and we have applied the methods overTurkish Stock Market closing values for more than a2 year period. On the otherhand, we have applied the term frequency – inversedocument frequency methodon the economy news of one of the high-circulatingnewspapers in Turkey.

  19. The Fourier decomposition method for nonlinear and non-stationary time series analysis.

    Science.gov (United States)

    Singh, Pushpendra; Joshi, Shiv Dutt; Patney, Rakesh Kumar; Saha, Kaushik

    2017-03-01

    for many decades, there has been a general perception in the literature that Fourier methods are not suitable for the analysis of nonlinear and non-stationary data. In this paper, we propose a novel and adaptive Fourier decomposition method (FDM), based on the Fourier theory, and demonstrate its efficacy for the analysis of nonlinear and non-stationary time series. The proposed FDM decomposes any data into a small number of 'Fourier intrinsic band functions' (FIBFs). The FDM presents a generalized Fourier expansion with variable amplitudes and variable frequencies of a time series by the Fourier method itself. We propose an idea of zero-phase filter bank-based multivariate FDM (MFDM), for the analysis of multivariate nonlinear and non-stationary time series, using the FDM. We also present an algorithm to obtain cut-off frequencies for MFDM. The proposed MFDM generates a finite number of band-limited multivariate FIBFs (MFIBFs). The MFDM preserves some intrinsic physical properties of the multivariate data, such as scale alignment, trend and instantaneous frequency. The proposed methods provide a time-frequency-energy (TFE) distribution that reveals the intrinsic structure of a data. Numerical computations and simulations have been carried out and comparison is made with the empirical mode decomposition algorithms.

  20. The physiology analysis system: an integrated approach for warehousing, management and analysis of time-series physiology data.

    Science.gov (United States)

    McKenna, Thomas M; Bawa, Gagandeep; Kumar, Kamal; Reifman, Jaques

    2007-04-01

    The physiology analysis system (PAS) was developed as a resource to support the efficient warehousing, management, and analysis of physiology data, particularly, continuous time-series data that may be extensive, of variable quality, and distributed across many files. The PAS incorporates time-series data collected by many types of data-acquisition devices, and it is designed to free users from data management burdens. This Web-based system allows both discrete (attribute) and time-series (ordered) data to be manipulated, visualized, and analyzed via a client's Web browser. All processes occur on a server, so that the client does not have to download data or any application programs, and the PAS is independent of the client's computer operating system. The PAS contains a library of functions, written in different computer languages that the client can add to and use to perform specific data operations. Functions from the library are sequentially inserted into a function chain-based logical structure to construct sophisticated data operators from simple function building blocks, affording ad hoc query and analysis of time-series data. These features support advanced mining of physiology data.

  1. Carbon price signal. Impact Analysis on the European Electricity System

    International Nuclear Information System (INIS)

    2016-03-01

    The Paris Agreement signed by 195 countries late in December 2015, after COP 21, created a new basis for efficient cooperation between countries in the fight against climate change. The technologies being rolled out by the electricity sector will have very different impacts on climate change and, for the time being, investments other than public aid for renewable energies are being guided primarily by prices. To shed more slight on the issue of greenhouse gas emissions, which is closely related to the challenges addressed at COP21, RTE initiated a study in 2015 based on the models used in its Generation Adequacy Report. ADEME wanted to contribute to this effort and offer its support. The present document outlines the approach taken to assessing the impact of the carbon price signal on emissions from the European electric power system, its production costs and its structural evolution over the medium term. This approach was discussed with members of the 'Network Outlook Committee' of the Transmission System Users' Committee which includes environmental NGOs as well as the main economic actors from the power sector. Key findings resulting from the analysis developed in this report include: Simulations conducted with the current generation fleet show that the carbon price would have to be close to euro 30/tonne at the European level to drive a significant reduction in emissions (about 100 million tonnes a year, or 15 %) from the European power sector. A higher price of about euro 100/tonne would help drive an emissions reduction of close to 30%. Over the medium and long terms, beyond an impact on the number of hours fossil fuel power plants would be run, having a high carbon price would send a signal encouraging investment in renewable energies and could incentivise the development of flexible and storage capacity. It would notably guarantee the profitability of gas-fired plants and renewable power development. The following assumptions are factored into the study

  2. Time series analysis as input for clinical predictive modeling: modeling cardiac arrest in a pediatric ICU.

    Science.gov (United States)

    Kennedy, Curtis E; Turley, James P

    2011-10-24

    Thousands of children experience cardiac arrest events every year in pediatric intensive care units. Most of these children die. Cardiac arrest prediction tools are used as part of medical emergency team evaluations to identify patients in standard hospital beds that are at high risk for cardiac arrest. There are no models to predict cardiac arrest in pediatric intensive care units though, where the risk of an arrest is 10 times higher than for standard hospital beds. Current tools are based on a multivariable approach that does not characterize deterioration, which often precedes cardiac arrests. Characterizing deterioration requires a time series approach. The purpose of this study is to propose a method that will allow for time series data to be used in clinical prediction models. Successful implementation of these methods has the potential to bring arrest prediction to the pediatric intensive care environment, possibly allowing for interventions that can save lives and prevent disabilities. We reviewed prediction models from nonclinical domains that employ time series data, and identified the steps that are necessary for building predictive models using time series clinical data. We illustrate the method by applying it to the specific case of building a predictive model for cardiac arrest in a pediatric intensive care unit. Time course analysis studies from genomic analysis provided a modeling template that was compatible with the steps required to develop a model from clinical time series data. The steps include: 1) selecting candidate variables; 2) specifying measurement parameters; 3) defining data format; 4) defining time window duration and resolution; 5) calculating latent variables for candidate variables not directly measured; 6) calculating time series features as latent variables; 7) creating data subsets to measure model performance effects attributable to various classes of candidate variables; 8) reducing the number of candidate features; 9

  3. An example of utilization of Kalman filters in time series analysis

    International Nuclear Information System (INIS)

    Marseguerra, M.; Porceddu, C.M.

    1987-01-01

    In reactor noise analysis the fluctuation of many interesting signals may be described by linear models such as the AR, ARMA or ARMAX ones. Another interesting approach of increasing importance is the Kalman filter methodology. In the this paper a linear system described by an autoregressive AR(2) model is considered and it is investigated whether the Kalman filter is capable of correctly estimating parameters together with their accuracies both in the stationary state and in the case of sudden variation of the parameters. In addition a more complex situation in which a stationary system under investigation feeds the sensor which delivers the observed signal. Assuming the system obeying on AR(2) model and the sensor a simpler AR(1) model, the problem is that of recovering the system output from the measured signal

  4. Introduction to applied statistical signal analysis guide to biomedical and electrical engineering applications

    CERN Document Server

    Shiavi, Richard

    2007-01-01

    Introduction to Applied Statistical Signal Analysis is designed for the experienced individual with a basic background in mathematics, science, and computer. With this predisposed knowledge, the reader will coast through the practical introduction and move on to signal analysis techniques, commonly used in a broad range of engineering areas such as biomedical engineering, communications, geophysics, and speech.Introduction to Applied Statistical Signal Analysis intertwines theory and implementation with practical examples and exercises. Topics presented in detail include: mathematical

  5. Early detection of metabolic and energy disorders by thermal time series stochastic complexity analysis

    Energy Technology Data Exchange (ETDEWEB)

    Lutaif, N.A. [Departamento de Clínica Médica, Faculdade de Ciências Médicas, Universidade Estadual de Campinas, Campinas, SP (Brazil); Palazzo, R. Jr [Departamento de Telemática, Faculdade de Engenharia Elétrica e Computação, Universidade Estadual de Campinas, Campinas, SP (Brazil); Gontijo, J.A.R. [Departamento de Clínica Médica, Faculdade de Ciências Médicas, Universidade Estadual de Campinas, Campinas, SP (Brazil)

    2014-01-17

    Maintenance of thermal homeostasis in rats fed a high-fat diet (HFD) is associated with changes in their thermal balance. The thermodynamic relationship between heat dissipation and energy storage is altered by the ingestion of high-energy diet content. Observation of thermal registers of core temperature behavior, in humans and rodents, permits identification of some characteristics of time series, such as autoreference and stationarity that fit adequately to a stochastic analysis. To identify this change, we used, for the first time, a stochastic autoregressive model, the concepts of which match those associated with physiological systems involved and applied in male HFD rats compared with their appropriate standard food intake age-matched male controls (n=7 per group). By analyzing a recorded temperature time series, we were able to identify when thermal homeostasis would be affected by a new diet. The autoregressive time series model (AR model) was used to predict the occurrence of thermal homeostasis, and this model proved to be very effective in distinguishing such a physiological disorder. Thus, we infer from the results of our study that maximum entropy distribution as a means for stochastic characterization of temperature time series registers may be established as an important and early tool to aid in the diagnosis and prevention of metabolic diseases due to their ability to detect small variations in thermal profile.

  6. Early detection of metabolic and energy disorders by thermal time series stochastic complexity analysis

    International Nuclear Information System (INIS)

    Lutaif, N.A.; Palazzo, R. Jr; Gontijo, J.A.R.

    2014-01-01

    Maintenance of thermal homeostasis in rats fed a high-fat diet (HFD) is associated with changes in their thermal balance. The thermodynamic relationship between heat dissipation and energy storage is altered by the ingestion of high-energy diet content. Observation of thermal registers of core temperature behavior, in humans and rodents, permits identification of some characteristics of time series, such as autoreference and stationarity that fit adequately to a stochastic analysis. To identify this change, we used, for the first time, a stochastic autoregressive model, the concepts of which match those associated with physiological systems involved and applied in male HFD rats compared with their appropriate standard food intake age-matched male controls (n=7 per group). By analyzing a recorded temperature time series, we were able to identify when thermal homeostasis would be affected by a new diet. The autoregressive time series model (AR model) was used to predict the occurrence of thermal homeostasis, and this model proved to be very effective in distinguishing such a physiological disorder. Thus, we infer from the results of our study that maximum entropy distribution as a means for stochastic characterization of temperature time series registers may be established as an important and early tool to aid in the diagnosis and prevention of metabolic diseases due to their ability to detect small variations in thermal profile

  7. Time series analysis of infrared satellite data for detecting thermal anomalies: a hybrid approach

    Science.gov (United States)

    Koeppen, W. C.; Pilger, E.; Wright, R.

    2011-07-01

    We developed and tested an automated algorithm that analyzes thermal infrared satellite time series data to detect and quantify the excess energy radiated from thermal anomalies such as active volcanoes. Our algorithm enhances the previously developed MODVOLC approach, a simple point operation, by adding a more complex time series component based on the methods of the Robust Satellite Techniques (RST) algorithm. Using test sites at Anatahan and Kīlauea volcanoes, the hybrid time series approach detected ~15% more thermal anomalies than MODVOLC with very few, if any, known false detections. We also tested gas flares in the Cantarell oil field in the Gulf of Mexico as an end-member scenario representing very persistent thermal anomalies. At Cantarell, the hybrid algorithm showed only a slight improvement, but it did identify flares that were undetected by MODVOLC. We estimate that at least 80 MODIS images for each calendar month are required to create good reference images necessary for the time series analysis of the hybrid algorithm. The improved performance of the new algorithm over MODVOLC will result in the detection of low temperature thermal anomalies that will be useful in improving our ability to document Earth's volcanic eruptions, as well as detecting low temperature thermal precursors to larger eruptions.

  8. Spatial analysis of precipitation time series over the Upper Indus Basin

    Science.gov (United States)

    Latif, Yasir; Yaoming, Ma; Yaseen, Muhammad

    2018-01-01

    The upper Indus basin (UIB) holds one of the most substantial river systems in the world, contributing roughly half of the available surface water in Pakistan. This water provides necessary support for agriculture, domestic consumption, and hydropower generation; all critical for a stable economy in Pakistan. This study has identified trends, analyzed variability, and assessed changes in both annual and seasonal precipitation during four time series, identified herein as: (first) 1961-2013, (second) 1971-2013, (third) 1981-2013, and (fourth) 1991-2013, over the UIB. This study investigated spatial characteristics of the precipitation time series over 15 weather stations and provides strong evidence of annual precipitation by determining significant trends at 6 stations (Astore, Chilas, Dir, Drosh, Gupis, and Kakul) out of the 15 studied stations, revealing a significant negative trend during the fourth time series. Our study also showed significantly increased precipitation at Bunji, Chitral, and Skardu, whereas such trends at the rest of the stations appear insignificant. Moreover, our study found that seasonal precipitation decreased at some locations (at a high level of significance), as well as periods of scarce precipitation during all four seasons. The observed decreases in precipitation appear stronger and more significant in autumn; having 10 stations exhibiting decreasing precipitation during the fourth time series, with respect to time and space. Furthermore, the observed decreases in precipitation appear robust and more significant for regions at high elevation (>1300 m). This analysis concludes that decreasing precipitation dominated the UIB, both temporally and spatially including in the higher areas.

  9. Applications of Hilbert Spectral Analysis for Speech and Sound Signals

    Science.gov (United States)

    Huang, Norden E.

    2003-01-01

    A new method for analyzing nonlinear and nonstationary data has been developed, and the natural applications are to speech and sound signals. The key part of the method is the Empirical Mode Decomposition method with which any complicated data set can be decomposed into a finite and often small number of Intrinsic Mode Functions (IMF). An IMF is defined as any function having the same numbers of zero-crossing and extrema, and also having symmetric envelopes defined by the local maxima and minima respectively. The IMF also admits well-behaved Hilbert transform. This decomposition method is adaptive, and, therefore, highly efficient. Since the decomposition is based on the local characteristic time scale of the data, it is applicable to nonlinear and nonstationary processes. With the Hilbert transform, the Intrinsic Mode Functions yield instantaneous frequencies as functions of time, which give sharp identifications of imbedded structures. This method invention can be used to process all acoustic signals. Specifically, it can process the speech signals for Speech synthesis, Speaker identification and verification, Speech recognition, and Sound signal enhancement and filtering. Additionally, as the acoustical signals from machinery are essentially the way the machines are talking to us. Therefore, the acoustical signals, from the machines, either from sound through air or vibration on the machines, can tell us the operating conditions of the machines. Thus, we can use the acoustic signal to diagnosis the problems of machines.

  10. Spectral analysis of uneven time series of geological variables; Analisis espectral de series temporales de variables geologicas con muestreo irregular

    Energy Technology Data Exchange (ETDEWEB)

    Pardo-Iguzquiza, E.; Rodriguez-Tovar, F. J.

    2013-06-01

    In geosciences the sampling of a time series tends to afford uneven results, sometimes because the sampling itself is random or because of hiatuses or even completely missing data or due to difficulties involved in the conversion of data from a spatial to a time scale when the sedimentation rate was not constant. Whatever the case, the best solution does not lie in interpolation but rather in resorting to a method that deals with the irregular data. We show here how the use of the smoothed Lomb-Scargle periodogram is both a practical and efficient choice. We describe the effects on the estimated power spectrum of the type of irregular sampling, the number of data, interpolation, and the presence of drift. We propose the permutation test as being an efficient way of calculating statistical confidence levels. By applying the Lomb-Scargle periodogram to a synthetic series with a known spectral content we are able to confirm the validity of this method in the face of the difficulties mentioned above. A case study with real data, including hiatuses, representing the thickness of the annual banding in a stalagmite, is chosen to demonstrate an application using the statistical and physical interpretation of spectral peaks. (Author)

  11. Early Detection of Amyotrophic Lateral Sclerosis (ALS using the Gait Motor Signal Frequency Analysis

    Directory of Open Access Journals (Sweden)

    Behzad Abedi

    2016-06-01

    Full Text Available Abstract Background: ALS is a progressive neuro-muscular disease, which is characterized by motor neuron loss in the Central Nervous System (CNS and Peripheral Nervous System (PNS. Up to now, no accurate clinical method for diagnosis of the disease have been provided. In most cases, ALS patients are unable to walk normally due to abnormalities in the nervous system. For this reason, one of the most appropriate methods in the diagnosis of ALS from other neurological diseases or from healthy volunteers is the gait motor signal analysis. Materials and Methods: In this study, gait signals available in Physionet database have been used. The database consists of 13 patients with ALS (ALS1, ALS2, …, ALS13 and 16 normal subjects (CO1, CO2, …, CO16. The patients participating in this study had no history of any psychiatric disorders and did not use any assistive device for walking, like wheelchair. The power spectrum of stride, swing, and stance of normal subjects and patients was computed for both left and right legs. To provide appropriate inputs for the classifier, the frequency band of the power spectrum of all signals was divided into eight equal parts. The area of all regions was computed. Three frequency band of the lower range of power spectra selected as inputs of the classifier. Results: In this study, power spectra, as frequency attributes, were used to explore probable differences of time series in both patients and healthy subjects. Conclusion: Artificial Neural Network was used to classify normal and ALS groups with the accuracy of 83% for the test data set. It seems that the present algorithm can be used in discriminating patients from normal subjects in the early stages of the disease.

  12. Application of wavelet analysis to signal processing methods for eddy-current test

    International Nuclear Information System (INIS)

    Chen, G.; Yoneyama, H.; Yamaguchi, A.; Uesugi, N.

    1998-01-01

    This study deals with the application of wavelet analysis to detection and characterization of defects from eddy-current and ultrasonic testing signals of a low signal-to-noise ratio. Presented in this paper are the methods for processing eddy-current testing signals of heat exchanger tubes of a steam generator in a nuclear power plant. The results of processing eddy-current testing signals of tube testpieces with artificial flaws show that the flaw signals corrupted by noise and/or non-defect signals can be effectively detected and characterized by using the wavelet methods. (author)

  13. A signal detection theory analysis of an unconscious perception effect.

    Science.gov (United States)

    Haase, S J; Theios, J; Jenison, R

    1999-07-01

    The independent observation model (Macmillan & Creelman, 1991) is fitted to detection-identification data collected under conditions of heavy masking. The model accurately predicts a quantitative relationship between stimulus detection and stimulus identification over a wide range of detection performance. This model can also be used to offer a signal detection interpretation of the common finding of above-chance identification following a missed signal. While our finding is not a new one, the stimuli used in this experiment (redundant three-letter strings) differ slightly from those used in traditional signal detection work. Also, the stimuli were presented very briefly and heavily masked, conditions typical in the study of unconscious perception effects.

  14. Analysis of Ultrasonic Transmitted Signal for Apple using Wavelet Transform

    International Nuclear Information System (INIS)

    Kim, Ki Bok; Lee, Sang Dae; Choi, Man Yong; Kim, Man Soo

    2005-01-01

    This study was conducted to analyze the ultrasonic transmitted signal for apple using wavelet transform. Fruit consists of nonlinear visco-elastic properties such as flesh, an ovary and rind and lienee most ultrasonic wave is attenuated and its frequency is shifted during passing the fruit. Thus it is not easy to evaluate the internal quality of the fruit using typical ultrasonic parameters such as wave velocity, attenuation, and frequency spectrum. The discrete wavelet transform was applied to the ultrasonic transmitted signal for apple. The magnitude of the first peak frequency of the wavelet basis from the ultrasonic transmitted signal showed a close correlation to the storage time of apple

  15. Studies in astronomical time series analysis. I - Modeling random processes in the time domain

    Science.gov (United States)

    Scargle, J. D.

    1981-01-01

    Several random process models in the time domain are defined and discussed. Attention is given to the moving average model, the autoregressive model, and relationships between and combinations of these models. Consideration is then given to methods for investigating pulse structure, procedures of model construction, computational methods, and numerical experiments. A FORTRAN algorithm of time series analysis has been developed which is relatively stable numerically. Results of test cases are given to study the effect of adding noise and of different distributions for the pulse amplitudes. A preliminary analysis of the light curve of the quasar 3C 272 is considered as an example.

  16. Novel Signal Noise Reduction Method through Cluster Analysis, Applied to Photoplethysmography.

    Science.gov (United States)

    Waugh, William; Allen, John; Wightman, James; Sims, Andrew J; Beale, Thomas A W

    2018-01-01

    Physiological signals can often become contaminated by noise from a variety of origins. In this paper, an algorithm is described for the reduction of sporadic noise from a continuous periodic signal. The design can be used where a sample of a periodic signal is required, for example, when an average pulse is needed for pulse wave analysis and characterization. The algorithm is based on cluster analysis for selecting similar repetitions or pulses from a periodic single. This method selects individual pulses without noise, returns a clean pulse signal, and terminates when a sufficiently clean and representative signal is received. The algorithm is designed to be sufficiently compact to be implemented on a microcontroller embedded within a medical device. It has been validated through the removal of noise from an exemplar photoplethysmography (PPG) signal, showing increasing benefit as the noise contamination of the signal increases. The algorithm design is generalised to be applicable for a wide range of physiological (physical) signals.

  17. Comparison of data transformation procedures to enhance topographical accuracy in time-series analysis of the human EEG.

    Science.gov (United States)

    Hauk, O; Keil, A; Elbert, T; Müller, M M

    2002-01-30

    We describe a methodology to apply current source density (CSD) and minimum norm (MN) estimation as pre-processing tools for time-series analysis of single trial EEG data. The performance of these methods is compared for the case of wavelet time-frequency analysis of simulated gamma-band activity. A reasonable comparison of CSD and MN on the single trial level requires regularization such that the corresponding transformed data sets have similar signal-to-noise ratios (SNRs). For region-of-interest approaches, it should be possible to optimize the SNR for single estimates rather than for the whole distributed solution. An effective implementation of the MN method is described. Simulated data sets were created by modulating the strengths of a radial and a tangential test dipole with wavelets in the frequency range of the gamma band, superimposed with simulated spatially uncorrelated noise. The MN and CSD transformed data sets as well as the average reference (AR) representation were subjected to wavelet frequency-domain analysis, and power spectra were mapped for relevant frequency bands. For both CSD and MN, the influence of noise can be sufficiently suppressed by regularization to yield meaningful information, but only MN represents both radial and tangential dipole sources appropriately as single peaks. Therefore, when relating wavelet power spectrum topographies to their neuronal generators, MN should be preferred.

  18. On the Impact of a Quadratic Acceleration Term in the Analysis of Position Time Series

    Science.gov (United States)

    Bogusz, Janusz; Klos, Anna; Bos, Machiel Simon; Hunegnaw, Addisu; Teferle, Felix Norman

    2016-04-01

    The analysis of Global Navigation Satellite System (GNSS) position time series generally assumes that each of the coordinate component series is described by the sum of a linear rate (velocity) and various periodic terms. The residuals, the deviations between the fitted model and the observations, are then a measure of the epoch-to-epoch scatter and have been used for the analysis of the stochastic character (noise) of the time series. Often the parameters of interest in GNSS position time series are the velocities and their associated uncertainties, which have to be determined with the highest reliability. It is clear that not all GNSS position time series follow this simple linear behaviour. Therefore, we have added an acceleration term in the form of a quadratic polynomial function to the model in order to better describe the non-linear motion in the position time series. This non-linear motion could be a response to purely geophysical processes, for example, elastic rebound of the Earth's crust due to ice mass loss in Greenland, artefacts due to deficiencies in bias mitigation models, for example, of the GNSS satellite and receiver antenna phase centres, or any combination thereof. In this study we have simulated 20 time series with different stochastic characteristics such as white, flicker or random walk noise of length of 23 years. The noise amplitude was assumed at 1 mm/y-/4. Then, we added the deterministic part consisting of a linear trend of 20 mm/y (that represents the averaged horizontal velocity) and accelerations ranging from minus 0.6 to plus 0.6 mm/y2. For all these data we estimated the noise parameters with Maximum Likelihood Estimation (MLE) using the Hector software package without taken into account the non-linear term. In this way we set the benchmark to then investigate how the noise properties and velocity uncertainty may be affected by any un-modelled, non-linear term. The velocities and their uncertainties versus the accelerations for

  19. Measurement of transient two-phase flow velocity using statistical signal analysis of impedance probe signals

    International Nuclear Information System (INIS)

    Leavell, W.H.; Mullens, J.A.

    1981-01-01

    A computational algorithm has been developed to measure transient, phase-interface velocity in two-phase, steam-water systems. The algorithm will be used to measure the transient velocity of steam-water mixture during simulated PWR reflood experiments. By utilizing signals produced by two, spatially separated impedance probes immersed in a two-phase mixture, the algorithm computes the average transit time of mixture fluctuations moving between the two probes. This transit time is computed by first, measuring the phase shift between the two probe signals after transformation to the frequency domain and then computing the phase shift slope by a weighted least-squares fitting technique. Our algorithm, which has been tested with both simulated and real data, is able to accurately track velocity transients as fast as 4 m/s/s

  20. Analysis of Design of Mixed-Signal Electronic Packaging

    National Research Council Canada - National Science Library

    Pileggi, Lawrence

    1999-01-01

    The objective of this project is to develop innovative algorithms and prototype tools that will help facilitate the design of mixed signal multi-chip modules and packaging in a manner that is similar...

  1. Research Article Genetic Analysis of Signal Peptides in Amphibian ...

    Indian Academy of Sciences (India)

    USUARIO

    Depending on the genus, the whole structure could be duplicated in tandem or scattered ..... signalling, presumably for translocating the lipid bilayer during synthesis. .... revised classification of extant frogs, salamanders, and caecilians.

  2. small signal analysis of load angle governing and excitation control

    African Journals Online (AJOL)

    Dr Obe

    system stabilizers (PSS) or using terminal voltage for control of exciter and speed signal for governor. ... Vfd= generator field voltage. Xd, Xq ... each other in the frequency domain, and therefore ..... angle sensing equipment, relays and.

  3. Emg Signal Analysis of Healthy and Neuropathic Individuals

    Science.gov (United States)

    Gupta, Ashutosh; Sayed, Tabassum; Garg, Ridhi; Shreyam, Richa

    2017-08-01

    Electromyography is a method to evaluate levels of muscle activity. When a muscle contracts, an action potential is generated and this circulates along the muscular fibers. In electromyography, electrodes are connected to the skin and the electrical activity of muscles is measured and graph is plotted. The surface EMG signals picked up during the muscular activity are interfaced with a system. The EMG signals from individual suffering from Neuropathy and healthy individual, so obtained, are processed and analyzed using signal processing techniques. This project includes the investigation and interpretation of EMG signals of healthy and Neuropathic individuals using MATLAB. The prospective use of this study is in developing the prosthetic device for the people with Neuropathic disability.

  4. Analysis of small-signal intensity modulation of semiconductor ...

    Indian Academy of Sciences (India)

    Computer simulation of the model is applied to 1.55-µm ... Semiconductor laser; small-signal modulation; modulation response; gain suppression. ... originates from intraband relaxation processes of charge carriers that extend for times as ...

  5. Analysis and Simulation of Multi-target Echo Signals from a Phased Array Radar

    OpenAIRE

    Jia Zhen; Zhou Rui

    2017-01-01

    The construction of digital radar simulation systems has been a research hotspot of the radar field. This paper focuses on theoretical analysis and simulation of multi-target echo signals produced in a phased array radar system, and constructs an array antenna element and a signal generation environment. The antenna element is able to simulate planar arrays and optimizes these arrays by adding window functions. And the signal environment can model and simulate radar transmission signals, rada...

  6. A Review on the Nonlinear Dynamical System Analysis of Electrocardiogram Signal.

    Science.gov (United States)

    Nayak, Suraj K; Bit, Arindam; Dey, Anilesh; Mohapatra, Biswajit; Pal, Kunal

    2018-01-01

    Electrocardiogram (ECG) signal analysis has received special attention of the researchers in the recent past because of its ability to divulge crucial information about the electrophysiology of the heart and the autonomic nervous system activity in a noninvasive manner. Analysis of the ECG signals has been explored using both linear and nonlinear methods. However, the nonlinear methods of ECG signal analysis are gaining popularity because of their robustness in feature extraction and classification. The current study presents a review of the nonlinear signal analysis methods, namely, reconstructed phase space analysis, Lyapunov exponents, correlation dimension, detrended fluctuation analysis (DFA), recurrence plot, Poincaré plot, approximate entropy, and sample entropy along with their recent applications in the ECG signal analysis.

  7. A Review on the Nonlinear Dynamical System Analysis of Electrocardiogram Signal

    Science.gov (United States)

    Mohapatra, Biswajit

    2018-01-01

    Electrocardiogram (ECG) signal analysis has received special attention of the researchers in the recent past because of its ability to divulge crucial information about the electrophysiology of the heart and the autonomic nervous system activity in a noninvasive manner. Analysis of the ECG signals has been explored using both linear and nonlinear methods. However, the nonlinear methods of ECG signal analysis are gaining popularity because of their robustness in feature extraction and classification. The current study presents a review of the nonlinear signal analysis methods, namely, reconstructed phase space analysis, Lyapunov exponents, correlation dimension, detrended fluctuation analysis (DFA), recurrence plot, Poincaré plot, approximate entropy, and sample entropy along with their recent applications in the ECG signal analysis. PMID:29854361

  8. Conditional Random Fields for Morphological Analysis of Wireless ECG Signals

    OpenAIRE

    Natarajan, Annamalai; Gaiser, Edward; Angarita, Gustavo; Malison, Robert; Ganesan, Deepak; Marlin, Benjamin

    2014-01-01

    Thanks to advances in mobile sensing technologies, it has recently become practical to deploy wireless electrocardiograph sensors for continuous recording of ECG signals. This capability has diverse applications in the study of human health and behavior, but to realize its full potential, new computational tools are required to effectively deal with the uncertainty that results from the noisy and highly non-stationary signals collected using these devices. In this work, we present a novel app...

  9. Enhanced Phosphoproteomic Profiling Workflow For Growth Factor Signaling Analysis

    DEFF Research Database (Denmark)

    Sylvester, Marc; Burbridge, Mike; Leclerc, Gregory

    2010-01-01

    Background Our understanding of complex signaling networks is still fragmentary. Isolated processes have been studied extensively but cross-talk is omnipresent and precludes intuitive predictions of signaling outcomes. The need for quantitative data on dynamic systems is apparent especially for our...... understanding of pathological processes. In our study we create and integrate data on phosphorylations that are initiated by several growth factor receptors. We present an approach for quantitative, time-resolved phosphoproteomic profiling that integrates the important contributions by phosphotyrosines. Methods...

  10. Brain Signal Analysis Using Different Types of Music

    OpenAIRE

    Siti Ayuni Mohd Nasir; Wan Mahani Hafizah Wan Mahmud

    2015-01-01

    Music is able to improve certain functions of human body physiologically and psychologically. Music also can improve attention, memory, and even mental math ability by listening to the music before performing any task. The purpose of this study is to study the relation between types of music and brainwaves signal that is differences in state of relaxation and attention states. The Electroencephalography (EEG) signal was recorded using PowerLab, Dual Bio Amp and computer to observes and record...

  11. Large-signal stability analysis of PWM converters

    Energy Technology Data Exchange (ETDEWEB)

    Huynh, P.T. [Philips Labs., Briarcliff Manor, NY (United States); Cho, B.H. [Seoul National Univ. (Korea, Republic of). Dept. of Electrical Engineering

    1995-12-31

    Investigation of the effects of existing nonlinearities on the stability of PWM converters is performed. The bilinear structure, the duty cycle saturation, and the opamp saturation are the principal nonlinearities in PWM converters. These nonlinearities are incorporated in the large-signal analytical models of PWM converters, and the basic input-output stability theory is applied to analyze their stability. Design and optimization of the small-signal loop gains to counteract the undesirable nonlinear effects are also discussed.

  12. Note for the Mirnov signal analysis in tokamaks

    International Nuclear Information System (INIS)

    Kikuchi, M.

    1985-05-01

    The relation between Mirnov coil signals and the current perturbation on the rational surface is examined analytically by using the approximate Green's function for the case of large aspect ratio circular tokamaks. Satellite island formation, phase modulation effect due to the poloidal variation of the field line pitch, and the shift effect of the plasma column with respect to the center of the vacuum chamber are examined. The detectability of these effects from Mirnov coil signals is discussed for TFTR

  13. Beyond Fractals and 1/f Noise: Multifractal Analysis of Complex Physiological Time Series

    Science.gov (United States)

    Ivanov, Plamen Ch.; Amaral, Luis A. N.; Ashkenazy, Yosef; Stanley, H. Eugene; Goldberger, Ary L.; Hausdorff, Jeffrey M.; Yoneyama, Mitsuru; Arai, Kuniharu

    2001-03-01

    We investigate time series with 1/f-like spectra generated by two physiologic control systems --- the human heartbeat and human gait. We show that physiological fluctuations exhibit unexpected ``hidden'' structures often described by scaling laws. In particular, our studies indicate that when analyzed on different time scales the heartbeat fluctuations exhibit cascades of branching patterns with self-similar (fractal) properties, characterized by long-range power-law anticorrelations. We find that these scaling features change during sleep and wake phases, and with pathological perturbations. Further, by means of a new wavelet-based technique, we find evidence of multifractality in the healthy human heartbeat even under resting conditions, and show that the multifractal character and nonlinear properties of the healthy heart are encoded in the Fourier phases. We uncover a loss of multifractality for a life-threatening condition, congestive heart failure. In contrast to the heartbeat, we find that the interstride interval time series of healthy human gait, a voluntary process under neural regulation, is described by a single fractal dimension (such as classical 1/f noise) indicating monofractal behavior. Thus our approach can help distinguish physiological and physical signals with comparable frequency spectra and two-point correlations, and guide modeling of their control mechanisms.

  14. Development of indicators of vegetation recovery based on time series analysis of SPOT Vegetation data

    Science.gov (United States)

    Lhermitte, S.; Tips, M.; Verbesselt, J.; Jonckheere, I.; Van Aardt, J.; Coppin, Pol

    2005-10-01

    Large-scale wild fires have direct impacts on natural ecosystems and play a major role in the vegetation ecology and carbon budget. Accurate methods for describing post-fire development of vegetation are therefore essential for the understanding and monitoring of terrestrial ecosystems. Time series analysis of satellite imagery offers the potential to quantify these parameters with spatial and temporal accuracy. Current research focuses on the potential of time series analysis of SPOT Vegetation S10 data (1999-2001) to quantify the vegetation recovery of large-scale burns detected in the framework of GBA2000. The objective of this study was to provide quantitative estimates of the spatio-temporal variation of vegetation recovery based on remote sensing indicators. Southern Africa was used as a pilot study area, given the availability of ground and satellite data. An automated technique was developed to extract consistent indicators of vegetation recovery from the SPOT-VGT time series. Reference areas were used to quantify the vegetation regrowth by means of Regeneration Indices (RI). Two kinds of recovery indicators (time and value- based) were tested for RI's of NDVI, SR, SAVI, NDWI, and pure band information. The effects of vegetation structure and temporal fire regime features on the recovery indicators were subsequently analyzed. Statistical analyses were conducted to assess whether the recovery indicators were different for different vegetation types and dependent on timing of the burning season. Results highlighted the importance of appropriate reference areas and the importance of correct normalization of the SPOT-VGT data.

  15. Dynamic Forecasting Conditional Probability of Bombing Attacks Based on Time-Series and Intervention Analysis.

    Science.gov (United States)

    Li, Shuying; Zhuang, Jun; Shen, Shifei

    2017-07-01

    In recent years, various types of terrorist attacks occurred, causing worldwide catastrophes. According to the Global Terrorism Database (GTD), among all attack tactics, bombing attacks happened most frequently, followed by armed assaults. In this article, a model for analyzing and forecasting the conditional probability of bombing attacks (CPBAs) based on time-series methods is developed. In addition, intervention analysis is used to analyze the sudden increase in the time-series process. The results show that the CPBA increased dramatically at the end of 2011. During that time, the CPBA increased by 16.0% in a two-month period to reach the peak value, but still stays 9.0% greater than the predicted level after the temporary effect gradually decays. By contrast, no significant fluctuation can be found in the conditional probability process of armed assault. It can be inferred that some social unrest, such as America's troop withdrawal from Afghanistan and Iraq, could have led to the increase of the CPBA in Afghanistan, Iraq, and Pakistan. The integrated time-series and intervention model is used to forecast the monthly CPBA in 2014 and through 2064. The average relative error compared with the real data in 2014 is 3.5%. The model is also applied to the total number of attacks recorded by the GTD between 2004 and 2014. © 2016 Society for Risk Analysis.

  16. Comparative performance analysis of shunt and series passive filter for LED lamp

    Science.gov (United States)

    Sarwono, Edi; Facta, Mochammad; Handoko, Susatyo

    2018-03-01

    Light Emitting Diode lamp or LED lamp nowadays is widely used by consumers as a new innovation in the lighting technologies due to its energy saving for low power consumption lamps for brighter light intensity. How ever, the LED lamp produce an electric pollutant known as harmonics. The harmonics is generated by rectifier as part of LED lamp circuit. The present of harmonics in current or voltage has made the source waveform from the grid is distorted. This distortion may cause inacurrate measurement, mall function, and excessive heating for any element at the grid. This paper present an analysis work of shunt and series filters to suppress the harmonics generated by the LED lamp circuit. The work was initiated by conducting several tests to investigate the harmonic content of voltage and currents. The measurements in this work were carried out by using HIOKI Power Quality Analyzer 3197. The measurement results showed that the harmonics current of tested LED lamps were above the limit of IEEE standard 519-2014. Based on the measurement results shunt and series filters were constructed as low pass filters. The bode analysis were appled during construction and prediction of the filters performance. Based on experimental results, the application of shunt filter at input side of LED lamp has reduced THD current up to 88%. On the other hand, the series filter has significantly reduced THD current up to 92%.

  17. Work-related accidents among the Iranian population: a time series analysis, 2000-2011.

    Science.gov (United States)

    Karimlou, Masoud; Salehi, Masoud; Imani, Mehdi; Hosseini, Agha-Fatemeh; Dehnad, Afsaneh; Vahabi, Nasim; Bakhtiyari, Mahmood

    2015-01-01

    Work-related accidents result in human suffering and economic losses and are considered as a major health problem worldwide, especially in the economically developing world. To introduce seasonal autoregressive moving average (ARIMA) models for time series analysis of work-related accident data for workers insured by the Iranian Social Security Organization (ISSO) between 2000 and 2011. In this retrospective study, all insured people experiencing at least one work-related accident during a 10-year period were included in the analyses. We used Box-Jenkins modeling to develop a time series model of the total number of accidents. There was an average of 1476 accidents per month (1476·05±458·77, mean±SD). The final ARIMA (p,d,q) (P,D,Q)s model for fitting to data was: ARIMA(1,1,1)×(0,1,1)12 consisting of the first ordering of the autoregressive, moving average and seasonal moving average parameters with 20·942 mean absolute percentage error (MAPE). The final model showed that time series analysis of ARIMA models was useful for forecasting the number of work-related accidents in Iran. In addition, the forecasted number of work-related accidents for 2011 explained the stability of occurrence of these accidents in recent years, indicating a need for preventive occupational health and safety policies such as safety inspection.

  18. Work-related accidents among the Iranian population: a time series analysis, 2000–2011

    Science.gov (United States)

    Karimlou, Masoud; Imani, Mehdi; Hosseini, Agha-Fatemeh; Dehnad, Afsaneh; Vahabi, Nasim; Bakhtiyari, Mahmood

    2015-01-01

    Background Work-related accidents result in human suffering and economic losses and are considered as a major health problem worldwide, especially in the economically developing world. Objectives To introduce seasonal autoregressive moving average (ARIMA) models for time series analysis of work-related accident data for workers insured by the Iranian Social Security Organization (ISSO) between 2000 and 2011. Methods In this retrospective study, all insured people experiencing at least one work-related accident during a 10-year period were included in the analyses. We used Box–Jenkins modeling to develop a time series model of the total number of accidents. Results There was an average of 1476 accidents per month (1476·05±458·77, mean±SD). The final ARIMA (p,d,q) (P,D,Q)s model for fitting to data was: ARIMA(1,1,1)×(0,1,1)12 consisting of the first ordering of the autoregressive, moving average and seasonal moving average parameters with 20·942 mean absolute percentage error (MAPE). Conclusions The final model showed that time series analysis of ARIMA models was useful for forecasting the number of work-related accidents in Iran. In addition, the forecasted number of work-related accidents for 2011 explained the stability of occurrence of these accidents in recent years, indicating a need for preventive occupational health and safety policies such as safety inspection. PMID:26119774

  19. Empirical mode decomposition and long-range correlation analysis of sunspot time series

    International Nuclear Information System (INIS)

    Zhou, Yu; Leung, Yee

    2010-01-01

    Sunspots, which are the best known and most variable features of the solar surface, affect our planet in many ways. The number of sunspots during a period of time is highly variable and arouses strong research interest. When multifractal detrended fluctuation analysis (MF-DFA) is employed to study the fractal properties and long-range correlation of the sunspot series, some spurious crossover points might appear because of the periodic and quasi-periodic trends in the series. However many cycles of solar activities can be reflected by the sunspot time series. The 11-year cycle is perhaps the most famous cycle of the sunspot activity. These cycles pose problems for the investigation of the scaling behavior of sunspot time series. Using different methods to handle the 11-year cycle generally creates totally different results. Using MF-DFA, Movahed and co-workers employed Fourier truncation to deal with the 11-year cycle and found that the series is long-range anti-correlated with a Hurst exponent, H, of about 0.12. However, Hu and co-workers proposed an adaptive detrending method for the MF-DFA and discovered long-range correlation characterized by H≈0.74. In an attempt to get to the bottom of the problem in the present paper, empirical mode decomposition (EMD), a data-driven adaptive method, is applied to first extract the components with different dominant frequencies. MF-DFA is then employed to study the long-range correlation of the sunspot time series under the influence of these components. On removing the effects of these periods, the natural long-range correlation of the sunspot time series can be revealed. With the removal of the 11-year cycle, a crossover point located at around 60 months is discovered to be a reasonable point separating two different time scale ranges, H≈0.72 and H≈1.49. And on removing all cycles longer than 11 years, we have H≈0.69 and H≈0.28. The three cycle-removing methods—Fourier truncation, adaptive detrending and the

  20. Detrended fluctuation analysis based on higher-order moments of financial time series

    Science.gov (United States)

    Teng, Yue; Shang, Pengjian

    2018-01-01

    In this paper, a generalized method of detrended fluctuation analysis (DFA) is proposed as a new measure to assess the complexity of a complex dynamical system such as stock market. We extend DFA and local scaling DFA to higher moments such as skewness and kurtosis (labeled SMDFA and KMDFA), so as to investigate the volatility scaling property of financial time series. Simulations are conducted over synthetic and financial data for providing the comparative study. We further report the results of volatility behaviors in three American countries, three Chinese and three European stock markets by using DFA and LSDFA method based on higher moments. They demonstrate the dynamics behaviors of time series in different aspects, which can quantify the changes of complexity for stock market data and provide us with more meaningful information than single exponent. And the results reveal some higher moments volatility and higher moments multiscale volatility details that cannot be obtained using the traditional DFA method.

  1. Time series analysis of pressure fluctuation in gas-solid fluidized beds

    Directory of Open Access Journals (Sweden)

    C. Alberto S. Felipe

    2004-09-01

    Full Text Available The purpose of the present work was to study the differentiation of states of typical fluidization (single bubble, multiple bubble and slugging in a gas-solid fluidized bed, using spectral analysis of pressure fluctuation time series. The effects of the method of measuring (differential and absolute pressure fluctuations and the axial position of the probes in the fluidization column on the identification of each of the regimes studied were evaluated. Fast Fourier Transform (FFT was the mathematic tool used to analysing the data of pressure fluctuations, which expresses the behavior of a time series in the frequency domain. Results indicated that the plenum chamber was a place for reliable measurement and that care should be taken in measurement in the dense phase. The method allowed fluid dynamic regimes to be differentiated by their dominant frequency characteristics.

  2. Analysis of Data from a Series of Events by a Geometric Process Model

    Institute of Scientific and Technical Information of China (English)

    Yeh Lam; Li-xing Zhu; Jennifer S. K. Chan; Qun Liu

    2004-01-01

    Geometric process was first introduced by Lam[10,11]. A stochastic process {Xi, i = 1, 2,…} is called a geometric process (GP) if, for some a > 0, {ai-1Xi, i = 1, 2,…} forms a renewal process. In thispaper, the GP is used to analyze the data from a series of events. A nonparametric method is introduced forthe estimation of the three parameters in the GP. The limiting distributions of the three estimators are studied.Through the analysis of some real data sets, the GP model is compared with other three homogeneous andnonhomogeneous Poisson models. It seems that on average the GP model is the best model among these fourmodels in analyzing the data from a series of events.

  3. The Relative Importance of the Service Sector in the Mexican Economy: A Time Series Analysis

    Directory of Open Access Journals (Sweden)

    Carlos Alberto Flores

    2014-01-01

    Full Text Available We conduct a study of the secondary and tertiary sectors with the goal of highlighting the relative im-portance of services in the Mexican economy. We consider a time series analysis approach designed to identify the stochastic nature of the series, as well as to define their long-run and-short run relationships with Gross Domestic Product (GDP. The results of cointegration tests suggest that, for the most part, activities in the secondary and tertiary sectors share a common trend with GDP. Interestingly, the long-run elasticities of GDP with respect to services are on average larger than those with respect to secondary activities. Common cycle tests results identify the existence of common cycles between GDP and the disaggregated sectors, as well as with manufacturing, commerce, real estate and transportation. In this case, the short-run elasticities of secondary activities are on average larger than those corresponding to services.

  4. Investigation of interfacial wave structure using time-series analysis techniques

    International Nuclear Information System (INIS)

    Jayanti, S.; Hewitt, G.F.; Cliffe, K.A.

    1990-09-01

    The report presents an investigation into the interfacial structure in horizontal annular flow using spectral and time-series analysis techniques. Film thickness measured using conductance probes shows an interesting transition in wave pattern from a continuous low-frequency wave pattern to an intermittent, high-frequency one. From the autospectral density function of the film thickness, it appears that this transition is caused by the breaking up of long waves into smaller ones. To investigate the possibility of the wave structure being represented as a low order chaotic system, phase portraits of the time series were constructed using the technique developed by Broomhead and co-workers (1986, 1987 and 1989). These showed a banded structure when waves of relatively high frequency were filtered out. Although these results are encouraging, further work is needed to characterise the attractor. (Author)

  5. Causality as a Rigorous Notion and Quantitative Causality Analysis with Time Series

    Science.gov (United States)

    Liang, X. S.

    2017-12-01

    Given two time series, can one faithfully tell, in a rigorous and quantitative way, the cause and effect between them? Here we show that this important and challenging question (one of the major challenges in the science of big data), which is of interest in a wide variety of disciplines, has a positive answer. Particularly, for linear systems, the maximal likelihood estimator of the causality from a series X2 to another series X1, written T2→1, turns out to be concise in form: T2→1 = [C11 C12 C2,d1 — C112 C1,d1] / [C112 C22 — C11C122] where Cij (i,j=1,2) is the sample covariance between Xi and Xj, and Ci,dj the covariance between Xi and ΔXj/Δt, the difference approximation of dXj/dt using the Euler forward scheme. An immediate corollary is that causation implies correlation, but not vice versa, resolving the long-standing debate over causation versus correlation. The above formula has been validated with touchstone series purportedly generated with one-way causality that evades the classical approaches such as Granger causality test and transfer entropy analysis. It has also been applied successfully to the investigation of many real problems. Through a simple analysis with the stock series of IBM and GE, an unusually strong one-way causality is identified from the former to the latter in their early era, revealing to us an old story, which has almost faded into oblivion, about "Seven Dwarfs" competing with a "Giant" for the computer market. Another example presented here regards the cause-effect relation between the two climate modes, El Niño and Indian Ocean Dipole (IOD). In general, these modes are mutually causal, but the causality is asymmetric. To El Niño, the information flowing from IOD manifests itself as a propagation of uncertainty from the Indian Ocean. In the third example, an unambiguous one-way causality is found between CO2 and the global mean temperature anomaly. While it is confirmed that CO2 indeed drives the recent global warming

  6. Phase correction and error estimation in InSAR time series analysis

    Science.gov (United States)

    Zhang, Y.; Fattahi, H.; Amelung, F.

    2017-12-01

    During the last decade several InSAR time series approaches have been developed in response to the non-idea acquisition strategy of SAR satellites, such as large spatial and temporal baseline with non-regular acquisitions. The small baseline tubes and regular acquisitions of new SAR satellites such as Sentinel-1 allows us to form fully connected networks of interferograms and simplifies the time series analysis into a weighted least square inversion of an over-determined system. Such robust inversion allows us to focus more on the understanding of different components in InSAR time-series and its uncertainties. We present an open-source python-based package for InSAR time series analysis, called PySAR (https://yunjunz.github.io/PySAR/), with unique functionalities for obtaining unbiased ground displacement time-series, geometrical and atmospheric correction of InSAR data and quantifying the InSAR uncertainty. Our implemented strategy contains several features including: 1) improved spatial coverage using coherence-based network of interferograms, 2) unwrapping error correction using phase closure or bridging, 3) tropospheric delay correction using weather models and empirical approaches, 4) DEM error correction, 5) optimal selection of reference date and automatic outlier detection, 6) InSAR uncertainty due to the residual tropospheric delay, decorrelation and residual DEM error, and 7) variance-covariance matrix of final products for geodetic inversion. We demonstrate the performance using SAR datasets acquired by Cosmo-Skymed and TerraSAR-X, Sentinel-1 and ALOS/ALOS-2, with application on the highly non-linear volcanic deformation in Japan and Ecuador (figure 1). Our result shows precursory deformation before the 2015 eruptions of Cotopaxi volcano, with a maximum uplift of 3.4 cm on the western flank (fig. 1b), with a standard deviation of 0.9 cm (fig. 1a), supporting the finding by Morales-Rivera et al. (2017, GRL); and a post-eruptive subsidence on the same

  7. Multidimensional scaling analysis of financial time series based on modified cross-sample entropy methods

    Science.gov (United States)

    He, Jiayi; Shang, Pengjian; Xiong, Hui

    2018-06-01

    Stocks, as the concrete manifestation of financial time series with plenty of potential information, are often used in the study of financial time series. In this paper, we utilize the stock data to recognize their patterns through out the dissimilarity matrix based on modified cross-sample entropy, then three-dimensional perceptual maps of the results are provided through multidimensional scaling method. Two modified multidimensional scaling methods are proposed in this paper, that is, multidimensional scaling based on Kronecker-delta cross-sample entropy (MDS-KCSE) and multidimensional scaling based on permutation cross-sample entropy (MDS-PCSE). These two methods use Kronecker-delta based cross-sample entropy and permutation based cross-sample entropy to replace the distance or dissimilarity measurement in classical multidimensional scaling (MDS). Multidimensional scaling based on Chebyshev distance (MDSC) is employed to provide a reference for comparisons. Our analysis reveals a clear clustering both in synthetic data and 18 indices from diverse stock markets. It implies that time series generated by the same model are easier to have similar irregularity than others, and the difference in the stock index, which is caused by the country or region and the different financial policies, can reflect the irregularity in the data. In the synthetic data experiments, not only the time series generated by different models can be distinguished, the one generated under different parameters of the same model can also be detected. In the financial data experiment, the stock indices are clearly divided into five groups. Through analysis, we find that they correspond to five regions, respectively, that is, Europe, North America, South America, Asian-Pacific (with the exception of mainland China), mainland China and Russia. The results also demonstrate that MDS-KCSE and MDS-PCSE provide more effective divisions in experiments than MDSC.

  8. Interpretation of the auto-mutual information rate of decrease in the context of biomedical signal analysis. Application to electroencephalogram recordings

    International Nuclear Information System (INIS)

    Escudero, Javier; Hornero, Roberto; Abásolo, Daniel

    2009-01-01

    The mutual information (MI) is a measure of both linear and nonlinear dependences. It can be applied to a time series and a time-delayed version of the same sequence to compute the auto-mutual information function (AMIF). Moreover, the AMIF rate of decrease (AMIFRD) with increasing time delay in a signal is correlated with its entropy and has been used to characterize biomedical data. In this paper, we aimed at gaining insight into the dependence of the AMIFRD on several signal processing concepts and at illustrating its application to biomedical time series analysis. Thus, we have analysed a set of synthetic sequences with the AMIFRD. The results show that the AMIF decreases more quickly as bandwidth increases and that the AMIFRD becomes more negative as there is more white noise contaminating the time series. Additionally, this metric detected changes in the nonlinear dynamics of a signal. Finally, in order to illustrate the analysis of real biomedical signals with the AMIFRD, this metric was applied to electroencephalogram (EEG) signals acquired with eyes open and closed and to ictal and non-ictal intracranial EEG recordings

  9. Interpretation of the auto-mutual information rate of decrease in the context of biomedical signal analysis. Application to electroencephalogram recordings.

    Science.gov (United States)

    Escudero, Javier; Hornero, Roberto; Abásolo, Daniel

    2009-02-01

    The mutual information (MI) is a measure of both linear and nonlinear dependences. It can be applied to a time series and a time-delayed version of the same sequence to compute the auto-mutual information function (AMIF). Moreover, the AMIF rate of decrease (AMIFRD) with increasing time delay in a signal is correlated with its entropy and has been used to characterize biomedical data. In this paper, we aimed at gaining insight into the dependence of the AMIFRD on several signal processing concepts and at illustrating its application to biomedical time series analysis. Thus, we have analysed a set of synthetic sequences with the AMIFRD. The results show that the AMIF decreases more quickly as bandwidth increases and that the AMIFRD becomes more negative as there is more white noise contaminating the time series. Additionally, this metric detected changes in the nonlinear dynamics of a signal. Finally, in order to illustrate the analysis of real biomedical signals with the AMIFRD, this metric was applied to electroencephalogram (EEG) signals acquired with eyes open and closed and to ictal and non-ictal intracranial EEG recordings.

  10. Statistical 21-cm Signal Separation via Gaussian Process Regression Analysis

    Science.gov (United States)

    Mertens, F. G.; Ghosh, A.; Koopmans, L. V. E.

    2018-05-01

    Detecting and characterizing the Epoch of Reionization and Cosmic Dawn via the redshifted 21-cm hyperfine line of neutral hydrogen will revolutionize the study of the formation of the first stars, galaxies, black holes and intergalactic gas in the infant Universe. The wealth of information encoded in this signal is, however, buried under foregrounds that are many orders of magnitude brighter. These must be removed accurately and precisely in order to reveal the feeble 21-cm signal. This requires not only the modeling of the Galactic and extra-galactic emission, but also of the often stochastic residuals due to imperfect calibration of the data caused by ionospheric and instrumental distortions. To stochastically model these effects, we introduce a new method based on `Gaussian Process Regression' (GPR) which is able to statistically separate the 21-cm signal from most of the foregrounds and other contaminants. Using simulated LOFAR-EoR data that include strong instrumental mode-mixing, we show that this method is capable of recovering the 21-cm signal power spectrum across the entire range k = 0.07 - 0.3 {h cMpc^{-1}}. The GPR method is most optimal, having minimal and controllable impact on the 21-cm signal, when the foregrounds are correlated on frequency scales ≳ 3 MHz and the rms of the signal has σ21cm ≳ 0.1 σnoise. This signal separation improves the 21-cm power-spectrum sensitivity by a factor ≳ 3 compared to foreground avoidance strategies and enables the sensitivity of current and future 21-cm instruments such as the Square Kilometre Array to be fully exploited.

  11. Analysis of cyclical behavior in time series of stock market returns

    Science.gov (United States)

    Stratimirović, Djordje; Sarvan, Darko; Miljković, Vladimir; Blesić, Suzana

    2018-01-01

    In this paper we have analyzed scaling properties and cyclical behavior of the three types of stock market indexes (SMI) time series: data belonging to stock markets of developed economies, emerging economies, and of the underdeveloped or transitional economies. We have used two techniques of data analysis to obtain and verify our findings: the wavelet transform (WT) spectral analysis to identify cycles in the SMI returns data, and the time-dependent detrended moving average (tdDMA) analysis to investigate local behavior around market cycles and trends. We found cyclical behavior in all SMI data sets that we have analyzed. Moreover, the positions and the boundaries of cyclical intervals that we found seam to be common for all markets in our dataset. We list and illustrate the presence of nine such periods in our SMI data. We report on the possibilities to differentiate between the level of growth of the analyzed markets by way of statistical analysis of the properties of wavelet spectra that characterize particular peak behaviors. Our results show that measures like the relative WT energy content and the relative WT amplitude of the peaks in the small scales region could be used to partially differentiate between market economies. Finally, we propose a way to quantify the level of development of a stock market based on estimation of local complexity of market's SMI series. From the local scaling exponents calculated for our nine peak regions we have defined what we named the Development Index, which proved, at least in the case of our dataset, to be suitable to rank the SMI series that we have analyzed in three distinct groups.

  12. Arab drama series content analysis from a transnational Arab identity perspective

    Directory of Open Access Journals (Sweden)

    Joelle Chamieh

    2016-04-01

    Full Text Available The scientific contribution in deciphering drama series falls under the discipline of understanding the narratology of distinctive cultures and traditions within specific contexts of certain societies. This article spells out the interferences deployed by the provocations that are induced through the functions of values in modeling societies which are projected through the transmission of media. The proposed operational model consists of providing an à priori design of common Arab values assimilated into an innovative grid analysis code book that has enabled the execution of a systematic and reliable approach to the quantitative content analysis performance. Additionally, a more thorough qualitative content analysis has been implemented in terms of narratolgy where actions have been evaluated based on the grid analysis code book for a clearer perception of Arab values depicted in terms of their context within the Arab drama milieu. This approach has been deployed on four Arab drama series covering the transnational/national and non-divisive/divisive media aspects in the intention of extracting the transmitted values from a common identity perspective for cause of divulging Arab people’s expectancies.

  13. Performance Improvement of Power Analysis Attacks on AES with Encryption-Related Signals

    Science.gov (United States)

    Lee, You-Seok; Lee, Young-Jun; Han, Dong-Guk; Kim, Ho-Won; Kim, Hyoung-Nam

    A power analysis attack is a well-known side-channel attack but the efficiency of the attack is frequently degraded by the existence of power components, irrelative to the encryption included in signals used for the attack. To enhance the performance of the power analysis attack, we propose a preprocessing method based on extracting encryption-related parts from the measured power signals. Experimental results show that the attacks with the preprocessed signals detect correct keys with much fewer signals, compared to the conventional power analysis attacks.

  14. Safety analysis of urban signalized intersections under mixed traffic.

    Science.gov (United States)

    S, Anjana; M V L R, Anjaneyulu

    2015-02-01

    This study examined the crash causative factors of signalized intersections under mixed traffic using advanced statistical models. Hierarchical Poisson regression and logistic regression models were developed to predict the crash frequency and severity of signalized intersection approaches. The prediction models helped to develop general safety countermeasures for signalized intersections. The study shows that exclusive left turn lanes and countdown timers are beneficial for improving the safety of signalized intersections. Safety is also influenced by the presence of a surveillance camera, green time, median width, traffic volume, and proportion of two wheelers in the traffic stream. The factors that influence the severity of crashes were also identified in this study. As a practical application, the safe values of deviation of green time provided from design green time, with varying traffic volume, is presented in this study. This is a useful tool for setting the appropriate green time for a signalized intersection approach with variations in the traffic volume. Copyright © 2014 Elsevier Ltd. All rights reserved.

  15. Comparison of causality analysis on simultaneously measured fMRI and NIRS signals during motor tasks.

    Science.gov (United States)

    Anwar, Abdul Rauf; Muthalib, Makii; Perrey, Stephane; Galka, Andreas; Granert, Oliver; Wolff, Stephan; Deuschl, Guenther; Raethjen, Jan; Heute, Ulrich; Muthuraman, Muthuraman

    2013-01-01

    Brain activity can be measured using different modalities. Since most of the modalities tend to complement each other, it seems promising to measure them simultaneously. In to be presented research, the data recorded from Functional Magnetic Resonance Imaging (fMRI) and Near Infrared Spectroscopy (NIRS), simultaneously, are subjected to causality analysis using time-resolved partial directed coherence (tPDC). Time-resolved partial directed coherence uses the principle of state space modelling to estimate Multivariate Autoregressive (MVAR) coefficients. This method is useful to visualize both frequency and time dynamics of causality between the time series. Afterwards, causality results from different modalities are compared by estimating the Spearman correlation. In to be presented study, we used directionality vectors to analyze correlation, rather than actual signal vectors. Results show that causality analysis of the fMRI correlates more closely to causality results of oxy-NIRS as compared to deoxy-NIRS in case of a finger sequencing task. However, in case of simple finger tapping, no clear difference between oxy-fMRI and deoxy-fMRI correlation is identified.

  16. Application on technique of joint time-frequency analysis of seismic signal's first arrival estimation

    International Nuclear Information System (INIS)

    Xu Chaoyang; Liu Junmin; Fan Yanfang; Ji Guohua

    2008-01-01

    Joint time-frequency analysis is conducted to construct one joint density function of time and frequency. It can open out one signal's frequency components and their evolvements. It is the new evolvement of Fourier analysis. In this paper, according to the characteristic of seismic signal's noise, one estimation method of seismic signal's first arrival based on triple correlation of joint time-frequency spectrum is introduced, and the results of experiment and conclusion are presented. (authors)

  17. Analysis of degree of nonlinearity and stochastic nature of HRV signal during meditation using delay vector variance method.

    Science.gov (United States)

    Reddy, L Ram Gopal; Kuntamalla, Srinivas

    2011-01-01

    Heart rate variability analysis is fast gaining acceptance as a potential non-invasive means of autonomic nervous system assessment in research as well as clinical domains. In this study, a new nonlinear analysis method is used to detect the degree of nonlinearity and stochastic nature of heart rate variability signals during two forms of meditation (Chi and Kundalini). The data obtained from an online and widely used public database (i.e., MIT/BIH physionet database), is used in this study. The method used is the delay vector variance (DVV) method, which is a unified method for detecting the presence of determinism and nonlinearity in a time series and is based upon the examination of local predictability of a signal. From the results it is clear that there is a significant change in the nonlinearity and stochastic nature of the signal before and during the meditation (p value > 0.01). During Chi meditation there is a increase in stochastic nature and decrease in nonlinear nature of the signal. There is a significant decrease in the degree of nonlinearity and stochastic nature during Kundalini meditation.

  18. Conditional Random Fields for Morphological Analysis of Wireless ECG Signals

    Science.gov (United States)

    Natarajan, Annamalai; Gaiser, Edward; Angarita, Gustavo; Malison, Robert; Ganesan, Deepak; Marlin, Benjamin

    2015-01-01

    Thanks to advances in mobile sensing technologies, it has recently become practical to deploy wireless electrocardiograph sensors for continuous recording of ECG signals. This capability has diverse applications in the study of human health and behavior, but to realize its full potential, new computational tools are required to effectively deal with the uncertainty that results from the noisy and highly non-stationary signals collected using these devices. In this work, we present a novel approach to the problem of extracting the morphological structure of ECG signals based on the use of dynamically structured conditional random field (CRF) models. We apply this framework to the problem of extracting morphological structure from wireless ECG sensor data collected in a lab-based study of habituated cocaine users. Our results show that the proposed CRF-based approach significantly out-performs independent prediction models using the same features, as well as a widely cited open source toolkit. PMID:26726321

  19. Characterization of Land Transitions Patterns from Multivariate Time Series Using Seasonal Trend Analysis and Principal Component Analysis

    Directory of Open Access Journals (Sweden)

    Benoit Parmentier

    2014-12-01

    Full Text Available Characterizing biophysical changes in land change areas over large regions with short and noisy multivariate time series and multiple temporal parameters remains a challenging task. Most studies focus on detection rather than the characterization, i.e., the manner by which surface state variables are altered by the process of changes. In this study, a procedure is presented to extract and characterize simultaneous temporal changes in MODIS multivariate times series from three surface state variables the Normalized Difference Vegetation Index (NDVI, land surface temperature (LST and albedo (ALB. The analysis involves conducting a seasonal trend analysis (STA to extract three seasonal shape parameters (Amplitude 0, Amplitude 1 and Amplitude 2 and using principal component analysis (PCA to contrast trends in change and no-change areas. We illustrate the method by characterizing trends in burned and unburned pixels in Alaska over the 2001–2009 time period. Findings show consistent and meaningful extraction of temporal patterns related to fire disturbances. The first principal component (PC1 is characterized by a decrease in mean NDVI (Amplitude 0 with a concurrent increase in albedo (the mean and the annual amplitude and an increase in LST annual variability (Amplitude 1. These results provide systematic empirical evidence of surface changes associated with one type of land change, fire disturbances, and suggest that STA with PCA may be used to characterize many other types of land transitions over large landscape areas using multivariate Earth observation time series.

  20. Reference analysis of the signal + background model in counting experiments

    Science.gov (United States)

    Casadei, D.

    2012-01-01

    The model representing two independent Poisson processes, labelled as ``signal'' and ``background'' and both contributing additively to the total number of counted events, is considered from a Bayesian point of view. This is a widely used model for the searches of rare or exotic events in presence of a background source, as for example in the searches performed by high-energy physics experiments. In the assumption of prior knowledge about the background yield, a reference prior is obtained for the signal alone and its properties are studied. Finally, the properties of the full solution, the marginal reference posterior, are illustrated with few examples.