WorldWideScience

Sample records for series analysis visualizing

  1. Visual time series analysis

    DEFF Research Database (Denmark)

    Fischer, Paul; Hilbert, Astrid

    2012-01-01

    We introduce a platform which supplies an easy-to-handle, interactive, extendable, and fast analysis tool for time series analysis. In contrast to other software suits like Maple, Matlab, or R, which use a command-line-like interface and where the user has to memorize/look-up the appropriate...... commands, our application is select-and-click-driven. It allows to derive many different sequences of deviations for a given time series and to visualize them in different ways in order to judge their expressive power and to reuse the procedure found. For many transformations or model-ts, the user may...... choose between manual and automated parameter selection. The user can dene new transformations and add them to the system. The application contains efficient implementations of advanced and recent techniques for time series analysis including techniques related to extreme value analysis and filtering...

  2. Transition Icons for Time-Series Visualization and Exploratory Analysis.

    Science.gov (United States)

    Nickerson, Paul V; Baharloo, Raheleh; Wanigatunga, Amal A; Manini, Todd M; Tighe, Patrick J; Rashidi, Parisa

    2018-03-01

    The modern healthcare landscape has seen the rapid emergence of techniques and devices that temporally monitor and record physiological signals. The prevalence of time-series data within the healthcare field necessitates the development of methods that can analyze the data in order to draw meaningful conclusions. Time-series behavior is notoriously difficult to intuitively understand due to its intrinsic high-dimensionality, which is compounded in the case of analyzing groups of time series collected from different patients. Our framework, which we call transition icons, renders common patterns in a visual format useful for understanding the shared behavior within groups of time series. Transition icons are adept at detecting and displaying subtle differences and similarities, e.g., between measurements taken from patients receiving different treatment strategies or stratified by demographics. We introduce various methods that collectively allow for exploratory analysis of groups of time series, while being free of distribution assumptions and including simple heuristics for parameter determination. Our technique extracts discrete transition patterns from symbolic aggregate approXimation representations, and compiles transition frequencies into a bag of patterns constructed for each group. These transition frequencies are normalized and aligned in icon form to intuitively display the underlying patterns. We demonstrate the transition icon technique for two time-series datasets-postoperative pain scores, and hip-worn accelerometer activity counts. We believe transition icons can be an important tool for researchers approaching time-series data, as they give rich and intuitive information about collective time-series behaviors.

  3. Visualizing Series

    Science.gov (United States)

    Unal, Hasan

    2008-01-01

    The importance of visualisation and multiple representations in mathematics has been stressed, especially in a context of problem solving. Hanna and Sidoli comment that "Diagrams and other visual representations have long been welcomed as heuristic accompaniments to proof, where they not only facilitate the understanding of theorems and their…

  4. Dynamical analysis and visualization of tornadoes time series.

    Directory of Open Access Journals (Sweden)

    António M Lopes

    Full Text Available In this paper we analyze the behavior of tornado time-series in the U.S. from the perspective of dynamical systems. A tornado is a violently rotating column of air extending from a cumulonimbus cloud down to the ground. Such phenomena reveal features that are well described by power law functions and unveil characteristics found in systems with long range memory effects. Tornado time series are viewed as the output of a complex system and are interpreted as a manifestation of its dynamics. Tornadoes are modeled as sequences of Dirac impulses with amplitude proportional to the events size. First, a collection of time series involving 64 years is analyzed in the frequency domain by means of the Fourier transform. The amplitude spectra are approximated by power law functions and their parameters are read as an underlying signature of the system dynamics. Second, it is adopted the concept of circular time and the collective behavior of tornadoes analyzed. Clustering techniques are then adopted to identify and visualize the emerging patterns.

  5. Dynamical analysis and visualization of tornadoes time series.

    Science.gov (United States)

    Lopes, António M; Tenreiro Machado, J A

    2015-01-01

    In this paper we analyze the behavior of tornado time-series in the U.S. from the perspective of dynamical systems. A tornado is a violently rotating column of air extending from a cumulonimbus cloud down to the ground. Such phenomena reveal features that are well described by power law functions and unveil characteristics found in systems with long range memory effects. Tornado time series are viewed as the output of a complex system and are interpreted as a manifestation of its dynamics. Tornadoes are modeled as sequences of Dirac impulses with amplitude proportional to the events size. First, a collection of time series involving 64 years is analyzed in the frequency domain by means of the Fourier transform. The amplitude spectra are approximated by power law functions and their parameters are read as an underlying signature of the system dynamics. Second, it is adopted the concept of circular time and the collective behavior of tornadoes analyzed. Clustering techniques are then adopted to identify and visualize the emerging patterns.

  6. Visualizing the Geometric Series.

    Science.gov (United States)

    Bennett, Albert B., Jr.

    1989-01-01

    Mathematical proofs often leave students unconvinced or without understanding of what has been proved, because they provide no visual-geometric representation. Presented are geometric models for the finite geometric series when r is a whole number, and the infinite geometric series when r is the reciprocal of a whole number. (MNS)

  7. Visualization of time series statistical data by shape analysis (GDP ratio changes among Asia countries)

    Science.gov (United States)

    Shirota, Yukari; Hashimoto, Takako; Fitri Sari, Riri

    2018-03-01

    It has been very significant to visualize time series big data. In the paper we shall discuss a new analysis method called “statistical shape analysis” or “geometry driven statistics” on time series statistical data in economics. In the paper, we analyse the agriculture, value added and industry, value added (percentage of GDP) changes from 2000 to 2010 in Asia. We handle the data as a set of landmarks on a two-dimensional image to see the deformation using the principal components. The point of the analysis method is the principal components of the given formation which are eigenvectors of its bending energy matrix. The local deformation can be expressed as the set of non-Affine transformations. The transformations give us information about the local differences between in 2000 and in 2010. Because the non-Affine transformation can be decomposed into a set of partial warps, we present the partial warps visually. The statistical shape analysis is widely used in biology but, in economics, no application can be found. In the paper, we investigate its potential to analyse the economic data.

  8. Development of analysis software for radiation time-series data with the use of visual studio 2005

    International Nuclear Information System (INIS)

    Hohara, Sin-ya; Horiguchi, Tetsuo; Ito, Shin

    2008-01-01

    Time-Series Analysis supplies a new vision that conventional analysis methods such as energy spectroscopy haven't achieved ever. However, application of time-series analysis to radiation measurements needs much effort in software and hardware development. By taking advantage of Visual Studio 2005, we developed an analysis software, 'ListFileConverter', for time-series radiation measurement system called as 'MPA-3'. The software is based on graphical user interface (GUI) architecture that enables us to save a large amount of operation time in the analysis, and moreover to make an easy-access to special file structure of MPA-3 data. In this paper, detailed structure of ListFileConverter is fully explained, and experimental results for counting capability of MPA-3 hardware system and those for neutron measurements with our UTR-KINKI reactor are also given. (author)

  9. IDP camp evolvement analysis in Darfur using VHSR optical satellite image time series and scientific visualization on virtual globes

    Science.gov (United States)

    Tiede, Dirk; Lang, Stefan

    2010-11-01

    In this paper we focus on the application of transferable, object-based image analysis algorithms for dwelling extraction in a camp for internally displaced people (IDP) in Darfur, Sudan along with innovative means for scientific visualisation of the results. Three very high spatial resolution satellite images (QuickBird: 2002, 2004, 2008) were used for: (1) extracting different types of dwellings and (2) calculating and visualizing added-value products such as dwelling density and camp structure. The results were visualized on virtual globes (Google Earth and ArcGIS Explorer) revealing the analysis results (analytical 3D views,) transformed into the third dimension (z-value). Data formats depend on virtual globe software including KML/KMZ (keyhole mark-up language) and ESRI 3D shapefiles streamed as ArcGIS Server-based globe service. In addition, means for improving overall performance of automated dwelling structures using grid computing techniques are discussed using examples from a similar study.

  10. Visualization analysis and design

    CERN Document Server

    Munzner, Tamara

    2015-01-01

    Visualization Analysis and Design provides a systematic, comprehensive framework for thinking about visualization in terms of principles and design choices. The book features a unified approach encompassing information visualization techniques for abstract data, scientific visualization techniques for spatial data, and visual analytics techniques for interweaving data transformation and analysis with interactive visual exploration. It emphasizes the careful validation of effectiveness and the consideration of function before form. The book breaks down visualization design according to three questions: what data users need to see, why users need to carry out their tasks, and how the visual representations proposed can be constructed and manipulated. It walks readers through the use of space and color to visually encode data in a view, the trade-offs between changing a single view and using multiple linked views, and the ways to reduce the amount of data shown in each view. The book concludes with six case stu...

  11. FTSPlot: fast time series visualization for large datasets.

    Directory of Open Access Journals (Sweden)

    Michael Riss

    Full Text Available The analysis of electrophysiological recordings often involves visual inspection of time series data to locate specific experiment epochs, mask artifacts, and verify the results of signal processing steps, such as filtering or spike detection. Long-term experiments with continuous data acquisition generate large amounts of data. Rapid browsing through these massive datasets poses a challenge to conventional data plotting software because the plotting time increases proportionately to the increase in the volume of data. This paper presents FTSPlot, which is a visualization concept for large-scale time series datasets using techniques from the field of high performance computer graphics, such as hierarchic level of detail and out-of-core data handling. In a preprocessing step, time series data, event, and interval annotations are converted into an optimized data format, which then permits fast, interactive visualization. The preprocessing step has a computational complexity of O(n x log(N; the visualization itself can be done with a complexity of O(1 and is therefore independent of the amount of data. A demonstration prototype has been implemented and benchmarks show that the technology is capable of displaying large amounts of time series data, event, and interval annotations lag-free with < 20 ms ms. The current 64-bit implementation theoretically supports datasets with up to 2(64 bytes, on the x86_64 architecture currently up to 2(48 bytes are supported, and benchmarks have been conducted with 2(40 bytes/1 TiB or 1.3 x 10(11 double precision samples. The presented software is freely available and can be included as a Qt GUI component in future software projects, providing a standard visualization method for long-term electrophysiological experiments.

  12. Applied time series analysis

    CERN Document Server

    Woodward, Wayne A; Elliott, Alan C

    2011-01-01

    ""There is scarcely a standard technique that the reader will find left out … this book is highly recommended for those requiring a ready introduction to applicable methods in time series and serves as a useful resource for pedagogical purposes.""-International Statistical Review (2014), 82""Current time series theory for practice is well summarized in this book.""-Emmanuel Parzen, Texas A&M University""What an extraordinary range of topics covered, all very insightfully. I like [the authors'] innovations very much, such as the AR factor table.""-David Findley, U.S. Census Bureau (retired)""…

  13. Visual physics analysis VISPA

    International Nuclear Information System (INIS)

    Actis, Oxana; Brodski, Michael; Erdmann, Martin; Fischer, Robert; Hinzmann, Andreas; Mueller, Gero; Muenzer, Thomas; Plum, Matthias; Steggemann, Jan; Winchen, Tobias; Klimkovich, Tatsiana

    2010-01-01

    VISPA is a development environment for high energy physics analyses which enables physicists to combine graphical and textual work. A physics analysis cycle consists of prototyping, performing, and verifying the analysis. The main feature of VISPA is a multipurpose window for visual steering of analysis steps, creation of analysis templates, and browsing physics event data at different steps of an analysis. VISPA follows an experiment-independent approach and incorporates various tools for steering and controlling required in a typical analysis. Connection to different frameworks of high energy physics experiments is achieved by using different types of interfaces. We present the look-and-feel for an example physics analysis at the LHC and explain the underlying software concepts of VISPA.

  14. Time series analysis in the social sciences the fundamentals

    CERN Document Server

    Shin, Youseop

    2017-01-01

    Times Series Analysis in the Social Sciences is a practical and highly readable introduction written exclusively for students and researchers whose mathematical background is limited to basic algebra. The book focuses on fundamental elements of time series analysis that social scientists need to understand so they can employ time series analysis for their research and practice. Through step-by-step explanations and using monthly violent crime rates as case studies, this book explains univariate time series from the preliminary visual analysis through the modeling of seasonality, trends, and re

  15. Multiscale Poincaré plots for visualizing the structure of heartbeat time series.

    Science.gov (United States)

    Henriques, Teresa S; Mariani, Sara; Burykin, Anton; Rodrigues, Filipa; Silva, Tiago F; Goldberger, Ary L

    2016-02-09

    Poincaré delay maps are widely used in the analysis of cardiac interbeat interval (RR) dynamics. To facilitate visualization of the structure of these time series, we introduce multiscale Poincaré (MSP) plots. Starting with the original RR time series, the method employs a coarse-graining procedure to create a family of time series, each of which represents the system's dynamics in a different time scale. Next, the Poincaré plots are constructed for the original and the coarse-grained time series. Finally, as an optional adjunct, color can be added to each point to represent its normalized frequency. We illustrate the MSP method on simulated Gaussian white and 1/f noise time series. The MSP plots of 1/f noise time series reveal relative conservation of the phase space area over multiple time scales, while those of white noise show a marked reduction in area. We also show how MSP plots can be used to illustrate the loss of complexity when heartbeat time series from healthy subjects are compared with those from patients with chronic (congestive) heart failure syndrome or with atrial fibrillation. This generalized multiscale approach to Poincaré plots may be useful in visualizing other types of time series.

  16. Time Series Analysis and Forecasting by Example

    CERN Document Server

    Bisgaard, Soren

    2011-01-01

    An intuition-based approach enables you to master time series analysis with ease Time Series Analysis and Forecasting by Example provides the fundamental techniques in time series analysis using various examples. By introducing necessary theory through examples that showcase the discussed topics, the authors successfully help readers develop an intuitive understanding of seemingly complicated time series models and their implications. The book presents methodologies for time series analysis in a simplified, example-based approach. Using graphics, the authors discuss each presented example in

  17. The analysis of time series: an introduction

    National Research Council Canada - National Science Library

    Chatfield, Christopher

    1989-01-01

    .... A variety of practical examples are given to support the theory. The book covers a wide range of time-series topics, including probability models for time series, Box-Jenkins forecasting, spectral analysis, linear systems and system identification...

  18. Time series analysis time series analysis methods and applications

    CERN Document Server

    Rao, Tata Subba; Rao, C R

    2012-01-01

    The field of statistics not only affects all areas of scientific activity, but also many other matters such as public policy. It is branching rapidly into so many different subjects that a series of handbooks is the only way of comprehensively presenting the various aspects of statistical methodology, applications, and recent developments. The Handbook of Statistics is a series of self-contained reference books. Each volume is devoted to a particular topic in statistics, with Volume 30 dealing with time series. The series is addressed to the entire community of statisticians and scientists in various disciplines who use statistical methodology in their work. At the same time, special emphasis is placed on applications-oriented techniques, with the applied statistician in mind as the primary audience. Comprehensively presents the various aspects of statistical methodology Discusses a wide variety of diverse applications and recent developments Contributors are internationally renowened experts in their respect...

  19. A Course in Time Series Analysis

    CERN Document Server

    Peña, Daniel; Tsay, Ruey S

    2011-01-01

    New statistical methods and future directions of research in time series A Course in Time Series Analysis demonstrates how to build time series models for univariate and multivariate time series data. It brings together material previously available only in the professional literature and presents a unified view of the most advanced procedures available for time series model building. The authors begin with basic concepts in univariate time series, providing an up-to-date presentation of ARIMA models, including the Kalman filter, outlier analysis, automatic methods for building ARIMA models, a

  20. Multi-granular trend detection for time-series analysis

    NARCIS (Netherlands)

    van Goethem, A.I.; Staals, F.; Löffler, M.; Dykes, J.; Speckmann, B.

    2017-01-01

    Time series (such as stock prices) and ensembles (such as model runs for weather forecasts) are two important types of one-dimensional time-varying data. Such data is readily available in large quantities but visual analysis of the raw data quickly becomes infeasible, even for moderately sized data

  1. Visual Interactive Analysis

    DEFF Research Database (Denmark)

    Kirchmeier-Andersen, Sabine; Møller Christensen, Jakob; Lihn Jensen, Bente

    2004-01-01

    This article presents the latest version of VIA (version 3.0). The development of the program was initiated by a demand for more systematic training of language analysis in high schools and universities. The system is now web-based, which enables teachers and students to share exercises across...

  2. Visual Analysis of Humans

    CERN Document Server

    Moeslund, Thomas B

    2011-01-01

    This unique text/reference provides a coherent and comprehensive overview of all aspects of video analysis of humans. Broad in coverage and accessible in style, the text presents original perspectives collected from preeminent researchers gathered from across the world. In addition to presenting state-of-the-art research, the book reviews the historical origins of the different existing methods, and predicts future trends and challenges. This title: features a Foreword by Professor Larry Davis; contains contributions from an international selection of leading authorities in the field; includes

  3. Consistent two-dimensional visualization of protein-ligand complex series

    Directory of Open Access Journals (Sweden)

    Stierand Katrin

    2011-06-01

    Full Text Available Abstract Background The comparative two-dimensional graphical representation of protein-ligand complex series featuring different ligands bound to the same active site offers a quick insight in their binding mode differences. In comparison to arbitrary orientations of the residue molecules in the individual complex depictions a consistent placement improves the legibility and comparability within the series. The automatic generation of such consistent layouts offers the possibility to apply it to large data sets originating from computer-aided drug design methods. Results We developed a new approach, which automatically generates a consistent layout of interacting residues for a given series of complexes. Based on the structural three-dimensional input information, a global two-dimensional layout for all residues of the complex ensemble is computed. The algorithm incorporates the three-dimensional adjacencies of the active site residues in order to find an universally valid circular arrangement of the residues around the ligand. Subsequent to a two-dimensional ligand superimposition step, a global placement for each residue is derived from the set of already placed ligands. The method generates high-quality layouts, showing mostly overlap-free solutions with molecules which are displayed as structure diagrams providing interaction information in atomic detail. Application examples document an improved legibility compared to series of diagrams whose layouts are calculated independently from each other. Conclusions The presented method extends the field of complex series visualizations. A series of molecules binding to the same protein active site is drawn in a graphically consistent way. Compared to existing approaches these drawings substantially simplify the visual analysis of large compound series.

  4. The foundations of modern time series analysis

    CERN Document Server

    Mills, Terence C

    2011-01-01

    This book develops the analysis of Time Series from its formal beginnings in the 1890s through to the publication of Box and Jenkins' watershed publication in 1970, showing how these methods laid the foundations for the modern techniques of Time Series analysis that are in use today.

  5. A System for Acquisition, Processing and Visualization of Image Time Series from Multiple Camera Networks

    Directory of Open Access Journals (Sweden)

    Cemal Melih Tanis

    2018-06-01

    Full Text Available A system for multiple camera networks is proposed for continuous monitoring of ecosystems by processing image time series. The system is built around the Finnish Meteorological Image PROcessing Toolbox (FMIPROT, which includes data acquisition, processing and visualization from multiple camera networks. The toolbox has a user-friendly graphical user interface (GUI for which only minimal computer knowledge and skills are required to use it. Images from camera networks are acquired and handled automatically according to the common communication protocols, e.g., File Transfer Protocol (FTP. Processing features include GUI based selection of the region of interest (ROI, automatic analysis chain, extraction of ROI based indices such as the green fraction index (GF, red fraction index (RF, blue fraction index (BF, green-red vegetation index (GRVI, and green excess (GEI index, as well as a custom index defined by a user-provided mathematical formula. Analysis results are visualized on interactive plots both on the GUI and hypertext markup language (HTML reports. The users can implement their own developed algorithms to extract information from digital image series for any purpose. The toolbox can also be run in non-GUI mode, which allows running series of analyses in servers unattended and scheduled. The system is demonstrated using an environmental camera network in Finland.

  6. SUBSURFACE VISUAL ALARM SYSTEM ANALYSIS

    International Nuclear Information System (INIS)

    D.W. Markman

    2001-01-01

    The ''Subsurface Fire Hazard Analysis'' (CRWMS M andO 1998, page 61), and the document, ''Title III Evaluation Report for the Surface and Subsurface Communication System'', (CRWMS M andO 1999a, pages 21 and 23), both indicate the installed communication system is adequate to support Exploratory Studies Facility (ESF) activities with the exception of the mine phone system for emergency notification purposes. They recommend the installation of a visual alarm system to supplement the page/party phone system The purpose of this analysis is to identify data communication highway design approaches, and provide justification for the selected or recommended alternatives for the data communication of the subsurface visual alarm system. This analysis is being prepared to document a basis for the design selection of the data communication method. This analysis will briefly describe existing data or voice communication or monitoring systems within the ESF, and look at how these may be revised or adapted to support the needed data highway of the subsurface visual alarm. system. The existing PLC communication system installed in subsurface is providing data communication for alcove No.5 ventilation fans, south portal ventilation fans, bulkhead doors and generator monitoring system. It is given that the data communication of the subsurface visual alarm system will be a digital based system. It is also given that it is most feasible to take advantage of existing systems and equipment and not consider an entirely new data communication system design and installation. The scope and primary objectives of this analysis are to: (1) Briefly review and describe existing available data communication highways or systems within the ESF. (2) Examine technical characteristics of an existing system to disqualify a design alternative is paramount in minimizing the number of and depth of a system review. (3) Apply general engineering design practices or criteria such as relative cost, and degree

  7. Analysis of series resonant converter with series-parallel connection

    Science.gov (United States)

    Lin, Bor-Ren; Huang, Chien-Lan

    2011-02-01

    In this study, a parallel inductor-inductor-capacitor (LLC) resonant converter series-connected on the primary side and parallel-connected on the secondary side is presented for server power supply systems. Based on series resonant behaviour, the power metal-oxide-semiconductor field-effect transistors are turned on at zero voltage switching and the rectifier diodes are turned off at zero current switching. Thus, the switching losses on the power semiconductors are reduced. In the proposed converter, the primary windings of the two LLC converters are connected in series. Thus, the two converters have the same primary currents to ensure that they can supply the balance load current. On the output side, two LLC converters are connected in parallel to share the load current and to reduce the current stress on the secondary windings and the rectifier diodes. In this article, the principle of operation, steady-state analysis and design considerations of the proposed converter are provided and discussed. Experiments with a laboratory prototype with a 24 V/21 A output for server power supply were performed to verify the effectiveness of the proposed converter.

  8. Analysis of Heavy-Tailed Time Series

    DEFF Research Database (Denmark)

    Xie, Xiaolei

    This thesis is about analysis of heavy-tailed time series. We discuss tail properties of real-world equity return series and investigate the possibility that a single tail index is shared by all return series of actively traded equities in a market. Conditions for this hypothesis to be true...... are identified. We study the eigenvalues and eigenvectors of sample covariance and sample auto-covariance matrices of multivariate heavy-tailed time series, and particularly for time series with very high dimensions. Asymptotic approximations of the eigenvalues and eigenvectors of such matrices are found...... and expressed in terms of the parameters of the dependence structure, among others. Furthermore, we study an importance sampling method for estimating rare-event probabilities of multivariate heavy-tailed time series generated by matrix recursion. We show that the proposed algorithm is efficient in the sense...

  9. Applied time series analysis and innovative computing

    CERN Document Server

    Ao, Sio-Iong

    2010-01-01

    This text is a systematic, state-of-the-art introduction to the use of innovative computing paradigms as an investigative tool for applications in time series analysis. It includes frontier case studies based on recent research.

  10. Time Series Analysis Forecasting and Control

    CERN Document Server

    Box, George E P; Reinsel, Gregory C

    2011-01-01

    A modernized new edition of one of the most trusted books on time series analysis. Since publication of the first edition in 1970, Time Series Analysis has served as one of the most influential and prominent works on the subject. This new edition maintains its balanced presentation of the tools for modeling and analyzing time series and also introduces the latest developments that have occurred n the field over the past decade through applications from areas such as business, finance, and engineering. The Fourth Edition provides a clearly written exploration of the key methods for building, cl

  11. Metoprolol-induced visual hallucinations: a case series

    Directory of Open Access Journals (Sweden)

    Goldner Jonathan A

    2012-02-01

    Full Text Available Abstract Introduction Metoprolol is a widely used beta-adrenergic blocker that is commonly prescribed for a variety of cardiovascular syndromes and conditions. While central nervous system adverse effects have been well-described with most beta-blockers (especially lipophilic agents such as propranolol, visual hallucinations have been only rarely described with metoprolol. Case presentations Case 1 was an 84-year-old Caucasian woman with a history of hypertension and osteoarthritis, who suffered from visual hallucinations which she described as people in her bedroom at night. They would be standing in front of the bed or sitting on chairs watching her when she slept. Numerous medications were stopped before her physician realized the metoprolol was the causative agent. The hallucinations resolved only after discontinuation of this medication. Case 2 was a 62-year-old Caucasian man with an inferior wall myocardial infarction complicated by cardiac arrest, who was successfully resuscitated and discharged from the hospital on metoprolol. About 18 months after discharge, he related to his physician that he had been seeing dead people at night. He related his belief that since he 'had died and was brought back to life', he was now seeing people from the after-life. Upon discontinuation of the metoprolol the visual disturbances resolved within several days. Case 3 was a 68 year-old Caucasian woman with a history of severe hypertension and depression, who reported visual hallucinations at night for years while taking metoprolol. These included awakening during the night with people in her bedroom and seeing objects in her room turn into animals. After a new physician switched her from metoprolol to atenolol, the visual hallucinations ceased within four days. Conclusion We suspect that metoprolol-induced visual hallucinations may be under-recognized and under-reported. Patients may frequently fail to acknowledge this adverse effect believing that they

  12. Visual Circular Analysis of 266 Years of Sunspot Counts.

    Science.gov (United States)

    Buelens, Bart

    2016-06-01

    Sunspots, colder areas that are visible as dark spots on the surface of the Sun, have been observed for centuries. Their number varies with a period of ∼11 years, a phenomenon closely related to the solar activity cycle. Recently, observation records dating back to 1749 have been reassessed, resulting in the release of a time series of sunspot numbers covering 266 years of observations. This series is analyzed using circular analysis to determine the periodicity of the occurrence of solar maxima. The circular analysis is combined with spiral graphs to provide a single visualization, simultaneously showing the periodicity of the series, the degree to which individual cycle lengths deviate from the average period, and differences in levels reached during the different maxima. This type of visualization of cyclic time series with varying cycle lengths in which significant events occur periodically is broadly applicable. It is aimed particularly at science communication, education, and public outreach.

  13. Visibility Graph Based Time Series Analysis.

    Science.gov (United States)

    Stephen, Mutua; Gu, Changgui; Yang, Huijie

    2015-01-01

    Network based time series analysis has made considerable achievements in the recent years. By mapping mono/multivariate time series into networks, one can investigate both it's microscopic and macroscopic behaviors. However, most proposed approaches lead to the construction of static networks consequently providing limited information on evolutionary behaviors. In the present paper we propose a method called visibility graph based time series analysis, in which series segments are mapped to visibility graphs as being descriptions of the corresponding states and the successively occurring states are linked. This procedure converts a time series to a temporal network and at the same time a network of networks. Findings from empirical records for stock markets in USA (S&P500 and Nasdaq) and artificial series generated by means of fractional Gaussian motions show that the method can provide us rich information benefiting short-term and long-term predictions. Theoretically, we propose a method to investigate time series from the viewpoint of network of networks.

  14. Visibility Graph Based Time Series Analysis.

    Directory of Open Access Journals (Sweden)

    Mutua Stephen

    Full Text Available Network based time series analysis has made considerable achievements in the recent years. By mapping mono/multivariate time series into networks, one can investigate both it's microscopic and macroscopic behaviors. However, most proposed approaches lead to the construction of static networks consequently providing limited information on evolutionary behaviors. In the present paper we propose a method called visibility graph based time series analysis, in which series segments are mapped to visibility graphs as being descriptions of the corresponding states and the successively occurring states are linked. This procedure converts a time series to a temporal network and at the same time a network of networks. Findings from empirical records for stock markets in USA (S&P500 and Nasdaq and artificial series generated by means of fractional Gaussian motions show that the method can provide us rich information benefiting short-term and long-term predictions. Theoretically, we propose a method to investigate time series from the viewpoint of network of networks.

  15. Allan deviation analysis of financial return series

    Science.gov (United States)

    Hernández-Pérez, R.

    2012-05-01

    We perform a scaling analysis for the return series of different financial assets applying the Allan deviation (ADEV), which is used in the time and frequency metrology to characterize quantitatively the stability of frequency standards since it has demonstrated to be a robust quantity to analyze fluctuations of non-stationary time series for different observation intervals. The data used are opening price daily series for assets from different markets during a time span of around ten years. We found that the ADEV results for the return series at short scales resemble those expected for an uncorrelated series, consistent with the efficient market hypothesis. On the other hand, the ADEV results for absolute return series for short scales (first one or two decades) decrease following approximately a scaling relation up to a point that is different for almost each asset, after which the ADEV deviates from scaling, which suggests that the presence of clustering, long-range dependence and non-stationarity signatures in the series drive the results for large observation intervals.

  16. Introduction to time series analysis and forecasting

    CERN Document Server

    Montgomery, Douglas C; Kulahci, Murat

    2008-01-01

    An accessible introduction to the most current thinking in and practicality of forecasting techniques in the context of time-oriented data. Analyzing time-oriented data and forecasting are among the most important problems that analysts face across many fields, ranging from finance and economics to production operations and the natural sciences. As a result, there is a widespread need for large groups of people in a variety of fields to understand the basic concepts of time series analysis and forecasting. Introduction to Time Series Analysis and Forecasting presents the time series analysis branch of applied statistics as the underlying methodology for developing practical forecasts, and it also bridges the gap between theory and practice by equipping readers with the tools needed to analyze time-oriented data and construct useful, short- to medium-term, statistically based forecasts.

  17. Neuron analysis of visual perception

    Science.gov (United States)

    Chow, K. L.

    1980-01-01

    The receptive fields of single cells in the visual system of cat and squirrel monkey were studied investigating the vestibular input affecting the cells, and the cell's responses during visual discrimination learning process. The receptive field characteristics of the rabbit visual system, its normal development, its abnormal development following visual deprivation, and on the structural and functional re-organization of the visual system following neo-natal and prenatal surgery were also studied. The results of each individual part of each investigation are detailed.

  18. Entropic Analysis of Electromyography Time Series

    Science.gov (United States)

    Kaufman, Miron; Sung, Paul

    2005-03-01

    We are in the process of assessing the effectiveness of fractal and entropic measures for the diagnostic of low back pain from surface electromyography (EMG) time series. Surface electromyography (EMG) is used to assess patients with low back pain. In a typical EMG measurement, the voltage is measured every millisecond. We observed back muscle fatiguing during one minute, which results in a time series with 60,000 entries. We characterize the complexity of time series by computing the Shannon entropy time dependence. The analysis of the time series from different relevant muscles from healthy and low back pain (LBP) individuals provides evidence that the level of variability of back muscle activities is much larger for healthy individuals than for individuals with LBP. In general the time dependence of the entropy shows a crossover from a diffusive regime to a regime characterized by long time correlations (self organization) at about 0.01s.

  19. Introduction to time series analysis and forecasting

    CERN Document Server

    Montgomery, Douglas C; Kulahci, Murat

    2015-01-01

    Praise for the First Edition ""…[t]he book is great for readers who need to apply the methods and models presented but have little background in mathematics and statistics."" -MAA Reviews Thoroughly updated throughout, Introduction to Time Series Analysis and Forecasting, Second Edition presents the underlying theories of time series analysis that are needed to analyze time-oriented data and construct real-world short- to medium-term statistical forecasts.    Authored by highly-experienced academics and professionals in engineering statistics, the Second Edition features discussions on both

  20. Nonlinear Time Series Analysis via Neural Networks

    Science.gov (United States)

    Volná, Eva; Janošek, Michal; Kocian, Václav; Kotyrba, Martin

    This article deals with a time series analysis based on neural networks in order to make an effective forex market [Moore and Roche, J. Int. Econ. 58, 387-411 (2002)] pattern recognition. Our goal is to find and recognize important patterns which repeatedly appear in the market history to adapt our trading system behaviour based on them.

  1. Lecture notes for Advanced Time Series Analysis

    DEFF Research Database (Denmark)

    Madsen, Henrik; Holst, Jan

    1997-01-01

    A first version of this notes was used at the lectures in Grenoble, and they are now extended and improved (together with Jan Holst), and used in Ph.D. courses on Advanced Time Series Analysis at IMM and at the Department of Mathematical Statistics, University of Lund, 1994, 1997, ...

  2. A taylor series approach to survival analysis

    International Nuclear Information System (INIS)

    Brodsky, J.B.; Groer, P.G.

    1984-09-01

    A method of survival analysis using hazard functions is developed. The method uses the well known mathematical theory for Taylor Series. Hypothesis tests of the adequacy of many statistical models, including proportional hazards and linear and/or quadratic dose responses, are obtained. A partial analysis of leukemia mortality in the Life Span Study cohort is used as an example. Furthermore, a relatively robust estimation procedure for the proportional hazards model is proposed. (author)

  3. Time series analysis of barometric pressure data

    International Nuclear Information System (INIS)

    La Rocca, Paola; Riggi, Francesco; Riggi, Daniele

    2010-01-01

    Time series of atmospheric pressure data, collected over a period of several years, were analysed to provide undergraduate students with educational examples of application of simple statistical methods of analysis. In addition to basic methods for the analysis of periodicities, a comparison of two forecast models, one based on autoregression algorithms, and the other making use of an artificial neural network, was made. Results show that the application of artificial neural networks may give slightly better results compared to traditional methods.

  4. The Statistical Analysis of Time Series

    CERN Document Server

    Anderson, T W

    2011-01-01

    The Wiley Classics Library consists of selected books that have become recognized classics in their respective fields. With these new unabridged and inexpensive editions, Wiley hopes to extend the life of these important works by making them available to future generations of mathematicians and scientists. Currently available in the Series: T. W. Anderson Statistical Analysis of Time Series T. S. Arthanari & Yadolah Dodge Mathematical Programming in Statistics Emil Artin Geometric Algebra Norman T. J. Bailey The Elements of Stochastic Processes with Applications to the Natural Sciences George

  5. Visual Analysis of Air Traffic Data

    Science.gov (United States)

    Albrecht, George Hans; Pang, Alex

    2012-01-01

    In this paper, we present visual analysis tools to help study the impact of policy changes on air traffic congestion. The tools support visualization of time-varying air traffic density over an area of interest using different time granularity. We use this visual analysis platform to investigate how changing the aircraft separation volume can reduce congestion while maintaining key safety requirements. The same platform can also be used as a decision aid for processing requests for unmanned aerial vehicle operations.

  6. Fourier analysis of time series an introduction

    CERN Document Server

    Bloomfield, Peter

    2000-01-01

    A new, revised edition of a yet unrivaled work on frequency domain analysis Long recognized for his unique focus on frequency domain methods for the analysis of time series data as well as for his applied, easy-to-understand approach, Peter Bloomfield brings his well-known 1976 work thoroughly up to date. With a minimum of mathematics and an engaging, highly rewarding style, Bloomfield provides in-depth discussions of harmonic regression, harmonic analysis, complex demodulation, and spectrum analysis. All methods are clearly illustrated using examples of specific data sets, while ample

  7. Visualizing data for environmental analysis

    Energy Technology Data Exchange (ETDEWEB)

    Benson, J.

    1997-04-01

    The Environmental Restoration Project at Los Alamos National Laboratory (LANL) has over 11,000 sampling locations in a 44 square mile area. The sample analyses contain raw analytical chemistry values for over 2,300 analytes and compounds used to define and remediate contaminated areas at LANL. The data consist of 2.5 million records in an oracle database. Maps are often used to visualize the data. Problems arise when a client specifies a particular kind of map without fully understanding the limitations of the data or the map. The ability of maps to convey information is dependent on many factors, though all maps are data dependent. The quantity, spatial distribution, and numerical range of the data can limit use with certain kinds of maps. To address these issues and educate the clients, several types of statistical maps (e.g., choropleth, isarithm, and graduated symbol such as bubble and spike) used for environmental analysis were chosen to show the advantages, disadvantages, and data limitations of each. By examining both the complexity of the analytical data and the limitations of the map type, it is possible to consider how reality has been transformed through the map, and if that transformation accurately conveys the information present.

  8. Visualizing data for environmental analysis

    International Nuclear Information System (INIS)

    Benson, J.

    1997-01-01

    The Environmental Restoration Project at Los Alamos National Laboratory (LANL) has over 11,000 sampling locations in a 44 square mile area. The sample analyses contain raw analytical chemistry values for over 2,300 analytes and compounds used to define and remediate contaminated areas at LANL. The data consist of 2.5 million records in an oracle database. Maps are often used to visualize the data. Problems arise when a client specifies a particular kind of map without fully understanding the limitations of the data or the map. The ability of maps to convey information is dependent on many factors, though all maps are data dependent. The quantity, spatial distribution, and numerical range of the data can limit use with certain kinds of maps. To address these issues and educate the clients, several types of statistical maps (e.g., choropleth, isarithm, and graduated symbol such as bubble and spike) used for environmental analysis were chosen to show the advantages, disadvantages, and data limitations of each. By examining both the complexity of the analytical data and the limitations of the map type, it is possible to consider how reality has been transformed through the map, and if that transformation accurately conveys the information present

  9. Nonlinear time series analysis with R

    CERN Document Server

    Huffaker, Ray; Rosa, Rodolfo

    2017-01-01

    In the process of data analysis, the investigator is often facing highly-volatile and random-appearing observed data. A vast body of literature shows that the assumption of underlying stochastic processes was not necessarily representing the nature of the processes under investigation and, when other tools were used, deterministic features emerged. Non Linear Time Series Analysis (NLTS) allows researchers to test whether observed volatility conceals systematic non linear behavior, and to rigorously characterize governing dynamics. Behavioral patterns detected by non linear time series analysis, along with scientific principles and other expert information, guide the specification of mechanistic models that serve to explain real-world behavior rather than merely reproducing it. Often there is a misconception regarding the complexity of the level of mathematics needed to understand and utilize the tools of NLTS (for instance Chaos theory). However, mathematics used in NLTS is much simpler than many other subjec...

  10. Query-Driven Visualization and Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Ruebel, Oliver; Bethel, E. Wes; Prabhat, Mr.; Wu, Kesheng

    2012-11-01

    This report focuses on an approach to high performance visualization and analysis, termed query-driven visualization and analysis (QDV). QDV aims to reduce the amount of data that needs to be processed by the visualization, analysis, and rendering pipelines. The goal of the data reduction process is to separate out data that is "scientifically interesting'' and to focus visualization, analysis, and rendering on that interesting subset. The premise is that for any given visualization or analysis task, the data subset of interest is much smaller than the larger, complete data set. This strategy---extracting smaller data subsets of interest and focusing of the visualization processing on these subsets---is complementary to the approach of increasing the capacity of the visualization, analysis, and rendering pipelines through parallelism. This report discusses the fundamental concepts in QDV, their relationship to different stages in the visualization and analysis pipelines, and presents QDV's application to problems in diverse areas, ranging from forensic cybersecurity to high energy physics.

  11. Time-Series Analysis: A Cautionary Tale

    Science.gov (United States)

    Damadeo, Robert

    2015-01-01

    Time-series analysis has often been a useful tool in atmospheric science for deriving long-term trends in various atmospherically important parameters (e.g., temperature or the concentration of trace gas species). In particular, time-series analysis has been repeatedly applied to satellite datasets in order to derive the long-term trends in stratospheric ozone, which is a critical atmospheric constituent. However, many of the potential pitfalls relating to the non-uniform sampling of the datasets were often ignored and the results presented by the scientific community have been unknowingly biased. A newly developed and more robust application of this technique is applied to the Stratospheric Aerosol and Gas Experiment (SAGE) II version 7.0 ozone dataset and the previous biases and newly derived trends are presented.

  12. KfK-seminar series on supercomputing und visualization from May till September 1992

    International Nuclear Information System (INIS)

    Hohenhinnebusch, W.

    1993-05-01

    During the period of may 1992 to september 1992 a series of seminars was held at KfK on several topics of supercomputing in different fields of application. The aim was to demonstrate the importance of supercomputing and visualization in numerical simulations of complex physical and technical phenomena. This report contains the collection of all submitted seminar papers. (orig./HP) [de

  13. Interactive visualization of gene regulatory networks with associated gene expression time series data

    NARCIS (Netherlands)

    Westenberg, M.A.; Hijum, van S.A.F.T.; Lulko, A.T.; Kuipers, O.P.; Roerdink, J.B.T.M.; Linsen, L.; Hagen, H.; Hamann, B.

    2008-01-01

    We present GENeVis, an application to visualize gene expression time series data in a gene regulatory network context. This is a network of regulator proteins that regulate the expression of their respective target genes. The networks are represented as graphs, in which the nodes represent genes,

  14. Towards a New Generation of Time-Series Visualization Tools in the ESA Heliophysics Science Archives

    Science.gov (United States)

    Perez, H.; Martinez, B.; Cook, J. P.; Herment, D.; Fernandez, M.; De Teodoro, P.; Arnaud, M.; Middleton, H. R.; Osuna, P.; Arviset, C.

    2017-12-01

    During the last decades a varied set of Heliophysics missions have allowed the scientific community to gain a better knowledge on the solar atmosphere and activity. The remote sensing images of missions such as SOHO have paved the ground for Helio-based spatial data visualization software such as JHelioViewer/Helioviewer. On the other hand, the huge amount of in-situ measurements provided by other missions such as Cluster provide a wide base for plot visualization software whose reach is still far from being fully exploited. The Heliophysics Science Archives within the ESAC Science Data Center (ESDC) already provide a first generation of tools for time-series visualization focusing on each mission's needs: visualization of quicklook plots, cross-calibration time series, pre-generated/on-demand multi-plot stacks (Cluster), basic plot zoom in/out options (Ulysses) and easy navigation through the plots in time (Ulysses, Cluster, ISS-Solaces). However, as the needs evolve and the scientists involved in new missions require to plot multi-variable data, heat maps stacks interactive synchronization and axis variable selection among other improvements. The new Heliophysics archives (such as Solar Orbiter) and the evolution of existing ones (Cluster) intend to address these new challenges. This paper provides an overview of the different approaches for visualizing time-series followed within the ESA Heliophysics Archives and their foreseen evolution.

  15. Topic Visualization and Survival Analysis

    OpenAIRE

    Wang, Ping Jr

    2017-01-01

    Latent semantic structure in a text collection is called a topic. In this thesis, we aim to visualize topics in the scientific literature and detect active or inactive research areas based on their lifetime. Topics were extracted from over 1 million abstracts from the arXiv.org database using Latent Dirichlet Allocation (LDA). Hellinger distance measures similarity between two topics. Topics are determined to be relevant if their pairwise distances are smaller than the threshold of Hellinger ...

  16. RankExplorer: Visualization of Ranking Changes in Large Time Series Data.

    Science.gov (United States)

    Shi, Conglei; Cui, Weiwei; Liu, Shixia; Xu, Panpan; Chen, Wei; Qu, Huamin

    2012-12-01

    For many applications involving time series data, people are often interested in the changes of item values over time as well as their ranking changes. For example, people search many words via search engines like Google and Bing every day. Analysts are interested in both the absolute searching number for each word as well as their relative rankings. Both sets of statistics may change over time. For very large time series data with thousands of items, how to visually present ranking changes is an interesting challenge. In this paper, we propose RankExplorer, a novel visualization method based on ThemeRiver to reveal the ranking changes. Our method consists of four major components: 1) a segmentation method which partitions a large set of time series curves into a manageable number of ranking categories; 2) an extended ThemeRiver view with embedded color bars and changing glyphs to show the evolution of aggregation values related to each ranking category over time as well as the content changes in each ranking category; 3) a trend curve to show the degree of ranking changes over time; 4) rich user interactions to support interactive exploration of ranking changes. We have applied our method to some real time series data and the case studies demonstrate that our method can reveal the underlying patterns related to ranking changes which might otherwise be obscured in traditional visualizations.

  17. Modeling, analysis, and visualization of anisotropy

    CERN Document Server

    Özarslan, Evren; Hotz, Ingrid

    2017-01-01

    This book focuses on the modeling, processing and visualization of anisotropy, irrespective of the context in which it emerges, using state-of-the-art mathematical tools. As such, it differs substantially from conventional reference works, which are centered on a particular application. It covers the following topics: (i) the geometric structure of tensors, (ii) statistical methods for tensor field processing, (iii) challenges in mapping neural connectivity and structural mechanics, (iv) processing of uncertainty, and (v) visualizing higher-order representations. In addition to original research contributions, it provides insightful reviews. This multidisciplinary book is the sixth in a series that aims to foster scientific exchange between communities employing tensors and other higher-order representations of directionally dependent data. A significant number of the chapters were co-authored by the participants of the workshop titled Multidisciplinary Approaches to Multivalued Data: Modeling, Visualization,...

  18. Visual Analysis of Weblog Content

    Energy Technology Data Exchange (ETDEWEB)

    Gregory, Michelle L.; Payne, Deborah A.; McColgin, Dave; Cramer, Nick O.; Love, Douglas V.

    2007-03-26

    In recent years, one of the advances of the World Wide Web is social media and one of the fastest growing aspects of social media is the blogosphere. Blogs make content creation easy and are highly accessible through web pages and syndication. With their growing influence, a need has arisen to be able to monitor the opinions and insight revealed within their content. In this paper we describe a technical approach for analyzing the content of blog data using a visual analytic tool, IN-SPIRE, developed by Pacific Northwest National Laboratory. We highlight the capabilities of this tool that are particularly useful for information gathering from blog data.

  19. Solar Image Analysis and Visualization

    CERN Document Server

    Ireland, J

    2009-01-01

    This volume presents a selection of papers on the state of the art of image enhancement, automated feature detection, machine learning, and visualization tools in support of solar physics that focus on the challenges presented by new ground-based and space-based instrumentation. The articles and topics were inspired by the Third Solar Image Processing Workshop, held at Trinity College Dublin, Ireland but contributions from other experts have been included as well. This book is mainly aimed at researchers and graduate students working on image processing and compter vision in astronomy and solar physics.

  20. Efficient Delivery and Visualization of Long Time-Series Datasets Using Das2 Tools

    Science.gov (United States)

    Piker, C.; Granroth, L.; Faden, J.; Kurth, W. S.

    2017-12-01

    For over 14 years the University of Iowa Radio and Plasma Wave Group has utilized a network transparent data streaming and visualization system for most daily data review and collaboration activities. This system, called Das2, was originally designed in support of the Cassini Radio and Plasma Wave Science (RPWS) investigation, but is now relied on for daily review and analysis of Voyager, Polar, Cluster, Mars Express, Juno and other mission results. In light of current efforts to promote automatic data distribution in space physics it seems prudent to provide an overview of our open source Das2 programs and interface definitions to the wider community and to recount lessons learned. This submission will provide an overview of interfaces that define the system, describe the relationship between the Das2 effort and Autoplot and will examine handling Cassini RPWS Wideband waveforms and dynamic spectra as examples of dealing with long time-series data sets. In addition, the advantages and limitations of the current Das2 tool set will be discussed, as well as lessons learned that are applicable to other data sharing initiatives. Finally, plans for future developments including improved catalogs to support 'no-software' data sources and redundant multi-server fail over, as well as new adapters for CSV (Comma Separated Values) and JSON (Javascript Object Notation) output to support Cassini closeout and the HAPI (Heliophysics Application Programming Interface) initiative are outlined.

  1. Topological data analysis for scientific visualization

    CERN Document Server

    Tierny, Julien

    2017-01-01

    Combining theoretical and practical aspects of topology, this book delivers a comprehensive and self-contained introduction to topological methods for the analysis and visualization of scientific data. Theoretical concepts are presented in a thorough but intuitive manner, with many high-quality color illustrations. Key algorithms for the computation and simplification of topological data representations are described in details, and their application is carefully illustrated in a chapter dedicated to concrete use cases. With its fine balance between theory and practice, "Topological Data Analysis for Scientific Visualization" constitutes an appealing introduction to the increasingly important topic of topological data analysis, for lecturers, students and researchers.

  2. Data and Visualization Corridors: Report on the 1998 DVC Workshop Series

    International Nuclear Information System (INIS)

    Smith, Paul H.; van Rosendale, John

    1998-01-01

    The Department of Energy and the National Science Foundation sponsored a series of workshops on data manipulation and visualization of large-scale scientific datasets. Three workshops were held in 1998, bringing together experts in high-performance computing, scientific visualization, emerging computer technologies, physics, chemistry, materials science, and engineering. These workshops were followed by two writing and review sessions, as well as numerous electronic collaborations, to synthesize the results. The results of these efforts are reported here. Across the government, mission agencies are charged with understanding scientific and engineering problems of unprecedented complexity. The DOE Accelerated Strategic Computing Initiative, for example, will soon be faced with the problem of understanding the enormous datasets created by teraops simulations, while NASA already has a severe problem in coping with the flood of data captured by earth observation satellites. Unfortunately, scientific visualization algorithms, and high-performance display hardware and software on which they depend, have not kept pace with the sheer size of emerging datasets, which threaten to overwhelm our ability to conduct research. Our capability to manipulate and explore large datasets is growing only slowly, while human cognitive and visual perception are an absolutely fixed resource. Thus, there is a pressing need for new methods of handling truly massive datasets, of exploring and visualizing them, and of communicating them over geographic distances. This report, written by representatives from academia, industry, national laboratories, and the government, is intended as a first step toward the timely creation of a comprehensive federal program in data manipulation and scientific visualization. There is, at this time, an exciting confluence of ideas on data handling, compression, telepresence, and scientific visualization. The combination of these new ideas, which we refer to as

  3. Visualization of and Software for Omnibus Test Based Change Detected in a Time Series of Polarimetric SAR Data

    DEFF Research Database (Denmark)

    Nielsen, Allan Aasbjerg; Conradsen, Knut; Skriver, Henning

    2017-01-01

    Based on an omnibus likelihood ratio test statistic for the equality of several variance-covariance matrices following the complex Wishart distribution and a factorization of this test statistic with associated p-values, change analysis in a time series of multilook polarimetric SAR data...... in the covariance matrix representation is carried out. The omnibus test statistic and its factorization detect if and when change occurs. Using airborne EMISAR and spaceborne RADARSAT-2 data this paper focuses on change detection based on the p-values, on visualization of change at pixel as well as segment level......, and on computer software....

  4. Exclusively visual analysis of classroom group interactions

    Science.gov (United States)

    Tucker, Laura; Scherr, Rachel E.; Zickler, Todd; Mazur, Eric

    2016-12-01

    Large-scale audiovisual data that measure group learning are time consuming to collect and analyze. As an initial step towards scaling qualitative classroom observation, we qualitatively coded classroom video using an established coding scheme with and without its audio cues. We find that interrater reliability is as high when using visual data only—without audio—as when using both visual and audio data to code. Also, interrater reliability is high when comparing use of visual and audio data to visual-only data. We see a small bias to code interactions as group discussion when visual and audio data are used compared with video-only data. This work establishes that meaningful educational observation can be made through visual information alone. Further, it suggests that after initial work to create a coding scheme and validate it in each environment, computer-automated visual coding could drastically increase the breadth of qualitative studies and allow for meaningful educational analysis on a far greater scale.

  5. Exclusively visual analysis of classroom group interactions

    Directory of Open Access Journals (Sweden)

    Laura Tucker

    2016-11-01

    Full Text Available Large-scale audiovisual data that measure group learning are time consuming to collect and analyze. As an initial step towards scaling qualitative classroom observation, we qualitatively coded classroom video using an established coding scheme with and without its audio cues. We find that interrater reliability is as high when using visual data only—without audio—as when using both visual and audio data to code. Also, interrater reliability is high when comparing use of visual and audio data to visual-only data. We see a small bias to code interactions as group discussion when visual and audio data are used compared with video-only data. This work establishes that meaningful educational observation can be made through visual information alone. Further, it suggests that after initial work to create a coding scheme and validate it in each environment, computer-automated visual coding could drastically increase the breadth of qualitative studies and allow for meaningful educational analysis on a far greater scale.

  6. Data management, archiving, visualization and analysis of space physics data

    Science.gov (United States)

    Russell, C. T.

    1995-01-01

    A series of programs for the visualization and analysis of space physics data has been developed at UCLA. In the course of those developments, a number of lessons have been learned regarding data management and data archiving, as well as data analysis. The issues now facing those wishing to develop such software, as well as the lessons learned, are reviewed. Modern media have eased many of the earlier problems of the physical volume required to store data, the speed of access, and the permanence of the records. However, the ultimate longevity of these media is still a question of debate. Finally, while software development has become easier, cost is still a limiting factor in developing visualization and analysis software.

  7. Highly comparative time-series analysis: the empirical structure of time series and their methods.

    Science.gov (United States)

    Fulcher, Ben D; Little, Max A; Jones, Nick S

    2013-06-06

    The process of collecting and organizing sets of observations represents a common theme throughout the history of science. However, despite the ubiquity of scientists measuring, recording and analysing the dynamics of different processes, an extensive organization of scientific time-series data and analysis methods has never been performed. Addressing this, annotated collections of over 35 000 real-world and model-generated time series, and over 9000 time-series analysis algorithms are analysed in this work. We introduce reduced representations of both time series, in terms of their properties measured by diverse scientific methods, and of time-series analysis methods, in terms of their behaviour on empirical time series, and use them to organize these interdisciplinary resources. This new approach to comparing across diverse scientific data and methods allows us to organize time-series datasets automatically according to their properties, retrieve alternatives to particular analysis methods developed in other scientific disciplines and automate the selection of useful methods for time-series classification and regression tasks. The broad scientific utility of these tools is demonstrated on datasets of electroencephalograms, self-affine time series, heartbeat intervals, speech signals and others, in each case contributing novel analysis techniques to the existing literature. Highly comparative techniques that compare across an interdisciplinary literature can thus be used to guide more focused research in time-series analysis for applications across the scientific disciplines.

  8. Visual interface for space and terrestrial analysis

    Science.gov (United States)

    Dombrowski, Edmund G.; Williams, Jason R.; George, Arthur A.; Heckathorn, Harry M.; Snyder, William A.

    1995-01-01

    The management of large geophysical and celestial data bases is now, more than ever, the most critical path to timely data analysis. With today's large volume data sets from multiple satellite missions, analysts face the task of defining useful data bases from which data and metadata (information about data) can be extracted readily in a meaningful way. Visualization, following an object-oriented design, is a fundamental method of organizing and handling data. Humans, by nature, easily accept pictorial representations of data. Therefore graphically oriented user interfaces are appealing, as long as they remain simple to produce and use. The Visual Interface for Space and Terrestrial Analysis (VISTA) system, currently under development at the Naval Research Laboratory's Backgrounds Data Center (BDC), has been designed with these goals in mind. Its graphical user interface (GUI) allows the user to perform queries, visualization, and analysis of atmospheric and celestial backgrounds data.

  9. Visualization and Analysis of Complex Covert Networks

    DEFF Research Database (Denmark)

    Memon, Bisharat

    systems that are covert and hence inherently complex. My Ph.D. is positioned within the wider framework of CrimeFighter project. The framework envisions a number of key knowledge management processes that are involved in the workflow, and the toolbox provides supporting tools to assist human end......This report discusses and summarize the results of my work so far in relation to my Ph.D. project entitled "Visualization and Analysis of Complex Covert Networks". The focus of my research is primarily on development of methods and supporting tools for visualization and analysis of networked......-users (intelligence analysts) in harvesting, filtering, storing, managing, structuring, mining, analyzing, interpreting, and visualizing data about offensive networks. The methods and tools proposed and discussed in this work can also be applied to analysis of more generic complex networks....

  10. EPIPOI: A user-friendly analytical tool for the extraction and visualization of temporal parameters from epidemiological time series

    Directory of Open Access Journals (Sweden)

    Alonso Wladimir J

    2012-11-01

    Full Text Available Abstract Background There is an increasing need for processing and understanding relevant information generated by the systematic collection of public health data over time. However, the analysis of those time series usually requires advanced modeling techniques, which are not necessarily mastered by staff, technicians and researchers working on public health and epidemiology. Here a user-friendly tool, EPIPOI, is presented that facilitates the exploration and extraction of parameters describing trends, seasonality and anomalies that characterize epidemiological processes. It also enables the inspection of those parameters across geographic regions. Although the visual exploration and extraction of relevant parameters from time series data is crucial in epidemiological research, until now it had been largely restricted to specialists. Methods EPIPOI is freely available software developed in Matlab (The Mathworks Inc that runs both on PC and Mac computers. Its friendly interface guides users intuitively through useful comparative analyses including the comparison of spatial patterns in temporal parameters. Results EPIPOI is able to handle complex analyses in an accessible way. A prototype has already been used to assist researchers in a variety of contexts from didactic use in public health workshops to the main analytical tool in published research. Conclusions EPIPOI can assist public health officials and students to explore time series data using a broad range of sophisticated analytical and visualization tools. It also provides an analytical environment where even advanced users can benefit by enabling a higher degree of control over model assumptions, such as those associated with detecting disease outbreaks and pandemics.

  11. Time series analysis of temporal networks

    Science.gov (United States)

    Sikdar, Sandipan; Ganguly, Niloy; Mukherjee, Animesh

    2016-01-01

    A common but an important feature of all real-world networks is that they are temporal in nature, i.e., the network structure changes over time. Due to this dynamic nature, it becomes difficult to propose suitable growth models that can explain the various important characteristic properties of these networks. In fact, in many application oriented studies only knowing these properties is sufficient. For instance, if one wishes to launch a targeted attack on a network, this can be done even without the knowledge of the full network structure; rather an estimate of some of the properties is sufficient enough to launch the attack. We, in this paper show that even if the network structure at a future time point is not available one can still manage to estimate its properties. We propose a novel method to map a temporal network to a set of time series instances, analyze them and using a standard forecast model of time series, try to predict the properties of a temporal network at a later time instance. To our aim, we consider eight properties such as number of active nodes, average degree, clustering coefficient etc. and apply our prediction framework on them. We mainly focus on the temporal network of human face-to-face contacts and observe that it represents a stochastic process with memory that can be modeled as Auto-Regressive-Integrated-Moving-Average (ARIMA). We use cross validation techniques to find the percentage accuracy of our predictions. An important observation is that the frequency domain properties of the time series obtained from spectrogram analysis could be used to refine the prediction framework by identifying beforehand the cases where the error in prediction is likely to be high. This leads to an improvement of 7.96% (for error level ≤20%) in prediction accuracy on an average across all datasets. As an application we show how such prediction scheme can be used to launch targeted attacks on temporal networks. Contribution to the Topical Issue

  12. Scientific & Intelligence Exascale Visualization Analysis System

    Energy Technology Data Exchange (ETDEWEB)

    2017-07-14

    SIEVAS provides an immersive visualization framework for connecting multiple systems in real time for data science. SIEVAS provides the ability to connect multiple COTS and GOTS products in a seamless fashion for data fusion, data analysis, and viewing. It provides this capability by using a combination of micro services, real time messaging, and web service compliant back-end system.

  13. Analysis and visualization of animal movement

    NARCIS (Netherlands)

    Shamoun-Baranes, J.; van Loon, E.E.; Purves, R.S.; Speckmann, B.; Weiskopf, D.; Camphuysen, C.J.

    2012-01-01

    The interdisciplinary workshop ‘Analysis and Visualization of Moving Objects’ was held at the Lorentz Centre in Leiden, The Netherlands, from 27 June to 1 July 2011. It brought together international specialists from ecology, computer science and geographical information science actively involved in

  14. Exploratory Visual Analysis for Animal Movement Ecology

    NARCIS (Netherlands)

    Slingsby, A.; van Loon, E.

    2016-01-01

    Movement ecologists study animals' movement to help understand their behaviours and interactions with each other and the environment. Data from GPS loggers are increasingly important for this. These data need to be processed, segmented and summarised for further visual and statistical analysis,

  15. Spatial analysis statistics, visualization, and computational methods

    CERN Document Server

    Oyana, Tonny J

    2015-01-01

    An introductory text for the next generation of geospatial analysts and data scientists, Spatial Analysis: Statistics, Visualization, and Computational Methods focuses on the fundamentals of spatial analysis using traditional, contemporary, and computational methods. Outlining both non-spatial and spatial statistical concepts, the authors present practical applications of geospatial data tools, techniques, and strategies in geographic studies. They offer a problem-based learning (PBL) approach to spatial analysis-containing hands-on problem-sets that can be worked out in MS Excel or ArcGIS-as well as detailed illustrations and numerous case studies. The book enables readers to: Identify types and characterize non-spatial and spatial data Demonstrate their competence to explore, visualize, summarize, analyze, optimize, and clearly present statistical data and results Construct testable hypotheses that require inferential statistical analysis Process spatial data, extract explanatory variables, conduct statisti...

  16. Nonparametric factor analysis of time series

    OpenAIRE

    Rodríguez-Poo, Juan M.; Linton, Oliver Bruce

    1998-01-01

    We introduce a nonparametric smoothing procedure for nonparametric factor analaysis of multivariate time series. The asymptotic properties of the proposed procedures are derived. We present an application based on the residuals from the Fair macromodel.

  17. Software for Graph Analysis and Visualization

    Directory of Open Access Journals (Sweden)

    M. I. Kolomeychenko

    2014-01-01

    Full Text Available This paper describes the software for graph storage, analysis and visualization. The article presents a comparative analysis of existing software for analysis and visualization of graphs, describes the overall architecture of application and basic principles of construction and operation of the main modules. Furthermore, a description of the developed graph storage oriented to storage and processing of large-scale graphs is presented. The developed algorithm for finding communities and implemented algorithms of autolayouts of graphs are the main functionality of the product. The main advantage of the developed software is high speed processing of large size networks (up to millions of nodes and links. Moreover, the proposed graph storage architecture is unique and has no analogues. The developed approaches and algorithms are optimized for operating with big graphs and have high productivity.

  18. DIY Solar Market Analysis Webinar Series: Solar Resource and Technical

    Science.gov (United States)

    Series: Solar Resource and Technical Potential DIY Solar Market Analysis Webinar Series: Solar Resource and Technical Potential Wednesday, June 11, 2014 As part of a Do-It-Yourself Solar Market Analysis Potential | State, Local, and Tribal Governments | NREL DIY Solar Market Analysis Webinar

  19. What marketing scholars should know about time series analysis : time series applications in marketing

    NARCIS (Netherlands)

    Horváth, Csilla; Kornelis, Marcel; Leeflang, Peter S.H.

    2002-01-01

    In this review, we give a comprehensive summary of time series techniques in marketing, and discuss a variety of time series analysis (TSA) techniques and models. We classify them in the sets (i) univariate TSA, (ii) multivariate TSA, and (iii) multiple TSA. We provide relevant marketing

  20. Interactive Visual Analysis within Dynamic Ocean Models

    Science.gov (United States)

    Butkiewicz, T.

    2012-12-01

    The many observation and simulation based ocean models available today can provide crucial insights for all fields of marine research and can serve as valuable references when planning data collection missions. However, the increasing size and complexity of these models makes leveraging their contents difficult for end users. Through a combination of data visualization techniques, interactive analysis tools, and new hardware technologies, the data within these models can be made more accessible to domain scientists. We present an interactive system that supports exploratory visual analysis within large-scale ocean flow models. The currents and eddies within the models are illustrated using effective, particle-based flow visualization techniques. Stereoscopic displays and rendering methods are employed to ensure that the user can correctly perceive the complex 3D structures of depth-dependent flow patterns. Interactive analysis tools are provided which allow the user to experiment through the introduction of their customizable virtual dye particles into the models to explore regions of interest. A multi-touch interface provides natural, efficient interaction, with custom multi-touch gestures simplifying the otherwise challenging tasks of navigating and positioning tools within a 3D environment. We demonstrate the potential applications of our visual analysis environment with two examples of real-world significance: Firstly, an example of using customized particles with physics-based behaviors to simulate pollutant release scenarios, including predicting the oil plume path for the 2010 Deepwater Horizon oil spill disaster. Secondly, an interactive tool for plotting and revising proposed autonomous underwater vehicle mission pathlines with respect to the surrounding flow patterns predicted by the model; as these survey vessels have extremely limited energy budgets, designing more efficient paths allows for greater survey areas.

  1. TIME SERIES ANALYSIS USING A UNIQUE MODEL OF TRANSFORMATION

    Directory of Open Access Journals (Sweden)

    Goran Klepac

    2007-12-01

    Full Text Available REFII1 model is an authorial mathematical model for time series data mining. The main purpose of that model is to automate time series analysis, through a unique transformation model of time series. An advantage of this approach of time series analysis is the linkage of different methods for time series analysis, linking traditional data mining tools in time series, and constructing new algorithms for analyzing time series. It is worth mentioning that REFII model is not a closed system, which means that we have a finite set of methods. At first, this is a model for transformation of values of time series, which prepares data used by different sets of methods based on the same model of transformation in a domain of problem space. REFII model gives a new approach in time series analysis based on a unique model of transformation, which is a base for all kind of time series analysis. The advantage of REFII model is its possible application in many different areas such as finance, medicine, voice recognition, face recognition and text mining.

  2. Mathematical foundations of time series analysis a concise introduction

    CERN Document Server

    Beran, Jan

    2017-01-01

    This book provides a concise introduction to the mathematical foundations of time series analysis, with an emphasis on mathematical clarity. The text is reduced to the essential logical core, mostly using the symbolic language of mathematics, thus enabling readers to very quickly grasp the essential reasoning behind time series analysis. It appeals to anybody wanting to understand time series in a precise, mathematical manner. It is suitable for graduate courses in time series analysis but is equally useful as a reference work for students and researchers alike.

  3. Time Series Analysis Using Geometric Template Matching.

    Science.gov (United States)

    Frank, Jordan; Mannor, Shie; Pineau, Joelle; Precup, Doina

    2013-03-01

    We present a novel framework for analyzing univariate time series data. At the heart of the approach is a versatile algorithm for measuring the similarity of two segments of time series called geometric template matching (GeTeM). First, we use GeTeM to compute a similarity measure for clustering and nearest-neighbor classification. Next, we present a semi-supervised learning algorithm that uses the similarity measure with hierarchical clustering in order to improve classification performance when unlabeled training data are available. Finally, we present a boosting framework called TDEBOOST, which uses an ensemble of GeTeM classifiers. TDEBOOST augments the traditional boosting approach with an additional step in which the features used as inputs to the classifier are adapted at each step to improve the training error. We empirically evaluate the proposed approaches on several datasets, such as accelerometer data collected from wearable sensors and ECG data.

  4. 5D Task Analysis Visualization Tool Phase II, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — The creation of a five-dimensional task analysis visualization (5D-TAV) software tool for Task Analysis and Workload Planning using multi-dimensional visualization...

  5. 5D Task Analysis Visualization Tool, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The creation of a five-dimensional task analysis visualization (5D-TAV) software tool for Task Analysis and Workload Planning using multi-dimensional visualization...

  6. Analysis and visualization of citation networks

    CERN Document Server

    Zhao, Dangzhi

    2015-01-01

    Citation analysis-the exploration of reference patterns in the scholarly and scientific literature-has long been applied in a number of social sciences to study research impact, knowledge flows, and knowledge networks. It has important information science applications as well, particularly in knowledge representation and in information retrieval.Recent years have seen a burgeoning interest in citation analysis to help address research, management, or information service issues such as university rankings, research evaluation, or knowledge domain visualization. This renewed and growing interest

  7. Growth And Export Expansion In Mauritius - A Time Series Analysis ...

    African Journals Online (AJOL)

    Growth And Export Expansion In Mauritius - A Time Series Analysis. ... RV Sannassee, R Pearce ... Using Granger Causality tests, the short-run analysis results revealed that there is significant reciprocal causality between real export earnings ...

  8. Visual behaviour analysis and driver cognitive model

    Energy Technology Data Exchange (ETDEWEB)

    Baujon, J.; Basset, M.; Gissinger, G.L. [Mulhouse Univ., (France). MIPS/MIAM Lab.

    2001-07-01

    Recent studies on driver behaviour have shown that perception - mainly visual but also proprioceptive perception - plays a key role in the ''driver-vehicle-road'' system and so considerably affects the driver's decision making. Within the framework of the behaviour analysis and studies low-cost system (BASIL), this paper presents a correlative, qualitative and quantitative study, comparing the information given by visual perception and by the trajectory followed. This information will help to obtain a cognitive model of the Rasmussen type according to different driver classes. Many experiments in real driving situations have been carried out for different driver classes and for a given trajectory profile, using a test vehicle and innovative, specially designed, real-time tools, such as the vision system or the positioning module. (orig.)

  9. Stochastic time series analysis of hydrology data for water resources

    Science.gov (United States)

    Sathish, S.; Khadar Babu, S. K.

    2017-11-01

    The prediction to current publication of stochastic time series analysis in hydrology and seasonal stage. The different statistical tests for predicting the hydrology time series on Thomas-Fiering model. The hydrology time series of flood flow have accept a great deal of consideration worldwide. The concentration of stochastic process areas of time series analysis method are expanding with develop concerns about seasonal periods and global warming. The recent trend by the researchers for testing seasonal periods in the hydrologic flowseries using stochastic process on Thomas-Fiering model. The present article proposed to predict the seasonal periods in hydrology using Thomas-Fiering model.

  10. A Markovian Entropy Measure for the Analysis of Calcium Activity Time Series.

    Science.gov (United States)

    Marken, John P; Halleran, Andrew D; Rahman, Atiqur; Odorizzi, Laura; LeFew, Michael C; Golino, Caroline A; Kemper, Peter; Saha, Margaret S

    2016-01-01

    Methods to analyze the dynamics of calcium activity often rely on visually distinguishable features in time series data such as spikes, waves, or oscillations. However, systems such as the developing nervous system display a complex, irregular type of calcium activity which makes the use of such methods less appropriate. Instead, for such systems there exists a class of methods (including information theoretic, power spectral, and fractal analysis approaches) which use more fundamental properties of the time series to analyze the observed calcium dynamics. We present a new analysis method in this class, the Markovian Entropy measure, which is an easily implementable calcium time series analysis method which represents the observed calcium activity as a realization of a Markov Process and describes its dynamics in terms of the level of predictability underlying the transitions between the states of the process. We applied our and other commonly used calcium analysis methods on a dataset from Xenopus laevis neural progenitors which displays irregular calcium activity and a dataset from murine synaptic neurons which displays activity time series that are well-described by visually-distinguishable features. We find that the Markovian Entropy measure is able to distinguish between biologically distinct populations in both datasets, and that it can separate biologically distinct populations to a greater extent than other methods in the dataset exhibiting irregular calcium activity. These results support the benefit of using the Markovian Entropy measure to analyze calcium dynamics, particularly for studies using time series data which do not exhibit easily distinguishable features.

  11. A Markovian Entropy Measure for the Analysis of Calcium Activity Time Series.

    Directory of Open Access Journals (Sweden)

    John P Marken

    Full Text Available Methods to analyze the dynamics of calcium activity often rely on visually distinguishable features in time series data such as spikes, waves, or oscillations. However, systems such as the developing nervous system display a complex, irregular type of calcium activity which makes the use of such methods less appropriate. Instead, for such systems there exists a class of methods (including information theoretic, power spectral, and fractal analysis approaches which use more fundamental properties of the time series to analyze the observed calcium dynamics. We present a new analysis method in this class, the Markovian Entropy measure, which is an easily implementable calcium time series analysis method which represents the observed calcium activity as a realization of a Markov Process and describes its dynamics in terms of the level of predictability underlying the transitions between the states of the process. We applied our and other commonly used calcium analysis methods on a dataset from Xenopus laevis neural progenitors which displays irregular calcium activity and a dataset from murine synaptic neurons which displays activity time series that are well-described by visually-distinguishable features. We find that the Markovian Entropy measure is able to distinguish between biologically distinct populations in both datasets, and that it can separate biologically distinct populations to a greater extent than other methods in the dataset exhibiting irregular calcium activity. These results support the benefit of using the Markovian Entropy measure to analyze calcium dynamics, particularly for studies using time series data which do not exhibit easily distinguishable features.

  12. Economic Analysis in Series-Distillation Desalination

    Directory of Open Access Journals (Sweden)

    Mirna Rahmah Lubis

    2010-06-01

    Full Text Available The ability to produce potable water economically is the primary purpose of seawater desalination research. Reverse osmosis (RO and multi-stage flash (MSF cost more than potable water produced from fresh water resources. Therefore, this research investigates a high-efficiency mechanical vapor-compression distillation system that employs an improved water flow arrangement. The incoming salt concentration was 0.15% salt for brackish water and 3.5% salt for seawater, whereas the outgoing salt concentration was 1.5% and 7%, respectively. Distillation was performed at 439 K and 722 kPa for both brackish water feed and seawater feed. Water costs of the various conditions were calculated for brackish water and seawater feeds using optimum conditions considered as 25 and 20 stages, respectively. For brackish water at a temperature difference of 0.96 K, the energy requirement is 2.0 kWh/m3. At this condition, the estimated water cost is $0.39/m3 achieved with 10,000,000 gal/day distillate, 30-year bond, 5% interest rate, and $0.05/kWh electricity. For seawater at a temperature difference of 0.44 K, the energy requirement is 3.97 kWh/m3 and the estimated water cost is $0.61/m3. Greater efficiency of the vapor compression system is achieved by connecting multiple evaporators in series, rather than the traditional parallel arrangement. The efficiency results from the gradual increase of salinity in each stage of the series arrangement in comparison to parallel. Calculations using various temperature differences between boiling brine and condensing steam show the series arrangement has the greatest improvement at lower temperature differences. Keywords: desalination, dropwise condensation, mechanical-vapor compression

  13. Analysis of JET ELMy time series

    International Nuclear Information System (INIS)

    Zvejnieks, G.; Kuzovkov, V.N.

    2005-01-01

    Full text: Achievement of the planned operational regime in the next generation tokamaks (such as ITER) still faces principal problems. One of the main challenges is obtaining the control of edge localized modes (ELMs), which should lead to both long plasma pulse times and reasonable divertor life time. In order to control ELMs the hypothesis was proposed by Degeling [1] that ELMs exhibit features of chaotic dynamics and thus a standard chaos control methods might be applicable. However, our findings which are based on the nonlinear autoregressive (NAR) model contradict this hypothesis for JET ELMy time-series. In turn, it means that ELM behavior is of a relaxation or random type. These conclusions coincide with our previous results obtained for ASDEX Upgrade time series [2]. [1] A.W. Degeling, Y.R. Martin, P.E. Bak, J. B.Lister, and X. Llobet, Plasma Phys. Control. Fusion 43, 1671 (2001). [2] G. Zvejnieks, V.N. Kuzovkov, O. Dumbrajs, A.W. Degeling, W. Suttrop, H. Urano, and H. Zohm, Physics of Plasmas 11, 5658 (2004)

  14. Interactive Web-based Visualization of Atomic Position-time Series Data

    Science.gov (United States)

    Thapa, S.; Karki, B. B.

    2017-12-01

    Extracting and interpreting the information contained in large sets of time-varying three dimensional positional data for the constituent atoms of simulated material is a challenging task. We have recently implemented a web-based visualization system to analyze the position-time series data extracted from the local or remote hosts. It involves a pre-processing step for data reduction, which involves skipping uninteresting parts of the data uniformly (at full atomic configuration level) or non-uniformly (at atomic species level or individual atom level). Atomic configuration snapshot is rendered using the ball-stick representation and can be animated by rendering successive configurations. The entire atomic dynamics can be captured as the trajectories by rendering the atomic positions at all time steps together as points. The trajectories can be manipulated at both species and atomic levels so that we can focus on one or more trajectories of interest, and can be also superimposed with the instantaneous atomic structure. The implementation was done using WebGL and Three.js for graphical rendering, HTML5 and Javascript for GUI, and Elasticsearch and JSON for data storage and retrieval within the Grails Framework. We have applied our visualization system to the simulation datatsets for proton-bearing forsterite (Mg2SiO4) - an abundant mineral of Earths upper mantle. Visualization reveals that protons (hydrogen ions) incorporated as interstitials are much more mobile than protons substituting the host Mg and Si cation sites. The proton diffusion appears to be anisotropic with high mobility along the x-direction, showing limited discrete jumps in other two directions.

  15. Online social media analysis and visualization

    CERN Document Server

    Kawash, Jalal

    2015-01-01

    This edited volume addresses the vast challenges of adapting Online Social Media (OSM) to developing research methods and applications. The topics cover generating realistic social network topologies, awareness of user activities, topic and trend generation, estimation of user attributes from their social content, behavior detection, mining social content for common trends, identifying and ranking social content sources, building friend-comprehension tools, and many others. Each of the ten chapters tackle one or more of these issues by proposing new analysis methods or new visualization techn

  16. Tool Wear Monitoring Using Time Series Analysis

    Science.gov (United States)

    Song, Dong Yeul; Ohara, Yasuhiro; Tamaki, Haruo; Suga, Masanobu

    A tool wear monitoring approach considering the nonlinear behavior of cutting mechanism caused by tool wear and/or localized chipping is proposed, and its effectiveness is verified through the cutting experiment and actual turning machining. Moreover, the variation in the surface roughness of the machined workpiece is also discussed using this approach. In this approach, the residual error between the actually measured vibration signal and the estimated signal obtained from the time series model corresponding to dynamic model of cutting is introduced as the feature of diagnosis. Consequently, it is found that the early tool wear state (i.e. flank wear under 40µm) can be monitored, and also the optimal tool exchange time and the tool wear state for actual turning machining can be judged by this change in the residual error. Moreover, the variation of surface roughness Pz in the range of 3 to 8µm can be estimated by the monitoring of the residual error.

  17. Metagenomics meets time series analysis: unraveling microbial community dynamics

    NARCIS (Netherlands)

    Faust, K.; Lahti, L.M.; Gonze, D.; Vos, de W.M.; Raes, J.

    2015-01-01

    The recent increase in the number of microbial time series studies offers new insights into the stability and dynamics of microbial communities, from the world's oceans to human microbiota. Dedicated time series analysis tools allow taking full advantage of these data. Such tools can reveal periodic

  18. Visual Analysis of Cloud Computing Performance Using Behavioral Lines.

    Science.gov (United States)

    Muelder, Chris; Zhu, Biao; Chen, Wei; Zhang, Hongxin; Ma, Kwan-Liu

    2016-02-29

    Cloud computing is an essential technology to Big Data analytics and services. A cloud computing system is often comprised of a large number of parallel computing and storage devices. Monitoring the usage and performance of such a system is important for efficient operations, maintenance, and security. Tracing every application on a large cloud system is untenable due to scale and privacy issues. But profile data can be collected relatively efficiently by regularly sampling the state of the system, including properties such as CPU load, memory usage, network usage, and others, creating a set of multivariate time series for each system. Adequate tools for studying such large-scale, multidimensional data are lacking. In this paper, we present a visual based analysis approach to understanding and analyzing the performance and behavior of cloud computing systems. Our design is based on similarity measures and a layout method to portray the behavior of each compute node over time. When visualizing a large number of behavioral lines together, distinct patterns often appear suggesting particular types of performance bottleneck. The resulting system provides multiple linked views, which allow the user to interactively explore the data by examining the data or a selected subset at different levels of detail. Our case studies, which use datasets collected from two different cloud systems, show that this visual based approach is effective in identifying trends and anomalies of the systems.

  19. Malware Analysis Using Visualized Image Matrices

    Directory of Open Access Journals (Sweden)

    KyoungSoo Han

    2014-01-01

    Full Text Available This paper proposes a novel malware visual analysis method that contains not only a visualization method to convert binary files into images, but also a similarity calculation method between these images. The proposed method generates RGB-colored pixels on image matrices using the opcode sequences extracted from malware samples and calculates the similarities for the image matrices. Particularly, our proposed methods are available for packed malware samples by applying them to the execution traces extracted through dynamic analysis. When the images are generated, we can reduce the overheads by extracting the opcode sequences only from the blocks that include the instructions related to staple behaviors such as functions and application programming interface (API calls. In addition, we propose a technique that generates a representative image for each malware family in order to reduce the number of comparisons for the classification of unknown samples and the colored pixel information in the image matrices is used to calculate the similarities between the images. Our experimental results show that the image matrices of malware can effectively be used to classify malware families both statically and dynamically with accuracy of 0.9896 and 0.9732, respectively.

  20. Malware analysis using visualized image matrices.

    Science.gov (United States)

    Han, KyoungSoo; Kang, BooJoong; Im, Eul Gyu

    2014-01-01

    This paper proposes a novel malware visual analysis method that contains not only a visualization method to convert binary files into images, but also a similarity calculation method between these images. The proposed method generates RGB-colored pixels on image matrices using the opcode sequences extracted from malware samples and calculates the similarities for the image matrices. Particularly, our proposed methods are available for packed malware samples by applying them to the execution traces extracted through dynamic analysis. When the images are generated, we can reduce the overheads by extracting the opcode sequences only from the blocks that include the instructions related to staple behaviors such as functions and application programming interface (API) calls. In addition, we propose a technique that generates a representative image for each malware family in order to reduce the number of comparisons for the classification of unknown samples and the colored pixel information in the image matrices is used to calculate the similarities between the images. Our experimental results show that the image matrices of malware can effectively be used to classify malware families both statically and dynamically with accuracy of 0.9896 and 0.9732, respectively.

  1. Visualization studies on evidence-based medicine domain knowledge (series 3): visualization for dissemination of evidence based medicine information.

    Science.gov (United States)

    Shen, Jiantong; Yao, Leye; Li, Youping; Clarke, Mike; Gan, Qi; Li, Yifei; Fan, Yi; Gou, Yongchao; Wang, Li

    2011-05-01

    To identify patterns in information sharing between a series of Chinese evidence based medicine (EBM) journals and the Cochrane Database of Systematic Reviews, to determine key evidence dissemination areas for EBM and to provide a scientific basis for improving the dissemination of EBM research. Data were collected on citing and cited from the Chinese Journal of Evidence-Based Medicine (CJEBM), Journal of Evidence-Based Medicine (JEBMc), Chinese Journal of Evidence Based Pediatrics (CJEBP), and the Cochrane Database of Systematic Reviews (CDSR). Relationships between citations were visualized. High-frequency key words from these sources were identified, to build a word co-occurrence matrix and to map research subjects. CDSR contains a large collection of information of relevance to EBM and its contents are widely cited across many journals, suggesting a well-developed citation environment. The content and citation of the Chinese journals have been increasing in recent years. However, their citation environments are much less developed, and there is a wide variation in the breadth and strength of their knowledge communication, with the ranking from highest to lowest being CJEBM, JEBMc and CJEBP. The content of CDSR is almost exclusively Cochrane intervention reviews examining the effects of healthcare interventions, so it's contribution to EBM is mostly in disease control and treatment. On the other hand, the Chinese journals on evidence-based medicine and practice focused more on areas such as education and research, design and quality of clinical trials, evidence based policymaking, evidence based clinical practice, tumor treatment, and pediatrics. Knowledge and findings of EBM are widely communicated and disseminated. However, citation environments and range of knowledge communication differ greatly between the journals examined in this study. This finds that Chinese EBM has focused mainly on clinical medicine, Traditional Chinese Medicine, pediatrics, tumor

  2. Visualization of Time-Series Sensor Data to Inform the Design of Just-In-Time Adaptive Stress Interventions.

    Science.gov (United States)

    Sharmin, Moushumi; Raij, Andrew; Epstien, David; Nahum-Shani, Inbal; Beck, J Gayle; Vhaduri, Sudip; Preston, Kenzie; Kumar, Santosh

    2015-09-01

    We investigate needs, challenges, and opportunities in visualizing time-series sensor data on stress to inform the design of just-in-time adaptive interventions (JITAIs). We identify seven key challenges: massive volume and variety of data, complexity in identifying stressors, scalability of space, multifaceted relationship between stress and time, a need for representation at multiple granularities, interperson variability, and limited understanding of JITAI design requirements due to its novelty. We propose four new visualizations based on one million minutes of sensor data (n=70). We evaluate our visualizations with stress researchers (n=6) to gain first insights into its usability and usefulness in JITAI design. Our results indicate that spatio-temporal visualizations help identify and explain between- and within-person variability in stress patterns and contextual visualizations enable decisions regarding the timing, content, and modality of intervention. Interestingly, a granular representation is considered informative but noise-prone; an abstract representation is the preferred starting point for designing JITAIs.

  3. Topic Time Series Analysis of Microblogs

    Science.gov (United States)

    2014-10-01

    may be distributed more globally. Tweets on a specific topic that cluster spatially, temporally or both might be of interest to analysts, marketers ...of $ and @, with the latter only in the case that it is the only character in the token (the @ symbol is significant in its usage by Instagram in...is generated by Instagram . Topic 80, Distance: 143.2101 Top words: 1. rawr 2. ˆ0ˆ 3. kill 4. jurassic 5. dinosaur Analysis: This topic is quite

  4. Time Series Factor Analysis with an Application to Measuring Money

    NARCIS (Netherlands)

    Gilbert, Paul D.; Meijer, Erik

    2005-01-01

    Time series factor analysis (TSFA) and its associated statistical theory is developed. Unlike dynamic factor analysis (DFA), TSFA obviates the need for explicitly modeling the process dynamics of the underlying phenomena. It also differs from standard factor analysis (FA) in important respects: the

  5. A Hierarchical Visualization Analysis Model of Power Big Data

    Science.gov (United States)

    Li, Yongjie; Wang, Zheng; Hao, Yang

    2018-01-01

    Based on the conception of integrating VR scene and power big data analysis, a hierarchical visualization analysis model of power big data is proposed, in which levels are designed, targeting at different abstract modules like transaction, engine, computation, control and store. The regularly departed modules of power data storing, data mining and analysis, data visualization are integrated into one platform by this model. It provides a visual analysis solution for the power big data.

  6. Avogadro: an advanced semantic chemical editor, visualization, and analysis platform.

    Science.gov (United States)

    Hanwell, Marcus D; Curtis, Donald E; Lonie, David C; Vandermeersch, Tim; Zurek, Eva; Hutchison, Geoffrey R

    2012-08-13

    The Avogadro project has developed an advanced molecule editor and visualizer designed for cross-platform use in computational chemistry, molecular modeling, bioinformatics, materials science, and related areas. It offers flexible, high quality rendering, and a powerful plugin architecture. Typical uses include building molecular structures, formatting input files, and analyzing output of a wide variety of computational chemistry packages. By using the CML file format as its native document type, Avogadro seeks to enhance the semantic accessibility of chemical data types. The work presented here details the Avogadro library, which is a framework providing a code library and application programming interface (API) with three-dimensional visualization capabilities; and has direct applications to research and education in the fields of chemistry, physics, materials science, and biology. The Avogadro application provides a rich graphical interface using dynamically loaded plugins through the library itself. The application and library can each be extended by implementing a plugin module in C++ or Python to explore different visualization techniques, build/manipulate molecular structures, and interact with other programs. We describe some example extensions, one which uses a genetic algorithm to find stable crystal structures, and one which interfaces with the PackMol program to create packed, solvated structures for molecular dynamics simulations. The 1.0 release series of Avogadro is the main focus of the results discussed here. Avogadro offers a semantic chemical builder and platform for visualization and analysis. For users, it offers an easy-to-use builder, integrated support for downloading from common databases such as PubChem and the Protein Data Bank, extracting chemical data from a wide variety of formats, including computational chemistry output, and native, semantic support for the CML file format. For developers, it can be easily extended via a powerful

  7. Time averaging, ageing and delay analysis of financial time series

    Science.gov (United States)

    Cherstvy, Andrey G.; Vinod, Deepak; Aghion, Erez; Chechkin, Aleksei V.; Metzler, Ralf

    2017-06-01

    We introduce three strategies for the analysis of financial time series based on time averaged observables. These comprise the time averaged mean squared displacement (MSD) as well as the ageing and delay time methods for varying fractions of the financial time series. We explore these concepts via statistical analysis of historic time series for several Dow Jones Industrial indices for the period from the 1960s to 2015. Remarkably, we discover a simple universal law for the delay time averaged MSD. The observed features of the financial time series dynamics agree well with our analytical results for the time averaged measurables for geometric Brownian motion, underlying the famed Black-Scholes-Merton model. The concepts we promote here are shown to be useful for financial data analysis and enable one to unveil new universal features of stock market dynamics.

  8. Biostatistics series module 9: Survival analysis

    Directory of Open Access Journals (Sweden)

    Avijit Hazra

    2017-01-01

    Full Text Available Survival analysis is concerned with “time to event“ data. Conventionally, it dealt with cancer death as the event in question, but it can handle any event occurring over a time frame, and this need not be always adverse in nature. When the outcome of a study is the time to an event, it is often not possible to wait until the event in question has happened to all the subjects, for example, until all are dead. In addition, subjects may leave the study prematurely. Such situations lead to what is called censored observations as complete information is not available for these subjects. The data set is thus an assemblage of times to the event in question and times after which no more information on the individual is available. Survival analysis methods are the only techniques capable of handling censored observations without treating them as missing data. They also make no assumption regarding normal distribution of time to event data. Descriptive methods for exploring survival times in a sample include life table and Kaplan–Meier techniques as well as various kinds of distribution fitting as advanced modeling techniques. The Kaplan–Meier cumulative survival probability over time plot has become the signature plot for biomedical survival analysis. Several techniques are available for comparing the survival experience in two or more groups – the log-rank test is popularly used. This test can also be used to produce an odds ratio as an estimate of risk of the event in the test group; this is called hazard ratio (HR. Limitations of the traditional log-rank test have led to various modifications and enhancements. Finally, survival analysis offers different regression models for estimating the impact of multiple predictors on survival. Cox's proportional hazard model is the most general of the regression methods that allows the hazard function to be modeled on a set of explanatory variables without making restrictive assumptions concerning the

  9. Signs over time: Statistical and visual analysis of a longitudinal signed network

    NARCIS (Netherlands)

    de Nooy, W.

    2008-01-01

    This paper presents the design and results of a statistical and visual analysis of a dynamic signed network. In addition to prevalent approaches to longitudinal networks, which analyze series of cross-sectional data, this paper focuses on network data measured in continuous time in order to explain

  10. Time Series Analysis of Insar Data: Methods and Trends

    Science.gov (United States)

    Osmanoglu, Batuhan; Sunar, Filiz; Wdowinski, Shimon; Cano-Cabral, Enrique

    2015-01-01

    Time series analysis of InSAR data has emerged as an important tool for monitoring and measuring the displacement of the Earth's surface. Changes in the Earth's surface can result from a wide range of phenomena such as earthquakes, volcanoes, landslides, variations in ground water levels, and changes in wetland water levels. Time series analysis is applied to interferometric phase measurements, which wrap around when the observed motion is larger than one-half of the radar wavelength. Thus, the spatio-temporal ''unwrapping" of phase observations is necessary to obtain physically meaningful results. Several different algorithms have been developed for time series analysis of InSAR data to solve for this ambiguity. These algorithms may employ different models for time series analysis, but they all generate a first-order deformation rate, which can be compared to each other. However, there is no single algorithm that can provide optimal results in all cases. Since time series analyses of InSAR data are used in a variety of applications with different characteristics, each algorithm possesses inherently unique strengths and weaknesses. In this review article, following a brief overview of InSAR technology, we discuss several algorithms developed for time series analysis of InSAR data using an example set of results for measuring subsidence rates in Mexico City.

  11. Comparing Visual and Statistical Analysis of Multiple Baseline Design Graphs.

    Science.gov (United States)

    Wolfe, Katie; Dickenson, Tammiee S; Miller, Bridget; McGrath, Kathleen V

    2018-04-01

    A growing number of statistical analyses are being developed for single-case research. One important factor in evaluating these methods is the extent to which each corresponds to visual analysis. Few studies have compared statistical and visual analysis, and information about more recently developed statistics is scarce. Therefore, our purpose was to evaluate the agreement between visual analysis and four statistical analyses: improvement rate difference (IRD); Tau-U; Hedges, Pustejovsky, Shadish (HPS) effect size; and between-case standardized mean difference (BC-SMD). Results indicate that IRD and BC-SMD had the strongest overall agreement with visual analysis. Although Tau-U had strong agreement with visual analysis on raw values, it had poorer agreement when those values were dichotomized to represent the presence or absence of a functional relation. Overall, visual analysis appeared to be more conservative than statistical analysis, but further research is needed to evaluate the nature of these disagreements.

  12. Stochastic Analysis : A Series of Lectures

    CERN Document Server

    Dozzi, Marco; Flandoli, Franco; Russo, Francesco

    2015-01-01

    This book presents in thirteen refereed survey articles an overview of modern activity in stochastic analysis, written by leading international experts. The topics addressed include stochastic fluid dynamics and regularization by noise of deterministic dynamical systems; stochastic partial differential equations driven by Gaussian or Lévy noise, including the relationship between parabolic equations and particle systems, and wave equations in a geometric framework; Malliavin calculus and applications to stochastic numerics; stochastic integration in Banach spaces; porous media-type equations; stochastic deformations of classical mechanics and Feynman integrals and stochastic differential equations with reflection. The articles are based on short courses given at the Centre Interfacultaire Bernoulli of the Ecole Polytechnique Fédérale de Lausanne, Switzerland, from January to June 2012. They offer a valuable resource not only for specialists, but also for other researchers and Ph.D. students in the fields o...

  13. Multichannel biomedical time series clustering via hierarchical probabilistic latent semantic analysis.

    Science.gov (United States)

    Wang, Jin; Sun, Xiangping; Nahavandi, Saeid; Kouzani, Abbas; Wu, Yuchuan; She, Mary

    2014-11-01

    Biomedical time series clustering that automatically groups a collection of time series according to their internal similarity is of importance for medical record management and inspection such as bio-signals archiving and retrieval. In this paper, a novel framework that automatically groups a set of unlabelled multichannel biomedical time series according to their internal structural similarity is proposed. Specifically, we treat a multichannel biomedical time series as a document and extract local segments from the time series as words. We extend a topic model, i.e., the Hierarchical probabilistic Latent Semantic Analysis (H-pLSA), which was originally developed for visual motion analysis to cluster a set of unlabelled multichannel time series. The H-pLSA models each channel of the multichannel time series using a local pLSA in the first layer. The topics learned in the local pLSA are then fed to a global pLSA in the second layer to discover the categories of multichannel time series. Experiments on a dataset extracted from multichannel Electrocardiography (ECG) signals demonstrate that the proposed method performs better than previous state-of-the-art approaches and is relatively robust to the variations of parameters including length of local segments and dictionary size. Although the experimental evaluation used the multichannel ECG signals in a biometric scenario, the proposed algorithm is a universal framework for multichannel biomedical time series clustering according to their structural similarity, which has many applications in biomedical time series management. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  14. VisIO: enabling interactive visualization of ultra-scale, time-series data via high-bandwidth distributed I/O systems

    Energy Technology Data Exchange (ETDEWEB)

    Mitchell, Christopher J [Los Alamos National Laboratory; Ahrens, James P [Los Alamos National Laboratory; Wang, Jun [UCF

    2010-10-15

    Petascale simulations compute at resolutions ranging into billions of cells and write terabytes of data for visualization and analysis. Interactive visuaUzation of this time series is a desired step before starting a new run. The I/O subsystem and associated network often are a significant impediment to interactive visualization of time-varying data; as they are not configured or provisioned to provide necessary I/O read rates. In this paper, we propose a new I/O library for visualization applications: VisIO. Visualization applications commonly use N-to-N reads within their parallel enabled readers which provides an incentive for a shared-nothing approach to I/O, similar to other data-intensive approaches such as Hadoop. However, unlike other data-intensive applications, visualization requires: (1) interactive performance for large data volumes, (2) compatibility with MPI and POSIX file system semantics for compatibility with existing infrastructure, and (3) use of existing file formats and their stipulated data partitioning rules. VisIO, provides a mechanism for using a non-POSIX distributed file system to provide linear scaling of 110 bandwidth. In addition, we introduce a novel scheduling algorithm that helps to co-locate visualization processes on nodes with the requested data. Testing using VisIO integrated into Para View was conducted using the Hadoop Distributed File System (HDFS) on TACC's Longhorn cluster. A representative dataset, VPIC, across 128 nodes showed a 64.4% read performance improvement compared to the provided Lustre installation. Also tested, was a dataset representing a global ocean salinity simulation that showed a 51.4% improvement in read performance over Lustre when using our VisIO system. VisIO, provides powerful high-performance I/O services to visualization applications, allowing for interactive performance with ultra-scale, time-series data.

  15. Volatility Analysis of Bitcoin Price Time Series

    Directory of Open Access Journals (Sweden)

    Lukáš Pichl

    2017-12-01

    Full Text Available Bitcoin has the largest share in the total capitalization of cryptocurrency markets currently reaching above 70 billion USD. In this work we focus on the price of Bitcoin in terms of standard currencies and their volatility over the last five years. The average day-to-day return throughout this period is 0.328%, amounting in exponential growth from 6 USD to over 4,000 USD per 1 BTC at present. Multi-scale analysis is performed from the level of the tick data, through the 5 min, 1 hour and 1 day scales. Distribution of trading volumes (1 sec, 1 min, 1 hour and 1 day aggregated from the Kraken BTCEUR tick data is provided that shows the artifacts of algorithmic trading (selling transactions with volume peaks distributed at integer multiples of BTC unit. Arbitrage opportunities are studied using the EUR, USD and CNY currencies. Whereas the arbitrage spread for EUR-USD currency pair is found narrow at the order of a percent, at the 1 hour sampling period the arbitrage spread for USD-CNY (and similarly EUR-CNY is found to be more substantial, reaching as high as above 5 percent on rare occasions. The volatility of BTC exchange rates is modeled using the day-to-day distribution of logarithmic return, and the Realized Volatility, sum of the squared logarithmic returns on 5-minute basis. In this work we demonstrate that the Heterogeneous Autoregressive model for Realized Volatility Andersen et al. (2007 applies reasonably well to the BTCUSD dataset. Finally, a feed-forward neural network with 2 hidden layers using 10-day moving window sampling daily return predictors is applied to estimate the next-day logarithmic return. The results show that such an artificial neural network prediction is capable of approximate capture of the actual log return distribution; more sophisticated methods, such as recurrent neural networks and LSTM (Long Short Term Memory techniques from deep learning may be necessary for higher prediction accuracy.

  16. Multivariate time series analysis with R and financial applications

    CERN Document Server

    Tsay, Ruey S

    2013-01-01

    Since the publication of his first book, Analysis of Financial Time Series, Ruey Tsay has become one of the most influential and prominent experts on the topic of time series. Different from the traditional and oftentimes complex approach to multivariate (MV) time series, this sequel book emphasizes structural specification, which results in simplified parsimonious VARMA modeling and, hence, eases comprehension. Through a fundamental balance between theory and applications, the book supplies readers with an accessible approach to financial econometric models and their applications to real-worl

  17. Elements of nonlinear time series analysis and forecasting

    CERN Document Server

    De Gooijer, Jan G

    2017-01-01

    This book provides an overview of the current state-of-the-art of nonlinear time series analysis, richly illustrated with examples, pseudocode algorithms and real-world applications. Avoiding a “theorem-proof” format, it shows concrete applications on a variety of empirical time series. The book can be used in graduate courses in nonlinear time series and at the same time also includes interesting material for more advanced readers. Though it is largely self-contained, readers require an understanding of basic linear time series concepts, Markov chains and Monte Carlo simulation methods. The book covers time-domain and frequency-domain methods for the analysis of both univariate and multivariate (vector) time series. It makes a clear distinction between parametric models on the one hand, and semi- and nonparametric models/methods on the other. This offers the reader the option of concentrating exclusively on one of these nonlinear time series analysis methods. To make the book as user friendly as possible...

  18. The Photoplethismographic Signal Processed with Nonlinear Time Series Analysis Tools

    International Nuclear Information System (INIS)

    Hernandez Caceres, Jose Luis; Hong, Rolando; Garcia Lanz, Abel; Garcia Dominguez, Luis; Cabannas, Karelia

    2001-01-01

    Finger photoplethismography (PPG) signals were submitted to nonlinear time series analysis. The applied analytical techniques were: (i) High degree polynomial fitting for baseline estimation; (ii) FFT analysis for estimating power spectra; (iii) fractal dimension estimation via the Higuchi's time-domain method, and (iv) kernel nonparametric estimation for reconstructing noise free-attractors and also for estimating signal's stochastic components

  19. SPATIOTEMPORAL VISUALIZATION OF TIME-SERIES SATELLITE-DERIVED CO2 FLUX DATA USING VOLUME RENDERING AND GPU-BASED INTERPOLATION ON A CLOUD-DRIVEN DIGITAL EARTH

    Directory of Open Access Journals (Sweden)

    S. Wu

    2017-10-01

    Full Text Available The ocean carbon cycle has a significant influence on global climate, and is commonly evaluated using time-series satellite-derived CO2 flux data. Location-aware and globe-based visualization is an important technique for analyzing and presenting the evolution of climate change. To achieve realistic simulation of the spatiotemporal dynamics of ocean carbon, a cloud-driven digital earth platform is developed to support the interactive analysis and display of multi-geospatial data, and an original visualization method based on our digital earth is proposed to demonstrate the spatiotemporal variations of carbon sinks and sources using time-series satellite data. Specifically, a volume rendering technique using half-angle slicing and particle system is implemented to dynamically display the released or absorbed CO2 gas. To enable location-aware visualization within the virtual globe, we present a 3D particlemapping algorithm to render particle-slicing textures onto geospace. In addition, a GPU-based interpolation framework using CUDA during real-time rendering is designed to obtain smooth effects in both spatial and temporal dimensions. To demonstrate the capabilities of the proposed method, a series of satellite data is applied to simulate the air-sea carbon cycle in the China Sea. The results show that the suggested strategies provide realistic simulation effects and acceptable interactive performance on the digital earth.

  20. Nonlinear time series analysis of the human electrocardiogram

    International Nuclear Information System (INIS)

    Perc, Matjaz

    2005-01-01

    We analyse the human electrocardiogram with simple nonlinear time series analysis methods that are appropriate for graduate as well as undergraduate courses. In particular, attention is devoted to the notions of determinism and stationarity in physiological data. We emphasize that methods of nonlinear time series analysis can be successfully applied only if the studied data set originates from a deterministic stationary system. After positively establishing the presence of determinism and stationarity in the studied electrocardiogram, we calculate the maximal Lyapunov exponent, thus providing interesting insights into the dynamics of the human heart. Moreover, to facilitate interest and enable the integration of nonlinear time series analysis methods into the curriculum at an early stage of the educational process, we also provide user-friendly programs for each implemented method

  1. Developing a functioning visualization and analysis system for performance assessment

    International Nuclear Information System (INIS)

    Jones, M.L.

    1992-01-01

    Various commercial software packages and customized programs provide the ability to analyze and visualize the geology of Yucca Mountain. Starting with sparse, irregularly spaced data a series of gridded models has been developed representing the thermal/mechanical units within the mountain. Using computer aided design (CAD) software and scientific visualization software, the units can be manipulated, analyzed, and graphically displayed. The outputs are typically gridded terrain models, along with files of three-dimensional coordinates, distances, and other dimensional values. Contour maps, profiles, and shaded surfaces are the output for visualization

  2. Development of Visualization Tools for ZPPR-15 Analysis

    International Nuclear Information System (INIS)

    Lee, Min Jae; Kim, Sang Ji

    2014-01-01

    ZPPR-15 cores consist of various drawer masters that have great heterogeneity. In order to build a proper homogenization strategy, the geometry of the drawer masters should be carefully analyzed with a visualization. Additionally, a visualization of drawer masters and the core configuration is necessary for minimizing human error during the input processing. For this purpose, visualization tools for a ZPPR-15 analysis has been developed based on a Perl script. In the following section, the implementation of visualization tools will be described and various visualization samples for both drawer masters and ZPPR-15 cores will be demonstrated. Visualization tools for drawer masters and a core configuration were successfully developed for a ZPPR-15 analysis. The visualization tools are expected to be useful for understanding ZPPR-15 experiments, and finding deterministic models of ZPPR-15. It turned out that generating VTK files is handy but the application of VTK files is powerful with the aid of the VISIT program

  3. Handbook of Time Series Analysis Recent Theoretical Developments and Applications

    CERN Document Server

    Schelter, Björn; Timmer, Jens

    2006-01-01

    This handbook provides an up-to-date survey of current research topics and applications of time series analysis methods written by leading experts in their fields. It covers recent developments in univariate as well as bivariate and multivariate time series analysis techniques ranging from physics' to life sciences' applications. Each chapter comprises both methodological aspects and applications to real world complex systems, such as the human brain or Earth's climate. Covering an exceptionally broad spectrum of topics, beginners, experts and practitioners who seek to understand the latest de

  4. Time series analysis and its applications with R examples

    CERN Document Server

    Shumway, Robert H

    2017-01-01

    The fourth edition of this popular graduate textbook, like its predecessors, presents a balanced and comprehensive treatment of both time and frequency domain methods with accompanying theory. Numerous examples using nontrivial data illustrate solutions to problems such as discovering natural and anthropogenic climate change, evaluating pain perception experiments using functional magnetic resonance imaging, and monitoring a nuclear test ban treaty. The book is designed as a textbook for graduate level students in the physical, biological, and social sciences and as a graduate level text in statistics. Some parts may also serve as an undergraduate introductory course. Theory and methodology are separated to allow presentations on different levels. In addition to coverage of classical methods of time series regression, ARIMA models, spectral analysis and state-space models, the text includes modern developments including categorical time series analysis, multivariate spectral methods, long memory series, nonli...

  5. Multiresolution analysis of Bursa Malaysia KLCI time series

    Science.gov (United States)

    Ismail, Mohd Tahir; Dghais, Amel Abdoullah Ahmed

    2017-05-01

    In general, a time series is simply a sequence of numbers collected at regular intervals over a period. Financial time series data processing is concerned with the theory and practice of processing asset price over time, such as currency, commodity data, and stock market data. The primary aim of this study is to understand the fundamental characteristics of selected financial time series by using the time as well as the frequency domain analysis. After that prediction can be executed for the desired system for in sample forecasting. In this study, multiresolution analysis which the assist of discrete wavelet transforms (DWT) and maximal overlap discrete wavelet transform (MODWT) will be used to pinpoint special characteristics of Bursa Malaysia KLCI (Kuala Lumpur Composite Index) daily closing prices and return values. In addition, further case study discussions include the modeling of Bursa Malaysia KLCI using linear ARIMA with wavelets to address how multiresolution approach improves fitting and forecasting results.

  6. Time Series Analysis of Wheat Futures Reward in China

    Institute of Scientific and Technical Information of China (English)

    2005-01-01

    Different from the fact that the main researches are focused on single futures contract and lack of the comparison of different periods, this paper described the statistical characteristics of wheat futures reward time series of Zhengzhou Commodity Exchange in recent three years. Besides the basic statistic analysis, the paper used the GARCH and EGARCH model to describe the time series which had the ARCH effect and analyzed the persistence of volatility shocks and the leverage effect. The results showed that compared with that of normal one,wheat futures reward series were abnormality, leptokurtic and thick tail distribution. The study also found that two-part of the reward series had no autocorrelation. Among the six correlative series, three ones presented the ARCH effect. By using of the Auto-regressive Distributed Lag Model, GARCH model and EGARCH model, the paper demonstrates the persistence of volatility shocks and the leverage effect on the wheat futures reward time series. The results reveal that on the one hand, the statistical characteristics of the wheat futures reward are similar to the aboard mature futures market as a whole. But on the other hand, the results reflect some shortages such as the immatureness and the over-control by the government in the Chinese future market.

  7. Time series analysis in chaotic diode resonator circuit

    Energy Technology Data Exchange (ETDEWEB)

    Hanias, M.P. [TEI of Chalkis, GR 34400, Evia, Chalkis (Greece)] e-mail: mhanias@teihal.gr; Giannaris, G. [TEI of Chalkis, GR 34400, Evia, Chalkis (Greece); Spyridakis, A. [TEI of Chalkis, GR 34400, Evia, Chalkis (Greece); Rigas, A. [TEI of Chalkis, GR 34400, Evia, Chalkis (Greece)

    2006-01-01

    A diode resonator chaotic circuit is presented. Multisim is used to simulate the circuit and show the presence of chaos. Time series analysis performed by the method proposed by Grasberger and Procaccia. The correlation and minimum embedding dimension {nu} and m {sub min}, respectively, were calculated. Also the corresponding Kolmogorov entropy was calculated.

  8. Time series analysis in chaotic diode resonator circuit

    International Nuclear Information System (INIS)

    Hanias, M.P.; Giannaris, G.; Spyridakis, A.; Rigas, A.

    2006-01-01

    A diode resonator chaotic circuit is presented. Multisim is used to simulate the circuit and show the presence of chaos. Time series analysis performed by the method proposed by Grasberger and Procaccia. The correlation and minimum embedding dimension ν and m min , respectively, were calculated. Also the corresponding Kolmogorov entropy was calculated

  9. Time series analysis of monthly pulpwood use in the Northeast

    Science.gov (United States)

    James T. Bones

    1980-01-01

    Time series analysis was used to develop a model that depicts pulpwood use in the Northeast. The model is useful in forecasting future pulpwood requirements (short term) or monitoring pulpwood-use activity in relation to past use patterns. The model predicted a downturn in use during 1980.

  10. Time Series Analysis Based on Running Mann Whitney Z Statistics

    Science.gov (United States)

    A sensitive and objective time series analysis method based on the calculation of Mann Whitney U statistics is described. This method samples data rankings over moving time windows, converts those samples to Mann-Whitney U statistics, and then normalizes the U statistics to Z statistics using Monte-...

  11. Time series analysis in astronomy: Limits and potentialities

    DEFF Research Database (Denmark)

    Vio, R.; Kristensen, N.R.; Madsen, Henrik

    2005-01-01

    In this paper we consider the problem of the limits concerning the physical information that can be extracted from the analysis of one or more time series ( light curves) typical of astrophysical objects. On the basis of theoretical considerations and numerical simulations, we show that with no a...

  12. Time Series Analysis of 3D Coordinates Using Nonstochastic Observations

    NARCIS (Netherlands)

    Velsink, H.

    2016-01-01

    Adjustment and testing of a combination of stochastic and nonstochastic observations is applied to the deformation analysis of a time series of 3D coordinates. Nonstochastic observations are constant values that are treated as if they were observations. They are used to formulate constraints on

  13. Time Series Analysis of 3D Coordinates Using Nonstochastic Observations

    NARCIS (Netherlands)

    Hiddo Velsink

    2016-01-01

    From the article: Abstract Adjustment and testing of a combination of stochastic and nonstochastic observations is applied to the deformation analysis of a time series of 3D coordinates. Nonstochastic observations are constant values that are treated as if they were observations. They are used to

  14. Identification of human operator performance models utilizing time series analysis

    Science.gov (United States)

    Holden, F. M.; Shinners, S. M.

    1973-01-01

    The results of an effort performed by Sperry Systems Management Division for AMRL in applying time series analysis as a tool for modeling the human operator are presented. This technique is utilized for determining the variation of the human transfer function under various levels of stress. The human operator's model is determined based on actual input and output data from a tracking experiment.

  15. Analysis and implementation of LLC-T series parallel resonant ...

    African Journals Online (AJOL)

    A prototype 300 W, 100 kHz converter is designed and built to experimentally demonstrate, dynamic and steady state performance for the LLC-T series parallel resonant converter. A comparative study is performed between experimental results and the simulation studies. The analysis shows that the output of converter is ...

  16. Visual modeling in an analysis of multidimensional data

    Science.gov (United States)

    Zakharova, A. A.; Vekhter, E. V.; Shklyar, A. V.; Pak, A. J.

    2018-01-01

    The article proposes an approach to solve visualization problems and the subsequent analysis of multidimensional data. Requirements to the properties of visual models, which were created to solve analysis problems, are described. As a perspective direction for the development of visual analysis tools for multidimensional and voluminous data, there was suggested an active use of factors of subjective perception and dynamic visualization. Practical results of solving the problem of multidimensional data analysis are shown using the example of a visual model of empirical data on the current state of studying processes of obtaining silicon carbide by an electric arc method. There are several results of solving this problem. At first, an idea of possibilities of determining the strategy for the development of the domain, secondly, the reliability of the published data on this subject, and changes in the areas of attention of researchers over time.

  17. Avogadro: an advanced semantic chemical editor, visualization, and analysis platform

    Directory of Open Access Journals (Sweden)

    Hanwell Marcus D

    2012-08-01

    Full Text Available Abstract Background The Avogadro project has developed an advanced molecule editor and visualizer designed for cross-platform use in computational chemistry, molecular modeling, bioinformatics, materials science, and related areas. It offers flexible, high quality rendering, and a powerful plugin architecture. Typical uses include building molecular structures, formatting input files, and analyzing output of a wide variety of computational chemistry packages. By using the CML file format as its native document type, Avogadro seeks to enhance the semantic accessibility of chemical data types. Results The work presented here details the Avogadro library, which is a framework providing a code library and application programming interface (API with three-dimensional visualization capabilities; and has direct applications to research and education in the fields of chemistry, physics, materials science, and biology. The Avogadro application provides a rich graphical interface using dynamically loaded plugins through the library itself. The application and library can each be extended by implementing a plugin module in C++ or Python to explore different visualization techniques, build/manipulate molecular structures, and interact with other programs. We describe some example extensions, one which uses a genetic algorithm to find stable crystal structures, and one which interfaces with the PackMol program to create packed, solvated structures for molecular dynamics simulations. The 1.0 release series of Avogadro is the main focus of the results discussed here. Conclusions Avogadro offers a semantic chemical builder and platform for visualization and analysis. For users, it offers an easy-to-use builder, integrated support for downloading from common databases such as PubChem and the Protein Data Bank, extracting chemical data from a wide variety of formats, including computational chemistry output, and native, semantic support for the CML file format

  18. Time series analysis methods and applications for flight data

    CERN Document Server

    Zhang, Jianye

    2017-01-01

    This book focuses on different facets of flight data analysis, including the basic goals, methods, and implementation techniques. As mass flight data possesses the typical characteristics of time series, the time series analysis methods and their application for flight data have been illustrated from several aspects, such as data filtering, data extension, feature optimization, similarity search, trend monitoring, fault diagnosis, and parameter prediction, etc. An intelligent information-processing platform for flight data has been established to assist in aircraft condition monitoring, training evaluation and scientific maintenance. The book will serve as a reference resource for people working in aviation management and maintenance, as well as researchers and engineers in the fields of data analysis and data mining.

  19. Time series analysis of ozone data in Isfahan

    Science.gov (United States)

    Omidvari, M.; Hassanzadeh, S.; Hosseinibalam, F.

    2008-07-01

    Time series analysis used to investigate the stratospheric ozone formation and decomposition processes. Different time series methods are applied to detect the reason for extreme high ozone concentrations for each season. Data was convert into seasonal component and frequency domain, the latter has been evaluated by using the Fast Fourier Transform (FFT), spectral analysis. The power density spectrum estimated from the ozone data showed peaks at cycle duration of 22, 20, 36, 186, 365 and 40 days. According to seasonal component analysis most fluctuation was in 1999 and 2000, but the least fluctuation was in 2003. The best correlation between ozone and sun radiation was found in 2000. Other variables which are not available cause to this fluctuation in the 1999 and 2001. The trend of ozone is increasing in 1999 and is decreasing in other years.

  20. Navigating nuclear science: Enhancing analysis through visualization

    Energy Technology Data Exchange (ETDEWEB)

    Irwin, N.H.; Berkel, J. van; Johnson, D.K.; Wylie, B.N.

    1997-09-01

    Data visualization is an emerging technology with high potential for addressing the information overload problem. This project extends the data visualization work of the Navigating Science project by coupling it with more traditional information retrieval methods. A citation-derived landscape was augmented with documents using a text-based similarity measure to show viability of extension into datasets where citation lists do not exist. Landscapes, showing hills where clusters of similar documents occur, can be navigated, manipulated and queried in this environment. The capabilities of this tool provide users with an intuitive explore-by-navigation method not currently available in today`s retrieval systems.

  1. Occam's razor and petascale visual data analysis

    Energy Technology Data Exchange (ETDEWEB)

    Bethel, E Wes [Lawrence Berkeley National Laboratory (LBNL); Johnson, Chris [University of Utah; Ahern, Sean [ORNL; Bell, J. [Lawrence Berkeley National Laboratory (LBNL); Bremer, Peer-Timo [Lawrence Livermore National Laboratory (LLNL); Childs, Hank [Lawrence Livermore National Laboratory (LLNL); Cormier-Michel, E [unknown; Day, M. [Lawrence Berkeley National Laboratory (LBNL); Deines, E. [University of California, Davis; Fogal, T. [University of Utah; Garth, Christoph [unknown; Geddes, C.G.R. [unknown; Hagen, H [unknown; Hamann, Bernd [unknown; Hansen, Charles [University of Utah; Jacobsen, Janet [Lawrence Berkeley National Laboratory (LBNL); Joy, Kenneth [University of California, Davis; Kruger, J. [University of Utah; Meredith, Jeremy S [ORNL; Messmer, P [unknown; Ostrouchov, George [ORNL; Pascucci, Valerio [Lawrence Livermore National Laboratory (LLNL); Potter, K. [University of Utah; Prabhat, [Lawrence Berkeley National Laboratory (LBNL); Pugmire, Dave [ORNL; Ruebel, O [unknown; Sanderson, Allen [University of Utah; Silva, C. [University of Utah; Ushizima, D. [Lawrence Berkeley National Laboratory (LBNL); Weber, G. [Lawrence Berkeley National Laboratory (LBNL); Whitlock, B. [Lawrence Livermore National Laboratory (LLNL); Wu, K. [Lawrence Berkeley National Laboratory (LBNL)

    2009-01-01

    One of the central challenges facing visualization research is how to effectively enable knowledge discovery. An effective approach will likely combine application architectures that are capable of running on today's largest platforms to address the challenges posed by large data with visual data analysis techniques that help find, represent, and effectively convey scientifically interesting features and phenomena.

  2. Simulation and formal analysis of visual attention

    NARCIS (Netherlands)

    Bosse, T.; Maanen, P.P. van; Treur, J.

    2009-01-01

    In this paper a simulation model for visual attention is discussed and formally analysed. The model is part of the design of an agent-based system that supports a naval officer in its task to compile a tactical picture of the situation in the field. A case study is described in which the model is

  3. Exclusively Visual Analysis of Classroom Group Interactions

    Science.gov (United States)

    Tucker, Laura; Scherr, Rachel E.; Zickler, Todd; Mazur, Eric

    2016-01-01

    Large-scale audiovisual data that measure group learning are time consuming to collect and analyze. As an initial step towards scaling qualitative classroom observation, we qualitatively coded classroom video using an established coding scheme with and without its audio cues. We find that interrater reliability is as high when using visual data…

  4. BiGGEsTS: integrated environment for biclustering analysis of time series gene expression data

    Directory of Open Access Journals (Sweden)

    Madeira Sara C

    2009-07-01

    Full Text Available Abstract Background The ability to monitor changes in expression patterns over time, and to observe the emergence of coherent temporal responses using expression time series, is critical to advance our understanding of complex biological processes. Biclustering has been recognized as an effective method for discovering local temporal expression patterns and unraveling potential regulatory mechanisms. The general biclustering problem is NP-hard. In the case of time series this problem is tractable, and efficient algorithms can be used. However, there is still a need for specialized applications able to take advantage of the temporal properties inherent to expression time series, both from a computational and a biological perspective. Findings BiGGEsTS makes available state-of-the-art biclustering algorithms for analyzing expression time series. Gene Ontology (GO annotations are used to assess the biological relevance of the biclusters. Methods for preprocessing expression time series and post-processing results are also included. The analysis is additionally supported by a visualization module capable of displaying informative representations of the data, including heatmaps, dendrograms, expression charts and graphs of enriched GO terms. Conclusion BiGGEsTS is a free open source graphical software tool for revealing local coexpression of genes in specific intervals of time, while integrating meaningful information on gene annotations. It is freely available at: http://kdbio.inesc-id.pt/software/biggests. We present a case study on the discovery of transcriptional regulatory modules in the response of Saccharomyces cerevisiae to heat stress.

  5. Interactive Correlation Analysis and Visualization of Climate Data

    Energy Technology Data Exchange (ETDEWEB)

    Ma, Kwan-Liu [Univ. of California, Davis, CA (United States)

    2016-09-21

    The relationship between our ability to analyze and extract insights from visualization of climate model output and the capability of the available resources to make those visualizations has reached a crisis point. The large volume of data currently produced by climate models is overwhelming the current, decades-old visualization workflow. The traditional methods for visualizing climate output also have not kept pace with changes in the types of grids used, the number of variables involved, and the number of different simulations performed with a climate model or the feature-richness of high-resolution simulations. This project has developed new and faster methods for visualization in order to get the most knowledge out of the new generation of high-resolution climate models. While traditional climate images will continue to be useful, there is need for new approaches to visualization and analysis of climate data if we are to gain all the insights available in ultra-large data sets produced by high-resolution model output and ensemble integrations of climate models such as those produced for the Coupled Model Intercomparison Project. Towards that end, we have developed new visualization techniques for performing correlation analysis. We have also introduced highly scalable, parallel rendering methods for visualizing large-scale 3D data. This project was done jointly with climate scientists and visualization researchers at Argonne National Laboratory and NCAR.

  6. Time series analysis of nuclear instrumentation in EBR-II

    International Nuclear Information System (INIS)

    Imel, G.R.

    1996-01-01

    Results of a time series analysis of the scaler count data from the 3 wide range nuclear detectors in the Experimental Breeder Reactor-II are presented. One of the channels was replaced, and it was desired to determine if there was any statistically significant change (ie, improvement) in the channel's response after the replacement. Data were collected from all 3 channels for 16-day periods before and after detector replacement. Time series analysis and statistical tests showed that there was no significant change after the detector replacement. Also, there were no statistically significant differences among the 3 channels, either before or after the replacement. Finally, it was determined that errors in the reactivity change inferred from subcritical count monitoring during fuel handling would be on the other of 20-30 cents for single count intervals

  7. Mathematical methods in time series analysis and digital image processing

    CERN Document Server

    Kurths, J; Maass, P; Timmer, J

    2008-01-01

    The aim of this volume is to bring together research directions in theoretical signal and imaging processing developed rather independently in electrical engineering, theoretical physics, mathematics and the computer sciences. In particular, mathematically justified algorithms and methods, the mathematical analysis of these algorithms, and methods as well as the investigation of connections between methods from time series analysis and image processing are reviewed. An interdisciplinary comparison of these methods, drawing upon common sets of test problems from medicine and geophysical/enviromental sciences, is also addressed. This volume coherently summarizes work carried out in the field of theoretical signal and image processing. It focuses on non-linear and non-parametric models for time series as well as on adaptive methods in image processing.

  8. MONGKIE: an integrated tool for network analysis and visualization for multi-omics data.

    Science.gov (United States)

    Jang, Yeongjun; Yu, Namhee; Seo, Jihae; Kim, Sun; Lee, Sanghyuk

    2016-03-18

    Network-based integrative analysis is a powerful technique for extracting biological insights from multilayered omics data such as somatic mutations, copy number variations, and gene expression data. However, integrated analysis of multi-omics data is quite complicated and can hardly be done in an automated way. Thus, a powerful interactive visual mining tool supporting diverse analysis algorithms for identification of driver genes and regulatory modules is much needed. Here, we present a software platform that integrates network visualization with omics data analysis tools seamlessly. The visualization unit supports various options for displaying multi-omics data as well as unique network models for describing sophisticated biological networks such as complex biomolecular reactions. In addition, we implemented diverse in-house algorithms for network analysis including network clustering and over-representation analysis. Novel functions include facile definition and optimized visualization of subgroups, comparison of a series of data sets in an identical network by data-to-visual mapping and subsequent overlaying function, and management of custom interaction networks. Utility of MONGKIE for network-based visual data mining of multi-omics data was demonstrated by analysis of the TCGA glioblastoma data. MONGKIE was developed in Java based on the NetBeans plugin architecture, thus being OS-independent with intrinsic support of module extension by third-party developers. We believe that MONGKIE would be a valuable addition to network analysis software by supporting many unique features and visualization options, especially for analysing multi-omics data sets in cancer and other diseases. .

  9. Spectral Unmixing Analysis of Time Series Landsat 8 Images

    Science.gov (United States)

    Zhuo, R.; Xu, L.; Peng, J.; Chen, Y.

    2018-05-01

    Temporal analysis of Landsat 8 images opens up new opportunities in the unmixing procedure. Although spectral analysis of time series Landsat imagery has its own advantage, it has rarely been studied. Nevertheless, using the temporal information can provide improved unmixing performance when compared to independent image analyses. Moreover, different land cover types may demonstrate different temporal patterns, which can aid the discrimination of different natures. Therefore, this letter presents time series K-P-Means, a new solution to the problem of unmixing time series Landsat imagery. The proposed approach is to obtain the "purified" pixels in order to achieve optimal unmixing performance. The vertex component analysis (VCA) is used to extract endmembers for endmember initialization. First, nonnegative least square (NNLS) is used to estimate abundance maps by using the endmember. Then, the estimated endmember is the mean value of "purified" pixels, which is the residual of the mixed pixel after excluding the contribution of all nondominant endmembers. Assembling two main steps (abundance estimation and endmember update) into the iterative optimization framework generates the complete algorithm. Experiments using both simulated and real Landsat 8 images show that the proposed "joint unmixing" approach provides more accurate endmember and abundance estimation results compared with "separate unmixing" approach.

  10. Notes on economic time series analysis system theoretic perspectives

    CERN Document Server

    Aoki, Masanao

    1983-01-01

    In seminars and graduate level courses I have had several opportunities to discuss modeling and analysis of time series with economists and economic graduate students during the past several years. These experiences made me aware of a gap between what economic graduate students are taught about vector-valued time series and what is available in recent system literature. Wishing to fill or narrow the gap that I suspect is more widely spread than my personal experiences indicate, I have written these notes to augment and reor­ ganize materials I have given in these courses and seminars. I have endeavored to present, in as much a self-contained way as practicable, a body of results and techniques in system theory that I judge to be relevant and useful to economists interested in using time series in their research. I have essentially acted as an intermediary and interpreter of system theoretic results and perspectives in time series by filtering out non-essential details, and presenting coherent accounts of wha...

  11. Analysis of Visual Interpretation of Satellite Data

    Science.gov (United States)

    Svatonova, H.

    2016-06-01

    Millions of people of all ages and expertise are using satellite and aerial data as an important input for their work in many different fields. Satellite data are also gradually finding a new place in education, especially in the fields of geography and in environmental issues. The article presents the results of an extensive research in the area of visual interpretation of image data carried out in the years 2013 - 2015 in the Czech Republic. The research was aimed at comparing the success rate of the interpretation of satellite data in relation to a) the substrates (to the selected colourfulness, the type of depicted landscape or special elements in the landscape) and b) to selected characteristics of users (expertise, gender, age). The results of the research showed that (1) false colour images have a slightly higher percentage of successful interpretation than natural colour images, (2) colourfulness of an element expected or rehearsed by the user (regardless of the real natural colour) increases the success rate of identifying the element (3) experts are faster in interpreting visual data than non-experts, with the same degree of accuracy of solving the task, and (4) men and women are equally successful in the interpretation of visual image data.

  12. ANALYSIS OF VISUAL INTERPRETATION OF SATELLITE DATA

    Directory of Open Access Journals (Sweden)

    H. Svatonova

    2016-06-01

    Full Text Available Millions of people of all ages and expertise are using satellite and aerial data as an important input for their work in many different fields. Satellite data are also gradually finding a new place in education, especially in the fields of geography and in environmental issues. The article presents the results of an extensive research in the area of visual interpretation of image data carried out in the years 2013 - 2015 in the Czech Republic. The research was aimed at comparing the success rate of the interpretation of satellite data in relation to a the substrates (to the selected colourfulness, the type of depicted landscape or special elements in the landscape and b to selected characteristics of users (expertise, gender, age. The results of the research showed that (1 false colour images have a slightly higher percentage of successful interpretation than natural colour images, (2 colourfulness of an element expected or rehearsed by the user (regardless of the real natural colour increases the success rate of identifying the element (3 experts are faster in interpreting visual data than non-experts, with the same degree of accuracy of solving the task, and (4 men and women are equally successful in the interpretation of visual image data.

  13. Visual Analysis of Inclusion Dynamics in Two-Phase Flow.

    Science.gov (United States)

    Karch, Grzegorz Karol; Beck, Fabian; Ertl, Moritz; Meister, Christian; Schulte, Kathrin; Weigand, Bernhard; Ertl, Thomas; Sadlo, Filip

    2018-05-01

    In single-phase flow visualization, research focuses on the analysis of vector field properties. In two-phase flow, in contrast, analysis of the phase components is typically of major interest. So far, visualization research of two-phase flow concentrated on proper interface reconstruction and the analysis thereof. In this paper, we present a novel visualization technique that enables the investigation of complex two-phase flow phenomena with respect to the physics of breakup and coalescence of inclusions. On the one hand, we adapt dimensionless quantities for a localized analysis of phase instability and breakup, and provide detailed inspection of breakup dynamics with emphasis on oscillation and its interplay with rotational motion. On the other hand, we present a parametric tightly linked space-time visualization approach for an effective interactive representation of the overall dynamics. We demonstrate the utility of our approach using several two-phase CFD datasets.

  14. Enabling dynamic network analysis through visualization in TVNViewer

    Directory of Open Access Journals (Sweden)

    Curtis Ross E

    2012-08-01

    Full Text Available Abstract Background Many biological processes are context-dependent or temporally specific. As a result, relationships between molecular constituents evolve across time and environments. While cutting-edge machine learning techniques can recover these networks, exploring and interpreting the rewiring behavior is challenging. Information visualization shines in this type of exploratory analysis, motivating the development ofTVNViewer (http://sailing.cs.cmu.edu/tvnviewer, a visualization tool for dynamic network analysis. Results In this paper, we demonstrate visualization techniques for dynamic network analysis by using TVNViewer to analyze yeast cell cycle and breast cancer progression datasets. Conclusions TVNViewer is a powerful new visualization tool for the analysis of biological networks that change across time or space.

  15. Enabling dynamic network analysis through visualization in TVNViewer

    Science.gov (United States)

    2012-01-01

    Background Many biological processes are context-dependent or temporally specific. As a result, relationships between molecular constituents evolve across time and environments. While cutting-edge machine learning techniques can recover these networks, exploring and interpreting the rewiring behavior is challenging. Information visualization shines in this type of exploratory analysis, motivating the development ofTVNViewer (http://sailing.cs.cmu.edu/tvnviewer), a visualization tool for dynamic network analysis. Results In this paper, we demonstrate visualization techniques for dynamic network analysis by using TVNViewer to analyze yeast cell cycle and breast cancer progression datasets. Conclusions TVNViewer is a powerful new visualization tool for the analysis of biological networks that change across time or space. PMID:22897913

  16. Visual querying and analysis of large software repositories

    NARCIS (Netherlands)

    Voinea, Lucian; Telea, Alexandru

    We present a software framework for mining software repositories. Our extensible framework enables the integration of data extraction from repositories with data analysis and interactive visualization. We demonstrate the applicability of the framework by presenting several case studies performed on

  17. Topological data analysis of financial time series: Landscapes of crashes

    Science.gov (United States)

    Gidea, Marian; Katz, Yuri

    2018-02-01

    We explore the evolution of daily returns of four major US stock market indices during the technology crash of 2000, and the financial crisis of 2007-2009. Our methodology is based on topological data analysis (TDA). We use persistence homology to detect and quantify topological patterns that appear in multidimensional time series. Using a sliding window, we extract time-dependent point cloud data sets, to which we associate a topological space. We detect transient loops that appear in this space, and we measure their persistence. This is encoded in real-valued functions referred to as a 'persistence landscapes'. We quantify the temporal changes in persistence landscapes via their Lp-norms. We test this procedure on multidimensional time series generated by various non-linear and non-equilibrium models. We find that, in the vicinity of financial meltdowns, the Lp-norms exhibit strong growth prior to the primary peak, which ascends during a crash. Remarkably, the average spectral density at low frequencies of the time series of Lp-norms of the persistence landscapes demonstrates a strong rising trend for 250 trading days prior to either dotcom crash on 03/10/2000, or to the Lehman bankruptcy on 09/15/2008. Our study suggests that TDA provides a new type of econometric analysis, which complements the standard statistical measures. The method can be used to detect early warning signals of imminent market crashes. We believe that this approach can be used beyond the analysis of financial time series presented here.

  18. Visualization and Data Analysis for High-Performance Computing

    Energy Technology Data Exchange (ETDEWEB)

    Sewell, Christopher Meyer [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-09-27

    This is a set of slides from a guest lecture for a class at the University of Texas, El Paso on visualization and data analysis for high-performance computing. The topics covered are the following: trends in high-performance computing; scientific visualization, such as OpenGL, ray tracing and volume rendering, VTK, and ParaView; data science at scale, such as in-situ visualization, image databases, distributed memory parallelism, shared memory parallelism, VTK-m, "big data", and then an analysis example.

  19. Cluster analysis of activity-time series in motor learning

    DEFF Research Database (Denmark)

    Balslev, Daniela; Nielsen, Finn Å; Futiger, Sally A

    2002-01-01

    Neuroimaging studies of learning focus on brain areas where the activity changes as a function of time. To circumvent the difficult problem of model selection, we used a data-driven analytic tool, cluster analysis, which extracts representative temporal and spatial patterns from the voxel......-time series. The optimal number of clusters was chosen using a cross-validated likelihood method, which highlights the clustering pattern that generalizes best over the subjects. Data were acquired with PET at different time points during practice of a visuomotor task. The results from cluster analysis show...

  20. The physiology analysis system: an integrated approach for warehousing, management and analysis of time-series physiology data.

    Science.gov (United States)

    McKenna, Thomas M; Bawa, Gagandeep; Kumar, Kamal; Reifman, Jaques

    2007-04-01

    The physiology analysis system (PAS) was developed as a resource to support the efficient warehousing, management, and analysis of physiology data, particularly, continuous time-series data that may be extensive, of variable quality, and distributed across many files. The PAS incorporates time-series data collected by many types of data-acquisition devices, and it is designed to free users from data management burdens. This Web-based system allows both discrete (attribute) and time-series (ordered) data to be manipulated, visualized, and analyzed via a client's Web browser. All processes occur on a server, so that the client does not have to download data or any application programs, and the PAS is independent of the client's computer operating system. The PAS contains a library of functions, written in different computer languages that the client can add to and use to perform specific data operations. Functions from the library are sequentially inserted into a function chain-based logical structure to construct sophisticated data operators from simple function building blocks, affording ad hoc query and analysis of time-series data. These features support advanced mining of physiology data.

  1. Infants’ Visual Recognition Memory for a Series of Categorically Related Items

    Science.gov (United States)

    Oakes, Lisa M.; Kovack-Lesh, Kristine A.

    2013-01-01

    Six-month-old infants' ("N" = 168) memory for individual items in a categorized list (e.g., images of dogs or cats) was examined to investigate the interactions between visual recognition memory, working memory, and categorization. In Experiments 1 and 2, infants were familiarized with six different cats or dogs, presented one at a time…

  2. Visualization techniques for malware behavior analysis

    Science.gov (United States)

    Grégio, André R. A.; Santos, Rafael D. C.

    2011-06-01

    Malware spread via Internet is a great security threat, so studying their behavior is important to identify and classify them. Using SSDT hooking we can obtain malware behavior by running it in a controlled environment and capturing interactions with the target operating system regarding file, process, registry, network and mutex activities. This generates a chain of events that can be used to compare them with other known malware. In this paper we present a simple approach to convert malware behavior into activity graphs and show some visualization techniques that can be used to analyze malware behavior, individually or grouped.

  3. Network graph analysis and visualization with Gephi

    CERN Document Server

    Cherven, Ken

    2013-01-01

    A practical, hands-on guide, that provides you with all the tools you need to visualize and analyze your data using network graphs with Gephi.This book is for data analysts who want to intuitively reveal patterns and trends, highlight outliers, and tell stories with their data using Gephi. It is great for anyone looking to explore interactions within network datasets, whether the data comes from social media or elsewhere. It is also a valuable resource for those seeking to learn more about Gephi without being overwhelmed by technical details.

  4. Analysis and visualization of social user communities

    Directory of Open Access Journals (Sweden)

    Daniel LÓPEZ SÁNCHEZ

    2016-06-01

    Full Text Available In this paper, a novel framework for social user clustering is proposed. Given a current controversial political topic, the Louvain Modularity algorithm is used to detect communities of users sharing the same political preferences. The political alignment of a set of users is labeled manually by a human expert and then the quality of the community detection is evaluated against this gold standard. In the last section, we propose a novel force-directed graph algorithm to generate a visual representation of the detected communities.   

  5. Time Series Analysis, Modeling and Applications A Computational Intelligence Perspective

    CERN Document Server

    Chen, Shyi-Ming

    2013-01-01

    Temporal and spatiotemporal data form an inherent fabric of the society as we are faced with streams of data coming from numerous sensors, data feeds, recordings associated with numerous areas of application embracing physical and human-generated phenomena (environmental data, financial markets, Internet activities, etc.). A quest for a thorough analysis, interpretation, modeling and prediction of time series comes with an ongoing challenge for developing models that are both accurate and user-friendly (interpretable). The volume is aimed to exploit the conceptual and algorithmic framework of Computational Intelligence (CI) to form a cohesive and comprehensive environment for building models of time series. The contributions covered in the volume are fully reflective of the wealth of the CI technologies by bringing together ideas, algorithms, and numeric studies, which convincingly demonstrate their relevance, maturity and visible usefulness. It reflects upon the truly remarkable diversity of methodological a...

  6. Recurrence Density Enhanced Complex Networks for Nonlinear Time Series Analysis

    Science.gov (United States)

    Costa, Diego G. De B.; Reis, Barbara M. Da F.; Zou, Yong; Quiles, Marcos G.; Macau, Elbert E. N.

    We introduce a new method, which is entitled Recurrence Density Enhanced Complex Network (RDE-CN), to properly analyze nonlinear time series. Our method first transforms a recurrence plot into a figure of a reduced number of points yet preserving the main and fundamental recurrence properties of the original plot. This resulting figure is then reinterpreted as a complex network, which is further characterized by network statistical measures. We illustrate the computational power of RDE-CN approach by time series by both the logistic map and experimental fluid flows, which show that our method distinguishes different dynamics sufficiently well as the traditional recurrence analysis. Therefore, the proposed methodology characterizes the recurrence matrix adequately, while using a reduced set of points from the original recurrence plots.

  7. Network analysis for the visualization and analysis of qualitative data.

    Science.gov (United States)

    Pokorny, Jennifer J; Norman, Alex; Zanesco, Anthony P; Bauer-Wu, Susan; Sahdra, Baljinder K; Saron, Clifford D

    2018-03-01

    We present a novel manner in which to visualize the coding of qualitative data that enables representation and analysis of connections between codes using graph theory and network analysis. Network graphs are created from codes applied to a transcript or audio file using the code names and their chronological location. The resulting network is a representation of the coding data that characterizes the interrelations of codes. This approach enables quantification of qualitative codes using network analysis and facilitates examination of associations of network indices with other quantitative variables using common statistical procedures. Here, as a proof of concept, we applied this method to a set of interview transcripts that had been coded in 2 different ways and the resultant network graphs were examined. The creation of network graphs allows researchers an opportunity to view and share their qualitative data in an innovative way that may provide new insights and enhance transparency of the analytical process by which they reach their conclusions. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  8. The Ultra-scale Visualization Climate Data Analysis Tools (UV-CDAT): Data Analysis and Visualization for Geoscience Data

    Energy Technology Data Exchange (ETDEWEB)

    Williams, Dean [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Doutriaux, Charles [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Patchett, John [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Williams, Sean [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Shipman, Galen [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Miller, Ross [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Steed, Chad [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Krishnan, Harinarayan [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Silva, Claudio [NYU Polytechnic School of Engineering, New York, NY (United States); Chaudhary, Aashish [Kitware, Inc., Clifton Park, NY (United States); Bremer, Peer-Timo [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Pugmire, David [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Bethel, E. Wes [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Childs, Hank [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Prabhat, Mr. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Geveci, Berk [Kitware, Inc., Clifton Park, NY (United States); Bauer, Andrew [Kitware, Inc., Clifton Park, NY (United States); Pletzer, Alexander [Tech-X Corp., Boulder, CO (United States); Poco, Jorge [NYU Polytechnic School of Engineering, New York, NY (United States); Ellqvist, Tommy [NYU Polytechnic School of Engineering, New York, NY (United States); Santos, Emanuele [Federal Univ. of Ceara, Fortaleza (Brazil); Potter, Gerald [NASA Johnson Space Center, Houston, TX (United States); Smith, Brian [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Maxwell, Thomas [NASA Johnson Space Center, Houston, TX (United States); Kindig, David [Tech-X Corp., Boulder, CO (United States); Koop, David [NYU Polytechnic School of Engineering, New York, NY (United States)

    2013-05-01

    To support interactive visualization and analysis of complex, large-scale climate data sets, UV-CDAT integrates a powerful set of scientific computing libraries and applications to foster more efficient knowledge discovery. Connected through a provenance framework, the UV-CDAT components can be loosely coupled for fast integration or tightly coupled for greater functionality and communication with other components. This framework addresses many challenges in the interactive visual analysis of distributed large-scale data for the climate community.

  9. SoccerStories: a kick-off for visual soccer analysis.

    Science.gov (United States)

    Perin, Charles; Vuillemot, Romain; Fekete, Jean-Daniel

    2013-12-01

    This article presents SoccerStories, a visualization interface to support analysts in exploring soccer data and communicating interesting insights. Currently, most analyses on such data relate to statistics on individual players or teams. However, soccer analysts we collaborated with consider that quantitative analysis alone does not convey the right picture of the game, as context, player positions and phases of player actions are the most relevant aspects. We designed SoccerStories to support the current practice of soccer analysts and to enrich it, both in the analysis and communication stages. Our system provides an overview+detail interface of game phases, and their aggregation into a series of connected visualizations, each visualization being tailored for actions such as a series of passes or a goal attempt. To evaluate our tool, we ran two qualitative user studies on recent games using SoccerStories with data from one of the world's leading live sports data providers. The first study resulted in a series of four articles on soccer tactics, by a tactics analyst, who said he would not have been able to write these otherwise. The second study consisted in an exploratory follow-up to investigate design alternatives for embedding soccer phases into word-sized graphics. For both experiments, we received a very enthusiastic feedback and participants consider further use of SoccerStories to enhance their current workflow.

  10. Complex Visual Data Analysis, Uncertainty, and Representation

    National Research Council Canada - National Science Library

    Schunn, Christian D; Saner, Lelyn D; Kirschenbaum, Susan K; Trafton, J. G; Littleton, Eliza B

    2007-01-01

    ... (weather forecasting, submarine target motion analysis, and fMRI data analysis). Internal spatial representations are coded from spontaneous gestures made during cued-recall summaries of problem solving activities...

  11. Visual function and ocular status of children with hearing impairment in Oman: A case series

    Directory of Open Access Journals (Sweden)

    Khandekar Rajiv

    2009-01-01

    Full Text Available Visual functions of children with hearing disability were evaluated in a school of Muscat, Oman in 2006. Two hundred and twenty-three children were tested for near vision, distant vision, contrast sensitivity, color vision, field of vision, motion perception and crowding. Profound and severe hearing loss was noted in 161 and 63 students respectively. Thirty-five (81% students with refractive error were using spectacles. Color vision and field of vision was defective in one student each. In 286 (64.1% eyes, contrast sensitivity was defective. Abnormal contrast sensitivity was not associated with the severity of hearing loss [RR = 1.04 (95% CI 0.91 to 1.29]. Children with hearing impairment should be assessed for visual functions. Refractive error and defect in contrast sensitivity were unusually high among these children. In addition to visual aids, we recommend environmental changes to improve illumination and contrast to improve the quality of life of such children with double disability.

  12. TINJAUAN ELEMEN ELEMEN CITRA KOTA SEBAGAI PEMBENTUK SERI VISUAL DI KOTA JAYAPURA

    Directory of Open Access Journals (Sweden)

    Alfini Baharudin

    2016-01-01

    Full Text Available A town ought to have clarity at its planning structure so that the town can give strong picture and image to its public. For the purpose required by existence of town planning structure which covers its functions and visual aspect which be each other related to give clarity.  Structural of the town planning among others can be observed visually by through an observation in doing movement of one point to point of others in town.  In this study, observation performed at town planning structure of Jayapura by doing observation to structural elements of town planning which there have. Observation there performed visually alongside main road which connects downtown of Jayapura-Abepura-Waena. From result show that town planning structure of Jayapura has one regulating elements in the form of kurvalinier main road which connects downtown of Jayapura-Abepura-Waena. Some of the structural elements of town planning which there have can give picture of strong image, but some elements between it needs arrangement in hope that can give optimal clarity. Result of this study recommends arrangement concept of serial vision at town planning structure of Jayapura according to its hierarchy.

  13. RESEARCH ON THE INTENSITY ANALYSIS AND RESULT VISUALIZATION OF CONSTRUCTION LAND IN URBAN PLANNING

    Directory of Open Access Journals (Sweden)

    J. Cui

    2017-09-01

    Full Text Available As a fundamental work of urban planning, the intensity analysis of construction land involves many repetitive data processing works that are prone to cause errors or data precision loss, and the lack of efficient methods and tools to visualizing the analysis results in current urban planning. In the research a portable tool is developed by using the Model Builder technique embedded in ArcGIS to provide automatic data processing and rapid result visualization for the works. A series of basic modules provided by ArcGIS are linked together to shape a whole data processing chain in the tool. Once the required data is imported, the analysis results and related maps and graphs including the intensity values and zoning map, the skyline analysis map etc. are produced automatically. Finally the tool is installation-free and can be dispatched quickly between planning teams.

  14. Time series analysis for psychological research: examining and forecasting change.

    Science.gov (United States)

    Jebb, Andrew T; Tay, Louis; Wang, Wei; Huang, Qiming

    2015-01-01

    Psychological research has increasingly recognized the importance of integrating temporal dynamics into its theories, and innovations in longitudinal designs and analyses have allowed such theories to be formalized and tested. However, psychological researchers may be relatively unequipped to analyze such data, given its many characteristics and the general complexities involved in longitudinal modeling. The current paper introduces time series analysis to psychological research, an analytic domain that has been essential for understanding and predicting the behavior of variables across many diverse fields. First, the characteristics of time series data are discussed. Second, different time series modeling techniques are surveyed that can address various topics of interest to psychological researchers, including describing the pattern of change in a variable, modeling seasonal effects, assessing the immediate and long-term impact of a salient event, and forecasting future values. To illustrate these methods, an illustrative example based on online job search behavior is used throughout the paper, and a software tutorial in R for these analyses is provided in the Supplementary Materials.

  15. Time series analysis for psychological research: examining and forecasting change

    Science.gov (United States)

    Jebb, Andrew T.; Tay, Louis; Wang, Wei; Huang, Qiming

    2015-01-01

    Psychological research has increasingly recognized the importance of integrating temporal dynamics into its theories, and innovations in longitudinal designs and analyses have allowed such theories to be formalized and tested. However, psychological researchers may be relatively unequipped to analyze such data, given its many characteristics and the general complexities involved in longitudinal modeling. The current paper introduces time series analysis to psychological research, an analytic domain that has been essential for understanding and predicting the behavior of variables across many diverse fields. First, the characteristics of time series data are discussed. Second, different time series modeling techniques are surveyed that can address various topics of interest to psychological researchers, including describing the pattern of change in a variable, modeling seasonal effects, assessing the immediate and long-term impact of a salient event, and forecasting future values. To illustrate these methods, an illustrative example based on online job search behavior is used throughout the paper, and a software tutorial in R for these analyses is provided in the Supplementary Materials. PMID:26106341

  16. Chaotic time series analysis in economics: Balance and perspectives

    International Nuclear Information System (INIS)

    Faggini, Marisa

    2014-01-01

    The aim of the paper is not to review the large body of work concerning nonlinear time series analysis in economics, about which much has been written, but rather to focus on the new techniques developed to detect chaotic behaviours in economic data. More specifically, our attention will be devoted to reviewing some of these techniques and their application to economic and financial data in order to understand why chaos theory, after a period of growing interest, appears now not to be such an interesting and promising research area

  17. Chaotic time series analysis in economics: Balance and perspectives

    Energy Technology Data Exchange (ETDEWEB)

    Faggini, Marisa, E-mail: mfaggini@unisa.it [Dipartimento di Scienze Economiche e Statistiche, Università di Salerno, Fisciano 84084 (Italy)

    2014-12-15

    The aim of the paper is not to review the large body of work concerning nonlinear time series analysis in economics, about which much has been written, but rather to focus on the new techniques developed to detect chaotic behaviours in economic data. More specifically, our attention will be devoted to reviewing some of these techniques and their application to economic and financial data in order to understand why chaos theory, after a period of growing interest, appears now not to be such an interesting and promising research area.

  18. Visual exploration and analysis of human-robot interaction rules

    Science.gov (United States)

    Zhang, Hui; Boyles, Michael J.

    2013-01-01

    We present a novel interaction paradigm for the visual exploration, manipulation and analysis of human-robot interaction (HRI) rules; our development is implemented using a visual programming interface and exploits key techniques drawn from both information visualization and visual data mining to facilitate the interaction design and knowledge discovery process. HRI is often concerned with manipulations of multi-modal signals, events, and commands that form various kinds of interaction rules. Depicting, manipulating and sharing such design-level information is a compelling challenge. Furthermore, the closed loop between HRI programming and knowledge discovery from empirical data is a relatively long cycle. This, in turn, makes design-level verification nearly impossible to perform in an earlier phase. In our work, we exploit a drag-and-drop user interface and visual languages to support depicting responsive behaviors from social participants when they interact with their partners. For our principal test case of gaze-contingent HRI interfaces, this permits us to program and debug the robots' responsive behaviors through a graphical data-flow chart editor. We exploit additional program manipulation interfaces to provide still further improvement to our programming experience: by simulating the interaction dynamics between a human and a robot behavior model, we allow the researchers to generate, trace and study the perception-action dynamics with a social interaction simulation to verify and refine their designs. Finally, we extend our visual manipulation environment with a visual data-mining tool that allows the user to investigate interesting phenomena such as joint attention and sequential behavioral patterns from multiple multi-modal data streams. We have created instances of HRI interfaces to evaluate and refine our development paradigm. As far as we are aware, this paper reports the first program manipulation paradigm that integrates visual programming

  19. Glyph-Based Video Visualization for Semen Analysis

    KAUST Repository

    Duffy, Brian

    2015-08-01

    © 2013 IEEE. The existing efforts in computer assisted semen analysis have been focused on high speed imaging and automated image analysis of sperm motility. This results in a large amount of data, and it is extremely challenging for both clinical scientists and researchers to interpret, compare and correlate the multidimensional and time-varying measurements captured from video data. In this work, we use glyphs to encode a collection of numerical measurements taken at a regular interval and to summarize spatio-temporal motion characteristics using static visual representations. The design of the glyphs addresses the needs for (a) encoding some 20 variables using separable visual channels, (b) supporting scientific observation of the interrelationships between different measurements and comparison between different sperm cells and their flagella, and (c) facilitating the learning of the encoding scheme by making use of appropriate visual abstractions and metaphors. As a case study, we focus this work on video visualization for computer-aided semen analysis, which has a broad impact on both biological sciences and medical healthcare. We demonstrate that glyph-based visualization can serve as a means of external memorization of video data as well as an overview of a large set of spatiotemporal measurements. It enables domain scientists to make scientific observation in a cost-effective manner by reducing the burden of viewing videos repeatedly, while providing them with a new visual representation for conveying semen statistics.

  20. Teaching Data Analysis with Interactive Visual Narratives

    Science.gov (United States)

    Saundage, Dilal; Cybulski, Jacob L.; Keller, Susan; Dharmasena, Lasitha

    2016-01-01

    Data analysis is a major part of business analytics (BA), which refers to the skills, methods, and technologies that enable managers to make swift, quality decisions based on large amounts of data. BA has become a major component of Information Systems (IS) courses all over the world. The challenge for IS educators is to teach data analysis--the…

  1. Visualization and analysis of atomistic simulation data with OVITO–the Open Visualization Tool

    International Nuclear Information System (INIS)

    Stukowski, Alexander

    2010-01-01

    The Open Visualization Tool (OVITO) is a new 3D visualization software designed for post-processing atomistic data obtained from molecular dynamics or Monte Carlo simulations. Unique analysis, editing and animations functions are integrated into its easy-to-use graphical user interface. The software is written in object-oriented C++, controllable via Python scripts and easily extendable through a plug-in interface. It is distributed as open-source software and can be downloaded from the website http://ovito.sourceforge.net/

  2. Time series clustering analysis of health-promoting behavior

    Science.gov (United States)

    Yang, Chi-Ta; Hung, Yu-Shiang; Deng, Guang-Feng

    2013-10-01

    Health promotion must be emphasized to achieve the World Health Organization goal of health for all. Since the global population is aging rapidly, ComCare elder health-promoting service was developed by the Taiwan Institute for Information Industry in 2011. Based on the Pender health promotion model, ComCare service offers five categories of health-promoting functions to address the everyday needs of seniors: nutrition management, social support, exercise management, health responsibility, stress management. To assess the overall ComCare service and to improve understanding of the health-promoting behavior of elders, this study analyzed health-promoting behavioral data automatically collected by the ComCare monitoring system. In the 30638 session records collected for 249 elders from January, 2012 to March, 2013, behavior patterns were identified by fuzzy c-mean time series clustering algorithm combined with autocorrelation-based representation schemes. The analysis showed that time series data for elder health-promoting behavior can be classified into four different clusters. Each type reveals different health-promoting needs, frequencies, function numbers and behaviors. The data analysis result can assist policymakers, health-care providers, and experts in medicine, public health, nursing and psychology and has been provided to Taiwan National Health Insurance Administration to assess the elder health-promoting behavior.

  3. Time series analysis of gold production in Malaysia

    Science.gov (United States)

    Muda, Nora; Hoon, Lee Yuen

    2012-05-01

    Gold is a soft, malleable, bright yellow metallic element and unaffected by air or most reagents. It is highly valued as an asset or investment commodity and is extensively used in jewellery, industrial application, dentistry and medical applications. In Malaysia, gold mining is limited in several areas such as Pahang, Kelantan, Terengganu, Johor and Sarawak. The main purpose of this case study is to obtain a suitable model for the production of gold in Malaysia. The model can also be used to predict the data of Malaysia's gold production in the future. Box-Jenkins time series method was used to perform time series analysis with the following steps: identification, estimation, diagnostic checking and forecasting. In addition, the accuracy of prediction is tested using mean absolute percentage error (MAPE). From the analysis, the ARIMA (3,1,1) model was found to be the best fitted model with MAPE equals to 3.704%, indicating the prediction is very accurate. Hence, this model can be used for forecasting. This study is expected to help the private and public sectors to understand the gold production scenario and later plan the gold mining activities in Malaysia.

  4. An Ideal Observer Analysis of Visual Working Memory

    Science.gov (United States)

    Sims, Chris R.; Jacobs, Robert A.; Knill, David C.

    2012-01-01

    Limits in visual working memory (VWM) strongly constrain human performance across many tasks. However, the nature of these limits is not well understood. In this article we develop an ideal observer analysis of human VWM by deriving the expected behavior of an optimally performing but limited-capacity memory system. This analysis is framed around…

  5. Remote Sensing Data Visualization, Fusion and Analysis via Giovanni

    Science.gov (United States)

    Leptoukh, G.; Zubko, V.; Gopalan, A.; Khayat, M.

    2007-01-01

    We describe Giovanni, the NASA Goddard developed online visualization and analysis tool that allows users explore various phenomena without learning remote sensing data formats and downloading voluminous data. Using MODIS aerosol data as an example, we formulate an approach to the data fusion for Giovanni to further enrich online multi-sensor remote sensing data comparison and analysis.

  6. Predicting the Market Potential Using Time Series Analysis

    Directory of Open Access Journals (Sweden)

    Halmet Bradosti

    2015-12-01

    Full Text Available The aim of this analysis is to forecast a mini-market sales volume for the period of twelve months starting August 2015 to August 2016. The study is based on the monthly sales in Iraqi Dinar for a private local mini-market for the month of April 2014 to July 2015. As revealed on the graph and of course if the stagnant economic condition continues, the trend of future sales is down-warding. Based on time series analysis, the business may continue to operate and generate small revenues until August 2016. However, due to low sales volume, low profit margin and operating expenses, the revenues may not be adequate enough to produce positive net income and the business may not be able to operate afterward. The principal question rose from this is the forecasting sales in the region will be difficult where the business cycle so dynamic and revolutionary due to systematic risks and unforeseeable future.

  7. Pipe-anchor discontinuity analysis utilizing power series solutions, Bessel functions, and Fourier series

    International Nuclear Information System (INIS)

    Williams, Dennis K.; Ranson, William F.

    2003-01-01

    One of the paradigmatic classes of problems that frequently arise in piping stress analysis discipline is the effect of local stresses created by supports and restraints attachments. Over the past 20 years, concerns have been identified by both regulatory agencies in the nuclear power industry and others in the process and chemicals industries concerning the effect of various stiff clamping arrangements on the expected life of the pipe and its various piping components. In many of the commonly utilized geometries and arrangements of pipe clamps, the elasticity problem becomes the axisymmetric stress and deformation determination in a hollow cylinder (pipe) subjected to the appropriate boundary conditions and respective loads per se. One of the geometries that serve as a pipe anchor is comprised of two pipe clamps that are bolted tightly to the pipe and affixed to a modified shoe-type arrangement. The shoe is employed for the purpose of providing an immovable base that can be easily attached either by bolting or welding to a structural steel pipe rack. Over the past 50 years, the computational tools available to the piping analyst have changed dramatically and thereby have caused the implementation of solutions to the basic problems of elasticity to change likewise. The need to obtain closed form elasticity solutions, however, has always been a driving force in engineering. The employment of symbolic calculus that is currently available through numerous software packages makes closed form solutions very economical. This paper briefly traces the solutions over the past 50 years to a variety of axisymmetric stress problems involving hollow circular cylinders employing a Fourier series representation. In the present example, a properly chosen Fourier series represent the mathematical simulation of the imposed axial displacements on the outside diametrical surface. A general solution technique is introduced for the axisymmetric discontinuity stresses resulting from an

  8. A Componential Analysis of Visual Attention in Children With ADHD.

    Science.gov (United States)

    McAvinue, Laura P; Vangkilde, Signe; Johnson, Katherine A; Habekost, Thomas; Kyllingsbæk, Søren; Bundesen, Claus; Robertson, Ian H

    2015-10-01

    Inattentive behaviour is a defining characteristic of ADHD. Researchers have wondered about the nature of the attentional deficit underlying these symptoms. The primary purpose of the current study was to examine this attentional deficit using a novel paradigm based upon the Theory of Visual Attention (TVA). The TVA paradigm enabled a componential analysis of visual attention through the use of a mathematical model to estimate parameters relating to attentional selectivity and capacity. Children's ability to sustain attention was also assessed using the Sustained Attention to Response Task. The sample included a comparison between 25 children with ADHD and 25 control children aged 9-13. Children with ADHD had significantly impaired sustained attention and visual processing speed but intact attentional selectivity, perceptual threshold and visual short-term memory capacity. The results of this study lend support to the notion of differential impairment of attentional functions in children with ADHD. © 2012 SAGE Publications.

  9. Research on Visual Analysis Methods of Terrorism Events

    Science.gov (United States)

    Guo, Wenyue; Liu, Haiyan; Yu, Anzhu; Li, Jing

    2016-06-01

    Under the situation that terrorism events occur more and more frequency throughout the world, improving the response capability of social security incidents has become an important aspect to test governments govern ability. Visual analysis has become an important method of event analysing for its advantage of intuitive and effective. To analyse events' spatio-temporal distribution characteristics, correlations among event items and the development trend, terrorism event's spatio-temporal characteristics are discussed. Suitable event data table structure based on "5W" theory is designed. Then, six types of visual analysis are purposed, and how to use thematic map and statistical charts to realize visual analysis on terrorism events is studied. Finally, experiments have been carried out by using the data provided by Global Terrorism Database, and the results of experiments proves the availability of the methods.

  10. Comparison of correlation analysis techniques for irregularly sampled time series

    Directory of Open Access Journals (Sweden)

    K. Rehfeld

    2011-06-01

    Full Text Available Geoscientific measurements often provide time series with irregular time sampling, requiring either data reconstruction (interpolation or sophisticated methods to handle irregular sampling. We compare the linear interpolation technique and different approaches for analyzing the correlation functions and persistence of irregularly sampled time series, as Lomb-Scargle Fourier transformation and kernel-based methods. In a thorough benchmark test we investigate the performance of these techniques.

    All methods have comparable root mean square errors (RMSEs for low skewness of the inter-observation time distribution. For high skewness, very irregular data, interpolation bias and RMSE increase strongly. We find a 40 % lower RMSE for the lag-1 autocorrelation function (ACF for the Gaussian kernel method vs. the linear interpolation scheme,in the analysis of highly irregular time series. For the cross correlation function (CCF the RMSE is then lower by 60 %. The application of the Lomb-Scargle technique gave results comparable to the kernel methods for the univariate, but poorer results in the bivariate case. Especially the high-frequency components of the signal, where classical methods show a strong bias in ACF and CCF magnitude, are preserved when using the kernel methods.

    We illustrate the performances of interpolation vs. Gaussian kernel method by applying both to paleo-data from four locations, reflecting late Holocene Asian monsoon variability as derived from speleothem δ18O measurements. Cross correlation results are similar for both methods, which we attribute to the long time scales of the common variability. The persistence time (memory is strongly overestimated when using the standard, interpolation-based, approach. Hence, the Gaussian kernel is a reliable and more robust estimator with significant advantages compared to other techniques and suitable for large scale application to paleo-data.

  11. Remote-Sensing Time Series Analysis, a Vegetation Monitoring Tool

    Science.gov (United States)

    McKellip, Rodney; Prados, Donald; Ryan, Robert; Ross, Kenton; Spruce, Joseph; Gasser, Gerald; Greer, Randall

    2008-01-01

    The Time Series Product Tool (TSPT) is software, developed in MATLAB , which creates and displays high signal-to- noise Vegetation Indices imagery and other higher-level products derived from remotely sensed data. This tool enables automated, rapid, large-scale regional surveillance of crops, forests, and other vegetation. TSPT temporally processes high-revisit-rate satellite imagery produced by the Moderate Resolution Imaging Spectroradiometer (MODIS) and by other remote-sensing systems. Although MODIS imagery is acquired daily, cloudiness and other sources of noise can greatly reduce the effective temporal resolution. To improve cloud statistics, the TSPT combines MODIS data from multiple satellites (Aqua and Terra). The TSPT produces MODIS products as single time-frame and multitemporal change images, as time-series plots at a selected location, or as temporally processed image videos. Using the TSPT program, MODIS metadata is used to remove and/or correct bad and suspect data. Bad pixel removal, multiple satellite data fusion, and temporal processing techniques create high-quality plots and animated image video sequences that depict changes in vegetation greenness. This tool provides several temporal processing options not found in other comparable imaging software tools. Because the framework to generate and use other algorithms is established, small modifications to this tool will enable the use of a large range of remotely sensed data types. An effective remote-sensing crop monitoring system must be able to detect subtle changes in plant health in the earliest stages, before the effects of a disease outbreak or other adverse environmental conditions can become widespread and devastating. The integration of the time series analysis tool with ground-based information, soil types, crop types, meteorological data, and crop growth models in a Geographic Information System, could provide the foundation for a large-area crop-surveillance system that could identify

  12. Toward a Shared Vocabulary for Visual Analysis: An Analytic Toolkit for Deconstructing the Visual Design of Graphic Novels

    Science.gov (United States)

    Connors, Sean P.

    2012-01-01

    Literacy educators might advocate using graphic novels to develop students' visual literacy skills, but teachers who lack a vocabulary for engaging in close analysis of visual texts may be reluctant to teach them. Recognizing this, teacher educators should equip preservice teachers with a vocabulary for analyzing visual texts. This article…

  13. Visual Analysis in Traffic & Re-identification

    DEFF Research Database (Denmark)

    Møgelmose, Andreas

    and analysis, and person re-identification. In traffic sign detection, the work comprises a thorough survey of the state of the art, assembly of the worlds largest public dataset with U.S. traffic signs, and work in machine learning based detection algorithms. It was shown that detection of U.S. traffic signs...

  14. Inorganic chemical analysis of environmental materials—A lecture series

    Science.gov (United States)

    Crock, J.G.; Lamothe, P.J.

    2011-01-01

    At the request of the faculty of the Colorado School of Mines, Golden, Colorado, the authors prepared and presented a lecture series to the students of a graduate level advanced instrumental analysis class. The slides and text presented in this report are a compilation and condensation of this series of lectures. The purpose of this report is to present the slides and notes and to emphasize the thought processes that should be used by a scientist submitting samples for analyses in order to procure analytical data to answer a research question. First and foremost, the analytical data generated can be no better than the samples submitted. The questions to be answered must first be well defined and the appropriate samples collected from the population that will answer the question. The proper methods of analysis, including proper sample preparation and digestion techniques, must then be applied. Care must be taken to achieve the required limits of detection of the critical analytes to yield detectable analyte concentration (above "action" levels) for the majority of the study's samples and to address what portion of those analytes answer the research question-total or partial concentrations. To guarantee a robust analytical result that answers the research question(s), a well-defined quality assurance and quality control (QA/QC) plan must be employed. This QA/QC plan must include the collection and analysis of field and laboratory blanks, sample duplicates, and matrix-matched standard reference materials (SRMs). The proper SRMs may include in-house materials and/or a selection of widely available commercial materials. A discussion of the preparation and applicability of in-house reference materials is also presented. Only when all these analytical issues are sufficiently addressed can the research questions be answered with known certainty.

  15. A New Conceptual Model for Business Ecosystem Visualization and Analysis

    Directory of Open Access Journals (Sweden)

    Luiz Felipe Hupsel Vaz

    2013-01-01

    Full Text Available This study has the objective of plotting the effects of network externalities and superstar software for the visualization and analysis of industry ecosystems. The output is made possible by gathering sales from a tracking website, associating each sale to a single consumer and by using a network visualization software. The result is a graph that shows strategic positioning of publishers and platforms, serving as a strategic tool for both academics and professionals. The approach is scalable to other industries and can be used to support analysis on mergers, acquisitions and alliances.

  16. Data-driven security analysis, visualization and dashboards

    CERN Document Server

    Jacobs, Jay

    2014-01-01

    Uncover hidden patterns of data and respond with countermeasures Security professionals need all the tools at their disposal to increase their visibility in order to prevent security breaches and attacks. This careful guide explores two of the most powerful ? data analysis and visualization. You'll soon understand how to harness and wield data, from collection and storage to management and analysis as well as visualization and presentation. Using a hands-on approach with real-world examples, this book shows you how to gather feedback, measure the effectiveness of your security methods, and ma

  17. Low-rank and sparse modeling for visual analysis

    CERN Document Server

    Fu, Yun

    2014-01-01

    This book provides a view of low-rank and sparse computing, especially approximation, recovery, representation, scaling, coding, embedding and learning among unconstrained visual data. The book includes chapters covering multiple emerging topics in this new field. It links multiple popular research fields in Human-Centered Computing, Social Media, Image Classification, Pattern Recognition, Computer Vision, Big Data, and Human-Computer Interaction. Contains an overview of the low-rank and sparse modeling techniques for visual analysis by examining both theoretical analysis and real-world applic

  18. STUDIES IN ASTRONOMICAL TIME SERIES ANALYSIS. VI. BAYESIAN BLOCK REPRESENTATIONS

    Energy Technology Data Exchange (ETDEWEB)

    Scargle, Jeffrey D. [Space Science and Astrobiology Division, MS 245-3, NASA Ames Research Center, Moffett Field, CA 94035-1000 (United States); Norris, Jay P. [Physics Department, Boise State University, 2110 University Drive, Boise, ID 83725-1570 (United States); Jackson, Brad [The Center for Applied Mathematics and Computer Science, Department of Mathematics, San Jose State University, One Washington Square, MH 308, San Jose, CA 95192-0103 (United States); Chiang, James, E-mail: jeffrey.d.scargle@nasa.gov [W. W. Hansen Experimental Physics Laboratory, Kavli Institute for Particle Astrophysics and Cosmology, Department of Physics and SLAC National Accelerator Laboratory, Stanford University, Stanford, CA 94305 (United States)

    2013-02-20

    This paper addresses the problem of detecting and characterizing local variability in time series and other forms of sequential data. The goal is to identify and characterize statistically significant variations, at the same time suppressing the inevitable corrupting observational errors. We present a simple nonparametric modeling technique and an algorithm implementing it-an improved and generalized version of Bayesian Blocks-that finds the optimal segmentation of the data in the observation interval. The structure of the algorithm allows it to be used in either a real-time trigger mode, or a retrospective mode. Maximum likelihood or marginal posterior functions to measure model fitness are presented for events, binned counts, and measurements at arbitrary times with known error distributions. Problems addressed include those connected with data gaps, variable exposure, extension to piecewise linear and piecewise exponential representations, multivariate time series data, analysis of variance, data on the circle, other data modes, and dispersed data. Simulations provide evidence that the detection efficiency for weak signals is close to a theoretical asymptotic limit derived by Arias-Castro et al. In the spirit of Reproducible Research all of the code and data necessary to reproduce all of the figures in this paper are included as supplementary material.

  19. Time series analysis of brain regional volume by MR image

    International Nuclear Information System (INIS)

    Tanaka, Mika; Tarusawa, Ayaka; Nihei, Mitsuyo; Fukami, Tadanori; Yuasa, Tetsuya; Wu, Jin; Ishiwata, Kiichi; Ishii, Kenji

    2010-01-01

    The present study proposed a methodology of time series analysis of volumes of frontal, parietal, temporal and occipital lobes and cerebellum because such volumetric reports along the process of individual's aging have been scarcely presented. Subjects analyzed were brain images of 2 healthy males and 18 females of av. age of 69.0 y, of which T1-weighted 3D SPGR (spoiled gradient recalled in the steady state) acquisitions with a GE SIGNA EXCITE HD 1.5T machine were conducted for 4 times in the time series of 42-50 months. The image size was 256 x 256 x (86-124) voxels with digitization level 16 bits. As the template for the regions, the standard gray matter atlas (icbn452 a tlas p robability g ray) and its labeled one (icbn.Labels), provided by UCLA Laboratory of Neuro Imaging, were used for individual's standardization. Segmentation, normalization and coregistration were performed with the MR imaging software SPM8 (Statistic Parametric Mapping 8). Volumes of regions were calculated as their voxel ratio to the whole brain voxel in percent. It was found that the regional volumes decreased with aging in all above lobes examined and cerebellum in average percent per year of -0.11, -0.07, -0.04, -0.02, and -0.03, respectively. The procedure for calculation of the regional volumes, which has been manually operated hitherto, can be automatically conducted for the individual brain using the standard atlases above. (T.T.)

  20. Studies in Astronomical Time Series Analysis. VI. Bayesian Block Representations

    Science.gov (United States)

    Scargle, Jeffrey D.; Norris, Jay P.; Jackson, Brad; Chiang, James

    2013-01-01

    This paper addresses the problem of detecting and characterizing local variability in time series and other forms of sequential data. The goal is to identify and characterize statistically significant variations, at the same time suppressing the inevitable corrupting observational errors. We present a simple nonparametric modeling technique and an algorithm implementing it-an improved and generalized version of Bayesian Blocks [Scargle 1998]-that finds the optimal segmentation of the data in the observation interval. The structure of the algorithm allows it to be used in either a real-time trigger mode, or a retrospective mode. Maximum likelihood or marginal posterior functions to measure model fitness are presented for events, binned counts, and measurements at arbitrary times with known error distributions. Problems addressed include those connected with data gaps, variable exposure, extension to piece- wise linear and piecewise exponential representations, multivariate time series data, analysis of variance, data on the circle, other data modes, and dispersed data. Simulations provide evidence that the detection efficiency for weak signals is close to a theoretical asymptotic limit derived by [Arias-Castro, Donoho and Huo 2003]. In the spirit of Reproducible Research [Donoho et al. (2008)] all of the code and data necessary to reproduce all of the figures in this paper are included as auxiliary material.

  1. STUDIES IN ASTRONOMICAL TIME SERIES ANALYSIS. VI. BAYESIAN BLOCK REPRESENTATIONS

    International Nuclear Information System (INIS)

    Scargle, Jeffrey D.; Norris, Jay P.; Jackson, Brad; Chiang, James

    2013-01-01

    This paper addresses the problem of detecting and characterizing local variability in time series and other forms of sequential data. The goal is to identify and characterize statistically significant variations, at the same time suppressing the inevitable corrupting observational errors. We present a simple nonparametric modeling technique and an algorithm implementing it—an improved and generalized version of Bayesian Blocks—that finds the optimal segmentation of the data in the observation interval. The structure of the algorithm allows it to be used in either a real-time trigger mode, or a retrospective mode. Maximum likelihood or marginal posterior functions to measure model fitness are presented for events, binned counts, and measurements at arbitrary times with known error distributions. Problems addressed include those connected with data gaps, variable exposure, extension to piecewise linear and piecewise exponential representations, multivariate time series data, analysis of variance, data on the circle, other data modes, and dispersed data. Simulations provide evidence that the detection efficiency for weak signals is close to a theoretical asymptotic limit derived by Arias-Castro et al. In the spirit of Reproducible Research all of the code and data necessary to reproduce all of the figures in this paper are included as supplementary material.

  2. Computerized diagnostic data analysis and 3-D visualization

    International Nuclear Information System (INIS)

    Schuhmann, D.; Haubner, M.; Krapichler, C.; Englmeier, K.H.; Seemann, M.; Schoepf, U.J.; Gebicke, K.; Reiser, M.

    1998-01-01

    Purpose: To survey methods for 3D data visualization and image analysis which can be used for computer based diagnostics. Material and methods: The methods available are explained in short terms and links to the literature are presented. Methods which allow basic manipulation of 3D data are windowing, rotation and clipping. More complex methods for visualization of 3D data are multiplanar reformation, volume projections (MIP, semi-transparent projections) and surface projections. Methods for image analysis comprise local data transformation (e.g. filtering) and definition and application of complex models (e.g. deformable models). Results: Volume projections produce an impression of the 3D data set without reducing the data amount. This supports the interpretation of the 3D data set and saves time in comparison to any investigation which requires examination of all slice images. More advanced techniques for visualization, e.g. surface projections and hybrid rendering visualize anatomical information to a very detailed extent, but both techniques require the segmentation of the structures of interest. Image analysis methods can be used to extract these structures (e.g. an organ) from the image data. Discussion: At the present time volume projections are robust and fast enough to be used routinely. Surface projections can be used to visualize complex and presegmented anatomical features. (orig.) [de

  3. A non linear analysis of human gait time series based on multifractal analysis and cross correlations

    International Nuclear Information System (INIS)

    Munoz-Diosdado, A

    2005-01-01

    We analyzed databases with gait time series of adults and persons with Parkinson, Huntington and amyotrophic lateral sclerosis (ALS) diseases. We obtained the staircase graphs of accumulated events that can be bounded by a straight line whose slope can be used to distinguish between gait time series from healthy and ill persons. The global Hurst exponent of these series do not show tendencies, we intend that this is because some gait time series have monofractal behavior and others have multifractal behavior so they cannot be characterized with a single Hurst exponent. We calculated the multifractal spectra, obtained the spectra width and found that the spectra of the healthy young persons are almost monofractal. The spectra of ill persons are wider than the spectra of healthy persons. In opposition to the interbeat time series where the pathology implies loss of multifractality, in the gait time series the multifractal behavior emerges with the pathology. Data were collected from healthy and ill subjects as they walked in a roughly circular path and they have sensors in both feet, so we have one time series for the left foot and other for the right foot. First, we analyzed these time series separately, and then we compared both results, with direct comparison and with a cross correlation analysis. We tried to find differences in both time series that can be used as indicators of equilibrium problems

  4. A non linear analysis of human gait time series based on multifractal analysis and cross correlations

    Energy Technology Data Exchange (ETDEWEB)

    Munoz-Diosdado, A [Department of Mathematics, Unidad Profesional Interdisciplinaria de Biotecnologia, Instituto Politecnico Nacional, Av. Acueducto s/n, 07340, Mexico City (Mexico)

    2005-01-01

    We analyzed databases with gait time series of adults and persons with Parkinson, Huntington and amyotrophic lateral sclerosis (ALS) diseases. We obtained the staircase graphs of accumulated events that can be bounded by a straight line whose slope can be used to distinguish between gait time series from healthy and ill persons. The global Hurst exponent of these series do not show tendencies, we intend that this is because some gait time series have monofractal behavior and others have multifractal behavior so they cannot be characterized with a single Hurst exponent. We calculated the multifractal spectra, obtained the spectra width and found that the spectra of the healthy young persons are almost monofractal. The spectra of ill persons are wider than the spectra of healthy persons. In opposition to the interbeat time series where the pathology implies loss of multifractality, in the gait time series the multifractal behavior emerges with the pathology. Data were collected from healthy and ill subjects as they walked in a roughly circular path and they have sensors in both feet, so we have one time series for the left foot and other for the right foot. First, we analyzed these time series separately, and then we compared both results, with direct comparison and with a cross correlation analysis. We tried to find differences in both time series that can be used as indicators of equilibrium problems.

  5. Discontinuous conduction mode analysis of phase-modulated series ...

    Indian Academy of Sciences (India)

    modulated dc–dc series resonant converter (SRC) operating in discontinuous conduction mode (DCM). The conventional fundamental harmonic approximation technique is extended for a non-ideal series resonant tank to clarify the limitations of ...

  6. Centrality measures in temporal networks with time series analysis

    Science.gov (United States)

    Huang, Qiangjuan; Zhao, Chengli; Zhang, Xue; Wang, Xiaojie; Yi, Dongyun

    2017-05-01

    The study of identifying important nodes in networks has a wide application in different fields. However, the current researches are mostly based on static or aggregated networks. Recently, the increasing attention to networks with time-varying structure promotes the study of node centrality in temporal networks. In this paper, we define a supra-evolution matrix to depict the temporal network structure. With using of the time series analysis, the relationships between different time layers can be learned automatically. Based on the special form of the supra-evolution matrix, the eigenvector centrality calculating problem is turned into the calculation of eigenvectors of several low-dimensional matrices through iteration, which effectively reduces the computational complexity. Experiments are carried out on two real-world temporal networks, Enron email communication network and DBLP co-authorship network, the results of which show that our method is more efficient at discovering the important nodes than the common aggregating method.

  7. Multiresolution and Explicit Methods for Vector Field Analysis and Visualization

    Science.gov (United States)

    Nielson, Gregory M.

    1997-01-01

    This is a request for a second renewal (3d year of funding) of a research project on the topic of multiresolution and explicit methods for vector field analysis and visualization. In this report, we describe the progress made on this research project during the second year and give a statement of the planned research for the third year. There are two aspects to this research project. The first is concerned with the development of techniques for computing tangent curves for use in visualizing flow fields. The second aspect of the research project is concerned with the development of multiresolution methods for curvilinear grids and their use as tools for visualization, analysis and archiving of flow data. We report on our work on the development of numerical methods for tangent curve computation first.

  8. Adult Craniopharyngioma: Case Series, Systematic Review, and Meta-Analysis.

    Science.gov (United States)

    Dandurand, Charlotte; Sepehry, Amir Ali; Asadi Lari, Mohammad Hossein; Akagami, Ryojo; Gooderham, Peter

    2017-12-18

    The optimal therapeutic approach for adult craniopharyngioma remains controversial. Some advocate for gross total resection (GTR), while others advocate for subtotal resection followed by adjuvant radiotherapy (STR + XRT). To conduct a systematic review and meta-analysis assessing the rate of recurrence in the follow-up of 3 yr in adult craniopharyngioma stratified by extent of resection and presence of adjuvant radiotherapy. MEDLINE (1946-July 1, 2016) and EMBASE (1980-June 30, 2016) were systematically reviewed. From1975 to 2013, 33 patients were treated with initial surgical resection for adult onset craniopharyngioma at our center and were reviewed for inclusion in this study. Data from 22 patients were available for inclusion as a case series in the systematic review. Eligible studies (n = 21) were identified from the literature in addition to a case series of our institutional experience. Three groups were available for analysis: GTR, STR + XRT, and STR. The rates of recurrence were 17%, 27%, and 45%, respectively. The risk of developing recurrence was significant for GTR vs STR (odds ratio [OR]: 0.24, 95% confidence interval [CI]: 0.15-0.38) and STR + XRT vs STR (OR: 0.20, 95% CI: 0.10-0.41). Risk of recurrence after GTR vs STR + XRT did not reach significance (OR: 0.63, 95% CI: 0.33-1.24, P = .18). This is the first and largest systematic review focusing on the rate of recurrence in adult craniopharyngioma. Although the rates of recurrence are favoring GTR, difference in risk of recurrence did not reach significance. This study provides guidance to clinicians and directions for future research with the need to stratify outcomes per treatment modalities. Copyright © 2017 by the Congress of Neurological Surgeons

  9. Data analysis and visualization in MCNP trademark

    International Nuclear Information System (INIS)

    Waters, L.S.

    1994-01-01

    There are many situations where the user may wish to go beyond current MCNP capabilities. For example, data produced by the code may need formatting for input into an external graphics package. Limitations on disk space may hinder writing out large PTRAK files. Specialized data analysis routines may be needed to model complex experimental results. One may wish to produce particle histories in a format not currently available in the code. To address these and other similar concerns a new capability in MCNP is being tested. A number of real, integer, logical and character variables describing the current and past characteristics of a particle are made available online to the user in three subroutines. The type of data passed can be controlled by cards in the INP file. The subroutines otherwise are empty, and the user may code in any desired analysis. A new MCNP executable is produced by compiling these subroutines and linking to a library which contains the object files for the rest of the code

  10. Interactive Visual Analysis for Organic Photovoltaic Solar Cells

    KAUST Repository

    Abouelhassan, Amal A.

    2017-12-05

    Organic Photovoltaic (OPV) solar cells provide a promising alternative for harnessing solar energy. However, the efficient design of OPV materials that achieve better performance requires support by better-tailored visualization tools than are currently available, which is the goal of this thesis. One promising approach in the OPV field is to control the effective material of the OPV device, which is known as the Bulk-Heterojunction (BHJ) morphology. The BHJ morphology has a complex composition. Current BHJ exploration techniques deal with the morphologies as black boxes with no perception of the photoelectric current in the BHJ morphology. Therefore, this method depends on a trial-and-error approach and does not efficiently characterize complex BHJ morphologies. On the other hand, current state-of-the-art methods for assessing the performance of BHJ morphologies are based on the global quantification of morphological features. Accordingly, scientists in OPV research are still lacking a sufficient understanding of the best material design. To remove these limitations, we propose a new approach for knowledge-assisted visual exploration and analysis in the OPV domain. We develop new techniques for enabling efficient OPV charge transport path analysis. We employ, adapt, and develop techniques from scientific visualization, geometric modeling, clustering, and visual interaction to obtain new designs of visualization tools that are specifically tailored for the needs of OPV scientists. At the molecular scale, the user can use semantic rules to define clusters of atoms with certain geometric properties. At the nanoscale, we propose a novel framework for visual characterization and exploration of local structure-performance correlations. We also propose a new approach for correlating structural features to performance bottlenecks. We employ a visual feedback strategy that allows scientists to make intuitive choices about fabrication parameters. We furthermore propose a

  11. Assessing Spontaneous Combustion Instability with Nonlinear Time Series Analysis

    Science.gov (United States)

    Eberhart, C. J.; Casiano, M. J.

    2015-01-01

    Considerable interest lies in the ability to characterize the onset of spontaneous instabilities within liquid propellant rocket engine (LPRE) combustion devices. Linear techniques, such as fast Fourier transforms, various correlation parameters, and critical damping parameters, have been used at great length for over fifty years. Recently, nonlinear time series methods have been applied to deduce information pertaining to instability incipiency hidden in seemingly stochastic combustion noise. A technique commonly used in biological sciences known as the Multifractal Detrended Fluctuation Analysis has been extended to the combustion dynamics field, and is introduced here as a data analysis approach complementary to linear ones. Advancing, a modified technique is leveraged to extract artifacts of impending combustion instability that present themselves a priori growth to limit cycle amplitudes. Analysis is demonstrated on data from J-2X gas generator testing during which a distinct spontaneous instability was observed. Comparisons are made to previous work wherein the data were characterized using linear approaches. Verification of the technique is performed by examining idealized signals and comparing two separate, independently developed tools.

  12. Interrupted time-series analysis: studying trends in neurosurgery.

    Science.gov (United States)

    Wong, Ricky H; Smieliauskas, Fabrice; Pan, I-Wen; Lam, Sandi K

    2015-12-01

    OBJECT Neurosurgery studies traditionally have evaluated the effects of interventions on health care outcomes by studying overall changes in measured outcomes over time. Yet, this type of linear analysis is limited due to lack of consideration of the trend's effects both pre- and postintervention and the potential for confounding influences. The aim of this study was to illustrate interrupted time-series analysis (ITSA) as applied to an example in the neurosurgical literature and highlight ITSA's potential for future applications. METHODS The methods used in previous neurosurgical studies were analyzed and then compared with the methodology of ITSA. RESULTS The ITSA method was identified in the neurosurgical literature as an important technique for isolating the effect of an intervention (such as a policy change or a quality and safety initiative) on a health outcome independent of other factors driving trends in the outcome. The authors determined that ITSA allows for analysis of the intervention's immediate impact on outcome level and on subsequent trends and enables a more careful measure of the causal effects of interventions on health care outcomes. CONCLUSIONS ITSA represents a significant improvement over traditional observational study designs in quantifying the impact of an intervention. ITSA is a useful statistical procedure to understand, consider, and implement as the field of neurosurgery evolves in sophistication in big-data analytics, economics, and health services research.

  13. Immersive Visual Data Analysis For Geoscience Using Commodity VR Hardware

    Science.gov (United States)

    Kreylos, O.; Kellogg, L. H.

    2017-12-01

    Immersive visualization using virtual reality (VR) display technology offers tremendous benefits for the visual analysis of complex three-dimensional data like those commonly obtained from geophysical and geological observations and models. Unlike "traditional" visualization, which has to project 3D data onto a 2D screen for display, VR can side-step this projection and display 3D data directly, in a pseudo-holographic (head-tracked stereoscopic) form, and does therefore not suffer the distortions of relative positions, sizes, distances, and angles that are inherent in 2D projection. As a result, researchers can apply their spatial reasoning skills to virtual data in the same way they can to real objects or environments. The UC Davis W.M. Keck Center for Active Visualization in the Earth Sciences (KeckCAVES, http://keckcaves.org) has been developing VR methods for data analysis since 2005, but the high cost of VR displays has been preventing large-scale deployment and adoption of KeckCAVES technology. The recent emergence of high-quality commodity VR, spearheaded by the Oculus Rift and HTC Vive, has fundamentally changed the field. With KeckCAVES' foundational VR operating system, Vrui, now running natively on the HTC Vive, all KeckCAVES visualization software, including 3D Visualizer, LiDAR Viewer, Crusta, Nanotech Construction Kit, and ProtoShop, are now available to small labs, single researchers, and even home users. LiDAR Viewer and Crusta have been used for rapid response to geologic events including earthquakes and landslides, to visualize the impacts of sealevel rise, to investigate reconstructed paleooceanographic masses, and for exploration of the surface of Mars. The Nanotech Construction Kit is being used to explore the phases of carbon in Earth's deep interior, while ProtoShop can be used to construct and investigate protein structures.

  14. Modeling activity patterns of wildlife using time-series analysis.

    Science.gov (United States)

    Zhang, Jindong; Hull, Vanessa; Ouyang, Zhiyun; He, Liang; Connor, Thomas; Yang, Hongbo; Huang, Jinyan; Zhou, Shiqiang; Zhang, Zejun; Zhou, Caiquan; Zhang, Hemin; Liu, Jianguo

    2017-04-01

    The study of wildlife activity patterns is an effective approach to understanding fundamental ecological and evolutionary processes. However, traditional statistical approaches used to conduct quantitative analysis have thus far had limited success in revealing underlying mechanisms driving activity patterns. Here, we combine wavelet analysis, a type of frequency-based time-series analysis, with high-resolution activity data from accelerometers embedded in GPS collars to explore the effects of internal states (e.g., pregnancy) and external factors (e.g., seasonal dynamics of resources and weather) on activity patterns of the endangered giant panda ( Ailuropoda melanoleuca ). Giant pandas exhibited higher frequency cycles during the winter when resources (e.g., water and forage) were relatively poor, as well as during spring, which includes the giant panda's mating season. During the summer and autumn when resources were abundant, pandas exhibited a regular activity pattern with activity peaks every 24 hr. A pregnant individual showed distinct differences in her activity pattern from other giant pandas for several months following parturition. These results indicate that animals adjust activity cycles to adapt to seasonal variation of the resources and unique physiological periods. Wavelet coherency analysis also verified the synchronization of giant panda activity level with air temperature and solar radiation at the 24-hr band. Our study also shows that wavelet analysis is an effective tool for analyzing high-resolution activity pattern data and its relationship to internal and external states, an approach that has the potential to inform wildlife conservation and management across species.

  15. UTOOLS: microcomputer software for spatial analysis and landscape visualization.

    Science.gov (United States)

    Alan A. Ager; Robert J. McGaughey

    1997-01-01

    UTOOLS is a collection of programs designed to integrate various spatial data in a way that allows versatile spatial analysis and visualization. The programs were designed for watershed-scale assessments in which a wide array of resource data must be integrated, analyzed, and interpreted. UTOOLS software combines raster, attribute, and vector data into "spatial...

  16. In situ visualization and data analysis for turbidity currents simulation

    Science.gov (United States)

    Camata, Jose J.; Silva, Vítor; Valduriez, Patrick; Mattoso, Marta; Coutinho, Alvaro L. G. A.

    2018-01-01

    Turbidity currents are underflows responsible for sediment deposits that generate geological formations of interest for the oil and gas industry. LibMesh-sedimentation is an application built upon the libMesh library to simulate turbidity currents. In this work, we present the integration of libMesh-sedimentation with in situ visualization and in transit data analysis tools. DfAnalyzer is a solution based on provenance data to extract and relate strategic simulation data in transit from multiple data for online queries. We integrate libMesh-sedimentation and ParaView Catalyst to perform in situ data analysis and visualization. We present a parallel performance analysis for two turbidity currents simulations showing that the overhead for both in situ visualization and in transit data analysis is negligible. We show that our tools enable monitoring the sediments appearance at runtime and steer the simulation based on the solver convergence and visual information on the sediment deposits, thus enhancing the analytical power of turbidity currents simulations.

  17. GRIZ: Visualization of finite element analysis results on unstructured grids

    International Nuclear Information System (INIS)

    Dovey, D.; Loomis, M.D.

    1994-01-01

    GRIZ is a general-purpose post-processing application that supports interactive visualization of finite element analysis results on three-dimensional unstructured grids. GRIZ includes direct-to-videodisc animation capabilities and is being used as a production tool for creating engineering animations

  18. Predictive Factors in OCT Analysis for Visual Outcome in Exudative AMD

    Directory of Open Access Journals (Sweden)

    Maria-Andreea Gamulescu

    2012-01-01

    Full Text Available Background. Reliable predictive factors for therapy outcome may enable treating physicians to counsel their patients more efficiently concerning probability of improvement or time point of discontinuation of a certain therapy. Methods. This is a retrospective analysis of 87 patients with exudative age-related macular degeneration who received three monthly intravitreal ranibizumab injections. Visual acuity before initiation of intravitreal therapy and 4–6 weeks after last intravitreal injection was compared and related to the preoperative visualisation of continuity of the outer retinal layers as assessed by OCT: external limiting membrane (ELM, inner photoreceptor segments (IPS, junction between inner and outer segments (IS/OS, and outer photoreceptor segments (OPS. Results. Visual acuity increased in 40 of 87 (46.0% patients, it remained stable in 25 (28.7%, and 22 (25.3% patients had decreased visual acuity four to six weeks after triple intravitreal ranibizumab injections. No statistically significant predictive value could be demonstrated for grade of continuity of outer retinal layers concerning visual acuity development. Conclusions. In our series of AMD patients, grade of continuity of outer retinal layers was not a significant predictive value for visual acuity development after triple ranibizumab injections.

  19. Graphical Data Analysis on the Circle: Wrap-Around Time Series Plots for (Interrupted) Time Series Designs.

    Science.gov (United States)

    Rodgers, Joseph Lee; Beasley, William Howard; Schuelke, Matthew

    2014-01-01

    Many data structures, particularly time series data, are naturally seasonal, cyclical, or otherwise circular. Past graphical methods for time series have focused on linear plots. In this article, we move graphical analysis onto the circle. We focus on 2 particular methods, one old and one new. Rose diagrams are circular histograms and can be produced in several different forms using the RRose software system. In addition, we propose, develop, illustrate, and provide software support for a new circular graphical method, called Wrap-Around Time Series Plots (WATS Plots), which is a graphical method useful to support time series analyses in general but in particular in relation to interrupted time series designs. We illustrate the use of WATS Plots with an interrupted time series design evaluating the effect of the Oklahoma City bombing on birthrates in Oklahoma County during the 10 years surrounding the bombing of the Murrah Building in Oklahoma City. We compare WATS Plots with linear time series representations and overlay them with smoothing and error bands. Each method is shown to have advantages in relation to the other; in our example, the WATS Plots more clearly show the existence and effect size of the fertility differential.

  20. Visualization and Analysis-Oriented Reconstruction of Material Interfaces

    Energy Technology Data Exchange (ETDEWEB)

    Childs, Henry R.

    2010-03-05

    Reconstructing boundaries along material interfaces from volume fractions is a difficult problem, especially because the under-resolved nature of the input data allows for many correct interpretations. Worse, algorithms widely accepted as appropriate for simulation are inappropriate for visualization. In this paper, we describe a new algorithm that is specifically intended for reconstructing material interfaces for visualization and analysis requirements. The algorithm performs well with respect to memory footprint and execution time, has desirable properties in various accuracy metrics, and also produces smooth surfaces with few artifacts, even when faced with more than two materials per cell.

  1. Time Series Analysis of the Quasar PKS 1749+096

    Science.gov (United States)

    Lam, Michael T.; Balonek, T. J.

    2011-01-01

    Multiple timescales of variability are observed in quasars at a variety of wavelengths, the nature of which is not fully understood. In 2007 and 2008, the quasar 1749+096 underwent two unprecedented optical outbursts, reaching a brightness never before seen in our twenty years of monitoring. Much lower level activity had been seen prior to these two outbursts. We present an analysis of the timescales of variability over the two regimes using a variety of statistical techniques. An IDL software package developed at Colgate University over the summer of 2010, the Quasar User Interface (QUI), provides effective computation of four time series functions for analyzing underlying trends present in generic, discretely sampled data sets. Using the Autocorrelation Function, Structure Function, and Power Spectrum, we are able to quickly identify possible variability timescales. QUI is also capable of computing the Cross-Correlation Function for comparing variability at different wavelengths. We apply these algorithms to 1749+096 and present our analysis of the timescales for this object. Funding for this project was received from Colgate University, the Justus and Jayne Schlichting Student Research Fund, and the NASA / New York Space Grant.

  2. Assessing Visualization: An analysis of Chilean teachers’ guidelines

    DEFF Research Database (Denmark)

    Andrade-Molina, Melissa; Díaz, Leonora

    2018-01-01

    , this importance seems to fade when it comes to assessing students while learning school mathematics and geometry. We conducted an analysis of the official guidelines for the assessment school mathematics in Chile. The analysis of two of those guides is considered here. The results revealed that these guidelines...... do not help teachers while assessing visualization in schools; rather its focus is embedded in a tradition of training that leads to a reduction of space....

  3. Prospects of the use of new visualization methods in developing countries (WHO technical report series 723)

    International Nuclear Information System (INIS)

    Zedgenidze, G.A.; Narkevich, K.Ya.

    1987-01-01

    To substantiate the expediency of choosing either vizualization method, a comparative analysis of advantages and disadvantages of roentgen computerized tomography (CT) and ultrasonic investigation (UI) has been performed. Conditions of using CT and UI in developing countries emphasising on organizational and technical and esepecially clinical-diagnostic aspects of this problem have been analysed as well. A manual on clinical indications and correct CT and UI use from the point of view of methodology has been compiled

  4. Analysis of time series and size of equivalent sample

    International Nuclear Information System (INIS)

    Bernal, Nestor; Molina, Alicia; Pabon, Daniel; Martinez, Jorge

    2004-01-01

    In a meteorological context, a first approach to the modeling of time series is to use models of autoregressive type. This allows one to take into account the meteorological persistence or temporal behavior, thereby identifying the memory of the analyzed process. This article seeks to pre-sent the concept of the size of an equivalent sample, which helps to identify in the data series sub periods with a similar structure. Moreover, in this article we examine the alternative of adjusting the variance of the series, keeping in mind its temporal structure, as well as an adjustment to the covariance of two time series. This article presents two examples, the first one corresponding to seven simulated series with autoregressive structure of first order, and the second corresponding to seven meteorological series of anomalies of the air temperature at the surface in two Colombian regions

  5. Visual Data Analysis as an Integral Part of Environmental Management

    Energy Technology Data Exchange (ETDEWEB)

    Meyer, Joerg; Bethel, E. Wes; Horsman, Jennifer L.; Hubbard, Susan S.; Krishnan, Harinarayan; Romosan,, Alexandru; Keating, Elizabeth H.; Monroe, Laura; Strelitz, Richard; Moore, Phil; Taylor, Glenn; Torkian, Ben; Johnson, Timothy C.; Gorton, Ian

    2012-10-01

    The U.S. Department of Energy's (DOE) Office of Environmental Management (DOE/EM) currently supports an effort to understand and predict the fate of nuclear contaminants and their transport in natural and engineered systems. Geologists, hydrologists, physicists and computer scientists are working together to create models of existing nuclear waste sites, to simulate their behavior and to extrapolate it into the future. We use visualization as an integral part in each step of this process. In the first step, visualization is used to verify model setup and to estimate critical parameters. High-performance computing simulations of contaminant transport produces massive amounts of data, which is then analyzed using visualization software specifically designed for parallel processing of large amounts of structured and unstructured data. Finally, simulation results are validated by comparing simulation results to measured current and historical field data. We describe in this article how visual analysis is used as an integral part of the decision-making process in the planning of ongoing and future treatment options for the contaminated nuclear waste sites. Lessons learned from visually analyzing our large-scale simulation runs will also have an impact on deciding on treatment measures for other contaminated sites.

  6. SDSS Log Viewer: visual exploratory analysis of large-volume SQL log data

    Science.gov (United States)

    Zhang, Jian; Chen, Chaomei; Vogeley, Michael S.; Pan, Danny; Thakar, Ani; Raddick, Jordan

    2012-01-01

    User-generated Structured Query Language (SQL) queries are a rich source of information for database analysts, information scientists, and the end users of databases. In this study a group of scientists in astronomy and computer and information scientists work together to analyze a large volume of SQL log data generated by users of the Sloan Digital Sky Survey (SDSS) data archive in order to better understand users' data seeking behavior. While statistical analysis of such logs is useful at aggregated levels, efficiently exploring specific patterns of queries is often a challenging task due to the typically large volume of the data, multivariate features, and data requirements specified in SQL queries. To enable and facilitate effective and efficient exploration of the SDSS log data, we designed an interactive visualization tool, called the SDSS Log Viewer, which integrates time series visualization, text visualization, and dynamic query techniques. We describe two analysis scenarios of visual exploration of SDSS log data, including understanding unusually high daily query traffic and modeling the types of data seeking behaviors of massive query generators. The two scenarios demonstrate that the SDSS Log Viewer provides a novel and potentially valuable approach to support these targeted tasks.

  7. Interglacial climate dynamics and advanced time series analysis

    Science.gov (United States)

    Mudelsee, Manfred; Bermejo, Miguel; Köhler, Peter; Lohmann, Gerrit

    2013-04-01

    Studying the climate dynamics of past interglacials (IGs) helps to better assess the anthropogenically influenced dynamics of the current IG, the Holocene. We select the IG portions from the EPICA Dome C ice core archive, which covers the past 800 ka, to apply methods of statistical time series analysis (Mudelsee 2010). The analysed variables are deuterium/H (indicating temperature) (Jouzel et al. 2007), greenhouse gases (Siegenthaler et al. 2005, Loulergue et al. 2008, L¨ü thi et al. 2008) and a model-co-derived climate radiative forcing (Köhler et al. 2010). We select additionally high-resolution sea-surface-temperature records from the marine sedimentary archive. The first statistical method, persistence time estimation (Mudelsee 2002) lets us infer the 'climate memory' property of IGs. Second, linear regression informs about long-term climate trends during IGs. Third, ramp function regression (Mudelsee 2000) is adapted to look on abrupt climate changes during IGs. We compare the Holocene with previous IGs in terms of these mathematical approaches, interprete results in a climate context, assess uncertainties and the requirements to data from old IGs for yielding results of 'acceptable' accuracy. This work receives financial support from the Deutsche Forschungsgemeinschaft (Project ClimSens within the DFG Research Priority Program INTERDYNAMIK) and the European Commission (Marie Curie Initial Training Network LINC, No. 289447, within the 7th Framework Programme). References Jouzel J, Masson-Delmotte V, Cattani O, Dreyfus G, Falourd S, Hoffmann G, Minster B, Nouet J, Barnola JM, Chappellaz J, Fischer H, Gallet JC, Johnsen S, Leuenberger M, Loulergue L, Luethi D, Oerter H, Parrenin F, Raisbeck G, Raynaud D, Schilt A, Schwander J, Selmo E, Souchez R, Spahni R, Stauffer B, Steffensen JP, Stenni B, Stocker TF, Tison JL, Werner M, Wolff EW (2007) Orbital and millennial Antarctic climate variability over the past 800,000 years. Science 317:793. Köhler P, Bintanja R

  8. The Prediction of Teacher Turnover Employing Time Series Analysis.

    Science.gov (United States)

    Costa, Crist H.

    The purpose of this study was to combine knowledge of teacher demographic data with time-series forecasting methods to predict teacher turnover. Moving averages and exponential smoothing were used to forecast discrete time series. The study used data collected from the 22 largest school districts in Iowa, designated as FACT schools. Predictions…

  9. ON THE FOURIER AND WAVELET ANALYSIS OF CORONAL TIME SERIES

    International Nuclear Information System (INIS)

    Auchère, F.; Froment, C.; Bocchialini, K.; Buchlin, E.; Solomon, J.

    2016-01-01

    Using Fourier and wavelet analysis, we critically re-assess the significance of our detection of periodic pulsations in coronal loops. We show that the proper identification of the frequency dependence and statistical properties of the different components of the power spectra provides a strong argument against the common practice of data detrending, which tends to produce spurious detections around the cut-off frequency of the filter. In addition, the white and red noise models built into the widely used wavelet code of Torrence and Compo cannot, in most cases, adequately represent the power spectra of coronal time series, thus also possibly causing false positives. Both effects suggest that several reports of periodic phenomena should be re-examined. The Torrence and Compo code nonetheless effectively computes rigorous confidence levels if provided with pertinent models of mean power spectra, and we describe the appropriate manner in which to call its core routines. We recall the meaning of the default confidence levels output from the code, and we propose new Monte-Carlo-derived levels that take into account the total number of degrees of freedom in the wavelet spectra. These improvements allow us to confirm that the power peaks that we detected have a very low probability of being caused by noise.

  10. ON THE FOURIER AND WAVELET ANALYSIS OF CORONAL TIME SERIES

    Energy Technology Data Exchange (ETDEWEB)

    Auchère, F.; Froment, C.; Bocchialini, K.; Buchlin, E.; Solomon, J., E-mail: frederic.auchere@ias.u-psud.fr [Institut d’Astrophysique Spatiale, CNRS, Univ. Paris-Sud, Université Paris-Saclay, Bât. 121, F-91405 Orsay (France)

    2016-07-10

    Using Fourier and wavelet analysis, we critically re-assess the significance of our detection of periodic pulsations in coronal loops. We show that the proper identification of the frequency dependence and statistical properties of the different components of the power spectra provides a strong argument against the common practice of data detrending, which tends to produce spurious detections around the cut-off frequency of the filter. In addition, the white and red noise models built into the widely used wavelet code of Torrence and Compo cannot, in most cases, adequately represent the power spectra of coronal time series, thus also possibly causing false positives. Both effects suggest that several reports of periodic phenomena should be re-examined. The Torrence and Compo code nonetheless effectively computes rigorous confidence levels if provided with pertinent models of mean power spectra, and we describe the appropriate manner in which to call its core routines. We recall the meaning of the default confidence levels output from the code, and we propose new Monte-Carlo-derived levels that take into account the total number of degrees of freedom in the wavelet spectra. These improvements allow us to confirm that the power peaks that we detected have a very low probability of being caused by noise.

  11. Prediction of solar cycle 24 using fourier series analysis

    International Nuclear Information System (INIS)

    Khalid, M.; Sultana, M.; Zaidi, F.

    2014-01-01

    Predicting the behavior of solar activity has become very significant. It is due to its influence on Earth and the surrounding environment. Apt predictions of the amplitude and timing of the next solar cycle will aid in the estimation of the several results of Space Weather. In the past, many prediction procedures have been used and have been successful to various degrees in the field of solar activity forecast. In this study, Solar cycle 24 is forecasted by the Fourier series method. Comparative analysis has been made by auto regressive integrated moving averages method. From sources, January 2008 was the minimum preceding solar cycle 24, the amplitude and shape of solar cycle 24 is approximate on monthly number of sunspots. This forecast framework approximates a mean solar cycle 24, with the maximum appearing during May 2014 (+- 8 months), with most sunspot of 98 +- 10. Solar cycle 24 will be ending in June 2020 (+- 7 months). The difference between two consecutive peak values of solar cycles (i.e. solar cycle 23 and 24 ) is 165 months(+- 6 months). (author)

  12. Online Time Series Analysis of Land Products over Asia Monsoon Region via Giovanni

    Science.gov (United States)

    Shen, Suhung; Leptoukh, Gregory G.; Gerasimov, Irina

    2011-01-01

    Time series analysis is critical to the study of land cover/land use changes and climate. Time series studies at local-to-regional scales require higher spatial resolution, such as 1km or less, data. MODIS land products of 250m to 1km resolution enable such studies. However, such MODIS land data files are distributed in 10ox10o tiles, due to large data volumes. Conducting a time series study requires downloading all tiles that include the study area for the time period of interest, and mosaicking the tiles spatially. This can be an extremely time-consuming process. In support of the Monsoon Asia Integrated Regional Study (MAIRS) program, NASA GES DISC (Goddard Earth Sciences Data and Information Services Center) has processed MODIS land products at 1 km resolution over the Asia monsoon region (0o-60oN, 60o-150oE) with a common data structure and format. The processed data have been integrated into the Giovanni system (Goddard Interactive Online Visualization ANd aNalysis Infrastructure) that enables users to explore, analyze, and download data over an area and time period of interest easily. Currently, the following regional MODIS land products are available in Giovanni: 8-day 1km land surface temperature and active fire, monthly 1km vegetation index, and yearly 0.05o, 500m land cover types. More data will be added in the near future. By combining atmospheric and oceanic data products in the Giovanni system, it is possible to do further analyses of environmental and climate changes associated with the land, ocean, and atmosphere. This presentation demonstrates exploring land products in the Giovanni system with sample case scenarios.

  13. Time-series-analysis techniques applied to nuclear-material accounting

    International Nuclear Information System (INIS)

    Pike, D.H.; Morrison, G.W.; Downing, D.J.

    1982-05-01

    This document is designed to introduce the reader to the applications of Time Series Analysis techniques to Nuclear Material Accountability data. Time series analysis techniques are designed to extract information from a collection of random variables ordered by time by seeking to identify any trends, patterns, or other structure in the series. Since nuclear material accountability data is a time series, one can extract more information using time series analysis techniques than by using other statistical techniques. Specifically, the objective of this document is to examine the applicability of time series analysis techniques to enhance loss detection of special nuclear materials. An introductory section examines the current industry approach which utilizes inventory differences. The error structure of inventory differences is presented. Time series analysis techniques discussed include the Shewhart Control Chart, the Cumulative Summation of Inventory Differences Statistics (CUSUM) and the Kalman Filter and Linear Smoother

  14. Visual Analysis for Nowcasting of Multidimensional Lightning Data

    Directory of Open Access Journals (Sweden)

    Stefan Peters

    2013-08-01

    Full Text Available Globally, most weather-related damages are caused by thunderstorms. Besides floods, strong wind, and hail, one of the major thunderstorm ground effects is lightning. Therefore, lightning investigations, including detection, cluster identification, tracking, and nowcasting are essential. To enable reliable decisions, current and predicted lightning cluster- and track features as well as analysis results have to be represented in the most appropriate way. Our paper introduces a framework which includes identification, tracking, nowcasting, and in particular visualization and statistical analysis of dynamic lightning data in three-dimensional space. The paper is specifically focused on enabling users to conduct the visual analysis of lightning data for the purpose of identification and interpretation of spatial-temporal patterns embedded in lightning data, and their dynamics. A graphic user interface (GUI is developed, wherein lightning tracks and predicted lightning clusters, including their prediction certainty, can be investigated within a 3D view or within a Space-Time-Cube. In contrast to previous work, our approach provides insight into the dynamics of past and predicted 3D lightning clusters and cluster features over time. We conclude that an interactive visual exploration in combination with a statistical analysis can provide new knowledge within lightning investigations and, thus, support decision-making in weather forecast or lightning damage prevention.

  15. Geoscience data visualization and analysis using GeoMapApp

    Science.gov (United States)

    Ferrini, Vicki; Carbotte, Suzanne; Ryan, William; Chan, Samantha

    2013-04-01

    Increased availability of geoscience data resources has resulted in new opportunities for developing visualization and analysis tools that not only promote data integration and synthesis, but also facilitate quantitative cross-disciplinary access to data. Interdisciplinary investigations, in particular, frequently require visualizations and quantitative access to specialized data resources across disciplines, which has historically required specialist knowledge of data formats and software tools. GeoMapApp (www.geomapapp.org) is a free online data visualization and analysis tool that provides direct quantitative access to a wide variety of geoscience data for a broad international interdisciplinary user community. While GeoMapApp provides access to online data resources, it can also be packaged to work offline through the deployment of a small portable hard drive. This mode of operation can be particularly useful during field programs to provide functionality and direct access to data when a network connection is not possible. Hundreds of data sets from a variety of repositories are directly accessible in GeoMapApp, without the need for the user to understand the specifics of file formats or data reduction procedures. Available data include global and regional gridded data, images, as well as tabular and vector datasets. In addition to basic visualization and data discovery functionality, users are provided with simple tools for creating customized maps and visualizations and to quantitatively interrogate data. Specialized data portals with advanced functionality are also provided for power users to further analyze data resources and access underlying component datasets. Users may import and analyze their own geospatial datasets by loading local versions of geospatial data and can access content made available through Web Feature Services (WFS) and Web Map Services (WMS). Once data are loaded in GeoMapApp, a variety options are provided to export data and/or 2D/3D

  16. On-line analysis of reactor noise using time-series analysis

    International Nuclear Information System (INIS)

    McGevna, V.G.

    1981-10-01

    A method to allow use of time series analysis for on-line noise analysis has been developed. On-line analysis of noise in nuclear power reactors has been limited primarily to spectral analysis and related frequency domain techniques. Time series analysis has many distinct advantages over spectral analysis in the automated processing of reactor noise. However, fitting an autoregressive-moving average (ARMA) model to time series data involves non-linear least squares estimation. Unless a high speed, general purpose computer is available, the calculations become too time consuming for on-line applications. To eliminate this problem, a special purpose algorithm was developed for fitting ARMA models. While it is based on a combination of steepest descent and Taylor series linearization, properties of the ARMA model are used so that the auto- and cross-correlation functions can be used to eliminate the need for estimating derivatives. The number of calculations, per iteration varies lineegardless of the mee 0.2% yield strength displayed anisotropy, with axial and circumferential values being greater than radial. For CF8-CPF8 and CF8M-CPF8M castings to meet current ASME Code S acid fuel cells

  17. Sustainability of Italian budgetary policies: a time series analysis (1862-2013

    Directory of Open Access Journals (Sweden)

    Gordon L. Brady

    2017-12-01

    Full Text Available In this paper, we analyze the sustainability of Italian public finances using a unique database covering the period 1862-2013. This paper focuses on empirical tests for the sustainability and solvency of fiscal policies. A necessary but not sufficient condition implies that the growth rate of public debt should in the limit be smaller than the asymptotic rate of interest. In addition, the debt-to-GDP ratio must eventually stabilize at a steady-state level. The results of unit root and stationarity tests show that the variables are non-stationary at levels, but stationary in first-differences form, or I(1. However, some breaks in the series emerge, given internal and external crises (wars, oil shocks, regime changes, institutional reforms. Therefore, the empirical analysis is conducted for the entire period, as well as two sub‐periods (1862‐1913 and 1947‐2013. Moreover, anecdotal evidence and visual inspection of the series confirm our results. Furthermore, we conduct tests on cointegration, which evidence that a long-run relationship between public expenditure and revenues is found only for the first sub-period (1862-1913. In essence, the paper’s results reveal that Italy have sustainability problems in the Republican age.

  18. Analysis of complex time series using refined composite multiscale entropy

    International Nuclear Information System (INIS)

    Wu, Shuen-De; Wu, Chiu-Wen; Lin, Shiou-Gwo; Lee, Kung-Yen; Peng, Chung-Kang

    2014-01-01

    Multiscale entropy (MSE) is an effective algorithm for measuring the complexity of a time series that has been applied in many fields successfully. However, MSE may yield an inaccurate estimation of entropy or induce undefined entropy because the coarse-graining procedure reduces the length of a time series considerably at large scales. Composite multiscale entropy (CMSE) was recently proposed to improve the accuracy of MSE, but it does not resolve undefined entropy. Here we propose a refined composite multiscale entropy (RCMSE) to improve CMSE. For short time series analyses, we demonstrate that RCMSE increases the accuracy of entropy estimation and reduces the probability of inducing undefined entropy.

  19. Clinical evaluation of a novel population-based regression analysis for detecting glaucomatous visual field progression.

    Science.gov (United States)

    Kovalska, M P; Bürki, E; Schoetzau, A; Orguel, S F; Orguel, S; Grieshaber, M C

    2011-04-01

    The distinction of real progression from test variability in visual field (VF) series may be based on clinical judgment, on trend analysis based on follow-up of test parameters over time, or on identification of a significant change related to the mean of baseline exams (event analysis). The aim of this study was to compare a new population-based method (Octopus field analysis, OFA) with classic regression analyses and clinical judgment for detecting glaucomatous VF changes. 240 VF series of 240 patients with at least 9 consecutive examinations available were included into this study. They were independently classified by two experienced investigators. The results of such a classification served as a reference for comparison for the following statistical tests: (a) t-test global, (b) r-test global, (c) regression analysis of 10 VF clusters and (d) point-wise linear regression analysis. 32.5 % of the VF series were classified as progressive by the investigators. The sensitivity and specificity were 89.7 % and 92.0 % for r-test, and 73.1 % and 93.8 % for the t-test, respectively. In the point-wise linear regression analysis, the specificity was comparable (89.5 % versus 92 %), but the sensitivity was clearly lower than in the r-test (22.4 % versus 89.7 %) at a significance level of p = 0.01. A regression analysis for the 10 VF clusters showed a markedly higher sensitivity for the r-test (37.7 %) than the t-test (14.1 %) at a similar specificity (88.3 % versus 93.8 %) for a significant trend (p = 0.005). In regard to the cluster distribution, the paracentral clusters and the superior nasal hemifield progressed most frequently. The population-based regression analysis seems to be superior to the trend analysis in detecting VF progression in glaucoma, and may eliminate the drawbacks of the event analysis. Further, it may assist the clinician in the evaluation of VF series and may allow better visualization of the correlation between function and structure owing to VF

  20. Preprint WebVRGIS Based Traffic Analysis and Visualization System

    OpenAIRE

    Li, Xiaoming; Lv, Zhihan; Wang, Weixi; Zhang, Baoyun; Hu, Jinxing; Yin, Ling; Feng, Shengzhong

    2015-01-01

    This is the preprint version of our paper on Advances in Engineering Software. With several characteristics, such as large scale, diverse predictability and timeliness, the city traffic data falls in the range of definition of Big Data. A Virtual Reality GIS based traffic analysis and visualization system is proposed as a promising and inspiring approach to manage and develop traffic big data. In addition to the basic GIS interaction functions, the proposed system also includes some intellige...

  1. Interactive visual analysis of nuclear data with ZVView

    International Nuclear Information System (INIS)

    Zerkin, Viktor

    2002-01-01

    This paper describes the cross section graphics software package ZVVIEW that was developed for the evaluators to perform efficient interactive visual analysis of experimental and theoretical nuclear data. ZVVIEW is a very powerful and complete package that simplifies the presentation of nuclear cross section data. A CD-ROM version of this computer package is available from the IAEA-NDS on request. (a.n.)

  2. Visual Linguistic Analysis of Political Discussions : Measuring Deliberative Quality

    OpenAIRE

    Gold, Valentin; El-Assady, Mennatallah; Hautli-Janisz, Annette; Bögel, Tina; Rohrdantz, Christian; Butt, Miriam; Holzinger, Katharina; Keim, Daniel

    2017-01-01

    This article reports on a Digital Humanities research project which is concerned with the automated linguistic and visual analysis of political discourses with a particular focus on the concept of deliberative communication. According to the theory of deliberative communication as discussed within political science, political debates should be inclusive and stakeholders participating in these debates are required to justify their positions rationally and respectfully and should eventually def...

  3. Aspects of scientific visualization for HEP analysis at Fermilab

    International Nuclear Information System (INIS)

    Kallenbach, Jeff; Lebrun, Paul

    1996-01-01

    Based on the workshop on scientific visualization held on Aug 7-9, 1995 at Fermilab, and practical experience with IRIS Explorer, we comment on the use of Open GL based for Event Displays and related HEP data analysis. WE wish to compare the pros and cons of such systems on technical grounds, case of use, and most of all, application interfaces, as the programmer and the user are often the same person. Costs and educational considerations will also be briefly discussed. (author)

  4. Real-time visualization and analysis of airflow field by use of digital holography

    Science.gov (United States)

    Di, Jianglei; Wu, Bingjing; Chen, Xin; Liu, Junjiang; Wang, Jun; Zhao, Jianlin

    2013-04-01

    The measurement and analysis of airflow field is very important in fluid dynamics. For airflow, smoke particles can be added to visually observe the turbulence phenomena by particle tracking technology, but the effect of smoke particles to follow the high speed airflow will reduce the measurement accuracy. In recent years, with the advantage of non-contact, nondestructive, fast and full-field measurement, digital holography has been widely applied in many fields, such as deformation and vibration analysis, particle characterization, refractive index measurement, and so on. In this paper, we present a method to measure the airflow field by use of digital holography. A small wind tunnel model made of acrylic glass is built to control the velocity and direction of airflow. Different shapes of samples such as aircraft wing and cylinder are placed in the wind tunnel model to produce different forms of flow field. With a Mach-Zehnder interferometer setup, a series of digital holograms carrying the information of airflow filed distributions in different states are recorded by CCD camera and corresponding holographic images are numerically reconstructed from the holograms by computer. Then we can conveniently obtain the velocity or pressure information of the airflow deduced from the quantitative phase information of holographic images and visually display the airflow filed and its evolution in the form of a movie. The theory and experiment results show that digital holography is a robust and feasible approach for real-time visualization and analysis of airflow field.

  5. Unsupervised land cover change detection: meaningful sequential time series analysis

    CSIR Research Space (South Africa)

    Salmon, BP

    2011-06-01

    Full Text Available An automated land cover change detection method is proposed that uses coarse spatial resolution hyper-temporal earth observation satellite time series data. The study compared three different unsupervised clustering approaches that operate on short...

  6. Quantitative analysis of gait in the visually impaired.

    Science.gov (United States)

    Nakamura, T

    1997-05-01

    In this comparative study concerning characteristics of independent walking by visually impaired persons, we used a motion analyser system to perform gait analysis of 15 late blind (age 36-54, mean 44.3 years), 15 congenitally blind (age 39-48, mean 43.8 years) and 15 sighted persons (age 40-50, mean 44.4 years) while walking a 10-m walkway. All subjects were male. Compared to the sighted, late blind and congenitally blind persons had a significantly slower walking speed, shorter stride length and longer time in the stance phase of gait. However, the relationships between gait parameters in the late and congenitally blind groups were maintained, as in the sighted group. In addition, the gait of the late blind showed a tendency to approximate the gait patterns of the congenitally blind as the duration of visual loss progressed. Based on these results we concluded that the gait of visually impaired persons, through its active use of non-visual sensory input, represents an attempt to adapt to various environmental conditions in order to maintain a more stable posture and to effect safe walking.

  7. Fermeuse wind power project Newfoundland : noise and visual analysis studies

    Energy Technology Data Exchange (ETDEWEB)

    Henn, P.; Turgeon, J.; Heraud, P.; Belanger, S.; Dakousian, S.; Lamontagne, C.; Soares, D. [Helimax Energy Inc., Montreal, PQ (Canada); Basil, C.; Boulianne, S.; Salacup, S.; Thompson, C. [Skypower, Toronto, ON (Canada)

    2008-03-15

    This paper discussed the noise and visual analyses used to assess the potential impacts of a wind energy project on the east coast of the Avalon Peninsula near St. John's, Newfoundland. The proposed farm will be located approximately 1 km away from the town of Fermeuse, and will have an installed capacity of 27 MW from 9 turbines. The paper provided details of the consultation process conducted to determine acceptable distance and site locations for the wind turbines from the community. Stakeholders were identified during meetings, events, and discussions with local authorities. Consultations were also held with government agencies and municipal councils. A baseline acoustic environment study was conducted, and details of anticipated environmental impacts during the project's construction, operation, and decommissioning phases were presented. The visual analysis study was divided into the following landscape units: town, shoreline, forest, open land and lacustrine landscapes. The effect of the turbines on the landscapes were assessed from different viewpoints using visual simulation programs. The study showed that the visual effects of the project are not considered as significant because of the low number of turbines. It was concluded that the effect of construction on ambient noise levels is of low concern as all permanent dwellings are located at least 1 km away from the turbines. 2 refs., 4 tabs., 4 figs.

  8. Fractal Analysis of Radiologists Visual Scanning Pattern in Screening Mammography

    Energy Technology Data Exchange (ETDEWEB)

    Alamudun, Folami T [ORNL; Yoon, Hong-Jun [ORNL; Hudson, Kathy [University of Tennessee, Knoxville (UTK); Morin-Ducote, Garnetta [University of Tennessee, Knoxville (UTK); Tourassi, Georgia [ORNL

    2015-01-01

    Several investigators have investigated radiologists visual scanning patterns with respect to features such as total time examining a case, time to initially hit true lesions, number of hits, etc. The purpose of this study was to examine the complexity of the radiologists visual scanning pattern when viewing 4-view mammographic cases, as they typically do in clinical practice. Gaze data were collected from 10 readers (3 breast imaging experts and 7 radiology residents) while reviewing 100 screening mammograms (24 normal, 26 benign, 50 malignant). The radiologists scanpaths across the 4 mammographic views were mapped to a single 2-D image plane. Then, fractal analysis was applied on the derived scanpaths using the box counting method. For each case, the complexity of each radiologist s scanpath was estimated using fractal dimension. The association between gaze complexity, case pathology, case density, and radiologist experience was evaluated using 3 factor fixed effects ANOVA. ANOVA showed that case pathology, breast density, and experience level are all independent predictors of the visual scanning pattern complexity. Visual scanning patterns are significantly different for benign and malignant cases than for normal cases as well as when breast parenchyma density changes.

  9. Phishing Detection: Analysis of Visual Similarity Based Approaches

    Directory of Open Access Journals (Sweden)

    Ankit Kumar Jain

    2017-01-01

    Full Text Available Phishing is one of the major problems faced by cyber-world and leads to financial losses for both industries and individuals. Detection of phishing attack with high accuracy has always been a challenging issue. At present, visual similarities based techniques are very useful for detecting phishing websites efficiently. Phishing website looks very similar in appearance to its corresponding legitimate website to deceive users into believing that they are browsing the correct website. Visual similarity based phishing detection techniques utilise the feature set like text content, text format, HTML tags, Cascading Style Sheet (CSS, image, and so forth, to make the decision. These approaches compare the suspicious website with the corresponding legitimate website by using various features and if the similarity is greater than the predefined threshold value then it is declared phishing. This paper presents a comprehensive analysis of phishing attacks, their exploitation, some of the recent visual similarity based approaches for phishing detection, and its comparative study. Our survey provides a better understanding of the problem, current solution space, and scope of future research to deal with phishing attacks efficiently using visual similarity based approaches.

  10. Ovis: A framework for visual analysis of ocean forecast ensembles

    KAUST Repository

    Hollt, Thomas; Magdy, Ahmed; Zhan, Peng; Chen, Guoning; Gopalakrishnan, Ganesh; Hoteit, Ibrahim; Hansen, Charles D.; Hadwiger, Markus

    2014-01-01

    We present a novel integrated visualization system that enables interactive visual analysis of ensemble simulations of the sea surface height that is used in ocean forecasting. The position of eddies can be derived directly from the sea surface height and our visualization approach enables their interactive exploration and analysis.The behavior of eddies is important in different application settings of which we present two in this paper. First, we show an application for interactive planning of placement as well as operation of off-shore structures using real-world ensemble simulation data of the Gulf of Mexico. Off-shore structures, such as those used for oil exploration, are vulnerable to hazards caused by eddies, and the oil and gas industry relies on ocean forecasts for efficient operations. We enable analysis of the spatial domain, as well as the temporal evolution, for planning the placement and operation of structures.Eddies are also important for marine life. They transport water over large distances and with it also heat and other physical properties as well as biological organisms. In the second application we present the usefulness of our tool, which could be used for planning the paths of autonomous underwater vehicles, so called gliders, for marine scientists to study simulation data of the largely unexplored Red Sea. © 1995-2012 IEEE.

  11. Emotion processing in the visual brain: a MEG analysis.

    Science.gov (United States)

    Peyk, Peter; Schupp, Harald T; Elbert, Thomas; Junghöfer, Markus

    2008-06-01

    Recent functional magnetic resonance imaging (fMRI) and event-related brain potential (ERP) studies provide empirical support for the notion that emotional cues guide selective attention. Extending this line of research, whole head magneto-encephalogram (MEG) was measured while participants viewed in separate experimental blocks a continuous stream of either pleasant and neutral or unpleasant and neutral pictures, presented for 330 ms each. Event-related magnetic fields (ERF) were analyzed after intersubject sensor coregistration, complemented by minimum norm estimates (MNE) to explore neural generator sources. Both streams of analysis converge by demonstrating the selective emotion processing in an early (120-170 ms) and a late time interval (220-310 ms). ERF analysis revealed that the polarity of the emotion difference fields was reversed across early and late intervals suggesting distinct patterns of activation in the visual processing stream. Source analysis revealed the amplified processing of emotional pictures in visual processing areas with more pronounced occipito-parieto-temporal activation in the early time interval, and a stronger engagement of more anterior, temporal, regions in the later interval. Confirming previous ERP studies showing facilitated emotion processing, the present data suggest that MEG provides a complementary look at the spread of activation in the visual processing stream.

  12. Ovis: A Framework for Visual Analysis of Ocean Forecast Ensembles.

    Science.gov (United States)

    Höllt, Thomas; Magdy, Ahmed; Zhan, Peng; Chen, Guoning; Gopalakrishnan, Ganesh; Hoteit, Ibrahim; Hansen, Charles D; Hadwiger, Markus

    2014-08-01

    We present a novel integrated visualization system that enables interactive visual analysis of ensemble simulations of the sea surface height that is used in ocean forecasting. The position of eddies can be derived directly from the sea surface height and our visualization approach enables their interactive exploration and analysis.The behavior of eddies is important in different application settings of which we present two in this paper. First, we show an application for interactive planning of placement as well as operation of off-shore structures using real-world ensemble simulation data of the Gulf of Mexico. Off-shore structures, such as those used for oil exploration, are vulnerable to hazards caused by eddies, and the oil and gas industry relies on ocean forecasts for efficient operations. We enable analysis of the spatial domain, as well as the temporal evolution, for planning the placement and operation of structures.Eddies are also important for marine life. They transport water over large distances and with it also heat and other physical properties as well as biological organisms. In the second application we present the usefulness of our tool, which could be used for planning the paths of autonomous underwater vehicles, so called gliders, for marine scientists to study simulation data of the largely unexplored Red Sea.

  13. Ovis: A framework for visual analysis of ocean forecast ensembles

    KAUST Repository

    Hollt, Thomas

    2014-08-01

    We present a novel integrated visualization system that enables interactive visual analysis of ensemble simulations of the sea surface height that is used in ocean forecasting. The position of eddies can be derived directly from the sea surface height and our visualization approach enables their interactive exploration and analysis.The behavior of eddies is important in different application settings of which we present two in this paper. First, we show an application for interactive planning of placement as well as operation of off-shore structures using real-world ensemble simulation data of the Gulf of Mexico. Off-shore structures, such as those used for oil exploration, are vulnerable to hazards caused by eddies, and the oil and gas industry relies on ocean forecasts for efficient operations. We enable analysis of the spatial domain, as well as the temporal evolution, for planning the placement and operation of structures.Eddies are also important for marine life. They transport water over large distances and with it also heat and other physical properties as well as biological organisms. In the second application we present the usefulness of our tool, which could be used for planning the paths of autonomous underwater vehicles, so called gliders, for marine scientists to study simulation data of the largely unexplored Red Sea. © 1995-2012 IEEE.

  14. Correlation between detrended fluctuation analysis and the Lempel-Ziv complexity in nonlinear time series analysis

    International Nuclear Information System (INIS)

    Tang You-Fu; Liu Shu-Lin; Jiang Rui-Hong; Liu Ying-Hui

    2013-01-01

    We study the correlation between detrended fluctuation analysis (DFA) and the Lempel-Ziv complexity (LZC) in nonlinear time series analysis in this paper. Typical dynamic systems including a logistic map and a Duffing model are investigated. Moreover, the influence of Gaussian random noise on both the DFA and LZC are analyzed. The results show a high correlation between the DFA and LZC, which can quantify the non-stationarity and the nonlinearity of the time series, respectively. With the enhancement of the random component, the exponent a and the normalized complexity index C show increasing trends. In addition, C is found to be more sensitive to the fluctuation in the nonlinear time series than α. Finally, the correlation between the DFA and LZC is applied to the extraction of vibration signals for a reciprocating compressor gas valve, and an effective fault diagnosis result is obtained

  15. Advances in Antithetic Time Series Analysis : Separating Fact from Artifact

    Directory of Open Access Journals (Sweden)

    Dennis Ridley

    2016-01-01

    Full Text Available The problem of biased time series mathematical model parameter estimates is well known to be insurmountable. When used to predict future values by extrapolation, even a de minimis bias will eventually grow into a large bias, with misleading results. This paper elucidates how combining antithetic time series' solves this baffling problem of bias in the fitted and forecast values by dynamic bias cancellation. Instead of growing to infinity, the average error can converge to a constant. (original abstract

  16. Data imputation analysis for Cosmic Rays time series

    Science.gov (United States)

    Fernandes, R. C.; Lucio, P. S.; Fernandez, J. H.

    2017-05-01

    The occurrence of missing data concerning Galactic Cosmic Rays time series (GCR) is inevitable since loss of data is due to mechanical and human failure or technical problems and different periods of operation of GCR stations. The aim of this study was to perform multiple dataset imputation in order to depict the observational dataset. The study has used the monthly time series of GCR Climax (CLMX) and Roma (ROME) from 1960 to 2004 to simulate scenarios of 10%, 20%, 30%, 40%, 50%, 60%, 70%, 80% and 90% of missing data compared to observed ROME series, with 50 replicates. Then, the CLMX station as a proxy for allocation of these scenarios was used. Three different methods for monthly dataset imputation were selected: AMÉLIA II - runs the bootstrap Expectation Maximization algorithm, MICE - runs an algorithm via Multivariate Imputation by Chained Equations and MTSDI - an Expectation Maximization algorithm-based method for imputation of missing values in multivariate normal time series. The synthetic time series compared with the observed ROME series has also been evaluated using several skill measures as such as RMSE, NRMSE, Agreement Index, R, R2, F-test and t-test. The results showed that for CLMX and ROME, the R2 and R statistics were equal to 0.98 and 0.96, respectively. It was observed that increases in the number of gaps generate loss of quality of the time series. Data imputation was more efficient with MTSDI method, with negligible errors and best skill coefficients. The results suggest a limit of about 60% of missing data for imputation, for monthly averages, no more than this. It is noteworthy that CLMX, ROME and KIEL stations present no missing data in the target period. This methodology allowed reconstructing 43 time series.

  17. Geomechanical time series and its singularity spectrum analysis

    Czech Academy of Sciences Publication Activity Database

    Lyubushin, Alexei A.; Kaláb, Zdeněk; Lednická, Markéta

    2012-01-01

    Roč. 47, č. 1 (2012), s. 69-77 ISSN 1217-8977 R&D Projects: GA ČR GA105/09/0089 Institutional research plan: CEZ:AV0Z30860518 Keywords : geomechanical time series * singularity spectrum * time series segmentation * laser distance meter Subject RIV: DC - Siesmology, Volcanology, Earth Structure Impact factor: 0.347, year: 2012 http://www.akademiai.com/content/88v4027758382225/fulltext.pdf

  18. A visual interface to computer programs for linkage analysis.

    Science.gov (United States)

    Chapman, C J

    1990-06-01

    This paper describes a visual approach to the input of information about human families into computer data bases, making use of the GEM graphic interface on the Atari ST. Similar approaches could be used on the Apple Macintosh or on the IBM PC AT (to which it has been transferred). For occasional users of pedigree analysis programs, this approach has considerable advantages in ease of use and accessibility. An example of such use might be the analysis of risk in families with Huntington disease using linked RFLPs. However, graphic interfaces do make much greater demands on the programmers of these systems.

  19. Software for hydrogeologic time series analysis, interfacing data with physical insight

    NARCIS (Netherlands)

    Asmuth, Jos R. von; Maas, K.; Knotters, M.; Bierkens, M.F.P.; Bakker, M.; Olsthoorn, T.; Cirkel, D.; Leunk, I.; Schaars, F.; Asmuth, Daniel C. von

    2012-01-01

    The program Menyanthes combines a variety of functions for managing, editing, visualizing, analyzing and modeling hydrogeologic time series. Menyanthes was initially developed within the scope of the PhD research of the first author, whose primary aimwas the integration of data and

  20. Visual accumulation tube for size analysis of sands

    Science.gov (United States)

    Colby, B.C.; Christensen, R.P.

    1956-01-01

    The visual-accumulation-tube method was developed primarily for making size analyses of the sand fractions of suspended-sediment and bed-material samples. Because the fundamental property governing the motion of a sediment particle in a fluid is believed to be its fall velocity. the analysis is designed to determine the fall-velocity-frequency distribution of the individual particles of the sample. The analysis is based on a stratified sedimentation system in which the sample is introduced at the top of a transparent settling tube containing distilled water. The procedure involves the direct visual tracing of the height of sediment accumulation in a contracted section at the bottom of the tube. A pen records the height on a moving chart. The method is simple and fast, provides a continuous and permanent record, gives highly reproducible results, and accurately determines the fall-velocity characteristics of the sample. The apparatus, procedure, results, and accuracy of the visual-accumulation-tube method for determining the sedimentation-size distribution of sands are presented in this paper.

  1. Time Series Analysis of Wheat flour Price Shocks in Pakistan: A Case Analysis

    OpenAIRE

    Asad Raza Abdi; Ali Hassan Halepoto; Aisha Bashir Shah; Faiz M. Shaikh

    2013-01-01

    The current research investigates the wheat flour Price Shocks in Pakistan: A case analysis. Data was collected by using secondary sources by using Time series Analysis, and data were analyzed by using SPSS-20 version. It was revealed that the price of wheat flour increases from last four decades, and trend of price shocks shows that due to certain market variation and supply and demand shocks also play a positive relationship in price shocks in the wheat prices. It was further revealed th...

  2. Content and user-based music visual analysis

    Science.gov (United States)

    Guo, Xiaochun; Tang, Lei

    2015-12-01

    In recent years, people's ability to collect music got enhanced greatly. Many people who prefer listening music offline even stored thousands of music on their local storage or portable device. However, their ability to deal with music information has not been improved accordingly, which results in two problems. One is how to find out the favourite songs from large music dataset and satisfy different individuals. The other one is how to compose a play list quickly. To solve these problems, the authors proposed a content and user-based music visual analysis approach. We first developed a new recommendation algorithm based on the content of music and user's behaviour, which satisfy individual's preference. Then, we make use of visualization and interaction tools to illustrate the relationship between songs and help people compose a suitable play list. At the end of this paper, a survey is mentioned to show that our system is available and effective.

  3. Visualization of fuel rod burnup analysis by Scilab

    International Nuclear Information System (INIS)

    Tsai, Chiung-Wen

    2013-01-01

    The goal of this technical note is to provide an alternative, the freeware Scilab, by which means we may construct custom GUIs and distribute them without extra constrains and cost. A post-processor has been constructed by Scilab to visualize the fuel rod burnup analysis data calculated by FRAPCON-3.4. This post-processor incorporates a graphical user interface (GUI), providing users a rapid overview of the characteristics of the numerical results with 2-D and 3-D graphs, as well as the animations of fuel temperature distribution. An assessment case input file provided by FRAPCON user group was applied to demonstrate the construction of a post-processor with GUI by object-oriented GUI tool, as well as the capability of visualization functions of Scilab

  4. Visualization of fuel rod burnup analysis by Scilab

    Energy Technology Data Exchange (ETDEWEB)

    Tsai, Chiung-Wen, E-mail: d937121@oz.nthu.edu.tw

    2013-12-15

    The goal of this technical note is to provide an alternative, the freeware Scilab, by which means we may construct custom GUIs and distribute them without extra constrains and cost. A post-processor has been constructed by Scilab to visualize the fuel rod burnup analysis data calculated by FRAPCON-3.4. This post-processor incorporates a graphical user interface (GUI), providing users a rapid overview of the characteristics of the numerical results with 2-D and 3-D graphs, as well as the animations of fuel temperature distribution. An assessment case input file provided by FRAPCON user group was applied to demonstrate the construction of a post-processor with GUI by object-oriented GUI tool, as well as the capability of visualization functions of Scilab.

  5. A Visual Analytics Approach for Correlation, Classification, and Regression Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Steed, Chad A [ORNL; SwanII, J. Edward [Mississippi State University (MSU); Fitzpatrick, Patrick J. [Mississippi State University (MSU); Jankun-Kelly, T.J. [Mississippi State University (MSU)

    2012-02-01

    New approaches that combine the strengths of humans and machines are necessary to equip analysts with the proper tools for exploring today's increasing complex, multivariate data sets. In this paper, a novel visual data mining framework, called the Multidimensional Data eXplorer (MDX), is described that addresses the challenges of today's data by combining automated statistical analytics with a highly interactive parallel coordinates based canvas. In addition to several intuitive interaction capabilities, this framework offers a rich set of graphical statistical indicators, interactive regression analysis, visual correlation mining, automated axis arrangements and filtering, and data classification techniques. The current work provides a detailed description of the system as well as a discussion of key design aspects and critical feedback from domain experts.

  6. Track Irregularity Time Series Analysis and Trend Forecasting

    Directory of Open Access Journals (Sweden)

    Jia Chaolong

    2012-01-01

    Full Text Available The combination of linear and nonlinear methods is widely used in the prediction of time series data. This paper analyzes track irregularity time series data by using gray incidence degree models and methods of data transformation, trying to find the connotative relationship between the time series data. In this paper, GM (1,1 is based on first-order, single variable linear differential equations; after an adaptive improvement and error correction, it is used to predict the long-term changing trend of track irregularity at a fixed measuring point; the stochastic linear AR, Kalman filtering model, and artificial neural network model are applied to predict the short-term changing trend of track irregularity at unit section. Both long-term and short-term changes prove that the model is effective and can achieve the expected accuracy.

  7. A visual analysis of gender bias in contemporary anatomy textbooks.

    Science.gov (United States)

    Parker, Rhiannon; Larkin, Theresa; Cockburn, Jon

    2017-05-01

    Empirical research has linked gender bias in medical education with negative attitudes and behaviors in healthcare providers. Yet it has been more than 20 years since research has considered the degree to which women and men are equally represented in anatomy textbooks. Furthermore, previous research has not explored beyond quantity of representation to also examine visual gender stereotypes and, in light of theoretical advancements in the area of intersectional research, the relationship between representations of gender and representations of ethnicity, body type, health, and age. This study aimed to determine the existence and representation of gender bias in the major anatomy textbooks used at Australian Medical Schools. A systematic visual content analysis was conducted on 6044 images in which sex/gender could be identified, sourced from 17 major anatomy textbooks published from 2008 to 2013. Further content analysis was performed on the 521 narrative images, which represent an unfolding story, found within the same textbooks. Results indicate that the representation of gender in images from anatomy textbooks remain predominantly male except within sex-specific sections. Further, other forms of bias were found to exist in: the visualization of stereotypical gendered emotions, roles and settings; the lack of ethnic, age, and body type diversity; and in the almost complete adherence to a sex/gender binary. Despite increased attention to gender issues in medicine, the visual representation of gender in medical curricula continues to be biased. The biased construction of gender in anatomy textbooks designed for medical education provides future healthcare providers with inadequate and unrealistic information about patients. Copyright © 2017 Elsevier Ltd. All rights reserved.

  8. Analysis of historical series of industrial demand of energy; Analisi delle serie storiche dei consumi energetici dell`industria

    Energy Technology Data Exchange (ETDEWEB)

    Moauro, F. [ENEA, Centro Ricerche Casaccia, Rome (Italy). Dip. Energia

    1995-03-01

    This paper reports a short term analysis of the Italian demand for energy fonts and a check of a statistic model supposing the industrial demand for energy fonts as a function of prices and production, according to neoclassic neoclassic micro economic theory. To this pourpose monthly time series of industrial consumption of main energy fonts in 6 sectors, industrial production indexes in the same sectors and indexes of energy prices (coal, natural gas, oil products, electricity) have been used. The statistic methodology refers to modern analysis of time series and specifically to transfer function models. These ones permit rigorous identification and representation of the most important dynamic relations between dependent variables (production and prices), as relation of an input-output system. The results have shown an important positive correlation between energy consumption with prices. Furthermore, it has been shown the reliability of forecasts and their use as monthly energy indicators.

  9. Unstable Periodic Orbit Analysis of Histograms of Chaotic Time Series

    International Nuclear Information System (INIS)

    Zoldi, S.M.

    1998-01-01

    Using the Lorenz equations, we have investigated whether unstable periodic orbits (UPOs) associated with a strange attractor may predict the occurrence of the robust sharp peaks in histograms of some experimental chaotic time series. Histograms with sharp peaks occur for the Lorenz parameter value r=60.0 but not for r=28.0 , and the sharp peaks for r=60.0 do not correspond to a histogram derived from any single UPO. However, we show that histograms derived from the time series of a non-Axiom-A chaotic system can be accurately predicted by an escape-time weighting of UPO histograms. copyright 1998 The American Physical Society

  10. Minimum entropy density method for the time series analysis

    Science.gov (United States)

    Lee, Jeong Won; Park, Joongwoo Brian; Jo, Hang-Hyun; Yang, Jae-Suk; Moon, Hie-Tae

    2009-01-01

    The entropy density is an intuitive and powerful concept to study the complicated nonlinear processes derived from physical systems. We develop the minimum entropy density method (MEDM) to detect the structure scale of a given time series, which is defined as the scale in which the uncertainty is minimized, hence the pattern is revealed most. The MEDM is applied to the financial time series of Standard and Poor’s 500 index from February 1983 to April 2006. Then the temporal behavior of structure scale is obtained and analyzed in relation to the information delivery time and efficient market hypothesis.

  11. The Review of Visual Analysis Methods of Multi-modal Spatio-temporal Big Data

    Directory of Open Access Journals (Sweden)

    ZHU Qing

    2017-10-01

    Full Text Available The visual analysis of spatio-temporal big data is not only the state-of-art research direction of both big data analysis and data visualization, but also the core module of pan-spatial information system. This paper reviews existing visual analysis methods at three levels:descriptive visual analysis, explanatory visual analysis and exploratory visual analysis, focusing on spatio-temporal big data's characteristics of multi-source, multi-granularity, multi-modal and complex association.The technical difficulties and development tendencies of multi-modal feature selection, innovative human-computer interaction analysis and exploratory visual reasoning in the visual analysis of spatio-temporal big data were discussed. Research shows that the study of descriptive visual analysis for data visualizationis is relatively mature.The explanatory visual analysis has become the focus of the big data analysis, which is mainly based on interactive data mining in a visual environment to diagnose implicit reason of problem. And the exploratory visual analysis method needs a major break-through.

  12. Interactive Visualization and Analysis of Geospatial Data Sets - TrikeND-iGlobe

    Science.gov (United States)

    Rosebrock, Uwe; Hogan, Patrick; Chandola, Varun

    2013-04-01

    The visualization of scientific datasets is becoming an ever-increasing challenge as advances in computing technologies have enabled scientists to build high resolution climate models that have produced petabytes of climate data. To interrogate and analyze these large datasets in real-time is a task that pushes the boundaries of computing hardware and software. But integration of climate datasets with geospatial data requires considerable amount of effort and close familiarity of various data formats and projection systems, which has prevented widespread utilization outside of climate community. TrikeND-iGlobe is a sophisticated software tool that bridges this gap, allows easy integration of climate datasets with geospatial datasets and provides sophisticated visualization and analysis capabilities. The objective for TrikeND-iGlobe is the continued building of an open source 4D virtual globe application using NASA World Wind technology that integrates analysis of climate model outputs with remote sensing observations as well as demographic and environmental data sets. This will facilitate a better understanding of global and regional phenomenon, and the impact analysis of climate extreme events. The critical aim is real-time interactive interrogation. At the data centric level the primary aim is to enable the user to interact with the data in real-time for the purpose of analysis - locally or remotely. TrikeND-iGlobe provides the basis for the incorporation of modular tools that provide extended interactions with the data, including sub-setting, aggregation, re-shaping, time series analysis methods and animation to produce publication-quality imagery. TrikeND-iGlobe may be run locally or can be accessed via a web interface supported by high-performance visualization compute nodes placed close to the data. It supports visualizing heterogeneous data formats: traditional geospatial datasets along with scientific data sets with geographic coordinates (NetCDF, HDF, etc

  13. Information theoretic analysis of canny edge detection in visual communication

    Science.gov (United States)

    Jiang, Bo; Rahman, Zia-ur

    2011-06-01

    In general edge detection evaluation, the edge detectors are examined, analyzed, and compared either visually or with a metric for specific an application. This analysis is usually independent of the characteristics of the image-gathering, transmission and display processes that do impact the quality of the acquired image and thus, the resulting edge image. We propose a new information theoretic analysis of edge detection that unites the different components of the visual communication channel and assesses edge detection algorithms in an integrated manner based on Shannon's information theory. The edge detection algorithm here is considered to achieve high performance only if the information rate from the scene to the edge approaches the maximum possible. Thus, by setting initial conditions of the visual communication system as constant, different edge detection algorithms could be evaluated. This analysis is normally limited to linear shift-invariant filters so in order to examine the Canny edge operator in our proposed system, we need to estimate its "power spectral density" (PSD). Since the Canny operator is non-linear and shift variant, we perform the estimation for a set of different system environment conditions using simulations. In our paper we will first introduce the PSD of the Canny operator for a range of system parameters. Then, using the estimated PSD, we will assess the Canny operator using information theoretic analysis. The information-theoretic metric is also used to compare the performance of the Canny operator with other edge-detection operators. This also provides a simple tool for selecting appropriate edgedetection algorithms based on system parameters, and for adjusting their parameters to maximize information throughput.

  14. Dynamic Factor Analysis of Nonstationary Multivariate Time Series.

    Science.gov (United States)

    Molenaar, Peter C. M.; And Others

    1992-01-01

    The dynamic factor model proposed by P. C. Molenaar (1985) is exhibited, and a dynamic nonstationary factor model (DNFM) is constructed with latent factor series that have time-varying mean functions. The use of a DNFM is illustrated using data from a television viewing habits study. (SLD)

  15. Koopman Operator Framework for Time Series Modeling and Analysis

    Science.gov (United States)

    Surana, Amit

    2018-01-01

    We propose an interdisciplinary framework for time series classification, forecasting, and anomaly detection by combining concepts from Koopman operator theory, machine learning, and linear systems and control theory. At the core of this framework is nonlinear dynamic generative modeling of time series using the Koopman operator which is an infinite-dimensional but linear operator. Rather than working with the underlying nonlinear model, we propose two simpler linear representations or model forms based on Koopman spectral properties. We show that these model forms are invariants of the generative model and can be readily identified directly from data using techniques for computing Koopman spectral properties without requiring the explicit knowledge of the generative model. We also introduce different notions of distances on the space of such model forms which is essential for model comparison/clustering. We employ the space of Koopman model forms equipped with distance in conjunction with classical machine learning techniques to develop a framework for automatic feature generation for time series classification. The forecasting/anomaly detection framework is based on using Koopman model forms along with classical linear systems and control approaches. We demonstrate the proposed framework for human activity classification, and for time series forecasting/anomaly detection in power grid application.

  16. Complexity analysis of the turbulent environmental fluid flow time series

    Science.gov (United States)

    Mihailović, D. T.; Nikolić-Đorić, E.; Drešković, N.; Mimić, G.

    2014-02-01

    We have used the Kolmogorov complexities, sample and permutation entropies to quantify the randomness degree in river flow time series of two mountain rivers in Bosnia and Herzegovina, representing the turbulent environmental fluid, for the period 1926-1990. In particular, we have examined the monthly river flow time series from two rivers (the Miljacka and the Bosnia) in the mountain part of their flow and then calculated the Kolmogorov complexity (KL) based on the Lempel-Ziv Algorithm (LZA) (lower-KLL and upper-KLU), sample entropy (SE) and permutation entropy (PE) values for each time series. The results indicate that the KLL, KLU, SE and PE values in two rivers are close to each other regardless of the amplitude differences in their monthly flow rates. We have illustrated the changes in mountain river flow complexity by experiments using (i) the data set for the Bosnia River and (ii) anticipated human activities and projected climate changes. We have explored the sensitivity of considered measures in dependence on the length of time series. In addition, we have divided the period 1926-1990 into three subintervals: (a) 1926-1945, (b) 1946-1965, (c) 1966-1990, and calculated the KLL, KLU, SE, PE values for the various time series in these subintervals. It is found that during the period 1946-1965, there is a decrease in their complexities, and corresponding changes in the SE and PE, in comparison to the period 1926-1990. This complexity loss may be primarily attributed to (i) human interventions, after the Second World War, on these two rivers because of their use for water consumption and (ii) climate change in recent times.

  17. Visual Tracking via Feature Tensor Multimanifold Discriminate Analysis

    Directory of Open Access Journals (Sweden)

    Ting-quan Deng

    2014-01-01

    Full Text Available In the visual tracking scenarios, if there are multiple objects, due to the interference of similar objects, tracking may fail in the progress of occlusion to separation. To address this problem, this paper proposed a visual tracking algorithm with discrimination through multimanifold learning. Color-gradient-based feature tensor was used to describe object appearance for accommodation of partial occlusion. A prior multimanifold tensor dataset is established through the template matching tracking algorithm. For the purpose of discrimination, tensor distance was defined to determine the intramanifold and intermanifold neighborhood relationship in multimanifold space. Then multimanifold discriminate analysis was employed to construct multilinear projection matrices of submanifolds. Finally, object states were obtained by combining with sequence inference. Meanwhile, the multimanifold dataset and manifold learning embedded projection should be updated online. Experiments were conducted on two real visual surveillance sequences to evaluate the proposed algorithm with three state-of-the-art tracking methods qualitatively and quantitatively. Experimental results show that the proposed algorithm can achieve effective and robust effect in multi-similar-object mutual occlusion scenarios.

  18. Visual artificial grammar learning in dyslexia: A meta-analysis.

    Science.gov (United States)

    van Witteloostuijn, Merel; Boersma, Paul; Wijnen, Frank; Rispens, Judith

    2017-11-01

    Literacy impairments in dyslexia have been hypothesized to be (partly) due to an implicit learning deficit. However, studies of implicit visual artificial grammar learning (AGL) have often yielded null results. The aim of this study is to weigh the evidence collected thus far by performing a meta-analysis of studies on implicit visual AGL in dyslexia. Thirteen studies were selected through a systematic literature search, representing data from 255 participants with dyslexia and 292 control participants (mean age range: 8.5-36.8 years old). If the 13 selected studies constitute a random sample, individuals with dyslexia perform worse on average than non-dyslexic individuals (average weighted effect size=0.46, 95% CI [0.14 … 0.77], p=0.008), with a larger effect in children than in adults (p=0.041; average weighted effect sizes 0.71 [sig.] versus 0.16 [non-sig.]). However, the presence of a publication bias indicates the existence of missing studies that may well null the effect. While the studies under investigation demonstrate that implicit visual AGL is impaired in dyslexia (more so in children than in adults, if in adults at all), the detected publication bias suggests that the effect might in fact be zero. Copyright © 2017 The Author(s). Published by Elsevier Ltd.. All rights reserved.

  19. Gender Stereotypes in Science Education Resources: A Visual Content Analysis.

    Science.gov (United States)

    Kerkhoven, Anne H; Russo, Pedro; Land-Zandstra, Anne M; Saxena, Aayush; Rodenburg, Frans J

    2016-01-01

    More men are studying and working in science fields than women. This could be an effect of the prevalence of gender stereotypes (e.g., science is for men, not for women). Aside from the media and people's social lives, such stereotypes can also occur in education. Ways in which stereotypes are visible in education include the use of gender-biased visuals, language, teaching methods, and teachers' attitudes. The goal of this study was to determine whether science education resources for primary school contained gender-biased visuals. Specifically, the total number of men and women depicted, and the profession and activity of each person in the visuals were noted. The analysis showed that there were more men than women depicted with a science profession and that more women than men were depicted as teachers. This study shows that there is a stereotypical representation of men and women in online science education resources, highlighting the changes needed to create a balanced representation of men and women. Even if the stereotypical representation of men and women in science is a true reflection of the gender distribution in science, we should aim for a more balanced representation. Such a balance is an essential first step towards showing children that both men and women can do science, which will contribute to more gender-balanced science and technology fields.

  20. Multiscale multifractal multiproperty analysis of financial time series based on Rényi entropy

    Science.gov (United States)

    Yujun, Yang; Jianping, Li; Yimei, Yang

    This paper introduces a multiscale multifractal multiproperty analysis based on Rényi entropy (3MPAR) method to analyze short-range and long-range characteristics of financial time series, and then applies this method to the five time series of five properties in four stock indices. Combining the two analysis techniques of Rényi entropy and multifractal detrended fluctuation analysis (MFDFA), the 3MPAR method focuses on the curves of Rényi entropy and generalized Hurst exponent of five properties of four stock time series, which allows us to study more universal and subtle fluctuation characteristics of financial time series. By analyzing the curves of the Rényi entropy and the profiles of the logarithm distribution of MFDFA of five properties of four stock indices, the 3MPAR method shows some fluctuation characteristics of the financial time series and the stock markets. Then, it also shows a richer information of the financial time series by comparing the profile of five properties of four stock indices. In this paper, we not only focus on the multifractality of time series but also the fluctuation characteristics of the financial time series and subtle differences in the time series of different properties. We find that financial time series is far more complex than reported in some research works using one property of time series.

  1. Experiment archive, analysis, and visualization at the National Ignition Facility

    International Nuclear Information System (INIS)

    Hutton, Matthew S.; Azevedo, Stephen; Beeler, Richard; Bettenhausen, Rita; Bond, Essex; Casey, Allan; Liebman, Judith; Marsh, Amber; Pannell, Thomas; Warrick, Abbie

    2012-01-01

    Highlights: ► We show the computing architecture to manage scientific data from NIF experiments. ► NIF laser “shots” generate GBs of data for sub-microsec events separated by hours. ► Results are archived, analyzed and displayed with parallel and scalable code. ► Data quality and pedigree, based on calibration of each part, are tracked. ► Web-based visualization tools present data across shots and diagnostics. - Abstract: The National Ignition Facility (NIF) at the Lawrence Livermore National Laboratory is the world's most energetic laser, providing a scientific research center to study inertial confinement fusion and matter at extreme energy densities and pressures. A target shot involves over 30 specialized diagnostics measuring critical x-ray, optical and nuclear phenomena to quantify ignition results for comparison with computational models. The Shot Analysis and Visualization System (SAVI) acquires and analyzes target diagnostic data for display within a time-budget of 30 min. Laser and target diagnostic data are automatically loaded into the NIF archive database through clustered software data collection agents. The SAVI Analysis Engine distributes signal and image processing tasks to a Linux cluster where computation is performed. Intermediate results are archived at each step of the analysis pipeline. Data is archived with metadata and pedigree. Experiment results are visualized through a web-based user interface in interactive dashboards tailored to single or multiple shot perspectives. The SAVI system integrates open-source software, commercial workflow tools, relational database and messaging technologies into a service-oriented and distributed software architecture that is highly parallel, scalable, and flexible. The architecture and functionality of the SAVI system will be presented along with examples.

  2. Flow boiling in microgap channels experiment, visualization and analysis

    CERN Document Server

    Alam, Tamanna; Jin, Li-Wen

    2013-01-01

    Flow Boiling in Microgap Channels: Experiment, Visualization and Analysis presents an up-to-date summary of the details of the confined to unconfined flow boiling transition criteria, flow boiling heat transfer and pressure drop characteristics, instability characteristics, two phase flow pattern and flow regime map and the parametric study of microgap dimension. Advantages of flow boiling in microgaps over microchannels are also highlighted. The objective of this Brief is to obtain a better fundamental understanding of the flow boiling processes, compare the performance between microgap and c

  3. Visual analysis of music in function of music video

    Directory of Open Access Journals (Sweden)

    Antal Silard

    2015-01-01

    Full Text Available Wide-spread all over the planet, incorporating all music genres, the music video, the subject matter of this analysis, has become irreplaceable in promotions, song presentations, an artist's image, visual aesthetics of subculture; today, most of the countries in the world have a channel devoted to music only, i.e. to music video. The form started to develop rapidly in the 50s of the twentieth century, alongside television. As it developed, its purpose has changed: from a simple presentation of musicians to an independent video form.

  4. Visual gamma-ray analysis. VIPF program (WINDOW 95)

    International Nuclear Information System (INIS)

    Yamada, S.

    1998-01-01

    VIsual Peak Fitting (VIPF) program for the analysis of gamma radiation peaks from Ge detectors which works on WINDOWS 95 as an operating system has been developed. Gamma-ray peaks are simulated as Gauss function with 1st- or 2nd-order polynomial function for the background spectrum. Any function can be further added to for parameter fitting. The VIPF program can be obtained through internet by down-loading: http://w3.rri.kyoto-u.ac.jp/~yamada. Details of the program procedure, explanation of the fitting function to be used and peak search routine, and manuals of the code are given. (Ohno, S.)

  5. Barcode server: a visualization-based genome analysis system.

    Directory of Open Access Journals (Sweden)

    Fenglou Mao

    Full Text Available We have previously developed a computational method for representing a genome as a barcode image, which makes various genomic features visually apparent. We have demonstrated that this visual capability has made some challenging genome analysis problems relatively easy to solve. We have applied this capability to a number of challenging problems, including (a identification of horizontally transferred genes, (b identification of genomic islands with special properties and (c binning of metagenomic sequences, and achieved highly encouraging results. These application results inspired us to develop this barcode-based genome analysis server for public service, which supports the following capabilities: (a calculation of the k-mer based barcode image for a provided DNA sequence; (b detection of sequence fragments in a given genome with distinct barcodes from those of the majority of the genome, (c clustering of provided DNA sequences into groups having similar barcodes; and (d homology-based search using Blast against a genome database for any selected genomic regions deemed to have interesting barcodes. The barcode server provides a job management capability, allowing processing of a large number of analysis jobs for barcode-based comparative genome analyses. The barcode server is accessible at http://csbl1.bmb.uga.edu/Barcode.

  6. Visualization analysis of author collaborations in schizophrenia research.

    Science.gov (United States)

    Wu, Ying; Duan, Zhiguang

    2015-02-19

    Schizophrenia is a serious mental illness that levies a heavy medical toll and cost burden throughout the world. Scientific collaborations are necessary for progress in psychiatric research. However, there have been few publications on scientific collaborations in schizophrenia. The aim of this study was to investigate the extent of author collaborations in schizophrenia research. This study used 58,107 records on schizophrenia from 2003 to 2012 which were downloaded from Science Citation Index Expanded (SCI Expanded) via Web of Science. CiteSpace III, an information visualization and analysis software, was used to make a visual analysis. Collaborative author networks within the field of schizophrenia were determined using published documents. We found that external author collaboration networks were more scattered while potential author collaboration networks were more compact. Results from hierarchical clustering analysis showed that the main collaborative field was genetic research in schizophrenia. Based on the results, authors belonging to different institutions and in different countries should be encouraged to collaborate in schizophrenia research. This will help researchers focus their studies on key issues, and allow each other to offer reasonable suggestions for making polices and providing scientific evidence to effectively diagnose, prevent, and cure schizophrenia.

  7. Contingency Analysis Post-Processing With Advanced Computing and Visualization

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Yousu; Glaesemann, Kurt; Fitzhenry, Erin

    2017-07-01

    Contingency analysis is a critical function widely used in energy management systems to assess the impact of power system component failures. Its outputs are important for power system operation for improved situational awareness, power system planning studies, and power market operations. With the increased complexity of power system modeling and simulation caused by increased energy production and demand, the penetration of renewable energy and fast deployment of smart grid devices, and the trend of operating grids closer to their capacity for better efficiency, more and more contingencies must be executed and analyzed quickly in order to ensure grid reliability and accuracy for the power market. Currently, many researchers have proposed different techniques to accelerate the computational speed of contingency analysis, but not much work has been published on how to post-process the large amount of contingency outputs quickly. This paper proposes a parallel post-processing function that can analyze contingency analysis outputs faster and display them in a web-based visualization tool to help power engineers improve their work efficiency by fast information digestion. Case studies using an ESCA-60 bus system and a WECC planning system are presented to demonstrate the functionality of the parallel post-processing technique and the web-based visualization tool.

  8. The Web system of visualization and analysis equipped with reproducibility

    International Nuclear Information System (INIS)

    Ueshima, Yutaka; Saito, Kanji; Takeda, Yasuhiro; Nakai, Youichi; Hayashi, Sachiko

    2005-01-01

    In the advanced photon experimental research, real-time visualization and steering system is thought as desirable method of data analysis. This approach is valid only in the fixed analysis at one time or in the easily reproducible experiment. But, in the research for an unknown problem like the advanced photon experimental research, it is necessary that the observation data can be analyzed many times because profitable analysis is difficult at the first time. Consequently, output data should be filed to refer and analyze at any time. To support the research, we need the followed automatic functions, transporting data files from data generator to data storage, analyzing data, tracking history of data handling, and so on. The supporting system will be integrated database system with several functional servers distributed on the network. (author)

  9. Modeling Philippine Stock Exchange Composite Index Using Time Series Analysis

    Science.gov (United States)

    Gayo, W. S.; Urrutia, J. D.; Temple, J. M. F.; Sandoval, J. R. D.; Sanglay, J. E. A.

    2015-06-01

    This study was conducted to develop a time series model of the Philippine Stock Exchange Composite Index and its volatility using the finite mixture of ARIMA model with conditional variance equations such as ARCH, GARCH, EG ARCH, TARCH and PARCH models. Also, the study aimed to find out the reason behind the behaviorof PSEi, that is, which of the economic variables - Consumer Price Index, crude oil price, foreign exchange rate, gold price, interest rate, money supply, price-earnings ratio, Producers’ Price Index and terms of trade - can be used in projecting future values of PSEi and this was examined using Granger Causality Test. The findings showed that the best time series model for Philippine Stock Exchange Composite index is ARIMA(1,1,5) - ARCH(1). Also, Consumer Price Index, crude oil price and foreign exchange rate are factors concluded to Granger cause Philippine Stock Exchange Composite Index.

  10. Time series analysis of the behavior of brazilian natural rubber

    Directory of Open Access Journals (Sweden)

    Antônio Donizette de Oliveira

    2009-03-01

    Full Text Available The natural rubber is a non-wood product obtained of the coagulation of some lattices of forest species, being Hevea brasiliensis the main one. Native from the Amazon Region, this species was already known by the Indians before the discovery of America. The natural rubber became a product globally valued due to its multiple applications in the economy, being its almost perfect substitute the synthetic rubber derived from the petroleum. Similarly to what happens with other countless products the forecast of future prices of the natural rubber has been object of many studies. The use of models of forecast of univariate timeseries stands out as the more accurate and useful to reduce the uncertainty in the economic decision making process. This studyanalyzed the historical series of prices of the Brazilian natural rubber (R$/kg, in the Jan/99 - Jun/2006 period, in order tocharacterize the rubber price behavior in the domestic market; estimated a model for the time series of monthly natural rubberprices; and foresaw the domestic prices of the natural rubber, in the Jul/2006 - Jun/2007 period, based on the estimated models.The studied models were the ones belonging to the ARIMA family. The main results were: the domestic market of the natural rubberis expanding due to the growth of the world economy; among the adjusted models, the ARIMA (1,1,1 model provided the bestadjustment of the time series of prices of the natural rubber (R$/kg; the prognosis accomplished for the series supplied statistically adequate fittings.

  11. Analysis of relationship among visual evoked potential, oscillatory potential and visual acuity under stimulated weightlessness

    Directory of Open Access Journals (Sweden)

    Jun Zhao

    2013-05-01

    Full Text Available AIM: To observe the influence of head-down tilt simulated weightlessness on visual evoked potential(VEP, oscillatory potentials(OPsand visual acuity, and analyse the relationship among them. METHODS: Head-down tilt for -6° was adopted in 14 healthy volunteers. Distant visual acuity, near visual acuity, VEP and OPs were recorded before, two days and five days after trial. The record procedure of OPs followed the ISCEV standard for full-field clinical electroretinography(2008 update. RESULTS: Significant differences were detected in the amplitude of P100 waves and ∑OPs among various time points(P<0.05. But no relationship was observed among VEP, OPs and visual acuity. CONCLUSION: Head-down tilt simulated weightlessness induce the rearrange of blood of the whole body including eyes, which can make the change of visual electrophysiology but not visual acuity.

  12. Financial time series analysis based on information categorization method

    Science.gov (United States)

    Tian, Qiang; Shang, Pengjian; Feng, Guochen

    2014-12-01

    The paper mainly applies the information categorization method to analyze the financial time series. The method is used to examine the similarity of different sequences by calculating the distances between them. We apply this method to quantify the similarity of different stock markets. And we report the results of similarity in US and Chinese stock markets in periods 1991-1998 (before the Asian currency crisis), 1999-2006 (after the Asian currency crisis and before the global financial crisis), and 2007-2013 (during and after global financial crisis) by using this method. The results show the difference of similarity between different stock markets in different time periods and the similarity of the two stock markets become larger after these two crises. Also we acquire the results of similarity of 10 stock indices in three areas; it means the method can distinguish different areas' markets from the phylogenetic trees. The results show that we can get satisfactory information from financial markets by this method. The information categorization method can not only be used in physiologic time series, but also in financial time series.

  13. Porcupine: A visual pipeline tool for neuroimaging analysis.

    Directory of Open Access Journals (Sweden)

    Tim van Mourik

    2018-05-01

    Full Text Available The field of neuroimaging is rapidly adopting a more reproducible approach to data acquisition and analysis. Data structures and formats are being standardised and data analyses are getting more automated. However, as data analysis becomes more complicated, researchers often have to write longer analysis scripts, spanning different tools across multiple programming languages. This makes it more difficult to share or recreate code, reducing the reproducibility of the analysis. We present a tool, Porcupine, that constructs one's analysis visually and automatically produces analysis code. The graphical representation improves understanding of the performed analysis, while retaining the flexibility of modifying the produced code manually to custom needs. Not only does Porcupine produce the analysis code, it also creates a shareable environment for running the code in the form of a Docker image. Together, this forms a reproducible way of constructing, visualising and sharing one's analysis. Currently, Porcupine links to Nipype functionalities, which in turn accesses most standard neuroimaging analysis tools. Our goal is to release researchers from the constraints of specific implementation details, thereby freeing them to think about novel and creative ways to solve a given problem. Porcupine improves the overview researchers have of their processing pipelines, and facilitates both the development and communication of their work. This will reduce the threshold at which less expert users can generate reusable pipelines. With Porcupine, we bridge the gap between a conceptual and an implementational level of analysis and make it easier for researchers to create reproducible and shareable science. We provide a wide range of examples and documentation, as well as installer files for all platforms on our website: https://timvanmourik.github.io/Porcupine. Porcupine is free, open source, and released under the GNU General Public License v3.0.

  14. Metaviz: interactive statistical and visual analysis of metagenomic data.

    Science.gov (United States)

    Wagner, Justin; Chelaru, Florin; Kancherla, Jayaram; Paulson, Joseph N; Zhang, Alexander; Felix, Victor; Mahurkar, Anup; Elmqvist, Niklas; Corrada Bravo, Héctor

    2018-04-06

    Large studies profiling microbial communities and their association with healthy or disease phenotypes are now commonplace. Processed data from many of these studies are publicly available but significant effort is required for users to effectively organize, explore and integrate it, limiting the utility of these rich data resources. Effective integrative and interactive visual and statistical tools to analyze many metagenomic samples can greatly increase the value of these data for researchers. We present Metaviz, a tool for interactive exploratory data analysis of annotated microbiome taxonomic community profiles derived from marker gene or whole metagenome shotgun sequencing. Metaviz is uniquely designed to address the challenge of browsing the hierarchical structure of metagenomic data features while rendering visualizations of data values that are dynamically updated in response to user navigation. We use Metaviz to provide the UMD Metagenome Browser web service, allowing users to browse and explore data for more than 7000 microbiomes from published studies. Users can also deploy Metaviz as a web service, or use it to analyze data through the metavizr package to interoperate with state-of-the-art analysis tools available through Bioconductor. Metaviz is free and open source with the code, documentation and tutorials publicly accessible.

  15. Visualization-based analysis of multiple response survey data

    Science.gov (United States)

    Timofeeva, Anastasiia

    2017-11-01

    During the survey, the respondents are often allowed to tick more than one answer option for a question. Analysis and visualization of such data have difficulties because of the need for processing multiple response variables. With standard representation such as pie and bar charts, information about the association between different answer options is lost. The author proposes a visualization approach for multiple response variables based on Venn diagrams. For a more informative representation with a large number of overlapping groups it is suggested to use similarity and association matrices. Some aggregate indicators of dissimilarity (similarity) are proposed based on the determinant of the similarity matrix and the maximum eigenvalue of association matrix. The application of the proposed approaches is well illustrated by the example of the analysis of advertising sources. Intersection of sets indicates that the same consumer audience is covered by several advertising sources. This information is very important for the allocation of the advertising budget. The differences between target groups in advertising sources are of interest. To identify such differences the hypothesis of homogeneity and independence are tested. Recent approach to the problem are briefly reviewed and compared. An alternative procedure is suggested. It is based on partition of a consumer audience into pairwise disjoint subsets and includes hypothesis testing of the difference between the population proportions. It turned out to be more suitable for the real problem being solved.

  16. Time-Series Analysis of Supergranule Characterstics at Solar Minimum

    Science.gov (United States)

    Williams, Peter E.; Pesnell, W. Dean

    2013-01-01

    Sixty days of Doppler images from the Solar and Heliospheric Observatory (SOHO) / Michelson Doppler Imager (MDI) investigation during the 1996 and 2008 solar minima have been analyzed to show that certain supergranule characteristics (size, size range, and horizontal velocity) exhibit fluctuations of three to five days. Cross-correlating parameters showed a good, positive correlation between supergranulation size and size range, and a moderate, negative correlation between size range and velocity. The size and velocity do exhibit a moderate, negative correlation, but with a small time lag (less than 12 hours). Supergranule sizes during five days of co-temporal data from MDI and the Solar Dynamics Observatory (SDO) / Helioseismic Magnetic Imager (HMI) exhibit similar fluctuations with a high level of correlation between them. This verifies the solar origin of the fluctuations, which cannot be caused by instrumental artifacts according to these observations. Similar fluctuations are also observed in data simulations that model the evolution of the MDI Doppler pattern over a 60-day period. Correlations between the supergranule size and size range time-series derived from the simulated data are similar to those seen in MDI data. A simple toy-model using cumulative, uncorrelated exponential growth and decay patterns at random emergence times produces a time-series similar to the data simulations. The qualitative similarities between the simulated and the observed time-series suggest that the fluctuations arise from stochastic processes occurring within the solar convection zone. This behavior, propagating to surface manifestations of supergranulation, may assist our understanding of magnetic-field-line advection, evolution, and interaction.

  17. Scientific Visualization for Atmospheric Data Analysis in Collaborative Virtual Environments

    Science.gov (United States)

    Engelke, Wito; Flatken, Markus; Garcia, Arturo S.; Bar, Christian; Gerndt, Andreas

    2016-04-01

    1 INTRODUCTION The three year European research project CROSS DRIVE (Collaborative Rover Operations and Planetary Science Analysis System based on Distributed Remote and Interactive Virtual Environments) started in January 2014. The research and development within this project is motivated by three use case studies: landing site characterization, atmospheric science and rover target selection [1]. Currently the implementation for the second use case is in its final phase [2]. Here, the requirements were generated based on the domain experts input and lead to development and integration of appropriate methods for visualization and analysis of atmospheric data. The methods range from volume rendering, interactive slicing, iso-surface techniques to interactive probing. All visualization methods are integrated in DLR's Terrain Rendering application. With this, the high resolution surface data visualization can be enriched with additional methods appropriate for atmospheric data sets. This results in an integrated virtual environment where the scientist has the possibility to interactively explore his data sets directly within the correct context. The data sets include volumetric data of the martian atmosphere, precomputed two dimensional maps and vertical profiles. In most cases the surface data as well as the atmospheric data has global coverage and is of time dependent nature. Furthermore, all interaction is synchronized between different connected application instances, allowing for collaborative sessions between distant experts. 2 VISUALIZATION TECHNIQUES Also the application is currently used for visualization of data sets related to Mars the techniques can be used for other data sets as well. Currently the prototype is capable of handling 2 and 2.5D surface data as well as 4D atmospheric data. Specifically, the surface data is presented using an LoD approach which is based on the HEALPix tessellation of a sphere [3, 4, 5] and can handle data sets in the order of

  18. Visualization and analysis of eddies in a global ocean simulation

    Energy Technology Data Exchange (ETDEWEB)

    Williams, Sean J [Los Alamos National Laboratory; Hecht, Matthew W [Los Alamos National Laboratory; Petersen, Mark [Los Alamos National Laboratory; Strelitz, Richard [Los Alamos National Laboratory; Maltrud, Mathew E [Los Alamos National Laboratory; Ahrens, James P [Los Alamos National Laboratory; Hlawitschka, Mario [UC DAVIS; Hamann, Bernd [UC DAVIS

    2010-10-15

    Eddies at a scale of approximately one hundred kilometers have been shown to be surprisingly important to understanding large-scale transport of heat and nutrients in the ocean. Due to difficulties in observing the ocean directly, the behavior of eddies below the surface is not very well understood. To fill this gap, we employ a high-resolution simulation of the ocean developed at Los Alamos National Laboratory. Using large-scale parallel visualization and analysis tools, we produce three-dimensional images of ocean eddies, and also generate a census of eddy distribution and shape averaged over multiple simulation time steps, resulting in a world map of eddy characteristics. As expected from observational studies, our census reveals a higher concentration of eddies at the mid-latitudes than the equator. Our analysis further shows that mid-latitude eddies are thicker, within a range of 1000-2000m, while equatorial eddies are less than 100m thick.

  19. Integrating sentiment analysis and term associations with geo-temporal visualizations on customer feedback streams

    Science.gov (United States)

    Hao, Ming; Rohrdantz, Christian; Janetzko, Halldór; Keim, Daniel; Dayal, Umeshwar; Haug, Lars-Erik; Hsu, Mei-Chun

    2012-01-01

    Twitter currently receives over 190 million tweets (small text-based Web posts) and manufacturing companies receive over 10 thousand web product surveys a day, in which people share their thoughts regarding a wide range of products and their features. A large number of tweets and customer surveys include opinions about products and services. However, with Twitter being a relatively new phenomenon, these tweets are underutilized as a source for determining customer sentiments. To explore high-volume customer feedback streams, we integrate three time series-based visual analysis techniques: (1) feature-based sentiment analysis that extracts, measures, and maps customer feedback; (2) a novel idea of term associations that identify attributes, verbs, and adjectives frequently occurring together; and (3) new pixel cell-based sentiment calendars, geo-temporal map visualizations and self-organizing maps to identify co-occurring and influential opinions. We have combined these techniques into a well-fitted solution for an effective analysis of large customer feedback streams such as for movie reviews (e.g., Kung-Fu Panda) or web surveys (buyers).

  20. Evaluation of Visual Field Progression in Glaucoma: Quasar Regression Program and Event Analysis.

    Science.gov (United States)

    Díaz-Alemán, Valentín T; González-Hernández, Marta; Perera-Sanz, Daniel; Armas-Domínguez, Karintia

    2016-01-01

    To determine the sensitivity, specificity and agreement between the Quasar program, glaucoma progression analysis (GPA II) event analysis and expert opinion in the detection of glaucomatous progression. The Quasar program is based on linear regression analysis of both mean defect (MD) and pattern standard deviation (PSD). Each series of visual fields was evaluated by three methods; Quasar, GPA II and four experts. The sensitivity, specificity and agreement (kappa) for each method was calculated, using expert opinion as the reference standard. The study included 439 SITA Standard visual fields of 56 eyes of 42 patients, with a mean of 7.8 ± 0.8 visual fields per eye. When suspected cases of progression were considered stable, sensitivity and specificity of Quasar, GPA II and the experts were 86.6% and 70.7%, 26.6% and 95.1%, and 86.6% and 92.6% respectively. When suspected cases of progression were considered as progressing, sensitivity and specificity of Quasar, GPA II and the experts were 79.1% and 81.2%, 45.8% and 90.6%, and 85.4% and 90.6% respectively. The agreement between Quasar and GPA II when suspected cases were considered stable or progressing was 0.03 and 0.28 respectively. The degree of agreement between Quasar and the experts when suspected cases were considered stable or progressing was 0.472 and 0.507. The degree of agreement between GPA II and the experts when suspected cases were considered stable or progressing was 0.262 and 0.342. The combination of MD and PSD regression analysis in the Quasar program showed better agreement with the experts and higher sensitivity than GPA II.

  1. Visibility graph analysis on quarterly macroeconomic series of China based on complex network theory

    Science.gov (United States)

    Wang, Na; Li, Dong; Wang, Qiwen

    2012-12-01

    The visibility graph approach and complex network theory provide a new insight into time series analysis. The inheritance of the visibility graph from the original time series was further explored in the paper. We found that degree distributions of visibility graphs extracted from Pseudo Brownian Motion series obtained by the Frequency Domain algorithm exhibit exponential behaviors, in which the exponential exponent is a binomial function of the Hurst index inherited in the time series. Our simulations presented that the quantitative relations between the Hurst indexes and the exponents of degree distribution function are different for different series and the visibility graph inherits some important features of the original time series. Further, we convert some quarterly macroeconomic series including the growth rates of value-added of three industry series and the growth rates of Gross Domestic Product series of China to graphs by the visibility algorithm and explore the topological properties of graphs associated from the four macroeconomic series, namely, the degree distribution and correlations, the clustering coefficient, the average path length, and community structure. Based on complex network analysis we find degree distributions of associated networks from the growth rates of value-added of three industry series are almost exponential and the degree distributions of associated networks from the growth rates of GDP series are scale free. We also discussed the assortativity and disassortativity of the four associated networks as they are related to the evolutionary process of the original macroeconomic series. All the constructed networks have “small-world” features. The community structures of associated networks suggest dynamic changes of the original macroeconomic series. We also detected the relationship among government policy changes, community structures of associated networks and macroeconomic dynamics. We find great influences of government

  2. Anatomy of the ICDS series: A bibliometric analysis

    International Nuclear Information System (INIS)

    Cardona, Manuel; Marx, Werner

    2007-01-01

    In this article, the proceedings of the International Conferences on Defects in Semiconductors (ICDS) have been analyzed by bibliometric methods. The papers of these conferences have been published as articles in regular journals or special proceedings journals and in books with diverse publishers. The conference name/title changed several times. Many of the proceedings did not appear in the so-called 'source journals' covered by the Thomson/ISI citation databases, in particular by the Science Citation Index (SCI). But the number of citations within these source journals can be determined using the Cited Reference Search mode under the Web of Science (WoS) and the SCI offered by the host STN International. The search functions of both systems were needed to select the papers published as different document types and to cover the full time span of the series. The most cited ICDS papers were identified, and the overall numbers of citations as well as the time-dependent impact of these papers, of single conferences, and of the complete series, was established. The complete of citing papers was analyzed with respect to the countries of the citing authors, the citing journals, and the ISI subject categories

  3. Discontinuous conduction mode analysis of phase-modulated series ...

    Indian Academy of Sciences (India)

    Utsab Kundu

    domain analysis; frequency domain analysis; critical load resistance. 1. Introduction ... DCMSRC design process, requiring repeated circuit simu- lations for design ... Structured derivation of Av is presented, ..... System specifications. L. C r. Lm.

  4. How accurate are interpretations of curriculum-based measurement progress monitoring data? Visual analysis versus decision rules.

    Science.gov (United States)

    Van Norman, Ethan R; Christ, Theodore J

    2016-10-01

    Curriculum based measurement of oral reading (CBM-R) is used to monitor the effects of academic interventions for individual students. Decisions to continue, modify, or terminate these interventions are made by interpreting time series CBM-R data. Such interpretation is founded upon visual analysis or the application of decision rules. The purpose of this study was to compare the accuracy of visual analysis and decision rules. Visual analysts interpreted 108 CBM-R progress monitoring graphs one of three ways: (a) without graphic aids, (b) with a goal line, or (c) with a goal line and a trend line. Graphs differed along three dimensions, including trend magnitude, variability of observations, and duration of data collection. Automated trend line and data point decision rules were also applied to each graph. Inferential analyses permitted the estimation of the probability of a correct decision (i.e., the student is improving - continue the intervention, or the student is not improving - discontinue the intervention) for each evaluation method as a function of trend magnitude, variability of observations, and duration of data collection. All evaluation methods performed better when students made adequate progress. Visual analysis and decision rules performed similarly when observations were less variable. Results suggest that educators should collect data for more than six weeks, take steps to control measurement error, and visually analyze graphs when data are variable. Implications for practice and research are discussed. Copyright © 2016 Society for the Study of School Psychology. Published by Elsevier Ltd. All rights reserved.

  5. The Analysis Of Personality Disorder On Two Characters In The Animation Series Black Rock Shooter

    OpenAIRE

    Ramadhana, Rizki Andrian

    2015-01-01

    The title of this thesis is The Analysis of Personality Disorder on Two Characters in the Animation Series “Black Rock Shooter” which discusses about the personality disorder of two characters from this series; they are Kagari Izuriha and Yomi Takanashi. The animation series Black Rock Shooter is chosen as the source of data because this animation has psychological genre and represents the complexity of human relationship, especially when build up a friendship. It is because human is a social...

  6. PIVOT: platform for interactive analysis and visualization of transcriptomics data.

    Science.gov (United States)

    Zhu, Qin; Fisher, Stephen A; Dueck, Hannah; Middleton, Sarah; Khaladkar, Mugdha; Kim, Junhyong

    2018-01-05

    Many R packages have been developed for transcriptome analysis but their use often requires familiarity with R and integrating results of different packages requires scripts to wrangle the datatypes. Furthermore, exploratory data analyses often generate multiple derived datasets such as data subsets or data transformations, which can be difficult to track. Here we present PIVOT, an R-based platform that wraps open source transcriptome analysis packages with a uniform user interface and graphical data management that allows non-programmers to interactively explore transcriptomics data. PIVOT supports more than 40 popular open source packages for transcriptome analysis and provides an extensive set of tools for statistical data manipulations. A graph-based visual interface is used to represent the links between derived datasets, allowing easy tracking of data versions. PIVOT further supports automatic report generation, publication-quality plots, and program/data state saving, such that all analysis can be saved, shared and reproduced. PIVOT will allow researchers with broad background to easily access sophisticated transcriptome analysis tools and interactively explore transcriptome datasets.

  7. Multi-Scale Entropy Analysis as a Method for Time-Series Analysis of Climate Data

    Directory of Open Access Journals (Sweden)

    Heiko Balzter

    2015-03-01

    Full Text Available Evidence is mounting that the temporal dynamics of the climate system are changing at the same time as the average global temperature is increasing due to multiple climate forcings. A large number of extreme weather events such as prolonged cold spells, heatwaves, droughts and floods have been recorded around the world in the past 10 years. Such changes in the temporal scaling behaviour of climate time-series data can be difficult to detect. While there are easy and direct ways of analysing climate data by calculating the means and variances for different levels of temporal aggregation, these methods can miss more subtle changes in their dynamics. This paper describes multi-scale entropy (MSE analysis as a tool to study climate time-series data and to identify temporal scales of variability and their change over time in climate time-series. MSE estimates the sample entropy of the time-series after coarse-graining at different temporal scales. An application of MSE to Central European, variance-adjusted, mean monthly air temperature anomalies (CRUTEM4v is provided. The results show that the temporal scales of the current climate (1960–2014 are different from the long-term average (1850–1960. For temporal scale factors longer than 12 months, the sample entropy increased markedly compared to the long-term record. Such an increase can be explained by systems theory with greater complexity in the regional temperature data. From 1961 the patterns of monthly air temperatures are less regular at time-scales greater than 12 months than in the earlier time period. This finding suggests that, at these inter-annual time scales, the temperature variability has become less predictable than in the past. It is possible that climate system feedbacks are expressed in altered temporal scales of the European temperature time-series data. A comparison with the variance and Shannon entropy shows that MSE analysis can provide additional information on the

  8. A Multivariate Time Series Method for Monte Carlo Reactor Analysis

    International Nuclear Information System (INIS)

    Taro Ueki

    2008-01-01

    A robust multivariate time series method has been established for the Monte Carlo calculation of neutron multiplication problems. The method is termed Coarse Mesh Projection Method (CMPM) and can be implemented using the coarse statistical bins for acquisition of nuclear fission source data. A novel aspect of CMPM is the combination of the general technical principle of projection pursuit in the signal processing discipline and the neutron multiplication eigenvalue problem in the nuclear engineering discipline. CMPM enables reactor physicists to accurately evaluate major eigenvalue separations of nuclear reactors with continuous energy Monte Carlo calculation. CMPM was incorporated in the MCNP Monte Carlo particle transport code of Los Alamos National Laboratory. The great advantage of CMPM over the traditional Fission Matrix method is demonstrated for the three space-dimensional modeling of the initial core of a pressurized water reactor

  9. Trident: scalable compute archives: workflows, visualization, and analysis

    Science.gov (United States)

    Gopu, Arvind; Hayashi, Soichi; Young, Michael D.; Kotulla, Ralf; Henschel, Robert; Harbeck, Daniel

    2016-08-01

    The Astronomy scientific community has embraced Big Data processing challenges, e.g. associated with time-domain astronomy, and come up with a variety of novel and efficient data processing solutions. However, data processing is only a small part of the Big Data challenge. Efficient knowledge discovery and scientific advancement in the Big Data era requires new and equally efficient tools: modern user interfaces for searching, identifying and viewing data online without direct access to the data; tracking of data provenance; searching, plotting and analyzing metadata; interactive visual analysis, especially of (time-dependent) image data; and the ability to execute pipelines on supercomputing and cloud resources with minimal user overhead or expertise even to novice computing users. The Trident project at Indiana University offers a comprehensive web and cloud-based microservice software suite that enables the straight forward deployment of highly customized Scalable Compute Archive (SCA) systems; including extensive visualization and analysis capabilities, with minimal amount of additional coding. Trident seamlessly scales up or down in terms of data volumes and computational needs, and allows feature sets within a web user interface to be quickly adapted to meet individual project requirements. Domain experts only have to provide code or business logic about handling/visualizing their domain's data products and about executing their pipelines and application work flows. Trident's microservices architecture is made up of light-weight services connected by a REST API and/or a message bus; a web interface elements are built using NodeJS, AngularJS, and HighCharts JavaScript libraries among others while backend services are written in NodeJS, PHP/Zend, and Python. The software suite currently consists of (1) a simple work flow execution framework to integrate, deploy, and execute pipelines and applications (2) a progress service to monitor work flows and sub

  10. SVIP-N 1.0: An integrated visualization platform for neutronics analysis

    International Nuclear Information System (INIS)

    Luo Yuetong; Long Pengcheng; Wu Guoyong; Zeng Qin; Hu Liqin; Zou Jun

    2010-01-01

    Post-processing is an important part of neutronics analysis, and SVIP-N 1.0 (scientific visualization integrated platform for neutronics analysis) is designed to ease post-processing of neutronics analysis through visualization technologies. Main capabilities of SVIP-N 1.0 include: (1) ability of manage neutronics analysis result; (2) ability to preprocess neutronics analysis result; (3) ability to visualization neutronics analysis result data in different way. The paper describes the system architecture and main features of SVIP-N, some advanced visualization used in SVIP-N 1.0 and some preliminary applications, such as ITER.

  11. New trends in data analysis and visualization on the web

    CERN Multimedia

    CERN. Geneva; Alvarez, Sergio

    2011-01-01

    Emerging technologies are democratizing  the power of server clusters to perform data analysis. At the same time new tools and new ideas are coming to data visualization on the web. Together this is creating a revolution on the Internet where data is combined with other data and visualized in amazing new ways. And this is now mainstream. The role this revolution has on science and how it is communicated is now starting to become clearer with new trends like citizen science and crowd-sourcing. As a result, researchers are exploiting a new kind of computation power, human computation. In this seminar we will talk about the story behind some of these initiatives while presenting some new projects being developed by Vizzuality. From how to find new exoplanets, to understand climate change or doing better nature conservation, we will try to present you with as many ideas as possible so that you might find a way to use it on your field or engage the general public in your wor...

  12. Analysis and Visualization of Relations in eLearning

    Science.gov (United States)

    Dráždilová, Pavla; Obadi, Gamila; Slaninová, Kateřina; Martinovič, Jan; Snášel, Václav

    The popularity of eLearning systems is growing rapidly; this growth is enabled by the consecutive development in Internet and multimedia technologies. Web-based education became wide spread in the past few years. Various types of learning management systems facilitate development of Web-based courses. Users of these courses form social networks through the different activities performed by them. This chapter focuses on searching the latent social networks in eLearning systems data. These data consist of students activity records wherein latent ties among actors are embedded. The social network studied in this chapter is represented by groups of students who have similar contacts and interact in similar social circles. Different methods of data clustering analysis can be applied to these groups, and the findings show the existence of latent ties among the group members. The second part of this chapter focuses on social network visualization. Graphical representation of social network can describe its structure very efficiently. It can enable social network analysts to determine the network degree of connectivity. Analysts can easily determine individuals with a small or large amount of relationships as well as the amount of independent groups in a given network. When applied to the field of eLearning, data visualization simplifies the process of monitoring the study activities of individuals or groups, as well as the planning of educational curriculum, the evaluation of study processes, etc.

  13. Interactive visualization and analysis of multimodal datasets for surgical applications.

    Science.gov (United States)

    Kirmizibayrak, Can; Yim, Yeny; Wakid, Mike; Hahn, James

    2012-12-01

    Surgeons use information from multiple sources when making surgical decisions. These include volumetric datasets (such as CT, PET, MRI, and their variants), 2D datasets (such as endoscopic videos), and vector-valued datasets (such as computer simulations). Presenting all the information to the user in an effective manner is a challenging problem. In this paper, we present a visualization approach that displays the information from various sources in a single coherent view. The system allows the user to explore and manipulate volumetric datasets, display analysis of dataset values in local regions, combine 2D and 3D imaging modalities and display results of vector-based computer simulations. Several interaction methods are discussed: in addition to traditional interfaces including mouse and trackers, gesture-based natural interaction methods are shown to control these visualizations with real-time performance. An example of a medical application (medialization laryngoplasty) is presented to demonstrate how the combination of different modalities can be used in a surgical setting with our approach.

  14. Biological time series analysis using a context free language: applicability to pulsatile hormone data.

    Directory of Open Access Journals (Sweden)

    Dennis A Dean

    Full Text Available We present a novel approach for analyzing biological time-series data using a context-free language (CFL representation that allows the extraction and quantification of important features from the time-series. This representation results in Hierarchically AdaPtive (HAP analysis, a suite of multiple complementary techniques that enable rapid analysis of data and does not require the user to set parameters. HAP analysis generates hierarchically organized parameter distributions that allow multi-scale components of the time-series to be quantified and includes a data analysis pipeline that applies recursive analyses to generate hierarchically organized results that extend traditional outcome measures such as pharmacokinetics and inter-pulse interval. Pulsicons, a novel text-based time-series representation also derived from the CFL approach, are introduced as an objective qualitative comparison nomenclature. We apply HAP to the analysis of 24 hours of frequently sampled pulsatile cortisol hormone data, which has known analysis challenges, from 14 healthy women. HAP analysis generated results in seconds and produced dozens of figures for each participant. The results quantify the observed qualitative features of cortisol data as a series of pulse clusters, each consisting of one or more embedded pulses, and identify two ultradian phenotypes in this dataset. HAP analysis is designed to be robust to individual differences and to missing data and may be applied to other pulsatile hormones. Future work can extend HAP analysis to other time-series data types, including oscillatory and other periodic physiological signals.

  15. Sophisticated visualization algorithms for analysis of multidimensional experimental nuclear spectra

    International Nuclear Information System (INIS)

    Morhac, M.; Kliman, J.; Matousek, V.; Turzo, I.

    2004-01-01

    This paper describes graphical models of visualization of 2-, 3-, 4-dimensional scalar data used in nuclear data acquisition, processing and visualization system developed at the Institute of Physics, Slovak Academy of Sciences. It focuses on presentation of nuclear spectra (histograms). However it can be successfully applied for visualization of arrays of other data types. In the paper we present conventional as well as new developed surface and volume rendering visualization techniques used (Authors)

  16. Time Series in Education: The Analysis of Daily Attendance in Two High Schools

    Science.gov (United States)

    Koopmans, Matthijs

    2011-01-01

    This presentation discusses the use of a time series approach to the analysis of daily attendance in two urban high schools over the course of one school year (2009-10). After establishing that the series for both schools were stationary, they were examined for moving average processes, autoregression, seasonal dependencies (weekly cycles),…

  17. Mapping air temperature using time series analysis of LST : The SINTESI approach

    NARCIS (Netherlands)

    Alfieri, S.M.; De Lorenzi, F.; Menenti, M.

    2013-01-01

    This paper presents a new procedure to map time series of air temperature (Ta) at fine spatial resolution using time series analysis of satellite-derived land surface temperature (LST) observations. The method assumes that air temperature is known at a single (reference) location such as in gridded

  18. Time-series analysis of Nigeria rice supply and demand: Error ...

    African Journals Online (AJOL)

    The study examined a time-series analysis of Nigeria rice supply and demand with a view to determining any long-run equilibrium between them using the Error Correction Model approach (ECM). The data used for the study represents the annual series of 1960-2007 (47 years) for rice supply and demand in Nigeria, ...

  19. Taxation in Public Education. Analysis and Bibliography Series, No. 12.

    Science.gov (United States)

    Ross, Larry L.

    Intended for both researchers and practitioners, this analysis and bibliography cites approximately 100 publications on educational taxation, including general texts and reports, statistical reports, taxation guidelines, and alternative proposals for taxation. Topics covered in the analysis section include State and Federal aid, urban and suburban…

  20. SEURAT: visual analytics for the integrated analysis of microarray data.

    Science.gov (United States)

    Gribov, Alexander; Sill, Martin; Lück, Sonja; Rücker, Frank; Döhner, Konstanze; Bullinger, Lars; Benner, Axel; Unwin, Antony

    2010-06-03

    In translational cancer research, gene expression data is collected together with clinical data and genomic data arising from other chip based high throughput technologies. Software tools for the joint analysis of such high dimensional data sets together with clinical data are required. We have developed an open source software tool which provides interactive visualization capability for the integrated analysis of high-dimensional gene expression data together with associated clinical data, array CGH data and SNP array data. The different data types are organized by a comprehensive data manager. Interactive tools are provided for all graphics: heatmaps, dendrograms, barcharts, histograms, eventcharts and a chromosome browser, which displays genetic variations along the genome. All graphics are dynamic and fully linked so that any object selected in a graphic will be highlighted in all other graphics. For exploratory data analysis the software provides unsupervised data analytics like clustering, seriation algorithms and biclustering algorithms. The SEURAT software meets the growing needs of researchers to perform joint analysis of gene expression, genomical and clinical data.

  1. SEURAT: Visual analytics for the integrated analysis of microarray data

    Directory of Open Access Journals (Sweden)

    Bullinger Lars

    2010-06-01

    Full Text Available Abstract Background In translational cancer research, gene expression data is collected together with clinical data and genomic data arising from other chip based high throughput technologies. Software tools for the joint analysis of such high dimensional data sets together with clinical data are required. Results We have developed an open source software tool which provides interactive visualization capability for the integrated analysis of high-dimensional gene expression data together with associated clinical data, array CGH data and SNP array data. The different data types are organized by a comprehensive data manager. Interactive tools are provided for all graphics: heatmaps, dendrograms, barcharts, histograms, eventcharts and a chromosome browser, which displays genetic variations along the genome. All graphics are dynamic and fully linked so that any object selected in a graphic will be highlighted in all other graphics. For exploratory data analysis the software provides unsupervised data analytics like clustering, seriation algorithms and biclustering algorithms. Conclusions The SEURAT software meets the growing needs of researchers to perform joint analysis of gene expression, genomical and clinical data.

  2. RECONSTRUCTION OF PRECIPITATION SERIES AND ANALYSIS OF CLIMATE CHANGE OVER PAST 500 YEARS IN NORTHERN CHINA

    Institute of Scientific and Technical Information of China (English)

    RONG Yan-shu; TU Qi-pu

    2005-01-01

    It is important and necessary to get a much longer precipitation series in order to research features of drought/flood and climate change.Based on dryness and wetness grades series of 18 stations in Northern China of 533 years from 1470 to 2002, the Moving Cumulative Frequency Method (MCFM) was developed, moving average precipitation series from 1499 to 2002 were reconstructed by testing three kinds of average precipitation, and the features of climate change and dry and wet periods were researched by using reconstructed precipitation series in the present paper.The results showed that there were good relationship between the reconstructed precipitation series and the observation precipitation series since 1954 and their relative root-mean-square error were below 1.89%, that the relation between reconstructed series and the dryness and wetness grades series were nonlinear and this nonlinear relation implied that reconstructed series were reliable and could became foundation data for researching evolution of the drought and flood.Analysis of climate change upon reconstructed precipitation series revealed that although drought intensity of recent dry period from middle 1970s of 20th century until early 21st century was not the strongest in historical climate of Northern China, intensity and duration of wet period was a great deal decreasing and shortening respectively, climate evolve to aridification situation in Northern China.

  3. The Timeseries Toolbox - A Web Application to Enable Accessible, Reproducible Time Series Analysis

    Science.gov (United States)

    Veatch, W.; Friedman, D.; Baker, B.; Mueller, C.

    2017-12-01

    The vast majority of data analyzed by climate researchers are repeated observations of physical process or time series data. This data lends itself of a common set of statistical techniques and models designed to determine trends and variability (e.g., seasonality) of these repeated observations. Often, these same techniques and models can be applied to a wide variety of different time series data. The Timeseries Toolbox is a web application designed to standardize and streamline these common approaches to time series analysis and modeling with particular attention to hydrologic time series used in climate preparedness and resilience planning and design by the U. S. Army Corps of Engineers. The application performs much of the pre-processing of time series data necessary for more complex techniques (e.g. interpolation, aggregation). With this tool, users can upload any dataset that conforms to a standard template and immediately begin applying these techniques to analyze their time series data.

  4. Financial time series analysis based on effective phase transfer entropy

    Science.gov (United States)

    Yang, Pengbo; Shang, Pengjian; Lin, Aijing

    2017-02-01

    Transfer entropy is a powerful technique which is able to quantify the impact of one dynamic system on another system. In this paper, we propose the effective phase transfer entropy method based on the transfer entropy method. We use simulated data to test the performance of this method, and the experimental results confirm that the proposed approach is capable of detecting the information transfer between the systems. We also explore the relationship between effective phase transfer entropy and some variables, such as data size, coupling strength and noise. The effective phase transfer entropy is positively correlated with the data size and the coupling strength. Even in the presence of a large amount of noise, it can detect the information transfer between systems, and it is very robust to noise. Moreover, this measure is indeed able to accurately estimate the information flow between systems compared with phase transfer entropy. In order to reflect the application of this method in practice, we apply this method to financial time series and gain new insight into the interactions between systems. It is demonstrated that the effective phase transfer entropy can be used to detect some economic fluctuations in the financial market. To summarize, the effective phase transfer entropy method is a very efficient tool to estimate the information flow between systems.

  5. Stock price forecasting based on time series analysis

    Science.gov (United States)

    Chi, Wan Le

    2018-05-01

    Using the historical stock price data to set up a sequence model to explain the intrinsic relationship of data, the future stock price can forecasted. The used models are auto-regressive model, moving-average model and autoregressive-movingaverage model. The original data sequence of unit root test was used to judge whether the original data sequence was stationary. The non-stationary original sequence as a first order difference needed further processing. Then the stability of the sequence difference was re-inspected. If it is still non-stationary, the second order differential processing of the sequence is carried out. Autocorrelation diagram and partial correlation diagram were used to evaluate the parameters of the identified ARMA model, including coefficients of the model and model order. Finally, the model was used to forecast the fitting of the shanghai composite index daily closing price with precision. Results showed that the non-stationary original data series was stationary after the second order difference. The forecast value of shanghai composite index daily closing price was closer to actual value, indicating that the ARMA model in the paper was a certain accuracy.

  6. Industrial electricity demand for Turkey: A structural time series analysis

    International Nuclear Information System (INIS)

    Dilaver, Zafer; Hunt, Lester C.

    2011-01-01

    This research investigates the relationship between Turkish industrial electricity consumption, industrial value added and electricity prices in order to forecast future Turkish industrial electricity demand. To achieve this, an industrial electricity demand function for Turkey is estimated by applying the structural time series technique to annual data over the period 1960 to 2008. In addition to identifying the size and significance of the price and industrial value added (output) elasticities, this technique also uncovers the electricity Underlying Energy Demand Trend (UEDT) for the Turkish industrial sector and is, as far as is known, the first attempt to do this. The results suggest that output and real electricity prices and a UEDT all have an important role to play in driving Turkish industrial electricity demand. Consequently, they should all be incorporated when modelling Turkish industrial electricity demand and the estimated UEDT should arguably be considered in future energy policy decisions concerning the Turkish electricity industry. The output and price elasticities are estimated to be 0.15 and - 0.16 respectively, with an increasing (but at a decreasing rate) UEDT and based on the estimated equation, and different forecast assumptions, it is predicted that Turkish industrial electricity demand will be somewhere between 97 and 148 TWh by 2020. -- Research Highlights: → Estimated output and price elasticities of 0.15 and -0.16 respectively. → Estimated upward sloping UEDT (i.e. energy using) but at a decreasing rate. → Predicted Turkish industrial electricity demand between 97 and 148 TWh in 2020.

  7. A unified nonlinear stochastic time series analysis for climate science.

    Science.gov (United States)

    Moon, Woosok; Wettlaufer, John S

    2017-03-13

    Earth's orbit and axial tilt imprint a strong seasonal cycle on climatological data. Climate variability is typically viewed in terms of fluctuations in the seasonal cycle induced by higher frequency processes. We can interpret this as a competition between the orbitally enforced monthly stability and the fluctuations/noise induced by weather. Here we introduce a new time-series method that determines these contributions from monthly-averaged data. We find that the spatio-temporal distribution of the monthly stability and the magnitude of the noise reveal key fingerprints of several important climate phenomena, including the evolution of the Arctic sea ice cover, the El Nio Southern Oscillation (ENSO), the Atlantic Nio and the Indian Dipole Mode. In analogy with the classical destabilising influence of the ice-albedo feedback on summertime sea ice, we find that during some time interval of the season a destabilising process operates in all of these climate phenomena. The interaction between the destabilisation and the accumulation of noise, which we term the memory effect, underlies phase locking to the seasonal cycle and the statistical nature of seasonal predictability.

  8. Methodology Series Module 6: Systematic Reviews and Meta-analysis.

    Science.gov (United States)

    Setia, Maninder Singh

    2016-01-01

    Systematic reviews and meta-analysis have become an important of biomedical literature, and they provide the "highest level of evidence" for various clinical questions. There are a lot of studies - sometimes with contradictory conclusions - on a particular topic in literature. Hence, as a clinician, which results will you believe? What will you tell your patient? Which drug is better? A systematic review or a meta-analysis may help us answer these questions. In addition, it may also help us understand the quality of the articles in literature or the type of studies that have been conducted and published (example, randomized trials or observational studies). The first step it to identify a research question for systematic review or meta-analysis. The next step is to identify the articles that will be included in the study. This will be done by searching various databases; it is important that the researcher should search for articles in more than one database. It will also be useful to form a group of researchers and statisticians that have expertise in conducting systematic reviews and meta-analysis before initiating them. We strongly encourage the readers to register their proposed review/meta-analysis with PROSPERO. Finally, these studies should be reported according to the Preferred Reporting Items for Systematic Reviews and Meta-analysis checklist.

  9. A novel water quality data analysis framework based on time-series data mining.

    Science.gov (United States)

    Deng, Weihui; Wang, Guoyin

    2017-07-01

    The rapid development of time-series data mining provides an emerging method for water resource management research. In this paper, based on the time-series data mining methodology, we propose a novel and general analysis framework for water quality time-series data. It consists of two parts: implementation components and common tasks of time-series data mining in water quality data. In the first part, we propose to granulate the time series into several two-dimensional normal clouds and calculate the similarities in the granulated level. On the basis of the similarity matrix, the similarity search, anomaly detection, and pattern discovery tasks in the water quality time-series instance dataset can be easily implemented in the second part. We present a case study of this analysis framework on weekly Dissolve Oxygen time-series data collected from five monitoring stations on the upper reaches of Yangtze River, China. It discovered the relationship of water quality in the mainstream and tributary as well as the main changing patterns of DO. The experimental results show that the proposed analysis framework is a feasible and efficient method to mine the hidden and valuable knowledge from water quality historical time-series data. Copyright © 2017 Elsevier Ltd. All rights reserved.

  10. Time series analysis of diverse extreme phenomena: universal features

    Science.gov (United States)

    Eftaxias, K.; Balasis, G.

    2012-04-01

    The field of study of complex systems holds that the dynamics of complex systems are founded on universal principles that may used to describe a great variety of scientific and technological approaches of different types of natural, artificial, and social systems. We suggest that earthquake, epileptic seizures, solar flares, and magnetic storms dynamics can be analyzed within similar mathematical frameworks. A central property of aforementioned extreme events generation is the occurrence of coherent large-scale collective behavior with very rich structure, resulting from repeated nonlinear interactions among the corresponding constituents. Consequently, we apply the Tsallis nonextensive statistical mechanics as it proves an appropriate framework in order to investigate universal principles of their generation. First, we examine the data in terms of Tsallis entropy aiming to discover common "pathological" symptoms of transition to a significant shock. By monitoring the temporal evolution of the degree of organization in time series we observe similar distinctive features revealing significant reduction of complexity during their emergence. Second, a model for earthquake dynamics coming from a nonextensive Tsallis formalism, starting from first principles, has been recently introduced. This approach leads to an energy distribution function (Gutenberg-Richter type law) for the magnitude distribution of earthquakes, providing an excellent fit to seismicities generated in various large geographic areas usually identified as seismic regions. We show that this function is able to describe the energy distribution (with similar non-extensive q-parameter) of solar flares, magnetic storms, epileptic and earthquake shocks. The above mentioned evidence of a universal statistical behavior suggests the possibility of a common approach for studying space weather, earthquakes and epileptic seizures.

  11. Real analysis series, functions of several variables, and applications

    CERN Document Server

    Laczkovich, Miklós

    2017-01-01

    This book develops the theory of multivariable analysis, building on the single variable foundations established in the companion volume, Real Analysis: Foundations and Functions of One Variable. Together, these volumes form the first English edition of the popular Hungarian original, Valós Analízis I & II, based on courses taught by the authors at Eötvös Loránd University, Hungary, for more than 30 years. Numerous exercises are included throughout, offering ample opportunities to master topics by progressing from routine to difficult problems. Hints or solutions to many of the more challenging exercises make this book ideal for independent study, or further reading. Intended as a sequel to a course in single variable analysis, this book builds upon and expands these ideas into higher dimensions. The modular organization makes this text adaptable for either a semester or year-long introductory course. Topics include: differentiation and integration of functions of several variables; infinite numerica...

  12. Analysis of engineering cycles thermodynamics and fluid mechanics series

    CERN Document Server

    Haywood, R W

    1980-01-01

    Analysis of Engineering Cycles, Third Edition, deals principally with an analysis of the overall performance, under design conditions, of work-producing power plants and work-absorbing refrigerating and gas-liquefaction plants, most of which are either cyclic or closely related thereto. The book is organized into two parts, dealing first with simple power and refrigerating plants and then moving on to more complex plants. The principal modifications in this Third Edition arise from the updating and expansion of material on nuclear plants and on combined and binary plants. In view of increased

  13. Visualization and Analysis of Climate Simulation Performance Data

    Science.gov (United States)

    Röber, Niklas; Adamidis, Panagiotis; Behrens, Jörg

    2015-04-01

    Visualization is the key process of transforming abstract (scientific) data into a graphical representation, to aid in the understanding of the information hidden within the data. Climate simulation data sets are typically quite large, time varying, and consist of many different variables sampled on an underlying grid. A large variety of climate models - and sub models - exist to simulate various aspects of the climate system. Generally, one is mainly interested in the physical variables produced by the simulation runs, but model developers are also interested in performance data measured along with these simulations. Climate simulation models are carefully developed complex software systems, designed to run in parallel on large HPC systems. An important goal thereby is to utilize the entire hardware as efficiently as possible, that is, to distribute the workload as even as possible among the individual components. This is a very challenging task, and detailed performance data, such as timings, cache misses etc. have to be used to locate and understand performance problems in order to optimize the model implementation. Furthermore, the correlation of performance data to the processes of the application and the sub-domains of the decomposed underlying grid is vital when addressing communication and load imbalance issues. High resolution climate simulations are carried out on tens to hundreds of thousands of cores, thus yielding a vast amount of profiling data, which cannot be analyzed without appropriate visualization techniques. This PICO presentation displays and discusses the ICON simulation model, which is jointly developed by the Max Planck Institute for Meteorology and the German Weather Service and in partnership with DKRZ. The visualization and analysis of the models performance data allows us to optimize and fine tune the model, as well as to understand its execution on the HPC system. We show and discuss our workflow, as well as present new ideas and

  14. Time series analysis of aerobic bacterial flora during Miso fermentation.

    Science.gov (United States)

    Onda, T; Yanagida, F; Tsuji, M; Shinohara, T; Yokotsuka, K

    2003-01-01

    This article reports a microbiological study of aerobic mesophilic bacteria that are present during the fermentation process of Miso. Aerobic bacteria were enumerated and isolated from Miso during fermentation and divided into nine groups using traditional phenotypic tests. The strains were identified by biochemical analysis and 16S rRNA sequence analysis. They were identified as Bacillus subtilis, B. amyloliquefaciens, Kocuria kristinae, Staphylococcus gallinarum and S. kloosii. All strains were sensitive to the bacteriocins produced by the lactic acid bacteria isolated from Miso. The dominant species among the undesirable species throughout the fermentation process were B. subtilis and B. amyloliquefaciens. It is suggested that bacteriocin-producing lactic acid bacteria are effective in the growth prevention of aerobic bacteria in Miso. This study has provided useful information for controlling of bacterial flora during Miso fermentation.

  15. Visual Analysis as a design and decision-making tool in the development of a quarry

    Science.gov (United States)

    Randall Boyd Fitzgerald

    1979-01-01

    In order to obtain local and state government approvals, an environmental impact analysis of the mining and reclamation of a proposed hard rock quarry was required. High visibility of the proposed mining area from the adjacent community required a visual impact analysis in the planning and design of the project. The Visual Analysis defined design criteria for the...

  16. Information theoretic analysis of edge detection in visual communication

    Science.gov (United States)

    Jiang, Bo; Rahman, Zia-ur

    2010-08-01

    Generally, the designs of digital image processing algorithms and image gathering devices remain separate. Consequently, the performance of digital image processing algorithms is evaluated without taking into account the artifacts introduced into the process by the image gathering process. However, experiments show that the image gathering process profoundly impacts the performance of digital image processing and the quality of the resulting images. Huck et al. proposed one definitive theoretic analysis of visual communication channels, where the different parts, such as image gathering, processing, and display, are assessed in an integrated manner using Shannon's information theory. In this paper, we perform an end-to-end information theory based system analysis to assess edge detection methods. We evaluate the performance of the different algorithms as a function of the characteristics of the scene, and the parameters, such as sampling, additive noise etc., that define the image gathering system. The edge detection algorithm is regarded to have high performance only if the information rate from the scene to the edge approaches the maximum possible. This goal can be achieved only by jointly optimizing all processes. People generally use subjective judgment to compare different edge detection methods. There is not a common tool that can be used to evaluate the performance of the different algorithms, and to give people a guide for selecting the best algorithm for a given system or scene. Our information-theoretic assessment becomes this new tool to which allows us to compare the different edge detection operators in a common environment.

  17. Cluster analysis of activity-time series in motor learning

    DEFF Research Database (Denmark)

    Balslev, Daniela; Nielsen, Finn Årup; Frutiger, Sally A.

    2002-01-01

    Neuroimaging studies of learning focus on brain areas where the activity changes as a function of time. To circumvent the difficult problem of model selection, we used a data-driven analytic tool, cluster analysis, which extracts representative temporal and spatial patterns from the voxel...... practice-related activity in a fronto-parieto-cerebellar network, in agreement with previous studies of motor learning. These voxels were separated from a group of voxels showing an unspecific time-effect and another group of voxels, whose activation was an artifact from smoothing. Hum. Brain Mapping 15...

  18. Multi-focus and multi-level techniques for visualization and analysis of networks with thematic data

    Science.gov (United States)

    Cossalter, Michele; Mengshoel, Ole J.; Selker, Ted

    2013-01-01

    Information-rich data sets bring several challenges in the areas of visualization and analysis, even when associated with node-link network visualizations. This paper presents an integration of multi-focus and multi-level techniques that enable interactive, multi-step comparisons in node-link networks. We describe NetEx, a visualization tool that enables users to simultaneously explore different parts of a network and its thematic data, such as time series or conditional probability tables. NetEx, implemented as a Cytoscape plug-in, has been applied to the analysis of electrical power networks, Bayesian networks, and the Enron e-mail repository. In this paper we briefly discuss visualization and analysis of the Enron social network, but focus on data from an electrical power network. Specifically, we demonstrate how NetEx supports the analytical task of electrical power system fault diagnosis. Results from a user study with 25 subjects suggest that NetEx enables more accurate isolation of complex faults compared to an especially designed software tool.

  19. SensorDB: a virtual laboratory for the integration, visualization and analysis of varied biological sensor data.

    Science.gov (United States)

    Salehi, Ali; Jimenez-Berni, Jose; Deery, David M; Palmer, Doug; Holland, Edward; Rozas-Larraondo, Pablo; Chapman, Scott C; Georgakopoulos, Dimitrios; Furbank, Robert T

    2015-01-01

    To our knowledge, there is no software or database solution that supports large volumes of biological time series sensor data efficiently and enables data visualization and analysis in real time. Existing solutions for managing data typically use unstructured file systems or relational databases. These systems are not designed to provide instantaneous response to user queries. Furthermore, they do not support rapid data analysis and visualization to enable interactive experiments. In large scale experiments, this behaviour slows research discovery, discourages the widespread sharing and reuse of data that could otherwise inform critical decisions in a timely manner and encourage effective collaboration between groups. In this paper we present SensorDB, a web based virtual laboratory that can manage large volumes of biological time series sensor data while supporting rapid data queries and real-time user interaction. SensorDB is sensor agnostic and uses web-based, state-of-the-art cloud and storage technologies to efficiently gather, analyse and visualize data. Collaboration and data sharing between different agencies and groups is thereby facilitated. SensorDB is available online at http://sensordb.csiro.au.

  20. Multifractal detrended cross-correlation analysis on gold, crude oil and foreign exchange rate time series

    Science.gov (United States)

    Pal, Mayukha; Madhusudana Rao, P.; Manimaran, P.

    2014-12-01

    We apply the recently developed multifractal detrended cross-correlation analysis method to investigate the cross-correlation behavior and fractal nature between two non-stationary time series. We analyze the daily return price of gold, West Texas Intermediate and Brent crude oil, foreign exchange rate data, over a period of 18 years. The cross correlation has been measured from the Hurst scaling exponents and the singularity spectrum quantitatively. From the results, the existence of multifractal cross-correlation between all of these time series is found. We also found that the cross correlation between gold and oil prices possess uncorrelated behavior and the remaining bivariate time series possess persistent behavior. It was observed for five bivariate series that the cross-correlation exponents are less than the calculated average generalized Hurst exponents (GHE) for q0 and for one bivariate series the cross-correlation exponent is greater than GHE for all q values.

  1. A Reception Analysis on the Youth Audiences of TV Series in Marivan

    Directory of Open Access Journals (Sweden)

    Omid Karimi

    2014-03-01

    Full Text Available The aim of this article is to describe the role of foreign media as the agitators of popular culture. For that with reception analysis it’s pay to describe decoding of youth audiences about this series. Globalization theory and Reception in Communication theory are formed the theoretical system of current article. The methodology in this research is qualitative one, and two techniques as in-depth interview and observation are used for data collection. The results show different people based on individual features, social and cultural backgrounds have inclination toward special characters and identify with them. This inclination so far the audience fallow the series because of his/her favorite character. Also there is a great compatibility between audience backgrounds and their receptions. A number of audience have criticized the series and point out the negative consequences on its society. However, seeing the series continue; really they prefer watching series enjoying to risks of it.

  2. [Amblyopia: reading speed in comparison with visual acuity for gratings, single Landolt Cs and series Landolt Cs].

    Science.gov (United States)

    Bach, M; Strahl, P; Waltenspiel, S; Kommerell, G

    1990-01-01

    In the treatment of amblyopia in preschool children, a means of predicting later reading ability would be helpful. This prediction might be possible using a test for visual acuity where the results correlate with reading ability in adult patients with amblyopia. We measured the following four parameters in 18 experienced readers with strabismic amblyopia: (1) time spent reading ten lines of a standard text in one of three magnifications, (2) visual acuity for gratings, (3) visual acuity for single Landolt Cs, and (4) visual acuity for crowded Landolt Cs (one Landolt C flanked by two full rings on each side each at a distance of 2.6 min of arc). The reading text was presented on paper at a distance of 40 cm; the subject had a choice of three magnifications. The acuity tests were generated by a computer on a VDU at 4.6 m. The relative impairment of the amblyopic eye was defined as the quotient between the performance of the amblyopic and the good eye. In addition, the difference between the times spent reading the ten lines with the amblyopic and with the good eye was calculated.(ABSTRACT TRUNCATED AT 250 WORDS)

  3. Cerebral venous sinus thrombosis on MRI: A case series analysis

    Directory of Open Access Journals (Sweden)

    Sanjay M Khaladkar

    2014-01-01

    Full Text Available Background: Cerebral venous sinus thrombosis (CVST is a rare form of stroke seen in young and middle aged group, especially in women due to thrombus of dural venous sinuses and can cause acute neurological deterioration with increased morbidity and mortality if not diagnosed in early stage. Neurological deficit occurs due to focal or diffuse cerebral edema and venous non-hemorrhagic or hemorrhagic infarct. Aim and Objectives: To assess/evaluate the role of Magnetic Resonance Imaging (MRI and Magnetic Resonance Venography (MRV as an imaging modality for early diagnosis of CVST and to study patterns of venous thrombosis, in detecting changes in brain parenchyma and residual effects of CVST using MRI. Materials and Methods: Retrospective descriptive analysis of 40 patients of CVST diagnosed on MRI brain and MRV was done. Results: 29/40 (72.5% were males and 11/40 (27.5% were females. Most of the patients were in the age group of 21-40 years (23/40-57.5%. Most of the patients 16/40 (40% presented within 7 days. No definite cause of CVST was found in 24 (60% patients in spite of detailed history. In 36/40 (90% of cases major sinuses were involved, deep venous system were involved in 7/40 (17.5% cases, superficial cortical vein was involved in 1/40 (2.5% cases. Analysis of stage of thrombus (acute, subacute, chronic was done based on its appearance on T1 and T2WI. 31/40 (77.5% patients showed complete absence of flow on MRV, while 9/40 (22.5% cases showed partial flow on MR venogram. Brain parenchyma was normal in 20/40 (50% patients while 6/40 (15% cases had non-hemorrhagic infarct and 14/40 (35% patients presented with hemorrhagic infarct. Conclusion: Our study concluded that MRI brain with MRV is sensitive in diagnosing both direct signs (evidence of thrombus inside the affected veins and indirect signs (parenchymal changes of CVST and their follow up.

  4. An analysis of fusion algorithms for LWIR and visual images

    CSIR Research Space (South Africa)

    De Villiers, J

    2013-12-01

    Full Text Available This paper presents a comparison of methods to fuse pre-registered colour visual and long wave infra-red images to create a new image containing both visual and thermal cues. Three methods of creating the artificially coloured fused images...

  5. Working Memory Enhances Visual Perception: Evidence from Signal Detection Analysis

    Science.gov (United States)

    Soto, David; Wriglesworth, Alice; Bahrami-Balani, Alex; Humphreys, Glyn W.

    2010-01-01

    We show that perceptual sensitivity to visual stimuli can be modulated by matches between the contents of working memory (WM) and stimuli in the visual field. Observers were presented with an object cue (to hold in WM or to merely attend) and subsequently had to identify a brief target presented within a colored shape. The cue could be…

  6. Chaos in Electronic Circuits: Nonlinear Time Series Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Wheat, Jr., Robert M. [Kennedy Western Univ., Cheyenne, WY (United States)

    2003-07-01

    Chaos in electronic circuits is a phenomenon that has been largely ignored by engineers, manufacturers, and researchers until the early 1990’s and the work of Chua, Matsumoto, and others. As the world becomes more dependent on electronic devices, the detrimental effects of non-normal operation of these devices becomes more significant. Developing a better understanding of the mechanisms involved in the chaotic behavior of electronic circuits is a logical step toward the prediction and prevention of any potentially catastrophic occurrence of this phenomenon. Also, a better understanding of chaotic behavior, in a general sense, could potentially lead to better accuracy in the prediction of natural events such as weather, volcanic activity, and earthquakes. As a first step in this improvement of understanding, and as part of the research being reported here, methods of computer modeling, identifying and analyzing, and producing chaotic behavior in simple electronic circuits have been developed. The computer models were developed using both the Alternative Transient Program (ATP) and Spice, the analysis techniques have been implemented using the C and C++ programming languages, and the chaotically behaving circuits developed using “off the shelf” electronic components.

  7. Financing Human Development for Sectorial Growth: A Time Series Analysis

    Directory of Open Access Journals (Sweden)

    Shobande Abdul Olatunji

    2017-06-01

    Full Text Available The role which financing human development plays in fostering the sectorial growth of an economy cannot be undermined. It is a key instrument which can be utilized to alleviate poverty, create employment and ensure the sustenance of economic growth and development. Thus financing human development for sectorial growth has taken the center stage of economic growth and development strategies in most countries. In a constructive effort to examine the in-depth relationship between the variables in the Nigerian space, this paper provides evidence on the impact of financing human development and sectorial growth in Nigeria between 1982 and 2016, using the Johansen co-integration techniques to test for co-integration among the variables and the Vector Error Correction Model (VECM to ascertain the speed of adjustment of the variables to their long run equilibrium position. The analysis shows that a long and short run relationship exists between financing human capital development and sectorial growth during the period reviewed. Therefore, the paper argues that for an active foundation for sustainable sectorial growth and development, financing human capital development across each unit is urgently required through increased budgetary allocation for both health and educational sectors since they are key components of human capital development in a nation.

  8. Spatially adaptive mixture modeling for analysis of FMRI time series.

    Science.gov (United States)

    Vincent, Thomas; Risser, Laurent; Ciuciu, Philippe

    2010-04-01

    Within-subject analysis in fMRI essentially addresses two problems, the detection of brain regions eliciting evoked activity and the estimation of the underlying dynamics. In Makni et aL, 2005 and Makni et aL, 2008, a detection-estimation framework has been proposed to tackle these problems jointly, since they are connected to one another. In the Bayesian formalism, detection is achieved by modeling activating and nonactivating voxels through independent mixture models (IMM) within each region while hemodynamic response estimation is performed at a regional scale in a nonparametric way. Instead of IMMs, in this paper we take advantage of spatial mixture models (SMM) for their nonlinear spatial regularizing properties. The proposed method is unsupervised and spatially adaptive in the sense that the amount of spatial correlation is automatically tuned from the data and this setting automatically varies across brain regions. In addition, the level of regularization is specific to each experimental condition since both the signal-to-noise ratio and the activation pattern may vary across stimulus types in a given brain region. These aspects require the precise estimation of multiple partition functions of underlying Ising fields. This is addressed efficiently using first path sampling for a small subset of fields and then using a recently developed fast extrapolation technique for the large remaining set. Simulation results emphasize that detection relying on supervised SMM outperforms its IMM counterpart and that unsupervised spatial mixture models achieve similar results without any hand-tuning of the correlation parameter. On real datasets, the gain is illustrated in a localizer fMRI experiment: brain activations appear more spatially resolved using SMM in comparison with classical general linear model (GLM)-based approaches, while estimating a specific parcel-based HRF shape. Our approach therefore validates the treatment of unsmoothed fMRI data without fixed GLM

  9. Simultaneous determination of radionuclides separable into natural decay series by use of time-interval analysis

    International Nuclear Information System (INIS)

    Hashimoto, Tetsuo; Sanada, Yukihisa; Uezu, Yasuhiro

    2004-01-01

    A delayed coincidence method, time-interval analysis (TIA), has been applied to successive α-α decay events on the millisecond time-scale. Such decay events are part of the 220 Rn→ 216 Po (T 1/2 145 ms) (Th-series) and 219 Rn→ 215 Po (T 1/2 1.78 ms) (Ac-series). By using TIA in addition to measurement of 226 Ra (U-series) from α-spectrometry by liquid scintillation counting (LSC), two natural decay series could be identified and separated. The TIA detection efficiency was improved by using the pulse-shape discrimination technique (PSD) to reject β-pulses, by solvent extraction of Ra combined with simple chemical separation, and by purging the scintillation solution with dry N 2 gas. The U- and Th-series together with the Ac-series were determined, respectively, from alpha spectra and TIA carried out immediately after Ra-extraction. Using the 221 Fr→ 217 At (T 1/2 32.3 ms) decay process as a tracer, overall yields were estimated from application of TIA to the 225 Ra (Np-decay series) at the time of maximum growth. The present method has proven useful for simultaneous determination of three radioactive decay series in environmental samples. (orig.)

  10. CRISPR-DAV: CRISPR NGS data analysis and visualization pipeline.

    Science.gov (United States)

    Wang, Xuning; Tilford, Charles; Neuhaus, Isaac; Mintier, Gabe; Guo, Qi; Feder, John N; Kirov, Stefan

    2017-12-01

    The simplicity and precision of CRISPR/Cas9 system has brought in a new era of gene editing. Screening for desired clones with CRISPR-mediated genomic edits in a large number of samples is made possible by next generation sequencing (NGS) due to its multiplexing. Here we present CRISPR-DAV (CRISPR Data Analysis and Visualization) pipeline to analyze the CRISPR NGS data in a high throughput manner. In the pipeline, Burrows-Wheeler Aligner and Assembly Based ReAlignment are used for small and large indel detection, and results are presented in a comprehensive set of charts and interactive alignment view. CRISPR-DAV is available at GitHub and Docker Hub repositories: https://github.com/pinetree1/crispr-dav.git and https://hub.docker.com/r/pinetree1/crispr-dav/. xuning.wang@bms.com. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  11. Advances in directional borehole radar data analysis and visualization

    Science.gov (United States)

    Smith, D.V.G.; Brown, P.J.

    2002-01-01

    The U.S. Geological Survey is developing a directional borehole radar (DBOR) tool for mapping fractures, lithologic changes, and underground utility and void detection. An important part of the development of the DBOR tool is data analysis and visualization, with the aim of making the software graphical user interface (GUI) intuitive and easy to use. The DBOR software system consists of a suite of signal and image processing routines written in Research Systems' Interactive Data Language (IDL). The software also serves as a front-end to many widely accepted Colorado School of Mines Center for Wave Phenomena (CWP) Seismic UNIX (SU) algorithms (Cohen and Stockwell, 2001). Although the SU collection runs natively in a UNIX environment, our system seamlessly emulates a UNIX session within a widely used PC operating system (MicroSoft Windows) using GNU tools (Noer, 1998). Examples are presented of laboratory data acquired with the prototype tool from two different experimental settings. The first experiment imaged plastic pipes in a macro-scale sand tank. The second experiment monitored the progress of an invasion front resulting from oil injection. Finally, challenges to further development and planned future work are discussed.

  12. Time series analysis of wind speed using VAR and the generalized impulse response technique

    Energy Technology Data Exchange (ETDEWEB)

    Ewing, Bradley T. [Area of Information Systems and Quantitative Sciences, Rawls College of Business and Wind Science and Engineering Research Center, Texas Tech University, Lubbock, TX 79409-2101 (United States); Kruse, Jamie Brown [Center for Natural Hazard Research, East Carolina University, Greenville, NC (United States); Schroeder, John L. [Department of Geosciences and Wind Science and Engineering Research Center, Texas Tech University, Lubbock, TX (United States); Smith, Douglas A. [Department of Civil Engineering and Wind Science and Engineering Research Center, Texas Tech University, Lubbock, TX (United States)

    2007-03-15

    This research examines the interdependence in time series wind speed data measured in the same location at four different heights. A multiple-equation system known as a vector autoregression is proposed for characterizing the time series dynamics of wind. Additionally, the recently developed method of generalized impulse response analysis provides insight into the cross-effects of the wind series and their responses to shocks. Findings are based on analysis of contemporaneous wind speed time histories taken at 13, 33, 70 and 160 ft above ground level with a sampling rate of 10 Hz. The results indicate that wind speeds measured at 70 ft was the most variable. Further, the turbulence persisted longer at the 70-ft measurement than at the other heights. The greatest interdependence is observed at 13 ft. Gusts at 160 ft led to the greatest persistence to an 'own' shock and led to greatest persistence in the responses of the other wind series. (author)

  13. Bioelectric signal classification using a recurrent probabilistic neural network with time-series discriminant component analysis.

    Science.gov (United States)

    Hayashi, Hideaki; Shima, Keisuke; Shibanoki, Taro; Kurita, Yuichi; Tsuji, Toshio

    2013-01-01

    This paper outlines a probabilistic neural network developed on the basis of time-series discriminant component analysis (TSDCA) that can be used to classify high-dimensional time-series patterns. TSDCA involves the compression of high-dimensional time series into a lower-dimensional space using a set of orthogonal transformations and the calculation of posterior probabilities based on a continuous-density hidden Markov model that incorporates a Gaussian mixture model expressed in the reduced-dimensional space. The analysis can be incorporated into a neural network so that parameters can be obtained appropriately as network coefficients according to backpropagation-through-time-based training algorithm. The network is considered to enable high-accuracy classification of high-dimensional time-series patterns and to reduce the computation time taken for network training. In the experiments conducted during the study, the validity of the proposed network was demonstrated for EEG signals.

  14. Fractal analysis and nonlinear forecasting of indoor 222Rn time series

    International Nuclear Information System (INIS)

    Pausch, G.; Bossew, P.; Hofmann, W.; Steger, F.

    1998-01-01

    Fractal analyses of indoor 222 Rn time series were performed using different chaos theory based measurements such as time delay method, Hurst's rescaled range analysis, capacity (fractal) dimension, and Lyapunov exponent. For all time series we calculated only positive Lyapunov exponents which is a hint to chaos, while the Hurst exponents were well below 0.5, indicating antipersistent behaviour (past trends tend to reverse in the future). These time series were also analyzed with a nonlinear prediction method which allowed an estimation of the embedding dimensions with some restrictions, limiting the prediction to about three relative time steps. (orig.)

  15. CROSAT: A digital computer program for statistical-spectral analysis of two discrete time series

    International Nuclear Information System (INIS)

    Antonopoulos Domis, M.

    1978-03-01

    The program CROSAT computes directly from two discrete time series auto- and cross-spectra, transfer and coherence functions, using a Fast Fourier Transform subroutine. Statistical analysis of the time series is optional. While of general use the program is constructed to be immediately compatible with the ICL 4-70 and H316 computers at AEE Winfrith, and perhaps with minor modifications, with any other hardware system. (author)

  16. Visual Field Preferences of Object Analysis for Grasping with One Hand

    Directory of Open Access Journals (Sweden)

    Ada eLe

    2014-10-01

    Full Text Available When we grasp an object using one hand, the opposite hemisphere predominantly guides the motor control of grasp movements (Davare et al. 2007; Rice et al. 2007. However, it is unclear whether visual object analysis for grasp control relies more on inputs (a from the contralateral than the ipsilateral visual field, (b from one dominant visual field regardless of the grasping hand, or (c from both visual fields equally. For bimanual grasping of a single object we have recently demonstrated a visual field preference for the left visual field (Le and Niemeier 2013a, 2013b, consistent with a general right-hemisphere dominance for sensorimotor control of bimanual grasps (Le et al., 2013. But visual field differences have never been tested for unimanual grasping. Therefore, here we asked right-handed participants to fixate to the left or right of an object and then grasp the object either with their right or left hand using a precision grip. We found that participants grasping with their right hand performed better with objects in the right visual field: maximum grip apertures (MGAs were more closely matched to the object width and were smaller than for objects in the left visual field. In contrast, when people grasped with their left hand, preferences switched to the left visual field. What is more, MGA scaling showed greater visual field differences compared to right-hand grasping. Our data suggest that, visual object analysis for unimanual grasping shows a preference for visual information from the ipsilateral visual field, and that the left hemisphere is better equipped to control grasps in both visual fields.

  17. An Interactive Analysis of Hyperboles in a British TV Series: Implications For EFL Classes

    Science.gov (United States)

    Sert, Olcay

    2008-01-01

    This paper, part of an ongoing study on the analysis of hyperboles in a British TV series, reports findings drawing upon a 90,000 word corpus. The findings are compared to the ones from CANCODE (McCarthy and Carter 2004), a five-million word corpus of spontaneous speech, in order to identify similarities between the two. The analysis showed that…

  18. URBAN CHILDHOOD ROUTINES MEDIATED BY TECHNOLOGY: A VISUAL ANALYSIS

    Directory of Open Access Journals (Sweden)

    González-Patiño, Javier

    2011-07-01

    Full Text Available With the goal of documenting routines in which a technological component had some type of impact and observe the influences of immediate environments, exploring meaning and sense, interviews and some observations and, especially, a series of photographs taken by a young girl about her daily life were collected. The purpose of this article is to foster a critical debate about the social and developmental consequences that have been attributed to changes in the living conditions of contemporary Western urban children. For the analytical approach and the analysis of the data, it was especially important to take into consideration the characteristics of the case study participant, a high social class 12 year old pre-adolescent who lives in a home where a large variety of information and communication technology equipment is at her disposal. The conclusions that are presented mainly revolve around two issues. First, related to the emergence of multimodal communication situations, of increasingly semiotic complexity, promoted by digital practices visible in different styles of mediation in the uses of technologies identified as: "convergence", "divergence" and "accessibility". Second, that these technological practices facilitate transformations in space, whether public, private or virtual, altering the importance that traditionally held by other places in the processes of socialization of urban children. This article is published in Spanish.

  19. Interactive Visual Analysis for Organic Photovoltaic Solar Cells

    KAUST Repository

    Abouelhassan, Amal A.

    2017-01-01

    Organic Photovoltaic (OPV) solar cells provide a promising alternative for harnessing solar energy. However, the efficient design of OPV materials that achieve better performance requires support by better-tailored visualization tools than

  20. Statistical Analysis of a Comprehensive List of Visual Binaries

    Directory of Open Access Journals (Sweden)

    Kovaleva D.

    2015-12-01

    Full Text Available Visual binary stars are the most abundant class of observed binaries. The most comprehensive list of data on visual binaries compiled recently by cross-matching the largest catalogues of visual binaries allowed a statistical investigation of observational parameters of these systems. The dataset was cleaned by correcting uncertainties and misclassifications, and supplemented with available parallax data. The refined dataset is free from technical biases and contains 3676 presumably physical visual pairs of luminosity class V with known angular separations, magnitudes of the components, spectral types, and parallaxes. We also compiled a restricted sample of 998 pairs free from observational biases due to the probability of binary discovery. Certain distributions of observational and physical parameters of stars of our dataset are discussed.

  1. Is Fourier analysis performed by the visual system or by the visual investigator.

    Science.gov (United States)

    Ochs, A L

    1979-01-01

    A numerical Fourier transform was made of the pincushion grid illusion and the spectral components orthogonal to the illusory lines were isolated. Their inverse transform creates a picture of the illusion. The spatial-frequency response of cortical, simple receptive field neurons similarly filters the grid. A complete set of these neurons thus approximates a two-dimensional Fourier analyzer. One cannot conclude, however, that the brain actually uses frequency-domain information to interpret visual images.

  2. PIV Analysis of Ludwig Prandtl's Historic Flow Visualization Films

    OpenAIRE

    Willert, Christian; Kompenhans, Jürgen

    2010-01-01

    Around 1930 Ludwig Prandtl and his colleagues O. Tietjens and W. M\\"uller published two films with visualizations of flows around surface piercing obstacles to illustrate the unsteady process of flow separation. These visualizations were achieved by recording the motion of fine particles sprinkled onto the water surface in water channels. The resulting images meet the relevant criteria of properly seeded recordings for particle image velocimetry (PIV). Processing these image sequences with mo...

  3. Quantitative Postural Analysis of Children With Congenital Visual Impairment.

    Science.gov (United States)

    de Pádua, Michelle; Sauer, Juliana F; João, Silvia M A

    2018-01-01

    The aim of this study was to compare the postural alignment of children with visual impairment with that of children without visual impairment. The sample studied was 74 children of both sexes ages 5 to 12 years. Of these, 34 had visual impairment and 40 were control children. Digital photos from the standing position were used to analyze posture. Postural variables, such as tilt of the head, shoulder position, scapula position, lateral deviation of the spine, ankle position in the frontal plane and head posture, angle of thoracic kyphosis, angle of lumbar lordosis, pelvis position, and knee position in the frontal and sagittal planes, were measured with the Postural Assessment Software 0.63, version 36 (SAPO, São Paulo, Brazil), with markers placed in predetermined bony landmarks. The main results of this study showed that children with visual impairment have increased head tilt (P Visual impairment influences postural alignment. Children with visual impairment had increased head tilt, uneven shoulders, greater lateral deviation of the spine, thoracic kyphosis, lower lumbar lordosis, and more severe valgus deformities on knees. Copyright © 2017. Published by Elsevier Inc.

  4. ESTIMATING RELIABILITY OF DISTURBANCES IN SATELLITE TIME SERIES DATA BASED ON STATISTICAL ANALYSIS

    Directory of Open Access Journals (Sweden)

    Z.-G. Zhou

    2016-06-01

    Full Text Available Normally, the status of land cover is inherently dynamic and changing continuously on temporal scale. However, disturbances or abnormal changes of land cover — caused by such as forest fire, flood, deforestation, and plant diseases — occur worldwide at unknown times and locations. Timely detection and characterization of these disturbances is of importance for land cover monitoring. Recently, many time-series-analysis methods have been developed for near real-time or online disturbance detection, using satellite image time series. However, the detection results were only labelled with “Change/ No change” by most of the present methods, while few methods focus on estimating reliability (or confidence level of the detected disturbances in image time series. To this end, this paper propose a statistical analysis method for estimating reliability of disturbances in new available remote sensing image time series, through analysis of full temporal information laid in time series data. The method consists of three main steps. (1 Segmenting and modelling of historical time series data based on Breaks for Additive Seasonal and Trend (BFAST. (2 Forecasting and detecting disturbances in new time series data. (3 Estimating reliability of each detected disturbance using statistical analysis based on Confidence Interval (CI and Confidence Levels (CL. The method was validated by estimating reliability of disturbance regions caused by a recent severe flooding occurred around the border of Russia and China. Results demonstrated that the method can estimate reliability of disturbances detected in satellite image with estimation error less than 5% and overall accuracy up to 90%.

  5. Trend analysis and change point detection of annual and seasonal temperature series in Peninsular Malaysia

    Science.gov (United States)

    Suhaila, Jamaludin; Yusop, Zulkifli

    2017-06-01

    Most of the trend analysis that has been conducted has not considered the existence of a change point in the time series analysis. If these occurred, then the trend analysis will not be able to detect an obvious increasing or decreasing trend over certain parts of the time series. Furthermore, the lack of discussion on the possible factors that influenced either the decreasing or the increasing trend in the series needs to be addressed in any trend analysis. Hence, this study proposes to investigate the trends, and change point detection of mean, maximum and minimum temperature series, both annually and seasonally in Peninsular Malaysia and determine the possible factors that could contribute to the significance trends. In this study, Pettitt and sequential Mann-Kendall (SQ-MK) tests were used to examine the occurrence of any abrupt climate changes in the independent series. The analyses of the abrupt changes in temperature series suggested that most of the change points in Peninsular Malaysia were detected during the years 1996, 1997 and 1998. These detection points captured by Pettitt and SQ-MK tests are possibly related to climatic factors, such as El Niño and La Niña events. The findings also showed that the majority of the significant change points that exist in the series are related to the significant trend of the stations. Significant increasing trends of annual and seasonal mean, maximum and minimum temperatures in Peninsular Malaysia were found with a range of 2-5 °C/100 years during the last 32 years. It was observed that the magnitudes of the increasing trend in minimum temperatures were larger than the maximum temperatures for most of the studied stations, particularly at the urban stations. These increases are suspected to be linked with the effect of urban heat island other than El Niño event.

  6. Multivariate stochastic analysis for Monthly hydrological time series at Cuyahoga River Basin

    Science.gov (United States)

    zhang, L.

    2011-12-01

    Copula has become a very powerful statistic and stochastic methodology in case of the multivariate analysis in Environmental and Water resources Engineering. In recent years, the popular one-parameter Archimedean copulas, e.g. Gumbel-Houggard copula, Cook-Johnson copula, Frank copula, the meta-elliptical copula, e.g. Gaussian Copula, Student-T copula, etc. have been applied in multivariate hydrological analyses, e.g. multivariate rainfall (rainfall intensity, duration and depth), flood (peak discharge, duration and volume), and drought analyses (drought length, mean and minimum SPI values, and drought mean areal extent). Copula has also been applied in the flood frequency analysis at the confluences of river systems by taking into account the dependence among upstream gauge stations rather than by using the hydrological routing technique. In most of the studies above, the annual time series have been considered as stationary signal which the time series have been assumed as independent identically distributed (i.i.d.) random variables. But in reality, hydrological time series, especially the daily and monthly hydrological time series, cannot be considered as i.i.d. random variables due to the periodicity existed in the data structure. Also, the stationary assumption is also under question due to the Climate Change and Land Use and Land Cover (LULC) change in the fast years. To this end, it is necessary to revaluate the classic approach for the study of hydrological time series by relaxing the stationary assumption by the use of nonstationary approach. Also as to the study of the dependence structure for the hydrological time series, the assumption of same type of univariate distribution also needs to be relaxed by adopting the copula theory. In this paper, the univariate monthly hydrological time series will be studied through the nonstationary time series analysis approach. The dependence structure of the multivariate monthly hydrological time series will be

  7. Phase synchronization based minimum spanning trees for analysis of financial time series with nonlinear correlations

    Science.gov (United States)

    Radhakrishnan, Srinivasan; Duvvuru, Arjun; Sultornsanee, Sivarit; Kamarthi, Sagar

    2016-02-01

    The cross correlation coefficient has been widely applied in financial time series analysis, in specific, for understanding chaotic behaviour in terms of stock price and index movements during crisis periods. To better understand time series correlation dynamics, the cross correlation matrices are represented as networks, in which a node stands for an individual time series and a link indicates cross correlation between a pair of nodes. These networks are converted into simpler trees using different schemes. In this context, Minimum Spanning Trees (MST) are the most favoured tree structures because of their ability to preserve all the nodes and thereby retain essential information imbued in the network. Although cross correlations underlying MSTs capture essential information, they do not faithfully capture dynamic behaviour embedded in the time series data of financial systems because cross correlation is a reliable measure only if the relationship between the time series is linear. To address the issue, this work investigates a new measure called phase synchronization (PS) for establishing correlations among different time series which relate to one another, linearly or nonlinearly. In this approach the strength of a link between a pair of time series (nodes) is determined by the level of phase synchronization between them. We compare the performance of phase synchronization based MST with cross correlation based MST along selected network measures across temporal frame that includes economically good and crisis periods. We observe agreement in the directionality of the results across these two methods. They show similar trends, upward or downward, when comparing selected network measures. Though both the methods give similar trends, the phase synchronization based MST is a more reliable representation of the dynamic behaviour of financial systems than the cross correlation based MST because of the former's ability to quantify nonlinear relationships among time

  8. Visualization and analysis of flow structures in an open cavity

    Science.gov (United States)

    Liu, Jun; Cai, Jinsheng; Yang, Dangguo; Wu, Junqiang; Wang, Xiansheng

    2018-05-01

    A numerical study is performed on the supersonic flow over an open cavity at Mach number of 1.5. A newly developed visualization method is employed to visualize the complicated flow structures, which provide an insight into major flow physics. Four types of shock/compressive waves which existed in experimental schlieren are observed in numerical visualization results. Furthermore, other flow structures such as multi-scale vortices are also obtained in the numerical results. And a new type of shocklet which is beneath large vortices is found. The shocklet beneath the vortex originates from leading edge, then, is strengthened by successive interactions between feedback compressive waves and its attached vortex. Finally, it collides against the trailing surface and generates a large number of feedback compressive waves and intensive pressure fluctuations. It is suggested that the shocklets beneath vortex play an important role of cavity self-sustained oscillation.

  9. Method and apparatus for modeling, visualization and analysis of materials

    KAUST Repository

    Aboulhassan, Amal

    2016-08-25

    A method, apparatus, and computer readable medium are provided for modeling of materials and visualization of properties of the materials. An example method includes receiving data describing a set of properties of a material, and computing, by a processor and based on the received data, geometric features of the material. The example method further includes extracting, by the processor, particle paths within the material based on the computed geometric features, and geometrically modeling, by the processor, the material using the geometric features and the extracted particle paths. The example method further includes generating, by the processor and based on the geometric modeling of the material, one or more visualizations regarding the material, and causing display, by a user interface, of the one or more visualizations.

  10. Advanced analysis of free visual exploration patterns in schizophrenia

    Directory of Open Access Journals (Sweden)

    Andreas eSprenger

    2013-10-01

    Full Text Available Background: Visual scanpath analyses provide important information about attention allocation and attention shifting during visual exploration of social situations. This study investigated whether patients with schizophrenia simply show restricted free visual exploration behaviour reflected by reduced saccade frequency and increased fixation duration or whether patients use qualitatively different exploration strategies than healthy controls. Methods: Scanpaths of 32 patients with schizophrenia and age-matched 33 healthy controls were assessed while participants freely explored six photos of daily life situations (20 seconds/photo evaluated for cognitive complexity and emotional strain. Using fixation and saccade parameters, we compared temporal changes in exploration behaviour, cluster analyses, attentional landscapes and analyses of scanpath similarities between both groups. Results: We found fewer fixation clusters, longer fixation durations within a cluster, fewer changes between clusters, and a greater increase of fixation duration over time in patients compared to controls. Scanpath patterns and attentional landscapes in patients also differed significantly from those of controls. Generally, cognitive complexity and emotional strain had significant effects on visual exploration behaviour. This effect was similar in both groups as were physical properties of fixation locations.Conclusions: Longer attention allocation to a given feature in a scene and less attention shifts in patients suggest a more focal processing mode compared to a more ambient exploration strategy in controls. These visual exploration alterations were present in patients independently of cognitive complexity, emotional strain or physical properties of visual cues implying that they represent a rather general deficit. Despite this impairment, patients were able to adapt their scanning behaviour to changes in cognitive complexity and emotional strain similar to controls.

  11. Time Series Data Analysis of Wireless Sensor Network Measurements of Temperature.

    Science.gov (United States)

    Bhandari, Siddhartha; Bergmann, Neil; Jurdak, Raja; Kusy, Branislav

    2017-05-26

    Wireless sensor networks have gained significant traction in environmental signal monitoring and analysis. The cost or lifetime of the system typically depends on the frequency at which environmental phenomena are monitored. If sampling rates are reduced, energy is saved. Using empirical datasets collected from environmental monitoring sensor networks, this work performs time series analyses of measured temperature time series. Unlike previous works which have concentrated on suppressing the transmission of some data samples by time-series analysis but still maintaining high sampling rates, this work investigates reducing the sampling rate (and sensor wake up rate) and looks at the effects on accuracy. Results show that the sampling period of the sensor can be increased up to one hour while still allowing intermediate and future states to be estimated with interpolation RMSE less than 0.2 °C and forecasting RMSE less than 1 °C.

  12. Multi-complexity ensemble measures for gait time series analysis: application to diagnostics, monitoring and biometrics.

    Science.gov (United States)

    Gavrishchaka, Valeriy; Senyukova, Olga; Davis, Kristina

    2015-01-01

    Previously, we have proposed to use complementary complexity measures discovered by boosting-like ensemble learning for the enhancement of quantitative indicators dealing with necessarily short physiological time series. We have confirmed robustness of such multi-complexity measures for heart rate variability analysis with the emphasis on detection of emerging and intermittent cardiac abnormalities. Recently, we presented preliminary results suggesting that such ensemble-based approach could be also effective in discovering universal meta-indicators for early detection and convenient monitoring of neurological abnormalities using gait time series. Here, we argue and demonstrate that these multi-complexity ensemble measures for gait time series analysis could have significantly wider application scope ranging from diagnostics and early detection of physiological regime change to gait-based biometrics applications.

  13. Harmonic Analysis of a Nonstationary Series of Temperature Paleoreconstruction for the Central Part of Greenland

    Directory of Open Access Journals (Sweden)

    T.E. Danova

    2016-06-01

    Full Text Available The results of the investigations of a transformed series of reconstructed air temperature data for the central part of Greenland with an increment of 30 years have been presented. Stationarization of a ~ 50,000-years’ series of the reconstructed air temperature in the central part of Greenland according to ice core data has been performed using mathematical expectation. To obtain mathematical expectation estimation, the smoothing procedure by the methods of moving average and wavelet analysis has been carried out. Fourier’s transformation has been applied repeatedly to the stationarized series with changing the averaging time in the process of smoothing. Three averaging time values have been selected for the investigations: ~ 400–500 years, ~ 2,000 years, and ~ 4,000 years. Stationarization of the reconstructed temperature series with the help of wavelet transformation showed the best results when applying the averaging time of ~ 400 and ~ 2000 years, the trends well characterize the initial temperature series, there-by revealing the main patterns of its dynamics. Using the period with the averaging time of ~ 4,000 years showed the worst result: significant events of the main temperature series were lost in the process of averaging. The obtained results well correspond to cycling known to be inherent to the climatic system of the planet; the detected modes of 1,470 ± 500 years are comparable to the Dansgaard–Oeschger and Bond oscillations.

  14. Properties of Asymmetric Detrended Fluctuation Analysis in the time series of RR intervals

    Science.gov (United States)

    Piskorski, J.; Kosmider, M.; Mieszkowski, D.; Krauze, T.; Wykretowicz, A.; Guzik, P.

    2018-02-01

    Heart rate asymmetry is a phenomenon by which the accelerations and decelerations of heart rate behave differently, and this difference is consistent and unidirectional, i.e. in most of the analyzed recordings the inequalities have the same directions. So far, it has been established for variance and runs based types of descriptors of RR intervals time series. In this paper we apply the newly developed method of Asymmetric Detrended Fluctuation Analysis, which so far has mainly been used with economic time series, to the set of 420 stationary 30 min time series of RR intervals from young, healthy individuals aged between 20 and 40. This asymmetric approach introduces separate scaling exponents for rising and falling trends. We systematically study the presence of asymmetry in both global and local versions of this method. In this study global means "applying to the whole time series" and local means "applying to windows jumping along the recording". It is found that the correlation structure of the fluctuations left over after detrending in physiological time series shows strong asymmetric features in both magnitude, with α+ physiological data after shuffling or with a group of symmetric synthetic time series.

  15. The application of complex network time series analysis in turbulent heated jets

    International Nuclear Information System (INIS)

    Charakopoulos, A. K.; Karakasidis, T. E.; Liakopoulos, A.; Papanicolaou, P. N.

    2014-01-01

    In the present study, we applied the methodology of the complex network-based time series analysis to experimental temperature time series from a vertical turbulent heated jet. More specifically, we approach the hydrodynamic problem of discriminating time series corresponding to various regions relative to the jet axis, i.e., time series corresponding to regions that are close to the jet axis from time series originating at regions with a different dynamical regime based on the constructed network properties. Applying the transformation phase space method (k nearest neighbors) and also the visibility algorithm, we transformed time series into networks and evaluated the topological properties of the networks such as degree distribution, average path length, diameter, modularity, and clustering coefficient. The results show that the complex network approach allows distinguishing, identifying, and exploring in detail various dynamical regions of the jet flow, and associate it to the corresponding physical behavior. In addition, in order to reject the hypothesis that the studied networks originate from a stochastic process, we generated random network and we compared their statistical properties with that originating from the experimental data. As far as the efficiency of the two methods for network construction is concerned, we conclude that both methodologies lead to network properties that present almost the same qualitative behavior and allow us to reveal the underlying system dynamics

  16. Statistical attribution analysis of the nonstationarity of the annual runoff series of the Weihe River.

    Science.gov (United States)

    Xiong, Lihua; Jiang, Cong; Du, Tao

    2014-01-01

    Time-varying moments models based on Pearson Type III and normal distributions respectively are built under the generalized additive model in location, scale and shape (GAMLSS) framework to analyze the nonstationarity of the annual runoff series of the Weihe River, the largest tributary of the Yellow River. The detection of nonstationarities in hydrological time series (annual runoff, precipitation and temperature) from 1960 to 2009 is carried out using a GAMLSS model, and then the covariate analysis for the annual runoff series is implemented with GAMLSS. Finally, the attribution of each covariate to the nonstationarity of annual runoff is analyzed quantitatively. The results demonstrate that (1) obvious change-points exist in all three hydrological series, (2) precipitation, temperature and irrigated area are all significant covariates of the annual runoff series, and (3) temperature increase plays the main role in leading to the reduction of the annual runoff series in the study basin, followed by the decrease of precipitation and the increase of irrigated area.

  17. Analysis of clinical features and visual outcomes of pars planitis.

    Science.gov (United States)

    Berker, Nilufer; Sen, Emine; Elgin, Ufuk; Atilgan, Cemile Ucgul; Dursun, Erdem; Yilmazbas, Pelin

    2018-04-01

    To evaluate the demographic characteristics, clinical features, treatment and outcomes of patients with pars planitis in a tertiary referral center in Turkey. Medical records of patients with pars planitis were retrospectively reviewed. The data including demographic and ocular features and treatment outcomes were recorded. The distribution of clinical findings and complications were evaluated according to age and gender groups. The changes in final BCVA compared to the initial BCVA were noted. Statistical analysis was performed using SPSS software (Version 18.0, SPSS Inc., Chicago, USA). Twenty-seven patients (54 eyes) were included in this study. 16 patients were male (59.3%), and 11 were female (40.7%). Mean age at diagnosis was 12.84 ± 8.26 (range 4-36) years. Mean follow-up period was 61.3 ± 52.15 (range 9-172) months. Mean BCVA was 0.58 ± 0.36 (range 0.03-1.00) (0.40 ± 0.45 logMAR) at presentation, and 0.81 ± 0.28 (range 0.10-1.00) (0.14 ± 0.27 logMAR) at final visit (P = 0.001). Vitreous inflammation (100%), vitreous haze (92.6%), snowballs (74.1%), snowbanks (66.7%), anterior chamber cells (66.7%) and peripheral retinal vascular sheathing (48.1%) were the most common presentations. Ocular complications included vitreous condensation (51.9%), cystoid macular edema (22.2%), cataract (18.5%), inferior peripheral retinal detachment (11.1%), glaucoma (5.6%) and vitreous hemorrhage (3.7%). Treatments included topical, periocular, intravitreal and systemic corticosteroids, immunosuppressives, peripheral laser photocoagulation and pars plana vitrectomy when needed. Pars planitis is an idiopathic chronic intermediate uveitis mostly affecting children and adolescents. In spite of its chronic nature with high potential of causing ocular complications, adequate treatment and close follow-up lead to favorable visual outcomes.

  18. Stethoscope: A platform for interactive visual analysis of query execution plans

    NARCIS (Netherlands)

    M.M. Gawade (Mrunal); M.L. Kersten (Martin)

    2012-01-01

    textabstractSearching for the performance bottleneck in an execution trace is an error prone and time consuming activity. Existing tools oer some comfort by providing a visual representation of trace for analysis. In this paper we present the Stethoscope, an interactive visual tool to inspect and

  19. Stethoscope: a platform for interactive visual analysis of query execution plans

    NARCIS (Netherlands)

    Gawade, M.; Kersten, M.

    2012-01-01

    Searching for the performance bottleneck in an execution trace is an error prone and time consuming activity. Existing tools offer some comfort by providing a visual representation of trace for analysis. In this paper we present the Stethoscope, an interactive visual tool to inspect and ana- lyze

  20. JavaScript and jQuery for data analysis and visualization

    CERN Document Server

    Raasch, Jon; Ogievetsky, Vadim; Lowery, Joseph

    2014-01-01

    Go beyond design concepts-build dynamic data visualizations using JavaScript JavaScript and jQuery for Data Analysis and Visualization goes beyond design concepts to show readers how to build dynamic, best-of-breed visualizations using JavaScript-the most popular language for web programming. The authors show data analysts, developers, and web designers how they can put the power and flexibility of modern JavaScript libraries to work to analyze data and then present it using best-of-breed visualizations. They also demonstrate the use of each technique with real-world use cases, showing how to

  1. Visual attention to advertising : A segment-level analysis

    NARCIS (Netherlands)

    Rosbergen, E; Pieters, R; Wedel, M

    1997-01-01

    We propose a methodology to study the effects of physical ad properties on consumers' Visual attention to advertising that accounts for heterogeneity in these effects across consumers. In an illustrative experiment, we monitor consumers' eye movements during naturalistic exposure to a consumer

  2. A visual analysis of a cultural tourism destination | Eringa | Research ...

    African Journals Online (AJOL)

    ... can help to frame the experience in all three stages. For that reason it is advisable for destinations to employ some kind of visual identity system management to package the city image into a clear brand. Keywords: European Capital of Culture, Leeuwarden 2018, Chinese visitors, destination branding, flanking research ...

  3. Prajna: adding automated reasoning to the visual- analysis process.

    Science.gov (United States)

    Swing, E

    2010-01-01

    Developers who create applications for knowledge representation must contend with challenges in both the abundance of data and the variety of toolkits, architectures, and standards for representing it. Prajna is a flexible Java toolkit designed to overcome these challenges with an extensible architecture that supports both visualization and automated reasoning.

  4. Visual artificial grammar learning in dyslexia : A meta-analysis

    NARCIS (Netherlands)

    van Witteloostuijn, Merel; Boersma, Paul; Wijnen, Frank; Rispens, Judith

    2017-01-01

    Background Literacy impairments in dyslexia have been hypothesized to be (partly) due to an implicit learning deficit. However, studies of implicit visual artificial grammar learning (AGL) have often yielded null results. Aims The aim of this study is to weigh the evidence collected thus far by

  5. Visual data mining and analysis of software repositories

    NARCIS (Netherlands)

    Voinea, S.L.; Telea, A.C.

    2007-01-01

    In this article we describe an ongoing effort to integrate information visualization techniques into the process of configuration management for software systems. Our focus is to help software engineers manage the evolution of large and complex software systems by offering them effective and

  6. Chromatic Information and Feature Detection in Fast Visual Analysis.

    Directory of Open Access Journals (Sweden)

    Maria M Del Viva

    Full Text Available The visual system is able to recognize a scene based on a sketch made of very simple features. This ability is likely crucial for survival, when fast image recognition is necessary, and it is believed that a primal sketch is extracted very early in the visual processing. Such highly simplified representations can be sufficient for accurate object discrimination, but an open question is the role played by color in this process. Rich color information is available in natural scenes, yet artist's sketches are usually monochromatic; and, black-and-white movies provide compelling representations of real world scenes. Also, the contrast sensitivity of color is low at fine spatial scales. We approach the question from the perspective of optimal information processing by a system endowed with limited computational resources. We show that when such limitations are taken into account, the intrinsic statistical properties of natural scenes imply that the most effective strategy is to ignore fine-scale color features and devote most of the bandwidth to gray-scale information. We find confirmation of these information-based predictions from psychophysics measurements of fast-viewing discrimination of natural scenes. We conclude that the lack of colored features in our visual representation, and our overall low sensitivity to high-frequency color components, are a consequence of an adaptation process, optimizing the size and power consumption of our brain for the visual world we live in.

  7. Simulation and Formal Analysis of Visual Attention in Cognitive Systems

    NARCIS (Netherlands)

    Bosse, T.; Maanen, P.P. van; Treur, J.

    2007-01-01

    In this paper a simulation model for visual attention is discussed and formally analysed. The model is part of the design of a cognitive system which comprises an agent that supports a naval officer in its task to compile a tactical picture of the situation in the field. A case study is described in

  8. Interactive visual exploration and analysis of origin-destination data

    Science.gov (United States)

    Ding, Linfang; Meng, Liqiu; Yang, Jian; Krisp, Jukka M.

    2018-05-01

    In this paper, we propose a visual analytics approach for the exploration of spatiotemporal interaction patterns of massive origin-destination data. Firstly, we visually query the movement database for data at certain time windows. Secondly, we conduct interactive clustering to allow the users to select input variables/features (e.g., origins, destinations, distance, and duration) and to adjust clustering parameters (e.g. distance threshold). The agglomerative hierarchical clustering method is applied for the multivariate clustering of the origin-destination data. Thirdly, we design a parallel coordinates plot for visualizing the precomputed clusters and for further exploration of interesting clusters. Finally, we propose a gradient line rendering technique to show the spatial and directional distribution of origin-destination clusters on a map view. We implement the visual analytics approach in a web-based interactive environment and apply it to real-world floating car data from Shanghai. The experiment results show the origin/destination hotspots and their spatial interaction patterns. They also demonstrate the effectiveness of our proposed approach.

  9. On statistical inference in time series analysis of the evolution of road safety.

    Science.gov (United States)

    Commandeur, Jacques J F; Bijleveld, Frits D; Bergel-Hayat, Ruth; Antoniou, Constantinos; Yannis, George; Papadimitriou, Eleonora

    2013-11-01

    Data collected for building a road safety observatory usually include observations made sequentially through time. Examples of such data, called time series data, include annual (or monthly) number of road traffic accidents, traffic fatalities or vehicle kilometers driven in a country, as well as the corresponding values of safety performance indicators (e.g., data on speeding, seat belt use, alcohol use, etc.). Some commonly used statistical techniques imply assumptions that are often violated by the special properties of time series data, namely serial dependency among disturbances associated with the observations. The first objective of this paper is to demonstrate the impact of such violations to the applicability of standard methods of statistical inference, which leads to an under or overestimation of the standard error and consequently may produce erroneous inferences. Moreover, having established the adverse consequences of ignoring serial dependency issues, the paper aims to describe rigorous statistical techniques used to overcome them. In particular, appropriate time series analysis techniques of varying complexity are employed to describe the development over time, relating the accident-occurrences to explanatory factors such as exposure measures or safety performance indicators, and forecasting the development into the near future. Traditional regression models (whether they are linear, generalized linear or nonlinear) are shown not to naturally capture the inherent dependencies in time series data. Dedicated time series analysis techniques, such as the ARMA-type and DRAG approaches are discussed next, followed by structural time series models, which are a subclass of state space methods. The paper concludes with general recommendations and practice guidelines for the use of time series models in road safety research. Copyright © 2012 Elsevier Ltd. All rights reserved.

  10. A Filtering of Incomplete GNSS Position Time Series with Probabilistic Principal Component Analysis

    Science.gov (United States)

    Gruszczynski, Maciej; Klos, Anna; Bogusz, Janusz

    2018-04-01

    For the first time, we introduced the probabilistic principal component analysis (pPCA) regarding the spatio-temporal filtering of Global Navigation Satellite System (GNSS) position time series to estimate and remove Common Mode Error (CME) without the interpolation of missing values. We used data from the International GNSS Service (IGS) stations which contributed to the latest International Terrestrial Reference Frame (ITRF2014). The efficiency of the proposed algorithm was tested on the simulated incomplete time series, then CME was estimated for a set of 25 stations located in Central Europe. The newly applied pPCA was compared with previously used algorithms, which showed that this method is capable of resolving the problem of proper spatio-temporal filtering of GNSS time series characterized by different observation time span. We showed, that filtering can be carried out with pPCA method when there exist two time series in the dataset having less than 100 common epoch of observations. The 1st Principal Component (PC) explained more than 36% of the total variance represented by time series residuals' (series with deterministic model removed), what compared to the other PCs variances (less than 8%) means that common signals are significant in GNSS residuals. A clear improvement in the spectral indices of the power-law noise was noticed for the Up component, which is reflected by an average shift towards white noise from - 0.98 to - 0.67 (30%). We observed a significant average reduction in the accuracy of stations' velocity estimated for filtered residuals by 35, 28 and 69% for the North, East, and Up components, respectively. CME series were also subjected to analysis in the context of environmental mass loading influences of the filtering results. Subtraction of the environmental loading models from GNSS residuals provides to reduction of the estimated CME variance by 20 and 65% for horizontal and vertical components, respectively.

  11. Characteristics of Articles About Human Papillomavirus Vaccination in Japanese Newspapers: Time-Series Analysis Study.

    Science.gov (United States)

    Ueda, Nao; Yokouchi, Ryoki; Onoda, Taro; Ogihara, Atsushi

    2017-12-19

    Media coverage and reports have a major influence on individual vaccination and other health-related activities. People use the media to seek information and knowledge on health-related behaviors. They obtain health-related information from media such as television and newspapers, and they trust such information. While several studies have examined the relation between media coverage and individual health, there is a lack of studies that have analyzed media reports of health information. In particular, we have found no analyses related to cervical cancer (human papillomavirus [HPV]) vaccine. This study aimed to identify mentions of cervical cancer vaccine in Japan's printed news media and to determine their characteristics. We used the archival databases of 2 Japanese newspapers, Yomiuri Shimbun (Yomidasu Rekishikan) and Asahi Shimbun (Kikuzo II Visual), for text mining. First, we created a database by extracting articles published between January 1, 2007, and December 31, 2014, that matched the terms "cervical cancer" AND "vaccination" in a keyword search. Then, we tallied the extracted articles based on the month of publication and number of characters in order to conduct a time-series analysis. We extracted a total of 219 articles. Of these, 154 (70.3%) were positive and 51 (23.3%) were negative toward HPV vaccination. Of the 51 negative articles, 4 (7.8%) were published before June 2013, when routine vaccination was temporarily discontinued due to concerns regarding side effects, and 47 (92.2%) were published since then. The negative reports commonly cited side effects, although prior to June 2013, these issues were hardly mentioned. Although foreign media reports mentioned side effects before routine vaccination was temporarily discontinued, fewer articles mentioned side effects than recommendations for vaccination. Furthermore, on June 13, 2013, the World Health Organization's advisory body Global Advisory Committee on Vaccine Safety issued a statement

  12. A Recurrent Probabilistic Neural Network with Dimensionality Reduction Based on Time-series Discriminant Component Analysis.

    Science.gov (United States)

    Hayashi, Hideaki; Shibanoki, Taro; Shima, Keisuke; Kurita, Yuichi; Tsuji, Toshio

    2015-12-01

    This paper proposes a probabilistic neural network (NN) developed on the basis of time-series discriminant component analysis (TSDCA) that can be used to classify high-dimensional time-series patterns. TSDCA involves the compression of high-dimensional time series into a lower dimensional space using a set of orthogonal transformations and the calculation of posterior probabilities based on a continuous-density hidden Markov model with a Gaussian mixture model expressed in the reduced-dimensional space. The analysis can be incorporated into an NN, which is named a time-series discriminant component network (TSDCN), so that parameters of dimensionality reduction and classification can be obtained simultaneously as network coefficients according to a backpropagation through time-based learning algorithm with the Lagrange multiplier method. The TSDCN is considered to enable high-accuracy classification of high-dimensional time-series patterns and to reduce the computation time taken for network training. The validity of the TSDCN is demonstrated for high-dimensional artificial data and electroencephalogram signals in the experiments conducted during the study.

  13. Regression and regression analysis time series prediction modeling on climate data of quetta, pakistan

    International Nuclear Information System (INIS)

    Jafri, Y.Z.; Kamal, L.

    2007-01-01

    Various statistical techniques was used on five-year data from 1998-2002 of average humidity, rainfall, maximum and minimum temperatures, respectively. The relationships to regression analysis time series (RATS) were developed for determining the overall trend of these climate parameters on the basis of which forecast models can be corrected and modified. We computed the coefficient of determination as a measure of goodness of fit, to our polynomial regression analysis time series (PRATS). The correlation to multiple linear regression (MLR) and multiple linear regression analysis time series (MLRATS) were also developed for deciphering the interdependence of weather parameters. Spearman's rand correlation and Goldfeld-Quandt test were used to check the uniformity or non-uniformity of variances in our fit to polynomial regression (PR). The Breusch-Pagan test was applied to MLR and MLRATS, respectively which yielded homoscedasticity. We also employed Bartlett's test for homogeneity of variances on a five-year data of rainfall and humidity, respectively which showed that the variances in rainfall data were not homogenous while in case of humidity, were homogenous. Our results on regression and regression analysis time series show the best fit to prediction modeling on climatic data of Quetta, Pakistan. (author)

  14. Bayesian near-boundary analysis in basic macroeconomic time series models

    NARCIS (Netherlands)

    M.D. de Pooter (Michiel); F. Ravazzolo (Francesco); R. Segers (René); H.K. van Dijk (Herman)

    2008-01-01

    textabstractSeveral lessons learnt from a Bayesian analysis of basic macroeconomic time series models are presented for the situation where some model parameters have substantial posterior probability near the boundary of the parameter region. This feature refers to near-instability within dynamic

  15. A Comparison of Missing-Data Procedures for Arima Time-Series Analysis

    Science.gov (United States)

    Velicer, Wayne F.; Colby, Suzanne M.

    2005-01-01

    Missing data are a common practical problem for longitudinal designs. Time-series analysis is a longitudinal method that involves a large number of observations on a single unit. Four different missing-data methods (deletion, mean substitution, mean of adjacent observations, and maximum likelihood estimation) were evaluated. Computer-generated…

  16. Using trajectory sensitivity analysis to find suitable locations of series compensators for improving rotor angle stability

    DEFF Research Database (Denmark)

    Nasri, Amin; Eriksson, Robert; Ghandhar, Mehrdad

    2014-01-01

    This paper proposes an approach based on trajectory sensitivity analysis (TSA) to find most suitable placement of series compensators in the power system. The main objective is to maximize the benefit of these devices in order to enhance the rotor angle stability. This approach is formulated...

  17. Detrended partial cross-correlation analysis of two nonstationary time series influenced by common external forces

    Science.gov (United States)

    Qian, Xi-Yuan; Liu, Ya-Min; Jiang, Zhi-Qiang; Podobnik, Boris; Zhou, Wei-Xing; Stanley, H. Eugene

    2015-06-01

    When common factors strongly influence two power-law cross-correlated time series recorded in complex natural or social systems, using detrended cross-correlation analysis (DCCA) without considering these common factors will bias the results. We use detrended partial cross-correlation analysis (DPXA) to uncover the intrinsic power-law cross correlations between two simultaneously recorded time series in the presence of nonstationarity after removing the effects of other time series acting as common forces. The DPXA method is a generalization of the detrended cross-correlation analysis that takes into account partial correlation analysis. We demonstrate the method by using bivariate fractional Brownian motions contaminated with a fractional Brownian motion. We find that the DPXA is able to recover the analytical cross Hurst indices, and thus the multiscale DPXA coefficients are a viable alternative to the conventional cross-correlation coefficient. We demonstrate the advantage of the DPXA coefficients over the DCCA coefficients by analyzing contaminated bivariate fractional Brownian motions. We calculate the DPXA coefficients and use them to extract the intrinsic cross correlation between crude oil and gold futures by taking into consideration the impact of the U.S. dollar index. We develop the multifractal DPXA (MF-DPXA) method in order to generalize the DPXA method and investigate multifractal time series. We analyze multifractal binomial measures masked with strong white noises and find that the MF-DPXA method quantifies the hidden multifractal nature while the multifractal DCCA method fails.

  18. Operation States Analysis of the Series-Parallel resonant Converter Working Above Resonance Frequency

    Directory of Open Access Journals (Sweden)

    Peter Dzurko

    2007-01-01

    Full Text Available Operation states analysis of a series-parallel converter working above resonance frequency is described in the paper. Principal equations are derived for individual operation states. On the basis of them the diagrams are made out. The diagrams give the complex image of the converter behaviour for individual circuit parameters. The waveforms may be utilised at designing the inverter individual parts.

  19. AAMFT Master Series Tapes: An Analysis of the Inclusion of Feminist Principles into Family Therapy Practice.

    Science.gov (United States)

    Haddock, Shelley A.; MacPhee, David; Zimmerman, Toni Schindler

    2001-01-01

    Content analysis of 23 American Association for Marriage and Family Therapy Master Series tapes was used to determine how well feminist behaviors have been incorporated into ideal family therapy practice. Feminist behaviors were infrequent, being evident in fewer than 3% of time blocks in event sampling and 10 of 39 feminist behaviors of the…

  20. Dynamic factor analysis in the frequency domain: causal modeling of multivariate psychophysiological time series

    NARCIS (Netherlands)

    Molenaar, P.C.M.

    1987-01-01

    Outlines a frequency domain analysis of the dynamic factor model and proposes a solution to the problem of constructing a causal filter of lagged factor loadings. The method is illustrated with applications to simulated and real multivariate time series. The latter applications involve topographic

  1. Harmonic analysis of dense time series of landsat imagery for modeling change in forest conditions

    Science.gov (United States)

    Barry Tyler. Wilson

    2015-01-01

    This study examined the utility of dense time series of Landsat imagery for small area estimation and mapping of change in forest conditions over time. The study area was a region in north central Wisconsin for which Landsat 7 ETM+ imagery and field measurements from the Forest Inventory and Analysis program are available for the decade of 2003 to 2012. For the periods...

  2. Economic Conditions and the Divorce Rate: A Time-Series Analysis of the Postwar United States.

    Science.gov (United States)

    South, Scott J.

    1985-01-01

    Challenges the belief that the divorce rate rises during prosperity and falls during economic recessions. Time-series regression analysis of postwar United States reveals small but positive effects of unemployment on divorce rate. Stronger influences on divorce rates are changes in age structure and labor-force participation rate of women.…

  3. Operation Analysis of the Series-Parallel Resonant Converter Working above Resonance Frequency

    Directory of Open Access Journals (Sweden)

    Peter Dzurko

    2006-01-01

    Full Text Available The present article deals with theoretical analysis of operation of a series-parallel converter working above resonance frequency. Derived are principal equations for individual operation intervals. Based on these made out are waveforms of individual quantities during both the inverter operation at load and no-load operation. The waveforms may be utilised at designing the inverter individual parts.

  4. Exploratory Climate Data Visualization and Analysis Using DV3D and UVCDAT

    Science.gov (United States)

    Maxwell, Thomas

    2012-01-01

    Earth system scientists are being inundated by an explosion of data generated by ever-increasing resolution in both global models and remote sensors. Advanced tools for accessing, analyzing, and visualizing very large and complex climate data are required to maintain rapid progress in Earth system research. To meet this need, NASA, in collaboration with the Ultra-scale Visualization Climate Data Analysis Tools (UVCOAT) consortium, is developing exploratory climate data analysis and visualization tools which provide data analysis capabilities for the Earth System Grid (ESG). This paper describes DV3D, a UV-COAT package that enables exploratory analysis of climate simulation and observation datasets. OV3D provides user-friendly interfaces for visualization and analysis of climate data at a level appropriate for scientists. It features workflow inte rfaces, interactive 40 data exploration, hyperwall and stereo visualization, automated provenance generation, and parallel task execution. DV30's integration with CDAT's climate data management system (COMS) and other climate data analysis tools provides a wide range of high performance climate data analysis operations. DV3D expands the scientists' toolbox by incorporating a suite of rich new exploratory visualization and analysis methods for addressing the complexity of climate datasets.

  5. Unified functional network and nonlinear time series analysis for complex systems science: The pyunicorn package

    Science.gov (United States)

    Donges, Jonathan; Heitzig, Jobst; Beronov, Boyan; Wiedermann, Marc; Runge, Jakob; Feng, Qing Yi; Tupikina, Liubov; Stolbova, Veronika; Donner, Reik; Marwan, Norbert; Dijkstra, Henk; Kurths, Jürgen

    2016-04-01

    We introduce the pyunicorn (Pythonic unified complex network and recurrence analysis toolbox) open source software package for applying and combining modern methods of data analysis and modeling from complex network theory and nonlinear time series analysis. pyunicorn is a fully object-oriented and easily parallelizable package written in the language Python. It allows for the construction of functional networks such as climate networks in climatology or functional brain networks in neuroscience representing the structure of statistical interrelationships in large data sets of time series and, subsequently, investigating this structure using advanced methods of complex network theory such as measures and models for spatial networks, networks of interacting networks, node-weighted statistics, or network surrogates. Additionally, pyunicorn provides insights into the nonlinear dynamics of complex systems as recorded in uni- and multivariate time series from a non-traditional perspective by means of recurrence quantification analysis, recurrence networks, visibility graphs, and construction of surrogate time series. The range of possible applications of the library is outlined, drawing on several examples mainly from the field of climatology. pyunicorn is available online at https://github.com/pik-copan/pyunicorn. Reference: J.F. Donges, J. Heitzig, B. Beronov, M. Wiedermann, J. Runge, Q.-Y. Feng, L. Tupikina, V. Stolbova, R.V. Donner, N. Marwan, H.A. Dijkstra, and J. Kurths, Unified functional network and nonlinear time series analysis for complex systems science: The pyunicorn package, Chaos 25, 113101 (2015), DOI: 10.1063/1.4934554, Preprint: arxiv.org:1507.01571 [physics.data-an].

  6. Independent component analysis: A new possibility for analysing series of electron energy loss spectra

    International Nuclear Information System (INIS)

    Bonnet, Nogl; Nuzillard, Danielle

    2005-01-01

    A complementary approach is proposed for analysing series of electron energy-loss spectra that can be recorded with the spectrum-line technique, across an interface for instance. This approach, called blind source separation (BSS) or independent component analysis (ICA), complements two existing methods: the spatial difference approach and multivariate statistical analysis. The principle of the technique is presented and illustrations are given through one simulated example and one real example

  7. Visual physics analysis-from desktop to physics analysis at your fingertips

    International Nuclear Information System (INIS)

    Bretz, H-P; Erdmann, M; Fischer, R; Hinzmann, A; Klingebiel, D; Komm, M; Lingemann, J; Rieger, M; Müller, G; Steggemann, J; Winchen, T

    2012-01-01

    Visual Physics Analysis (VISPA) is an analysis environment with applications in high energy and astroparticle physics. Based on a data-flow-driven paradigm, it allows users to combine graphical steering with self-written C++ and Python modules. This contribution presents new concepts integrated in VISPA: layers, convenient analysis execution, and web-based physics analysis. While the convenient execution offers full flexibility to vary settings for the execution phase of an analysis, layers allow to create different views of the analysis already during its design phase. Thus, one application of layers is to define different stages of an analysis (e.g. event selection and statistical analysis). However, there are other use cases such as to independently optimize settings for different types of input data in order to guide all data through the same analysis flow. The new execution feature makes job submission to local clusters as well as the LHC Computing Grid possible directly from VISPA. Web-based physics analysis is realized in the VISPA-Web project, which represents a whole new way to design and execute analyses via a standard web browser.

  8. Time Series Imputation via L1 Norm-Based Singular Spectrum Analysis

    Science.gov (United States)

    Kalantari, Mahdi; Yarmohammadi, Masoud; Hassani, Hossein; Silva, Emmanuel Sirimal

    Missing values in time series data is a well-known and important problem which many researchers have studied extensively in various fields. In this paper, a new nonparametric approach for missing value imputation in time series is proposed. The main novelty of this research is applying the L1 norm-based version of Singular Spectrum Analysis (SSA), namely L1-SSA which is robust against outliers. The performance of the new imputation method has been compared with many other established methods. The comparison is done by applying them to various real and simulated time series. The obtained results confirm that the SSA-based methods, especially L1-SSA can provide better imputation in comparison to other methods.

  9. Analysis of Land Subsidence Monitoring in Mining Area with Time-Series Insar Technology

    Science.gov (United States)

    Sun, N.; Wang, Y. J.

    2018-04-01

    Time-series InSAR technology has become a popular land subsidence monitoring method in recent years, because of its advantages such as high accuracy, wide area, low expenditure, intensive monitoring points and free from accessibility restrictions. In this paper, we applied two kinds of satellite data, ALOS PALSAR and RADARSAT-2, to get the subsidence monitoring results of the study area in two time periods by time-series InSAR technology. By analyzing the deformation range, rate and amount, the time-series analysis of land subsidence in mining area was realized. The results show that InSAR technology could be used to monitor land subsidence in large area and meet the demand of subsidence monitoring in mining area.

  10. Feasibility of bioelectrical impedance analysis in persons with severe intellectual and visual disabilities

    NARCIS (Netherlands)

    Havinga-Top, Thamar; Waninge, Aly; van der Schans, Cees; Jager, Harriët

    2015-01-01

    Background: Body composition measurements provide importanti nformation about physical fitness and nutritional status. People with severe intellectual and visual disabilities (SIVD) have an increased risk for altered body composition. Bioelectrical impedance analysis (BIA) has been evidenced as a

  11. Feasibility of bioelectrical impedance analysis in persons with severe intellectual and visual disabilities

    NARCIS (Netherlands)

    Havinga-Top, A. M.; Waninge, A.; van der Schans, C. P.; Jager-Wittenaar, H.

    2015-01-01

    Background: Body composition measurements provide important information about physical fitness and nutritional status. People with severe intellectual and visual disabilities (SIVD) have an increased risk for altered body composition. Bioelectrical impedance analysis (BIA) has been evidenced as a

  12. Sensitivity analysis of machine-learning models of hydrologic time series

    Science.gov (United States)

    O'Reilly, A. M.

    2017-12-01

    Sensitivity analysis traditionally has been applied to assessing model response to perturbations in model parameters, where the parameters are those model input variables adjusted during calibration. Unlike physics-based models where parameters represent real phenomena, the equivalent of parameters for machine-learning models are simply mathematical "knobs" that are automatically adjusted during training/testing/verification procedures. Thus the challenge of extracting knowledge of hydrologic system functionality from machine-learning models lies in their very nature, leading to the label "black box." Sensitivity analysis of the forcing-response behavior of machine-learning models, however, can provide understanding of how the physical phenomena represented by model inputs affect the physical phenomena represented by model outputs.As part of a previous study, hybrid spectral-decomposition artificial neural network (ANN) models were developed to simulate the observed behavior of hydrologic response contained in multidecadal datasets of lake water level, groundwater level, and spring flow. Model inputs used moving window averages (MWA) to represent various frequencies and frequency-band components of time series of rainfall and groundwater use. Using these forcing time series, the MWA-ANN models were trained to predict time series of lake water level, groundwater level, and spring flow at 51 sites in central Florida, USA. A time series of sensitivities for each MWA-ANN model was produced by perturbing forcing time-series and computing the change in response time-series per unit change in perturbation. Variations in forcing-response sensitivities are evident between types (lake, groundwater level, or spring), spatially (among sites of the same type), and temporally. Two generally common characteristics among sites are more uniform sensitivities to rainfall over time and notable increases in sensitivities to groundwater usage during significant drought periods.

  13. Application of the Allan Variance to Time Series Analysis in Astrometry and Geodesy: A Review.

    Science.gov (United States)

    Malkin, Zinovy

    2016-04-01

    The Allan variance (AVAR) was introduced 50 years ago as a statistical tool for assessing the frequency standards deviations. For the past decades, AVAR has increasingly been used in geodesy and astrometry to assess the noise characteristics in geodetic and astrometric time series. A specific feature of astrometric and geodetic measurements, as compared with clock measurements, is that they are generally associated with uncertainties; thus, an appropriate weighting should be applied during data analysis. In addition, some physically connected scalar time series naturally form series of multidimensional vectors. For example, three station coordinates time series X, Y, and Z can be combined to analyze 3-D station position variations. The classical AVAR is not intended for processing unevenly weighted and/or multidimensional data. Therefore, AVAR modifications, namely weighted AVAR (WAVAR), multidimensional AVAR (MAVAR), and weighted multidimensional AVAR (WMAVAR), were introduced to overcome these deficiencies. In this paper, a brief review is given of the experience of using AVAR and its modifications in processing astrogeodetic time series.

  14. Social network analysis of character interaction in the Stargate and Star Trek television series

    Science.gov (United States)

    Tan, Melody Shi Ai; Ujum, Ephrance Abu; Ratnavelu, Kuru

    This paper undertakes a social network analysis of two science fiction television series, Stargate and Star Trek. Television series convey stories in the form of character interaction, which can be represented as “character networks”. We connect each pair of characters that exchanged spoken dialogue in any given scene demarcated in the television series transcripts. These networks are then used to characterize the overall structure and topology of each series. We find that the character networks of both series have similar structure and topology to that found in previous work on mythological and fictional networks. The character networks exhibit the small-world effects but found no significant support for power-law. Since the progression of an episode depends to a large extent on the interaction between each of its characters, the underlying network structure tells us something about the complexity of that episode’s storyline. We assessed the complexity using techniques from spectral graph theory. We found that the episode networks are structured either as (1) closed networks, (2) those containing bottlenecks that connect otherwise disconnected clusters or (3) a mixture of both.

  15. Principal components and iterative regression analysis of geophysical series: Application to Sunspot number (1750 2004)

    Science.gov (United States)

    Nordemann, D. J. R.; Rigozo, N. R.; de Souza Echer, M. P.; Echer, E.

    2008-11-01

    We present here an implementation of a least squares iterative regression method applied to the sine functions embedded in the principal components extracted from geophysical time series. This method seems to represent a useful improvement for the non-stationary time series periodicity quantitative analysis. The principal components determination followed by the least squares iterative regression method was implemented in an algorithm written in the Scilab (2006) language. The main result of the method is to obtain the set of sine functions embedded in the series analyzed in decreasing order of significance, from the most important ones, likely to represent the physical processes involved in the generation of the series, to the less important ones that represent noise components. Taking into account the need of a deeper knowledge of the Sun's past history and its implication to global climate change, the method was applied to the Sunspot Number series (1750-2004). With the threshold and parameter values used here, the application of the method leads to a total of 441 explicit sine functions, among which 65 were considered as being significant and were used for a reconstruction that gave a normalized mean squared error of 0.146.

  16. Interrupted time series analysis in drug utilization research is increasing: systematic review and recommendations.

    Science.gov (United States)

    Jandoc, Racquel; Burden, Andrea M; Mamdani, Muhammad; Lévesque, Linda E; Cadarette, Suzanne M

    2015-08-01

    To describe the use and reporting of interrupted time series methods in drug utilization research. We completed a systematic search of MEDLINE, Web of Science, and reference lists to identify English language articles through to December 2013 that used interrupted time series methods in drug utilization research. We tabulated the number of studies by publication year and summarized methodological detail. We identified 220 eligible empirical applications since 1984. Only 17 (8%) were published before 2000, and 90 (41%) were published since 2010. Segmented regression was the most commonly applied interrupted time series method (67%). Most studies assessed drug policy changes (51%, n = 112); 22% (n = 48) examined the impact of new evidence, 18% (n = 39) examined safety advisories, and 16% (n = 35) examined quality improvement interventions. Autocorrelation was considered in 66% of studies, 31% reported adjusting for seasonality, and 15% accounted for nonstationarity. Use of interrupted time series methods in drug utilization research has increased, particularly in recent years. Despite methodological recommendations, there is large variation in reporting of analytic methods. Developing methodological and reporting standards for interrupted time series analysis is important to improve its application in drug utilization research, and we provide recommendations for consideration. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.

  17. Definition of distance for nonlinear time series analysis of marked point process data

    Energy Technology Data Exchange (ETDEWEB)

    Iwayama, Koji, E-mail: koji@sat.t.u-tokyo.ac.jp [Research Institute for Food and Agriculture, Ryukoku Univeristy, 1-5 Yokotani, Seta Oe-cho, Otsu-Shi, Shiga 520-2194 (Japan); Hirata, Yoshito; Aihara, Kazuyuki [Institute of Industrial Science, The University of Tokyo, 4-6-1 Komaba, Meguro-ku, Tokyo 153-8505 (Japan)

    2017-01-30

    Marked point process data are time series of discrete events accompanied with some values, such as economic trades, earthquakes, and lightnings. A distance for marked point process data allows us to apply nonlinear time series analysis to such data. We propose a distance for marked point process data which can be calculated much faster than the existing distance when the number of marks is small. Furthermore, under some assumptions, the Kullback–Leibler divergences between posterior distributions for neighbors defined by this distance are small. We performed some numerical simulations showing that analysis based on the proposed distance is effective. - Highlights: • A new distance for marked point process data is proposed. • The distance can be computed fast enough for a small number of marks. • The method to optimize parameter values of the distance is also proposed. • Numerical simulations indicate that the analysis based on the distance is effective.

  18. Equipment of visualization environment of a large-scale structural analysis system. Visualization using AVS/Express of an ADVENTURE system

    International Nuclear Information System (INIS)

    Miyazaki, Mikiya

    2004-02-01

    The data display work of visualization is done in many research fields, and a lot of special softwares for the specific purposes exist today. But such softwares have an interface to only a small number of solvers. In many simulations, data conversion for visualization software is required between analysis and visualization for the practical use. This report describes the equipment work of the data visualization environment where AVS/Express was installed in corresponding to many requests from the users of the large-scale structural analysis system which is prepared as an ITBL community software. This environment enables to use the ITBL visualization server as a visualization device after the computation on the ITBL computer. Moreover, a lot of use will be expected within the community in the ITBL environment by merging it into ITBL/AVS environment in the future. (author)

  19. Integrated Analysis and Visualization of Group Differences in Structural and Functional Brain Connectivity: Applications in Typical Ageing and Schizophrenia.

    Directory of Open Access Journals (Sweden)

    Carolyn D Langen

    Full Text Available Structural and functional brain connectivity are increasingly used to identify and analyze group differences in studies of brain disease. This study presents methods to analyze uni- and bi-modal brain connectivity and evaluate their ability to identify differences. Novel visualizations of significantly different connections comparing multiple metrics are presented. On the global level, "bi-modal comparison plots" show the distribution of uni- and bi-modal group differences and the relationship between structure and function. Differences between brain lobes are visualized using "worm plots". Group differences in connections are examined with an existing visualization, the "connectogram". These visualizations were evaluated in two proof-of-concept studies: (1 middle-aged versus elderly subjects; and (2 patients with schizophrenia versus controls. Each included two measures derived from diffusion weighted images and two from functional magnetic resonance images. The structural measures were minimum cost path between two anatomical regions according to the "Statistical Analysis of Minimum cost path based Structural Connectivity" method and the average fractional anisotropy along the fiber. The functional measures were Pearson's correlation and partial correlation of mean regional time series. The relationship between structure and function was similar in both studies. Uni-modal group differences varied greatly between connectivity types. Group differences were identified in both studies globally, within brain lobes and between regions. In the aging study, minimum cost path was highly effective in identifying group differences on all levels; fractional anisotropy and mean correlation showed smaller differences on the brain lobe and regional levels. In the schizophrenia study, minimum cost path and fractional anisotropy showed differences on the global level and within brain lobes; mean correlation showed small differences on the lobe level. Only

  20. Application of Time Series Analysis in Determination of Lag Time in Jahanbin Basin

    Directory of Open Access Journals (Sweden)

    Seied Yahya Mirzaee

    2005-11-01

        One of the important issues that have significant role in study of hydrology of basin is determination of lag time. Lag time has significant role in hydrological studies. Quantity of rainfall related lag time depends on several factors, such as permeability, vegetation cover, catchments slope, rainfall intensity, storm duration and type of rain. Determination of lag time is important parameter in many projects such as dam design and also water resource studies. Lag time of basin could be calculated using various methods. One of these methods is time series analysis of spectral density. The analysis is based on fouries series. The time series is approximated with Sinuous and Cosines functions. In this method harmonically significant quantities with individual frequencies are presented. Spectral density under multiple time series could be used to obtain basin lag time for annual runoff and short-term rainfall fluctuation. A long lag time could be due to snowmelt as well as melting ice due to rainfalls in freezing days. In this research the lag time of Jahanbin basin has been determined using spectral density method. The catchments is subjected to both rainfall and snowfall. For short term rainfall fluctuation with a return period  2, 3, 4 months, the lag times were found 0.18, 0.5 and 0.083 month, respectively.

  1. Statistical tools for analysis and modeling of cosmic populations and astronomical time series: CUDAHM and TSE

    Science.gov (United States)

    Loredo, Thomas; Budavari, Tamas; Scargle, Jeffrey D.

    2018-01-01

    This presentation provides an overview of open-source software packages addressing two challenging classes of astrostatistics problems. (1) CUDAHM is a C++ framework for hierarchical Bayesian modeling of cosmic populations, leveraging graphics processing units (GPUs) to enable applying this computationally challenging paradigm to large datasets. CUDAHM is motivated by measurement error problems in astronomy, where density estimation and linear and nonlinear regression must be addressed for populations of thousands to millions of objects whose features are measured with possibly complex uncertainties, potentially including selection effects. An example calculation demonstrates accurate GPU-accelerated luminosity function estimation for simulated populations of $10^6$ objects in about two hours using a single NVIDIA Tesla K40c GPU. (2) Time Series Explorer (TSE) is a collection of software in Python and MATLAB for exploratory analysis and statistical modeling of astronomical time series. It comprises a library of stand-alone functions and classes, as well as an application environment for interactive exploration of times series data. The presentation will summarize key capabilities of this emerging project, including new algorithms for analysis of irregularly-sampled time series.

  2. Methotrexate therapy for chronic noninfectious uveitis: analysis of a case series of 160 patients.

    Science.gov (United States)

    Samson, C M; Waheed, N; Baltatzis, S; Foster, C S

    2001-06-01

    To evaluate the outcomes of patients with chronic noninfectious uveitis unresponsive to conventional antiinflammatory therapy who were treated with methotrexate. Retrospective noncomparative interventional case series. All patients with chronic noninfectious uveitis treated with methotrexate at a single institution from 1985 to 1999. Charts of patients seen on the Ocular Immunology & Uveitis Service at the Massachusetts Eye & Ear Infirmary were reviewed. Patients with chronic uveitis of noninfectious origin treated with methotrexate were included in the study. Control of inflammation, steroid-sparing effect, visual acuity, adverse reactions. A total of 160 patients met the inclusion criteria. Control of inflammation was achieved in 76.2% of patients. Steroid-sparing effect was achieved in 56% of patients. Visual acuity was maintained or improved in 90% of patients. Side effects requiring discontinuation of medication occurred in 18% of patients. Potentially serious adverse reactions occurred in only 8.1% of patients. There was neither long-term morbidity nor mortality caused by methotrexate. Methotrexate is effective in the treatment of chronic noninfectious uveitis that fails to respond to conventional steroid treatment. It is an effective steroid-sparing immunomodulator, is a safe medication, and is well tolerated.

  3. Visual Analysis of Speeding Bumps Using Floating Car Dataset

    DEFF Research Database (Denmark)

    Kveladze, Irma; Agerholm, Niels

    2018-01-01

    measures, and also their influence on the driving behaviour of vehicles from a temporal perspective. To fill this gap and understand these aspects, in this research we propose visual analytics techniques. To explore the influence of the bumps on speed change behavior of vehicles, several use case studies...... in three different cities were selected with close cooperation of a traffic domain expert. The aim was first to study influence of the bumps on vehicle volume and speeding from a time perspective. And then, establish any connection between speed bump intervals and vehicle speeding in the selected use cases...

  4. Analysis of Tire Contact Parameters Using Visual Processing

    Directory of Open Access Journals (Sweden)

    Valentin Ivanov

    2010-01-01

    The first part of this paper presents the results of experimental estimation of the contact patch area depending on the normal wheel load and inflation pressure for different car tires. The data were obtained for test bench conditions on the basis of the visual processing of tread footprint. Further, the contact length in the cohesion area during wheel rolling for single points on the tire profile has been chosen as a benchmark criterion. This paper has analyzed the influence of the wheel normal load and tire inflation pressure on the contact length with small rolling velocities. The results of the investigations are given for winter and racing tires with different grades of wear.

  5. The singular nature of auditory and visual scene analysis in autism.

    Science.gov (United States)

    Lin, I-Fan; Shirama, Aya; Kato, Nobumasa; Kashino, Makio

    2017-02-19

    Individuals with autism spectrum disorder often have difficulty acquiring relevant auditory and visual information in daily environments, despite not being diagnosed as hearing impaired or having low vision. Resent psychophysical and neurophysiological studies have shown that autistic individuals have highly specific individual differences at various levels of information processing, including feature extraction, automatic grouping and top-down modulation in auditory and visual scene analysis. Comparison of the characteristics of scene analysis between auditory and visual modalities reveals some essential commonalities, which could provide clues about the underlying neural mechanisms. Further progress in this line of research may suggest effective methods for diagnosing and supporting autistic individuals.This article is part of the themed issue 'Auditory and visual scene analysis'. © 2017 The Author(s).

  6. Fractal time series analysis of postural stability in elderly and control subjects

    Directory of Open Access Journals (Sweden)

    Doussot Michel

    2007-05-01

    Full Text Available Abstract Background The study of balance using stabilogram analysis is of particular interest in the study of falls. Although simple statistical parameters derived from the stabilogram have been shown to predict risk of falls, such measures offer little insight into the underlying control mechanisms responsible for degradation in balance. In contrast, fractal and non-linear time-series analysis of stabilograms, such as estimations of the Hurst exponent (H, may provide information related to the underlying motor control strategies governing postural stability. In order to be adapted for a home-based follow-up of balance, such methods need to be robust, regardless of the experimental protocol, while producing time-series that are as short as possible. The present study compares two methods of calculating H: Detrended Fluctuation Analysis (DFA and Stabilogram Diffusion Analysis (SDA for elderly and control subjects, as well as evaluating the effect of recording duration. Methods Centre of pressure signals were obtained from 90 young adult subjects and 10 elderly subjects. Data were sampled at 100 Hz for 30 s, including stepping onto and off the force plate. Estimations of H were made using sliding windows of 10, 5, and 2.5 s durations, with windows slid forward in 1-s increments. Multivariate analysis of variance was used to test for the effect of time, age and estimation method on the Hurst exponent, while the intra-class correlation coefficient (ICC was used as a measure of reliability. Results Both SDA and DFA methods were able to identify differences in postural stability between control and elderly subjects for time series as short as 5 s, with ICC values as high as 0.75 for DFA. Conclusion Both methods would be well-suited to non-invasive longitudinal assessment of balance. In addition, reliable estimations of H were obtained from time series as short as 5 s.

  7. Savant Genome Browser 2: visualization and analysis for population-scale genomics.

    Science.gov (United States)

    Fiume, Marc; Smith, Eric J M; Brook, Andrew; Strbenac, Dario; Turner, Brian; Mezlini, Aziz M; Robinson, Mark D; Wodak, Shoshana J; Brudno, Michael

    2012-07-01

    High-throughput sequencing (HTS) technologies are providing an unprecedented capacity for data generation, and there is a corresponding need for efficient data exploration and analysis capabilities. Although most existing tools for HTS data analysis are developed for either automated (e.g. genotyping) or visualization (e.g. genome browsing) purposes, such tools are most powerful when combined. For example, integration of visualization and computation allows users to iteratively refine their analyses by updating computational parameters within the visual framework in real-time. Here we introduce the second version of the Savant Genome Browser, a standalone program for visual and computational analysis of HTS data. Savant substantially improves upon its predecessor and existing tools by introducing innovative visualization modes and navigation interfaces for several genomic datatypes, and synergizing visual and automated analyses in a way that is powerful yet easy even for non-expert users. We also present a number of plugins that were developed by the Savant Community, which demonstrate the power of integrating visual and automated analyses using Savant. The Savant Genome Browser is freely available (open source) at www.savantbrowser.com.

  8. Knowledge-assisted cross-media analysis of audio-visual content in the news domain

    NARCIS (Netherlands)

    Mezaris, Vasileios; Gidaros, Spyros; Papadopoulos, Georgios Th.; Kasper, Walter; Ordelman, Roeland J.F.; de Jong, Franciska M.G.; Kompatsiaris, Ioannis

    In this paper, a complete architecture for knowledge-assisted cross-media analysis of News-related multimedia content is presented, along with its constituent components. The proposed analysis architecture employs state-of-the-art methods for the analysis of each individual modality (visual, audio,

  9. New Tools for Sea Ice Data Analysis and Visualization: NSIDC's Arctic Sea Ice News and Analysis

    Science.gov (United States)

    Vizcarra, N.; Stroeve, J.; Beam, K.; Beitler, J.; Brandt, M.; Kovarik, J.; Savoie, M. H.; Skaug, M.; Stafford, T.

    2017-12-01

    Arctic sea ice has long been recognized as a sensitive climate indicator and has undergone a dramatic decline over the past thirty years. Antarctic sea ice continues to be an intriguing and active field of research. The National Snow and Ice Data Center's Arctic Sea Ice News & Analysis (ASINA) offers researchers and the public a transparent view of sea ice data and analysis. We have released a new set of tools for sea ice analysis and visualization. In addition to Charctic, our interactive sea ice extent graph, the new Sea Ice Data and Analysis Tools page provides access to Arctic and Antarctic sea ice data organized in seven different data workbooks, updated daily or monthly. An interactive tool lets scientists, or the public, quickly compare changes in ice extent and location. Another tool allows users to map trends, anomalies, and means for user-defined time periods. Animations of September Arctic and Antarctic monthly average sea ice extent and concentration may also be accessed from this page. Our tools help the NSIDC scientists monitor and understand sea ice conditions in near real time. They also allow the public to easily interact with and explore sea ice data. Technical innovations in our data center helped NSIDC quickly build these tools and more easily maintain them. The tools were made publicly accessible to meet the desire from the public and members of the media to access the numbers and calculations that power our visualizations and analysis. This poster explores these tools and how other researchers, the media, and the general public are using them.

  10. Visual Analysis of MOOC Forums with iForum.

    Science.gov (United States)

    Fu, Siwei; Zhao, Jian; Cui, Weiwei; Qu, Huamin

    2017-01-01

    Discussion forums of Massive Open Online Courses (MOOC) provide great opportunities for students to interact with instructional staff as well as other students. Exploration of MOOC forum data can offer valuable insights for these staff to enhance the course and prepare the next release. However, it is challenging due to the large, complicated, and heterogeneous nature of relevant datasets, which contain multiple dynamically interacting objects such as users, posts, and threads, each one including multiple attributes. In this paper, we present a design study for developing an interactive visual analytics system, called iForum, that allows for effectively discovering and understanding temporal patterns in MOOC forums. The design study was conducted with three domain experts in an iterative manner over one year, including a MOOC instructor and two official teaching assistants. iForum offers a set of novel visualization designs for presenting the three interleaving aspects of MOOC forums (i.e., posts, users, and threads) at three different scales. To demonstrate the effectiveness and usefulness of iForum, we describe a case study involving field experts, in which they use iForum to investigate real MOOC forum data for a course on JAVA programming.

  11. Mutational Analysis of Drosophila Basigin Function in the Visual System

    Science.gov (United States)

    Munro, Michelle; Akkam, Yazan; Curtin, Kathryn D.

    2009-01-01

    Drosophila basigin is a cell-surface glycoprotein of the Ig superfamily and a member of a protein family that includes mammalian EMMPRIN/CD147/basigin, neuroplastin, and embigin. Our previous work on Drosophila basigin has shown that it is required for normal photoreceptor cell structure and normal neuron-glia interaction in the fly visual system. Specifically, the photoreceptor neurons of mosaic animals that are mutant in the eye for basigin show altered cell structure with nuclei, mitochondria and rER misplaced and variable axon diameter compared to wild-type. In addition, glia cells in the optic lamina that contact photoreceptor axons are misplaced and show altered structure. All these defects are rescued by expression of either transgenic fly basigin or transgenic mouse basigin in the photoreceptors demonstrating that mouse basigin can functionally replace fly basigin. To determine what regions of the basigin protein are required for each of these functions, we have created mutant basigin transgenes coding for proteins that are altered in conserved residues, introduced these into the fly genome, and tested them for their ability to rescue both photoreceptor cell structure defects and neuron-glia interaction defects of basigin. The results suggest that the highly conserved transmembrane domain and the extracellular domains are crucial for basigin function in the visual system while the short intracellular tail may not play a role in these functions. PMID:19782733

  12. Dynamic analysis and pattern visualization of forest fires.

    Science.gov (United States)

    Lopes, António M; Tenreiro Machado, J A

    2014-01-01

    This paper analyses forest fires in the perspective of dynamical systems. Forest fires exhibit complex correlations in size, space and time, revealing features often present in complex systems, such as the absence of a characteristic length-scale, or the emergence of long range correlations and persistent memory. This study addresses a public domain forest fires catalogue, containing information of events for Portugal, during the period from 1980 up to 2012. The data is analysed in an annual basis, modelling the occurrences as sequences of Dirac impulses with amplitude proportional to the burnt area. First, we consider mutual information to correlate annual patterns. We use visualization trees, generated by hierarchical clustering algorithms, in order to compare and to extract relationships among the data. Second, we adopt the Multidimensional Scaling (MDS) visualization tool. MDS generates maps where each object corresponds to a point. Objects that are perceived to be similar to each other are placed on the map forming clusters. The results are analysed in order to extract relationships among the data and to identify forest fire patterns.

  13. Adventures in Modern Time Series Analysis: From the Sun to the Crab Nebula and Beyond

    Science.gov (United States)

    Scargle, Jeffrey

    2014-01-01

    With the generation of long, precise, and finely sampled time series the Age of Digital Astronomy is uncovering and elucidating energetic dynamical processes throughout the Universe. Fulfilling these opportunities requires data effective analysis techniques rapidly and automatically implementing advanced concepts. The Time Series Explorer, under development in collaboration with Tom Loredo, provides tools ranging from simple but optimal histograms to time and frequency domain analysis for arbitrary data modes with any time sampling. Much of this development owes its existence to Joe Bredekamp and the encouragement he provided over several decades. Sample results for solar chromospheric activity, gamma-ray activity in the Crab Nebula, active galactic nuclei and gamma-ray bursts will be displayed.

  14. Flood Frequency Analysis For Partial Duration Series In Ganjiang River Basin

    Science.gov (United States)

    zhangli, Sun; xiufang, Zhu; yaozhong, Pan

    2016-04-01

    Accurate estimation of flood frequency is key to effective, nationwide flood damage abatement programs. The partial duration series (PDS) method is widely used in hydrologic studies because it considers all events above a certain threshold level as compared to the annual maximum series (AMS) method, which considers only the annual maximum value. However, the PDS has a drawback in that it is difficult to define the thresholds and maintain an independent and identical distribution of the partial duration time series; this drawback is discussed in this paper. The Ganjiang River is the seventh largest tributary of the Yangtze River, the longest river in China. The Ganjiang River covers a drainage area of 81,258 km2 at the Wanzhou hydrologic station as the basin outlet. In this work, 56 years of daily flow data (1954-2009) from the Wanzhou station were used to analyze flood frequency, and the Pearson-III model was employed as the hydrologic probability distribution. Generally, three tasks were accomplished: (1) the threshold of PDS by percentile rank of daily runoff was obtained; (2) trend analysis of the flow series was conducted using PDS; and (3) flood frequency analysis was conducted for partial duration flow series. The results showed a slight upward trend of the annual runoff in the Ganjiang River basin. The maximum flow with a 0.01 exceedance probability (corresponding to a 100-year flood peak under stationary conditions) was 20,000 m3/s, while that with a 0.1 exceedance probability was 15,000 m3/s. These results will serve as a guide to hydrological engineering planning, design, and management for policymakers and decision makers associated with hydrology.

  15. Time-series analysis of climatologic measurements: a method to distinguish future climatic changes

    International Nuclear Information System (INIS)

    Duband, D.

    1992-01-01

    Time-series analysis of climatic parameters as air temperature, rivers flow rate, lakes or seas level is an indispensable basis to detect a possible significant climatic change. These observations, when they are carefully analyzed and criticized, constitute the necessary reference for testing and validation numerical climatic models which try to simulate the physical and dynamical process of the ocean-atmosphere couple, taking continents into account. 32 refs., 13 figs

  16. Chernobyl effects on domestic and inbound tourism in Sweden. A time series analysis

    International Nuclear Information System (INIS)

    Hultkrantz, L.; Olsson, C.

    1997-01-01

    This paper estimates the impact of the Chernobyl nuclear accident on domestic and international tourism in Sweden. From ARIMA time series forecasts, outlier search, and intervention analysis based on regional monthly accommodation data from 1978-1989, no effect on domestic tourism is found. However, there is an enduring deterrence effect on incoming tourism. The loss of gross revenue from incoming tourism because of the Chernobyl accident, is estimated to 2.5 billion SEK. 5 figs., 7 tabs., 1 appendix, 27 refs

  17. On-line condition monitoring of nuclear systems via symbolic time series analysis

    International Nuclear Information System (INIS)

    Rajagopalan, V.; Ray, A.; Garcia, H. E.

    2006-01-01

    This paper provides a symbolic time series analysis approach to fault diagnostics and condition monitoring. The proposed technique is built upon concepts from wavelet theory, symbolic dynamics and pattern recognition. Various aspects of the methodology such as wavelet selection, choice of alphabet and determination of depth of D-Markov Machine are explained in the paper. The technique is validated with experiments performed in a Machine Condition Monitoring (MCM) test bed at the Idaho National Laboratory. (authors)

  18. A Time Series Analysis to Asymmetric Marketing Competition Within a Market Structure

    OpenAIRE

    Francisco F. R. Ramos

    1996-01-01

    As a complementary to the existing studies of competitive market structure analysis, the present paper proposed a time series methodology to provide a more detailed picture of marketing competition in relation to competitive market structure. Two major hypotheses were tested as part of this project. First, it was found that some significant cross- lead and lag effects of marketing variables on sales between brands existed even between differents submarkets. second, it was found that high qual...

  19. Time series analysis in road safety research uisng state space methods

    OpenAIRE

    BIJLEVELD, FD

    2008-01-01

    In this thesis we present a comprehensive study into novel time series models for aggregated road safety data. The models are mainly intended for analysis of indicators relevant to road safety, with a particular focus on how to measure these factors. Such developments may need to be related to or explained by external influences. It is also possible to make forecasts using the models. Relevant indicators include the number of persons killed permonth or year. These statistics are closely watch...

  20. Chernobyl effects on domestic and inbound tourism in Sweden. A time series analysis

    Energy Technology Data Exchange (ETDEWEB)

    Hultkrantz, L. [Department of Economics, University of Uppsala, Uppsala (Sweden); Olsson, C. [Department of Economics, Umeaa University, Umeaa (Sweden)

    1997-03-01

    This paper estimates the impact of the Chernobyl nuclear accident on domestic and international tourism in Sweden. From ARIMA time series forecasts, outlier search, and intervention analysis based on regional monthly accommodation data from 1978-1989, no effect on domestic tourism is found. However, there is an enduring deterrence effect on incoming tourism. The loss of gross revenue from incoming tourism because of the Chernobyl accident, is estimated to 2.5 billion SEK. 5 figs., 7 tabs., 1 appendix, 27 refs.

  1. Time Series Modeling of Army Mission Command Communication Networks: An Event-Driven Analysis

    Science.gov (United States)

    2013-06-01

    Lehmann, D. R. (1984). How advertising affects sales: Meta- analysis of econometric results. Journal of Marketing Research , 21, 65-74. Barabási, A. L...317-357. Leone, R. P. (1983). Modeling sales-advertising relationships: An integrated time series- econometric approach. Journal of Marketing ... Research , 20, 291-295. McGrath, J. E., & Kravitz, D. A. (1982). Group research. Annual Review of Psychology, 33, 195- 230. Monge, P. R., & Contractor

  2. RVA. 3-D Visualization and Analysis Software to Support Management of Oil and Gas Resources

    Energy Technology Data Exchange (ETDEWEB)

    Keefer, Donald A. [Univ. of Illinois, Champaign, IL (United States); Shaffer, Eric G. [Univ. of Illinois, Champaign, IL (United States); Storsved, Brynne [Univ. of Illinois, Champaign, IL (United States); Vanmoer, Mark [Univ. of Illinois, Champaign, IL (United States); Angrave, Lawrence [Univ. of Illinois, Champaign, IL (United States); Damico, James R. [Univ. of Illinois, Champaign, IL (United States); Grigsby, Nathan [Univ. of Illinois, Champaign, IL (United States)

    2015-12-01

    A free software application, RVA, has been developed as a plugin to the US DOE-funded ParaView visualization package, to provide support in the visualization and analysis of complex reservoirs being managed using multi-fluid EOR techniques. RVA, for Reservoir Visualization and Analysis, was developed as an open-source plugin to the 64 bit Windows version of ParaView 3.14. RVA was developed at the University of Illinois at Urbana-Champaign, with contributions from the Illinois State Geological Survey, Department of Computer Science and National Center for Supercomputing Applications. RVA was designed to utilize and enhance the state-of-the-art visualization capabilities within ParaView, readily allowing joint visualization of geologic framework and reservoir fluid simulation model results. Particular emphasis was placed on enabling visualization and analysis of simulation results highlighting multiple fluid phases, multiple properties for each fluid phase (including flow lines), multiple geologic models and multiple time steps. Additional advanced functionality was provided through the development of custom code to implement data mining capabilities. The built-in functionality of ParaView provides the capacity to process and visualize data sets ranging from small models on local desktop systems to extremely large models created and stored on remote supercomputers. The RVA plugin that we developed and the associated User Manual provide improved functionality through new software tools, and instruction in the use of ParaView-RVA, targeted to petroleum engineers and geologists in industry and research. The RVA web site (http://rva.cs.illinois.edu) provides an overview of functions, and the development web site (https://github.com/shaffer1/RVA) provides ready access to the source code, compiled binaries, user manual, and a suite of demonstration data sets. Key functionality has been included to support a range of reservoirs visualization and analysis needs, including

  3. TIME SERIES ANALYSIS ON STOCK MARKET FOR TEXT MINING CORRELATION OF ECONOMY NEWS

    Directory of Open Access Journals (Sweden)

    Sadi Evren SEKER

    2014-01-01

    Full Text Available This paper proposes an information retrieval methodfor the economy news. Theeffect of economy news, are researched in the wordlevel and stock market valuesare considered as the ground proof.The correlation between stock market prices and economy news is an already ad-dressed problem for most of the countries. The mostwell-known approach is ap-plying the text mining approaches to the news and some time series analysis tech-niques over stock market closing values in order toapply classification or cluster-ing algorithms over the features extracted. This study goes further and tries to askthe question what are the available time series analysis techniques for the stockmarket closing values and which one is the most suitable? In this study, the newsand their dates are collected into a database and text mining is applied over thenews, the text mining part has been kept simple with only term frequency – in-verse document frequency method. For the time series analysis part, we havestudied 10 different methods such as random walk, moving average, acceleration,Bollinger band, price rate of change, periodic average, difference, momentum orrelative strength index and their variation. In this study we have also explainedthese techniques in a comparative way and we have applied the methods overTurkish Stock Market closing values for more than a2 year period. On the otherhand, we have applied the term frequency – inversedocument frequency methodon the economy news of one of the high-circulatingnewspapers in Turkey.

  4. The Fourier decomposition method for nonlinear and non-stationary time series analysis.

    Science.gov (United States)

    Singh, Pushpendra; Joshi, Shiv Dutt; Patney, Rakesh Kumar; Saha, Kaushik

    2017-03-01

    for many decades, there has been a general perception in the literature that Fourier methods are not suitable for the analysis of nonlinear and non-stationary data. In this paper, we propose a novel and adaptive Fourier decomposition method (FDM), based on the Fourier theory, and demonstrate its efficacy for the analysis of nonlinear and non-stationary time series. The proposed FDM decomposes any data into a small number of 'Fourier intrinsic band functions' (FIBFs). The FDM presents a generalized Fourier expansion with variable amplitudes and variable frequencies of a time series by the Fourier method itself. We propose an idea of zero-phase filter bank-based multivariate FDM (MFDM), for the analysis of multivariate nonlinear and non-stationary time series, using the FDM. We also present an algorithm to obtain cut-off frequencies for MFDM. The proposed MFDM generates a finite number of band-limited multivariate FIBFs (MFIBFs). The MFDM preserves some intrinsic physical properties of the multivariate data, such as scale alignment, trend and instantaneous frequency. The proposed methods provide a time-frequency-energy (TFE) distribution that reveals the intrinsic structure of a data. Numerical computations and simulations have been carried out and comparison is made with the empirical mode decomposition algorithms.

  5. Visual Task Demands and the Auditory Mismatch Negativity: An Empirical Study and a Meta-Analysis.

    Science.gov (United States)

    Wiens, Stefan; Szychowska, Malina; Nilsson, Mats E

    2016-01-01

    Because the auditory system is particularly useful in monitoring the environment, previous research has examined whether task-irrelevant, auditory distracters are processed even if subjects focus their attention on visual stimuli. This research suggests that attentionally demanding visual tasks decrease the auditory mismatch negativity (MMN) to simultaneously presented auditory distractors. Because a recent behavioral study found that high visual perceptual load decreased detection sensitivity of simultaneous tones, we used a similar task (n = 28) to determine if high visual perceptual load would reduce the auditory MMN. Results suggested that perceptual load did not decrease the MMN. At face value, these nonsignificant findings may suggest that effects of perceptual load on the MMN are smaller than those of other demanding visual tasks. If so, effect sizes should differ systematically between the present and previous studies. We conducted a selective meta-analysis of published studies in which the MMN was derived from the EEG, the visual task demands were continuous and varied between high and low within the same task, and the task-irrelevant tones were presented in a typical oddball paradigm simultaneously with the visual stimuli. Because the meta-analysis suggested that the present (null) findings did not differ systematically from previous findings, the available evidence was combined. Results of this meta-analysis confirmed that demanding visual tasks reduce the MMN to auditory distracters. However, because the meta-analysis was based on small studies and because of the risk for publication biases, future studies should be preregistered with large samples (n > 150) to provide confirmatory evidence for the results of the present meta-analysis. These future studies should also use control conditions that reduce confounding effects of neural adaptation, and use load manipulations that are defined independently from their effects on the MMN.

  6. Visual Task Demands and the Auditory Mismatch Negativity: An Empirical Study and a Meta-Analysis

    Science.gov (United States)

    Wiens, Stefan; Szychowska, Malina; Nilsson, Mats E.

    2016-01-01

    Because the auditory system is particularly useful in monitoring the environment, previous research has examined whether task-irrelevant, auditory distracters are processed even if subjects focus their attention on visual stimuli. This research suggests that attentionally demanding visual tasks decrease the auditory mismatch negativity (MMN) to simultaneously presented auditory distractors. Because a recent behavioral study found that high visual perceptual load decreased detection sensitivity of simultaneous tones, we used a similar task (n = 28) to determine if high visual perceptual load would reduce the auditory MMN. Results suggested that perceptual load did not decrease the MMN. At face value, these nonsignificant findings may suggest that effects of perceptual load on the MMN are smaller than those of other demanding visual tasks. If so, effect sizes should differ systematically between the present and previous studies. We conducted a selective meta-analysis of published studies in which the MMN was derived from the EEG, the visual task demands were continuous and varied between high and low within the same task, and the task-irrelevant tones were presented in a typical oddball paradigm simultaneously with the visual stimuli. Because the meta-analysis suggested that the present (null) findings did not differ systematically from previous findings, the available evidence was combined. Results of this meta-analysis confirmed that demanding visual tasks reduce the MMN to auditory distracters. However, because the meta-analysis was based on small studies and because of the risk for publication biases, future studies should be preregistered with large samples (n > 150) to provide confirmatory evidence for the results of the present meta-analysis. These future studies should also use control conditions that reduce confounding effects of neural adaptation, and use load manipulations that are defined independently from their effects on the MMN. PMID:26741815

  7. Trend Estimation and Regression Analysis in Climatological Time Series: An Application of Structural Time Series Models and the Kalman Filter.

    Science.gov (United States)

    Visser, H.; Molenaar, J.

    1995-05-01

    The detection of trends in climatological data has become central to the discussion on climate change due to the enhanced greenhouse effect. To prove detection, a method is needed (i) to make inferences on significant rises or declines in trends, (ii) to take into account natural variability in climate series, and (iii) to compare output from GCMs with the trends in observed climate data. To meet these requirements, flexible mathematical tools are needed. A structural time series model is proposed with which a stochastic trend, a deterministic trend, and regression coefficients can be estimated simultaneously. The stochastic trend component is described using the class of ARIMA models. The regression component is assumed to be linear. However, the regression coefficients corresponding with the explanatory variables may be time dependent to validate this assumption. The mathematical technique used to estimate this trend-regression model is the Kaiman filter. The main features of the filter are discussed.Examples of trend estimation are given using annual mean temperatures at a single station in the Netherlands (1706-1990) and annual mean temperatures at Northern Hemisphere land stations (1851-1990). The inclusion of explanatory variables is shown by regressing the latter temperature series on four variables: Southern Oscillation index (SOI), volcanic dust index (VDI), sunspot numbers (SSN), and a simulated temperature signal, induced by increasing greenhouse gases (GHG). In all analyses, the influence of SSN on global temperatures is found to be negligible. The correlations between temperatures and SOI and VDI appear to be negative. For SOI, this correlation is significant, but for VDI it is not, probably because of a lack of volcanic eruptions during the sample period. The relation between temperatures and GHG is positive, which is in agreement with the hypothesis of a warming climate because of increasing levels of greenhouse gases. The prediction performance of

  8. Linking Automated Data Analysis and Visualization with Applications in Developmental Biology and High-Energy Physics

    Energy Technology Data Exchange (ETDEWEB)

    Ruebel, Oliver [Technical Univ. of Darmstadt (Germany)

    2009-11-20

    Knowledge discovery from large and complex collections of today's scientific datasets is a challenging task. With the ability to measure and simulate more processes at increasingly finer spatial and temporal scales, the increasing number of data dimensions and data objects is presenting tremendous challenges for data analysis and effective data exploration methods and tools. Researchers are overwhelmed with data and standard tools are often insufficient to enable effective data analysis and knowledge discovery. The main objective of this thesis is to provide important new capabilities to accelerate scientific knowledge discovery form large, complex, and multivariate scientific data. The research covered in this thesis addresses these scientific challenges using a combination of scientific visualization, information visualization, automated data analysis, and other enabling technologies, such as efficient data management. The effectiveness of the proposed analysis methods is demonstrated via applications in two distinct scientific research fields, namely developmental biology and high-energy physics.Advances in microscopy, image analysis, and embryo registration enable for the first time measurement of gene expression at cellular resolution for entire organisms. Analysis of high-dimensional spatial gene expression datasets is a challenging task. By integrating data clustering and visualization, analysis of complex, time-varying, spatial gene expression patterns and their formation becomes possible. The analysis framework MATLAB and the visualization have been integrated, making advanced analysis tools accessible to biologist and enabling bioinformatic researchers to directly integrate their analysis with the visualization. Laser wakefield particle accelerators (LWFAs) promise to be a new compact source of high-energy particles and radiation, with wide applications ranging from medicine to physics. To gain insight into the complex physical processes of particle

  9. Linking Automated Data Analysis and Visualization with Applications in Developmental Biology and High-Energy Physics

    International Nuclear Information System (INIS)

    Ruebel, Oliver

    2009-01-01

    Knowledge discovery from large and complex collections of today's scientific datasets is a challenging task. With the ability to measure and simulate more processes at increasingly finer spatial and temporal scales, the increasing number of data dimensions and data objects is presenting tremendous challenges for data analysis and effective data exploration methods and tools. Researchers are overwhelmed with data and standard tools are often insufficient to enable effective data analysis and knowledge discovery. The main objective of this thesis is to provide important new capabilities to accelerate scientific knowledge discovery form large, complex, and multivariate scientific data. The research covered in this thesis addresses these scientific challenges using a combination of scientific visualization, information visualization, automated data analysis, and other enabling technologies, such as efficient data management. The effectiveness of the proposed analysis methods is demonstrated via applications in two distinct scientific research fields, namely developmental biology and high-energy physics.Advances in microscopy, image analysis, and embryo registration enable for the first time measurement of gene expression at cellular resolution for entire organisms. Analysis of high-dimensional spatial gene expression datasets is a challenging task. By integrating data clustering and visualization, analysis of complex, time-varying, spatial gene expression patterns and their formation becomes possible. The analysis framework MATLAB and the visualization have been integrated, making advanced analysis tools accessible to biologist and enabling bioinformatic researchers to directly integrate their analysis with the visualization. Laser wakefield particle accelerators (LWFAs) promise to be a new compact source of high-energy particles and radiation, with wide applications ranging from medicine to physics. To gain insight into the complex physical processes of particle

  10. Time series analysis as input for clinical predictive modeling: modeling cardiac arrest in a pediatric ICU.

    Science.gov (United States)

    Kennedy, Curtis E; Turley, James P

    2011-10-24

    Thousands of children experience cardiac arrest events every year in pediatric intensive care units. Most of these children die. Cardiac arrest prediction tools are used as part of medical emergency team evaluations to identify patients in standard hospital beds that are at high risk for cardiac arrest. There are no models to predict cardiac arrest in pediatric intensive care units though, where the risk of an arrest is 10 times higher than for standard hospital beds. Current tools are based on a multivariable approach that does not characterize deterioration, which often precedes cardiac arrests. Characterizing deterioration requires a time series approach. The purpose of this study is to propose a method that will allow for time series data to be used in clinical prediction models. Successful implementation of these methods has the potential to bring arrest prediction to the pediatric intensive care environment, possibly allowing for interventions that can save lives and prevent disabilities. We reviewed prediction models from nonclinical domains that employ time series data, and identified the steps that are necessary for building predictive models using time series clinical data. We illustrate the method by applying it to the specific case of building a predictive model for cardiac arrest in a pediatric intensive care unit. Time course analysis studies from genomic analysis provided a modeling template that was compatible with the steps required to develop a model from clinical time series data. The steps include: 1) selecting candidate variables; 2) specifying measurement parameters; 3) defining data format; 4) defining time window duration and resolution; 5) calculating latent variables for candidate variables not directly measured; 6) calculating time series features as latent variables; 7) creating data subsets to measure model performance effects attributable to various classes of candidate variables; 8) reducing the number of candidate features; 9

  11. Early detection of metabolic and energy disorders by thermal time series stochastic complexity analysis

    Energy Technology Data Exchange (ETDEWEB)

    Lutaif, N.A. [Departamento de Clínica Médica, Faculdade de Ciências Médicas, Universidade Estadual de Campinas, Campinas, SP (Brazil); Palazzo, R. Jr [Departamento de Telemática, Faculdade de Engenharia Elétrica e Computação, Universidade Estadual de Campinas, Campinas, SP (Brazil); Gontijo, J.A.R. [Departamento de Clínica Médica, Faculdade de Ciências Médicas, Universidade Estadual de Campinas, Campinas, SP (Brazil)

    2014-01-17

    Maintenance of thermal homeostasis in rats fed a high-fat diet (HFD) is associated with changes in their thermal balance. The thermodynamic relationship between heat dissipation and energy storage is altered by the ingestion of high-energy diet content. Observation of thermal registers of core temperature behavior, in humans and rodents, permits identification of some characteristics of time series, such as autoreference and stationarity that fit adequately to a stochastic analysis. To identify this change, we used, for the first time, a stochastic autoregressive model, the concepts of which match those associated with physiological systems involved and applied in male HFD rats compared with their appropriate standard food intake age-matched male controls (n=7 per group). By analyzing a recorded temperature time series, we were able to identify when thermal homeostasis would be affected by a new diet. The autoregressive time series model (AR model) was used to predict the occurrence of thermal homeostasis, and this model proved to be very effective in distinguishing such a physiological disorder. Thus, we infer from the results of our study that maximum entropy distribution as a means for stochastic characterization of temperature time series registers may be established as an important and early tool to aid in the diagnosis and prevention of metabolic diseases due to their ability to detect small variations in thermal profile.

  12. Early detection of metabolic and energy disorders by thermal time series stochastic complexity analysis

    International Nuclear Information System (INIS)

    Lutaif, N.A.; Palazzo, R. Jr; Gontijo, J.A.R.

    2014-01-01

    Maintenance of thermal homeostasis in rats fed a high-fat diet (HFD) is associated with changes in their thermal balance. The thermodynamic relationship between heat dissipation and energy storage is altered by the ingestion of high-energy diet content. Observation of thermal registers of core temperature behavior, in humans and rodents, permits identification of some characteristics of time series, such as autoreference and stationarity that fit adequately to a stochastic analysis. To identify this change, we used, for the first time, a stochastic autoregressive model, the concepts of which match those associated with physiological systems involved and applied in male HFD rats compared with their appropriate standard food intake age-matched male controls (n=7 per group). By analyzing a recorded temperature time series, we were able to identify when thermal homeostasis would be affected by a new diet. The autoregressive time series model (AR model) was used to predict the occurrence of thermal homeostasis, and this model proved to be very effective in distinguishing such a physiological disorder. Thus, we infer from the results of our study that maximum entropy distribution as a means for stochastic characterization of temperature time series registers may be established as an important and early tool to aid in the diagnosis and prevention of metabolic diseases due to their ability to detect small variations in thermal profile

  13. Time series analysis of infrared satellite data for detecting thermal anomalies: a hybrid approach

    Science.gov (United States)

    Koeppen, W. C.; Pilger, E.; Wright, R.

    2011-07-01

    We developed and tested an automated algorithm that analyzes thermal infrared satellite time series data to detect and quantify the excess energy radiated from thermal anomalies such as active volcanoes. Our algorithm enhances the previously developed MODVOLC approach, a simple point operation, by adding a more complex time series component based on the methods of the Robust Satellite Techniques (RST) algorithm. Using test sites at Anatahan and Kīlauea volcanoes, the hybrid time series approach detected ~15% more thermal anomalies than MODVOLC with very few, if any, known false detections. We also tested gas flares in the Cantarell oil field in the Gulf of Mexico as an end-member scenario representing very persistent thermal anomalies. At Cantarell, the hybrid algorithm showed only a slight improvement, but it did identify flares that were undetected by MODVOLC. We estimate that at least 80 MODIS images for each calendar month are required to create good reference images necessary for the time series analysis of the hybrid algorithm. The improved performance of the new algorithm over MODVOLC will result in the detection of low temperature thermal anomalies that will be useful in improving our ability to document Earth's volcanic eruptions, as well as detecting low temperature thermal precursors to larger eruptions.

  14. Spatial analysis of precipitation time series over the Upper Indus Basin

    Science.gov (United States)

    Latif, Yasir; Yaoming, Ma; Yaseen, Muhammad

    2018-01-01

    The upper Indus basin (UIB) holds one of the most substantial river systems in the world, contributing roughly half of the available surface water in Pakistan. This water provides necessary support for agriculture, domestic consumption, and hydropower generation; all critical for a stable economy in Pakistan. This study has identified trends, analyzed variability, and assessed changes in both annual and seasonal precipitation during four time series, identified herein as: (first) 1961-2013, (second) 1971-2013, (third) 1981-2013, and (fourth) 1991-2013, over the UIB. This study investigated spatial characteristics of the precipitation time series over 15 weather stations and provides strong evidence of annual precipitation by determining significant trends at 6 stations (Astore, Chilas, Dir, Drosh, Gupis, and Kakul) out of the 15 studied stations, revealing a significant negative trend during the fourth time series. Our study also showed significantly increased precipitation at Bunji, Chitral, and Skardu, whereas such trends at the rest of the stations appear insignificant. Moreover, our study found that seasonal precipitation decreased at some locations (at a high level of significance), as well as periods of scarce precipitation during all four seasons. The observed decreases in precipitation appear stronger and more significant in autumn; having 10 stations exhibiting decreasing precipitation during the fourth time series, with respect to time and space. Furthermore, the observed decreases in precipitation appear robust and more significant for regions at high elevation (>1300 m). This analysis concludes that decreasing precipitation dominated the UIB, both temporally and spatially including in the higher areas.

  15. Long Term Results of Visual Field Progression Analysis in Open Angle Glaucoma Patients Under Treatment.

    Science.gov (United States)

    Kocatürk, Tolga; Bekmez, Sinan; Katrancı, Merve; Çakmak, Harun; Dayanır, Volkan

    2015-01-01

    To evaluate visual field progression with trend and event analysis in open angle glaucoma patients under treatment. Fifteen year follow-up results of 408 eyes of 217 glaucoma patients who were followed at Adnan Menderes University, Department of Ophthalmology between 1998 and 2013 were analyzed retrospectively. Visual field data were collected for Mean Deviation (MD), Visual Field Index (VFI), and event occurrence. There were 146 primary open-angle glaucoma (POAG), 123 pseudoexfoliative glaucoma (XFG) and 139 normal tension glaucoma (NTG) eyes. MD showed significant change in all diagnostic groups (pfield indices. We herein report our fifteen year follow-up results in open angle glaucoma.

  16. Visual analysis of transcriptome data in the context of anatomical structures and biological networks

    Directory of Open Access Journals (Sweden)

    Astrid eJunker

    2012-11-01

    Full Text Available The complexity and temporal as well as spatial resolution of transcriptome datasets is constantly increasing due to extensive technological developments. Here we present methods for advanced visualization and intuitive exploration of transcriptomics data as necessary prerequisites in order to facilitate the gain of biological knowledge. Color-coding of structural images based on the expression level enables a fast visual data analysis in the background of the examined biological system. The network-based exploration of these visualizations allows for comparative analysis of genes with specific transcript patterns and supports the extraction of functional relationships even from large datasets. In order to illustrate the presented methods, the tool HIVE was applied for visualization and exploration of database-retrieved expression data for master regulators of Arabidopsis thaliana flower and seed development in the context of corresponding tissue-specific regulatory networks.

  17. In vivo visualization of RNA in plants cells using the λN₂₂ system and a GATEWAY-compatible vector series for candidate RNAs.

    Science.gov (United States)

    Schönberger, Johannes; Hammes, Ulrich Z; Dresselhaus, Thomas

    2012-07-01

    The past decade has seen a tremendous increase in RNA research, which has demonstrated that RNAs are involved in many more processes than were previously thought. The dynamics of RNA synthesis towards their regulated activity requires the interplay of RNAs with numerous RNA binding proteins (RBPs). The localization of RNA, a mechanism for controlling translation in a spatial and temporal fashion, requires processing and assembly of RNA into transport granules in the nucleus, transport towards cytoplasmic destinations and regulation of its activity. Compared with animal model systems little is known about RNA dynamics and motility in plants. Commonly used methods to study RNA transport and localization are time-consuming, and require expensive equipment and a high level of experimental skill. Here, we introduce the λN₂₂ RNA stem-loop binding system for the in vivo visualization of RNA in plant cells. The λN₂₂ system consists of two components: the λN₂₂ RNA binding peptide and the corresponding box-B stem loops. We generated fusions of λN₂₂ to different fluorophores and a GATEWAY vector series for the simple fusion of any target RNA 5' or 3' to box-B stem loops. We show that the λN₂₂ system can be used to detect RNAs in transient expression assays, and that it offers advantages compared with the previously described MS2 system. Furthermore, the λN₂₂ system can be used in combination with the MS2 system to visualize different RNAs simultaneously in the same cell. The toolbox of vectors generated for both systems is easy to use and promises significant progress in our understanding of RNA transport and localization in plant cells. © 2012 The Authors. The Plant Journal © 2012 Blackwell Publishing Ltd.

  18. Spectral analysis of uneven time series of geological variables; Analisis espectral de series temporales de variables geologicas con muestreo irregular

    Energy Technology Data Exchange (ETDEWEB)

    Pardo-Iguzquiza, E.; Rodriguez-Tovar, F. J.

    2013-06-01

    In geosciences the sampling of a time series tends to afford uneven results, sometimes because the sampling itself is random or because of hiatuses or even completely missing data or due to difficulties involved in the conversion of data from a spatial to a time scale when the sedimentation rate was not constant. Whatever the case, the best solution does not lie in interpolation but rather in resorting to a method that deals with the irregular data. We show here how the use of the smoothed Lomb-Scargle periodogram is both a practical and efficient choice. We describe the effects on the estimated power spectrum of the type of irregular sampling, the number of data, interpolation, and the presence of drift. We propose the permutation test as being an efficient way of calculating statistical confidence levels. By applying the Lomb-Scargle periodogram to a synthetic series with a known spectral content we are able to confirm the validity of this method in the face of the difficulties mentioned above. A case study with real data, including hiatuses, representing the thickness of the annual banding in a stalagmite, is chosen to demonstrate an application using the statistical and physical interpretation of spectral peaks. (Author)

  19. Ultra-scale Visualization Climate Data Analysis Tools (UV-CDAT)

    Energy Technology Data Exchange (ETDEWEB)

    Williams, Dean N. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Silva, Claudio [New York Univ. (NYU), NY (United States). Computer Science and Engineering Dept.

    2013-09-30

    For the past three years, a large analysis and visualization effort—funded by the Department of Energy’s Office of Biological and Environmental Research (BER), the National Aeronautics and Space Administration (NASA), and the National Oceanic and Atmospheric Administration (NOAA)—has brought together a wide variety of industry-standard scientific computing libraries and applications to create Ultra-scale Visualization Climate Data Analysis Tools (UV-CDAT) to serve the global climate simulation and observational research communities. To support interactive analysis and visualization, all components connect through a provenance application–programming interface to capture meaningful history and workflow. Components can be loosely coupled into the framework for fast integration or tightly coupled for greater system functionality and communication with other components. The overarching goal of UV-CDAT is to provide a new paradigm for access to and analysis of massive, distributed scientific data collections by leveraging distributed data architectures located throughout the world. The UV-CDAT framework addresses challenges in analysis and visualization and incorporates new opportunities, including parallelism for better efficiency, higher speed, and more accurate scientific inferences. Today, it provides more than 600 users access to more analysis and visualization products than any other single source.

  20. Studies in astronomical time series analysis. I - Modeling random processes in the time domain

    Science.gov (United States)

    Scargle, J. D.

    1981-01-01

    Several random process models in the time domain are defined and discussed. Attention is given to the moving average model, the autoregressive model, and relationships between and combinations of these models. Consideration is then given to methods for investigating pulse structure, procedures of model construction, computational methods, and numerical experiments. A FORTRAN algorithm of time series analysis has been developed which is relatively stable numerically. Results of test cases are given to study the effect of adding noise and of different distributions for the pulse amplitudes. A preliminary analysis of the light curve of the quasar 3C 272 is considered as an example.

  1. The Real-time Frequency Spectrum Analysis of Neutron Pulse Signal Series

    International Nuclear Information System (INIS)

    Tang Yuelin; Ren Yong; Wei Biao; Feng Peng; Mi Deling; Pan Yingjun; Li Jiansheng; Ye Cenming

    2009-01-01

    The frequency spectrum analysis of neutron pulse signal is a very important method in nuclear stochastic signal processing Focused on the special '0' and '1' of neutron pulse signal series, this paper proposes new rotation-table and realizes a real-time frequency spectrum algorithm under 1G Hz sample rate based on PC with add, address and SSE. The numerical experimental results show that under the count rate of 3X10 6 s -1 , this algorithm is superior to FFTW in time-consumption and can meet the real-time requirement of frequency spectrum analysis. (authors)

  2. The singular nature of auditory and visual scene analysis in autism

    OpenAIRE

    Lin, I.-Fan; Shirama, Aya; Kato, Nobumasa; Kashino, Makio

    2017-01-01

    Individuals with autism spectrum disorder often have difficulty acquiring relevant auditory and visual information in daily environments, despite not being diagnosed as hearing impaired or having low vision. Resent psychophysical and neurophysiological studies have shown that autistic individuals have highly specific individual differences at various levels of information processing, including feature extraction, automatic grouping and top-down modulation in auditory and visual scene analysis...

  3. Special Issue of Selected Papers from Visualization and Data Analysis 2011

    Science.gov (United States)

    Kao, David L.; Wong, Pak Chung

    2012-01-01

    This special issue features the best papers that were selected from the 18th SPIE Conference on Visualization and Data Analysis (VDA 2011). This annual conference is a major international forum for researchers and practitioners interested in data visualization and analytics research, development, and applications. VDA 2011 received 42 high-quality submissions from around the world. Twenty-four papers were selected for full conference papers. The top five papers have been expanded and reviewed for this special issue.

  4. On the Impact of a Quadratic Acceleration Term in the Analysis of Position Time Series

    Science.gov (United States)

    Bogusz, Janusz; Klos, Anna; Bos, Machiel Simon; Hunegnaw, Addisu; Teferle, Felix Norman

    2016-04-01

    The analysis of Global Navigation Satellite System (GNSS) position time series generally assumes that each of the coordinate component series is described by the sum of a linear rate (velocity) and various periodic terms. The residuals, the deviations between the fitted model and the observations, are then a measure of the epoch-to-epoch scatter and have been used for the analysis of the stochastic character (noise) of the time series. Often the parameters of interest in GNSS position time series are the velocities and their associated uncertainties, which have to be determined with the highest reliability. It is clear that not all GNSS position time series follow this simple linear behaviour. Therefore, we have added an acceleration term in the form of a quadratic polynomial function to the model in order to better describe the non-linear motion in the position time series. This non-linear motion could be a response to purely geophysical processes, for example, elastic rebound of the Earth's crust due to ice mass loss in Greenland, artefacts due to deficiencies in bias mitigation models, for example, of the GNSS satellite and receiver antenna phase centres, or any combination thereof. In this study we have simulated 20 time series with different stochastic characteristics such as white, flicker or random walk noise of length of 23 years. The noise amplitude was assumed at 1 mm/y-/4. Then, we added the deterministic part consisting of a linear trend of 20 mm/y (that represents the averaged horizontal velocity) and accelerations ranging from minus 0.6 to plus 0.6 mm/y2. For all these data we estimated the noise parameters with Maximum Likelihood Estimation (MLE) using the Hector software package without taken into account the non-linear term. In this way we set the benchmark to then investigate how the noise properties and velocity uncertainty may be affected by any un-modelled, non-linear term. The velocities and their uncertainties versus the accelerations for

  5. A Review of Pathway-Based Analysis Tools That Visualize Genetic Variants

    Directory of Open Access Journals (Sweden)

    Elisa Cirillo

    2017-11-01

    Full Text Available Pathway analysis is a powerful method for data analysis in genomics, most often applied to gene expression analysis. It is also promising for single-nucleotide polymorphism (SNP data analysis, such as genome-wide association study data, because it allows the interpretation of variants with respect to the biological processes in which the affected genes and proteins are involved. Such analyses support an interactive evaluation of the possible effects of variations on function, regulation or interaction of gene products. Current pathway analysis software often does not support data visualization of variants in pathways as an alternate method to interpret genetic association results, and specific statistical methods for pathway analysis of SNP data are not combined with these visualization features. In this review, we first describe the visualization options of the tools that were identified by a literature review, in order to provide insight for improvements in this developing field. Tool evaluation was performed using a computational epistatic dataset of gene–gene interactions for obesity risk. Next, we report the necessity to include in these tools statistical methods designed for the pathway-based analysis with SNP data, expressly aiming to define features for more comprehensive pathway-based analysis tools. We conclude by recognizing that pathway analysis of genetic variations data requires a sophisticated combination of the most useful and informative visual aspects of the various tools evaluated.

  6. Brain SPECT in mesial temporal lobe epilepsy: comparison between visual analysis and SPM (Statistical Parametric Mapping)

    Energy Technology Data Exchange (ETDEWEB)

    Amorim, Barbara Juarez; Ramos, Celso Dario; Santos, Allan Oliveira dos; Lima, Mariana da Cunha Lopes de; Camargo, Edwaldo Eduardo; Etchebehere, Elba Cristina Sa de Camargo, E-mail: juarezbarbara@hotmail.co [State University of Campinas (UNICAMP), SP (Brazil). School of Medical Sciences. Dept. of Radiology; Min, Li Li; Cendes, Fernando [State University of Campinas (UNICAMP), SP (Brazil). School of Medical Sciences. Dept. of Neurology

    2010-04-15

    Objective: to compare the accuracy of SPM and visual analysis of brain SPECT in patients with mesial temporal lobe epilepsy (MTLE). Method: interictal and ictal SPECTs of 22 patients with MTLE were performed. Visual analysis were performed in interictal (VISUAL(inter)) and ictal (VISUAL(ictal/inter)) studies. SPM analysis consisted of comparing interictal (SPM(inter)) and ictal SPECTs (SPM(ictal)) of each patient to control group and by comparing perfusion of temporal lobes in ictal and interictal studies among themselves (SPM(ictal/inter)). Results: for detection of the epileptogenic focus, the sensitivities were as follows: VISUAL(inter)=68%; VISUAL(ictal/inter)=100%; SPM(inter)=45%; SPM(ictal)=64% and SPM(ictal/inter)=77%. SPM was able to detect more areas of hyperperfusion and hypoperfusion. Conclusion: SPM did not improve the sensitivity to detect epileptogenic focus. However, SPM detected different regions of hypoperfusion and hyperperfusion and is therefore a helpful tool for better understand pathophysiology of seizures in MTLE. (author)

  7. Behavior analysis for elderly care using a network of low-resolution visual sensors

    Science.gov (United States)

    Eldib, Mohamed; Deboeverie, Francis; Philips, Wilfried; Aghajan, Hamid

    2016-07-01

    Recent advancements in visual sensor technologies have made behavior analysis practical for in-home monitoring systems. The current in-home monitoring systems face several challenges: (1) visual sensor calibration is a difficult task and not practical in real-life because of the need for recalibration when the visual sensors are moved accidentally by a caregiver or the senior citizen, (2) privacy concerns, and (3) the high hardware installation cost. We propose to use a network of cheap low-resolution visual sensors (30×30 pixels) for long-term behavior analysis. The behavior analysis starts by visual feature selection based on foreground/background detection to track the motion level in each visual sensor. Then a hidden Markov model (HMM) is used to estimate the user's locations without calibration. Finally, an activity discovery approach is proposed using spatial and temporal contexts. We performed experiments on 10 months of real-life data. We show that the HMM approach outperforms the k-nearest neighbor classifier against ground truth for 30 days. Our framework is able to discover 13 activities of daily livings (ADL parameters). More specifically, we analyze mobility patterns and some of the key ADL parameters to detect increasing or decreasing health conditions.

  8. Brain SPECT in mesial temporal lobe epilepsy: comparison between visual analysis and SPM (Statistical Parametric Mapping)

    International Nuclear Information System (INIS)

    Amorim, Barbara Juarez; Ramos, Celso Dario; Santos, Allan Oliveira dos; Lima, Mariana da Cunha Lopes de; Camargo, Edwaldo Eduardo; Etchebehere, Elba Cristina Sa de Camargo; Min, Li Li; Cendes, Fernando

    2010-01-01

    Objective: to compare the accuracy of SPM and visual analysis of brain SPECT in patients with mesial temporal lobe epilepsy (MTLE). Method: interictal and ictal SPECTs of 22 patients with MTLE were performed. Visual analysis were performed in interictal (VISUAL(inter)) and ictal (VISUAL(ictal/inter)) studies. SPM analysis consisted of comparing interictal (SPM(inter)) and ictal SPECTs (SPM(ictal)) of each patient to control group and by comparing perfusion of temporal lobes in ictal and interictal studies among themselves (SPM(ictal/inter)). Results: for detection of the epileptogenic focus, the sensitivities were as follows: VISUAL(inter)=68%; VISUAL(ictal/inter)=100%; SPM(inter)=45%; SPM(ictal)=64% and SPM(ictal/inter)=77%. SPM was able to detect more areas of hyperperfusion and hypoperfusion. Conclusion: SPM did not improve the sensitivity to detect epileptogenic focus. However, SPM detected different regions of hypoperfusion and hyperperfusion and is therefore a helpful tool for better understand pathophysiology of seizures in MTLE. (author)

  9. Development of indicators of vegetation recovery based on time series analysis of SPOT Vegetation data

    Science.gov (United States)

    Lhermitte, S.; Tips, M.; Verbesselt, J.; Jonckheere, I.; Van Aardt, J.; Coppin, Pol

    2005-10-01

    Large-scale wild fires have direct impacts on natural ecosystems and play a major role in the vegetation ecology and carbon budget. Accurate methods for describing post-fire development of vegetation are therefore essential for the understanding and monitoring of terrestrial ecosystems. Time series analysis of satellite imagery offers the potential to quantify these parameters with spatial and temporal accuracy. Current research focuses on the potential of time series analysis of SPOT Vegetation S10 data (1999-2001) to quantify the vegetation recovery of large-scale burns detected in the framework of GBA2000. The objective of this study was to provide quantitative estimates of the spatio-temporal variation of vegetation recovery based on remote sensing indicators. Southern Africa was used as a pilot study area, given the availability of ground and satellite data. An automated technique was developed to extract consistent indicators of vegetation recovery from the SPOT-VGT time series. Reference areas were used to quantify the vegetation regrowth by means of Regeneration Indices (RI). Two kinds of recovery indicators (time and value- based) were tested for RI's of NDVI, SR, SAVI, NDWI, and pure band information. The effects of vegetation structure and temporal fire regime features on the recovery indicators were subsequently analyzed. Statistical analyses were conducted to assess whether the recovery indicators were different for different vegetation types and dependent on timing of the burning season. Results highlighted the importance of appropriate reference areas and the importance of correct normalization of the SPOT-VGT data.

  10. Dynamic Forecasting Conditional Probability of Bombing Attacks Based on Time-Series and Intervention Analysis.

    Science.gov (United States)

    Li, Shuying; Zhuang, Jun; Shen, Shifei

    2017-07-01

    In recent years, various types of terrorist attacks occurred, causing worldwide catastrophes. According to the Global Terrorism Database (GTD), among all attack tactics, bombing attacks happened most frequently, followed by armed assaults. In this article, a model for analyzing and forecasting the conditional probability of bombing attacks (CPBAs) based on time-series methods is developed. In addition, intervention analysis is used to analyze the sudden increase in the time-series process. The results show that the CPBA increased dramatically at the end of 2011. During that time, the CPBA increased by 16.0% in a two-month period to reach the peak value, but still stays 9.0% greater than the predicted level after the temporary effect gradually decays. By contrast, no significant fluctuation can be found in the conditional probability process of armed assault. It can be inferred that some social unrest, such as America's troop withdrawal from Afghanistan and Iraq, could have led to the increase of the CPBA in Afghanistan, Iraq, and Pakistan. The integrated time-series and intervention model is used to forecast the monthly CPBA in 2014 and through 2064. The average relative error compared with the real data in 2014 is 3.5%. The model is also applied to the total number of attacks recorded by the GTD between 2004 and 2014. © 2016 Society for Risk Analysis.

  11. Comparative performance analysis of shunt and series passive filter for LED lamp

    Science.gov (United States)

    Sarwono, Edi; Facta, Mochammad; Handoko, Susatyo

    2018-03-01

    Light Emitting Diode lamp or LED lamp nowadays is widely used by consumers as a new innovation in the lighting technologies due to its energy saving for low power consumption lamps for brighter light intensity. How ever, the LED lamp produce an electric pollutant known as harmonics. The harmonics is generated by rectifier as part of LED lamp circuit. The present of harmonics in current or voltage has made the source waveform from the grid is distorted. This distortion may cause inacurrate measurement, mall function, and excessive heating for any element at the grid. This paper present an analysis work of shunt and series filters to suppress the harmonics generated by the LED lamp circuit. The work was initiated by conducting several tests to investigate the harmonic content of voltage and currents. The measurements in this work were carried out by using HIOKI Power Quality Analyzer 3197. The measurement results showed that the harmonics current of tested LED lamps were above the limit of IEEE standard 519-2014. Based on the measurement results shunt and series filters were constructed as low pass filters. The bode analysis were appled during construction and prediction of the filters performance. Based on experimental results, the application of shunt filter at input side of LED lamp has reduced THD current up to 88%. On the other hand, the series filter has significantly reduced THD current up to 92%.

  12. Work-related accidents among the Iranian population: a time series analysis, 2000-2011.

    Science.gov (United States)

    Karimlou, Masoud; Salehi, Masoud; Imani, Mehdi; Hosseini, Agha-Fatemeh; Dehnad, Afsaneh; Vahabi, Nasim; Bakhtiyari, Mahmood

    2015-01-01

    Work-related accidents result in human suffering and economic losses and are considered as a major health problem worldwide, especially in the economically developing world. To introduce seasonal autoregressive moving average (ARIMA) models for time series analysis of work-related accident data for workers insured by the Iranian Social Security Organization (ISSO) between 2000 and 2011. In this retrospective study, all insured people experiencing at least one work-related accident during a 10-year period were included in the analyses. We used Box-Jenkins modeling to develop a time series model of the total number of accidents. There was an average of 1476 accidents per month (1476·05±458·77, mean±SD). The final ARIMA (p,d,q) (P,D,Q)s model for fitting to data was: ARIMA(1,1,1)×(0,1,1)12 consisting of the first ordering of the autoregressive, moving average and seasonal moving average parameters with 20·942 mean absolute percentage error (MAPE). The final model showed that time series analysis of ARIMA models was useful for forecasting the number of work-related accidents in Iran. In addition, the forecasted number of work-related accidents for 2011 explained the stability of occurrence of these accidents in recent years, indicating a need for preventive occupational health and safety policies such as safety inspection.

  13. Work-related accidents among the Iranian population: a time series analysis, 2000–2011

    Science.gov (United States)

    Karimlou, Masoud; Imani, Mehdi; Hosseini, Agha-Fatemeh; Dehnad, Afsaneh; Vahabi, Nasim; Bakhtiyari, Mahmood

    2015-01-01

    Background Work-related accidents result in human suffering and economic losses and are considered as a major health problem worldwide, especially in the economically developing world. Objectives To introduce seasonal autoregressive moving average (ARIMA) models for time series analysis of work-related accident data for workers insured by the Iranian Social Security Organization (ISSO) between 2000 and 2011. Methods In this retrospective study, all insured people experiencing at least one work-related accident during a 10-year period were included in the analyses. We used Box–Jenkins modeling to develop a time series model of the total number of accidents. Results There was an average of 1476 accidents per month (1476·05±458·77, mean±SD). The final ARIMA (p,d,q) (P,D,Q)s model for fitting to data was: ARIMA(1,1,1)×(0,1,1)12 consisting of the first ordering of the autoregressive, moving average and seasonal moving average parameters with 20·942 mean absolute percentage error (MAPE). Conclusions The final model showed that time series analysis of ARIMA models was useful for forecasting the number of work-related accidents in Iran. In addition, the forecasted number of work-related accidents for 2011 explained the stability of occurrence of these accidents in recent years, indicating a need for preventive occupational health and safety policies such as safety inspection. PMID:26119774

  14. Interpretative Visual Analysis. Developments, State of the Art and Pending Problems

    Directory of Open Access Journals (Sweden)

    Bernt Schnettler

    2008-09-01

    Full Text Available he article offers a brief resume of recent developments in the field of interpretative visual analysis with emphasis on the German speaking area and the sociological discipline. It lays a special focus on hermeneutical and genre analysis and on research with audiovisual data. Far from constituting an already closed field, the authors stress the fact that methodological advances in qualitative research based in visual data still face a number of pending quests. This encompasses sequentiality, complexity and naturalness of videographic data, and extends to the respective methodological challenges for transcription, analysis and presentation of results. URN: urn:nbn:de:0114-fqs0803314

  15. Empirical mode decomposition and long-range correlation analysis of sunspot time series

    International Nuclear Information System (INIS)

    Zhou, Yu; Leung, Yee

    2010-01-01

    Sunspots, which are the best known and most variable features of the solar surface, affect our planet in many ways. The number of sunspots during a period of time is highly variable and arouses strong research interest. When multifractal detrended fluctuation analysis (MF-DFA) is employed to study the fractal properties and long-range correlation of the sunspot series, some spurious crossover points might appear because of the periodic and quasi-periodic trends in the series. However many cycles of solar activities can be reflected by the sunspot time series. The 11-year cycle is perhaps the most famous cycle of the sunspot activity. These cycles pose problems for the investigation of the scaling behavior of sunspot time series. Using different methods to handle the 11-year cycle generally creates totally different results. Using MF-DFA, Movahed and co-workers employed Fourier truncation to deal with the 11-year cycle and found that the series is long-range anti-correlated with a Hurst exponent, H, of about 0.12. However, Hu and co-workers proposed an adaptive detrending method for the MF-DFA and discovered long-range correlation characterized by H≈0.74. In an attempt to get to the bottom of the problem in the present paper, empirical mode decomposition (EMD), a data-driven adaptive method, is applied to first extract the components with different dominant frequencies. MF-DFA is then employed to study the long-range correlation of the sunspot time series under the influence of these components. On removing the effects of these periods, the natural long-range correlation of the sunspot time series can be revealed. With the removal of the 11-year cycle, a crossover point located at around 60 months is discovered to be a reasonable point separating two different time scale ranges, H≈0.72 and H≈1.49. And on removing all cycles longer than 11 years, we have H≈0.69 and H≈0.28. The three cycle-removing methods—Fourier truncation, adaptive detrending and the

  16. Porcupine : A visual pipeline tool for neuroimaging analysis

    NARCIS (Netherlands)

    van Mourik, Tim; Snoek, Lukas; Knapen, T; Norris, David G

    The field of neuroimaging is rapidly adopting a more reproducible approach to data acquisition and analysis. Data structures and formats are being standardised and data analyses are getting more automated. However, as data analysis becomes more complicated, researchers often have to write longer

  17. Parallel analysis tools and new visualization techniques for ultra-large climate data set

    Energy Technology Data Exchange (ETDEWEB)

    Middleton, Don [National Center for Atmospheric Research, Boulder, CO (United States); Haley, Mary [National Center for Atmospheric Research, Boulder, CO (United States)

    2014-12-10

    ParVis was a project funded under LAB 10-05: “Earth System Modeling: Advanced Scientific Visualization of Ultra-Large Climate Data Sets”. Argonne was the lead lab with partners at PNNL, SNL, NCAR and UC-Davis. This report covers progress from January 1st, 2013 through Dec 1st, 2014. Two previous reports covered the period from Summer, 2010, through September 2011 and October 2011 through December 2012, respectively. While the project was originally planned to end on April 30, 2013, personnel and priority changes allowed many of the institutions to continue work through FY14 using existing funds. A primary focus of ParVis was introducing parallelism to climate model analysis to greatly reduce the time-to-visualization for ultra-large climate data sets. Work in the first two years was conducted on two tracks with different time horizons: one track to provide immediate help to climate scientists already struggling to apply their analysis to existing large data sets and another focused on building a new data-parallel library and tool for climate analysis and visualization that will give the field a platform for performing analysis and visualization on ultra-large datasets for the foreseeable future. In the final 2 years of the project, we focused mostly on the new data-parallel library and associated tools for climate analysis and visualization.

  18. Detrended fluctuation analysis based on higher-order moments of financial time series

    Science.gov (United States)

    Teng, Yue; Shang, Pengjian

    2018-01-01

    In this paper, a generalized method of detrended fluctuation analysis (DFA) is proposed as a new measure to assess the complexity of a complex dynamical system such as stock market. We extend DFA and local scaling DFA to higher moments such as skewness and kurtosis (labeled SMDFA and KMDFA), so as to investigate the volatility scaling property of financial time series. Simulations are conducted over synthetic and financial data for providing the comparative study. We further report the results of volatility behaviors in three American countries, three Chinese and three European stock markets by using DFA and LSDFA method based on higher moments. They demonstrate the dynamics behaviors of time series in different aspects, which can quantify the changes of complexity for stock market data and provide us with more meaningful information than single exponent. And the results reveal some higher moments volatility and higher moments multiscale volatility details that cannot be obtained using the traditional DFA method.

  19. Time series analysis of pressure fluctuation in gas-solid fluidized beds

    Directory of Open Access Journals (Sweden)

    C. Alberto S. Felipe

    2004-09-01

    Full Text Available The purpose of the present work was to study the differentiation of states of typical fluidization (single bubble, multiple bubble and slugging in a gas-solid fluidized bed, using spectral analysis of pressure fluctuation time series. The effects of the method of measuring (differential and absolute pressure fluctuations and the axial position of the probes in the fluidization column on the identification of each of the regimes studied were evaluated. Fast Fourier Transform (FFT was the mathematic tool used to analysing the data of pressure fluctuations, which expresses the behavior of a time series in the frequency domain. Results indicated that the plenum chamber was a place for reliable measurement and that care should be taken in measurement in the dense phase. The method allowed fluid dynamic regimes to be differentiated by their dominant frequency characteristics.

  20. Analysis of Data from a Series of Events by a Geometric Process Model

    Institute of Scientific and Technical Information of China (English)

    Yeh Lam; Li-xing Zhu; Jennifer S. K. Chan; Qun Liu

    2004-01-01

    Geometric process was first introduced by Lam[10,11]. A stochastic process {Xi, i = 1, 2,…} is called a geometric process (GP) if, for some a > 0, {ai-1Xi, i = 1, 2,…} forms a renewal process. In thispaper, the GP is used to analyze the data from a series of events. A nonparametric method is introduced forthe estimation of the three parameters in the GP. The limiting distributions of the three estimators are studied.Through the analysis of some real data sets, the GP model is compared with other three homogeneous andnonhomogeneous Poisson models. It seems that on average the GP model is the best model among these fourmodels in analyzing the data from a series of events.

  1. The Relative Importance of the Service Sector in the Mexican Economy: A Time Series Analysis

    Directory of Open Access Journals (Sweden)

    Carlos Alberto Flores

    2014-01-01

    Full Text Available We conduct a study of the secondary and tertiary sectors with the goal of highlighting the relative im-portance of services in the Mexican economy. We consider a time series analysis approach designed to identify the stochastic nature of the series, as well as to define their long-run and-short run relationships with Gross Domestic Product (GDP. The results of cointegration tests suggest that, for the most part, activities in the secondary and tertiary sectors share a common trend with GDP. Interestingly, the long-run elasticities of GDP with respect to services are on average larger than those with respect to secondary activities. Common cycle tests results identify the existence of common cycles between GDP and the disaggregated sectors, as well as with manufacturing, commerce, real estate and transportation. In this case, the short-run elasticities of secondary activities are on average larger than those corresponding to services.

  2. Investigation of interfacial wave structure using time-series analysis techniques

    International Nuclear Information System (INIS)

    Jayanti, S.; Hewitt, G.F.; Cliffe, K.A.

    1990-09-01

    The report presents an investigation into the interfacial structure in horizontal annular flow using spectral and time-series analysis techniques. Film thickness measured using conductance probes shows an interesting transition in wave pattern from a continuous low-frequency wave pattern to an intermittent, high-frequency one. From the autospectral density function of the film thickness, it appears that this transition is caused by the breaking up of long waves into smaller ones. To investigate the possibility of the wave structure being represented as a low order chaotic system, phase portraits of the time series were constructed using the technique developed by Broomhead and co-workers (1986, 1987 and 1989). These showed a banded structure when waves of relatively high frequency were filtered out. Although these results are encouraging, further work is needed to characterise the attractor. (Author)

  3. Robustness Analysis of Visual QA Models by Basic Questions

    KAUST Repository

    Huang, Jia-Hong

    2017-09-14

    Visual Question Answering (VQA) models should have both high robustness and accuracy. Unfortunately, most of the current VQA research only focuses on accuracy because there is a lack of proper methods to measure the robustness of VQA models. There are two main modules in our algorithm. Given a natural language question about an image, the first module takes the question as input and then outputs the ranked basic questions, with similarity scores, of the main given question. The second module takes the main question, image and these basic questions as input and then outputs the text-based answer of the main question about the given image. We claim that a robust VQA model is one, whose performance is not changed much when related basic questions as also made available to it as input. We formulate the basic questions generation problem as a LASSO optimization, and also propose a large scale Basic Question Dataset (BQD) and Rscore (novel robustness measure), for analyzing the robustness of VQA models. We hope our BQD will be used as a benchmark for to evaluate the robustness of VQA models, so as to help the community build more robust and accurate VQA models.

  4. Robustness Analysis of Visual Question Answering Models by Basic Questions

    KAUST Repository

    Huang, Jia-Hong

    2017-11-01

    Visual Question Answering (VQA) models should have both high robustness and accuracy. Unfortunately, most of the current VQA research only focuses on accuracy because there is a lack of proper methods to measure the robustness of VQA models. There are two main modules in our algorithm. Given a natural language question about an image, the first module takes the question as input and then outputs the ranked basic questions, with similarity scores, of the main given question. The second module takes the main question, image and these basic questions as input and then outputs the text-based answer of the main question about the given image. We claim that a robust VQA model is one, whose performance is not changed much when related basic questions as also made available to it as input. We formulate the basic questions generation problem as a LASSO optimization, and also propose a large scale Basic Question Dataset (BQD) and Rscore (novel robustness measure), for analyzing the robustness of VQA models. We hope our BQD will be used as a benchmark for to evaluate the robustness of VQA models, so as to help the community build more robust and accurate VQA models.

  5. Visualization and analysis of lipopolysaccharide distribution in binary phospholipid bilayers

    Energy Technology Data Exchange (ETDEWEB)

    Henning, Maria Florencia [Instituto de Investigaciones Bioquimicas La Plata (INIBIOLP), CCT-La Plata, CONICET, Facultad de Ciencias Medicas, UNLP, Calles 60 y 120, 1900 La Plata (Argentina); Sanchez, Susana [Laboratory for Fluorescence Dynamics, University of California-Irvine, Irvine, CA (United States); Bakas, Laura, E-mail: lbakas@biol.unlp.edu.ar [Instituto de Investigaciones Bioquimicas La Plata (INIBIOLP), CCT-La Plata, CONICET, Facultad de Ciencias Medicas, UNLP, Calles 60 y 120, 1900 La Plata (Argentina); Departamento de Ciencias Biologicas, Facultad de Ciencias Exactas, UNLP, Calles 47 y 115, 1900 La Plata (Argentina)

    2009-05-22

    Lipopolysaccharide (LPS) is an endotoxin released from the outer membrane of Gram-negative bacteria during infections. It have been reported that LPS may play a role in the outer membrane of bacteria similar to that of cholesterol in eukaryotic plasma membranes. In this article we compare the effect of introducing LPS or cholesterol in liposomes made of dipalmitoylphosphatidylcholine/dioleoylphosphatidylcholine on the solubilization process by Triton X-100. The results show that liposomes containing LPS or cholesterol are more resistant to solubilization by Triton X-100 than the binary phospholipid mixtures at 4 {sup o}C. The LPS distribution was analyzed on GUVs of DPPC:DOPC using FITC-LPS. Solid and liquid-crystalline domains were visualized labeling the GUVs with LAURDAN and GP images were acquired using a two-photon microscope. The images show a selective distribution of LPS in gel domains. Our results support the hypothesis that LPS could aggregate and concentrate selectively in biological membranes providing a mechanism to bring together several components of the LPS-sensing machinery.

  6. Robustness Analysis of Visual Question Answering Models by Basic Questions

    KAUST Repository

    Huang, Jia-Hong

    2017-01-01

    Visual Question Answering (VQA) models should have both high robustness and accuracy. Unfortunately, most of the current VQA research only focuses on accuracy because there is a lack of proper methods to measure the robustness of VQA models. There are two main modules in our algorithm. Given a natural language question about an image, the first module takes the question as input and then outputs the ranked basic questions, with similarity scores, of the main given question. The second module takes the main question, image and these basic questions as input and then outputs the text-based answer of the main question about the given image. We claim that a robust VQA model is one, whose performance is not changed much when related basic questions as also made available to it as input. We formulate the basic questions generation problem as a LASSO optimization, and also propose a large scale Basic Question Dataset (BQD) and Rscore (novel robustness measure), for analyzing the robustness of VQA models. We hope our BQD will be used as a benchmark for to evaluate the robustness of VQA models, so as to help the community build more robust and accurate VQA models.

  7. Robustness Analysis of Visual QA Models by Basic Questions

    KAUST Repository

    Huang, Jia-Hong; Alfadly, Modar; Ghanem, Bernard

    2017-01-01

    Visual Question Answering (VQA) models should have both high robustness and accuracy. Unfortunately, most of the current VQA research only focuses on accuracy because there is a lack of proper methods to measure the robustness of VQA models. There are two main modules in our algorithm. Given a natural language question about an image, the first module takes the question as input and then outputs the ranked basic questions, with similarity scores, of the main given question. The second module takes the main question, image and these basic questions as input and then outputs the text-based answer of the main question about the given image. We claim that a robust VQA model is one, whose performance is not changed much when related basic questions as also made available to it as input. We formulate the basic questions generation problem as a LASSO optimization, and also propose a large scale Basic Question Dataset (BQD) and Rscore (novel robustness measure), for analyzing the robustness of VQA models. We hope our BQD will be used as a benchmark for to evaluate the robustness of VQA models, so as to help the community build more robust and accurate VQA models.

  8. Analysis on causes of visual fatigue in 3 502 cases

    Directory of Open Access Journals (Sweden)

    Yue-Lan Feng

    2016-02-01

    Full Text Available AIM:To observe the reason of visual fatigue in Inner Mongolian in order to provide the epidemiological data for the prevention and treatment of asthenopia.METHODS:This was a retrospective case-controlled study. From January 2011 to December 2014, all the clinical data of 3 502 patients who were diagnosed as asthenopia aged 7~50 was analyzed. The subjects were divided into 4 groups according to the age:(17~20 years old: 712 cases;(221~30 years old: 603 cases;(331~40 years old:694 cases;(441~50 years old:1 493 cases. The patients were examed for the conditions of anterior and posterior segment, Schirmer Ⅰ test, break-up time, computer optometry, subjective refraction, horizontal convergence and divergence, distance and near phoria, near point of convergence, accommodative convergence/accommodation ratio, accommodative facility, relative accommodation, amplitude of accommodation and accommodative response. The causes for asthenopia were analyzed by Kruskal-wallis H test first, then comparisons among groups were conducted by Nemenyi test. RESULTS:The causes for asthenopia were eye-related diseases(49.37%, ametropia(23.36%, accommodation and convergence function problem(21.70%, disorders of extraocular muscles function(5.57%. Kruskal-wallis H test showed significant differences on the prevalence of asthenopia caused by different reasons of the four groups(PPCONCLUSION: The causes of asthenopia are eye-related diseases, ametropia, accommodation and convergence function problem and disorders of extraocular muscles function.

  9. Causality as a Rigorous Notion and Quantitative Causality Analysis with Time Series

    Science.gov (United States)

    Liang, X. S.

    2017-12-01

    Given two time series, can one faithfully tell, in a rigorous and quantitative way, the cause and effect between them? Here we show that this important and challenging question (one of the major challenges in the science of big data), which is of interest in a wide variety of disciplines, has a positive answer. Particularly, for linear systems, the maximal likelihood estimator of the causality from a series X2 to another series X1, written T2→1, turns out to be concise in form: T2→1 = [C11 C12 C2,d1 — C112 C1,d1] / [C112 C22 — C11C122] where Cij (i,j=1,2) is the sample covariance between Xi and Xj, and Ci,dj the covariance between Xi and ΔXj/Δt, the difference approximation of dXj/dt using the Euler forward scheme. An immediate corollary is that causation implies correlation, but not vice versa, resolving the long-standing debate over causation versus correlation. The above formula has been validated with touchstone series purportedly generated with one-way causality that evades the classical approaches such as Granger causality test and transfer entropy analysis. It has also been applied successfully to the investigation of many real problems. Through a simple analysis with the stock series of IBM and GE, an unusually strong one-way causality is identified from the former to the latter in their early era, revealing to us an old story, which has almost faded into oblivion, about "Seven Dwarfs" competing with a "Giant" for the computer market. Another example presented here regards the cause-effect relation between the two climate modes, El Niño and Indian Ocean Dipole (IOD). In general, these modes are mutually causal, but the causality is asymmetric. To El Niño, the information flowing from IOD manifests itself as a propagation of uncertainty from the Indian Ocean. In the third example, an unambiguous one-way causality is found between CO2 and the global mean temperature anomaly. While it is confirmed that CO2 indeed drives the recent global warming

  10. Phase correction and error estimation in InSAR time series analysis

    Science.gov (United States)

    Zhang, Y.; Fattahi, H.; Amelung, F.

    2017-12-01

    During the last decade several InSAR time series approaches have been developed in response to the non-idea acquisition strategy of SAR satellites, such as large spatial and temporal baseline with non-regular acquisitions. The small baseline tubes and regular acquisitions of new SAR satellites such as Sentinel-1 allows us to form fully connected networks of interferograms and simplifies the time series analysis into a weighted least square inversion of an over-determined system. Such robust inversion allows us to focus more on the understanding of different components in InSAR time-series and its uncertainties. We present an open-source python-based package for InSAR time series analysis, called PySAR (https://yunjunz.github.io/PySAR/), with unique functionalities for obtaining unbiased ground displacement time-series, geometrical and atmospheric correction of InSAR data and quantifying the InSAR uncertainty. Our implemented strategy contains several features including: 1) improved spatial coverage using coherence-based network of interferograms, 2) unwrapping error correction using phase closure or bridging, 3) tropospheric delay correction using weather models and empirical approaches, 4) DEM error correction, 5) optimal selection of reference date and automatic outlier detection, 6) InSAR uncertainty due to the residual tropospheric delay, decorrelation and residual DEM error, and 7) variance-covariance matrix of final products for geodetic inversion. We demonstrate the performance using SAR datasets acquired by Cosmo-Skymed and TerraSAR-X, Sentinel-1 and ALOS/ALOS-2, with application on the highly non-linear volcanic deformation in Japan and Ecuador (figure 1). Our result shows precursory deformation before the 2015 eruptions of Cotopaxi volcano, with a maximum uplift of 3.4 cm on the western flank (fig. 1b), with a standard deviation of 0.9 cm (fig. 1a), supporting the finding by Morales-Rivera et al. (2017, GRL); and a post-eruptive subsidence on the same

  11. Modern Methods of Multidimensional Data Visualization: Analysis, Classification, Implementation, and Applications in Technical Systems

    Directory of Open Access Journals (Sweden)

    I. K. Romanova

    2016-01-01

    Full Text Available The article deals with theoretical and practical aspects of solving the problem of visualization of multidimensional data as an effective means of multivariate analysis of systems. Several classifications are proposed for visualization techniques, according to data types, visualization objects, the method of transformation of coordinates and data. To represent classification are used charts with links to the relevant work. The article also proposes two classifications of modern trends in display technology, including integration of visualization techniques as one of the modern trends of development, along with the introduction of interactive technologies and the dynamics of development processes. It describes some approaches to the visualization problem, which are concerned with fulfilling the needs. The needs are generated by the relevant tasks such as information retrieval in global networks, development of bioinformatics, study and control of business processes, development of regions, etc. The article highlights modern visualization tools, which are capable of improving the efficiency of the multivariate analysis and searching for solutions in multi-objective optimization of technical systems, but are not very actively used for such studies. These are horizontal graphs, graphics "quantile-quantile", etc. The paper proposes to use Choropleth cards traditionally used in cartography for simultaneous presentation of the distribution parameters of several criteria in the space. It notes that visualizations of graphs in network applications can be more actively used to describe the control system. The article suggests using the heat maps to provide graphical representation of the sensitivity of the system quality criteria under variations of options (multivariate analysis of technical systems. It also mentions that it is useful to extend the supervising heat maps to the task of estimating quality of identify in constructing system models. A

  12. Age-related macular degeneration with choroidal neovascularization in the setting of pre-existing geographic atrophy and ranibizumab treatment. Analysis of a case series and revision paper

    Directory of Open Access Journals (Sweden)

    Miguel Hage Amaro

    2012-12-01

    Full Text Available PURPOSE: To report the response of choroidal neovascularization (CNV to intravitreal ranibizumab treatment in the setting of age-related macular degeneration (AMD with extensive pre-existing geographic atrophy (GA and a revision paper. METHODS: This is a revision paper and a retrospective case series of 10 eyes in nine consecutive patients from a photographic database. The patients were actively treated with ranibizumab for neovascular AMD with extensive pre-existing GA. Patients were included if they had GA at or adjacent to the foveal center that was present before the development of CNV. The best corrected visual acuity and optical coherence tomography (OCT analysis of the central macular thickness were recorded for each visit. Serial injections of ranibizumab were administered until there was resolution of any subretinal fluid clinically or on OCT. Data over the entire follow-up period were analyzed for overall visual and OCT changes. All patients had been followed for at least 2 years since diagnosis. RESULTS: The patients received an average of 6 ± 3 intravitreal injections over the treatment period. Eight eyes had reduced retinal thickening on OCT. On average, the central macular thickness was reduced by 94 ± 101 µm. Eight eyes had improvement of one or more lines of vision, where as one eye had dramatic vision loss and one had no change. The average treatment outcome for all patients was -0.07 ± 4.25 logMAR units, which corresponded to a gain of 0.6 ± 4.4 lines of Snellen acuity. The treatment resulted in a good anatomic response with the disappearance of the subretinal fluid, improved visual acuity, and stabilized final visual results. CONCLUSION: The results of this case series suggest that the use of an intravitreal anti-vascular endothelial growth factor (VEGF agent (ranibizumab for CNV in AMD with extensive pre-existing GA is effective. Our results are not as striking as published results from large-scale trials of anti

  13. Visualizing solvent mediated phase transformation behavior of carbamazepine polymorphs by principal component analysis

    DEFF Research Database (Denmark)

    Tian, Fang; Rades, Thomas; Sandler, Niklas

    2008-01-01

    The purpose of this research is to gain a greater insight into the hydrate formation processes of different carbamazepine (CBZ) anhydrate forms in aqueous suspension, where principal component analysis (PCA) was applied for data analysis. The capability of PCA to visualize and to reveal simplified...

  14. Design Improvements on Graded Insulation of Power Transformers Using Transient Electric Field Analysis and Visualization Technique

    OpenAIRE

    Yamashita, Hideo; Nakamae, Eihachiro; Namera, Akihiro; Cingoski, Vlatko; Kitamura, Hideo

    1998-01-01

    This paper deals with design improvements on graded insulation of power transformers using transient electric field analysis and a visualization technique. The calculation method for transient electric field analysis inside a power transformer impressed with impulse voltage is presented: Initially, the concentrated electric network for the power transformer is concentrated by dividing transformer windings into several blocks and by computing the electric circuit parameters.

  15. Multidimensional scaling analysis of financial time series based on modified cross-sample entropy methods

    Science.gov (United States)

    He, Jiayi; Shang, Pengjian; Xiong, Hui

    2018-06-01

    Stocks, as the concrete manifestation of financial time series with plenty of potential information, are often used in the study of financial time series. In this paper, we utilize the stock data to recognize their patterns through out the dissimilarity matrix based on modified cross-sample entropy, then three-dimensional perceptual maps of the results are provided through multidimensional scaling method. Two modified multidimensional scaling methods are proposed in this paper, that is, multidimensional scaling based on Kronecker-delta cross-sample entropy (MDS-KCSE) and multidimensional scaling based on permutation cross-sample entropy (MDS-PCSE). These two methods use Kronecker-delta based cross-sample entropy and permutation based cross-sample entropy to replace the distance or dissimilarity measurement in classical multidimensional scaling (MDS). Multidimensional scaling based on Chebyshev distance (MDSC) is employed to provide a reference for comparisons. Our analysis reveals a clear clustering both in synthetic data and 18 indices from diverse stock markets. It implies that time series generated by the same model are easier to have similar irregularity than others, and the difference in the stock index, which is caused by the country or region and the different financial policies, can reflect the irregularity in the data. In the synthetic data experiments, not only the time series generated by different models can be distinguished, the one generated under different parameters of the same model can also be detected. In the financial data experiment, the stock indices are clearly divided into five groups. Through analysis, we find that they correspond to five regions, respectively, that is, Europe, North America, South America, Asian-Pacific (with the exception of mainland China), mainland China and Russia. The results also demonstrate that MDS-KCSE and MDS-PCSE provide more effective divisions in experiments than MDSC.

  16. Radial artery pulse waveform analysis based on curve fitting using discrete Fourier series.

    Science.gov (United States)

    Jiang, Zhixing; Zhang, David; Lu, Guangming

    2018-04-19

    Radial artery pulse diagnosis has been playing an important role in traditional Chinese medicine (TCM). For its non-invasion and convenience, the pulse diagnosis has great significance in diseases analysis of modern medicine. The practitioners sense the pulse waveforms in patients' wrist to make diagnoses based on their non-objective personal experience. With the researches of pulse acquisition platforms and computerized analysis methods, the objective study on pulse diagnosis can help the TCM to keep up with the development of modern medicine. In this paper, we propose a new method to extract feature from pulse waveform based on discrete Fourier series (DFS). It regards the waveform as one kind of signal that consists of a series of sub-components represented by sine and cosine (SC) signals with different frequencies and amplitudes. After the pulse signals are collected and preprocessed, we fit the average waveform for each sample using discrete Fourier series by least squares. The feature vector is comprised by the coefficients of discrete Fourier series function. Compared with the fitting method using Gaussian mixture function, the fitting errors of proposed method are smaller, which indicate that our method can represent the original signal better. The classification performance of proposed feature is superior to the other features extracted from waveform, liking auto-regression model and Gaussian mixture model. The coefficients of optimized DFS function, who is used to fit the arterial pressure waveforms, can obtain better performance in modeling the waveforms and holds more potential information for distinguishing different psychological states. Copyright © 2018 Elsevier B.V. All rights reserved.

  17. Quad-Tree Visual-Calculus Analysis of Satellite Coverage

    Science.gov (United States)

    Lo, Martin W.; Hockney, George; Kwan, Bruce

    2003-01-01

    An improved method of analysis of coverage of areas of the Earth by a constellation of radio-communication or scientific-observation satellites has been developed. This method is intended to supplant an older method in which the global-coverage-analysis problem is solved from a ground-to-satellite perspective. The present method provides for rapid and efficient analysis. This method is derived from a satellite-to-ground perspective and involves a unique combination of two techniques for multiresolution representation of map features on the surface of a sphere.

  18. Using Interactive Data Visualizations for Exploratory Analysis in Undergraduate Genomics Coursework: Field Study Findings and Guidelines

    Science.gov (United States)

    Kumar, Anuj; Nong, Paige; Su, Gang; Meng, Fan

    2016-01-01

    Life scientists increasingly use visual analytics to explore large data sets and generate hypotheses. Undergraduate biology majors should be learning these same methods. Yet visual analytics is one of the most underdeveloped areas of undergraduate biology education. This study sought to determine the feasibility of undergraduate biology majors conducting exploratory analysis using the same interactive data visualizations as practicing scientists. We examined 22 upper level undergraduates in a genomics course as they engaged in a case-based inquiry with an interactive heat map. We qualitatively and quantitatively analyzed students’ visual analytic behaviors, reasoning and outcomes to identify student performance patterns, commonly shared efficiencies and task completion. We analyzed students’ successes and difficulties in applying knowledge and skills relevant to the visual analytics case and related gaps in knowledge and skill to associated tool designs. Findings show that undergraduate engagement in visual analytics is feasible and could be further strengthened through tool usability improvements. We identify these improvements. We speculate, as well, on instructional considerations that our findings suggested may also enhance visual analytics in case-based modules. PMID:26877625

  19. Lake Area Analysis Using Exponential Smoothing Model and Long Time-Series Landsat Images in Wuhan, China

    Directory of Open Access Journals (Sweden)

    Gonghao Duan

    2018-01-01

    Full Text Available The loss of lake area significantly influences the climate change in a region, and this loss represents a serious and unavoidable challenge to maintaining ecological sustainability under the circumstances of lakes that are being filled. Therefore, mapping and forecasting changes in the lake is critical for protecting the environment and mitigating ecological problems in the urban district. We created an accessible map displaying area changes for 82 lakes in the Wuhan city using remote sensing data in conjunction with visual interpretation by combining field data with Landsat 2/5/7/8 Thematic Mapper (TM time-series images for the period 1987–2013. In addition, we applied a quadratic exponential smoothing model to forecast lake area changes in Wuhan city. The map provides, for the first time, estimates of lake development in Wuhan using data required for local-scale studies. The model predicted a lake area reduction of 18.494 km2 in 2015. The average error reached 0.23 with a correlation coefficient of 0.98, indicating that the model is reliable. The paper provided a numerical analysis and forecasting method to provide a better understanding of lake area changes. The modeling and mapping results can help assess aquatic habitat suitability and property planning for Wuhan lakes.

  20. Cognitive Task Analysis of the Battalion Level Visualization Process

    National Research Council Canada - National Science Library

    Leedom, Dennis K; McElroy, William; Shadrick, Scott B; Lickteig, Carl; Pokorny, Robet A; Haynes, Jacqueline A; Bell, James

    2007-01-01

    ... position or as a battalion Operations Officer or Executive Officer. Bases on findings from the cognitive task analysis, 11 skill areas were identified as potential focal points for future training development...